Archive for December 11, 2019

2019 Postman “State of the API” Report Reveals APIs Expanding Beyond Developers

Posted in Commentary with tags on December 11, 2019 by itnerd

Postman today released the results of its annual 2019 Postman “State of the API” Report. The report is based on a survey of more than 10,000 API (Application Programming Interface) developers, users, testers, and executives. The respondents provided insights on everything from how their time with APIs is spent to what they see as the most significant issues and opportunities for APIs in 2020.

While the survey reports that more developers work with APIs (Application Programming Interfaces) than anyone else in a typical organization, the reach of APIs is increasingly touching more people than just those who code. Only 46.6% of respondents identified as being either a front-end or back-end developer (compared to 58.6% last year), with QA engineers, technical team leads, API architects, DevOps specialists, and others rounding out the field.

Key data highlights from the survey include:

API Security: While API security is a hot topic—driven by frequent reports of API security breaches and misuse—respondents feel confident in their API security postures. Nearly three-quarters feel that their APIs are “very secure” or have “above-average security.” Only 2.4% stated that their APIs were not at all secure.

API Documentation: The most helpful enhancement that API producers can make is to provide better examples in the documentation (63.5%), followed by standardization (59.4%) and sample code (57.8%). API consumers also find real-world use cases, better workflows, additional tools, and SDKs helpful, although to a lesser extent.

Additional data points:

  • Experience: 78.2% of developers have 5 or fewer years of experience developing APIs; 12.2% have 10 or more years of experience
  • Team Size: 72.6% work on teams of 10 members or less, 25.7% on 22-50 member teams, and 1.7% on 50+ member teams
  • Time Spent: 26.1% of time is spent on development, 22.2% on debugging and manual testing, 11.4% on automated testing, 11.2% on designing and mocking, 9.1% on managing others, 7.3% on documentation, 5.7% on monitoring, 3.6% on publishing, and 3.3% on writing about APIs (NOTE: 70% spend more time on manual testing and debugging than they thought they should)
  • Number of APIs: 39% generally work with 1-5 APIs, 22% with 6-10, 14% with 11-20, 11% with 20-50, and 13% work with more than 50 APIs
  • Internal vs. External: 52.8% of APIs used are internal, 28.4% used were shared only among integration partners, and 18.8% used are public
  • Performance: 47.6% feel that their APIs do not break, stop working, or materially change specification often enough to matter, 28.4% said it happens monthly, 15.7% weekly, and 3.2% daily
  • Industry: 52.3% work in technology, 41.2% in business and/or IT services, followed in order by banking and finance, healthcare, retail, manufacturing, government/defense, advertising/agencies, nonprofits, and a variety of other industries
  • Technology: 53.9% said microservices is the most exciting technology for developers in the next year, while 45.5% said containers and 44.0% said serverless architecture (NOTE: OpenAPI 3.0, GraphQL, HTTP 2.0, and WebSocket were also considerations)

The complete “State of the API” report can be found here: https://www.getpostman.com/resources/infographics/api-survey-2019/

Infogix Identifies Six Data Management Trends To Keep Your Eye Out For In 2020

Posted in Commentary with tags on December 11, 2019 by itnerd

Infogi today revealed its fourth annual list of trending challenges and opportunities in data management.

Every year, Infogix’s global experts and influencers identify top data trends based on their decades of knowledge and experience working with clients worldwide.

Below are the six trends Infogix has identified for 2020.

Real-Time Data to Disrupt the Future

Massive amounts of data are generated from a diverse set of industry domains, including social networks, e-commerce, transactions, IoT devices and web applications, requiring organizations to react quickly to extract value from that data. Traditional batch processing, where data is sent on a schedule from system to system, will not meet the demands of the changing data landscape. Companies are increasingly turning to event-driven architectures to handle growing volumes of streaming data. They are using distributed streaming platforms like Apache Kafka, ActiveMQ, Apache Pulsar, Amazon Kinesis and many others to provide high-throughput, low latency real-time streaming, flexible data retention, redundancy and scalability. In a world that demands lightning-fast speed-to-insights and real-time access to data, data quality has never been so important. Organizations must enlist vendors who can safeguard data quality to prevent data assets from becoming liabilities and provide validation at a speed and scale to match their data-in-motion.

Cultural Change through Data Governance

More and more organizations are embracing data governance as a means to improve enterprise data understanding and create a data-driven culture. Yet many still struggle to bridge the technical/business divide. Business-focused data governance encourages collaboration between business and technical stakeholders to build user-friendly tools, like data catalogs, that explain technical data in a business context and include critical institutional business knowledge. A business-oriented approach prioritizes business user understanding, empowering them to quickly turn data assets into actionable business insights. Business users won’t use or depend on data they don’t trust, making data quality a critical element of any data governance effort. Data governance that includes end-to-end data quality monitoring and metrics gives both technical and non-technical users a 360-degree view of data that will lead to increased revenue, customer retention and competitive advantage.

Conquering Bad Data

Even though data quality is one of the most persistent and pervasive challenges in data management, historically organizations only prioritized quality when revenue, reputation or mission-critical data was at risk. But that is changing. Complex regulatory compliance and the ever-increasing speed and scale of data have prompted organizations to prioritize data quality as a critical component of their enterprise data governance initiatives. By building a data quality-powered data governance framework, organizations improve enterprise data value and resolve data quality issues before they proliferate across systems. They understand they can’t wait for “data quality horror stories to provide evidence that poor data quality is having an impact on your organization,” as this article notes. By then, the damage is done.

Maturing Data Privacy Laws

The European Union’s General Data Protection Regulation (GDPR) was implemented nearly two years ago, serving as a global catalyst for data privacy legislation. In the U.S., states like Nevada and California have already passed sweeping legislation to protect the personal data of consumers, with many other states poised to follow suit. Noncompliant companies risk both significant financial fines and reputational damage, prompting many organizations to evaluate and address any potential compliance gaps. Businesses need strong data governance to identify and protect personal data, control data access and track lineage as data moves from sources to systems and processes, but data quality also plays a critical role in mitigating compliance risk. Poorly maintained data and poor quality data can both easily result in compliance violations that impact an organization’s brand and bottom line.

Self-Service Technologies on the Rise

Tools and technologies with machine learning (ML) and automation capabilities that enable self-service data analytics took off in 2019. Still, we often see these tools leveraged as part of a departmental project, rather than an enterprise program. To scale enterprise-wide, organizations must encourage data literacy among users so self-service analysis yields accurate and actionable results. Organizations must also establish policies for data access and usage, and ensure the accuracy of high-value data with key capabilities including timeliness, completeness and integrity checks. Only quality data will yield quality business insights.

The 2020 Buzzword: Automation

In the coming year, expect everyone to be talking automation! From hyper-automation using machine learning and AI, to workforce automation that eliminates jobs, to IoT building automation for physical plant efficiency—automation will be a top focus in data and technology. In analytics, companies will have to take self-service to the next level, not just empowering business users to analyze data, but completely automating data science tasks so they can focus more on leveraging insights than generating them. With automated data and analytics, data integrity will be even more critical, demanding automated data quality detection, monitoring and improvement.

To learn more about these data management trends for 2020 and beyond, visit http://www.infogix.com or @infogix.

Using Data For Good – Sault Ste. Marie Ontario Sets A ‘Smart’ Example”

Posted in Commentary on December 11, 2019 by itnerd

By leveraging public data for public good, Sault Ste. Marie has set an example for what it means to be a ‘smart’ city.  The recent federally funded case study, Building Data-Smart City Solutions, examines how Acorn Information Solutions, a division of the Sault Ste. Marie Innovation Centre, and its partners, use data to improve the community.

 

With 20 years of experience and approximately 50 partnerships and clients locally, and over 100 across Ontario and Canada, Acorn Information Solutions is the best example for how municipalities can use multi-enterprise Geographic Information System (GIS) data to create efficiencies in municipal operations and planning, improve health and human services, stimulate economic development and more.

 

CFN Consultants Inc., a security and defence consultancy out of Nova Scotia, is quoted in the document and describes the Sault’s use of shared data as “without question, the most advanced organization of its type in Canada.”

 

To read Building Data-Smart City Solutions, visit https://ssmic.com/news-resources/publications-resources/. Completion of the case study was made possible through funding received from FedNor.

Siemplify Announces New Linux-based Platform

Posted in Commentary on December 11, 2019 by itnerd

Siemplify today released a new version of its flagship security operations platform. Boasting a high-performance Linux-based architecture, the new version delivers improved investigation, automation and response capabilities that set new standards for enterprise readiness and ease of playbook lifecycle management.

The new Siemplify Security Operations Platform has been redesigned with scalability, robustness and the cloud in mind. For example, one global managed security service provider (MSSP) currently processes 50,000 correlated alerts each day from 15 different SIEMs across more than 50 customer sites, while a Fortune 100 energy conglomerate enriches more than 100,000 alerts each week.

The new version also extends Siemplify’s market-leading ability to seamlessly manage SOAR (security orchestration, automation and response) across multiple customer environments – addressing the unique needs of MSSPs, as well as enterprises with multiple discrete business units. The new Siemplify lightweight remote agent securely collects alerts, enriches them and performs ad-hoc actions and remediations across the remote environment, complete with full redundancy and simple, yet powerful central management.

The new version also introduces a modular approach to incident response playbook design that eliminates redundant actions, dramatically simplifying playbook lifecycle management. By introducing a new “block” concept to playbooks, users can create one block of actions for use cases, such as enrichment or response, and reuse those blocks in any playbook that requires this information. Any changes made to the individual blocks automatically cascade through all the playbooks that contain them. This approach to playbook design, combined with the advanced expression builder released earlier this year, delivers unparalleled ease of playbook creation and maintenance.

For more information on the latest release, visit the Siemplify blog at: https://www.siemplify.co/blog/product-update-whats-new-in-v5-3-of-the-siemplify-security-operations-platform

Review: Mujjo Double-Insulated Touchscreen Gloves

Posted in Products with tags on December 11, 2019 by itnerd

This past week I have been testing these:

8641FEED-4BC0-4DC3-9130-8DB104B44CA5

These are the Mujjo Double-Insulated Touchscreen Gloves which are designed to keep your hands warm while allowing you to use your phone or smart watch. They have two key features that you should be aware of. First is that they are triple-layered and double-insulated which is entirely laminated with both wind-resistant Micro Fleece and 3M Thinsulate. That’s all designed to keep your hands warm.

Second is this:

BA9FFB56-17E3-4166-A558-0CFDEA425F16

The gloves have a textured surface that allows you to grip your phone without worrying about it slipping out of your hands. Now I normally use my iPhone XS with a case, but in the interest of science, I took it out of the case and used these gloves to hold my phone. It felt very secure and at no time did I feel that it was going to slip out of my hand.

But here’s the key claim that these gloves make. These gloves claim to be engineered to whistand much colder temperatures. Testing that was a problem as the weather never got below -4C here in Toronto. But I was able to go out for a brisk walk at night once the temperature got to -4C. My walk was a total of 20 minutes and my hands warmed up to the point that they were sweating 12 minutes into the walk. Based on that, I can see how they would keep your hands warm in much colder weather.

As for using them with your phone, that’s a total win on two fronts. The first front is that unlike other touchscreen winter gloves that I’ve tried where I would have to use the tip of my finger to interact with my phone. But with the Mujjo gloves, interacting my phone was very natural. As in it was no different than interacting with a phone with no gloves. Second, it was insanely easy to interact with my Apple Watch while wearing these gloves as I have had issues using my Apple Watch with other touchscreen winter gloves.

Top Tip: It is super important that if you are ordering these gloves online that you visit this site and print out the templates to figure out what your size is. By doing so, you can ensure that these gloves will fit you properly and the touch screen functionality will work perfectly,

Mujjo Double-Insulated Touchscreen Gloves sell for 49.50 Euros. They’re absolutely worth a look if you’re someone who wants a pair of gloves that will keep your hands warm, but will also allow you to interact with your phone or smart watch with ease.