EQ Bank Partners With TransferWise To Shake-Up International Money Transfers

Posted in Commentary with tags on December 12, 2019 by itnerd

EQ Bank is changing the game again and is announcing a partnership with TransferWise, the global technology company for international money transfers, shaking up how Canadians send money overseas. The result is fully transparent, remarkably fast international money transfers that are up to 8x cheaper for EQ Bank customers.

Working with TransferWise for Banks, EQ Bank has integrated TransferWise’s API directly into their infrastructure. This integration allows EQ Bank customers to send money right from their Savings Plus Account at the real exchange rate, paying only a small, transparent TransferWise while being able to earn 2.30 per cent interest until the moment they hit send.

As a leader in digital banking, and the first Canadian bank to move its core system to the cloud, this partnership marks another significant step for EQ Bank in driving innovation in financial services.

While Canada is on the cusp of delivering open banking, partnerships with fintechs, improving digital technologies and investing in agile cloud architectures, will be critical in delivering first-class services across Canada, paving the way for more sophisticated ways for Canadians to bank.

Why I Am Now Using DNS.Watch As My DNS Provider…. At Least For Now

Posted in Commentary on December 12, 2019 by itnerd

Yesterday I came home to discover that my Internet wasn’t working properly. The symptoms that were being presented were that some websites would work, and others were inaccessible. On top of that if a website did work, there was a chance that the content would not display properly. My suspicion was that the DNS or Domain Name System server that I was using was not working properly.

A quick tutorial on what DNS is. DNS is a system which translates the domain names you enter in a browser to the IP addresses required to access those sites. You need to have access to a DNS server if you want to do anything on the Internet and your ISP will provide you with access to theirs. But that may not be a good thing for reasons that I will get to in a minute.

As a troubleshooting step, I removed the Cloudflare DNS service that I had been using instead of my ISP’s DNS service and replaced it with the DNS service provided by Rogers which is my ISP of the moment. By doing that all my problems went away.

Now a word about why I don’t use the DNS service provided by Rogers. A few years ago Rogers was caught doing DNS redirection. Meaning if you mistype an address in your browser, Rogers DNS service will take you to a search page with ads. Besides having unwanted ads displayed, it’s also a security risk that I wrote about here. It was a pretty shady thing for Rogers to be doing as ISP’s in my opinion shouldn’t be doing stuff like that, and I haven’t trusted them enough since then to use their DNS service full time. But to be fair to Rogers, they weren’t the only ones doing this sort of thing as Bell was caught doing something similar to what Rogers was doing. The result was that I have used public DNS services. I started off using OpenDNS. Then I moved to Level3’s DNS service when OpenDNS got bought by Cisco and they wanted you to register to use it. More recently I had been using Cloudflare’s DNS service as that was the new cool thing to use. But last night forced me to move again because of the issues that I was seeing.

Since I refuse to use the Rogers DNS service under any circumstances as I don’t know if they still do DNS redirection, I have at least for now moved to DNS.WatchDNS.Watch for the following reasons:

  • DNS Neutrality — The servers do not censor any DNS requests. This differs to some ISPs around the world who actively censor what you can and cannot access.
  • Privacy Protection — The company does not log any DNS queries. It is not recording any of your actions. By contrast a typical ISP DNS server may log your history, and some don’t even anonymize the data collected.
  • Data is not for sale — The company as far as I am aware does not have any business deals in place with ad networks or other institutions that have an interest in learning about your online habits.
  • No ISP DNS Hijacking — This goes back to the sorts of things that Rogers and Bell have been caught doing. DNS.Watch doesn’t do that at all, which is a good thing for you. 
  • It support IPv4 and IPv6 addresses  That way you future proof yourself seeing as IPv4 addresses are running out.
  • DNS.Watch supports DNSSEC  This system is used by many sites to ensure that the data you receive is legitimately from the real site and not from a hijacked domain or other trickery

The only weakness of DNS.Watch is that their setup instructions aren’t the best as they don’t have instructions for setting up your average consumer grade router to use it. For experienced users, that’s not a big deal as they’ll figure it out. But for the average user it can be a bit of a challenge to set up and DNS.Watch should really address that.

I’ve been using it for the last few hours and it seems very quick and responsive. For the present time I am going to stick with it and see how it performs. But I am going to look at my options for a public DNS service to see which is the best one. In the meantime, I would love to know what happened last night to Cloudflare’s DNS service. I’ve talked to a couple of people and they had issues with it as well. But I can’t find anything online that speaks to what happened. Thus some clarity on that would be nice to get so that I can pick the best public DNS service for me.

Review: ASUS AiMesh AX6100 Wi-Fi system

Posted in Products with tags on December 12, 2019 by itnerd

Mesh WiFi has been a thing for a while now. And to be frank, ASUS is kind of late to the market. But that’s not a bad thing. That’s a very good thing based on my experience with their AiMesh AX6100 Wi-Fi system which I have been testing over the last few days.

IMG_1490

I got two of these nodes from ASUS which kind of have the vibe of their gaming routers. Except that they don’t look as over the top as those. That’s a good as this opens this WiFi mesh router to more people, and they’ll look good in more places. The AiMesh AX6100 Wi-Fi system has a lot going for it that frankly puts other mesh WiFi products to shame. Let’s start with this:

IMG_1491

It comes with four gigabit ports. Plus a gigabit WAN port for your router.

IMG_1492

You also get a USB 2.0 and USB 3.0 port as well. All of that allows you to connect storage devices, printers, wired computers, etc. That gives you a number of use cases for this mesh WiFi system. And what’s cool is that both nodes are identical so you can connect either unit to the modem supplied by your Internet service provider via an Ethernet cable.

The real star of the show from a specifications perspective is that it is a Wi-Fi 6 mesh WiFi system. And rather than simply offering Wi-Fi 6 for the people who have iPhone 11’s and Galaxy S10’s, Wi-Fi 6 can be used for wireless backhaul functions between the two nodes. And that works incredibly well. I’ll give you details on that in a second. But let me get to the setup process first.

Set-up was insanely easy using the ASUS Router app (available for iOS and Android). The app recommends placing the two units within a few meters of each other during the initial pairing process, after which you can place them farther apart. In my case, I put one in the living room of my roughly sub 1000 sq foot condo and the other in the bedroom. For what it’s worth, ASUS says a pair of these routers provides wireless coverage of up to 5,500 sq feet.

Once I set things up, I went about my testing and found that this setup produces some of the fastest speeds that I have seen from a mesh WiFi setup. I clocked an average download speed of 550 Mbps to 600Mbps on my MacBook Pro running 802.11ac placed in the bedroom in my condo. By comparison, other brands of mesh routers I had previously tested weren’t even in the same star system when it came to speed in the same scenario. I was blown away by this result. And for what it’s worth, it is possible to connect these two nodes by gigabit Ethernet. So in a bigger house it is possible to use that for backhaul functions which may be better in that use case as a wired connection would be more stable than a wireless one. Having said that, I had no issues with the wireless setup that I tested.

The AX6100 comes also bundled with a couple of extras:

  • AiProtection Pro which a suite of home network security tools powered by Trend Micro that defends your connected home devices from cyber threats, and has parental controls to restrict Internet access or block inappropriate content from children.
  • WTFast which promises lower latency and less lag while gaming by funneling gaming traffic through optimized network routes

In terms of price, expect to pay $550 CDN or so for the two pack that I tested. Each additional node is about $270 CDN. If you need a mesh WiFi system, this is the one to get at present.

2019 Postman “State of the API” Report Reveals APIs Expanding Beyond Developers

Posted in Commentary with tags on December 11, 2019 by itnerd

Postman today released the results of its annual 2019 Postman “State of the API” Report. The report is based on a survey of more than 10,000 API (Application Programming Interface) developers, users, testers, and executives. The respondents provided insights on everything from how their time with APIs is spent to what they see as the most significant issues and opportunities for APIs in 2020.

While the survey reports that more developers work with APIs (Application Programming Interfaces) than anyone else in a typical organization, the reach of APIs is increasingly touching more people than just those who code. Only 46.6% of respondents identified as being either a front-end or back-end developer (compared to 58.6% last year), with QA engineers, technical team leads, API architects, DevOps specialists, and others rounding out the field.

Key data highlights from the survey include:

API Security: While API security is a hot topic—driven by frequent reports of API security breaches and misuse—respondents feel confident in their API security postures. Nearly three-quarters feel that their APIs are “very secure” or have “above-average security.” Only 2.4% stated that their APIs were not at all secure.

API Documentation: The most helpful enhancement that API producers can make is to provide better examples in the documentation (63.5%), followed by standardization (59.4%) and sample code (57.8%). API consumers also find real-world use cases, better workflows, additional tools, and SDKs helpful, although to a lesser extent.

Additional data points:

  • Experience: 78.2% of developers have 5 or fewer years of experience developing APIs; 12.2% have 10 or more years of experience
  • Team Size: 72.6% work on teams of 10 members or less, 25.7% on 22-50 member teams, and 1.7% on 50+ member teams
  • Time Spent: 26.1% of time is spent on development, 22.2% on debugging and manual testing, 11.4% on automated testing, 11.2% on designing and mocking, 9.1% on managing others, 7.3% on documentation, 5.7% on monitoring, 3.6% on publishing, and 3.3% on writing about APIs (NOTE: 70% spend more time on manual testing and debugging than they thought they should)
  • Number of APIs: 39% generally work with 1-5 APIs, 22% with 6-10, 14% with 11-20, 11% with 20-50, and 13% work with more than 50 APIs
  • Internal vs. External: 52.8% of APIs used are internal, 28.4% used were shared only among integration partners, and 18.8% used are public
  • Performance: 47.6% feel that their APIs do not break, stop working, or materially change specification often enough to matter, 28.4% said it happens monthly, 15.7% weekly, and 3.2% daily
  • Industry: 52.3% work in technology, 41.2% in business and/or IT services, followed in order by banking and finance, healthcare, retail, manufacturing, government/defense, advertising/agencies, nonprofits, and a variety of other industries
  • Technology: 53.9% said microservices is the most exciting technology for developers in the next year, while 45.5% said containers and 44.0% said serverless architecture (NOTE: OpenAPI 3.0, GraphQL, HTTP 2.0, and WebSocket were also considerations)

The complete “State of the API” report can be found here: https://www.getpostman.com/resources/infographics/api-survey-2019/

Infogix Identifies Six Data Management Trends To Keep Your Eye Out For In 2020

Posted in Commentary with tags on December 11, 2019 by itnerd

Infogi today revealed its fourth annual list of trending challenges and opportunities in data management.

Every year, Infogix’s global experts and influencers identify top data trends based on their decades of knowledge and experience working with clients worldwide.

Below are the six trends Infogix has identified for 2020.

Real-Time Data to Disrupt the Future

Massive amounts of data are generated from a diverse set of industry domains, including social networks, e-commerce, transactions, IoT devices and web applications, requiring organizations to react quickly to extract value from that data. Traditional batch processing, where data is sent on a schedule from system to system, will not meet the demands of the changing data landscape. Companies are increasingly turning to event-driven architectures to handle growing volumes of streaming data. They are using distributed streaming platforms like Apache Kafka, ActiveMQ, Apache Pulsar, Amazon Kinesis and many others to provide high-throughput, low latency real-time streaming, flexible data retention, redundancy and scalability. In a world that demands lightning-fast speed-to-insights and real-time access to data, data quality has never been so important. Organizations must enlist vendors who can safeguard data quality to prevent data assets from becoming liabilities and provide validation at a speed and scale to match their data-in-motion.

Cultural Change through Data Governance

More and more organizations are embracing data governance as a means to improve enterprise data understanding and create a data-driven culture. Yet many still struggle to bridge the technical/business divide. Business-focused data governance encourages collaboration between business and technical stakeholders to build user-friendly tools, like data catalogs, that explain technical data in a business context and include critical institutional business knowledge. A business-oriented approach prioritizes business user understanding, empowering them to quickly turn data assets into actionable business insights. Business users won’t use or depend on data they don’t trust, making data quality a critical element of any data governance effort. Data governance that includes end-to-end data quality monitoring and metrics gives both technical and non-technical users a 360-degree view of data that will lead to increased revenue, customer retention and competitive advantage.

Conquering Bad Data

Even though data quality is one of the most persistent and pervasive challenges in data management, historically organizations only prioritized quality when revenue, reputation or mission-critical data was at risk. But that is changing. Complex regulatory compliance and the ever-increasing speed and scale of data have prompted organizations to prioritize data quality as a critical component of their enterprise data governance initiatives. By building a data quality-powered data governance framework, organizations improve enterprise data value and resolve data quality issues before they proliferate across systems. They understand they can’t wait for “data quality horror stories to provide evidence that poor data quality is having an impact on your organization,” as this article notes. By then, the damage is done.

Maturing Data Privacy Laws

The European Union’s General Data Protection Regulation (GDPR) was implemented nearly two years ago, serving as a global catalyst for data privacy legislation. In the U.S., states like Nevada and California have already passed sweeping legislation to protect the personal data of consumers, with many other states poised to follow suit. Noncompliant companies risk both significant financial fines and reputational damage, prompting many organizations to evaluate and address any potential compliance gaps. Businesses need strong data governance to identify and protect personal data, control data access and track lineage as data moves from sources to systems and processes, but data quality also plays a critical role in mitigating compliance risk. Poorly maintained data and poor quality data can both easily result in compliance violations that impact an organization’s brand and bottom line.

Self-Service Technologies on the Rise

Tools and technologies with machine learning (ML) and automation capabilities that enable self-service data analytics took off in 2019. Still, we often see these tools leveraged as part of a departmental project, rather than an enterprise program. To scale enterprise-wide, organizations must encourage data literacy among users so self-service analysis yields accurate and actionable results. Organizations must also establish policies for data access and usage, and ensure the accuracy of high-value data with key capabilities including timeliness, completeness and integrity checks. Only quality data will yield quality business insights.

The 2020 Buzzword: Automation

In the coming year, expect everyone to be talking automation! From hyper-automation using machine learning and AI, to workforce automation that eliminates jobs, to IoT building automation for physical plant efficiency—automation will be a top focus in data and technology. In analytics, companies will have to take self-service to the next level, not just empowering business users to analyze data, but completely automating data science tasks so they can focus more on leveraging insights than generating them. With automated data and analytics, data integrity will be even more critical, demanding automated data quality detection, monitoring and improvement.

To learn more about these data management trends for 2020 and beyond, visit http://www.infogix.com or @infogix.

Using Data For Good – Sault Ste. Marie Ontario Sets A ‘Smart’ Example”

Posted in Commentary on December 11, 2019 by itnerd

By leveraging public data for public good, Sault Ste. Marie has set an example for what it means to be a ‘smart’ city.  The recent federally funded case study, Building Data-Smart City Solutions, examines how Acorn Information Solutions, a division of the Sault Ste. Marie Innovation Centre, and its partners, use data to improve the community.

 

With 20 years of experience and approximately 50 partnerships and clients locally, and over 100 across Ontario and Canada, Acorn Information Solutions is the best example for how municipalities can use multi-enterprise Geographic Information System (GIS) data to create efficiencies in municipal operations and planning, improve health and human services, stimulate economic development and more.

 

CFN Consultants Inc., a security and defence consultancy out of Nova Scotia, is quoted in the document and describes the Sault’s use of shared data as “without question, the most advanced organization of its type in Canada.”

 

To read Building Data-Smart City Solutions, visit https://ssmic.com/news-resources/publications-resources/. Completion of the case study was made possible through funding received from FedNor.

Siemplify Announces New Linux-based Platform

Posted in Commentary on December 11, 2019 by itnerd

Siemplify today released a new version of its flagship security operations platform. Boasting a high-performance Linux-based architecture, the new version delivers improved investigation, automation and response capabilities that set new standards for enterprise readiness and ease of playbook lifecycle management.

The new Siemplify Security Operations Platform has been redesigned with scalability, robustness and the cloud in mind. For example, one global managed security service provider (MSSP) currently processes 50,000 correlated alerts each day from 15 different SIEMs across more than 50 customer sites, while a Fortune 100 energy conglomerate enriches more than 100,000 alerts each week.

The new version also extends Siemplify’s market-leading ability to seamlessly manage SOAR (security orchestration, automation and response) across multiple customer environments – addressing the unique needs of MSSPs, as well as enterprises with multiple discrete business units. The new Siemplify lightweight remote agent securely collects alerts, enriches them and performs ad-hoc actions and remediations across the remote environment, complete with full redundancy and simple, yet powerful central management.

The new version also introduces a modular approach to incident response playbook design that eliminates redundant actions, dramatically simplifying playbook lifecycle management. By introducing a new “block” concept to playbooks, users can create one block of actions for use cases, such as enrichment or response, and reuse those blocks in any playbook that requires this information. Any changes made to the individual blocks automatically cascade through all the playbooks that contain them. This approach to playbook design, combined with the advanced expression builder released earlier this year, delivers unparalleled ease of playbook creation and maintenance.

For more information on the latest release, visit the Siemplify blog at: https://www.siemplify.co/blog/product-update-whats-new-in-v5-3-of-the-siemplify-security-operations-platform