Aptum Earns Microsoft Azure Expert Managed Service Provider Recognition

Posted in Commentary with tags on January 30, 2023 by itnerd

Aptum, a hybrid multi-cloud managed service provider (MSP), today announced it has been recognized by Microsoft as an Azure Expert MSP. This designation identifies Aptum as a qualified global partner to deliver Azure solutions to customers.

Aptum is among a group of MSPs globally to earn this certification, having completed an extensive auditing process by an independent third party. The certification process consisted of a rigorous audit of 66 controls in areas such as:

  • Business Health and Managed Service Focus
  • Microsoft Services
  • Assessment and Design
  • Build and Migration
  • Cloud Operations and Service Management 
  • Security and Governance
  • Cloud SLAs, Customer Satisfaction, and Cost Optimization
  • Continual Improvement and Process Optimization

Aptum also provided multiple customer references for projects successfully delivered over the last 12 months. 

As an Azure Expert MSP, Aptum is strongly equipped to help organizations meet their evolving technology needs and achieve their business objectives. The company recently earned other Microsoft partner designations, highlighting its commitment to training and accreditation, as well as its expertise. 

  • The Microsoft Solutions Partner for Data & AI (Azure) designation demonstrates Aptum’s ability to assist customers with the management of their data across multiple systems to build analytics and AI solutions
  • The Microsoft Solutions Partner for Digital & App Innovation (Azure) certification establishes Aptum’s capability to help customers build, run, and manage applications across multiple clouds, on premises, and at the edge, with frameworks and tools customers choose
  • As a Microsoft Solutions Partner for Infrastructure (Azure), Aptum is identified as a partner that can help customers accelerate migration of key infrastructure workloads to Microsoft Azure.

Developers Are Fleeing Twitter For Mastodon

Posted in Commentary with tags on January 30, 2023 by itnerd

There’s been a fair amount of news about the fact that users are fleeing Twitter for Mastodon. But what’s now starting come to light is the fact that developers are doing the same thing. They’re being driven by the ban of third party clients on the platform, and as a result are looking for a new place to call home:

When Twitter quietly updated its developer policies to ban third-party clients from its platform, it abruptly closed an important chapter of Twitter’s history. Unlike most of its counterparts, which tightly control what developers are able to access, Twitter has a long history with independent app makers.

Now, the developers of some Twitter clients are turning their attention to another upstart platform: Mastodon. This week, Tapbots, the studio behind Tweebot, released Ivory, a Mastodon client based on its longtime Twitter app. Matteo Villa, the developer behind Twitter app Fenix, is testing a Mastodon client of his own called Wooly. Junyu Kuang, the indie developer behind Twitter client Spring is working on a Mastodon app called Mona. Shihab Mehboob, developer of Twitter app Aviary, is close to launching a Mastodon client called Mammoth.

The one-time Twitter developers join a growing group of independent app makers who have embraced Mastodon, the open-source social network that’s seen explosive growth since Elon Musk took over Twitter. The decentralized service now has more than 1.5 million users across nearly 10,000 servers. That, coupled with Mastodon’s open-source, “API-first” approach, has attracted dozens of developers eager to put their own spin on the service.

I question the number of users on Mastodon that is quoted in the article because an account on Mastodon which tracks the number of users on the platform says this:

But besides that, developers moving to Mastodon will help to grow the platform as it not only gives users more choice in terms of the Mastodon client that they use, but drives innovation of the platform. Those will help to make Mastodon a much better option than Twitter for those who want to be on some form of social media as there’s no innovation going on at Twitter at the moment. And you can only use their client or their web page to see Tweets. And to be frank, Twitter’s native client sucks and third party clients were always a much better option to access Twitter.

Bottom line: You can add this to the list of reasons why Twitter is a train wreck next to a dumpster fire.

Microsoft Posts Report On Last Week’s Outage

Posted in Commentary with tags on January 29, 2023 by itnerd

Last week, Microsoft had a major outage that affected a lot of their services including:

  • Teams
  • Xbox Live
  • Outlook
  • Microsoft 365 
  • Minecraft
  • Azure
  • GitHub
  • Microsoft Store

At the time, Microsoft said that a networking change caused this. And at the time, I said this:

My question for Microsoft, which I hope they answer is what specifically happened and what will they do to ensure that it doesn’t happen again. Microsoft does give some version of this information out, so I for one will be interested to see what they say.

And now Microsoft has a Preliminary Post Incident Review that goes into more detail that answers the questions that I had:

We determined that a change made to the Microsoft Wide Area Network (WAN) impacted connectivity between clients on the internet to Azure, connectivity across regions, as well as cross-premises connectivity via ExpressRoute. As part of a planned change to update the IP address on a WAN router, a command given to the router caused it to send messages to all other routers in the WAN, which resulted in all of them recomputing their adjacency and forwarding tables. During this re-computation process, the routers were unable to correctly forward packets traversing them. The command that caused the issue has different behaviors on different network devices, and the command had not been vetted using our full qualification process on the router on which it was executed.

And this is how they responded:

Our monitoring initially detected DNS and WAN related issues from 07:12 UTC. We began investigating by reviewing all recent changes. By 08:10 UTC, the network started to recover automatically. By 08:20 UTC, as the automatic recovery was happening, we identified the problematic command that triggered the issues. Networking telemetry shows that nearly all network devices had recovered by 09:00 UTC, by which point the vast majority of regions and services had recovered. Final networking equipment recovered by 09:35 UTC.

Due to the WAN impact, our automated systems for maintaining the health of the WAN were paused, including the systems for identifying and removing unhealthy devices, and the traffic engineering system for optimizing the flow of data across the network. Due to the pause in these systems, some paths in the network experienced increased packet loss from 09:35 UTC until those systems were manually restarted, restoring the WAN to optimal operating conditions. This recovery was completed at 12:43 UTC.

And this is how they will stop this from happening again:

  • We have blocked highly impactful commands from getting executed on the devices (Completed)
  • We will require all command execution on the devices to follow safe change guidelines (Estimated completion: February 2023)

This is all good and I really wish that other companies would do the same thing as you’re more likely to trust a company who is open and transparent. Kudos to you Microsoft.

Guest Post: 5 Essential Data Privacy Regulations for Businesses to Know in 2023

Posted in Commentary with tags on January 28, 2023 by itnerd


Happy 2023 Data Privacy Week!

Just as everyone started to get more or less cozy with the regulatory landscape in data privacy/protection and individuals and businesses learned to navigate the shallow waters of data subject requests, risk management, and impact assessments – BOOM   – another tidal wave of regulatory requirements and new challenges rushed in!

2023 is the perfect moment to start internalizing new acronyms (get ready for #NIS2, #DORA, #DPDPB, #CPRA, #CCPA, #CPA, #CDPA, #UCPA, #VCDPA, #ADPPA, #PrivacyPenaltyBill) and legislative acts they stand for.

The underlying motive of the upcoming changes is to boost and enhance the cybersecurity postures of various organizations and manage evolving cyber risks more effectively.

Here is a helicopter view of selected legal developments around the world:

  • EU – Directive (EU) 2022/2555 on measures for a high common level of cybersecurity across the Union (NIS2)
  • EU – Regulation on digital operational resilience for the financial sector (DORA)
  • US – State & Federal privacy laws
  • India – Digital Personal Data Protection Bill (DPDPB)
  • Australia – Privacy Penalty Bill & overhaul of the Privacy Act 1988


According to ENISA, the general spending on cybersecurity is 41 per cent lower by organizations in the EU than by their US counterparts. With the arrival of NIS2, this ratio is expected to shift to cover this enormous gap at least partially. Conservative estimates are that NIS2 entry in force will translate into a ~22 per cent increase in ICT spending over a 3–4-year period.

NIS2 was published just before year-end, and EU Member States now have 21 months to transpose requirements and mechanisms described into national laws. The 2016 NIS Directive – despite shortcomings – served as a cornerstone for increasing Member States’ cybersecurity capabilities. Now, NIS2 will expand the scope and the list of impacted organizations. It is expected that as many as 160 000 organizations will be subject to this new legislation, including digital services providers (platforms and data centre services), electronic communications networks and services providers, manufacturing, food, and the public sector.

NIS2 aims to strengthen cybersecurity postures by, amongst other: improving cybersecurity governance, addressing the security of supply chains, streamlining reporting obligations (early warnings/shortened notification periods), and introducing more stringent supervisory measures and stricter enforcement requirements.

What can you do right now?

  • First, try to understand which obligations will apply to your organization and in which compliance bucket your organization will fall into: “Essential Entity,” “Important Entity,” or maybe “other.”
  • Next, see if you can create synergies and leverage existing technical and organizational measures implemented during preceding compliance efforts (e.g., GDPR, NIS1, etc.)
  • Start looking for the right partners that can adequately support your compliance efforts. Engage your vendors in discussing the approach that best fits your organization.
  • Last but not least, initiate planning for increased spending to address any remaining gaps. In compliance could result in administrative fines of up to 10 million euros or up to 2 per cent of the total annual worldwide turnover of the organization.


DORA aims to achieve “a high common level of digital operational resilience,” mitigating cyber threats and ensuring resilient operations across the EU financial sector. It will become directly applicable from January 17th, 2025. It will impact the financial sector (banks, insurance companies, investment firms) and its ICT providers (i.e., cloud platforms) – roughly around 22,000 organizations.

New requirements imposed by DORA will effectively boil down to reviewing and updating risk management practices. Financial sector customers will need to transfer as many regulatory risks as possible to ICT providers or apply different risk-mitigating strategies. In any case, ICT providers will need to be able to assure adherence to DORA’s requirements. The whole industry will also need to reassess contractual relations with vendors. DORA will incorporate requirements for contracts between financial companies and their critical ICT providers, including the location where data is processed, service level agreement descriptions, reporting requirements, rights of access, and circumstances that would lead to terminating the contract.

In a separate post – Commvault’s Product Team will perform a more technical deep-dive into DORA’s requirements related to detection (art. 10), response and recovery (art. 11), and backup (art. 12).

US data privacy laws – CPRA/CCPA, CPA, CDPA, UCPA, VCDPA, ADPPA

As of January 1st, 2023, California Privacy Rights Act (CPRA) amendments to the California Consumer Privacy Act 2018 went into effect. Many temporary exemptions in place expire, imposing additional obligations on companies dealing with California residents’ personal information, e.g., regarding employment-related personal data, opt-out from selling personal information.

2023 is also the year when the Colorado Privacy Act (CPA), The Connecticut Data Privacy Act (CDPA), The Utah Consumer Privacy Act (UCPA), and The Virginia Consumer Data Privacy Act (VCDPA) will become effective. Legislative fragmentation risk is imminent and substantial, and this is the kind of risk that caused the European Union to harmonize the regulatory approach. Let us see whether the same will be true in 2023 in the case of the American Data Privacy and Protection Act (‘ADPPA’) – a proposal for a federal and general data privacy law.

India – DPDPB

Indian legislators plan to introduce a very ambitious Digital Personal Data Protection Bill (DPDPB) this year. When enacted, long-awaited legislation will undoubtedly impact all kinds of organizations due to India’s role as a tech powerhouse and a global outsourcing hub.

Australia – Privacy Penalty Bill & overhaul of the Privacy Act

Australian authorities announced yet another complete overhaul of the Privacy Act dated 1988. The current legislation was summarized as “out of date and not fit for purpose in the digital age.”

In the meantime, still in 2022, Australia passed the Privacy Penalty Bill that increased privacy-related sanctions to levels comparable with trends introduced by GDPR (up to 50m AUD) and expanded regulatory powers of the Office of the Australian Information Commissioner (OAIC) and the Australian Communications and Media Authority (ACMA).


The relentless compliance clock just started ticking again. Cross-functional teams consisting of IT, compliance, privacy, legal professionals, and business analysts will spend considerable amounts of time analyzing the impact of the cloudburst of legislative developments that emerged at the end of last year and will materialize throughout 2023.

Be aware that the legislative developments presented here could be more comprehensive. You can be sure, however, that they will become standard talking points not only in 2023 but also for the years to come.

Today Is Data Privacy Day

Posted in Commentary with tags on January 28, 2023 by itnerd

Data Privacy Day, also known in Europe as Data Protection Day, is globally recognized each year on January 28th. Some have now even extended this to a weeklong celebration. The event’s purpose is to raise awareness and promote privacy and data protection best practices. 

Executives from Datadobi, DH2i, Folio Photonics, Nexsan, Nyriad, Hammerspace, Fortra and Retrospect had this to say about this very timely and important topic: 

Carl D’Halluin, CTO, Datadobi: 

“A staggering amount of unstructured data has been and continues to be created. In response, a variety of innovative new tools and techniques have been developed so that IT professionals can better get their arms around it. Savvy IT professionals know that effective and efficient management of unstructured data is critical in order to maximize revenue potential, control costs, and minimize risk across today’s heterogeneous, hybrid-cloud environments. However, savvy IT professionals also know this can be easier said than done, without the right unstructured data management solution(s) in place. And, on Data Privacy Day we are reminded that data privacy is among the many business-critical objectives being faced by those trying to rein-in their unstructured data. 

The ideal unstructured data management platform is one that enables companies to assess, organize, and act on their data, regardless of the platform or cloud environment in which it is being stored. From the second it is installed, users should be able to garner insights into their unstructured data. From there, users should be able to quickly and easily organize the data in a way that makes sense and to enable them to achieve their highest priorities, whether it is controlling costs, CO2, or risk – or ensuring end-to-end data privacy.”

​​Don Boxley, CEO and Co-Founder, DH2i:

“The perpetual concern around data privacy and protection has led to an abundance of new and increasingly stringent regulations around the world. According to the United Nations Conference on Trade and Development (UNCTAD), 71% of countries now have data protection and privacy legislation, with another 9% having draft legislation. 

This increased scrutiny makes perfect sense. Data is being created and flowing not just from our business endeavors, but countless personal interactions we make every day – whether we are hosting an online conference, making an online purchase, or using a third party for ride-hailing, food delivery, or package transport. 

Today, as organizations endeavor to protect data – their own as well as their customers’ – many still face the hurdle of trying to do so with outdated technology that was simply not designed for the way we work and live today. Most notably, many organizations are relying on virtual private networks (VPNs) for network access and security. Unfortunately, both external and internal bad actors are now exploiting VPN’s inherent vulnerabilities. However, there is light at the end of the tunnel. Forward looking IT organizations have discovered the answer to the VPN dilemma. It is an innovative and highly reliable approach to networking connectivity – the Software Defined Perimeter (SDP). This approach enables organizations to build a secure software-defined perimeter and use Zero Trust Network Access (ZTNA) tunnels to seamlessly connect all applications, servers, IoT devices, and users behind any symmetric network address translation (NAT) to any full cone NAT: without having to reconfigure networks or set up complicated and problematic VPNs. With SDP, organizations can ensure safe, fast and easy network and data access; while ensuring they adhere to internal governance and external regulations compliance mandates.”

Steve Santamaria, CEO, Folio Photonics: 

“It is no secret that data is at the center of everything you do. Whether you are a business, a nonprofit, an educational institution, a government agency, or the military, it is vital to your everyday operations. It is therefore critical that the appropriate person(s) in your organization have access to the data they need anytime, anywhere, and under any conditions. However, it is of the equal importance that you keep it from falling in the wrong hands. 

Therefore, when managing current and archival data, a top concern must be data security and durability, not just today but for decades upon decades into the future. The ideal data storage solution must offer encryption and WORM (write-once, read-many) capabilities. It must require little power and minimal climate control. It should be impervious to EMPs, salt water, high temps, and altitudes. And, all archive solutions must have 100+ years of media life and be infinitely backward compatible, while still delivering a competitive TCO. But most importantly, the data storage must have the ability to be air-gapped as this is truly the only way to prevent unauthorized digital access.”

Surya Varanasi, CTO, Nexsan: 

“Digital technology has revolutionized virtually every aspect of our lives. Work, education, shopping, entertainment, and travel are just a handful of the areas that have been transformed. Consequently, today, our data is like gravity – it’s everywhere. 

On Data Privacy Day, we are reminded of this fact, and the need to ensure our data’s safety and security. Fortunately, there are laws and regulations that help to take some of the burden off of our shoulders; such as the General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), and Health Insurance Portability and Accountability Act (HIPAA).

However, some of the responsibility remains on our shoulders as well as those of the data management professionals we rely upon. Today, it would be extremely challenging to find an organization (or an individual for that matter) that isn’t backing up their data. Unfortunately however, today that just isn’t enough. Cyber criminals have become increasingly aggressive and sophisticated, along with their ransomware and other malware. And now, the threat isn’t just that they will hold your data until payment, cyber criminals are now threatening to make personal and confidential data public, if not paid. It is therefore critical that cyber hygiene must include protecting backed up data by making it immutable and by eliminating any way that data can be deleted or corrupted. 

This can be accomplished with an advanced Unbreakable Backup solution, which creates an immutable, object-locked format, and then takes it a step further by storing the admin keys in another location entirely for added protection. With an Unbreakable Backup solution that encompasses these capabilities, users can ease their worry about the protection and privacy of their data, and instead focus their expertise on activities that more directly impact the organization’s bottom-line objectives.”

Andrew Russell, Chief Revenue Officer, Nyriad: 

“Data Privacy Day serves as a great reminder of the value and power of data. In addition to your people, data is without question the most strategic asset of virtually any organization. Data and the ability to fully leverage, manage, store, share, and protect it, enables organizations to be successful across virtually every facet – from competitive advantage, to innovation, the employee experience, and customer satisfaction, to legal and regulations compliance competency. 

Consequently, savvy data management professionals recognize that while a storage solution that is able to deliver unprecedented performance, resiliency, and efficiency with a low total cost of ownership is priority number one to fully optimize data and intelligence for business success; they likewise need to ensure they have the ability to protect against, detect, and restore data and operations in the event of a successful cyber-attack in order to protect their data, for business survival.” 

Brian Dunagan, Vice President of Engineering, Retrospect: 

“Every organization, regardless of size, faces the real possibility that they could be the next victim of a cyberattack. That is because today’s ransomware, which is easier than ever for even the novice cybercriminal to obtain via ransomware as a service (RaaS), strikes repeatedly and randomly without even knowing whose system it is attacking. Ransomware now simply searches for that one crack, that one vulnerability, that will allow it entry to your network. Once inside it can lock-down, delete, and/or abscond with your data and demand payment should you wish to keep your data private and/or have it returned. 

As an IT professional, it is therefore critical that beyond protection, steps be taken to detect ransomware as early as possible to stop the threat and ensure their ability to remediate and recover. A backup solution that includes anomaly detection to identify changes in an environment that warrants the attention of IT is a must. In order to ensure its benefit,, users must be able to tailor the backup solution’s anomaly detection to their business’s specific systems and workflows; with capabilities such as customizable filtering and thresholds for each of their backup policies. And, those anomalies must be immediately reported to management, as well as aggregated for future ML/analyzing purposes.”

Molly Presley, SVP of Marketing at Hammerspace:  

“With global rules governing how data should be stored, used, and shared, combined with escalating data losses, explosive personal data growth, and customer expectations, addressing data privacy is now an obligatory business requirement. However, as organizations expand and navigate compliance and legal requirements in the rapidly evolving age of big data, AI/ML, and government regulations, the existing processes surrounding data privacy need to evolve to 1) automate processes and 2) scale to meet increasingly complex new challenges.   

Privacy and security concerns increasingly impact multiple vertical markets, including finance, government, healthcare and life sciences, telecommunications, IT, online retail, and others, as they quickly outgrow legacy data storage architectures. As a result, there is increasing pressure to develop and implement a data strategy and architecture for decentralized data that is more cohesive, making access to critical information simplified and secure.

To protect the organizations’ and individual users’ sensitive data, organizations must take the steps necessary to control how data is shared and eliminate the proliferation of data copies outside the controls of IT security systems. Accelerating IT modernization efforts while managing the ever-increasing volumes of data requires a data solution that simplifies, automates, and secures access to global data. Most importantly, to ensure data privacy and secure data collaboration, a data solution must be able to put data to use across multiple locations and to multiple users while simplifying IT Operations by automating data protection and data management to meet policies set by administrators.”

Jason Lohrey, CEO of Arcitecta:   

“In this information age, data is the critical element of transformation, serving as a foundation for strategic decision-making. Data Privacy Day reminds us that data influences everything we do, from building services, products, customer experiences, and employee relationships. With the acceleration of technology, we are more connected than ever before and using data to facilitate high-value achievements for businesses and consumers.  

But with new threats, it is now more imperative than ever to protect data from those who seek to gain an advantage by exploiting others. It is becoming increasingly easier to infiltrate systems around the world. Organizations need to increase the resilience of their data so that it remains continuously available, and IT leaders must shift their focus from successful backups to successful recoveries to ensure that valuable data doesn’t become compromised by landing in the wrong hands.”  

Nick Hogg, Director of Technical Training at Fortra:

“With the rise of remote working, sharing sensitive files is now taken for granted. Therefore, awareness days and weeks, like Data Privacy Week, are a great way to remind organizations and their stakeholders of the importance of storing and handling data properly.

It’s essential for organizations to re-evaluate their security awareness and compliance training programs to move away from the traditional once-a-year, ‘box-ticking’ exercises that have proven to be less effective. The goal is to deliver ongoing training that keeps data security and compliance concerns front and center in employees’ minds, allowing them to better identify phishing and ransomware risks, as well as reducing user error when handling sensitive data.

They will also need to use digital transformation and ongoing cloud migration initiatives to re-evaluate their existing data loss prevention and compliance policies. The goal is to ensure stronger protection of their sensitive data and meet compliance requirements, while replacing complex infrastructure and policies to reduce the management overhead and interruptions to legitimate business processes.”

Wade Barisoff, Director of Product, Data Protection at Fortra (on the recent introduction of new privacy laws in the states of California and Virginia):

“As new states contemplate their own flavors of data privacy legislation, the only consistency will be the fact that each new law is different. We are already seeing this now; for example, in California, residents can sue companies for data violations, whereas in others it’s their attorney general’s offices that can impose the fines. In Utah, standards apply to fewer businesses compared to other states. As each state seeks to highlight how much they value their citizens’ rights over the next, we’ll see an element of (for example), ‘What’s good for California isn’t good enough for Kansas’ creep in, and this developing complexity will have a significant impact on organizations operating across the country.

Before GDPR there were (and still are) many different country laws for data privacy. GDPR was significant, not because it was a unifying act that enshrined the rights of people and their digital identities to govern how their data could be handled, but it was the first legislation with real teeth. Fines for non-compliance were enough to force companies into action.

So far, five states have (or will have) individual laws, but there are 45 more yet to come. The amount of money and time companies will spend enacting the proper controls for these individual privacy laws fuels the argument for a more unified national approach to data privacy standards, as the penalties for non-compliance are significant. Also, as states begin to increase the demands on business, usually without fully understanding the technology landscape and how businesses work with shared and cloud-based technologies, there’s a potential that companies will be forced to make the decision not to conduct business in certain areas. A national approach would allow businesses to tackle data privacy once, but as it stands, with the federated states model, doing business within the U.S. is likely to get more complicated and expensive.”

FBI Pwns Ransomware Gang… Yes You Read That Right

Posted in Commentary with tags on January 27, 2023 by itnerd

The FBI revealed yesterday that it had shut down the prolific ransomware gang called Hive. To do this, they hacked the hackers. Which I have to admit is a novel approach:

At a news conference, U.S. Attorney General Merrick Garland, FBI Director Christopher Wray, and Deputy U.S. Attorney General Lisa Monaco said government hackers broke into Hive’s network and put the gang under surveillance, surreptitiously stealing the digital keys the group used to unlock victim organizations’ data.

They were then able to alert victims in advance so they could take steps to protect their systems before Hive demanded the payments.

“Using lawful means, we hacked the hackers,” Monaco told reporters. “We turned the tables on Hive.”

News of the takedown first leaked on Thursday morning when Hive’s website was replaced with a flashing message that said: “The Federal Bureau of Investigation seized this site as part of coordinated law enforcement action taken against Hive Ransomware.”

That is impressive. But I should point something out. There were no arrests. So the gang is still out there, and they perhaps they may be rebuilding to launch new attacks. Or they could be scared and not surface again. We’ll have to see.

UPDATE: Brian Johnson, Chief Security Officer of Armorblox had this to say:

This action from the US agencies is definitely a step in the right direction. Specifically looking at attack vectors like ransomware and credential phishing across our 58,000+ tenants, we see a concentration into a few different threat actors at the top – including Hive – so taking them out will have a large impact on the number of attacks that organizations would see. 

At the same time, precisely because of regulatory and law enforcement actions, we are seeing threat actors moving away from ransomware and crypto based attacks to easier attack methods to compromise organizations and steal money or credentials. In the past two years, the two most common cyber insurance claims have been business email compromise and vendor fraud, not ransomware. The arrival of chatGPT is showing attackers the art of the possible when it comes to using language models to create more realistic and successful phishing and business compromise attacks, and in response organizations will need to do the same to defend themselves against the next wave of attacks.

How To Make Your Apple Watch Ultra More “Ultra” (Also Applies To Other Apple Watch Models)

Posted in Products with tags on January 27, 2023 by itnerd

When the Apple Watch Ultra first popped up, people naturally compared it to Garmin sports watches and pointed out the shortcomings that the Apple Watch Ultra had. These shortcomings were:

  • No offline maps for the Apple Watch Workout app
  • No recovery and training advice

Now to be fair, these were shortcomings that the Apple Watch have always had. But they were magnified because of how the Apple Watch Ultra was marketed. Which was directly against established sport watches that had these features baked into their offerings. The good news is that you can easily add these features to not only your Apple Watch Ultra, but any Apple Watch actually to make either more “Ultra”. Let’s start with recovery advice as that is important to the Apple Watch Ultra’s target market.

For recovery and training advice, I have been using an app called Athlytic for the last couple of years on both my iPhone and Apple Watch. I’m going to use their definition of what their app does a great job of encapsulating it:

Athlytic is an app that works with both the iPhone and the Apple Watch to leverage the data in Apple Health, giving you daily, personalized insights into and coaching about your health and daily training.

More specifically, Athlytic uses the health data, collected by your Apple Watch, to help you gauge three things for the current day: how ready your body is to perform, how much cardiovascular exertion you should aim to put on your body, and how much cumulative cardiovascular exertion you’ve put on your body.

Athlytic generates three primary metrics: a Recovery score, a Target Exertion Zone, and an Exertion score.

So in short, it helps me to figure out how hard I can or more accurately should push myself when I work out or take it easy. Which in case you are wondering, I do a workout every day either on the bike outdoors, on the bike indoors via the Zwift platform, cross country skiing in the winter, and some other stuff like hiking and walking.

Let me walk you through how I use it.

When I wake up in the morning and open Athlytic, the recovery screen is the first place I go to. I’ve really been doing some hard workouts lately on Zwift, and it shows that over the last week that my body really isn’t recovering form the efforts that I have put in over the last few days. So based on this, I should be doing less intense workouts to get my body to fully recover. Athlytic goes deep into the weeds to help you understand how these numbers are calculated which you can read here. But the main metric that works into this recovery score is HRV or heart rate variability which Athlytic defines as follows:

The next screen that I go to is the sleep screen to see how well (or not so well) I slept the previous night. On this night, I had decent sleep as it was north of 7.5 hours.

I also pay attention to my sleep debt, which is a way of illustrating if you are consistently getting 7.5 or more hours of sleep consistently, which in turn pays off in terms of better recovery scores. I also pay attention to my sleep time consistency. Which is another way of illustrating if my bedtime is the same every night which helps me get a better nights sleep. Both of these are in a very good place at the moment.

I also tag what happened the day before. For example, the day before I had two cups of coffee in the morning. I do this because Athlytic can start trending recovery relative to different things that happen. Thus I can see what positively or negatively affects my recovery.

Athlytic measures a number of metrics via your Apple Watch, and presents them in this summary page. If anything is out of line, you’ll get an alert, as having any of these out of line may be in indication of fatigue or sickness.

The final screen that I look at is the trends screen which has my exertion or how hard I worked out in blue, and the recovery in grey. This illustrates that earlier this week, I was working out way harder than I should have and I am now paying for it later in the week as the blue line was way above the grey line. I got that under control later in the week. But by then my body was clearly fatigued. Thus something that I need to focus on is bringing those lines closer together as I am going to get more fitness gains by not over training.

So with that out of the way, my wife and I planned to go cross country skiing and given this recovery level, we planned to do two laps of a loop that was just over 5K. I would do the second lap by myself at my typical pace which is way faster than what my wife can do. That’s where another app called WorkOutDoors comes in:

The Apple Watch Workout app is really inadequate. It doesn’t have anywhere the level of customization that any dedicated sports watch such as a Garmin watch has. It also doesn’t support the paring of sensors like power meters on bikes for example. And more importantly, it doesn’t support offline maps which a lot of endurance athletes rely upon. WorkOutDoors solves all of that and really leverages the big screen of the Apple Watch Ultra as seen here. I can use the iPhone app to create custom screens like this one to display the information that I need to see, as well as download routes in .gpx file format to allow me to follow a route. Including in situations where I do not have cellular service which is something that the baked in Workout app cannot do. I will admit that when you first look at this app to try and customize your screens, it can be intimidating. But I encourage you to experiment with different views and try them out to see if you like them or not as it really isn’t that intimidating. One big plus of WorkOutDoors is that I can have it upload directly to the sports social networking site Strava because if your workout is not posted to Strava, it didn’t happen as far as your friends are concerned. WorkOutDoors has mostly replaced the Workout app on my Apple Watch Ultra as it is simply far more usable and functional with the exception of one thing. WorkOutDoors really needs to leverage the always on display as it doesn’t offer “live” views even when the screen is dimmed. If they fixed that, this app would be perfect.

So, after my wife and I did our laps of the route that we planned, I can go back to Althyltic and see how hard I worked.

This is the second lap of the just over 5K route that I spoke of earlier where I was pushing myself a bit harder. Athlytic can display the heart rate of my cross country ski run and then go into the weeds about what this means.

In this case, it showed me that while I was working hard, but not insanely hard. Most of my heart rate was in zone four which is good for building my VO2 Max capacity. I also note that this workout was scored as a 4.01 in terms of effort. Combined with my first run which was much easier, I got an exertion score of 5.9 which was well within my exertion range of 4.5 to 6.5 that I was aiming for. It also shows how intense the workout was. Because you can do a workout and think “wow that was hard” and it actually wasn’t. In this case, it validates that I was working hard but not going over the top.

In fact, Athlytic says so as 76% of this lap was anaerobic. This is the sort of workout that will help my cycling when the road season starts up again.

For me, the combination of these apps allows me to really focus on how I train, how I recover, as well as seeing the real time metrics that I need to work out effectively. And both have resulted in a significant gain in fitness for yours truly. Thus I consider these apps to be a must if you’re serious about using the Apple Watch to up your fitness game. Athlytic is a subscription app which costs $30.99 CDN a year. WorkOutDoors however is a one time payment app which is $8.49 CDN. However both support Family Sharing so others in your family who want to up their fitness game can do so easily and for one fee for up to six people. If you want to make your Apple Watch more “Ultra” regardless of whether you have an Apple Watch Ultra or some other Apple Watch so that you can really improve your fitness, and make up for the lack of this functionality from Apple, you should have a serious look at both of these apps.

Guest Post: Efficiency and visibility with the benefit of SAP monitoring

Posted in Commentary on January 27, 2023 by itnerd

By Gregg Ostrowski, Executive CTO, Cisco AppDynamics

Today, SAP is a vital part of business operations, giving enterprise companies the ability to deliver goods and services to customers around the world. From frontend to backend, several business applications depend completely on SAP to run their most critical business operations.

As IT environments become more complex and dynamic, IT teams are finding it increasingly challenging to manage the availability and performance of both SAP and non-SAP applications. Partial visibility into SAP environments and their dependencies on third-party applications can become enormous obstacles to effective problem resolution and mean time to resolution (MTTR), leading to repeated outages and multiple revenue losses.

Organizations must therefore reevaluate how they approach monitoring their SAP environments, strengthening their monitoring strategies to optimize application availability and performance, and observing the status of key business transactions live and in real time.

SAP monitoring challenges

The majority of organizations still deploy a multitude of tools to monitor dependent systems, or they have a siloed tool monitoring SAP, completely independent from the rest of their IT stack. A fragmented approach like this means they cannot see the full end-to-end flow and correlate business performance with their SAP landscape.

This requires many companies to continue to manually correlate SAP performance data with business events on an ad hoc basis or do so after business problems occur. They waste a lot of time trying to troubleshoot problems by having to manually review all the records, which adds to the MTTR, as dynamic environments create a wave of additional data, and this approach will not scale.

In fact, even if companies have the option to measure performance and monitor it in this way, it will only allow them to cope with problems. They cannot prioritize the most important business operations because they cannot establish which are the most direct customer and business-facing issues. Therefore, IT teams are exposed to just putting out fires, wasting valuable time and resources to focus on strategic priorities.

The importance of visibility in the IT environment

Without question, businesses need a single source of truth about their SAP environments and how they are driving company-wide performance.

The above means that it is necessary to ensure they have deep, end-to-end visibility to get a comprehensive view of their entire IT landscape. With this information, technologists are available to see and understand upstream service dependencies – as well as user experience – within SAP.

IT teams need a solution that can understand exclusive advanced business application programming code issues at a microscopic level so that developers can easily pinpoint the root cause of application performance issues. This level of visibility creates more stability within the application environment, enhancing technologists’ ability to respond reliably to IT, business outcomes, and customer expectations.

It is important for organizations to move on from piecemeal and heavily manual methods of monitoring SAP and non-SAP apps. Trying to recreate issues is not always possible. This approach is too time-consuming and increases the risk of ongoing performance problems that can affect end users and corrupt the bottom line.

Technologists can make use of dynamic baseline capabilities to bypass having to manually update static thresholds as priorities change and environments grow. To avoid infinite alert storms, enterprises could leverage artificial intelligence (AI) and machine learning (ML) to proactively assess the status of transactions, as well as address issues that may arise.

Through this proactive functionality, organizations can adjust their resource investment based on scenarios specific to their business and potentially affecting performance, such as high traffic volumes due to holiday shopping or other seasonal events, month-end closing, and product launches, among other business activities.

This is especially relevant when organizations take advantage of cloud-native solutions. It provides real-time performance metrics, helping technologists report issues as they identify bottlenecks to focus on innovation processes that can drive organizational growth. 

Report: Commvault Leads the Industry in Kubernetes Data Protection for Third Consecutive Year

Posted in Commentary with tags on January 27, 2023 by itnerd

Commvault, a global enterprise leader in intelligent data services across on-premises, cloud, and SaaS environments, today announced that leading industry research firm GigaOm has named Commvault a “Leader” and “Outperformer” in the new GigaOm Radar for Kubernetes Data Protection for the third year running.

Commvault was evaluated along with 14 other vendors based on execution, roadmap and ability to innovate. According to GigaOm, Commvault is “doing very well by combining solutions for SaaS applications, on-premises (VM-based) infrastructure, containers, and databases efficiently.” Commvault provides Kubernetes data protection through its Commvault Complete™ Data Protection software and Metallic Data Management as a Service (DMaaS) solutions, giving customers the flexibility to choose their preferred storage vendor through their extensive ecosystem.

Kubernetes and containers have not (yet) replaced all cloud and traditional applications – they have integrated into the application landscape and need to be protected accordingly. Over the last year, Commvault has significantly advanced its protection for Kubernetes workloads by integrating, fully automated management, replication, migration, and security enhancements across its portfolio of Intelligent Data Services. According to the GigaOm Radar for Kubernetes Data Protection, Commvault provides effective protection for “hybrid applications that run across Kubernetes, VMs, and cloud services, consolidating backup operations on a single platform.”

To learn more about how Commvault and our Metallic SaaS portfolio ranked in the GigaOm Radar for Kubernetes Data Protection, view the report here.

The TELUS Data Trust Survey Reveals Some Interesting Facts On How Canadians View Data Privacy

Posted in Commentary with tags on January 26, 2023 by itnerd

It’s Data Privacy Week – an issue that is top of mind as federal government is about to discuss.

Bill C-27, proposed to strengthen Canada’s private sector law while ensuring the privacy of Canadians will be protected and that innovative businesses can benefit from clear rules as technology continues to evolve.

In a recent poll, commissioned by TELUS, only 30 per cent of Canadians trust organizations to protect their personal information, and more than half of Canadians feel confident data and technology can be used to positively impact healthcare, education, safety and security. Here’s some other key points from the TELUS Data Trust Survey:

About the survey:

  • The purpose of TELUS’ Data Trust Survey was to explore Canadians’ feelings around data privacy and trust to drive conversations in light of rapid digital transformation of our economy and lives.
  • Trust, as it pertains to data, is essential to technological and social innovation. With data breaches on the rise, it’s more important than ever for organizations to protect the data entrusted to them and to be transparent about how data is used and protected to maintain and build trust.


  • Audience: Responses were nationally representative from 2,016 Canadian adults, 18years of age or older
  • Timeframe: August 31, 2022 to September 19, 2022
  • Location: Canada
  • Language: English, French
  • Sample provider: Dynata, Inc.Key Survey Findings

Canadians care about their data privacy.

  • 97% of Canadians feel they understand what personal information is.
  • More than eight-in-ten claim to understand how their personal information can be usedonline.
  • Three-quarters of Canadians take regular steps to protect their personal information.
  • More than half are extremely concerned about personal information and online privacy.

Half have experienced a breach, impacting their trust.

  • In the last two years, almost half of Canadians experienced some sort of breach.
  • Since experiencing a breach, two-thirds of Canadians have become more guarded with their personal information.
  • Since experiencing a breach30% of Canadians have lost trust with the company associated with the breach.

Canadians believe in the power of technology for good.

  • Half of Canadians are extremely excited about technology improvements.
  • More than half of Canadians feel confident data and technology can be used topositively impact healthcare, education, safety and security.
  • 42% believe that sharing data with trustworthy companies can provide them with usefulproducts and services

Trust matters to Canadians when choosing companies to engage with.

  • 79% of Canadians agree that a company’s reputation for how it treats personal information and privacy changes the way they think about the company or brand.
  • 62% are more likely to buy products or use services from companies they trust.
  • Only 31% trust organizations to protect their personal information.
  • Nearly one-quarter of Canadians say they don’t trust telecoms, 14% say they don’t trustvirtual care providers.