$3.72B USD in Cyber Week Sales Expected In Canada: Salesforce

Posted in Commentary with tags on November 24, 2025 by itnerd

Salesforce is predicting a strong Cyber Week (Thursday, Nov. 27 through Monday, Dec. 1)., showing digital traffic and sales have been higher over the last seven weeks in comparison to 2024. This is based on data from over 1.5 billion global shoppers across 1.5 trillion page views – including Canada.

Consumers’ feelings towards AI-powered shopping are quickly changing (48% of AI users would trust an agent to make a purchase on their behalf), putting AI agents center stage this shopping season and driving an anticipated $73 billion globally during Cyber Week. 

In Canada, we have seen:

  • Cyber Week digital sales are expected to reach $3.72B USD, with 2% YoY growth.
  • Early-season momentum is strong: from Oct. 1 to Nov. 15, Canadian Gross Merchandise Value (GMV) is up 2% YoY, and digital traffic is up 3% YoY.

You can read the post on this here

Cybersecurity Continues to Strengthen at MicroAge

Posted in Commentary with tags on November 24, 2025 by itnerd

MicroAge is proud to share that it has successfully completed a rigorous security audit known as System and Organization Controls 2 (SOC 2) Type 2 as of November 21, 2025. The examination conducted by Johanson Group, LLP found that MicroAge meets high standards for keeping systems and data secure. The audit resulted in a clean report, meaning MicroAge met all the required criteria without any issues.

A SOC 2 audit provides independent, third-party validation that a service organization’s information security practices meet industry standards required by the American Institute of Certified Public Accountants (AICPA). During the audit, a service organization’s non-financial reporting controls related to the security of a system are tested. The SOC 2 report delivered by Johanson Group, LLP, verified the suitability of the design and operating effectiveness of MicroAge controls to meet the standards for these criteria.

Additionally, MicroAge has earned the Cybersecurity Maturity Model Certificate (CMMC) Level 1 attestation, which focuses on the protection of Federal Contract Information (FCI) by having organizations implement 15 foundational cybersecurity requirements. This certification is a critical step for companies working with the U.S. Department of Defense and demonstrates MicroAge’s ability to meet essential security requirements for protecting sensitive information.

What This Means to Clients
CMMC Level 1 attestation assures clients that MicroAge adheres to robust cybersecurity practices designed to protect sensitive federal information. By meeting these requirements, MicroAge provides an added layer of trust and compliance that allows clients to confidently engage in projects that demand strong security standards.

MicroAge intends to continue executing and improving its internal controls and provide consistent peace of mind to clients with annual SOC 2 reporting and ongoing compliance with CMMC requirements. On top of everything, safety of client and company information remains a top priority.

Shai-Hulud malware infects 500 npm packages, leaks secrets on GitHub

Posted in Commentary with tags on November 24, 2025 by itnerd

 There is new research that shows that hundreds of trojanized versions of well-known packages have been planted in the npm registry in a new Shai-Hulud supply-chain campaign. Which is of course, really, really bad.

Ensar Seker, CISO at SOCRadarhas provided the following comment: 

“This campaign marks a dramatic escalation in software supply‑chain threats. Unlike earlier attacks that compromised only a handful of packages or relied on drop‑in malicious dependencies, Shai‑Hulud is a self‑propagating worm that abuses developer workflows, steals developer/CI CD credentials, publishes them to public GitHub repositories, and then uses those credentials to infect additional packages. 

What makes it especially pervasive is that it targets npm packages with multi‑million download counts, packages such as @ctrl/tinycolor, Zapier, ENS Domains, PostHog and Postman have been impacted.  Through the injection of malicious scripts (often via lifecycle hooks like postinstall/preinstall) and hidden GitHub Actions workflows, the attacker turns every infected developer workstation and CI runner into a distribution node. 

To defend against this kind of attack, dev and security teams must treat npm package management and CI/CD pipelines as part of the threat surface. This means enforcing strict token/scoped access policies, limiting or auditing lifecycle scripts (especially preinstall/postinstall hooks), monitoring secrets in build environments and using behavioral analytics to detect unusual GitHub Actions workflows or outbound connections from build hosts. Given the worm‑like nature of Shai‑Hulud, time is of the essence: any delay in rotating tokens or cleaning compromised build agents can lead to rapid spread.

In short, Shai‑Hulud isn’t a typical “package compromise”; it’s a worm embedded into the dev supply chain. It signals that attackers are shifting from targeting compiled binaries and runtime environments toward the very processes developers use to build and ship software. No organization should assume “we don’t use npm, so we’re safe”, because even downstream dependencies or dev toolchains can become the launch pad.”

This illustrates the need for a software bill of materials so that it is clear where software components come from. But beyond that, developers need to know and have full confidence in the components that they use. That way the chances of this sort of attack are lessened.

Kyndryl and Microsoft study reveals that 78% of leading organizations highlight IT as a key enabler of environmental goals

Posted in Commentary with tags on November 24, 2025 by itnerd

Kyndryl in collaboration with Microsoft, today announced the findings of the third annual Global Sustainability Barometer Study, conducted by Ecosystm. The study reveals that integration-focused organizations – those that align sustainability with business strategy, empower employees and adopt advanced technologies like AI – are driving measurable business value and lasting impact in today’s rapidly changing world.

Integration-focused organizations lead the way globally

The 2025 Global Sustainability Barometer Study identifies a decisive shift from years prior: leading organizations embed sustainability into their core business processes to outperform their peers across regions and industries. These leaders turn sustainability from a side initiative into a value-creation engine, driving resilience, competitiveness and market differentiation. Notably, 78% of integration-focused organizations highlight IT as a key enabler in achieving sustainability goals, leveraging data, automation and AI for measurable impact, and 56% of IT teams now lead sustainability efforts beyond IT, up from 38% in 2024.

Key global findings

  • Core driver of strategy: 62% of integration-focused organizations embed sustainability into their innovation, cost savings and resilience strategies – compared to 34% of others – transforming sustainability from a compliance requirement into a catalyst for long-term growth and competitive advantage.
  • Financial gains: 59% of organizations worldwide report financial benefits from sustainability investments, primarily through operational efficiency, customer retention and new market opportunities.
  • Early agentic AI adoption: Globally, 30% of all organizations are piloting or deploying agentic AI for sustainability, with early adopters reporting measurable gains in cost savings, innovation and compliance.
  • Connecting policy, people, and purpose: 73% of organizations globally report strong alignment between technology and sustainability teams. By connecting departmental objectives, empowering employees and engaging stakeholders, these leaders move sustainability from a compliance exercise to a driver of business value and lasting impact.
  • Regional and industry momentum: Europe leads in aligning tech modernization and AI adoption for sustainability, propelled by robust regulatory frameworks. Across all regions, countries accelerating sustainability cite clearer return on investment (ROI) or new revenue opportunities as the top drivers (67%). Additionally, industries leading in agentic AI adoption and experimentation include energy and utilities, banking and transport – with focus placed beyond energy and emissions optimization, on operational resilience, resource efficiency and sustainable product design.

The study findings align with the 2025 Kyndryl Readiness Report and recognize the deeper integration between sustainability and IT. The Readiness Report reveals that 27% businesses that invest in IT modernization achieve sustainability-based benefits through efficiency, innovation, security and compliance, while 22% cite improved energy efficiency or sustainability as a critical outcome for digital transformation ROI.


About the Global Sustainability Barometer Study
The third edition of the Global Sustainability Barometer Study, conducted by Ecosystm and commissioned by Kyndryl and Microsoft, reflects the perspectives of 1,286 enterprise leaders spanning 20 countries and nine industry groups. Conducted between August and September 2025, this study aims to provide a comprehensive view of how integration, strategy, and technology are transforming sustainability from compliance to competitive advantage.

Learn more about the study, From Planning to Progress: AI-Driven Sustainability in Practice

Safer Black Friday and Cyber Week Shopping Demand More Than One Tool – Research Yields Consumer Security Tips

Posted in Commentary with tags on November 24, 2025 by itnerd

As millions of consumers prepare their budgets, credit cards and digital wallets for Black Friday and Cyber Week, the common wisdom is clear: use a VPN to protect your financial data. But in a study conducted by PureVPN, researchers with Ontario Tech University and CQR Cybersecurity found that relying on a standalone VPN, or juggling it alongside separate password managers and ad blockers, creates a false sense of security that cyber thieves are ready and able to exploit. PureVPN also announced a Black Friday and Cyber Week pricing discount of 88 percent at $1.49/month for unified, attack-thwarting online shopping and communications.

According to the study “The Cost of Fragmentation: Measuring Time, Spend and Risk in Personal Cybersecurity Tool Stacks,” the use of separate security tools for VPNs, password management, and ad blocking creates a dangerous security gap. The data shows that 38% of modern cyberattacks now exploit stolen credentials and exposed connections, specifically by taking advantage of the data exposed by non-integrated tools.

The Hidden Risk of the Security Gap

Shoppers often assume they are safe if they have a password manager and a VPN installed. However, when these tools don’t communicate and integrate with one another, risks emerge. A typical example of this is when a consumer auto-fills credit card details or passwords on mobile devices while their separate VPN is disconnected, a common occurrence due to “alert fatigue,” and those credentials can travel over the exposed network.

Alert Fatigue: The Enemy of Safe Shopping

The rush of online Black Friday deals and the contention for in-store “door opener” specials are chaotic. Adding a barrage of security notifications can make this chaos worse – and for many, overwhelming. And that’s when a shopper turns to risky behaviors like turning off their VPN. The study found that the average consumer manages 3.4 distinct security apps, and spends up to 27 hours a year maintaining them, leading to a cycle of “alert chaos”:

  • 44% of users receive overlapping alerts from different apps.
  • 38% of those who receive overlapping alerts admit to ignoring these alerts entirely due to the volume.
  • 29–34% leave essential tools disabled or miss paid features, turning fragmented apps into “open doors” for attackers.

Safer Shopping Solution: Integrated and Easy Protection, Not Competing Apps

For a safer holiday shopping season, PureVPN is offering discounts on its new Unified Security Suite, which was specifically designed to close security gaps for mobile and online shoppers – especially those who aren’t IT hobbyists. Combining a VPN, Password Manager, Dark Web Monitoring, and Tracker Blocking into a single app, the suite ensures that critical actions are protected automatically.

Key PureVPN Unified Security Suite protections for Black Friday shoppers include:

  • Secured Autofill: The integrated VPN and Password Manager ensure that all time credentials or credit card numbers are autofilled as they travel through an encrypted tunnel.
  • Real-Time Anti-Tracking: The built-in Tracker & Ad Blocker stops advertisers and malicious scripts from building profiles based on shopping habits.
  • Real Savings: By replacing redundant subscription apps, users can stop wasting the costs of overlapping, non-integrated apps, which can cost as much as $850 annually, according to the study’s research findings.

PureVPN’s Black Friday and Holiday Season Pricing – Now 88% Off.

PureVPN has launched a $1.49/month Black Friday offer on its Unified Security Suite app to help protect privacy in response to the last year of rising cybercrime, offering consumers the best value-to-feature ratio among VPN providers.

Availability

The new unified PureVPN app is now available on Android and iOS. This Black Friday, shoppers can secure their digital footprint not just with a VPN, but with a complete, integrated defensive perimeter. With the Unified Security Suite now live on both platforms, PureVPN is redefining personal protection: one app, zero complexity, complete peace of mind.

To learn more, visit: https://www.purevpn.com/order

So About Android Phones Getting AirDrop…. Apple May Not Have Signed Off On This

Posted in Commentary with tags , on November 22, 2025 by itnerd

A couple of days ago I posted a story about Pixel 10’s and ultimately all Android phones getting the ability to support Apple’s AirDrop functionality. One thing that sort of popped into my head at the time was the thought that Apple as a company was not mentioned in terms of signing off on this. As a result, I did some looking around and found my answer via  a statement that Google provided to Android Authority:

We accomplished this through our own implementation. Our implementation was thoroughly vetted by our own privacy and security teams, and we also engaged a third party security firm to pentest the solution.

So Apple was not involved. That really sounds like the whole Beeper situation where Beeper reverse engineered iMessage to give Android users the ability to send and receive iMessages in a very sketchy way. As a result Apple went scorched Earth on Beeper to stop that from working. Now Bepper was a very tiny company which truly had zero chance against Apple. Google is a much bigger company that will stand up to Apple if the latter tries to break this functionality. It should also be noted that Apple gets billions of dollars from Google via an agreement to have Google’s search engine as the default search engine on iDevices. Thus Apple may have a financial incentive not to do anything. Thus the fact that Apple didn’t sign off on this as far as I can tell may be a non factor. But we’ll find out soon enough.

2026 Predictions from Cerabyte 

Posted in Commentary with tags on November 21, 2025 by itnerd

Martin Kunze, founder and CMO of Cerabyte was kind enough to offer the following 2026 Technology Predictions about important trends in AI, data centers and data storage. 

Tackling the Data Center Waste Crisis

The industry will confront an uncomfortable truth: data center efficiency has plateaued. In 2026, sustainability will become a competitive differentiator as hyperscalers and enterprises face mounting pressure to curb waste from short lived media. Sustainability is increasingly translating to economic cost, overprovisioned storage, and unused capacity. Expect to see new architectures that prioritize efficiency, longevity, and circular economy principles in hardware design.

The Future of AI Depends Not Just on Algorithms, but on Storage

AI innovation has been dominated by advances in algorithms and compute, but 2026 will mark the year storage infrastructure takes center stage. The ability to store, access, and preserve exabyte-scale datasets efficiently will define which companies lead in AI. Those who treat storage as a first-class citizen — not a bottleneck — will gain strategic advantage.

Sustainability Becomes the Hyperscalers’ Biggest Concern

In 2026, the race to power AI will collide head-on with the race to decarbonize. Hyperscalers will face increasing scrutiny over their environmental footprints, from embodied carbon in data centers to the long-term sustainability of their storage strategies. Technologies that extend data lifespan, minimize energy consumption, and reduce material waste will shift from “nice-to-have” to “must-adopt.”

Salesforce confirms 200+ orgs impacted by another third party Gainsight breach

Posted in Commentary on November 21, 2025 by itnerd

In an early morning advisory yesterday, Salesforce says it revoked refresh tokens linked to Gainsight-published applications while it investigates data theft and attacks targeting potentially hundreds of customers.

The company highlighted that the incident doesn’t originate from a vulnerability within its platform as all evidence is derived from malicious activity related to the Gainsight app’s external connection to Salesforce.

“Salesforce has identified unusual activity involving Gainsight-published applications connected to Salesforce, which are installed and managed directly by customers. Our investigation indicates this activity may have enabled unauthorized access to certain customers’ Salesforce data through the app’s connection.

“Upon detecting the activity, Salesforce revoked all active access and refresh tokens associated with Gainsight-published applications connected to Salesforce and temporarily removed those applications from the AppExchange while our investigation continues,” Salesforce said in a Thursday morning advisory.

During the August 2025 Salesloft breach, “Scattered Lapsus$ Hunters” stole sensitive information from the customers of 760 companies using stolen OAuth tokens for Salesloft’s Drift AI chat integration with Salesforce, resulting in the theft of 1.5 billion Salesforce records.

Thursday, ShinyHunters told Bleeping Computer they gained access to 285 Salesforce instances after breaching Gainsight via data stolen in the Salesloft drift breach.

Gainsight did not say how its customers’ access tokens may have been compromised, but previously said it was also one of the Salesloft Drift customers impacted in the previous attacks.

Gainsight has an update and FAQ page for customer support, while Salesforce has alerted all impacted customers of this incident. 

John Carberry, Solution Sleuth, Xcape, Inc. had this to say:

   “Salesforce’s confirmation that over 200 organizations were exposed through misconfigured Gainsight apps is another sobering reminder that your biggest danger in the SaaS world is frequently someone else’s integration.

   “This incident demonstrates how long the tail of a supply-chain vulnerability can be. It builds immediately on the previous Salesloft/Drift breach, in which attackers allegedly stole OAuth tokens and are now utilizing that access to pivot into 285 Salesforce instances.

   “Technically, Salesforce did the right thing by removing all Gainsight-related tokens and removing the apps from the AppExchange, but for customers, this highlights an unsettling reality. Even if the core platform isn’t vulnerable, over-privileged third-party apps can still gain access to your CRM crown jewels.

   “This incident makes it abundantly evident that, even in cases when a core platform is secure, the broad permissions given to integrated applications that appear to be harmless continue to be the weakest link in the cloud ecosystem.

   “Moving forward, companies must handle linked apps as high-risk identities. Inventory them, give them the least privilege required, keep an eye on their activity, and be prepared to quickly revoke trust when anomalous behavior is detected. Attackers will have easy access to your client data if you don’t regularly examine your SaaS integrations and tighten OAuth scopes.

  “In 2025, the real zero day isn’t in your CRM; it’s in the third-party app you forgot was connected to it.”

Lydia Zhang, President & Co-Founder,Ridge Security Technology Inc. followed up with this:

   “It’s clear that once attackers succeed in a large-scale breach, it becomes progressively easier for them to leverage the compromised data and tokens to achieve additional attacks.

   “The message for defenders is that patching the initially ‘broken’ door isn’t enough, you must thoroughly inspect every part of your environment to ensure the attackers cannot reuse access from a prior breach to open new doors.”

Denis Calderone CRO & COO, Suzu Labs adds this:

   “We’ve been warning clients about this scenario for years, that the SaaS integration trust chain is almost always longer and more complex than anyone realizes.

   “This is like a Russian nesting doll: Salesloft gets breached, which exposes Gainsight, which compromises 200+ Salesforce customers. You might know you’re using Gainsight, but do you know Gainsight integrates with Salesloft? That visibility gap is where these cascading breaches live.

   “Organizations should focus heavily on OAuth hygiene and conditional access policies. Organizations need to continuously monitor OAuth token usage for abnormalities: unusual data volumes, unexpected geographic access, dormant tokens suddenly going active. When something doesn’t look right, automatically revoke refresh tokens. Don’t wait for vendor disclosure. If a token that’s been quiet for months suddenly pulls gigabytes of data, that’s your signal.

   “And here’s the simple part: if you see a dormant OAuth token that hasn’t been used in 60 or 90 days, just revoke it. This will limit your blast radius with minimal impact on user experience.”

Supply chain attacks are starting to become as bad as ransomware as organizations are falling victim to these attacks left, right center. This reinforces that organizations need to take action to mitigate this threat right now.

Active Archive Alliance 2026 Predictions

Posted in Commentary on November 21, 2025 by itnerd

Members of the Active Archive Alliance recently shared their predictions for data management as it relates to active archives in 2026. These newly released predictions reveal a major shift: active archives are no longer a “nice-to-have” tier. They are becoming the architectural backbone that enables AI at scale.

Below is a list of the top 13 upcoming trends for your review:

Active archives will play a central role in ensuring high-value datasets remain instantly accessible. Organizations will increasingly adopt a combination of active archives, intelligent tiering, and hybrid cloud architectures to optimize storage utilization at scale. Tiering is necessary to group large datasets and assign them levels of importance and priority. An active archive serves this purpose well, as it allows data to be relegated to a lower tier while still being available rapidly should it be needed by the AI engine. Organizations that fail to modernize their storage strategies will risk higher costs, slower AI deployment, and diminished competitiveness in an increasingly data-driven world. – Eric Polet, Director of Product Marketing, Arcitecta

Tape for long-term storage

Driven by exponential data growth and the need for lower-cost, energy-efficient, long-term storage, tape is poised to become a cornerstone of active archival tiers within hybrid storage architectures. – Marc Steinhilber, CEO, BDT Media Automation GmbH

Active archives will require sophisticated data analytics

The industry hit the storage wall as we predicted last year – rising lead times, media and stock prices tell the story. Active archives require sophisticated data analytics, as archives evolve from data dumps to data sources. Data storage must be accessible, sustainable, and affordable to unleash the full potential of AI. – Martin Kunze, CMO and Co-Founder, Cerabyte

Managing data growth is becoming more than just a challenge with IT teams barely able to keep up with demands for performance storage

In the AI data-driven world of 2026 and beyond, IT teams will be compelled to strategically leverage active archiving. With intelligent data management, an active archive solution allows for automated movement of data, based on user defined policy, moving data from expensive, energy intensive performance storage to eco-friendly, economy storage tiers such as today’s modern automated tape systems. This frees up overwhelmed performance storage tiers while maintaining ease of access to always online active archive content. – Rich Gadomski, Dir. Channel Sales and New Business Development, FUJIFILM North America Corp., Data Storage Solutions

Modern object storage will expand to include long-term tape solutions
The explosion of Generative AI and increased demand for unstructured data retention is exceeding modern IT budget growth. Standardized object storage interfaces are making it easy to move data, but object storage was designed as a single tier utilizing hard disk drives. Tiering will become a standard requirement for active data object storage vendors. Modern object storage solutions will expand support to include tape and other long-term storage mediums as an object storage deep archive target, at a fraction of the cost of cloud archives.  Cloud will continue to be part of the hybrid data protection strategy. The result will be lowered costs for organizations storing Petabytes of data.  – Mark Hill, Business Line Executive Data Retention Infrastructure, IBM

AI as the “archivist’s assistant” for value extraction

The role of the active archive will fundamentally change from a secure ‘holding tank’ to a ‘Data Intelligence Sandbox.’ AI will move beyond just classification and indexing to provide more robust and useful search, automatically identifying and connecting data—such as linking a decades-old research document to a currently active patent—transforming long-tail archived data into a persistent, accessible “corporate memory” that drives net-new R&D and revenue either for that organization or by monetizing the data to offer other organizations. – Paul Luppino, Director, Global Digital Solutions, Iron Mountain

The evolution of cold storage

Cold storage solutions will evolve to provide near-instant access (within seconds) to archived data, making it truly “active” rather than dormant. – Pete Paisley, Owner, MagStor

The rise of geo-distributed active archives based on S3-to-tape technology
By 2026, the deployment of geo-distributed active archives leveraging modern tape libraries is expected to accelerate across enterprises and data center environments. This development is driven by sustained data growth, rising energy and storage costs, and growing demands for data resilience and regulatory compliance.

Advancements in tape system integration, such as S3 object storage compatibility, metadata-driven access, and seamless connection to cloud workflows, are transforming S3-to-Tape systems into geo-aware active archives. These systems enable cost-efficient, sustainable, and cyber-resilient data preservation across multiple geographic locations.

Consequently, S3-to-Tape solutions will play a pivotal role in shaping long-term, distributed data management architectures. – Thomas Thalmann, CEO, PoINT Software & Systems GmbH

Companies that adopt active data archive solutions will be driven not just by cost savings but by compounding pressures such as:

  • Exponential growth in attack surfaces, vectors and points of entry
  • Required recovery of minimum business operations without ransomware payment
  • Regulatory enforcement that punishes non-compliance heavily
  • Rising cost of infrastructure and energy
  • Corporate sustainability mandates
  • Increasing volumes of AI-derived data with long-term retention requirements

These forces will make active archives strategically essential. As the most modern and efficient long-term data storage architecture designed for AI-era complexities, active archives enable early adopters to gain competitive advantages through lower compliance risk, reduced long-term costs, faster audit response, and lessened environmental impact. – Rick Bump, Chief Executive Officer and Co-Founder, SAVARTUS

AI meets its infrastructure reckoning

The race to scale artificial intelligence will collide head-on with the physical limits of power, space, and sustainability. The world’s data centers—already consuming nearly 5% of global electricity—will face unprecedented pressure as exabyte-scale datasets multiply and GPU-driven workloads demand 24/7 throughput. The winners in this next phase won’t be those who build the biggest models, but those who build the smartest infrastructure. Expect a paradigm shift that incorporates the concept of active archives—energy-aware, cyber-resilient tiers where cold data moves from cloud and disk to modern tape systems that consume virtually no power at rest yet remain immediately accessible. This balance of intelligence and efficiency will define digital progress in 2026 and beyond: AI innovation sustained not by endless compute, but by thoughtful, scalable data preservation that keeps the lights on—literally. – Ted Oade, Director of Product Marketing, Spectra Logic

Cloud-based, active archives will no longer be thought of as secondary storage, they become an extension of primary storage. We expect demands for instant access to archived content to only increase  – perhaps double or triple over the next few years – as adoption of AI, analytics, threat hunting, media workflows, and compliance accelerate. Data has gravity and cloud-based archives are a way to balance storage costs with demand for accessibility and we expect demand for more active “always available” storage to grow unabated. – George Hamilton, Director of Product Marketing, Wasabi

AI infrastructure will demand smarter access to all data

As AI workloads grow in complexity and scale, the way data centers manage and access storage is undergoing a fundamental shift. Traditional architectures are struggling to keep up with the demands of real-time analytics, model training, and inference. What’s emerging is a need for infrastructure that’s not only high-performance, but also flexible enough to span edge, core, and cloud environments. To support the full AI lifecycle, systems must deliver consistent performance while also enabling intelligent access to archival data, ensuring that even historical information can be leveraged efficiently and meaningfully.  – Scott Hamilton, Senior Director, Product Management, Marketing & CX, Western Digital

Hybrid deployments for large active archives

The use of hybrid configurations that combine on-premises storage and cloud will continue to grow. This is especially the case for large active media archives where on-premises storage provides high performance and cost-effectiveness and, when combined with cloud object storage, the solution provides a high level of data protection. – Phil Storey, CEO, XenData

Hammerspace Breaks IO500 Barriers: First Standards-Based Linux + NFS System To Achieve True HPC-Class Performance

Posted in Commentary with tags on November 21, 2025 by itnerd

 Hammerspace has announced a breakthrough IO500 10-Node Production result that establishes a new era for high-performance data infrastructure. For the first time, a fully standards-based architecture — standard Linux, the upstream NFSv4.2 client, and commodity NVMe flash — has delivered a 10-node Production fully reproducible IO500 result traditionally achievable only by proprietary parallel filesystems.

This result is the first IO500 Production benchmark demonstrating indisputable proof that standards-based Linux and NFS can meet the extreme performance requirements of high-performance computing (HPC) and artificial intelligence (AI) workloads — without proprietary client software, specialized networking stacks or complex parallel filesystem infrastructure.

A Milestone Moment for Data Platforms — As Transformative as Linux Was for Compute

In the late 1990s, researchers like Dr. David Bader, Distinguished Professor and founder of the Department of Data Science in the Ying Wu College of Computing and Director of the Institute for Data Science at New Jersey Institute of Technology, transformed the HPC world by proving that Linux-based clusters, built on Linux and on commodity components, could rival proprietary supercomputers. That work transformed HPC architecture then, and machine learning (ML) and AI architectures of the future, ultimately making Linux the standard powering nearly every powerful compute environment on Earth. 

This vision laid the foundations for the AI architectures that are emerging even today. Hyperion Research estimates that “over $300 billion in revenue has been generated from selling supercomputers. This represents a sizable economic gain, especially since the use of these systems generated research valued at least ten times over the purchase price. While it is difficult to fully measure the value that supercomputers have generated, even looking at just automotive, aircraft, and pharmaceuticals, supercomputers have contributed to products valued at more than $100 trillion over the last 25 years.”

Hammerspace’s IO500 achievement represents the next chapter of that evolution, this time in the data layer.

Just as Linux revolutionized compute architecture, the combination of standards-based Linux and pNFS is now proving it can revolutionize high-performance data architecture for HPC and AI.
 

The First Architecture That Meets the Demands of Both HPC and AI

This achievement marks the industry’s proof that open, interoperable infrastructure can deliver the performance required by AI and HPC workloads without proprietary lock-in.

HPC environments have traditionally relied on deep institutional expertise to operate complex proprietary filesystems, but the rapid rise of AI has changed the landscape. AI is scaling far faster — across enterprises, cloud providers, sovereign AI platforms, service providers and thousands of new data-intensive applications — and it is impossible to meet this demand with architectures that require niche expertise to deploy and maintain. Every systems administrator already knows how to operate Linux and NFS; however, very few have the specialized knowledge required for legacy parallel file systems. As AI infrastructure becomes mainstream, organizations need HPC-class performance delivered through tools and protocols familiar to the broader IT community. This IO500 result proves that the performance required for both HPC and AI can now be achieved using standard Linux, standard NFS and widely understood operational models, finally aligning extreme performance with the scale and accessibility the AI industry demands.

Standards-Based Architecture, Industry-Leading Performance

The submission by Samsung, leveraging the Hammerspace Data Platform, achieved the fastest standards-based IO500 10-Node Production result ever recorded. Hammerspace not only contributes a substantial number of the capabilities into Linux for pNFS workloads, but its Data Platform is engineered and designed from the ground up to capitalize on these upstream performance enhancements in the Linux kernel.

Unlike traditional storage platforms and legacy parallel file systems that treat Linux as a compatibility layer or pNFS as an added-on interface, Hammerspace’s architecture is built directly on top of — and actively contributes to — the same NFSv4.2 and pNFS innovations driving modern HPC and AI performance. This deep alignment uniquely allows Hammerspace to take immediate advantage of new capabilities such as lower-latency I/O paths, advanced client-side parallelism and improved failover logic, translating Linux’s ongoing advancements directly into real-world application speedups. As a result, organizations can benefit from cutting-edge performance improvements in standard Linux distributions without deploying proprietary clients or rearchitecting their infrastructure.

Unlike legacy parallel file systems that rely on complex, vendor-specific clients, the Samsung’s Hammerspace submission used: 

  • Standard RHEL/Ubuntu Linux
  • Standard upstream NFSv4.2 (pNFS) client
  • Standard NVMe SSDs from Samsung
  • Standard IP-over-InfiniBand
  • Standard server platforms
  • Hammerspace’s standards-based parallel global file system leveraging the pNFS client

In the submission, there was no proprietary client, no custom kernel modules and no exotic parallel file system used.

Modern HPC and AI workloads can now run at elite speeds using standards-based infrastructure and data architectures.

Upstream Linux Innovation Unlocks New Performance

The step-function improvement achieved during the time between the ISC25 and SC25 events is the result of:

  • Enhanced pNFS Flexible File layout parallelism
  • Upstream NFS client improvements contributed by Hammerspace
  • Upstream NFS server improvements that avoid page cache contention, allowing improved sustained performance and reduced resource utilization, contributed by Hammerspace
  • File-level objective-based policy optimizations
  • Latency reductions and throughput gains in metadata access
  • High-performance NVMe data placement managed through the Hammerspace global file system

These enhancements strengthen the entire Linux ecosystem — echoing the transformative Linux HPC contributions of the late 1990s and early 2000s.

Hammerspace’s Top-10 IO500 performance is more than a benchmark victory. It is the first empirical proof that standards-based Linux and NFS can power high-performance data systems at the top of the HPC and AI stack.

Linux democratized supercomputing and now standards-based data infrastructure is positioned to democratize high-performance storage — and reshape the future of global-scale computing.