Salesforce confirms 200+ orgs impacted by another third party Gainsight breach

Posted in Commentary on November 21, 2025 by itnerd

In an early morning advisory yesterday, Salesforce says it revoked refresh tokens linked to Gainsight-published applications while it investigates data theft and attacks targeting potentially hundreds of customers.

The company highlighted that the incident doesn’t originate from a vulnerability within its platform as all evidence is derived from malicious activity related to the Gainsight app’s external connection to Salesforce.

“Salesforce has identified unusual activity involving Gainsight-published applications connected to Salesforce, which are installed and managed directly by customers. Our investigation indicates this activity may have enabled unauthorized access to certain customers’ Salesforce data through the app’s connection.

“Upon detecting the activity, Salesforce revoked all active access and refresh tokens associated with Gainsight-published applications connected to Salesforce and temporarily removed those applications from the AppExchange while our investigation continues,” Salesforce said in a Thursday morning advisory.

During the August 2025 Salesloft breach, “Scattered Lapsus$ Hunters” stole sensitive information from the customers of 760 companies using stolen OAuth tokens for Salesloft’s Drift AI chat integration with Salesforce, resulting in the theft of 1.5 billion Salesforce records.

Thursday, ShinyHunters told Bleeping Computer they gained access to 285 Salesforce instances after breaching Gainsight via data stolen in the Salesloft drift breach.

Gainsight did not say how its customers’ access tokens may have been compromised, but previously said it was also one of the Salesloft Drift customers impacted in the previous attacks.

Gainsight has an update and FAQ page for customer support, while Salesforce has alerted all impacted customers of this incident. 

John Carberry, Solution Sleuth, Xcape, Inc. had this to say:

   “Salesforce’s confirmation that over 200 organizations were exposed through misconfigured Gainsight apps is another sobering reminder that your biggest danger in the SaaS world is frequently someone else’s integration.

   “This incident demonstrates how long the tail of a supply-chain vulnerability can be. It builds immediately on the previous Salesloft/Drift breach, in which attackers allegedly stole OAuth tokens and are now utilizing that access to pivot into 285 Salesforce instances.

   “Technically, Salesforce did the right thing by removing all Gainsight-related tokens and removing the apps from the AppExchange, but for customers, this highlights an unsettling reality. Even if the core platform isn’t vulnerable, over-privileged third-party apps can still gain access to your CRM crown jewels.

   “This incident makes it abundantly evident that, even in cases when a core platform is secure, the broad permissions given to integrated applications that appear to be harmless continue to be the weakest link in the cloud ecosystem.

   “Moving forward, companies must handle linked apps as high-risk identities. Inventory them, give them the least privilege required, keep an eye on their activity, and be prepared to quickly revoke trust when anomalous behavior is detected. Attackers will have easy access to your client data if you don’t regularly examine your SaaS integrations and tighten OAuth scopes.

  “In 2025, the real zero day isn’t in your CRM; it’s in the third-party app you forgot was connected to it.”

Lydia Zhang, President & Co-Founder,Ridge Security Technology Inc. followed up with this:

   “It’s clear that once attackers succeed in a large-scale breach, it becomes progressively easier for them to leverage the compromised data and tokens to achieve additional attacks.

   “The message for defenders is that patching the initially ‘broken’ door isn’t enough, you must thoroughly inspect every part of your environment to ensure the attackers cannot reuse access from a prior breach to open new doors.”

Denis Calderone CRO & COO, Suzu Labs adds this:

   “We’ve been warning clients about this scenario for years, that the SaaS integration trust chain is almost always longer and more complex than anyone realizes.

   “This is like a Russian nesting doll: Salesloft gets breached, which exposes Gainsight, which compromises 200+ Salesforce customers. You might know you’re using Gainsight, but do you know Gainsight integrates with Salesloft? That visibility gap is where these cascading breaches live.

   “Organizations should focus heavily on OAuth hygiene and conditional access policies. Organizations need to continuously monitor OAuth token usage for abnormalities: unusual data volumes, unexpected geographic access, dormant tokens suddenly going active. When something doesn’t look right, automatically revoke refresh tokens. Don’t wait for vendor disclosure. If a token that’s been quiet for months suddenly pulls gigabytes of data, that’s your signal.

   “And here’s the simple part: if you see a dormant OAuth token that hasn’t been used in 60 or 90 days, just revoke it. This will limit your blast radius with minimal impact on user experience.”

Supply chain attacks are starting to become as bad as ransomware as organizations are falling victim to these attacks left, right center. This reinforces that organizations need to take action to mitigate this threat right now.

Active Archive Alliance 2026 Predictions

Posted in Commentary on November 21, 2025 by itnerd

Members of the Active Archive Alliance recently shared their predictions for data management as it relates to active archives in 2026. These newly released predictions reveal a major shift: active archives are no longer a “nice-to-have” tier. They are becoming the architectural backbone that enables AI at scale.

Below is a list of the top 13 upcoming trends for your review:

Active archives will play a central role in ensuring high-value datasets remain instantly accessible. Organizations will increasingly adopt a combination of active archives, intelligent tiering, and hybrid cloud architectures to optimize storage utilization at scale. Tiering is necessary to group large datasets and assign them levels of importance and priority. An active archive serves this purpose well, as it allows data to be relegated to a lower tier while still being available rapidly should it be needed by the AI engine. Organizations that fail to modernize their storage strategies will risk higher costs, slower AI deployment, and diminished competitiveness in an increasingly data-driven world. – Eric Polet, Director of Product Marketing, Arcitecta

Tape for long-term storage

Driven by exponential data growth and the need for lower-cost, energy-efficient, long-term storage, tape is poised to become a cornerstone of active archival tiers within hybrid storage architectures. – Marc Steinhilber, CEO, BDT Media Automation GmbH

Active archives will require sophisticated data analytics

The industry hit the storage wall as we predicted last year – rising lead times, media and stock prices tell the story. Active archives require sophisticated data analytics, as archives evolve from data dumps to data sources. Data storage must be accessible, sustainable, and affordable to unleash the full potential of AI. – Martin Kunze, CMO and Co-Founder, Cerabyte

Managing data growth is becoming more than just a challenge with IT teams barely able to keep up with demands for performance storage

In the AI data-driven world of 2026 and beyond, IT teams will be compelled to strategically leverage active archiving. With intelligent data management, an active archive solution allows for automated movement of data, based on user defined policy, moving data from expensive, energy intensive performance storage to eco-friendly, economy storage tiers such as today’s modern automated tape systems. This frees up overwhelmed performance storage tiers while maintaining ease of access to always online active archive content. – Rich Gadomski, Dir. Channel Sales and New Business Development, FUJIFILM North America Corp., Data Storage Solutions

Modern object storage will expand to include long-term tape solutions
The explosion of Generative AI and increased demand for unstructured data retention is exceeding modern IT budget growth. Standardized object storage interfaces are making it easy to move data, but object storage was designed as a single tier utilizing hard disk drives. Tiering will become a standard requirement for active data object storage vendors. Modern object storage solutions will expand support to include tape and other long-term storage mediums as an object storage deep archive target, at a fraction of the cost of cloud archives.  Cloud will continue to be part of the hybrid data protection strategy. The result will be lowered costs for organizations storing Petabytes of data.  – Mark Hill, Business Line Executive Data Retention Infrastructure, IBM

AI as the “archivist’s assistant” for value extraction

The role of the active archive will fundamentally change from a secure ‘holding tank’ to a ‘Data Intelligence Sandbox.’ AI will move beyond just classification and indexing to provide more robust and useful search, automatically identifying and connecting data—such as linking a decades-old research document to a currently active patent—transforming long-tail archived data into a persistent, accessible “corporate memory” that drives net-new R&D and revenue either for that organization or by monetizing the data to offer other organizations. – Paul Luppino, Director, Global Digital Solutions, Iron Mountain

The evolution of cold storage

Cold storage solutions will evolve to provide near-instant access (within seconds) to archived data, making it truly “active” rather than dormant. – Pete Paisley, Owner, MagStor

The rise of geo-distributed active archives based on S3-to-tape technology
By 2026, the deployment of geo-distributed active archives leveraging modern tape libraries is expected to accelerate across enterprises and data center environments. This development is driven by sustained data growth, rising energy and storage costs, and growing demands for data resilience and regulatory compliance.

Advancements in tape system integration, such as S3 object storage compatibility, metadata-driven access, and seamless connection to cloud workflows, are transforming S3-to-Tape systems into geo-aware active archives. These systems enable cost-efficient, sustainable, and cyber-resilient data preservation across multiple geographic locations.

Consequently, S3-to-Tape solutions will play a pivotal role in shaping long-term, distributed data management architectures. – Thomas Thalmann, CEO, PoINT Software & Systems GmbH

Companies that adopt active data archive solutions will be driven not just by cost savings but by compounding pressures such as:

  • Exponential growth in attack surfaces, vectors and points of entry
  • Required recovery of minimum business operations without ransomware payment
  • Regulatory enforcement that punishes non-compliance heavily
  • Rising cost of infrastructure and energy
  • Corporate sustainability mandates
  • Increasing volumes of AI-derived data with long-term retention requirements

These forces will make active archives strategically essential. As the most modern and efficient long-term data storage architecture designed for AI-era complexities, active archives enable early adopters to gain competitive advantages through lower compliance risk, reduced long-term costs, faster audit response, and lessened environmental impact. – Rick Bump, Chief Executive Officer and Co-Founder, SAVARTUS

AI meets its infrastructure reckoning

The race to scale artificial intelligence will collide head-on with the physical limits of power, space, and sustainability. The world’s data centers—already consuming nearly 5% of global electricity—will face unprecedented pressure as exabyte-scale datasets multiply and GPU-driven workloads demand 24/7 throughput. The winners in this next phase won’t be those who build the biggest models, but those who build the smartest infrastructure. Expect a paradigm shift that incorporates the concept of active archives—energy-aware, cyber-resilient tiers where cold data moves from cloud and disk to modern tape systems that consume virtually no power at rest yet remain immediately accessible. This balance of intelligence and efficiency will define digital progress in 2026 and beyond: AI innovation sustained not by endless compute, but by thoughtful, scalable data preservation that keeps the lights on—literally. – Ted Oade, Director of Product Marketing, Spectra Logic

Cloud-based, active archives will no longer be thought of as secondary storage, they become an extension of primary storage. We expect demands for instant access to archived content to only increase  – perhaps double or triple over the next few years – as adoption of AI, analytics, threat hunting, media workflows, and compliance accelerate. Data has gravity and cloud-based archives are a way to balance storage costs with demand for accessibility and we expect demand for more active “always available” storage to grow unabated. – George Hamilton, Director of Product Marketing, Wasabi

AI infrastructure will demand smarter access to all data

As AI workloads grow in complexity and scale, the way data centers manage and access storage is undergoing a fundamental shift. Traditional architectures are struggling to keep up with the demands of real-time analytics, model training, and inference. What’s emerging is a need for infrastructure that’s not only high-performance, but also flexible enough to span edge, core, and cloud environments. To support the full AI lifecycle, systems must deliver consistent performance while also enabling intelligent access to archival data, ensuring that even historical information can be leveraged efficiently and meaningfully.  – Scott Hamilton, Senior Director, Product Management, Marketing & CX, Western Digital

Hybrid deployments for large active archives

The use of hybrid configurations that combine on-premises storage and cloud will continue to grow. This is especially the case for large active media archives where on-premises storage provides high performance and cost-effectiveness and, when combined with cloud object storage, the solution provides a high level of data protection. – Phil Storey, CEO, XenData

Hammerspace Breaks IO500 Barriers: First Standards-Based Linux + NFS System To Achieve True HPC-Class Performance

Posted in Commentary with tags on November 21, 2025 by itnerd

 Hammerspace has announced a breakthrough IO500 10-Node Production result that establishes a new era for high-performance data infrastructure. For the first time, a fully standards-based architecture — standard Linux, the upstream NFSv4.2 client, and commodity NVMe flash — has delivered a 10-node Production fully reproducible IO500 result traditionally achievable only by proprietary parallel filesystems.

This result is the first IO500 Production benchmark demonstrating indisputable proof that standards-based Linux and NFS can meet the extreme performance requirements of high-performance computing (HPC) and artificial intelligence (AI) workloads — without proprietary client software, specialized networking stacks or complex parallel filesystem infrastructure.

A Milestone Moment for Data Platforms — As Transformative as Linux Was for Compute

In the late 1990s, researchers like Dr. David Bader, Distinguished Professor and founder of the Department of Data Science in the Ying Wu College of Computing and Director of the Institute for Data Science at New Jersey Institute of Technology, transformed the HPC world by proving that Linux-based clusters, built on Linux and on commodity components, could rival proprietary supercomputers. That work transformed HPC architecture then, and machine learning (ML) and AI architectures of the future, ultimately making Linux the standard powering nearly every powerful compute environment on Earth. 

This vision laid the foundations for the AI architectures that are emerging even today. Hyperion Research estimates that “over $300 billion in revenue has been generated from selling supercomputers. This represents a sizable economic gain, especially since the use of these systems generated research valued at least ten times over the purchase price. While it is difficult to fully measure the value that supercomputers have generated, even looking at just automotive, aircraft, and pharmaceuticals, supercomputers have contributed to products valued at more than $100 trillion over the last 25 years.”

Hammerspace’s IO500 achievement represents the next chapter of that evolution, this time in the data layer.

Just as Linux revolutionized compute architecture, the combination of standards-based Linux and pNFS is now proving it can revolutionize high-performance data architecture for HPC and AI.
 

The First Architecture That Meets the Demands of Both HPC and AI

This achievement marks the industry’s proof that open, interoperable infrastructure can deliver the performance required by AI and HPC workloads without proprietary lock-in.

HPC environments have traditionally relied on deep institutional expertise to operate complex proprietary filesystems, but the rapid rise of AI has changed the landscape. AI is scaling far faster — across enterprises, cloud providers, sovereign AI platforms, service providers and thousands of new data-intensive applications — and it is impossible to meet this demand with architectures that require niche expertise to deploy and maintain. Every systems administrator already knows how to operate Linux and NFS; however, very few have the specialized knowledge required for legacy parallel file systems. As AI infrastructure becomes mainstream, organizations need HPC-class performance delivered through tools and protocols familiar to the broader IT community. This IO500 result proves that the performance required for both HPC and AI can now be achieved using standard Linux, standard NFS and widely understood operational models, finally aligning extreme performance with the scale and accessibility the AI industry demands.

Standards-Based Architecture, Industry-Leading Performance

The submission by Samsung, leveraging the Hammerspace Data Platform, achieved the fastest standards-based IO500 10-Node Production result ever recorded. Hammerspace not only contributes a substantial number of the capabilities into Linux for pNFS workloads, but its Data Platform is engineered and designed from the ground up to capitalize on these upstream performance enhancements in the Linux kernel.

Unlike traditional storage platforms and legacy parallel file systems that treat Linux as a compatibility layer or pNFS as an added-on interface, Hammerspace’s architecture is built directly on top of — and actively contributes to — the same NFSv4.2 and pNFS innovations driving modern HPC and AI performance. This deep alignment uniquely allows Hammerspace to take immediate advantage of new capabilities such as lower-latency I/O paths, advanced client-side parallelism and improved failover logic, translating Linux’s ongoing advancements directly into real-world application speedups. As a result, organizations can benefit from cutting-edge performance improvements in standard Linux distributions without deploying proprietary clients or rearchitecting their infrastructure.

Unlike legacy parallel file systems that rely on complex, vendor-specific clients, the Samsung’s Hammerspace submission used: 

  • Standard RHEL/Ubuntu Linux
  • Standard upstream NFSv4.2 (pNFS) client
  • Standard NVMe SSDs from Samsung
  • Standard IP-over-InfiniBand
  • Standard server platforms
  • Hammerspace’s standards-based parallel global file system leveraging the pNFS client

In the submission, there was no proprietary client, no custom kernel modules and no exotic parallel file system used.

Modern HPC and AI workloads can now run at elite speeds using standards-based infrastructure and data architectures.

Upstream Linux Innovation Unlocks New Performance

The step-function improvement achieved during the time between the ISC25 and SC25 events is the result of:

  • Enhanced pNFS Flexible File layout parallelism
  • Upstream NFS client improvements contributed by Hammerspace
  • Upstream NFS server improvements that avoid page cache contention, allowing improved sustained performance and reduced resource utilization, contributed by Hammerspace
  • File-level objective-based policy optimizations
  • Latency reductions and throughput gains in metadata access
  • High-performance NVMe data placement managed through the Hammerspace global file system

These enhancements strengthen the entire Linux ecosystem — echoing the transformative Linux HPC contributions of the late 1990s and early 2000s.

Hammerspace’s Top-10 IO500 performance is more than a benchmark victory. It is the first empirical proof that standards-based Linux and NFS can power high-performance data systems at the top of the HPC and AI stack.

Linux democratized supercomputing and now standards-based data infrastructure is positioned to democratize high-performance storage — and reshape the future of global-scale computing.

Ridge Security Brings AI-Powered Penetration Testing to Microsoft Azure Marketplac

Posted in Commentary with tags on November 21, 2025 by itnerd

Ridge Security has made its flagship AI-powered solution, RidgeBot®, available on the Microsoft Azure Marketplace, Microsoft’s online store providing applications and services for use on Azure. Customers and partners can seamlessly deploy RidgeBot within their Azure environments to perform continuous, AI-driven validation that identifies, validates, and remediates vulnerabilities at scale.

Transforming Security Validation Through Automation

Powered by AI-driven automation, RidgeBot autonomously identifies attack surfaces, safely executes exploitations, and generates actionable reports aligned with OWASP and MITRE frameworks. This enables security and DevOps teams to detect, validate, and remediate vulnerabilities faster, reducing risk exposure and strengthening resilience across hybrid and multi-cloud environments.

By leveraging Azure’s global infrastructure, RidgeBot allows organizations to unify automated penetration testing and continuous security validation under a single cloud platform. The result is a scalable, always-on approach that replaces infrequent manual testing with continuous, data-driven assurance of real-world defenses.

Designed for Enterprises and Partners Alike

RidgeBot on Microsoft Azure Marketplace provides flexible deployment options tailored to different security and operational needs:

  • Managed Application (Subscription): Deploy RidgeBot directly from Azure with a simple monthly plan covering one web app or up to twenty IPs, billed through Azure.
  • Virtual Machine (BYOL): Deploy RidgeBot as a VM using a license purchased directly from Ridge Security for greater configuration flexibility.
  • Microsoft Sentinel Joint Solution: Centrally manage multiple RidgeBot deployments across on-premise and cloud environments for unified alerts and analytics.

These options empower enterprises and partners to scale security validation seamlessly within their Azure ecosystems. Learn more at www.ridgesecurity.ai/azure

Guest Post – Travel hack: Restart this one phone setting for faster interne

Posted in Commentary with tags on November 21, 2025 by itnerd

Just toggling airplane mode can improve the connection

A reliable internet connection while traveling is a modern necessity, but your mobile device can sabotage its quality. Connectivity experts explain that this happens because your device may prioritize one network over another, regardless of their quality or strength.

“When you land in a foreign country, your phone automatically scans and registers available networks,” says Vykintas Maknickas, CEO of Saily, NordVPN’s travel eSIM app. “If your home network has several roaming partners at your destination with equal priority, your device will usually latch onto the strongest signal at the airport — and then stick with it.”

Airports typically provide strong coverage from multiple carriers, since carriers have an incentive to connect as many devices as possible. If your phone attaches to one network in the terminal, that initial connection can persist even when a stronger or faster network is available at your hotel or local area, potentially leaving you with a weaker connection.

Instead of relying on unsecured Wi-Fi or searching for a local SIM card, a simple fix can help. “Once you reach your hotel, apartment, or office — the spot where you’ll spend the most time — toggle airplane mode once. That will force a clean network reselection and let your phone choose the best local tower and partner for your location,” says Maknickas.

 The $0 travel hack

To secure a better quality connection, follow these steps:

  • On arrival, let your phone connect as usual so you can get moving.
  • When you reach your base, turn on airplane mode for 10-20 seconds, then turn it off.
  • Give it a minute to settle. Watch for bars, 4G/5G indicators, and more importantly, check a quick speed test or load a map.

This break triggers a fresh scan of available networks. Because you’re now in a different environment, the best option may change, and your phone will lock onto that instead of keeping the airport pick.

Additional tips to keep in mind

To avoid potential issues, Maknickas also shares a few additional pieces of advice:

  • Some carriers can nudge your phone toward preferred partners. The toggle still helps the device reassess signal quality, but you may be steered back. If performance is poor, switch to manual network selection in settings and pick the carrier that has the best performance.
  • Don’t overdo it. Constant toggling while moving can make the device chase towers and drain battery. The point of this hack is one reset at the main location.

About

Saily is a travel eSIM app that ensures secure and affordable mobile internet connection abroad. Saily offers 24/7 instant customer support, flexible plans, and coverage in 200+ destinations. Saily was created by the experts behind NordVPN — the advanced security and privacy app. For more information: https://saily.com

Vykintas Maknickas is the CEO of Saily. After spending the bigger part of a decade at Nord Security, Vykintas is now overseeing Saily, the newest addition to Nord Security’s product family. He is a professional with comprehensive knowledge of the past, present, and future of cybersecurity and stays up to date with the latest threats and opportunities in the digital world.

The Day a Database Permission Change Broke the Internet: A Cloudflare Story – Liquibase analysis 

Posted in Commentary with tags on November 20, 2025 by itnerd

Ryan McCurdy, VP with Liquibase, the leader in database DevOps, has just published “The Day a Database Permission Change Broke the Internet: A Cloudflare Story.” His analysis details how:

  • A minor adjustment, routine in most organizations, touched a hidden part of Cloudflare’s architecture and awakened a dependency no one had considered dangerous.
  • What happened next revealed how modern systems fail today: Nodes that loaded the expanded file went dark. Nodes that loaded the old file continued to serve traffic. The network oscillated, recovering for minutes at a time before failing again, as if trapped between two different realities.
  • Once every shard of the ClickHouse cluster adopted new permissions, every file produced was oversized and every proxy that touched it entered the same panic.

The analysis details clearly and compellingly how the cascading failure occurred, and ways in which most data-driven organizations are at risk.

He notes:

“Cloudflare is one of the most capable engineering organizations in the world. Their systems are built to survive pressure that would overwhelm most companies. Their teams live in incident response. Their infrastructure is distributed, hardened, and instrumented with extraordinary detail. Yet the event that brought them down started with a quiet change in who could read what inside a database.”

He concludes by noting that the only real path forward is a new level of discipline at the data layer. Databases must be governed with the same rigor applied to application pipelines, and offers specific vendor-neutral recommendations.

Hell Has Frozen Over… iPhones Can Now AirDrop To Android Users

Posted in Commentary with tags on November 20, 2025 by itnerd

Google announced today a new cross-platform feature that allows for file sharing between iPhones And Pixel 10 devices called Quick Share:

We built this with security at its core, protecting your data with strong safeguards that were tested by independent security experts. It’s just one more way we’re bringing better compatibility that people are asking for between operating systems, following our work on RCS and unknown tracker alerts.

We’re looking forward to improving the experience and expanding it to more Android devices.

This addresses a major pain point that has always bugged iPhone and Android users. And when it spreads to more Android phones, this will be huge. I have to ask if you’re surprised as I am that this is even a thing. Post a comment below and share your thoughts.

Samsung’s Must-Have Tech Picks to Wrap Up 2025

Posted in Commentary with tags on November 20, 2025 by itnerd

Here’s Samsung’s top tech that are prefect for someone on your gift giving list. There’s a wide variety of options for everybody : 

Tablets for Productivity & Fun 

  • Galaxy Tab A11+ – Premium design, 256GB storage (expandable to 2TB), and Galaxy AI features like Gemini, Circle to Search, and Solve Math. 
  • Galaxy Tab S10 Lite – Includes S Pen, 256GB storage (expandable to 2TB), and Galaxy AI tools such as Handwriting Help, Object Eraser, and Generative Edit. 

Smartwatches to Track Your Health 

Buds for Your Everyday Audio 

  • Galaxy Buds3 Pro – Comfortable fit, active noise cancellation, plus real-time Interpreter & Live Translate. 

Foldables for Maximum Productivity & Style 

  • Galaxy Z Flip7 – 200MP camera, slim design, expansive unfolding screen, and Galaxy AI features including Generative Edit and Handwriting Help. 
  • Galaxy Z Fold7 – 200MP camera, slim design, massive screen real estate, and full Galaxy AI suite for ultimate multitasking. 

Galaxy S25 Series: Phones That Do It All 

  • Galaxy S25 Edge – 200MP camera, slim design, and Galaxy AI features. 
  • Galaxy S25 & S25+ – Snapdragon 8 Elite, long-lasting battery, and Galaxy AI tools. 
  • Galaxy S25 Ultra – Includes S Pen, 200MP camera, 100x Space Zoom, long-lasting battery, and Galaxy AI for productivity and creativity.  

Adistec and Fortra announce Strategic Partnership to Accelerate Cybersecurity Growth Across Latin America

Posted in Commentary with tags on November 20, 2025 by itnerd

Adistec, a leading value-added distributor with 20+ years of experience developing IT channels across Latin America, is excited to announce a strategic alliance with global cybersecurity leader Fortra, to expand the availability of its cybersecurity portfolio throughout all regional LATAM operations. This distribution agreement positions Adistec as a key regional enabler for Fortra’s go-to-market strategy.

Fortra offers an integrated ecosystem of advanced offensive and defensive cybersecurity solutions that help organizations break the cyber attack chain. Its portfolio includes data security technologies like DLP, DSPM, and data classification, brand protection, and offensive security products Core Impact, Cobalt Strike, and Outflank Security Tooling.

A Strategic Alliance to Transform LATAM Cybersecurity Readiness

The integration of Fortra’s solutions into Adistec’s regional portfolio gives partners access to a cohesive suite designed to support multiple security requirements — from offensive validation of defenses to regulatory compliance and protection of sensitive data.

Marcelo Gardelin, Strategic Alliance Director for Adistec adds: “Our collaboration with Fortra reinforces Adistec’s mission to strengthen digital resilience across Latin America. By combining our value-added capabilities with Fortra’s industry-leading technologies, we empower partners to deliver continuous, integrated security to organizations of every size.”

Stuut Technologies Raises $29.5 Million Series A Led by Andreessen Horowitz to Automate Accounts Receivable Work

Posted in Commentary with tags on November 20, 2025 by itnerd

Stuut Technologies, the first AI platform that automates accounts receivable work for companies, today announced it has raised $29.5 million in combined Series A funding led by Andreessen Horowitz, with participation from Activant Capital, Khosla Ventures, 1984.vc, Page One Ventures, Vesey Ventures, Carya Venture Partners, and Valley Ventures. Seema Amble from Andreessen Horowitz and Steve Sarracino from Activant Capital will join the board. The funding will accelerate product development and expand Stuut’s autonomous account receivable capabilities for mid-market and enterprise companies across six key functionalities: collections, payments, cash application, deductions, credits, and disputes. 

Companies lose up to 5% of EBITDA because AR teams spend their days chasing customers, logging into portals, and matching payments by hand. This problem is particularly acute for manufacturers, distributors, CPG, logistics, outsourced services, and medical device organizations with complex customer relationships and high transaction volumes. Traditional software has tried to help for years, but it all hits the same wall: it can’t actually do the work—it just gives humans tools to do it themselves.

Stuut is the AI coworker that knows every customer across your entire cash process, helping businesses collect 40% more revenue on time by doing their AR work faster and better than manual processes. Unlike older traditional accounts receivable platforms that require 6-18 months to implement and constant human oversight, Stuut executes complete workflows independently while integrating seamlessly with existing ERP systems in under a week.

What Stuut Does:

  • Actually Does the Work: AI handles customer outreach, payment matching, dispute resolution, and portal management autonomously. Learns each customer’s patterns and executes complete workflows from start to payment across all formats (SMS, email, voice).
  • Knows Each Customer: Context travels across collections, payments, cash application, and deductions—remembering every interaction. Gets smarter with each customer interaction and applies learning to future decisions across your entire process. Coming soon will be credits and disputes capabilities.
  • Fast Implementation: Up and running in days versus 6-18 months for traditional software. Integrates into existing systems without disruption with immediate results that compound over time.
  • Proven Results: 40% reduction in overdue balances and 70% reduction in manual tasks. Customers including ZoomInfo, Bishop Lifting, Honeywell, and PerkinElmer are live and collecting immediately with 3-4 day implementation.