World Backup Day is today and it was started by a group of concerned internet users and tech enthusiasts in 2011. The initiative was led by Ismail Jadun, a digital strategy consultant from Ohio, and his friends. They were inspired to create World Backup Day after reflecting on the fact that many people were not backing up their data regularly, and as a result, were putting themselves and their organizations at risk. The first World Backup Day was observed on March 31, 2011, and since then, it has become an annual event that encourages people to take action to protect their digital estate.
Data loss can occur due to a number of reasons such as hardware failure, software corruption, malware attacks, natural disasters, and even human error. The amount of money that businesses lose due to data loss can vary depending on various factors such as the size of the business, the industry, and the type of data lost. However, studies suggest that the cost of data loss can be significant, with some estimates ranging from thousands to millions of dollars per incident. And one can imagine the devastating consequences if an organization like a hospital, emergency responders, or military agency lost access to critical data.
Datadobi’s Carl D’Halluin, DH2i’s Don Boxley, and Folio Photonics’ Steve Santamaria had this to say about this important day and why it affects virtually every corner of the datacenter, across virtually every industry, around the world:
Carl D’Halluin, Chief Technology Officer (CTO), Datadobi:
“Failing to backup your data can have catastrophic consequences, as a single hardware failure, cyber-attack, or natural disaster can wipe out all your valuable information, leaving you with no way to recover it. This means that years of hard work can all be lost in an instant, with no chance of retrieval. Even the cost of losing just a portion of your important data can be immeasurable, with potential financial, legal, and reputational implications that can last for years.
Identifying the vital data that requires protection should be the first step in the process. But even if you know and can ‘describe’ what data must be protected, finding it has always been another matter – and you cannot backup what you cannot find. To effectively address this enormous and complicated undertaking, users should look for a data management solution that is agnostic to specific vendors and can manage a variety of unstructured data types, such as file and object data, regardless of whether they are stored on-premises, remotely, or in the cloud. The solution should be capable of evaluating and interpreting various data characteristics such as data size, format, creation date, type, level of complexity, access frequency, and other specific factors that are relevant to your organization. Subsequently, the solution should allow the user to organize the data into a structure that is most suitable for the organization’s particular needs and empower the user to take action based on the analyzed data. In this case, backup the necessary data to the appropriate environment(s). And, if necessary, the solution should enable the user to identify data that should be organized into a ‘golden copy’ and move that to a confidential, often air-gapped environment.
To sum it up… Don’t let the nightmare of data loss become your reality – always backup your data.”
Don Boxley, CEO and Co-Founder, DH2i:
“World Backup Day is an annual event that is intended to raise awareness of the importance of data backup and protection. It serves as a reminder for individuals and organizations to take proactive measures to safeguard critical data against unexpected incidents that can result in data loss, such as hardware or software failure, cyber-attacks, natural disasters, and human error. And, while the exact cost can vary depending on factors such as the size of the organization, the type and amount of data lost, the cause of the loss, and the duration of the downtime, according to various studies, it can cost organizations upwards of billions of dollars each year.
That’s why, for systems architects and IT executives alike, zero is the ultimate hero. And to achieve it, they are taking a multi-pronged approach to data protection. To achieve zero downtime, zero security holes, and zero wasted resources, they are also layering-on smart high availability (HA) clustering and software-defined perimeter (SDP) technology that enables them to securely connect and failover enterprise applications — from anywhere, to anywhere, anytime.
On World Backup day and all year long, it is critical to remember that businesses that invest in data protection are better equipped to navigate unexpected data loss events, maintain regulatory compliance, and protect their critical assets and reputation. Bottom-line, investing in data protection is not just smart, it’s essential for business success.”
Steven Santamaria, CEO, Folio Photonics:
“The world’s most valuable resource is data, and it is of utmost importance to properly store, protect, and preserve this resource. The safekeeping of data is essential because it represents the foundation upon which many modern businesses are built, and its loss can have far-reaching consequences for organizations and individuals alike. As such, ensuring the safety and longevity of data should be a top priority for any entity that relies on this precious resource.
On World Backup Day, we are reminded of this, and the criticality of backup as one of the key safety nets against data loss, whether it’s due to technology failures, cyber-attacks, or human error.
Today, I would offer that the most effective data protection strategy should also incorporate a data storage platform that can be securely archived in an off-site location, with the added benefit of being taken off-line and air-gapped for even greater security. This means that the storage platform is physically separated from the main network and disconnected from the internet, making it highly resistant to cyber-attacks and other forms of data breaches. In essence, a well-designed data protection strategy should prioritize both physical and digital security to safeguard critical data and ensure business continuity.”
Molly Presley, SVP of Marketing at Hammerspace:
“The coming year will be about automation to help identify and protect data assets. Human-managed processes are challenging to scale as the number and variety of data-creating devices continually increase. As a result, setting data protection services at a global level that automatically apply policies that meet corporate governance compliance requirements will be increasingly important.
Automation will include identifying newly created data on any infrastructure in the global data environment, automating controls on data copy creation, and automating data services to ensure global protection on any infrastructure. “
Darren Yablonski, Senior Director of Sales Engineering leading teams in Canada, U.S. and LATAM at Commvault:
“As the sophistication of cybercriminals has changed over the last few years, so too has data protection — significantly. In the past, cybercriminals would typically gain access to an organization’s data and encrypt it so employees could no longer understand it, rendering it useless to the business. This is why ensuring you have a secure copy of your data is so important. With a spare dataset to restore, business can continue as usual.
Lately, cybercriminals are increasingly moving from encrypting the data, to instead holding it for ransom and threatening to publish it. This has much broader consequences, including reputational damage as well as possible loss of competitive advantage as your customer and company data could be available to the entire industry. As a result, organizations should consider changing their approach to data protection.
Gone are the days when it was enough to just backup your data. Organizations need to prevent cybercriminals from accessing systems to begin with by leveraging, for example, an early detection system. Cyber deception can give companies the upper hand and put them one step ahead of any potential attackers. Decoys are deployed to throw attackers off course and instead draw them to artificial assets instead of legitimate ones. The minute an attacker enters the decoy IT environment, the organization is notified so it can act immediately and isolate the asset. With response time significantly reduced, cybercriminals are far less likely to get into any real systems.
Backups will always remain important, because unfortunately the worst can always happen — from a natural disaster that destroys your servers to a cyberattack. However, in the face of the sophisticated cybercriminal, it’s vital to have a proactive approach to data protection in tandem with traditional reactive methods.”
2025 Predictions: Self-Optimizing Clusters, Cross-Cloud HA, & Enhanced Security and Isolation
Posted in Commentary with tags DH2i on November 22, 2024 by itnerdWhen it comes to AI and High Availability (HA) Clustering, the synergy between AI’s capabilities and HA’s needs is expected to drive more advanced, resilient, and self-managing clusters. Here are a few predictions from Don Boxley, CEO and Co-Founder, DH2i on how this convergence will shape the future.
1.) Self-Optimizing Clusters – AI will enable HA clusters to self-optimize by analyzing workload patterns, resource usage, and performance metrics in real-time. This means that clusters can automatically adjust resource allocation, distribute workloads more evenly, and maintain optimal performance without human intervention, even under fluctuating loads.
“Managing HA clusters manually often leads to inefficiencies, with resources sitting idle during low usage and systems struggling to keep up under peak loads.”
“AI eliminates these inefficiencies by continuously analyzing workloads and resource usage, allowing clusters to self-optimize and maintain peak performance without manual oversight.”
2.) Cross-Cloud High Availability – As organizations adopt multi-cloud strategies, AI-driven HA clustering will help maintain HA across different cloud environments by managing clusters that span multiple providers. AI-driven HA clustering will also leverage adaptive load balancing, where AI learns usage patterns, traffic surges and analyzes performance across providers to intelligently distribute workloads across nodes. This approach will minimize latency and prevent bottlenecks, keeping HA clusters performant and responsive.
“Organizations relying on multi-cloud strategies frequently encounter challenges in ensuring consistent performance and availability across providers, leading to latency and bottlenecks.”
“AI simplifies cross-cloud HA by dynamically analyzing traffic and distributing workloads intelligently across providers, ensuring seamless performance and responsiveness.”
3.) Enhanced Security and Isolation – AI-powered monitoring will enable HA clusters to detect unusual behaviors that may signify security breaches or potential insider threats. By identifying anomalies, AI can isolate affected nodes or reroute traffic away from potential threats, enhancing the security and reliability of HA clusters.
“Traditional monitoring tools often miss subtle threats or fail to respond quickly enough, leaving HA clusters vulnerable to breaches and downtime.”
“AI-powered monitoring detects anomalies in real-time and isolates threats immediately, ensuring the security and reliability of high availability clusters without delays.”
1 Comment »