Archive for DH2i

2025 Predictions: Self-Optimizing Clusters, Cross-Cloud HA, & Enhanced Security and Isolation

Posted in Commentary with tags on November 22, 2024 by itnerd

When it comes to AI and High Availability (HA) Clustering, the synergy between AI’s capabilities and HA’s needs is expected to drive more advanced, resilient, and self-managing clusters. Here are a few predictions from Don Boxley, CEO and Co-Founder, DH2i on how this convergence will shape the future.

1.)  Self-Optimizing Clusters – AI will enable HA clusters to self-optimize by analyzing workload patterns, resource usage, and performance metrics in real-time. This means that clusters can automatically adjust resource allocation, distribute workloads more evenly, and maintain optimal performance without human intervention, even under fluctuating loads.

“Managing HA clusters manually often leads to inefficiencies, with resources sitting idle during low usage and systems struggling to keep up under peak loads.”

“AI eliminates these inefficiencies by continuously analyzing workloads and resource usage, allowing clusters to self-optimize and maintain peak performance without manual oversight.”

2.)  Cross-Cloud High Availability – As organizations adopt multi-cloud strategies, AI-driven HA clustering will help maintain HA across different cloud environments by managing clusters that span multiple providers. AI-driven HA clustering will also leverage adaptive load balancing, where AI learns usage patterns, traffic surges and analyzes performance across providers to intelligently distribute workloads across nodes. This approach will minimize latency and prevent bottlenecks, keeping HA clusters performant and responsive.

“Organizations relying on multi-cloud strategies frequently encounter challenges in ensuring consistent performance and availability across providers, leading to latency and bottlenecks.”

“AI simplifies cross-cloud HA by dynamically analyzing traffic and distributing workloads intelligently across providers, ensuring seamless performance and responsiveness.”

3.)  Enhanced Security and Isolation – AI-powered monitoring will enable HA clusters to detect unusual behaviors that may signify security breaches or potential insider threats. By identifying anomalies, AI can isolate affected nodes or reroute traffic away from potential threats, enhancing the security and reliability of HA clusters.

“Traditional monitoring tools often miss subtle threats or fail to respond quickly enough, leaving HA clusters vulnerable to breaches and downtime.”

“AI-powered monitoring detects anomalies in real-time and isolates threats immediately, ensuring the security and reliability of high availability clusters without delays.”

Keeping Data and Systems Secure and Available on Black Friday and Cyber Monday

Posted in Commentary with tags , on November 7, 2024 by itnerd

Black Friday and Cyber Monday are just around the corner, and I don’t know about you, but in addition to being excited about the potential holiday shopping deals, I always feel for the IT professionals who are tasked with keeping their systems and data secure and available. This continues to become an exponentially more difficult task as cyber-criminals become increasingly aggressive and sophisticated. I can only imagine how stressful this responsibility is at this time of year! 

Don Boxley, CEO and Co-Founder of DH2i and DeeDee Kato, Vice President of Corporate Marketing for Foxit had this to say about these important days and critical topics: 

Don Boxley, CEO and Co-Founder, DH2i

“Let’s talk about VPNs – you know, those tools so many have relied on for secure online connections? Well, here’s the thing – they’re not quite the safety blanket they were back when they were invented over two decades ago. They actually have some serious flaws – like they do not protect your anonymity, they won’t protect you from malware or phishing attacks, and they serve as an access point that can be leveraged to compromise your entire, unsegmented network. Additionally, VPN services can access your personal info (PII), see your IP address, and see what websites you visit and what you do while you are there – and that is just a start! During crazy shopping times like Black Friday, hackers have had a field day with VPNs – messing with prices, running scams, and basically sneaking into business systems like they own the place. 

Here’s the good news! Enter Software-Defined Perimeter (SDP) – think of it as your super-smart security guard who trusts no one (in a good way). Instead of just letting people in because they have the right password, SDP does a full background check – every… single… time. It’s like having a bouncer who looks at every ID, does pat-downs, and even checks everyone’s shoes before letting anyone into the club.

What makes SDP really shine is how it keeps an eagle eye on everything happening in your network, sets up specific rules for who can access what, and basically builds virtual walls between different parts of your system. So, when the holiday shopping chaos hits, businesses can breathe easier knowing their own and their customers’ data is not just “VPN secure,” but actually secure and truly locked down.”

DeeDee Kato, Vice President of Corporate Marketing, Foxit

“PDFs are oftentimes the unsung heroes, the workhorses in the background, quietly enabling businesses to streamline communication, enhance accessibility, and ensure consistency. Certainly, on Black Friday and Cyber Monday this is true — as the modest PDF makes it possible for retailers/e-tailers to better manage and roll-out large-scale promotions, expedite customer communications, and ensure operational efficiency. From product catalogs, gift cards, and digital coupons to receipts, shipping confirmations, and return policies, PDFs enable businesses to provide customers with clear, consistent, easily accessible and downloadable information across devices. 

My advice for retailers/e-tailers at this make-or-break time of year is to make sure they are able to make the most of PDFs during these shopping days. They must choose a PDF solution that optimizes PDFs for mobile and web viewing to accommodate customers browsing on various devices. They must also have the ability to compress promotions, order summaries, gift guides, etc. for faster download speeds without compromising readability, reducing bounce rates during peak traffic. And, last but certainly not least, retailers/e-tailors much organize PDFs logically – with clear filenames and well-placed links to enhance the customer experience – especially during high-stress moments when users need quick access to coupons or order details. 

Of course, lets not forget the security is essential on Black Friday and Cyber Monday – yes, especially for PDFs. They can become a doorway for phishing and fraud. Retailers/e-tailers must employ PDF solutions that allow for password protection and encryption, and even secure digital signatures – to protect not only their customers personal information, but their employees’ as well. 

So to retailers/e-tailers this year, I say, make PDF efficiency and security your mantra. In doing so you will deliver premium experience for your customers and employees – when it matters most.”

DH2i to Showcase DxEnterprise Smart High Availability Software and Its SQL Server Operator for Kubernetes at PASS Data Community Summit 2024

Posted in Commentary with tags on October 28, 2024 by itnerd

DH2i today announced it will be showcasing its DxEnterprise Microsoft SQL Server high availability software for instances and containers at this year’s PASS Data Community Summit. PASS Summit 2024 will bring thousands of data platform professionals together for an in-person event in Seattle, WA, from November 4-8, 2024.

DH2i Booth #204:

Attendees will have the opportunity to experience DH2i’s industry-leading DxEnterprise software firsthand and learn how to drive Microsoft SQL Server downtime and data loss to near-zero across on-prem, remote, cloud, and hybrid environments – all while eliminating management complexity. DH2i will also show how to easily set up multi-site clusters for disaster recovery (DR), manage Windows and Linux SQL Server side-by-side in the same cluster, and easily and securely usher in the era of containers with DxOperator by DH2i, the industry’s preferred SQL Server Operator for Kubernetes (K8s). 

In-Booth Demos & Raffles, DH2i Booth #204:

  • Wednesday, November 6: 10:50 am 
  • Thursday, November 7: 3:10 pm 
  • Friday, November 8: 10:00 am 

Don’t Miss:

General Session: “Harness the Power of Kubernetes to Achieve Truly Cloud-Agnostic SQL Server

  • Wednesday, November 6: 3:30 pm – 4:30 pm in Room 345-346
  • DH2i’s CTO and Co-Founder, OJ Ngo, to join Microsoft’s Principal Product Manager, Amit Khandelwal, to discuss how organizations can cost-effectively maintain “5-nines” SQL Server uptime while eliminating the risk of cloud vendor lock-in. While Kubernetes and containers offer infrastructure autonomy, they also introduce complexity and downtime risks. Ngo and Khandelwal will present a solution to these concerns by demonstrating a cloud-agnostic SQL Server environment using Kubernetes, allowing multi-cloud deployments and automatic failover across platforms. This approach accelerates digital transformation, enabling unified HA management for SQL Server across Windows, Linux, and Kubernetes in a single framework.

10-Minute Lightning Talk: “Deploy Highly Available SQL Server Containers in AKS in 3 Easy Steps

  • Thursday, November 7 at 11:15 am in room 343-344
  • DH2i’s CTO and Co-Founder, OJ Ngo, will present a 10-minute lightning talk to demonstrate how easy SQL Server container deployment and HA can be with the industry’s preferred SQL Server Operator for Kubernetes. Whether you’re an expert or just curious about the benefits of database containers, this session will show an easy-to-execute, 3-step approach to deploy a customizable, Always-On Availability Group in an Azure Kubernetes Service (AKS) cluster.

20-Minute Session at AWS Booth: “Deploy a SQL Server Availability Group on Amazon EKS with Ease using DH2i”

  • Wednesday, November 6 at 1:00 pm at the AWS Booth
  • DH2i’s CTO and Co-Founder, OJ Ngo, will present alongside Yogi Barot from AWS as they demonstrate an easy, operator-driven approach to deploy highly available SQL Server on Amazon Elastic Kubernetes Service and the combined ability of this solution stack to ensure the industry’s lowest downtime for SQL Server containers.

Guest Post: Navigating Microsoft SQL Server and Kubernetes in a Hybrid and Multi-Cloud Era

Posted in Commentary with tags on February 15, 2024 by itnerd

By Don Boxley, CEO and Co-Founder, DH2i

In a business world that’s increasingly leaning on hybrid and multi-cloud environments for agility and competitiveness, DH2i’s recent launch of DxOperator couldn’t be more timely. For those managing SQL Server within Kubernetes — especially when dealing with the intricacies of operating across various cloud platforms — it is a true game changer. 

DxOperator is the result of a close relationship with the Microsoft SQL Server team, which led to the creation of a tool that is ideally suited to automate SQL Server container deployment in Kubernetes. What makes it truly unique and a stand-out in this space is DxOperator’s ability to take complex setups and make them simple — which ensures that HA and operational efficiency are easily achievable, even across multi-cloud environments.

Of course, another reason that DxOperator is in a league of its own is how it turns your specific requirements into optimized actions. DxOperator handles everything from custom pod naming to node selection with such finesse that managing SQL Server containers becomes a breeze. It’s all about making sure that your deployments are not just efficient but also best practice compliant.

Microsoft’s Rob Horrocks praised DxOperator (see announcement) for its ease-of-use and effectiveness, noting its potential to simplify complex deployments for those who might not be Kubernetes experts. DxOperator’s user-friendly nature, together with its robustness is reshaping how businesses approach database management.

“Previously, deploying this type of setup could require up to 30 minutes and numerous pages of code. However, with the DxOperator feature, it’s been streamlined to a mere 3-5 minutes and a handful of code lines. This makes the transition to K8s significantly smoother for those experienced with SQL Server but new to K8s,” Horrocks explained.

OJ Ngo, DH2i’s CTO and Co-Founder, also shared that DxOperator was built with a focus on practical automation and efficient management of SQL Server availability groups. OJ and his team met their goal with flying colors! DxOperator is the industry’s most versatile tool — aligning with Kubernetes’ best practices while meeting the modern demands of IT infrastructures, particularly in hybrid and multi-cloud scenarios.

Tailored for Hybrid and Multi-Cloud Strategies

For organizations embracing hybrid and multi-cloud models, DxOperator is a significant boon. DxOperator streamlines the deployment of SQL Server across various settings, aligning seamlessly with the scalable and adaptable characteristics of hybrid cloud approaches. The result is that businesses have the flexibility to allocate their resources more wisely and keep spending under control. Moreover, digital security is enhanced with our cutting-edge DxEnterprise with secure tunneling technology, ensuring safe and private data exchange across any network. And, at the same time, it ensures everything runs smoothly, no matter where their data and applications are hosted in the cloud.

Highlights:

  • Efficient Deployment: DxOperator facilitates quick and intelligent setup of SQL Server instances, ideally suiting the complex requirements of hybrid and multi-cloud settings.
  • High Availability: The tool ensures that your SQL Server environments are always up and running, smoothly integrating into Always On Availability Groups for continuous operation across any cloud setting.
  • Simplified Management: With DxOperator, the complexity of managing SQL Server environments is significantly reduced, freeing up IT teams to focus on strategic initiatives.

For those interested in exploring DxOperator and how it can streamline your SQL Server deployments, especially within hybrid and multi-cloud frameworks, I encourage you to check out DH2i’s website. (Click here for comprehensive guides and details on how to get started with DxOperator.) 

DH2i Announces General Availability of Revolutionary DxOperator for Streamlined SQL Server Container Deployment on Kubernetes

Posted in Commentary with tags on February 6, 2024 by itnerd

 DH2i today announced the general availability (GA) launch of DxOperator, a major advancement for Kubernetes and SQL Server integration. DxOperator is engineered to meet the growing demands of businesses seeking efficient, scalable, and highly available (HA) database environments. It is the ideal choice for customers looking to streamline their SQL Server container deployments on Kubernetes, with unparalleled ease of use, robustness, and automation capabilities.

DxOperator was meticulously developed from the ground up by DH2i in collaboration with the Microsoft SQL Server team. It is designed to automate the deployment of DxEnterprise clusters and streamline the orchestration of Microsoft SQL Server availability group (AG) workloads within Kubernetes environments. DxOperator provides extensive control to users over their SQL instances and availability groups, encompassing a wide range of functionalities. It adeptly translates user-defined directives into precise, low-level actions, ensuring deployments are not only efficient but also adhere to the best practices embedded within its logic. With features like custom pod naming, node selection and affinity, SQL AG customization, and load balancing, DxOperator is more than just a tool; it’s a gateway to deploying highly available, resilient, and scalable SQL Server containers with an unprecedented level of ease and precision. Its ability to handle complex configurations, like custom annotations, specific container specifications, and quality of service parameters, further accentuates its role as a crucial enabler for robust, production-grade SQL Server deployments in Kubernetes.

Key Features of DxOperator:

  • Efficient Deployment: Enables the rapid deployment of SQL Server instances on Kubernetes clusters with precise MSSQL-config parameters.
  • High Availability (HA): Automates the configuration of DxEnterprise clusters and the seamless integration of SQL Server instances into Always On Availability Groups (AGs).
  • Simplified Management: Reduces the complexity of managing SQL Server environments on Kubernetes, offering a user-friendly approach with minimal commands.

Key Benefits for Users:

  • Enhanced Productivity: DxOperator’s streamlined processes allow IT teams to focus on more strategic tasks, leaving the intricacies of deployment and management to the operator.
  • Scalability: Catering to the dynamic needs of businesses, DxOperator makes scaling SQL Server environments on Kubernetes a straightforward process.
  • Cost-Efficiency: The automation and efficiency provided by DxOperator significantly reduce the total cost of ownership for SQL Server deployments.

Getting Started with DxOperator: For more information on DH2i’s DxOperator and to begin leveraging its capabilities, interested customers can visit https://dh2i.com/dxoperator-preview/. Here, they can access a wealth of resources, including a comprehensive quick-start guide and details on obtaining a DxEnterprise developer license.

Today Is Data Privacy Day

Posted in Commentary with tags , , on January 28, 2024 by itnerd

Data Privacy Day is today. Led by the National Cyber Security Alliance (NCSA), this event is a key part of a yearly global campaign focused on safety, security, and privacy. The theme for this year is “Take Control of Your Data.” It represents a worldwide endeavor to raise awareness about the significance of privacy respect, the protection of personal information, and the cultivation of trust.

Executives from Appdome, Datadobi, DH2i, Folio Photonics, Mission Cloud had this to say about this important day, and the incredibly important topic it represents:

Carl D’Halluin, CTO, Datadobi

“On January 28, we celebrate Data Privacy Day. Initiated in the United States and Canada in 2008 by the National Cyber Security Alliance, its aim is to raise awareness and promote privacy and data protection best practices. 

I would say the number one data privacy best practice is pretty simple: make sure you can get the right data to the right place at the right time. Wherever the data is in its lifecycle, it should be protected and only accessible as needed. Of course, this tends to be easier said than done. But, there is perhaps nothing more critical and imperative than implementing the right strategies and technologies to do so. After all, while data is an organization’s most valuable asset (in addition to its people), it also represents its greatest potential risk. 

Balancing these two aspects is key. In other words, effective data management enables you to optimize your business intelligence, make faster and smarter decisions, and gain a competitive edge, as well as better meet business requirements such as internal governance and legal mandates, external regulations, and financial obligations and goals.” 

Don Boxley, CEO and Co-Founder, DH2i

“Data privacy isn’t just important for businesses – it is a matter of corporate survival. A company can make just one small mistake, neglect one small security check-box, and the consequences can be catastrophic. One small mistake could lead to a data breach that causes legal and regulatory fines, as well as irreparable damage to the company’s reputation — a nightmare from which recovery is near-impossible.

A software-defined perimeter (SDP) solution could be the answer! Many SDP solutions are engineered to provide secure network connectivity across on-prem, cloud, and hybrid environments. SDP enables its users to transform their traditional network-based perimeter security with a more sophisticated one that creates micro-perimeters around data. SDP enables secure connections between data centers and across private and public cloud platforms without needing a VPN or direct connect, thereby significantly reducing security vulnerabilities even further. In addition, for those focused on data protection and privacy, SDP enables the ability to create secure tunnels for specific applications, as opposed to entire network access. Ideally, such a solution would be streamlined and straightforward to manage, equipped with an intuitive interface that eases the configuration, and ongoing management of secure connections. This combination — increased security, ease-of-use, and adaptability – makes SDP the ideal choice for protecting data and ensuring data privacy.”

Steve Santamaria, CEO, Folio Photonics:

“On Data Privacy Day, we are reminded of the business-critical importance of safeguarding sensitive information – both professional and personal – at a time when data breaches and cyber threats have become all too common. For data protection professionals, this should not be viewed as a gentle nudge but rather a polite – yet strong shove toward reviewing and fortifying the technology and policies that serve as the underpinnings of your data protection strategy.

How can anyone not admire those responsible for their organization’s data protection? As we in the business know – it’s no walk in the park! The good news is of course, that smarter and more powerful technology solutions continuously enter the marketplace, ready to take their place in the data protection professional’s arsenal. Active archives built on an optical storage foundation can offer an ideal data protection solution for several compelling reasons. Firstly, they provide a high level of security as data stored on optical discs is read-only, rendering it resistant to cyber threats like ransomware. Optical storage is also highly durable — able to withstand physical damage from factors like magnetic fields, moisture, and temperature fluctuations, ensuring the safety of critical data. What’s more, optical storage media boasts a long lifespan, making it ideal for data archival and compliance requirements while also being cost-effective in the long term. And last but certainly not least, it can be easily air-gapped – adding a virtually impenetrable defense against a cyber-attack. 

Retrieving data from optical storage is quick and reliable due to fast read speeds, making archived data readily accessible. And if that isn’t enough — it is environmentally friendly, consuming less energy and having a lower carbon footprint compared to alternative storage options.”

Alan Bavosa, VP of Security Products at Appdome:

In the spirit of Data Privacy Week, we should champion initiatives that prioritize security and resiliency.  

Protecting consumer data and privacy isn’t just about how a company uses their data internally or with partners, it is how it’s guarded from wider threats, such as cyber attackers. In fact, data privacy and cybersecurity are intrinsically interlinked – you can’t ensure consumer data is kept private if you don’t prioritize cybersecurity. And this includes the protections on a brand’s mobile app offering, especially as mobile stands as the dominant channel for people’s interactions, fueling criminals to eagerly infiltrate apps.  

If brands don’t pay attention to how they protect their consumers via mobile apps, they are putting themselves at a huge commercial and reputational risk as customers may leave. For instance, nearly three-quarters of global mobile consumers stated that they’d be likely or very likely to stop using an app and tell their friend to stop using it too following a data breach or if they discovered that it didn’t protect their data. 

Clearly, brands that do have privacy and security built into their mobile applications have a lot to benefit. Not only will it address cybersecurity fears and build consumer trust, but it will put them on course to comply with regulations such as DORA (Digital Operational Resilience Act) and NIS 2 Directives that both require cybersecurity resilience.

Ryan Ries, Chief Data Science Strategist at Mission Cloud:

Data Privacy is a very difficult topic to try and understand because there are so many rules and regulations that are constantly changing and are different state to state and country to country. People have to look at what kind of data they have and understand all the rules associated with it which is very time consuming and a serious endeavor. We often see customers that had this under control when they were a smaller company, but as they grow they have to really focus on ensuring they are doing the right things with the data and understanding what rules it falls under. There are so many different layers to data privacy and how you handle it, does it fall under PII, PHI or HIPAA? Do I need to worry about GDPR or data residency? There is a lot to consider and you need to be diligent that you are handling your data properly.

2024 Predictions In Regards To Downtime Prevention, Enhanced Cybersecurity, Kubernetes, AI And More

Posted in Commentary with tags , on December 16, 2023 by itnerd

As we approach the end of the year, it’s always intriguing to explore predictions for the upcoming period across various industries. Let’s turn our attention to the 2024 forecasts by Steve Leeper of Datadobi and Don Boxley of DH2i. Their perspectives collectively paint a comprehensive picture of an evolving landscape where data becomes a central component of intelligent business strategies and robust IT systems.

Steve Leeper, VP Product Marketing, Datadobi

“As artificial intelligence (AI) continues to weave into the fabric of modern business, the year 2024 is likely to witness a surge in the demand for enhanced data insight and mobility. Companies will need to gain insight into their data to strategically feed AI and machine learning platforms, ensuring the most valuable and relevant information is utilized for analysis. This granular data insight will become a cornerstone for businesses as they navigate the complexities of AI integration. At the same time, the mobility of data will emerge as a critical factor, with the need to efficiently transfer large and numerous datasets to AI systems for in-depth analysis and model refinement. The era of AI adoption will not just be about possessing vast amounts of data but about unlocking its true value through meticulous selection and agile movement.

The trajectory of storage technology is also poised for a significant shift as the year 2024 approaches, with declining flash prices driving a broad-scale transition towards all-flash object storage systems. This shift is expected to result in superior system performance, catering adeptly to the voracious data appetites and rapid access demands of AI-driven operations. As flash storage becomes more financially accessible, its integration into object storage infrastructures is likely to become the norm, offering the swift performance that traditional HDD-based object storage and scalability that NAS systems lack. This evolution will be particularly beneficial for handling the large datasets integral to AI workloads, which necessitate rapid throughput and scalability. Consequently, a data mobility wave may be seen, with datasets and workloads being transferred from outdated and sluggish storage architectures to cutting-edge all-flash object storage solutions. Such a move is anticipated not just for its speed but for its ability to meet the expanding data and performance requisites of burgeoning AI initiatives.

Also importantly, in 2024, the landscape of data management will undergo a profound transformation as the relentless accumulation of data heightens the necessity for robust management solutions. According to Gartner’s projections, by 2027, it is expected that no less than 40% of organizations will have implemented data storage management solutions to classify, garner insights, and optimize their data assets, a significant leap from the 15% benchmark set in early 2023. This trend is likely to be propelled by the relentless expansion of data volumes, outpacing the rate at which companies can expand their IT workforce, thus elevating the indispensability of automation for data management at scale.

2024 is set to be a pivotal time for data management, with a shift towards API-centric architectures for meshed applications gaining traction. As customers increasingly demand that data management vendors offer API access to their functionalities, we are likely to see a mesh of interconnected applications seamlessly communicating with one another. Imagine ITSM (IT Service Management) and/or ITOM (IT Operations Management) software triggering actions in other applications via API calls in response to tickets — this interconnectedness will become commonplace. The trend towards API-first strategies will likely accelerate, driven by the desire to embed data management more integrally within the broader IT ecosystem. As a result, the development of self-service applications will flourish, enabling automated workflows and facilitating access to data management services without the need for manual oversight. This move towards a more integrated, automated IT environment is not just anticipated; it is imminent, reflecting a broader shift towards efficiency and interconnectivity within the technological landscape.

Finally, as we look toward 2024, we predict that an intensified focus on risk management will become a strategic imperative for companies worldwide.  Governance, risk, and compliance (GRC) practices are anticipated to receive heightened attention as companies grapple with the complexities of managing access to data, aging data, orphaned data, and illegal/unwanted data, recognizing these as potential vulnerabilities. Moreover, immutable object storage and offline archival storage will continue to be essential tools in addressing the diverse risk management and data lifecycle needs within the market.”

Don Boxley, CEO and Co-Founder, DH2i

“In 2024, there will be four key trends. To start, the increasing complexity of IT infrastructures, especially with the widespread adoption of containerized environments like Kubernetes, will drive the need for more sophisticated downtime prevention solutions. These systems will leverage predictive analytics to identify potential issues before they cause system failures. Automation will play a key role, with features like automatic failover processes that ensure continuous operation without manual intervention. The focus will be on creating solutions that are not only reactive in addressing issues but also proactive in preventing them.

Next, the cybersecurity landscape is rapidly evolving, with more sophisticated and frequent attacks. In response, the adoption of advanced network technologies like software-defined perimeter (SDP) and Zero Trust Network Access (ZTNA) will become critical in 2024. These technologies offer a more dynamic and adaptive approach to network security compared to traditional VPNs. SDP provides a way to create secure, context-aware connections between users and network resources, effectively reducing the attack surface. ZTNA, on the other hand, operates on the principle of “never trust, always verify,” ensuring that access to network resources is strictly controlled and monitored. These technologies will be especially important for protecting multi-cloud environments and remote work infrastructures.

And, as organizations continue to diversify their IT portfolios, the need for solutions that offer cross-platform compatibility and seamless integration will grow in 2024. These solutions will need to support a variety of environments – from cloud services provided by different vendors to on-premises data centers and emerging container technologies. The key will be in providing a unified management interface that can handle various systems, offering efficient and coherent control over diverse IT assets. This trend is not just about compatibility; it’s about integration that is deep enough to allow different systems to work together harmonically, enhancing overall system efficiency and reducing operational complexities.

Last but not least, now that it’s been demonstrated that SQL Server Kubernetes (K8s) clusters perform much faster on physical servers than on virtual Machines, solutions will be developed that will enable customers to deploy SQL Server Availability Groups on K8s environments in seconds with greater customization in 2024. The solutions will make it easy for customers to see reductions in OS licensing, CPU clock cycles, and memory when using K8s as opposed to VMs.  These new solutions will also offer cross-platform compatibility and seamless integration with existing non-K8s environments.  They will take full advantage of Zero Trust networking technology to allow multi-region/multi-cloud compatibility for true cloud independence.”

Guest Post: Like the VMs Before Them – SQL Server Containers Are Exploding in Use

Posted in Commentary with tags on November 10, 2023 by itnerd

By Don Boxley Jr

In the tech industry, we often see game-changing trends that redefine how we handle computing. Sometimes, these trends escalate to full-blown explosions of technology usage. It happened with virtual machines (VMs) and now it’s happening with SQL server containers.

A Look Back at the Rise of Virtual Machines 

Think back to around 15 years ago when VMs emerged on the scene. Organizations embraced VMs virtually overnight due to the numerous advantages they offered. For instance, VMs allowed for the consolidation of multiple physical servers onto a single, high-capacity host. This consolidation translated into substantial cost savings by reducing hardware requirements, enhancing resource utilization, and minimizing the physical footprint of data centers.

In addition to cost efficiency, VMs provided greater flexibility and scalability. IT could now easily create and manage VMs, enabling rapid deployment of apps and services. This increased agility and enabled organizations to respond swiftly to changing market dynamics and gain a competitive edge. 

Last but not least, VMs improved disaster recovery (DR) and business continuity efforts. Capabilities such as snapshots and virtualization-based backup simplified recovery from failures and data loss, helping to ensure uninterrupted operations. 

Shifting the Conversation to SQL Server Containers

Today, there is a similar excitement brewing, but this time, it is focused on containers – especially in terms of their significant impact on Microsoft SQL Server environments!

Rob Horrocks, Microsoft’s Senior Cloud Solution Architect, recently shed light on the SQL Server container phenomenon: “The trajectory of SQL Server containers is reminiscent of the VM explosion we witnessed over a decade ago. As with VMs then, containers now offer a compelling value proposition for modern enterprises – agility, efficiency, and scalability.”

During a recent presentation, Horrocks walked us through a live demo of migrating a SQL Server 2022 instance from Windows to Kubernetes (K8s). He artfully employed Contained Availability Groups (AGs) and 

DH2i’s DxEnterprise Smart High Availability Clustering software to achieve this Windows Server to K8s migration. The demonstration showcased how quick and easy SQL Server container modernization can actually be.

Why Are SQL Server Containers the New Buzz?

  • Unlock Optimized SQL Server: Containers for a SQL Server instance offer a multitude of benefits, which include improved performance, reduced operating costs, and the ability for one-click deployments.
  • Unify SQL Server Environments: With tools like DxEnterprise, you can achieve unified management for SQL Server on Linux, Windows, and Kubernetes. This multi-platform software can manage Windows, Linux, and multiple SQL Server containers cohesively, ensuring an uninterrupted workflow.
  • Absolute Minimum Downtime: With DxEnterprise’s unique failover capabilities, you get industry-leading failover performance with fully automatic failover for SQL Server AGs in Kubernetes. These are all the ingredients needed to take production SQL Server workloads to containers and reap the full list of benefits.
  • Simplified Modernization: Migration has often been a daunting task. But with DxEnterprise software, even the most intricate infrastructure and configurations can smoothly transition to Kubernetes in mere minutes.

Given these powerful advantages, it’s clear why the SQL Server containers vs. virtual machines debate is all but settled. SQL Server containerization is on the rise for good reason, and, as Rob Horrocks theorized, this might just be the tip of the iceberg.

The Future is Contained

The road to digital transformation can be paved with challenges. However, with the rise of SQL Server containers, we are better equipped than ever to overcome them. 

For years, DH2i has stood at the forefront of SQL Server container technology, anticipating this future trend and earning the rightful position as an innovator in the field of SQL Server software. By investing in technologies like DxEnterprise, enterprises can not only harness our extensive experience but also future-proof their SQL Server deployments and enjoy the immense benefits of modernization.  

Bottom line, it’s crucial to recognize that the SQL Server container revolution isn’t merely a passing fad. It represents the inevitable future of SQL Server. As we move forward, the tools and technologies supporting this transformation will continue to advance, offering unprecedented ease, efficiency, and optimization. 

Embrace this evolution now, or risk falling behind in the competitive landscape because thousands of organizations are already evaluating SQL Server containers and even deploying them in production. The time to act is now!

DH2i to Showcase DxEnterprise Smart High Availability Clustering Software at PASS Data Community Summit 

Posted in Commentary with tags on November 6, 2023 by itnerd

DH2i, the world’s leading provider of always-secure and always-on IT infrastructure solutions, today announced it will showcase its DxEnterprise smart high availability clustering software at next week’s PASS Data Community Summit (November 14-17, Seattle, WA).

Visitors to DH2i’s Booth 101 will have the opportunity to see firsthand how DxEnterprise enables seamless SQL Server modernization with containers, unlocking your path to zero downtime and providing fully automatic database-level failover for Availability Groups (AGs) in Kubernetes.

In addition, don’t miss OJ Ngo, CTO at DH2i, and Amit Khandelwal, Senior Program Manager at Microsoft, as they present: “Containerize with Ease with Cross-Platform SQL Server AGs” on November 16, 10:15 AM-11:30 AM, in Room 400

This talk will explore the motivators behind the SQL Server container explosion taking place across the globe. Organizations are in pursuit of enhanced scalability and portability, reduced management and operating costs, and peak utilization, amongst a wealth of other benefits. However, one of the greatest hurdles that stands in their way is high availability (HA). Attendees will learn how to break through the traditional constraints of Always On Availability Groups (AGs) and perfectly unify Windows, Linux, and Kubernetes in a single, highly available AG. To learn more about this session, please visit: https://passdatacommunitysummit.com/sessions/2078/

To learn more about the PASS Data Community Summit and register to attend, please visit: https://passdatacommunitysummit.com/. You can use code: DH150CP for a $150 discount.

Not attending the PASS Data Community Summit? Learn more about DH2i’s approach to smart high availability technology with a one-on-one demo today. Sign up here: https://dh2i.com/demo/

Today Is World Backup Day

Posted in Commentary with tags , , on March 31, 2023 by itnerd

World Backup Day is today and it was started by a group of concerned internet users and tech enthusiasts in 2011. The initiative was led by Ismail Jadun, a digital strategy consultant from Ohio, and his friends. They were inspired to create World Backup Day after reflecting on the fact that many people were not backing up their data regularly, and as a result, were putting themselves and their organizations at risk. The first World Backup Day was observed on March 31, 2011, and since then, it has become an annual event that encourages people to take action to protect their digital estate.

Data loss can occur due to a number of reasons such as hardware failure, software corruption, malware attacks, natural disasters, and even human error. The amount of money that businesses lose due to data loss can vary depending on various factors such as the size of the business, the industry, and the type of data lost. However, studies suggest that the cost of data loss can be significant, with some estimates ranging from thousands to millions of dollars per incident. And one can imagine the devastating consequences if an organization like a hospital, emergency responders, or military agency lost access to critical data. 

Datadobi’s Carl D’Halluin, DH2i’s Don Boxley, and Folio Photonics’ Steve Santamaria had this to say about this important day and why it affects virtually every corner of the datacenter, across virtually every industry, around the world:

Carl D’Halluin, Chief Technology Officer (CTO), Datadobi:

“Failing to backup your data can have catastrophic consequences, as a single hardware failure, cyber-attack, or natural disaster can wipe out all your valuable information, leaving you with no way to recover it. This means that years of hard work can all be lost in an instant, with no chance of retrieval. Even the cost of losing just a portion of your important data can be immeasurable, with potential financial, legal, and reputational implications that can last for years. 

Identifying the vital data that requires protection should be the first step in the process. But even if you know and can ‘describe’ what data must be protected, finding it has always been another matter – and you cannot backup what you cannot find. To effectively address this enormous and complicated undertaking, users should look for a data management solution that is agnostic to specific vendors and can manage a variety of unstructured data types, such as file and object data, regardless of whether they are stored on-premises, remotely, or in the cloud. The solution should be capable of evaluating and interpreting various data characteristics such as data size, format, creation date, type, level of complexity, access frequency, and other specific factors that are relevant to your organization. Subsequently, the solution should allow the user to organize the data into a structure that is most suitable for the organization’s particular needs and empower the user to take action based on the analyzed data. In this case, backup the necessary data to the appropriate environment(s). And, if necessary, the solution should enable the user to identify data that should be organized into a ‘golden copy’ and move that to a confidential, often air-gapped environment.

To sum it up… Don’t let the nightmare of data loss become your reality – always backup your data.”

Don Boxley, CEO and Co-Founder, DH2i

“World Backup Day is an annual event that is intended to raise awareness of the importance of data backup and protection. It serves as a reminder for individuals and organizations to take proactive measures to safeguard critical data against unexpected incidents that can result in data loss, such as hardware or software failure, cyber-attacks, natural disasters, and human error. And, while the exact cost can vary depending on factors such as the size of the organization, the type and amount of data lost, the cause of the loss, and the duration of the downtime, according to various studies, it can cost organizations upwards of billions of dollars each year.

That’s why, for systems architects and IT executives alike, zero is the ultimate hero. And to achieve it, they are taking a multi-pronged approach to data protection. To achieve zero downtime, zero security holes, and zero wasted resources, they are also layering-on smart high availability (HA) clustering and software-defined perimeter (SDP) technology that enables them to securely connect and failover enterprise applications — from anywhere, to anywhere, anytime.

On World Backup day and all year long, it is critical to remember that businesses that invest in data protection are better equipped to navigate unexpected data loss events, maintain regulatory compliance, and protect their critical assets and reputation. Bottom-line, investing in data protection is not just smart, it’s essential for business success.”

Steven Santamaria, CEO, Folio Photonics

“The world’s most valuable resource is data, and it is of utmost importance to properly store, protect, and preserve this resource. The safekeeping of data is essential because it represents the foundation upon which many modern businesses are built, and its loss can have far-reaching consequences for organizations and individuals alike. As such, ensuring the safety and longevity of data should be a top priority for any entity that relies on this precious resource.

On World Backup Day, we are reminded of this, and the criticality of backup as one of the key safety nets against data loss, whether it’s due to technology failures, cyber-attacks, or human error. 

Today, I would offer that the most effective data protection strategy should also incorporate a data storage platform that can be securely archived in an off-site location, with the added benefit of being taken off-line and air-gapped for even greater security. This means that the storage platform is physically separated from the main network and disconnected from the internet, making it highly resistant to cyber-attacks and other forms of data breaches. In essence, a well-designed data protection strategy should prioritize both physical and digital security to safeguard critical data and ensure business continuity.”

Molly Presley, SVP of Marketing at Hammerspace:

“The coming year will be about automation to help identify and protect data assets.  Human-managed processes are challenging to scale as the number and variety of data-creating devices continually increase.​​ As a result, setting data protection services at a global level that automatically apply policies that meet corporate governance compliance requirements will be increasingly important. 

Automation will include identifying newly created data on any infrastructure in the global data environment, automating controls on data copy creation, and automating data services to ensure global protection on any infrastructure. “ 

 Darren Yablonski, Senior Director of Sales Engineering leading teams in Canada, U.S. and LATAM at Commvault:

“As the sophistication of cybercriminals has changed over the last few years, so too has data protection ­­— significantly. In the past, cybercriminals would typically gain access to an organization’s data and encrypt it so employees could no longer understand it, rendering it useless to the business. This is why ensuring you have a secure copy of your data is so important. With a spare dataset to restore, business can continue as usual. 

Lately, cybercriminals are increasingly moving from encrypting the data, to instead holding it for ransom and threatening to publish it. This has much broader consequences, including reputational damage as well as possible loss of competitive advantage as your customer and company data could be available to the entire industry. As a result, organizations should consider changing their approach to data protection. 

Gone are the days when it was enough to just backup your data. Organizations need to prevent cybercriminals from accessing systems to begin with by leveraging, for example, an early detection system. Cyber deception can give companies the upper hand and put them one step ahead of any potential attackers. Decoys are deployed to throw attackers off course and instead draw them to artificial assets instead of legitimate ones. The minute an attacker enters the decoy IT environment, the organization is notified so it can act immediately and isolate the asset. With response time significantly reduced, cybercriminals are far less likely to get into any real systems. 

Backups will always remain important, because unfortunately the worst can always happen — from a natural disaster that destroys your servers to a cyberattack. However, in the face of the sophisticated cybercriminal, it’s vital to have a proactive approach to data protection in tandem with traditional reactive methods.”