Operational Costs of VDI Outweigh Promised Benefits, say IT workers

Posted in Commentary with tags on December 10, 2024 by itnerd

 Nexthink has announced research showing that Virtual Desktop Infrastructure (VDI) procurement and management processes are riddled with contradictions. The survey of 1000 frontline IT workers found that:

  • 92% say the employee experience is an important consideration when choosing a VDI solution
  • However, 91% admit that cost considerations trump performance when choosing a provider
  • 95% believe that VDI offers an equal or better experience than desktops
  • Yet 92% confess that it has primarily been designed to make life easier for IT, rather than the end-user

The cost of these contradictions is significant, with a third of organizations (31%) reporting daily VDI issues that require L3 VDI specialist support, and a further 40% having them on a weekly basis, as L1 and L2 support are often unable to manage the complexity of VDI. This means that, despite a key driver of VDI deployment being the ability to better control costs, enterprises are having to spend huge sums on operationalization and maintenance.

The confusion over VDI is further compounded by the fact that a substantial proportion of these escalated issues were not necessarily specific to VDI. Application functionality failures (54%) and slow performance (47%) accounted for two of the top three most reported issues to IT teams, neither of which are necessarily related to VDI.

In order to address these issues, businesses need a unified view over all VDI sessions with end-to-end visibility and automated workflows to enable remediation with minimal interruption to the user experience. Moreover, having instant insight into where problems are occurring can remove the blame game between functions and enable better collaboration both within IT departments and with the wider organization. 

To find out more about the challenges of VDI management, click here for the full report.

Rogers Xfinity introduces Storm-Ready WiFi

Posted in Commentary with tags on December 10, 2024 by itnerd

Rogers today announced the launch of Rogers Xfinity Storm-Ready WiFi, an innovative new product designed to keep customers connected when there is an outage.

Rogers Xfinity Storm-Ready WiFi brings Rogers advanced network technology together with a device that automatically switches to a cellular backup connection when there’s a network or power outage. The device and battery backup seamlessly keep customers’ homes online so they can work and stream without interruption.

The launch of Storm-Ready WiFi follows the company’s recent introduction of Rogers Xfinity, a suite of in-home services that leverage Comcast’s world-class product and technology platform.

Rogers Xfinity Storm-Ready WiFi delivers:

  • Extended Battery Backup
    Keep streaming for up to four hours during a power outage with a rechargeable battery backup
  • Enhanced Reliability
    Automatically switches to Rogers cellular network when the power or primary internet service is interrupted with real-time connection and battery status notifications on the Rogers Xfinity app
  • Simple Setup and Seamless Integration
    Ready in minutes and seamlessly integrates with Rogers Xfinity Internet
  • Stronger WiFi Coverage
    Device provides enhanced coverage, doubling as a WiFi extender for everyday use, making it our best WiFi Boost Pod ever.

Rogers Xfinity Storm-Ready WiFi is now available for customers in British Columbia, part of Rogers commitment to deliver innovative products to Western Canada and will be rolling out across the country in February. Customers can pre-order their Storm-Ready WiFi device today or visit Rogers.com for more information.

MacWeb Unleashes Bare Metal Mac Cloud Services Featuring The M4 and M4 Pro Apple Silicon

Posted in Commentary with tags on December 10, 2024 by itnerd

MacWeb, a provider of on-demand, bare-metal cloud services for Apple developers and IT teams, has launched three dedicated Mac mini configurations powered by Apple’s latest high-performance M4 and M4 Pro chips. These new offerings, based on the world’s fastest CPU core, provide developers with unparalleled performance, scalability, and affordability — enabling them to accelerate development workflows and boost productivity.

MacWeb’s new Mac mini cloud service offers three tiers to meet the diverse needs of power users:

  • MacWeb Base M4: Perfect for providing virtual remote desktop and priced at only $99 per month, this Mac mini tier provides a cost-effective solution for small to medium-size businesses, schools, and universities.
  • MacWeb Power M4 Pro: Ideal for more demanding workloads such as application development and testing, this tier offers a significant performance boost for faster build times and enhanced productivity. Priced at $199 per month, this cloud Mac mini service uses the M4 Pro chip with 12-core CPU and 24GB unified memory.
  • MacWeb Ultimate M4 Pro: Designed for mission-critical production applications and AI models, this tier delivers unparalleled performance and robust storage for the most demanding tasks. Priced at $299 per month, this cloud Mac mini service offers a 14-core CPU, 20-core GPU, and 64GB of unified memory.

MacWeb also continues to offer its popular M2-based Mac mini cloud services, providing developers with a range of options to meet their specific needs and macOS version requirements.

Thunderbolt 5 local networking now available in the cloud for M4 Pro-based services

Experience blazing-fast file transfers and seamless connectivity. Built into the latest Mac mini models with the M4 Pro chip, Thunderbolt 5 enables 80 Gbps of bi-directional bandwidth and is up to 800% faster than 10G ethernet. Thunderbolt 5 unlocks high-speed clustering of Mac minis, allowing developers to combine multiple systems for demanding workloads like AI, video editing, and software testing, creating a powerful and scalable solution in the cloud.

In addition to exceptional performance, MacWeb’s new M4 Mac mini cloud service offers:

  • Instant activation: Get started in minutes with easy setup and instant access to your dedicated Mac mini.
  • High availability: Enjoy a reliable and secure cloud environment with 99.9% uptime.
  • Custom configurations: Tailor your cloud environment to your specific needs with customizable configurations, such as cluster nodes for distributed computing, and tap into expert consulting from MacWeb’s dedicated support team.

New Supply Chain Visibility & Risk Research Reveals Containers Have 600+ Vulnerabilities On Average

Posted in Commentary with tags on December 10, 2024 by itnerd

NetRise has released a new report that explores software compositions, vulnerability risks, and non-CVE risks in different asset classes in every organization’s software supply chain. The report analyzes the scope and scale of the components and risks found across 70 of the most commonly downloaded Docker Hub container images.

Key findings from NetRise researchers include:

  • After analyzing 70 randomly selected container images from 250 of Docker Hub’s most commonly downloaded images and generating a detailed SBOM, NetRise discovered that each container image had an average of 389 software components.
  • NetRise found that one in eight components had no software manifest—they lacked the formal metadata typically found in manifests and details about dependencies, version numbers, or the package’s source. 
  • The average container had 604 known vulnerabilities in the underlying software components, with over 45% being 2 to 10+ years old; over 4% of the 16,557 identified CVEs with a Critical or High CVSS Severity ranking were weaponized vulnerabilities known by botnets to spread ransomware, used by threat actors, or used in known attacks; 4.8 misconfigurations per container, including 146 “world writable and readable directories outside tmp,” the containers had overly permissive identity controls with an average of 19.5 usernames per container. 

You can read the report here.

Unveiling Cyber Operation by Nemesis & Shiny Hunters

Posted in Commentary with tags on December 9, 2024 by itnerd

VPN Mentor have just published a report about a cyber operation exploiting vulnerabilities in public sites and leading to unauthorized access to sensitive customer data, infrastructure credentials and proprietary source code. The investigation has identified individuals behind the incident and linked it to the groups “Nemesis” & “Shiny Hunters” and the report provides detailed analysis of the attack’s tactics, techniques, and procedures, collaborating with the AWS Fraud Team for mitigation measures

You can check the full report here: https://www.vpnmentor.com/news/shiny-nemesis-report/

Mission Named AWS Security Partner of the Year

Posted in Commentary with tags on December 9, 2024 by itnerd

Mission, a CDW company and US-based Amazon Web Services (AWS) Premier Tier Services and ISV Accelerate Partner, today announced that it has been named the AWS Security Partner of the Year. This award recognizes partners who have demonstrated excellence in securing every stage of cloud adoption, from initial migration through ongoing day-to-day management.

In addition to winning Security Partner of the Year, Mission achieved finalist status in four prestigious global award categories:

  • Migration Consulting Partner of the Year
  • Generative AI Consulting Partner of the Year
  • SaaS Consulting Partner of the Year
  • Aerospace & Satellite Consulting Partner of the Year

These recognitions showcase Mission’s comprehensive expertise in cloud migration, security, generative AI, SaaS consulting, and industry-specific solutions.

Announced during the Partner Awards Gala at AWS re: Invent 2024, Geo and Global AWS Partner Awards recognize partners whose business models continue to evolve and thrive on AWS as they support their customers. These prestigious AWS awards and nominations cap off a landmark year for Mission, marked by the announcement of its partnership with leading security provider CrowdStrike, the launch of multiple software offerings in the AWS Marketplace, and the development of many AI production workloads for customers.

Taylor Swift Fans in Vancouver Break Record for Most Data Ever Used at Single Event on The Rogers 5G Network

Posted in Commentary with tags on December 9, 2024 by itnerd

Taylor Swift fans on the Rogers 5G network used over 11 terabytes (TB) of mobile data in just a few hours at BC Place to share and stream her last concert of The Eras Tour – setting a new Canadian record.

This shatters the record set at the Taylor Swift concert at Rogers Centre in Toronto on November 21, when fans used 7.4 TB of data on the Rogers 5G network.

The data used on December 8 is the equivalent of uploading 307,000 photos and 2,180 hours of video streaming. Based on data usage spikes, the most-shared moments of the show were when Taylor Swift came on stage and the start of the ‘Reputation’ era. 

As presenting sponsor of Taylor Swift | The Eras Tour in Canada, Rogers invested $10 million to enhance 5G connectivity at BC Place ahead of Taylor’s last stop on her global tour. This includes a full network redesign and installation of a new in-stadium network system to bring fans the best experience across her three nights in Vancouver. Teams spent 10,000 hours of planning and installation took 10,000 hours with a crew of over 40.

The upgrades increase 5G network capacity by 38 times throughout the stadium – the equivalent to coverage provided by 20 towers in Vancouver. Over the course of Taylor’s three performances in Vancouver, fans on the Rogers 5G network at BC Place used 32TB of data.

5G technology is critical for today’s concert experiences, providing faster speeds, lower latency and more capacity, as large numbers of fans livestream and share the moment in real-time.

As part of planning for the expected crowds outside of BC Place, Rogers also installed two temporary Cell on Wheels to increase wireless capacity and ensure reliable connectivity.

The Roku Channel Hits 150+ FAST Channel Milestone in Canada

Posted in Commentary on December 9, 2024 by itnerd

 Roku reached a milestone: Canadian viewers can now access more than 150 free ad-supported streaming TV (FAST) channels on The Roku Channel. With new additions available starting today that range from news to comedy, Roku has expanded and diversified its selection of streaming entertainment in Canada. 

Recently added channels include: Corner Gas Channel, CTV @ Home, CTV Laughs, Noovo Cinéma, TSN The Ocho, and Wicked Tuna. The Roku Channel will also add three new CBC News local channels (Manitoba, Quebec, and Nova Scotia, joining BC, Toronto, and the national CBC News channel) in the coming weeks, as well as Zoomer Television, EarthDay 365, Canadian favourites Heartland and Murdoch Mysteries, as well as the world-renowned The Graham Norton Show, and the iconic The Ed Sullivan Show.

The Roku Channel is the home of free and premium entertainment on the Roku platform, and the exclusive home of Roku Originals. Free, ad-supported content on The Roku Channel is only available on Roku streaming players or Roku TV Models™ in Canada. 

For the full list of channels, please visit the Roku blog.

Three Disruptors In Tech Give Their 2025 Predictions

Posted in Commentary on December 9, 2024 by itnerd

Here’s some of the most captivating predictions from 3 disruptors in 3 critical technology fields.

1. CLOUD NATIVE | Ratan Tipirneni, President and CEO, Tigera:

  • Kubernetes will bridge the gap for GenAI workloads: In many GenAI applications, enterprises will use RAG with proprietary data, which will often be confidential and sensitive. To address concerns around data security, privacy, and integrity, some will deploy GenAI in their local data center, but many will want to run GenAI across both cloud and on-premises, and Kubernetes will bridge the gap.

2. IT | Ofer Regev, CTO at Faddom:

  • IT skills gap will accelerate lightweight automation: The global shortage of skilled IT professionals will worsen in 2025, pushing businesses to adopt more lightweight, automated tools. Complex solutions requiring extensive expertise will lose ground to agentless technologies that rapidly simplify deployment and deliver value. 
  • Rise of Zero Trust beyond devices: Zero Trust will expand beyond devices and networks to include identity verification frameworks for all digital interactions. With the surge of remote work and decentralized systems, traditional identity models will fall short. This will demand tools capable of tracking and validating user and system behaviors across dynamic IT landscapes.

3. DEVOPS | Steve Fenton, Principle DevEx Researcher, Octopus Deploy:

  • Platform Engineering will be thinner: Platform engineering has become a path towards DevOps efficiency and developer productivity. In 2025, organizations will realize they can achieve the goals of platform engineering with fewer lines of bespoke code. Instead of trying to build a grand unifying platform, existing tools will provide solutions that reduce fragmentation, apply standards, and integrate security into software delivery.
  • Continuous delivery is dead… Long live continuous delivery! As organizations shift to platform-as-a-service, Kubernetes, and serverless offerings, they often lose good practices along the way. The solid continuous delivery pipelines they created for traditional self-hosted and IaaS environments had solid practices that should be transferred to new environments.

Active Archive Alliance 2025 Data Storage Trends And Predictions

Posted in Commentary with tags on December 8, 2024 by itnerd

Here’s some 2025 predictions from members of the Active Archive Alliance.

Modern Object Storage Will Expand to Include Long-Term Tape Solutions

The explosion of Generative AI and increased demand for unstructured data retention is exceeding modern IT budget growth.  Standardized object storage interfaces are making it easy to move data, but object storage was designed as a single tier utilizing hard disk drives.  Tiering will become a standard requirement for active data object storage vendors.  Modern object storage solutions will expand support to include tape and other long-term storage mediums as an object storage deep archive target, at a fraction of the cost of cloud archives.  Cloud will continue to be part of the hybrid data protection strategy. The results will be lowered costs for organizations storing Petabytes of data.

 – Mark Hill, Business Line Executive Data Retention Infrastructure, IBM

Sustainability Makes a Comeback in Data Storage with Active Archives

Despite daily examples of the devastating effects of climate change, broader corporate sustainability initiatives have in many cases moved off center stage due to unachievable and overly aggressive goals with poor return on investment. Meanwhile, in the IT industry, the shiny new thing is AI with its energy intensive GPUs dwarfing the energy requirements of traditional CPUs. AI also requires massive volumes of data to feed its training models and more data gets generated in the process that may never be deleted even after it goes cold. Sustainable active archive solutions with intelligent data management capabilities can leverage ultra energy-efficient and extremely cost-effective tiers of storage such as S3 compatible object-based tape libraries. This will be needed to offset the voracious energy consumption of truly cutting-edge and breakthrough AI applications as the AI age evolves in 2025 and beyond.

-Rich Gadomski, Head of Tape Evangelism, FUJIFILM North America Corp., Data Storage Solutions

Active Archive Based on Standards and Established Tape Technology: Essential for Future Data Centers

Standardization plays a major role in the data center sector. This applies not only to hardware, but also to software. The need for data archiving in data centers will increase as data volumes grow rapidly due to applications such as AI. For example, active archive concepts based on established tape technology and standardized object-based software interfaces will be used to enable the active use of archived data. Active archives based on scalable and rack-mountable tape libraries that are designed for use in data centers and can be integrated via standardized software interfaces such as S3, will become indispensable in future data centers.

– Thomas Thalmann, CEO, PoINT Software & Systems

Revolutionizing Data Management: AI-Driven Solutions for Smart Storage and Seamless Access.
Artificial intelligence (AI) has the potential to revolutionize data storage and active archives by enhancing efficiency and accessibility. As data volumes soar, we can optimize storage management by predicting usage patterns and minimizing costs, potentially making decisions about how and where to store data at the point of creation. In the realm of active archives, AI can analyze and prioritize data, ensuring frequently accessed information is readily available while less critical data is stored cost-effectively. Automated classification, tagging, and indexing could simplify the search process, allowing for intelligent data handling. For example, sensitive intellectual property could be air-gapped to tape for security, while short-term, frequently accessed data could be stored in a cloud tier. This strategic approach could lead to significant improvements in data

management, enabling organizations to respond more effectively to their needs and streamline their operations.

– Paul Luppino, Director, Global Digital Solutions, Iron Mountain

AI Workloads Will Fuel More Storage Disaggregation

As the AI train keeps moving full-steam ahead, more companies will realize that server-bound storage will be less cost effective and at times inadequate when compared to what can be accomplished via disaggregated storage. Simply put, disaggregated storage is external storage, attached to the server via SAS or fabric. This disaggregation has been proven to deliver the performance and capacity required to meet the requirements of demanding GPU-related workloads which are at the heart of AI and machine learning processes. Disaggregating storage from the server accomplishes two key things: (1) it enables storage to be shared across multiple servers offering greater flexibility and utilization of storage resources, and (2) demonstrations show that disaggregated storage delivers the performance needed to keep GPU processing fully saturated. Over time these external storage architectures will become standard with HDD for active archives and with flash for performance workloads and will ultimately migrate to fabric as opposed to SAS given the convenience and distance benefits of fabrics.

-Mark Pastor, Platform Product Management, Western Digital

 AI-based Applications Fuel the Rise of Accessible Cold Storage, Enabling the Processing of Data Within Seconds.

The digitalization of our everyday lives has created the need for immutable records and the permanent capture of sensor data of all kinds of applications such as autonomous cars, medical diagnostics, Smart IoT, images, and videos.  Most of this cold-born data is often retained indefinitely for personal or liability reasons or later monetization, including browsing user data, medical risk assessment, and training data to enhance AI. This data must be stored as cost-effectively as possible; otherwise, newly invested AI-based business models will struggle to turn a profit. In the future, large chunks of data will quickly be heated up to enable ML and AI-powered tools to generate insights on large datasets. The need for fast, accessible, high-performance Active Archive solutions is obvious and will drive accelerating demand in 2025!

-Martin Kunze, CMO and Co-Founder, Cerabyte

Tape Enables More Cost-Effective Active Archives in 2025

The percentage of total data that goes on tape will increase within active archive environments.  I don’t just mean more data will go on tape than before, which has been the case for a while.  I’m saying more data will go on tape thanks to its cost, energy and long-term reliability advantages, when compared to the amount of data going on disk.

 –W. Curtis Preston, Technology Evangelist, S2|Data

The Rise of Storage Virtualization and the Data Fabric

As organizations look to optimize their storage strategies in 2025, the rise of storage virtualization is making it easier to interconnect various data storage technologies. Businesses can maximize their existing investments and avoid vendor lock-in by leveraging a data fabric—an architecture that unifies cloud, disk, tape, and flash storage into a single, logical namespace. This trend towards virtualization allows for a more flexible approach to data management, enabling businesses to mix and match technologies to meet specific needs. For example, high-performance workloads can run on flash storage while colder data is moved to tape in an active archive. The ability to integrate various storage solutions seamlessly will be a key enabler for organizations aiming to improve efficiency, reduce costs, and scale their operations.

-Jason Lohrey, CEO, Arcitecta

AI’s Rising Energy Costs: Why Data Centers Will Be Turning to Tape Storage

As AI continues to reshape industries, the energy demands on data centers are intensifying. AI servers consume up to 14 times more power than traditional systems, raising both operational costs and environmental concerns. In response, the focus must shift to energy-efficient solutions, and magnetic tape—a 70-year-old technology—offers a relevant answer. Modern tape storage is not only highly durable but also incredibly energy-efficient, particularly when compared to disk storage. By offloading cold data to tape in an active archive, data centers can free up energy for AI workloads, maximizing efficiency. As energy becomes a factor potentially limiting the growth of AI, businesses that embrace sustainable practices will gain a competitive edge in 2025 and beyond.

-Ted Oade, Director of Product Marketing, Spectra Logic

Trend Towards Hybrid Active Archives

As the volume of stored digital data increases year on year, many organizations increasingly turn to public cloud storage. This is true for active archives because storing in cloud object storage is scalable, secure and, most importantly, is convenient. However, for large archives, it is a costly option. For this reason, we are seeing a trend towards hybrid storage solutions where data is stored both on premises and in the cloud. This can minimize egress, and it avoids costly fees when an organization decides it must migrate its content from one cloud provider to an alternative, whether that be a different provider or an on-premises solution.

-Phil Storey, XenData CEO

Tape-based Active Archive Complements the Fastest AI Storage

QStar believes the use of AI to provide added insight into multiple types of data will be a major driving force in many IT departments over the next year and into the future. The use of tier 1 primary storage and new tier 0 GPU based storage will require significant data sets or project data to be available for relatively short periods of time during processing. At other times, this data has to be stored securely and be readily available when next needed, but also stored at low-cost, due to the size of the data sets involved.

Multi-node tape-based active archive solutions provide everything an AI environment requires, by using many tape drives in parallel to increase significantly raw performance. Tape media is the lowest cost and most secure form of storage. AI applications can choose to access this data through a file system (SMB or NFS) or S3 API protocol.

-David Thomson – SVP Sales and Marketing (QStar Technologies) 

Healthcare: Using AI in Cyber Risk Reduction

Although artificial intelligence (AI) is at the forefront of discussions in active archiving for healthcare, continuing to use AI to help in the cyber security world will be a must in 2025.  Hackers continue to advance AI technology to develop more complicated and complex hacking initiatives and ethical hackers are required to do the same to prevent it. The push for interoperability in healthcare will require advanced AI hacking prevention techniques and 2025 is going to be a big year for prevention and balance: How to share data and still keep it secure. Concerns over AI’s ability to learn and evaluate critical thinking processes are not unfounded. Patient safety is already a concern as hackers try to infiltrate medical devices and this will definitely need to be a focus in 2025.  2024 provided good examples of areas of vulnerability. In 2025, initiatives should be increased to include analysis of these gaps in protection, and the budgets increased to provide funding to actually close them.

-Kel Pults, DHA, MSN, RN, NI-BC, NREMT Chief Clinical Officer and VP Government Strategy, MediQuant