Other World Computing Announces Plug-and-Play 10G Ethernet Adapter & Thunderbolt Pro Dock 

Posted in Commentary with tags on November 6, 2024 by itnerd

Other World Computing today announced it is making it easier than ever for Mac users to maximize their devices’ performance with the OWC 10G Ethernet Adapter, delivering lightning-fast 10G Ethernet connectivity to any Mac with Thunderbolt (USB-C), including the latest M4 series. The plug-and-play adapter offers a high-performance, reliable solution for Mac models without built-in 10G Ethernet – allowing users to connect at 10G speeds and leverage faster local networks and network-attached storage (NAS).

To complement this upgrade, Other World Computing offers its OWC Thunderbolt Pro Dock, which transforms any workspace with versatile port expansion and 10G Ethernet capabilities. The Thunderbolt Pro Dock – with the ability to power a Mac laptop and add essential ports through a single Thunderbolt cable – is the ideal accessory for users needing high-speed network connectivity and streamlined connectivity for multiple devices.

Product highlights: 

OWC Thunderbolt 10G Ethernet Adapter – The Portable Ethernet Supercharger

  • Blazing-fast – Over 900MB/s real-world tested transfer speed for large file transfers, video editing, and live-streaming gaming sessions
  • Compatible – Use with any Thunderbolt-equipped Mac or Windows computer
  • Capable – Up to 100-meter cable distance with Cat6a for 10G, Cat6 for 5G, and Cat5e for 2.5G
  • Smart – Supports auto-negotiation for 10Gb/s, 5Gb/s, 2.5Gb/s, 1Gb/s and 100Mb/s Base-T Ethernet standards
  • Accessible – Wake-on-LAN ready for remote access of home or work computer
  • AVB ready – Perfect for use in pro audio and video applications where synchronization of data streams is critical
  • Securable – Features Kensington-compatible lock slot for anti-theft cabling

OWC Thunderbolt Pro Dock – The Best Networking, Media, and Docking Solution for Pro Creatives

  • Fast & diverse – Transfer files in a blink, daisy chain up to five Thunderbolt devices, use USB-C accessories, and add an HDMI display with two Thunderbolt (USB-C) ports
  • Full throttle bandwidth – Share and stream up to 90% faster with a 10Gb/s Ethernet port
  • AVB ready – Perfect for use in pro audio and video applications where synchronization of data streams is critical
  • Ingest content – Easily handle multiple video and photo card uploads at up to 1630MB/s with CFexpress and SD card readers
  • Full-speed connectivity – Use your devices and accessories at their max speed with one USB-C and three USB Type-A 10Gb/s ports
  • Add more viewspace – Connect two 4K displays or up to a single 8K display via Displayport 1.4
  • Powerful – Keep your notebook mobile use ready with 85W charging power
  • Silent and secure – Fanless cooling mode and locking power connection for on-set use
  • Certified – Thunderbolt certified for Mac and Windows
  • Dante compatible – Easy Plug and Play – no software needed – into networks designed for bit-perfect audio with super low latency and sample-accurate synchronization

The OWC Thunderbolt 10G Ethernet Adapter is available now for $199.99 – learn more and purchase here: https://eshop.macsales.com/item/OWC/TB3ADP10GBE/.

The OWC Thunderbolt Pro Dock is available now for $349.99 – learn more and purchase here: https://eshop.macsales.com/shop/owc-thunderbolt-pro-dock

Arcitecta to Showcase New Data Management Solutions and Collaborative Presentations at SC24

Posted in Commentary with tags on November 6, 2024 by itnerd

Arcitecta, a creative and innovative data management software company, will host a visually-immersive booth at the SC24 show in Atlanta, Georgia, November 17-22, 2024. The company will showcase and demonstrate breakthroughs in data management for high-performance computing, research data and supercomputing environments. 

Arcitecta Co-LAB: Future Thinking

Arcitecta Co-LAB is an exciting joint endeavor with the company’s customers, partners and friends. The lab offers a unique opportunity to engage with HPC thought leaders, including Cerabyte, Princeton University, the University of Melbourne and many others, who will ‘take over’ the Arcitecta booth. They will delve into forward-thinking ideas, share insights and experiences, present groundbreaking research, and discuss topics ranging from the future of big data to strategies for resilience against loss. 


Date & Time: November 19-21, 2024, from 10:00 a.m. to 4:00 p.m.
Location: Arcitecta Booth #2851

Customer Spotlight – User Group Meeting

Arcitecta’s User Group meeting will feature presentations from Princeton University and the University of Melbourne highlighting Mediaflux’s adaptable capabilities in HPC environments for any system. This event will also include an open discussion on a variety of topics and provides an opportunity to connect with fellow Mediaflux users, share insights and experiences, and explore innovative approaches to common challenges. 

Date & Time: Monday, November 18, 2024, from 9:00 a.m. – 12:00 p.m.
Location: W Hotel, Studio 5, 45 Ivan Allen Jr Blvd NW, Atlanta, GA

Product Spotlight – Mediaflux Bundles

Arcitecta will provide demonstrations at SC24 of Mediaflux Multi-SiteMediaflux Edge and Mediaflux Burst, bundled with Dell PowerScale and ECS/ObjectScale tooptimize data workflows and foster collaboration in geographically distributed workforces. Mediaflux is built for any environment to handle data anytime, at any scale, and in any location, enabling organizations to leverage additional resources, boost computational power, access data instantly, optimize storage for cost-effectiveness, and ensure resilience against data loss. 

Beowulf Bash Event 

Arcitecta is proud to sponsor the fabulous Beowulf Bash event again this year. Join the Arcitecta team for food and beverages and a great time. For more details, visit: https://beowulfbash.com/

Date & Time: Monday, November 18, 2024, from 9:00 p.m. - 12:00 a.m.
Location: World of Coca-Cola, 121 Baker Street NW, Atlanta, GA
 

Active Archive Alliance Cocktail Reception

The Active Archive Alliance will host a cocktail reception on Tuesday, November 19, 2024, from 5:00 p.m. – 8:00 p.m. Stop by the Arcitecta Booth #2851 for details and an invitation.

Immersive Booth Experience

Once again, the Arcitecta booth will provide an immersive space for connection and inquiry. Sit back and immerse yourself in the ideas circulating on its state-of-the-art LED screens. Enter an environment where a diverse team of in-house artists respond to new ideas in creative computing, exploring the transformative relationship between technology and art. 

To schedule a meeting with the Arcitecta team at SC24, visit: https://www.arcitecta.com/events/2024/sc/chat/

Review: TP-Link Deco X50 AX3000 WiFi Router/Access Point

Posted in Products with tags on November 6, 2024 by itnerd

When I signed up for Distributel, one of the things that I was offered was what they described to me as a “WiFi pod”. I only took one as I had zero intention of using the gear that an ISP provides as that’s a form of lock in by said ISP. But what I got was a pre-configured TP-Link Deco X50 WiFi router/Access point as opposed to something unique and custom made for the ISP. That piqued my interest.

The Deco X50 is a WiFi 6 dual band router that when you have two or more of these can also deliver mesh WiFi. Let’s have a look at it:

From the front, it’s rather unremarkable. Which is good as it will fit into any decor.

From the back you get three gigabit ethernet ports. Any port can be used to connect to your ISP. But because they are all gigabit, you’re limited to gigabit speeds. Which makes the X50 not the correct choice if you have faster than gigabit service.

Now each X50 unit has only two radios. A 2.4GHz one and a 5GHz transmitter that handles both client connections and backhaul traffic. Having a third radio for backhaul traffic would be better for a lot of people who have a lot of devices in a mesh setup as that will create some amount of congestion between each unit if the traffic between the two is high enough. But only having two radios keeps the price down which I suspect was the priority here.

Power users will be disappointed with the fact that there’s no web page with advanced configuration options as everything is done through the Deco app, although there is a web based status page. To be fair you can configure a number of things via the app form the initial setup, which by the way is easy enough for the average person to do, to some more complex tasks as turning on QoS or using their parental controls subscription service. In terms of the parental controls service, the fact that parental controls are a subscription service that you have to pay for is a bit of #fail as competitors ASUS offer that for free.

Now in my testing, performance was actually decent. Let’s start with the performance from the Optical Networking Terminal that Distributel supplied to the X50:

Now I had to test this via plugging in an Ethernet cable to my MacBook Pro because I couldn’t find a speed test built into the router or the Deco app. But this result is better, especially on the upload end of things than my ASUS Zen WiFi XT8 which isn’t that good when it comes to upload in PPPoE scenarios. Further validating that I need to dump the ASUS gear for something better as I am clearly leaving some speed on the table by using the ASUS hardware.

From a WiFi perspective, the performance was also decent. This result was from about 5 feet from the X50:

That’s competitive with the XT8 which also supports WiFi 6. So from a performance perspective, it doesn’t suck. Though I do wonder how it would perform with multiple units and a lot of traffic given that there’s no dedicated backhaul.

So would I recommend the X50? It depends on the type of user your are. If your needs are modest. As in you need WiFi 6 in your home and you’re not doing anything crazy, this might be an option for you. If you’re a power user or you have faster than gigabit internet, you should likely look elsewhere. At least the price is better than decent as I found a pair of these for $179 CAD on Amazon. Thus if you fit the use case for this WiFi router/access point, it’s worth looking at in my opinion.

Fortra Discovers A Nearly 200% Spike in Abuse of Cloudflare’s Trusted Platforms

Posted in Commentary with tags on November 6, 2024 by itnerd

Fortra, has uncovered a significant surge in cybercriminal abuse of Cloudflare Pages (198% increase) and Workers (104%) over the past year. The data also reveals that monthly incidents on Cloudflare Pages alone could surpass 1,600 by year’s end—a 257% year-over-year increase.

What’s surprising?

While it’s primarily used legitimately, Cloudflare Pages can be exploited for malicious purposes due to its reputation, free hosting, ease of use, and global Content Delivery Network (CDN). Threat actors can create convincing malicious sites, using custom domains and secure HTTPS connections to deceive victims. Similarly, while designed to help developers to deploy and run JavaScript code directly at the edge of Cloudflare’s CDN, Cloudflare Workers can be exploited to bypass security controls or automate various attacks like brute-force login attempts. In short, they’re using Cloudflare’s strengths to lure victims into a false sense of security.

Cybersecurity teams may need to change their approach

While Cloudflare does implement threat detection and phishing prevention mechanisms, Fortra’s report suggests a growing trend of abuse on reputable platforms, highlighting the need for more vigilant monitoring, even in environments perceived as secure. Security teams should be aware of these increased attacks and proactively monitor for suspicious activity, as the platform can often be misused before detection of these attacks occur.

Tips for cybersecurity teams:

Cloudflare has several security measures in place to combat abuse, including threat detection systems, phishing detection, and user reporting mechanisms to take down malicious content. Despite these efforts, cybercriminals can still exploit the platform before malicious content is detected. The risk is in how cybercriminals are misusing the service, and not in the technology itself.

Users can protect themselves from phishing by following several best practices. First, they should be cautious when interacting with unfamiliar websites, especially those requesting personal or sensitive information. Verifying the legitimacy of URLs and ensuring that the domain matches the expected source can help identify phishing attempts. Additionally, enabling two-factor authentication (2FA) for accounts adds an extra layer of security.Developers using Cloudflare Pages should implement strong security measures such as regularly updating their site’s dependencies, using HTTPS for secure connections, and monitoring for suspicious activity. It’s also important to report any phishing attempts or malicious activity to Cloudflare for further investigation and takedown, helping to prevent wider abuse.

StorageMAP 7.1 launch adds more solutions to solve unstructured data challenges

Posted in Commentary with tags on November 6, 2024 by itnerd

Today, Datadobi is launching the latest version of their powerful heterogeneous unstructured data management platform. When they announced StorageMAP 7.0 in June, they talked about offering enterprises insights to drive better control, portability, optimization, and management of their unstructured data. StorageMAP 7.1 takes it a step further and solves some focused challenges facing their customers globally, including offering an innovative HDI Archive Appliance Bypass feature, example dashboards, and the most important one, improvements to scalability and performance. Here’s a closer look:

Solving for: Scalability and performance 

StorageMAP is constantly evolving and improving. With the release of version 7.1, this trend continues with many behind-the-scenes improvements in scalability and performance.

For example, the Unstructured Data Mobility Engine (uDME) at the core of StorageMAP has been updated with new enhancements to address ever-growing scalability and performance challenges present in modern, unstructured data environments.

Why this matters

Scale is one of the most critical factors in dealing with unstructured data management in today’s large and complex environments. The challenge has two dimensions: 1) the capacity being managed and 2) the number of items (i.e., files and objects) being managed. Capacity and item count combine to create a challenge only StorageMAP can address. In contrast, trying to manage a large environment with a solution that cannot handle scale will result in disappointment, failed projects, and a sunk cost in software that doesn’t deliver the desired value. With its industry-best ability to handle scale, StorageMAP solves the problem altogether.

Solving for: Migration inefficiencies and performance bottlenecks

StorageMAP 7.1 employs an HDI Archive Appliance Bypass feature to drastically increase migration performance for archived data using the Hitachi Data Ingestor (HDI).

The “bypass” involves using multiple StorageMAP connections to the storage systems – one connection to the primary storage system and a second connection to the archive storage system. These connections effectively bypass the middleware HDI archiving appliance, which is responsible for both relocating data to the archive storage system and retrieving it when a client application requests archived data.

Why this matters

The problem with the middleware archiving appliance is its significant performance limitations that make migrating all active and archived data an extremely slow process typically riddled with errors. Additionally, the migration workload on the archiving appliance hinders continued archival and retrieval operations. In bypassing the middleware and reading data directly from the primary and archive storage systems, StorageMAP greatly accelerates and enhances the accuracy of an otherwise problematic migration.

Solving for: Onboarding and usability

StorageMAP 7.1 offers sample dashboards to help customers get started with the creation of custom dashboards.

While StorageMAP version 7.0 introduced the ability to create a library of custom dashboards, version 7.1 provides example dashboards to seed the library for a new installation. These can be used out-of-the-box with a new installation of StorageMAP so customers can realize value even before they create their own custom dashboards.

Why this matters

With the 7.1 release, Datadobi is providing example dashboards that a customer can refer to for both ideas to include in their own custom dashboards. Customers can also refer to the definition of the widgets included in the example dashboards as a training aid that will help them derive value from StorageMAP quickly.

The bottom line

Datadobi continues to raise the bar on what it means to deliver the world’s most powerful, comprehensive, and real-world proven unstructured data management platform – not to mention the only true vendor-neutral option on the market today. Whether you’re dealing with complex migrations, working to lower risk and/or cost, or looking for a seamless way to gain greater value and insights from your data, StorageMAP 7.1 is the answer.

Ready to experience the difference? Reach out today — https://datadobi.com/contact/ — to schedule a demo and to learn more about how StorageMAP can transform your cloud data file management strategy.

Martello Launches Partner Network 

Posted in Commentary with tags on November 5, 2024 by itnerd

Martello Technologies Group Inc., an expert in user experience management for Microsoft 365, is excited to announce the launch of the Martello Partner Network. This ecosystem brings together managed service providers (MSPs), Microsoft Gold VARs, and complementary product vendors, including Martello Vantage DX, to quickly create differentiated Microsoft 365 and Teams solutions, enabling them to meet the evolving needs of enterprise clients. 

Microsoft 365 and Teams are essential to enterprise digital transformation, and partners today are seeking ways to stand out. According to IDC’s recent study, titled “Microsoft Partners: Driving Economic Value and AI Maturity,” every $1 of Microsoft revenue translates into $8.45 of services revenue and $10.93 of software revenue for partners. The rise of generative AI and Microsoft Copilot further accelerates this opportunity.

The Martello Partner Network Welcomes Key Players

In addition to existing partners such as Orange and Yorktel, Martello welcomes:

Perception Integrate: A leading UK-based AV integrator, Perception Integrate partnered with Martello to deliver proactive monitoring for Microsoft Teams Rooms. Technical Director Adam Southgate noted, “This partnership empowers us to proactively alert clients to potential Teams Room performance issues.”

LoopUp: Microsoft Teams Phone is the fastest growing Cloud Telephony solution, and large multinational enterprises are looking for a single global provider. With industry-leading coverage, LoopUp answers that call. Michael Boggia, VP Sales at LoopUp, said “LoopUp brings a key business function, telephony, into Teams. That means identifying issues in Teams and assigning accountability is critical, which makes Vantage DX an excellent complement.”

Entergrade Solutions: Delivering enterprise-grade cloud solutions, Entergrade provides consulting, development, and products to maximize Microsoft 365 value for enterprises. Co-Founder Michael LaMontagne, a Microsoft MVP, stated, “With Martello, we ensure optimal performance and an exceptional user experience for our clients.”

Martello Partner Network Benefits:

  • Access to innovative tools like Vantage DX to strengthen and differentiate Microsoft 365 offerings
  • Sales and technical enablement for partners.
  • Go-to-market support, including co-branded collateral, lead generation strategies, and co-selling opportunities to drive revenue with superior digital experience management.

To learn more about joining the Martello Partner Network and how it can drive revenue growth, cost optimization and impeccable client service for Microsoft 365 and Teams, visit https://martellotech.com/partners/martello-partner-program/.

Majority of Toronto-Waterloo IT decision-makers see data privacy and sovereignty a key-priority, finds new research from OVHcloud

Posted in Commentary with tags on November 5, 2024 by itnerd

OVHcloud, a global cloud player, today hosts its inaugural Canada’s Innovation Future event alongside the Balsillie School of International Affairs (BSIA) and the Centre for International Governance Innovation (CIGI) in Waterloo. The event brings together thought-leaders, academics, and industry experts from Ontario’s vibrant tech sector to address the intersection of innovation and regulation, focusing on fostering open, trusted and sustainable technologies.

As Canada’s digital landscape reaches a pivotal moment, discussions will center on creating frameworks that balance technological growth with responsible governance, particularly in areas like artificial intelligence (AI). As businesses increasingly adopt AI solutions, there is a growing emphasis on establishing regulatory frameworks that ensure ethical practices and protect data sovereignty. Striking a balance between fostering a competitive tech landscape and addressing privacy and security concerns, in this evolving regulatory environment presents both challenges and opportunities for organizations in the Waterloo-Toronto corridor.

During the event, OVHcloud is making two major announcements demonstrating its commitment to driving innovation, sustainability and data sovereignty within Canada’s technology landscape. In collaboration with market research firm, Léger, OVHcloud released new research on the evolving technology landscape in the Waterloo-Toronto corridor. 

Addressing critical topics such as data sovereignty, AI adoption, and sustainability, this research provides key insights into the challenges and opportunities faced by tech organizations in their pursuit of technological freedom and efficiency.  Among the report’s key findings, respondents highlighted that:

  • Data protection is a top priority, with almost nine in ten respondents (88%) saying it is important for their organization to store data locally. 
  • Sustainability is a key focus for many organizations, with 81% saying environmental sustainability is important in their organization’s technology and data management strategies. However, 36% of respondents believe they receive clear information about their providers’ carbon emissions. 
  • Almost two-thirds (64%) of respondents feel constrained to use the services of U.S. cloud giants, citing limited flexibility in switching vendors. 
  • Almost two-thirds (64%) of respondents believe their organization are prepared to manage the privacy and security risks posed by the growing use of AI and machine learning in data related activities.
  • More than two-thirds (68%) believe current regulations in Canada sufficiently support digital innovation while protecting privacy and security, while 21% believe that the Canadian regulations are too restrictive or not strong enough.

The findings present a pivotal opportunity to reframe the conversation around data governance, sustainability, and innovation within the Waterloo-Toronto corridor. OVHcloud’s strong values and differentiators in Canada significantly enhance the region’s ability to tackle these challenges while promoting data sovereignty and sustainable innovation. Complying with local regulations and remaining fully immune to extraterritorial legislation, OVHcloud demonstrates a strong commitment to data protection through its multi-site footprint in Canada.

Its innovative approach is reflected in industry-leading energy efficiency, including water-cooled data centers, as well as a comprehensive carbon calculator that addresses all three scopes of emissions, including manufacturing and lifecycle impacts. OVHcloud’s commitment to energy efficiency also addresses the growing demand for sustainable technology solutions, enabling organizations to reduce their carbon footprint while managing costs. This combination of affordability and environmental responsibility positions OVHcloud as a trusted partner for Canadian businesses.

In a landscape where reliance on U.S. providers can create constraints, OVHcloud champions an open cloud model free from vendor lock-in, offering fully interoperable and reversible solutions that empower organizations to maintain complete control over their infrastructure and data journey.

In addition to the research, OVHcloud also announced a new partnership between OVHcloud and the Balsillie School of International Affairs, highlighting the need for sustained discussions on technology and governance to shape Canada’s innovation future. The partnership further demonstrates the company’s dedication and commitment to contributing to the tech ecosystem of the region. 

Through this partnership, OVHcloud will support the BSIA’s technology governance internship program, which places graduate students in corporate, government and not-for-profit organizations. This initiative allows students to gain hands-on experience in governance, policy, and emerging technologies while providing a platform for organizations like OVHcloud to benefit from fresh perspectives on critical issues such as data protection and sovereignty.

This partnership as an important step in deepening OVHcloud’s engagement with the academic and tech communities in North America, helping to shape the future of technology governance and further promoting sovereignty and sustainability for a better digital future.

Firmly established in Canada in 2011, OVHcloud has expanded its presence with the opening of a new data center in Cambridge (ON) in March 2024 Located in one of North America’s most dynamic innovation clusters, this new data centre delivers trusted cloud solutions that cater to Canadian companies’ needs for enhanced performance, resilience and data governance. 

Download the OVHcloud/Léger report here.

Variable Refresh Rates On Monitors Seem To Be Broken On macOS Sequoia 15.1

Posted in Commentary with tags on November 5, 2024 by itnerd

I am beginning to think that I should have stayed on macOS Sonoma. I say that because that hot off posting this issue with Sequoia comes a new issue. If you have a monitor that has the ability to do variable refresh rates, this may not work on Sonoma. For example my new BenQ MOBIUZ EX321UX Monitor which had no issues doing variable refresh rates when I was running Sonoma can no longer do variable refresh rates in Sequoia. Instead, I found it to be locked to 144Hz in my case.

Now I know that it’s not the monitor because I grabbed a PC laptop that I had in my office and tested variable refresh rate support, and it worked fine. I also tested the same scenario on my wife’s Mac which runs Sonoma, and that worked fine as well. And while researching this, I found a single post that sounds similar to what I am experiencing.

And:

Reading through this, it seems like this is some sort of regression as this apparently worked in Sequoia 15 and 15.0.1 which implies that this might be a regression. But it could also be Apple deliberately limiting support for variable refresh rates and HDR along with it as described here and here. At this point, it’s not clear which it is. But as more people discover this, more people will not be happy. Especially if they have a MacBook Pro and are used to ProMotion which is Apple’s implementation of variable refresh rates. To work around this, I have set my refresh rate to 100 Hz at 2560×1440 resolution which gives me the added advantage of being able to use the option of having my MacBook Pro render slightly sharper text while using HDR. And I will be retesting this when Sequoia 15.2 comes out as I am hoping that it will address this issue.

Have you experienced this issue? If so, post a comment below and share your experience.

Cybercriminals Exploit DocuSign’s APIs to Send Authentic-Looking Invoices

Posted in Commentary with tags on November 4, 2024 by itnerd

Wallarm has unveiled a report where hackers are exploiting DocuSign APIs to send authentically appearing invoices in a new breed of cyber threat. Exploiting trusted platforms like DocuSign through their APIs marks a concerning evolution in cybercriminal strategies. By embedding fraudulent activities within legitimate services, attackers increase their chances of success while making detection more challenging. 

While beneficial for businesses, DocuSign’s API-friendly environment inadvertently provides a fertile ground for malicious actors to exploit. With paid accounts and access to official templates, attackers can customize invoices to match the branding of target companies, including unauthorized use of trademarks like Norton’s.

You can read the report here.

Backing Up Via Time Machine Is Broken In macOS Sequoia

Posted in Commentary with tags on November 3, 2024 by itnerd

Immediately after updating to macOS Sequoia, specifically the 15.1 version I noticed two problems with Apple’s Time Machine utility:

  1. Scheduled backups would fail to back up with the error message “Time Machine couldn’t complete the back up to <INSERT THE NAME OF MY NAS HERE>”. What made this interesting is that my wife’s Mac which is still on Sonoma backs via Time Machine up without an issue. It’s only the Macs that I have that are running Sequoia that have this issue. Basically implying that Apple broke Time Machine on Sequoia rather than the NAS being the issue.
  2. Adding insult to injury is the fact that the “preparing to back up” phase of backing up on one of my Sequoia Macs can take over 30 minutes. Again, my wife’s Mac doesn’t seem to have this issue which implies that this is a Sequoia issue.

I seem not to be alone in having problems with Time Machine on Sequoia. I have found post after post after post after post after post on this. Implying that issues with Time Machine is a widespread problem that Apple has yet to address. Now after looking through all of these posts, along with others that I have not linked to, I noted some common themes among them:

  • For some, disabling the macOS firewall seems to fix these issues. Especially if you are on Sequoia 15.0 or 15.0.1. I say that because whatever this firewall issue is appears to have been fixed in Sequoia 15.1 for some but not for all. My take on this is that the firewall is on for a reason and you should not mess with it. Thus a better course of action is to try updating to Sequoia 15.1 and see if your issues go away.
  • For some who back up over WiFi to something like a NAS, Apple has a feature that obfuscates the MAC (Media Access Control) address of your WiFi adapter to stop third parties from tracking you if you are using public WiFi. Turning this feature off has resolved these issues for some. The way you do that is as follows:
    • Go to System Settings
    • Go to WiFi
    • Choose the WiFi network that is used to back up to the NAS and click on the three dots on the right hand side and choose network settings.
    • Under “Private WiFi Address” set it to Off and click okay.
  • For some, deleting the backup volume from Time Machine and re-adding it fixes this issue. If you want to test that, here’s what you do (Note: This will NOT delete your backup data in case you are worried about that):
    • Go to System Settings
    • Go to General
    • Go to Time Machine
    • Highlight the backup volume and click on the “-“
    • Click on the “+” and add the backup volume back. It will ask you if you want to retain the existing backup history or delete it. Choose the option to retain the backup history.

Now I am testing removing and re-adding the Time Machine volume now on both my Sequoia Macs along with trying the WiFi suggestion as well. It will take me a few days to get a sense if either of those resolves the issue. But what I have tested and can give you feedback on is the slow speed in terms of “preparing to back up”. I dug out an old trick from my memory banks to test a theory (more on that theory in a moment) and found that it does validate my theory. But there’s a catch to doing what I am about to tell you that I will get to in a minute. First, this is what I did:

  1. I went to the Applications folder.
  2. Then I went to the Utilities folder.
  3. I started the Terminal application.
  4. I then typed this command: sudo sysctl debug.lowpri_throttle_enabled=0
  5. I pressed enter and then it prompted me for the password for user account on my Mac. I entered that and hit Return.
  6. I then closed the Terminal.

What this command does is disable throttling for Time Machine because Apple’s use case for Time Machine is that you’re backing up every hour by default. As a result of that your Mac by default will throttle how fast it backs up so that it doesn’t negatively affect anything else that you might be doing. However by disabling throttling, your Mac will back up as fast as it can. When I tested this by turning off throttling, it would take about 10 minutes to start backing up my Mac. When I turned throttling back on, it would take 30 minutes or so as I mentioned above. Beyond that, the back up was faster overall with throttling turned off.

Now turning throttling off has the side effect of making your Mac slower because it’s going as fast as it can to back up data and affecting everything else you might be doing as a side effect of that. You may not want that, especially if you’re still on an Intel Mac. But in my case, I use a third party utility called TimeMachineEditor which I wrote about here to schedule my backups to happen when I am asleep. Thus throttling has no negative effect for me and my use case. And it really doesn’t seem to affect anything on my M1 Pro MacBook Pro. Though I will also admit that may put it back to the default setting once everything is sorted and Time Machine works as expected as I try to run my Macs in as close to a default state as possible. Having said all of that, this test validates the theory that I had which was that Apple for reasons that I do not understand has changed the behaviour of Time Machine in Sequoia to more aggressively throttle backup speed. Because on a Mac with an earlier versions of macOS, this process of “preparing to back up” happens much faster.

Like I said earlier, I will report back in terms of how this works, or doesn’t work as I suspect that it may take a week or so before I get an idea on that front. But if you rely on Time Machine and you’re thinking of updating to macOS Sequoia, you may want to hold off until Apple officially fixes whatever they broke. And if you have any insight on these issues, feel free to leave a comment below and share your thoughts.