Archive for Privacy

Google Analytics Declared Unlawful In Denmark

Posted in Commentary with tags , on September 22, 2022 by itnerd

Denmark yesterday declared the use of Google Analytics unlawful. The Danish Data Protection Agency concluded that the tool would require the ‘implementation of supplementary measures in addition to the settings provided by Google’. The Agency stated that the decision represents a common European position among the citizens whose personal data is protected. Here are the key details:

The Danish Data Protection Agency has looked into the tool Google Analytics, its settings, and the terms under which the tool is provided. On the basis of this review, the Danish Data Protection Agency concludes that the tool cannot, without more, be used lawfully. Lawful use requires the implementation of supplementary measures in addition to the settings provided by Google.

In sort, if you’re in Denmark you can’t use Google Analytics. Full stop.

Mark Bower, VP of Product Management of Anjuna Security:

     “The ever-expanding bulk collection of consumer data and its handling will continue to land under the EU regulatory microscope, especially with the recent expansion of GDPR scope around inferred data following recent rulings in Lithuania that propagate across the union. Under this new extension, data that is derived from personal data is considered in scope. If breached, it has the same consequence as primary personal identifiers including massive fines. This has sweeping impact and risk for organizations: traditional approaches to compliance that often assume the personal data can be identified in advance of collection and then protected it may no longer work or be fit for purpose, especially with machine learning models where new derived outcomes and inference are coveted by data processors across industry, especially ad-tech, payments, financial services and retail. Organizations handling personal data must therefore look at more thorough and innovative protection strategies in addition to carefully analyze the risk of bulk collection itself. It’s no surprise then that the top of the data food chain is the first to be put in the spotlight – but they will not be the last”

You have to assume that a bunch of people at Google are not happy about this as gathering data and making money off of it is their core business. And I would not be surprised if other places on the planet start to do similar things.

Sucks to be Google.

Morgan Stanley Gets Slapped With $35 Million Fine After Failing To Wipe And/Or Encrypt Hard Drives That Eventually Were Resold

Posted in Commentary with tags on September 22, 2022 by itnerd

Well, this is one hell of a screw up.

A reader pointed out to me that the SEC has fined Morgan Stanley $35 million. The press release that the SEC put out has these details:

The Securities and Exchange Commission today announced charges against Morgan Stanley Smith Barney LLC (MSSB) stemming from the firm’s extensive failures, over a five-year period, to protect the personal identifying information, or PII, of approximately 15 million customers. MSSB has agreed to pay a $35 million penalty to settle the SEC charges.

The SEC’s order finds that, as far back as 2015, MSSB failed to properly dispose of devices containing its customers’ PII. On multiple occasions, MSSB hired a moving and storage company with no experience or expertise in data destruction services to decommission thousands of hard drives and servers containing the PII of millions of its customers. Moreover, according to the SEC’s order, over several years, MSSB failed to properly monitor the moving company’s work. The staff’s investigation found that the moving company sold to a third party thousands of MSSB devices including servers and hard drives, some of which contained customer PII, and which were eventually resold on an internet auction site without removal of such customer PII. While MSSB recovered some of the devices, which were shown to contain thousands of pieces of unencrypted customer data, the firm has not recovered the vast majority of the devices.

The SEC’s order also finds that MSSB failed to properly safeguard customer PII and properly dispose of consumer report information when it decommissioned local office and branch servers as part of a broader hardware refresh program. A records reconciliation exercise undertaken by the firm during this decommissioning process revealed that 42 servers, all potentially containing unencrypted customer PII and consumer report information, were missing. Moreover, during this process, MSSB also learned that the local devices being decommissioned had been equipped with encryption capability, but that the firm had failed to activate the encryption software for years.

Wow. There are a lot of #fails her. And quite honestly if I were a Morgan Stanley customer, I would be pissed.

Yes I said it.

The fact is that in 2015 never mind 2022, this is completely unacceptable. Companies need to handle Personally Identifiable Information or PII with the upmost of care. Morgan Stanley didn’t and it’s cost them. Though seeing as they agreed to pay this fine to make this problem go away as I suspect they figured out that they were in deep trouble when the SEC knocked on their door.

Hopefully, companies who handle PII are paying attention to this and hopefully the SEC doles out more punishment like this to send the message that if you screw this up, you will pay.

#Fail: Feelyou Exposes 70k Personal Emails

Posted in Commentary with tags on July 19, 2022 by itnerd

From the #Fail department comes the story of anonymous mental health app Feelyou who accidentally exposed 70,000 personal emails by failing to require any authentication to access the app’s GraphQL API.

That truly is a #Fail.

The vulnerability, discovered by security researcher Maia Arson Crimew, was patched over the weekend. Which is cold comfort if you use this app.

Yariv Shivek, VP of Product, Neosec had this comment:

“Healthcare APIs carry sensitive data and therefore must be secure. However, without proper automated controls – such as API monitoring – it’s hard to know when you’re providing sensitive information without correct authentication.”

This is really embarrassing for the makers of this app, and hopefully they take this opportunity to make sure that personal info stays secure.

Senators Demand Answers On Data Collection And Sharing Policies From Two Mental Health App Providers

Posted in Commentary with tags on July 2, 2022 by itnerd

Earlier this week I wrote about the potential pitfalls of data that relates to abortions in the wake of the removal of reproductive rights in the U.S. Today I have another example of how sensitive data can be misused. The Verge is reporting that US Senators Ron Wyden (D-OR), Elizabeth Warren (D-MA), and Cory Booker (D-NJ) have written to two leading mental health app providers, Talkspace and BetterHelp, and are demanding answers about their data collection and sharing practices:

In letters to BetterHelp and Talkspace executives on Thursday, Warren — along with Sens. Cory Booker (D-NJ) and Ron Wyden (D-OR) — called on the mental health companies to explain how their apps collect and use data obtained from their patients. Specifically, lawmakers requested information on the apps’ relationships with online advertisers, data brokers, and social media platforms like Facebook as well as how those relationships are disclosed to users.

Reviewing the companies’ privacy policies, the senators wrote that “unfortunately, it appears possible that the policies used by your company and similar mental health platforms allow third-party Big Tech firms and data brokers, who have shown remarkably little interest in protecting vulnerable consumers and users, to access and use highly confidential personal and medical information.”

The letter follows a report published in May by the Mozilla Foundation, which warned consumers that online talk therapy apps could be profiting off of their mental health data. While both BetterHelp and Talkspace promise not to sell a user’s medical data without their consent, the researchers determined that personal information — like a patient’s name, phone number, and email — could still be sold or accessed by third parties for advertising and marketing purposes.

Well, that’s shady. But sadly not a new phenomena. And Dan Weiss who is the SVP Application & Network Security Services for GRIMM agrees:

This problem is neither new nor unique. Instead, it represents a challenge to both users and regulators. Technology’s trend to move more rapidly than regulation and to expand in ways that are difficult to predict is both well-known and a consistent story over time. This case highlights that consumers still operate under the mistaken bias that data (of any sort) that passes through a mobile device has any assumption of privacy. The truth is that ensuring “privacy” in this landscape is a nearly impossible challenge, one which most application developers implement imperfectly, if at all.
Coupled with the fact that user data is the primary product for many applications, the challenge of restricting unintended use cases is one that regulators have failed to address. Some platforms, such as Apple, have worked to provide a level of additional visibility and control to users (although a cynic might rightfully question the completeness and motivation for these changes). However, attempting to solve the problem through current regulatory strategies is essentially doomed to fail for the reasons alluded to above. Ultimately, it will fall to the consumer to understand how the application is using their data and fully understand that the conventional definition of privacy no longer applies to data that transits the vast majority of commercial mobile applications.

It will be interesting to see how these companies respond to these senators. But it will be even more interesting to see if they suddenly make changes to hopefully make the scrutiny on them go away.

Watch this space for more.

App Data Could Be Used To Prosecute Women Under Anti-Abortion Laws

Posted in Commentary with tags on June 30, 2022 by itnerd

In the wake of the US Supreme Court overturning reproductive rights, there is now a legitimate concern that prosecutors getting access to data from period tracking apps and other apps like search engines and text messages is a real possibility. Under some state legislation, it could even be illegal to send a text message offering help or support. Via CNN:

A wave of new legislation taking aim at abortion rights across the country is raising concerns about the potential use of personal data to punish people who seek information about or access to abortion services online.

In some of the most restrictive states, digital rights experts warn that people’s search histories, location data, messages and other digital information could be used by law enforcement agencies investigating or prosecuting abortion-related cases.

Concerns about the digital privacy implications of abortion restrictions come amid a movement by Republican-controlled states, including Georgia, Texas, Mississippi and Oklahoma, in recent years to pass laws severely curtailing access to the service. And they take on additional significance following the leak Monday of the Supreme Court draft opinion that would overturn Roe v. Wade, which guarantees a person’s Constitutional right to terminate a pregnancy before viability (usually around 24 weeks). Overturning the landmark 1973 court ruling would transform the landscape of reproductive health in America, leaving abortion policy up to individual states and potentially paving the way for more than 20 states to pass new laws restricting abortions.

That story from CNN was published before The US Supreme Court struck down reproductive rights. Now in the wake of that decision, states are moving ahead with this sort of legislation. That has led Democrats to pursue legislation to provide legal protection for the privacy of this data. But in the here and now, the risk is still very real. Jake Williams, Executive Director of Threat Intelligence, SCYTHE provides this view and advice:

Search providers are required to comply with subpoenas from law enforcement when the search results themselves are evidence of a crime. Given the rapidly changing laws around access to abortion, searches for abortion and abortion related topics can be risky. While some have recommend searching using private browsing (or Incognito mode), these searches are still tied to your IP address. Ownership or use of the IP can be revealed through your ISP or mobile provider. You should ideally use a VPN when searching for legally ambiguous topics. Some past subpoenas have relied on geofencing to locate mobile phone subscribers within a particular area. It is also conceivable that this technique will be used to identify those who have traveled to a specific location where abortion or abortion related services are offered.

This is a troubling time for American women as things have moved into a place that is more akin to the Margret Atwood novel “The Handmaid’s Tale“. Thus it makes sense that anyone in this position take reasonable precautions to ensure their safety.

The Office Of The Privacy Commissioner Of Canada Says Tim Horton’s Was Tracking You Illegally

Posted in Commentary with tags , on June 1, 2022 by itnerd

Tim Horton’s was recently under investigation by the The Office of the Privacy Commissioner of Canada because of accusations that they were tracking you without your permission. The results of that investigation are now out and….

People who downloaded the Tim Hortons app had their movements tracked and recorded every few minutes of every day, even when their app was not open, in violation of Canadian privacy laws, a joint investigation by federal and provincial privacy authorities has found.

The investigation concluded that Tim Hortons’ continual and vast collection of location information was not proportional to the benefits Tim Hortons may have hoped to gain from better targeted promotion of its coffee and other products.

The Office of the Privacy Commissioner of Canada, Commission d’accès à l’information du Québec, Office of the Information and Privacy Commissioner for British Columbia, and Office of the Information and Privacy Commissioner of Alberta issued their Report of Findings today.

The Tim Hortons app asked for permission to access the mobile device’s geolocation functions, but misled many users to believe information would only be accessed when the app was in use. In reality, the app tracked users as long as the device was on, continually collecting their location data.

The app also used location data to infer where users lived, where they worked, and whether they were travelling. It generated an “event” every time users entered or left a Tim Hortons competitor, a major sports venue, or their home or workplace.

The investigation uncovered that Tim Hortons continued to collect vast amounts of location data for a year after shelving plans to use it for targeted advertising, even though it had no legitimate need to do so.

The company says it only used aggregated location data in a limited way, to analyze user trends – for example, whether users switched to other coffee chains, and how users’ movements changed as the pandemic took hold.

While Tim Hortons stopped continually tracking users’ location in 2020, after the investigation was launched, that decision did not eliminate the risk of surveillance. The investigation found that Tim Hortons’ contract with an American third-party location services supplier contained language so vague and permissive that it would have allowed the company to sell “de-identified” location data for its own purposes.

There is a real risk that de-identified geolocation data could be re-identified. A research report by the Office of the Privacy Commissioner of Canada underscored how easily people can be identified by their movements.

None of this sounds good and doesn’t leave Tim Horton’s looking good. I suspect that because of that, Tim Horton’s agreed to these items:

  • Delete any remaining location data and direct third-party service providers to do the same;
  • Establish and maintain a privacy management program that: includes privacy impact assessments for the app and any other apps it launches; creates a process to ensure information collection is necessary and proportional to the privacy impacts identified; ensures that privacy communications are consistent with, and adequately explain app-related practices; and
  • Report back with the details of measures it has taken to comply with the recommendations.

This report makes it clear that you can’t trust Tim Horton’s and their apps. And even after the above items are implemented, I still wouldn’t trust them. So what does Tim Horton’s have to say about this? Not much really:

“Data from this geolocation technology was never used for personalized marketing for individual guests. The very limited use of this data was on an aggregated, de-identified basis to study trends in our business — and the results did not contain personal information from any guests,” spokesperson Michael Oliveira said in an email.

“We’ve strengthened our internal team that’s dedicated to enhancing best practices when it comes to privacy and we’re continuing to focus on ensuring that guests can make informed decisions about their data when using our app.”

Sure they are. I am not buying what they are saying.

It is worth noting that Tim Horton’s is still facing four class-action lawsuits in B.C., Ontario, and Quebec? So their issues are not over by a long shot.

India Orders VPN Providers To Retain Data…. VPN Providers Are Considering Their Options Including Leaving The Country

Posted in Commentary with tags , on May 9, 2022 by itnerd

India has ordered VPN’s to collect and store users’ data, including names, addresses, contact numbers, email and IP addresses, for up to five year. With this move, Wired reported that VPN providers have since threated to quit India:

The justification from the country’s Computer Emergency Response Team (CERT-In) is that it needs to be able to investigate potential cybercrime. But that doesn’t wash with VPN providers, some of whom have said they may ignore the demands. “This latest move by the Indian government to require VPN companies to hand over user personal data represents a worrying attempt to infringe on the digital rights of its citizens,” says Harold Li, vice president of ExpressVPN. He adds that the company would never log user information or activity and that it will adjust its “operations and infrastructure to preserve this principle if and when necessary.”

Artur Kane, CMO at GoodAccess had this to say:

“Though controversial upon inception, the so-called data retention legislation has now been with us for decades. Most technologically developed countries enforce these directives with varying retention periods, usually ranging from 6 months to 2 years. In some countries, all expenses on data retention are even covered by the government.

Until now, the data retention obligations were limited to infrastructure providers (internet service providers, telecommunications), and asking the same of VPN vendors is without precedent in democratic countries.

The use of VPNs, in the past widely adopted by companies to provide remote access to company IT resources, has rapidly spread to millions of consumers over the past decade, who use it to avoid surveillance by internet providers, bypass country-based content filtering, and other restrictions. In my opinion, cybercriminals had been using VPNs to anonymize their activities even before ordinary users jumped on the trend.

Now, forcing VPN providers to track user traffic and their private data (like source and destination IP, port, protocol, and timestamps) is going to invalidate one of the last remaining safeguards of personal privacy on the public internet while helping to expose only a handful of lawbreakers. 

The value for the price doesn’t add up, either. Privacy is a basic human need, legally protected in many free countries, and people have the right to protect it, especially now, when their sensitive data is more valuable than ever and is being collected on a shocking scale.

Law on the public internet can be enforced in other ways that do not impact user privacy, such as the use of behavioral algorithms by vendors, looking for characteristic patterns of potentially malicious behaviors, or disabling VPN services to those accounts where such events were detected.”

I have been to India a number of times and this news is very disappointing. India really needs to reconsider this as this is a massive overreach by the Indian Government. And it risks making them a very repressive country that nobody will want to visit or do business in.

Ikea Canada Had A “Internal” Data Breach…. WTF??

Posted in Commentary with tags , on May 7, 2022 by itnerd

Over the last month, my wife and I have been doing shopping at Ikea Canada. But I may be rethinking that as it has come to light this past week that Ikea Canada had what they term an “Internal” data breach that affected 95,000 Canadians. Global News has the details:

Ikea Canada told Global News it was made aware that some of its customers’ personal information appeared in the results of a generic search made by an employee between March 1 to March 3.

A spokesperson added that the information was accessed by the person using Ikea’s customer database.

“While we can’t speculate as to why the search was made, we can share that we have taken actions to remedy this situation,” Ikea Canada PR leader Kristin Newbigging said.

“We have also reviewed our internal processes and reminded our co-workers of their obligation to protect customer information.”

Okay. The fact that you have to remind your employees not to do something like this is a huge problem. And the fact that an employee did this is a massive problem. It likely shows that their internal controls weren’t on point.

Here’s the best news out of this:

kea Canada has submitted a breach report to the Office of the Privacy Commissioner of Canada (OPC).

OPC officials confirmed they are in communication with the company to get more information and determine next steps. They would not say what those steps could be.

Hopefully the OPC smacks Ikea Canada silly as this is pretty unacceptable from my perspective. In the meantime, affected customers have already been notified by email.

Google Collects Data From Google Dial And Messages Without Your Consent Or Ability To Opt Out…. WTF?

Posted in Commentary with tags , on March 23, 2022 by itnerd

People have said to me that I am such an Apple Fanboy because I tend to gravitate towards Apple products. The reality is that while I don’t trust any company completely, I trust Apple more than Google. And this story is a clear reason why I feel that way:

According to a research paper, “What Data Do The Google Dialer and Messages Apps On Android Send to Google?” [PDF], by Trinity College Dublin computer science professor Douglas Leith, Google Messages (for text messaging) and Google Dialer (for phone calls) have been sending data about user communications to the Google Play Services Clearcut logger service and to Google’s Firebase Analytics service.

“The data sent by Google Messages includes a hash of the message text, allowing linking of sender and receiver in a message exchange,” the paper says. “The data sent by Google Dialer includes the call time and duration, again allowing linking of the two handsets engaged in a phone call. Phone numbers are also sent to Google.” The timing and duration of other user interactions with these apps has also been transmitted to Google. And Google offers no way to opt-out of this data collection.

So in short, Google is unsurprisingly harvesting user data. Something that they don’t exactly confirm. But they don’t exactly deny either:

Google confirmed to The Register on Monday that the paper’s representations about its interactions with Leith are accurate. “We welcome partnerships – and feedback – from academics and researchers, including those at Trinity College,” a Google spokesperson said. “We’ve worked constructively with that team to address their comments, and will continue to do so.”

The paper raises questions about whether Google’s apps comply with GDPR but cautions that legal conclusions are out of scope for what is a technical analysis. We asked Google whether it believes its apps meet GDPR obligations but we received no reply.

Hopefully politicians in both the US and Europe are paying attention because this is something that merits an investigation. And perhaps some form of punishment.

Clearview AI Brags About Being Able To Ensure “Almost Everyone In The World Will Be Identifiable”

Posted in Commentary with tags , on February 19, 2022 by itnerd

I’ve written about Clearview AI before. And the fact that when it comes to violating your privacy, they’re right up there with Meta/Facebook. Well after reading this Washington Post story, I have to say that Meta/Facebook doesn’t seem so bad compared to these clowns:

Clearview AI is telling investors it is on track to have 100 billion facial photos in its database within a year, enough to ensure “almost everyone in the world will be identifiable,” according to a financial presentation from December obtained by The Washington Post. 

Those images — equivalent to 14 photos for each of the 7 billion people on Earth — would help power a surveillance system that has been used for arrests and criminal investigations by thousands of law enforcement and government agencies around the world. And the company wants to expand beyond scanning faces for the police, saying in the presentation that it could monitor “gig economy” workers and is researching a number of new technologies that could identify someone based on how they walk, detect their location from a photo or scan their fingerprints from afar. 

The 55-page “pitch deck,” the contents of which have not been reported previously, reveals surprising details about how the company, whose work already is controversial, is positioning itself for a major expansion, funded in large part by government contracts and the taxpayers the system would be used to monitor. The document was made for fundraising purposes, and it is unclear how realistic its goals might be. The company said that its “index of faces” has grown from 3 billion images to more than 10 billion since early 2020 and that its data collection system now ingests 1.5 billion images a month.

With $50 million from investors, the company said, it could bulk up its data collection powers to 100 billion photos, build new products, expand its international sales team and pay more toward lobbying government policymakers to “develop favorable regulation.”

Doesn’t that make you feel creeped out? It sure makes me feel creeped out. The only solution is to make sure that this company is regulated to death so that it makes their business model is made non-viable. And if I had to place bets on who might be likely to do that, I would say that the EU would be the ones that would to take the lead on that because they simply don’t tolerate violations of privacy. But It would mean a lot more if the US which is where Clearview AI is based did something as it would send a clear message that this sort of business model is unacceptable.