CloudSEK has published research showing that 22 popular Android applications, collectively installed on more than 500 million devices, contain hardcoded Google API keys that now provide full, unauthorized access to Google’s Gemini artificial intelligence platform.
The report, released today by CloudSEK’s BeVigil security search engine, reveals a structural flaw at the crossroads of decade-old developer practices and Google’s rapidly expanding AI infrastructure. It is available at:
Background: A Decade-Old Assumption, Quietly Broken
For more than a decade, Google told developers that API keys in the AIza… format were safe to embed in public-facing applications. They were treated as public identifiers, not secrets.
That changed with Gemini. When a developer enables the Gemini API on a Google Cloud project, every existing API key on that project silently inherits access to Gemini endpoints, with no warning, no notification, and no opt-in prompt.
Developers who embedded Maps or Firebase keys years ago, following Google’s own documentation, now unknowingly hold live credentials to one of the world’s most powerful AI systems.
BeVigil scanned the top 10,000 Android apps by install count and confirmed 32 such live keys across 22 applications.
The Affected Apps: Household Names, Global Reach
The 22 vulnerable applications span e-commerce, travel, finance, education, news, and productivity. They include:
- OYO Hotel Booking App (100M+ installs)
- Google Pay for Business (50M+ installs)
- Taobao (50M+ installs)
- apna Job Search App (50M+ installs)
- ELSA Speak: AI English Learning (10M+ installs) – confirmed data exposure
- The Hindu: India and World News (10M+ installs)
- Shutterfly: Prints, Cards and Gifts (10M+ installs)
- JioSphere Web Browser (10M+ installs)
- Muslim: Ramadan 2026, Athan (10M+ installs)
- 30 Day Fitness Challenge, Krishify, ISS Live Now, and 10 others
CONFIRMED DATA EXPOSURE: Using the key found in ELSA Speak’s publicly downloadable app, CloudSEK researchers queried Google’s Gemini Files API and received a live response listing uploaded audio files. The files were likely speech recordings submitted by users for AI-powered pronunciation coaching.
What an Attacker Can Do With a Single Exposed Key
Any person who decompiles a vulnerable app and extracts its hardcoded key can:
- Access and download private user files, including documents, audio, and images, stored in the Gemini Files API
- Make unlimited Gemini API calls, potentially generating thousands of dollars in charges on the developer’s Google Cloud account
- Exhaust the organization’s API quotas, knocking out AI-powered features for real users
- Read cached AI context windows, which may contain sensitive prompts and internal data
- Continue exploiting the key across multiple app update cycles, as hardcoded keys often survive app versioning
Real Losses: Three Cases of Gemini API Key Abuse
The following highlights three publicly reported cases where stolen or exposed Google API keys led to severe financial harm:
Case 1: $15,400 overnight. A solo developer’s startup nearly collapsed after an attacker used his exposed key to flood Gemini with inference requests. The developer revoked the key within 10 minutes of a $40 billing alert. Due to a 30-hour reporting lag in Google Cloud’s billing system, the damage had already reached $15,400 by the time the dashboard updated.
Case 2: $128,000 and a company facing bankruptcy. A Japanese company using the Gemini API for internal tools saw approximately 20.36 million yen (around $128,000) in unauthorized charges accumulate after its key was compromised, even though firewall-level IP restrictions were in place. Google initially denied an adjustment request.
Case 3: $82,314 in 48 hours, a 455-times spike. A three-person development team in Mexico with a typical monthly cloud spend of $180 had their key stolen between February 11 and 12, 2025. Within 48 hours, attackers generated $82,314 in Gemini charges. Google’s representative initially held the company liable under the platform’s Shared Responsibility Model, citing an amount that exceeded the company’s total bank balance.
Full Report: https://www.cloudsek.com/blog/hardcoded-google-api-keys-in-top-android-apps-now-expose-gemini-ai

90% Run Enterprise GenAI at Scale, Yet 65% Lack Confidence in Data Security Controls
Posted in Commentary on April 8, 2026 by itnerdMIND has released new research, “The Impact of Data Trust on AI Initiative Success,” which examines the role of data trust in AI success, revealing a widening gap between rapid AI adoption and the ability to secure and govern the data that powers it.
Key findings include:
There is a wide gap between visibility and enforcement: Most organizations have written policies for AI. They have governance frameworks, acceptable use of documents, and AI councils. What they cannot do is enforce those policies effectively at machine speed. In fact, 70% struggle to enforce policies on GenAI tools, 66% cannot enforce AI agent policies, and 98% report at least one AI security challenge.
Data fundamentals are lacking and impede AI projects: Every day an AI tool operates against an unclassified, ungoverned data estate, it is surfacing exposure that no one can see and manage. The challenge is urgent, with 68% not knowing what data their agents are accessing, 65% not knowing what data is accessible for AI input, and 41% reporting they know they have Shadow GenAI.
AI does not behave like a human: Policies written for human actors are insufficient for AI agents that execute without hesitation. Data estates that were never fully classified become comprehensively and immediately exposed the moment an agent is pointed at them. These agents behave in ways that existing security frameworks were never built to track. Alarmingly, 90% of organizations have given broad data access to enterprise GenAI, 68% cannot determine what data their agents are accessing, and 32% have unknown agents already operating in their environment.
You can look at the research here: https://mind.io/content/research-report-impact-of-data-trust-on-ai-success
Leave a comment »