Back to blog

Vibe Coding Is Leaking Your Secrets — Here's the Fix

VaultProof Team 6 min read

The vibe coding explosion

"Vibe coding" — the practice of describing what you want to an AI and letting it write the code — has gone mainstream. Tools like Cursor, Lovable, Bolt, v0, and Claude Code are turning ideas into working apps in minutes instead of weeks.

The speed is intoxicating. You can build a full-stack SaaS app in an afternoon. Ship a prototype before lunch. Deploy to production by dinner.

But there's a catch that nobody talks about until it's too late: AI-generated code is leaking your secrets at an unprecedented rate.

The numbers are terrifying

81%

Surge in AI-related credential leaks

1 in 5

Vibe-coded sites expose API keys

45%

AI-generated code has security flaws

These aren't hypothetical risks. GitHub's 2024 report found 39 million secrets leaked in public repositories — the highest number ever recorded. The 2025 numbers are tracking even higher, driven largely by AI-assisted development.

The pattern is clear: faster development + less security review = more leaks.

Real incidents that happened

Moltbook: 1.5 million API keys exposed

A vibe-coded platform built with AI assistance shipped to production with API keys embedded directly in client-side JavaScript. Security researchers found 1.5 million API keys from services including OpenAI, Stripe, and AWS exposed in the browser source code. The developer had no idea.

SANDWORM_MODE: 19 malicious npm packages

Attackers published 19 npm packages specifically designed to harvest API keys from projects built with AI coding tools. The packages mimicked popular AI utility libraries and silently exfiltrated environment variables. Over 47,000 downloads before removal.

Lovable platform: 170 apps exposed

Security researchers found that 170 applications built on the Lovable AI coding platform had exposed Supabase credentials, API keys, and database URLs. The AI code generator was embedding secrets directly into frontend code by default.

Why this keeps happening

The root cause isn't that developers are careless. It's that the AI-assisted development workflow has fundamental security gaps:

  • 1.
    AI puts keys directly in code.

    When you tell an AI "connect to OpenAI," it writes apiKey: 'sk-proj-...' inline. That's what it learned from training data. It doesn't know better.

  • 2.
    You push without reviewing.

    When AI generates 500 lines and it works, you commit and push. Who's reading every line? Nobody. That's the whole point of vibe coding.

  • 3.
    .env files are visible to AI agents.

    Even if you use .env, tools like Cursor and Claude Code have filesystem access. They can read your .env, and they often include its contents in code suggestions or context windows.

  • 4.
    Deploy platforms auto-expose secrets.

    Many AI-integrated deploy platforms inject env vars into client bundles by default. If your variable starts with NEXT_PUBLIC_ or VITE_, it ships to the browser.

The fix: keep keys out of your codebase entirely

The only way to guarantee a key can't leak from your code is to make sure it was never in your code. That's what VaultProof does.

With VaultProof's proxy URL, your OpenAI/Anthropic/Mistral key never enters your codebase, your .env, your shell history, or your AI agent's context. You change one line:

Before (dangerous)
const client = new OpenAI({
  apiKey: 'sk-proj-abc123...',  // leaked in 5 minutes
});
After (secure)
const client = new OpenAI({
  baseURL: 'https://proxy.vaultproof.dev/v1/openai/<VAULT_ID>',
  apiKey: 'unused',  // key never touches your code
});

Already have a .env file full of secrets? VaultProof's import feature lets you upload your .env in one click. Each key gets Shamir-split and stored securely. Then delete the .env and use proxy URLs instead.

The AI agent sees a proxy URL. Not a secret. It can't leak what it doesn't have.

Import your .env in one click

Upload your .env file. VaultProof splits and encrypts every key. Replace secrets with proxy URLs. Delete the .env.

Get Started Free
Share:

Related posts