
You're deploying your Next.js application. The build starts fine, static pages generate successfully, and then—crash. FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory. You increase Node's memory allocation to 8GB, try again, same result. Sound familiar?
This Next.js build out of memory error is particularly frustrating because the usual fix—throwing more RAM at the problem—doesn't work. The real culprit is often hiding in plain sight: large files accidentally committed to your repository that get bundled into the build context.
In this article, we'll diagnose why this happens, trace it to a common root cause involving service account files, and implement a proper fix that keeps your builds lightweight and your secrets secure.
Let's look at a typical scenario. You're building a Next.js 15 application—perhaps for a restaurant loyalty program, an e-commerce platform, or any project that connects to external services. Locally, everything seems fine during development. But when you run next build or deploy to Vercel, you get hit with this:
✓ Compiled successfully
✓ Collecting page data
✓ Generating static pages (18/18)
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x100a3d0f8 node::Abort() [node]
2: 0x100a3d278 node::OOMErrorHandler(char const*, v8::OOMDetails const&) [node]
3: 0x100bc1990 v8::internal::V8::FatalProcessOutOfMemory(...)The build completes the compilation phase without issues. It even generates your static pages successfully. Then, during the final bundling or optimization phase, memory consumption spikes uncontrollably until Node.js crashes.
You try the standard remediation:
# ❌ Doesn't help with this particular issue
NODE_OPTIONS="--max-old-space-size=8192" npm run buildEven allocating 8GB of heap memory doesn't solve it. The build still fails. This indicates the problem isn't about the memory limit being too low—it's about something in your project causing unbounded memory consumption.
This issue was reported in GitHub Issue #76704, where a developer experienced the exact same pattern: a "not really large app" that inexplicably consumed all available memory during builds. The breakthrough came when they discovered large service account files hidden in their repository.
Here's what's actually happening. During the Next.js build process, the bundler (whether Webpack or Turbopack) analyzes your project's dependency graph. It traces imports, resolves modules, and processes files to create optimized bundles. When certain files end up in the build context—either through direct imports or because they're present in directories the bundler scans—they get loaded into memory.
Service account files are a particularly insidious offender. Consider this common pattern for Firebase or Google Cloud integration:
// ❌ Broken: Importing a local JSON file directly
import serviceAccount from "./firebase-service-account.json";
import admin from "firebase-admin";
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
export const db = admin.firestore();This looks innocent enough. But here's the problem:
Service account files can be large. While a typical service account JSON is only a few KB, some projects accumulate multiple service account files, backup copies, or files with embedded certificates that balloon in size.
They may contain or reference other large files. In some configurations, service account directories include large key files, certificate chains, or bundled dependencies.
They get processed during static analysis. Even if you only import a single field from the JSON, the entire file (and anything it references) gets loaded into the bundler's memory.
Multiple copies multiply the problem. If you have firebase-admin-sdk.json, firebase-admin-sdk-backup.json, and firebase-admin-sdk-prod.json all in your repo, and any of them get pulled into the build context, you're tripling your memory overhead.
Beyond service account files, similar issues arise from:
.bak, .old) that duplicate large dependenciesThe bundler doesn't discriminate. If a file is in a directory it scans, and it's importable, it becomes part of the build context.
When the bundler encounters a large file or a file that triggers extensive processing, it doesn't just load the file once. It may:
This explains why increasing --max-old-space-size doesn't help. You're not dealing with a "needs more memory" problem—you're dealing with a "shouldn't be processing this at all" problem.
The fix has two parts: remove the problematic files from your repository, and restructure your code to load credentials from environment variables instead.
First, find large files in your repository:
# Find files larger than 100KB in your project
find . -type f -size +100k -not -path "./node_modules/*" -not -path "./.git/*"
# Check for common service account patterns
find . -name "*service-account*" -o -name "*credentials*" -o -name "*firebase-admin*"Look for JSON files containing private keys, certificate files, or any unexpectedly large files in your source directories.
Once identified, remove these files from Git tracking:
# Remove from Git but keep local copy (you'll need the values for env vars)
git rm --cached firebase-service-account.json
git rm --cached google-credentials.json
# Commit the removal
git commit -m "Remove service account files from repository"Add them to .gitignore to prevent future commits:
# .gitignore
# Service account and credential files
**/firebase-service-account*.json
**/google-credentials*.json
**/service-account*.json
**/*-credentials.json
*.pem
*.key
# Environment files (should already be here)
.env
.env.local
.env.*.localNow, restructure your code to load credentials from environment variables:
// ✅ Fixed: Loading credentials from environment variables
import admin from "firebase-admin";
// Parse the service account from an environment variable
const serviceAccount = JSON.parse(
process.env.FIREBASE_SERVICE_ACCOUNT_KEY || "{}"
);
if (!admin.apps.length) {
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
}
export const db = admin.firestore();Set the environment variable in your deployment platform. For Vercel:
# In your Vercel project settings, add:
# Name: FIREBASE_SERVICE_ACCOUNT_KEY
# Value: (paste the entire JSON content, minified)For local development, use .env.local:
# .env.local (DO NOT commit this file)
FIREBASE_SERVICE_ACCOUNT_KEY='{"type":"service_account","project_id":"your-project",...}'Service account JSON files are multi-line, which can cause issues with some environment variable parsers. Here's a more robust approach:
// ✅ Fixed: Robust credential loading with error handling
import admin from "firebase-admin";
function getFirebaseAdmin() {
if (admin.apps.length) {
return admin;
}
const serviceAccountKey = process.env.FIREBASE_SERVICE_ACCOUNT_KEY;
if (!serviceAccountKey) {
throw new Error(
"FIREBASE_SERVICE_ACCOUNT_KEY environment variable is not set"
);
}
try {
// Handle both stringified JSON and base64-encoded values
let credentials;
if (serviceAccountKey.startsWith("{")) {
credentials = JSON.parse(serviceAccountKey);
} else {
// Assume base64 encoded
const decoded = Buffer.from(serviceAccountKey, "base64").toString(
"utf-8"
);
credentials = JSON.parse(decoded);
}
admin.initializeApp({
credential: admin.credential.cert(credentials),
});
return admin;
} catch (error) {
throw new Error(`Failed to parse Firebase credentials: ${error.message}`);
}
}
export const db = getFirebaseAdmin().firestore();To use base64 encoding (recommended for complex JSON):
# Encode your service account file
cat firebase-service-account.json | base64
# Add the output as FIREBASE_SERVICE_ACCOUNT_KEY in your deployment platformFixing the immediate problem is essential, but preventing recurrence requires establishing proper practices for credential and large file management.
Start with a .gitignore that explicitly excludes credential patterns:
# Credentials and secrets
**/credentials/
**/secrets/
*.pem
*.key
*.p12
*.pfx
*-service-account*.json
*-credentials*.json
firebase-adminsdk*.json
google-cloud-key*.json
# Environment files
.env
.env.*
!.env.example
# Large data files
*.sql
*.sqlite
*.db
data/*.json
seed-data/
# Build artifacts that shouldn't be committed
.next/
out/
build/
dist/
# IDE and OS files
.DS_Store
*.swp
.idea/
.vscode/settings.jsonFor production deployments, consider dedicated secret management instead of environment variables:
| Platform | Secret Management |
|---|---|
| Vercel | Environment Variables (encrypted at rest) |
| AWS | AWS Secrets Manager or Parameter Store |
| Google Cloud | Secret Manager |
| Azure | Key Vault |
| Self-hosted | HashiCorp Vault, Doppler |
Use Git hooks to prevent accidental commits of sensitive files:
# Install husky for Git hooks
npm install --save-dev husky
# Initialize husky
npx husky initCreate a pre-commit hook:
#!/bin/sh
# .husky/pre-commit
# Check for potential credential files
if git diff --cached --name-only | grep -E "(service-account|credentials|\.pem|\.key)"; then
echo "ERROR: Attempting to commit potential credential files!"
echo "Remove these files from staging and add them to .gitignore"
exit 1
fiRun periodic checks for large or sensitive files:
# Find large files in Git history
git rev-list --objects --all | \
git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' | \
sed -n 's/^blob //p' | \
sort -rnk2 | \
head -20
# Check for files that match sensitive patterns
git ls-files | grep -E "(credential|secret|service-account|\.pem|\.key)"Beyond credentials, maintain awareness of what enters your build:
@next/bundle-analyzer.// ✅ Better: Dynamic import for optional heavy dependencies
async function generatePDF(data) {
const { default: PDFDocument } = await import("pdfkit");
// Use PDFDocument...
}find . -type f -size +100k and check for service account or credential files.If your builds are still failing after removing large files, investigate dynamic imports, bundle splitting, or consider using standalone output mode for more controlled production builds. The goal is always the same: keep your build context as lightweight as possible.
Content creator and developer at UICraft Marketplace, sharing insights and tutorials on modern web development.
Save hours of development time with our premium Next.js templates. Built with Next.js 16, React 19, and Tailwind CSS 4.
Get the latest articles, tutorials, and product updates delivered to your inbox.

When you deploy a new version of your Next.js application, existing clients can silently fail when calling Server Actions. This happens because Next.js generates new encryption keys per build, making old client sessions incompatible. Here's the advanced fix using persistent encryption keys.

The "Props must be serializable" warning in Next.js confuses many developers who mark every client component with 'use client'. This guide explains the root cause and shows you exactly how to fix it by understanding the true purpose of client-server boundaries.

The Next.js App Router introduced in version 14.2.3+ has a critical singleton instantiation issue affecting database connections, syntax highlighters, and module-level state. This guide explains why singletons behave inconsistently during builds and provides battle-tested solutions.