Performance & Optimization
Bundle Size Optimization in Next.js: Under 150KB Initial JS
TL;DR
Every kilobyte of JavaScript you ship to the browser costs you users. Parse time, execution time, memory pressure — it all compounds, especially on mobile. My hard rule is under 150KB of initial JavaScript on every project I ship. On FreshMart, a UK grocery e-commerce platform, the initial JS bundle was 380KB when I started the performance audit. After a systematic optimization pass — bundle analysis, dynamic imports, library swaps, Server Components, and aggressive tree-shaking — I brought it down to 120KB. This article walks through the exact process, the tools I use, and the decisions that made the biggest impact.
Why Bundle Size Matters for SEO
Google has made it clear: page experience is a ranking signal. But most developers fixate on Lighthouse scores without understanding the chain reaction that bloated JavaScript triggers.
Here is what happens when your initial JS exceeds 200KB:
- Download time increases. On a 3G connection (still common in emerging markets), 200KB of gzipped JS takes over 2 seconds just to download. That is before a single line executes.
- Parse and compile time spikes. JavaScript is the most expensive resource byte-for-byte. A 200KB image and 200KB of JS are not equivalent. The browser has to parse, compile, and execute JS. Images just decode.
- Time to Interactive (TTI) suffers. The main thread is blocked while JS executes. Users tap buttons and nothing happens. They leave.
- LCP gets delayed. If your hero content depends on client-side rendering (and more of it does than you think), bloated JS pushes your Largest Contentful Paint past the 2.5s threshold.
- INP degrades. More JS on the main thread means longer tasks. Longer tasks mean delayed responses to user interactions. Google's Interaction to Next Paint metric punishes this directly.
The SEO impact is not theoretical. When I optimized FreshMart's bundle, their mobile Lighthouse performance score jumped from 62 to 96. Organic traffic to product pages increased 23% over the following eight weeks. Google was crawling and indexing pages faster because the pages were lighter.
Under 150KB initial JS is not a vanity target. It is a business decision.
Using the Bundle Analyzer
You cannot optimize what you cannot measure. The first thing I do on any Next.js performance engagement is install @next/bundle-analyzer and look at what is actually being shipped.
Setup
npm install @next/bundle-analyzer// next.config.js
const withBundleAnalyzer = require('@next/bundle-analyzer')({
enabled: process.env.ANALYZE === 'true',
});
module.exports = withBundleAnalyzer({
// your existing Next.js config
});Then run it:
ANALYZE=true npm run buildThis opens an interactive treemap in your browser showing every module in your client and server bundles, sized proportionally. The visual is immediately revealing. You will spot the problems within 30 seconds.
What to Look For
When I opened the FreshMart bundle analyzer for the first time, three things jumped out:
- moment.js was included at 67KB gzipped — used in exactly two components for date formatting.
- lodash (the full library, not cherry-picked) was pulling in 24KB gzipped for three utility functions.
- A rich text editor component (used only on the admin dashboard) was being loaded on every page because it was imported at the top level of a shared layout.
Those three issues alone accounted for over 100KB of unnecessary initial JS. Most bundle bloat follows this pattern: a heavy library imported broadly, used narrowly.
I run ANALYZE=true builds at least once a week during active development. Bundle size creep is real. A developer adds one import, it looks harmless, and suddenly you have shipped an extra 30KB to every user.
Dynamic Imports for Below-Fold Content
The single highest-impact optimization in Next.js is dynamic imports. If a component is not visible on initial page load, it should not be in the initial bundle. Period.
The Pattern
import dynamic from 'next/dynamic';
const ReviewSection = dynamic(() => import('@/components/ReviewSection'), {
loading: () => <ReviewSectionSkeleton />,
});
const RelatedProducts = dynamic(() => import('@/components/RelatedProducts'), {
loading: () => <ProductGridSkeleton count={4} />,
});On FreshMart, the product page had five sections below the fold: reviews, related products, nutritional information, recipe suggestions, and a store locator. All five were statically imported. Every user paid the cost of downloading and parsing those components, even though most users never scrolled past the product image and add-to-cart button.
After converting those five sections to dynamic imports:
| Section | Size (gzipped) | Loaded On |
|---|---|---|
| ReviewSection | 18KB | Scroll into view |
| RelatedProducts | 12KB | Scroll into view |
| NutritionInfo | 8KB | Tab click |
| RecipeSuggestions | 15KB | Tab click |
| StoreLocator | 22KB | Tab click |
That is 75KB moved out of the initial bundle. The components still load fast when needed — Next.js prefetches them during idle time — but the initial page load no longer pays for content the user has not requested.
When NOT to Use Dynamic Imports
Do not dynamically import above-the-fold content. If a component is visible on first paint, dynamic importing it will make your LCP worse because you are adding a network round trip before it can render. Dynamic imports are for below-fold, on-interaction, or conditionally rendered content only.
Tree-Shaking — What Actually Gets Removed
Tree-shaking is the process where your bundler (webpack in Next.js) removes unused exports from your bundle. In theory, it sounds automatic. In practice, it breaks constantly, and most developers do not realize it.
Why Tree-Shaking Fails
Tree-shaking only works on ES modules (import/export). It cannot work on CommonJS (require/module.exports) because CommonJS exports are dynamic — the bundler cannot statically determine what is used.
Here is what kills tree-shaking in real projects:
- Barrel files with side effects. If you have an
index.tsthat re-exports everything from a directory, and any of those modules have side effects (code that runs on import), the bundler cannot safely remove anything. - CommonJS dependencies. Many npm packages still ship CommonJS. When you
import { debounce } from 'lodash', webpack cannot tree-shake it because lodash's main entry point is CommonJS. - Dynamic property access. If anywhere in your code you do
icons[iconName], the bundler cannot determine which icons are used, so it includes all of them.
How I Fix It
// BAD — imports the entire lodash library (70KB)
import { debounce } from 'lodash';
// GOOD — imports only the debounce module (1.5KB)
import debounce from 'lodash/debounce';
// BETTER — use a tree-shakeable alternative (0.5KB)
import { debounce } from 'lodash-es';For icon libraries, this is critical:
// BAD — imports every icon in the library (200KB+)
import { Search, Menu, X } from 'react-icons/fa';
// GOOD — deep import path (3KB total)
import { LuSearch, LuMenu, LuX } from 'lucide-react';Lucide is inherently tree-shakeable because each icon is an independent ES module. Some icon libraries are not. Check the bundle analyzer output — if you see a massive icons chunk, your tree-shaking is broken.
Verifying Tree-Shaking Works
Add this to your next.config.js to mark packages as side-effect-free:
// next.config.js
module.exports = {
experimental: {
optimizePackageImports: ['lucide-react', '@radix-ui/react-icons'],
},
};Next.js 13.5+ supports optimizePackageImports, which forces barrel file optimization for specified packages. I add every UI library I use to this list.
Heavy Libraries and Their Alternatives
This is where the biggest wins hide. A single library swap can save 50KB+ with zero functionality loss. Here are the swaps I make on every project:
| Heavy Library | Size (gzip) | Alternative | Size (gzip) | Savings |
|---|---|---|---|---|
| moment.js | 67KB | date-fns | 3KB (per function) | ~60KB |
| lodash | 24KB | lodash-es (cherry-picked) | 1-3KB | ~20KB |
| chart.js | 45KB | lightweight-charts | 18KB | ~27KB |
| uuid | 4KB | crypto.randomUUID() | 0KB (native) | 4KB |
| axios | 13KB | fetch (native) | 0KB | 13KB |
| numeral.js | 8KB | Intl.NumberFormat (native) | 0KB | 8KB |
| classnames | 1KB | clsx | 0.3KB | 0.7KB |
| yup | 12KB | zod (better types too) | 8KB | 4KB |
On FreshMart, swapping moment.js for date-fns was a 15-minute change that removed 60KB from the bundle. The moment.js library includes every locale by default. Even if you configure it to use only English, the tree-shaking failure I described above means the full library ships.
// Before — 67KB
import moment from 'moment';
const formatted = moment(date).format('DD MMM YYYY');
// After — 1.2KB for this specific function
import { format } from 'date-fns';
const formatted = format(date, 'dd MMM yyyy');The native platform APIs deserve special attention. Modern browsers ship fetch, crypto.randomUUID(), Intl.NumberFormat, Intl.DateTimeFormat, structuredClone, and AbortController. Every one of these replaces a library dependency with zero bundle cost. I actively look for npm packages that polyfill what the browser already provides.
Code Splitting Strategies
Dynamic imports handle component-level splitting. But there are broader architectural decisions that determine how Next.js splits your code.
Route-Based Splitting (Automatic)
Next.js App Router automatically code-splits by route. Each page in app/ gets its own JS chunk. This is free — you do not need to configure anything. But you can break it.
The most common way to break route-based splitting is shared layouts that import heavy dependencies:
// app/layout.tsx — BAD
// This rich text editor ships to EVERY page
import RichTextEditor from '@/components/RichTextEditor';
export default function RootLayout({ children }) {
return (
<html>
<body>
<RichTextEditor /> {/* Why is this in the root layout? */}
{children}
</body>
</html>
);
}On FreshMart, the admin's rich text editor was imported in a layout component that wrapped both the storefront and the admin panel. Every customer visiting the storefront downloaded 35KB of editor code they would never use. Moving the editor import into the admin layout (a separate route group) instantly removed it from the customer-facing bundle.
Conditional Feature Loading
'use client';
import { useEffect, useState } from 'react';
function ProductPage({ product }) {
const [showAR, setShowAR] = useState(false);
async function handleARView() {
const { ARViewer } = await import('@/components/ARViewer');
setShowAR(true);
}
return (
<div>
<ProductDetails product={product} />
<button onClick={handleARView}>View in AR</button>
{showAR && <ARViewer model={product.arModel} />}
</div>
);
}The AR viewer component was 45KB and used by fewer than 2% of users. Inline dynamic import on user interaction means 98% of users never download it.
Server Components Reduce Client JS
This is the architectural shift that changes everything about bundle size in Next.js. Server Components (RSC) render on the server and send HTML to the client. Their JavaScript never reaches the browser.
In the App Router, every component is a Server Component by default. The moment you add 'use client' at the top of a file, that component and everything it imports becomes part of the client bundle.
The Rule
Ask one question: Does this component need interactivity or browser APIs?
- If no: keep it as a Server Component. Zero client JS.
- If yes: make it a Client Component, but make it as small as possible.
The Pattern I Use
// app/products/[id]/page.tsx — Server Component (0KB client JS)
import { getProduct } from '@/lib/db';
import { ProductDetails } from '@/components/ProductDetails';
import { AddToCartButton } from '@/components/AddToCartButton';
export default async function ProductPage({ params }) {
const product = await getProduct(params.id);
return (
<div>
{/* Server Component — renders to HTML, no client JS */}
<ProductDetails product={product} />
{/* Client Component — only this ships JS to the browser */}
<AddToCartButton productId={product.id} price={product.price} />
</div>
);
}ProductDetails includes the product title, description, specifications table, and image gallery markup. All of that renders as static HTML on the server. The only Client Component is the AddToCartButton — a small interactive element that handles cart state.
On FreshMart, converting the product page from a fully client-rendered page to this Server Component pattern removed 85KB of client JS. The product data fetching, formatting, rich text rendering, and layout logic all moved to the server. The client only received the HTML output plus the small interactive pieces.
Watch the Boundary
The biggest mistake I see: developers add 'use client' to a parent component, which forces every child to become a Client Component too. Push 'use client' as deep into the tree as possible. The interactive leaf components should be client-side. Everything above them should be server-rendered.
Third-Party Script Loading
Third-party scripts are the silent bundle killers. Analytics, chat widgets, ad scripts, social embeds — they add up fast, and they are often loaded synchronously, blocking your main thread.
The next/script Component
import Script from 'next/script';
// Analytics — load after the page is interactive
<Script
src="https://www.googletagmanager.com/gtag/js?id=GA_ID"
strategy="afterInteractive"
/>
// Chat widget — load when the browser is idle
<Script
src="https://widget.intercom.io/widget/APP_ID"
strategy="lazyOnload"
/>
// Critical A/B testing — load before hydration
<Script
src="https://cdn.optimizely.com/js/PROJECT_ID.js"
strategy="beforeInteractive"
/>The strategy prop is everything:
| Strategy | When It Loads | Use Case |
|---|---|---|
beforeInteractive | Before hydration | A/B testing, bot detection |
afterInteractive | After hydration | Analytics, tracking |
lazyOnload | During browser idle | Chat, social widgets |
worker | In a web worker | Heavy analytics processing |
On FreshMart, the Intercom chat widget was loading synchronously on every page. Moving it to lazyOnload reduced the main thread blocking time by 180ms on mobile. The chat bubble still appears — just 2-3 seconds later, which no user notices because nobody opens a chat widget within 3 seconds of landing on a page.
Self-Hosting Critical Scripts
For scripts you control (your own analytics, feature flags), self-host them. Third-party origins require a DNS lookup, TCP connection, and TLS handshake before the script even starts downloading. Self-hosting eliminates that overhead and gives you control over caching headers.
My Optimization Process
Here is the exact process I follow on every project. I run through this during initial development and again before every major release through my performance optimization services.
Step 1: Measure the Baseline
ANALYZE=true npm run buildRecord three numbers:
- First Load JS (shown in the Next.js build output)
- Largest page bundle (from the build output table)
- Total client JS (from the bundle analyzer treemap)
Step 2: Identify the Top 5 Offenders
Open the bundle analyzer treemap. Sort mentally by size. The top 5 largest modules are your targets. In my experience, the top 5 account for 60-80% of the bloat.
Step 3: Classify Each Offender
For each large module, ask:
- Is it used on every page? If not, dynamic import it.
- Is the full library needed? If not, cherry-pick or swap.
- Does it need to run on the client? If not, move to a Server Component.
- Is there a native browser API? If so, drop the library entirely.
Step 4: Implement Changes, Measure Again
After each change, run the analyzer again. I track the numbers in a simple table:
| Change | Before | After | Saved |
|---|---|---|---|
| moment.js to date-fns | 380KB | 315KB | 65KB |
| lodash cherry-picking | 315KB | 292KB | 23KB |
| Dynamic import below-fold | 292KB | 218KB | 74KB |
| Server Components conversion | 218KB | 145KB | 73KB |
| Rich text editor to admin only | 145KB | 120KB | 25KB |
Step 5: Set a Budget and Enforce It
// next.config.js
module.exports = {
experimental: {
optimizePackageImports: ['lucide-react', 'date-fns', '@radix-ui/react-icons'],
},
// Performance budget — fail the build if exceeded
async headers() {
return [];
},
};I also add a CI check using bundlewatch:
// package.json
{
"bundlewatch": {
"files": [
{
"path": ".next/static/**/*.js",
"maxSize": "150KB"
}
]
}
}If a PR pushes the bundle over 150KB, the CI check fails. No exceptions. This is how you prevent bundle size regression over time.
Real Before/After Numbers
Here are the actual numbers from the FreshMart optimization, a UK grocery e-commerce platform I built and optimized:
Before Optimization
| Metric | Value |
|---|---|
| First Load JS (homepage) | 380KB |
| First Load JS (product page) | 342KB |
| First Load JS (category page) | 295KB |
| Lighthouse Performance (mobile) | 62 |
| LCP (mobile, field data) | 4.1s |
| TTI (mobile) | 6.8s |
| Total Client JS (all routes) | 1.2MB |
After Optimization
| Metric | Value | Improvement |
|---|---|---|
| First Load JS (homepage) | 120KB | -68% |
| First Load JS (product page) | 108KB | -68% |
| First Load JS (category page) | 95KB | -68% |
| Lighthouse Performance (mobile) | 96 | +34 points |
| LCP (mobile, field data) | 1.8s | -56% |
| TTI (mobile) | 2.9s | -57% |
| Total Client JS (all routes) | 410KB | -66% |
The most impactful changes, ranked by kilobytes saved:
- Server Components conversion — 85KB saved. Moving data fetching, formatting, and static rendering to the server was the single biggest win.
- Dynamic imports for below-fold content — 75KB saved. Five sections on the product page moved to lazy loading.
- moment.js to date-fns — 65KB saved. A 15-minute library swap.
- Rich text editor scoped to admin — 25KB saved. A layout restructure that took 20 minutes.
- lodash cherry-picking — 23KB saved. Changing import paths.
- axios to native fetch — 13KB saved. One-to-one API replacement.
Total time spent: about 6 hours of focused work. The ROI was immediate — faster pages, better SEO rankings, higher conversion rates.
Key Takeaways
- Set a hard budget. Mine is 150KB initial JS. FreshMart ships at 120KB. Put it in CI and enforce it on every PR.
- Measure before you optimize. Install
@next/bundle-analyzeron day one. Run it weekly. Know exactly what you are shipping. - Server Components are the biggest lever. If a component does not need interactivity, it should not ship JS to the browser. Push
'use client'to the leaves. - Dynamic import everything below the fold. If users have to scroll to see it, it should not be in the initial bundle.
- Swap heavy libraries for lighter alternatives. moment.js, lodash, axios — all have smaller or native replacements. Check the table above.
- Tree-shaking is not automatic. Barrel files, CommonJS, and dynamic property access break it silently. Verify with the bundle analyzer.
- Third-party scripts need a loading strategy. Use
next/scriptwith the rightstrategyprop. Never load a chat widget synchronously. - Bundle size is an SEO ranking factor — indirectly through Core Web Vitals, but the impact is measurable and significant.
Every kilobyte matters. Not in an abstract, theoretical sense — in real user experience, real conversion rates, and real search rankings. If you want this level of optimization on your project, check out my services or take a look at the FreshMart case study for the full breakdown.
*Uvin Vindula is a Web3 and AI engineer based between Sri Lanka and the UK, building production-grade web applications at iamuvin.com↗. He specializes in Next.js performance, full-stack TypeScript, and blockchain development. Every project he ships targets under 150KB initial JavaScript — no exceptions.*
Working on a Web3 or AI project?

Uvin Vindula
Web3 and AI engineer based in Sri Lanka and the UK. Author of The Rise of Bitcoin. Director of Blockchain and Software Solutions at Terra Labz. Founder of uvin.lk — Sri Lanka's Bitcoin education platform with 10,000+ learners.