Last Tuesday, I refreshed Google PageSpeed Insights for the hundredth time and saw something that made me do a double-take: 99/100/100/100. Not just on my homepage, but across my entire Next.js portfolio.
For context, I've been obsessing over web performance for months. Not because I love green numbers (though I do), but because I kept reading statistics about how the average website loads slower than a Windows 95 boot sequence, and frankly, that felt embarrassing for our industry.
Let me tell you the story of how this happened, and more importantly, why it matters for your projects too.
The Reality Check That Started Everything
I used to think my old portfolio was "pretty fast." Then I ran Lighthouse on a few competitor sites and had a wake-up call. Here's what I found:
What "Normal" Actually Looks Like
- Most developer portfolios: 60-80 performance scores
- Popular agency websites: often 40-60
- E-commerce sites: frequently below 50
- That "blazing fast" startup's landing page: 35
Meanwhile, Users Are Waiting
- 2.5 seconds for content to appear (industry average)
- 300-600ms of "why isn't this button working?" delay
- Layout jumping around as images load
- 53% abandonment rate if it takes over 3 seconds
The Next.js 15 Decision That Changed Everything
Here's where the story gets interesting. Instead of optimizing my existing React app, I decided to rebuild on Next.js 15 with React 19. Not because I love bleeding-edge tech (okay, maybe a little), but because of one killer feature: Server Components by default.
Think about it: most websites ship a bundle of JavaScript that then fetches data to display content. It's like mailing someone an empty envelope with instructions to call you for the actual letter. Why not just send the letter?
The Old Way: Client-Side Fetching
1 2 3 4 5 6 7 8 9 10 11 12
// Ship JavaScript, then fetch data export default function ProjectsPage() { const [projects, setProjects] = useState([]); useEffect(() => { fetch('/api/projects').then(setProjects); }, []); return projects.length ? <ProjectGrid projects={projects} /> : <Loading />; }
The New Way: Server Components
1 2 3 4 5 6 7 8
// Data is already there when the page loads export default async function ProjectsPage() { const projects = await prisma.project.findMany({ where: { status: 'PUBLISHED' } }); return <ProjectGrid projects={projects} />; }
The difference? No loading spinner. No "flash of empty content." No JavaScript bundle just to show some text and images. The content is just... there.
The Bundle Diet That Actually Worked
Most performance advice sounds like "just compress your images lol." But the real wins come from shipping less code in the first place. Here's what my bundle looks like after optimization:
The secret sauce was treating different parts of my application as separate bundles. Why should visitors downloading my portfolio also get code for features they'll never use?
1 2 3 4 5 6 7 8 9
// Dynamic imports keep heavy features out of the main bundle const Dashboard = dynamic(() => import('./dashboard/Dashboard'), { loading: () => <div>Loading...</div>, }); // Only loads when actually needed const HeavyChart = dynamic(() => import('./charts/InteractiveChart'), { ssr: false, // Skip server-side rendering for client-only components });
The Animation Performance Hack
Everyone wants smooth animations, but most developers ship the entire Framer Motion library (50KB) to animate a button hover. I found a better way.
The Heavy Approach
Most developers do this:
1 2 3 4
// Importing everything (50KB) import { motion } from 'framer-motion'; <motion.div animate={{ opacity: 1 }} />
The Optimized Solution
Instead, I use LazyMotion with selective features:
1 2 3 4 5 6 7 8 9 10 11
// LazyMotion with only needed features import { LazyMotion, domMax, m } from 'framer-motion'; export const MotionProvider = ({ children }) => ( <LazyMotion features={domMax} strict> {children} </LazyMotion> ); // Now 'm' works like 'motion' but 20KB lighter <m.div animate={{ opacity: 1 }} />
The best optimization is the code you don't ship.
The Caching Strategy That Actually Makes Sense
Most caching advice is either "cache everything forever" or "cache nothing because it's too complicated." I went with a middle path that actually works for real websites:
1 2 3 4 5 6 7 8 9 10 11 12 13 14
// Different cache rules for different content types const cacheHeaders = { // Build assets: Cache forever (they have hashes anyway) static: 'public, max-age=31536000, immutable', // Pages: Cache but allow updates pages: 'public, max-age=3600, stale-while-revalidate=86400', // Dynamic content: Don't cache dynamic: 'no-store' }; // ISR keeps content fresh without rebuilding everything export const revalidate = 21600; // 6 hours
This means returning visitors get instant page loads, but the content stays fresh. Best of both worlds.
The Accessibility Win That Surprised Me
Here's something that caught me off-guard: fixing performance often fixes accessibility too. When I optimized for screen readers and keyboard navigation, it also made my site faster.
1 2 3 4 5 6 7 8 9
// Respect user preferences const useAnimationBudget = () => { const prefersReducedMotion = useMediaQuery('(prefers-reduced-motion: reduce)'); return { shouldAnimate: !prefersReducedMotion, particleCount: prefersReducedMotion ? 0 : 50 }; };
Users who prefer reduced motion get a faster site with less JavaScript running. Users who love animations get smooth 60fps experiences. Everyone wins.
Why Most Sites Still Struggle
After analyzing dozens of slow websites, I noticed a pattern. It's not that developers don't know about performance. It's that they optimize the wrong things at the wrong time.
The Typical Approach
- Build all the features quickly
- Add animations and interactions
- Deploy and celebrate
- Maybe optimize later (spoiler: later never comes)
What Actually Works
- Choose a performance-first architecture from day one
- Set performance budgets and stick to them
- Monitor bundle size on every commit
- Optimize as you build, not after
The difference is like the compound interest of web performance. Small decisions early on create massive advantages over time.
The Business Impact Nobody Talks About
Here's the part that makes this more than just vanity metrics: fast sites make more money.
When Amazon improved their load time by 100ms, they saw a 1% increase in revenue. For them, that's hundreds of millions of dollars. For your project, it might be the difference between someone reading your entire blog post or bouncing to a competitor.
The Tools That Made the Difference
I didn't achieve these results through willpower and guesswork. These tools were essential:
Bundle Analysis
ANALYZE=true pnpm build
Performance Monitoring
- Lighthouse CI in my GitHub Actions
- Real User Monitoring with Core Web Vitals
- Regular PageSpeed Insights audits
Code Quality
- Bundle size limits in my build process
- Performance budgets that fail CI if exceeded
- Automatic image optimization with Next.js Image
What You Can Implement Today
You don't need to rebuild your entire site. Here are the changes that provide the biggest performance wins:
Easy Wins (1 hour)
- Add
priority
to your largest above-the-fold image - Enable Next.js Image optimization
- Use dynamic imports for heavy components
- Check your bundle size with
npm run build
Medium Effort (1 day)
- Implement proper caching headers
- Split dashboard/public code into separate bundles
- Add performance budgets to your CI
- Optimize your largest JavaScript dependencies
Bigger Investment (1 week)
- Migrate to Next.js App Router with Server Components
- Implement ISR for dynamic content
- Add comprehensive performance monitoring
- Optimize your Core Web Vitals across all pages
The Real Lesson Here
Achieving 99/100/100/100 Lighthouse scores isn't about perfection for its own sake. It's about building things that work better for the people who use them.
When your site loads in under a second, people notice. When animations are smooth and interactions feel instant, it creates trust. When accessibility is built-in from day one, you reach more people.
My Next.js portfolio now loads faster than most native apps. Your project could too. The question isn't whether you should optimize for performance, but how quickly you can start.
Speed is a feature. Make it a priority.
Want to Work Together?
If you're working on a performance challenge or building something that needs to load fast, I'd love to help. I offer consulting and coaching for developers who want to optimize their applications without compromising on features.
Whether you're dealing with slow load times, poor Core Web Vitals, or just want to learn modern performance techniques, we can work through the specific challenges your project faces. The tools and techniques I've shared here work with any modern JavaScript framework, not just Next.js.
Feel free to reach out through my contact form if you're interested in collaborating or want to discuss your performance optimization goals.
What performance challenge are you working on next?