DEV Community

Cover image for React vs WordPress: Why I Stopped Fighting the SEO Battle and Started Automating It
Mitu Das
Mitu Das

Posted on • Originally published at ccbd.dev

React vs WordPress: Why I Stopped Fighting the SEO Battle and Started Automating It

I spent five hours debugging why Google couldn't crawl a single page of a React app I'd built. No meta tags. No Open Graph data. A Lighthouse SEO score of 43. The client had left WordPress specifically because they wanted "modern tech", and I'd handed them an invisible website. If you're a React developer who's ever had a client ask "why aren't we ranking?", this one's for you.

The Real Reason WordPress Still Wins on SEO (It's Not Magic)

Let's be honest about why agencies keep defaulting to WordPress: it just works, out of the box, for SEO. Install Yoast, write some content, Google finds you. Done.

But the reason isn't that WordPress is better software. It's that WordPress renders HTML on the server, which means every page Googlebot visits already has its title, description, canonical URL, and Open Graph tags baked into the response. There's no JavaScript to execute. No async hydration to wait for.

React apps, by default, ship a nearly empty index.html and build the DOM client-side. Googlebot can execute JavaScript, but it often won't wait long enough, and crawl budget is finite. The result: orphaned pages, missing meta, and rankings that flatline.

The fix isn't to abandon React. It's to give your React app the same server-side head tag discipline that WordPress enforces by default.

Step 1: Stop Relying on document.title. Use a Proper Head Manager

The first thing most React devs reach for is document.title = "My Page" inside a useEffect. This is wrong for two reasons: it fires after render (too late for crawlers), and it leaves your <meta> tags completely untouched.

Use react-helmet-async instead. It renders head content during SSR and stays reactive client-side:

npm install react-helmet-async
Enter fullscreen mode Exit fullscreen mode

Wrap your app:

// main.jsx
import { HelmetProvider } from 'react-helmet-async';

ReactDOM.createRoot(document.getElementById('root')).render(
  <HelmetProvider>
    <App />
  </HelmetProvider>
);
Enter fullscreen mode Exit fullscreen mode

Then on any page component:

// BlogPost.jsx
import { Helmet } from 'react-helmet-async';

export default function BlogPost({ post }) {
  return (
    <>
      <Helmet>
        <title>{post.title} | My Blog</title>
        <meta name="description" content={post.excerpt} />
        <meta property="og:title" content={post.title} />
        <meta property="og:description" content={post.excerpt} />
        <meta property="og:image" content={post.coverImage} />
        <link rel="canonical" href={`https://yourdomain.com/blog/${post.slug}`} />
      </Helmet>
      <article>{/* your content */}</article>
    </>
  );
}
Enter fullscreen mode Exit fullscreen mode

Result: Every route now has unique, crawlable meta. This alone typically moves a Lighthouse SEO score from the 40s to the 80s.

Step 2: Generate Your Sitemap Automatically (Not Manually)

WordPress auto-generates a sitemap. Your React app doesn't, unless you build it. Most developers either forget this entirely or maintain a hand-written sitemap.xml that's always three sprints out of date.

The right move: generate it at build time from your actual routes or CMS data.

// scripts/generate-sitemap.js
import { writeFileSync } from 'fs';
import { routes } from '../src/routes.js'; // your route config

const BASE_URL = 'https://yourdomain.com';
const today = new Date().toISOString().split('T')[0];

const sitemap = `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
${routes
  .filter(r => !r.private)
  .map(r => `  <url>
    <loc>${BASE_URL}${r.path}</loc>
    <lastmod>${today}</lastmod>
    <changefreq>${r.changefreq || 'weekly'}</changefreq>
    <priority>${r.priority || '0.7'}</priority>
  </url>`)
  .join('\n')}
</urlset>`;

writeFileSync('./public/sitemap.xml', sitemap);
console.log(`✅ Sitemap generated with ${routes.length} URLs`);
Enter fullscreen mode Exit fullscreen mode

Wire it into your build:

// package.json
{
  "scripts": {
    "build": "node scripts/generate-sitemap.js && vite build"
  }
}
Enter fullscreen mode Exit fullscreen mode

Result: A fresh, accurate sitemap on every deploy. No manual updates. No forgotten pages.

Step 3: Automate Meta Generation for Content-Heavy Apps

Here's where things get interesting. If your React app pulls content from a headless CMS, an API, or a database, writing <Helmet> blocks by hand for every content type doesn't scale. You end up with inconsistent descriptions, missing OG images, and titles that don't match page content.

This is the exact problem the @power-seo package was built for. It's a lightweight utility that generates structured meta tags from a content object, useful when you're pulling dynamic data and don't want to hand-wire every field.

npm install @power-seo
Enter fullscreen mode Exit fullscreen mode
// hooks/useSEO.js
import { generateMeta } from '@power-seo';
import { Helmet } from 'react-helmet-async';

export function SEOHead({ content }) {
  const meta = generateMeta({
    title: content.title,
    description: content.description,
    image: content.thumbnail,
    url: content.canonicalUrl,
    type: content.type || 'article',
  });

  return (
    <Helmet>
      {meta.map(tag =>
        tag.property
          ? <meta key={tag.property} property={tag.property} content={tag.content} />
          : <meta key={tag.name} name={tag.name} content={tag.content} />
      )}
      <title>{meta.title}</title>
      <link rel="canonical" href={content.canonicalUrl} />
    </Helmet>
  );
}
Enter fullscreen mode Exit fullscreen mode

Now every content type (blog posts, product pages, profile pages) gets consistent, correctly structured meta without you writing the same boilerplate seventeen times.

A deeper breakdown of this pattern, including how it was applied on a real headless CMS project, is documented at ccbd.dev/blog/react-vs-wordpress.

Step 4: Validate Before You Ship

None of this matters if you don't verify it's working. Add this to your CI pipeline or run it locally before every deploy:

# Install once
npm install -g lighthouse

# Run against your local build
npx serve dist &
lighthouse http://localhost:3000 --only-categories=seo --output=json | \
  node -e "const d=require('/dev/stdin');console.log('SEO Score:', d.categories.seo.score * 100)"
Enter fullscreen mode Exit fullscreen mode

You want 90+. Anything below 80 means something is missing: usually canonical URLs, meta descriptions, or mobile viewport config.

Also paste your URLs into search.google.com/test/rich-results before launch. It shows you exactly what Googlebot sees, which is the only opinion that matters.

What I Learned

  • WordPress doesn't win on SEO because it's better; it wins because it enforces discipline by default. React gives you full control, which means you're also fully responsible.
  • Client-side document.title is not SEO. Use react-helmet-async from day one, not as a retrofit.
  • Sitemaps should be generated, not written. Automate this in your build script or you will forget to update it.
  • Dynamic content needs a systematic meta strategy. Hand-written <Helmet> blocks don't scale past a dozen content types. Abstract it into a hook or utility.
  • Measure it. Lighthouse SEO scores are blunt but honest. Run them in CI so regressions don't slip into production.

Your Turn

Have you migrated a project from WordPress to React and had to rebuild the SEO layer from scratch? What did you wish you'd known going in?

Drop your war stories in the comments, especially if you found a crawlability issue that turned out to be something completely unexpected. These threads are always more useful than the article itself.

Top comments (0)