DEV Community

mathew lam
mathew lam

Posted on • Originally published at jerseytome.com

How I Got 98 Pages Indexed by Google in 24 Hours (Without Waiting for Crawlers)

The Problem With New Sites

You deploy a content site with 100+ pages. You submit your sitemap to Google Search Console. Then you wait.

And wait.

After two weeks, Google has crawled maybe 15 pages. Your carefully written content is invisible. Sound familiar?

I ran into this exact problem with my side project — a 270-page content site across 5 locales. After a week, only ~35 pages were indexed. I needed a faster way.

The Solution: Google Indexing API + Bing URL Submission

Most developers don't know that Google has an Indexing API that lets you proactively notify Google when pages are created or updated. It's not just for job postings anymore — it works for any URL you own.

Step 1: Set Up OAuth2 Credentials

You need a Google Cloud project with the Indexing API enabled:

  1. Go to GCP Console
  2. Create an OAuth 2.0 Client ID (Desktop app type)
  3. Enable the "Web Search Indexing API"
  4. Run the OAuth flow once to get a refresh token
// Exchange refresh token for access token
const response = await fetch('https://oauth2.googleapis.com/token', {
  method: 'POST',
  headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
  body: new URLSearchParams({
    client_id: process.env.GOOGLE_CLIENT_ID,
    client_secret: process.env.GOOGLE_CLIENT_SECRET,
    refresh_token: process.env.GOOGLE_REFRESH_TOKEN,
    grant_type: 'refresh_token',
  }),
});
Enter fullscreen mode Exit fullscreen mode

Step 2: Submit URLs Programmatically

Once you have an access token, pushing a URL is one API call:

async function submitToGoogle(url: string, accessToken: string) {
  const response = await fetch(
    'https://indexing.googleapis.com/v3/urlNotifications:publish',
    {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        Authorization: `Bearer ${accessToken}`,
      },
      body: JSON.stringify({
        url,
        type: 'URL_UPDATED',
      }),
    }
  );
  return response.ok;
}
Enter fullscreen mode Exit fullscreen mode

Step 3: Generate URLs From Your Content

Don't hardcode URLs. Read them from your content directory:

function getAllIndexableUrls(): string[] {
  const urls: string[] = [];
  const contentDir = path.join(process.cwd(), 'content/en');

  // Walk content directory
  for (const file of walkDir(contentDir)) {
    if (file.endsWith('.mdx')) {
      const slug = file.replace(contentDir, '').replace('.mdx', '');
      urls.push(`${SITE_URL}${slug}`);
    }
  }

  return urls;
}
Enter fullscreen mode Exit fullscreen mode

Step 4: Add Bing for Free

Bing's URL Submission API is even simpler — just an API key:

async function submitToBing(url: string) {
  await fetch(
    `https://ssl.bing.com/webmaster/api.svc/json/SubmitUrl?apikey=${BING_KEY}&siteUrl=${SITE}`,
    {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({ siteUrl: SITE, url }),
    }
  );
}
Enter fullscreen mode Exit fullscreen mode

Step 5: Run on Every Deploy

Add to your CI/CD pipeline or run manually after deploying new content:

# Submit all URLs
npx tsx scripts/indexing.ts --all

# Or just recently modified
npx tsx scripts/indexing.ts --recent 7
Enter fullscreen mode Exit fullscreen mode

Results

I ran this against 98 URLs on my site. All 98 were accepted by Google's Indexing API in a single run. Within 48 hours, Google Search Console showed a significant jump in indexed pages.

The rate limit is 200 URLs per day, which is plenty for most content sites.

Key Takeaways

  1. Don't rely on passive crawling for new sites — Google may take weeks to discover your pages through sitemap alone
  2. The Indexing API is underused — most tutorials only mention it for job postings, but it works for any verified site
  3. Combine with proper structured data — submit URLs that already have JSON-LD (Article, Product, FAQ schemas) so Google processes rich results on first crawl
  4. Automate it — wire it into your deploy pipeline so new content gets pushed immediately

Bonus: Sitemap Best Practices

While you're at it, fix your sitemap lastModified dates. I see so many Next.js sites doing this:

// ❌ Bad — every page looks "modified today" on every build
lastModified: new Date()

// ✅ Good — use actual content modification dates
lastModified: article.meta.dateModified
  ? new Date(article.meta.dateModified)
  : new Date('2024-01-01')  // fallback to a real date
Enter fullscreen mode Exit fullscreen mode

Google uses lastModified to prioritize which pages to re-crawl. If everything says "today", it trusts nothing.


Full implementation is in my side project: JerseyToMe — an NBA jersey encyclopedia with 270+ pages across 5 languages. The indexing script processes all URLs in under 30 seconds.

Top comments (0)