Hi! Hoping someone can help point me in the right direction
Background
- I built a site that has about between 20,000 - 30,000 pages (representing different stores)
- The general structure of the pages is the same, but the content/data differs for each store
- I programmatically generated a sitemap (in Next.js)
- I submitted the sitemap to Google in June 2024
- Initially Google indexed all of my pages and my site was getting several hundred impressions per day
- After ~2 weeks, impressions dropped a cliff to 5 impressions / day
- It's been that way for 8 months now
- About 2 months ago (late November 2024), I received some advice that I shouldn't have started with such a large sitemap, and to reduce the sitemap ~100 pages and focus on getting those indexed and ranked before increasing sitemap coverage
- I changed the sitemap to have just 150 page URL's and resubmitted the sitemap to Google
Problem
In the 2 months since shrinking the sitemap, Google has proceeded to mostly eliminate the pages it had previously indexed, but only very slowly get rid of the pages that it "Crawled, but didn't index"
I know crawling and indexing is a slow game, but this seems much slower than I would expect.
November 30
- Indexed: ~3,000 pages
- Crawled, but not indexed: ~24,000 pages
January 30
- Indexed: ~22 pages
- Crawled, but not index: ~19,000 pages
Strangely, when I re-submit my sitemap to Google (<sitename>/sitemap.xml), it still shows tens of thousands of URL's, even though the live sitemap only has ~150 URLs.
I've tried everything I can think of in the Google Search Console, but nothing seems to be working.
There are no manual actions to take.
Anyone have advice on what I can do or try?
Happy to provide more information if helpful.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://ift.tt/vXYhjpb
Comments
Post a Comment