We submitted a sitemap on search console a few months ago with 40k+ URLs, most of which were thin search result pages. 600 of them got indexed, 3k are crawled but not indexed. We fixed our sitemap to exclude the excessive, low-quality URLs but we're having trouble getting new pages crawled & indexed and I think this is why. We know that we should delete low-quality pages with a 410 code but that only matters for the pages that got indexed, right?
I'm more concerned about the 3k crawled but not indexed pages and remaining ~40k not indexed pages. Will they continue to clog up our crawl budget, so to speak, for a long time? Any advice on how to handle?
Thanks so much for any insight you can offer 🙏🏻
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://ift.tt/Leup59r
Comments
Post a Comment