I have a website and have a handful of posts that are not being indexed and are saying - Excluded by ‘noindex’ tag
But here's the problem, I've got a lot pages being indexed in google - but when I look into the specific pages, there are some that are not. Does anybody know why this is happening?
When I do URL inspect on these specifc posts that are not being indexed, it says -
"Indexing allowed? info No: 'noindex' detected in 'X-Robots-Tag' http header"
Does anybody know how to fix this? My website is allowing crawling, so it's not that. I'm pretty new to SEO, so apologies if this is a noob question. Thanks
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://ift.tt/SKUanLE
Comments
Post a Comment