Skip to main content

Content duplicates on multilocation pages in 2022-2023

Let's have a discussion. Perhaps I was not alone in wondering about the tactics of multi-location pages, with duplicate content, where only geo keywords change.

In many niches, both white and gray, this tactic works. And quite successfully, because in principle there is a logic that the pages will not intersect in one cluster. But Google calls it a technical duplicate and does not recommend using this practice.

This is where the battle is created:

Writing unique content for each location, which costs 9285719 money VS the chivalrous emergence of Google's Content Update 2023, which will knock hundreds of multi-location pages out of the SERPs, and then there will be a shit panic. And it's still ok if the limit is in the drop index, and not in conspiratorial sanctions.

What do you think?

Step on the gas and create a lot of pages with "Industry + Location" markers and stay with money in your pocket or spend a lot of money on content and be calm in the coming years.

submitted by /u/NefariousnessPrior53
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://ift.tt/4612eGB

Comments

Popular posts from this blog

Local seo vs. natiowide seo?

I've done SEO for local businesses but I recently got my first client that sells an item nation wide. ​ Any suggestions for doing nationwide SEO? ​ I am used to making geopages for local towns. I was going to do the same with some input from the client about what cities or towns he would like to show up in? submitted by /u/Letmeinterviewyou [link] [comments] from Search Engine Optimization: The Latest SEO News http://bit.ly/2JHy0k0

Clients site has a weird issue with 302 redirects that I haven't seen before.

Site is in Drupal, hosted on Amazon CDN & Cloudflare. So here's a quick breakdown: The site itself works normally. It's a bit dated, but you can click on links and navigate around as you'd expect. Seeing no obvious issues, I run a Screaming Frog crawl to begin my audit. Only 5 pages were picked up by the crawl which was super weird, since all internal links are regular html and there shouldn't be any issues. So I go through the site and manually collect a bunch of URLs, which I submit to SF again as a list. Every single link bar the 5 originally crawled return a 302, with the 'redirect' pointing back to the home page. Except as I said, those pages don't browser redirect. Browser side, they work fine. I guess they redirect the crawl bot though, since the rest of the site is functionally invisible. Other tools I've looked at say that the pages return simultaneous 302 and 200s, which doesn't make too much sense. These 302s are also old enough ...