Skip to main content

Best Practices for Handling Duplicate Pages\Content with Multiple Menu Paths

I have a situation where a service is accessible through at least three different paths in the menu structure, so following distinct menu paths ultimately displays the same content (page). Now, I'm considering several options. For instance, we could display three unique URLs for the same content, though they would all be duplicates. However, the breadcrumbs would align correctly, so users could see the exact path they took based on the breadcrumb trail and URL structure. This would, however, create duplicate pages, but we could mark them all with a canonical tag pointing to the original.

The second option is to attempt creating unique content for these pages based on the navigation path, though this would feel very forced. The third option is to create a new path (e.g., Path 4 under "services") to which all three paths would link, though we are unsure how to structure the breadcrumbs and URLs for that. I would appreciate any suggestions

|| || |Path 1|Home > Menu A > Submenu A>Service X |/menu-a/submenu/service-x|

|| || |Path 2|Home > Menu B > Submenu B>Service X|/menu-b/submenu/service-x|

|| || |Path 3|Home > Menu C > Submenu C > Service X|/menu-c/submenu/service-x|

submitted by /u/FlexibleFury
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://ift.tt/q8VXOL9

Comments

Popular posts from this blog

Local seo vs. natiowide seo?

I've done SEO for local businesses but I recently got my first client that sells an item nation wide. ​ Any suggestions for doing nationwide SEO? ​ I am used to making geopages for local towns. I was going to do the same with some input from the client about what cities or towns he would like to show up in? submitted by /u/Letmeinterviewyou [link] [comments] from Search Engine Optimization: The Latest SEO News http://bit.ly/2JHy0k0

Clients site has a weird issue with 302 redirects that I haven't seen before.

Site is in Drupal, hosted on Amazon CDN & Cloudflare. So here's a quick breakdown: The site itself works normally. It's a bit dated, but you can click on links and navigate around as you'd expect. Seeing no obvious issues, I run a Screaming Frog crawl to begin my audit. Only 5 pages were picked up by the crawl which was super weird, since all internal links are regular html and there shouldn't be any issues. So I go through the site and manually collect a bunch of URLs, which I submit to SF again as a list. Every single link bar the 5 originally crawled return a 302, with the 'redirect' pointing back to the home page. Except as I said, those pages don't browser redirect. Browser side, they work fine. I guess they redirect the crawl bot though, since the rest of the site is functionally invisible. Other tools I've looked at say that the pages return simultaneous 302 and 200s, which doesn't make too much sense. These 302s are also old enough ...