Skip to main content

Question: Choosing between user comments or passing webvitals?

I represent a popular news content site, and we produce a large amount of unique author-generated articles.

We have been working diligently on improving IT and site speed issues and have reached the following position:

All content on our site is passing Google Webvitals on both mobile and desktop - with the exception of our DISQUS comment threads. These threads by nature of DISQUS' technology do not pass CLS and LCP (in some cases).

So we are sitting at about a 95% webvitals passing ratio for the entire site and are left with two choices.

  1. Accept the failing webvitals on the comment threads, but keep them.
  2. set all the comment threads to NOINDEX, which will bring our webvitals passing rate up to 100% for the site.

However Google has mentioned there is a "positive effect" to having user generated comments on a content site and that sites should not remove them and not expect to see some type of negative effect from the algo.

What do you think the correct solution is? The big question is which is more important - getting a perfect webvitals score, or having user comments visible to google?

submitted by /u/bols-larry
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://ift.tt/FeZzcNw

Comments

Popular posts from this blog

Local seo vs. natiowide seo?

I've done SEO for local businesses but I recently got my first client that sells an item nation wide. ​ Any suggestions for doing nationwide SEO? ​ I am used to making geopages for local towns. I was going to do the same with some input from the client about what cities or towns he would like to show up in? submitted by /u/Letmeinterviewyou [link] [comments] from Search Engine Optimization: The Latest SEO News http://bit.ly/2JHy0k0

Clients site has a weird issue with 302 redirects that I haven't seen before.

Site is in Drupal, hosted on Amazon CDN & Cloudflare. So here's a quick breakdown: The site itself works normally. It's a bit dated, but you can click on links and navigate around as you'd expect. Seeing no obvious issues, I run a Screaming Frog crawl to begin my audit. Only 5 pages were picked up by the crawl which was super weird, since all internal links are regular html and there shouldn't be any issues. So I go through the site and manually collect a bunch of URLs, which I submit to SF again as a list. Every single link bar the 5 originally crawled return a 302, with the 'redirect' pointing back to the home page. Except as I said, those pages don't browser redirect. Browser side, they work fine. I guess they redirect the crawl bot though, since the rest of the site is functionally invisible. Other tools I've looked at say that the pages return simultaneous 302 and 200s, which doesn't make too much sense. These 302s are also old enough ...