Skip to main content

Posts

Google Is Testing New Bot Authorization Standard via @sejournal, @martinibuster

Google is testing a cryptographic protocol for verifying bot traffic that could make unwanted crawlers easier to identify. The post Google Is Testing New Bot Authorization Standard appeared first on Search Engine Journal . from Search Engine Journal https://ift.tt/eLDdVpt
Recent posts

How do you not burnout waiting for results?

I just launched my platform and I started to build content around it. SEO mainly. Everywhere I see that meaningful results come after 6-12 months. How do you guys actually keep motivated during that period ? submitted by /u/vshaddix [link] [comments] from Search Engine Optimization: The Latest SEO News https://ift.tt/pri7bTw

Nick from Profound banned on Reddit

submitted by /u/WebLinkr [link] [comments] from Search Engine Optimization: The Latest SEO News https://ift.tt/583IgXk

Managing Crawl Budget for a news website

I'm providing SEO advice to a company that does web development for a large news agency. They publish around 700 articles daily and have more than 10m URLs in total. Their website has thousands, maybe even hundreads of thousands up to a million of topic URLs that have only IDs and are non indexable. They serve like a topical page, but dont have anything besides the list of URLs towards articles. I aim to help them improve their crawl budget and I'm confused whether disallowing these URLs will be helpful, or could it prevent some pages from being crawled and discovered. Furthermore, the website has authors pages thag provide basically no value. These pages are non indexable, dont havs bios, images, or anything. I told them to disallow them but Im not sure whether this was the right move. Any advice? submitted by /u/DukeVeljko [link] [comments] from Search Engine Optimization: The Latest SEO News https://ift.tt/CJ29nWS