Skip to main content

Posts

Chinese Traffic

The last 2 weeks I have been getting tons of traffic from China. They are visiting different pages. My guess is that they are copying my topics. Anyone seen this resently? submitted by /u/No_Statement_3317 [link] [comments] from Search Engine Optimization: The Latest SEO News https://ift.tt/1rOeMlk
Recent posts

Built a CLI that tells me if GPTBot/ClaudeBot/Perplexity can actually reach my site (and where the block is)

I kept getting "your AI visibility is low" reports from various tools that wouldn't tell me *why* . Was the block in robots.txt? At the CDN? At origin? Different fixes, different teams. I guess this sits somewhere in the "generative engine optimization" bucket, but I wanted the tool to stay very concrete: can these crawlers reach the site, and if not, where are they being blocked? So I wrote a small Node CLI that just answers that question deterministically: ``` npx u/geosuite/ai-crawler-bots robots https://my-site.com ``` What it actually does: - Parses robots.txt with line-level provenance — when a bot is Disallow'd it tells me *which line in which group* . - For each tracked bot (24 right now: GPTBot, ChatGPT-User, OAI-SearchBot, ClaudeBot, PerplexityBot, Perplexity-User, Bytespider, etc.), reports the verdict. - Detects Cloudflare's "Managed Content" markers (`# BEGIN Cloudflare Managed content` … `# END`) and tells me whether my own rules ...

Adding schema didn’t boost citations on any platform [Ahrefs SEO Case Study]

Yet another blow for the GEO Schema bros. marketing and Propaganda. Because no LLM OEMs actually said this - they felt they could parrot it for everyone and nobody would figure it out. If you've been parroting it - thats fine - thats up to - but do not come for anyone just because you dont like this. You're free to run your own peer-reviewed case study (which requires evidence of an actual study). Link: https://ahrefs.com/blog/schema-ai-citations/ Adding schema didn’t boost citations on any platform We tracked 1,885 web pages that added JSON-LD schema between August 2025 and March 2026, matched them against 4,000 control pages, and measured citation changes across Google AI Overviews, AI Mode, and ChatGPT. Adding schema produced no major uplift in citations on any platform. AI source Effect on citations Verdict Google AIO −4.6% Small but statistically significant decline relative to matched controls; (both groups were declining together, but treated pages fell ...

Planning to buy the SEMrush Plan

Planning to recommend SEMrush internally for SEO audits, on-page, outreach, backlinks, and AI citation tracking. I’ll be handling around 5 projects. Which SEMrush plan would you recommend? Also planning to try the trial version first. Would love expert suggestions. submitted by /u/toppo_prema [link] [comments] from Search Engine Optimization: The Latest SEO News https://ift.tt/PsTE4pF

How much do exact match primary keywords still matter in blog posts now?

I’m trying to sanity check how people are handling on page SEO today. For years, a lot of SEO guidance pushed exact or near exact primary keywords into places like: title H1 first paragraph excerpt SEO title meta description at least one H2 repeated naturally throughout the body But I’m seeing a tradeoff. When the keyword is a little awkward, forcing it into all of those spots can make the post read like SEO bait instead of something a human would naturally write. For example, a keyword like “interview questions behavioral” technically fits the search intent, but using that exact phrase in the title, H1, first sentence, and excerpt makes the writing feel weird. For people doing SEO now, how strict are you with exact match primary keyword placement? Do you still try to include the exact keyword in all of those fields, or do you mostly optimize for intent, natural language, topical coverage, internal links, and useful structure? I’m especially curious how you handle awkwa...

Horrible SEO on a website with a lot of products

I started recently at a new place and i have to do a lot, they use drupal and have maybe around 2K product listings. The huge issue was that GSC had more than 300K not indexed pages, i have blocked certain characters in robots.txt and vibe coded an extention that puts no-follow on those faceted URLs, second issue was that some products are on request and have no prices, google flags that as issue. Indexed pages are around 3K. I have allowed claude and chatgpt, generated llms.txt, updated the sitemap, put short articles about the services and FAQ about certain peoduct then interlinked that to the same product from sitemap URLs. On PageSpeed insights i made everything at 100% expect the performance of the website which ~50%, they have some .js, .css files from modules, drupal uses a TON of extentions, it gets too technical and i do NOT like this platform. But our competitor ranks in top 10 and has way worse PageSpeed insights! The store is not in top 10 searches for the certain prod...