Let me explain.
I specialize in local SEO.
I’ve been diving deep into semantic search, Google’s NLP tool (natural language processing), and also been having fun with tools highly recommended by the experts in our industry, like Surfer AI and SemRush’s Writing Assistant.
Great fun.
But here’s the problem NO ONE is talking about.
These tools and methods can rely heavily on taking the breakdown from the top ranking sites. I.E.: • ideal range of X keywords are mentioned and its synonyms • ideal ranges of amounts of headers, paragraphs, and images
These factors are given a content score based on Google’s NLP tool.
What happens when the top ranking sites for a local keyword are all junk? How do you know how to format your page to achieve that ever so sweet 💯% (or as close as possible) optimization?
For example, If I want to optimize for “General Contractor Augusta ME”, but all the leading sites suck at optimizing for it, then there’s no point of reference by which to base a solid foundation for semantic search.
So, do you then scale up and go national with the keyword, and then edit the content to be based locally? For example, optimize for the term “general contractor”, and then pepper in your local terms and language?
What’re your thoughts? Am I thinking about this too deep?
Lay it on me.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://ift.tt/D0x2tBO
Comments
Post a Comment