On this week’s episode of the Niche Pursuits podcast, Sergey Lucktinov and I discuss how semantic SEO is evolving in response to the rise of AI-powered search engines and large language models (LLMs). We dive into how AI’s content fetching and ranking systems differ from traditional SEO, and what website owners need to change in their approach to content creation, site structure, and technical performance to stay ahead.
While the interview explores some deep technical ideas, the gist of the takeaway is this: the SEO game is shifting and content creators must adapt or be left behind.
Watch the full episode
From traditional SEO to semantic SEO: the evolution
Sergey has been active in SEO for over 15 years, first internally, then at agencies and finally managing his own affiliate sites. For most of his career, he never had any major problems with Google updates until he started experiencing one a few years ago.
That led to a deeper dive into Semantic SEO, specifically the approach pioneered by Koray Tuğberk Gübür, which emphasizes building topical authority and semantically structuring websites rather than relying heavily on backlinks.
Semantic SEO depends on:
- Macro semantics: How your website is structured across categories and topics.
- Micro-semantics: How individual pages are structured and written.
- Current authority: Covering a topic in depth to build trust with search engines.
- Current maps: Organize content into macro (broad), seed (mid-level), and node (specific) pages.
After studying the AI infrastructure, Sergey realized that almost 90% of Koray’s system reflects AI technical principles.
How LLMs retrieve and rank content
The biggest change Sergey highlights is how LLMs like ChatGPT retrieve information differently than Google’s traditional search engine.
- Search engines are deterministic; they return the same results for the same query under the same conditions.
- LLMs use probabilities. They take information from various sources and synthesize an answer using the ‘cheapest’ and clearest content available.
This change in methodology means that your content now must serve two masters: the algorithmic consistency of search engines and the probabilistic logic of LLMs. What LLMs look for when retrieving content:
- Clarity and confidence in language.
- A tight, well-defined structure that reflects their internal knowledge systems.
- Semantic relevance, including related entities and topics.
- Speed of delivery, as slow websites are immediately excluded from collection.
According to Sergey, even a five-second delay can make your page ineligible for the retrieval process.
What is Semantic Retrieval Optimization (SRO)?
Sergey introduces his concept of Semantic Retrieval Optimization (SRO), an evolution of Semantic SEO specifically tailored to the way LLMs process and extract content.
SRO is about shaping your content and website structure to match the way AI systems retrieve, evaluate and compile responses. Key components of SRO:
Website structure
- Macro pages cover broad topics and link to home pages.
- Seed pages cover narrower topics and link to node pages.
- Node pages handle long-tail queries and back up the chain.
Strict hierarchical linkage
- Macro → Seed → Node.
- Node → Seed → Macro.
- Never exceed levels randomly.
Content clarity
- Each page and section should focus on one idea.
- Use H2s and H3s as separate ‘chunks’ of information.
Micro-semantics: Writing for LLMs
Once your site structure is in place, your content should follow suit. This is where micro-semantics comes into play.
What makes content “high-quality” for LLMs:
- Use of semantic triples: Simple, clear sentence structures like “X is Y” that help AI understand relationships.
- Concise, focused paragraphs: LLMs process content in parts, and each part must cover one topic.
- Factual accuracy: LLMs will penalize content that does not reflect their prior training or contradicts known facts.
- Entity-rich writing: Pages should mention closely related topics (e.g. ‘Paris’ when discussing ‘Eiffel Tower’).
According to Sergey, an ideal article is readable, accurate and built in such a way that it can be processed ‘cheaply’ by an LLM. That balance is key to ranking in AI-driven environments.
The role of technical SEO in an AI-driven world
Speed is everything. While structure and semantics are important, Sergey underlines that technical SEO is still fundamental. He estimates that establishing speed, structure, and microsemantics covers 80% to 90% of what SRO requires.
Technical factors to be given priority:
- Page speed: If your content doesn’t load fast enough, it won’t be eligible for retrieval.
- Site structure: Clear navigation helps LLMs understand content relationships.
- Clean code and schema: AI systems appreciate well-marked, structured content.
Why speed is important:
- LLMs can retrieve more than 200 results for a search.
- Sites will be immediately disqualified if they are too slow.
- The first filter layer is completely based on speed.
Sergey describes the AI retrieval process as a multi-step filtering system:
- LLMs generate potential questions based on user input.
- They get the best results from search engines.
- Only the fastest, clearest, and most reliable content makes it into the final answer.
Injecting new information: the right way
LLMs use pre-trained data for general knowledge, but for recent or niche information they rely on web content. So injecting new insights is a great way to stand out if you do it right.
Strategies for incorporating new content:
- Support claims with logic or referenced case studies.
- Avoid wild, unsupported statements.
- Please mention research institutions or studies when citing data, even without outbound links.
LLMs don’t necessarily follow links, but they do evaluate context and perceived authority.
Keywords are dead: long live the meaning
One of Sergey’s most useful recommendations is to move away from keyword-based thinking and start optimizing based on meaning and intent.
How to change your mindset:
- Start with your customer’s journey, not keywords.
- Identify pain points and connect the content to their questions.
- Consider what a customer needs at every stage and write content that solves that.
Keyword volume is no longer so relevant. AI cares about the depth and clarity of your answers, not whether your phrase gets 1,000 searches a month.
Tools and Tactics: What to Use
Sergey is building a custom SaaS suite to support this methodology, but in the meantime he uses:
- Custom GPTs trained on semantic SEO principles.
- Surfer SEO for basic optimization (although not LLM focused).
- Manual content checks to clean up redundant or conflicting pages.
He notes that while entity graph tools exist, many are difficult to use and don’t provide actionable insights.
Final thoughts
Semantic SEO has evolved beyond search engine rankings. With the rise of LLMs, Semantic Retrieval Optimization by Sergey Lucktinov offers a way to future-proof your content strategy. Its data-driven insights, 90% alignment between AI systems and semantic SEO, velocity-based disqualification and entity misalignment penalties highlight how different this new era of optimization has become.
This is how you stay competitive:
- Structure your site to reflect macro, seed, and node logic.
- Write for clarity, consistency and semantic accuracy.
- Prioritize page speed and factual accuracy.
- Focus on meaning over keywords.
- Train writers to use semantic triples and microsemantic tactics.
The future of content isn’t just SEO-friendly. It’s AI-ready.
Links and sources
#Sergey #Lucktinov #rewriting #semantic #SEO #age


