secture & code

Practical implementation of AI-first SEO: Tools, cases and strategies (Part 2)

In the first part we explored how AIs are fundamentally transforming search
with AI Overviews reaching 2 billion users and reducing traditional CTR by 34.5%. We talk about the differences between platforms, the critical importance of E-E-A-T and why the 95% of
AI citations do not follow traditional SEO logic.

Now it's time to move from theory to practice. In this second part, we will reveal exactly how to implement a successful AI-first strategy, what tools you need, real cases with concrete numbers and critical aspects that almost everyone is ignoring.

AI-first

Structured data: the secret language of AIs

Here's a technical fact that can make the difference between success and total failure in AI-first SEO: the crawlers AI applications such as GPTBot, ClaudeBot and PerplexityBot cannot execute JavaScript. This means that all your structured data must be present in the initial HTML served from the server. If your site relies on JavaScript to render important content, you are invisible to AIs..

JSON-LD has become the format of choice because of its ease of implementation and lower error-proneness. But it's not just a matter of having schema markup; It is a matter of having it correctly implemented.

The data confirm this: 72% of the results that appear on the first page of Google have some form of schema markup implemented, and users click on enriched results 58% of the time, compared to only 41% for non-enriched results.

Article Schema establishes clear authorship and publication dates, key elements for assessing freshness and authority. FAQ Schema has proven particularly powerful, generating over 3,600 clicks and 129,000 impressions in documented successful implementations.

For e-commerce, Product Schema with ratings, availability and pricing is fundamental. Person and Author Schema are essential for E-E-A-T signals, while Organization Schema establishes the business authority and legitimacy of the site.

The documented results speak for themselves:

  • Food Network saw a 35% increase in visits after implementing recipe schema on 80% of its pages.
  • Express Legal Funding managed to appear in Knowledge Panels after a comprehensive implementation of author markup.
  • Rakuten reported 1.5x longer time on pages and 3.6x higher interaction rate after implementing structured data correctly.

But basic implementation is no longer enough. The advanced technique that is making the difference is the Connected Schema Markup, where instead of isolated blocks of structured data, you create relationships between entities. Imagine your schema as a knowledge graph where the article connects to its author, the author to the organization, the organization to its products and services, creating an interconnected information network that IAs can navigate and understand holistically.

This difference between isolated and connected schema can be the difference between being quoted occasionally and becoming the reference source in your industry.

How to measure your AI visibility?

Measuring visibility in AI requires an entirely new set of tools and metrics that go beyond traditional ranking.

1. Tools

  • Google Search Console, that free tool we all know, already tracks AI Overviews in its Performance Report, although without dedicated filtering yet. When your site appears in an AI Overview, it always shows position number 1, which can be misleading if you are used to traditional ranking metrics.
  • Semrush launched its AI SEO Toolkit at $99 per month per domain, which tracks your presence on ChatGPT, SearchGPT, Google AI Mode and Overviews, Gemini and Perplexity, calculating a comprehensive AI Visibility Score with sentiment analysis included.
  • Ahrefs responded with Brand Radar, focusing on the tracking of brand mentions through
    AI platforms with deep competitive analysis.
  • BrightEdge, always at the forefront of business, offers its Generative Parser with real-time monitoring from November 2023.
  • Otterly.AI, at only $29 per month, stands out for its focus on neutral monitoring without personalization bias, crucial for understanding how new users view your content.
  • Peec AI offers plans from $89 to $499 per month with tracking from 25 to 300+ prompts and multi-country support.
  • WriteSonic recently launched AI Traffic Analytics, a tool that promises to specifically track traffic coming from ChatGPT, Gemini and Perplexity.

2. Metrics

The metrics you need to track are fundamentally different from the traditional ones:

  • AI Visibility Scoremeasures the percentage of relevant queries where your brand appears in AI responses.
  • Citation RateTrack how often you are cited as a source.
  • Share of VoiceCalculate your share of mentions versus your competitors. And perhaps the most important metric: AI traffic conversion, which according to recent data, converts 4.4 times better than traditional organic traffic, amply justifying the investment in this new form of optimization.

Success stories and failures

Real cases with concrete numbers are the best way to understand the potential and risks of AI-first SEO.

Transformational success

The Search Initiativeachieved a 2,300% growth in monthly AI traffic in just 12 months, going from zero to having 90 keywords consistently appearing in AI Overviews.

His strategy was not revolutionary in concept but revolutionary in execution:

  1. Created meticulously structured content for AI readability with 60-100 word paragraphs
  2. They aggressively targeted long-tail conversational queries.
  3. They systematically enhanced their E-E-A-T signals with author bios and verifiable credentials.
  4. They obsessively monitored their performance across multiple AI platforms.

Xponent21 took this even further with organic traffic growth of 4.162% year over year, reaching 10.5 million search impressions and 20,100 clicks by May 2025.

Its 14-step framework included rapid creation of content clusters with 100+ pieces in a matter of months, a multi-format content ecosystem that included articles, videos, infographics and podcasts, strategic publishing on external platforms to maximize digital footprint and advanced implementation of schema markup on each piece of content.

The result was not just vanity traffic: they generated a multi-million dollar sales pipeline with qualified leads coming in daily from AI searches.

Lyzr AI proved that you don't need years to see results, achieving 150% of traffic growth in just 3 months using AI-specific optimization tools like Surfer SEO.

Rocky Brands transformed its business with a 30% increase in search revenue and 74% year-over-year revenue growth, all attributable to its AI-first strategy.

Lesson-teaching failures

Failures are equally instructive and serve as a crucial warning.

The case of Causal is particularly sobering. They partnered with Byword for massive AI content generation, publishing hundreds of articles in a few weeks without human oversight or quality checking.

The result was catastrophic: they dropped from 650,000 monthly visitors to just 3,000 in 30 days when Google identified the content as AI-generated spam.

The March 2024 algorithm update was particularly brutal, causing massive de-indexing of sites that relied excessively on AI-generated content.

The patterns of failure are consistent: scale without quality, automated keyword stuffing, lack of real expertise signals, generic content creation without unique value, and lack of human oversight.

The lesson is clear: AI is a powerful tool, but without strategy and human oversight, it is a recipe for disaster.


Critical issues that 90% ignores

While everyone is obsessing over text and keywords, there are massive opportunities that almost no one is taking advantage of. Multimodal and voice search represents one of these overlooked opportunities.

Google Lens processes 20 billion visual searches per month, a number that is growing exponentially. There are 86.1 million active voice search users, and most relevant for e-commerce: 20% of visual searches are specifically for shopping.

Optimization for these modalities requires a completely different approach. The alt text must
be detailed and descriptive, not only for accessibility but also for AI understanding.

Image-specific schema markup is critical. Your content must be conversational and answer natural questions. And something few people consider: how your content sounds when read aloud by
voice assistants. A paragraph that reads well can sound terrible when Alexa pronounces it. The technical differences between AI models are deeper than they appear on the surface.

ChatGPT doesn't just use the Bing API; it has exclusive content deals with Associated Press, Reuters and Financial Times that significantly influence its responses. Perplexity with its proprietary crawler has brought controversy, with accusations of ignoring robots.txt files, but its love for Reddit and user-generated content is undeniable with 3.2 million citations. Claude is notable for its automatic search without the need for manual user triggers, activating when it detects it needs updated information.

Gemini maintains a more agnostic stance but with a clear preference for YouTube content, taking advantage of the synergy within the Google ecosystem.

The ethical and legal risks are real and growing. GDPR compliance requires consideration of how content is processed by AIs, especially when it includes personal data. Algorithmic bias is a documented problem: AIs perpetuate and amplify existing biases in their training data.

Google has been shown to aggressively detect and penalize massively AI-generated content without human oversight. And there's a new emerging standard that few know about: the llms.txt file, similar to robots.txt but specifically for controlling how AI crawlers access and use your content.

Step-by-step implementation strategy

Successful implementation of an AI-first strategy requires a systematic approach that I have seen work consistently. During the first two weeks, the focus should be on in-depth auditing and preparation. Honestly assess your current E-E-A-T situation:

  1. Do you have detailed and verifiable author biographies?
  2. Are you consistently citing reliable sources?
  3. Does your content demonstrate real expertise or does it just repeat information available elsewhere?

Simultaneously, review your current structured data. Many sites have incorrectly or incompletely implemented schema. Validate everything with the Schema.org Validator and prioritize Article, FAQ and Organization as fundamental schemas.

Also analyze your competition in the context of AI: search for key terms in your industry on ChatGPT, Perplexity and Claude, meticulously documenting who appears and, more importantly, analyzing why they appear.

Reformat your existing content to be IA-friendly: 60- to 100-word paragraphs that can be easily excerpted and cited, clear structure with descriptive subheadings, and direct and concise answers at the beginning of each section. New content should be created specifically with AI in mind: comprehensive guides of 1,500+ words that cover topics in depth, case studies with real data and transparent methodology, and detailed comparisons, remembering that the listicles comparative citations account for 32.5% of all AI citations.

💡 Listicle is a content format that combines an article and a list, presenting information in an organized manner through numbered or bulleted points, each with its respective description.

Improving confidence signals should be continuous but systematic. Add clearly visible publication and update dates, include methodology sections when presenting data or conclusions, and liberally cite reliable external sources. Don't be afraid to link to competitors when relevant; IAs value objectivity and comprehensiveness over commercial bias.

Measurement and adjustment should become a weekly ritual:

  • Set up Google Search Console to start collecting AI Overviews data immediately.
  • Invest in at least one third-party tool.
  • Set brand alerts on all major AI platforms.
  • Check weekly your AI Visibility Score, Citation Rate by platform, and especially the conversion of traffic coming from AI.
  • Perform constant A/B testing of formats and structures, adjust your strategy according to which platform is generating the best results and aggressively scale what works while quickly abandoning what doesn't.

The future of SEO is here

Data-backed predictions paint a clear and inevitable picture. By 2025, 80% of SEO experts consider AI «extremely important» to their strategy. Gartner predicts a 25% drop in traditional search volume by 2026.

The AI SEO market will reach $4.97 billion by 2033. And the market for AI agents, those automated assistants that will perform SEO for us, will reach $103.6 billion by 2032, growing at an annual rate of 44.9%.

Industry leaders are being brutally honest about the magnitude of the change:

  • Barry Schwartz states that «the search as we knew it is disappearing before our eyes«.
  • Rand Fishkin is even more direct: «60% of searches result in zero clicks. The golden age of SEO is over«.
  • Even Danny Sullivan, now at Google, acknowledges that there is a real and substantiated perception of advantage for big brands in this new paradigm.

The consensus among 22 leading SEO experts recently interviewed converges on key points
that define the immediate future:

  1. Technical excellence is no longer optional but the price of entry.
  2. The shift in focus from keywords to user intent and experience is irreversible.
  3. Building brand authority through E-E-A-T is more important than any technical hack.
  4. Multi-platform presence beyond Google is essential for survival.
  5. Integrating AI as a tool while maintaining human expertise as a differentiator is the balance that defines success.
  6. And perhaps most importantly: authentic community building and real engagement is becoming the ultimate quality signal that no AI can fake.

The time to act is now

SEO is not dead, but the version we knew is fast disappearing. The future of
SEO is unequivocally AI-first. The challenge is no longer simply to be on the first page of Google, but to train machines to make your brand the automatic reference when a human asks an AI about your industry.

The evidence is overwhelming and consistent. Companies that have adopted AI-first strategies are seeing growth from 2.300% to 4.162%. Those that are sticking to traditional tactics are seeing their traffic evaporate month over month.

AI traffic converts 4.4 times better than traditional organic traffic, justifying any necessary investment.

The key elements for success are clear:

  • Quality has definitely triumphed over quantity: one exceptional article that the IAs consider authoritative is worth more than a thousand mediocre keyword-optimized pages.
  • E-E-A-T is not a suggestion; it is the price of entry. Structured data has gone from being a best practice to absolutely mandatory.
  • Constant measurement and continuous adjustment are vital because each platform evolves on a daily basis.
  • The human factor remains irreplaceable: authenticity, real experience and the unique human voice are more valuable than ever.

The next SEO battle isn't against other websites; it's to earn the trust of AIs. And while this may seem daunting, it also represents the greatest opportunity in the history of digital marketing for those willing to evolve.

The data is clear, the tools are available, the success stories are documented and you now have the complete framework to implement your own AI-first strategy. The question is not whether you should adapt your SEO strategy for the AI era. The question is whether you will do it fast enough to capitalize on this historic transformation, or whether you will be another casualty of the most fundamental shift in search since Google came along more than two decades ago.

Software engineer specialized in AI

Picture of Antonio Romero

Antonio Romero

I love difficult technical challenges and I'm always learning something new. I see every project as my opportunity to transform ideas into real code, using the latest technologies and collaborating with the team. Python is my go-to language for developing robust and scalable systems.
Picture of Antonio Romero

Antonio Romero

I love difficult technical challenges and I'm always learning something new. I see every project as my opportunity to transform ideas into real code, using the latest technologies and collaborating with the team. Python is my go-to language for developing robust and scalable systems.

We are HIRING!

What Can We Do