Google Updates Snippet Guidance, Robots.txt Rules, EU Proposes Search Data Sharing
Google clarified “Read more” deep links, tightened robots.txt interpretation, and Brussels moved to force search data sharing with rivals and AI chatbots.

Google has pushed three documentation shifts that make technical SEO feel less like guesswork and more like compliance work. Search Central now spells out best practices for “Read more” deep links, Google has rewritten its robots.txt guidance to explain how it interprets the file, and the European Commission has advanced a proposal that would open Google search data to competitors under the Digital Markets Act.
The snippet change is small on paper but important in practice. Google defines a “Read more” deep link as a link within a snippet that takes users to a specific section on the page. By adding best practices for those links, Google has given publishers and SEO teams a clearer standard for how section-level content can surface in search. For agencies, that raises the value of auditing page structure, snippet presentation, and the parts of a page most likely to be used as a destination for deeper navigation.
The bigger operational risk sits with robots.txt. Google’s updated specification page now more explicitly explains how it interprets robots.txt rules, and Google says its automated crawlers obey robots.txt when crawling automatically. Google has also signaled that it may expand its list of unsupported robots.txt rules by looking at real-world patterns from HTTP Archive, which puts more pressure on teams that still rely on nonstandard directives, old templates, or assumptions that search crawlers will quietly forgive messy syntax. If a rule has only worked by accident, this is the moment to find out.
That makes robots.txt the first file to audit, because it controls crawl behavior at scale and now sits closer to documented enforcement. Next comes deep links, where agencies should check whether important sections are structured in a way that can support snippet-level navigation. The third priority is policy exposure, because the regulatory side of search is moving in parallel with the product side.
On April 16, 2026, the European Commission issued preliminary findings to Alphabet and proposed measures that would require Google to share search data, including ranking, query, click, and view data, with third-party search engines on fair, reasonable, and non-discriminatory terms. The Commission also said it is seeking feedback from interested third parties. In proceedings opened on January 27, 2026, it explicitly raised whether AI chatbot providers should be eligible to access the data as part of the same interoperability and search sharing framework.

Google’s own Search Central documentation updates page shows how frequently these rules are changing. For agencies, the message is straightforward: technical SEO is now a documentation watchlist, a crawl-control discipline, and a policy problem at the same time.
Know something we missed? Have a correction or additional information?
Submit a Tip

