Googlebot renders, indexes AJAX-style dynamic content driven by XHR POST

Google now renders and indexes client-side AJAX-style JavaScript POST requests. This is good news for those who use modern JavaScript to query online resource APIs and produce dynamic content in pages because such dynamic content is now finally indexed along with the rest of the static content on a page. This is a development with Evergreen Chromium, although some AJAX link crawling was supported earlier. Google deprecated its own previous advice that: “The browser can execute JavaScript and produce content on the fly – the crawler cannot.”

HTTP Background

The HTTP standard describes a number of request types, of which GET is most widely used. Browsers, for example, use GET to retrieve a URL when you type in the URL address bar, or click links and bookmarks etc. The POST request type differs from GET in that it comes with a payload definition that is meant to be unpacked on the server for use with an application program. HTML forms use POST to send text from input fields for processing on the server.

POST Problems

After early experimentation, search engines generally avoid seeding input fields for making POST requests on their own. If a site is created with valuable database content accessible via a site search engine field without easy discovery of links to its results pages, we wouldn’t expect it to get indexed – even by today’s Googlebot. These traditional indexing problems affect pages with client-side XHR POST requests, too.

With the rapid evolution of JavaScript, client-side POST requests have been programmatically available through the browser’s XMLHttpRequest (XHR) object since before the jQuery days. XHR allows JavaScript in the browser to conduct a sub-request for querying an online resource API and retrieve information to produce content “on the fly.”

Some POST Requests Now Work with Google

Google’s new Evergreen Googlebot can now crawl and index XHR POST requests. The question of whether it does or doesn’t was prompted by technical SEO Valentin Pletzer, who follows the Evergreen Googlebot closely, by examining browser support for burgeoning JavaScript features. Be aware that other crawlers still do not have this capability yet.

Proof of Concept

Google developer Martin Splitt at first expressed doubts as to whether XHR POST requests would work with the new Evergreen Googlebot, but curiosity caused him to write a proof of concept and he found that it works. The interesting thing to note is that he didn’t write it using old vanilla JavaScript XHR patterns, and he didn’t write it using jQuery. He used ES6’s new fetch() method. View the source code on Glitch.

There are likely going to be a lot of JavaScript pages written using older patterns, especially jQuery, but it’s definitely more forward looking to use ES6. Logic follows that if Googlebot can interpret ES6 JavaScript patterns for XHR then it can surely interpret older JavaScript patterns for accessing the same browser object.

Things to Note

When Google renders dynamic content driven by the XHR POST request method, each additional sub-request will count against your crawl budget. The content from the POST event isn’t cached as part of the page, which reduces your crawl budget by the number of XHR requests to assemble the page. If you had a crawl budget of 100 pages, for example, and your template for them used one XHR POST request each for content on the fly, it appears that only 50 of your pages would get cached for use with Google’s search index.


 

About The Author

, Googlebot renders, indexes AJAX-style dynamic content driven by XHR POST, #Bizwhiznetwork.com Innovation ΛI

About Skype

Check Also

, Google officially drops Mobile Usability Report, #Bizwhiznetwork.com Innovation ΛI

Google officially drops Mobile Usability Report

Search Engine Land » SEO » Google officially drops Mobile Usability report, Mobile-Friendly Test tool …

Leave a Reply

Your email address will not be published. Required fields are marked *

Bizwhiznetwork Consultation