Webmasters Asked on November 3, 2021
I’m trying to implement ajax calls to load parts of a page only when a user gets to that part of the page and make sure at the same time that Google is able to read the content.
Here is what I have done:
#!module=comments
to the urlIt works fine. When I click on the link, I see the static content being loaded. If I access the URL directly (escaped_fragment) it also display the content correctly.
Now the SEO part with GoogleBot…
There is a possibility to simulate crawling from the Google Search Console. I tested the url with the script and I logged all visits on the static html page to see if Googlebot was getting there. The good news is that I see GoogleBot visiting the static html page.
How do I know if the static content of this page has been added to the main page? (and not seen as separate content) I’d like to transform all my pages with ajax loading if I’m not sure to do the right thing.
Hash bang URLs are deprecated. Googlebot no longer treats them any differently than any other URL. There is no reason to use hash bang URLs anymore. These days it is fine to use normal URLs rather than use special crawlable AJAX URLs.
Googlebot doesn't click on anything on the page. Anything that loads into the page when users click won't be indexed as part of the page. Googlebot does follow links on the page. If it can follow a link to the comments, it will see those comments as a separate page rather than part of the page.
You can get Google to index the text of the comments and still use AJAX. Rather than require users to click to read the comments, you can load the comments into the page every time the page loads. Googlebot now executes JavaScript that runs when the page loads and includes any loaded content as part of the page.
Answered by Stephen Ostermiller on November 3, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP