Webmasters Asked by Loko on March 10, 2021
Now I use ReactJS for my website and was wondering whether changing to server side rendering(NextJS) would impact my search engine rankings?
I’ve tried some tools that would give your website a "SEO score". It doesn’t seem as if their crawlers are giving my website a bad score because of the client side rendering.
So does it actually impact my SEO if I would change from client side rendering to server side rendering? Even if it’s minimal.
I know there are questions like (Server-side rendering for search engines only (SEO)) here about server side rendering already but it doesn’t answer my direct question. I also keep reading different things about it.
Google is getting much better about crawling and indexing sites built with JavaScript on the client side. Some client side rendered sites are starting to do fairly well with SEO. Server side rending is no longer a requirement. However there are some limitations to client side rendering.
Search engines other than Google don't have advanced enough crawling technology for client side rendering. Bing, Yandex, and Baidu are not currently able to index client side rendered sites. If you want to appear in the other search engines, you need server side rendering. Google is 90% of the search market share in many counties, so appearing only in Google isn't always a deal breaker.
It is more computationally intensive to crawl and process client side rendered sites. Google has been working hard to speed up the process. Google has reported that their "render queue" is now only minutes long. However, I've seen reports that indexing can still take weeks longer for a client side rendered site.
You need to break up your site into pages. Google expects to be able to deep link into the specific content that users search for. A "singe page application" is not search engine friendly. Even if you load content via AJAX for users, you need to make sure that the URL changes when new content appears (by using pushState
) and that when somebody lands on a deep URL they see the correct content.
Googlebot doesn't behave like a normal visitor. It doesn't scroll the page, it doesn't click on anything. Any functionality that can only be accessed by scrolling or on-click won't get indexed. Only the content that loads into the page within a couple seconds of the onload event without any user interaction will get indexed.
You need to link to all your pages. When Googlebot renders pages, it scans the document object model (DOM) for links to other pages on your site. You should ensure that navigation elements are rendered with <a href="page.html">
links so that Googlebot can find all the pages on your site. You can intercept clicks on these links for visitors and load content via AJAX, but Googlebot needs anchor elements for navigation. Using other types of elements with click events for navigation kills your SEO.
Answered by Stephen Ostermiller on March 10, 2021
So does it actually impact my SEO if I would change from client-side rendering to server-side rendering?
No - Just make sure that Google can crawl can render the content that you are serving - test it using GSC inspect URL or https://search.google.com/test/rich-results, look s the JS loading issues and source code to make sure the content is visible.
Answered by Amine Dahimene on March 10, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP