1

I’m developing a website that offers free tools like converters, compressors, and calculators. My setup includes ASP.NET Core as the backend API and React.js for the frontend.

While researching ways to enhance SEO, I came across mixed opinions regarding Google's indexability of client-side rendered apps. Since React apps rely on client-side rendering (CSR), when you check the page source of a React app, you only see minimal HTML, as React dynamically generates the full content using JavaScript.

Here’s my concern: if I continue developing the site with React.js, will this negatively impact my SEO traffic? Will Google's crawlers be able to effectively index the pages? Or should I consider switching to server-side rendering (SSR), either by fully using ASP.NET Core to render the pages?

When I inspect the React page from google search console, Google correctly shows the crawled page.With final HTML..is this mean good to go with React without moving SSR?

I’d appreciate insights from those who have dealt with React and SEO challenges or have recommendations on the best approach for my project. I still have time to shift to a server-side solution if needed.

3
  • I think my answer to Do search engines perform JS rendering while crawling? should have everything you need. It isn't specifically about React, but your question could apply to any client-side rendered framework. Commented Oct 15, 2024 at 22:06
  • Thank you @StephenOstermiller . At Google search console, When I click URL inspection section and click View Crawled page I can clearly see the whole html. That means google can see the html right? So there is no need to change the current rendering method to server side rendering? Commented Oct 21, 2024 at 8:37
  • With the caveat that index speed is much slower when you make changes or add content. Commented Oct 21, 2024 at 8:59

2 Answers 2

0

You should really only consider SSR if:

  1. Your site isn’t being indexed well and errors pop up in Search Console.
  2. Your pages are rendering very slowly, which could cause issues for search bots when crawling.

If everything’s fine on those fronts, just relax)

The main thing is to keep an eye on the state of your pages in GSC.

0

Google will wait for javascript to render as long as its not too slow.

  • Crawling Phase: Googlebot fetches a URL from the crawling queue by making an HTTP request
  • Rendering Queue: Googlebot queues all pages for rendering, unless a robots meta tag or header tells Google not to index the page
  • Rendering Phase: Once Google's resources allow, a headless Chromium renders the page and executes the JavaScript

From Understand JavaScript SEO Basics

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.