11 Sensitive Issues With Dynamic Rendering
Koray Tu?berk GüBüR
CEO and Founder of Holistic SEO & Digital - Fortune 500 SEO and Business Consultant - Topical Authority - Venture Capitalist - Investor
Dynamic Rendering is an solution for JavaScript driven and powered websites. If your web site can not be crawled or seen correctly by the user-agent bots, Dynamic Rendering is an solution. Prerender.io, Puppeteer, Rendertron or other kind of midle softwares are helpers for this process. It shows a Server Side Rendered (SSR) Page to the crawlers and Google doesn't count it as cloaking. For more infiromation you can click here.
But still, there are some sensitive subjects should be handled carefully about Dynamic Rendering.
1- Unintentional Cloaking
SSR Page and Client Side Rendered (CSR) Page should be identical even if it is cached. If they are different from each other, Google can see this sitution as abuse and penalize the web site or dynamicly rendered web pages of your internet site. So you shouldn't add different or more links, different SEO optimized contents or keywords to the SSR Page which is secluded from CSR Page.
2- Device Issue
As we mentioned in our previous article, Dynamic Rendering can be problematic about the desktop-mobile differences. Sometimes, SSR Page can be crawled by the mobile user-agent bot isntead of desktop user-agent bot. This causes that your content is shown as desktop content in mobile-device user-agent. To solve this problem you should use "Vary-User Agent" in the header area. If your web-site is completely responsive, this may not be your problem but still you should know about the Dynamic Rendering Methods.
3- Web Server Speed Issues
Dynamic Rendering load extra burden to the web server. Dynamic Renderer and Web Server give the Static HTML Codes crawl bots through bandwith of infrastructure of your web site. And long response time can cancel crawling process. To prevent this obstacle, you should cache the Static HTML (SSR Page).
4- Non-Active URL Parts
To decrease burden of web server, you should reduce the count of URLs which will be cached and prerendered. In this context, you should ignore the unnecessary URL Parameters which don't change the content. To this, you should cache and prerender the canonical URL of theese Non-Active and long URLs. Also, using robot.txt file actively and correctly for the crawl budget can help in this sitution.
5- Possible Prerender Failures
There is a aide method for preventing theese kind of failures or at lest reducing. You should put a 503 status code as initial which will turn into a 200 status code after page is loaded. This works well because prerendering service will try to download page and its JS until it gets 200 status code.
6- Logs of Monitor
You should check how user-agent bots see your web page regularly. If there is a mistake, remember, this won't affect your users but it will affect your rankings and Search Engine attitude to your web site. If you don't see the begining of faulty monitor logs, you may be late to fix problem with least damage.
And there is a crawl statistic for more detail.
7- Make Regular Audits For SSR Problems
Thanks to Botify, Deepcrawl or Little Warden tools you can audit your web sites regularly. You can make log analyses, see the title, h1, content changes and price, review rendering issiues. The problems won't be seen to the users right away, but user-agent bots will see. So you should use theese tools to see minor changes. Also, some CDNs or Cloudflare DNSs block the Botify, Deepcrawl or Little Warden IPs. You should whitelist their bots' IP Adresses.
8- Use the Right User-Agent for Auditing
If you made a change on Dynamic Rendering Service, you should make an audit. But if you used the wrong user-agent or if you didn't use the user-agent bots' view, you will only see the Client-Side Rendering Page. Server-Side Rendering Page is for user-agent bots. You should set your user-agent to search engine you want. If you want to see your Prerendered Content in Google Mobile Results, you should use the mobile google bot user-agent.
9- Check Design and CSS Mistakes in Server Side Rendered Page
Remember, your dynamic rendered page will be only seen by search engine bots. Search Engine Bots will fetch and render your web page through your CSS and Design Structure. To prevent accidental cloaking and unwanted web page look, you should examine the interaction between CSS and Design Layout and JavaScript Files by viewing through Search Engine Bots' eyes.
10- Invisible Contents and Dynamic Rendering
If you use insivible CSS Files via inline method, they may be visible on SSR Pages ant not in CSR Page or contrary some of them may visible on CSR Page while it is not on SSR Page. Insivible content can be discoverable but still it can be treated differently. The best method to prevent this situation is making sure that all of your sources are visible.
11- Dependent Contents And Dynamic Rendering
If your web site include onClick action content, Dynamic Rendering may not be the best solution of your index problem. Because, Dynamic Rendering creates Static HTML Files and they contain the first version of content, not the OnClick action version of it. If Google or other Search Engine bots can not see your web page correctly, they won't index it, it will give a negative effect on crawl budget and Dynamic Rendering exists to prevent this. If you want to use this workaround technique, be sure that you use only visible and un-dependent content.
You can watch the video that shows when and how Google pronounces Dynamic Rendering Solution.
Thank you for reading.