- Client-Side Rendering (CSR)
- Server-Side Rendering (SSR)
- Dynamic Rendering (DR)
We can achieve better indexability across a wider array of search engines by using server-side rendering (SSR), where rendering is much more instant for the crawlers. This is especially useful for big websites such as eCommerce sites, where crawl budget may be more of a concern as you would be expecting the search engine to crawl and index hundreds of thousands of pages.
Bilal highlighted that it is important to note the differences between how websites are seen by users compared to how Google’s crawlers ‘view’ a website. For example, when Google crawls a page, it doesn’t scroll down the page as a user would, but rather takes the page as a whole —utilizing what is known as a ‘view port’.
For pages with a long scroll, many developers will use Lazy Loading for images, so that they are not loaded until needed, in order to try and conserve resources and improve the time it takes for a page to load for a user. Googlebot does support lazy loading, but it cannot scroll a page like a user would, so it needs to resize its viewport in order to see the whole content. So you want to make sure your images are lazy-loaded not with a ‘scroll identifier’, but with an event ‘listener’ that can be tied to when the page DOM is complete.
Bilal says there are several things to remember when using Lazy Loading:
- Don’t lazy load images ‘above the fold’.
- Do use a lazy load JS library for browsers that do not yet support lazy loading.
- Ensure to scale the images according to the viewport (consider scale for mobile viewports)
- If you are using tracking or other 3rd party tools, place those scripts in the footer being lazy-loaded, if possible.
Want more Deepcrawl webinars? Check out our full archive of on-demand webinars.