Why is Rendered Crawling Important?
By enabling our latest feature you can:
- Cache resources found in a crawl so that your server is not overloaded.
Here’s how you can get going with DeepCrawl’s new rendering capabilities:
- Before the crawl
- During the crawl
- Pages which inject links and content into the DOM will be crawled and the rendered DOM will be parsed.
- After the crawl
- Once the crawl has completed you will be able to view all of DeepCrawl’s reports based on the rendered version of the page for all of the URL sources you’ve uploaded to the crawl.
The Nuts and Bolts of Our Latest Release
Find Out How Pages Compare Before and After Rendering
It’s important to have an understanding of a site before it has been rendered as there is a delay before Google indexes the rendered version of a page, first indexing the HTML version. While Google is working to reduce this delay between indexing two versions, making body content available pre-rendering is recommended especially for sites with time-sensitive content.
Rendering Timeouts That Reflect What Search Engines See
While rendering a page, DeepCrawl uses a default timeout of about 10 seconds after which time the current DOM is used. This timeout length is comparable to search engines, highlighting slow pages which take too long for search engines to render the content for.
Ad Tracking and Analytics Scripts Blocked by Default
Resource Caching Avoids Overloading Servers
As part of our rendering we will cache the resources encountered during a crawl so that your server won’t be overloaded by individual rendering servers requesting the same resources constantly.
If you have any questions don’t hesitate to send us a message or you can contact your customer success representative directly.