You can watch the full recording of the webinar here:
— Richie Lauridsen (@richiesnippets) March 13, 2019
Has there been any progress on updating Google’s WRS to run alongside the Chrome release schedule?
In Martin’s words, “we’re getting there.” Keep an eye out for upcoming Google announcements around this by checking the upcoming Google I/O and Chrome Dev Summit talks, as well as the Google Webmasters Twitter account. Martin wants this to happen as much as SEOs do, and he will get the news out about this as soon as he can. Progress is looking good so far though.
Can you rank server-side, client-side, hybrid and dynamic rendering from best to worst?
- Server-side rendering – Content is available straight away for search engine crawlers. However, it can be a complicated process to decide when server-side rendering should be used. For example, should it only be used when a user clicks on something? You may need to incorporate caching to help with the implementation.
- Dynamic rendering – This method is purely a workaround for search engine crawlers. The search engine’s request will be sent to a dynamic renderer which will send it back the rendered content. However, the user will still have to render content client-side so you don’t get any of the user experience benefits with dynamic rendering like you do with hybrid rendering.
- Client-side rendering.
Bear in mind that all of these rendering methods require code changes.
What’s the basic tool set needed to cover the fundamentals?
— DeepCrawl (@DeepCrawl) March 13, 2019
Google Search Console, Chrome DevTools and Lighthouse are also good for running tests with.
React and Angular have the highest developer adoption and developer satisfaction rates. A more up-and-coming framework that’s growing in popularity is Vue. Martin has analysed all three of these frameworks and they all work. None of them are inaccessible to search engine crawlers and shouldn’t be hurting your SEO efforts.
A note on Angular: If Angular sites aren’t showing up in search or the Mobile-friendly Test, this could be because Angular has a particular polyfill that’s disabled by default which needs to be enabled. Talk to your developer about doing this.
A note on React: React doesn’t need any additional polyfills to be enabled, but you need to use the React Helmet extension to serve page titles and meta descriptions.
This decision should be based on what you’re trying to do with the website, not the size of the site. The outcome of this choice won’t matter to Google, so it depends on your developers.
How long does it take for extremely dynamic sites to be rendered by Google?
This depends on how Google rates the quality of your website and what its crawl budget is like. There is mainly an issue if you have more dynamically-changing pages than Google’s overall crawl budget allocation for your site.
It’s difficult to put a hard date on rendering because this process relies on crawling first, and existing crawling queues still apply. If you increase Google’s ability to crawl your website more quickly, this will naturally decrease the time it takes for rendering.
A polyfill provides new features that the browser doesn’t have natively, making them available for the browser.
Polyfills have their own issues and caveats to be aware of. For example, the Shadow DOM Polyfill had a large file size of around 500kb, so you’ll need to test things like this to make sure they’re not contributing too much to page weight.
Polyfills provide a good experience for both users and search engines though, so overall they’re good to use.
If Google fails to render a page, is there a risk of it seeing duplicate content?
To prevent this from happening with server-side rendering, only render the content that’s needed on a URL basis.
What are the main items that should be server-side rendered?
The main content, structured data, page titles, meta descriptions, canonical tags, hreflang and date annotations. Structured data can be difficult to server-side render but try and get this into the static content.
What is your advice on generating and pre-rendering highly personalised content?
The first question is, will Google see this content? No, because Google is stateless and doesn’t store cookies. Also, highly personalised content isn’t very useful to display in the SERPs.
For dynamic rendering it can take up to 12 seconds to generate content and Google uses page load performance as a ranking factor, so this could be an issue. Maybe choose to just pre-render crucial pages to reduce rendering time and costs.
Another thing to bear in mind is that some cloud hosting providers don’t support SPAs and will always serve a 404 error. However, they do allow custom 404 pages which can have content added to them. If you add content to these pages, they will work for users but they won’t show up in search because the server still responds with a 404 error. Make sure your pages have a 200 status code.
There is nothing to offer around this right now. You could use the Google Search Console API to see how many pages are being indexed, and which ones are being skipped or have errors.
Google is looking into providing more support on this. However, Google wants to avoid opening up rendering as an API to give these insights, because this would eat into resource that Google needs for rendering itself. Once Google’s WRS catches up with Chrome it will become easier to use tools like Puppeteer for this kind of analysis.
A lot of sites have scripts that, when rendered, insert iframes into the. Does this cause specific issues?
HTML and browsers are designed in a way that allows you to make mistakes. An iframe can’t belong in the head because it is a visual element and belongs in the body. So, rather than panicking and breaking the entire page, the browser will see an iframe in the head and assume the body has started.
This means that some browsers will completely ignore titles or other important tags if they are below an iframe in the head. Make sure hreflang is included in the head and isn’t being mistakenly read as being in the body.
Be careful with 3rd party scripts and where they’re adding tags, and remember, iframes never belong in the head.
— DeepCrawl (@DeepCrawl) March 13, 2019
What do you think is the potential cause for the divide between SEOs and developers?
Martin has seen firsthand that both sides have been burned by bad actors on the other respective side.
Also, developers can be resistant to changing the way they work because they need to be able to work as efficiently and productively as possible, so they usually stick with the tools they’re familiar with rather than learning new tools or different ways of working. This is a real problem in the developer community because efficiency isn’t the most important thing; user experience is. Developers should take the time to learn what makes a good web experience for the user, rather than sticking with what they know.
However, there is a real interest in the developer community around technical SEO and discoverability. No developer that cares about their work wants to work on a website that no one will see.
In the past, bad SEOs have spread fear and uncertainty around search issues by making claims without facts. Make sure you back up your requests with proof to avoid developers investing in projects that have no real benefit for performance, and could even backfire. Martin is trying to bridge this gap by providing more documentation on what actually works for Google so the good actors in both communities have the facts to be able to work together towards the right goals.
Sign up for our next webinar on dashboarding for enterprise SEO with Nick Wilsdon
If you’ve been enjoying our monthly webinars, then we’ve got many more to come. The next one will be with Nick Wilsdon who will be explaining how you can create and use effective dashboarding to win in enterprise SEO.