Notes from the Google Webmaster Hangout on the 16th of November 2018.
Inconsistency in PSI Alerts Suggests Being on the Edge of Google’s Speed Requirements
The PageSpeed Insights team try to maintain stable tests for websites, so if a site has fluctuations between getting alerts and passing the tests, this suggests the site is right on the edge of Google’s speed requirements.
New PageSpeed Insights Tool Uses Lighthouse Data & Runs Lighthouse Audits
The new PageSpeed Insights tool runs Lighthouse audits for you using data from across different devices.
Treat Tag Pages Like Any Other Page & Only Noindex Low-Quality Ones
John recommends treating tag pages like any other page on your site and to differentiate between useful tag pages and low-quality tag pages by noindexing the low quality ones.
Optimize for Relevance Rather than Ranking Factors
Avoid focusing on ranking factors and instead focus on Google’s long-term strategy, which is to show great, relevant websites to users.
‘Discovered – Currently not indexed’ GSC Report Pages Have No Value for Crawling & Indexing
Google knows about pages in the ‘Discovered – currently not indexed’ report in Google Search Console but hasn’t prioritised them for crawling and indexing. This is usually due to internal linking and content duplication issues.
Test How Search Engines can Crawl Internal Linking Using Crawling Tools
John recommends using tools like DeepCrawl (now Lumar) to test how your internal linking is set up and whether there are any technical issues which would prevent Googlebot from crawling certain pages on your website.
Being Part of Organizations like The Trust Project Could Indirectly Improve Rankings
Belonging to organizations like The Trust Project won’t have a direct ranking impact, but may indirectly influence rankings as users may trust and value your content more if your website is associated with them.
Fetch & Render Tool in GSC Doesn’t Reflect Real Rendering
Getting ‘temporarily unreachable’ messages in the Fetch & Render tool doesn’t reflect how Google is rendering content for its index. Google’s rendering service has a longer cutoff time and uses caching.
Combine Separate CSS, JS & Tracking URLs to Increase Googlebot Requests to Your Server
To improve site speed and allow Googlebot to send requests to your server more frequently, reduce the number of external URLs that need to be loaded. For example, combine CSS files into one URL, or as few URLs as possible.
Pages With Long Download Times Reduce Googlebot’s Crawl Budget
If a page takes a long time to download then this will use up crawl budget for Googlebot, meaning it will have less time to crawl other pages on your site. Look at the ‘time spent downloading a page’ report in GSC to spot these issues.
Internally Link to Seasonal Content so Google Will Index It
Publish seasonal content far enough in advance for Google to index it for the required period, and also internally link to this content so Google knows these pages are important and relevant to users which will improve indexing.
Use One URL for All Seasonal Content
John recommends using one URL to host all of your seasonal content, regardless of topic. For example, have Thanksgiving content on a page then replace that with Christmas content. This will accumulate link equity, making the page more important to Google.
Quality Rater Guidelines & EAT Don’t Directly Impact Rankings
EAT and the Quality Rater Guidelines show where Google is heading in the future for providing better content to users, but they don’t directly impact Google’s algorithms.
Adopting Unsupported Structured Data Can Influence What Google Will Support in Future
If Google sees websites widely implementing markup for particular elements, then that can influence which types of structured data it will work to support in future.
Reduce Number of Unique Downloads if Seeing ‘Uncommon Downloads’ Issue in GSC
The ‘uncommon downloads’ warning in the Security Issues section of GSC can be caused by sites that generate unique downloads for each user. Reduce the number of downloads so Googlebot is able to check them easily before a user accesses them.
Malware Can be Served via Your Site if You Serve External Content via JavaScript
If you serve content from 3rd parties via JavaScript then be aware that if the websites hosting the content get hacked or edit their JS code to serve malware, then Google will flag your site as serving malware.
Submit Subdirectories to URL Removal Tool to Get Around Individual URL Limits
The URL Removal Tool limits the number of individual URLs that can be submitted to be removed per day. To get around this, you can submit subdirectories to get entire sections of content removed from Google’s index.
Only Use Sitemap Files Temporarily for Serving Removed URLs to be Deindexed
Sitemap files are a good temporary solution for getting Google to crawl and deindex lists of removed URLs quickly. However, make sure these sitemaps aren’t being served to Google for too long.
Word Count has Nothing to do with Google’s Ranking Algorithms
The word count of a page plays no part in Google’s algorithms when deciding which pages should rank higher. Website owners can analyse word count to see which pages may be high quality or thin, but this doesn’t factor into search.