Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Key Takeaways From #BrightonSEO

Brighton was once again invaded by the UK’s SEO community on Friday; and what a glorious day it was too. Here are our takeaways from the talks we managed to squeeze in…
 

Tom Bennet: Log file analysis – the gateway to assessing crawl behaviour

First up in the 11am slot of the technical thread was Tom Bennet from Builtvisible who talked us through the value of log files (slides here) and how their analysis can significantly improve organic search performance after you’ve addressed issues.

So what can log files tell us? Well, Tom explained that they record of all hits that a server received – from humans and robots – including server IP, timestamp, method, request URI, HTTP status code and user-agent. This allows you to see exactly what happened when Googlebot had an issue crawling your site – including the severity of the problem and expectations for the result; Tom compared log files to the CCTV from the scene of a crime.

And the point? Conservation of crawl budget (the number of URLs that Google crawls on each visit to your site) is key to organic performance. It’s much better to have small budget that is well spent on high quality content, rather than wasting budget on lots of pages full of thin content.

Here are Tom’s basic tips for preparing your data:

  • There are tools that will allow you to do this, but you can’t beat Excel.
  • Convert .log to csv.
  • Sample size 60-120k Googlebot requests/rows is ideal.
  • Use text to columns using space delimiter.
  • Create a table – label your columns, sorted by timestamp.

You will be able to drill down to the most common issues that are affecting Googlebot the most, including issues with crawl frequency over time, HTTP response codes, robots.txt, faceted nav, sitemap and css/js (whether these files absorb crawl budget but are not necessary for render of page?)
 

Tom Whittam: Proactive measures for good site health

Our third speaker in this thread was Tom from Ayimayou can find his slides here.
Tom’s talk was all about being proactive with our backlinks, to avoid manual actions and Penguin penalties. Every type of site is different: large corporate websites might be at risk from large, hard-to-manage backlink profiles and historic techniques, while SMEs might have small budgets but low brand awareness. But all SEOs should be aware of the threat of negative SEO and even getting hacked.

We’d highly recommend looking at Tom’s slides for all of his advice, but here are our notes as a starting point:

  • Identifying negative SEO:
    • Look at Ahrefs new/lost links, GWT link dates, most common anchors, who links the most and number of links to identify potential negative SEO.
  • Avoiding hackers:
    • Routine backups, update plugins, regular password updates.
  • Link monitoring:
    • Affiliate system check, implement nofollows, disavow if all else fails.
    • What to look at: redirect path, inspect link path, inspect element, block parameter in GA (universal analytics/robots.txt).
  • Referrer spam
    • Identify/exclude spam referrers and then block all spam bots in Google Analytics.

 

Matthew Brown: Rich snippets after the fire

All SEOs want to know what Google’s going to do next, and during the 2:30pm slot Matthew Brown from Moz shared some of his predictions with us. Matthew’s slides are available at bit.ly/TheSnippets, and here are Matthew’s key predictions/actions:
 

Everything changes, and it’s all for the users…

Google is really image-conscious – but if something is used just for SEOs, then they have no problem in pulling it. That’s a good way to figure out if a feature might be at risk inthe future: video snippets and authorship being the recent victims.
 

Google UK is just behind on local features, but that will soon change

Blah blah blah

Start targeting ‘localised’ results and add entity relationships to your SEO strategy – use the entity feature of Alchemy API to target relevant entities as well as your main keyword.
 

Google is now using unstructured data

Google getting better at scraping data, whether it’s structured or not. Matthew used the example of Cutters Crabhouse in the US – their company is not web-centric but Google still manages to list their menu in the search results.
 

Google’s biggest ad spend opportunity is on mobile

We should be predicting what will happen with desktop SERPs based on what Google is already using on mobile: Google Now is a preview of future SERPs. Matthew predicted that in the future Google will ‘require’ responsive for ranking, to improve user experience for all mobile searches.
 

The future is entities…

…And they are based on links. At some point we’ll start talking about entity rank – whether that rumour comes from Google or not.

  • Matthew’s recommended actions include:
  • Use of Facebook Open Graph and Twitter Cards
  • Yandex ‘Islands’
  • Gmail events and reservations
  • Google SERPs
  • Schema.org (see bit.ly/googleactions and bit.ly/schemaactions for more)
  • Look for rising JSON-LD adoption (more at bit.ly/jsonldmarkup)

If you don’t yet have a Lumar account, you can request a live demo here on a site you’re working on, or log in now to get started.

Avatar image for Tristan Pirouz
Tristan Pirouz

Marketing Strategist

Tristan is an SEO enthusiast, strategist, and the former Head of Marketing at Lumar.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox