The White Hat SEO Techniques You Should Prioritise in 2018



April 26th, 2018 / Clayton C

Articles on search engine optimisation (SEO) and SEO techniques often use the term “White Hate SEO” (and its polar opposite “Black Hat SEO”) explicitly. Due to the nature of SEO there is a fine line between White Hat SEO principles, and Black Hat SEO.

For anyone unsure of what White Hat and Black Hat SEO actually mean, White Hat SEO refers to SEO techniques that follow the Google established guidelines of search engines. It could also include emerging techniques that – while not part of any published guidelines – have been positively mentioned by search engine employees, like John Mueller and discussed publicly by search engine authority figures. Black Hat SEO eschews White Hat SEO practices in favour of identifying and exploiting loopholes and acting against known SEO principles. The draw of Black Hat SEO is that it can get high rankings for competitive customer queries in short time frames and with less cost, a shortcut – if you like – to the front page or top of search results.

But these are also frequently short-term wins, with search engine algorithms or the web spam team often detecting the use of underhanded techniques and properly adjusting the ranking, or worse yet, leading to a penalty being imposed on your site. White Hat SEO, by focusing on following the rules and guidelines, usually results in slow gains, which can be frustrating for anyone expecting to see immediate results. But the benefit is that these gains, which with a strong strategy can be incremental and ongoing, are seldom fleeting. Most negative changes to your ranking will be the result of somebody else having a stronger strategy than you, but you can regain the advantage by strengthening up your own strategy. And any changes to search engine algorithms will typically not have a significant long-term impact on your site’s rankings.

Common White Hat SEO Practices

Because they follow established guidelines, broad White Hat SEO practices generally remain the same year in and year out, with occasional small changes made to the techniques that feed into the overarching principle. These include:

  • Regular keyword research.
  • Site speed
  • HTTP2
  • Creating of high-quality content.
  • A link building strategy that includes attracting quality inbound links, and gaining legitimate links via guest blogging, etc.
  • Proper use of page titles and other important meta tags.
  • A site structure that makes it easy for visitors and search engines to navigate your site, including the use of a clear URL structure. This also includes the use of a sitemap, and a robots.txt file to control which pages search engines crawl and index.
  • Good use of headers, and keywords throughout your site.

However, as noted above, small changes are sometimes necessary. These could come about as a result of new technology, updates to some search engine guidelines, and sometimes because of changes to how people perform searches. If you regularly read articles on SEO, you will sometimes pick up on what changes to implement, or  what techniques to prioritise, but it is also easy to miss something that could be of great benefit to your business in search results.

White Hat SEO Techniques to Prioritise in 2018

With the above in mind, let’s now take a look at some of the White Hat SEO techniques you should be focusing on this year, without neglecting any of the standard SEO techniques.

Improve page speed

Early in 2018 Google announced the “Speed Update”, which will be rolling out in July 2018. Although page load speed has been a ranking factor for the past eight years, the new update would see page and site speed on mobile devices now also becoming a ranking factor. Meaning if your site loads quickly on desktop, but underperforms on mobile devices, your site’s rankings on SERPs could be affected. In the same announcement Google stated the update was expected to only affect a small number of queries, based on pages that deliver the slowest experience to users.

But this doesn’t mean you shouldn’t be worried about the impact of page speed: page speed also influences your click-through ratio and bounce rate, with users abandoning sites that load slowly, and never returning. Now is the best time to run various page speed tests and find ways of implementing the recommendations returned. Speed tests are performed on individual pages – not the whole site at once – so focus on popular pages first, before tackling lower priority pages. Many of the recommendations – when implemented properly – will bring improvements to page load speed throughout your site collectively, but there could still opportunities on individual pages too.

Switch to using HTTPS

Most algorithmic changes to Google are only confirmed after they have already been implemented, but other changes to Google’s index are announced well in advance, such as the “Speed Update”. Another change that was communicated some time back was the push for more secure browsing, with Google encouraging site operators to using HTTPS throughout their site, not just on pages involving sensitive data.

How Google Chrome will start treating all HTTP pages from July 2018

Since 2017, Google has gradually begun marking more and more pages as “Not secure” in their Chrome browser, beginning first with pages that required sensitive data such as credit card details and/or passwords. Since the release of Chrome 62 in October 2017, this increased to pages that allowed users to enter any form of data, including site search queries, and any data entered into contact forms or sign-up fields and forms. Starting in July 2018 – with the release of Chrome 68 – all HTTP pages will be marked as “Not secure”.

While the use of HTTP versus HTTPS is not a ranking factor, it could still have an impact on your site traffic and conversions, which could indirectly influence your site rankings. Switching to HTTPS throughout your is relatively simple, but there are a few important steps to take to ensure all traffic is properly redirected to the secure versions of all pages.

Compress all images

The internet has always been a very visual medium, and as internet speeds have improved over the years, so too has our use of large or high-resolution images increased. Since 2011 the average size of a web page has tripled, with images accounting for more than half of the average web page size.

How page size has changed since 2011

Page load speed matters more than page size, but certain elements that contribute to large page sizes – such as images – do influence page load speed. A common recommendation when doing a page speed test will be to compress images, and if you aren’t already compressing images, it is a step that needs to be added to your development process.

TinyPNG after compressing images

Image compression software is much more powerful than it used to be, with tools such as ImageOptim, Caesium, and TinyPNG able to compress images by up to 90 percent without visible loss to image quality. And these tools work on JPEGs, SVGs, PNGs, and other image formats. Compressing all images not only speeds up page load times, but also improves the overall user experience.

Check and update all metadata

Page titles have always been an important part of SEO, since it is one of the first things users see on SERPs. Descriptive page titles that incorporate relevant keywords increase the chances of users clicking on your result, especially if it comes close to matching their query. Pay attention to what users are searching for in relation to your business, and also pay attention to your competitors page titles. How do your page titles compare, and are they enticing enough?

Example of a search snippet in Google

Meta descriptions for all pages are equally important, since they give a greater indication of relevance in relation to the user’s original search query. And since November 2017, Google now displays up to 320 characters in search snippets, double what it used to be. This gives you more room to play with in terms of writing more impactful meta descriptions that more closely align to user intent.

Page titles and meta descriptions don’t directly influence site rankings, but they have the potential to increase click-through rates and time spent on site, which can influence rankings, along with conversions. However, it is also worth noting that Google now dynamically generate search snippets, so SERPs will sometimes use a page’s meta description, and sometimes it will pull content directly from the page copy.

Check use of structured data

Structured data for websites has been around – and encouraged – for more than five years, yet many sites still don’t use it at all, or don’t use it effectively.

As noted by John Mueller, structured data does not offer a ranking boost, but it can make it easier for search engines to understand what a page is about. It can also help your website perform better in local search, and make your site’s appearance on SERPs stand out, and be more relevant to user queries.

Example of a rich snippet in Google search results

Proper use of structured data makes it possible for Google to display rich snippets and other search features, which could include ratings, additional product info, richer information about your business, including location and business hours, and much more. By including more information in results for your business on SERPs, your entry not only stands out more from all the other entries, but it is also possible that the displayed information may be exactly what a user was searching for, making it more likely that they will select your entry. And as before, higher click-throughs lead to more conversions, more time spent on site, and this all increase the chances of your site ranking higher in future.

Audit your site for opportunities using structured data, looking not only at information like address, contact numbers and business hours using structured data, but also products (in online stores), customer reviews, and blog posts/articles. Look for opportunities to include relevant breadcrumbs and sitelink search boxes in your search results, along with opportunities to display products, etc. in a carousel.

Using structured data appropriately on your site doesn’t guarantee that the data will be incorporated in rich snippets, but it also won’t have any negative impact on your site.

Optimise for voice search

The final White Hat SEO technique you should be focusing on is one that needs to become part of your long-term strategy too: voice search. With mobile devices now responsible for more searches than desktop computers, users are also starting to rely more on voice search than a typed query. Digital assistants like Alexa, Google Assistant, Siri, and Cortana are now not only standard on most mobile devices, but also in a variety of smart devices like Google Home and the Amazon Echo. And all of them make it possible for users to interact with using their voice, asking the assistant to perform an action, or to search the web for information.

Keyword Magic Tool in SEMRush

What changes when performing a voice search is how the query is expressed, and while the use of natural language has become more prominent in searched queries, it is even more common in spoken queries. This means paying more attention to researching long-tail keywords, and how they appear in your content, along with questions users ask when searching. SEMRush has a tool in development – Keyword Magic Tool – which not only improves keyword research, but makes it possible to display only questions that use certain keywords.

Conclusion

The rules and guidelines that search engines like Google follow when indexing and ranking websites will continue to change. Not because search engines are continually looking for ways to make ranking more difficult, but rather because they are constantly looking for ways to improve the results returned for user queries, so that they are always as relevant to the query as possible. But as long as you follow these rules and guidelines by employing White Hat SEO techniques, how your site ranks should never be negatively impacted in a significant manner. However, it is important to remember that while the principles of White Hat SEO won’t ever change significantly, the techniques employed will continue to evolve inline with search engine algorithms, rules, and guidelines.




ABOUT THIS ARTICLE
Articles on search engine optimisation (SEO) and SEO techniques often use the term "White Hate SEO" (and its polar opposite "Black Hat SEO") explicitly. Due to the nature of SEO there is a fine line between White Hat SEO principles, and Black Hat SEO. For anyone unsure of what White Hat and Black Hat SEO [...]
- Posted by




*Predikkta has sourced several external independent global tools to analyze websites.These tools do not reflect on occasion the internal website analytics, but are recognised global tools and provide accurate comparative results for measurement against competitors.

**The views in this article are those of the author