3 min read
Honchō Scoops Up Two UK Search Awards!
It’s official, we've added not one, but two shiny trophies to our awards cabinet! We’re over the moon to share that we’ve triumphed at the UK Search...
If a site has poor load times, it can create many problems. Many users will click out of the site if it takes too long to load. You may never get that user back. Poor site structure can lead to longer load times and it can also mean that a search engine crawler will have trouble finding data.
In result of that, your pages might not get crawled because it is using too much crawl budget so they’ll move on to other sites. This means that a poorly optimised page may not appear on Google.
Technical SEO aims to eradicate these sort of issues. In short, it’s the management of the technical aspect of a website, ensuring that nothing is holding a site back from reaching its potential.
One of the first major priorities is ensuring that the site speed is adequate. There are 1000s of potential causes to slow site speed. But more common than not, it comes down to a few typical reasons.
Kissmetrics found that 40% of users leave a site with a 3+ second loading time.
Source: https://blog.kissmetrics.com/loading-time/
If images are too large in file size, it will cause a webpage to load too slowly. Uncompressed images on an image-heavy page will slow down the page. High-quality images will be large in file size, and significantly larger in file size than anything else on the webpage.
Compressing the image will solve this problem. The file size will be reduced to an optimal level. There is, at times, an issue where compressing the image will significantly reduce the image quality. This is certainly a problem for photographers and e-commerce sites. Ugly images are not attractive.
However, there’s a new file format for images on the web called WebP. Developed by Google, it’s a format that is able to compress the image to an optimal level but retains the image quality.
Great looking images. No hindrance on load times. The perfect remedy with great click-through results. The problem currently is that not all browsers and sites support this format, so it currently is not the perfect solution.
This is where technical SEO can dive into solutions. Currently, Google Chrome and Opera support WebP. Other browsers are rumoured to be testing the format though. If a browser doesn’t support the WebP, make sure you have a different file format set up for it to fall back on, like PNG.
HTTP compression will benefit all other content outside of images. The HTML content of the site can be compressed. In result, this means that the site requires less bandwidth, reducing download times.
This compression reduces the number of HTTP requests a web page makes whilst reducing the size.
Further technical issues can arise by compressing HTTP, such as:
Making it easy for Google’s crawlers to scout your web pages can go a long way. Lay down a simple pathway for crawlers. This way, they can get to the content that you want on google easily. There are many methods and ways to make this possible.
Good site architecture is a principle that can drastically make everyone’s lives better. No page should lie any deeper than 4 levels. If your content is buried too deep in a site, not only will it be very unlikely that the majority of users find it, search engine crawlers may also miss it. In short, that’s content that’s not going to be found.
A site must have consistent and user-friendly architecture. This makes Google’s crawlers happy, gives the user a better experience and traffic to your site is increased. Unfortunately, it’s not always as straightforward as sticking to having 4 levels deep.
Canonicalisation is a way to control duplicate content to search engines. Duplicate content is where there are multiple unique URLs for the same page. There are instances where duplicate pages are created so that the user has multiple ways of reaching said content.
Search engines won’t see it that way and will see the different URLs as different pages. Crawling duplicate content wastes a crawler’s resources.
A canonical tag tells crawlers to only crawl this page of content and ignore other URLs associated with this page. In result, only the URL that has a canonical tag will appear on search engine results.
Having a sitemap for search engines is a map for crawlers. Robot.txt is a script telling the crawlers which pages are allowed to be crawled and which are not.
Imagine a huge mansion. The sitemap is a big map of the mansion, and robot.txt are the red and green lights above the doors that they can and can’t access.
This ensures that the crawlers know exactly where to go. This goes a long way. Helping Google’s crawlers will help you rank well on google.
The use of Schema markups has become very popular now in the search industry. Schema is a code that you can insert into the HTML code which improves the way search engines read and represent your webpage.
Using Schema, you can organise how your page will look in a search engine. For example, you can add a star rating system based on user reviews under your page link. If the site is more writing based, you can add a publish date and author.
It essentially just adds a little extra flair to your page in search engines. Something as little as having a star rating could be the difference in someone clicking on your site or not.
Schema markup was created by Google, Yahoo and Bing in a collaboration designed to make everyone’s experience better on search engines. Adding in a schema gives you the opportunity to add something small but important about your company in the SERPs.
It’s like a virtual business card. Analysts say that having a schema ranks you significantly higher than websites that don’t. In short, using schema not only makes your page look more attractive to the user, it also ranks you higher. Our schema markup article offers more detail on this.
A great site architecture and data compression are definite methods to set yourself up for the future. However, there are more factors that are becoming more prevalent as the years roll by.
It’s common knowledge that a very large majority of users surf the internet from their smartphones now. Google announced that more than 50% of search queries globally are conducted from a smartphone device. This number is likely going to continue to grow as smartphones play such a big part in the modern world.
It goes without saying that it should be a huge priority to have a mobile-friendly version of your site. Many marketers now feel that mobile should take priority over everything else now.
Sites that are poorly optimised or structured for smartphones will rank lower on search engines. Ensuring your site is mobile friendly will set yourself up for the future.
Voice search is becoming more popular amongst users across multiple devices. Smartphones, tablets and even desktops all have voice searching capabilities now. Siri on Mac and iPhones/iPads, Windows has Cortana, and Google has Google Voice Search on Android devices. There are even now voice command devices, like the Amazon Echo, that are taking the market by storm.
Currently, voice search is still young and far from perfect. But as the years roll on, it’s getting better and more popular. Voice Search is something that has been tried so many times over decades, and it has never really been a success. So what makes it different now?
Voice Search has been on the horizon for some time now, and this is down to one major factor. Accuracy. It’s something that so many previous voice search technologies have got wrong. Different accents couldn’t be picked up, your tone of voice would cause the voice search to think you’re saying something else. It just wasn’t very good.
Now, although it’s not perfect, it can understand what you are saying and is a viable way to search. It won’t be long until it will understand accents with near enough 100% accurately, but also be smart enough to understand the context of what you’re saying, leading to actual conversations, rather than just static commands.
Me: What’s the best BBQ for under £500?
VA: Do you want a traditional charcoal BBQ or a gas one?
Me: What do you think is better?
VA: As you regularly buy ribs, brisket and shoulders of pork, you may enjoy the traditional charcoal BBQ as you can smoke the meat
Me: Cool. Let’s go for a charcoal one then
VA: Good choice. How often will you use it?
Me: I want to use it as much as I can in the summer
VA: Well on average, London only gets 62 days of sunshine a year. So it’s worth getting a rain cover for the BBQ.
This sort of experience isn’t here yet. But it could very well be around the corner. Currently, voice search is more command based, but the key thing that makes voice search so viable and popular now is that it works. And it’s pretty accurate.
Google voice search currently sits at a 90% accuracy rate. The major appeal aside from that is its ease of use. Most people on average manage to type 40 words per minute, but you can speak 150 words per minute.
Hound is a 3rd party smartphone app is dubbed as the “fastest and smartest voice assistant”. It can keep up with the pace of someone naturally speaking. It’s able to understand long-winded sentences whilst said very quickly. When you go on to follow up what you are saying, it understands the context and pulls up something still relevant to your original query. It’s quite staggering.
Voice search very much has the potential to completely change the SEO industry and it is something that many businesses and SEO teams are thinking about. Experts believe that by 2020, 50% of all searches will be via voice search. It’s something that is changing the strategy of SEO teams. Priority is shifting towards long tail keyword searches.
Long tail keyword searches are more specific in detail. They’re more like a sentence with the keyword included. Typically, when someone is searching by voice, they will be more specific in what they are saying and it will be constructed in a proper sentence.
When someone is typing a keyword into google search, it might be “Weather Madrid”. Whereas if someone was using a voice search, it might be a bit more like: “What is the weather like in Madrid today?” or “Is the weather going to be nice today in Madrid?”. Long tail keyword priority is essentially basing keywords around more natural sentences.
Setting your business up so that it can appear on search engine results from voice searches will go a long way in the future. It doesn’t require too much work and if you’ve followed the basis of the previous points in this article, you will be good to go for voice search.
Technical SEO can be very complicated with tons of depth to the subject. Hopefully, this article helps you understand the core fundamentals and basics of SEO and the importance of it. Ensuring that your site doesn’t suffer from any technical hindrances greatly improves performance and will make a difference to your business.
3 min read
It’s official, we've added not one, but two shiny trophies to our awards cabinet! We’re over the moon to share that we’ve triumphed at the UK Search...
5 min read
Understand ecommerce attribution models which attribution models can maximise your marketing efforts and ROI.
3 min read
Explore how social commerce is changing the way we shop online, blending social interactions with digital commerce for a seamless buying experience.