The definition of quality search engine doesn’t just attempt to return pages that actually match the query that is input by the user. A really good search engine actually tries to answer the underlying question that the user has asked. If you have realised this then you are already one step ahead of many others, and you will understand why Google, Bing and others use a very complex algorithm or series of huge algorithms to decide the results that they should return.
There are multiple factors in the algorithm (over 220 in Googles), and some consist as hard factors which are the number of back links to a page, the age of the page and website, the quantity of content on the page and potentially social signals like social recommendations to Facebook likes or Instagram follows.
These are just a few of the internal and external influences, the other fact is that on page factors are vital. This could be the quality of the HTML code on the page, the size of the web page, the speed with which the web page loads, how the algorithm determines the usability of the web page, which all play a role in how the algorithm works when ranking a page in the search. It is only actually by analysing the off-site and on page factors that Google or the other search engines can determine what pages will answer the question best for the query that has been asked. Specialist digital agency types and SEO professionals spend thousands of hours trying to extrapolate Google highly secretive algorithm functions and actions. Whilst Google are masters at not ‘showing their hand’ there are some core basic factors that are known including site age, authority, backlinks, site quality, site traffic. You can also achieve these qualities by getting some top quality backlinks. You can buy guest posts from a reliable source to get some authority quickly. But still you have to be patient and go without hurrying. Studying Googles many patents is also a good way to revers their algorithm, if you have time and a strong stomach for complex maths.
Search engines have evolved massively over the past years; initially they could only deal with Boolean operators. To simplify this, they used to work on if the search term was actually included in the document or not; is this true or false, a one or a zero. You could also use search operators like ‘and’ ‘all’ and ‘not’ that would actually search through documents that have multiple different terms in and then could exclude terms and add additional terms. Moving on to the modern day and search engines have a far better understanding of how to decode a webpage, index the content and then rank it within their search engine to place more valuable and relevant documents nearer the top of the list of search results, which would be the coveted first page of Google or first page of Bing. These are just some of the factors that are within an overview of the main search engine Google’s algorithm.
In reality Bing is a very small part of internet search traffic, reported by some SEO experts as being 5% – 10% of internet traffic.
The difference in quality between Bing and Google is night and day, as can be seen if you search for the same thing in both search engines. Google has a much better local search algorithm and is focused on laser targeting search traffic with locally relevant results for many niches. For instance if you search for ‘painter and decorator’, you don’t want a large company in London as result 1 if you live in Scotland. Bing, may give you a result for any random area, often for an irrelevant company that is not exactly what you are looking for. Google however, will provide you results only for local companies on page one (usually!) and then as a general rule they will be geo related until there is no more local results to give and then it will move to a wider geographic spread.
What’s going to be big in 2020 Google search? Voice search will continue to grow, and local search will continue to be refined, along with a healthy dose of AI to try and improve the search users experience! (UX)