As of January 2023, Yandex experienced a source code leak that has given the SEO community a lot of insight of what features of sites can be updated in order to benefit the organic rankings. Yandex and Google have many similarities when it comes to ranking factors but it should be noted that these are two search engines, although they work similarly, their aim is to satisfy two very different audiences. Therefore, not all the learnings from the Yandex leaks will be necessarily valuable to a site that is trying to rank on Google.
What is Yandex?
Yandex was originally founded in 1997 and over the years has become one of the tech leaders in Russia and the 4th biggest search engine by market share worldwide.
Yandex is one of the biggest Russian multinational internet companies and is considered as the Russian alternative to Google. The two search engines work similarly with the difference that Yandex is designed to serve predominantly a Russian speaking audience. Yandex also offer a variety other applications and services such as:
Taxi services (like Uber)
Email services (like Gmail)
App analytics (like Google Analytics)
Maps (like Google Maps)
Smart home technologies (like the Google Dot or Alexa)
Music Streaming app (like Spotify or Apple Music)
Online Advertising (like Google Ads services)
Yandex Source Code Leak
Towards the end of January, Yandex experienced a leak of their source code, which gave great insights to the SEO community. Many, if not all search engines are often secretive about the details of their ranking algorithms to prevent exploitation by spammers and other malicious actors in the SEO realm. So when elements of Yandex’s algorithm leaked it was big news for the SEO community!
Like Google, Yandex has a detailed algorithm that updates and changes on a regular basis. They both work toward the common goal of providing users with sites that are related to their queries. This suggests that there are many similarities in the design of their search algorithms.
The Yandex code leaked on 31st of January 2023. There were around 17,000 ranking factors found in the codebase that are measured through a variety of metrics which offer a better understanding into the inner workings of a modern search engine. It also helps readers understand what are the most influential ranking factors. The data from Yandex includes details about the algorithms and techniques used by the search engine to determine the relevance and quality of websites, and this information can be used to inform optimization strategies. It should be noted that the Yandex algorithm is not the same as Google’s, therefore the findings may have different levels of relevancy to a Google based SEO strategy.
From the leak, multiple SEO factors were identified with ranking factors having the most weight. Some of the most important ranking signals for Yandex have been found in the 3 areas below:
The source code showed that Yandex uses a PageRank algorithm to rank the Search Engine Results Page (SERP) results. PageRank is based around the theory that a link from one website to another acts as a vote of trust and authority and, therefore, the more links (votes) that point to a page, the more it should be trusted and, therefore, rank higher. This shows that link building is an important factor in SEO strategies and if implemented correctly, you should see improvement in organic traffic and website interactions.
Overall, the findings show that backlinking strategy takes priority when it comes to SEO, as it helps search engines identify the relevancy and trustworthiness of your website. Furthermore, backlinks from the main page (homepage) tend to be more effective.
2. Users Signals
User signals are behavioral patterns that search engines use to evaluate the ranking of your site. This includes metrics such as click through rate (CTR), time on site, bounce rate and the number of visitors returning to the site. An example of this would be the number of returning users, this gives search engines an indication of high quality and trustworthy information on your site. The idea behind returning users is that if someone goes back to a site, chances are that the content of that site should be valuable and it may be difficult to find elsewhere. This is a factor that search engines value highly.
Another important factor is the quality of user experience when landing on the site. This can have an impact on general user signals as the longer the load time the higher the chances are of users bouncing off the site and returning to the search bar. This could be considered by search engines as a negative factor since not many users are spending a substantial amount of time navigating the site. This would indicate to search engines that the quality of the content is poor and may result in lower rankings within search engines.
Yandex would look at specific factors such as the ratio of numbers of clicks on the URL, relative to all the clicks on the search. This essentially means that the more clicks your site gets, the better the site will be perceived by search engines, consequently benefitting rankings.
Sometimes the click ratios are often broken down by region in order to provide more specific answers to the user.
3. Relevance Evaluation
Relevance is a measure of how appropriate a given page is for a given query. Search engines are able to go through the website content and evaluate how relevant each page is to a specific keyword as well as the intention behind the copy. The reason for this is to meet user’s needs and help them find the relevant information they are looking for, rather than relying on one-to-one keyword matching, which can lead to a poor SERP.
All of these learning are great indicators of what search engines are looking for in websites. However, not all Yandex learnings should be applied to Google SEO strategies but they are great points to keep in mind and regularly review.
Differences between Yandex and Google
Yandex and Google have a lot of similarities as they both have continued to stay at the cutting edge of technology by attending tech conferences such as Special Interest Group on Information Retrieval (SIGIR, where tech companies gather together to identify how computers can become more compatible to people’s requests on SERP) and making findings and innovations partially public. Although the platforms are similar, it should be noted that the learnings from the leak should be considered more as suggestions rather than something that needs to be added to your SEO checklist.
This leak has been very valuable and one of the biggest learnings has been in relation to the traffic surrounding user signals. The code shows that the quality of the traffic that comes to your website has an impact on your rankings. The aim should be to gather a high quality audience that is constantly returning. but also where users do spend a significant amount of time browsing through the site pages which can be achieved with the right on-page SEO strategy.
If you’d like to find out more about SEO Strategies and how it can help your business, call us on 0207 288 6204 or get in touch via our website!