How do we use the results of our Search Analytics Lab in practice?



Now we are ready to share with you how we use the results of search analytics lab research in practical work with clients. At the Optimization-2020 conference practical cases were presented by Olga Yudina, Digital Director of “Ashmanov and Partners.

Case #1: Online pet supply store.

Result: Doubled the number of requests in the TOPs, increased visibility by almost 3 times.

Start of work: March 2020.

The sphere of the client’s business – goods for pets. At the start of the work the site already had a fairly high position in high-frequency demands, but, nevertheless, could not rise above. For example, in Yandex the site was on the 14th place in the ranking of competitors, in Google – the 7th place. Thus, the client had average visibility in the topic.

Search analytics lab audit

Recall that the laboratory of search analytics company “Ashmanov and Partners” monthly large sample of search queries analyzes changes in the ranking algorithms. For this purpose it collects the values of 600 parameters and automatically and manually evaluates pages in search results. Based on these studies, large factor studies are released annually. These are large documents with figures, statistical analysis of data, parameter correlations and analytics, which are available to everyone. The research looks quite complicated, so let’s show by concrete examples how we apply it in our work and how you can do it.

The figure shows a part of the table that is generated as a result of our analysis of the client’s site in the SearchAnalytics Lab, and with which analysts (not optimizers!) work. The full table contains several hundred factors that are used to check the customer’s site. Parameters, on the basis of which we draw conclusions about the strong or weak influence on the ranking, are calculated on the basis of statistical methods, such as Spearman’s correlation coefficient, Fisher’s test, etc. (This article is not intended to dive into the statistical side of the analysis, so we do not talk about them in detail).

We do the calculations and get special values for each parameter: relative to the site of our client and the rest – for competitors’ sites. Now we will show now more clearly how we use data obtained in this table, and make decisions on which metrics, how to influence. We will divide them all into categories.

Traffic Metrics

The first thing we looked at was traffic metrics. The results of the analysis showed that:

  • The site is inferior to competitors in terms of traffic metrics,
  • The site’s ICS is much lower than in Top 10, here I would give a link to our article about IC
  • Alexa Rank is also inferior to competitors.

Thus, in terms of traffic metrics our site is far behind the desired values, and we will have to compensate for this deficiency by working on other factors.

Further analysis of the Laboratory has revealed that the site lags behind competitors and leaders in the Top on the parameter “a lot of addresses”. The presence of multiple addresses can have a positive impact on the rankings, and in this case we clearly saw this relationship: 67% of sites at the top in our subject had multiple addresses.

Our client did not have a large number of addresses on the site at the start of the work. We have been issued a recommendation to place these addresses. And the client (for what he is very grateful, it’s always a pleasure to work with clients who listen to recommendations and implement them in the site!) placed a map, added addresses of offices and warehouses, as well as pickup points. By the way, in one of our articles we have already talked about how to effectively build a relationship between the client and the SEO-agency.

Read also:   Top 10 modern debit cards of 2020

The next parameter is warranty information. Another one of the important parameters that should be present on the site. We saw that on this parameter our client is also inferior to competitors: about 50% of sites in the Top provided information about guarantees. Of course, we gave a recommendation to place this information.

We also saw that 90% of the sites in the Top have the possibility of self-delivery on the site. On the site of our client at the time such an opportunity was not present.

What did we do?

We added multiple addresses, warranty information, and information on pickup from the warehouse and pickup points.

Text Factors

Next we analyzed the parameters relating to text factors and risks. The site was found to have a high risk of Baden-Baden! The risk score was 14.4, which is very high; the competitors had a much lower risk (2.92 to 9.82).

There was also an insufficient number of occurrences of queries in alt. Alt – an attribute of tag img, which contains a verbal (“alternative”) description of the image. It is him indexed by a search robot to understand what is depicted in the picture. Therefore, when you place a picture or other images on the site alt necessarily must be filled with keyword queries of the page. In this parameter we also found a lag from our competitors: 0,86 on our site against 6,95 – 17,2 of the competitors.

What we did:

The texts for the site were checked using the Turgenev service, which was also developed by the Lab. It allows you to assess the overall risk of Baden-Baden and identify excessive keyword density, the proportion of meaningful text and a number of other parameters. By reworking the texts according to Turgenev’s recommendations, it was possible to reduce the risk to 7 and below (medium and insignificant). Missing queries in alt were also added.

Linking parameters

Next we analyzed the link parameters. Our site was inferior both by the number of linking domains to the site and by LinkRank of links
to the site.

We developed a content marketing strategy that allowed us to increase the missing factors.

The results we arrived at in this case study:

As a result of implementing the recommendations issued, including improving traffic, link and text factors, the number of queries in the Tops of Yandex and Google increased significantly – about 2-fold. In Yandex the increase was from 1,075 to 2,129, in Google from 737 to 1,400.

In terms of visibility the growth was even more significant. Visibility grew by almost 3 times: from 9.39% to 27.47% in Yandex, from 10.61% to 29.88% in Google. This is the effect of implementing the above and some other recommendations.

Opinion of the expert

Olga Yudina, director of internet marketing department of “Ashmanov and partners”:

“The factors shown are quite simple, I admit that many optimizers can analyze it without Lab data. But not enough to analyze and make recommendations! The big problem – to convince the customer to implement a particular parameter on the site. When the optimizer has the above statistics, which is supported by numbers and data from competitors, statistics on the top, we are much easier to show customers what to do and why, to convince them to implement the recommendations.

Case #2: a site in the aircraft rental industry

Result: increase of the number of requests in the top 10 by more than 10 times (Yandex), almost 4 times (Google), increase in the visibility by 16 times (Yandex), almost 3 times (Google).

Read also:   Instagram is testing influencer accounts with message filtering and deep analytics

Start of work: March 2020.

Visibility of the site in search was at a very low level: it took only 89th place in the ranking of topic leaders in Yandex and 17th place in Google.

Audit in the Laboratory of search analytics

We conducted a search engine audit and analysis of the site in the Laboratory. In the figure you can see a piece of the interface, which works directly with the optimizer.

The parameters considered for each factor, for each competitor in the top 10 and top 30. Let’s analyze some of them in details.

Commercial factors

Prices on the client’s page
were absent. However, prices were indicated for 75% of competitors’ sites in the Top 10 and 60% in the Top 30. Of course a recommendation was given to place the prices, which the client did, and we are very grateful to him for that.

The significant factor is the reviews, which were not present on our site either. That said, 39% of sites in the Top 10 and 32% in the Top 30 did not neglect to post reviews. We showed this to the business with an appropriate recommendation. Especially since reviews existed, and good ones at that. All that was left to do was to scan and post testimonials from their high-profile clients, which was done.

Further analysis revealed that the site was inferior to the competition, because it did not have a section of answers to frequently asked questions. At the same time, such sections were found in 36% of competitors in the top 10 topics and 42% in the top 30. This factor, as we know and as the above data suggests, can increase user loyalty, so the client implemented an answers section on their site

Text Factors

When auditing the texts on the site it turned out that the content of the site is inferior to competitors by the number of keywords and synonyms. The number of occurrences of individual words in the text gave 7.93, while in the top 10 it was 13.5, and in the top 30 – 11.8.

A similar situation is with occurrences of individual query words and synonyms in description . This is why we recommended that you should optimize the texts on your site. In one of our articles we have already told how to facilitate the work with the semantic core for the client.

Linking factors

When analyzing the link ranking we have identified weaknesses in the distribution of link mass on the pages of the site. In particular, the homepage had too many links: 0.99 vs. 0.57 for competitors in the top 10, and 0.58 in the top 30. So we did some content marketing work to correct this deficiency.

Read also:   Nethouse service overview.

The results we arrived at in this case study:

Growth in the number of queries in the Top 10: more than 10 times in Yandex and almost 4 times in Google. In numbers, that’s an increase from 11 to 170 and from 57 to 205, respectively.

Visibility dynamics were also significant: from 1.33% to 21.31% in Yandex and from 8.05% to 23.1 in Google.

Expert opinion

Olga Yudina, director of internet marketing department of “Ashmanov and partners”:


“Factors are shown quite simple, I admit that many optimizers can analyze it without Lab data. But not enough to analyze and make recommendations! The big problem – to convince the customer to implement a particular parameter on the site. When the optimizer has the above statistics, which is supported by numbers and data from competitors, statistics on the Top, it is much easier to show customers what to do and why, to convince them to implement the recommendations.

Key ranking factor projections for 2021

Predictions, of course, are a thankless task. However, Michael Volovich, head of the Laboratory of search analytics, shared his vision of how search engine rankings will change in the coming year. Recall that the Laboratory researches more than 600 parameters of output in various topics and issues reports according to the results of research.

What can we expect in 2021?

  • It is clear that the monopoly of large sites in Top Yandex ends. Search engine will cautiously limit the benefits for the largest sites.
  • This will weaken some of the parameters (ICS, traffic, linking). But smaller sites will have a slightly better chance to break into the top lines.
  • We assume that Yandex and Google (which initially had no such an abundance of large and assortment sites) will be more similar to each other than before
  • Sites with mobile adaptive layouts will rank closer to the top 3. Yandex’s turbo-pages will gain traction in mobile rankings. There is no data yet on AMP-pages in mobile Google, saying that they will make a big difference in the rankings.
  • There will be more latent factors involved in ranking in the top positions. At the same time, some parameters (linking, social, etc.) weaken the correlation with getting into the top 3
  • Google will increase the gap between the “head” (Top 1-2-3) and the “tail” of the rankings.
  • Google will increase the weight of the site authority (E-A-T). Recall that E-A-T algorithms assess the authority and reliability of the site and can lower relevant pages with unreliable information. We talked more about Google’s rankings and algorithms on our blog.
  • The influence of technical parameters (HTTPS, adaptive layout, etc.) and some non-commercial factors will increase. However, they will not 100% determine the position of the site.


  • When working with clients, those professionals who can back up their recommendations with data from analytics and research are the winners.
  • Practical cases show that the implementation of lab-based recommendations has a predictable positive effect.
  • In 2021 the tendencies of the previous years are observed on the search engine as a whole. However, the role of a number of parameters will change: some parameters will decrease their importance, some, on the contrary, will become more significant.
  • A more detailed assessment of the correlation can be found in search analytics lab research reports.

Leave a Reply

Your email address will not be published. Required fields are marked *