How to Access and Interpret Real-Time Search Engine Data
Google doesn’t publish official information regarding its database of indexed websites, so webmasters have to use proxies in order to make informed decisions. Although not as accurate and, in addition, subject to bias, proxies provide valuable insights into the results of algorithm changes and can be used to justify or explain a website’s statistics. So naturally, when you can have access to almost real-time search data from sources like SEOmoz1 you should make full use of it.
To begin with, many articles can be evaluated on the basis of their social media sharing, so tools like Social Mention2 are extremely useful. Simply typing a company or website name puts the search to work determining the content popularity in real time. Furthermore, it tries to establish the sentiment and general emotion behind social mentions – positive, negative, or neutral, as well as the associated anchor text. It concisely presents information in tables, taking account of data such as origins of mentions and common . Naturally, everything should be taken with a grain of salt as the interface is not perfect. Alternatively there is Twingly. Predominantly for Twitter users, Twingly3 offers extreme customization and search options. It further combines searches into RSS feeds so that you can always stay on top of what is going on.
Those in search of more serious information should consider using MOZcast, which is updated daily and can go back 3 months into the past for a thorough comparison exercise. The database has a huge number of samples in order to provide a useful insight into the Google algorithms in use in that time frame. Its data should still be carefully considered as it does not encompass all available results. The small degree calendar shows the amount of news on a specific day and whether it is good or bad for the industry.
The first hard core statistic is domain diversity, which basically shows how many different websites appear on the SERPs. The bigger the diversity, the less crowding, meaning that more websites are competing for the top position. This would theoretically mean that no website monopolizes user traffic or achieves an unusually high Page Rank. Looking at the current numbers, the diversity has decreased by two percent despite Google’s August efforts and algorithm changes.
The second tab accesses the number of results on the SERP, which were previously predominantly 10, sometimes 9, but with the introduction of the 7-results SERP on 13th August, the number has decreased to below 9.54. This means that websites from 7th to 10th place in the affected areas will see a sharp drop in traffic, in many ways similar to a punishment. These industries will also see an increase in competition as in effect the places which provide sustainable traffic have decreased, although each space will attract a higher proportion. There are unfortunately wide variations in this tab.
Exact match and partial match domain influence are the following tabs, and these measure whether having an EMD or PMD improves ranking in SERP. Currently, the effect on the EMD filter is still visible with 10%5 of the EMDs being excluded from the top list. The EMD algorithm change was made with the aim of improving SERP but innocent bystanders may have been hurt.
The final tab shows the big 10 websites which dominate SERPs because they have so much content that they are indexing databases as well. The general trend of their increased importance is apparent in the 90-day view, which means it will be nearly impossible to compete with them in their niches.
In conclusion, webmasters must take advantage of all available information in order to make informed judgments about their websites. The larger amount of data may prove harder to analyze but the depth of data provides better statistics.