Online analytics – Signals or Noise?

Posted by | · · · · · | Big Data · Brand audits · Online Reputation Management · Optimization · Performance Measurement · semantic Web | Comments Off on Online analytics – Signals or Noise?

Online analytics sources – Signals or Noise?

A critique of some of the most common sources of online data.

Online data sources chart -- Signals or Noise?

Online data sources — Signals or Noise?

Google Webmaster Tools, Live’s Webmaster Center, Yahoo! Site Explorer, and many SEO products: provide (incomplete) information about only your own site. They report only data their respective search engine/tool provides: You cannot use them to compare your site on a like for like basis with competitors’ sites.

SEO/SEM tools: SEO tools are generally a random batch of tools, neither sequenced nor structured or weighted to give sub-scores and final scores. SEM tools are heavily PPC oriented. Both types commonly fail to disclose or clarify something about the data they report or the basis for their output, especially concerning link data, which they provide on a delayed-reporting basis (i.e. not real-time data).

SERP and KW trackers check for the search results position of a site or page for a KW query, so you have to enter a URL and a KW: They can’t check for just a KW (which is what searchers do), nor do they weight or rank the search positions or provide details of who claims them (one tool that does do these things is http://ranktank.eu/).

You cannot use a report from search engine or social network A to analyze how competitive your brand may be in search engine or social network B.

Social media monitoring services

Social media and online monitoring services: Monitor conversations around your brand and your competitors which at best highlights individuals you should be talking to (is that CRM or customer support?) and at worst delivers you verbatim transcripts of ten thousand postings. They:

  • contain a lot of noise, fail to differentiate (sometimes called disambiguate) between various uses of a word, are skewed by false positives and don’t recognize cynicism and irony (which are the lingua franca of the social Web);
  • show a snapshot of how you’re engaging with the social networks they subscribe to data from, but they don’t cover all social networks (which are one part of social Web), have almost zero coverage of forums and bulletin boards, and are weak in the blogosphere;
  • listen to the chatter that surfaces on social networks and report on the end result but they don’t read the flow of underlying data that produces the chatter and which enables you to intervene at the beginning of the process rather than at the end of it;
  • don’t show how the overall Web engages with you in terms of the underlying algorithmically driven signals that drive online visibility;
  • largely treat all social media platforms and all user actions as equal, which ignores the reality that some social networks are more influential than others and that some types of social interactions have a higher value than others;
  • are largely “blind” to the traditional Web and are thus unable to provide actionable insight into the flow of data (the signals) between it and the social Web. Thus, they can’t advise you on where and how to insert your data into this flow, or on the format that data should be in so that it can be understood semantically by machines as well as intuitively by humans.
online data sources example 1

online data sources example 1

Most of the tools provided by social networks themselves require you to be logged in to own account/profile – they can’t help you benchmark yourself against another profile or to score profiles against one another.

Traditional market research services

Their main focus is narrowly at the individual user or single social network level rather than on the strategic picture of what the overall Internet is signalling. They:

  • rely heavily on opt-in panels, focus groups, polls and surveys, so they have to extrapolate from a small user sample;
  • help you understand the inner thoughts of a handful of social media users but not the big picture view of all the sources of thousands of signals
  • like social media and online monitoring services, fail to weight different platforms and channels by their relative importance — the biggest social networks are not usually the most influential for niche topics – and they treat all online interactions as equal (ignoring that some are short lived, others are written to the graph).
  • tend to PPC and paid media-oriented (plenty of old line metrics like OTC), which ignores the fact that Web and social search rankings are editorially driven and that a correctly configured website and social profiles can influence each other’s search positioning far more than paid placement.
  • don’t explain how to improve your brand’s performance in Web or social search, because they don’t connect the user numbers with the technical configuration and code quality of the assets or platforms counted.
online data sources example 2

online data sources example 2

In common with social media and online monitoring services, traditional market research reports count users and actions instead of ranking and scoring them.

In summary, the Web and social Web are merging. Visibility on any given social network can be significantly influenced (positively or negatively) by acts or omissions that occur off that network — including interactions with websites. The reality is that most sources of online data are either account specific or oriented to Big Brands: smaller companies have to work harder to find relevant data. And social media and online monitoring services don’t relate social media to general Web presence.

SignalsRank analytics compared to online data sources

SignalsRank reads – and scores — the flow of data between websites, social networks and search engines. This flow carries signals used by Google, Facebook, Bing, Twitter and others to algorithmically determine Web and social media visibility. We are, to our knowledge, the only company that connects underlying technical factors (signals embedded in code, links, social objects) that are visible to machines but not to people, to the surface results (tweets, likes, tags, mentions) that people interact with.

 

 

 


No Comments

Comments are closed.