Category Archives: Implementation

Is it a good idea to run two digital analytics tools in tandem?

These questions come up a lot:

“Is it a good idea to run a double analytics implementation on one site/mobile app (i.e. Adobe + Google)?”

“Can we install GA and Adobe Analytics on the same website or do they conflict?”

You can add as many tracking tools as you want, they won’t adversely affect each other. I’ve seen companies that have Adobe Analytics, Google Analytics, Parse.ly, Mixpanel, and Coremetrics installed on the same site concurrently. It might affect page load time, but won’t affect data quality.

While you technically can have multiple tools in place, it doesn’t always mean you should. As a consultant I feel like I should have to put a dollar in a jar every time I say, “It depends,” but… it depends. Some of my clients do this, most don’t. There are reasons to do it, and reasons to avoid.

Advantages of tracking with two different tools:

  • A second source of data can be very handy for validation purposes or as a backup in case something catastrophic happens with the other tool.
  • Adobe is strong in some areas where GA is weak, and vice versa. Having both implemented gives you the best of both worlds.
    Novice analysts will be able to ramp up on GA faster, but seasoned analysts might be frustrated by the limitations of GA and need the more sophisticated functionality of Adobe.
  • Most companies use AdWords, so it’s likely that GA is already in place, anyway.

Disadvantages of tracking with two different tools:

  • $$$ – Assuming you’re on GA360, then you’re paying for two tools.
  • Double the tools, double the maintenance. New tagging requests will likely need to be implemented separately on each tool.
  • The idea of “one source of truth” can get muddied if you don’t have strong governance in place to choose and enforce which one is your system of record. Analysts could potentially cherrypick data and provide biased analysis.
  • You will inevitably end up down the data discrepancy wormhole when users report that the data from Adobe vs. GA don’t match perfectly . Expect to spend time and resources investigating those issues and/or educating your user base that it is normal and expected for two tools to yield slightly different data and to not freak out about a 2% variance.

If the advantages are something that you need and the disadvantages are something you are equipped to deal with, then running two tools in tandem might be a good option for you.

If the cons outweigh the pros, save yourself the $$$ and the trouble of implementing and maintaining two tools.

Cross-posted from  https://www.quora.com/Is-it-a-good-idea-to-run-a-double-analytics-implementation-on-one-site-mobile-app-i-e-Adobe-Google/answer/Halee-Kotara.

Common Analytics Gotchas for Project Managers

In the excitement of overhauling a site, analytics tracking requirements commonly get overlooked. It’s understandable that the focus is on the shinier, prettier aspects like the design and functionality rather than the less visible (but equally critical) aspects like tracking and measurement. This scenario happens more often than anyone would like to admit:

What data?

 

 

 

 

 

These are some analytics gotchas that every digital project manager should be aware of…

 

Analytics tracking is more complex than just enabling it or switching it on. It requires defining which data points will be captured on which pages, how they will be captured, what values get sent to the reports, and having a developer code it accordingly. There should be project phases for:

  1. Gathering analytics requirements from the business for what should be tracked.
    • “We need a report with user IDs.”
  2. Designing the implementation based on those requirements.
    • “Capture the user ID on each page in eVar12.”
  3. Implementing the analytics tracking code to collect the data.
    • s.eVar12=”123456″

Involve the analytics team early in the process to allow sufficient time to design and implement the solution.

 

Analytics data can be captured on the page load, or on the click of some element. From a scalability and ease of implementation perspective, it is far easier to implement tracking on page load where a new URL is rendered. That’s not to say that onclick tracking should be avoided entirely, but be aware that onclick tracking lends itself to being more complex, more manual, and less scalable.

There are ways to make onclick tracking more scalable by adding CSS elements specifically related to data collection, but the front-end developers would have to be provided with a spec for what additional CSS definitions need to be added to which elements, what the values look like, etc.

 

Onclick tracking does not happen automatically. All tracking based on click or hover interactions must be custom-defined and custom-coded.

 

Excessive onclick tracking can have an effect on cost. Certain tools like Adobe Analytics have a pricing model based on server calls. Using Adobe as an example, each Adobe contract specifies a certain amount of allotted server calls per month. A server call happens each time a packet of data is sent to Adobe, which means one server call each time a visitor loads a page, and one server call each time a visitor clicks on a tracked element. By gratuitously adding onclick tracking, it can cause server call overages and additional costs.

 

Use a data layer and tag management tool. Industry best practice is to have a tool-agnostic data layer on all the pages of your site, and then a tag management tool like Tealium or Adobe Dynamic Tag Manager to map the data layer values to the appropriate variables in your reporting tool.

 

Resist the urge to track everything. Only track what is essential, and what will yield actionable information. It’s better to have 10 really solid reports with meaningful information, than 100 reports full of “maybe that will be interesting to look at someday” information.