Categories
Google Analytics Reporting

How to Figure Out Origin of Direct Traffic

Original question from Quora:

How do I figure out where direct traffic in Google Analytics found my website in the first place? No one just enters a URL of a website first time visiting it; they need to know about the website through ads, organic search, referrals etc.

The short answer is: you can’t. The fact that the traffic is being classified under “Direct” by definition means that there is no referring information for Google Analytics to use to determine where the user came from.

You’re right that it seems strange and unlikely that someone’s first visit to a website would start by them typing in the URL, so here are a couple of possible explanations for that:

  • It’s not actually the user’s first visit. These days browsers are deleting cookies more aggressively and more users are adopting ad/tracking blockers. GA’s new / return visitor dimension depends on the GA cookie being present. If the GA cookie gets deleted for whatever reason, when that user comes back (even if they have been to your site dozens of times) they will look like a new visitor in the GA data.

  • Referral data or UTM parameters are getting stripped. It’s not unheard of for those details to get lost if there are a lot of redirects in between the origin site and the destination site. There are also some sites that are configured to strip any non-whitelisted URL parameters before the page (and therefore the GA tracking code) loads.

  • Vanity URLs. If you’re doing offline marketing where you’re using vanity URLs like http://website.com/freegift and you don’t do anything special to flag those referrals in GA, they’ll come through as direct.

My heart always sinks when I’m digging into some anomaly and find that it’s related to direct traffic because that’s virtually a dead end for my investigation. You can try breaking down direct by landing page to get some sense of which page the user started on. If it’s a page deeper in the site, it might give you a hint to how/why they got there. But odds are it’s the homepage, and you’ll be stuck in another analytical dead end.

I wish I had a different and more helpful answer for you, but this is one of the painful and annoying realities of digital analytics!

Categories
Analysis

How to Identify Those Pesky Bots

Tell me if you’ve heard this one…

You see a major spike in traffic.

You get excited.

You congratulate yourself for putting together such an effective marketing campaign and/or having such a great brand. You feel like the most popular kid in school. There is an extra strut in your step.

Then…

There are no corresponding conversions. Bounce rate is through the roof. All this exciting new incremental traffic is from St. Petersburg, Russia.

You have a bot problem.

Here are some tips on where you can look in your data to confirm:

Data PointWhat to Look For
Operating SystemLinux or "not set"
Browser“not set” or “unknown”
Browser Versionmany versions prior to the most current version
Locationcoming from the same city
ISPcoming from the same ISP
New vs Returningall new users
Pageviews per Visitif almost all the suspect traffic has only 1 pageview per visit and a 100% bounce rate

OR

if almost all the suspect traffic has an inordinately high number of pageviews per visit
Landing Pagethis isn't necessarily an indicator in and of itself, but seeing where this suspect traffic is entering the site can give you some hints

Example: if a lot of the landing pages are a search results page with a keyword of a specific SKU, it’s probably a crawler making a database of products.
Pagesnon-bounced traffic is consistently hitting the same pages in the same sequence
User Agent Stringthis is not available by default in Google Analytics or Adobe Analytics reporting, but you can easily copy to a custom dimension or eVar for further investigation

If it is a bot, you can:

  • use view filters in Google Analytics to exclude it from your reporting view. (This will only exclude it from that date going forward.)
  • adjust your segment definition in Adobe Analytics to exclude it from your virtual report suite. (Depending on the scope of your segment, this may or may not exclude it from that date going forward.)
  • invest in a bot detection tool like White Ops to proactively identify and block bots before they can even muck up your data.
Categories
Google Analytics

Understanding Time on Page / Time on Site / Bounces in Google Analytics

I have clients who put a lot of focus on time-spent metrics like “time on page” or “average session duration,” and there is a lot of time spent on my part making sure everyone understands how those metrics are counted (and how misleading they can be). It’s a common misconception that there is a countdown timer ticking on each page, and that said counter is only running when someone is actively looking at / scrolling down the page in the active browser tab, and that the counter stops when someone closes the tab or browser.

Sadly, those are completely inaccurate assumptions and often lead people to misinterpret what’s in those reports.  All time-spent metrics are calculated based on timestamps of when packets of data are sent to your reporting tool. The reporting tools have no idea whether or not your users were actively on your site, only that beacons of data are sent. Here’s a good example of why time-spent metrics shouldn’t be blindly trusted:

  • Best case scenario: If you were on Page A at 12:00pm and stayed engaged reading the page for a few minutes and navigated to Page B at 12:02pm, Google Analytics would show that your time on Page A was 2 minutes. Great!
  • More realistic scenario: If you were on Page A at 12:00pm, got distracted and looked at another site in another browser tab for 10 minutes, then had to take a 10 minute phone call, then finally returned and navigated to Page B at 12:20pm, GA would show that your time on Page A was 20 minutes. No it wasn’t, yikes!

Rather than using these types of passive metrics which can be misleading, I recommend using metrics that show users actively engaging: page views per session, scroll event tracking, page depth, etc. But if you do want to continue using the time-spent reports, it’s important to understand exactly what is being calculated and how.

  1. All time spent metrics are simple difference calculations based on timestamps of when hits are sent to GA.
  2. Average Session Duration is the difference between the timestamps of the first and last hits of the session (and that final hit can be either a pageview or an event). If only one hit happened in the session, the average session duration will be 0 since there is no second timestamp to perform the calculation. (Google Help Article)
  3. Average Time on Page is the time difference between subsequent pageviews. The final page (or in cases of single page visits the only page) will have a time on page of 0 because the subsequent timestamp is not there for the calculation.
    • Note that events are not taken into account on time on page calculations.
  4. Bounces are sessions that only trigger one hit, and that hit can be a pageview or an event. If a user only visited one page and incremented 1 single pageview but scrolled down enough to make a scroll event fire, that’s 2 hits and therefore they will not be a bounce. (Google Help Article)
    • Note that GA has an option for “non-interaction events” which were designed to be ignored for the purposes of bounce rate.

I always find it helpful to see detailed examples, so I generated data for a few scenarios to illustrate:

Session 1:
3 Pageviews, 0 Events

Page 1 @ 2:58pm
Page 2 @ 3:00pm
Page 3 @ 3:02pm

  • Avg Session Duration = 4 minutes (3:02pm minus 2:58pm)
  • Bounce Rate = 0%
  • Page 1 Avg Time on Page = 2 minutes (3:00pm minus 2:58pm)
  • Page 2 Avg Time on Page = 2 minutes (3:02pm minus 3:00pm)
  • Page 3 Avg Time on Page = 0 minutes (there is no subsequent timestamp to calculate)

Session 2:
3 Pageviews, 2 Events

Page 1 @ 3:02pm
Page 2 @ 3:03pm
Page 3 @ 3:04pm
Event A @ 3:05pm
Event B @ 3:10pm

  • Avg Session Duration = 8 minutes (3:10pm minus 3:02pm)
  • Bounce Rate = 0%
  • Page 1 Avg Time on Page = 1 minute (3:03pm minus 3:02pm)
  • Page 2 Avg Time on Page = 1 minute (3:04pm minus 3:03pm)
  • Page 3 Avg Time on Page = 0 minutes (there is no subsequent pageview timestamp to calculate)

Session 3:
1 Pageview, 1 Event

Page 1 @ 3:10pm
Event A @ 3:12pm

  • Avg Session Duration = 2 minutes (3:12pm minus 3:10pm)
  • Bounce Rate = 0%
  • Page 1 Avg Time on Page = 0 minutes (there is no subsequent pageview timestamp to calculate)

Session 4:
1 Pageview, 0 Events

Page 1 @ 3:13pm

  • Avg Session Duration = 0 minutes (there is no subsequent timestamp to calculate)
  • Bounce Rate = 100%
  • Page 1 Avg Time on Page = 0 minutes (there is no subsequent pageview timestamp to calculate)

Session 5:
2 Pageviews, 1 Event, 1 Pageview

Page 1 @ 3:32pm
Page 2 @ 3:34pm
Event A @ 3:36pm
Page 3 @ 3:38pm

  • Avg Session Duration = 6 minutes (3:38pm minus 3:32pm)
  • Bounce Rate = 0%
  • Page 1 Avg Time on Page = 2 minutes (3:34pm minus 3:32pm)
  • Page 2 Avg Time on Page = 4 minute (3:38pm minus 3:34pm)
    • NOTE: this is ignoring the event and is calculating the time difference from the previous pageview
  • Page 3 Avg Time on Page = 0 minutes (there is no subsequent pageview timestamp to calculate)

Hopefully that clarifies exactly how things are counted, and will empower you to interpret those reports correctly!

Categories
Reporting

Explanations for Data Discrepancies Between Reporting Tools

If you’ve ever been dragged kicking and screaming into a data discrepancy discussion or forced to spend your time investigating a 2% variance between your ad-side reporting and your digital analytics reporting, this post is for you.

I’m not saying that discrepancies aren’t a concern or shouldn’t be investigated; it’s important that everyone feels confident in the quality of data. But I am asking you NOT to make me spend 10 hours looking into why your Facebook clicks in Facebook Analytics don’t 100% match your Facebook referral sessions in Google Analytics. Here’s why…

Different tools process data differently

No two reporting tools will ever be a 100% match. Data collection methodologies vary, the way data is processed varies, how sessions are defined can vary, etc. My general rule of thumb is that if it’s within 10-15% and the trendlines are the same, you’re in excellent shape!

Ad-side click data will always higher than site-side page load data

Be aware that most ad-side reporting tools (Google AdWords, Facebook, Twitter, Pardot) collect data on the click, where site-side digital analytics tools (Adobe Analytics and Google Analytics) collect data on page load. If the user abandons the landing page or clicks to another page before the landing page fully loads, the digital analytics tracking code with the campaign parameters might not execute. In that case, the user will be counted in the ad-side reporting but not the digital analytics reporting.

Ad-side reporting tools will always show higher numbers compared to site-side reporting tools, and this can be exacerbated if you have slow page load time.

Not all metrics are comparable

Different metrics are processed and calculated differently so be aware of which metrics you’re trying to compare. Don’t mix and match simple counter metrics with ones that are deduplicated across a session or user:

  • Clicks and pageviews are typically simple counters and will increment each time they happen. These types of metrics are always higher than the ones that follow in this list.
  • Unique clicks, unique pageviews, sessions, and visits are deduplicated at the session level, so even if someone clicks or looks at that page multiple times in a session, it would only be counted once. These metrics are lower than clicks and pageviews for that reason.
  • Users and visitors are deduplicated across the user’s lifespan (or until their vistior ID cookie expires or is cleared). These metrics are lower than sessions.

Not all conversion rates are comparable

The definition of a “conversion rate” can vary depending on the context. Be sure you know what metrics are being used in the conversion rate calculation before trying to compare them.

Ad-side reporting often shows “clicks ÷ impressions” while site-side reporting typically shows “orders ÷ visits”.

Not all tools use the same attribution

Different tools use different attribution models which can skew data. Typically ad-side reporting tools will give 100% of all conversions to themselves because they are not taking other marketing touchpoints into account.

Digital analytics tools have to share attribution across multiple marketing channel touchpoints, so the data per channel or campaign is often lower than what the ad-side reporting tool shows.

For example, consider this user journey:

  1. two days ago, a visitor came to your site via a Facebook ad
  2. one day ago, a visitor returned to your site via a Google paid search ad
  3. today the visitor returned to your site via an email and made a $100 purchase

Here’s how the data will look in the various reporting tools:

  • Facebook reporting is going to claim $100 for itself
  • Google Ads reporting is going to claim $100 for itself
  • Email reporting is going to claim $100 for itself
  • Digital analytics reporting has to share that $100 across the 3 touchpoints, and the allocation will vary depending on your digital analytics tool’s attribution defaults or configurations. Assuming it’s configured to last non-direct click attribution, the digital analytics reporting will show:
    • Facebook ad = $0
    • Google ad = $0
    • Email = $100

Ad blockers can affect data collection

Some users block tracking in their browsers which could prevent your digital analytics tool from collecting data. (Presumably it would also prevent the ad-side data from being collected, but that completely depends on what blocker utility the user is using and what it does/doesn’t block.)

Time zones can skew daily numbers

If two reporting tools are configured for different time zones, the data will not align when you’re looking at daily numbers.

Revenue can be defined in many different ways

Digital analytics revenue data is typically pure product demand revenue collected the moment the order is placed. It is not fair to compare to shipped revenue, which accounts for situations like fraudulent orders or order cancellations. Digital analytics data also doesn’t normally account for returned purchases.

Feel free to copy / paste that next time you get the dreaded data discrepancy questions!

Categories
Google Analytics

When Does a Google Analytics Session Expire, Really?

One of the many ways that Google Analytics and Adobe Analytics differ are their individual definitions of what constitutes a session. Both abide by the industry standard of a session ending after 30 minutes of inactivity*.

* Unless you’ve changed the session timeout setting in a Google Analytics property, or the visit definition/visit timeout setting in an Adobe Analytics virtual report suite.

But Google Analytics has an additional trigger for ending a session that can be a real gotcha when trying to matchback data across the two tools: it ends a session and starts a new one “every time a user’s campaign source changes.” That statement always perplexes me because I’m not sure whether to interpret that as:

  1. when any UTM parameter changes, or …
  2. when only the utm_campaign parameter changes

I always assumed #1, but I decided to test this out so I’d be 100% sure. In each test scenario below, I opened a new guest browser window so none of my activity from each test would overlap.

If all other UTM parameters are the same but utm_medium is different, it will start a new session.

The proof: I loaded two URLs with identical values for utm_source, utm_campaign, and utm_id, but different values for utm_medium.

kotaraindustries.com?utm_medium=affiliate&utm_source=halee.com&utm_campaign=201810101234&utm_id=201810101234
kotaraindustries.com?utm_medium=partner&utm_source=halee.com&utm_campaign=201810101234&utm_id=201810101234

When I pull up the report by one of the other parameters, it shows two sessions, validating that a new session was started when I loaded that second page.

If all other UTM parameters are the same but utm_source is different, it will start a new session.

The proof: In a new, fresh guest window, I loaded two URLs with identical values for utm_medium, utm_campaign, and utm_id, but different values for utm_source.

kotaraindustries.com?utm_medium=affiliate&utm_source=benicetobears.com&utm_campaign=201810101232&utm_id=201810101232
kotaraindustries.com?utm_medium=affiliate&utm_source=benicetootters.com&utm_campaign=201810101232&utm_id=201810101232

Again, two sessions:

If all other UTM parameters are the same but utm_campaign is different, it will start a new session.

The proof: In a newer, fresher guest window, I loaded two URLs with identical values for utm_medium, utm_source, and utm_id, but different values for utm_campaign.

kotaraindustries.com?utm_medium=affiliate&utm_source=nxt.com&utm_campaign=201810101230&utm_id=201810101230
kotaraindustries.com?utm_medium=affiliate&utm_source=nxt.com&utm_campaign=201810101231&utm_id=201810101230

How many sessions? Two sessions!

If all other UTM parameters are the same but utm_content is different, it will start a new session.

The proof: In the newest, freshest of guest windows, I loaded two URLs with identical values for utm_medium, utm_source, and utm_campaign, but different values for utm_content.

kotaraindustries.com?utm_medium=affiliate&utm_source=halee.com&utm_campaign=201810101235&utm_id=201810101235&utm_content=version-a
kotaraindustries.com?utm_medium=affiliate&utm_source=halee.com&utm_campaign=201810101235&utm_id=201810101235&utm_content=version-b

Two sessions:

If all other UTM parameters are the same but utm_term is different, it will start a new session.

The proof: I loaded two URLs (in yet another guest window) with identical values for utm_medium, utm_source, and utm_campaign, but different values for utm_term.

kotaraindustries.com?utm_medium=affiliate&utm_source=halee.com&utm_campaign=201810101237&utm_id=201810101237&utm_term=keyword-1
kotaraindustries.com?utm_medium=affiliate&utm_source=halee.com&utm_campaign=201810101237&utm_id=201810101237&utm_term=keyword-2

This is getting boring and predictable, two sessions:

If all other UTM parameters are the same but utm_id is different, it will start a new session.

The proof: Another test, another guest window. I loaded two URLs with identical values for utm_medium, utm_source, and utm_campaign, but different values for the elusive and not-often-used utm_id.

kotaraindustries.com?utm_medium=affiliate&utm_source=wwe.com&utm_campaign=201810101228&utm_id=201810101228
kotaraindustries.com?utm_medium=affiliate&utm_source=wwe.com&utm_campaign=201810101228&utm_id=201810101229

Dos sessiones:

So there you go. Irrefutable evidence that a new session starts on change of any of the following:

  • utm_medium
  • utm_source
  • utm_campaign
  • utm_content
  • utm_term
  • utm_id
Categories
Adobe Analytics Google Analytics Implementation

Is it a good idea to run two digital analytics tools in tandem?

These questions come up a lot:

“Is it a good idea to run a double analytics implementation on one site/mobile app (i.e. Adobe + Google)?”

“Can we install GA and Adobe Analytics on the same website or do they conflict?”

You can add as many tracking tools as you want, they won’t adversely affect each other. I’ve seen companies that have Adobe Analytics, Google Analytics, Parse.ly, Mixpanel, and Coremetrics installed on the same site concurrently. It might affect page load time, but won’t affect data quality.

While you technically can have multiple tools in place, it doesn’t always mean you should. As a consultant I feel like I should have to put a dollar in a jar every time I say, “It depends,” but… it depends. Some of my clients do this, most don’t. There are reasons to do it, and reasons to avoid.

Advantages of tracking with two different tools:

  • A second source of data can be very handy for validation purposes or as a backup in case something catastrophic happens with the other tool.
  • Adobe is strong in some areas where GA is weak, and vice versa. Having both implemented gives you the best of both worlds.
    Novice analysts will be able to ramp up on GA faster, but seasoned analysts might be frustrated by the limitations of GA and need the more sophisticated functionality of Adobe.
  • Most companies use AdWords, so it’s likely that GA is already in place, anyway.

Disadvantages of tracking with two different tools:

  • $$$ – Assuming you’re on GA360, then you’re paying for two tools.
  • Double the tools, double the maintenance. New tagging requests will likely need to be implemented separately on each tool.
  • The idea of “one source of truth” can get muddied if you don’t have strong governance in place to choose and enforce which one is your system of record. Analysts could potentially cherrypick data and provide biased analysis.
  • You will inevitably end up down the data discrepancy wormhole when users report that the data from Adobe vs. GA don’t match perfectly . Expect to spend time and resources investigating those issues and/or educating your user base that it is normal and expected for two tools to yield slightly different data and to not freak out about a 2% variance.

If the advantages are something that you need and the disadvantages are something you are equipped to deal with, then running two tools in tandem might be a good option for you.

If the cons outweigh the pros, save yourself the $$$ and the trouble of implementing and maintaining two tools.

Cross-posted from  https://www.quora.com/Is-it-a-good-idea-to-run-a-double-analytics-implementation-on-one-site-mobile-app-i-e-Adobe-Google/answer/Halee-Kotara.

Categories
Careers

4 Reasons Why All Digital Careers Need to Understand Digital Analytics

Data is everywhere. It’s captured with every step we take and every button we click. If you’re conducting business online, data should be at the heart of everything you do. Here are four reasons why it is so important for everyone on the digital team to be able to understand and apply analytics.

  1. The days are over where it took months to gather data, and days to run a report. In the digital world, campaigns and content can be changed in minutes. Online interactions can be tracked and reported in real-time. The data we have about our customers today is more detailed, more robust, and processed more quickly than ever before. In order to effectively leverage real-time data, the ultimate end users of the data must take a more active role in reporting and analyzing it.
  2. The days are over where being an analyst required a PhD in statistics and an office in the basement next to the mainframe computer. Thanks to technology, many of the digital analytics tools like Google Analytics and Adobe Analytics are designed to do the statistical heavy lifting. Whether what’s needed is advanced segmentation queries or complex data visualizations, the tools make it achievable to accomplish these tasks without an advanced stats degree. Analytics tools these days help make reporting more visual, creating a story to tell utilizing the data. Even the least mathy among us can be empowered to do very sophisticated analysis once armed with the fundamentals of digital analytics.
  3. The days are over where marketers and copywriters relied on gut feelings to make decisions. It’s not enough to proclaim yourself a subject matter expert or cite your years of experience anymore, especially when it comes to justifying digital business decisions. With the quality and quantity of data at our disposal, if you aren’t using your digital analytics data, you are ignoring your customers. This is self-sabotage in an increasing customer-centric digital world. Data can also be implemented to improve website user flows and calculate ROI, which can be very useful to marketers. In order to know if your digital efforts are a success, it is imperative to set goals and measure performance against them. If you can’t measure it, you can’t improve it.
  4. The days are over where the marketers and copywriters worked separately from the analysts. The best digital analysts aren’t the people with the deepest math knowledge or the flashiest spreadsheet skills. The best digital analysts are the people with subject matter expertise in the topic they are analyzing. A report is just numbers if it doesn’t have context around it. A content manager who deeply understands the content hierarchy and flow of the site with only 1 month of analytics experience will likely find deeper insights and opportunities in the data than an analyst with 5 years of experience who doesn’t understand the content. It’s no longer a world where the marketers and copywriters work alongside the analysts. The marketers and content creators *are* the analysts.

There will always be a need for pure analysts or “data scientists” to handle complex data challenges. But thanks to the hard work of these data scientists, digital analytics tools exist that make data accessible to all of us. Not only is data everywhere, data is for everyone. Whether you’re in charge of marketing, content, user experience, or technical infrastructure, there is digital analytics data that exists that will help you better understand your customers and more effectively do your job. If you want to take your career to the next level, incorporate analytics into everything you do even if “analyst” isn’t in your title.

Categories
Reporting

Adding Historical and Statistical Context to Your Trended Reports

Traffic and conversion numbers go up and down every day. When looking at trended data, it can be difficult to know when an increase or decrease is truly significant. Sometimes our stakeholders can unnecessarily panic about a dip, or overly congratulate themselves about a spike. 

This post shows how to add historical and statistical context to trended data using a simple standard deviation calculation in Excel. There are also tips for how to visualize this data to make the statistical concepts very simple for the report recipients to read and understand.

Why this is helpful… 

  • Provides context for trended data
  • Accounts for seasonality
  • Removes the guesswork from deciding whether an increase or decrease requires action
  • Especially useful in post-launch scorecards to help stakeholders decide whether or not to roll back changes

How to do it… 

For a given metric, pull historical data as far back as possible. For example, if you’re analyzing weekly homepage visits, pull historical weekly data back at least 53 weeks, but ideally 2-5 years if you have the data to support it. The more historical data you have, the better.

In Excel, use the STDEV function to find the standard deviation across all the historical values for that metric. This calculation will yield a number that gives the normal range of variance for that set of values:
stdev

 

Using the standard deviation in conjunction with the data points from the prior year, it is possible to create a series of contextual bands on your chart. Then when the current year’s data is overlaid on top of those bands, it makes it very clear whether or not this year’s performance is within the normal range of variance:
stdev-graph

  • 1 standard deviation above or below = acceptable
  • 2 standard deviations above or below = outperforming/underperforming
  • 3 standard deviations above or below = requires attention

To make the banded chart, you will need a summary table that looks something like this:Summary Data Table

  • Prior Year – Data point from the previous year for the same week.
  • This Year – Data point from this year which is the central point of this report.
  • Attention Required – A formula subtracting two standard deviations below the prior year data point for that week. This will be the bottom threshold for the banded chart.
  • Underperforming / Outperforming / Great – The standard deviation value. We will use this value to make a stacked area chart.
  • Acceptable – Also used for the stacked area chart, but because the acceptable band has one standard deviation above and another standard deviation below the prior year value this value needs to be the standard deviation value multiplied by 2. 

When configuring the chart in Excel,  the prior year and current year should be line chart types. In order to create the bands, configure all the standard deviation-related data points as stacked area charts. All the data series should remain on the same primary axis.
excel-series-chart-types

 

What to do with the information…  

If a key performance indicator dips into the “Attention Required” zone, that means performance has been very negatively affected and it should be investigated and addressed immediately. In the case of a site or page overhaul or campaign launch, the team should consider rolling back.

Categories
Implementation

Common Analytics Gotchas for Project Managers

In the excitement of overhauling a site, analytics tracking requirements commonly get overlooked. It’s understandable that the focus is on the shinier, prettier aspects like the design and functionality rather than the less visible (but equally critical) aspects like tracking and measurement. This scenario happens more often than anyone would like to admit:

What data?

 

 

 

 

 

These are some analytics gotchas that every digital project manager should be aware of…

 

Analytics tracking is more complex than just enabling it or switching it on. It requires defining which data points will be captured on which pages, how they will be captured, what values get sent to the reports, and having a developer code it accordingly. There should be project phases for:

  1. Gathering analytics requirements from the business for what should be tracked.
    • “We need a report with user IDs.”
  2. Designing the implementation based on those requirements.
    • “Capture the user ID on each page in eVar12.”
  3. Implementing the analytics tracking code to collect the data.
    • s.eVar12=”123456″

Involve the analytics team early in the process to allow sufficient time to design and implement the solution.

 

Analytics data can be captured on the page load, or on the click of some element. From a scalability and ease of implementation perspective, it is far easier to implement tracking on page load where a new URL is rendered. That’s not to say that onclick tracking should be avoided entirely, but be aware that onclick tracking lends itself to being more complex, more manual, and less scalable.

There are ways to make onclick tracking more scalable by adding CSS elements specifically related to data collection, but the front-end developers would have to be provided with a spec for what additional CSS definitions need to be added to which elements, what the values look like, etc.

 

Onclick tracking does not happen automatically. All tracking based on click or hover interactions must be custom-defined and custom-coded.

 

Excessive onclick tracking can have an effect on cost. Certain tools like Adobe Analytics have a pricing model based on server calls. Using Adobe as an example, each Adobe contract specifies a certain amount of allotted server calls per month. A server call happens each time a packet of data is sent to Adobe, which means one server call each time a visitor loads a page, and one server call each time a visitor clicks on a tracked element. By gratuitously adding onclick tracking, it can cause server call overages and additional costs.

 

Use a data layer and tag management tool. Industry best practice is to have a tool-agnostic data layer on all the pages of your site, and then a tag management tool like Tealium or Adobe Dynamic Tag Manager to map the data layer values to the appropriate variables in your reporting tool.

 

Resist the urge to track everything. Only track what is essential, and what will yield actionable information. It’s better to have 10 really solid reports with meaningful information, than 100 reports full of “maybe that will be interesting to look at someday” information.

Categories
Adobe Analytics Google Analytics

Does Adobe Analytics Have Its Own Version of Google’s Custom (UTM) URLs for Campaign Tracking?

Adobe Analytics has similar functionality, but it works slightly differently.

Adobe Analytics has a pre-defined variable called s.campaign which is reserved for campaign tracking. However, the URL parameter that sends data to s.campaign is NOT pre-defined. As is usually the case with Adobe Analytics, it’s customizable and you can choose your own query string parameter. My clients often use “cid” or “cmpid” or “camp,” but you can use whatever you want.

In order to map data from that query string parameter to s.campaign, you’ll either need to:

    1. Implement the getQueryParam plugin.
    2. Specify the parameter and map it to s.campaign in your tag management tool.

In Google Analytics, there are five dimensions related to campaign tracking: utm_source, utm_medium, utm_term, utm_content, utm_campaign. Adobe Analytics was designed to use only one dimension / campaign ID, and then to supplement that unique campaign ID using classifications (formerly known as SAINT). These classifications add supplemental data into Adobe Analytics using that campaign ID as a key, and can be accomplished by either uploading a lookup table, or using Classification Rule Builder to write logical/conditional rules to automatically generate the classifications on the fly.

One benefit of Adobe’s method is that it gives you far more campaign-related dimensions. With Google Analytics you get five, but in Adobe Analytics you can have up to 30 classifications. I typically see classifications for Channel (paid search, display, social, email, etc.) and Campaign Name at a minimum. But it’s completely customizable and you can use it for whatever makes sense for your company: the Campaign Manager responsible for a campaign, what day of the week / time of day an email was sent, ad placement size, ad network, etc.

Another benefit of the classification method is that you can retroactively add or modify the supplemental data. If a campaign manager accidentally passed the wrong values to any of the utm campaign tracking parameters in Google Analytics, the data will be incorrect in the reporting. (It would be possible to write some transformation filters to correct it, but it would be a pain and not scalable.) In Adobe Analytics it would be a simple matter of either correcting/re-uploading the lookup table, or correcting the rules in Classification Rule Builder.

As far as what values to pass to s.campaign, here are the two most common and effective ways to handle:

    1. Using a obfuscated campaign ID – The value in the query string parameter would be some value like “234987423,” and then all the details about that ID would be captured elsewhere, i.e. in a simple spreadsheet, within campaign planning software, or in a tool like Tracking First | Campaign Code Validation. Then the supplemental data would be exported from that other tool and imported into Adobe Analytics.
    2. Using a concatenated series of values – The value in the query string parameter would be a delimited, concatenated series of values like “soc:111:555,” where “soc” is a reference to the channel being Social, “111” is Campaign = Memorial Day, and “555” is Partner = Facebook. Using Classification Rule Builder you can parse those three delimited values apart using regex, and dynamically populate “soc,” “111,” and “555” into their respective classifications.

      If you wanted to get fancy, you could also use sub-classifications (a classification on a classification) to give a friendlier name to the other values so instead of seeing “111” as a line item in the Campaign report, it would say “Memorial Day.”

So while Adobe Analytics doesn’t work exactly like Google Analytics, it does have similar (and arguably more sophisticated) campaign tracking functionality. The difference is that there is a lot more upfront configuration work to set up Adobe Analytics campaign tracking, whereas Google Analytics campaign tracking is ready to go out of the box. But on the flipside there is a lot more flexibility and ability to customize the data in Adobe Analytics, where you’re just stuck with the default in Google Analytics.

And honestly you can extend that observation to any feature that exists in both tools!

Cross-posted from https://www.quora.com/Does-Adobe-Analytics-have-its-own-version-of-Googles-custom-UTM-URLs-for-campaign-tracking/answer/Halee-Kotara