When Does a Google Analytics Session Expire, Really?

One of the many ways that Google Analytics and Adobe Analytics differ are their individual definitions of what constitutes a session. Both abide by the industry standard of a session ending after 30 minutes of inactivity*.

* Unless you’ve changed the session timeout setting in a Google Analytics property, or the visit definition/visit timeout setting in an Adobe Analytics virtual report suite.

But Google Analytics has an additional trigger for ending a session that can be a real gotcha when trying to matchback data across the two tools: it ends a session and starts a new one “every time a user’s campaign source changes.” That statement always perplexes me because I’m not sure whether to interpret that as:

  1. when any UTM parameter changes, or …
  2. when only the utm_campaign parameter changes

I always assumed #1, but I decided to test this out so I’d be 100% sure. In each test scenario below, I opened a new guest browser window so none of my activity from each test would overlap.

If all other UTM parameters are the same but utm_medium is different, it will start a new session.

The proof: I loaded two URLs with identical values for utm_source, utm_campaign, and utm_id, but different values for utm_medium.


When I pull up the report by one of the other parameters, it shows two sessions, validating that a new session was started when I loaded that second page.

If all other UTM parameters are the same but utm_source is different, it will start a new session.

The proof: In a new, fresh guest window, I loaded two URLs with identical values for utm_medium, utm_campaign, and utm_id, but different values for utm_source.


Again, two sessions:

If all other UTM parameters are the same but utm_campaign is different, it will start a new session.

The proof: In a newer, fresher guest window, I loaded two URLs with identical values for utm_medium, utm_source, and utm_id, but different values for utm_campaign.


How many sessions? Two sessions!

If all other UTM parameters are the same but utm_content is different, it will start a new session.

The proof: In the newest, freshest of guest windows, I loaded two URLs with identical values for utm_medium, utm_source, and utm_campaign, but different values for utm_content.


Two sessions:

If all other UTM parameters are the same but utm_term is different, it will start a new session.

The proof: I loaded two URLs (in yet another guest window) with identical values for utm_medium, utm_source, and utm_campaign, but different values for utm_term.


This is getting boring and predictable, two sessions:

If all other UTM parameters are the same but utm_id is different, it will start a new session.

The proof: Another test, another guest window. I loaded two URLs with identical values for utm_medium, utm_source, and utm_campaign, but different values for the elusive and not-often-used utm_id.


Dos sessiones:

So there you go. Irrefutable evidence that a new session starts on change of any of the following:

  • utm_medium
  • utm_source
  • utm_campaign
  • utm_content
  • utm_term
  • utm_id

Is it a good idea to run two digital analytics tools in tandem?

These questions come up a lot:

“Is it a good idea to run a double analytics implementation on one site/mobile app (i.e. Adobe + Google)?”

“Can we install GA and Adobe Analytics on the same website or do they conflict?”

You can add as many tracking tools as you want, they won’t adversely affect each other. I’ve seen companies that have Adobe Analytics, Google Analytics, Parse.ly, Mixpanel, and Coremetrics installed on the same site concurrently. It might affect page load time, but won’t affect data quality.

While you technically can have multiple tools in place, it doesn’t always mean you should. As a consultant I feel like I should have to put a dollar in a jar every time I say, “It depends,” but… it depends. Some of my clients do this, most don’t. There are reasons to do it, and reasons to avoid.

Advantages of tracking with two different tools:

  • A second source of data can be very handy for validation purposes or as a backup in case something catastrophic happens with the other tool.
  • Adobe is strong in some areas where GA is weak, and vice versa. Having both implemented gives you the best of both worlds.
    Novice analysts will be able to ramp up on GA faster, but seasoned analysts might be frustrated by the limitations of GA and need the more sophisticated functionality of Adobe.
  • Most companies use AdWords, so it’s likely that GA is already in place, anyway.

Disadvantages of tracking with two different tools:

  • $$$ – Assuming you’re on GA360, then you’re paying for two tools.
  • Double the tools, double the maintenance. New tagging requests will likely need to be implemented separately on each tool.
  • The idea of “one source of truth” can get muddied if you don’t have strong governance in place to choose and enforce which one is your system of record. Analysts could potentially cherrypick data and provide biased analysis.
  • You will inevitably end up down the data discrepancy wormhole when users report that the data from Adobe vs. GA don’t match perfectly . Expect to spend time and resources investigating those issues and/or educating your user base that it is normal and expected for two tools to yield slightly different data and to not freak out about a 2% variance.

If the advantages are something that you need and the disadvantages are something you are equipped to deal with, then running two tools in tandem might be a good option for you.

If the cons outweigh the pros, save yourself the $$$ and the trouble of implementing and maintaining two tools.

Cross-posted from  https://www.quora.com/Is-it-a-good-idea-to-run-a-double-analytics-implementation-on-one-site-mobile-app-i-e-Adobe-Google/answer/Halee-Kotara.

4 Reasons Why All Digital Careers Need to Understand Digital Analytics

Data is everywhere. It’s captured with every step we take and every button we click. If you’re conducting business online, data should be at the heart of everything you do. Here are four reasons why it is so important for everyone on the digital team to be able to understand and apply analytics.

  1. The days are over where it took months to gather data, and days to run a report. In the digital world, campaigns and content can be changed in minutes. Online interactions can be tracked and reported in real-time. The data we have about our customers today is more detailed, more robust, and processed more quickly than ever before. In order to effectively leverage real-time data, the ultimate end users of the data must take a more active role in reporting and analyzing it.
  2. The days are over where being an analyst required a PhD in statistics and an office in the basement next to the mainframe computer. Thanks to technology, many of the digital analytics tools like Google Analytics and Adobe Analytics are designed to do the statistical heavy lifting. Whether what’s needed is advanced segmentation queries or complex data visualizations, the tools make it achievable to accomplish these tasks without an advanced stats degree. Analytics tools these days help make reporting more visual, creating a story to tell utilizing the data. Even the least mathy among us can be empowered to do very sophisticated analysis once armed with the fundamentals of digital analytics.
  3. The days are over where marketers and copywriters relied on gut feelings to make decisions. It’s not enough to proclaim yourself a subject matter expert or cite your years of experience anymore, especially when it comes to justifying digital business decisions. With the quality and quantity of data at our disposal, if you aren’t using your digital analytics data, you are ignoring your customers. This is self-sabotage in an increasing customer-centric digital world. Data can also be implemented to improve website user flows and calculate ROI, which can be very useful to marketers. In order to know if your digital efforts are a success, it is imperative to set goals and measure performance against them. If you can’t measure it, you can’t improve it.
  4. The days are over where the marketers and copywriters worked separately from the analysts. The best digital analysts aren’t the people with the deepest math knowledge or the flashiest spreadsheet skills. The best digital analysts are the people with subject matter expertise in the topic they are analyzing. A report is just numbers if it doesn’t have context around it. A content manager who deeply understands the content hierarchy and flow of the site with only 1 month of analytics experience will likely find deeper insights and opportunities in the data than an analyst with 5 years of experience who doesn’t understand the content. It’s no longer a world where the marketers and copywriters work alongside the analysts. The marketers and content creators *are* the analysts.

There will always be a need for pure analysts or “data scientists” to handle complex data challenges. But thanks to the hard work of these data scientists, digital analytics tools exist that make data accessible to all of us. Not only is data everywhere, data is for everyone. Whether you’re in charge of marketing, content, user experience, or technical infrastructure, there is digital analytics data that exists that will help you better understand your customers and more effectively do your job. If you want to take your career to the next level, incorporate analytics into everything you do even if “analyst” isn’t in your title.

Adding Historical and Statistical Context to Your Trended Reports

Traffic and conversion numbers go up and down every day. When looking at trended data, it can be difficult to know when an increase or decrease is truly significant. Sometimes our stakeholders can unnecessarily panic about a dip, or overly congratulate themselves about a spike. 

This post shows how to add historical and statistical context to trended data using a simple standard deviation calculation in Excel. There are also tips for how to visualize this data to make the statistical concepts very simple for the report recipients to read and understand.

Why this is helpful… 

  • Provides context for trended data
  • Accounts for seasonality
  • Removes the guesswork from deciding whether an increase or decrease requires action
  • Especially useful in post-launch scorecards to help stakeholders decide whether or not to roll back changes

How to do it… 

For a given metric, pull historical data as far back as possible. For example, if you’re analyzing weekly homepage visits, pull historical weekly data back at least 53 weeks, but ideally 2-5 years if you have the data to support it. The more historical data you have, the better.

In Excel, use the STDEV function to find the standard deviation across all the historical values for that metric. This calculation will yield a number that gives the normal range of variance for that set of values:


Using the standard deviation in conjunction with the data points from the prior year, it is possible to create a series of contextual bands on your chart. Then when the current year’s data is overlaid on top of those bands, it makes it very clear whether or not this year’s performance is within the normal range of variance:

  • 1 standard deviation above or below = acceptable
  • 2 standard deviations above or below = outperforming/underperforming
  • 3 standard deviations above or below = requires attention

To make the banded chart, you will need a summary table that looks something like this:Summary Data Table

  • Prior Year – Data point from the previous year for the same week.
  • This Year – Data point from this year which is the central point of this report.
  • Attention Required – A formula subtracting two standard deviations below the prior year data point for that week. This will be the bottom threshold for the banded chart.
  • Underperforming / Outperforming / Great – The standard deviation value. We will use this value to make a stacked area chart.
  • Acceptable – Also used for the stacked area chart, but because the acceptable band has one standard deviation above and another standard deviation below the prior year value this value needs to be the standard deviation value multiplied by 2. 

When configuring the chart in Excel,  the prior year and current year should be line chart types. In order to create the bands, configure all the standard deviation-related data points as stacked area charts. All the data series should remain on the same primary axis.


What to do with the information…  

If a key performance indicator dips into the “Attention Required” zone, that means performance has been very negatively affected and it should be investigated and addressed immediately. In the case of a site or page overhaul or campaign launch, the team should consider rolling back.

Common Analytics Gotchas for Project Managers

In the excitement of overhauling a site, analytics tracking requirements commonly get overlooked. It’s understandable that the focus is on the shinier, prettier aspects like the design and functionality rather than the less visible (but equally critical) aspects like tracking and measurement. This scenario happens more often than anyone would like to admit:

What data?






These are some analytics gotchas that every digital project manager should be aware of…


Analytics tracking is more complex than just enabling it or switching it on. It requires defining which data points will be captured on which pages, how they will be captured, what values get sent to the reports, and having a developer code it accordingly. There should be project phases for:

  1. Gathering analytics requirements from the business for what should be tracked.
    • “We need a report with user IDs.”
  2. Designing the implementation based on those requirements.
    • “Capture the user ID on each page in eVar12.”
  3. Implementing the analytics tracking code to collect the data.
    • s.eVar12=”123456″

Involve the analytics team early in the process to allow sufficient time to design and implement the solution.


Analytics data can be captured on the page load, or on the click of some element. From a scalability and ease of implementation perspective, it is far easier to implement tracking on page load where a new URL is rendered. That’s not to say that onclick tracking should be avoided entirely, but be aware that onclick tracking lends itself to being more complex, more manual, and less scalable.

There are ways to make onclick tracking more scalable by adding CSS elements specifically related to data collection, but the front-end developers would have to be provided with a spec for what additional CSS definitions need to be added to which elements, what the values look like, etc.


Onclick tracking does not happen automatically. All tracking based on click or hover interactions must be custom-defined and custom-coded.


Excessive onclick tracking can have an effect on cost. Certain tools like Adobe Analytics have a pricing model based on server calls. Using Adobe as an example, each Adobe contract specifies a certain amount of allotted server calls per month. A server call happens each time a packet of data is sent to Adobe, which means one server call each time a visitor loads a page, and one server call each time a visitor clicks on a tracked element. By gratuitously adding onclick tracking, it can cause server call overages and additional costs.


Use a data layer and tag management tool. Industry best practice is to have a tool-agnostic data layer on all the pages of your site, and then a tag management tool like Tealium or Adobe Dynamic Tag Manager to map the data layer values to the appropriate variables in your reporting tool.


Resist the urge to track everything. Only track what is essential, and what will yield actionable information. It’s better to have 10 really solid reports with meaningful information, than 100 reports full of “maybe that will be interesting to look at someday” information.

Does Adobe Analytics Have Its Own Version of Google’s Custom (UTM) URLs for Campaign Tracking?

Adobe Analytics has similar functionality, but it works slightly differently.

Adobe Analytics has a pre-defined variable called s.campaign which is reserved for campaign tracking. However, the URL parameter that sends data to s.campaign is NOT pre-defined. As is usually the case with Adobe Analytics, it’s customizable and you can choose your own query string parameter. My clients often use “cid” or “cmpid” or “camp,” but you can use whatever you want.

In order to map data from that query string parameter to s.campaign, you’ll either need to:

    1. Implement the getQueryParam plugin.
    2. Specify the parameter and map it to s.campaign in your tag management tool.

In Google Analytics, there are five dimensions related to campaign tracking: utm_source, utm_medium, utm_term, utm_content, utm_campaign. Adobe Analytics was designed to use only one dimension / campaign ID, and then to supplement that unique campaign ID using classifications (formerly known as SAINT). These classifications add supplemental data into Adobe Analytics using that campaign ID as a key, and can be accomplished by either uploading a lookup table, or using Classification Rule Builder to write logical/conditional rules to automatically generate the classifications on the fly.

One benefit of Adobe’s method is that it gives you far more campaign-related dimensions. With Google Analytics you get five, but in Adobe Analytics you can have up to 30 classifications. I typically see classifications for Channel (paid search, display, social, email, etc.) and Campaign Name at a minimum. But it’s completely customizable and you can use it for whatever makes sense for your company: the Campaign Manager responsible for a campaign, what day of the week / time of day an email was sent, ad placement size, ad network, etc.

Another benefit of the classification method is that you can retroactively add or modify the supplemental data. If a campaign manager accidentally passed the wrong values to any of the utm campaign tracking parameters in Google Analytics, the data will be incorrect in the reporting. (It would be possible to write some transformation filters to correct it, but it would be a pain and not scalable.) In Adobe Analytics it would be a simple matter of either correcting/re-uploading the lookup table, or correcting the rules in Classification Rule Builder.

As far as what values to pass to s.campaign, here are the two most common and effective ways to handle:

    1. Using a obfuscated campaign ID – The value in the query string parameter would be some value like “234987423,” and then all the details about that ID would be captured elsewhere, i.e. in a simple spreadsheet, within campaign planning software, or in a tool like Tracking First | Campaign Code Validation. Then the supplemental data would be exported from that other tool and imported into Adobe Analytics.
    2. Using a concatenated series of values – The value in the query string parameter would be a delimited, concatenated series of values like “soc:111:555,” where “soc” is a reference to the channel being Social, “111” is Campaign = Memorial Day, and “555” is Partner = Facebook. Using Classification Rule Builder you can parse those three delimited values apart using regex, and dynamically populate “soc,” “111,” and “555” into their respective classifications.

      If you wanted to get fancy, you could also use sub-classifications (a classification on a classification) to give a friendlier name to the other values so instead of seeing “111” as a line item in the Campaign report, it would say “Memorial Day.”

So while Adobe Analytics doesn’t work exactly like Google Analytics, it does have similar (and arguably more sophisticated) campaign tracking functionality. The difference is that there is a lot more upfront configuration work to set up Adobe Analytics campaign tracking, whereas Google Analytics campaign tracking is ready to go out of the box. But on the flipside there is a lot more flexibility and ability to customize the data in Adobe Analytics, where you’re just stuck with the default in Google Analytics.

And honestly you can extend that observation to any feature that exists in both tools!

Cross-posted from https://www.quora.com/Does-Adobe-Analytics-have-its-own-version-of-Googles-custom-UTM-URLs-for-campaign-tracking/answer/Halee-Kotara

What are the biggest differences between Adobe SiteCatalyst and Google Analytics Premium?

The short answer I always give is that Google Analytics is like a one size fits all t-shirt, and Adobe Analytics is like a custom-tailored suit. The rest of my answer is not so short.

Google Analytics:

  • The standard reporting interface is usually easier for beginners to learn, but isn’t very flexible when you want to orient the data differently than the default. Same goes for dashboards.
  • With Universal Google Analytics there are a lot more implementation options than there were in Classic Google Analytics, including far more ecommerce tracking options, along with custom dimensions and metrics. With enhanced ecommerce you can now track products from product impression to product view to cart to checkout to purchase, where in Classic GA you can only track products on the order confirmation page. With non-Premium Google Analytics you get 20 custom dimensions and 20 custom metrics. With Premium you get 200 custom dimensions and 200 custom metrics. However, the amount of customization and the degree to which you can configure these custom dimensions and metrics are very limited compared to Adobe.
  • One of Google Analytics Premium’s selling points is you get unsampled data, but to be clear, you do NOT get unsampled data in the browser-based reporting interface. You simply get an option to download the unsampled data into a .csv or .tsv file.
  • With Premium you also get support service, which I believe is Google partnering you up with an official Google Analytics third-party partner.

Adobe Analytics:

  • The standard reporting interface has always been super flexible. You’ve always been able to drag whichever metrics you want into a report, breakdown line items by other dimensions, and customize the reporting menu to your company’s needs and internal terminology. And now with their new Analysis Workspace interface that launched on 9/17/2015, it is even MORE flexible. Honestly, Analysis Workspace is the most nimble digital analytics interface I’ve ever seen. You can pivot and segment the data quickly and easily, and the load time is extremely fast.Adobe has other ancillary tools like Report Builder which is an Excel add-in, Ad Hoc (formerly known as Discover) with very powerful segmentation features, and Data Workbench (formerly known as Insight) which allows you to pull in other data sources.
  • The implementation options are really only limited by your imagination and availability of dev resources to put them into place! In regular Adobe Analytics you have 75 eVars and 100 events, and in Adobe Analytics Premium you get 250 eVars and 1,000 events. Each individual eVar can be customized on which value gets credit for conversions (first click / last click / shared) and how long values get credit for conversions (time-based, session-based, or on the completion of some other event). Similarly with events, they can be configured to be incremented, deduplicated per session, or deduplicated across multiple sessions with the use of some unique ID.Not to mention you can easily create calculated metrics in Admin, too.
  • Data is always unsampled.
  • Support is usually directly with Adobe’s account team or Adobe Client Care. The account team seems to be far more focused on selling you other tools in the Marketing Cloud than helping you with the ones you’ve got, and Client Care is usually worthless.

I like that Google Analytics gives you more out of the box to start with and there is less need to implement or configure the basic types of reports that are applicable across all sites (pages, campaigns, products, etc.)

On the flip side, I like that Adobe is so customizable and so flexible. This is a blessing and a curse. As others have stated, the quality of Adobe Analytics data is totally dependent on how well it is implemented. Going back to my original analogy, imagine someone who only knows how to sew on a button trying to sew an entire custom-fitted suit from scratch!

Cross-posted from https://www.quora.com/What-are-the-biggest-differences-between-Adobe-SiteCatalyst-and-Google-Analytics-Premium/answer/Halee-Kotara

Adobe Summit Breakout Session: Data Storytelling

This session was not very technical and really had nothing to do with SiteCatalyst specifically, but data presentation in general. As analysts we’re comfortable in the details and the numbers, so it can be hard to pull ourselves out of the weeds and remember that our audience doesn’t care about what segmentation logic we used, but “what should I do to make my business better?” This session had some good best practices for packaging up the data and putting a cherry on top. (PDF of the presentation can be found here.)

A few random quotes:

  • “We hear facts, but we feel stories.”
  • “If you lead with data, people’s shields come up. Dry, factual arguments are met with skepticism.”
  • “If you lead with a story + data, shields come down. The story allows people to drop their intellectual guard.

NOTE: Every single presenter must have been given that third piece of advice, because I swear the first 10 minutes of every single presentation at Summit was some useless, often unrelated tangent: ReportBuilder and U2, Discover and river surfing, etc.  So while I agree a story can be useful, I would argue the story needs to be RELEVANT.

A reminder on the basics of a literary story arc, and mapping those to a data story:

  1. Introduction / Setup:  background of current situation, identify main problem or opportunity
  2. Inciting Incident / Rising Action: share the findings from the data to reveal deeper insights
  3. Climax / Denouement: present major finding or key insight (a-ha! moment)
  4. Falling Action / Resolution / Conclusion: give recommendations

Some advice on visuals:

  • build the customer journey with visual elements (use screenshots of the site to remind audience what the visitor sees)
  • identify the right data (add context by comparing to another date range, another segment, etc.)
  • identify the right visualizations for that data (don’t get fancy because you can, 2D charts along common aligned scales are best for accurate comparisons, avoid pie charts)
  • remove unnecessary noise (no more than 4 lines in a line graph, if you must use a pie chart have no more than 5 slices and lump all “other” into 5th slice)
  • highlight what’s important (use colors, annotations, etc…don’t make the audience work to find what you want them to see)
  • make it easy to consume: “Never force your audience to memorize, organize, or calculate numbers in their head.” (omit unnecessary chart features, use clear labels and chart titles)

5 things to avoid which will derail your story:

  1. not knowing your audience
  2. using unfamiliar analytics jargon
  3. sharing too much detail  (be selective in what you share and don’t feel obligated to substantiate everything, the audience likely trusts our expertise and is not interested in the analytics process, only the insights)
  4. leaving out valuable context
  5. talking too much and not allowing time for discussion

And tips from a client who has a “Chief Storyteller” on staff:

  • get your point across, one point or illustration per slide, overall story should have a moral or lesson
  • when presenting to an executive, distill into how it will drive business, and two cocktail talking points
  • you don’t need to show the analysis, only the results (put the analysis in an appendix)

Ouch. That last one hurts. But it’s true. When it requires 25 data points broken into 10 segments trended over 13 months to find some insight, we want to show our hard work.  And as linear thinkers, we want to walk through the logic of how we came to our conclusion.  It’s a good reminder that sometimes the best analysis (even if it took 2 days and 5 pivot tables) can be distilled into one sentence…

Unlike this post.

Adobe Summit Breakout Session: ReportBuilder

At Summit 2014, there was a session called “Adobe ReportBuilder: Attaining data mashing (Excel)ence“.  (Here’s the PDF download of the presentation.  It will show more detail than this post.)  It was one of the more helpful sessions. If you don’t know what ReportBuilder is, it’s an add-in to Excel that allows you to build data blocks within your spreadsheet that pulls data directly from SiteCatalyst. (It used to be called Excel Client.)  It’s a lifesaver for recurring scorecards where you constantly have to pull the same metrics with updated time ranges. No more exporting / copying / pasting!

While I find ReportBuilder to be pretty self-explanatory and Adobe has training videos on the basic functionality, at Summit they shared these useful new features and/or best practices:

Use separate Excel tabs for controls versus data

One of the most powerful features of ReportBuilder is that you can have the report blocks set up to dynamically pull certain report parameters from cells in Excel. Adobe’s first tip was have one dedicated Excel tab for all your controls/inputs, and separate data tab(s) that contain the data blocks:

Advanced / shortcut rolling date expressions

If you choose “Rolling Dates” from the date selection dropdown, there is an additional link for “Show Advanced Options”. Those allow you to write expressions to more exactly identify your rolling dates. Customized Expression Elements:

  • cd = current day
  • cw = current week
  • cm = current month
  • cq = current quarter
  • cy = current year

Expression Examples:

  • From cm-13m to cm-1d (returns the last full 13 months of data)
  • From cw-13w To cw-1d (returns the last full 13 weeks of data)
  • From cd-7d To cd-1d (returns the last 7 days of data)
  • *always add “-1d” to eliminate partial, current day data


Editing multiple report blocks at once

While using cell-based inputs can eliminate the need for this feature, if you happen to still do it the old clunky way there is an option that will save you time by allowing you to mass-edit several report blocks at once. On the “Add-Ins” ribbon in Excel, select “Manage”, highlight the report suites you want to mass edit, select “Edit Multiple”, and follow the wizard to update the report parameters. Don’t forget to refresh!

Dependent data blocks

These are useful if you have a report that returns a list of line items that you want to use as a filter in another data block. I didn’t know this was even possible!

Anomaly detection

Using the previous 30 days as a training period, ReportBuilder can estimate the expected, upper, and lower range of data for a given metric.  You must be looking at a trended report with daily granularity to be able to use this functionality.  Enabling this will help you identify weird outliers in your trended data.


Since it took me 2 months to write this post, with the May 22 upgrade it looks like all the events/metrics are now in the same pane and no longer separated by traffic/conversion metrics!  No more having to build separate data blocks for visits and revenue, yay!

Adobe Digital Marketing Summit 2014

This year was the first time I was able to attend Summit since the beginning of my web analytics career 10 years ago. After attending eTail and Shop.org, my expectations of any digital marketing conference were super low. In my recent experience, these conferences have solely been venues for marketing douchebags to see how many times they can toss about the latest marketing buzzwords. “Big data!” “Omni-channel!” “Predictive analytics!”

Although some of my buddies who also attended Summit said it wasn’t as good as past years, I was delighted. All the content was extremely relevant and tactical. I walked away with a few secret tools and lots of new ideas. And I’m going to share all those tools and secrets with you now. “Big data omni sharing!”

I’ll be writing up some specific posts about sessions I attended in person and share my extended notes and screencaps on those:

And the single document that was worth the price of admission for the entire conference:

  • Excel Template That Automatically Extracts SiteCatalyst Admin Settings