Quantcast
Channel: Dan Wilkerson – LunaMetrics
Viewing all 55 articles
Browse latest View live

Instantiating & Using The Google Tag Manager dataLayer – Data Layer Best Practices Pt 1

$
0
0

blog-data-layer-series-pt-1

We’ve collected a series of technical best practices designed to help you successfully interoperate with the Google Tag Manager Data Layer. These best practices are designed to help eliminate some of the peskier and harder-to-debug issues we run into when working with clients. This part will discuss the proper way to instantiate and work with the dataLayer.

Always Use .push()

Google documentation frequently demonstrates populating values in the data layer by simple instantiation, e.g.:

<head>
<!-- Some other HTML -->
  <script type="text/javascript">  
    datalayer = [{
      'foo': 'bar',
      'baz': 'boo',
      'primeDirective': [1, 3, 5, 7, 11, 13, 15, ...],
      'ecommerce': {
        ...
      }
      // Etc, etc, etc...
    }];
  </script>
</head>
<body>
  <!-- 
    ******************
      The GTM Snippet 
    ******************
  -->
  ...
</body>

However, in certain circumstances, directly instantiating the dataLayer like this is dangerous. Why? If the above code were placed below the Google Tag Manager snippet on the page, something not-so-nice would happen, e.g.:

<head>
  <!-- Some other HTML -->
</head>
<body>
  <!-- 
    ******************
      The GTM Snippet 
    ******************
  -->
  ...
  <!-- Moved script tags to bottom for speed - Mr. WellMeaningDev, 10/16/15 -->
  <script type="text/javascript">  
    datalayer = [{
      'foo': 'bar',
      'baz': 'boo',
      'primeDirective': [1, 3, 5, 7, 11, 13, 15, ...],
      'ecommerce': {
        ...
      }
      // Etc, etc, etc...
    }];
  </script>
</body>

By moving the dataLayer = [{}] statement below our Google Tag Manager snippet, we actually destroy the true dataLayer. Any subsequent .push()‘s will seemingly have no effect, never appearing in our Debug Panel. And it would extremely hard to debug, since events prior to the overwrite will appear in Debug Mode, while events fired after will not.

The problem is that our initialization code doesn’t check if there’s already a variable named dataLayer. Because of this, it effectively overwrites whatever is already associated with the namespace dataLayer when it’s executed.

Making things worse, Google Tag Manager is still using the now-un-namespaced dataLayer, so if we .push() additional events into the new dataLayer, Google Tag Manager misses them completely. The Debug Panel is no help, either, as it still will show whatever events were caught up until that point. It’s only by manually polling dataLayer in the console that you’ll discover the issue.

wut

Proper dataLayer Instantiation

To fix this, copy the big G. In Google’s scripts and snippets, they frequently have to interact with globals that may or may not be ready and available when the code is executing. To get around this issue, they use the following pattern:

var someArray = window.someArray || [];
someArray.push(something);

Look familiar? You might remember this snippet of the Classic tracking code:

var _gaq = _gaq || [];
_gaq.push([ ... ])

This pattern is incredibly useful in JavaScript. Literally translated, it says “set the value of the variable named someArray to whatever is already named someArray, or, if someArray doesn’t exist yet, set it to an empty array”. This pattern lets us sprinkle in commands to various services throughout the code, to be executed when the service is ready to go.

Google isn’t alone in using this pattern, either; Facebook and many others employ the same strategy for managing asynchronous resource loading and command queueing and execution.

We take it one step further in our best practice; although the above method is good for 99.99% of the time, our instantiation syntax is 100% bulletproof. Whenever we’re interacting with the dataLayer, we use this syntax:

var dataLayer = window.dataLayer = window.dataLayer || [];
dataLayer.push({
  'foo': 'bar'
});

By using this syntax, you’ll always reference or instantiate the global dataLayer, and scope the variable dataLayer locally to prevent any funky hoisting or scope collisions.

The reason you should take this approach is simple: over time, other development teams will add their own dataLayer code, or shift your code around. If you use the dataLayer = [{}]; style of instantiation, you’ll end up with some hard-to-debug issues whenever this happens (and trust me, it will).

Using var dataLayer = window.dataLayer = window.dataLayer || []; dataLayer.push({ ... }); ensures you’ll never run the risk of these issues popping up. Making this the standard syntax also prevents two teams from accidentally overwriting/deleting another teams dataLayer values when they add their own code later on.

Join us for Part 2 of this series, where we’ll discuss how to push values into the dataLayer within Google Tag Manager Custom HTML Tags.

What are your thoughts on dataLayer interaction? Have you developed another clever solution to this issue? Share with us in the comments below.

The post Instantiating & Using The Google Tag Manager dataLayer – Data Layer Best Practices Pt 1 appeared first on LunaMetrics.


Integrating AngularJS and Google Tag Manager

$
0
0

blog-integrate-angularjs-gtm-tinypng
If you’ve added Google Tag Manager and Google Analytics to an Angular app, you were in for a surprise. After deploying your code, you may have popped open the Real Time Reports and saw… nothing.

Well, not nothing, but not a whole lot of anything. Where are the page paths? Hell, where are the pageviews? You dove into the Google Analytics docs and came back disappointed. Finally, you hit the search results – someone, somewhere had to know what to do.

Welcome, friend! You’ve come to the right place. Let’s get you straightened out.

But First, The Easy Button


Not interested in what’s going on under the hood? The module you’re looking for is Angulartics (Full disclosure: I’m a contributor on the GA and GTM libraries). It supports:

  • ngRoute and UIRouter
  • Google Tag Manager & Hard-coded Google Analytics
  • Declarative event tracking
  • Advanced hit types like User Timings
  • Exception tracking, scroll tracking, and more advanced features
  • 20+ analytics tools

And there’s a GTM Container file for one-click configuration written by yours truly in the repository.

Get the code here!

What Is Google Tag Manager?

Google Tag Manager (also referred to as GTM) is a tool for delivering Google Analytics tracking code and other tracking snippets on a page. It helps de-clutter your code by moving tracking snippets into a single location, instead of littered throughout the page. It features a WYSIWYG-style interface for creating, testing, and publishing additional tracking snippets on the fly, called Tags. Tags are assigned conditions on when they should be executed, called Triggers. In a nutshell, here’s how it works:

  1. A user interaction (clicking, submitting a form) or programmatic notification is observed by GTM (called an event)
  2. Each Trigger is evaluated against the event to see if the firing conditions set in the interface are met. If they are, the Trigger ‘fires’
  3. All Tags assigned to any Triggers that fire are executed

Most commonly, Google Tag Manager is used to deploy Google Analytics tags, from basic pageview tracking all the way up to complex custom Google Analytics Events. For a more in-depth explanation, my colleague Kaelin wrote a great introduction to Google Tag Manager and the relationship between GTM and Google Analytics.

Our Cookbook

Below, we’ve shared some components that we’ve used in the past. These components are designed to minimize the amount of additional code required to integrate Google Tag Manager and an Angular app. Because GTM sits on top of Google Analytics, the data that you give to it is inert by default; whether anything is fired is controlled in the interface. Because of this, it’s a good idea to be inclusive with the data you share with the dataLayer.

Pageview Tracking

The default Trigger for firing pageviews is called All Pages. The event that this Trigger corresponds to is gtm.js, which fires only when the GTM snippet is first loaded by the browser. This is why your pageviews haven’t been showing up. You need to add an internal listener in your app that notifies GTM when a state change occurs. If you’re using ngRoute, it would look like this:

(function(angular) {

  angular
    .module('app', ['ngRoute'])
    .run($run);

  // Safely instantiate dataLayer
  var dataLayer = window.dataLayer = window.dataLayer || [];

  $run.$inject = ['$rootScope', '$location'];

  function $run($rootScope, $location) {

    $rootScope.$on('$routeChangeSuccess', function() {

      dataLayer.push({
        event: 'ngRouteChange',
        attributes: {
          route: $location.path()
        }
      });

    });

  }

})(angular);

Once you’ve wired this into your app, Google Tag Manager will be notified any time a route change occurs. In order to translate this into a pageview in Google Analytics, you’ll need to create a Trigger for your ngRouteChange event, like so:

GTM Trigger for Angular Route Change

Then create a Variable to extract your page path from the dataLayer, like so:

GTM Variable for Angular Route

And you’ll need to create a Google Analytics pageview tag, like so:

GTM Tag for Angular Pageview

Don’t forget the Fields to Set configurations!

  • cookieDomain is set to auto
  • page is set to {{DLV – Angular Route}}

If you’re unwilling to wire into your router, you can create a Trigger using the History Change listener built into GTM, watching for pushState events, but we don’t recommend this approach.

Model Change Tracking

Often, you may want to track changes to model values in Google Analytics. A simple way to do this is with a custom Directive that notifies Google Analytics or Google Tag Manager when a model’s value changes. A simple solution for this challenge is to use a custom Directive that binds to the change event.

(function(angular) {

  angular
    .module('app', [])
    .directive('notifyGtm', notifyGtm);

  function notifyGtm() {

    var dataLayer = window.dataLayer = window.dataLayer || [];

    function link(scope, element, attributes, ngModel) {

      element.bind('change', function() {

        var el = element && element[0] ? element[0] : '';

        if (el) {

          dataLayer.push({
            event: 'ngNotifyGtm',
            attributes: {
              element: el,
              modelValue: ngModel.$modelValue,
              viewValue: ngModel.$viewValue
            }
          });

        }

      });

    }

    return {
      require: 'ngModel',
      restrict: 'A',
      link: link,
      scope: {
        ngModel: '='
      }
    };

  }

})(angular);

You can then add notify-gtm on any element with an ng-model that you’d like to observe with Google Tag Manager. In Google Tag Manager, you can use a Data Layer Variable to extract the value of the model or view, or use the element object reference to pull out additional data.

Notifying GTM After A Template Renders

Often, we depend on information in the DOM when capturing data to send to Google Analytics. Normally, we can use the gtm.dom or gtm.load as safe-guards for firing our code, but with an Angular app, it can be hard to ensure the data is ready to go. Unfortunately, the best solution I’ve seen so far is equally ugly; creating a Directive that triggers a function after rendering (hat tip to Guilherme Ferreira).

<div after-render="missionCompleted"></div>

This provides a balance of flexibility and discretion, but it means you’ll have to remember to add the directive at the bottom of each of your templates. If you’ve got more clever solution, please share it in the comments below.

Tracking Other Interactions

If you’d like to use Google Tag Manager and Google Analytics more extensively in your application, take advantage of Google Tag Manager’s reusable Variables to streamline the process. First, you’ll want to build a reusable service for interacting with GTM.

(function(angular) {

  angular
    .module('app', [])
    .service('GTMService', GTMService);

  var dataLayer = window.dataLayer = window.dataLayer || [];

  function GTMService() {

    return function GTMService(obj) {

      dataLayer.push(obj);

    };

  }

})(angular);

This can be further evolved to streamline things like event tracking.

(function(angular) {

  angular
    .module('app', [])
    .service('GTMService', GTMService);

  var dataLayer = window.dataLayer = window.dataLayer || [];

  function GTMService() {

    function push(obj) {

      dataLayer.push(obj);

    }

    push.trackEvent = function(obj) {

      var attr = obj.attributes;
      var abort;

      angular.forEach(['category', 'action'], function(el) {
 
        if (typeof attr[el] === 'undefined') console.log('trackEvent: Missing required property ' + el + '. Aborting hit.');
        abort = true;

      });

      if (abort) return;

      push({
        'event': 'ngTrackEvent',
        'attributes': {
          'category': attr.category,
          'action': attr.action,
          'label': attr.label,
          'value': attr.value,
          'nonInteraction': attr.nonInteraction
        }
      });

    }

    return push;

  }

})(angular);

With the above example, you could create a single Tag and Trigger for your Event Tracking, and simply invoke GTMService.eventTrack whenever you’d like to fire an event.

The Angulartics source code is a great place to learn more about different ways to integrate the two for maximum ease of use.

Other Front-end Frameworks

Many of these same concepts apply to other frameworks, e.g. Knockout, Backbone, and so on. If you’d be interested in knowing how we would approach integrating Google Tag Manager with another framework, let us know in the comments and we’ll see about writing a post covering that particular library.

To Recap

These components should help get you started. Remember, the data pushed to the dataLayer is inert by default, so lean towards ‘over-sharing’. You can automate pageview tracking, model change observation, and template rendering with our example components, and you can use a simple service to interact with the dataLayer in your controllers and other components. Finally, you can further streamline things like event tracking by creating your own services, and I’d recommend reviewing the Angulartics module if you’d like to learn more.

Have you integrated Google Tag Manager and Angular before? What was your experience? Share it with us in the comments.

The post Integrating AngularJS and Google Tag Manager appeared first on LunaMetrics.

Extending Google Analytics with Programmatic Data Import

$
0
0

ga-programmatic-data-import
There are lots of external data sources with juicy information you might like to see modeled in Google Analytics. Maybe you’ve got cost data from non-Google ad networks you’d like to see accounted for in your reports. Maybe you have data from a CRM that you want to use to enhance your Google Analytics reporting. For these and many other cases, Data Import is your huckleberry.

My colleague Jim Gianoglio has already written a great post detailing how to use Data Import via the web interface, where you can quickly and easily upload a CSV of custom data for Google Analytics to use. But what about when you’ve got data that updates frequently? Can we automate that process?

The answer is yes; we can do programmatic Data Import with Google Analytics. Today we’ll look at two examples – Salesforce lead information and Facebook Ads Cost Data.

Step 0: Prep Work

In order to bring in our data to Google Analytics, we have to be able to join the data in Google Analytics with a shared key. If you’re not familiar with this concept, think of a coat check – you hand in your coat and get a ticket. The coat check employee puts a matching ticket on your coat and hangs it.

Later, you can exchange your ticket for the same jacket. Your ticket is the key – both the coat and the owner have the same number. This allows the coat check employee to not have to remember all the names and faces of jacket owners, and the jacket owners to not have to recite all the details about their coat when they try and claim it.

Coat Check

For our Facebook data, we’ll be using our Campaign Source and Medium as our coat check ticket – on all of our Facebook ad destination URLs, we’ll add campaign parameters like this:

http://oursite.com/ads/facebook-ad-landing-page?utm_source=facebook&utm_medium=cpc&utm_campaign=facebook-ads&utm_content=banana+smoothie+post

For our Salesforce data, we’ll be using our Google Analytics Client ID. We’ll have to have stored the Client ID in Salesforce, too. I’d recommend utilizing a hidden field on all Salesforce forms to capture Client IDs. You’ll also need to have created a Custom Dimension in Google Analytics and stored Client IDs there, too.

Editor’s Note: This post will walk through the programmatic data import process for Google Analytics, but won’t go into the specifics required to create your data sets, set up automated scripts, and handle authentication.

We’re assuming you’re technically savvy enough to gain authorized access to Google Analytics, Salesforce, and Facebook (or your data source of choice). For more on how to access your Salesforce data, check out their documentation on their web server OAuth flow. You’ll find Facebook’s docs on authentication here.

Fair warning: all of these authentication schemes are relatively tool-specific and confusing. For Salesforce, make sure your App Client has the refresh permission configured, and for Facebook, you’ll need to get the ads_read permission from a user managing those ads and trade up for a long-lived token. And for Google Analytics, you’ll want to follow the service-type authorization flow. Hopefully those tips save you some time!

One more note before we jump in – we’re going to assume that you’ve got a server somewhere that you can use to host a service that handles the orchestration of everything we’ll discuss below. If you’re looking for something on the cheap and don’t anticipate high volumes of data, I’d suggest checking out Google App Scripts. They’re a great fit for this situation; you can set up triggers and make HTTP calls to external services. Just be aware that your scripts will die if they go over the six minute execution limit.

Step 1: Creating Our Data Sets

In order to send data into Google Analytics, we’ll need to create a Data Import Data Set. We can create and configure our Data Sets from within the Data Import configuration interface, which you can find under the Property column in our Admin Tab:

Data Import Option

Inside the Data Import interface, click New Data Set.

New Data Set

Then, we’ll select the Cost Data data set type and click Next Step.

Cost Data Set Option

We’ll give our data set a name, and select which views we would like the data set to bring data into.

Select Views

Finally, we will configure the schema for our data set. Depending on what data we have available, we might use more or fewer dimensions and metrics. Then we’ll be ready to save our data set.

Once we’ve saved our data set, we’ll need to do two things:

  • Click Get Schema and copy down what we see
  • Click Get Custom Data Source ID and copy down the ID of our data set

Setting up Our Facebook Cost Data

I’m going to add the Campaign, Ad Content, and Destination URL dimensions. I’ll also add the Impressions, Cost, and Clicks metrics. Finally, I’ll select Overwrite for my cost data import behavior. This means that if I upload two sets of data with duplicate keys, Google Analytics will overwrite the values from the first set of data with the values from the second set. You may prefer the other option, Summation, which will sum up the values of duplicate keys instead. Then we’ll click Save.

Define Data Import Schema

And that’s it! Our Facebook Cost Data data set is ready to go. Make sure you copy down the schema and data source ID, as outlined above.

Setting up Our Salesforce Data Set

For our Salesforce data set, we’re going to first need to create a Custom Dimension named Salesforce ID. Which ID you use (Opportunity, Lead, Contact, etc.) is up to you – you’ll have to ask yourself how you want to model your data in Google Analytics. For our example, I’ll be using the Lead ID.

Make sure you’ve got your Client ID dimension set up too, as outlined in Step 0. If you want to include any other data from Salesforce, e.g. Department Contacted, you’ll need to create additional Custom Dimensions for that data. We’re going to keep it simple for today.

Once you’ve got that in place, head over to our Data Import interface and create a new Data Set. You might be tempted to pick User Data, but that relies on the User ID dimension as our joining key, which we may or may not have. We want to use our Client ID as our joining key, a Custom Dimension, so we’ll select Custom Data.

Next, we’ll select Client ID as our Key, which we’ll find nested under Custom Dimensions in the dropdown. Then, we’ll select Salesforce ID as our Imported Data.

At the end of the creation wizard, we have the option to select whether we’d like to overwrite dimension values if a hit contains values for the same dimensions. If you’re planning on sending in this data along with hits from another source and you have reason to believe the value might change, you might select No, and defer to the value with your hit. If you believe your automatic upload will always contain the most correct data, you can select ‘Yes’. We’ll go with Yes, because we don’t expect our Salesforce ID to ever show up on a hit and if it did, it would be better to defer to our uploaded data. You should end up with something like this:

Salesforce Data Import Schema

Save the data set, copy down the Data Source ID and schema, and we’re ready to start sending in data!

Importing the Data

Now that our data sets are in order, it’s time to begin uploading the data to Google Analytics. Remember the schema that Google Analytics provided us for a data sets? E.g., our Facebook Cost data set has this schema:

ga:date,ga:medium,ga:source,ga:adClicks,ga:adCost,ga:impressions,ga:campaign,ga:adContent,ga:adDestinationUrl

Once we have the data, we’ll need to format it to follow the schema provided by Google and then stuff it into a CSV file. Ultimately, the file should look something like this:

ga:date,ga:medium,ga:source,ga:adClicks,ga:adCost,ga:impressions,ga:campaign,ga:adContent,ga:adDestinationUrl
20160101,cpc,facebook,10,13.12,1001,Dan's Awesome Campaign,Dan's Even Better Ad Name,http://www.lunametrics.com?utm_source=facebook&utm_medium...
...

Or for our Salesforce data set:

ga:dimension1,ga:dimension2
350989407.1454899563,00Q310000173c4m

To upload the data to Google Analytics, simply POST your CSV to:

https://www.googleapis.com/upload/analytics/v3/management/accounts/accountId/webproperties/webPropertyId/customDataSources/customDataSourceId/uploads

Use the MIME Type application/octet-stream, where the webPropertyId is our Property ID (a.k.a. UA number or Tracking ID), the accountId is the middle numbers of our UA number (e.g. UA-XXXXXX-YY), and the customDataSourceId is our Data Source ID from earlier. Once we’ve posted our data, Google Analytics will return an uploads resource with data specific to your freshly uploaded data set, or any errors that occurred.

A few quick notes on limits – in addition to the normal Google API restrictions, you’re only able to create 50 data sets per property, and upload 50 CSVs of data per day, of a max of 1GB in size. You’re also limited to importing 100MB of data per ga:date value in Cost Data imports.

Using The Data in Reports

Once you’ve uploaded your cost data, you should be able to see the results in the Cost Analysis report, as well as in the MCF reports. You can also create a Custom Report to analyze your shiny new cost data.

Your Salesforce data will behave a little differently; Google Analytics will populate the data you’ve provided in association with your Client IDs you specified only after a session from said Client ID is recorded. So, if your visitor never returns after submitting a Salesforce lead, their Salesforce ID will not show up in the reports.

This applies to other keys, too; your data will not be populated retroactively. For this reason, it’s often a better idea to use a non-interaction Event via the Measurement Protocol in order to send in custom data to Google Analytics.

What data sources would you like to import into Google Analytics? Share in the comments below.

The post Extending Google Analytics with Programmatic Data Import appeared first on LunaMetrics.

Increase Your Google Analytics Page Speed Hit Limit

$
0
0

increase-your-ga-page-speed-limit

It’s not much of a secret that slow-loading pages on your site decrease the chances of a visit converting. Walmart, Google, and others have released data that point to real-world impacts from sluggish page load times. Enter the Page Timings report in Google Analytics – chock-a-bloc full of actionable metrics. You can see exactly which pages on your site are loading slowly, and you can apply Segments to focus on just the traffic you’re interested in.

There’s just one catch – the data is pretty heavily sampled. By default, Google Analytics collects timing data on just 1% of users who visit your site, or 10,000 hits per day, whichever comes first. You can increase the percentage of users that are sampled, but once you hit 10,000 hits, you’ll get no more data. My colleague Samantha Barnes has written an excellent guide that fleshes out this feature toe-to-tip.

What I’m here to tell you today is that you actually can exceed the 10,000 hit limit for your page timing data. How’s that, you’re asking? By using Events, of course.

How It Works

The page timing data you see in Google Analytics isn’t being gathered by any kind of JavaScript witchcraft on the browser: they come from the Navigation Timing interface, an API exposed by modern browsers that holds data on a slew of different and important steps the browser takes on the way to rendering a page. The Mozilla Developer Network has a a great guide to this API here.

Since this interface is available to any Joe Schmoe, we, too, can use it to get data about how long our pages take to load.

Capturing The Data

You can implement this through JavaScript on your page or through Google Tag Manager. We’ve even created a Page Load Timing GTM Recipe to make your life easier!

I’ve also provided instructions below:

  1. Create a new Google Analytics Event. Fill in all the normal information, your Google Analytics Tracking ID, etc.
  2. Give it whatever Category name pleases you best, I used “Site Speed”
  3. For the Action, fill in the Variable {{Page URL}}. Either type this, or choose it from the lego dropdown box.
  4. For the label, click the Variables drop down and choose Create a new Variable.
  5. Select the Variable type Custom JavaScript and paste the below into the field:

function() {
 
  var timing = window.performance.timing;
  var ms = timing.loadEventStart - timing.navigationStart;
  
  return Math.round(ms / 100) / 10;
  
}

  1. Finally, select ‘True’ for the Non-Interaction Hit field. We’ll be firing this automatically and we don’t want this to artificially affect our bounce rate.

In the end, your Event should look something like this:

Once you’ve got that configured, you’ll need to create two Triggers – one to fire the tag, and another to block it when the API isn’t available. The firing Trigger is super simple – we’ll just use the built-in Window Load Pageview Type:

Next, we’ll need to create our blocking trigger; it should look like this:

Why Not Custom Metrics?

Custom Metrics are a great choice! If you’d like to gather the same metric in a Custom Metric, or pull out additional Timing API data and plug it into Events there, be my guest. The reason I’m recommending an Event here is that we need to fire the Event on window.load, which can come well after we’d like to send a Pageview, and thus we have to fire a second hit – so why not just use that hit for our data. If you’ve already got an Event firing on every pageview, you can just use a Custom Metric instead.

Keep in mind that adding this Event will effectively double the hits you’re generating; if you’re running close to that 10 million hit limit on a monthly basis, you should sample this hit, either through the sampleRate Field To Set or another Blocking Exception like the Blocking Trigger #7 Jon Meck outlines here.

So What Does This Do For Me?

Using your page speed timing data, you can learn which pages of your site are taking an abnormally long time to load. By sorting first by the Label, you can drill into just the pages that are going over a certain threshold, e.g. 2.0 seconds or higher, and begin to investigate why they’re not up to snuff. If you want to be really fancy, try the new Histogram Buckets feature in version four of the Core Reporting API. Some additional tools that will help you along your way:

  • WebpageTest.org, which allows you to run up to 9 simultaneous loads of a page and returns a handy waterfall of the results
  • PageSpeed Tools, which provide concrete recommendations on how to improve the performance of your pages
  • Chrome’s Timeline Tool, found in the developers console

Armed with your data and these tools, you can drastically improve the performance of your site (and hopefully your conversion rate, too).

Any questions? Sound off in the comments below.

The post Increase Your Google Analytics Page Speed Hit Limit appeared first on LunaMetrics.

Track Angular Exceptions Automagically In Google Analytics

$
0
0

track angular exceptions in google analytics
We use AngularJS at LunaMetrics as part of the stack for our internal toolset. One of the things we wanted to do early on was track exceptions thrown in the application with Google Analytics, something I’ve always advocated for. The project I contribute to, Angulartics, DOES support error tracking, but it’s up to you to invoke it. I wanted something more automagic, so I set about digging in the Angular documentation to see what I could do.

After doing a little research, I knew that Angular handled exceptions internally by way of the $exceptionHandler service. The documentation spelled out that replacing the service was a viable option for changing the way exceptions were handled; I didn’t want to lose the default behavior, though, so I kept looking.

The method I ended up settling on was to use a service decorator. Service decorators are called during service instantiation, and are passed in the original service instance, annotated $delegate in the docs. You can then modify, extend, replace, or patch that service to your hearts content.

Using this approach, I extended the existing $exceptionHandler to push data about the exception into the dataLayer. I’m using Angulartics in my project, so I chose to make use of the trackException method – ultimately, this is just a fancy wrapper for an Event that gets sent to Google Analytics.

EDIT: Since this post was initially drafted, this functionality was introduced to Angulartics. Read more here. Shoutout to Oded Niv for the PR. If you’d like automagic exception tracking without using the Angulartics library, you can use the below code as a template – just replace $analyticsProvider with the provider for your own Analytics wrapper, or (shame on you) just access the global ga/_gaq objects directly.

(function(angular) {

  /**
   * This has been added to the core angulartics library since this post was drafted; do not add this in to your app!
   **/
  angular
    .module('app', [
      'angulartics',
      'angulartics.google.tagmanager'
    ])
    .config(Config);
  
  Config.$inject = ['$analyticsProvider', '$provide'];

  function($analyticsProvider, $provide) {

    // Extend exception handling to pass along to GA
    $provide.decorator('$exceptionHandler', ['$delegate', function($delegate) {

      return function(exception, cause) {

        // We have to instantiate a service instance with .$get() to use .exceptionTrack()
        var $analytics = $analyticsProvider.$get();

        // $delegate is a reference to the original handler
        $delegate(exception, cause);

        $analytics.exceptionTrack({
          description: exception.message,
          stack: cause + '\n' + exception.stack
        }); 

      };   

    }
    
  }
  /**
   * This has been added to the core angulartics library since this post was drafted; do not add this in to your app!
   **/
  
})(angular);

You can easily swap out the code on lines 23:26 with a different way of transmitting the data to Google Analytics. If you’re using Angulartics with Google Tag Manager, make sure that you’ve got your Angulartics Event tag configured (hint: import the container JSON instead of configuring it manually).

Once you’ve got it in place, any time an exception is emitted from an Angular expression, it will be automatically sent off to Google Analytics for you to analyze/respond to. Plus, you’ll get all the associated hit data that Google Analytics collects, like browser type, device, and browser version, which can help with reproducing any tricky bugs.

Got a different approach? Share it in the comments.

The post Track Angular Exceptions Automagically In Google Analytics appeared first on LunaMetrics.

Eventbrite & Google Analytics: Setting Up Cross Domain Tracking

$
0
0

blog-cross-domain-ga-eb

Eventbrite recent rolled out the ability to do true cross domain tracking with Google Analytics. What is cross domain tracking? Read more about it here. All caught up? Okay, let’s get started.

In order to do cross domain tracking with your Eventbrite events, you’ll need to decorate any links that point at Eventbrite with a special, dynamic query parameter. The parameter key is _eboga. Eventbrite requires that the value of the parameter be the users client ID (which is a little different than the Linker setup you might be used to).

First, the bad news

Although this attempt at a cross domain integration is better than previous iterations, the code still has a few opportunities to grow. The current implementation lacks:

  • Checks to ensure that the client ID is inherited by the correct user

    Google’s Linker plugin checks a hash of the User Agent and a timestamp to prevent “collisions”, like when a client ID is in the URL and the user shares it with another user. Eventbrite’s custom system doesn’t, which means annoying things can happen like the page being indexed with some random users CID. Given that most clicks will be sent to expiring pages, this seems like a marginal risk, but it’s still a disappointment.

  • A mechanism to prevent collisions with other Eventbrite users

    Each vendor has their _eboga value stored in a cookie named _eboga on the Eventbrite site. This means if a user travels between multiple Eventbrite sites, they could wind up changing their Client ID several times, meaning they’ll be tracked as several different users. It would have been nice to see the code make use of namespaced localStorage or cookies (although both admittedly have challenges, and the structure of Eventbrite’s site doesn’t help).

  • The ability to utilize the integration with their embedded event iframes

    The code doesn’t appear to show up in the code served in Eventbrite’s embedded iframes, at least in my own tests. To test this, I found clients who were using the integration and manually created an iframe-like URL (e.g. https://www.eventbrite.com/tickets-external?eid=XXXXXXXXXXX&ref=etckt). When I inspected the source of the page, I wasn’t able to locate the code used to extract the _eboga parameter and set the cookie.

TL;DR: If you’re making heavy use of the Eventbrite iframes on your site, or are very concerned about your data hygiene, you might want to skip this one out. Sorry to be the bearer of bad news.

The Code

If you’re using Google Tag Manager, we’ve got a handy container file for you to import & instructions. Feel free to read the below, regardless.

Extracting the Client ID

To get to the client ID, you’ve got two options:

  • Ask GA politely for it
  • Pull it out of the _ga cookie

Option #1: Asking Politely

This little snippet will grab the Client ID by using the tracker.get() method.

// Returns a Google Analytics Client ID from the first
// tracker it finds or from a tracker with a given Property ID.
function getClientId(targetTrackingId) {

  var trackers = window[gaName].getAll();
  var len = trackers.length;
  var tracker,
    i;

  if (!targetTrackingId) return trackers[0].get('clientId');

  for (i = 0; i < len; i++) {

    tracker = trackers[i];

    if (tracker.get('trackingId') === targetTrackingId) {

      return tracker.get('clientId');

    }

  }

}

You can pass the function a UA number, e.g. ‘UA-123456-6’, and it will extract the client ID associated with that Property. This is only important if you’re using the cookieName setting – otherwise, you can just call the function without passing anything.

Option #2: Pulling Teeth

If you can’t/don’t want to interact with the ga API, you can also extract a users Client ID from their _ga cookie. Here’s a snippet to get that done for you:

// Returns the Google Analytics Client ID for the user
function getClientId() {

  var _gaCookie = document.cookie.match(/(^|[;,]\s?)_ga=([^;,]*)/);
  if(_gaCookie) return _gaCookie[2].match(/\d+\.\d+$/)[0];
  
}

Decorating the links

Once we’ve gotten our client ID, we then need to find and decorate all links pointing at Eventbrite with it (and our special parameter). Here’s how to get that done:

// Adds the Eventbrite cross domain tracking parameter to any
// links pointing to www.eventbrite.com
function bindUrls() {

  var urls = document.querySelectorAll('a');
  var clientId = getClientId();
  var parameter = '_eboga=' + clientId;
  var url,
    i;

  // If we're in debug mode and can't find a client 
  if (!clientId) {

    window.console && window.console.error('Unable to detect Client ID. Verify you are using Universal Analytics, the code is firing after Google Analytics has set a Client ID, and the correct targetTrackingId is set, if any.');

    return;

  }

  for (i = 0; i < urls.length; i++) {

    url = urls[i];

    if (url.hostname === 'www.eventbrite.com') {

      url.search = url.search ? url.search + '&' + parameter : '?' + parameter;

    }

  }

}

Firing The Code

Now that we’ve got all the pieces in place, we need to determine when to fire our code. We need to make sure the a elements we’re targeting are in place before we try and bind to them. The simplest way to do that is to fire our code on the DOMContentLoaded event (or window.load for older browsers). We also need to make sure that Google Analytics has loaded before the code fires, and that a client ID is available. To do this, we’ll take advantage of the Universal Analytics command queue. Let’s put it all together:

<script id="eventbrite-cross-domain-tracking">
  (function(document, window) {

    // Set this to your UA number to use a specific tracker Client ID. Defaults to
    // first tracker registered, which is fine for 99.9% of users.
    var targetTrackingId = '';

    if (!document.querySelector) return;

    var gaName = window.GoogleAnalyticsObject || 'ga';

    // Safely instantiate our GA queue.
    window[gaName]=window[gaName]||function(){(window[gaName].q=window[gaName].q||[]).push(arguments)};window[gaName].l=+new Date;

    if(document.readyState !== 'loading') {

      init();

    } else {

      // On IE8 this fires on window.load, all other browsers will fire when DOM ready
      document.addEventListener ? 
        simpleAddEvent(document, 'DOMContentLoaded', init) : 
        simpleAddEvent(window, 'load', init);

    }

    function init() {
      window[gaName](function() {

        // Defer to the back of the queue if no tracker is ready
        if (!ga.getAll().length) {

          window[gaName](bindUrls);

        } else {

          bindUrls();
 
        }
 
      });

    }

    function bindUrls() {

      var urls = document.querySelectorAll('a');
      var clientId = getClientId();
      var parameter = '_eboga=' + clientId;
      var url,
        i;

      // If we're in debug mode and can't find a client 
      if (!clientId) {

        window.console && window.console.error('GTM Eventbrite Cross Domain: Unable to detect Client ID. Verify you are using Universal Analytics and the correct targetTrackingId is set, if any.');

        return;

      }

      for (i = 0; i < urls.length; i++) {

        url = urls[i];

        if (url.hostname === 'www.eventbrite.com') {

          url.search = url.search ? url.search + '&' + parameter : '?' + parameter;

        }

      }

    }

    function getClientId() {

      var trackers = window[gaName].getAll();
      var len = trackers.length;
      var tracker,
        i;

      if (!targetTrackingId) return trackers[0].get('clientId');

      for (i = 0; i < len; i++) {

        tracker = trackers[i];

        if (tracker.get('trackingId') === targetTrackingId) {

          return tracker.get('clientId');

        }

      }

    }

    // Very simple event binding w/ support for attachEvent
    function simpleAddEvent(el, evt, handler) {

      if ('attachEvent' in document) {

        el.attachEvent('on' + evt, function(e) {

          handler.call(this, evt);

        });

      } else if ('addEventListener' in document) {

        el.addEventListener(evt, handler);

      }

    }

  })(document, window);
</script>

And there we have it! The script will load after both GA and the DOM are ready to go, ensuring we can tack on our client ID and _eboga parameter to all of the Eventbrite links on our page.

Have another approach? Spot a bug in my script? Sound off in the comments below.

The post Eventbrite & Google Analytics: Setting Up Cross Domain Tracking appeared first on LunaMetrics.

Tracking Very Large Transactions with Google Analytics & Google Tag Manager

$
0
0

tracking-large-transactions

Google Analytics will allow you to send 8192 bytes of data with a single request. By default, it will switch from a GET request to a POST request once you’ve crossed a threshold of about 2000 bytes, which is where certain browsers begin to get upset with the length of the GET’s query string. However, sometimes 8192 just isn’t enough.

This issue is particularly thorny when the hit in question contains Transaction information. Believe it or not, this issue has come up with my clients – so I’m sharing my solution in case anyone else finds themselves in this, admittedly, very niche situation.

Unlike product impressions, which lend themselves nicely to chunking, transactions have associations that need to be preserved (the products in the purchase to the purchase ID). The good news is, we can trick Google Analytics into letting us break a transaction into several hits and still report on it as a cohesive unit. We do this by breaking the transaction into chunks, then changing the data very slightly between hits.

Unfortunately, this approach has some downsides – our Transactions metric will be wildly inflated, for example. With a little creativity, though, we can use Custom Metrics or Goals in order to replicate the transaction counts we would normally expect in the interface.

We have a few options to solve this problem:

Send Less Data

I can hear your booing and hissing. Hear me out – ask yourself if you really need all of the data you want to send along with your Transaction. You do? Alright, read on.

Use Data Import

Data Import is an advanced feature of Google Analytics that lets us bring in extra data to the interface without sending the data along with the hit. There are two types of Data Import – Query Time (Analytics 360 née Premium only) and Process Time. Using Data Import, we can whittle our transaction and product data down to the bare minimums – just IDs and hit/session/user-specific data.

Break Up the Transaction into Sub-transactions with Different IDs

Instead of sending the Transaction with just one hit, break it into multiple hits and append an index to the end of the Transaction ID, e.g. ‘T12345’ becomes ‘T12345-1, T12345-2, . . .’. This approach gets the data into Google Analytics, but can lead to high-cardinality problems for large organizations with lots of transaction IDs, and it inflates the count of transactions in the interface. Then there’s all the nasty side effects that we have to watch for, like revenue changes – in short, it’s a messy fix.

Break Up The Transaction into Sub-transactions with the Same ID

I know, I know. You’re saying “Dan, that’s not going to work – Google Analytics de-duplicates transactions with the same ID within the same session. Go home, kid, you’re drunk.” Well, you’re right – Google Analytics does de-duplicate duplicate transactions fired within the same session. However, the de-duplication isn’t at the Transaction ID level – it’s at the transaction level. If any data you send with the transaction changes – for example, the coupon field – the transaction will not be de-duplicated. Neat, right?

How Does It Work?

You read right! If you change any of the data about a transaction, it will not be de-duplicated – Google Analytics will only ignore transactions that are exactly the same. So, if you were to send a transaction with ID 1234 and revenue $12.50, and then immediately send a transaction with ID 1234 and revenue $12.51, guess what? You’ll see the Transaction ID 1234 in your reports, with a total of 25.01 in revenue. Go on, try it yourself. I’ll wait.

Okay, satisfied?

Armed with this knowledge, we can see that if we were cut our very large transaction into 8000 byte chunks and change an unused field between each hit, we could track the transaction using the same ID. This would allow us to view our transactions in the interface as Google intended – we’ll see the correct total for the correct Transaction ID, and we can click on that ID to see all the products associated with the purchase.

Interested? Great! We’ve already got a Container file ready for you to use. Here’s how to install the chunker.

Note: This solution is for Enhanced Ecommerce only with Google Tag Manager. File this solution under “Google Analytics Hacks” and test everything thoroughly!

Step 1 – Download & Import the Container File

This step is pretty easy. If you’re not sure how to import a container file, check out my colleague Jim Gianoglio’s rundown on how to do so.

Get Container File

Step 2 – Change Your Existing Ecommerce Tag

You’ll need to locate your tag that you’re using to carry in your Enhanced Ecommerce data and adjust the Ecommerce settings. This might be your Pageview tag that fires on all pages, or it might be a separate Pageview or Event Tag. In this tag, uncheck the “Use data layer” box, and from the dropdown that appears, select “JS – Ecommerce Object”. This Variable will detect if the transaction needs to be chunked, and if so, it will return an empty ecommerce object. Otherwise, the ecommerce object that was supposed to be in place at this step will be returned, as usual.

Note: If you’re already using a Variable for this step, you’ll need to do some tinkering inside of the Variable. You should be able to swap DLV - Ecommerce - v1 Variable with the Variable you’ve been using. As always, test extensively before publishing!

Step 3 – Adjust The Firing Trigger

When there is a large transaction that needs be chunked, we’ll use a Custom HTML Tag to do the work. By default, this Custom HTML tag will fire on “Pageview” on any page where the result of {{JS - Ecommerce Hit Should Be Chunked}} is true. This assumes that your data layer is loading before GTM, with the right ecommerce information loaded on the page. If you’re firing your transactions using a custom event, like below, you’ll need to tweak the firing trigger.

dataLayer.push({
  event: 'transactionReady',
  ecommerce: {
     ... // etc
  }
});

To fix this complication, change the Trigger {{Pageview - Should Ecommerce Hit Be Chunked Equals true}} to use the trigger type “Custom Event” and change the Event name field to match your event, e.g. “transactionReady”.

Step 4 – Add Your Google Analytics Tracking ID

So now your page has loaded, the transaction has been determined to be too large, so our Custom HTML Tag is going to split it into chunks and then use data layer pushes to fire in the chunked Transactions using a Google Analytics Event tag. The tag and trigger are all set up for you, but you need to go into the tag called GA - Event - Transaction Chunk and update your Tracking ID to match your other Google Analytics tags.

Step 5 [Optional] – Set the Transaction Custom Metric

Since we’ll be sending a transaction for each chunk of our total transaction, we’re going to blow our Transactions metric way, way out of the water. This is the only downside to this approach. There is a way we can replicate our transaction count, by using a Custom Metric. Create the Custom Metric in Google Analytics, jot down the index number, then configure it on original tag that typically sends in transactions. We’ve included a Variable named JS - Transaction Count Custom Metric that you can use for the value of the Custom Metric.

Step 6 – Test!

That’s it! Once you’ve got the above configurations in place, run a test transaction that would overflow the 8000 byte limit. You should see several transChunk events appear in the Debug panel.

Downsides & Considerations

Of course, there are still some downsides to this approach. For starters, each chunk is going to count as a separate transaction. In turn, the Conversion metrics in your other reports are going to go way, way up. Consider anywhere you see Conversions (Google Analytics, Google AdWords, etc.) and think if this will affect those numbers. That means you’re going to need to make it clear to stakeholders that the Transaction metric is no longer going to cut the mustard.

We use the Coupon field to make sure each transaction is slightly different. If you’re currently using the Coupon field, you’ll need to do some tweaking.

If we’re using Events for all of our transaction tracking, we can set up a Goal to track the purchases instead. Goals are fired once or zero times per session, so if a session can include multiple purchases, this may not be the solution we’re looking for. If we can’t use Goals, we can configure our tags to use Custom Metrics instead. No matter what we do, since this solution breaks standard reporting in a fundamental way, it’s critical that anyone who uses the account gets a proper introduction as to what is going on when they’re viewing the reports.

And that’s it! By adjusting the transaction configuration ever so slightly between our chunks, we can send in transactions with greater than 8192 bytes of data and still see them represented in the interface as a single, cohesive unit. Don’t forget – this solution should only be used if every other option is exhausted.

Any questions? Sound off below.

The post Tracking Very Large Transactions with Google Analytics & Google Tag Manager appeared first on LunaMetrics.

Vimeo Tracking Plugin for Google Analytics & Google Tag Manager

$
0
0

vimeo-plugin-ga-gtm
We’re rolling out a new plugin to complement our suite of Google Tag Manager recipes. The latest addition targets the popular video-sharing platform Vimeo. Just like our YouTube Tracking plugin, this library allows you to track Play, Pause, and Watch to End events, as well as track when the user has viewed past specific percentages of the video.

You can find our complete documentation (and a handy link to download a container file with the plugin) here. If you’ve never imported a container into Google Tag Manager before, check out this handy guide from my colleague Jim Gianoglio.

The default settings will track Play, Pause, and Watch to End events, as well as when the user views past the 10%, 25%, 50%, 75%, 90%, and 100% marks. Currently, it has been tested and shown to work in the following browsers:

  • IE10+
  • Edge 14
  • Chrome 57
  • Firefox 52
  • Opera 44
  • Safari 10
  • Yandex 14

To see our plugin in action, play the video below. Events will appear in the textbox beneath as they are tracked.



Here’s the HTML of the above:

<style>
  #demo-logger {
    padding: 7px 10px;
    line-height: 1.50;
    font-size: 12px;
    height: 300px;
    width: 100%;
    resize: none;
    background-color: #404040;
    color: #fff;
    font-family: "Lucida Console", "Lucida Sans Typewriter", monaco, "Bitstream Vera Sans Mono", monospace;
    border-radius: 2px;
  }
</style>
<iframe src="https://player.vimeo.com/video/12345" width="500" height="281" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
<textarea id='demo-logger' readonly=true>Press play to begin logging events.</textarea>
<script>
  (function(document) {

    var iframe = document.getElementById('vimeo');

    autoResize(iframe, '4:3');

    window.addEventListener('resize', autoResize.bind(this, iframe, '4:3'));

    function autoResize(div, ratio) {

      var dims = ratio.split(':').reduce(function(dims, val, ind) {

        var key = ind ? 'width' : 'height';
        dims[key] = Number(val);

        return dims;

      }, {});

      div.style.width = '100%';
      div.style.height = (dims.width / dims.height * Number(div.getBoundingClientRect().width)) + 'px';

    }

  })(document);
</script>

For our demo, a Custom HTML tag is fired in GTM to append the event to the textarea on this page. It is triggered by vimeoTrack events.

<script>
  (function(document) {
    var loggerId = 'demo-logger';
    var el = document.getElementById(loggerId);
	if (el.value === 'Press play to begin logging events.') el.value = '';

    var str = '[' + +new Date() + ']:\n' + [
      'Event: {{Event}}',
      'Video Action: {{DLV - Video Action}}',
      'Video Name: {{DLV - Video Name}}'
    ].join('\n') + '\n';

    if (!el.value) {

      el.value = str;

    } else {

      el.value = el.value.split('\n').concat(str).join('\n')

    }
    
    el.scrollTop = el.scrollHeight;

  })(document);
</script>

If you’re curious about Vimeo’s Player API, check out their documentation. Vimeo’s Player API wasn’t a huge chore to work with, but be aware that video metadata is loaded asynchronously after the video itself. To run event bindings after video metadata is loaded, make use of the ‘Player.getVideoTitle()’ method. This method returns a ‘Promise’-like object, which you can then call ‘.then(callback)’ on and run the rest of your bindings with the required data in hand.

Ready to get started? Head over to our recipe page and download the plugin. If you’d like to contribute or discover a bug, submit a pull request on our GitHub repository.

The post Vimeo Tracking Plugin for Google Analytics & Google Tag Manager appeared first on LunaMetrics.


Wistia Tracking Plugin for Google Analytics & Google Tag Manager

$
0
0

wistia-tracking-plugin-ga-gtm
We’re rolling out a new plugin to complement our suite of Google Tag Manager recipes. The latest addition targets the popular video-hosting platform Wistia. Just like our YouTube Tracking plugin, this library allows you to track Play, Pause, and Watch to End events, as well as track when the user has viewed past specific percentages of the video.

You can find our documentation (and a handy link to download a container file with the plugin) here. If you’ve never imported a container into Google Tag Manager before, check out this handy guide from my colleague Jim Gianoglio.

The default settings will track Play, Pause, and Watch to End events, as well as when the user views past the 10%, 25%, 50%, 75%, 90%, and 100% marks. Currently, it has been tested and shown to work in the following browsers:

  • IE8+
  • Edge 14
  • Chrome 57
  • Firefox 52
  • Opera 44
  • Safari 10
  • Yandex 14

IMPORTANT: By default, if Wistia detects that Google Analytics is installed on the page, it will already send events to your account. If you’re using hard-coded Google Analytics, this might be just fine for your use case – however, if you’re using GTM, you must add the class ‘googleAnalytics=false’ to your embed code, as seen in our example below. In our tests, every tracker is dispatched a hit when a Wistia automagic event goes out – if you’re using GTM, you’re almost certainly over-reporting on these events, as GTM creates a new tracker for every hit sent on the page.

To see our plugin in action, play the video below. Events will appear in the textarea beneath as they are tracked.


 


Here’s the HTML of the above:

<style>
  #demo-logger {
    padding: 7px 10px;
    line-height: 1.50;
    font-size: 12px;
    height: 300px;
    width: 100%;
    resize: none;
    background-color: #404040;
    color: #fff;
    font-family: "Lucida Console", "Lucida Sans Typewriter", monaco, "Bitstream Vera Sans Mono", monospace;
    border-radius: 2px;
  }
</style>
<script src="//fast.wistia.com/embed/medias/j38ihh83m5.jsonp" async></script>
<script src="//fast.wistia.com/assets/external/E-v1.js" async></script>
<div class="wistia_embed wistia_async_j38ihh83m5 googleAnalytics=false" style="height:349px;width:620px">&nbsp;</div> <textarea id='demo-logger' readonly=true>Press play to begin logging events.</textarea>
<script>
  (function(document) {

    var iframe = document.getElementById('wistia');

    autoResize(iframe, '16:9');

    window.addEventListener('resize', autoResize.bind(this, iframe, '16:9'));

    function autoResize(div, ratio) {

      var dims = ratio.split(':').reduce(function(dims, val, ind) {

        var key = ind ? 'width' : 'height';
        dims[key] = Number(val);

        return dims;

      }, {});

      div.style.width = '100%';
      div.style.height = (dims.width / dims.height * Number(div.getBoundingClientRect().width)) + 'px';

    }

  })(document);
</script>

For our demo, a Custom HTML tag is fired in GTM to append the event to the textarea on this page. It is triggered by wistiaTrack events.

<script>
  (function(document) {
    var loggerId = 'demo-logger';
    var el = document.getElementById(loggerId);
	if (el.value === 'Press play to begin logging events.') el.value = '';

    var str = '[' + +new Date() + ']:\n' + [
      'Event: {{Event}}',
      'Video Action: {{DLV - Video Action}}',
      'Video Name: {{DLV - Video Name}}'
    ].join('\n') + '\n';

    if (!el.value) {

      el.value = str;

    } else {

      el.value = el.value.split('\n').concat(str).join('\n')

    }
    
    el.scrollTop = el.scrollHeight;

  })(document);
</script>

If you’re curious about Wistia’s Player API, check out their documentation. Having built similar plugins the past, we feel their simple command-queue style syntax is a breeze to work with. We also love that they support applying a command to all videos without having to watch for them to load or iterate through on-page elements. Huge kudos to the Wistia team for designing such a pleasant API to work with.

Ready to get started? Head over to our recipe page and download the plugin. If you’d like to contribute or discover a bug, submit a pull request on our GitHub repository.

The post Wistia Tracking Plugin for Google Analytics & Google Tag Manager appeared first on LunaMetrics.

A Developer’s Guide To Implementing The Data Layer

$
0
0

developer-guide-to-implementing-data-layer

This is a post written for developers. If you’re not a developer or you do not have access to make changes to the source code of the site that you’d like to add initial dataLayer values to, forward this post to the appropriate persons. Seriously, this post is for developers only. Get out. Go.

Hello Developer!

I understand you recently received an (email|task|ticket|request|sticky note) asking YOU to implement something called a ‘data layer’ on your site, possibly with some details about what it should include. This guide is meant to flesh out that request so you’ll know what you’re doing.

Before We Talk Data Layer

In order to set this up, you need a Google Tag Manager or Google Optimize snippet (or potentially both, if your team has asked for both products to be deployed) from the team that submitted the request. if you don’t have that, request it from your team and cool your heels until they send them over. Both of these tools can use the same data layer!

They snippets look similar to what’s below, but will have special IDs for your organization instead of FOO, which I’ve used below. These snippets must go as high in the head of your page as possible.

Google Optimize Snippet

Google Optimize is a great free/paid A/B testing tool from Google that allows you to create and run experiments on your website. Before you can start using it however, you need to install it on your website. Need this? Check out more details on Google Optimize installation instructions.

Here’s generally what it will look like, and you’ll need to update certain values.

<script>
  (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
  (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
  m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
  })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
  ga('create', 'UA-123456-1', 'auto');
  ga('require', 'GTM-FOO');
</script>

Google Tag Manager Snippet

Google Tag Manager is tool that makes it easier to add tags and tracking to your site for analytics, advertising, SEO fixes, you name it! It’s our preferred method for adding Google Analytics to a page. Learn more here about how Google Tag Manager and Google Analytics work together.

Just like Google Optimize, a developer will need to add this to your website before you can start using the tool. Here are detailed instructions for Google Tag Manager installation.

<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-FOO');</script>
<!-- End Google Tag Manager -->

Note: To keep it simple, the vast majority of websites can safely ignore the GTM iframe snippet. More on what that is here.

Why Do We Need A Data Layer?

The above instructions and links are all that you need to successfully install and start using tools like Google Optimize and Google Tag Manager. These tools will load in the browser for the user that is viewing your site and begin performing their the tasks they were instructed to do, whether that’s tracking clicks on PDFs or showing two different versions of a headline. Great!

Both of these tools will use information from the page itself and manually entered into the tool to make decisions and share information to other tools.

Making decisions comes in the form of deciding when to Trigger certain tags inside of Google Tag Manager, or determining when an experiment is shown in Google Optimize. Sharing data with other tools can be sending information about your page or users to another tool, like Google Analytics, Google AdWords, or third-party tools.

We use the data layer to use information from your server to help us make decisions or share data with other tools.

Depending on what platform you use to host your website, you likely have a wealth of information on the server about the content of your pages and the users that are accessing it. The data layer is how we make that information available to tools like Google Tag Manager and Google Optimize so they can be used easily. The setup on the backend will be specific to your platform, but the output will be standardized.

The Data Layer Snippet

Immediately before the GTM and/or Optimize snippets, place this code:

var dataLayer = window.dataLayer = window.dataLayer || [];
dataLayer.push({
  key: 'value'
  ...
});

Replace the ellipses with data from your backend. The team that requested the change should be able to tell you what data it is that they need. You will need to extract that data from your system and populate it into the value here. This data will appear as key/value pairs. The keys can be named almost anything you like, and the values should be dynamic. Important rules here:

  1. It must be on the page when the browser receives the initial response from the server. Absolutely no AJAX.
  2. It must not be edited via code after the snippets. No going back and adding stuff – the data must be in place at the time the page is sent to the user.
  3. It must always appear ABOVE the GTM and/or Optimize snippets.
  4. If no data is required for a given page, the Data Layer snippet can be omitted – both snippets will see that it isn’t present and initialize a blank one of their own.

Placing Values on the Data Layer

This part gets a little tricky. As a website developer, you know best how to get the data out of your platform and echo it onto the page. I can’t help really help you here. Here’s an example from a post that we wrote about pulling information from WordPress to create date range cohorts for content, but every platform will be unique.

The part I can help with is what it should look like on the page!

Here’s a complete example with a data layer and Google Tag Manager snippet, based on an implementation we used with a client:

var dataLayer = window.dataLayer = window.dataLayer || [];
dataLayer.push({
  page: {
    category1: 'Auto',
    category2: 'Life hacks',
    platform: 'Foo',
    wordCount: 40,
    length: 400
  },
  user: {
    backendId: '20d75b5c-5143-11e7-b114-b2f933d5fe66'
  },
  site: {
    owner: 'bar'
  },
  session: {
    status: 'anonymous',
    checkedOut: false
  }
});
<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-FOO');</script>
<!-- End Google Tag Manager -->

Wrapping It All Up

So there you have it – information is taken from your server, and added to the page in the correct format. By following these instructions, now your team members using Google Tag Manager and Google Optimize will be able to use the information you’ve given them in a variety of ways.

Want a visual guide? Here’s a handy example of taking Category information a blog post (this one!), storing it on the data later, accessing it Google Tag Manager and Google Optimize, then seeing the final result.

LunaMetrics Data Layer Infographic

Frequently Asked Questions

There’s a lot of documentation around these items, and a lot of our recommendations have come from our consulting experience with customers.

Q: Can’t I just do dataLayer = [{}]?

A: I don’t have enough fingers and toes to count the number of times that sloppy instantiation like that has led to data loss. Just use my syntax. Why? Not checking for an existing variable (like above) can overwrite the reference that GTM depends on, and it can be a real pain to troubleshoot, and it happens all. The. Time. Please, please just take my advice and don’t get cute. Remember: if it breaks and you changed it, who’s going to get yelled at? You. It’s going to be you. And I can be very loud.

Q: What kind of things go in the dataLayer part?

A: Here’s a great post from my very talented colleague Dorcas Alexander that lists off a bunch of ideas for inspiration. Again, think about any information that will be helpful for making decisions or that may need to shared with another tool.

Q: But what’s actually going on under the hood? I want to know how it works.

A: Great! Here’s a laundry list of blog posts to go through; they’ll teach you just about everything you’ll need.

Still have questions? Is this guide incomplete? Sound off in the comments below and I’ll take a look.

The post A Developer’s Guide To Implementing The Data Layer appeared first on LunaMetrics.

Google Analytics API v4: Histogram Buckets

$
0
0

Google Analytics Histogram Bucket

Back in April of last year, Google released version 4 of their reporting API. One of the new features they’ve added is the ability to request histogram buckets straight from Google, instead of binning the data yourself. Histograms allow you to examine the underlying frequency distribution of a set of data, which can help you make better decisions with your data. They’re perfect for answering questions like:

  • Do most sessions take about the same amount of time to complete, or are there distinct groups?
  • What percentage of page loads happen in under two seconds?
  • What is the relationship between session count and transactions per user?

Want to see for yourself? We’ve got a handy demo you can use to visualize some of your very own data. To get started, click ‘Connect’ below.

Try It Yourself

We’ve put together a simple demo that you can use to do a little exploring.

How It Really Works

Here’s how to use this new Histogram feature yourself with the API.

Note: we’re assuming you’ve got the technical chops to handle authorizing access to your own data and issuing the requests to the API.

Here’s what a typical query looks like with the new version of the API:

{
  "reportRequests": [
    {
      "viewId": "VIEW_ID",
      "dateRanges": [
        {
          "startDate": "30daysAgo",
          "endDate": "yesterday"
        }
      ],
      "metrics": [
        {
          "expression": "ga:users"
        }
      ],
      "dimensions": [
        {
          "name": "ga:hour"
        }
      ],
      "orderBys": [
        {
          "fieldName": "ga:hour",
          "sortOrder": "ASCENDING"
        }
      ]
    }
  ]
}

This query will return a row for each hour, with the number of users that generated a session during that hour for each row; simplified, it’d be something like this:

[
  ['0', 100],
  ['1', 100],
  ['2', 100],
  ['3', 110],
  ['4', 120],
  ['5', 140],
  ['6', 220],
  ['7', 300],
  ...
]

Wouldn’t this data be more useful if it were dayparted? Let’s use the histogram feature to bucket our data into traditional TV dayparts:

Early Morning 6:00 AM – 10:00 AM
Daytime 10:00 AM – 5:00 PM
Early Fringe 5:00 PM – 8:00 PM
Prime Time 8:00 PM – 11:00 PM
Late News 11:00 PM – 12:00 PM
Late Fringe 12:00 PM – 1:00 AM
Post Late Fringe 1:00 AM – 2:00 AM
Graveyard 2:00 AM – 6:00 AM

To request our data be returned in these new buckets, we’ll need to make two modifications to our query from before. The first change we’ll make is to add a histogramBuckets array to the ga:hour object in our dimensions array. We’ll populate this with ["0", "2", "6", "10", "17", "20", "22", "23"]. Each number in this sequence marks the beginning of a new histogram bin.

The end of the bin is inferred by the number that follows it, and if values exist below the first bin’s minimum an additional bin will be tacked on for us at the beginning to contain those values. For example, if we had started our histogramBuckets with “2” instead of “0”, the API would add a new bucket to the beginning named “<2", and it would contain the values for matching rows where the ga:hour dimension was 0 or 1. The second change we need to make is to add the “orderType”: “HISTOGRAM_BUCKET” to the orderBys portion of our request.

{
  "reportRequests": [
    {
      "viewId": "70570703",
      "dateRanges": [
        {
          "startDate": "30daysAgo",
          "endDate": "yesterday"
        }
      ],
      "metrics": [
        {
          "expression": "ga:users"
        }
      ],
      "dimensions": [
        {
          "name": "ga:hour",
          "histogramBuckets": [
            "0",
            "2",
            "6",
            "10",
            "17",
            "20",
            "22",
            "24"
          ]
        }
      ],
      "orderBys": [
        {
          "fieldName": "ga:hour",
          "orderType": "HISTOGRAM_BUCKET",
          "sortOrder": "ASCENDING"
        }
      ]
    }
  ]
}

Here’s what the response for that query looks like for some data from a personal site:

{
  "reports": [
    {
      "columnHeader": {
        "dimensions": [
          "ga:hour"
        ],
        "metricHeader": {
          "metricHeaderEntries": [
            {
              "name": "ga:users",
              "type": "INTEGER"
            }
          ]
        }
      },
      "data": {
        "rows": [
          {
            "dimensions": [
              "0-1"
            ],
            "metrics": [
              {
                "values": [
                  "31"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "2-5"
            ],
            "metrics": [
              {
                "values": [
                  "113"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "6-9"
            ],
            "metrics": [
              {
                "values": [
                  "155"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "10-16"
            ],
            "metrics": [
              {
                "values": [
                  "247"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "17-19"
            ],
            "metrics": [
              {
                "values": [
                  "52"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "20-21"
            ],
            "metrics": [
              {
                "values": [
                  "25"
                ]
              }
            ]
          },
          {
            "dimensions": [
              "22-23"
            ],
            "metrics": [
              {
                "values": [
                  "21"
                ]
              }
            ]
          }
        ],
        "totals": [
          {
            "values": [
              "644"
            ]
          }
        ],
        "rowCount": 7,
        "minimums": [
          {
            "values": [
              "21"
            ]
          }
        ],
        "maximums": [
          {
            "values": [
              "247"
            ]
          }
        ],
        "isDataGolden": true
      }
    }
  ],
  "queryCost": 1
}

Some Downsides

As of this writing, the chief advantage of this feature is that it can save you a little logic and time when your own application wants to use histograms with your Google Analytics data. There’s no “give me X buckets” though – you have to know the range of your data ahead of time. Additionally, data is coerced into an integer, so floats are out.

That means if you want to generate bins dynamically (like we’re doing in our example), you need to first get the range of the data from Google Analytics, then calculate those buckets and send a second request. You may wish to simply request the raw data and calculate the histogram yourself.

Hopefully Google will add some more functionality to this feature to simplify dynamic binning, too. I’d also welcome the ability to create histograms within the Google Analytics interface! Hopefully this API feature is a sign that something like that is in the works.

There are a limited set of dimensions that can be queried in this manner; here’s a complete list:

Count of Sessions ga:sessionCount
Days Since Last Session ga:daysSinceLastSession
Session Duration ga:sessionDurationBucket
Days to Transaction ga:daysToTransaction
Year ga:year
Month of the year ga:month
Week of the Year ga:week
Day of the month ga:day
Hour ga:hour
Minute ga:minute
Month Index ga:nthMonth
Week Index ga:nthWeek
Day Index ga:nthDay
Minute Index ga:nthMinute
ISO Week of the Year ga:isoWeek
ISO Year ga:isoYear
Hour Index ga:nthHour
Any Custom Dimension ga:dimensionX (where X is the Custom Dimension index)

Great Example Use Cases

Wondering how you might use this feature? Here are some more examples to get your juices flowing:

  • Use Events to capture more accurate page load times and store the time in the label, then bin the times using the API.
  • Capture blog publish dates and see when blog posts peak in engagement
  • Look at months and transactions to identify seasonality
  • Compare Session Count and Revenue to see, in general, the number of sessions required to drive your highest revenue.

Have a clever use case of your own? Let me know about it the comments.

The post Google Analytics API v4: Histogram Buckets appeared first on LunaMetrics.

Tracking Inbound Campaigns in Google Analytics

$
0
0

Campaigns in Google Analytics

Understanding how users get to your website can help with everything from marketing strategy, advertising budgeting, and even content production. It’s also one of the most popular reasons to use Google Analytics. There are certain things that you get out of the box with a Google Analytics implementation, like what pages are viewed or how someone reached your site. There are also steps you can take to improve these reports!

Normally, when I visit your site, Google Analytics tries to figure out where I’ve come from. It does this for each session and stores the data in the Acquisition -> All Traffic -> Source / Medium report (technically the data is used all over the place, but this report focuses on these dimensions).

How Does It Work?

For most traffic coming to your site, Google Analytics looks at the page that I was previously to determine which bucket of traffic I belong in. Google Analytics figures out where I’ve come from by using the “referrer” header, a special piece of data attached to a request for a page by the browser whenever I travel to that page from a previous page, e.g.:

1) I google “google analytics training”

2) I click the link to the LunaMetrics site; when I do that, the browser tacks on the “referrer” header set to “https://www.google.com”

3) The page loads, my visit begins, and Google Analytics determines that I must be coming from google / organic as my source and medium.

The referrer header (spelled “referer” in the wild thanks to an early spelling error) is parsed by Google Analytics when the page loads on the first pageview of my visit (called a session in GA parlance). It uses the data there to try and categorize my visit to your site into one of three buckets:

(direct) / (none) (Direct traffic)

Often this is referred to as “bookmark” or “brand” traffic, which is wrong. (direct) / (none) does not guarantee that someone typed in the URL directly (a square is a rectangle but a rectangle isn’t necessarily a square); it just means we don’t know where they came from. In short, when they loaded their first pageview, the referrer header was empty. We’ve got more details about direct traffic misconceptions in another post.

othersite.com / referral

This is called “referral” traffic, and it includes any session that began when a user clicked a link from another website (e.g. example.com has a link that points to lunametrics.com).

google / organic

Google Analytics has a special list of sites it considers search engines (including itself, Bing, Baidu, etc). If the referrer header is from one of those sites, my session will be categorized as searchenginename / organic instead of as a referral.

Paid Traffic To Your Site

If you enable account integrations with Google AdWords or DoubleClick Campaign Manager, a fourth type of traffic will be recorded – google / cpc, which will include sessions that clicked on AdWords ads instead of organic results and traveled to your site that way.

Similarly, if you’ve integrated with DoubleClick Campaign Manager, you’ll see – dfa / cpm, showing sessions that came to you via an advertisement that they interacted with.

Custom Campaign Parameters

Now here comes the fun part! Those previous buckets of traffic are determined automatically when someone arrives on your website, using their previous page (the referrer) or special query parameters (paid search). These determinations can be useful, but don’t necessarily provide a lot of information.

Campaign parameters (a.k.a. UTM parameters, campaign tags, ad nauseam) are special query parameters that are appended to a URL in order to override Google Analytics default detection of where someone comes from. These are small pieces of info that we manually add to a link before we share them. We use special words that Google Analytics is set up to recognize, which will then map to the fields in Google Analytics, like source and medium.

If I was on lunametric.com and I clicked a link to your site that looked like this:

http://www.example.com/?utm_source=Marlon&utm_medium=Brando

Instead of seeing a session referred from lunametrics.com / referral in Google Analytics, you’ll see:

Marlon / Brando

Weird, right?

How It Works

In my example above, I simply added the query parameters by typing them onto the end of the URL. You don’t need to implement these by hand: Google makes available a simple tool called URL builder that does the hard work – you just give it a URL and your desired source, medium, and so on.

Certain social media and email campaign tools will help do this for you automatically, or allow you to add campaign parameters to all outgoing emails or post with just a little customization. Those are great to help ensure consistency and make sure you don’t forget!

We’ve written about a few other options before – we love using Google Sheet to keep everyone organized in your company. Here’s a Google Sheets template we put together a few years ago.

These custom parameters go on the link before you share it or email it out. Then, when a person clicks on that link and arrives on your page, Google Analytics takes care of the rest – using those values to override the default automatic traffic naming.

How Do I Choose What To Add?

We’ve also written extensively about how to choose what to call your traffic sources. Check out naming convention post for ideas and tips. In general, there are few standards (like medium = email) that you can follow, but it should make sense to you and your team.

Unlimited Potential

​Campaign Parameters give you the ability to define how Google Analytics classifies a given session. In general, when we’re sharing a link that we have control over (for example, in an email newsletter), we should use campaign parameters.

Let’s repeat that – when you paste/post/share a link on another site in an effort to drive traffic to your own site, you should use campaign parameters.

Taking Credit for Our Efforts

Separating our traffic into traffic we’ve generated versus traffic we’ve just happened to acquire helps shape marketing priorities. For example, if we tag all the links that we share on Facebook, we can segment out traffic that stumbled onto our site through Facebook from traffic our marketing has earned, either through posts we published or ads that we ran. Without that distinction, we might attribute wins to our social media marketing that it wasn’t really responsible for (and vice versa).

Traffic from Facebook by itself might be lumped into facebook.com / referral. Sometimes you’ll also see m.facebook.com or l.facebook.com.

Now let’s say you’re actively sharing links to your most recent blog posts from your company page, and you’ve tagged those with custom campaign parameters. Now, we’ll ALSO see traffic coming in from the source and medium that you’ve defined, perhaps something like facebook / social.

Extra Detail From Tagged Links

Take advantage of additional parameters to store extra context; I showed you utm_source and utm_medium, but there’s also:

utm_campaign – used to store cross-channel campaign names so that all traffic can be rolled up and reported on. For example, you run a quarterly sales event that you promote using display advertising, social advertising, and email marketing. If you add &utm_campaign=quarterly-promo to all the links in each channel, you can open the Acquisition -> Campaigns -> All Campaigns report and find quarterly-promo, where you’ll be able to see in one place how many users and sessions the campaign acquired.

utm_term – Used for non-Google search engine ads to store the keyword you bid against, e.g “google analytics training”.

utm_content – a catch-all for additional context you may wish to catch about the link, e.g. “boy laughing creative”

Combating Direct Traffic

Traffic is very frequently tagged (direct) / (none) when in actuality it is traffic coming to the site through a method that does not set a referrer header. A few examples:

  • Clicking a link on a https: site that points to an http: site (for security reasons, the browser does not set a referrer header).
  • Opening a link from a native application (e.g. Outlook or a native app).
  • Clicking a link that is redirected many times where the referrer header is either lost or set to the “wrong” referrer

You won’t always be able to combat these issues, however using Campaign Parameters will definitely help reduce the amount of Direct, or unknown traffic, by taking credit when possible. Consider sending out emails to your email list – if someone opens that email in a native application like Outlook, that will then open a new browser window and load the right page. In this example, there is no previous page, which means there’s no referrer, which means it will automatically show up in Google Analytics as Direct.

With Campaign Parameters, you can define a source / medium for all of your outbound emails, something like newsletter / email. Now, regardless of what email provider a recipient is using, if they click on that link – they’ll show up in your reports correctly.

Going Forward

Once you’ve tagged your link, if you’ve set up either Goals or Ecommerce tracking, you’ll be able to see “last-touch” attribution for your marketing efforts in most of your reports; for example, in the Source / Medium report, the final three columns in the table will contain conversion data for users with that given source and medium. When you get to be a more advanced user, the Conversions -> Multi-Channel Funnels section of reports can be used to analyze behavior over many sessions (e.g. did any users hear about us from that email and then come back a few weeks later and convert).

A few important warnings:

  • Never tag internal links with campaign parameters (e.g. from your homepage to a sale page). This will cause a second session to be started for the user and will destroy the original context of their visit. Use Events or Enhanced Ecommerce Promotions.
  • GA is case-sensitive, so Marlon / Brando and marlon / brando will show up as two distinct rows. That can lead to reporting errors, so make sure everyone is on the same page. We recommend using a tool like Google Sheets to organize everyone. Additionally, Filters can be used to help avoid any mistakes.

The post Tracking Inbound Campaigns in Google Analytics appeared first on LunaMetrics.

Google Analytics Performance Tuning: “True” Direct

$
0
0


Howdy y’all; me here again to talk to you about “Direct” traffic. A few quick bullet points:

  • Direct or (direct) / (none) traffic isn’t “bookmark” traffic, it’s traffic whose source we cannot identify.
  • Many sessions that are actually Direct are re-labeled to the user’s previous session referring information instead.
  • … except in the Multichannel Funnel (MCF) reports, where they’ll be labeled as “Direct” instead, but sometimes these are not Direct.
  • And additionally, sometimes UTM parameters can rain on your parade, too.

This blog is a part of our Attribution series, so we’ll be sprinkling links to other posts throughout.

A Very Important Warning

This post covers tweaks and code necessary to overhaul attribution in the reports to something a bit more straightforward. Whether or not you should implement these changes will be a decision for you and your company. Even Analytics experts are divided are this topic; it is really a personal decision that depends largely on how you use the data or plan on using it.

Be warned: the changes I’m going to suggest are not for everyone; they’ll irreversibly change your Google Analytics reporting and data collection in very fundamental ways. Do not make these changes with properly considering the implications and discussing with your team. You’ve been warned.

1) Identifying Direct Traffic That Isn’t

Use campaign parameters, duh. If you don’t know what these are, this post isn’t for you. Seriously, get out of here. Here’s a blog post I wrote. Get it together.

2) Stop Campaign Inheritance

Campaign inheritance is what I call the session re-labeling “feature”. This is how Google Analytics handles figuring out traffic sources, something called “last non-direct click attribution,” Not familiar? Say a user clicks your AdWords ad and has a nice visit on your site. Later on, they’re feeling nostalgic (“Oh, honey, remember that parallax scrolling background! Yeah, with the scroll hijacking.”), so they come back, typing in the URL by hand.

You might expect their second session to be labeled as “(direct) / (none)” – nope, it’s “google / cpc”. Google Analytics figures that it’s more helpful to you as a marketer to attribute that session to the last non-direct attribution session they’ve got; that might have had something to do with this second session, after all, whereas “(direct) / (none)” is just a black hole.

For more information, head over to Becky’s post which goes into full detail about last non-direct click attribution.

If you’d prefer the session be labeled “(direct) / (none)” instead of “google / cpc”, you need to change the Campaign Timeout setting in the Property Settings to 0. This will tell Google Analytics to attribute each session independently of the last.

You can expect to see a greater percentage of your traffic attributed to the Direct channel after this change is made.

3) Fix Non-Direct Direct in the MCF Reports

As previously mentioned, MCF reports will attribute “Direct” traffic a little different. At first brush, it appears that sessions that were really direct (no referral/UTM parameters) are marked and treated as such in these reports, even as they’re attributed to their most recent non-direct attribution in the standard reports. This turns out to not be the case.

The reality is that sessions are marked as “Direct” when the session would have been attributed to the same campaign information as the most recent session for the user (remember: direct sessions are set to the most recent values, per above). isTrueDirect and the “Direct Session” dimension behave the same way. If someone comes via Google Search to your site twice in a row, the second session will be marked as Direct in the MCF reports, even though they used a search engine each time.

Good news! This is taken care of by changing the Campaign Timeout. Once set to zero, these reports will also show the real campaign information or Direct. Dust off your hands, you badass, you.

4) Fixing Errant UTM Parameters

UTM parameters are great, but sometimes they do undesirable things. For example, if I came to your site via a tagged link and bookmarked that page, every time I’d come back I’d register a new session attributed to that campaign. I might share that URL with a bunch of friends, who now all end up being attributed to that campaign, too (this might be desirable, from your perspective).

Add a hitCallback that will scrub campaign parameters from the URL after the user’s first page view.

function () {

  if (!window.history) return;
 
  var path = document.location.pathname;
  var search = scrub(document.location.search.replace('?', ''));
  var hash = scrub(document.location.hash.replace('#', ''));
  var newPath = path + (search ? '?' + search : '') + (hash ? '#' + hash : '');

  window.history.replaceState('','', newPath);
        
  function scrub(s) {

    if (!s) return;
    
    s = '&' + s;

    var params = ['medium', 'source', 'campaign', 'content', 'term'];
    var b,
        i;

    for (i = 0; i < 5; i++) {
    
      b = s.indexOf('&utm_' + params[i] + '=');
      if (b > -1) s = s.slice(0, b) + s.slice(s.indexOf('&', b + 1), s.length);
    
    }
    
    return s.slice(1);

  }

}

Once the first page view has been recorded in Google Analytics, the callback here will clean those parameters out of the URL. This will keep the user from gunking up your data with them later on.


With all of these changes in place, you’ll get reports that are more literal in their interpretation of Direct traffic, with hopefully fewer visits marked as Direct, too.

Have questions? Sound off below.


This blog is a part of our Attribution series, check out all the posts in the series below:

The post Google Analytics Performance Tuning: “True” Direct appeared first on LunaMetrics.

Tracking Single Page Applications with Google Analytics

$
0
0

single page applications google analytics

Google Analytics and Google Tag Manager (GTM), though designed with traditional round-trip-based websites and web applications in mind, can be configured to work properly with single page applications (or SPAs). Common technical issues encountered when tracking SPAs with these tools are:

  • Only the first page is tracked
  • Page paths do not include fragment data (e.g. /app#/page-path is /app)
  • Page paths or page titles are incongruent with application state
  • Duplicate tracking of the first page
  • Misleading page timings data

These complications occur whether you’re using Angular, React, Backbone, or any other front-end framework or code that manipulates the History API or fragment alongside changes to on-page content.

There are also a few issues that arise specifically when using GTM to track your SPA.

  • Campaign information overriding
  • Accidental data inheritance
  • DOM state uncertainty

Let’s take a look at how to solve these issues.

A Note On Syntax

There are currently three supported syntaxes you might be using on your project. We’ll outline the steps for each, but make sure you’re using the correct one; in practice, these different syntaxes overlap. For example, Google Tag Manager (GTM), uses the global window.dataLayer as its interface, which gtag.js also uses albeit indirectly. Both GTM and gtag.js load analytics.js, the Google Analytics library. All of this interdependency can cause some confusion; make sure you’re only using one syntax.

Common Issues

Only the First Page Is Tracked

If you’ve already tried to implement Google Analytics, you’ve probably already noticed that only the first page view of your application is being recorded. You might have thought that Google Analytics had some mechanism to automagically track page views. There’s no magic here; GTM, analytics.js and gtag.js are just APIs for issuing page views and other hits, and the neat trick they’ve done is to embed a call to the page view method in the standard snippet the tool asks you to install on your site.

If you’re using analytics.js, it looks like this:

<script>
  (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
  (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
  m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
  })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');

  ga('create', 'UA-0000000-1', 'auto');
  ga('send', 'pageview');
  ^^^^^^^^^^^^^^^^^^^^^^
</script>

And if you’re using gtag.js, it looks like this:

<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-57570530-1"></script>
<script>
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag('js', new Date());

  gtag('config', 'UA-0000000-1');
  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
</script>

If you’re using Google Tag Manager, it’s here:

<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
                                                     ^^^^^^^^^^^^^^
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-0000000');</script>
<!-- End Google Tag Manager -->

For web sites where each page of content requires a round trip then result is that every time a page loads a page view gets sent to Google Analytics. Because your SPA doesn’t trigger a full round trip when content changes on the page you’ll need to add calls to your code when you want to track a page view. Often, we can do this automatically by binding to a routing handler in our code.

  • In Angular 1.X we can bind to $routeChangeSuccess or $stateChangeSuccess event on the $rootScope
  • With react-router we can extend Route with a component that calls our page view onComponentDidMount

Although automatic tracking can be helpful, consider what you are actually interested in measuring. Automatically triggered hits can be too frequent and become less meaningful. There are also limits on how many hits can be sent on a per-client and per-account basis.

  • 500 hits per session
  • 200,000 hits per user per day
  • 10,000,000 hits per month (unless you’re a GA360 customer)

An excellent implementation will track few things automatically. Here is the syntax to trigger a page view for each library:

If you’re using analytics.js:

ga('send', 'pageview');

If you’re using gtag.js:

gtag('config', 'UA-0000000-1');

If you’re using gtm.js:

dataLayer.push({event: 'pageview'});

GTM also requires you configure a Google Analytics Tag and a Trigger that fires on the custom event ‘pageview’:

Page Paths Do Not Include Fragment Data (e.g. /app#/page-path is /app)

Frameworks and code may leverage the fact that browsers will allow the hash or fragment to be changed without triggering a full page reload. Instead of changing the URL with the History API, the fragment is changed instead, usually with what appears to be a page path:

/app#/page-path
/app#/subsequent-page-path

However, Google Analytics will not include the fragment in the Page dimension inside of the reports; instead, the above two paths would both be represented as /app.

To fix this issue, you’ll need to customize the data that you send to Google Analytics and “teach” the tool to include the fragment in the page path. For more on how to do this, read on.

Page Paths or Page Titles Are Incongruent with Application State

Often, we’ll want to adjust the page path or page title we send to Google Analytics. By default, Google Analytics will use the value of document.location.pathname and it will be stored as the Page dimension in Google Analytics:

You can override it with a path of your own. The value you set to the Page dimension must start with a forward slash (/) and should be formatted in the same manner as a page path.

You can also change the value for the Page Title dimension. By default, this will be whatever is in the <title> tag on the page at the time the hit is sent. If your application already changes the <title> tag when the state changes, you’re all set. If that’s not the case, see the below for examples on how to override that value.

A note: generally speaking, Google Analytics page-level reporting revolves around the Page dimension, and the Page Title dimension requires extra clicks to access or apply. As a result, we recommend orienting your implementation around the Page dimension and only using the Page Title to add additional context for specific reporting needs.

If you’re using analytics.js:

ga('set', 'page', '/your-own-path');
ga('set', 'title', 'Your custom title');
ga('send', 'pageview');

If you’re using gtag.js:

gtag('config', 'UA-0000000-1', {
  'page_title' : 'Your custom title',
  'page_path': '/your-own-path'
});

If you’re using GTM, you can use whatever dataLayer keys you wish; here’s a pattern we like:

dataLayer.push({
  event: 'pageview',
  page: {
    path: '/your-own-path',
    title: 'Your custom title'
  }
});

Note that you’ll need to create Data Layer Variables to extract those values from the dataLayer, then set them in your Google Settings Variable (or directly on your tag). First, make a version 1 Data Layer Variable for page. Then create Custom JS variables to remove the values. For more on why you must take this approach, see Accidental Data Inheritance in the GTM-specific issues section below.

Then return the path property.

Duplicate Tracking of the First Page

Once you’ve enabled automatic page view tracking, on pages where your SPA is delivered alongside the standard header and footer content for your site you may start accidentally tracking two page views. The problem is that the Google Analytics snippet for the rest of the site fires its standard page view, then your application loads and also fires a page view. To fix the issue, you’ll need to either:

  1. Add logic to your backend that can detect when the SPA is shipping and remove the initial page view call
  2. Add logic to your SPA that detects that an initial page view has already been fired.

The first approach can be difficult to implement; maybe your application ships as part of the page and isn’t immediately bootstrapped, or that part of your templating pipeline doesn’t have knowledge of whether the page will ship the SPA or not.

The second approach can be simpler to implement:

let hasFired = false;

// Later in page view tracking function...
if (!hasFired) { hasFired = true; return; }
  
// Fire page view as normal now
...

But it can also present challenges and can feel a little gross. The best solution will depend on your codebase.

Misleading Page Timings Data

Google Analytics will track page speed timings for you and report on how long pages on your site took to load for visitors. Timing data is collected immediately after a page view hit is dispatched. The timing data is sourced from the window.performance API; because the page never reloads with a SPA, the net result is each page view after the first uses the same timing data as the first page view.

This can lead to bad analysis. Further complicating matters is the way in which timing hits are sampled.

This is a point of frustration with clients who often want insight into how performant their SPAs actually are. To solve for this, we recommend either:

  • Fishing out the data you need from performance.getEntries()
  • Storing a timestamp at an agreed upon pre-loading point, then capture the difference after a load has occurred

Additionally, we recommend using events to capture this data instead of timing hits.

Google Tag Manager-Specific Issues

Google Tag Manager has some specific design details that cause problems when trying to couple the tool with SPAs. Here are a few of those, and how to avoid them. If you’re not using GTM, you may skip the below.

Campaign Information Overriding

Due to the way that GTM handles issuing commands to Google Analytics, visits that include both an HTTP referrer value and special tracking parameters will be accidentally split into multiple sessions and incorrectly attributed inside the reports.

To fix this issue, import our SPA Campaign Information Fix recipe and set the customTask Field in your Google Settings Variable to {{JS - customTask - Null Conflicting Referrers}}. This will prevent this issue from impacting your container.

We’ve also got a SPA Container Bootstrap with all of the above setup for you right here.

Accidental Data Inheritance

GTM is designed to encourage code reuse; a page view can be triggered by many conditions (e.g. the page begins loading AND {event: 'pageview'}). This can lead to issues of accidental inheritance; a tag is fired more than once with data that it was only intended to use a single time, or stale data.

dataLayer.push({
  event: 'pageview',
  page: {
    path: '/some-page-path',
    title: 'Foo'
  }
});
// GA Hit has page=/some-page-path and title=Foo
// later on
dataLayer.push({
  event: 'pageview',
  page: {
    path: '/subsequent-page-path',
  }
});
// GA Hit has page=/subsequent-page-path AND ALSO title=Foo

To prevent this, either use a clean-up tag to null sensitive keys or a “version 1” Data Layer Variable and Custom JS Variable.

Note: Unlike with version 2, you cannot access nested keys of version 1 Data Layer Variables. Instead, you must use a Custom JS variable to retrieve the value.

DOM State Uncertainty

The traditional round-trip model provides a simple lifecycle:

  1. The page is requested
  2. The initial HTML is received and the browser begins to parse the document into the DOM, requesting other resources as it parses them
  3. When initial parsing finishes, the document emits the DOMContentLoaded event
  4. When all resources have loaded, the window emits a load event

GTM listens for these lifecycle events and allows a user to trigger “tags” (code snippets, e.g. a Google Analytics page view) when they occur. Often, tags will require data stored somewhere in the DOM (e.g. the h3 of a widget) or will require the page be finished rendering before firing (e.g. scroll tracking). Because of this, we recommend adding a dataLayer.push() call when significant changes have completely finished rendering in your application.

import React from 'react';

let dataLayer = window.dataLayer = window.dataLayer || [];

class Page extends React.Component {

  render() {

    return (
      <div className="page">
        {this.props.children}
      </div>
    );

  }

  onComponentDidMount() {

    dataLayer.push({
      event: 'domReady'
    });

  }

}

export default Page;

In Closing

Remember to verify that your application:

  • Tracks page views on every meaningful state change
  • Correctly sets the page and title for the hit to values congruent with the state of the application
  • Does not fire two page views for the same page when the application first loads

Additionally, if you want to track page load times, use events and roll your own method of measurement. If you’re using GTM, consider adding a .push() after the DOM has finished rendering, and watch out for accidental data inheritance. Finally, make sure to import our campaign information overriding fix if you plan on using GTM, too.

Tracking SPAs with Google Analytics can be a lot of fun; because so much of the logic can live in the front end, it’s easy to add tracking with a deep knowledge of how the application works and the state that the application, session, and user are in at the time of data collection. Make sure to avoid these common pitfalls to enjoy a useful Google Analytics implementation.

The post Tracking Single Page Applications with Google Analytics appeared first on LunaMetrics.

Reusing Google Analytics Campaign Information with Google Tag Manager

$
0
0

GA Campaign UTMZ

There are many situations where it would be helpful to have the source, medium, and other campaign information for a user’s session. For example, we may want to capture the source / medium of a session when a user submits a Contact Us form to store inside of Salesforce.

This used to be very easy to do; in Classic Analytics, campaign information would be stored in the __utmz cookie, right on the user’s browser.

Universal Analytics moved campaign processing from the user’s browser to Google’s servers, eliminating the __utmz cookie (and easy access to that data). So how can we retrieve that information with Universal Analytics?

The proper way to do it is to store the users Client ID in a 3rd-party system and a Custom Dimension, then run a periodic script to retrieve the data via the Reporting API and sync it between systems. However, for many clients that kind of implementation is out of reach.

The next best solution is to re-create the __utmz cookie. One traditional solution has been to include Classic Analytics and send its hits to a dummy UA number. This does the job, but adds a lot of JavaScript that we don’t need.

A few years ago we wrote a UTMZ Cookie emulator recipe that we’re releasing to the public today. The script it contains will do its best to faithfully recreate the __utmz cookie of old, and will inherit values from any existing __utmz cookie and apply the same rules that ga.js would to values therein.

Import The Recipe

For instructions on how to import the recipe file, check out my colleague Jim Gianoglio’s evergreen post on using the Import feature of Google Tag Manager. Once you’ve got it imported, proceed.

Use the Variables

The recipe will import the following Variables into your container:

  • JS – GA Source
  • JS – GA Medium
  • JS – GA Campaign
  • JS – GA Content
  • JS – GA Keyword

These correspond to dimensions in Google Analytics. We can use these Variables to use for a condition in a Trigger or to pass values to a Tag.

In a Trigger

Let’s set up a Trigger that fires when a user visits from a search engine. We’ll create a Trigger that fires on Pageview when the value of the JS – GA Medium Variable is “organic”. Here’s how to set that up:

We can use this concept with any of our campaign variables, too; want to fire a modal for users visiting from a specific campaign? Use the JS – GA Campaign Variable.

In a Tag

We also might want to provide these values to Tags that fire in our container. If the Tag is a “pre-built” templated Tag, we typically just need to add the Variable to a field in the template. To pass along our data to a Custom HTML Tag, we need to review the vendor’s documentation and determine where the value belongs.

Caveats

Hopefully you’re excited about some of the things you might try with this recipe, but before you get too jazzed, there are some caveats. Because the processing happens server-side, there are some situations our recipe can’t account for:

  • If you’re using cross-domain tracking, our recipe will probably say the session was a self-referral from that sister domain.
  • If you’re using the referral exclusion list extensively, our recipe will attribute sessions to domains that the exclusion list would have set to (direct) / (none) instead.
  • If you have any kind of filter black magic fiddling with the source / medium values, our script will miss those, too.
  • If you’re using Google AdWords or Google DoubleClick and autotagging is turned on, you will see the source/medium, but you won’t receive campaign information as that’s joined inside the interface.
  • If the user is coming directly and has an existing campaign from before the recipe was added, the recipe will show them as (direct) / (none) whereas the reports will show the previous campaign.

Questions? Issues? Let us know in the comments.

The post Reusing Google Analytics Campaign Information with Google Tag Manager appeared first on LunaMetrics.


Viewing all 55 articles
Browse latest View live