If you manage a lot of web properties then you surely have run into the problem of replicating the same report across all of the web properties. While, having the ability to create an aggregation of web properties is the most convenient (a roll-up account) there are cases where you cannot actually implement it or the implementation of such a solution will either require significant cut back on data points sent to the roll-up as Google Analytics comes with limitations to quota usage. It always advisable to check your current utilization, you can do it using this tool. As always, reality is not ideal so a roll-up account might not be available. In this case our fallback data retrieval mechanism is the Google Analytics API. We have described ways to access the API using R in two posts : Google Analytics + R = FUN! & Basic A/B Testing plots and stats with R. In the context of our post we need to create a simple dashboard that tracks the conversion rate of our handful of web properties across channels, so this is something that we would pursue using the Google Apps Script under the notorious name Magic Script.
Getting the Magic Script
Open the Script Editor in a Google Spreadsheet clear the code appeared and paste the Magic Script code found in this gist. Note : Use this gist because it contains some modifications compared to the initial Magic Script from the GA team.
If you haven’t used the Google APIs in apps scripts before go through the following steps to enable the Google Analytics API:
- In the script editor go to: Resources > Use Google APIs
- Turn ‘Google Analytics API’ to
- Click the link to
Google APIs Console
- Turn ‘Google Analytics API’ to
- Accept the terms of service
- Close the ‘Google APIs Console’ window
OKin the script editor window and close that window
- Back in the spreadsheet, the ‘Google Analytics’ menu should now be working. (You may need to re-authenticate.)
Now, a new tab should appear next to ‘Help’ names Google Analytics. If you click it you will see that you can create not only basic reports configuration but Multi-channel funnel reports as well.
Define a report
We will track conversion rate for the last 30 days. This means that we will not store data but we will only use the fact that the API can query based on relative dates. Let’s create a core report
The report you will define here will be replicated for all web properties you have. To replicate the report you have to simply drag the first row until the aut0 increment of queryX,valueX reaches the number of web properties. Also, name the target sheet accordingly to reflect the web property. Then you need to create a intermediate sheet to aggregate the data (as the data will be refreshed on the target sheet hence every run will clear all contents of the sheet) plus in the future you might need to add more dimensions to the query that will result to more columns added). Essentially, in a new sheet we aggregate data from the queries’ fed sheets, like below.
Tip : Create the first formula and then select Show all formulas from the View menu. Then simply Find & replace the sheet name in the respective web property column and you are done!
Create blocks of the above defined by the metrics you have defined in the report. this means you will make another table as above with source data the sheet-name!C13 which will give you a table with the visits. Then creating row totals will give us daily totals for all web properties reported and column totals will give period wide totals for each web property. Last a sum of all cells in a block will give the total visits/conversions/dimensionX for the period summing over web properties. The division of totals gives period wide conversion rate for all wep properties and a cell by cell division provides the conversion rate for a specific web property for a day.
How to fight Execution Errors
Google spreadsheets have a time limit of 5 minutes when running scripts. So, if you have a heavy on queries spreadsheet you might get errors like that. The solution to this is to break down queries to more spreadsheets and then import data into your dashboard (the master spreadsheet) using the importrange() function which is magical by all means! However, in our case here if an error occurs (lets assume that it will occur in the last query) you can let viewers of the dashboard know using a message in cell using the next if() statement.
=if(now()-B2>0.019,"Beware : Execution Error in the last run!","")
Dashboard without sparklines?
No sir! In case you didn’t know Google Spreadsheets have a sparkline() function to use for the dashboards you are designing which are a great source of eye-picking information for spotting ups and downs in the time trends of your metrics and KPIs. Make sure that you won’t exaggerate like me with the use of sparklines…
Sparkline of Type I : Line
Line will show you time evolution, for your conversion rates, traffic, qualified visits etc. Look for peaks and lows or relative flat linings
Sparkline of Type II : Bar
Really nice to show you relative performance, say today vs yesterday (or vs this day last week). To use this the formula is simply
In the end
The following is a sample dashboard with a few runs on my personal site and some other blogs I have access to. It’s not really e-commerce rate but rather goal conversion rate for a contact form submission. Take a look at this and get some ideas on how to create more for your personal dashboard.
There is no end, if there is no tracking code added…
The Magic Script you got above is modified and enables you to track the usage of the Spreadsheet (using pageviews) and the execution of the getData function with event tracking. simply open the Script Editor and replace your UA in the script. I suggest having a separate Google Apps web property and use a content grouping per Google App (spreadsheets, docs, forms etc) to track utilisation of your drive creations. This is really exciting to see how (and if!) people are actually using the docs you shared like the screenshot below.
Have you used Google Analytics Content Experiments?
I’ve been using it lately and the most useful thing is that all data are integrated into the API to do what you want to do except for the standard reports. So, you can get them into your favorite data analysis tool and get more things done. In the following I simulate an example for an e-commerce site, so I am interested in checking the next (among others)
- did the proposed variation increased the Funnel entrances?
- did the feature uplift the Average Order Value (AOV) ?
- did the conversion rate (sales wise) improved?
We can easily accomplish this using the Google Analytics API and R. We have found a way to get data off Google Analytics using R previously (see : Google Analytics + R = FUN!)
The code that gets the above chart into a presentation of yours is hosted over at GitHub. We source the multiplot.r file in order to be able to add multiple plot of the ggplot kind within a loop (see this)
The scope of this post it to show how simple it is to get data out of the Google Analytics and create your own reports (that you hope that they can be semi-automated at least) and you favourite statistical graphs (those that GA is currently missing). As you already know R is a favourite tool to me, so this will be the main tool to get the data, reshape them and depict them. You will need elementary knowledge of the R language and in the end you’ll soon relaize that google search is more than 40% of the code polishing stuff your code will ever need…
There are two packages (or libraries) that connect to the Google Analytics API and return data to you, the older one is RGoogleAnalytics and the new champion is rga. Both are excellent and I’ve used both at all occasions. rga seems a bit nicier but RGoogleAnalytics is certainly more robust and works under all occasions. Apart from the core, I will use ProjectTemplate for my personal organisation (you won’t see it however) and ggplot2 for graphics.
First of all go to Google’s API Console and create a new API application after you make sure that you have Google Analytics Service enabled.
Because of the lengthy script that would ruin the flow of the post I have created a Github repo where all scripts reside [zip]
Now, that the API access is set-up in the one side,we should make the connection to R. You should already know that RCurl is a bit tricky as I have oulined in the A tiny RCurl headache note. The solution proposed there is applied here as well. note that this issue will be solved in the next release of rga. On the other hand RGoogleAnalytics seems to be already on the spot. Have in mind that using
ssl.verifypeer = FALSE
isn’t the most secure way to use network communication in R. You can use the following to create a connection to the API [rga_initiate_API_connection.R] This is heavily copy-pasted from Randy Zwitch’s (not provided): Using R and the Google Analytics API post.
One issue is how to get the Profile IDs. The hard way would be to go to Query Explorer and cycle through all profiles and write down the IDs that you are interested in. However, luck is all you got as there is a function that will return you all profiles that you have access with the account tied to the API access you created. (BTW, it is excellent that there is provision for access to the Management API in the rga package)
In the next I will assume that you have defined the ids that you are interested in.
The main hypothesis that I want to get a taste of is whether the different post categories (eg. measure, statistics, music etc) have different load times. This will be interesting given that all categories don’t have the same burden to get loaded (images vary, youtube videos scripts). To achieve the following you will need to use a filters vector and loop over it. Give appropriate names in the page.group vector and you will be done. Note, that we have created extra metrics
- e-commerce rate : this is not meaningful in the case of a blog, but if you are advanced in analytics you might have implemented goals as e-commerce events as B. Clifton suggests.
- bounce rate : the bounce rate should be correlated to the page load
- buckets of page load time : we use a 4 seconds range for each bucket to be consistent with the the Apdex standard.
Because I want to get a more metrics than a single query allows (11) I use another query in the loop to ge the rest and then merge them.Now, if you run all these scripts you will have a data frame like this extracted in the end of the script using the head() function
Enough with scripting!
Now, that the data are on our console we can finally get some graphs. The following histogram is the aggregated page load speed histogram of this blog. You should note that there is a significant volume of sample units that belongs to the 12-16 bucket. I have the suspicion that they also belong to a specific country group as well as the host is providing good page load timings in the US and Western Europe. (Note to myself : I should add the ga:country dimension in the second query run).
OK. This is not a nice picture at all! I know that I have experimented with various analytics scripts in the last months plus in the first 3 months of 2013 I was using a significantly heavier wordpress theme but I still think the the sample is skewed by the georgaphic distribution of the readers (a new post will come soon on this!)
Extend the script to your needs
In a modification of the script above I can use a loop on the web properties that I have access , so I use R to store data and create a roll-up report in a fast way. If you are looking at the comments section of the scripts you will notice the following.
# In the future we should only get data for increment dates. Don't we? get.start.date<-min(final_dataset$date)
I use this to incrementally query and store data in the final_dataset data frame (this will help with the sampling that I will run into the first time of running the script for a long period of time). I am pretty sure a cron thing can be streamlined here, however I have no idea on cron jobs…
Head now to https://github.com/IronistM/R_Google_Analytics !
One of most useful but rarely accomplished to integrate data into your Web Analytics tool is demographic data (in this case the tool is Google Analytics). There are a lot of approach, but it turns out that there is one that could be in front of your eyes from the very beginning…
The data provider
We will get data from Google Display Network, so we will need a campaign running on the GDN. If you don’t know how to set-up one it’s pretty straightforward, just set your ad campaign to “Display Network only” or “Search & Display Networks – All features.”
Add demographic targeting to an ad group
Now, that you have a campaign set you can add an Audience to run on. There are more than one type of Audiences as you can see in the AdWords help. We are interested in utilising the Demorgaphic option, so we will create an Ad Group to specifically serve us with traffic by age group. I will copy-paste the AdWords instructions below
- Select the ad group to which you’d like to add demographic categories.
- Click the +Change display targeting button. Click Age to update your demographic targeting by age.
- In the “Age” section, click Edit.
- Select an option below:
- All ages
- Specific ages
- 65 or more
- Click Done.
- Click Save when you’ve finished updating all of your targeting settings.
After this, we have an Ad Group to start with.Now, the tricky part; you need a budget… ;x
Spot data in Google Analytics
Where can you find the data in Google Analytics? It is likely that you have noticed that the GDN campaigns sent out to Google Analytics information in the Keyword dimension such as
Obviously these are not keywords….Now among these you will find keywords that are exactly the Age Groups that you selected to add to your Ad Group targeting. To check that you can use the following Regex filter in the Keyword report of AdWords et voila!
Segment and slice at will
Obviously the above is only of descriptive value, the true thing starts when you apply this as a segment (remember segmentation provides insight) or apply a segment to this report. Imagine how insightful this will be when you are testing a new feature that you thing that will change the way that the visitors interact with your website or engaging with your upper conversion funnel steps. I would be happy to have a budget set for traffic of this structure to evaluate my landing pages….
This track is my jam: “I Changed My Mind (Stereo MC’s Rattlesnake Mix)” by Lyrics Born http://sdtk.fm/Z0AFEc
So, I got lured into setting a back-up strategy for the present blog, mostly because I cahnge a lot of things at times while trying to learn about wordpress. I could probably do this via a time scheduled job on my hosting provider, but I am not that techy so I searched for relevant plug-ins, and there are a ton (see this list). I would like to stress out that I am getting extremely fond of Google Drive at this moment, so I narrowed the list to those plug-ins that are capable of storing the back-up over my drive.
Confession : I am not sure if this is optimal, though..
From my shortlist I prefered to use UpdraftPlus. It claims to simplifies backups and restoration. You can backup copies of the wordpress databases on the cloud (Amazon S3, Dropbox, Google Drive) to FTP and email (this would be not of practical use if you have a large folder of upload (>20Mb probably will get cut from the email provider).
The main options are below. You can configure the frequency of the backup (the shortest is 4hours) and the number of backups that will be retained (1 is enough for me). Next check what should be backup’d. Iback up everything You can set an email to alert you of possible issues (if you don’t you can always access the log of the process). Last, you can set an encryption password for use when you will need to restore your wordpress database.
Configure Backup Contents And Schedule
A common issue when having a lot of interlinking in your content is that at some point of time you will either delete or mistype a link URL (you soooo rush to push the Publish button). There are two ways ot track the page that originated to the 404 mishappening…The first one is to use the Navigation Summary, the second one is to use Event Tracking. If you are not familiar with event tracking it is a google analytics gem that you can use to measure interactions with elements of your site and many more. To implement Event tracking you will need to add the following code snippet in the 404 page of your site to track the previous page that originated to the 404 mishappening…
Then you will see in your Event Reports the Event Category 404.
Drilling down to the Event Action will reveal the page before that lead to the 404 page. This way you can spot the page and possibly the broken link (most 404 pages stem from broken links or user typing a wrong URL)
Now I should go and fix the broken link in the Software page of mine : )
NOTE : If you are using WordPress you will find the previous page appended to the 404 URL, like this, where the homepage is generating the error
New track! Chalk Outline by Three Days Grace via #SoundHound for Android. http://www.soundhound.com/?t=074b9085a335e16b6bc2b65a3bde85fd
This is mostly a measure blog so I can safely presume that if you are bloggers yourself then you are measuring the performance of your efforts using a web analytics solution. In case you don’t then you are missing some exciting stuff going on with your online ramblings I can assure you…
Now, there is sufficient data to assume that you are using Google Analytics (Usage statistics and market share of Google Analytics for websites) but there are so many tools out there to have a look out for. I use more than one tools in this blog (none of which messes with your privacy however!). One alternative story is Mixpanel. I prefer Mixpanel when it comes to funnels plus you can define formulas making it really attracting. You can get some details from this Zippy Kid’s post. Now, how you can use it to your blog?
Install the basic code
Now paste this into your footer (footer.php located in the Appearance > Editor in case of a WordPress blog) and you will have tracking in all your pages. Let’s not confuse however that the reason we implementing Mixpanel tracking into our blog, site or whatever isn’t pageviews but events with attributes that are of interest to us.
Code for special events
I really want to have a segmentation based on the posts that visitor view on my blog prior to reaching specific goals that I have defined. Say that one goal is someone to go to my CV page, when he goes there I feel so much richer that I credit myself with 1,5Eur (this buys me a Cola in Greece)
We just used the most basic function of Mixpanel,
which is a mix of trackEvent and trackPageview that Google Analytics uses for his own tracking. We can use it in plain to send a specific event to mixpanel but we can use it better in order to have more data at hand. So, say that I need to monitor the posts per Category and whether they supply code to readers. Then I can break this down to posts as well.
Make sure you tag all your posts in a consistent way. You can get organised with a document where you keep all your tracking codes so your can simply paste them over your new blog theme when time comes :).
Another interesting point to track is whether visitors that use your search box get results or not. Paste the following inside the section of SearchResults.php that defines that results are delivered. You will identify it by something like the following that essentially marks a code area that has post (aka results for the corresponding search query)
Now insert your Mixpanel tag
Of course you can assign a Value to this, in fact you should definetely do so…
Next go to the comments.php page and add a tag that fires an event when a comment is made and assign a value (I credit myself with 11Euros for each comment)