As we visit with customers to prepare their analytics and marketing optimization strategies for 2015, we’re seeing a few key questions emerge for next year. These are all related to the evolution of the analytics and optimization markets, not just with respect to tools and capabilities, but governance concerns as well:
- How does my practice become more proactive and less reactive?
- How do we make web analytics data part of a “Big Data” initiative?
- How do we get a better understanding of customer behavior across web, mobile, and online channels?
Here is some practical advice on how to address these questions.
How does my practice become more proactive and less reactive?
Does your team behave more like mechanics – wait for warning lights to come on the dashboard, then react – or more like R&D teams building a race car – it can always be faster and more efficient. Mechanics fix problems; R&D teams try to get every last drop of performance out of the machine.
Whether we’re talking analytics or optimization, getting out of a mostly reactive cycle is key to gaining control, and sanity, over your daily life. This is a governance problem, and it starts with setting clear analytics and optimization objectives. For example, in a previous life (most of our consultants have been practitioners), one of our consultants used the table below to prioritize his team’s work.
The Key Performance Indicator (KPI) is Revenue Per Visit (RPV), and the idea is to find the product pages that have the greatest potential upside based on current traffic and conversion rates. With the upside possibilities identified, two things happen:
- First, the team can focus their efforts on the pages that matter for RPV. Test plans, design exercises, and promotions can be prioritized accordingly.
- Second, this gives them an easy-to-understand tool for communicating to anyone what they are working on and why. Executives clearly understand the value of what they’re doing, and they can manage ad hoc requests in relation to their RPV KPI.
Of course, ad hoc requests don’t go away, and they’ll always be a part of the job. But having a tool like this helps to have a business conversation to set priorities and set realistic expectations about when an analysis or test can be accomplished.
How do we make web analytics data part of a “Big Data” initiative?
This question is coming up more and more as we head into next year. Web analytics data is notoriously difficult to make sense of outside of the native tools – Adobe Analytics, Google Analytics, or whatever you use. The problems are not small:
- Non-human traffic clutters the data set. Figuring out what is a “robot” and what is a human isn’t exactly straightforward. Then you need to scrub the data of the robot-generated traffic.
- Derived metrics, like conversions and other events, are often lost in the extraction. You need to put those derived metrics back together by recreating the calculations and logic.
- The APIs often require a great deal of knowledge of the data set, and even if you have that knowledge, it can be difficult to confirm that the data you’re getting back is the complete set of what you want.
Numeric sees two strategic approaches to this challenge.
The first approach involves extracting, transforming, and loading data from the native tool. This is a more difficult approach that will likely involve custom IT work and a data scientists who really knows what is coming across in the data feed and how to clean it. This is also not a one-time deal. New robots, metrics, and other changes to the native system will require frequent updates to the custom solution.
The second approach is more of a “direct injection” approach that uses something like Numeric’s Framework (a javascript file) that can provide a consistent, reliable data feed directly to a big data repository. If you have a TMS, check to see if the vendor has a connector to whatever repository you’re using (Hadoop, Mongo, Amazon Redshift, etc.).
How do we get a better understanding of customer behavior across web, mobile, and online channels?
This is often the question that drives the need to use web analytics data outside the native tools. An easy first step is to undertake a pilot Data Visualization project using a tool like Tableau or Qlik. These tools allow you to build a dashboard that you can use to quickly compare activity across channels. Here are the basic steps:
- Set your objectives and keep them simple. Maybe you want to look at regional comparisons of visits to your Facebook page, YouTube video views and purchases from to your ecommerce site.
- Define your data sources. In this case there are three – Facebook, YouTube, and whatever you’re using for web analytics. You may also have a campaign management tool like Pardot, Hubspot, Marketo, Eloqua or similar that can be a source of information.
- Define your data mapping. From your sources, decide what is important. It’s often not all of it. You’ll very likely go from many tables to just a few. This mapping should take place in an ETL tool like Pentaho or Talend.
- Run your first transformations and load them into your DataViz tool to see what you have.
From here, it’s a matter of iterating with new visualizations and maybe adding some new data sources until you’ve met your objectives. Don’t worry about automating data integrations yet. That comes later once you are happy with what you have.
A typical pilot DataViz project like I’ve described doesn’t have to be a massive undertaking. We’re averaging around 150-225 hours to complete DataViz pilots.
Conclusion
As the analytics and optimization disciplines continue to evolve, these kinds of questions become more pressing every day. Getting a handle on how you’ll address them can be key strategic drivers for 2015.