Skip to content

Everyone Take A Deep Breath

August 10, 2011

I love the web analytics industry for many reasons- but one of them is the pure passion that so many folks bring to this little niche that none of us grew up anticipating we would work in someday.  I’m proud to be able to say I work among peers who have defined what our industry does, come up with a code of ethics, and continuously thinks about, and challenges what it means to work in our industry.  I also appreciate the passion, the thoughtfulness, and the sense of humor I have found daily in the #measure community. 

Yesterday’s conversations around the strengths and weaknesses of various tools, and the ongoing conversations around why we still see broken implementations in 2011 has me thinking about how we find ourselves in this situation, and how we move forward.  So here is my first (ever) blog post, and my thoughts on the subject:

About me: I work on the client side.  I have experience analyzing data in three different web analytics tools, and that experience goes back several years, but my implementation experience is limited to one tool.  I fall into the “stronger on the business side than the implementation side”.  My implementation experience started about 3 years ago, and continues to this day.  I now have our web analytics tool running across 40 websites.   I am still responsible for those original implementations and the data that gets collected and reported on across all 40 sites.  I don’t take that responsibility lightly.  Also, I am not perfect.  Far from it.  But I do consider myself *reasonably* intelligent.  And motivated to put my best effort into my job every day, and appreciative of the tools I use which help me analyze data and get information out to my stakeholders on a daily basis.

Since I have experience “implementing an analytics tool with no prior experience implementing an analytics tool” (wait, don’t we all fall under this category at some point?), I think I’m qualified to point out a few of the pitfalls I experienced, and why I think there is still opportunity for improvement.

When it came time to implement our new web analytics tool, there were three groups involved.  Myself from the client side, some analytics consultants from an agency building some new websites, and an implementation consultant from the tool vendor.  We all had good intentions.  We all wanted to succeed.  We all wanted to do a good job.  I truly believe that the vendor and our consultants had my organization’s best interests in mind.

We discussed business requirements at length: We talked about which metrics were important, and how we wanted to measure our success.  That portion of the project was relatively smooth for me, as it is the part of the process where I feel more comfortable, but I also know we missed a few things (and I take responsibility for that issue contributing to a solution that is not 100% perfect today).  What was not as easy for me was figuring out how to capture this data in a manner that would allow our organization to report on it in the exact manner that we envisioned using this brand new tool that I knew little about.  I felt bombarded by questions and conversations that revolved around whether to pass data into both an eVar and a prop, or just one of them, and when to do this….as well as…. should we set a regular event? Or a custom event? do we need a VISTA rule? or is there a plug-in to support the logic we needed?  Or, maybe we need a VISTA rule AND a plug-in (oh, did I just disclose who my vendor is?).  Oh, and do I want to correlate or sub-relate each of these pieces of data?  Dizzying, from my perspective. 

I’d be fine if, in the end, we got exactly what we set out to get on Day 1, even if it meant a missed requirement resulted in a missing report.  But something went wrong.  Reports, which were supposed to show consistent results, weren’t showing consistent results.  Data we expected in one report was showing up in a different report.  We couldn’t find a logical way to run paid and natural search reports side by side.  I didn’t  know yet that we were getting burned by the form analysis plug-in in a really big, bad way (that learning was still months away).  What was difficult to digest then, and still is today, was that the vendor’s consultant was not able to anticipate and mitigate these issues before they occurred.  He had our best interests in mind, but he also struggled. A lot.

Thankfully I have one of the most wonderful account managers in the world, and she helped advocate on our behalf and extra resources were corralled to help fix our problems.  Ultimately the vendor also owned up to those problems, and for that, Omniture, I am grateful and I have a lot of respect.

Since then, we have sought to collect additional data using our tool.  We’ve launched new websites, new campaigns, and new initiatives, which have involved thinking about how to scale our implementation to grow and collect more data.  I have reached out, and I have asked how to do something new, something different- and I have literally received three different answers from three different resources at said vendor about how to track something.  That troubles me.  A lot.

My issue is not necessarily around whether the tool is engineered the best way, or whether it has kept pace.  It certainly sounds like there are opportunities for improvement.  It also sounds like many tools out there face similar challenges

My issue is this: If the people who are supposed to have the most knowledge about a tool are struggling to advise me how to implement their tool, what am I  supposed to think?  And what am I supposed to do?  Is all the complexity of the tool worthwhile if no one can tell me how to leverage it?  Today, it makes me think twice, and I am loathe to make major changes.  Ultimately, is this what a vendor wants to hear from their customer?  That their customer is hesitant and a little scared to adopt more functionality or make changes?  I don’t think so.  

Improvements to the engineering of analytics tools will hopefully make some of these issues go away, but I believe tool vendors also need to focus on how well their own resources are able to advise clients on the usage of their own tools. 

Ultimately:

  • I believe the folks on the client side own the responsibility of gathering business requirements and bringing those to the table at the beginning of an implementation.  The vendor cannot anticipate which metrics are most valuable, or even what those metrics are without client input.
  • I believe the vendor, who has programmed and coded their tool to function a specific way, is responsible for helping the client translate their business requirements into technical requirements for implementation.   The client cannot anticipate data processing rules which impact how the data gets from their website into the correct variable. 
  • I believe this is not easy, and it requires both sides to work closely together. 
  • And I believe there are still improvements to this process which both sides can and should work on together.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: