Nov 23 2010

Competing On Customer Experience

Category: web analyticsTom Shivers

 

Beautiful girls at the beach pulling a rope

Jenny: how do text ads vs. image ads affect user experience?
Eric Feige: Jenny, your question is one that Usability Sciences has addressed. It really depends on the persona, the user goals and the scenario that they are completing. Image ads appeal more to customers/users who need a great deal of information and context regarding the product that they are evaluating. Other user profiles, on the other hand, purely want to get through the task at hand as quickly as possible and just need the basics of what is in the advertisement / promotion. Online coupons fit into this category – though our experience and research findings indicate that a great deal of instruction (e.g. print the coupon and take it with you) is required.

Paula: how can one measure customer experience?
Eric Feige: Paula, measuring customer experience requires a holistic approach that typically requires a variety of tools and service providers. Examples…. Web Analytics, Attitudinal Analytics, Usability, Persona Research…

JP: Not clear as to where cust. experience and online marketing intersect…
Eric: We are finding that more and more businesses look at customer experience as four integrated / inter-related phases (1) Attract / Differentiate (2) Convert (3) Service/Fulfill (4) Extend/Loyalty. Online marketing can, for some marketers, be really just phase (1). But for some companies that are competing on customer experiences they align online marketing with all four phases… We are beginning to see some morphing of interactive marketing into other functional areas focused on customer lifecycle.

Tom: how are attitudinal analytics measured?
Eric: Tom, Attitudinal Analytics is a customer experience that we offer so I’ll define AA from a Usability Sciences point of view…and offer some nuances. AA is a quantitative customer experience solution that “intercepts” several thousand user sessions – gathering intent and attitudes along with tracking actual user behavior across multiple domains. So out of 3000 users, AA tells us which personas are succeeding and failing by intent and a variety of other measures. This large sample can be integrated with Web Analytics such as Omniture to provide a more compelling and tangible story of what is happening and why online:
http://www.usabilitysciences.com/services/online-user-experience-research/attitudinal-analysis-with-clickstream-visualization/
Plug for our definition of Attitudinal Analytics in the URL above…

Paula: Does AA provide a single metric for measuring customer experience…. like customer satisfaction?
Eric: Paula, I think the days of a single metric died a few years ago. This Forrester report has some great “starter metrics” that cross company functions (e.g. marketing, customer service, etc..)

Jenny: I’m sorry, I didn’t frame my question properly. I was thinking from the publisher’s point of view and it’s users. I’m wondering if showing image ads will make the website look more “commercialized” and therefore, not pleasing to the user.
Eric: Jenny, we work with many publishers who constantly balance between increasing ad revenue through online ads and improving usability. We routinely test different types of ads online and in our labs. Not surprisingly, some ads distract users and they end up not effectively completing their intent (why they came to the publisher in the first place). So, conducting user experience projects beyond redesigns is a key reason why businesses opt to continuously “listen” to the customer and monitor their experience. We all want to increase conversion rates, but optimization of the conversion process is continuous process and not a one-time project. Using Attitudinal Analytics as a solution example, companies will learn something new every quarter about improving conversions by listening to their customers and obtaining an objective understanding of what is working and what is not working…

An example might help…
We worked with a very, very large retailer whose complete site was designed primarily for male personas. When looking at success and failure by intent and by key demographics such as gender, we discovered that women were failing miserably while men were satisfied with the experience. The data (roughly 10,000 user sessions) clearly indicated a need to revamp key sections of the site (editorially, with context rich images, colors, etc..). When the actionable recommendations were implemented on they sections in which women were failing, conversion rates for this particularly user segment went through the roof. Traditional web analytics data would not have been able to pinpoint failures with certain demographic and psychographic profiles.

Paula: I use google and it doesn’t help me understand what is not working right for my site visitors… only page views and such. I need something that tells me where my customers are failing on my site. How do I capture attitudes and behaviorial data?
Eric: Joining attitudinal and behavioral data with Google Analytics (and any other analytics solution) is powerful with a couple lines of Javascript at the entry point of the customer experience (generally the home page). Yes, it is software. But, it is a hosted software model where the end client implements Javascript as they do with Google Analytics. The software is called WebIQ (Usability Sciences software). The end-to-end solution is based on the WebIQ software and is Attitudinal Analytics.

Eric: Do any audience members operate internationally?

Paula: how does webIQ capture attitudinal & behavioral data?
Eric: The deployment is a entry survey. We try to capture just the basics upon entry and quite a bit more as we intelligently track behavior through the site. When you get to womenswear, for example, and we know that your intent is to purchase for the holidays and that you are a woman, we’ll ask questions of the experience once you get to this point. We’ll track your path to this point and query you at the exit point. Keep in mind the exit point can be another site. All of the pathing behavior is analyzed by our team along with the user’s open text feedback. We’ll correlate what happened offline. And, after several thousand user sessions you have a quantitative understanding of a multitude of customer experiences.

JP: The survey can generate resistance and thus throw the results off, no?
Eric: JP, the resistance question is a good one. The invitation survey is set to N in X. That is, what ever frequency the client or the interactive agency desires. N=20 is typical for heavily trafficked sites with strong brand awareness. Users opt to participate because they want the experience to improve. The invitation survey is branded with the client’s logo and “look and feel” so our acceptance is directionally good. Our ecommerce clients understand the trade-offs between gathering quantitative customer experience insights and completing the potential sale. Those that rely on the insights have no problems with asking for some time of their customers.

Tom: does this quantitative understanding of customer experiences also need an interpreter?
Eric: Tom, yes. I would call them analysts. We employ analysts to go through the data, behavior, open text, etc.. to provide the findings and recommendations. We also work very frequently with interactive agencies that also analyze the customer experience data. And, of course, our clients consume the final deliverable and they take it to another level of presentation and analysis – generally to executive management.

I asked about international operations, because this type of customer experience measurement and improvement solution has a “sweet spot” when your customers come from multiple countries or from diverse communities. In that case, the customer experience data (behaviors, attitudes, intents, etc..) will come in in Chinese, Spanish, French, English (US), English (UK), etc… Customer experiences tend to vary by national origin and language spoken. We translate survey and the incoming data is also translated. 25 different languages so far …

Eric: I’ll send out a couple of other URL’s that I think are interesting..
Forrester’s customer experience research helps customer experience professionals and interactive marketing professionals compete effectively in a world where empowered consumers are getting harder than ever to win and keep. Key topic areas include benchmarking customer experience, building the business case for change, and transforming organization, culture, and process. Here’s some bad news for old school marketers: Today’s empowered consumers are getting harder to win and keep. Over the past three years, they’ve become more likely to research products online and less likely to be influenced by advertising. Even worse, during the past four years, the percentage of consumers who think price is more important than brand name has steadily increased. So how can companies attract and retain customers who have access to infinite information across their choice of traditional and digital channels?

The above is from the summary from the Forrester document. It is a good read.

Paula: do I run this software occassionally or all the time?
Eric: The software can be implemented to run continuously – but configured on an every couple of months basis to assess different parts of the customers’ experience. E.g. you just want to understand how users came to find out about your site – which online ads resonated with which user profiles, this may be one “deep dive” into the experience. Once you understand that, you may explore how they browse / search for products. Then you might focus on cart abandonment issues, etc.. I was going back to the question of which type of ad or widget is better…Given the complex nature of customer experience, simple rules of thumb are no longer applicable..

Eric Fiege is no longer with Usability Sciences, but can be reached via his LinkedIn profile.

If you enjoyed this post, make sure you subscribe to my RSS feed! You can also follow me on Twitter here.