Note: This blog post was originally published on the Clarabridge Blog - you can find the original post here.
I visit customers, prospects and partners fairly regularly, and during those visits a number of common topics come up – updates on our product releases, new partner activity, best practices and project reviews. We also discuss the business impacts of the Clarabridge solution - what customer insights they find, and how our customers use text analytics insights to improve customer value, customer experiences, and customer loyalty.
The conversations have veered into provocative territory on two recent occasions, where we have considered some interesting ethical implications of mining customer experience data.
Obligations as a listener
In one meeting at a healthcare company (attended by business, technical, and legal representatives of the company), they wanted to know how much they could mine, sort, and even “filter” text content before it was delivered to business analysts. The company has strict protocols for identifying and communicating safety and quality issues that originate from patients and providers, but they weren’t sure what to do if they found unsubstantiated information on a social media site. And they weren’t sure if they wanted all insights to go to all analysts. Generally, healthcare companies are obligated to reach out to patients and doctors to provide guidance and support if there’s a problem, and depending on the problem, they need to also report findings to an appropriate federal agency.
When it came to text mining of customer content; however, the company had questions. What is the reporting requirement, if the feedback comes from an anonymous survey? Or from a consumer posting on twitter? How much obligation does a company have to monitor, mine, and intervene in the social media world? At present there are no rules, federal guidelines, or even well defined best practices for using social media monitoring, to identify and counsel customers if they identify safety or quality issues. Should companies get out ahead of the government to develop progressive practices? Or should they wait?
Executive Compass for Customer Experience Management (CEM)
More important, once you start “listening” to the voice of your customers using text analytics and monitoring technology, are you now obligated to act on the insight? Or if you don’t listen, are you not obligated to act? What’s the moral or ethical imperative of using text mining technology to understand your customers better if they’re talking about product quality, safety, side effects, and even morbidity? Should a company be seeking to quantify this feedback?
These questions occur in many industries, not just in the healthcare sector. If a client in the financial services sector believes a contract has been broken and threatens to litigate in a blog post, an angry call to a call center, or in a survey rant – how seriously should a company take the threats? If a patron walks into a retail franchise location and spies what they believe to be a violent felon, or sex offender working behind the cash register, what is the obligation of the parent company - to notify, or enforce an HR action on the franchise? These insights are often latent in “voice of the customer” feedback, and have been found by our customers using Clarabridge – prior to using Clarabridge the information was latent and largely invisible.
Ultimately, text mining helps a customer listen, analyze and measure the extent of a problem or the outcome produced by an action. But taking action – whether it’s a sales, marketing, support, or (in the cases outlined above) safety, risk management, or criminal prosecution decisions, to me, ultimately depends on the will, and commitment of the organization to act on the insight. What do you think? What policies has your company incorporated?
Data. Data. Everywhere
4 months ago