Monday, November 25, 2013

Focus on the key moments of customer process

I was interacting with a person who introduced himself as a "customer experience expert". I see an increasing number of this title. I asked him what he does and what would the deliverable be. He defined it as someone who would design the customer interface along the complete process of customer interaction. For example, on a website, he would help design the page layout, the forms (one page or multi pages), menu items. On an in-store process, it would be the layout and the process of checkout.

There are an increasing number of companies that are now "walking the process in the customer shoes". Their aim is to make every step of the process as customer friendly and easy as possible. And this is exactly where they falter.

A Noble Prize winning psychologist, Daniel Kahneman, once stated:

"Human Beings only remember the peak and the end moments during an experience process."

And this is very true. Lets consider a queue for submission of, say, college admission forms. A typical process would be to obviously queue up early. Then await your turn. The person accepting the forms would check it and you would hope that everything is in order. If okay, then the form is accepted, else you need to get additional documents or information and maybe get back in the queue.

Now, let us evaluate this process. The peak moment is the relief from the anxiety of the comprehensiveness of the application docket. The end moment is getting a receipt of acknowledgement of submission. The college cannot do much about the number of people queueing up. But what it could do is address the peak moment early. So, a person could go down the queue checking the documents of each applicant and giving his advice. Thus the anxiety get eliminated much earlier. Then the wait is only for the queue to move up and submit the document. The end process is getting a receipt of acknowledgement. The college could hand over a bottle of water to the applicant along with the receipt. Well, he was in queue for say over an hour.

But what we find in reality is the college trying to rush up the queue by putting in more desks for acceptance. The security trying to get some sanity in the multiple queues that get formed. And the crowd experimenting with unruly behaviour in hope of jumping spots in the queue.

I, for one, have been through this scenario. The only thing I remembered was happily holding the submission receipt that confirmed my admission to the college. The 2 hour wait was forgotten. The anxiety was forgotten.

Businesses will do good to apply this analogy to their customer facing processes. The first step is to identify the peak moment of the process. Then address the same as early as possible. The next is to make the end moment or exit more pleasant.

For an online store, the peak moment would be creating the shopping basket. I have experienced web sites where once I click the "buy" button, I am taken to the shopping basket for checkout. For additional items, I am lost at this page. I need to go back or press the home page. There are websites that allow adding to the "shopping intention basket" from any page in the background. It stays on the same page and provides an uninterrupted experience of searching for additional buys. Finally, when I am ready, I get to review my basket and add/modify items in it. Also, the check out is often just a one page activity. With concepts like AJAX, this is a reality and I dont understand why websites dont adopt this approach. Often, checkout means atleast 4 pages of activity.

Focussing on the peak and end moments will make life simple and help the business focus on the key aspects of customer experience. The critical question here is "do we know the peak moment of our customer process?".

Tuesday, October 22, 2013

Campaign is not for the Wild Hearted

An amazing fact came to the fore while watching a documentary on National Geographic on the hunting and defensive skills of the wild animals. The key aspect of survival was not strength or size or venom ... it was patience. In one of the episodes, a group of three lioness laid seige on a watering hole for over 3 hours before the first zebra showed up. And even then, due to the impatience of one young lioness, they lost the hunt. If only the young lioness had waited a few more minutes they could have got the zebra trapped in the vicious triangle they had created. In this case the strength of the lions were not useful in achieving the success. Another episode showed a fish lying still under sand till its prey came close enough. Time to kill ... over two hours.

This was an amusing fact. The law of the wild rewards the one with most patience. But then nature has one resource which is unlimited ... TIME. Alas, we who live in the concrete jungle do not have access to unlimited stocks of time. There is always someone practicing to run faster, jump higher, become stronger.

Analytics was bought in to make organizations more nimble by using foresight or predictive insights. Knowing what is likely to happen in future gave businesses more time to adjust their business plans and approaches. But as more and more organizations are adopting analytics, the law of faster, higher and stronger is taking over. Already, the analytics vendors have started talking of automation and analytics factories. In-memory analytics is another subject area gaining popularity. These approaches are aimed at operationalizing anaytics much faster.

In light of this scenario, one cannot have a 3 month project plan for any analytical exercise. The secret mantra is to Fail Fast. This is epecially true in the marketing field. The campaign managers still make project plans that run into weeks for each campaigns. When the campaign is launched, a lot of time and effort and money has gone into the preparation. In order to justify this investment, the campaign managers then try to keep the campaign over the red line. This may involve additional efforts, more money or more precise analysis.

The catch is that while the campaign was in a planning phase, the world around the business was constantly moving. Things change very rapidly in the consumer business. So when the campaigns eventually get launched, it was a different world then the one that was referenced during the planning and analysis phase.

I was surprised when discussing with an ex team member who is currently implementing a "multi channel campaign management" product (I will refrain from naming the product now). He had run into some issues and had called up to check on some configuration. He told me he was too busy since this was a "go live" weekend for a campaign. I found out that this campaign was being planned for over a month. The customer had a one week UAT (user acceptance test). I was shocked and amused to know this. I am very confident that this customer had no idea of BTL campaigns. The best UAT is out on the field. He should have done a quick test campaign to maybe 100 or 500 or 2% of the customer base and checked out the result. This should have been done as quick as possible. Depending on the industry, even within a couple of hours. If it worked, he could have gone across the customer base. If not, then look for something else. There is a nice scenario that a colleague shared with me. He said a typical day in the life of the campaign manager should be "A new idea by 0800 hrs, a new campaign by 1000 hrs, a test campaign by 1200 hrs, evaluate results by 1400 hrs, reject or deploy campaign by 1600 hrs, track the campaign by 1800 hrs".

But everytime I present this case, the idea does not find acceptance. Maybe it creates stress on the campaign manager. Cause now he has to get up with atleast 10 new ideas that he will test during the day alongwith the campaigns of days past. Most probably 9 out of the 10 will get rejected in the test phase. 1 campaign gets rolled out along with other campaigns. The start of the next day needs another 10 ideas. Compare this scenario with the one where he takes a month to plan and launch one campaign and his rejection is understandable.

I have seen marketing departments with 7 or 10 campaign managers running maybe 5 times the number of campaigns. Some of these campaigns have been in force for over 3 or 6 months. On the other side, I have met companies that claim to run over 1000 campaigns daily. I seriously doubt how they calculcate the contribution from these campaigns. The world outside has changed a lot over the past 3 months, so how can a campaign perform uniformly over the same period. Let alone 1000s of campaigns. 

Somewhere, somehow complacency has set in the process. This is where a nimbler competition can overrun the business. Get your campaign department to run more finer and more multiple campaigns with shorter turnaround. If possible, with a turnaround of a few hours. That is a sure shot recipe to beat your competition. For a man of patience belongs to the wild world and not the business world.

Tuesday, September 03, 2013

Article: Relationship Based Pricing

Today I take a shortcut. An article, authored by me, that was published in the periodical issued by Tata Consultancy Services Limited's BANCS team. This article covers the steps towards making relationship based pricing a reality in the banking environment. Similar steps will be applicable to other industries where such pricing can bring unique customer relationship definition and enhanced commercial benefits. Industries such as hotels, travel, high end household, ecommerce can benefit from this approach. If anyone is interested in adopting the same for their industry, get in touch me with me at michaeldsilva@gmail.com.


TCS-BaNCS-Whitepaper-Making-Relationship-based-Pricing-a-Reality-in-Financial-Services

Do not forget to provide me your views / feedback / comments / critiques.

 

Tuesday, August 06, 2013

Reverse Deduction ... It is Elementary My Dear Readers

I have started re-reading classics over the past six months. The last time I read books such as David Copperfield and Jekyll & Hyde, was during my school days. During that time these represented more of a fantasy or science fiction. Today, after almost 3 decades of living, these same books now take a more philosophical tint. I have recently finished reading the complete works related to Sherlock Holmes. So expect a few articles influenced by Sherlock Holmes and Sir Doyle.

As an analytical person, the first curiosity was on how Holmes can solve the seemingly complex crimes. I managed to string together the theory based on the hints thrown by Holmes during his discussions with Watson. To the statistician in me, this theory connected very well. Today's post is on how Holmes solves his crimes and its relevance to modern day business.

A typical approach to analytics is to identify a particular event ... say customer attrition. The analytics consultant will go about collecting data elements that he believes may be influencing customer attrition. Then he starts building the predictive model to identify the data elements that are more significant to customer attrition. Once identified, the next step is to contextualize the significance of the data element. Refer to my post on micro modelling <<click here to read>>. Often, the analyst will find clusters of customers behaving differently. Hence, he starts segmenting and building different models for each segments. Eventually, there may be many segments and each segment will have different data elements defining the influence of customer attrition within the segment. Thus, though the end state is customer attrition, there are different sets of data elements or paths to the end state.

Sherlock Holmes' art lies in starting from the end state and knowing all possible paths to that state. For example, examining the state of the corpse, he would deduce the possible sequence of events that could lead to that state. Its like knowing all possible behaviour, across customer segments, that would lead to the attrition within the segment. The next step is to go back on the path and arrive at the source. That is arrive at the segment the customer belongs to. Apparently, in the earlier life of Holmes, which is not covered in the book, Holmes is supposed to have conducted a number of experiments to deduce various paths to the states of interest, whether this was the state of corpse, the footsteps on the ground, hand writing analysis, etc.

Once Holmes could deduce the various paths, he goes about the case of elimination to arrive at the surviving path. This logic is not hard to understand. We are exposed to it very regularly whenever we visit the doctor. The medical practice uses this logic of elimination. The doctor observes the symptoms and then asks some questions such as "did you have fever", "do you feel nausea", "hows your stool". Basis the answers, he starts eliminating illness till he has narrowed down to a few. Then he may prescribe some generic drugs to address them. If the illness does not respond, then he recommends detailed observation (read tests) to further target specific illness (such as malaria). Eventually, there is only one path left, and Holmes has the sequence of events leading to the crime. This is the same algorithm that has been adopted for Text Mining and Document Tagging. The author, Doyle, was a doctor by profession. He applied the medical approach to crime fighting. It sounds so obvious after a century.

If one is analysing customer attrition and assuming we are in the know of all behaviours that lead to attrition for each segment, we can analyse each customer and narrow down the customer to a specific segment and path to customer attrition. Once this is done, we can then plan our interventions in the customer life to prevent the end state.. ie customer attrition.

But, alas, we may seldom know every path to attrition and hence will keep building predictive models and testing them. Till then, we will always be impressed by Mr. Sherlock Holmes.
 

Friday, April 26, 2013

Statistics cannot play God


One of my earlier posts titled "Statistics hints at Existence of God" <<click here for the post>> detailed how the error component of any model hinted at a philosophy that is analogous to concluding the existence of God. While, this explicitly states that God exists, today's post is about why this God should not be omnipresent and should not be implicit in the statistical model.

A key element of the modern day God and his relation with humans is the fact that God endowed humans with the capability of free will. One definition of God is an entity that is "all knowing". This entity knows the past, present and the future. In fact, he defines the future. God gave human the capacity of free will. It is argued that this was done out of his love for mankind. But this also represents a paradox in the way we define and understand God. The free will of humans gave it the option of wrong choices. Thus, emerged the possibility that God cannot now know the future since he does not dictate the choices man makes which in turn defines the future. God, hence, took a big risk in creating humans as free thereby including the possibility for wrongful choices.

Can a statistical model embody that spirit of granting free will? Lets look at the not so past sub-prime crisis. Every financial institution that went bust or lost in the crisis were big users of statistics. There were numerous case studies published of how these institutions used statistical scores to take decisions and thereby improve business parameters. While the employees, or human kind, followed the dictates of the statistical decision, business was thriving.

Then some mortals realised that the asset prices are ballooning. So even if the creditor defaults, the bank could recover their money by forcing the creditor to liquidate the asset. Or the bank could attach the asset of a lazy creditor and auction it at a much high price than the outstanding loan amount. The default scores or credit scores were rendered powerless. Though the model scores showed that the customer has questionable ability to earn and service  the loan amount, the institutions still over rode this decision and went ahead with granting the loans. History knows the derivatives and the leveraging that was done on these loans. But our debate lies with the initial event and not the derivatives.

The statisticians or the analytics sponsors in these institutions, be they the Business Intelligence heads or the CxOs, decided to play God. They granted free will to the consumers of the statistical models. This was probably done for the love they had in the employees capabilities and in the potential profitability. But they did not account for the risk of wrongful decisions. Eventually, when the risk became a substantiated object, the Garden of Eden was lost. To some, it cost their very existence.

If only, the sponsors had not played a loving God but a tyrant one, forcing the business to follow the scores of the statistical models, they could have survived the crisis much healthier. And there have been some institutions, albeit a few in number, who stayed with the statistical models and refused to permit free will to the business consumers or employees.

This shows a key lesson for companies adopting analytical expertise. A vast knowledge of historical knowledge is accumulated in the final, adopted statistical model. This knowledge is much greater than any individual employee. Hence, it is critical that the business processes are designed such that exceptions or over-rides are minimized, if not eliminated. Every time employees or processes are allowed to override analytical models, business has faltered and eventually the statistical approach is blamed. In such scenarios, it is not uncommon for the enterprise to abandon its analytical approach completely. And we hear comments such as "we tried analytics in the past and it does not work". This statement in a time when there are numerous examples of statistical applications in the same scenario being referred.

So, note this good, if you are deploying analytical approach in your enterprise, do not play the loving God and grant free will to the consumers of analytics.

Thursday, February 07, 2013

Statistics: Is it "Guilty" or "not Guilty"


In college, my statistics professor had an uncanny ability to link statistical concepts to mythology or philosophies. It also tended to make his lectures fun to attend and often sent us on a parallel track to read more about the event mentioned along with the statistical theory of the day. One of topics that often confuse statisticians-in-the-making is the right formulation of the problem or in statistical terms the right formulation of the hypothesis. In introducing the topic, he asked us to recall the court scene in every movie. The premise of all cases is that the accused is innocent unless proven guilty and it is the responsibility of the accuser to prove that the accused is guilty. The analogy to statistics is that a given series of observation is uniform unless it is proven otherwise. Thus, every hypothesis states that the series is closer to normal curve and the exercise is to prove that it is not. What is more interesting is that the conclusion in legal proceedings is  "given the circumstances the accused is not guilty". The law does not state that the accused is innocent. Again the analogy in statistics, the application of various theory eventually brings to the conclusion that the series does not deviate significantly. It does not state that the series is aligned to normal distribution or a derivation thereof.

It is important to understand this concept when applying predictive analytics to business scenarios. Let us considers the churn model or retention model. The premise of the entire engagement is to find customers who are likely to attrite. The data set is accordingly prepared such as one can define the population into one that attrited in a given period and the ones that continued. Accordingly the predictive model is built and the scoring rule applied to the target population.

The score is a representative of the likelihood that the customer will attrite or not. It is not a measure of the continuity of the customer being on books with the company.

The law states "in light of the known or presented evidence". Similarly, the statistical model is built based on the data variables (or information) fed as inputs to the model building process. The model only concludes whether the known variables indicate a attrition on the part of the customer. They do not indicate that a customer will continue the relationship. This understanding is very important in analysing and inferencing from the statistical models. One should understand that there are other factors which may not be known or could not be quantified. As such they constitute the missing information. And some of this information may influence some of the customers to attrite. Hence, we cannot say that a customer who will not attrite as per the predictive model will continue the relationship. In light of the known or input variables, the customer will not attrite --- that's the verdict.

One area of financial impact will be default modelling in credit lending business. The default model often predicts whether the customer will default on the loan taken. Businesses tend to wrongfully create an corollary that the customer who will not default is the good customer. As such, often these customers enjoy high ratings and the business tends to take additional exposure to such customers. Such is the belief that even the variable in the database is labelled as "good" and "bad" customers. The predictive model only states that given the variables analysed a customer will default (that is be a bad customer). It should not be construed that the others will be "good" customers.

This difference is not subtle and it is very critical that businesses deploying predictive analytics understand this difference. As long as there will be uncertainties in the business arena, the decision will be to segregate the target base into "guilty" and "not guilty". The statistics deployed aims to either "reject" or "not reject" the hypothesis. Maybe, the default status variable should be labelled as "bad" or "not bad" customers.

This is probably one of the toughest post for me. I had to explain a hard core statistical thought into layman language. It took some time to edit this post and I believe it is job well done. If you believe so then kindly let me know. If not, then I will be happy to get into a discussion to explain or further simplify the matter.
 
test