History, is it important?

Edmund Burke (British Statesman and Philosopher, 1729-1797) stated “Those who don’t know history are destined to repeat it.” In a recent book Decision Traps: The Ten Barriers to Decision-Making and How to Overcome Them by Edward Russo he identifies “Not Keeping Track” as one of the ten traps that contribute to poor decision making in today society and business.

Call and contact centers are subject to this problem in spite of the fact that they generate lots of data. We assume that experience makes its’ lessons available automatically. Unfortunately we believe because we know ‘this’ or ‘that’ today we will remember it tomorrow. Therefore we fail to keep systematic records to track the results of decisions. We often fail to analyze these records and results that lead us to decisions in ways that reveal their lessons

All too often in the day to day running of a center we adjust staffing, resources and focus on the issues or concerns of the day and make decisions on the fly. “Fighting fires” is a concept that every call center manager is very familiar with. Few if any notes are taken as to why those decisions are taken or by whom. Often even management meeting minutes fail to record key decisions, how or why they were made and most importantly what resulted.

It is this last item, what resulted, that is a stumbling block for many. Success has many parents, failure is an orphan. Too many people want some credit when something succeeds and too few take responsibility when decisions don’t work as expected. (Decisions often don’t work out as expected, but more on that later.)

A classic case of this is dealing with forecasts and expectations of number of calls or work for a period. Forecasts and actual calls differ sometimes by a little, sometimes by a lot. The small differences are of no interest to most of us as it is noise or volumes well within the expected variation of the forecast. The large differences cause consternation for all involved: those who developed the forecast, those who used the forecast and any who’ve an active interest in the outcomes.

A common reaction is to assign blame for an inaccurate forecast to the person or people developing it. Or to say that the people using it don’t understand it’s use. Or to say it was only an anomaly and not to worry. All of these approaches fail both now and most important in the future when this learning is most needed.

A better approach is to ensure that everyone understands how the forecast was developed. Was it based upon organic growth, acquisition etc? Are or were there anomalies affecting the data i.e. a major blackout, system crash, significant price increase etc. Then you must determine what, not who, caused the difference. Was there an event, events absence of an event(s) that generated more or less calls than anticipated? Were there events unaccounted for in the forecast? If so will they reoccur and how will they accounted for in the future? In other words was the methodology used rigorous and did it account for everything that was known or understood at the time? Did it surface all the assumptions?

Second, and presuming that the event was unexpected and that the forecast was rigorous and everyone agreed to it in the first place; what can be learned or noted for the next round of forecasting and for management to consider going foreword from here. Constantly challenge the assumptions that form the basis of the forecast or risk quickly losing touch with reality.

Make notes related to events that impact the forecast and raising these in management meetings ensures that this corporate learning gets embedded with everyone involved. The fact that an error of judgement or analysis happened is not important. What is important is that the group, people and corporation learn from the experience and record such in a fashion that enables everyone to get better.

Let us know what you think of this article or any suggestions you have for future issues by email at [email protected].

Read the entire newsletter here!

Leave a Reply

Your email address will not be published.