Ask Our Business Intelligence Expert: Lessons Learned

By Scott Yates, HGS Senior Operations Director

Are you getting a return on your business intelligence (BI) investment? According to Gartner research, organizations have spent over $60 billion on business intelligence software and service efforts to improve business performance; however, many have failed to achieve significant benefits, or at least have found it hard to quantify the benefit. It’s still early stages for next-gen innovation like automation and analytics, and many organizations are leaning on the experts for guidance. To that end, here we share some lessons learned from our in-house expert HGS Senior Operations Director Scott Yates.

Q: What is a business intelligence mistake you have personally made and learned from? Scott: When I first started in analytics, I clearly remember making the mistake of putting implementation before sound reporting—basically, not applying my knowledge of what winning looks like. For example, when I first led implementation of new contact channels, there was, in one particular case, a rush to get the software up and running, with the team in place to support the new program. We didn’t take the extra time to integrate the new and existing solutions. In fact, plugging the new solution into our CRM and BI tools was secondary. What we learned was this planning and integration definitely needs to happen before a solution is taken live. After that experience, our BI process matured and we understood not to rush in. You can’t build a great airplane while you fly it. It’s better to get the airplane sound and tested before it’s in flight.

Q: How does business intelligence (BI) depreciation hurt the business? Scott: Take into account how fast insights can depreciate. What happened yesterday isn’t as valuable a week from now. Data can be devalued over time, and bad data results in bad decisions—for example, how much to lean on certain channels for customer care. The key takeaway here involves more immediate and thorough analysis. Today, I make sure that our BI isn’t affected as much by the depreciation factor. To do this, make sure tools fully integrate into your technology portfolio. Using tools like Excel and Vlookup is a key indicator that you have not integrated your business, rather your BI is held together with manual effort. In this case, the BI machine is slower, less accurate, and less stable. At HGS, we conduct user acceptance training to make sure the engagement tools work and we make sure the production systems integrate where they can. Why is reporting always the afterthought when it is the requirement to make the correct adjustments to optimize the machine?

Q: What mistake do you see BI novices make? Scott: I’ve observed that those new to analytics often bring a personal bias and experiential assumptions. BI professionals sometimes believe they are looking for something that they suspect is already there. Junior BI professionals bring this bias but then learn to put past experiences behind them for a better and truer read of the data. Case in point: When launching a web chat initiative and trying to determine a workforce requirement, a novice could say he or she has implemented three different programs and each time assigned one person to every 400 chats. However, the new Google Analytics data may reveal something different—according to number of visits per page, the location of links, and a number of other factors that can change that assumption. Another mistake novices make is stopping investigation at the first anomaly and not making attempts to dig into secondary and tertiary elements. The lesson here is that when you’re trying to find interdependencies between the numbers, make sure to define additional data elements, to explain the why. The quality of your BI will be much better if you don’t stop at the first outlier. Assign analytics work to two or three team members at once, without telling them, and bring the team members together to compare results of the individual analyses. Debate different findings to round the final insights. This way, you have a more sound BI with experiential learning and a better training and development benefit for your BI professionals. I often participate in our HGS group reporting comparisons, which provides stronger team collaboration and accountability. Quality of analysis relates directly to amount of effort. You’ll definitely find limited effort if those delegating analysis don’t know how to do it themselves.  

Campaign slider