Analytics is nothing if not the act of gaining insights from data to drive process improvements. However, many people fail to embrace the innovation aspect of analytics because they don’t exhibit what I call “intellectual curiosity.”
Intellectual curiosity is the trait that drives people to look for new and innovative ways to accomplish something. People with this trait continually question the status quo and take an active and passionate approach in finding solutions. They may have preconceived notions (we all do) but they do not presume to know the definitive answer to questions without relying on real-world analysis.
For example, I was once in a meeting to review the sales for a new product launch. There was a tremendous amount of excitement in this meeting because the results were rumored to be very impressive. The product manager wasted no time getting straight to the point by showing the incredible results. Upon a cursory view, I raised my hand to ask a question. I pointed out that for his numbers to be correct, they would have had to capture an incredible 90 percent market share—the average was 4 percent to 6 percent—and this seemed more than a little suspect to me. Talk about taking the air out of the balloon!
What commenced was a lot of hemming and hawing, and later a lot of finger pointing, about who was to blame for not getting these number right. The problem wasn’t that the presented numbers were completely out of whack. The real issue was that this company had a process by which a business associate documented the data requests and the IT resource pulled the numbers as documented. Neither the presenter, because he “liked the numbers,” nor the IT resource who helped pull the numbers, displayed any intellectual curiosity to challenge the findings and make sure they made sense. Intellectual curiosity was not part of that company’s culture, and it resulted in a very embarrassing situation.
These people failed to embrace analytics to really see what was truly happening with the product. They had the answers they wanted regarding the product launch and were content with going no further to challenge initial pretenses. People who lack the desire to challenge their pretenses using analytics are not conducive to developing a culture of continuous improvement—a problem we see all too often in businesses today.
Part of being intellectually curious is getting to a point where you not only want to influence change, but you want to see that change implemented so you can verify actual results—and learn from them. We’ve all heard people tell us why something won’t work or describe why something else might be better. This is not intellectual curiosity. Exhibiting intellectual curiosity requires tangible experimentation versus simply proselytizing analytics.
Don’t Let Perfect be the Enemy of Good
The odds of coming up with an ideal scenario without any background information is highly unlikely. Yet I’m always amazed by how many people think they have the right answer on their first try—especially as it pertains to a complex business problem. One of my early mentors was correct when he once told me, “It’s better to do something four times and improve along the way than to take four times as long thinking you can perfect it on your first try.”
I’ve noticed that companies tend to essentially paralyze themselves by coming up with a million reasons not to do something because it might not be perfect right out of the gate. Instead of gaining incremental value that could be developed in a short period of time, they eschew any benefits at all because the results might only be good. Living by the edict that, “If you give analysts six days, six weeks, or six months, they’ll take it,” I subscribe to the methodology that I should execute, learn, then re-execute. This allows for many failures along the way, because as Henry Ford said, “Failure is only the opportunity to more intelligently begin again.”
The Key to Analytical Success—Failing Fast
Innovation through business discovery is the key to analytic success. The objective, however, is to not only spend time discovering improvement opportunities, but also defining their root cause, and then coming up with a method to address them.
The method will almost undoubtedly not produce a perfect solution, but it can assuredly look promising and be at least partially implemented. Once you reach this stage, the idea is to quickly assess how well your solution does or does not address the issue. Even if you don’t see the desired results, your ability to “fail fast” means you can build upon these learnings in a quick and agile way versus having to wait for a separate production deployment before seeing any results.
That said, once an analytical solution does show enough promise to become a permanent deployment, the process by which this was derived will inevitably make for a better overall solution because you have built a tangible prototype and defined an explicit business case based on the results. These are key inputs to help prioritize your analytical initiatives.
Better Deployments Through Actionable Governance
Governance has obtained a bad reputation in many corporate environments by being associated with bureaucracy, rigidity, and increased costs. Instead of focusing on innovative ways to deliver business value quickly, too many adherents of governance focus on the process and control.
Much like my experience in the meeting about the new product launch, many individuals see themselves as gatekeepers where adherence to protocol is paramount. Instead, they should be embracing innovation through intellectual curiosity with governance teams looking for creative and collaborative ways to deliver value.
Users need to trust their analytical solutions. This requires a thorough and reliable deployment. Governance can help, but the focus must be on reliability, cost-effectiveness, and speed to market—not dogmatically sticking to preordained processes.
My latest white paper provides a methodology for enacting actionable governance in analytics. By considering two seemingly incompatible organizational partners, business and IT, the paper lays out strategies to address the business need for innovation and the ability to “fail fast” while also realizing that IT excels at making sure the deployment and reliability of what’s deployed is maintained. In both instances, the key is to develop a collaboration between the various stakeholders and ensure there are ample rewards for those that exhibit intellectual curiosity.
Tom Casey is Executive Account Director for Teradata. He has nearly 25 years of experience working with, designing solutions around, and helping global customers make analytics actionable. As a data analyst, Tom has successfully implemented the use of statistics to better segment and target customers in support of major corporate programs. He’s a featured speaker at conferences, author of several papers, and has a solid track record delivering enterprise-scale analytical solutions.
View all posts by Tom Casey