Navigating tech in private equity

Today’s business intelligence technology offers real value for private equity firms but can be difficult to source and implement. pfm spoke with Alex Rodriguez of Crowe Horwath about how best to tap these powerful tools to get what GPs need the most.

 Broadly speaking, what is business intelligence technology?

AR: BI is an umbrella term for software and analysis designed to help organizations make decisions based on data rather than anecdotes. Historically, BI software was used only by companies on-site, and the software typically was designed to generate preconfigured reports that were intended to be broadly useful. Anyone familiar with the tedious process of aggregating multiple general reports to create an analysis will understand the shortcomings of these systems.

What’s the most common misconception private equity firms have about BI technology?

AR: Investors may not know that BI software can be used as part of the due diligence process to provide deep insights into the earnings power and potential of a business. The analytics developed during the due diligence process then can be expanded to monitor the significant business drivers over time. During the past several years, a significant paradigm shift in business intelligence has occurred due to the emergence of new technologies. Modern BI software allows us to bypass prebuilt reports to build any report we need from the raw data.

Are most of the software’s capabilities available out of the box?

AR: No. The features that make these software packages so powerful are their sophisticated coding languages and in-memory analytic engines that allow real-time calculations with voluminous, multifaceted data sets. In contrast, many of the out-of-the-box capabilities are designed to appeal to nontechnical end users; that’s why software vendor websites showcase novel visualizations with bright colors and animated graphics.

Real, meaningful analysis should be curated and presented simply to avoid obfuscating the message. While certain exotic visualizations can be useful, many are not. Today’s best practices for the visualization and presentation of data were developed decades ago and espouse elegant design that captures the essence of a data set. These principles are commonly broken by many of the elaborate visualizations now available. Software vendors know this, but specifications for in-memory analytic engines don’t always make for the most scintillating sales pitches.

When selecting a provider of BI tech, what questions should GPs be asking to ensure a provider is the right fit for their needs? That can be difficult when firms may not even know the capabilities of today’s offerings.

AR: Providers should – at a minimum – have strong technology skills. They also should have strong accounting, finance, and operational experience. High-caliber providers also should have a deep appreciation of the importance of visualizations for understanding the characteristics of data – not just for aesthetics. Finally, providers need to be adept at working with real-world data sets, which are inherently messy and never have every ideal metric.

What are the limits of today’s BI offerings?

AR: If limitations are defined in terms of speeds and feeds, certainly constraints exist. These constraints are not easily defined by rules of thumb, however, because they are highly dependent on the structure of the data, the types of calculations, and the efficiency of the code. Fortunately, a creative, flexible approach can help circumvent many limitations.