Organizations are accelerating their ability to make data-driven decisions by offering analytics capabilities directly to business users. Here’s how to do it right.
Data-informed decision-making is a key attribute of the modern digital business. But experienced data analysts and data scientists can be expensive and difficult to find and retain.
One potential solution to this challenge is to deploy self-service analytics, a type of business intelligence (BI) that enables business users to perform queries and generate reports on their own with little or no help from IT or data specialists.
Self-service analytics typically involves tools that are easy to use and have basic data analytics capabilities. Business professionals and leaders can leverage these to manipulate data so they can identify market trends and opportunities, for example. They’re not required to have any experience with analytics or background in statistics or other related disciplines.
Given the ongoing gap between the demand for experienced data analysts and the supply of these professionals — and the desire to quickly get valuable business insights into the hands of the users who need it most — it’s easy to see why enterprises would find self-service analytics appealing.
But there are right and wrong ways to deploy and use self-service analytics. Here are some tips for IT leaders looking to make good on the promise of self-service analytics strategies.
Data analytics and analytics tools have gained such a high profile within many businesses that it’s easy to see how they can be overused or inappropriately applied. This is even more of an issue with self-service analytics, because it enables a much larger range and base of people to analyze data.
That’s why it’s important to establish a plan for where and when it makes sense to use analytics, and to have reasonable controls to keep your analytics strategy from becoming a free for all.
“Determine your mission, vision, and questions you need to answer around analytics before even starting,” says Brittany Meiklejohn, a business and sales process analyst at Swagelok, a developer of fluid system products and services for the oil, gas, chemical, and clean energy industries.
“It is extremely easy to get caught up on all the charts and graphs you can create, but that gets overwhelming very quickly,” Meiklejohn says. “Having that roadmap from the start helps to trim down and focus on the actual metrics to create. Have a data governance plan as well to validate and keep the metrics clean. As soon as one metric is not accurate it is hard to get the buy-in again, so routinely confirming accuracy on all analytics is extremely important.”
The analytics plan should emphasize the use of proactive data as much as possible, Meiklejohn says. “Focus [on] data that is actionable and can be implemented back into the business,” she says. “Incorporate learnings to transform processes and decision-making at an organizational scale. It is great to understand the historical side of the business, but it is hard to change if you are only looking at the past.”
At Swagelok, departments are using self-service analytics tools from Domo to determine whether customer orders will be late, schedule production runs, analyze sales performance, and make supply chain decisions.
“We have seen an increase in efficiency; everyone is able to get the data they need to drive decisions much faster than before,” Meiklejohn says. “We are making more responsible data-driven decisions, since each department is using the data for decision-making.”
While it’s important to have a long-range analytics strategy in place, that doesn’t mean organizations should move at a plodding pace with self-service analytics.
“In my previous company, our advanced material business had a saying, ‘Go fast, take risks, and learn,’” says Keith Carey, CIO at Hemlock Semiconductor, a maker of products for the electronic and solar power industries. “That would be my advice for those just getting started [with self-service analytics]. Don’t get me wrong, governance is very important and can come along a little later so as not to stifle creativity.”
It’s a good idea to find a small work group “and assign a moonshot mission to demonstrate the art of possible,” Carey says. He suggests teams focus “on the data pipelines that drive consistent business logic and metrics across the enterprise. Understand the importance of timeliness and quality of the data on which important decisions are being made. That’s a great place to start.”
Hemlock launched a self-service analytics initiative in 2018 using Tibco’s Spotfire platform, which is currently being used by all functions of the business. “Prior to that, IT would develop custom .NET applications that wrangled data and provided initial charting capability,” Carey says. “The most popular feature of these apps was an ‘export to Excel’ button, where [the Microsoft spreadsheet] became the analytics platform of choice.”
A handful of the company’s brightest engineers also created macros that would mash up new data sets, “which took overnight to run on someone’s PC,” Carey says. “And hopefully, if it didn’t crash, the data set was shared out amongst the engineering professionals.
With self-service analytics capabilities, Hemlock has seen benefits such as faster decision-making and quicker results. Self-service enables all functions, including operations, finance, procurement, supply chain, and continuous improvement teams, to perform data discovery and create powerful visualizations.
“We shortened the learning curve, delivered results faster, and accelerated our understanding of our manufacturing processes, which led to improving our products and reducing cost,” Carey says. “Within a very short time, we saved millions of dollars by improving existing reporting methods and discovering new insights.”
Natural language processing (NLP) makes analytics more accessible to greater numbers of people by eliminating the need to understand SQL, database structures, and the concept of joining tables together, says Dave Menninger, senior vice president and research director at Ventana Research.
There are two main aspects of NLP as it relates to analytics, Menninger says: natural language search — also known as natural language query—and natural language presentation — also known as natural language generation.
“Natural language search allows people to ask questions and get responses without [any] special syntax,” Menninger says. “Just like typing a search into a Google search bar you can type, or in some cases speak, a query using everyday language.”
For example, a user could ask to see the products that had the biggest increase or decrease in sales for that month. The results would be displayed and then the user could refine the search, for instance, to determine the inventory on hand for certain products.
Natural language presentation deals with the results of analyses rather than the query portion, Menninger says. “Once a query has been formulated, using NLP or otherwise, the results are displayed as narratives explaining what was found,” he says.
In the product example, instead of displaying a chart of products showing the sales increases or decreases, natural language presentation would generate a few sentences or a paragraph describing specific details about the products.
“People have different learning styles,” Menninger says. “Some like tables of numbers. Some prefer charts. Others don’t know how to interpret tables or charts and prefer narratives. Natural language presentation makes It easier to know what to look for in an analysis. It also removes the inconsistency in the way data is interpretated by spelling out exactly what should be taken away from the analysis.”
Embedded analytics involves the integration of analytical capabilities and data visualizations into business applications. Embedding real-time reports and dashboards into these applications enables business users to analyze the data in these applications.
“Embedded analytics brings the analytics to the applications that individuals are using in [their] day-to-day activities,” Menninger says. This might include line-of-business applications such as enterprise resource planning (ERP), customer relationship management (CRM), or human resources information systems (HRIS), as well as productivity tools such as collaboration, email, spreadsheets, presentations, and documents.
“In the context of business applications, pre-built analyses make it much easier for line-of-business personnel to access and utilize analytics,” Menninger says. “It also provides good governance, since the data is managed by the underlying application where access rights are already maintained.”
The difference between success and failure with self-service analytics can come down to the technology tools companies choose to deploy. Business executives need to work with closely with IT leadership to evaluate tools and determine which ones best meet the needs of the organization and fit with its infrastructure.
Among the requirements financial services firm Western Union had when selecting a self-service analytics platform was that it be easy to integrate with multiple disparate data sources, be flexible and easy to use, have powerful analytical capabilities, and have minimal infrastructure requirements.
The company deployed a platform from Tableau to enable business users to make decisions based on their own queries and analyses in a governed environment, says Harveer Singh, chief data architect and head of data engineering and architecture at Western Union.
Business departments can create their own queries and reports and collaborate without the need for support from IT, Singh says. “Users have freedom to slice and dice the data without technical know-how,” he says. “Data can be derived from multiple sources in various formats.”
When organizations select the right analytics tools, self-service analytics “empowers business users to retrieve and analyze the data without the need for IT expertise/product specialists for report development and analysis,” Singh says. It’s an asset “that responds to dynamic business requirements.”