how businesses are building futures with data analytics

There’s nothing quite like a bit of optimism. It can be infectious, at least in the hands of the right people. It was interesting to see some recent stats from McKinsey, showing how global executives currently have a fairly optimistic view of their local economies, despite the ongoing struggle with Covid-19.

More than half of all executives surveyed said economic conditions in their own countries will be better six months from now, and they were also upbeat on their own companies’ profitability. Clearly they must know something, or maybe they are just being bullish, talking it up and hoping for the best.

For any business, building forecasts demands something a little bit more than a hunch. Whether it’s an understanding of market trends and product performance analysis, or building intelligence on internal performance, businesses need facts, not subjectivity. This is where business intelligence (BI) comes in, and this also where it can get a little complicated. Not that it’s stopped CIOs from investing in new tools. According to the Harvey Nash/KPMG CIO survey 2020, business intelligence (BI) is still a top strategic investment for businesses, with a quarter of CIOs surveyed claiming it sits in their top three most important tech investments.

There are multiple challenges facing any organisation that wants to embark on a BI strategy, but also those that already have a strategy and want to keep up with the pace of change. The fact is that the market has shifted quite dramatically since 2014, and this can impact any firm that backed one of the major players six years ago.

According to Gartner’s Magic quadrant for analytics and business intelligence platforms, the top four names today – Microsoft, Tableau, Qlik and ThoughtSpot – are out on their own. In 2014, the “leaders’ quadrant” was a much more crowded space, with IBM, SAS, Oracle and Tibco also pushing for market share.

Today, the likes of IBM and Oracle have become more niche, although Tibco is still regarded as a challenger, while Tableau, through its acquisition by Salesforce in 2019, now has more clout. It’s a less congested space at the top end but that doesn’t necessarily mean the choice is easier. Talking to CIOs it’s clear that there is no one-size-fits-all brand and that, despite huge progress in recent years, there are still notable gaps in capabilities, specifically around automation and AI/ML.

Expectations are also shifting. Qlik’s senior director of market intelligence Dan Sommer likens it to 2008-2009, when the company saw “a generational shift from reporting centric BI to analysis centric BI, with new requirements for agile, get-started-fast BI that sat closer to the business”. Today, Covid-19 is creating a similar shift, increasingly away from passive analytics towards more active analytics, where there is a need for a real-time data pipeline. Sommer reckons organisations will now be looking for analytics that can embed tactical actions into workflows, processes and moments.

It makes sense given what most businesses have gone through recently with Covid-19. The need for increased agility and resilience will force the hand somewhat, pushing organisations to think strategically and identify how and where they can make improvements to enable pre-emptive decision making. It’s also about speed and accuracy, and building on a sound footing, in terms of BI tools but also data quality and employee literacy.

For Tableau customer Rob Parker, head of data and analytics at Guy’s and St Thomas’ Charity, speed of data insight was a big challenge, and a key reason behind its shift from using Excel spreadsheets, something which will warm the hearts of any data analyst.

“Until two years ago, my team relied on traditional means, like building scatter points in Excel, to identify the areas in Lambeth and Southwark most in need of attention,” says Parker. “However, the drawbacks to this reliance on Excel quickly became apparent. For example, the lack of interactivity introduced a lag between requests from decision makers about the data and a response from my team.

“It was also challenging to understand what the data meant in practice. Lack of mapping capability and other data visualisation tools meant we couldn’t see spatial patterns.”

The shift to Tableau has clearly solved that issue. Parker claims building a strategic system that delivered timely access to trusted data played a “pivotal role” in allowing the organisation to make actionable and smarter decisions with data. It also changed the way the organisation works. The ability to collaborate more quickly and efficiently has been a key factor, including with other individuals and organisations outside of the charity and the goal, adds Parker, is to continually improve alignment and strategic collaboration across the trust.

“We heavily rely on Tableau’s geospatial analytics and authoring functionalities to work with and see more detailed location data and perform analysis with more contextual background layers,” says Parker. “Using the inbuilt maps and the new hyper files, Tableau gives us answers to our questions within 20 minutes, which used to take days or even weeks using more traditional software, such as Excel.”

Automation

One area where this could improve more dramatically is through the use of AI/ML technologies. Interestingly, Parker believes that data science and machine learning capabilities “are still very limited in most BI tools.” At the moment, at least, Parker’s determination to tackle more complex problems that require some form of machine learning approach, such as clustering, classification or regression, means he has to go outside of Tableau.

It’s this idea of no one tool being the comprehensive solution that is driving the industry forward fast. Relative newcomer ThoughtSpot has had the luxury of approaching the market focussed on new technologies, not having to contend with legacy users. This has led to a more, free world approach, as opposed to the guided approach of other tools. This is both an advantage and disadvantage. It demands data literacy but, as Paul French, director of business intelligence, visualisation and reporting at Nationwide believes, it also challenges knowledge and understanding, by spotting patterns and trends in data that reveal things the firm doesn’t already know.

It’s interesting because Nationwide has a multi-vendor BI tool strategy, matching technology to relevant use cases. This is of course a challenge for most large organisations with multiple products and a vastly diverse customer base.

“If we want to enable business users to interact with data using near natural language, ThoughtSpot would be our ‘go to’,” says French. “If there is a need for a very structured, pixel perfect dashboard, we’d use an alternative in our stack.”

Nationwide uses three strategic BI tools but French claims the focus at the moment is with ThoughtSpot. “A particular use case where we have been applying this AI technology is in improving Flow efficiency across our business change and IT estate,” says French. “Essentially, improving the time it takes from the initial concept to implementing it in a system change, and where are the impediments and issues along the way. We refer to this as delivering Better Value Sooner Safer Happier.”

While there’s nothing quite like a corporate acronym to get the data juices flowing, you can see the sense in it, at least in terms of improving flow and giving data teams access to what they need, when and where they need it. While ThoughtSpot may not be the final destination (that’s almost certainly going to be Tableau) it’s clearly finding a valuable home within Nationwide’s BI tools family, at a time when remote working and financial pressures are dictating working patterns.

“Consolidating data from our enterprise IT application register, ServiceNow and JIRA, we have been able to start to measure Flow, and identify areas in the delivery process where there are wait times and repetitions happening,” adds French. “We have been able to surface this information to delivery and IT teams across the Society, who are embracing this consolidated data and creating experiments in their own teams to address the impediments that the AI solution has identified.”

Data literacy

French says that the organisation is ultimately looking to enable continual improvement but to maximise these improvements, as many people as possible need access to the relevant data. Of course, this raises the idea of data literacy. It’s all very well giving people access to data but there runs a danger that without knowledge, this data can be misconstrued. No one wants untrained eyes seeing patterns and correlations that don’t really exist.

It’s a challenge facing most businesses. Improving data literacy is fundamental to growing BI and making regular storytelling more relevant to each departmental need. Augmented analytics is very much a part of this. According to Gartner’s Magic Quadrant report, by 2025, data stories will be the most widespread way of consuming analytics, and 75% of stories will be automatically generated using augmented analytics techniques.

According to French at Nationwide, the business has on-going training to improve data literacy internally, while he also points to ThoughtSpot’s self-service system, enabling users to “explore and drill down into live data helping them answer questions as they arise.” For Parker at Guy’s and St Thomas’ Charity, it’s a similar story, although he says it’s now a priority this year to improve data literacy and knowledge of Tableau.

It’s understandable and we will see a shift, as self-service capabilities start to enable business users, aided by augmented intelligence, to ask questions using natural language and getting conversational responses in return. But this still seems like a long way off, although the foundations are being put in place in a number of organisations.

At SAP and Tibco customer Vestas, for example, individual departments have designated, isolated “data spaces” to work from, allowing business users to explore data without disrupting the source system. Each department can, therefore, have data connections relevant to their business, whether they are SQL sources, on-premise, or in the cloud. Employees in that department only see the data they need.

And herein lies another problem: data quality and data trust. The more businesses build their future decision-making around data the more they will need to trust that data. This is an ongoing challenge as data sources vary and continue to grow, and as data grows, so does understanding of that data.

As firms continue to work from home and try to second guess rapidly changing markets, so the need for more nuanced intelligence increases. BI tools are just a part of this solution and no one tool seems to fit every job. For the moment at least, a BI strategy requires a mix, but the business case is no longer in question.

Random Posts