Like everything else at the moment, even the world of data and analytics is being shaped by the COVID-19 pandemic.
That’s the view of Gartner, at least. The market analyst recently published its latest Top 10 Trends in Data and Analytics outlook, framing this year’s selections very much in the context of the pandemic.
Introducing the report, the authors make the point that global efforts to combat COVID-19 are already pioneering cutting-edge data science practices due to the pressing need to find solutions as fast as possible. Necessity so often speeds up innovation, and much of the current effort to tackle the virus is based on trying to understand its biology and transmission by modelling data at an unprecedented scale.
But the influence COVID-19 is having on data handling and analysis is not just restricted to public health and medicine. Distinguished VP analyst and Fellow in Gartner’s Data and Analytics team, Rita Sallam, recently told an event: “In the face of unprecedented market shifts, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to accelerate innovation and forge new paths to a post-COVID-19 world.”
So what do the forecasters at Gartner see as the most important trends in analytics taking shape in 2020? Some are pretty obvious, and can be viewed as accelerations of shifts already well in motion before the pandemic, such as increasing reliance on cloud-based analytics platforms and the progressive convergence of Artificial Intelligence (AI) and analytics, including so-called Augmented Analytics.
Other tips, however, represent much more novel and potentially disruptive developments in the world of data science and analytics. Here are three that might be especially interesting to watch out for.
A phrase recently coined by, you guessed it, Gartner, there’s no saying whether the name X Analytics will ever catch on. But the trend it describes is worthy of attention.
One of the challenges data science has long had is how to make efficient and robust comparisons between different types of data, or else use disparate data sets as part of the same analysis. Traditionally, each type of data would have to be treated separately, either to create results of a single data type that could then be subject to further analysis, or to be interpreted and compared by people. Either way, this led to time-consuming processes that posed a barrier to analysis at scale across multiple data classes.
With recent advances in AI, rapid and reliable comparisons across different categories of data, even unstructured content like text, no longer poses an issue. That, according to Gartner, has triggered the emergence of X Analytics – a robust, structured approach capable of tackling macro-level problems through data at a massive scale.
Again, Gartner points to the way data specialists from across a huge range of disciplines are pooling their resources and expertise to interrogate vastly diverse data sets – everything from personal health records and test results to mobile phone location tracking, internet searches and social media posts – to come up with any actionable insights that might be useful to tackling COVID. The lessons learnt from this big picture approach to Big Data can then be applied to other big issues like climate change, disaster response, corporate risk – whatever ‘X’ you want to input.
We’ve all heard about Business Intelligence, or BI – the product you get when the results of data analysis are translated into the language of meaningful business insights. Decision Intelligence might be viewed as going one step further – not just converting the numbers into an informative snapshot of business performance, but making robust recommendations about the best actions to take as well.
Clearly, decision making involves an element of predicting which actions will achieve the best outcomes, so Decision Intelligence can be seen as building on the evolution of predictive analytics. A little like X Analytics, Gartner sees Decision Intelligence as evolving as a structured model for how to predict optimum outcomes repeatedly, with AI again at the heart of it.
Ledger Database Management Systems
Based on Blockchain technology, Ledger Database Management Systems (LDMS) have a few unique characteristics that set them apart from conventional data management systems, the most important of which is arguably immutability.
Like anything built on Blockchain, you cannot edit a data field in an LDMS. If you want to change a record, you instead create a brand new version of the entire database which includes both the new data and all previous versions. Any data record built on an LDMS is therefore simultaneously extremely secure and transparent, as all historical actions are by design part of the record and always visible.
One of the most interesting arguments for why LDMS is about to take off is that it allows for a highly efficient and distributed approach to ‘data versioning’ that is much like the source code versioning exemplified by GIT. GIT has been credited with being a key factor in the explosion of collaborative, open source software development worldwide. It makes it very easy and intuitive for large groups of collaborators working on a single programme in very complex, non-linear workflows to trial and test ideas without endangering previous versions, which supports and even encourages innovation.
With the transparency and protection LDMS offers, an interface that matches GIT for simplicity and ease-of-use could see data science shift to similarly collaborative, innovation-positive ground.