Top 10 Data & Analytics Technology Trends For 2019 According To Gartner Report

By Kimberly Cook |Email | Feb 21, 2019 | 6312 Views

Augmented Analytics and Artificial Intelligence in the spotlight at Gartner Data & Analytics Summit.

Augmented analytics, continuous intelligence, and explainable AI are the top trends in data and analytics technology that have significant disruptive potential over the next three to five years, says Gartner.

Speaking at the Gartner Data & Analytics Summit in Sydney today, Rita Sallam, research vice president at Gartner, said data and analytics leaders must examine the potential business impact of these trends and adjust business models and operations accordingly, or risk losing competitive advantage to those who do.

"The story of data and analytics keeps evolving, from supporting internal decision making to continuous intelligence, information products and appointing chief data officers," she said.
"It's critical to gain a deeper understanding of the technology trends fueling that evolving story and prioritize them based on business value," she added.

Gartner recommends that data and analytics leaders talk with senior business leaders about their critical business priorities and explore how the following top trends can enable them.

1. Augmented Analytics
Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed and shared. By 2020, augmented analytics will be a dominant driver of new purchases of analytics and BI, as well as data science and machine learning platforms, and of embedded analytics.

2. Augmented Data Management
Augmented data management converts metadata from being used for audit, lineage and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI/ML.

3. Continuous Intelligence
Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.

4. Explainable AI
AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable.

5. Graph
Graph analytics is a set of analytic techniques that allow for the exploration of relationships between entities of interest such as organizations, people and transactions.

6. Data Fabric
Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage.
7. NLP Conversational Analytics
By 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.

8. Commercial AI and Machine Learning
Gartner predicts that by 2022, 75 percent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms.

9. Blockchain
The core value proposition of blockchain and distributed ledger technologies is providing decentralized trust across a network of untrusted participants. The potential ramifications for analytics use cases are significant, especially those leveraging participant relationships and interactions.

10. Persistent Memory Servers New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads. It has the potential to improve application performance, availability, boot times, clustering methods and security practices while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.

Source: HOB