Jun 25 2019
Tech trend #5 in our series on corporate data management and digital transformation, is augmented analytics. Gartner describes augmented analytics as “a next-generation data and analytics paradigm that uses machine learning to automate data preparation, insight discovery and insight sharing for a broad range of business users, operational workers and citizen data scientists.” The real point is that data, in and of itself is meaningless. Using tools such as machine learning and artificial intelligence algorithms, the relevant data can be separated from the rest and/or the data itself can be interpreted.
A good analogy is driving a car. While many people can drive, using mostly just the wheel, pedals, and a few other things, like a turn signal now and then, most drivers don’t know anything about how the car runs- the inner workings of the engine or many of the car parts. But we know enough relevant data in order to drive the car. As we move further into data management and digital transformation, tools to simply technology such that we can drive the technology without understanding much of it, is still decades away. We are still designing that automated technology.
By developing augmented analytic strategies, businesses will be able to focus on relevant, accurate data to make better business decisions. The Global Augmented Analytics market accounted for $4.12 billion in 2017 and is expected to reach $52.41 billion by 2026. Key players in the Augmented Analytics market include: Tibco Software, Thought spot, Tableau Software, Sap, Salesforce, Information Builders, Microstrategy, Qlik, SAS, Microsoft, IBM, Oracle, Domo, Sisense, and Yellowfin. Gartner explains that, “By 2020, augmented analytics will be a dominant driver of new purchases of analytics and business intelligence, data science and machine learning platforms and embedded analytics.” Further, half of analytics queries “either will be generated via search, natural language processing or voice, or will be automatically generated.”
At NGD Systems, we leverage in-situ processing to bring processing directly to storage, eliminating the need for data movement to main memory prior to processing. This solves a major bottleneck for CDNs, saving time, power consumption, and reducing infrastructure footprint. Data that is relevant is quickly and efficiently sorted to enable real-time analytics for petabyte scale datasets. If you would like more information, contact me.