"Next year, we predict that data gravity, in which all of the data that needs to be correlated for analysis moves to the location of the largest data set, will push businesses to deploy their analytics wherever their data lives. Cloud data warehouses such as Amazon Redshift will continue to be a popular data destination and cloud analytics will become more prevalent as a result."
Streaming analytics is the practice of monitoring data as it streams into the organisation, instead of traditional batch analytics. This is particularly useful when monitoring the health of key infrastructure or machinery, which is why streaming analytics should continue to see traction in 2017, as more organisations look towards Internet-of-Things (IoT) deployments which demand it.
Ovum's Baer notes that streaming analytics is decades-old, but open source technology has lowered barriers to entry. Now, with the proliferation of connected devices and IoT in the enterprise, especially in manufacturing and healthcare, streaming analytics could have its day in 2017.
He said: "The reason for all this activity is the demand created by emerging IoT use cases; this is where realtime sense, analyse, and respond has spurred technology vendors to pick up where niche CEP (Complex Event Processing) left off."
Conclusion: 2017 data trends
Big data remains a thorny issue for the enterprise, but the cloud is making it cheaper and simpler for the enterprise to do more with their data, without having to hire an army of data scientists.
With the major cloud providers like AWS and Microsoft releasing APIs for machine learning, and Google releasing its TensorFlow open source tool, 2017 should see what were previously considered advanced data processing techniques go mainstream.
Sign up for Computerworld eNewsletters.