PostgreSQL database query monitoring graph using SelectStar database monitoring platform

5 Things to Expect in Database Management in 2017

2016 has been an exciting year for the database management industry. As we mentioned in a previous blog post, there have been a number of changes this year — from increasing demand for the hyperscale cloud to more of a focus on machine learning, we anticipate that many of these trends will continue to shape the industry in 2017 and beyond.

With that in mind, today I’ll highlight five of the biggest trends to expect in database management in 2017.


The DBA role has been shifting for a number of years, and we’ll only see this trend to continue in 2017. The increase of new database technologies, as well as platforms, have forced new skills to be required for DBAs that allow them to go beyond their traditional focus. The ability to synthesize big data has become a key element of the role already, adding an element of data scientist that was never included in the role previously.

In addition, databases like Hadoop and Cassandra now require DBAs to implement technologies with no upfront schema — requiring them to learn how to store data an entirely new way.


Before you roll your eyes about predictive analytics being on a “trend” list, read on. For years, we have been saying that predictive analytics are the next big thing in technology — particularly among database management. But now, we’ve started to see a bit of a shift. It’s actually true.

Early adopters have known this for awhile, but for those that are slow to adopt new technologies, predictive analytics are the hottest thing that they’ve found in recent years. On top of that, technology has finally caught up to enable predictive analytics across platforms. Whether it’s on premises or in the cloud, no matter if it’s a NoSQL or RDBMS, database monitoring solutions go across platforms and providers — digging deeper into your data and providing predictive insight into your data.


Cloud continues to gain traction in the market, and numbers from 451 Research show that the market for cloud as a service will only continue to grow — in fact, estimates show that it will double in the next five years.

Even as cloud continues to grow in popularity, it adds more complexity to our database environments. We are seeing more cloud adoption, sure, but there’s still a dependency on old technologies that only exist in on-premises environments.

Merging the old technology with new can be challenging all the way around. Staffing requires talent that can straddle both sides of the technology world; data security becomes a more complex issue due to more platforms, some with outdated technology; and troubleshooting becomes more time consuming without the right monitoring technology in place.


The database landscape will continue to grow in complexity as platform technologies continue to evolve and change based on the dynamic — and changing needs — of organizations. As we’ve seen with cloud migrations, companies struggle to move away from technologies they’ve relied heavily upon for decades but focus on technologies for new implementations. 2017 will result in distributed computing environments gain traction as end-user companies will transition from complementing their relational databases with distributed computing environments to replacing them completely.

In the 451 Research 2017 Trends in Data Platforms and Analytics report, they predict that this will have a significant impact on data grids — transforming them from caches to bona fide data processing layers, only adding more complication to database management without having the right pieces in place to optimize performance across database platforms.


With the Internet of Things (IoT), databases have migrated from being an IT function to the core of nearly every element of an organization’s infrastructure. Because of this, more attention is being placed on the performance of these databases. After all, it can directly impact customer satisfaction and even, employee productivity.

More attention means a need for insight and analysis into key metrics, even more so than before. Leveraging a database performance monitoring solution that helps you uncover issues as they arise, but also dig deep into key data points — like queries, CPU balance, network traffic, wait time and memory usage — to provide the information necessary to other stakeholders in the organization.

Interested in learning more? Try a free trial of SelectStar to see how it’s database and infrastructure monitoring can ready you for the challenges that 2017 brings your way.


Try SelectStar for 14 days. No credit card required.