Posts

Showing posts with the label automation

AIOps Platforms (Gartner)

Image
AIOps is an emerging technology and addresses something I’m a big fan of – improving IT Operations.  So I asked fellow Gartner analyst Colin Fletcher for a guest blog on the topic… Roughly three years ago, it was looking like we were going to see many enterprise IT operations leaders put themselves in the precarious role of “ the cobbler’s children ” by forgoing investment in Artificial Intelligence (AI) to help them do their work better, faster, and cheaper. We were hearing from many IT ops leaders building incredibly sophisticated Big Data and Advanced Analytics systems for business stakeholders, but were themselves using rudimentary, reactive red/yellow/green lights and manual steps to help run the infrastructure required to keep those same systems up and running. Further, we’re all now familiar in our personal lives with dynamic recommendations from online retailers, search providers, virtual personal assistants, and entertainment services, Talk about a paradox! Now I...

The Future of Data Engineering

Image
Data engineering’s job is to help an organization move and process data. This generally requires two different systems, broadly speaking: a data pipeline, and a data warehouse. The data pipeline is responsible for moving the data, and the data warehouse is responsible for processing it. I acknowledge that this is a bit overly simplistic. You can do processing in the pipeline itself by doing transformations between extraction and loading with batch and stream processing. The “data warehouse” now includes many storage and processing systems (Flink, Spark, Presto, Hive, BigQuery, Redshift, etc), as well as auxiliary systems such as data catalogs, job schedulers, and so on. Still, I believe the paradigm holds. The industry is working through changes in how these systems are built and managed. There are four areas, in particular, where I expect to see shifts over the next few years. Timeliness: From batch to realtime Connectivity: From one:one bespoke integrations to many:many Cen...

DataOps Principles: How Startups Do Data The Right Way

Image
If you have been trying to harness the power of data science and machine learning — but, like many teams, struggling to produce results — there’s a secret you are missing out on. All of those models and sophisticated insights require lots of good data, and the best way to get good data quickly is by using DataOps. What is DataOps? It’s a way of thinking about how an organization deals with data. It’s a set of tools to automate processes and empower individuals. And it’s a new DataOps Engineer role designed to make that thinking real by managing and building those tools. DataOps Principles DataOps was inspired by DevOps, which brought the power of agile development to operations (infrastructure management and production deployment).  DevOps transformed the way that software development is done; and now DataOps is transforming the way that data management is done. For larger enterprises with a dedicated data engineering team, DataOps is about breaking down barriers and re-...