StreamSets Adds New Features to DataOps Platform
September 11, 2018

StreamSets announced innovations that help companies efficiently build and continuously operate dataflows that span their data center and leading cloud platforms — AWS, Microsoft Azure and Google Cloud Platform.

New capabilities include data drift handling for cloud data stores for improved pipeline resiliency, continuous integration and delivery (CI/CD) automation that brings DevOps-style agility to dataflow pipelines, and the ability to centrally manage in-stream data protection policies for security and compliance.

These features build on StreamSets DataOps Platform’s rich catalog of cloud connectors, its cloud-native architecture for easy cross-platform deployment, and its ability to elastically scale dataflows via Kubernetes.

Features such as data drift handling and in-stream data protection are powered by StreamSets’ unique Intelligent Pipelines capability, which inspects and analyzes data in-flow, overcoming the lack of visibility common in traditional data integration and big data ingestion approaches.

A majority of StreamSets customers already use the StreamSets DataOps Platform for cloud dataflows, executing both “lift and shift” cloud migration projects that require peak throughput, and continuous real-time streaming of data.

“As our customers embark on their hybrid cloud journey, we see first-hand their struggle to orchestrate end-to-end management of data movement across a growing range of on-premises and cloud platforms,” said Arvind Prabhakar, CTO, StreamSets. “Our DataOps platform was architected as cloud-native from the start, allowing us to easily evolve with the market. Cloud drift-handling and CI/CD for dataflows are unique enhancements that help our customers on their journey from traditional to modern data integration based on DataOps.“

The expansion of data architectures into the cloud creates challenges for enterprises that still rely on traditional data integration software or single-purpose big data ingestion tools. Using these methods, pipelines take too long to build and deploy, and often rely on valuable, specialized developers. They are opaque, denying end-to-end visibility into pipeline performance to prevent failures or detect sensitive personal data in the dataflow. Finally, they are rigid, breaking whenever data drift occurs, such as when fields are added or changed or data platforms are upgraded.

With these new features, which began rolling out in late August, StreamSets DataOps Platform now offers:

- Development automation through a full-featured dataflow designer that includes “easy button” connectors for Amazon S3, Elastic MapReduce (EMR) and RedShift; Azure Data Lake Storage, HDInsight and Azure Databricks; Google DataProc and Snowflake

- Elastic scaling of cloud, multi-cloud and reverse hybrid cloud dataflows via Kubernetes

- New data drift handling, which automatically reflects updates to source schema in Amazon Athena, Azure SQL and Google BigQuery cloud data services

- A new CI/CD framework for automating frequent changes to dataflows through iterative design, test, validate and deployment steps

- New central governance of StreamSets Data Protector policies that detect and deal with sensitive data such as PII and PHI

The Latest

November 15, 2018

Serverless infrastructure environments are set to become the dominant paradigm for enterprise technology deployments, according to a new report — Why the Fuss About Serverless? — released by Leading Edge Forum ...

November 14, 2018

What to automate? Which parts of the delivery process are good candidates? Which applications will benefit from automation? At first, those sound like silly questions. Automate all your repetitive processes. If you think that you'll do the same thing manually more than once, automate it. Why would you waste your creative potential and knowledge by doing things that are much better done by scripts? Yet, an average company does not adhere to that logic. Why is that? ...

November 13, 2018

I'd love to see more security automation deeply integrated into the development process. Everybody knows since the 1990s that security as an afterthought just doesn't work, yet we keep doing it. The reason, I think, is because it's very hard to automate security ...

November 09, 2018

DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 5, the final installment, covers deployment and production ...

November 08, 2018

DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 4 is all about security ...

November 07, 2018

DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 3 covers the development environment and the infrastructure ...

November 06, 2018

DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 2 covers the coding process ...

November 05, 2018

Everyone talks about automating the software development lifecycle (SDLC) but the first question should be: What should you automate? With this question in mind, DEVOPSdigest asked experts from across the IT industry for their opinions on what steps in the SDLC should be automated. Part 1 starts with by-far the most popular recommendation: Testing ...

October 31, 2018

Halloween is a time for all things spooky, but not when it comes to your mobile app experience. A poor experience can not only scare off your customers but keep them away for good ...

October 30, 2018

As organizations have embraced open source, they have become polyglot — using multiple programming languages and technology stacks to accomplish software and hardware related tasks. Enterprises are caught between the benefits provided by a polyglot environment and the complexities and challenges these environments bring. Ultimately, if the situation remains unchecked, polyglot will kill your enterprise ...

Share this