Decodable announced at Confluent Current the general availability of Bring Your Own Cloud (BYOC), which allows users to run a private instance of the Decodable data plane—the part that handles connectivity and stream processing—in their own AWS account. In addition, Decodable has opened up a technical preview to support custom jobs written in Java using the standard Apache Flink DataStream and Table APIs.
“With all of the excitement around real-time data and what it helps us do, teams still get tripped up on practical issues. We’re fixing that,” said Eric Sammer, founder and CEO of Decodable. “We’ve taken the best-of-breed open source projects—Apache Flink and Debezium, among others—and built a powerful, fully-managed stream processing platform enterprises can run at scale in production. This goes beyond just spinning up Flink clusters; we provide a simple, easy-to-use developer experience using SQL or Java. BYOC is the next iteration in Decodable’s ongoing mission to support enterprise streaming data stacks at global scale.”
Additionally, Decodable now offers developer-focused support for running custom Apache Flink jobs, written in any JVM-based programming language such as Java or Scala, in addition to SQL-based data streaming pipelines. Custom Decodable pipelines allow users to choose freely between declarative and imperative job implementations, depending on specific needs and requirements. And either custom or SQL-based pipelines are easily integrated with fully-managed connectors, so users only move those parts which require a bespoke imperative implementation into a custom job.
This feature addresses the widest set of use cases with flexibility, supporting the simplest of tasks (data transformation) as well as more complex code-writing tasks.