The Big Picture: Security teams are scaling their infrastructure faster than their data pipelines can keep up—resulting in ballooning SIEM ingestion costs, months-long onboarding timelines, and SOC analysts trapped doing data engineering instead of threat hunting. The telemetry explosion has made a new architectural layer unavoidable.
The Guest: Aditya Sundararam, Chief Product Officer at DataBahn
The Guest: Nithya Nareshkumar, Co-Founder & President at DataBahn
Key Takeaways
- DataBahn acts as the intelligent control plane for security telemetry—normalizing, governing, optimizing, and routing data before it reaches any downstream SIEM, data lake, or agentic AI system
- The expanded Microsoft Sentinel integration compresses onboarding from months to hours, with customers achieving up to 60% reduction in ingestion costs
- AI-readiness starts at the pipeline: the organizations that win the next decade of cybersecurity will be those that operationalize clean, structured data quickly—not those that collect the most logs
In a recent TFiR interview, Swapnil Bhartiya spoke with Aditya Sundararam, Chief Product Officer, and Nithya Nareshkumar, Co-founder & President at DataBahn about the structural collapse of legacy security data architectures—and why an intelligent pipeline layer has become the foundation for modern SIEM deployments, Microsoft Sentinel integrations, and agentic AI-driven security operations centers.
THE SECURITY TELEMETRY CRISIS
What once involved collecting logs from a small number of systems now spans hundreds of sources—SaaS platforms, cloud-native services, on-premises infrastructure, IoT and OT environments, and third-party applications. The result is not just a volume problem. It is a complexity, cost, and data quality crisis.
Redundant telemetry inflates SIEM ingestion bills. Overlapping tool coverage produces noisy, low-signal logs. Sensitive data flows into downstream systems—including LLMs and agentic platforms—that were never intended to receive it. And when vendor formats change, the detection stack that was generating value quietly stops doing so.
Aditya Sundararam described the structural gap: “Many organizations started simply assuming that the main cost of security data is just SIEM licensing and compute. But often the real challenge turned out to be much deeper than that. Teams were collecting the wrong data, or not collecting data they should be—and too much noisy telemetry was surfacing because of this explosion in architecture and components.”
DATABAHN: THE INTELLIGENT CONTROL PLANE FOR SECURITY TELEMETRY
What DataBahn Does
DataBahn positions itself as the control plane for enterprise security telemetry—sitting between data sources and downstream consumers such as SIEMs, data lakes, compliance archives, and agentic AI workflows. The platform takes on four core functions in sequence.
First, it collects and normalizes telemetry from any source—SaaS, on-premises, IoT, OT, cloud-native—into a structure that every downstream consumer can use without additional data engineering. Second, it enforces a governance layer: validating data quality, detecting collection blind spots where a source has stopped sending, and identifying sensitive data that should never reach third-party systems. Third, it optimizes the pipeline by identifying and removing redundant, overlapping, or zero-value telemetry before ingestion. Fourth, it routes the refined data intelligently—to analytics tiers, cold storage, compliance archives, or agentic AI systems—based on use case.
“We become the control plane of security telemetry for organizations. Instead of every tool talking to every other system, you’re introducing an intelligent data foundation that acts as the conduit—analyzing and getting the data ready for SIEMs, data lakes, and AI-based systems.”
The impact on security operations is immediate. Rather than having SOC analysts double as data engineers—writing parsers, managing transformations, maintaining connectors—the platform acts as a continuous, automated data engineering function.
Nithya Nareshkumar framed the operational shift plainly: “The security teams are involved in data plumbing grunt work. When you bring in an intelligent pipeline like DataBahn, security teams can focus on threat detection and investigation—which is what they’re meant to do—instead of building data engineering infrastructure.”
SIEM MIGRATION AND COST REDUCTION AT SCALE
Customer validation of the DataBahn platform clusters around three consistent outcomes. The first is analyst focus: security teams report reclaiming time previously absorbed by data operations, redirecting it to detection engineering, threat hunting, and analytics. The second is SIEM migration acceleration: what previously required three to twelve months of parallel platform operation and manual validation of every data feed now completes in days. DataBahn customers have migrated close to 1,000 data feeds between platforms without prolonged co-licensing periods or extensive manual verification cycles. The third outcome is security data economics: pipeline-level telemetry optimization removes the recurring cost pressure of ingestion-based SIEM pricing. Customers consistently achieve 40–50% telemetry optimization within week one of deployment.
Aditya Sundararam explained the shift in how organizations now think about data decisions: “The trade-off of ‘should I bring this data in—is it worth the cost?’ is no longer a decision factor. At the pipeline layer, customers consolidate and refine telemetry, then take what they need to downstream systems—without having to open their checkbooks yet again.”
MICROSOFT SENTINEL INTEGRATION
DataBahn and Microsoft Sentinel: The Better Together Story
Microsoft Sentinel has become one of the fastest-growing platforms in the SIEM market, now expanding into Log Analytics Workspace tiering, Microsoft Graph, and Copilot-based agentic security workflows. DataBahn’s expanded partnership directly addresses the operational gap between Sentinel’s capabilities and how long it takes enterprises to actually activate them.
With DataBahn deployed in front of Sentinel, organizations normalize and enrich telemetry before it lands in the platform. Intelligent tiering directs data to the appropriate Sentinel layer—analytics, workspace, or long-term storage—eliminating the default behavior of routing everything into expensive analytics-tier ingestion. The result is faster onboarding, reduced cost, and data that is immediately analytics-ready and agent-compatible from day one.
Nithya Nareshkumar summarized the partnership value: “When you combine Sentinel’s analytics and investigation capabilities with an intelligent pipeline in front of it, organizations can move from months of onboarding and deployment into operationalization in a matter of days. That is what this partnership unlocks—and it’s a better together story.”
AI-READY DATA INFRASTRUCTURE FOR THE AGENTIC SOC
The emerging model of agentic security operations—where AI systems perform detection, investigation, and remediation workflows autonomously—creates a foundational data requirement that most existing architectures do not meet. AI and LLM-based systems cannot reliably reason on raw, noisy, or inconsistently formatted telemetry. DataBahn’s pipeline layer is designed to close that gap.
By ensuring data is clean, enriched, and contextually structured before reaching any agentic system, DataBahn removes the need for organizations to re-engineer their data foundation each time a new AI-based tool is introduced. Customers can direct precisely the data a given agentic workflow requires—without opening flood gates to every system and introducing new privacy or security exposure.
Aditya Sundararam drew the distinction between surface-level AI adoption and genuine AI readiness: “It’s no longer just putting a sticker of AI or LLM on top of raw datasets. You now have a layer that intelligently gives context to these systems—so customers can very quickly adopt or build new systems without having to go back to data engineering or fix data plumbing.”
Nithya Nareshkumar framed the competitive divide that is opening across enterprise security: “The last decade of cybersecurity was probably about building detection platforms. From this point onward, it’s going to be about preparing the data that these platforms are going to rely on—and that includes AI-based platforms. The organizations that come out as winners will not be those that collect the most logs. It will be those that are able to operationalize their data sets meaningfully, and very quickly.”





