Cloud Native

Edge computing should be effortless and secure: ZEDEDA’s Erik Nordmark

0

At  KubeCon + CloudNativeCon Europe in London, Erik Nordmark, Co-Founder and CTO of  ZEDEDA , sat down with  Swapnil Bhartiya to discuss the rapidly evolving state of edge computing. What followed was a deep dive into the real-world challenges of deploying cloud-native systems at the edge — and how ZEDEDA is tackling them head-on.

Why Edge Computing Now?

Edge computing has moved from hype to necessity, particularly across verticals like retail, energy, and industrial automation. Unlike cloud or data center environments, edge deployments must operate in disconnected, often harsh conditions: solar farms with no nearby technicians, trucks moving between connectivity zones, or factory floors where systems shut down abruptly with the lights.

ZEDEDA is targeting this space with one primary mission: make edge computing secure, resilient, and effortless, “We’ve been working on making edge computing effortless and secure — supporting customers moving from legacy systems toward containers and Kubernetes,” said Nordmark. “We operate in environments where the hardware may sit untouched for 7 to 10 years.”

Immutable Infrastructure at the Edge

ZEDEDA ’s platform is built on EVE-OS, an open-source, LF Edge–hosted project designed as a minimal, immutable operating system for edge devices. This concept mirrors container image immutability and brings that philosophy to the OS layer, “The OS doesn’t self-modify. It’s either running version one, two, or three,” explained Nordmark. “You roll forward deliberately — no surprise patching, no silent failures.”

This model is especially powerful in edge environments, where the network may be unreliable, physical access is limited, and graceful shutdowns are often impossible. With EVE-OS, operators maintain control and consistency without needing hands-on maintenance.

Secure Boot, Remote Attestation, and Unattended Infrastructure

The edge introduces a unique set of security challenges. Physical access is easier for adversaries, but remote response is harder. Theft, tampering, and unintentional disruptions are all on the table.

ZEDEDA brings a hardened security posture to the edge by combining:

  • Secure Boot and BIOS password protections
  • Immutable file systems
  • Measured boot and  remote attestation, adapted from hyperscaler data center practices

These measures ensure that edge workloads can be deployed safely — even in locations with zero on-site IT presence.

AI at the Edge: Opportunity Meets Reality

Edge AI is a rapidly expanding use case, but it’s not without friction. Nordmark outlined a range of technical and operational realities that must be addressed:

  • Hardware lifecycles: Edge devices may last a decade, while AI accelerators are on 12-month obsolescence cycles.
  • Model consistency: Running a uniform model across a heterogeneous, aging fleet requires careful tuning.
  • Data privacy and trust: Inference data from third-party-owned edge environments introduces challenges around retraining, consent, and validation.

To address these, ZEDEDA is actively exploring:

These approaches are critical as edge AI scales and moves from proof-of-concept to production in sectors like manufacturing and energy.

Introducing the Three-Node Edge Kubernetes Cluster

ZEDEDA is also rolling out a new feature: a resilient, autonomous three-node Kubernetes cluster for the edge. This cluster enables high availability without requiring an active internet connection or central orchestrator. “If a node fails, workloads fail over. When it returns, it rejoins the cluster—all without human intervention or external connectivity,” Nordmark said.

This is a major step forward for edge-native resiliency, especially for deployments where uptime is mission-critical but cloud access is unreliable or nonexistent.

What’s Next?

ZEDEDA isn’t trying to reinvent Kubernetes or AI frameworks. Instead, the company focuses on integrating open source tools into a cohesive, opinionated platform that works in the real-world edge environments most software teams rarely see.

As edge computing matures, the need for secure, consistent, and autonomous infrastructure will only grow. ZEDEDA’s work underscores that the edge isn’t just a mini data center—it’s a fundamentally different operating environment that demands a rethinking of deployment, security, and lifecycle strategies.

Whether you’re running AI inference in a warehouse or maintaining remote sensors on an oil rig, one thing is clear: edge-native infrastructure is the new frontier—and it’s arriving fast.

Guest: Erik Nordmark
Company: ZEDEDA
Show: KubeStruck

Salt Security Launches MCP Server to Secure AI-Driven API Access

Previous article

What Happened Today | April 30 2025

Next article