Kove

Kove:SDM™ is the first patented software-defined memory solution, enabling servers to access a shared, scalable memory pool—far larger than local capacity—so workloads get exactly the memory they need, when they need it, with no new hardware.

Contact a Red Hatter
Overview

Meet the world’s first fully mature, validated, software-defined memory solution, Kove:SDM™



Founded in 2003, Kove has a long history of solving technology’s most complex infrastructure challenges. From pioneering high-speed backups for large databases and setting sustained storage speed records, to inventing technology that allows cloud storage to scale without limits, Kove has consistently redefined what’s possible in enterprise IT. After years of research, development, and rigorous validation, Kove launched its most transformative innovation yet: Kove:SDM™, the world’s first patented and fully mature software-defined memory solution.

Kove:SDM™ breaks the rigid link between compute and memory by enabling individual servers to draw from a centralized memory pool—including amounts far beyond what could be physically installed in a server. This software-based memory allocation happens dynamically and precisely, ensuring every job receives exactly the memory it needs, when it needs it. The result: continuous computation without bottlenecks, even as models scale and workloads fluctuate.

With Kove:SDM™, enterprises can dramatically boost performance and efficiency across AI/ML, analytics, virtualization, and edge workloads. By keeping GPUs and CPUs fully utilized—without requiring hardware overhauls—customers accelerate training and inference times, reduce iteration cycles, and bring innovations to market faster. Most Kove:SDM™ deployments achieve annual ROI in the 50–80% range, with some exceeding 200%, thanks to reduced over-provisioning, minimized stranded capacity, and a significant drop in infrastructure sprawl. Kove:SDM™ is tightly integrated with Red Hat OpenShift, OpenShift AI, and Enterprise Linux, and operates seamlessly across any supported hardware. Within OpenShift clusters, Kove:SDM™ can increase container density by up to 100x, enabling higher throughput and dramatically greater workload consolidation.Beyond performance, Kove:SDM™ also delivers on sustainability. By increasing utilization per server, it reduces power and cooling requirements by up to 54%, supporting energy-conscious enterprises aiming to shrink their environmental footprint. Whether deployed in core data centers, hybrid clouds, or low-power edge environments, Kove:SDM™ provides consistent, scalable, and cost-effective memory performance—without compromising on security or control.

Security is foundational to Kove:SDM™. It features secure client masking, memory zeroing, fabric partitioning, and 64-bit key encryption—ensuring safe multi-tenancy and robust data isolation. And with local-memory-level latency delivered from 150 meters away or more, it eliminates one of the most persistent barriers to performance in modern distributed computing.

A unique product from Kove that provides software-defined memory [means] that we’re going to be able to train the models, and monitor the network, really, without any computer or hardware limitations.

Tom ZschachChief Innovation Officer - Swift

Benefits to working with Kove

More Performance, No New Hardware

Run AI/ML and real-time workloads without expensive infrastructure upgrades. Data scientists can run whatever size datasets they want — iterating faster, improving time-to-solution.



Lower TCO, Higher Efficiency

Reduce over-provisioning and stranded capacity while scaling seamlessly. Most customers achieve an annual ROI in the 50-80% range for their Kove:SDM™ investment — and it can be as high as 200%.



Lower Power & Cooling Costs

 With increased utilization of each server, you require fewer servers, which reduces your energy needs up to 54%. Imagine less cooling, a smaller footprint, and more processing power for your teams.



Edge & Cloud Memory Flexibility

Extend memory to edge nodes and hybrid cloud environments without vendor lock-in.



Interested in working with this partner?

Contact a Red Hatter
OfferingsResourcesFAQs

Is Kove® Software-Defined Memory secure?

Kove:SDM™ provides strong security against attacks targeting memory penetration. All memory is zeroed out prior to use or re-use. Similar to “LUN Masking” – i.e., logical unit number masking – in storage, Kove:SDM™ provides secure Client Masking to secure customer isolation and support multi-tenancy. In addition, fabric partitioning is enforced, and 64-bit keys secure the host-fabric adapters.

What is Software-Defined Memory?

Software-Defined Memory (SDM) is a subset of Software-Defined Technologies (SDT), which is the management of virtualized resources. Other areas of SDT include storage, computing, and networking. With Kove® Software-Defined Memory, the virtualized memory enables individual servers to draw from a common memory pool, receiving exactly the amount of memory needed, including amounts far larger than can be contained within a physical server. When your current job is completed, memory returns to the pool and becomes available for use by other servers and jobs, increasing your memory utilization. In this way, enterprises require less memory in individual servers and less aggregate memory, because it is using memory more strategically on-demand, where and when it is needed.

Does Kove:SDM™ work for Edge environments?

Yes. For companies concerned with low-latency performance, scalability, energy efficiency, security and cost as they roll out more edge computing and 5G infrastructure, Kove:SDM™ represents the definitive memory technology to address growing demands. You can achieve data center performance in remote locations with limited power as Kove:SDM™ provides improved performance with fewer resources in very demanding edge environments. WITH KOVE:SDM™ ON THE EDGE YOU CAN ACHIEVE: Enjoy greater utilization making edge computing financially viable. Enable 64 GiB servers to become multi-TiB servers on-demand. Save critical energy by reducing your power needs by as much as 54%. Increase your max memory per process by 10x. Run jobs that are vastly larger than your internal memory.

What are the key benefits of using Kove:SDM™ for AI workloads?

Kove:SDM™ dynamically expands memory based on demand, allowing hardware to adapt to the workload — instead of forcing workloads to conform to hardware limits. This flexibility ensures continuous computation, even as AI models grow and resize. By eliminating memory bottlenecks through dynamic allocation, Kove:SDM™ keeps GPUs and CPUs fully utilized, significantly accelerating training and inference times. This means faster insights, quicker customer delivery, and the ability to run larger models entirely in memory without delay. Instead of relying on costly hardware upgrades, performance scales efficiently through software — avoiding the need for additional GPUs or servers. As a result, organizations can handle more AI jobs in parallel, enabling higher-throughput pipelines and maximizing infrastructure efficiency.

What’s the latency impact?

There is none. Kove:SDM™ eliminates the latency problem across the data center, delivering local memory performance from 150 meters or farther away.
Red Hat logoLinkedInYouTubeFacebookTwitter

Platforms

Products & services

Try, buy, sell

Help

About Red Hat Ecosystem Catalog

The Red Hat Ecosystem Catalog is the official source for discovering and learning more about the Red Hat Ecosystem of both Red Hat and certified third-party products and services.

We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.