Real-World Use Cases with the Lanner MGX Edge AI Server

The Lanner MGX Edge AI Server is a high-performance, 2U short-depth, front-access platform built on NVIDIA’s modular MGX architecture. Designed to meet the demands of modern AI workloads, it combines CPU, GPU, and DPU capabilities in a flexible and scalable architecture. With modular PCIe support for a variety of NVIDIA GPUs, DPUs, and Smart NICs, the server enables tailored configurations that adapt to evolving use cases across enterprise data centers and telecom edge networks.

By leveraging this flexibility, Lanner collaborates with ecosystem partners to enable real-world AI innovations—from optimizing radio access networks and enhancing network traffic management to powering private, on-premises large language models. Explore how Lanner’s MGX Edge AI Server delivers the performance and adaptability needed to deploy next-generation AI applications at the edge.


 

 

 

RAN Intelligence

Simplifying RAN with Integrated DU, CU, and AI Acceleration

 

In partnership with Arrcus, Lanner enables intelligent RAN by deploying ArcOS on the ECA-6051, a powerful edge platform equipped with 400G NVIDIA BlueField-3 DPUs. By consolidating the Distributed Unit (DU), Centralized Unit (CU), and AI-driven applications running on SR-IOV virtual functions (VFs) into a MGX Edge Server, this joint solution simplifies RAN deployment, boosts network performance, and delivers scalable AI acceleration at the 5G edge.

Key Benefits:

  • Intelligent routing and consolidated functions optimize data flow between users, network, and AI.
  • Centralized control streamlines orchestration and dynamic resource allocation.
  • Virtualized functions and DPU-based routing enhance server and application performance.

 

 

 

 

 

AI Networking

Securing and Optimizing Traffic for AI Factories

 

In collaboration with F5, Lanner has deployed BIG-IP Next on the ECA-6051, leveraging both an NVIDIA BlueField-3 DPU and an L40S GPU.

This high-performance solution is designed for AI data center environments, providing real-time traffic steering, application-layer security, and load balancing.

Key Benefits:

  • Dynamic scaling of network functions at the edge
  • Improved network security and latency reduction
  • Optimized resource utilization through AI-driven orchestration

 

 

 

 

Private LLM

Offline AI for Secure, On-Premises Intelligence

 

Lanner partners with Alplux to power Patent Cube, a private LLM solution for enterprise-grade intellectual property management.

Running on the ECA-6051 equipped with dual NVIDIA L40S GPUs, Patent Cube brings semantic search, context-aware drafting, and personalized AI assistance to legal and patent professionals—all offline, ensuring full data sovereignty.

Key Benefits:

  • Secure, on-prem deployment with no cloud dependency
  • Advanced LLM features like semantic understanding and AI drafting
  • Ideal for industries requiring strict data privacy and IP compliance

 

 

 

 

ECA-6051 MGX Edge AI Server

 

 

ECA-6051

 

 
   

   
   
  • Intel® Xeon® 6 Processor (P core and E Core)
  • 8x DDR5 6400MHz RDIMM, Max. 1024GB System Memory
  • 2x PCIe*16 FHFL, 1x PCIe*16 FHHL Or 1x PCIe*16 LP (By SKU)
  • 1x GbE RJ45, 1x USB 3.0