Deep Learning Inference Server
- Intel® Xeon D-2100
- Short Depth Chassis
- Wide Temperature
NEW
1U 19” Rackmount Open RAN Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
- Intel® Xeon D-2100 8/12/14/16 Cores Processor
- Short Depth Chassis and Wide Operating Temperature -40~65ºC
- 2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
- Front Access I/O with 1x GbE RJ45 for IPMI, 8x 10G SFP+, 1x RJ45 Console, 1x USB 3.0 and Screw-less Fan Replacement
- 4x 2.5” Internal HDD/SSD Bays, 1x M.2 NVMe 2280 M key
- 1x PCI-E x16 slot for FPGA or GPU cards (Max. up to 75W)
- Intel® QuickAssist Technology
Deep Learning Inference Server
- Intel® Xeon D-2100
- Short Depth Chassis
- Wide Temperature
NEW
Short Depth Chassis Edge Computing Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
- Intel® Xeon D-2100 12/16 Cores Processor
- Wide Operating Temperature -40~65ºC
- 2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
- Front Access I/O: 1x GbE RJ45 IPMI, 8x 10G SFP+, 2x 40G QSFP+, 1x RJ45 Console, 1x USB 3.0 and Front Fan Replacement
- 2x 2.5” Internal HDD/SSD Bays, 2x M.2 NVMe 2280 M key
- 1x PCI-E*16 FH/HL slot for FPGA or GPU cards
- Optional G.8272 T-GM Compliant IEEE 1588v2 SynE, onboard GPS
Deep Learning Inference Server
- Intel® Xeon D-2700
- 10G SFP+, 25G SFP28
- -40~65ºC Temp. Range
NEW
High Performance Edge Computing Appliance With Intel Xeon® D-2700 Multi-core Processor
- Intel® Xeon D-2700 8~20 Cores Processor
- 4x DDR4 3200MHz REG RDIMM, Max. 256GB
- 8x 10G SFP+, 2x 25G SFP28, 1x RJ45, 2x USB 3.0
- 1x RJ45 Console, 2x M.2 NVMe 2280, 1x PCIe*16 FH 3/4L, 1x OCP 3.0 Slot
- -40C~65C Operating Temperature (SKU B)
- Intel® QuickAssist Technology
Deep Learning Inference Server
- Intel® Sapphire Rappids EE
- OCP 3.0 NIC, TPM
- 3x PCIe Slots
NEW
5G Edge Server with Intel® Xeon®Scalable Processor (Sapphire Rapids SP/EE)
- Intel® Sapphire Rapids SP/EE Processor
- Intel vRAN Boost Support (Sapphire Rapids SP/EE)
- Short Depth Chassis and Front I/O Design
- 16x DDR5 4400MHz RDIMM, Max. 1024GB
- 1x OCP 3.0 NIC Module
- 0 ~ 50ºC Operating Temperature (By CPU SKU)
- 2x M.2 NVMe 2280, 2x 2.5’’ SATA/ U.2
- 1x FHFL PCIex 16 slot, 2xLP or 1xFHHL slot (PCIex8)
- Secure BMC / TPM 2.0
- A NVIDIA-Certified System for industrial edge
Deep Learning Inference Server
- 2x Intel Xeon Cascade Lake
- 12x 3.5” Drive Bays
- Max. 768GB Memory
NEW
2U High Performance x86 Hyper-converged appliance with 12 x 3.5” Storage Bays
- 2U High Performance Hyper-converged Appliance
- Support Dual 2nd Gen Intel® Xeon® Scalable SP processor family up to 205w and max. up to 24x DDR4 R-DIMM
- Front: 12 x 3.5”HDD SATA 6G (Default) , SAS/2 NVME (Optional)
- Rear: 2 x 2.5” SATA 6G
- Console, LOM port, MGT port, 2x USB 3.0 ports, RJ45, SFP+ ports
- 2x PCI-E*16 FH 10.5”L (Max. 266.7mm) + 1x PCI-E *8 HH/HL
- 1+1 redundant power supply
- A NVIDIA-Certified System for enterprise edge
Deep Learning Inference Server
- 2x Intel Xeon Ice Lake SP
- Max. 1536GB Memory
- 8x NIC, 2x RJ45, 1x Console
NEW
2U 19" Rackmount Network Appliance Built with 3rd Gen Intel® Xeon® Scalable Processor (Codenamed Ice Lake SP)
- Dual 3rd Gen Intel® Xeon® Scalable Processor
- 24x DDR4 2133/2400/2666/2933/3200 MHz, Max. 1536GB
- 8x NIC Slots, 2x GbE RJ45, 1x RJ45 Console, 1x LOM, 2x USB 3.0
- PCIe*16 Gen 4 Expansion (Optional), 3x M.2-2280 (NVME & SATA)
- 100G Intel® QAT, Intel SGX, Intel Boot Guard, TPM 2.0
- 4x Individual Hot-swappable Fans, 1300W/2000W 1+1 ATX Redundant PSUs
Deep Learning Inference Server
- Intel® Sapphire Rapids-SP
- Max. 8x NIC
- Max. 12x NVME HDD
NEW
2U 19” Rackmount Network Appliance Built With Intel® Xeon® Processor Scalable Family (Codenamed Sapphire Rapids-SP)
- Intel® Xeon® Processor Scalable Family ( Sapphire Rapids-SP)
- 24x 288-pin DDR5 4800MHz R-DIMM, Max. 1536GB
- 8x NIC Slots, 2x GbE RJ45, 1x RJ45 Console, 1x LOM, 2x USB 3.0
- 2x 2.5” HDD/SSD (SKU A & C)
- 6x Individual Hot-swappable Fans, 1600W/2000W 1+1 ATX Redundant PSUs
- Intel® QuickAssist Technology