On May 27th 2019, Nvidia announced Nvidia-EGX. Nvidia-EGX is a platform that allow companies to avail low latency AI on edge devices. They predicted that by 2025 around 150 billion devices will be streaming continuous data that will need processing and thus they developed NVIDIA EGX to overcome this traffic and process data in real time. It is an edgy machine learning platform based on Kubernetes.
NVIDIA EGX has the ability to process real time data from these devices without sending it to cloud centres first. This development comes from the fact that AI algorithms needed powerful machines for processing complex data. This resulted in high latency and need for a huge bandwidth for transferring the data to cloud or data centres.
With this platform the AI has been brought to the edge of the network. NVIDIA being one of the tech giant for its GPUs (graphic processing units) being able to handle AI at its best joined hands with various hardware companies to achieve this giant feat in the tech industry.
Some examples include the Tesla V100 for deep learning and Quadro
GV100 using ray tracing for creating realistic images in real time. With exposure to such of technology and having first-hand experience in developing it, it was only a matter of time before they enter the AI edge market.
Various characteristics of NVIDIA EGX include:
EGX is highly scalable. It is equipped with NVIDIA Jetson Nano. NVIDIA Jetson Nano is known for providing one-half a trillion operations per second (TOPS) in just a few watts. EGX has a full rack of NVIDIA T4 servers, which delivers more than 10,000 TOPS for real-time speech recognition and various other real-time AI tasks.
NVIDIA shook hands with Red Hat for the EGX platform and integrate it with OpenShift.
Chris Wright, CTO of Red Hat claimed that with the combination of EGX and OpenShift, the platform will entail better optimization of operations for the users. He called it a consistent, high-performance, container centric platform.
NVIDIA also joined hands with Cisco security and Mellanox. This means that the EGX platform will be equipped with high tech AI computing technologies, the leading enterprise-grade Kubernetes container orchestration platform and networking and storage technologies. All this functionalities increases the range of usage of EGX and also makes it more safe to use. The collaborations with these companies makes the NVIDIA EGX promising.
EGX also enables hybrid cloud and multi cloud IoT. The NVIDIA AI computing can be used on EGX and the AI functionalities on the cloud can be run on EGX, the opposite is also true. This enables the EGX to be connected to major cloud IOT services and the customers can now remotely manage their services. EGX is compatible with AI applications running on AWS IoT Greengrass and Microsoft Azure IoT Edge and the platform can even connect to IoT services from both providers.
NVIDIA Edge Stack is optimized software that includes NVIDIA drivers, a CUDA Kubernetes plugin, a CUDA container runtime, CUDA-X libraries and containerized AI frameworks and applications, including TensorRT, TensorRT Inference Server and DeepStream. NVIDIA Edge Stack is optimized for certified servers and can be downloaded from the NVIDIA NGC registry.
Safe to say that NVIDIA EGX has huge and widespread support from the developers who have praised the EGX platform.
NVIDIA EGX servers are tuned for NVIDIA Edge Stack and NGC-Ready validated for CUDA-accelerated containers.
Nvidia is currently working with 13 different server manufacturers to sell its EGX platform including Cisco, Dell-EMC, HPE, Lenovo and others.