"The bulk of attention in AI networking over the past two years has been focused on training — scale-up fabrics, lossless scale-out Ethernet, GPU-to-GPU interconnects," AvidThink principal Roy Chua told Fierce Network. "That’s understandable; that’s where the money has been. But inferencing is rapidly becoming the dominant AI workload, and the networking requirements for inference are different from training, so there's a market need for networking fabrics to optimize inferencing workloads."

Read the News Article Here:

Keep Reading