After raising $750 million in new funding, Groq Inc. is carving out a space for itself in the artificial intelligence inference ecosystem. Groq started out developing AI inference chips and has ...
AMD is strategically positioned to dominate the rapidly growing AI inference market, which could be 10x larger than training by 2030. The MI300X's memory advantage and ROCm's ecosystem progress make ...
Cloudflare’s NET AI inference strategy has been different from hyperscalers, as instead of renting server capacity and aiming to earn multiples on hardware costs that hyperscalers do, Cloudflare ...
This episode is available to stream on-demand. As data centers adapt to manage huge volumes of data from AI applications, new opportunities are appearing outside of major facilities. In the move from ...
Probabilistic programming languages (PPLs) have emerged as a transformative tool for expressing complex statistical models and automating inference procedures. By integrating probability theory into ...
This episode is available to stream on-demand. This episode discusses the technical nuances of GPU performance and system design for AI and HPC. Expert speakers will compare hosted cloud and on-prem ...