Building the current crop of artificial intelligence chatbots has relied on specialized computer chips pioneered by Nvidia, which cornered the market and made itself the poster child of the AI boom. But the same qualities that make those graphics processor chips, or GPUs, so effective at creating powerful AI systems from scratch make them less efficient at putting AI products to work. That’s opened up the AI chip industry to rivals who think they can compete with Nvidia in selling so-called AI inference chips that are more attuned to the day-to-day running of AI tools and designed to reduce some of the huge computing costs of generative AI.
Recent comments