In a recent panel session at the Digital Workers Forum, Sergey Edunov, Meta’s director of engineering for Generative AI, shared a surprising insight into the power needs of AI applications. He stated that just two new nuclear power plants would be sufficient to handle the increasing demand for AI applications for the next year. This statement provided a ballpark estimate of the power requirements for AI “inference,” where AI is deployed to respond to questions or make recommendations. Edunov’s calculation considered the number of H100 GPUs that would be available next year and the electricity required to support them. His estimate concluded that two nuclear reactors would be enough to power these GPUs, indicating that the power needs for AI inference are relatively manageable at the current scale of humanity.
Related Posts
OpenAI’s Latest AI Announcements and Trends for 2024
A Wave of Excitement Sweeps the AI World OpenAI’s first developer conference, Dev Day, brought forth a series…
MLPerf 3.1: Benchmarking Generative AI
MLCommons Expands MLPerf Benchmarks MLCommons, a neutral organization, is enhancing its MLPerf AI benchmarks, now including large language…
Advancements in Robotics: OK-Robot Framework Blends Vision-Language Models and Robotics Primitives
Researchers have made significant progress in vision-language models (VLM) that can match natural language queries to objects in…
AI Advancements in Video Generation
AI companies are making rapid progress in the field of video generation. In recent months, we have seen…