Apr 22 (BNP): Global AI development is now being limited less by model capability and more by the availability of computing infrastructure, according to Goldman Sachs.
While AI models continue to improve rapidly in performance and efficiency, the key constraint has shifted toward computing resources required to train and deploy them at scale. This includes access to advanced semiconductors, large-scale data centers, and sufficient energy supply.
The report highlights a widening gap between fast-moving AI innovation and the physical infrastructure needed to support it. Rising demand for computing power is being driven by the expansion of generative AI, large language models, and enterprise-level automation systems.
Experts note that the next phase of AI growth will depend heavily on investments in chip manufacturing, cloud infrastructure, and energy-efficient computing systems. Without corresponding expansion in these areas, the pace of AI adoption could face structural limitations.
The findings underscore a broader shift in the AI ecosystem—from software-led progress to infrastructure-led scaling—where computing capacity is becoming a critical strategic asset in the global technology race.

Leave a Reply