The Spiraling Costs of Developing AI's
18 Aug 2024In the relentless race to unlock AI’s full potential, one thing is clear: bigger is often considered better. Tech giants like Microsoft, Google, and others have wholeheartedly embraced this notion, pouring colossal resources into training ever-larger AI models. As the Chair of the Consortium for Sustainable Urbanization in New York (CSU) and having led the United Nations Global Alliance for ICT and Development (UN-GAID), I find myself questioning the cost of this pursuit.
Last year, OpenAI’s GPT4, a language model with impressive capabilities, came with a jaw-dropping price tag of $78.4 million for training. Other tech titans quote similar figures to build their AIs, leaving me astounded. These astronomical costs stem from an insatiable hunger for computational horsepower, as AI behemoths demand immense resources to train models faster and with ever-increasing amounts of data.
Enter the Memphis Supercluster, owned by Elon Musk, poised to become the world’s most powerful AI supercluster by December 2024. Its resource appetite is truly colossal: 100,000 liquid-cooled Nvidia H100 cards, consuming 150 megawatts of electricity per hour (enough to power 100,000 homes), and requiring one million gallons of water daily for cooling. This approach aligns with the American philosophy that more horsepower equals better performance, often at the expense of sustainability and green practices.
However, I view this horsepower-driven approach as only part of the AI dream as efficiency matters. Big tech must move toward developing algorithms that achieve more with less. The quest for ever-larger models should be balanced by creating leaner, smarter algorithms that don’t use massive amounts of resources. As the world grapples with climate change, we must confront the sustainability crisis posed by AI. The energy-intensive process of training models demands urgent attention. Ignoring the carbon footprint of our digital future in pursuit of bigger and better AI systems is shortsighted, and is being done for purely monetary purposes to line the pockets of their corporate masters.
Quality data is another important issue that must be addressed when building AI systems. Data is the lifeblood of AI, yet the internet’s vast sea of information is both a blessing and a curse. Publicly available data, scraped from the web, often lacks rigor, introducing biases that skew AI models. We must recalibrate our focus, prioritizing quality over quantity, where rigorous data selection, cleaner labeling, and robust oversight are non-negotiable.
As AI becomes woven into our lives, vigilance is paramount. These massive AI goliaths must be trained under strict conditions, with oversight as a fundamental necessity. The safeguards we implement today will shape our AI-driven world tomorrow.
China provides an alternative perspective. Their focus on developing AI systems with efficient algorithms, better quality data, and greater accountability and transparency sets a sustainable example. I suggest we look closer at what China is doing as it is clearly developing AI’s with sustainability in mind. We must tread carefully, balancing power with responsibility as our AI future hinges on the choices we make today.