There is a new Artificial Intelligence arms race


DeepSeek: How much should you spend to train an AI AI? Discussion at Decoder on the Technologistically Theoretical Uncertainty

On today’s episode of Decoder, we’re talking about the only thing the AI industry — and pretty much the entire tech world — has been able to talk about for the last week: that is, of course, DeepSeek, and how the open-source AI model built by a Chinese startup has completely upended the conventional wisdom around chatbots, what they can do, and how much they should cost to develop.

The company claims that it cost less than $6 million to train its R1, despite the GPT-4 report stating it cost as much as $100 million. In a matter of days, DeepSeek went viral, becoming the No. 1 app in the US, and on Monday morning, it punched a hole in the stock market.

Panicked investors wiped more than $1 trillion off of tech stocks in a frenzied selloff earlier this week. Nvidia, in particular, suffered a record stock market decline of nearly $600 billion when it dropped 17 percent on Monday.

If raw computing power is not the path to AGI then there is a more crucial question. “We need a fundamentally new learning paradigm,” argues Databricks AI VP Naveen Rao. More computation alone won’t get us there. His skepticism gained weight recently when DeepSeek, a much smaller China-based AI lab, achieved o1-level performance allegedly using far less computing power by focusing on more efficient training methods.

As skepticism grows, we’re seeing increasingly grandiose promises, demonstrated in the Stargate announcement: AI-powered cancer vaccines designed in 48 hours, over 100,000 new American jobs, and US dominance in artificial intelligence. The venture aims to transform medical research by combining genetics with artificial intelligence. The project is focused on helping human advancement with the use of artificial intelligence. AGI is just around the corner, they’ll cure cancer, transform science — if investors just hand over a few more billion dollars.

It helps with making yourself seem better. Mark Zuckerberg, Musk, and Altman are seemingly in a data center measuring contest, constantly one-upping each other through social media posts and press releases. The tech elites have become stylish because they have the biggest data center empire. It is a point that says each leader has the potential to change the world as much as the digital one.

That is, in short, the massive bet AI leaders are making right now. It isn’t totally stupid to bet on scale, though. The scaling hypothesis — the idea that if you make an AI bigger and feed it more data and computing power, it gets smarter — is largely what has given us the best models we have today. A lot of this compute will be used for inference, too, so the models can handle millions of user requests simultaneously without crashing the chatbots. The infrastructure lead said that it was a very big feat to keep it up and running. In theory, Stargate would help with that.

According to Kent Draper, who leads IREN, the financial goals of the data center operator hold up in a best-case scenario. You can build a lot of data centers on its first-year budget of $100 billion. The problem is, there’s no guarantee that money can be spent quickly and efficiently.

The Story of Sam Altman’s Stargate, a Silicon Valley Factory: On the Failure of the Trump-Like Initiative to Expand the Stargate Industry

After Trump got a chance to show off his victory lap, he scaled down the promised $10 billion factory, and you may remember it. Most promised jobs never materialized, and the project largely fizzled after the publicity.

This would be far from the original vision of creating enough infrastructure to democratize access to AI computing power. The industry would not be changed, instead they’d be patching the most urgent holes.

Stargate is breaking ground with 10 data centers in Abilene, Texas — a location chosen likely for its untapped renewable energy potential, Draper said. Trump has pledged to fast-track construction through executive orders as the venture eyes expansion beyond Texas. Data centers in cities can take a long time to process software, but artificial intelligence facilities can take care of power access over location. (When the goal is to use the compute for inference, though, spacing of the facilities is trickier.)

Older data centers simply can’t handle that sort of power density, and you can’t add more electricity here. They don’t have the infrastructure to deliver that much power and Cooling systems that handle the intense heat are needed.

Source: Sam Altman’s Stargate is science fiction

Open AI is losing money, but they don’t have the money to run it: The story of Elon Musk and the First Buddy of the White House

This doesn’t look right or feel right. Despite its rapid growth and ballooning compute costs, Open Artificial Intelligence is losing money even though it charges customers $200 per month. At this burn rate, even the deepest pockets will eventually run dry. Son is an investor known for throwing money at moonshot ideas, only to yank the funding when reality catches up. Maybe an investor like Son is the last resort. Just look at the latest headline: it was reported that OpenAI is looking to nab another $25 billion from SoftBank in a new funding round that would value the startup at a jaw-dropping $340 billion.

The numbers are odd, and the First Buddy of the White House, Elon Musk, wasted no time in shouting out what everyone else was whispering. “They don’t actually have the money,” Musk posted soon after the announcement. SoftBank has under $10 billion of secured debt. I have good authority.

On top of that, SoftBank has a complicated history with regard to its investing acumen — just look at the Vision Fund, which resulted in the company’s disastrous backing of WeWork.

The reply said that his post was wrong and that the first site was already underway. There were still doubts rippling through the tech circles. In the CNBC interview, Nadella dodged questions about money but insisted he was not in the details of investments. When pressed about its funding, he only confirmed Microsoft’s annual $80 billion Azure investment. “All I know is I’m good for my $80 billion,” he said with a chuckle. Arm CEO Rene Hafs gave the financial foundation a vote of confidence.

The rest of it is supposed to come from outside investors and debt financing. SoftBank is no stranger to debt — it already carries $150 billion on its books. And Altman will likely tap the United Arab Emirates for additional capital. Even so, there’s a massive gap between $45 billion and $500 billion, and even the most aggressive capital raising efforts have limits.

Source: Sam Altman’s Stargate is science fiction

Why is OpenAI interested in Musk? Trump rebuked by the public and the Clinton-era Microsoft-Openai partnership, with an apology

Musk’s ongoing lawsuit with Openai is a reason he is not a neutral observer. And at this point, the conversation has devolved into a mix of petty sniping and bloviation. Trump brushed off questions regarding Musk’s funding concerns: “I don’t know if they do, but you know, they’re putting up the money. The government is putting money into it. He said that he hoped they would, and added that Musk did not like one of those people.

Last year, a Microsoft-Openai plan for a massive data center appeared despite being billed as a landmark Trump-era project. Microsoft was wary of OpenAI’s dependence on computing costs, which caused the partnership to fall apart. So, Altman traded Microsoft’s deep pockets for a more complex web of funding partners.