Scale is only a part of the equation, according to the leader of the artificial intelligence company


Does Computer Power Really Matter for Artificial General Intelligence? Jeff Dean’s Perspective on Hassabis’ Advances in Gemini and OpenAI

The past few years has seen amazing advances due to the increased amount of computer power used in training an artificial intelligence model. The man is looking to raise $7 trillion for more chips. Is computer power the thing that will help with artificial general intelligence?

We invented mixture of experts—[Google DeepMind chief scientist] Jeff Dean did that—and we developed a new version. The newest version of Gemini has almost the same performance as the largest of the previous generation of architecture. We’re working on creating an Ultra-sized model with the innovations, and we don’t have to limit it.

Demis Hassabis: You can now ingest a reasonable-sized short film. If you’re learning about a topic that requires one hour of class time, you can use it as a basis for searching for facts or when they did something. I think there’s going to be a lot of really cool use cases for that.

Several of the ideas that were researched by the company were left out due to the concerns about how they would be used. Hassabis oversaw a dramatic shift in the pace of research and releases in recent months with the rapid development of a multi-modal Artificial Intelligence model. The company announced last week that it would be making a major upgrade to the free version of Gemini Pro 1.5 that is more powerful and can analyze vast amounts of text, video, and audio at a time.

Ever since Alphabet forged DeepMind by merging two of its AI-focused divisions last April, Hassabis has been responsible for corralling its scientists and engineers in order to counter both OpenAI’s remarkable rise and its collaboration with Microsoft, seen as a potential threat to Alphabet’s cash-cow search business.

While it looked impossible for Openai to be knocked off its perch atop the tech industry, the company did ride a wave of excitement generated by a program called the ‘chatGgt’.

Willow Primack, VP of data operations at Scale AI, says that Remotasks and others are turning to subject matter experts for data labor in response to the major shift in the applications of AI systems, as these systems start to produce knowledge and content. As the tech industry has rushed to embrace generative AI over the past year and applied it to more sophisticated tasks, data providers have needed a new intake of contractors capable of what Primack calls “expert fact-checking.”

Jay is thoughtful about his role in the future of work. “It’s true,” he says. “I am passing on knowledge that I have and that the machine does not have.” He is aware that artificial intelligence models are unable to reproduce the ingenuity of humans when it comes to math problems. He hopes the work he did will help create an artificial intelligence that can benefit not replace him, but help him to practice his chosen subject. I imagined that when I started training these.

Jay says he knew he was training algorithms for the company overseen by Sam Altman because he was invited to join the OpenAI workspace in Slack. A screenshot he shared with WIRED shows he was part of a group, called “math trainers,” that was set up by the OpenAI researcher Yuri Burda. But Jay was not working directly for the famous AI company. He was being paid by a subsidiary of a US startup called Scale Ai, which was valued at over $7 billion and counts Microsoft, Meta, and the US Army among its clients.