You can watch a musician demo an Artificial Intelligence DJ


Mixer-style interfaces for real-time music mixing based on prompts: Google’s I/O keynote keynote and DJ Mode in MusicFX

Developer conferences aren’t exactly known for having an energetic, party-like atmosphere, but thankfully, that didn’t stop Google’s latest hype man. The company opened their I/O event this year with an artist known in online spaces for combining improvised electronic tracks with amusing ( and typically loud) vocals. He also wears a lot of robes.

Google also showed off its new DJ Mode in MusicFX, an AI music generator that lets musicians generate song loops and samples based on prompts. DJ mode was demonstrated during Mark Rebillet’s performance that led into the I/O keynote.

The interface works like this: users are presented with a mixer-style interface which spits out music based on text prompt and organizes them to sync the resulting track. Adding more prompt to the mix can change the music in real time. You can try it out yourself now over on Google’s AI Test Kitchen. Musicfx is still in development after it was introduced last year.

WIRED: Artificial Intelligence Updates for Google Search, and a Demonstration of Turntable Repair and Scam Detection for Android

Rebillet has over 2 million followers on both YouTube and TikTok, where he’s best known for his viral “Night Time Bitch” sound clip and songs in which he screams at people to get out of bed while wearing a bathrobe. He opened I/O by clambering out of a giant coffee mug, yelled for all the silly little nerds to wake up and then fired rainbow-colored robes into the crowd.

WIRED’s Lauren Goode talked with Google head of search Liz Reid about all the AI updates coming to Google Search, and what it means for the internet as a whole.

There is a new capability that allows for more tightly presented and readable search results, as well as the ability to get better responses from longer queries and searches with photos.

We also saw AI overviews, which are short summaries that pool information from multiple sources to answer the question you entered in the search box. These summaries appear at the top of the results so you don’t even need to go to a website to get the answers you’re seeking. These overviews are already controversial, with publishers and websites fearing that a Google search that answers questions without the user needing to click any links may spell doom for sites that already have to go to extreme lengths to show up in Google’s search results in the first place. The new enhanced artificial intelligence overviews will be available to everyone in the US starting today.

Lastly, we saw a quick demo of how users can rely on Google Lens to answer questions about whatever they’re pointing their camera at. These abilities are being built into lens in a slightly different way, and it sounds like it’s similar to Project Astra. The demo showed a woman trying to get a “broken” turntable to work, but Google identified that the record player’s tonearm simply needed adjusting, and it presented her with a few options for video and text-based instructions on how to do just that. Through the camera, it was possible to identify the make and model of the turntable.

The keynote showed off a new scam detection feature for Android that can listen in on your phone calls to detect when a scam is about to take place and can be used to ask you to move money into a different account. If it hears you getting duped, it’ll interrupt the call and give you an on-screen prompt suggesting that you hang up. Google says the feature works on the device, so your phone calls don’t go into the cloud for analysis, making the feature more private. Check out the guide to protecting yourself and your loved ones from artificial intelligence scam calls.

Google has also expanded its SynthID watermarking tool meant to distinguish media made with AI. This can help you detect misinformation, deepfakes, or phishing spam. The image can be detected by software that analyses the data in the picture, even though the tool leaves an invisible watermark. The new updates allow for scanning of Veo-generated video content on the Gemini app, on the web and in Veo-generated videos. Google says it plans to release SynthID as an open-source tool later this summer.