What Will Reimagine Do? The Google Pixel 9 and Google’s Promises for Digital Photo Editing and Detection of Probable Explosive Content
Google is the latest phone company this year to announce AI photo editing tools, following Samsung’s somewhat troubling, mostly delightful sketch-to-image feature and Apple’s much more seemingly tame Image Playground coming this fall. I think the answer is a new tool called Reimagine, which I used for a week with a few of my colleagues, and I am more convinced than ever that none of us are ready for what is coming.
A couple of my colleagues helped me test the boundaries of Reimagine with their Pixel 9 and 9 Pro review units, and we got it to generate some very disturbing things. Some of this required some creative prompting to work around the obvious guardrails; if you choose your words carefully, you can get it to create a reasonably convincing body under a blood-stained sheet.
We added car wrecks, smoking bombs in public places and sheets that appear to cover dead bodies to images during our week of testing. That seems bad. This isn’t some piece of specialized software we went out of our way to use, it is all built into a phone that my dad could buy.
Google claims the Pixel 9 will not be an unfettered bullshit factory but is thin on substantive assurances. According to the communications manager for the search giant, content that may offend users may be created if they were told to do so. “That said, it’s not anything goes. We have clear policies and terms of service on what types of content we allow and don’t and we build guardrails to prevent abuse. We are committed to continuously enhancing and refining the safeguards we have in place and are at times able to challenge these tools.
Our prompting to work around filters is a clear violation of these policies. It’s also a violation of Safeway’s policies to ring up your organic peaches as conventionally grown at the self-checkout, not that I know anyone who would do that. And someone with the worst intentions isn’t concerned with Google’s terms and conditions, either. What’s most troubling about all of this is the lack of robust tools to identify this kind of content on the web. Our ability to make problematic images is running way ahead of our ability to identify them.
How to Edit an Image with Reimagine? Adding Strangeness to the Search and Implications for AI-Micromanipulation
When you edit an image with Reimagine, there’s no watermark or any other obvious way to tell that the image is AI-generated — there’s just a tag in the metadata. That’s all well and good, but standard metadata is easily stripped from an image simply by taking a screenshot. We were told that a more robust system is used by the search giant to tag images created by the studio. But images edited with Magic Editor don’t get those tags.
To be sure, tampering with photos is nothing new. People have been adding weird and deceptive stuff to images since the beginning of photography. This is the first time that it has been so easy to add these items to your photos. It would take time, expertise, and access to expensive software to make it believable that a car crash is happening in an image. Those barriers are gone; all it now takes is a bit of text, a few moments, and a new Pixel phone.
You can already see the shape of what’s to come. In the Kyle Rittenhouse trial, the defense claimed that Apple’s pinch-to-zoom manipulates photos, successfully persuading the judge to put the burden of proof on the prosecution to show that zoomed-in iPhone footage was not AI-manipulated. Donald Trump claimed that a picture of a well attended rally was generated by an artificial intelligence program because people were able to believe it.
It is possible that everyone will read and abide by the policies of the tech giant and use the Reimagine service to put wildflowers and rainbows in their photos. That would be wonderful! It might be a good idea to apply extra skepticism to online photos in case they don’t.
We lived in a time in which the photo was used as a quick way to know things and have a smoking gun. It was an extraordinarily useful tool for navigating the world around us. Reality is less knowable and we are now leaping into a future where that is the case. The Library of Alexandria could fit on the microSD card of my Nintendo Switch, and yet it’s a handheld telephone that’s a fun bonus feature.
The editing tool for the Pixel camera is being described as a way to help you create the way you remember it, and that is something that is authentic to your memory. A photo is a mirror of itself in this world, not a supplement to fallible human recollection. The dumbest shit will go to court over witnesses’ reputation and corroborating evidence as photographs get less and less presentable.
The media had been in a crouching crouch, scrutinizing the details and provenance of every image, vetting for misleading context or photo manipulation, before the advent of artificial intelligence. After all, every major news event comes with an onslaught of misinformation. There is an incoming paradigm shift that implicates something more fundamental than the grind of suspicion that is sometimes called digital literacy.
The persistent cry of “Fake News!” The beginning of this era of bullshit was presaged by the Trumpist quarters in order to make the impact of the truth worse. The next Abu Gharaib will be covered by a sea of war crime snuff. The next George Floyd will be of no consequence.
For the most part, the average image created by these AI tools will, in and of itself, be pretty harmless — an extra tree in a backdrop, an alligator in a pizzeria, a silly costume interposed over a cat. In aggregate, the deluge upends how we treat the concept of the photo entirely, and that in itself has tremendous repercussions. The last decade saw a lot of social upheaval in the United States caused by videos of police brutality. Where the authorities obscured or concealed reality, these videos told the truth.
Why Are Some Photos Fake? The Case of Ai-Photoera, What Is-Reality-Google Pixel 9 (No one is ready for this)
The onus had been on those denying the truth of the photo to prove their claims. The flat-earther is out of step with the social consensus not because they do not understand astrophysics — how many of us actually understand astrophysics, after all? — but because they must engage in a series of increasingly elaborate justifications for why certain photographs and videos are not real. They must invent a vast state conspiracy to explain the steady output of satellite photographs that capture the curvature of the Earth. They must create a soundstage for the 1969 Moon landing.
No one on Earth today has ever lived in a world where photographs were not the linchpin of social consensus — for as long as any of us has been here, photographs proved something happened. Consider how the assumed reliability of a photograph has changed the truth of your experiences. There is a damaged fender of your rental car. The ceiling is leaking. A package is arriving. An actual, non-AI-generated cockroach in your takeout. When wildfires encroach upon your residential neighborhood, how do you communicate to friends and acquaintances the thickness of the smoke outside?
If I say Tiananmen Square, you will, most likely, envision the same photograph I do. This also applies to Abu Ghraib or napalm girl. These images have defined wars and revolutions, and have encapsulated truth that is difficult to fully express. There wasn’t a reason to say why these photos matter or why we put so much value in them. Our trust in photography was so deep that when we spent time discussing veracity in images, it was more important to belabor the point that it was possible for photographs to be fake, sometimes.
Source: No one’s ready for this
Explosion from the Side of an Old Building, a Cockroach in a Box, and Other Extremely Fucking Fake Photos with the Pixel 9 Magic Editor
Anyone who purchases a phone from an authorized retailer will have access to the simplest user interface for top-tier lies built right into their mobile device. This is all but certain to become the norm, with similar features already available on competing devices and rolling out on others in the near future. It is usually a positive thing when a device works well, but here it is the whole problem in the first place.
An explosion from the side of an old brick building. A bicycle is in an intersection. A cockroach in a box of takeout. It took less than 10 seconds to create each of these images with the Reimagine tool in the Pixel 9’s Magic Editor. They are crisp. They are all in color. They are high-fidelity. There is nothing suspicious about the background or the sixth finger. These photographs are extraordinarily convincing, and they are all extremely fucking fake.