The Washington Times reports that a House Speaker of the House of Representatives Paul Pelosi was attacked with a hammer by an intruder
And the effects of online harm against women are chilling. We can look to research that’s been done in societies where women face more social restrictions to see the impact. The study found that women in the country of Azeri chose to opt out of online socializing because of the risk of real-world repercussions from online harassment. In other words, women faced an impossible double standard: unable to control their image on social media but punished severely for it.
An intruder broke into the California home of House Speaker Nancy Pelosi early Friday and attacked her husband Paul Pelosi with a hammer while she was in Washington, DC. The speaker’s office said that Pelosi would make a full recovery after undergoing surgery to fix a skull fracture and other injuries.
The suspected attacker, David DePape, who has a history of sharing conspiracy theories on social media, said he would wait “until Nancy got home,” according to a source briefed on the attack. Chief William Scott of the San Francisco Police Department said that DePape was taken into custody on suspicion of attempted Homicide, assault with a deadly weapon, elder abuse and several other felonies.
The Pelosi home was attacked and shown how violent members of Congress can be. Social media companies should stop hosting content about violence and the FBI should look into it to see if there is a way to prosecute it. They should use this horrific episode as the wake-up call that it is, and not wait for Sen. Collins’ prediction that a member of Congress may end up dead to come true.
From 2017 to 2021, threats against members of Congress investigated by US Capitol Police increased by 144%, Axios reported. Many of the lawmakers on receiving threats are women and people of color.
After the window of Susan Collins’s home was smashed, she told The New York Times that there had been threats of violence. She said she would be surprised if a lawmaker was killed.
Democratic Rep. Pramila Jayapal has been harassed by a man who showed up repeatedly outside her home, armed with a handgun. Jayapal’s husband said he heard the voices of two men shouting obscenities and suggesting that they would stop harassing her neighborhood if she killed herself.
When they are hired for the job they sign up for a lot of things. It is difficult to describe having someone show up to your door with a gun, scaring your neighbors, scaring your staff and clearly trying to intimidate me.
And Democratic Rep. Alexandria Ocasio-Cortez of New York receives so many threats that she has a round-the-clock security team and, at times, sleeps in different locations. Her Republican colleague, Paul Gosar, was accused of killing her in an altered video that he posted on his account. (Gosar deleted the video and did not apologize. He was censured by the House and removed from two committee assignments hours later, but not before he posted a picture of himself with a video.
A rogue employee at the New York Post was fired this week after changing the headline of an online editorial to say that we must kill an elected official.
The Attack on Pelosi: The Role of Social Media in Detecting Online Feminino and Sexual Hate, and Inappropriate Images
Pelosi is a particular target of hatred among the right. The House Speaker, who has famously clashed with former President Donald Trump, was the subject of manipulated videos that made her appear to slur her words. Those videos were then amplified by both Trump and his personal lawyer Rudy Giuliani on social media, where they went viral. During the attack on the Capitol last January, Trump supporters went to her office and yelled, “Where are you, Nancy?”, which was eerily similar to the words DePape uttered on Friday: “Where is Nancy?”
Social media companies claim they don’t tolerate this kind of hate. Yet the reality is that it continues to be on their platforms. They need to get serious about taking down the abuse. The available reporting tools should be used immediately when users see online hate, like the Gosar video.
It is sobering that the attack on Pelosi happened while Musk has finalized his purchase of Twitter, as he favors a more liberal content moderation policy. If sexism and abuse become a bigger problem on any platform, users should stop using it.
The abuse of women on the internet should be investigated by the FBI. If the agency needs more funding to do it, Congress should levy a tax against social networks to fund an expansion of resources. I think many lawmakers are happy to cast a vote in favor of the bill, because of the threats they have received.
Users are required to submit only appropriate content with no nudity and no kids, adults only. Women are the majority of users, and they have noticed that the app not only gives nudes but also ascribes sexualized features to their images. I got several fully nude results despite only uploading my headshots. Some women of color told me that Lensa whitened their skin and anglicized their features in order to make them appear more Caucasian, and one woman of Asian descent told me that the photos where she didn’t look white they gave her ahega. I was shocked when I saw the fully clothed images that she uploaded and the topless results they produced, and she said she felt very violated after seeing it.
AI-enabled intimate image abuse that combines images to create or generate new, often realistic images—so-called deepfakes—are other weapons for online abuse that disproportionately impact women. Estimates from Sensity AI suggest that 90 to 95 percent of all online deepfake videos are nonconsensual porn, and around 90 percent of those feature women. The technology to create realistic deepfakes is now outpacing our ability and efforts to combat it. There are low barriers to entry in creating fakes that can be harmful, and they are getting more realistic by the day. The tools for detecting and combating abuse aren’t keeping up.
I use to feel violated by the internet. I have been the victim of harassment campaigns and have seen my image altered, distorted, and distributed without my consent. I am not a sex worker, but the novelty of hunting down and distributing my likeness is for some, a sport. Sex workers are not viewed as deserving of fundamental rights by the general public, so this behavior is celebrated. Because sex work is so often presumed to be a moral failing rather than a job, our dehumanization is redundant. I’ve logged on to Twitter to see my face photoshopped onto other women’s bodies, pictures of myself and unclothed clients in session, and once even a word search comprised of my face, personal details, and research interests. I don’t fear Lensa.
I decided to be my own lab rat because I was desensitized to technology. I ran a few experiments: first, only BDSM and dungeon photos; next, my most feminine photos under the “male” gender option; later, selfies from academic conferences—all of which produced spectacularly sized breasts and full nudity.
I have pictures of myself from childhood. Until my late teens and between my unruly hair, uneven teeth, and the bifocals I started wearing at age seven, my appearance could most generously be described as “mousy.” All of my pictures are in my distant relatives’ photo albums because they were taken when I was a child. I waited to use the app and see how it transformed me from an awkward six-year-old to a fairy princess.
In some instances, the AI seemed to recognize my child’s body and mercifully neglected to add breasts. The pattern it identified in my photo was similar to the way it perceived myflatchest as being that of an adult man. In other photos, the AI attached orbs to my chest that were distinct from clothing but also unlike the nude photos my other tests had produced.
This time, I used a mixture of childhood photos and selfies. What resulted were fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body. Similar to my earlier tests that generated seductive looks and poses, this set produced a kind of coyness: a bare back, tousled hair, an avatar with my childlike face holding a leaf between her naked adult’s breasts. Many were eerily reminiscent of Miley Cyrus’ 2008 photoshoot with Annie Leibovitz for Vanity Fair, which featured a 15-year-old Cyrus clutching a satin sheet around her bare body. There was a disturbing juxtaposition of her makeup-free, almost cherubic face with the body of someone who was implied to have just had sex.
There are answers: Better safety-by-design measures can help people control their images and their messages. People can now control how they are tagged in photos. Users of the dating app can choose to see nude images they want to see, with the help of Private Detector. Legislation, such as the UK’s proposed Online Safety Bill, can push social media companies to address these risks. It is far from perfect, but the bill takes a systems-based approach to regulation, asking platform companies to assess the risks and to develop upstream solutions such as improving human content moderation, dealing better with user complaints, and pushing for better systems to take care of users.
This regulatory approach is not likely to keep women from logging off in large numbers. If they do, they will miss out on the benefits of being online and our online communities will suffer.