The TikTok has been banned by the US House of Representatives.


TikTok: A social media platform that empowers people to think outside the enclave and embrace outside the family, as reported by ByteDance

State and federal officials are looking at ways to crack down on TikTok over privacy and security concerns, as well as determining whether the app is appropriate for teens. Executives from social media platforms, including TikTok, faced tough questions from lawmakers during a series of Congressional hearings over how their platforms can direct younger users to harmful content.

The article said that the Internal Audit team at ByteDance was going to surveil at least two Americans who had never worked for the company. Forbes claims that its report was based on materials that it reviewed but didn’t reveal why ByteDance was planning to track them or who they might be.

Just as our relationship with media shifted when it entered our homes, it has continued shifting as it invades our smartphones. These devices, which are tightly integrated into the ways that we think and process information, have allowed TikTok to position itself as an extension of our minds. If we want to extricate ourselves from the app’s grasp, we must first understand how the mind works in the age of the technologized self.

Take, for example, the transition from the cinema to TV that occurred in the mid-20th century and enabled moving images to enter our homes. Once constrained to the theater, this content began to live alongside us—we watched it as we got ready in the mornings, ate dinner, hosted guests, spent time with family. Marshall McLuhan noticed that the mechanics of how we receive, processed, and related to them changed when moving pictures were taken out of the dark, anonymous places of the theater and placed in our domestic spaces. As the newer features of our dwellings become ingrained in our perception of being in the world, they take on a familiar casualness. Donald Horton and R. Richard Wohl write in the paper that viewers developed “parasocial” relationships with the people they saw through the screens. Home audiences grew to see these mass media personas as confidants and friends, giving broadcasters the means to manipulate audiences at a more personal level.

Once, platforms sought to be device-agnostic, universal purveyors of content that would be accessible to anyone who might want it. As Kyle Chayka notes, this allowed companies to promise users that they could use any device to transcend particularities like nationality, identity, or class and “follow anything or anyone” they wanted when on the site. Google’s mission to “organize the world’s information and make it universally accessible” is in many ways emblematic of this logic. Discussions have rarely focused on the specifics of our encounter with these platforms—the instruments used, context, or materiality.

The companies promised to change after the hearings, which followed the disclosures from a Facebook employee. More work may need to be done according to findings from the CCDH.

A research director at a market research firm said that social media platforms were lack of substance in their response to ills on their platforms. She said that since their solutions are meant to block access and other things, the onus is on the guardian to ensure that all parental controls are activated.

In response, the company recently refreshed its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs about how Discord works, and tips on how to talk about online safety with teens. An option is available to prohibit a minor from receiving a friend request or a direct message from someone they don’t know.

After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.

The Safety Center on Facebook has a wealth of resources that include articles and advice from leading experts. “Our vision for Family Center is to eventually allow parents and guardians to help their teens manage experiences across Meta technologies, all from one place,” Liza Crenshaw, a Meta spokesperson, told CNN Business.

A feature in the app suggests that users take a break from the app, such as suggesting that they take a deep breath, write something down, check a to-do list or listen to a song. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.

While this was Snapchat’s first formal foray into parental controls, it did previously have a few existing safety measures for young users, such as requiring teens to be mutual friends before they can start communicating with each other and prohibiting them from having public profiles. Teen users have their Snap Map location-sharing tool off by default but can also use it to disclose their real-time location with a friend or family member even while their app is closed as a safety measure. Thefriendcheckup tool encourages users to review their friend lists and make sure they still want to be in touch with certain people

TikTok: a safety-first safety tool for social videos removal and interaction with strangers in online chats and chats with minors

The company told CNN Business it will continue to build on its safety features and consider feedback from the community, policymakers, safety and mental health advocates, and other experts to improve the tools over time.

The new ways to remove mature and potentially problematic videos were announced by TikTok in July. The new safeguards assigned a maturity score to videos that might contain mature or complex themes. It also rolled out a tool that aims to help people decide how much time they want to spend on TikToks. The tool gives users a dashboard that shows a breakdown of daytime usage and how many times they have opened the app in a day.

In addition to parental controls, the app restricts access to some features to younger users, such as Live and direct messaging. A pop-up also surfaces when teens under the age of 16 are ready to publish their first video, asking them to choose who can watch the video. Push notifications are curbed after 9 p.m. for account users ages 13 to 15, and 10 p.m. for users ages 16 to 17.

Discord did not appear before the Senate last year but the popular messaging platform has faced criticism over difficulty reporting problematic content and the ability of strangers to get in touch with young users.

Still, it’s possible for minors to connect with strangers on public servers or in private chats if the person was invited by someone else in the room or if the channel link is dropped into a public group that the user accessed. By default, all users — including users ages 13 to 17 — can receive friend invitations from anyone in the same server, which then opens up the ability for them to send private messages.

The Senate-Passed Social Media Data Protection Measure (Support) Bill: The Case for TikTok in the U.S.

It is a bipartisan bill that has been signed by the leader of the Senate intelligence committee and two other congressmen. TikTok has faced doubts about its ability to safeguard US user data from the Chinese government.

The proposed legislation would “block and prohibit all transactions” in the United States by social media companies with at least one million monthly users that are based in, or under the “substantial influence” of, countries that are considered foreign adversaries, including China, Russia, Iran, North Korea, Cuba and Venezuela.

The US House of Representatives has ordered staff to delete TikTok from any House-issued mobile phones, according to an internal memo obtained by NBC News. Catherine L Szpindor, the chief administrative officer of the House, has banned the popular social media app from being downloaded on House-issued devices, according to reports.

The posturing comes at a pivotal moment in the years-long negotiations between TikTok and the US government on a potential deal that aims to address national security concerns and allow the app’s continued use in the US.

McQuaide said that he would continue to brief Congress on the plans that were developed under the oversight of the country’s top national security agencies.

The newsletter “Reliable Sources” has a version of this article. Sign up for the daily digest chronicling the evolving media landscape here.

But its widespread usage across the U.S. is alarming government officials. In November, FBI Director Christopher Wray raised eyebrows after he told lawmakers that the app could be used to control users’ devices.

The Senate-passed bill would provide exceptions for “law enforcement activities, national security interests and activities, and security researchers.”

A Study of TikTok Policies against Eating Disorders, Suicide, and Selfharm: A Parent’s Nightmare on a Facebook Page

TikTok has reached one billion monthly users. In the us, two-thirds of teens say they use it, according to the research center.

In a report published Wednesday, the non-profit Center for Countering Digital Hate (CCDH) found that it can take less than three minutes after signing up for a TikTok account to see content related to suicide and about five more minutes to find a community promoting eating disorder content.

The results are every parent’s nightmare: young people are bombarded with harmful, harrowing content that can have a significant impact on their understanding of the world around them, and their physical and mental health.

The study depicts the viewing experience on the platform differently than it actually does, including the small sample size, the limited 30-minute window for testing, and the way accounts scrolled past a series of unrelated topics.

“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” the TikTok spokesperson told CNN. “We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

The spokesperson said the CCDH does not distinguish between positive and negative videos on given topics, adding that people often share empowering stories about eating disorder recovery.

After CNN sent a sample of five accounts toinstagram for comment, the company removed them, saying they all broke the companys policies against encouraging eating disorders.

The spokesperson told CNN when someone searches for banned words or phrases such as #selfharm, they will not see any results and will instead be redirected to local support resources.

Big Tech: The Big Blast of the Tech Industry after the Crimes of March 10, 2012 and the White House’s Failure to Publish

The White House held a call with the TikTok creators on March 10 after Russia invaded Ukraine. Jen Psaki, then the White House press secretary, and members of the National Security Council staff briefed the creators, who together had tens of millions of followers, on the latest news from the conflict and the White House’s goals and priorities. The meeting followed a similar effort the previous summer, in which the White House recruited dozens of TikTokers to help encourage young people to get vaccinated against Covid.

But it isn’t just lobbying that has made some of these bills difficult to pass. It’s much more challenging to impose sweeping regulations on an entire industry than it is to pass a bill governing how the US government handles its own technology.

The tech industry’s largest players have faced a kitchen sink of allegations in recent years. From knee-capping nascent rivals; to harming children and mental health; to undermining democracy; to spreading hate speech and harassment; to censoring conservative viewpoints; to bankrupting local news outlets; Big Tech has been made out as one of Washington’s largest villains.

There’s no evidence yet that that has actually happened. It is possible because of the national security laws of China, and because of the issues associated with trade, human rights and authoritarianism. Those concerns were renewed after a report this year suggested US user data had been repeatedly accessed by China-based employees. TikTok has disputed the report.

Beckerman told CNN on Tuesday that he thinks the concerns are overblown and that they can be solved through the ongoing government negotiations.

Source: https://www.cnn.com/2022/12/22/tech/washington-tiktok-big-tech/index.html

Lobbying for Tech Platforms: AicoA, Facebook, Meta, Google, and ByteDance are Screwing It Up

ByteDance spent over $300,000 on lobbying in 2015, according to records obtained by OpenSecrets. By the end of last year, its lobbyist count had more than doubled and the company had spent nearly $5.2 million on lobbying.

Meta was the biggest internet industry lobbying giant last year, spending upward of $20 million. Next was Amazon at $19 million, then Google at almost $10 million. Combined, that’s roughly $49 million in lobbying — almost 10 times what was spent by TikTok’s parent, which nevertheless clocked in at number four on the list.

One of those bills, the American Innovation and Choice Online Act (AICOA), would erect new barriers between tech platforms’ various lines of business, preventing Amazon, for example, from being able to compete with third-party sellers on its own marketplace. That legislation was a product of a 16-month House antitrust investigation into the tech industry that concluded, in 2020, that many of the biggest tech companies were effectively monopolies.

For a brief moment this month, lawmakers seemed poised to pass a bill that could force Meta, Google and other platforms to pay news organizations a larger share of ad revenues. The legislation went down after Meta warned it could have to remove news content from its platforms if the bill passed.

Silicon Valley’s top players have shown they can defend their turf in Washington when Congress wants to knock them down.

By contrast, decisions about rules the government may impose on tech platforms have questioned how those rules might affect different parts of the economy, from small businesses to individual users.

In some cases, as with proposals to revise the tech industry’s decades-old content moderation liability shield, Section 230 of the Communications Decency Act, legislation may raise First Amendment issues as well as partisan divisions. Democrats have said Section 230 should be changed because it gives social media companies a pass to leave some hate speech and offensive content unaddressed, while Republicans have called for changes to the law so that platforms can be pressured to remove less content.

The cross-cutting politics and the technical challenges of regulating an entire sector of technology, not to mention the potential consequences for the economy of screwing it up, have combined to make it genuinely difficult for lawmakers to reach an accord.

State of the Art: Social Media Inadequacy in Nebraska and Its Consequences for the Future of Education and Research

All devices from all states have been banned in Nebraska since 2020. The Florida Department of Financial Services is also involved in that. West Virginia and Louisiana both announced partial bans.

Establishing a Republican brand is important. A central tenet of what unites Republicans now is taking a strong stance [and] standing up to China,” says Thad Kousser, professor of political science at U.C. San Diego.

Brooke Oberwetter, a spokeswoman for TikTok, said to The Wall Street Journal that the move was a political signal rather than a practical solution for security concerns, and claimed that the ban would have minimal impact because very few House-managed phones have TikTok installed.

Social media teaching and research has become standard in higher education. The app has fundamentally changed the nature of modern communication with its aesthetics, practices, storytelling, and information-sharing.

From an educational standpoint, how are media and communications professors supposed to train students to be savvy content creators and consumers if we can’t teach a pillar of the modern media landscape? While students can certainly still access TikTok within the privacy of their own homes, professors can no longer put TikToks into PowerPoint slides or show TikTok links via classroom web browser. The ability of professors to teach their students how to best use TikTok will be gone, since all businesses rely on it. Additionally, TikTok makes parts of the world more accessible, as students can see the things they are learning about in real time.

As these states enforce their bans, their citizens are left vulnerable in a fast-paced media world. Additionally, media and communications students in the states will be at a disadvantage in applying for jobs, showcasing communicative and technical mastery, and brand and storytelling skills, as their peers from other states will be able to receive education and training.

Professors do research of their own. Social media scholars in these states quite literally cannot do what they have been hired to do and be experts in if these bans persist. While university compliance offices have said the bans may only be on campus Wi-Fi and mobile data is still allowed, who will foot that bill for one to pay for a more expensive data plan on their phone? The answer is no one. The option to work at home does remain a possibility, but it is expected of professors and other employees that they are in fact working on a regular basis. This means any social media professor attempting to research TikTok on campus will have to rely on video streaming via mobile data, which can be quite expensive, either through having to individually pay for unlimited data, or accidentally going over one’s limits.