TikTok, the popular social media platform where users upload short, often silly or light-hearted, videos is coming under fire this week. Internal moderation documents acquired by the German digital rights blog, Netzpolitik.org, show that TikTok has been discriminating against users who are disabled, queer, and fat.
According to these documents, TikTok instructed moderators to tag any content created by so-called, “special users.” The “special users” tag refers to users who are “susceptible to harassment or cyberbullying based on their physical or mental condition.”
The idea behind the tag was to provide these “special users” with protection from cyber bullying and online harassment. This was achieved by limiting the visibility of these user’s content. Videos with this tag had their viewership limited to the user’s country of origin and were prevented from being featured on the “for you” section of the app.
To make matters even worse, moderators only had about 30 seconds to make the decision to flag a video or not. Imagine looking at a complete stranger for less than a minute and having to decide if they fall somewhere on the Autism spectrum. Now, imagine doing that with only a 15 second video for reference.
Sources inside TikTok say that moderators complained about this policy multiple times, but their concerns were ignored. According to a TikTok spokesperson, the tag system was meant to be a temporary solution.
“This was never designed to be a long-term solution, but rather a way to help manage a troubling trend until our teams and user-facing controls could keep up.”
Point blank, TikTok discriminated against users based on their physical appearance and perceived disabilities. They denied these users a fair opportunity on their app by limiting the visibility of their content therefor preventing them from growing their audiences.
In their statement about the moderation policy, TikTok’s spokesperson asserts that the policy is no longer in effect.
“While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections.”
Owning up to their mistake is a good start, but a simple ‘our bad y’all’ is not good enough. When a company currently estimated to be worth 75 billion dollars admits to blatant discrimination against its users, there need to be some reparations.
Grindr got busted for selling users’ data locations to advertisers
(SOCIAL MEDIA) User data has been a hot topic in the tech world. It’s often shared haphazardly or not protected, and the app Grindr, follows suit.
If you’re like me, you probably get spam calls a lot. Information is no longer private in this day and age; companies will buy and sell whatever information they can get their hands on for a quick buck. Which is annoying, but not necessarily outright dangerous, right?
Grindr has admitted to selling their user’s data, however, they are specifically selling the location of their users without regard for liability concerns. Grindr, a gay hook-up app, is an app where a marginalized community is revealing their location to find a person to connect to. Sure, Grindr claims they have been doing this less and less since 2020, but the issue still remains: they have been selling the location of people who are in a marginalized community – a community that has faced a huge amount of oppression in the past and is still facing it to this day.
Who in their right mind thought this was okay? Grindr initially did so to create “real-time ad exchanges” for their users, to find places super close to their location. Which makes sense, sort of. The root of the issue is that the LGBTQAI+ community is a community at risk. How does Grindr know if all of their users are out? Do they know exactly who they’re selling this information to? How do they know that those who bought the information are going to use it properly?
They don’t have any way of knowing this and they put all of their users at risk by selling their location data. And the data is still commercially available! Historical data could still be obtained and the information was able to be purchased in 2017. Even if somebody stopped using Grindr in, say, 2019, the fact they used Grindr is still out there. And yeah, the data that’s been released has anonymized, Grindr claims, but it’s really easy to reverse that and pin a specific person to a specific location and time.
This is such a huge violation of privacy and it puts people in real, actual danger. It would be so easy for bigots to get that information and use it for something other than ads. It would be so easy for people to out others who aren’t ready to come out. It’s ridiculous and, yeah, Grindr claims they’re doing it less, but the knowledge of what they have done is still out there. There’s still that question of “what if they do it again” and, with how the world is right now, it’s really messed up and problematic.
If somebody is attacked because of the data that Grindr sold, is Grindr complicit in that hate crime, legally or otherwise?
So, moral of the story?
Yeah, selling data can get you a quick buck, but don’t do it.
You have no idea who you’re putting at risk by selling that data and, if people find out you’ve done it, chances are your customers (and employees) will lose trust in you and could potentially leave you to find something else. Don’t risk it!
BeReal: Youngsters are flocking in droves to this Instagram competitor app
(SOCIAL MEDIA) As Instagram loses steam due to its standards of “perfection posting,” users are drawn to a similar app with a different approach, BeReal.
BeReal is one of several “Real” apps exploding in growth with young users who crave real connections with people they know in real life.
According to data.ai, BeReal ranks 4th by downloads in the US, the UK, and France for Q1 2022 to date, behind only Instagram, Snapchat, and Pinterest.
BeReal flies in the face of what social media has become. Instead of curated looks that focus on the beautiful parts of life, BeReal users showcase what they’re doing at the moment and share those real photos with their friends. Their real friends.
It’s real. And real is different for a generation of social media users who have been raised on influencers and filters.
As the app says when you go to its page:
Every day at a different time, BeReal users are notified simultaneously to capture and share a Photo in 2 Minutes.
A new and unique way to discover who your friends really are in their daily life.
The app has seen monthly users increase by more than 315% according to Apptopia, which tracks and analyzes app performance.
“Push notifications are sent around the world simultaneously at different times each day,” the company said in a statement. “It’s a secret on how the time is chosen every day, it’s not random.”
The app allows no edits and no filters. They want users to show a “slice of their lives.”
Today’s social media users have seen their lives online inundated with ultra-curated social media. The pandemic led to more time spent online than ever. Social media became a way to escape. Reality was ugly. Social media was funny, pretty, and exciting.
Enter BeReal where users are asked to share two moments of real life on a surprise schedule. New apps are fun often because they’re new. However, the huge growth in the use of BeReal by college-aged users points to something more than the new factor.
For the past several years, experts have warned that social media was dangerous to our mental health. The dopamine hits of likes and shares are based on photos and videos filled with second and third takes, lens changes, lighting improvements, and filters. Constant comparisons are the norm. And even though we know the world we present on our social pages isn’t exactly an honest portrayal of life, we can’t help but experience FOMO when we see our friends and followers and those we follow having the times of their lives, buying their new it thing, trying the new perfect product, playing in their Pinterest-worthy decorated spaces we wish we could have.
None of what we see is actually real on our apps. We delete our media that isn’t what we want to portray and try again from a different angle and shoot second and third and forth takes that make us look just a little better.
We spend hours flipping through videos on our For You walls and Instagram stories picked by algorithms that know us better than we know ourselves.
BeReal is the opposite of that. It’s simple, fast, and real. It’s community and fun, but it’s a moment instead of turning into the time-sink of our usual social media that, while fun, is also meant to ultimately sell stuff, including all our data.
It will be interesting to watch BeReal and see if it continues down its promised path and whether the growth continues. People are looking for something. Maybe reality is that answer.
Team of deaf engineers at Snap create feature to help users learn ASL
(SOCIAL MEDIA) Snapchat engineers known as the “Deafengers” have created an ASL Alphabet Lens to help users learn the basics of ASL.
A team of Deaf and hard-of-hearing Snapchat engineers known as the “Deafengers” at the company have created an ASL Alphabet Lens to help users learn the basics of American Sign Language.
Using AR Technology, the Lens teaches users to fingerspell their names, practice the ASL Alphabet and play games to “put their new skills to the test.”
The Lens, launched last month, is the first of its kind and encourages users to learn American Sign Language.
In aSnapchat said, “For native signers, in a world where linguistic inequity is prevalent, we believe AR can help evolve the way we communicate. We look forward to learning more from our community as we strive to continuously improve experiences for everyone on Snapchat.”
Austin Vaday, one of the deaf engineers who helped develop the Lens said helping the world understand sign language is important. He shared hiswith NBC correspondent Erin McLaughlin on TODAY after the Lens was released.
Vaday didn’t learn American Sign Language until he was 12. Before then he relied mostly on lip-reading to communicate. ASL changed his life. That life-changing moment helped inspire the ASL Alphabet Lens.
The ASL Alphabet Lens was designed and developed over six months in partnership with SignAll.
There are approximately 48 million deaf and hard of hearing people in the United States, according to the National Association of the Deaf.
Vaday said the ASL Alphabet Lens came from the desire to find a way to appropriately and properly educate people so they can communicate with those who are deaf or hard of hearing.
Vaday said the team focused on the core values of intelligence, creativity, and empathy while working on the project and it’s a step to opening communication for all Snap users with the deaf and hard of hearing community.
The ASL Alphabet Lens is available to all Snapchat users.
Business Entrepreneur1 week ago
Entrepreneurs face higher rates of mental illness [part one]
Business Entrepreneur1 week ago
Many entrepreneurs facing mental health issues don’t get help [part two]
Business Marketing6 days ago
The use of offline marketing can still be advantageous in a digital world
Business News5 days ago
How to apply to be on a Board of Directors
Business Finance1 week ago
Follow these 7 steps to get outstanding invoices paid to you ASAP
Opinion Editorials4 days ago
3 reasons to motivate yourself to declutter your workspace (and mind)
Tech News2 weeks ago
Sometimes tech is a sight for sore eyes – others it’s the cause of them
Business Entrepreneur3 days ago
Having client difficulties? Protect yourself with an exit strategy clause