Twitter thought they had discovered a way of making more revenue. This came at a necessary time, as the company is facing an acquisition and has had several high-level staffing changes. This had understandably led investors to have some concerns about the stability of the platform, and caused Twitter executives to think outside of the box. This new idea would make more revenue and possibly gain more Twitter users. Sounds like a win-win, right? Unfortunately for the company, this was not the case.
Twitter decided earlier in 2022 to market adult content, according to an article from The Verge. Since they had already been passively letting adult content creators advertise their work on the platform, it seemed like a logical next step for the company to try to monetize it. The idea was that there would be an option for adult content creators to offer paid subscriptions to their followers, with Twitter of course taking a cut of the profits. This would be similar to the structure that the popular website OnlyFans uses.
Since Twitter was already allowing this content, and does not have anything in its community guidelines against posting pornographic content as long as certain stipulations are followed, the company did not see any immediate issues with profiting from users’ paid subscriptions to adult consensual content. Notice those keywords adult and consensual – herein lies the issue.
Luckily, before doing a full launch, Twitter hired a team of 84 people to test the technology for any security issues before release. The group, called the Red Team, found information that would immediately halt the project, namely the inability to stop child pornography from making it through the filters. The news that the system was not able to keep Child Sexual Abuse Material (CSAM) and non-consensual nudity from getting through the filter was not new to the platform, as the Health team had been warning high level executives at Twitter about this issue since early 2021.
The platform has been using PhotoDNA, developed by Microsoft, to detect and remove non-consensual and illegal sexual content. It appears that this Artificial Intelligence is not quite developed enough to ensure that nothing inappropriate makes it through, as images of CSAM that are new or altered can make it by the filters without being detected. That is obviously quite problematic.
Twitter has been sued more than once by victims of teenage sex trafficking who alleged that the company profited off of videos that were taken during these crimes, and did not remove the content when notified. The company, which has been around since 2006, did not have a way to report content containing CSAM until earlier this year, over a decade and a half later.
Overall, we can see why Twitter would want to monetize adult content. Other popular websites have already proven that when done right, these paid subscription services can be lucrative. However, Twitter does not yet have the capacity to properly monitor this material.
There have been multiple occasions when the company allegedly did not properly monitor content, and as a result non-consensual material was distributed freely on the platform. The company’s first obligation is to its users, to ensure that all content on the website is distributed safely and fairly.
If Twitter cannot fulfill its most important obligation while still making a profit, then it may be time to go back to the drawing board.
Allison Snider is a freelance writer and owner of AllieWritesCreatively. She is passionate about raising awareness for chronic illnesses and ending the stigma around mental health. In her free time, you can find Allison hiking with her wife, Sara, and two dogs, Stella and Sophie.
