Home » Oops! Pornhub’s Instagram Account Accidentally Back Up

Oops! Pornhub’s Instagram Account Accidentally Back Up

Instagram permanently banned Pornhub’s account in September after repeated violations of the platform’s content guidelines, which prohibit nudity and sexual solicitation. However, the account was temporarily reinstated over the weekend, only to be permanently banned again on Tuesday. An Instagram spokesperson acknowledged that the account’s reinstatement was an error and reiterated that the ban was due to policy violations. Pornhub has disputed the ban, claiming that its account did not violate any guidelines. This incident highlights the ongoing challenges faced by creators such as sex educators, pole dancers, and sex workers (and on that topic, don’t forget to protect yourself!), whose accounts are frequently suspended or disabled even when their content complies with the platform’s rules. Despite being unable to share NSFW content on Instagram, adult performers rely on the platform to communicate with their followers.

SESTA/FOSTA, a US law passed in 2018, has made it more challenging for online sex workers to earn a living safely and legally. The legislation aimed to combat sex trafficking, but in reality, it has made sex work less secure. Due to the law’s exception to Section 230, which holds online platforms responsible for facilitating prostitution and trafficking, social networks and credit card processors have become cautious about violating the law. However, according to a government report from 2021, federal prosecutors have only used the legislation once. In response to the permanent suspension of Pornhub’s Instagram account, the platform criticized the “arbitrarily and selectively-enforced ‘standards'” of Instagram and its parent company, Meta.

A spokesperson for Pornhub argued that Meta’s erratic enforcement of its policies has placed an undue burden on those in the adult industry, a marginalized group. Meanwhile, MindGeek, the parent company of Pornhub, is facing multiple lawsuits alleging that it knowingly profited from child sexual abuse material (CSAM). The company has removed all non-verified content and requires those who upload videos to verify their identity. While online platforms are required to report incidents related to CSAM, they are not necessarily obligated to proactively seek out and remove such content. According to a 2021 report, Meta reported 22 million instances of CSAM on Facebook and over 3 million on Instagram, while MindGeek made 13,229 reports.

See also  Avalanche (AVAX): Loco Plugs Itself To An Avalanche Subnet

Related Posts

Leave a Comment