Once again, Facebook finds itself marinating in a stew of controversy. This time, the social media giant is under fire for running ads promoting phenibut, a psychoactive substance notorious for its addictive properties. Created in Russia during the 1960s to combat anxiety and insomnia, phenibut has since been outlawed in several countries, including Germany and Australia. Although it’s not outright banned in the United States, it certainly doesn’t enjoy a free pass. It can be bought and sold, but not as medication or a dietary supplement. Despite this, Bio, a company selling the substance, has been running Facebook ads with fine print that slyly states phenibut is “Sold for laboratory research use only” and “Not for human consumption, nor medical, veterinary, or household uses.”
One might imagine that Facebook or its parent company Meta would be all over this, ensuring that such ads never see the light of day. But alas, it appears the platform’s content moderation is more like a sieve than a filter. Emails directed at Meta and Bio have, predictably, gone unanswered. This isn’t just an isolated incident, either. According to a report by Canada’s National Post, Facebook has been running ads for an array of other illicit substances, including LSD and mushrooms.
The platform seems to be a veritable bazaar for everything from controlled substances to ethically questionable AI-generated art. Even more alarming is the persistent issue of pedophiles exchanging images of underage children, exposing the disturbing underbelly of what is supposed to be a community-driven social platform.
Meta’s response to the issue has been a masterclass in corporate deflection. A spokesperson told the National Post that such ads are not permitted and that the flagged ads had been removed. However, these statements appear to be more of a PR exercise than a commitment to change. Ads for controlled substances continue to pop up like weeds in a neglected garden, even after Meta’s assurances.
This double-whammy of ineffective moderation and questionable business ethics paints Facebook as the online equivalent of a dark alleyway. It’s a place where you can find almost anything if you know where to look, but you might not like what you discover. The platform’s laissez-faire approach to content moderation not only undermines its credibility but also endangers its users.
As Facebook continues to grapple with these issues, one has to wonder: How many more scandals can the platform endure before users start jumping ship? The sheer volume of questionable content suggests a systemic problem, one that requires more than just piecemeal solutions. Until Facebook takes substantive action, it will continue to be a breeding ground for everything from dangerous drugs to ethically dubious content. And if Meta’s past actions are anything to go by, we might be waiting a while for those changes.