Meta has run hundreds of ads for cocaine, opioids and other drugs

Meta uses artificial-intelligence tools to moderate content, but the company’s tools haven’t managed to stop such drug ads, (AP Photo/Godofredo A. Vásquez, File) (AP)
Meta uses artificial-intelligence tools to moderate content, but the company’s tools haven’t managed to stop such drug ads, (AP Photo/Godofredo A. Vásquez, File) (AP)

Summary

Instagram and Facebook are still running ads for illegal drugs, months after the WSJ revealed they were under federal investigation for the practice.

Meta Platforms is running ads on Facebook and Instagram that steer users to online marketplaces for illegal drugs, months after The Wall Street Journal first reported that the social-media giant was facing a federal investigation over the practice.

The company has continued to collect revenue from ads that violate its policies, which ban promoting the sale of illicit or recreational drugs. A review by The Wall Street Journal in July found dozens of ads marketing illegal substances such as cocaine and prescription opioids, including as recently as Friday. A separate analysis over recent months by an industry watchdog group found hundreds of such ads.

The ads show photos of prescription drug bottles, piles of pills or bricks of cocaine. “Place your orders," said one of the ads the Journal found in July. It also included a photo of a razorblade and a yellow powder arranged to spell out “DMT," a psychedelic drug.

The Journal reported in March that federal authorities are investigating Meta for its role in the illicit sale of drugs. The nonprofit Tech Transparency Project, which investigates online platforms, reviewed Meta’s ad library from March to June and found more than 450 illicit drug ads on Facebook and Instagram.

“You don’t need the dark web anymore when you can just buy a Facebook ad to sell dangerous drugs or even scam people at a scale that wouldn’t have been possible through the dark web," said Katie Paul, director of the Tech Transparency Project.

Meta uses artificial-intelligence tools to moderate content, but the company’s tools haven’t managed to stop such drug ads, which often redirect users to other platforms where they can make purchases. The use of photos to showcase the drugs available appears to make it possible for the ads to bypass Meta’s content-moderation systems. Meta has also had layoffs that led to workforce reductions for its content moderation teams.

Meta works with law enforcement to combat this type of activity, a spokesman for the company said.

“Our systems are designed to proactively detect and enforce against violating content, and we reject hundreds of thousands of ads for violating our drug policies," the spokesman said. “We continue to invest resources and further improve our enforcement on this kind of content. Our hearts go out to those suffering from the tragic consequences of this epidemic—it requires all of us to work together to stop it."

When users click on the Facebook pages or Instagram accounts associated with the ads, those pages often include additional, nonsponsored photos or posts of drug-related content. Some of the accounts use names that make it clear they are for the transaction of drugs, such as the ad for DMT, which was posted by an account called “DMT Vapes and Notes."

Users who click the links in the ads are typically taken to private group chats on the app Telegram, which isn’t a Meta property. When accessed, these group chats typically show a stream of posts from the dealers that include photos of the drugs they offer, menus with prices and instructions for placing orders, according to the Journal’s review and the Tech Transparency Project analysis.

Telegram representatives didn’t respond to messages seeking comment about the practice.

Some of the private chats will include posts that say “TD" or “Touchdown" to indicate a successful shipment to a customer delivered via a shipping service. In some cases, the ads link to private group chats on Meta’s WhatsApp encrypted messaging service, according to the Tech Transparency Project report.

Meta disabled many of the drug ads spotted by the Journal within 48 hours of when they went live, the company spokesman said.

All of the ads have now been removed for violating Meta’s policies, and after being contacted, the company has also banned the users who created the ads from its platform. Additionally, the company said it is using insights about new adversarial tactics garnered from investigating these ads to fan out and do additional sweeps.

Section 230

Lawmakers have been discussing the need to hold technology companies responsible for what third parties post on their platforms. Efforts to do so have been complicated by Section 230 of the Communications Decency Act, which says online platforms aren’t liable for what third parties post, with a few exceptions. The Supreme Court left core elements of Section 230 unchanged after deciding on two cases involving the law in 2023.

The Justice Department in the past has tried to extend the reach of federal drug laws in a way that holds internet platforms culpable when companies use it to break the law. In 2011, Google agreed to pay $500 million for allowing online Canadian pharmacies to place ads targeting U.S. consumers, resulting in the unlawful importation of prescription drugs in the U.S.

At a Senate hearing in January, a number of parents said they think Meta and other social-media companies are responsible for the deaths of their children, including because of illegal drug sales on social platforms that led to overdoses from fentanyl, a synthetic opioid.

Chief Executive Mark Zuckerberg apologized at the hearing.

“I’m sorry for everything you have all been through," Zuckerberg said. “No one should go through the things that your families have suffered, and this is why we invest so much…to make sure no one has to go through the things your families have had to suffer."

Elijah’s death

Mikayla Brown, 34 years old, is among the parents who think Meta is responsible for the drug-overdose death of her child.

Her son Elijah Ott, a 15-year-old sophomore in California who loved to skateboard and cook Cajun pasta with his mother, died last September. After his death, Brown said she found messages on his phone showing how he connected with an Instagram account selling illegal drugs and sought to purchase marijuana oil and a pharmaceutical similar to Xanax.

In his autopsy, Ott tested positive for the Xanax-like pharmaceutical and a larger amount of fentanyl, which was determined to be the cause of his death. Brown said she thinks the drug purchased by her son was laced with fentanyl.

A review by the Journal showed those accounts remained live on Instagram months later. Meta disabled the accounts associated with the dealer after being contacted by the Journal.

“Because of this app, my child does not get to live," Brown said.

Write to Salvador Rodriguez at salvador.rodriguez@wsj.com

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

MINT SPECIALS