Saltar al contenido

Instagram has a drug problem. Its algorithms make it worse

 

By Source : LA Times, published on 28 Sep 2021

 

Instagram is known for its celebrity posts and photos of enviable vacations. But it has also become a sizable open marketplace for advertising illegal drugs. The company has pledged a crackdown in recent weeks, but it is struggling to keep pace with its own algorithms and systems, which serve up an array of personalized drug-related content aimed directly at people who show an interest in buying substances illicitly.

Recent searches on Instagram, which is owned by Facebook Inc., for hashtags of the names of drugs — such as #oxy, #percocet, #painkillers, #painpills, #oxycontin, #adderall and #painrelief — revealed thousands of posts by a mash-up of people grappling with addiction, bragging about their party-going lifestyle and enticements from drug dealers.

Following the dealer accounts, or even “liking” one of the dealer posts, prompted Instagram’s algorithms to work as designed — in this case, by filling up a person’s feed with posts for drugs, suggesting other sellers to follow and introducing new hashtags, such as #xansforsale. Ads from some of the country’s largest brands, including Target, Chase and Procter & Gamble, as well as Facebook’s own video streaming service, appeared next to posts illegally selling pills.

Even as top executives from Facebook and Twitter Inc., which has also long struggled with posts offering drugs illegally, promised this month during a congressional hearing that they were cracking down on sales of opioids and other drugs, their services appeared to be open marketplaces for advertising such content. Facebook’s chief operating officer, Sheryl Sandberg, said her company was “firmly against” such activity. Twitter Chief Executive Jack Dorsey said he was “looking deeply” at how drug-selling spreads on his site.

But activists and other groups have warned tech companies about illegal drug sales on their platforms for years. In recent months, lawmakers, the Food and Drug Administration and some advertisers have stepped into the fray. In April, FDA Commissioner Scott Gottlieb charged internet companies with not “taking practical steps to find and remove opioid listings.” Sen. Joe Manchin III (D-W.Va.) called social media companies “reckless,” saying: “It is past time they put human life above profit and finally institute measures that crack down on these harmful practices, preventing the sale of illegal narcotics on or through their platforms.”

The prevalence of drug posts on social media — which the FDA says has helped fuel the opioid epidemic that claimed more than 40,000 lives in the United States last year — shows how tech companies are often outsmarted by the software they created. The algorithms that power social media spread illicit content — including illegal drug ads, misinformation and hate speech — faster than the companies know how to take it down. The most common features of social platforms, such as hashtags and algorithms that deliver personalized feeds, drive drug-sale posts directly to users who have expressed interest in them — potentially exposing the most vulnerable people to addictive drugs.

“Just as drug use rewires the brain to crave more of the substance, social media platforms have designed their sites in such a way that after a single search for an illicit drug, the algorithm gets rewired to advertise drugs to the already vulnerable user,” said Rick Lane, a longtime technology policy advisor who helped push legislation known as FOSTA-SESTA through Congress this year. That law holds technology companies liable for prostitution and sex-trafficking ads on their platforms. He is now pushing for similar legislation for drug ads.

Facebook’s vice president of global marketing solutions, Carolyn Everson, said Instagram was paying more attention to illegal sales of drugs because of a growing focus on safety and on preventing abuses of the company’s platform. “We’re not yet sophisticated enough to tease apart every post to see if it’s trying to sell someone illegal drugs or they are taking Xanax ’cause they are stressed out,” said Everson, referring to the company’s artificial intelligence technology. “Obviously, there is some stuff that gets through that is totally against our policy, and we’re getting better at it.”

Instagram co-founders Kevin Systrom and Mike Krieger said late Monday that they were exiting the company. Adam Mossieri, a longtime deputy to Facebook CEO Mark Zuckerberg, is likely to become the photo-sharing app’s next leader, according to a person familiar with the matter.

“Some of the emergent behaviors we’ve seen have presented a new challenge, and we’re focused on tackling them alongside law enforcement, our peers and the FDA,” Twitter spokesman Ian Plunkett said.

Pharmaceutical companies are allowed to promote their brands on social media, but the process is highly regulated by the FDA, and companies and individuals are not allowed to sell drugs through social media.

Technology companies, which are lightly regulated compared with other industries, face the prospect of stricter rules if they cannot control the problems. During this month’s technology hearings, Manchin told the executives that his state had been hit hard by opioid addiction and that he was interested in launching a bill modeled after the sex-trafficking law that would hold companies liable for drug dealing taking place on their services.

John Montgomery, executive vice president for global brand safety for the ad-buying giant GroupM, whose agencies work with companies such as Procter & Gamble and Target, said Instagram was moving too slowly. “With illegal pharma content, there is little nuance. So it should be possible to identify and block faster than we’ve seen,” he said.

Instagram has become one of the most potent platforms for drug marketing, said Libby Baney, senior advisor for the advocacy group Alliance for Safe Online Pharmacies. Its growing use among teenagers as well as the service’s emphasis on visuals, its sophistication at personalizing content and its allowance of anonymous accounts have turned it into a hotbed of illegal promotion of drugs, she said.

Eric Feinberg, a researcher and the chief executive of GIPEC, a New York City cyberintelligence startup that tracks illegal activity such as counterfeit goods and terrorist content on technology platforms, began hunting for drug posts in June by searching for obvious hashtags. He found hundreds of Instagram posts appearing alongside content from 60 advertisers. Some of the Instagram dealers touted corresponding Twitter accounts, and Feinberg began tracking those accounts too. Some of the Twitter accounts were even more brazen and had been up for years.

Once he followed the sellers’ accounts, Instagram’s algorithms began delivering posts marketing drugs directly into his feed, suggesting other drug sellers for him to follow and introducing him to additional hashtags that he used as clues. At one point, he said, posts from sellers constituted about 40% of his feed. Feinberg said he plans to sell his monitoring software, but his company doesn’t yet earn any revenue.

Facebook said Feinberg’s feed was not a real representation of what the vast majority of people see in their feeds because he exclusively followed bad actors and some brands, prompting the company’s algorithm to cluster the two types of content together. “That being said, even a single piece of bad content on our platforms is one too many, and we’re working hard to improve our detection and enforcement,” Facebook spokesman Joe Benarroch said.

Most of the posts that appeared to be from dealers had a similar format: Photos of different types of drugs captioned with a string of hashtags and instructions to contact the account holder through a channel outside Instagram, such as email or messaging platforms Wickr, Kik or Facebook-owned WhatsApp. (Most drug posts included explicit instructions to avoid “DMs,” or direct messages, on Instagram itself. Such messages could be more easily traced.)

In recent months, Instagram took what it described as an extreme step by blocking search results for certain hashtags, such as #fentanyl, #cocaine and #heroin, even though that had the unwanted side effect of limiting people’s ability to seek support for substance abuse issues, Facebook’s Everson said. The hashtags can still be used and the posts can still be found through a person’s network, even though they are unavailable through a public search.

To get around blocked hashtags, sellers now market opioids under Xanax- and Adderall-related hashtags, many of which are searchable, Feinberg said. They also slightly tweak the spelling of drug names and include their contact details in the photos themselves — for example, by writing them on a piece of paper and then photographing it — to avoid software tools that can identify problematic keywords in a caption. Instagram appeared to suspend some hashtags when asked about them by the Washington Post.

Dealers appear to employ a “spray and pray” strategy designed to get around Instagram’s monitoring, Feinberg said. They post frequently, with individual posts getting a small number of likes before many are taken down by Instagram’s systems within 48 to 72 hours, often because users flagged them as problematic. As soon as one account was taken down, dealers created multiple Instagram accounts with similar names, such as a cluster including FoxPharm, FoxPharm12, FoxPharm69 and FoxPharm90, all featuring the same contact information.

Some appeared to have been created automatically, Feinberg said, suggesting the sellers are using bots. The different iterations of the name reminded him of Islamic State terrorist accounts and disinformation groups, which use a similar tactic.

“They are playing whack-a-mole here. They take them down, and then they come back again,” Feinberg said. “If [the tech companies] were really doing a full-court press, we wouldn’t keep finding what we are finding.”

Everson said Facebook and Instagram were in the early stages of developing artificial-intelligence tools that could flag drug content. She compared it to Facebook’s efforts, starting two years ago, to build AI software that it says can detect the majority of Islamic State accounts before people can see them. Now they are building visual classifiers that can recognize photographs of particular pills and detect common patterns, such as the inclusion of a phone number to move the transaction onto an encrypted messaging platform.

Facebook also gives advertisers tools to block their messages from appearing alongside certain publishers or categories of content, including tragedy or controversial social issues.

This month, Instagram also launched a pop-up notification that appears when someone searches hashtags for opioids, prescription medications or illegal drugs. The pop-up offers to connect people with free and confidential treatment referrals and information about substance use, prevention and recovery.

Dwoskin writes for the Washington Post.

 

Copied

Post your Comment