Last week, I posted about my breast cancer diagnosis on Facebook. Since then, my Facebook feed has featured ads for “alternative cancer care.” The ads, which were new to my timeline, promote everything from cumin seeds to colloidal silver as cancer treatments. Some ads promise luxury clinics — or even “nontoxic cancer therapies” on a beach in Mexico.
There’s a reason I’ll never fall for these ads: I’m an advocate against pseudoscience. As a consultant for the watchdog group Bad Science Watch and the founder of the Campaign Against Phony Autism Cures, I’ve learned to recognize the hallmarks of pseudoscience marketing: unproven and sometimes dangerous treatments, promising simplistic solutions and support. Things like “bleach cures” that promise to treat everything from Covid-19 to autism.
When I saw the ads, I knew that Facebook had probably tagged me to receive them. Interestingly, I haven’t seen any legitimate cancer care ads in my newsfeed, just pseudoscience. This may be because pseudoscience companies rely on social media in a way that other forms of health care don’t. Pseudoscience companies leverage Facebook’s social and supportive environment to connect their products with identities and to build communities around their products. They use influencers and patient testimonials. Some companies also recruit members through Facebook “support groups” to sell their products in pyramid schemes.
Through all this social media, patients begin to feel a sense of belonging, which makes it harder for them to question a product. Cancer patients are especially vulnerable to this stealth marketing. It’s hard to accept the loss of control that comes with a cancer diagnosis. As cancer patients, we are told where to go, how to sit and what to take. It can be painful and scary and tiring — and then all our hair falls out. During the pandemic, many of us are also isolated. Our loved ones can’t come to our appointments or even visit us in the hospital. Now, more than ever, who is there to hold our hand?
Pseudoscience companies tap directly into our fears and isolation, offering us a sense of control, while claiming their products can end our pain. They exploit our emotions to offer phony alternatives, like the “cell quickening” company that proclaims on Facebook: “Battering and bruising the body just to treat the symptoms [of] breast cancer is not necessarily the best or only option available to you. You have choices!”
When I looked at my body after my recent surgery, I wished there was another choice. I would have given just about anything to be on a beach in Mexico. But I’ve witnessed the false promises of these companies. I’ve spoken to someone who flew to that beach clinic, only to return home and discover that her tumor was inoperable. The evidence is clear: Death rates are much higher for people with cancer who choose alternative therapies instead of standard care.
Facebook is ubiquitous in many of our lives, and people use the platform to search for health-related support groups and information. So we might assume that Facebook has an ethical stake in keeping its content free of scams and misinformation. But Facebook has an odd history with the term “pseudoscience.” It was only last April that Facebook removed “pseudoscience” as a keyword from its categories for targeted advertising, and only after the tech publication The Markup reported that 78 million users were listed in Facebook’s ad portal as having an “interest” in the category.
Since the pandemic started, there has been increasing pressure on Facebook to remove coronavirus-related misinformation. Facebook pledged that it would add a warning label to Covid-19-related ads and would remove pseudoscience ads that were reported by its users. The problem, which even Facebook acknowledged, is that pseudoscience content can run for months before being flagged by readers. Facebook’s main ad-screening system is automated. While we wait for its artificial intelligence system to catch up with the discernment abilities of human reviewers, a steady flow of pseudoscience advertising has already slipped through on a platform with billions of users.
Could it be that Facebook has gotten too big to adequately regulate its content? That maybe there’s no hope for the change we need? Some advertisers seem to be suggesting this is the case. They are voting with their feet and leaving the platform altogether. Responding to a call by the advocacy group Stop Hate for Profit, advertisers such as Starbucks, Honda, Diageo and Patagonia have paused their advertising on Facebook as part of a broad boycott over how the platform is handling hate speech and misinformation. This week, Facebook met with representatives from Stop Hate for Profit. In the view of the organizers, the meeting did not go well. “Facebook approached our meeting today like it was nothing more than a PR exercise,” said one of the organizers, Jessica J. González of Free Press, a nonprofit media advocacy group.
If Facebook won’t change, what can individuals do? The quickest action is to suspend, delete or even just spend less time on Facebook (and on Instagram, which is owned by Facebook). I’ve decided to join that trend.
My retreat from Facebook may mean fewer online connections, perhaps at a time when I need them the most. But I’d rather leave than see what another friend with cancer calls the “slap in the face” ads.
My surgery team didn’t deliver false hope or send me to the beach. They stood under bright lights in a gritty urban hospital to open me up and repair me so I could breathe again. The solutions they offer aren’t simple; I have months of challenges ahead. But throughout this journey, I’ll find support from the people closest to me. Not from Facebook.
from Hacker News https://ift.tt/2Or1ao5
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.