Before we discuss Facebook, here’s something important for you customers. The RBI doesn’t have a system to inform victims of a data breach. Now the ‘ordinary prudent man’ in you would say that the RBI doesn’t need to, companies storing data are responsible. And that’s exactly the point we are trying to bring home.
Companies are not responsible, not until they have consequences emanating from a data protection law lurking over their heads. Our present rules do have a reporting mandate, but you customers have no right to demand a disclosure from a company. The public memory is ‘Ghajiniesque’, but I sincerely hope you have not forgotten the Mobikwik saga. Let’s demand the Parliament to act, now!
The Data is out, and it’s terrifying.
Facebook. The company suffered the world’s largest-ever outage due to a router misconfiguration. But right after that, a whistle-blower testified against it, before members of the Senate. But why are we discussing this?
I have always been curious to read about the impact that technology exerts on people and societies. And it’s not to be presumed that the impact is always negative. Not even in Facebook’s case. It has definitely had a positive impact on societies. People are creating good content on the platform. It’s connecting families and even helping find missing persons. Patients are able to find blood donors. Businesses are able to find buyers on the marketplace, and so on.
However, the negative impact seems to be outweighing the positive experiences without breaking a sweat. It’s breaking democracies, people’s minds, and the very fabric that glues our society. Everyone knew it was happening. Misinformation and lynching, addiction to the screen, ignoring parental advice, the craving for likes, political polarization, it was all unraveling before our eyes. People were divided into different camps. But none knew why social media was still pulling people towards it. There was no data (Facebook keeps all of it to itself) to substantiate what we were feeling. Well, until now.
The whistle-blowers, who happens to be a former Facebook employee, has emerged as the catalyst of change. She comes armed with troves of internal research. Apparently, the social media giant made a few changes to its ‘news feed algorithm in 2018, and that’s causing a lot of trouble. It’s making people angry, hateful, eroding trust in our society, driving teenagers towards eating disorders & suicides- and the company knows about it.
What’s more shocking is that the algorithm is forcing political parties to take policy positions they know are bad for society but they would lose the social media game if they don’t take those positions.
“And Facebook, over and over again, chose to optimize for its own interests, like making more money.”
-Frances Haugen, Facebook Whistleblower
But what is it that the algorithm is doing? The new algorithm curates the news feed for you. For personalization, it tracks your interests and shows you content that you like. It also tracks what you like and comment on. The effect of that? Facebook keeps serving what you like and you keep watching it. And that means you stay on the app for more time. More time, more ads, more revenue. Seems harmless, doesn’t it?
Except that it isn’t. 1. To know more about you and always show you what you like, the company takes more intrusive measures that affect your privacy. 2. The engagement-driven feed allows Facebook to show the content that elicits an extreme reaction from you. 3. You stay in a bubble, giving away extreme reactions, getting angrier, frustrated, and depressed in the process.
Similarly, craving likes on Instagram is causing teens to be addicted to the app, isolate themselves from society, hate their own bodies, make them question their self-worth, get into a feedback cycle, and ultimately struggle with real relationships. This damage to mental health is likely to haunt an entire generation. You can watch Frances Haugen’s full testimony.
I’m ending this discussion with a few excerpts from the internal documents, and The Wall Street Journal’s reports on the issue (Behind paywall).
On political polarization
“Engagement on positive and policy posts has been severely reduced, leaving parties increasingly reliant on inflammatory posts and direct attacks on their competitors.”
On groups and content that the algorithm suggests
Company researchers in 2019 set up a test account as a female Indian user and said they encountered a “nightmare” by merely following pages and groups recommended by Facebook’s algorithms.
“The test user’s News Feed has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore,” they wrote. The video service Facebook Watch “seems to recommend a bunch of softcore porn.”
After a suicide bombing killed dozens of Indian paramilitary officers (Pulwama), which India blamed on rival Pakistan, the account displayed drawings depicting beheadings and photos purporting to show a Muslim man’s severed torso. “I’ve seen more images of dead people in the past 3 weeks than I’ve seen in my entire life total,” one researcher wrote.
On appearance-based comparison
In most countries, it is worse among women. In India, comparison is higher among men than women. The same report later referenced India, stating that though general body image concerns are low in India, "issues with appearance comparison on IG <sic> is substantially higher... suggesting the need to better understand appearance-based content shared in India and its impact on viewers"
There's nobody to hold Facebook accountable. It has moved fast, and broken people. We need the government to intervene, and that's what Frances Haugen is demanding.
I hope I've been able to put my points across. This is Rohit, signing off!
|