India has to put in place rules that seek more transparency from technology companies. Domestic and global companies that use consumer behaviour data to enhance addictive behaviour must be scrutinised.
Privacy and data protection are often the key issues when debating the regulation around social media giants.
However, an important dimension which needs more attention are the algorithms that decide, define, and drive online user behaviour.
Even as various countries across the world battle social media giants for lack of transparency and accountability, some governments have begun to question the algorithms too.
The United States Senate Judiciary Committee recently held hearings on “Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape our Discourse and Our Minds.”
Like many countries, the US is concerned about the algorithms which are designed to addict.
“... This advanced technology is harnessed into algorithms designed to attract our time and attention on social media, and the results can be harmful to our kids’ attention spans, to the quality of our public discourse, to our public health, and even to our democracy itself,” said Sen. Chris Coons (D-DE), chair of the Senate Judiciary subcommittee on privacy and technology.
In the same way that India has social media intermediary rules and laws, the US has Section 230 of the Communications Decency Act, which offers some immunity for website platforms from third-party content.
The Senate hearings could lead to amendments in Section 230.
Another Senator at the hearing said that the business model of “these companies is addiction.”
A piece of legislation called Don’t Push My Buttons Act has been introduced in the Senate, with Tulsi Gabbard as the bill’s lead co-sponsor.
The law would require that platforms with more than 10 million users should get user permissions before offering them content based on past behaviour.
Basically, this means that companies cannot access our behaviour and drive us further into similar content.
This behaviour was believed to be particularly harmful during Brexit conversations.
Rather than allowing people to explore and stumble upon new content and alternate views on a subject, the algorithms drove users into more of the same.
Effectively, it created online echo chambers and prevented people from absorbing other ideas.
The same principle can apply to consumer products or services.
Algorithms can drive consumers to certain brands and categories while reducing choice and therefore hurting competition.
Another legislation introduced in March 2021 is called Protecting Americans from Dangerous Algorithms Act.
Congresswoman Anna G. Eshoo and Congressman Tom Malinowski reintroduced the Protecting Americans from Dangerous Algorithms Act, legislation to “hold large social media platforms accountable for their algorithmic amplification of harmful, radicalising content that leads to offline violence.”
The laws will seek changes in Section 230 and remove the protection offered to the giants if they persist with addictive algorithms.
Companies including Facebook, Google and Twitter have testified at the Senate hearings on addictive algorithms.
While the hearings are focused on US citizens, governments in other countries should also be alert about the consequence of addictive algorithms.
As the Government of India is establishing the rules of play for social media giants, it will be important to scrutinise and question addictive algorithms.
With an addressable market of over a billion users, tech giants will invest a lot of resources to increase their user base.
The variety of languages and users in the country lend themselves to using algorithms that use personal data for greater effect.
India has to put in place legislation and rules which seek more clarity and transparency from technology companies.
Domestic and global companies that use consumer behaviour data to enhance addictive behaviour must be scrutinised and controlled.
Currently the intermediary guidelines focus mostly on content management and grievance redressal.
However, the underlying software engines that influence online consumer behaviour need oversight, too.