The Model Code of Conduct is a set of guidelines intended to regulate political parties and candidates during elections to ensure that the conduct of polls by the Election Commission of India (ECI) is free and fair. This is something political parties and candidates voluntarily agree to. Divided into eight parts, the guidelines cover dos and don’ts for electioneering as well as on publication of advertisements at the cost of the exchequer, among other things. “The MCC started its journey from Kerala in 1960 and has now come of age,” says former chief election commissioner (CEC) OP Rawat. The code comes into effect from the time the EC announces the schedule of elections and will be in force till the results are announced.IS MCC LEGALLY ENFORCEABLE?
The MCC by itself is not legally enforceable but the EC draws power from other legislation such as the Representation of the People Act, 1951, and the IPC. There have been discussions on whether the code should have legal backing but the commission was not in favour of it. “The idea was that legal backing would make it stronger. But a law would mean it would be subject to judicial processes. Under the MCC, we expect responses in 24 to 48 hours, but that would not happen with an FIR,” says former CEC SY Quraishi. A Supreme Court ruling of 1977 clarified that the commission can regulate any grey area not covered by legislation. Rawat says the commission took shelter under this ruling when it annulled an assembly by-election in Tamil Nadu in 2017 after it found that a large amount of cash was distributed in the constituency, a decision upheld by the SC.
SHOULD MCC BE UPDATED TO COVER SOCIAL MEDIA PLATFORMS?
Opinion is somewhat divided on whether specific rules targeting violations online are necessary. One reason is that violations like the use of hate speech in campaigns are already covered under existing legislation. “To my mind, the Model Code of Conduct is perfect. Causing enmity or creating disturbances, which will be the consequence of social media posts as well, is covered under the MCC. The problem is with enforcement. It has to be enforced equally and forcefully,” says Quraishi.
Also Read| Budget 2024: Pressure for populist measures is off after state poll results, says NITI Aayog member Virmani
Rawat says there should be legislation to deal with challenges from artificial intelligence (AI) since the MCC does not have legal backing. “There is a total void in our electoral rules as far as futuristic challenges such as the threat from AI are concerned. So far, the EC has only taken executive action, like in 2018 and 2019, when social media platforms agreed to take down anything objectionable within 24 hours of election officials raising it,” says Rawat. Anupama Roy, professor at the Centre for Political Studies, JNU, and coauthor of The Election Commission of India:
Institutionalising Democratic Uncertainties, suggests there could be a voluntary commitment from social media platforms and intermediaries as there was in 2019, which could be reinforced in the context of new developments. “A commitment would have to be made and reiterated by social media platforms and, of course, the EC would have to play an active role in ensuring it happens,” says Roy.
However, other experts point to several loopholes in the current system. For one, there is very little adherence to the “silence period” 48 hours before the end of polling during which no public meetings or announcements should be made, or to the code of conduct issues that are specific to hate speech, says Joyojeet Pal, associate professor, School of Information, University of Michigan. “The idea of the campaign period ending makes little sense at this point because social media is constantly active, and campaigning is not something that happens only from a podium.
We need to adapt to that reality. Also, the idea of politicians or known party members being the only people responsible for a political campaign is also largely meaningless, because modern campaigns have evolved to be much more decentralised.”
Also Read| Vote on Account before the vote: Will new freebies find a place in it?
Nikhil Pahwa, founder, Medianama, a digital platform on information and analysis of technology policy in India, says some candidates start promoting during the silence period content published before that. He says, “In recent elections, we have seen that political content remains active even after the MCC prohibits further campaigning. So, what happens to a video of a politician’s speech that is promoted after the campaign has officially ended? Essentially, the campaign continues, albeit not in real-time.
This method of time-shifted campaigning is something the EC hasn’t addressed yet, and I’m aware that some political parties have employed this tactic.”
HOW DID THE EC AND SOCIAL MEDIA PLATFORMS TRACK VIOLATIONS IN 2019?
To combat the misuse of social media during the 2019 elections, social media platforms and the Internet and Mobile Association of India (IAMAI) agreed to adhere to a voluntary code of ethics, which included guidelines such as processing complaints of electoral law violations within three hours of those being flagged and creating a dedicated team to interact with the EC. The Media Certification and Monitoring Committee (MCMC), which is supposed to monitor and regulate political advertisements in the electronic media, was reconstituted that year to include social media experts at both district and state levels.
In 2019, 154 instances of fake news and misinformation on social media platforms were reported by the EC. However, substantial enforcement and consequences for online violations of the MCC by the EC are still awaited, according to some. “One of the most difficult aspects of the internet is attributing wrongdoing. Even when you identify misconduct, it often comes with built-in plausible deniability,” says Pahwa.
ARE THERE CONCERNS OVER THE POTENTIAL USE OF AI/ DEEPFAKES IN GENERAL ELECTIONS?
A new report by the World Economic Forum warns that AI-enabled misinformation and disinformation may radically disrupt electoral processes in several economies, including India. Tools to create these videos are becoming easily available and are often free. The EC is cognisant of the threat deepfakes pose, with CEC Rajiv Kumar speaking about “the disturbing trend of deepfake narratives” becoming a common feature in elections worldwide at a conference in January last year. (The EC did not respond to emailed questions from ET.)
Former CEC Rawat warns that misuse of AI will pose a major threat in the coming general elections. In December 2023, the Ministry of Electronics and Information Technology issued an advisory mandating all intermediaries to communicate to users clearly about prohibited content, particularly those specified under Rule 3(1)(b) of IT Rules. This rule aims to ensure that platforms identify and promptly remove misinformation, false or misleading content, and material impersonating others, including deepfakes, a government statement said.
HOW ARE SOCIAL MEDIA PLATFORMS GEARING UP FOR ELECTIONS?
With much of the campaigning set to unfold on social media platforms like Meta’s WhatsApp and Instagram, and Google’s YouTube, all eyes will be on how these platforms tackle violations. YouTube says it will roll out updates that will alert viewers to synthetic content. Content creators on YouTube will be mandated to disclose if their content is AI-generated or significantly altered. Noncompliance could lead to content removal or other penalties, including suspension from the YouTube Partner Program. YouTube also plans to introduce new labels on the video description panel and more noticeable labels for sensitive content to inform viewers about potential alterations.
A spokesperson from the company emphasised that dedicated, election-focused teams are working to ensure compliance with local laws and policies, including MCC. Meta, in a blog post from November last year, outlined its plans for 2024 elections globally. The company will require advertisers to disclose the use of AI or digital methods in creating or altering political or social issue ads in certain cases, a policy that is also expected to be implemented in India.
However, spread of deepfakes on end-toend encrypted platforms such as WhatsApp and Signal is impossible to prevent, and detection and removal of deepfakes can never be foolproof, says Pahwa. “If there’s watermarking on videos, there are tools that allow you to remove that.” He adds that it is difficult for platforms to detect deepfake videos with 100% accuracy from billions of pieces of content. “I think it’s an impossible task. A fact-checking video that contains a deepfake clip might also get impacted in the process,” he says.
Pahwa says the responsibility of enforcing rules is disproportionate on platforms. “The EC should be responsible for this enforcement but, frankly, its track record in enforcement has been quite weak. Essentially, the (current) approach burdens platforms with all the responsibility, while individuals who create disinformation aren’t held accountable,” he says.
WHAT MORE CAN BE DONE?
When it comes to regulating hate speech and misinformation online, a combination of greater awareness among the public and increased transparency from social media platforms could help address some of these issues. Pal of Michigan University emphasises the role social media companies can play by providing public application programming interface (API) to researchers. This would allow independent entities to oversee issues like hate speech and misinformation. “The only thing that social media companies can do for citizens is to be transparent about what politicians and political advertisers are doing on their platforms, and the best way to do this is to institute requirements for making advertising data public alongside building open APIs because all social media campaigns cannot be traced to explicit advertising,” he says.
On the issue of modifying MCC, analysts stress the need to enforce the existing code fairly, to instil public confidence. “How does the body deal with complaints that come before it? Is there a feeling that there is swift response to complaints about certain types of practices and parties and people, whereas similar complaints of others are ignored? Such a perception would hinder fostering confidence in the Model Code of Conduct,” says Sandeep Shastri, national coordinator, Lokniti Network, and director–academics, Nitte Education Trust.
Jagdeep Chhokar, founding member of the nonprofit Association for Democratic Reforms, concurs: “The biggest concern with the MCC is that the EC has of late not been consistent in its application. It must be implemented in the right spirit and evenhandedly. Otherwise, modifying the MCC will be an unfructuous exercise till the whole system is corrected.”