As Bangladesh approaches its 13th national parliamentary elections, social media is likely to play an increasingly influential role in shaping public opinion. Alongside legitimate political debate, however, the pre-election period often brings a surge in false or misleading content designed to confuse voters, inflame tensions or undermine confidence in the electoral process.

In this environment, the ability to recognise disinformation is no longer optional. It becomes a basic civic skill. False claims, manipulated videos and misleading narratives can spread rapidly online, often faster than corrections or fact checks. Learning how to assess what you see before sharing or reacting can help protect both individual users and the wider democratic process.

Understanding what you are looking at

Before evaluating content, it helps to understand the key terms used by media experts. The BBC Media Action Bangladesh Election Reporting Handbook 2026 categorises false information into three main types:

  • Misinformation: false or inaccurate information shared without the intention to cause harm, often due to misunderstanding or error.

  • Disinformation: deliberately created and spread false information intended to deceive, manipulate or damage trust. This is the most serious threat during elections.

  • Malinformation: genuine information shared out of context or selectively, in order to mislead or harm an individual, group or institution.

A step-by-step guide to spotting misinformation

Before liking, sharing or acting on a sensational post, take a moment to pause and work through the following checks.

1. Interrogate the source
Who is sharing the information? Examine the account or page closely. Is it a recognised news organisation, an official institution or a personal account? Newly created profiles, incomplete biographies or pages with no posting history are warning signs. If the content claims to come from a media outlet, visit the outlet’s official website directly rather than trusting a social media page that could be an impersonation.

2. Inspect the content itself
Look for internal red flags. Does the headline or caption rely on extreme language designed to provoke fear, anger or outrage? Be cautious of claims that seem too shocking, too simple or too perfectly aligned with a particular political narrative. For images and videos, check carefully for poor editing, mismatched lighting, unnatural cropping or blurring around faces, all of which may indicate manipulation.

3. Check the date and context
Old content is frequently recycled during elections. An image or video from a previous poll, protest or incident may be reposted as if it were current. Always check when the content was first published. For videos, search key phrases online to see whether the same footage has appeared before under a different description.

4. Cross-check with reliable sources
This is the most important step. Do not rely on a single post. Search for the same claim using trusted national or international news outlets, official election bodies or established fact-checking organisations. If no credible source is reporting the story, it is likely false, misleading or significantly exaggerated.

Spotting AI-generated and manipulated media

Image



Examples of AI generated videos. Image: Dismislab

Synthetic media, including deepfakes, pose a growing challenge. While technology continues to improve, several indicators remain useful:

  • Unnatural facial movement: irregular blinking, stiff expressions or lip movements that do not align with speech.

  • Audio inconsistencies: voices that sound robotic, lack natural pauses or do not synchronise properly with mouth movements.

  • Visual glitches: distortion around hairlines, glasses, jewellery or where the face meets the neck, as well as lighting that does not match the rest of the scene.

    They look like voters. They’re AI Read more

    They look like voters. They’re AI

Understanding bots and coordinated networks

Not all misleading content spreads organically. Automated accounts and coordinated networks can amplify false narratives, especially during politically sensitive periods. Watch for patterns such as large numbers of identical comments posted simultaneously or multiple accounts sharing the same text word for word.

Suspicious accounts often have generic names, misspelled names, stock or low-resolution profile images and minimal personal content. Watch out for suspicious activities, for example: large number of 'haha' reacts on a post, unrelated comments, interactions from profiles with foreign-sounding names etc.

Examples of coordinated interactions from bot accounts.

Example of a bot account.


While bots are not the only drivers of disinformation, they can significantly increase its reach and impact.

Election-specific risks in Bangladesh

The upcoming election presents particular challenges. With mobile phones now permitted inside polling centres under a new directive, there is concern that old images or videos from previous elections showing irregularities or alleged rigging could resurface online. Shared without context, such material could mislead voters or erode trust in the current process.

At the same time, there is a risk that genuine evidence of irregularities from the current election could be dismissed as outdated or fabricated. To reduce confusion, citizens who document incidents are encouraged to include visible timestamps on photos or clearly state the date and time at the start of videos.

Coordinated online activity, particularly on platforms such as Facebook, may amplify misleading material to generate fear or cast doubt on the legitimacy of the vote. Awareness and careful verification are therefore essential.

Final thoughts

Disinformation thrives on speed and emotion. Its aim is often to provoke a reaction before reflection. A simple but effective habit is to ask: Who benefits if I believe this? and Why is this being shared now?

If you cannot verify a claim, the safest option is not to share it. Even sharing content to criticise or debunk it can unintentionally spread falsehoods further.

By applying these checks, you move from being a passive consumer of information to an active, responsible participant in the digital public sphere. In an election period, that responsibility matters more than ever.

This guide draws on verification frameworks and disinformation analysis adapted from the BBC Media Action Bangladesh “Election Reporting Handbook 2026” by Dr Md. Saiful Alam Chowdhury, associate professor, Department of Mass Communication and Journalism, University of Dhaka.



Contact
reader@banginews.com

Bangi News app আপনাকে দিবে এক অভাবনীয় অভিজ্ঞতা যা আপনি কাগজের সংবাদপত্রে পাবেন না। আপনি শুধু খবর পড়বেন তাই নয়, আপনি পঞ্চ ইন্দ্রিয় দিয়ে উপভোগও করবেন। বিশ্বাস না হলে আজই ডাউনলোড করুন। এটি সম্পূর্ণ ফ্রি।

Follow @banginews