There was a time when power waited patiently for answers.

Politicians waited for opinion polls. CEOs waited for focus groups. You picked up the phone or opened the door to a stranger with a clipboard, and they asked you what you thought. Your answer was stored, counted, and months later a decision was made. That was the world of the 20th century. Slow-moving. Ordered. Predictable.

But that world is gone.

Today, nobody waits to be asked anything. We do not whisper our opinions into small surveys. We shout them into the internet. We tweet, post, react, rant, joke, and argue in real time.

And as we do, machines listen.

This listening has a name: Social media sentiment analysis. It is the quiet engine underneath politics, elections, business, news, and public life.

And whether we know it or not, it is shaping the world around us, sometimes reflecting our feelings, sometimes steering them.

From counting words to reading emotions

In the early days, sentiment analysis was very simple. Computers just counted words.

“Good,” “love,” “amazing” = positive.

“Bad,” “hate,” “awful” = negative.

But humans are not simple. We use sarcasm. We joke. We code-switch. Sometimes “sick” means “cool.” Sometimes “great job” means “you ruined everything.” The machines failed again and again.

So the technology evolved.

New AI models, like BERT and modern large language systems, began to read context. They no longer saw words floating alone. They understood tone. They recognized slang. They learned political language. They even began to understand when different groups used the same word in different emotional ways.

And now, AI doesn’t only judge whether we are positive or negative. It tries to detect anger, fear, joy, disgust, shock, love, and even sarcasm. It reads video and listens to audio. It can analyze millions of posts in seconds.

Most importantly, it doesn’t wait for permission.

We don’t give answers. We leak them.

India: The ground war of encrypted politics

If the United States is the loudspeaker of global politics, India is the whisper network.

In India, the heart of political life does not live on Twitter or TikTok. It lives on WhatsApp, a closed, encrypted world of group chats -- neighbours, families, priests, activists, and party workers.

Political parties organize vast networks of volunteers who run local WhatsApp groups like micro-news channels. Messages move from the national level to states, districts, and finally neighborhoods.

Sometimes these messages are news.

Sometimes they are opinions.

Sometimes they are rumours.

Sentiment analysis tools cannot directly see inside WhatsApp. So analysts track what leaks out, hashtags, keywords, timing patterns.

A single instruction sent through thousands of groups can cause a flood of identical tweets at the same second, creating a “trend.”

From the outside it looks like public opinion.

Inside, it is often organized choreography.

But this machine does not only help parties win elections. It also shapes policy. When the Indian government proposes a law and social media turns sharply negative, leaders sometimes withdraw the policy within days.

Is that democracy working? Or democracy panicking?

The line grows thinner every year.

Bangladesh: Sentiment, rumours, and political pressure

In Bangladesh today, this digital ocean of emotions is not just chatter, it is politically consequential and measurable. Long before the upcoming election, social media has become a live arena where public mood, party narratives, and voter expectations shift in real time.

Online sentiment patterns reveal a clear emotional tilt toward opposition politics. Across digital conversations, the Bangladesh Nationalist Party (BNP) dominates positive and mobilizing sentiment, accounting for roughly four out of every 10 expressions of electoral preference.

Jamaat-e-Islami follows with a substantial share of approval-driven discourse, particularly in conversations framed around governance reform, moral authority, and protest politics. By contrast, sentiment surrounding the ruling Awami League skews markedly more negative, with approval signals lagging in the mid-teens and discussion often shaped by frustration, fatigue, and distrust.

Just as powerful is expectation sentiment. A strong majority of online political discussion reflects the belief that the opposition is more likely to win the next election. That belief matters.

When people think political change is coming, they speak louder, share more aggressively, and persuade undecided voters with confidence. Momentum becomes emotional before it becomes electoral.

Yet this landscape is volatile. Political propaganda and polls-related misinformation surge across Facebook, X, and TikTok, spreading faster than verified information.

Fabricated quotes, selectively edited videos, and exaggerated claims spike around key moments, court rulings, arrests, protests, speeches, intensifying fear, anger, and moral outrage.

Fact-checking patterns consistently show that politics dominates misinformation ecosystems, and that emotionally charged content travels farther than corrections. False narratives often land first. Clarifications arrive late.

The consequences are visible. Rumours move so fast that leaders are frequently forced to deny events that never happened. Silence is read as guilt. Delay as weakness. Nuance as evasion.

What makes Bangladesh particularly sensitive to this dynamic is scale. For millions, especially outside elite urban circles, Facebook functions as a primary source of political information. Algorithms reward engagement, not deliberation. Sensational and emotional content outruns policy discussion every time.

As a result, the emotional tone of political discourse online is no longer just reflected in sentiment dashboards, it is shaped by them. Shifts in online mood increasingly function as proxies for political momentum. For parties racing toward the election, understanding these emotional undercurrents is no longer optional. It is now part of the terrain of power itself.

The US: The open-air sentiment battlefield

In the United States, nothing is quiet. Politics lives in the open, feeding algorithms that love conflict. On TikTok, Instagram, and X, political influencers now outrank traditional news outlets in reach.

And here is the secret: Anger travels faster than facts.

Sentiment skews negative by design. Outrage gets likes. Sarcasm gets clicks. Fear spreads. Campaigns monitor this river of emotion not just to react but to target.

Who is anxious? Who feels ignored? Who is unsure?

Each group receives different messages, carefully crafted to press emotional buttons. This is where sentiment analysis meets psychology. The tools no longer study “the public.” They study you.

Not to learn. But to persuade. Or sometimes, simply to discourage you from voting at all.

Business learns the hard way

If politics uses sentiment to chase votes, companies use it to protect profits.

Sometimes they fail.

When Bud Light partnered with a transgender influencer, sentiment algorithms detected a wave of negative emotion that didn’t fade. It hardened. Sales collapsed. The brand lost billions in market value. Silence made it worse. Online outrage escaped the internet and entered bars, grocery stores, and homes.

The lesson was brutal: In the age of sentiment, delay is death.

But sentiment can also save.

When the Sonic the Hedgehog movie trailer came out, audiences hated the character design. The data was overwhelming. Instead of ignoring it, the studio changed course. They redesigned the character. The movie became a hit.

This wasn’t a focus group. This was the internet as a co-creator.

Moments of joy

When Taylor Swift began appearing at NFL games, the emotional map of American football changed overnight. New fans flooded the conversation. Sentiment soared. The NFL embraced it. The result was a surge in viewership and hundreds of millions in brand value.

Emotion, when harnessed, can print money.

So what happens when leaders check sentiment dashboards like weather apps?

On one hand, this feels democratic. Power hears the public instantly. People once invisible now appear on the radar.

But there is a darker side. Bots can fake outrage. Influencers can engineer trends. Campaigns can plant emotions like seeds.

And because leaders fear negative waves, the loudest emotions often win -- not the wisest ideas. Long-term policies die because short-term anger spikes.

Are we listening to the people? Or are we being trained by a machine that rewards outrage?

Here is the most important shift:We once measured public opinion. Now we create it.

Platforms amplify posts not because they are true, but because they trigger emotion. Those emotions feed sentiment dashboards.

Leaders react. Their reaction fuels more posts. The machine loops.

The observer and the observed now shape each other.

The machine does not just listen to the crowd. It becomes part of the crowd.

The question we must ask

Sentiment analysis can make leaders more accountable. It can help companies correct mistakes. It can warn us of crises before they happen.

But it can also manipulate feelings. It can silence nuance. It can turn democracy into a mood ring.

So we must decide. Do we want technology that listens? Yes.

But we must also demand technology that respects truth, privacy, and human dignity. We must remember that behind every data point is a real person -- with fears, dreams, bias, humour, sarcasm, and contradiction.

The machine may read emotion. But it does not feel it.

And if we forget that, the real danger is not that algorithms will replace opinion polls.

The real danger is that one day, we may mistake the algorithm for the people themselves.

Nawrin Sultana is a Bangladeshi-Canadian marketing consultant, blending her cultural roots with a global perspective. [email protected].



Contact
reader@banginews.com

Bangi News app আপনাকে দিবে এক অভাবনীয় অভিজ্ঞতা যা আপনি কাগজের সংবাদপত্রে পাবেন না। আপনি শুধু খবর পড়বেন তাই নয়, আপনি পঞ্চ ইন্দ্রিয় দিয়ে উপভোগও করবেন। বিশ্বাস না হলে আজই ডাউনলোড করুন। এটি সম্পূর্ণ ফ্রি।

Follow @banginews