Image by Chatgpt.com

“When algorithms decide who gets seen, do some stories vanish before they even start?” Since Instagram is defining online self, being visible has become currency. Who is observed, and who remains in the shadows of the algorithm? Recent studies disclose an unpleasant tendency, coronavirus-gendered and barred by location, super-boosted and super-stifled by Instagram. Creators are affected in their reach, relevance, and revenue. It is not a technical algorithm because it is stylish. And that may not be just.

What Is Algorithm Bias?

The algorithm of Instagram is not accidental. It decides the content that appears on your feed depending on the engagement, history, and the type of content. However, more and more researchers are discovering that information bias in the inputs or design of the model results in unfair patterns of visibility.

An investigation in 2024 by The Guardian Australia found brand new Instagram accounts intended to be gender-neutral were immediately dropped into an echo chamber, with the feeds of women overwhelmed by beauty, weight loss, and commercial sexualized imagery in a few hours.

This implies that the algorithm not only inherits the bias but it is built-in.

It is even more layered as an Indian user. The algorithm of Instagram could be sustaining societal rules instead of disrupting them, as there is a divide between regional language producers and neglected gender identities.

Gendered Engagement vs Gendered Suppression

A study published in Nature (2023) indicates that social platform visual algorithms are more biased against women when displaying pictorial material than textual material. Women only stand a better chance of being viewed when they meet the strict standards slender, pale, and fashionable.

Indian influencers have reported constant pressure to "look a certain way" to be favoured by the algorithm. “You’re rewarded for being pretty, not for being real,” said a Bengaluru-based female creator in a recent interview.

On the other hand, male producers with fitness, comedic, or business topics are alleged to get more base reach, with less-curated output.

Well, it is not just gender; it is the post location you are in.

Regional Bias: The Metro Bubble

Artists based in Delhi, Mumbai, or Bangalore cite faster growth in their followers and more discoverability. However, even creators of Tier-2 and Tier-3 tend to disappear behind the algorithm unless they use English or popular audio.

An article published by the ACM CHI Conference (2024) found that LGBTQ+ or tribal related content in smaller Indian towns was much less likely to be featured by the Instagram feed mechanisms, due to their lower initial engagement and algorithmic expectations of unsatisfactory audience relevance.

As a result, the many local artists end up code-switching - adapting metro slang, changing aesthetics, or changing their native dialects simply to blend.

The Algorithm is Watching—and Shaping Us

The hashtag # AlgorithmBias has been picking up momentum on Twitter (now X), though those behind it post how their audience takes a nosedive once they write about politics, caste, sexuality, or region..

Some users reflected:

  • “As a plus-size model from Bihar, I can tell when I’m being shadow-banned.”
  • “My poetry in Tamil barely gets 1/4th the reach compared to English reels I don’t even enjoy making.”
  • “My comedy went viral once I changed my bio from ‘small-town creator’ to ‘digital entertainer.’ Coincidence? I don’t think so.”

In the meantime, subreddits such as r/InstagramMarketing have threads dissecting supposed shadow ban patterns, where post rankings are influenced by specific keywords or the clothes a person wears, or even their skin colour.

Even the most famous feminist groups began to trace the algorithm suppression during the promotion of activism or minority stories.

What This Means for Creators—and the Internet

  1. Visibility Isn’t Neutral
    The algorithm is not a mirror; it furthers preferences. And when even these preferences are already biased, they silence whole voices, in particular, women, queer, and regional artists.
  2. Creators Are Adapting, But at What Cost? 
    The feeling that you have to write in English, keep up with viral trends, or be like everyone else is real. To make it in the algorithmic world, so many creators are changing their style, content, or even identity.
  3. Representation Begins with Recognition
    When an algorithm fails to display a woman wearing a hijab or a boy performing in a dhoti unless it can suit a trend, then diversity does not surface. The visibility algorithms need to be checked and diversified.
  4. We Need Transparency
    It is no longer just fun to have social media; it is infrastructure. The prioritising of human lives and the ranking of their content by the AI systems needs to be accountable to platforms such as Instagram.

FINAL THOUGHT

Instagram can feel like a digital sandbox, but the algorithm is not all math, not all the time: it is policy. It determines whose star rises, whose grows, and who declines. And at the moment, it is evident: not everyone perceives creators equally.

And if you are a woman or a nonbinary artist or you are a Jharkhand poet, you might have to fight harder to be visible. Not that your content is not good enough, but that it has never been made with you in mind.

"The invisible censorship is the new one. Because it puts on the mask of code.”

.    .    .

Discus