We thought that progress would arrive with glowing screens, data charts, and auto-graded papers.
It did.
We thought it would come with personalised modules and applications that whispered lessons in regional accents.
It did.
We celebrated when chalkboards turned into tablets, when questions became clickable, and when a child in rural Jharkhand could finally “access” the same resources as one in Delhi.
Yes, we call it a revolution. We call it equity. We call it SMART.
But, amid all the applause, did anyone hear the silence?
The silence of a teacher who now watches her students through video calls that glitch, unsure if they’re confused or just muted. The silence of a child who can’t speak up when a bot gets it all wrong. The silence between personalised content and the un-personalised loneliness that followed. We replaced hands-on guidance with algorithms. We outsourced encouragement to notifications. And somewhere in this automation, we forgot that learning was always meant to be a human experience.
Yes, the transformation is real, and India did need scale. It did need reach and innovation. 260 million students and not enough teachers. That is when AI truly felt like a miracle wrapped in code. And maybe it partly even is.
Adaptive learning platforms like Byju’s, Toppr, and even government efforts under NEP 2020 and PM e-Vidya are bold attempts to democratize learning. A child in a village can now answer the same quiz as one in a city. A teacher, overburdened and underpaid, finally has time to breathe.
But here’s the truth we skip in headlines-
AI can grade an answer sheet. It cannot sense the tears behind a wrong answer. It can track performance. But it cannot track confidence. It can teach. But it cannot care. The emotional scaffolding that held up our classrooms is cracking under the weight of automation. And then there’s the other elephant- the digital divide. For every child learning with AI, there’s one without a device. Without signal. Without help. How do we call it inclusion when some children still stand outside locked screens?
This is not a rant against machines. This is a plea—a plea for balance.
AI can assist, guide, and even inspire. But it must never become the teacher, because teaching isn’t just about transferring knowledge; it is about holding space for curiosity, for fear, and joy. And, as India marches towards an AI-powered future, let us not forget what made our past so powerful: the human presence in every lesson, the voice that said ‘it’s okay’, and the eyes that noticed when a child was struggling quietly.
Let us build a future that remembers how to FEEL.
Screens Don’t See Fear:
They say the algorithm knows us. Knows what we need to revise. Knows which concept we haven’t mastered. Knows how fast we read.
How long have we stared at the screen?
How many mistakes do we make?
But there’s also something that it doesn’t know. It doesn’t know I freeze every time I open my math module- not because I’m weak, but because the last time I had a doubt, all I got was silence instead of comfort from the app. It doesn’t know that my friend pretends to click through her lessons just to hide the fact that she doesn’t understand anything anymore. It doesn’t know that fear doesn’t show up in analytics. That shame doesn’t appear on dashboards. That loneliness can’t be predicted by an AI. We have surely built machines to measure performance, but what they can’t measure is the deep pain we hold inside our hearts. In a real classroom, a teacher might have noticed a flicker in the eyes, a lowered head, and the way a hand doesn’t rise even when the answer is known. A teacher might have paused the lesson and asked, “Are you okay?” and offered a smile. She might have given a second chance- a moment of care.
-It will recommend a harder quiz. Push a new video. Award a badge. Because of the system, everything looks fine, if not better. We confuse performance with progress. But if children are learning to stay silent in their confusion, are we truly educating them? If fear becomes invisible, is it still harmless? In our rush to digitise education, what we forgot is that understanding doesn’t come from speed or from scores. It comes from safety. And no screen has ever been soft enough to offer that to us.
The Teachers We Tried to Replace:
They told us AI would do what humans couldn’t. Correct papers in seconds. Track student data flawlessly. Work without rest. Teach without bias. They said it like it was a good thing. But when we handed over the chalk, the warmth went with it. Because teachers were never just content delivery machines. They were mirrors. Witnesses. Sometimes, the only adult in a child’s life who believed they could be more. Who noticed when a normally chatty student went quiet? Who stayed back after class to ask if everything was okay? Who stood between a student and their worst day with nothing but a voice and a presence?
And guess what? We tried to replace that with AI.
It’s not that technology is wrong. It’s the fact that we believed it could do everything. WE gave it tasks beyond its reach, and it was we who expected it to understand hesitation, offer kindness, and inspire confidence. But empathy has no algorithm. We can’t code the patience of a teacher who explains the same thing five times and still says, “It’s okay, let’s try again.” We can’t simulate the moment when a teacher locks eyes with a student and both of them just know- this time, it clicked. And what happens to students who never get that moment? Do they keep learning? Or just keep clicking?
A screen might show progress. But only a teacher can show belief. So maybe we never needed to replace teachers. Maybe we just needed to support them. Give them tools, not subtitles. Assistants, not replacements. Data, not decisions. Because when a student remembers who changed their life,
They rarely say “the app.”
They say, “My teacher.”
And some things are too sacred to automate.
When Learning Became a Login:
Once upon a time, learning was messy. Voices clashed. Desks creaked. Laughter spilt between the lessons. You raised your hand not just to answer, but to be seen.
To say: I’m here, ma’am. I’m trying.
Now, it’s just… a login.
Click.
Start module.
Finish the quiz.
Repeat.
No noise. No chaos. No eye contact.
The classroom has shrunk into a rectangle on a screen, and the students have shrunk with it. They don’t interrupt anymore. They don’t ask why. They just scroll. We mistook silence for focus. Obedience for understanding. And forgot that real learning is loud, uncomfortable, even a little wild. Now, when a student struggles, the system suggests “revision mode.” But what if what they need isn’t another video? What if they need someone to look them in the eye and say, “You’re not stupid. This is just hard.” Because sometimes, that’s all it takes to stop a child from giving up. But AI doesn’t say that. It never will. We’ve created perfect user journeys, but we forgot that real education is rarely linear. It zigzags through confusion and breakthroughs. It stumbles, cries, laughs, and then restarts. And when we remove that mess, we risk removing magic.
Yes, AI offers access.
Yes, AI gives structure.
But if education becomes just a system to be navigated, we lose the part that truly has the potential to transform people.
We need to ask ourselves-
Are we teaching children how to answer questions or how to ask them? Because curiosity doesn’t live in dashboards. It lives in messy moments. Moments that AI doesn’t know how to hold.
Not JUST a screen problem:
In the grand conversation about artificial intelligence in education, most of the attention tends to circle around academic performance, efficiency, or access. We discuss how AI can personalise learning, improve student outcomes, and even transform traditional education models. But rarely do we pause to ask a simpler, more physical question—how does all of this feel in a student's body and mind?
Behind the glossy user interfaces and interactive dashboards lies a quieter, less glamorous reality: our increasing dependence on screens is beginning to hurt more than help. And while this isn’t exclusive to AI-based education, it’s certainly accelerated by it.
Let’s begin with the most obvious: fatigue. In traditional classrooms, students are encouraged to engage with their surroundings, ask questions, move around, and interact with others. There are breaks—natural pauses in attention when a teacher looks away, walks around the room, or answers another student. These pauses, while subtle, are necessary resets for the brain.
In contrast, AI-based platforms are designed for non-stop engagement. Lessons flow from one to the next, questions pop up as soon as the previous one is answered, and there’s constant visual and auditory stimulation. The screen never blinks. For a young mind, this is like trying to keep running without ever catching your breath.
Over time, this creates a sense of mental dullness. Students begin to feel like they’re absorbing information not because they’re curious, but because the system keeps feeding it to them. Learning becomes passive, mechanical, and—most dangerously—emotionally disconnected.
Then there’s the issue of attention. AI tools are built to keep students “hooked,” often using gamified methods: badges, streaks, scores, and ranks. While these may seem motivating at first, they condition the mind to expect rewards constantly. The moment the screen goes away, real-life learning—reading a textbook, thinking through a problem on paper, even listening to a teacher—feels less exciting. Boring, even.
The result? Shortened attention spans. Students find it harder to stay focused in offline environments because their brains are being trained for rapid-fire interaction and instant gratification. It’s like switching from fast food to a slow-cooked meal—you might not have the patience anymore.
And we haven’t even talked about the body. Eyes burn. Necks ache. Backs slump. The human body wasn’t built to sit still and stare into blue light for five to six hours a day. Yet, this is what AI-enhanced digital learning often demands. Unlike classrooms, where there's movement—turning to a peer, walking to the board, standing up to ask something—AI learning environments often chain students to their seats.
Even subtle physical movements, like flipping a page or sketching in the margins of a notebook, are lost in the digital world. These small things might seem insignificant, but they’re vital sensory experiences. They grounded us. They keep learning tactile and alive. Take them away, and education becomes abstract, sterile, and remote.
Let’s also consider the emotional climate. In a classroom, there’s laughter, banter, shared anxiety before a test, relief after one. These are not just moments of social interaction—they’re moments of human regulation. Being around others helps students calibrate their emotions, share their stress, and feel seen.
AI, for all its intelligence, cannot replicate this. A bot that congratulates you for completing a level doesn’t carry the same emotional weight as a teacher who genuinely praises you. And when a student fails or feels stuck, the algorithm offers a solution, but not comfort. Not reassurance. Not the presence of a fellow human who says, “It’s okay. Try again.”
Over time, this lack of emotional texture can make learning feel isolating. Students start to internalise every mistake. There’s no outlet, no real-time community, no shoulder to lean on. The experience becomes solitary, even when it's dressed up with digital avatars and happy sound effects.
So no, this is not just a screen problem. It’s a human experience problem. It’s about how our bodies, minds, and hearts are reacting to an education system that is slowly removing the things that made learning a fully lived experience.
If we want AI in education to succeed—not just statistically, but soulfully—we need to rethink how it's designed, implemented, and timed. We need learning to remain grounded in movement, rest, connection, and care.
Because at the end of the day, education is not just about content. It’s about how we feel while learning. And no machine, no matter how advanced, should ever make us forget that.
The Myth of Objectivity:
In the rush to embrace artificial intelligence in education, one of the most dangerous assumptions we've made is that AI is objective. That it is neutral. Unlike a tired teacher or a flawed curriculum, it will treat every student fairly. We imagine AI as this grand, emotionless machine, unaffected by mood, fatigue, or bias. But here's the unsettling truth: AI doesn’t eliminate bias. It just hides it behind code.
When we say a system is “objective,” we assume it has no opinion. But AI is not born in a vacuum. It is trained on data. And that data? It comes from humans. It’s scraped from textbooks, student behaviour patterns, academic assessments, online interactions, and even social media sometimes. All of it is tangled in human assumptions about intelligence, success, language, and ability.
Let’s take a small example. Imagine an AI tool that predicts which students are likely to fall behind in math. Sounds useful, right? But if the model was trained on students from urban English-medium schools, it might automatically flag a rural or regional-language student as “weak” simply because their performance doesn’t align with the original training data. That’s not objectivity. That’s prejudice in disguise.
It gets more complicated when you consider language bias. English is often seen as the gold standard in Indian education technology. Most AI-driven learning platforms use English as the primary language, or offer Hindi as the second best. But what happens to students whose first language is Tamil? Or Bengali? Or Marathi? They are left learning in a language that may never fully reflect their thinking patterns. As a result, they often score lower, not because they don’t understand the subject, but because the AI doesn’t understand them.
Now imagine this happening silently. A student keeps getting questions that feel unnatural. She doesn’t know that the system was trained on a different linguistic base. All she knows is that she’s not doing well. She begins to think she’s not smart. Confidence cracks. And the algorithm never explains why.
The bias isn’t always just academic, either. Cultural context matters, and AI doesn’t get culture. A student might give an answer that makes perfect sense in their community or region, but is marked wrong because it doesn’t align with the “correct” version programmed in. AI doesn’t have the flexibility to say, “Maybe there’s another valid way to look at this.”
This becomes especially dangerous when these tools are used for assessment and tracking. If AI grades assignments, suggests remedial content, or ranks students, then it is quietly shaping how we value them. And if the AI is flawed, we’re creating an entire generation of students being judged by systems that don’t fully see them.
But here’s the trickiest part: We rarely question AI's decisions. When a teacher gives you a bad grade, you might ask, “Why?” When an app does it, we accept it. Why? Because it feels scientific. It feels logical. The interface is sleek. The progress bar looks precise. It tells you, “You’re 43% proficient in algebra,” and you believe it, even if it got your context completely wrong.
This blind trust is what makes AI’s bias more dangerous than a human's. A teacher’s bias can be noticed, corrected, or even apologised for. But with AI, the bias is built-in and invisible. You can’t argue with it. You can’t explain yourself. It doesn’t listen. It just outputs.
And in a country like India, with its diverse educational backgrounds, this is deeply unfair. What’s considered a “normal” learning curve in Delhi might look completely different in a tribal school in Jharkhand. An algorithm trained on one cannot possibly understand the other—yet it’s being used for both.
So, what can be done? First, we need to stop worshipping AI as an unbiased god. It’s a tool, not a judge. Developers must actively design for diversity—linguistic, regional, and cultural. Systems must be trained on more inclusive data. And schools must be trained to interpret AI’s outputs, not blindly follow them.
Second, students need to be taught to question the technology they use. Just like we teach them to think critically about media or history, we must teach them to ask: “Who made this tool? Whose data does it learn from? Whose voice is missing?”
Objectivity is not about removing all opinions. It’s about making space for all voices and letting no single data set define intelligence.
Until we do that, AI in education will not be a revolution. It’ll just be a prettier version of the same old story, where some students thrive, and others are quietly left behind.
The App didn’t know her Story:
Meena sits cross-legged on the mud floor of her one-room home in Dumka, Jharkhand. Her school shut down during the lockdown. Her teacher never came back. Now, she learns from an AI app on her uncle’s second-hand smartphone, borrowed for exactly 45 minutes every evening — after the buffaloes are fed, before her uncle returns from the fields. The app greets her with cheerful colours and a chirpy “Let’s Learn Fractions Today!” It does not notice the bruises on her wrist from helping carry firewood. It does not ask if she’s eaten. Meena tries to focus. She doesn’t understand the question, but there’s no one to ask. The screen offers “Hint Mode.” It helps a little. She selects an answer.
Wrong.
The bot says:
“Incorrect. Keep practising!”
But she won’t.
The battery is dying. The lantern is flickering.
And somewhere in the next room, her little brother is crying.
Tomorrow, the app will start fresh.
It will not remember she struggled.
It will not matter that she cried.
Because AI doesn’t track pain.
It only tracks progress.
The report will show “Needs Improvement.”
But Meena? She needed someone to say:
“You did your best. That’s enough today.”
But no one did.
Not the app.
Not the system.
No one.
And quietly, Meena begins to believe.
She is only as smart as the percentage on the screen.
More data, less dialogue- The silent crisis of Mental health:
In the pursuit of faster, smarter, and more scalable education, one thing is slowly fading from sight: the emotional well-being of students. As artificial intelligence enters classrooms in the form of automated grading, adaptive learning platforms, and performance-tracking dashboards, it brings efficiency—but also silence. Silence where there used to be encouragement. Silence where there used to be a conversation. In this silence, many young minds are left alone with their doubts, failures, and fears.
Traditional teaching was never perfect, but it was personal. A teacher could notice a student's hesitation, pause to explain again, or offer a warm smile that said, “I believe in you.” But AI doesn’t pause. It doesn’t soften its tone. It doesn’t ask how you’re feeling today. It simply assesses, calculates, and responds with clinical precision.
For a student already struggling with low confidence, an AI telling them they are “below average” is more than just feedback—it’s a label. And once labelled, students often begin to internalise those identities. “I am slow.” “I’m not good enough.” “I’ll never understand this.” The machine doesn’t mean harm. But it also doesn’t know how to heal.
Mental health in students is not just about anxiety or depression—it’s also about the quiet erosion of self-worth. About feeling seen. Heard. Valued. These are things that come not from metrics, but from human connection. From the teacher who notices that a student hasn’t spoken in days. From the peer who explains a math problem with patience, not impatience. From the environment that whispers, “It’s okay to try again.”
AI, for all its intelligence, cannot yet replicate these subtleties. It cannot detect when a student is distracted because of family stress, or when they’re falling behind, not due to ability but due to emotional exhaustion. It does not see the child sitting in front of the screen, just the scores behind it.
And as schools grow more data-driven, the focus often shifts to performance dashboards, predictive models, and learning analytics. But no chart shows heartbreak. No algorithm flags loneliness. No heatmap reveals shame.
We are raising a generation that may be hyper-informed, but under-supported. Students who may finish all modules but carry invisible wounds. We must ask: Is the goal to create efficient learners or fulfilled humans?
If we want to prepare students for the real world, we must remember: the real world is messy, emotional, and deeply human. Education must reflect that. It must leave space for struggle. For confusion. For the soft things that machines do not yet understand.
In this era of AI-powered learning, mental health cannot be an afterthought. It must be interwoven into the very fabric of how we teach, assess, and connect. Otherwise, we may end up with high-performing systems populated by low-belief students.
And no machine can fix that.
What the numbers have to say:
As artificial intelligence reshapes Indian education, its presence is no longer just a concept—it’s measurable, trackable, and very real. But while we talk about AI as the future of learning, what do the actual numbers say about its impact?
Let’s begin with adoption. According to a 2023 report by NASSCOM and BCG, over 65% of Indian edtech startups now incorporate AI-driven features such as adaptive testing, predictive analytics, and automated content delivery. Platforms like Byju’s, Vedantu, and Toppr collectively serve millions of students using such tools. The government, too, is encouraging AI integration—PM e-Vidya and DIKSHA are embedding intelligent systems to personalise content delivery for school children.
The reason is clear: scale and shortage. India has over 260 million school students, but struggles with a shortage of nearly 1 million qualified teachers. In rural areas, AI promises to fill this gap, offering instant explanations, digital assessments, and 24/7 learning support. In theory, it levels the playing field.
However, this narrative of technological progress hides some uncomfortable truths.
Start with the digital divide. A 2022 survey by the Ministry of Education revealed that only 27% of Indian students had consistent access to digital devices and internet connectivity suitable for AI-enabled platforms. In rural areas, that number drops to under 15%. So, while urban students benefit from smart apps and video tutors, millions are left staring at blank screens—or worse, excluded altogether.
Now, look at mental health—a growing area of concern. In a 2021 NCERT survey of 1.8 lakh students, 81% reported feeling anxious, frustrated, or emotionally unmotivated during AI-driven or online learning. The lack of teacher interaction, increased screen time, and performance-focused feedback mechanisms were cited as major contributors.
Another study by UNESCO in 2023 noted that only 9% of India’s leading AI edtech tools had built-in support for emotional well-being or mental health tracking. This points to a structural oversight: while AI can analyse test scores in milliseconds, it still can’t read the mental state of the learner behind those scores.
Moreover, a 2024 ASER (Annual Status of Education Report) survey showed that although AI usage increased performance by 12–15% in standardised math and reading scores, student satisfaction remained flat, and in some segments, declined. Students reported feeling more like “users” than learners, more like “targets” than individuals.
Finally, let’s not forget about teachers. In a study by the Observer Research Foundation (ORF), 62% of Indian teachers surveyed expressed concern that increasing AI use was reducing their creative freedom and turning them into “system managers” rather than educators.
Together, these statistics reveal a complex picture. AI in education is no longer an experiment—it is a reality. But data shows that equity, emotional health, and teacher autonomy are still being left behind in the rush toward digitisation.
If we truly want a smarter education system, we need to look beyond performance scores and platform features. We must ask: What kind of learners—and citizens—are we creating?
Build the Future, but Don’t Unlearn the Past:
We are not against the future. We want progress. We want to reach. We want every child to have the world at their fingertips. But we forget- the best classrooms were never just about content. They were about connection. A teacher’s encouragement after a low grade. The whispered doubts shared between friends during a break. The moment someone believed in you before you did. AI can assist. It can organise, analyse, and personalise. But it cannot replace the quiet magic of a teacher’s presence. It cannot replace the awkward courage of raising your hand. It cannot replace being seen, not just tracked.
As India builds its digital education empire, we must ask-
Who are we building it for?
If it leaves the child with no device,
If it silences the student who learns differently,
If it forgets that some lessons are not measured in marks-
Then we are building glass palaces with hollow hearts.
Let AI be the tool.
Let humans remain the soul.
Because education is not just about knowing. It’s about becoming.
And no algorithm should decide what kind of human a child becomes.
We must not trade empathy for efficiency.
We must not confuse access with equity.
And we must never, ever forget:
A good teacher can change your marks.
A great one can change your life.
So build the future.
YES.
But don’t unlearn the past.