As artificial intelligence becomes deeply embedded in the daily lives of students, schools are grappling with a new challenge: not whether students should use AI, but how they should use it responsibly. This question took centre stage at Panel Discussion: “Raising AI-Aware Students: Where Do Schools Draw the Ethical Line?” at the TechEdu India Summit 2026, organised by ETEducation.
Moderated by Sheeba Chauhan, Senior Associate – Content & Community, ETEducation, the discussion brought together leading school educators to reflect on how institutions can cultivate AI-aware learners without compromising empathy, originality and ethical judgment. As classrooms increasingly integrate AI tools, the panel emphasised that the real responsibility of schools lies in guiding students to become thoughtful users of technology rather than passive consumers of automated answers.
From answer machines to thinking catalysts
Dr Nicholas Correa, Director Principal of Finland International School, argued that the framing of AI in classrooms needs to change fundamentally. According to him, AI should not be treated as a solution generator but as a thinking partner that prompts deeper inquiry.
“We should take AI as thinking—not as an answer or solution provider,” he noted.
Correa warned that when students rely on AI to generate ready-made responses, they risk losing essential human abilities such as empathy, reflective judgment and intellectual curiosity. Instead, he suggested that AI should be used to generate hypotheses, challenge assumptions and encourage exploration.
A key shift, he argued, must occur in assessment practices. Schools need to evaluate how students arrive at conclusions rather than merely judging the final answer. “The moment we focus on process, the problem reduces,” he said, advocating for process-driven learning that values inquiry and reasoning.
Experience and emotion still drive originality
Adding another dimension to the conversation, Arwa Baldiwala, Head of School at Fazlani L’Academie Globale, highlighted that technology cannot replace lived experiences that shape authentic learning.
She described AI as a supportive tool that must be used with intention and discernment.
“You need to work with it. You don’t ask AI to work for you,” she explained.
Baldiwala emphasised that originality is rooted in personal journeys—failures, achievements, emotions and human interactions. When students rely excessively on AI-generated outputs, they risk losing the authenticity that makes their ideas unique. Drawing the ethical boundary, she suggested, lies in helping students recognise the difference between responsible assistance and unhealthy dependency.
Accountability as the new academic integrity
For Dr Shilpa Hiwale, Principal of Rejoice International School, the solution lies in reinforcing accountability. Instead of policing AI usage, educators should ensure students can demonstrate genuine understanding of the work they submit.
“After you submit your work, can you explain and analyse what you have submitted?” she often asks students.
If learners can articulate their reasoning, evaluate their arguments and defend their conclusions, AI remains a supportive learning tool. If they cannot, it signals overdependence.
Hiwale stressed that the broader mission of schools should not be to produce flawless assignments but to nurture thoughtful individuals. “Our focus should be on developing conscious, good human beings,” she said, underscoring the importance of ethical awareness alongside academic achievement.
Teaching students to question AI
Another crucial concern raised during the discussion was the reliability of AI-generated information. Dr Sharda Sharma, Deputy CEO and Director at Dr Pillai Global Academy, warned that excessive reliance on AI tools can weaken students’ attention spans and critical thinking abilities.
She emphasised that students must understand that AI outputs are not always accurate. Algorithms can produce biased responses, fabricate facts or generate misleading information—commonly known as hallucinations.
To counter this, Sharma advocated for a culture of transparency and verification in classrooms. Students should be encouraged to acknowledge when AI has assisted their work and develop the habit of cross-checking information rather than accepting it blindly.
Beyond restrictions: Cultivating digital discernment
Despite differing perspectives, the panel shared a clear consensus: banning AI from classrooms is neither practical nor beneficial. Instead, the role of schools is to build digital discernment—the ability to evaluate, question and responsibly engage with technology.
Educators must teach students how to critically interpret AI outputs, validate information sources and maintain curiosity-driven learning. At the same time, schools must continue to nurture qualities that machines cannot replicate—empathy, creativity, ethical judgment and human connection.
The classroom of the future
As AI continues to reshape educational environments, the ethical line will not be drawn solely through policy frameworks. It will emerge through pedagogy—how teachers guide students to interact with technology meaningfully.
The discussion at the TechEdu India Summit 2026 ultimately reinforced a powerful message: the classrooms of the future must produce reflective thinkers who collaborate with technology without surrendering their originality or humanity.
In an age where algorithms can generate answers in seconds, the true value of education may lie not in finding the fastest solution—but in nurturing the wisdom to ask the right questions.

