Ethics of AI in Education Collective Reflections from the Italian Awareness Raising Session
By Juliana Raffaghelli, Francesca Crudele, Martina Zanchin
đŽđš Versione italiana: leggi qui
The “ARS” or “Awareness Raising Sessions” are part of a participatory strategy to focus on aspects of AI Ethics that will then become a key element of the Open Educational Resources within our project (see Activities).
In this post, we tell you about how the session held at our UNIPD university campus in Rovigo went, which gave us wonderful hospitality!
We shaped our meeting as a “Lunchtime Seminar”: sharing lunch and sharing ideas, to nourish ourselves đ
Session Overview
The session began with participants working in small groups on fictional but realistic ethical dilemmas related to AI use in education. Despite being offered ten different casesâranging from domestic use of Alexa, to school surveillance, to institutional governanceâmultiple groups coincidentally selected the same one: a university computer science class where students used AI collaboratively, without resistance or guidance from their professor. The case sparked immediate recognition and identification.

âWe actually found that we live a similar situation,â one participant shared. âAs students, when weâre given group tasks, we organize, we use AI, because itâs available. So you use it.â
What began as a reflection on a single classroom example soon unfolded into a broader collective inquiry into what it means to teach, learn, and act ethically in increasingly digitized and datafied institutions.
âItâs like a simulationâ: Teaching and Learning on Autopilot
At the heart of the case was a sense of disconnectionâbetween students and teachers, between intentions and actions. One participant described the dynamic in stark terms:
âThe teacher was literally indifferent. He didnât care that students used AI and didnât think critically on their own⌠He just wanted to secure his job.â

The situation resonated widely. Another participant offered a chilling metaphor:
âItâs like a simulation of the educational environment. The students are simulating learning, and the teacher is simulating teaching. Everyone is just playing along.â
This idea of “simulation” echoed Gert Biestaâs critique of performative education: systems driven more by appearance and compliance than by purpose and meaning. The passivity of the teacher, paired with the tactical pragmatism of the students, exposed how AI was being quietly normalizedânot as an object of inquiry, but as a background tool in a broader educational performance.
âYou feel guilty, but you still use itâ: Emotional Ambivalence and Ethical Drift
The conversation moved into the emotional terrain of the issueâwhat it feels like to use AI in these ambiguous spaces. A student participant reflected:
âWe talked about emotionsâfeelings of guilt, disappointment, bitterness⌠that Italian word amarezza. You know itâs not really right to use it, but you do it anyway.â

Others spoke about the emotional burden teachers carry:
âTheyâre overwhelmed. Thereâs the program to follow, societyâs expectations, institutional deadlines⌠and maybe also this fear of being replaced by AI.â
Such reflections revealed a profound emotional complexity: students caught between capability and conscience; teachers torn between care and exhaustion. The session made clear that AI ethics isnât just about abstract principlesâitâs saturated with affect, anxiety, and tacit compromise.
âWe need to train both students and teachersâ: Ethics as Practice, Not Policy
Participants converged on one practical insight: ethical AI use must be taughtâbut not just in a technical sense.
âItâs not just about knowing how to use it. Itâs about how to think with it, or not think with it. Thatâs where critical training comes in.â
Another added:
âEvaluation is a big issue. If we donât rethink assessment, weâll just keep rewarding surface-level success.â





This led to an exploration of how educational systemsâthrough grading structures, workloads, and funding modelsâshape what gets taught, what gets valued, and what gets ignored. Ethics, in this framing, couldnât be reduced to a module or a checkbox. It had to be embedded into pedagogy, relationships, and institutional cultures.
âWe canât put everything on the shoulders of the userâ: Supererogation and Systemic Responsibility
The facilitator introduced the philosophical term âmoral supererogationâ to describe what many participants were circling around:
âWeâre asking teachers and students to be ethical⌠while offering them no real alternatives. Thatâs unfair. You canât be a hero in a broken system.â
One participant illustrated this tension perfectly:
âI havenât used Facebook in ten years. I donât use WhatsApp. But when I got here, the instructor said the group is on WhatsApp. What am I supposed to doâopt out and look like a troublemaker? I got a second phone, just to fit in.â
These reflections made clear that ethical choices are rarely made in a vacuum. They are made in constrained environments, often shaped by default platforms, peer pressure, and lack of institutional support. The discussion pointed to the need for collective agreements, alternative infrastructures, and shared accountability.


âWhatsApp is easyâbut at what cost?â: Platforms, Infrastructure, and Sovereignty
The group began to interrogate the technologies themselves. Why are certain tools used so pervasively in education? And what are the implications?
âWe talked about WhatsApp. Itâs part of Meta. And Meta pushes you to use itâitâs easy, but you donât know what it takes to use it.â
The facilitator expanded:
âEthics isnât only about the human side. Itâs about the ecosystemâwho builds the tools, who owns the data, where the servers are. We canât separate use from infrastructure.â
This opened up a conversation about digital sovereignty. One participant asked:
âLucrezia is greatâbut itâs only for faculty. Why donât students have access? And what happens when every university builds its own AI and server infrastructure? Who controls that?â
The ethical debate became geopolitical: energy policies, data hosting, public vs. private platforms. The boundaries between educational ethics and political economy blurredâintentionally.
âThere are no checklists for thisâ: Ethics as Critical Inquiry and Imagination
The session closed with a call to rethink ethics itselfânot as a fixed doctrine, but as a process:
âEthics isnât a checklist. Itâs ongoing. Itâs about asking, deciding, imagining⌠together.â
Participants spoke about building co-designed agreements with students, embedding ethics in the curriculum not as surveillance, but as dialogue.
âWe could have a contract at the beginning of the course,â one educator suggested, âwhere we decide with students what tools weâll use, and how.â
The facilitator echoed this direction:
âWeâre in a postdigital era. Platformization, datafication, and now AIâtheyâre not neutral. But we can respondânot just by resisting, but by imagining different futures.â
As participants packed up, the conversation lingeredâabout care, about power, about what might still be possible in education. Someone joked about not having opened the good Prosecco wine offered to conclude the lunchtime. But the reflection on the ethics of AI and data had already been uncorked.
Leave a Reply