Facial emotion recognition technology utilises artificial intelligence to analyse facial expressions and discern emotions such as happiness, sadness, anger, and the like. While its adoption in schools has elicited concerns regarding privacy, data security, and ethical considerations, such functionalities often infiltrate the classroom through a hardware/software creep. This implies that these features frequently become accessible as part of a broader software update. A case in point is the environmental control feature of ViewSonic Sense within a smart whiteboard. The product gauges, for instance, the distance between students and assesses environmental attributes like room temperature and humidity. Yet, the technology also evaluates student participation. This participation feature encompasses the detection of students raising their hands or gauging their concentration levels through emotion recognition technology (Viewsonic, 2023).
The apprehensions linked to facial emotion recognition echo those associated with facial recognition (refer to Attendance Monitoring). However, facial emotion recognition delves deeper by inferring affective states of users based on their facial expressions. Some scholars posit that the strategy of evaluating emotions to steer future decisions has its roots in consumer marketing and the evolution of bespoke advertising. They argue that, when transposed to classrooms, it implies several limiting beliefs about education. This viewpoint diminishes learning to mere information absorption and perceives teaching as solely capturing student attention, rather than stimulating discussions or nurturing inquiries and experiments (Saltman, 2016). Furthermore, there’s scant empirical evidence supporting the accuracy of facial emotion recognition (Andrejevic & Selwyn, 2020; Article-19, 2023).