After my initial attempt at using face recognition to capture users’ emotions for the AI conversational agent (chatbot) project, I opted for a simpler solution. I created a self-reporting system where users can directly share their emotions. This data will be analysed to make the AI conversational agent emotionally intelligent, a crucial aspect of my doctoral research focused on building an emotionally aware AI for educational assessments. The development process involved multiple iterations, as described below.
Iteration 1:
The concept involved designing a straightforward web app featuring 5 emoji buttons representing varying emotional states, ranging from Very Anxious to Very Calm, with a neutral option in between. Users would click on the button that matches their current emotion, and the system would prompt for input every 60 seconds. After the session, users would receive a CSV log detailing their emotional states, along with basic data analysis such as identifying the most dominant emotion during the session. Below is an overview of the developed system:

The initial version faced several issues:
1) Users had to wait 60 seconds before reporting their emotional state again, which was too long considering emotions can change rapidly.
2) The system lacked granularity, only allowing selection from 5 predefined emotional states without capturing the intensity of emotions.
3) There was no real-time reporting feature, making it impossible to integrate emotional intelligence into the AI conversational agent (i.e. lack of an API).
To address these issues, I proceeded to the second iteration.
Iteration 2:
The updated concept involved replacing the 5 emotional state buttons with a slider scale, allowing users to report the intensity of their emotional state (stress level). This change would provide a more detailed insight into their anxiety level at any given moment. Additionally, the system was modified to allow instant reporting (real-time reporting) of emotional states without the previous 60-second timer constraint. These reports would be displayed on a separate page. The log was also enhanced to provide a breakdown of the user’s emotional state, offering more detailed information compared to the previous CSV file’s higher-level overview with timestamping. Below is an overview of the updated system:

The second iteration also encountered the following issues:
1) The front-end GUI lacked clarity, making it difficult for users to identify the emotional states to choose from.
2) While there was an assessor view for displaying real-time emotional states, there was no live-reporting API feature. Consequently, real-time emotional states couldn’t be exported for integration into other systems (the AI conversational agent), rendering the goal of enabling emotional intelligence unattainable.
To address these challenges, I proceeded to the third iteration.
Iteration 3:
The goal was to enhance the front-end GUI and implement an API to provide real-time output of users’ emotional states in JSON format for integration into an external AI conversational agent. Below is the updated and final version of the system:

The initial two iterations primarily utilised HTML/CSS for the presentation layer and JavaScript for implementing the system’s behavior. However, for the third iteration described above, significant development involved server-side configuration to enable exporting real-time emotional states to JSON output, hence PHP was used. This allowed the creation of an API that could be integrated into the external AI conversational agent. This integration makes the AI conversational agent emotionally intelligent by enabling the user’s emotional state to serve as input prompts in an LLM-augmented conversational agent.
Reflections:
The third iteration lays the groundwork for emotional state reporting in the AI conversational agent system. The integration plan involves embedding the client-side emotional input slider within an iframe in the user interface of the AI conversational agent. The results page will be integrated into the assessor-side of the agent system. Additionally, the API will be used to push the emotional state as a Python variable in the conversational agent’s system. This enables the emotional state to be utilized when prompting queries to the chosen LLM, facilitating emotionally intelligent generative responses.
And example of the API call in Python (Flask) to get the real-time emotional state is provided below:
Stay tuned for further updates on the progress of the emotionally intelligent AI conversational agent and how this emotional state reporting tool is integrated therein.
This emotional tracking system is now freely available to use as an innovative app which allows anyone (e.g. teachers, healthcare workers, assessors, managers, employers, HR team members etc.) to monitor the level of stress of a person in real-time during a session (meeting, lesson, assessment, task, activity etc.). Test it out here!