Acting Intelligence: Reimagining AI

Neurotech@Berkeley
9 min readMar 20, 2022

At the moment, there is no emotionally intelligent AI. AI machines cannot feel happiness, sadness, fear, joy, and the plethora of emotions in the capacity that humans do. AI can mimic emotion, but its ability is extremely limited. The experiences that human beings go through make their emotions as rich, vivid, and unique as they are. This is something that AI machines are not capable of experiencing. However, there are essential benefits of reimagining AI; empathetic AI could benefit society and increase access to mental health care and diagnosis, especially in areas with limited access to such services. The important consideration is to find balance in the emotions of the AI. What if I told you that there is a way to train AI to be emotionally intelligent using theater?

What if AI does not need emotion but can perform instead?

I invite you to reimagine artificial intelligence. Like athletes, doctors, and mathematicians, who all use practice to synthesize information, AI machines require programming and data input to perform functions. If we want an AI machine to learn how to catch a ball, for example, then we can study a person catching a ball. Similarly, if we want an AI machine to empathize with others, we can observe humans interacting with each other. Though since studying humans has many ethical complications, it does not need to be limited to observing how people candidly interact with each other. Instead, we can use acting as a medium of human interaction. Perhaps, if AI machines are trained similarly to how actors are classically trained, they will observe and understand human emotions just as well. Stimulating an artificial empathetic system in a controlled and scripted environment (i.e., acting) could overcome the obstacles of bias and data availability that current efforts of programming AI with empathy experience.

Obstacles

“In empathy analysis, data is currently the primary limiting factor in both quantity and variety” (Bo Xiao et al.).

Observing the effects of empathy on humans could be challenging to study due to its long timeframe and unethical nature in many intimate or traumatic moments in real-life. One proposed solution could be examining empathy on a physiological level, such as vocal qualities, facial expressions, and other physical characteristics. However, for example, speaking in a louder voice may mean assertion in one culture, whereas in different cultures, it may signify excitement. Unfortunately, many AI systems have been trained in a cookie-cutter manner, only taking into account the characteristics and culture of only a portion of society, often the majority in that environment.

In a study performed by Rachael Tatman at the University of Washington, Youtube’s auto-captions were evaluated based on the speaker’s sex and dialect of English is spoken (namely California, Georgia [the state], New England, New Zealand, and Scotland) (Tatman, 2017). The research found that the Word Error Rate for women was about 13% higher than for men. Additionally, there was almost a 20% difference between the dialect with the lowest Word Error Rate (California) and the dialect with the highest Word Error Rate (Scotland) (Tatman, 2017). Isn’t it surprising that even different dialects of the language AI is trained to learn can have such disparities between them? What is even more surprising is that there was a difference even within the United States. Such data are bound to be biased even unintentionally, causing potential obstacles for AI to misinterpret humans.

The implicit or explicit biases of an individual can often reveal themselves when they program the AI. Biases can impact data collection, analysis, and framing, and the limitation of data variety “with respect to modalities and scenarios’’ can introduce bias from the get-go (Xiao et al., 2016). There are endless interactions between individuals with different backgrounds that have not yet been analyzed; the unforeseen amount of data regarding human behavior and empathy that has not been collected yet creates a massive gap in the information needed to program AI to be empathetic effectively. “Manual annotations of behavioral cues have been needed for empathy analysis in varying degrees,’’ and researchers have been looking for a way to collect information on empathy through “automation and integration of behavioral signal acquisition, processing, and assessment within a unified system”(Xiao et al., 2016). When faced with an obstacle, it can be helpful to search for solutions in fields aside from strictly science-related ones.

Theater as a Solution

Imagine you are sitting in the front row of a beautiful theater, watching a romance tragedy and tearing up as a character mourns the loss of a loved one. You may be pulled into the character’s world and empathize with the character. For this response to occur, the actor on stage must be well-rehearsed and have exquisite technique. The hours of rehearsal dedicated to that scene will ultimately determine the extent and genuineness of the emotional response as an audience member. The immature actor will step onto the stage convinced that they must be sad, happy, angry, etc. The classically trained actor will know that they must not be anything because one cannot just be on stage. Instead, they will know what to do and actualize every line of text. For instance, if the actor needs to play a scene in which they are angry, the actor does not need to be angry to give the scene justice. Instead, the actor must take each line of the text, understand the objective or motive of the character, and plan how to physicalize that necessary action. Therefore, the actor will no longer try to force a state of emotion in an unhealthy manner. Instead, the actor will focus on physical actions (i.e., how they sit, how their hands are positioned, how they move, their rhythm and volume, etc.). By sculpting such a meticulous and thought-out performance, the actor will allow the emotions to arise in the performance as a natural and believable byproduct.

This method of acting challenges any notions that acting is untruthful or mentally damaging to the actor; rather, it allows humanity to reveal itself in beautiful ways. Stories told through acting change people’s minds, broaden their perspectives, and reveal human conditions (both unfathomable and wonderful) that many audience members have not experienced.

The actions of an actor can not only inspire emotion for the actor but also their scene partner(s) and audience members. Understanding an actor’s technique and rehearsal process can provide great insight into how AI can also be trained to perform empathy. AI can be programmed to do specific actions that allow for authentic emotions with individuals it interacts with. Giving AI the discipline of an actor, and programming it to approach all situations, no matter how emotional or complex, can allow for a harmless yet empathetic responsive system. More importantly, an actor’s training relies on the heavy utilization of the given text. The script is what gives the actor the cues for certain actions and emotional responses. Just as AI improves algorithms by processing large data sets, AI can also expand its emotional intelligence by analyzing plays and other theater shows. Now imagine feeding AI with the information of hundreds of thousands of scripts written throughout history based on actual experiences and complex emotions. These are the reasons why theater, and acting overall, should be incorporated into machine learning and AI systems. The theater could therefore swoop in and play the role of a hero by allowing AI to study trauma and complex human interactions that occur authentically on stage without any of the ethical implications that would otherwise be present in such real-life moments.

Benefits of an AI that Acts Intelligently

Using the benefits that theater could provide to increase the emotional intelligence of AI, health care could be drastically changed for the better. As of right now, AI systems are being used to detect diseases at earlier stages due to accurate trend observations allowing for both a better understanding of disease development and treatment personalization to allow for improved cures. The use of AI aids clinicians in decision-making and detecting details that may be easily missed.

Currently, AI can also aid mental healthcare practice. Though AI does not embody the emotional warmth a psychologist can provide, it can take in data regarding a patient and use a holistic approach to understanding the psychological state each person may be in. (Graham et al., 2019). AI can help us take into account, for example, biological and social factors to help identify potential causes and diagnoses for patients. In a future where AI can analyze theater, it might give the AI a sense of compassion while factoring in all potential causes when diagnosing diseases and even suggesting how to help improve the patient’s mental state.

In Conclusion

The crossover between artificial intelligence and theater can revolutionize healthcare and potentially deconstruct the stigmas surrounding AI. Many argue that aside from the task’s difficulty, AI should never even be allowed to reproduce human emotion, for it can pose some threat to humanity. However, this article is based on a different premise; the key difference here is that the empathy given to AI is not expected to be equivalent in nature and impact to human empathy. Giving an AI machine data and programming it to complete a function is much less daunting than giving AI a sense of self and consciousness. Exploring the concept of artificial empathy and its connection to theater can allow AI to perform empathy that is credible and authentic by observing and mimicking the rehearsal process of an actor. A patient speaking to an intelligent system during an online chat will have a much better and more helpful experience if the system is able to detect emotion through language and respond. As AI machines continually advance, their necessity for artificial empathy will most likely increase as well. Theater can have the same profound impact it has on broadening humans’ perspectives on AI systems. The potential benefits of giving AI the ability to detect emotion, understand emotion, and be able to respond effectively, will most definitely increase the availability and quality of healthcare services.

Whether the intelligent system learns through observation or a digitized form of acting itself needs to be studied further; however, the fact remains that acting can positively influence AI. In essence, AI can act. Like how empathy is imposed on humans, AI can be immersed in an emotional environment, such as actors on a stage. Through this form of ‘acting,’ AI can learn empathy, taking on a computerized version of emotion.

Works Cited:

Tatman, R. (2017). Gender and dialect bias in YouTube’s automatic captions. Proceedings of the First ACL Workshop on Ethics in Natural Language Processing. https://doi.org/10.18653/v1/w17-1606

Graham, S., Depp, C., Lee, E. E., Nebeker, C., Tu, X., Kim, H.-C., & Jeste, D. V. (2019, November 7). Artificial Intelligence for Mental Health and Mental illnesses: An overview. Current psychiatry reports. Retrieved March 19, 2022, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7274446/

Xiao, B., Imel, Z. E., Georgiou, P., Atkins, D. C., & Narayanan, S. S. (2016). Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with q Behavioral Signal Processing Framework. Current psychiatry reports, 18(5), 49. https://doi.org/10.1007/s11920-016-0682-5

This article was written by Mary Shahinyan, who is a junior undergraduate student at UC Berkeley studying Molecular and Cell Biology and Theater and Performance Studies, and Meltem Su, who is a sophomore undergraduate student at UC Berkeley studying Molecular and Cell Biology.

This article was edited by Jacob Marks, a junior undergraduate pre-medical student at UC Berkeley studying Cognitive Science, and Annabel Davis, a senior undergraduate student at UC Berkeley studying Cognitive Science.

--

--

Neurotech@Berkeley

We write on psychology, ethics, neuroscience, and the newest in neural engineering. @UC Berkeley