Home Metaverse Can Empathy Grow in the Metaverse and Virtual Reality?

Can Empathy Grow in the Metaverse and Virtual Reality?

by admin

KEY POINTS

  • Research suggests “virtual empathy” is possible if creators of virtual worlds keep empathic design principles in mind. 
  • In virtual worlds, there may be a heightened merging of the “self” and “other.” 
  • How virtual humans are designed can directly influence people’s capacity for empathy in virtual worlds.

Can virtual reality, the Metaverse, web 3.0, and virtual humans that populate these worlds be designed in a way that enhances empathy and social connection? Current and new research suggests “virtual empathy” is possible—if creators of these virtual worlds keep empathic design principles in mind.

Designers of the Metaverse and web 3.0 can create and implement virtual reality worlds in a way that can enhance empathy and cultivate connection, but this requires paying close attention to emerging research in the field of empathy and social connection in virtual reality (VR), augmented reality (AR), and human-computer interactions.

For the most part, research suggests that most interactions with a virtual human can be very similar to interactions with a real human. Interestingly, even the presence of a virtual human as a bystander can make people react as if they were being observed (i.e., the Hawthorne effect), even though when asked directly about this, people will not admit that there is an impact of a virtual human’s presence.

Social interaction rules also seem generally well-preserved, such as communication through gazes and social cues, even though the neurobiology of it may differ. Even the controversial Milgram electric shock obedience social psychology experiment from the 1960s has been replicated with virtual humans in 2006 (though participants fully knew that the virtual humans were not, in fact, being shocked).

While many social interactions stay the same, there are two important and different phenomenon that occur in virtual worlds. One is a heightened merging of the “self” and “other.” This “self-other merging” phenomenon means more blurred boundaries between self and other virtual humans. One theory for why this happens is that people in virtual worlds ascribe or “project” more positive traits to others and, as a consequence, are more willing to help others. This may not be a bad thing. In fact, it could mean more empathy in virtual worlds. However, this “self-other merging” could make people more vulnerable to bad actors in the Metaverse and relationships potentially less authentic if people are not really “seeing” each other.

A second notable difference in virtual worlds is the Proteus effect. Is it possible that the design of avatars and virtual humans may have an influence on “virtual empathy”? Yes. The “Proteus effect” was first described in 2007 by Stanford researchers and is the phenomenon describing how people change their behavior in virtual worlds based on the characteristics of their avatar, such as visual features. In other words, people will engage in behaviors different than they would normally and conform to what is expected or stereotypical behavior based on the appearance of their avatars. As a result, how virtual humans are designed can directly influence people’s capacity for empathy in virtual worlds.

Recent research in the journal Computers and Human Behavior found that specific ways of visual representation and expression of pain can create more awareness and emotional perception among virtual humans. One study found that adding more specific bodily trunk movement and specific facial expression to an avatar made people much more aware of the avatar’s pain. Researchers measured observer pupil size and found a much greater reaction when the avatar moved their body and face in a certain way (likely mediated through a mirror neuron process). Another study revealed how specific facial expressions of pain (e.g., brow lowering, nose wrinkling, and upper lip raising, orbit tightening and eyelid closure) and certain combinations and order of these expressions were most effective at conveying pain to others. The nuance of avatar and virtual human design will play a large role in the cultivation of and capacity for virtual empathy.

Experiences in virtual worlds have also translated real-world behavior. Both VR and AR can even be leveraged as tools to enhance empathy and perspective-taking and even work towards violence prevention. Studies in immersive virtual reality have examined whether virtual reality and augmented reality can be used to improve empathy, enhance bystander behavior to help others, and even reduce real-world violence. In a study published in Natureresearchers worked with 20 men who had been aggressors in domestic violence and put them in an immersive virtual scene where the men were put into the experience of virtual women bodies. After being embodied as virtual women who experienced a virtual domestic abuse scenario, the men were better able to recognize fear and unhappiness in women’s faces—an important finding since being able to recognize emotions of others is considered an important underlying problem with aggressive behavior.

As we enter web 3.0, understanding these psychological principles of empathic design and virtual human-mediated communication is more important than ever. We are continuing to understand how empathy will exist in the Metaverse and how it can be mediated and encouraged. While many have cautioned against the potential harmful psychological effects of the Metaverse, technological advances are rarely—if ever—reductionistic enough to be either fully dystopian or utopian. The important conversation to have now is how we and the trailblazers in this industry can all work together to build an ethical and moral framework for the Metaverse grounded in empathic design and social connection.

New research illustrates virtual reality can improve certain types of empathy.

KEY POINTS

  • New research shows that socially responsible virtual reality programs can successfully promote certain types of empathy.
  • Cognitive empathy is the ability to appreciate how others feel, and affective or emotional empathy is to feel how others are feeling.
  • Cognitive empathy and perspective-taking require more mental effort and focus than emotional or affective empathy.

Research illustrates that empathy can exist in virtual worlds like the metaverse and web 3.0, and virtual interactions often mirror those interactions in real life. As virtual worlds become more accessible to larger audiences, could metaverse and virtual reality activities be used to enhance empathy? Could the metaverse and virtual worlds promote social good?

Empathy has been referred to as a muscle that can be improved, strengthened, and cultivated through practice. In a 2015 TED talk, Virtual Reality (VR) Developer Chris MIlk called VR “the ultimate empathy machine.” Multiple VR companies have invested in programs to incentivize designers to create content for social good, including Oculus’s 2016 “VR for Good” initiative and the 2017 HTC VIVE “VR for Impact” program. Have such socially responsible VR programs been able to encourage empathy?

A recent meta-analysis research paper and review of VR programs published in Technology, Mind, and Behavior suggests that certain virtual reality experiences can be effective at enhancing emotional empathy. Still, the ability to impact cognitive empathy needs more work.

What is the difference between these types of empathy? How can cognitive empathy be better targeted through VR programs? These distinctions are helpful to consider as VR designers develop experiences and virtual worlds like the metaverse.

The concept of multiple dimensions and types of empathy is a longstanding one. Adam Smith, in 1759 described two types of empathy: 1) an emotional reaction to others and 2) the ability to recognize the other person’s emotional state–this means not necessarily experiencing an emotional reaction to the other’s personal state.

Modern-day empathy theories propose that empathy comes from a “dual process” involving automatic unconscious and conscious processes.

Empathy is multidimensional. There are different types of empathy (the third is less commonly raised):

1. Emotional empathy or “affective empathy” is the ability to share another person’s feelings. This is considered an immediate automatic emotional response and is achieved through emotional connection.

“I can feel in my body what others are feeling.” (bottom-up processing)

2. Cognitive empathy or “perspective-taking empathy” is the ability to understand how a person feels and what they might be thinking. This type of empathy may be improved through “perspective-taking” exercises and better communication. To cultivate cognitive empathy, the participant has to put in more effort.

“I appreciate and get how others are feeling, but I don’t feel it.” (top-down processing)

3. Compassionate empathy or “empathic concern” is the third type of empathy that moves one to take action and help others suffering. Paul Ekman described this type of empathy as a subset of both cognitive and emotional empathy— a “narrower slice” of empathy focused on another person’s suffering. This type of empathy includes a wish to relieve others’ suffering and a willingness to take action.

“I feel and recognize the suffering of others and want to do something to relieve their suffering.”

Research has found that virtual reality experiences can be designed to promote social good and promote emotional empathy, including virtual experiences that show people how it is to become homelessencounter racism, or be in a refugee camp. There are even virtual experiences to help one imagine being a cow in a slaughterhouse or transforming into a rainforest tree in an approaching fire.

But can these virtual experiences guide people to truly grasp the feelings, thoughts, and decisions of that situation? Can they inspire participants to change behavior and take action? What elements of virtual experiences are most likely to impact the types of empathy that are harder to reach (cognitive and compassionate)? These more challenging questions require a deeper investigation into empathy, an umbrella term for different psychological processes.

Emotional empathy or affective empathy is the product of the fight-or-flight response system through a neurobiological model of processing called “bottom-up processing” – in which one’s body gives the brain feedback information. Emotional empathy is a visceral response and feeling produced by, among other things, complex neurohormones released like adrenaline.

On the other hand, cognitive empathy is the complex product of “mentalizing,” or the act of putting oneself in others’ shoes and figuring out the thoughts and feelings of another person. This active “top-down processing” mode takes even more effort than emotional empathy and concentration since it draws upon more advanced resources of the mind. Distractions and being inundated with emotions can make cognitive empathy more difficult to access.

This means the design of virtual experiences and how it engages the user can result in varying amounts and types of empathy. Media scholar Marshall McLuhan coined “hot media” and “cold media.” “Hot media” are works that “spoonfeed” sensations to the user without requiring more active user engagement.

“Cold media” are works that require high participation from users. Cognitive empathy requires that the user experience be more like “cold media” and not just serve up sensations. In other words, virtual experiences that require the user to imagine, engage actively, and participate in a focused environment free from distraction are most likely to help the user develop cognitive empathy.

Some scholars and philosophers have said that trying to improve a user’s cognitive empathy is not only impossible but also unethical – because you can’t actually ever truly imagine other people’s subjective experience. However, this point is very much debated. Being able to better imagine oneself in someone else’s situation seems more feasible through virtual experiences and that one can do so while also acknowledging that one person can never truly grasp another person’s complex and unique perspective. There is a nuanced but key distinction between two types of cognitive empathy— the difference between “imagine-self” and “imagine-other” perspective taking:

There are two types of “perspective-taking”  also known as “projective empathy” or “simulation”:

1. Imagine-other perspective-taking “being someone” – to imagine the perspective of other subjects, grasping their thoughts, feelings, decisions, psychological traits.

2. Imagine-self perspective-taking “being in someone else’s shoes” – to imagine the thoughts, feelings, decisions, characteristics of how you would feel in the other’s circumstances.

These two types are projective empathy are neurologically and psychologically different. Imagine-other takes even more mental flexibility and the ability to set aside one’s immediate reactions, emotions, and feelings (“emotional regulation“).

Cultivating empathy through VR programs will continue to grow through the efforts of creative and interactive VR designers and the expansion of technological capabilities. These questions are worth pursuing and require precision and awareness of defining and encouraging empathy.

Quelle:

https://www.psychologytoday.com/us/blog/urban-survival/202112/can-empathy-exist-in-the-metaverse-and-virtual-reality

https://www-psychologytoday-com.cdn.ampproject.org/c/s/www.psychologytoday.com/us/blog/urban-survival/202202/can-empathy-grow-in-the-metaverse-and-virtual-reality?amp

Foto: Andrea Piacquadio / Pexels

You may also like

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More