Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the penci-text-to-speech domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /data/web/e51578/html/apps/wordpress-132133/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wpforms-lite domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /data/web/e51578/html/apps/wordpress-132133/wp-includes/functions.php on line 6114

Notice: Die Funktion _load_textdomain_just_in_time wurde fehlerhaft aufgerufen. Das Laden der Übersetzung für die Domain soledad wurde zu früh ausgelöst. Das ist normalerweise ein Hinweis auf Code im Plugin oder Theme, der zu früh läuft. Übersetzungen sollten mit der Aktion init oder später geladen werden. Weitere Informationen: Debugging in WordPress (engl.). (Diese Meldung wurde in Version 6.7.0 hinzugefügt.) in /data/web/e51578/html/apps/wordpress-132133/wp-includes/functions.php on line 6114
Two University of Waterloo researchers receive grants to help build metaverse – Metaverse for Learning
Home Metaverse Two University of Waterloo researchers receive grants to help build metaverse

Two University of Waterloo researchers receive grants to help build metaverse

by admin

Waterloo research labs are each receiving $30,000 from Meta for research that may not be practical for another 20 years, one grant recipient says

Waking up in the morning to find your sleep cycle organized into a graph on your bed’s headboard and then having buttons appear on your coffee cup to connect you to your social media accounts could all be possible in a virtual world known as the metaverse.

Achieved using augmented reality glasses, a University of Waterloo lab is a small step closer to realizing these virtual concepts, thanks to a $30,000 grant from Meta’s Reality Labs Research in Toronto.

Meta, the parent company of Facebook, announced that 17 researchers across Canada — two at Waterloo — are receiving a total of $510,000 in unrestricted grants to continue their work. 

Meta refers to the money as gifts because it is unrestricted. The labs can put the money toward what they choose, there is no time frame for when the research should be completed and Meta will not own the intellectual property over the research when it is done. 

Meta will have access to each team’s research when the public does. 

Jian Zhao’s WaterlooHCI Lab is focusing its research on the interaction between the user and data, such as being able to see your sleep cycle as soon as you open your eyes. 

One focus of Zhao’s lab is the interaction between a livestreamer and their audience. 

The audience’s experience is limited when the only communication is through text, said Zhao. This is what he is working to enhance.

“For example, the streamer can make a face to compose this emoji, like a happy or sad face or some gestures and this can be captured by the system to compose the appropriate emoji and insert it into the conversation to reply back to the audience,” said Zhao, adding that artificial intelligence would be a necessary part of that research. 

Zhao will also be studying how to visualize the streamer’s heartbeat to best improve the audience’s experience, which would be fitting for gamers who play horror games. 

“The past 10 years, the most interactions that happened are on mobile devices, but in the future, there will be more interaction in the metaverse, so how can we enable this interaction and give you the best experience,” said Zhao.

Human-Computer Interaction Lab

The other local grant recipient is Daniel Vogel’s Human-Computer Interaction Lab.

Vogel’s research is focusing on augmented reality (AR) and spatial augmented reality (SAR) uses in office and museum settings. 

In an office, a person would be able to pass a desktop window to the person beside them. At a museum, more information about a certain exhibit would be shown digitally, instead of using a sign or display.

Augmented reality is when there is a display between the user and the environment, which changes that environment — these are your Snapchat filters or AR glasses. 

Spatial augmented reality takes this a step further. Projectors, the same as those seen in a classroom, are used for projection mapping. This is where light containing digital information is projected to make it appear as though the pixels are on top of real-world objects. 

“The end result (between AR and SAR) is similar in that you can lay content on top of the real world,” said Vogel. 

“The difference is that in spatial augmented reality, it’s literally the world that you’re in. Essentially, you have small high-resolution screens on everything, in theory.”

If the concept is difficult to understand or visualize, it’s because this is research that may not be practical for another 20 years, said Vogel.

Vogel’s team will also look into mid-air hand gestures, figuring out how people in an augmented reality would select a menu and look up directions without a mouse, keyboard or touch screen. 

Instead, the user would do a hand gesture to signal a choice. 

“I think (this) presents a logical continuation of the path we’re already on from carrying around smartphones to now, or in the near future, it could be more common to be wearing AR headsets,” said Vogel. 

“AR headsets could be happening in the next few years to, eventually, the world is just augmented directly, like digital information is just everywhere.”

Quelle:

https://www.therecord.com/business/technology/2022/03/30/two-university-of-waterloo-researchers-receive-grants-to-help-build-metaverse.html

You may also like

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More