THESIS
2023
1 online resource (xii, 95 pages) : illustrations (chiefly color)
Abstract
The Metaverse is the realm that combines the real and virtual worlds. It enables individuals
to work, play, and interact in the blended post-reality world, which is a network of
social, networked, immersive environments on permanent multi-user platforms.
By utilizing avatars to represent themselves in the blended space, users can expand
their physical presence in the Metaverse. Using these avatars, they can converse and
interact with other people and objects in the virtual environment, just as they would in
the real world.
A natural user interface (NUI) lets users naturally engage with other users and digital
objects during Metaverse events (e.g., social gatherings and online meetings). Several
studies on NUI techniques (such as body and hand motions, speech, and biometrics) have
been p...[
Read more ]
The Metaverse is the realm that combines the real and virtual worlds. It enables individuals
to work, play, and interact in the blended post-reality world, which is a network of
social, networked, immersive environments on permanent multi-user platforms.
By utilizing avatars to represent themselves in the blended space, users can expand
their physical presence in the Metaverse. Using these avatars, they can converse and
interact with other people and objects in the virtual environment, just as they would in
the real world.
A natural user interface (NUI) lets users naturally engage with other users and digital
objects during Metaverse events (e.g., social gatherings and online meetings). Several
studies on NUI techniques (such as body and hand motions, speech, and biometrics) have
been published, but the integration of NUI into the multi-user metaverse has not been
thoroughly investigated.
This thesis presents four contributions to research that facilitate human-to-human and
human-to-object interaction in the Metaverse.
In the first contribution, we develop an adaptive framework for human-avatar interaction.
This framework employs Octree and Inverse-Kinematic algorithms on skeletal data
in order to maximize transmission to other users and scale with the number of concurrent
Metaverse users.
In the second contribution, we develop a metaverse-based 3D model editing tool (3DeformR)
that allows freehand manipulation. 3DeformR combines three optimized hand
motions and bi-harmonic deformation algorithms to enable the selection and modification
of fine-grained 3D models.
In the third contribution, we created Mobile to AR (M2A), a framework for displaying
context-aware online content. M2A uses the visual context to display more material while
allowing users to quickly identify essential data with minimal website modifications.
The last contribution combines the human-avatar interaction framework with M2A to
create a context-aware recommendation system (A2W). A2W is a browser that produces
AR-driven web content for Metaverse users based on a content-based filtering recommendation
system.
These contributions include a range of Metaverse interactions and are assessed using
rigorous experiments and user research. Our results demonstrated that the frameworks
enhance the quality and immersion of the user experience in the Metaverse.
Post a Comment