In order to produce realistic simulations and enhance immersion in augmented reality systems, solutions must not only present a realistic visual rendering of virtual objects, but also allow natural hand interactions. Most approaches capable of understanding user interaction with virtual content can often be restrictive or computationally expensive. To cope with these problems, we demonstrate a method which employs user’s thumb and forefinger to interact with the virtual content in a natural way, utilizing a single RGB-D camera. Based on this method, we develop and realise an augmented reality chess game, focused on providing an immersive experience to users, so that they are able to manipulate virtual chess pieces seamlessly over a board of markers and play against a chess engine.
This paper presents a method for sleep monitoring at home, aiming to support the diagnosis of sleep apnea. Since sleep problems affect 24% of men and 9% of women, worldwide, it is essential that the proposed system is convenient and low-cost. Unlike most sleep measurement techniques, the suggested one is also suited for long-term monitoring. In this study, an efficient application was developed in order to detect snoring and apneic events. The prevalence of snoring is quite high among population and also an often symptom of the obstructive sleep apnea syndrome (OSAS). Thus an unattended recording system was built using the Raspberry Pi computer, a simple microphone and Internet connection. A Java application was also developed, so as to extract knowledge concerning the sound samples and get an estimate on whether the user may be suffering from OSAS or not.
In order to enhance immersion in augmented reality systems, solutions must not only present a realistic visual rendering of the virtual objects, but also allow natural hand interactions. The main goal of this project was to utilize and introduce advanced techniques for the superimposition and manipulation of virtual objects over the view of the real world for mixed reality simulations. In this work, a board of markers was used for computing the camera pose seamlessly and a pinch gesture detection algorithm was implemented, employing users thumb and forefinger to interact with the virtual content, using an RGB-D camera. Ultimately, a Mixed Reality Chess was developed, focused on providing an immersive experience to users, so that they are able to manipulate virtual chess pieces in front of a real table and play against a chess engine.
The pressing need for a future with fewer road accidents rapidly increased the interest for the technological advancement of driverless cars, which have now reached to such a point, that they can soon be used by civilians for their everyday transportation. However, considering that human drivers are not needed, questions arise regarding their nature, as well as their tenor and issues concerning autonomous navigation in moral dilemma situations have excessively come into sight. Several different data-driven approaches that use text-based surveys and online platforms have tried to tackle this challenge, however, we argue that this is not a realistic enough way to assess people’s responses towards moral dilemmas involving driverless cars. Exploiting the fact that people tend to respond realistically during situations recreated using immersive virtual reality technology, in this thesis, a pilot study was conducted in order to get an insight on how people evaluate the morality of decisions a driverless car should inevitably make in moral dilemma scenarios. A virtual reality simulation was designed and developed, through which users were able to fully experience a moral dilemma leading to an accident, as the passengers of a virtual autonomous vehicle. The participants responses were recorded using a questionnaire to assess their sense of presence and identify their moral preferences for the decisions made by the driverless car. Despite the small sample used, results showed that participants were both surprised and absorbed during the virtual reality simulation, with their answers regarding how driverless cars should behave being mostly similar to literature. Nevertheless, their intense emotional reactions during the presented moral dilemmas and small declinations from past surveys, provide encouraging evidence for the use of virtual reality as a tool for future experiments using multiple scenarios that should be further investigated.
I wrote an article on how to integrate Variable Rate Shading (VRS) with your Unreal Engine project in order to enable Foveated Rendering using the HTC VIVE Pro Eye headset.
The HTC VIVE Tracker allows you to not only track objects in VR (getting tracker pose in real-time) but also use the POGO pins to simulate input buttons (Grip/Trigger/Trackpad/Menu) as if you were using a Vive controller. In this post I am going to show you how to do both using Unreal Engine 4.24.
We realise how important it is for the development community to have easy-to-use tools that improve content performance. We introduce the Adaptive Quality feature, which can help to automatically adjust the rendering quality of your VR application according to the system workload in order to achieve better performance and improve battery life by up to 15%. This blog post will explain more about Adaptive Quality, why it is important and how to apply it to your own projects. We will provide an overview of the solution and its design/implementation and share a few tips on how developers can get started using it. We also describe how it works in synergy with other features of the Wave SDK such as Dynamic Fixed Foveated Rendering and Dynamic Resolution for better results.