Below you will find an overview of current projects being carried out in the group - details of previous projects can be found here

The Oculus Rift virtual reality headset

 
Image of Occulus Rift

The Oculus Rift virtual reality headset is one of the most talked about technologies of 2013, with many proclaiming it and similar devices could revolutionise the way people experience interactive content in the coming years.

Dr. Llyr ap Cenydd from the School of Computer Science has sought to explore what could be possible with this new immersive technology by developing prototype experiences in his spare time over the Summer, using an Oculus Rift development kit he funded through Kickstarter.

Ocean Rift is a virtual deep sea diving experience specifically designed for the Oculus Rift. In the tech demo users can explore the sea bed and meet a number of different shark species, schooling fish and humpback whales.

The most infamous inhabitant of Ocean Rift is the hungry prehistoric Megalodon shark that lives in the deeper parts of the simulation. While Ocean Rift is mostly a benign experience, players are warned not to venture too far into the deep, and numerous anecdotal blog posts and YouTube reaction videos are testament to how realistic virtual reality can be when this advice is ignored!


Medical Virtual Environments

 




More details can be found on our Publications page

Two key projects in this area have been in collaboration with the CRaIVE consortium, which comprises of interventional radiologists, computer scientists, physicists, clinical engineers and psychologists in the UK and is driving new paradigms in interventional training within virtual environments, relevant to the specialty of Interventional Radiology. The VMG group is working within CRaIVE on the development and validation of two training simulators for interventional radiology:

Image Guided Interventional Needle Simulation (ImaGINe-S), funded by the UK Department of Health
A simulator for teaching the Seldinger Technique, funded by the EPSRC.

ImaGINe-S supports training of ultrasound guided needle puncture and was awarded second place in the Eurographics 2009 Medical Prize for its innovative use of computer graphics in a complex system that is already far advanced towards clinical use. The Bangor Image Guided Needle Puncture Simulator (BIGNePSi) provided the starting point for several of the features used by ImaGINe-S.

The Seldinger technique requires a needle puncture into the vascular system, which then acts as the portal for the insertion of a guidewire followed by a catheter into the vessel. At VMG we are currently building a novel interface for the entry of the needle, guidewire and catheter into our virtual patient.

 

 
Image

Click here for further details

Simulating and compensating changes in appearance between day and night vision
ACM SIGGRAPH 2014 / ACM Transactions on Graphics
Modern displays, such as the ones used in phones or tablets, are designed to work in bright and dark environments. When used in bright conditions, these devices increase screen brightness. However, when used in dim conditions, such as indoors when the lights are dimmed, it is beneficial to reduce the screen brightness. Doing so reduces the strain on the eyes and lowers the power consumption. However, below a certain brightness level (luminance) the image quality suffers, as the human visual system can no longer retain normal colour and detail perception. This makes small details invisible and causes washed-out colours. The method we propose can compensate for the appearance changes, making it possible to reduce screen brightness and thus lower both the eye strain and power consumption, and still preserve high image quality.

     
 
Image

gVirtualXRay: Virtual X-Ray Imaging Library on GPU
This project is focused on developing new software technologies for simulating X-ray images on the graphics processor unit (GPU) using OpenGL. It supports ‘old’ OpenGL implementation as well as modern OpenGL core profile (OpenGL 3.2+). No deprecated function in OpenGL has been used. The library takes care of matrix transformations, matrix stacks, etc.

The source code of gVirtualXRay’s is available under the BSD 3-Clause License. For details on use and redistribution please refer to http://opensource.org/licenses/BSD-3-Clause.
LINK: http://gvirtualxray.sourceforge.net/

 

UltraSendo is a new tactile system designed to utilise focussed airborne ultrasound to mimic a pulsation effect such as that of a human arterial pulse or thrill. In this project, we used focussed airbourne ultrasound to create a novel haptics component, which can later be integrated into a variety of medical procedure training simulators.

Link to paper

     
 


Image

Virtual Environment for Rugby Skills Training (VERST)

There is growing interest in utilising Virtual Environments (VEs) in the context of sports. In particular there is a desire to be able to improve sensorimotor skills rather than just using a VE as a tool for strategy analysis or entertainment. The skills required across all different sports are very large and varied. This project focusses on ball sports, in particular a VE for developing training skills in rugby. Such a VE needs to provide realistic rendering of the sports scene to achieve good perceptual fidelity. More important for a sport-themed VE is high functional fidelity, which requires an accurate physics model of a complex environment, real time response, and a natural user interface. The goal is to provide multiple scenarios to the player at different levels of difficulties, providing them with improved skills that can be applied directly to the real sports arena

Fly4PET: Fly Algorithm in PET Reconstruction for Radiotherapy Treatment Planning

 

This project is focused on developing new software technologies for lung cancer treatment and it is based on accurate physical models implemented using high performance computing. Four research themes have been identified: improvement of our original reconstruction algorithm for Positron Emission Tomography (PET) imaging, fast respiration simulation, tumour segmentation and extraction, and interactive multi-modal visualisation. The clinical outputs that are expected will be improving the quantitative results in PET, assisting doctors to elaborate their treatment plans using both anatomical (CT) and biological (PET) information, helping to assess the response of tumours to the treatment, and improving of the validation of treatment plans by radiation oncologists.

This multi-disciplinary project will require an alliance with external partners in computer science, medicine and medical physics from France, Belgium, California and Wales.

 
Image


The Lost Heritage of Gwynedd in 3D
Dr Jonathan C Roberts, Senior Lecturer at the School of Computer Science and Ray Karl, Professor of Archaeology and Heritage are joint leaders of a project entitled 'Co-production of alternative views of lost heritage' which has secured a grant worth c. £575,000 under the Arts and Humanities Research Council’s ‘Connected Communities’ call. The project will be in collaboration with the schools of Computer Science at Aberystwyth University, Archaeology at Manchester Metropolitan University and Gwynedd Archaeological Trust. This builds on an ongoing AHRC £100,000 project led from Bangor: 'Alternative views of the lost heritage of Gwynedd'.

The focus of the project will be on producing heritage data in conjunction with local communities. Photographs of heritage artefacts and environments will be uploaded onto our website (www.heritageTogether.org). By collaborating with our local Historic Environment Record at Gwynedd Archaeological Trust, the data thus generated will be linked directly to the archaeological information held there. Dr Jonathan Roberts said: "It is fantastic to be working with the community on this project; it will allow the community to learn more about their heritage, and while improving heritage management and allowing to much more effectively monitor damage to or loss of substance of monuments. It also provides a step change in research collaboration between academia and the wider public and provide the public with a new way of meaningfully engage with their heritage."

 

Image

Image

EVIVA Logo

Project IVY
Project IVY is a European funded project with 6 partners (EU Lifelong Learning Progamme - Project 511862-2010-LLP-UK-KA-KA3MP). The aim is to develop a virtual collaborative training environment for interpreters. The project is currently using Second Life to create a prototype environment. Users can teleport to a chosen scenario, where the room is modelled in a suitable way for the scenario, and then the user can hear the scenario and can interact with the world to understand how to act as an interpreter.

PROJECT EVIVA
Project EVIVA is a follow-up project of IVY, with 5 partners (EU Lifelong Learning Progamme - Project 511862-2010-LLP-UK-KA-KA3MP). The project’s aims are (a) to investigate the efficiency of VLEs and how they support different learning activities, and how learners from diverse backgrounds learn by using VLEs, (b) to develop innovative evaluation methods for interpreter-mediated communications by combining traditional methods of assessment, with methods such as introspection, corpus analysis and visual analytics, and (c) to formulate design recommendations for building VLEs for interpreting

Bangor Partner site URL: http://www.vmg.cs.bangor.ac.uk/IVY/
Consortium URL: http://www.virtual-interpreting.net




     

Visualization and Medical Graphics Group
School of Computer Science,
Bangor University, Dean Street, Bangor, Gwynedd, UK. LL57 1UT
Email: vmg@bangor.ac.uk - Tel +44 (0)1248 388244
Privacy and Cookies