VR. ready, steady…

drmmu's avatarTransforming online learning experience using virtual reality and gamification

img_4152

Nothing cheers you up more than a new gadget in the middle of a term. Oculus Rift, controllers and earphones (Thank you, Nick!) The 60+mph wind gusts (Storm Doris) nearly took them off my arms in the car park, but I managed.

We are still expecting a Google Pixel+Daydream and a(n?) FOVE (with eye-tracking capabilities) to arrive.

View original post

Packet probing for media synchronisation

[A piece of work with Hans Stokking from TNO]

Cross-device immersive media has been one of my main research topics for many years. To get the “immersion”, we inevitably need a mechanism to orchestrate media playback on different devices. This sounds easy but very often very hard to do in practice, especially over different devices (iPad, Smart TV, PI…). One challenge is the delay. When we send commands (such as “START PLAYING”) to another device, your application takes some time to code and pack the information into manageable chunks, which then take some time sitting in a queue and waiting to be serialised as packets for network distribution. Will your packets then travel at the speed of light? Not quite, yet still at 177,000 km/s (59% the speed of light) in Ethernet cables. But the Internet is not an empty highway, your packets will “bump into” others at network nodes such as routers or switches, which simply means they’ll probably sit in the queue again and wait for their turn. When the packets finally arrive at the receiver, they must then go through the network interface card, driver, TCP/IP stack, and any application that control the media playback, before your command can be executed… The whole process, after all the waiting, may only take a few hundred milliseconds. Not bad. OK, let’s make things easier for ourselves and put all the device very close to each other on the same network, then the overall delay is probably just under 100 milliseconds (1/10 of a second). Sounds good, right? Surely, it’s ok for one speaker to lag the other for 100 milliseconds???

What did our recent research say about the human perception of latency in an inter-destination audio-visual test?

20-40 milliseconds perceivable.
60-100 milliseconds annoying.

Oops….

If you are thinking “No worries, let’s actively measure all those delays (serialisation delay, queuing delay, propagation delay, processing delay, reply delay, etc.) which our messages encountered and compensate them when we execute the command.”, then you are with the group of “mad” researchers in the MediaSync community, who’ve spent many years investigating the sources of Internet delays and the feasible ways to measure them.

To give an overview of different packet probing methods to estimate delay and available bandwidth (and to show how such measurements can really make or break a cool media application), Hans and I are now finalising a manuscript for a Springer book chapter.

It looks like we are going over the 25-page limit though…

UoN Waterside campus in VR?

Screen Shot 2017-02-10 at 16.21.37.png
Creative hub, Waterside campus, The University of Northampton  (All rights reserved)

In the past few months, I have been working with a few colleagues, including Dr Anastasios Bakaoukas and Ewan Armstrong, at Computing on initialising a VR project. The idea is simple: build our new Waterside Campus (due open in 2018) in virtual reality using game development engine so we can all (virtually) walk around on our new site before it’s fully completed in the physical world. So why have we volunteered to do this?

  1. Because we can! Our Game Development/Arts/Design programmes are strong and fast growing. We have expertise in modelling, artistic design and artificial intelligence for developing immersive games.
  2. It can potentially help the University to promote our infrastructure/facilities at the new site to prospective students. To this end, we have worked with the marketing team to understand their needs. The tool could also help improving the visitor/student experience once we move to the new campus.
  3. Using the new campus as the case for teaching. Students can drop their designs or game logics directly into this VR environment and test their work in a unique context.
  4. It will be a great platform for media, AI, and traffic analysis research.

Despite an enormous workload on teaching and marking, the team has worked with an external media company and has committed many many hours transform 3D models for the gaming environment (special thanks to Ewan!). We hope to deliver some interest results very soon!

BTW, if you like one of those “cut-in-half” arts, here are some of my takes on the Creative Hub:

Taking a small slice off and you can see some networking space on the left and entrances to blocks of lecture theatres on the right:

screen-shot-2017-02-10-at-16-22-04
Creative Hub, Waterside campus, The University of Northampton (All rights reserved)

 

Cutting it further:

screen-shot-2017-02-10-at-16-22-24
Creative Hub, Waterside campus, The University of Northampton (All rights reserved)

 

It looks like there is a standalone building wrapped inside…

screen-shot-2017-02-10-at-16-22-38
Creative Hub, Waterside campus, The University of Northampton (All rights reserved)

And here is the Engine Shed, a Grade II listed building currently being restored and it will be the home of the Student Union.

 

Screen Shot 2017-02-10 at 20.18.48.png
The Student Union, Waterside campus, The University of Northampton (All rights reserved)