[This is a “work-in-progress” page which I update every now and then. For more details, email me (firstname.lastname@example.org).
A middleware API to enable the orchestration of multi-device experiences
Link multiple devices and get them to work together to deliver a more immersive TV experience. The project will be part of a collaboration with the BBC R&D.
A Domain Specific Language to express multi-device experiences
Build a tool to help programme directors to express an experience (e.g., an explosion on TV screen triggers fans, lights, etc. in your home) A potential game changer in broadcasting / post-production studio. The project will be part of a collaboration with the BBC R&D.
Build a synchronisation accuracy toolkit
This project involves using an Arduino (www.arduino.cc), light sensors and audio inputs to determine the asynchrony between two devices playing the same stream. The toolkit can become the most essential tool to enable new TV viewing experiences across multiple linked devices. The project will be part of a collaboration with the BBC R&D.
Integrating headless Internet of Things devices in TV experiences
Identifying potential IoT devices that could be used for immersive TV-led experiences. The students could then build a generic infrastructure to discover IoT devices, their capabilities. The project will be part of a collaboration with the BBC R&D.
Eye-tracking in virtual reality
Use state-of-the-art kit such as FOVE to systematically track users gazing point in virtual reality games/documentaries. The results will help VR designers to improve user engagement and user experience.
Waterside campus in virtual reality
This project is part of a new initiative to create the entire Waterside campus in virtual reality and develop novel applications. You will be working on mesh objects produced by a professional media company (and approved by the architects) using Waterside site plans.
Using multi-sensory media (Mulsemedia) to deliver immersive experiences
Traditional multimedia applications have primarily engaged only two of the human senses – the aural and the visual. How about tactile, olfactory, and gustatory? The result, mulsemedia, is a key to the world of home entertainment and online retailing. Ref1 Ref2
Conventional network nodes are programmed to only conduct low-level tasks such as packet forwarding. A more intelligent network can greatly improve network efficiency, energy consumption, and user experiences. You are welcome to use tools such as NS2/NS3, Mininet, OpenNet, etc. to evaluate your work. Ref1
Machine Learning – Understanding user behaviours in online TV
Understanding user interactions and user behaviours are becoming as fundamental as streaming content online for any online media services. For instance, Amazon, Netflix, and Google all track our activities and use machine learning tools to improve their recommendation engines and service designs. In this project, you will have access to a large dataset of user activities captured in an online IPTV platform. Your role is to (study and) use machine learning tools to model how users interact with the service, and propose solutions to better adapt the dynamic elements of a website to (latent) user preferences.
Game with a Purpose (GWAP) for health
Wouldn’t it be great to have fun playing a game while helping you become healthier? See Ref1 for some inspiration.
Interested in proposing something new? In general, my areas of interest include but not limited to: media processing, media retrieval, media orchestration, online video, social network analysis, context-aware recommendation, affective computing, quality of service/quality of user experience, Internet of multimedia things, software-defined networking, network function virtualisation, security, anomaly detection, wearables, augmented reality, virtual reality, human-computer interaction, and interaction design. I am also interested in cross-disciplinary research and development.