Moving in Portfolio Innovation Centre

SDCN: Software Defined Cognitive Networking

While most University staff members are setteling in our new and modern Waterside campus, we have also welcomed a new workshop space at the Portfolio Innovation Centre on University’s Avenue Campus. The workshop has just had the furnitures moved in but it will soon see lots of research, development, testing, and experimentation. The workshop will support SDCN project as well as any of my reseach students who require short-term R&D space (for final-year projects, MSc projects ,etc.). So I’ll be an interesting place of oscilloscopes, VR goggles, Raspberry PIs, drones, network switches, media streamers and most importantly talented young people. I’ll encourage them to post stories of their research here.

(and yes, a coffee machine is ready and whiteboards are coming soon.)

View original post

Westminster Higher Education Forum and IEEE TALE 2018

I have been invited as a speaker at a Westminster Higher Education Forum: Technology in higher education: the future of learning environments, the use of Artificial Intelligence and the impact of online courses. An short intro of this event (scheduled on Thursday, 28th March 2019 in London) is copied below. The forum will discuss the impact of technonogies and data sciences in learning and teaching by looking at different theories and practices. I’ll contribute with our experiences with Active Blended Learning at Northampton but more importantly the on-site experiments our staff and students have done with primary and secondary schools in recent years.

And on that same note, I also became part of the programme committee for IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE 2018). IEEE TALE is “the IEEE Education Society’s flagship Asia-Pacific conference series, catering to researchers and practitioners with an interest in engineering and computing education as well as those interested in the innovative use of digital technologies for learning, teaching, and assessment in any discipline”. The conference theme this year is “Engineering Next-Generation Learning”. Any thing with engineering in it immediately sounds execiting isn’t it? My understanding is that it is engineering in very broad terms including human factors.

Westminster Higher Education Forum: Technology in higher education: the future of learning environments, the use of Artificial Intelligence and the impact of online courses.

This seminar focuses on the use of technology in the higher education sector, assessing its effectiveness and discussing ways forward for maximising its potential in both learning and teaching.

Delegates will consider the impact of technology on the learning experience and what more might need to be done to meet students’ expectations by further developing personalised teaching. They will assess the benefits and challenges of flipped and blended learning practices in delivering instructional content and improving students’ engagement.

Further sessions will discuss how technology can impact the quality of marking by supporting better standardisation processes, while alleviating pressure on lecturers by reducing academic workloads. Delegates will look at improvements already delivered in the quality of feedback provided to students and at practical issues still to be addressed.

Those attending will also discuss the latest findings on the use of virtual reality teaching, including concerns about its potentially negative effect on students’ ability to memorise quantitative data and its successful application in subjects such as science and geography.

The seminar will also look at the development of Massive Open Online Courses (MOOCs), including key lessons that can be learnt from examples of best practice in marketing and student recruitment. Delegates will examine how MOOCs have fostered co-operation between hi-tech industry and the higher education sector, while also assessing their impact on lifelong learning and social mobility.

Orchastrated media demo on BBC Taster

The Audio Team at BBC R&D North Lab has recently published an orchestrated media demo (the Vostok-K incident) and is now available on BBC Taster ( It shows how we can orchestrate media playback across multiple user devices to deliver a more immersive experience. The demo uses a cloud-based media synchronisation service that Dr. Rajiv Ramdhany (thank you for sharing the news, Rajiv!) built for the 2Immerse project. The demo is very similar to the one we shared a couple of years ago in a IEEE J-STSP journal article (open access). I believe a version of the underlying system (not sure if its the one used in the Taster demo) has borrowed our idea of a perceptual model to adjust playback for different “catch-up” scenarios (all complex equations can be found in the paper, if you are interested).  Technically, it is quite difficult to achieve this level of synchronicity without using special chips and network protocols (read this paper to see how many things can go wrong). What’s also fantastic about this demo is that it uses content specifically created for the technology. Instead of using any off-the-shelf 5.1/7.1 movie sound tracks, the demo splits sound sources and merge them on-the-fly based on what user device(s) are available.

I think the biggest challenge is the level of human intervention still required for the demo to work and work well in the wild over mobile devices. Device discovery is an obvious topic and we can throw some crazy idea on it easily (e.g., ultrasonic piggy-back). I am also interested in how devices’ capabilities plays a key role in the experience (also observed from my previous experiments). While listening to the Vostok demo, I was subconsciously trying to work out my own location in the scene and that is often dictated by which device has a higher volume. So my personal experience might not be the same as how the directors wanted. Is this a bad thing? Not necessary. Like the user controlled 360 video, allowing the audience to choose where they want to be (e.g., decides which character you want to stand next to in a scene) can be a good pathway for content customisation and interactive media. Perhaps I should get my 3rd year undergrads to try out the demo in classroom. Wouldn’t it be cool to have 40 mobile phones going crazy at the same time? Maybe they’ll also come up with some nice projects to work on.

Post-doctoral Research Assistant position open for applications

The vacancy can be accessed directly via and will remain active until 11.59pm on 12 September 2018.

We wish to appoint a Research Associate to work on the EPSRC-funded project “Software Defined Cognitive Networking (SDCN)”. The SDCN project aims at enhancing online video distribution, which takes the vast majority of internet traffic. We seek to develop new context-aware network models to improve the user experience and network efficiency using software-defined networks.

We invite applications from enthusiastic individuals with experience in relevant aspects of communication networks. The ideal candidate will be a proficient programmer in languages appropriate for scientific computing (e.g. Python, Java), have knowledge of statistical modelling. The role will require you to engage with external partners and stakeholders, and therefore you should possess excellent communication skills with evidence of working effectively as an individual and within a team. You will have strong project management skills and have an ability to write up research for peer-reviewed conferences and journals.

You will work at the University of Northampton’s new Waterside Campus. A dedicated lab at the university’s Avenue Campus will also accommodate research and experimentation activities. The University promotes continuing professional development (CPD) and provides access to development for staff at all levels.

The candidate should possess a PhD in computer science or closely related field. This is a fixed term appointment of 12 months.

For informal enquiries, please contact Dr Mu Mu, email:


Research Assistant  Fixed Term  at University of Northampton.png

Does WOW! translate to an A? A comparative study of VR-based learning in higher education

Congratulations to Yoana (previously a BSc Business Computing student and now on our MSc programme), for having a paper accepted by IEEE VR 2018, a leading conference on 3D and virtual reality research!

Yoana’s research is centred around a simple but fundamental question: Does the “WOW!” effect of VR contribute much if anything to students’ learning outcomes in a higher education setup? While VR is increasingly adopted by primary/secondary schools in the UK to improve the pupils” engagement with learning materials as part of the STE(A)M initiatives, it is unclear how the technology would and could impact the learning of hard sciences in university. Yoana conducted a comparative study on students’ performance in a standardised assessment when course content is delivered using VR and conventional lecture slides. Interestingly, students see VR as a great platform to isolate them from real-world distractions but the extra cognitive load brought by VR content has a detrimental impact on how the learners recognising/memorising important quantitative data. We also concluded that social interaction and tailored productivity tools are two main factors that underpin the exploitation of VR in HE.

Slavova, Y. and Mu, M., A Comparative Study of the Learning Outcomes and Experience of VR in Education, to appear in Proceedings of the 25th IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR 2018), Germany, 05/2018

Screen Shot 2018-02-08 at 22.57.31

PhD Studentship open to applicants!

SDCN: Software Defined Cognitive Networking


UPDATE: The PhD advert is now online at
 It is also being circulated to relevant mailing lists and social media.

A 3.5-year PhD studentship is now approved by the University as part of its support for the project. It offers a great opportunity for a recent graduate or someone with related work experience to work on an emerging research topic along elite industrial and academic partners. The job advert will be online soon on and the University’s website and open to applicants of any nationalities. The application deadline is early January (for a March start).

View original post

SDCN goes to Ireland for a project review

SDCN: Software Defined Cognitive Networking

[Author: Mu]

I accepted an invitation from Science Foundation Ireland to conduct a project review for one of its prestigious research programmes. It was a fantastic experience to visit Ireland and see how resilient people are off the back of hurricane Ophelia. The work at the host institution is forward-thinking and I am glad to witness all the positive changes made by a single research project over two years of time.

And yes, I did have a couple of hours to spend in Dublin before my late morning flight back home:


View original post