Research

Brief Overview of the Projects that I am currently and actively working on with my peers and also my postgraduate students:

1. Synchronous Cross Device Interaction for Multi-Mobile Collaboration System


This project investigates the user experience issues when multiple users calibrate and align multiple mobile devices together during collaborative interaction. Preliminary observation reveals that existing calibration setting is too complicated for the users to initiate connectivity across multiple devices. Following this, several connectivity methods are explored to minimize connectivity time and promote a seamless integration between the users and the system.

Project in Collaboration with: Teo Rhun Ming


2. Solving the Effects of Bezel and Screen Orientation on Multi-Mobile Device System through Continuous and Rotatable Spatial Mapping



Mobile devices are increasingly becoming an important part of technology. As the demand for handling information grows, so does the need for collaboration in integrating and managing multiple forms of information. With its small screen, it is relatively uncomfortable for a group of people to perform task collaboratively. This project aims to solve interactivity limitations such as bezels and screen orientation which are found to restrict users from performing collaborative tasks.


Project in Collaboration with: Ong Beng Liang


3. Using Spatial Conceptual Metaphor on Musical Objects for Music Composition in Immersive Virtual Environment





The project intends to identify the musical parameters which are significant in the music making process. The proposed system intent to provide user a three-dimensional interaction with the virtual environment. Hence, exploring human interactions through natural user interface is also one of research objectives we highlight in this study. In addition, the conceptual framework of this paper is to enable sound synthesis process through direct manipulation on geometric objects in VR environment. Ultimately, the goal is to develop a VR system that provides music learning experience for novice user to learn musical terminology through direct interaction to the immersive environment.


Project in Collaboration with: Hoo Yong Leng



4. Identifying the Basis of Auditory Similarity in Concatenative Sound Synthesis for Automatic Music Composition


This research aims to identify the basis of auditory similarity in concatenative sound synthesis users. Concatenative sound synthesis (CSS) system is an existing approach to create new sounds based on a user supplied audio query. Typically, the audio is synthesised based on the least distance between the query sound unit and the available sound units in the database. However, sounds synthesised through this approach often times result in a mediocre level of satisfaction within the users as confusion between various audio perception attributes during the CSS system’s matching process causes mismatches to occur. This study will determine the dominant perceptual attribute, e.g. melody, timbral, tempo and loudness; that humans base their judgment of sound similarity on. The study also looks at two categories of CSS system’s users: musicians and non-musicians, and observes whether there is a significant difference in the subjective judgments between the two groups with regards to sound similarity. 




5. Effective Gestures in Immersive Pretend Play Environment for Children



This project attempts to address the issue of: “How can pretend play create effective gestures in immersive pretend play environment for children?”. It will emphasize on measuring the social interaction for children in immersive pretend play environment. The purpose of this study is to create effective gestures in immersive pretend play environment for children. The results are expected to lead towards immersive pretend play theory, a set of effective gestures, proof of concept (prototype), and application. 

Project in Collaboration with: Nur Syabila Zabidi




6. Effective Multi-Touch Gestures with Auditory Cues for Kindergarten Children's Learning Apps

The project intends to improve kindergarten-level's interaction by implementing effective multi-touch gestures and auditory cues in their virtual learning environment. The children's behaviours while using the learning apps will be studied and appropriate gestures for these children will be identified. In addition, sound cues will also be synchronized with the gesture movements enhance the learning experience of these children.


Project in Collaboration with: Amelia Mazalan


7. Designing a User Centered Approach to the Development of a Virtual Mobile Kompang for Primary School Children




Children who learn music and possess the ability to play an instrument have enhanced skills in many areas. There are varieties of musical instruments available for the children to get involved with, and one of the most common instruments taught in many primary schools in Malaysia is the Kompang. However, there are several limitations to this instrument. For example, Kompang can be an awkward and heavy instrument to carry around. Moreover as a percussive instrument, the Kompang is very loud which can irritate people around the player, making it difficult for students to practice at home. A Virtual Mobile Application will hopefully overcome these limitations as it is portable, smaller in size and can allow more parameter controls, as well as expanding the existing applications which are more focused on western musical instrument such as drum, guitar and piano. This research proposes a User Centred Approach to the development of a virtual mobile application through the users’ perspectives and will analyse the requirements of a spatial-based virtual mobile Kompang. A prototype of the application will also be developed and its effectiveness will be evaluated.


Project in Collaboration with: Siti Aisah Atan



8. PharmacoApp: An Interactive Pharmacology Mobile App for Undergraduate UPM Medical Students



This research will contribute to a new design to the interactive pharmacology mobile app, and may influence the way M-learning is used to increase retention rate of undergraduate Pharmacology students when learning the materials taught in their course.


Project in Collaboration with: Dr. Siti Khadijah Ali, Dr. Muhamad Zulfadhli Mehat; Faculty of Medicine and Health Sciences, UPM





9. Incorporating Haptic Technology to Re-Engineering the Psychological Empowerment of Youth with Disabilities  via Inclusive Virtual Training



The general objective of the study is to develop virtual training for re engineering of people with disabilities’ psychological empowerment. The project will determine the inclusive features of virtual training which is user friendly for youth with disabilities’, which will then be sued to design and develop inclusive virtual training which is user friendly for youth with disabilities. The effectiveness and level of psychological empowerment of youth with disabilities’ recipients will then be evaluated through pre and post test and usability of the mobile application.

Project in Collaboration with: Dr. Nor Wahiza Abdul Wahat (IPSAS, UPM), Dr. Siti Zobaidah Omar (Faculty of Modern Languages and Communication, UPM)



10. Myself in Artcodes


In an experiment with a split brain patient, when the patient’s right brain received a written instruction to stand up, the patient would stand up. But when asked why did he stand up, the patient’s left side of the brain (which control speech) would make up a story that only seems logical to the patient, such as, "I'm standing up to get some water". This shows the patient’s left side of the brain receives information from the right side unconsciously. Many evidence in neuroscience shows that consciousness and free will are only illusions. We explore the notion that consciousness is only a story that the brains create to justify humans actions, which are triggered by the unconscious. In this work, the audience will be asked to download a mobile app called Artcodes. They will arrange magnets on any of the five human figures on the wall. We argue that the positions of these magnets are reflections of their unconscious minds about their own bodies. They will then scan the their chosen images using Artcodes, which will take them to a webpage that provides an interpretation of the positions of the magnet. In actuality, the interpretations are only made up by the artists to symbolise the conscious mind’s ability to create stories to justify unconscious behaviour. The audience can then click either ‘Agree’ or ‘Disagree’ button on the webpage. We are hoping the audience will reflect on the absurdity of the work and of life in general.

Project in Collaboration with: Dr. Hanif Baharin (UKM), Dr. Afdallyna Harun (UiTM), Khairul Anwar Mt Nawi (UNITEN)





11. Gesture Recognition for Music-Computer Interaction: the Virtual Piano


This short-term project aims to develop an algorithm for recognizing hand gestures which are suitable for piano playing with reasonable speed and accuracy as well as enhancing the music learning experience.

Project in Collaboration with: Nametso Motswagodisa