top of page

My Research

Visual Awareness and Sensory Competition


The human visual system combines input from both eyes into a single, coherent experience, even though the view from each eye is never exactly identical. One way the brain achieves this is through interocular competition, where signals from one eye can suppress those from the other. This competitive process is most clearly seen in binocular rivalry (BR), where the result is an alternating perception between the two different images presented to each eye.

I study how this process works, and how the properties of natural, and artificial stimuli influence what we become aware of. By understanding how visual experience is constructed from the two eyes, this work helps inform the design of interocular competition experiments as well as more user-friendly assistive technologies, such as smart glasses and low-vision prosthetics. To illustrate the importance of interocular competition in design, the following simulates the visual alternations in BR (left) and how it may affect the visibility of content in smart displays during mobility (right).

Sensory plasticity

Our sensory systems are able to adapt to the environments and experiences we are exposed to.
I study how the patterns in different types of sensory experiences (e.g., indoor and outdoor environments, changes in binocular input) may influence perception. This includes the adaptations shaped by long-term prior exposure, and those driven by more recent sensory input. The goal here is to understand how experience shapes perception, and how these changes can be considered when designing experiments and human-centered designs.

 

A current focus of my ongoing work is how visual experience shapes higher-order perceptual abilities, such as social vision, and how visual features in digital media may influence perception and development. For example, the following is a VR model of a real-life Kodak Hall constructed using LiDAR scans, and a goal here is to study how naturalistic cues affect audiovisual perception for both neurotypical and autistic individuals.

Special population work

A more applied arm of my research, I study the link between sensory experience and perception in special populations or derive and pursue translational solutions for the affected populations. ​

A primary ongoing effort is to develop accessible solutions for early autism detection, pursued under the NeuroBeA and MIMOSA projects, both cross-institutional efforts that include local universities (e.g., NTU, Duke-NUS), KKH and A*STAR.

 

I've also contributed to another completed initiative (ConsciousVR) with NNI, where we explored the use of familiar stimuli (given the human tendency to recognise and become aware of them more readily) in a VR-based stimulation programme for patients with disorders of consciousness.​​

Selected articles that illustrate some of my published work:

bottom of page