yan tat wong

image

about me

I am a post-doctoral research fellow at the Center for Neural Science at New York University (NYU). I work in the lab of Bijan Pesaran studying how we make decisions as well as developing brain machine interfaces. Before joining NYU I completed a PhD in bio-medical engineering with the Australian Vision Prosthesis Group at the University of New South Wales where I worked on developing visual prosthetics.

contact: yan (dot) wong (at) nyu dot edu

cv: pdf

research interests

My research interests are in building neural prosthetics to help those suffering from sensory or motor deficits. One of the major limitations holding back current neural prosthetic devices is our lack of understanding of the basic science of how the brain works. For example, a greater understanding the neural mechanism of motor control can help build brain machine interfaces that give patients better control over artificial robotic limbs.

In the Pesaran lab, I have studied effector-based decision making in the parietal cortex as well as decoding neural activity for the control of prosthetic devices. I have focused on understanding how different types of neural signals can be used to control prosthetic devices. In our natural behavior, we are constantly making decisions about where to move our eyes and hands. The decision of where to look or where to reach can depend on a wide variety of factors, including the value or reward associated with making that movement. One of the projects I have worked on in the Pesaran lab addressed this question of how the brain makes effector-specific valuations. We used multi-electrode recordings simultaneously from eye movement and arm movement areas in the posterior parietal cortex (PPC) to study how these two areas respond when making coordinated look and reach decisions. This project demonstrates that multiple areas of the PPC encode the values for the movement decisions and that neurons in each area work together to make coordinated movement decisions.

I am now working on translating this into decoding neural activity for the control of prosthetic devices. We record neural activity from multi electrode arrays in both the parietal cortex and in motor cortex while subjects make a variety of reaching movements. Using a sophisticated motion capture system, we can track arm and hand movements in three-dimensions. Using our understanding of the basic neural physiology of these brain regions, we can decode these reaching movements in space.

selected publications

Focal activation of the feline retina via a suprachoroidal electrode array   paper
Wong, Y. T., Chen, S. C., Seo, J. M., Morley, J. W., Lovell, N. H., Suaning, G. Vision Research, 49(8):825-33, 2009.

Retinal Neurostimulator for a Multi-Focal Vision Prosthesis   paper
Wong, Y. T., Dommel, N. B., Byrnes-Preston, P. J., Lehmann, T., Hallum, L. E., Lovell, N. H., Suaning, G. J. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 15:425-434, Sept 2007.

A CMOS retinal neurostimulator capable of focussed, simultaneous stimulation   paper
Dommel, N. B., Wong, Y. T., Lehmann, T., Dodds, C. W., Lovell, N. H., Suaning, G. J. Journal of Neural Engineering, 6:035006, 2009.

Optimizing the decoding of movement goals from local field potentials in macaque cortex.   paper
Markowitz DA, Wong YT, Gray CM, Pesaran B. Journal of Neuroscience, 14;31(50):18412-22, Dec 2011.

Competition for visual selection in the oculomotor system.   paper
Markowitz DA, Shewcraft RA, Wong YT, Pesaran B. Journal of Neuroscience, 22;31(25):9298-306, Jun 2011.

Artifcial Vision   book chapter
Chen, S. C., Wong, Y. T., Hallum. L. E., Dommel, N., Suaning, G. J., and Lovell, N. H. Encyclopedia on Biomedical Engineering, P. Bonato (ed), Wiley Press.