Xbox 360 and Kinect-based motion sensing for in-home rehabilitation
The goal of this project is to research new motion sensing algorithms and system for in-home rehabilitation
using new gaming motion tracking technologies, such as Microsoft Xbox 360 and Kinect. The home environment is
the easiest place to help patients to recovery by conducting intensive, repetitive practice of functional movement.
Recent developments in the gaming industry and gaming motion tracking devices, such as Microsoft Xbox 360 and Kinect,
have great potential to transform in-home rehabilitation. In this research, we will explore the possibility of
utilizing Kinect for in-home rehabilitation. Specific research tasks in this project include the research and
development of Kinect-based software to capture, track, analyze, and interpret the 3D motion images to facilitate
in-home rehabilitation, GPU-enhanced image and video analysis.
Smartphone-based activity monitoring
The objective of this project is to research into new algorithms and software to detect and classify the physical
activity for long-term lifestyle and healthcare monitoring using smartphone platform, such as iPhone and/or android.
Physical activity is one of the leading health indicators to measure the mobility level, latent chronic diseases and
aging process. Smartphone is an ideal platform to monitor the physical activity since it is usually carried by user
most of the time and it could measure the activity passively and automatically in a non-intrusive fashion. Specific
research tasks in this project include feature extraction from inertial sensors (e.g., accelerometers and gyroscopes),
new machine learning algorithms for activity classification.
Ubiquitous sensing based on smart vision techniques for managed homecare
This project aims to investigate new software system for pervasive home monitoring using smart vision techniques.
Pervasive home monitoring plays a vital role in maintaining the independence and improving the life quality for aging
population. Low cost vision-based system can be used to monitor and valuate daily activities of the occupants.
Specific research tasks in this project include: image sensing device level visual appearance filtering techniques
to ensure privacy of occupancy, shape and motion vector analysis for event detection (e.g., vital sign analysis,
adverse events detection).
Granger Causality Based Brain-Computer Interface
A brain-computer interface (BCI) (A.K.A., mind-machine interface (MMI), or direct neural interface, brain-machine
interface (BMI)) is a direct communication pathway between the brain and an external device. Granger causality (GC),
one of the key enabling, is one of the most popular measures to reveal causality influence of time series and has been
widely applied in economics and neuroscience. This project focus on developing new causality measures (in time and
frequency domains) for the linear regression model. We envison the new causality measure will more reasonable and
understandable than GC or Granger-like measures.
Data Visualization for Clinical Decision Support
The objective of this project is to discover the most effective and efficient way to visualize our specific,
multimodal, biomedical data. The ultimate goal is to integrate this visualization methodology into clinical
decision support software. This software would be used by doctors and therapists to more easily make decisions
regarding their patients, based on these visualizations. The data I will be working with comes from several
different sources and in a variety of different forms. These include Kinect motion data, video and/or image
data, accelerometer and gyroscope data, and ECG/EEG data. The Kinect provides data about the position and
movement of various points on the patient's body. The video data will come from cameras monitoring the
patient. Mobile phones will provide the gyroscope and accelerometer data, which track which direction the
phone is pointed in and its movement in three dimensions. The EEG and ECG data measure the electrical
activity of the patient's brain and heart, respectively. All the data will also have a timestamp component.
Medical imaging indexing and search
Images are ubiquitous in biomedicine as they play a vital role in medical diagnosis, clinical treatment, and biomedical
research/education. The ultimate goal of this project is to research, develop, evaluate, and demonstrate a data-intensive
and scalable intelligent medical image modeling and retrieval system with the capacity of finding the most clinically relevant
images to support clinical decision making during diagnosis and treatment. The main challenges of this project take root in
the unique characteristics of medical image data, which is large volume, heterogynous, and semantic-rich Specific research
tasks in this project include Hadoop and MapReduce-based massive image
indexing and retreival, new multi-modal (e.g., text and image) image feature extraction techniques by expanding Bags of Words
(BOW) visual features extraction methods and Latent Dirichlet Allocation-based text feature extraction methods, new image and
textual modeling and parsing techniques (e.g., cross-modal correlations using canonical correlation analysis (CCA),
GPU-based feature extraction.
Risk Analysis for Acute Coronary Syndromes in Chest Pain Patients
The use of nuclear cardiac stress testing has been incorporated into chest pain unit (CPU) evaluation
protocols in the evaluation of patients deemed at low to intermediate risk of acute coronary syndromes
(ACS) defined as unstable angina or acute myocardial infarction (AMI). The objective of this project is
to develop a computer-aided predicative model to investigate the risk factors (e.g., age, sex, cardiac
risk factors) on the incidence of ACS for the purpose of developing a tool that may assist physicians
to predicate the ACS in chest pain patients.