NuiCapture Captures, Collects Data On Body Landmarks
May 9, 2012

NuiCapture Captures, Collects Data On Body Landmarks

Connie K. Ho for

Imagine a tool that can record, export, visualize, and analyze Microsoft Kinect sensor data. NuiCapture, otherwise known as Natural User Interface and recently released by Cadavid Concepts, can do just that. It´s able to efficiently capture and process Kinect Data for analysis, assisting those working in research and industry-related jobs.

Cadavid Concepts, a Miami-based research and development company, works in the areas of computer vision, pattern recognition, and image processing.

“NuiCapture was inspired by the need of both my colleagues and I for software capable of synchronously recording from multiple Kinect sensors, and then exporting that data automatically,” remarked Dr. Steven Cadavid, Chief Executive Officer and Director of Computer Vision at Cadavid Concepts.

Kinect was developed by Microsoft as a gaming consul. It´s a system that is able to learn about the body shape and extract different landmarks like the head, torso, hand, and wrist. Cadavid and his team have been developing NuiCapture over a year and have gone through several stages of development.

“There are several reasons why nuiCapture is important for psychology and other applications,” said Cadavid. “For one, it can extract information about body control.”

Microsoft Kinect is groundbreaking in its technology. Later this month, according to Cadavid, Microsoft is scheduled to release a new set of tools that not only can extract body landmarks but can also extract facial landmarks and finger digits.

“In terms of tracking people in uncontrollable environments, it´s very new technology,” said Cadavid in reference to Microsoft´s Kinect program.

The current version of nuiCapture focuses on data capture, exportation, and visualization. First, the user is able to record the depth, color, skeleton, and audio streams of multiple Kinect sensors concurrently. Before and during a capture session, the user can also preview and adjust the view of the angle of each Kinect sensor to better record data. Record capture sessions can then be exported as image sequences/compressed AVI files, Matlab (MAT) files, and XML files. NuiCapture automatically exports the MAT and XML files, seamlessly importing the raw Kinect sensor data to third-party analysis software tools. Various capture sessions can be batch processed.

“Before nuiCapture, there was no streamlined way of collecting that raw data and converting it into a format for researchers,” explained Cadavid on the novelty of nuiCapture.

In terms of collecting data, there are four modalities extracted from the Kinect sensor. This includes color, depth imagery (the 3-D structure of the object), skeletal component (body landmarks from the subject), and audio.

“With the built-in 3D media player, you can view any of the Kinect sensor modalities independently or merge them into a single view,” noted Cadavid. “With the 3-D rotation and scaling capabilities, you can look at the different details of the human model, capture the entire background and room or just the human in the scene. There´s a lot of flexibility with what you want to see in detail.”

Lastly, the visualization component allows the user to view specific locations within the sessions and play around with the images in 3-D to see the data from the different viewpoints.

“The skeleton is overlaid on the color and depth images, giving you a sense of where the landmarks are extracted from the body,” described Cadavid.

Past studies, including one done by the University of Minnesota and Ohio Northern University, highlight how computer vision systems incorporate cameras and sensors to track children from multiple viewpoints over a period of time. Researchers in that particular study hoped that they would be able to observe children and determine signs of developmental disorders. The group used Microsoft´s Kinect sensors due to low cost and easy use.

“Most often, the behavior of such at-risk children deviate in very subtle ways from that of a normal child; correct diagnosis of which requires prolonged and continuous monitoring of their activities by a clinician, which is a difficult and time intensive task,” wrote the researchers in the report. “As a result, the development of automation tools for assisting in such monitoring activities will be an important step towards effective utilization of the diagnostic resources.”

Each session involved children between the ages of three to five in a group of ten. The kids were allowed to move freely around the classroom, while the Kinect sensors recorded their movement. With the study, the researchers could better understand the hyperactivity of the participants.

“Tracking the motion of the subjects is an important step in behavior analysis,” noted the researchers in the article. “For example, one of the behavioral cues that medical personnel will be interested in looking at in a diagnostic session will be the hyperactivity of children. Thus robust motion tracking is essential.”

With nuiCapture, University of Miami Professor Dr. Daniel Messinger has also been able to capture the interactions between mothers and their children to assist his research on autism.

“One way in which kids express emotion is how they move their bodies with their parents,” remarked Messinger. “NuiCapture lets us hone in on our activities. It´s been a terrific collaboration for us.”

Messinger, who has five different strands of research, focused on early social and emotional development, studied kids who were at risk for being on the autism spectrum.

“Autism involves interaction, problems with interaction, non-verbal behavior,” said Messinger. “NuiCapture may help us understand that better.”

Messinger believes that the program has helped him streamline his research on autism and has beneficial in the coding process.

“It offers a lot of opportunity to better understand interactions,” explained Messinger, who has worked with the program over the past six months. “It can take a long time to code interactions and this allows us more time to think about the data.”

Apart from studies of autism and psychology-related projects, Cadavid believes that the technology can be used for other areas like biometrics, human-computer interaction, medicine, and robotics. NuiCapture can determine if a person is alert or not through the skeletal appearance. It can also be useful in home care medical facilities, capturing abnormalities in the skeleton and alerting medical professionals if a patient has fallen down.

“There´s a diverse set of applications,” said Cadavid. “It´s applicable for any kind of domain where you have monitoring of people and the need to understand the actions of different people.”