AI can help to understand Animal body language

Dec. 22, 2018 | 9:20:00
Artificial Intelligence
AI is really amazing, it can help human beings deal with lots of information. The quality of image is getting better day by day. The combining of new technologies get new future to us.
As we've all seen, the Hollywood movie use "Motion Capture" sensor devices to capture the movement. And turn actors into giants or beasts. In real life, the Princeton Institute of Neuroscience (PNI) has also developed a set of AI tools that with just few minutes, one can automatically track the animal's movement in movies, as well as assist in the study of animal body language.
This so called tool LEAP takes only a few minutes of training and automatically track the body moves of animals in the film with high precision, and don't need any physical markers or labels.
Artificial Intelligence
Professor of Molecular Biology Mala Murthy said, "This method can be widely used in animal model systems to measure the behavior of animals with genetic mutations or drug treatment."
Although it's expected to be published in the January 21st of the journal Nature Methods, the software has been adopted by many relevant laboratories since the released open version in May.
PNI graduate student Talmo Pereira said the tool is very flexible and in principle can be used for any image data. "It works by marking a few points in several movies, and then the neural network do the rest of job. We provide an easy-to-use interface that allows anyone to apply LEAP to their own videos without any prior programming knowledge."
When being asked if LEAP was as effective in large mammals as the flies and mice, Pereira immediately use the walking giraffe film at the Kenya Mpala Research Center as an example. It too less an hour and marked 30 spots in the film, and LEAP can then track the giraffe's postures immediately in the rest movies.
LEAP Motion Controllers in SteamVR! How-To Guide

In the past, similar AI tools relied on a large amount of manual annotation data training to work on a variety of background and light data. So LEAP built a system by the development team, users can choose the type of data that is suitable for the Neural networks, without being restricted by other researchers or companies.
In the view of Monica Daley who's from Royal Veterinary College (RVC), this work also have great potential in other fields. Most of Delay's research is about how animals move effectively in different terrains. And the big challenge is to get valuable information from the film clips. LEAP will make the work of researchers more effective and automatically than ever, so that they can focus on study of a wider variety of animal's behavior.
When researchers combined LEAP with other quantitative tools, they can study the so called body languageg by observing the animal's physical movement. And the neuroscientists within the team can establish links with the neural processes behind them.
In the past five years, neuroscience has made great strides in observing and manipulating brain activity, and with the help of LEAP, the automatic classification of animal behavior can be more automated in the future. Joshua Shaevitz, one of the authors, believes that this is not only enables researchers to better understand how the brain produces behavior and also explore future diagnostics and treatments for the needs.
Showing 1-8 of 30 results
contact us to newsletter & free shipping for first shopping