Quantcast

Leafsnap Builds Database Of Trees With Smartphone Tech

June 8, 2011

A new smartphone application is now available that allows any budding botanist to instantly identify tree types by snapping a picture of its leaf, reports the Associated Press (AP).

The free iPhone and iPad app, called Leafsnap, instantly searches a growing library of leaf images amassed by the Smithsonian Institution.

Seconds after taking the photo, the program returns a likely species name, high-resolution photographs and information on the tree’s flowers, fruit, seeds and bark. Users confirm the identification and share their findings with the app’s growing database and map the population of trees one mobile phone at a time.

Debuting last month, Leafsnap identified all the trees in New York’s Central Park and Washington’s Rock Creek Park.

It has been downloaded more than 150,000 times in the first month, and its creators expect it to continue to grow as it expands to Android phones. By this summer, it will include all the types of trees in the Northeast and eventually will identify all the trees of North America.

John Kress, a Smithsonian research botanist who created the app with engineers from Columbia University and the University of Maryland, originally conceived of the idea in 2003 as a portable database aid for scientists to discover new species in unknown habitats.

The project evolved, though, with the emergence of smartphones as a new way for the average enthusiast to contribute to research. “This is going to be able to populate a database of every tree in the United States,” Kress explained. “I mean that’s millions and millions and millions of trees, so that would be really neat.”

To identify a tree, the app works best if users place a leaf on a white background to photograph. Engineers altered facial recognition technology to devise an algorithm that could identify a leaf by its shape and features.

After uploading to a server, information is returned with a ranking of the most likely tree species a user has found, along with other characteristics to help confirm the tree’s identity.

The whole concept relies on a complete database to power the application. The software engineers started by photographing leaves from the Smithsonian’s vast collection of specimens. It became clear, though, that they would need images of living specimens for the application to work correctly.

A nonprofit group called Finding Species was asked to capture the initial thousands of images of leaves for the app, The Washington Post reports. Beyond finding answers about the world of trees, even casual users can contribute to scientific research. Images and tree identifications are automatically sent with mapping information from the phone to Leafsnap’s database.

An iPad version of Leafsnap also includes a feature called “Nearby Species” to show all the trees that have been labeled by others near a user’s location. Scientists explain that the data could eventually be used to map and monitor the growth and decline of tree populations.

More science apps could be on the way as well. Professor Peter Belhumeur, who directs Columbia University’s Laboratory for the Study of Visual Appearance and his son, William, already are thinking of conceiving possibilities of similar app that identify butterflies and other critters, Kress said.

Leafsnap cost about $2.5 million to develop, funded primarily by a grant from the National Science Foundation. It will cost another $1 million to expand it within the next 18 months to cover all the trees of the United States, involving about 800 species.

Scientists also are getting requests to expand the app’s capabilities to cover trees in France, Morocco, Thailand and elsewhere. “We want to spread this, not across the United States, but across the world,” Belhumeur said. It’s just a matter of collecting and photographing all the tree species native to a region.

On the Net:




comments powered by Disqus