top of page

ROC GRANTS

PROJECT SPOTLIGHT

Wild Me: Wildbook-Computer Vision for Dolphins

Wild Me specializes in data management and computer vision solutions to modernize wildlife research, especially where citizen science can be integrated to increase data collection and improve curation. We seek to advance the mark-recapture monitoring of select dolphin populations by developing software to identify individuals based on fin shape taken from large volumes of photos and track their populations over time. This work builds on two completed NSF grants (see IBEIS.org) to integrate data management (see wildbook.org) and computer vision in a reusable, open source platform for wildlife monitoring.

Under Waitt Foundation funding, we will expand our existing computer vision infrastructure in conjunction with the Department of Computer Science at Rensselaer Polytechnic Institute (RPI/Chuck Stewart) and jointly create a prototype algorithm for automatically identifying dolphins from pictures of their dorsal fins. Existing data from the Chicago Zoological Society (PI Reny Tyson), Duke University (PI Kim Urian), and Massey University (PI Krista Rankmore) will be used for computer vision training for U.S. bottlenose dolphins and New Zealand common dolphins. These fins are distinguishable “by eye” by the curves and notches of the trailing edge.

In summer 2016, we began experimental algorithm analysis by considering two different approaches. This first, used successfully on white shark fins, is based on extracting and matching “descriptor” vectors that characterize the appearance of small segments of fin. The other, which we developed recently for matching humpback whale flukes, is based on matching a long sequence of curvature measures extracted along the edge of the fluke (see https://www.youtube.com/watch?v=JhIcP4K-M6c). Based on initial results we will pursue the latter approach first.

​Our work on the prototype will proceed in the following steps:
Import existing data sets from collaborators into a Wildbook to create a baseline of individuals dolphins and exemplar fin images.

Perform automated image segmentation of the dorsal fin.

Create novel digital curvature measures that are less affected by noise than traditional, differential measures.

Design a curvature matching algorithm, similar to our work on humpback flukes.

Implement encounter-based matching: from a time sequence of fin images extract the most distinguishable views as the basis for fin representation and matching.​

The resulting algorithm will produce a ranked-list of the most similar fins for each query image sequence. Our accuracy goal for this prototype is to have a match from the correct dolphin be top-ranked 60% of the time, in the top five 75% of the time, and in the top ten 90% of the time. This prototype should complete the ranking for each query within 5 minutes.

Collaborating PIs will receive a Wildbook with their data integrated and a computer vision system that allows them to add new data and rapidly receive accurate, computer vision matches of their marked individuals based on fin photographs. All work will be delivered as open source and distributed freely for broader use. RPI is providing tuition matching to magnify the student time available for the research phase of this grant.

bottom of page