{"id":784627,"date":"2021-12-17T10:04:48","date_gmt":"2021-12-17T18:04:48","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=784627"},"modified":"2024-06-06T18:56:29","modified_gmt":"2024-06-07T01:56:29","slug":"bioacoustics","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/bioacoustics\/","title":{"rendered":"Bioacoustics"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background- card-background--full-bleed\">\n\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"720\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720.jpg\" class=\"attachment-full size-full\" alt=\"bioacoustics spectogram\" style=\"\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720.jpg 1920w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-300x113.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-1024x384.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-768x288.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-1536x576.jpg 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-1600x600.jpg 1600w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_spectrogram_header_1920x720-240x90.jpg 240w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/>\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/group\/ai-for-good-research-lab\/\" class=\"icon-link icon-link--reverse mb-2\" data-bi-cN=\"AI For Good Lab\">\n\t\t\t\t\t\t\t\t\t<span class=\"c-glyph glyph-chevron-left\" aria-hidden=\"true\"><\/span>\n\t\t\t\t\t\t\t\t\tAI For Good Lab\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 class=\"wp-block-heading\" id=\"bioacoustics\">Bioacoustics<\/h1>\n\n\n\n<p>Leveraging AI to process audio recordings of diverse species<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<p>Bioacoustics is a cross-disciplinary science that combines biology and acoustics. Usually, it refers to the investigation of sound production, dispersion and reception in animals (including humans). In our research lab, we collaborate with conservation organizations and research labs to leverage machine learning and deep learning models to automatically process and analyze large volumes of audio recordings. Here are a few of the bioacoustics research projects we are working on:<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-1024x576.jpg\" alt=\"close up of McCaw bird\" class=\"wp-image-1026315\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/McCaw-Amazon_Correct-size.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"exploring-call-clusters-for-multiple-animal-species\">The sounds of the rainforest<\/h4>\n\n\n\n<p>Listen and learn about the Amazon\u2019s call for conservation through bioacoustics. In the heart of South America lies the Amazon Rainforest, a crucial component of the Earth\u2019s ecological balance. Spanning across nine countries and covering over 5.5 million square kilometers, the Amazon is a critical indicator of the planet\u2019s health.\u00a0<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/unlocked.microsoft.com\/bioacoustics\/\">Listen to our story<\/a><\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788.jpg\" alt=\"AI for Good - a green frog peering out from behind a dark green leaf\" class=\"wp-image-1016463\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/03\/AI4Good_I-6-Amazon_1400x788-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"new-automated-bioacoustics-analysis-platform\">AI-powered conservation for the Amazon<\/h4>\n\n\n\n<p>Deforestation threatens our planet&#8217;s health, with the Amazon losing nearly 2 million hectares in 2022 alone. Project Guacamaya, in partnership with Humboldt Institute, uses AI to combat this crisis: satellite analysis detects illegal deforestation activities, AI-enhanced camera traps streamline wildlife monitoring, and <strong>bioacoustics aid species identification.<\/strong> This approach offers efficient monitoring to prioritize action where it&#8217;s most needed.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/news.microsoft.com\/source\/latam\/features\/ai\/amazon-ai-rainforest-deforestation\/?lang=en\">AI and rainforest preservation article<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.linkedin.com\/pulse\/using-ai-protect-amazon-juan-m-lavista-ferres\/\">Blog<\/a><\/div>\n<\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-top is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-1024x576.jpg\" alt=\"animals with acoustics signal\" class=\"wp-image-1026207\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/Bioacustics_Correct-size.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"exploring-call-clusters-for-multiple-animal-species\">Zero-shot transfer for wildlife bioacoustics detection<\/h4>\n\n\n\n<p>Building supervised learning models requires manual data labeling which is slow, expensive and error prone. With self-supervised learning models, we are able to learn effective visual representations without human supervision, allowing us to explore clusters of animal calls that are from different species, or different stereotypes of calls from the same species, and also discover new sounds that are of interest.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/zero-shot-transfer-for-wildlife-bioacoustics-detection\/\">Publication<\/a><\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-full is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1400\" height=\"788\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2.jpg\" alt=\"acoustics annotation tool for wildlife\" class=\"wp-image-1026210\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2.jpg 1400w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2024\/04\/ann_tool_Correct-size_op2-1280x720.jpg 1280w\" sizes=\"auto, (max-width: 1400px) 100vw, 1400px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"new-automated-bioacoustics-analysis-platform\">CLAP bioacoustics annotation tool<\/h4>\n\n\n\n<p>We&#8217;ve leveraged CLAP&#8217;s group-level generalization capabilities to develop an efficient bioacoustics annotation tool akin to MegaDetector for animal audio detection. By enabling annotators to focus solely on regions of interest rather than listening to entire recordings or deciphering spectrograms, our tool streamlines the annotation process, drastically reducing time, effort, and expertise required. The user-friendly interface can also quickly identify and annotate positive and negative sounds.<\/p>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-top is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-1024x576.jpg\" alt=\"Beluga whales only live in the Arctic Ocean and its adjoining seas, such as the Bering Sea. (Photo by David Merron Photography\/Getty Images)\" class=\"wp-image-783874\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-343x193.jpg 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/10\/Beluga_GettyImages-547332670-scaled-1.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"helping-protect-beluga-whales-with-deep-learning\">Helping protect beluga whales with deep learning<\/h4>\n\n\n\n<p>It&#8217;s been over a decade since the Cook Inlet beluga was listed as endangered in 2008, and the population continues to decline with a current population estimate of 328 whales. To help understand and mitigate this problem our partners put together an acoustic research program to continuously monitor the beluga whale habitat. Using machine learning our researchers are helping to reduce labor and time for annotation and increase the accuracy of classification results.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/beluga-whale-acoustic-signal-classification-using-deep-learning-neural-network-models\/\">Publication<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/beluga-whale-acoustic-signal-classification-using-deep-learning-neural-network-models\/\">Get beluga sounds code<\/a><\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-1024x576.jpg\" alt=\"three colorful birds perched on tree branches\" class=\"wp-image-806221\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-1024x576.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-300x169.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-768x432.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-1066x600.jpg 1066w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-655x368.jpg 655w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-343x193.jpg 343w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-240x135.jpg 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-640x360.jpg 640w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-960x540.jpg 960w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788-1280x720.jpg 1280w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2021\/11\/Bioacoustics_birds-cluster-calls_1400x788.jpg 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"using-bioacoustics-for-multi-species-classification\">Using bioacoustics for multi-species classification<\/h4>\n\n\n\n<p>Acoustic monitoring has gained widespread interest as an ecological tool for wildlife population assessment, conservation, and biodiversity research. Acoustic analyses are often done manually which can be limiting. To enable analysis of entire datasets, accurate, automated sound recognition methods are paramount. In our research, we evaluate deep convolutional neural networks for classifying the calls of 24 birds and amphibian species. This multi-label multi-species classification methodology and its framework can be easily adopted by other acoustic classification problems.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/multispecies-bioacoustic-classification-using-\">Publication<\/a><\/div>\n\n\n\n<div class=\"wp-block-button is-style-pill\"><a data-bi-type=\"button\" class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/github.com\/microsoft\/Multi_Species_Bioacoustic_Classification\">Get bioacoustic classification code<\/a><\/div>\n<\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flow wp-block-group-is-layout-flow\">\n<h4 class=\"wp-block-heading\" id=\"explore-more\">Explore More<\/h4>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Project<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/accelerating-biodiversity-surveys\/\" data-bi-cN=\"Accelerating Biodiversity Surveys with AI\" data-external-link=\"false\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>Accelerating Biodiversity Surveys with AI<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-chevron-right\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"annotations \" data-bi-aN=\"citation\">\n\t<article class=\"annotations__list card depth-16 bg-body p-4 \">\n\t\t<div class=\"annotations__list-item\">\n\t\t\t\t\t\t<span class=\"annotations__type d-block text-uppercase font-weight-semibold text-neutral-300 small\">Collection<\/span>\n\t\t\t<a href=\"https:\/\/www.microsoft.com\/en-us\/ai\/ai-for-earth-tech-resources\" data-bi-cN=\"Open-source code resources\" data-external-link=\"false\" data-bi-aN=\"citation\" data-bi-type=\"annotated-link\" class=\"annotations__link font-weight-semibold text-decoration-none\"><span>Open-source code resources<\/span>&nbsp;<span class=\"glyph-in-link glyph-append glyph-append-chevron-right\" aria-hidden=\"true\"><\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n\n\n","protected":false},"excerpt":{"rendered":"<p>Bioacoustics is a cross-disciplinary science that combines biology and acoustics. Usually, it refers to the investigation of sound production, dispersion and reception in animals (including humans). In our research lab, we collaborate with conservation organizations and research labs to leverage machine learning and deep learning models to automatically process and analyze large volumes of audio recordings.<\/p>\n","protected":false},"featured_media":798307,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,243062,198583],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-784627","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-audio-acoustics","msr-research-area-ecology-environment","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[696730,696769,696796,757309,757414,769912,787663,967266],"related-downloads":[],"related-videos":[908409],"related-groups":[696544],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Zhongqi Miao","user_id":42462,"people_section":"Section name 0","alias":"zhongqimiao"},{"type":"user_nicename","display_name":"Meghana Kshirsagar","user_id":39736,"people_section":"Section name 0","alias":"mekshirs"},{"type":"user_nicename","display_name":"Shahrzad Gholami","user_id":39757,"people_section":"Section name 0","alias":"sgholami"},{"type":"guest","display_name":"Jane Wang","user_id":798076,"people_section":"Section name 0","alias":""},{"type":"user_nicename","display_name":"Juan M. Lavista Ferres","user_id":39552,"people_section":"Section name 0","alias":"jlavista"},{"type":"user_nicename","display_name":"Rahul Dodhia","user_id":41401,"people_section":"Section name 0","alias":"radodhia"},{"type":"user_nicename","display_name":"Thomas Roca","user_id":41416,"people_section":"Section name 0","alias":"throca"},{"type":"user_nicename","display_name":"Darren Tanner","user_id":41404,"people_section":"Section name 0","alias":"datanner"},{"type":"guest","display_name":"Andres Hernandez Celis","user_id":1026321,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Luisa  Vargas Daza","user_id":1026324,"people_section":"Section name 0","alias":""}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/784627","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":42,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/784627\/revisions"}],"predecessor-version":[{"id":1121169,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/784627\/revisions\/1121169"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/798307"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=784627"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=784627"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=784627"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=784627"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=784627"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}