Machines vs the human eye: UH Hilo students learn and apply data science skills to reef study

A team of UH Hilo faculty and undergraduate students investigated whether or not computer vision tools can detect disease on coral reefs as well as the human eye. The findings? Machines can complement human evaluation.

By Leah Sherwood.

Group photo of class in front of PowerPoint slide with image of underwater coral reef and the words: UH Hilo 'Ike Wai Summer Research Experience.
Group photo of faculty and students after presenting their results at UH Hilo. Front from left: Brittany Wells, Sofia Ferreira, Danielle Wilde, Micah Marshall, Chad Kinoshita, and Travis Mandel. Back: John Burns, Shane Murphy, Alexander Spengler, Drew Gotshalk, Grady Weyenberg, and Nicholas Del Moral. Photo by Robert Pelayo.

As part of a five-week research experience conducted during the summer, a team of University of Hawai‘i at Hilo faculty and undergraduate students investigated whether or not computer vision tools can detect disease on coral reefs as well as the human eye. The team analyzed the accuracy of physical eyes-on surveys conducted in the water compared to surveys based on photographs of the reef.

The main goal of the project was to develop advanced tools in data science in order to answer cross-disciplinary environmental questions. Students studying data science at UH Hilo learn skills in acquiring, archiving, and extracting knowledge from data in its various forms in order to find solutions to problems. UH Hilo’s Certificate in Data Science program also focuses on teaching students how to successfully communicate to peers and the public about the underlying structure and patterns found within the data being generated.

UH Hilo undergraduate students with a background in marine science and computer science were encouraged to apply for the summer program. The research course had both field and classroom components and included a blend of marine science, statistics, and computer science activities. Students were challenged to apply advanced statistics, master a variety of deep learning methods, and learn about the health and structure of underwater coral communities.

The summer program was co-led by John Burns, assistant professor of marine science, Travis Mandel, assistant professor of computer science and engineering, and Grady Weyenberg, assistant professor of mathematics. “Combining professors and students from multiple disciplines provided creative insight and a much greater capacity to solve problems, which ultimately enabled us to develop a unique strategy to test if computers have the capacity to assist with detecting diseases and anomalies on coral reefs,” explains Burns, director of the Multiscale Environmental Graphical Analysis (MEGA) Laboratory at UH Hilo.

Machines vs the human eye

Burns explains the need to understand the complementary differences between computer vision tools vs the human eye in this type of scientific inquiry. “Human visual surveys are seven percent more sensitive than computers at detecting coral diseases, but there is a lot of overlap. The moral of the story is that your eyes are a really great detection system but it doesn’t rule out the utility of a digital assessment.”

Diver underwater with snorkel gear. writing down data on clipboard.
A student conducts reef survey, June 31, 2019. During the field portion of the summer research program, student divers conducted visual surveys of the coral reefs and subsequently collected imagery of the same locations. This work was completed in Kona and Laehala on Hawai‘i Island, and the resulting data could be expressed as binary categories (yes or no) to determine if each surveyed colony is affected by a disease. Photo by John Burns.

Weyenberg explains that the research conducted over the summer addressed the question of how researchers should utilize their time in the field: surveying corals for disease with the naked eye or collecting photographs to analyze later.

“There was no literature and no quantitative comparison of the two methods on this question, and people do argue both ways,” says Weyenberg, who led the statistical component of the class. “We classified each coral colony as either having or not having a particular disease diagnosis using both in-water surveys and by looking at photographs of the coral. This allowed us to estimate the ability of both methods to make accurate diagnoses, without having to know with certainty if the coral is healthy or not.”

The marine science and computer science students also paired up to train deep learning systems to perform two different computer vision tasks—object detection (determining regions where diseases are present) and image regression (predicting what percentage of coral cover is affected by disease). By comparing their results, they were able to compare different approaches in terms of how accurately and quickly humans performed these tasks compared to machines.

Teaching machines to recognize coral disease is time-consuming work, especially when the goal is to compare different learning strategies and different methods of annotating the images of coral in the water. Mandel, a machine learning expert, says that was the motivation behind creating small groups to test how well the algorithms learned to correctly diagnose the coral colonies when different annotation methods were used.

“Everyone says ‘let’s automate it!’” Mandel notes. “But they forget that there is a person behind the automation process who has to trace the coral colonies or draw boxes around objects. We also want to know which annotation strategy minimizes human effort while still getting good results on the machine learning side. In marine science most of these annotation methods haven’t been explored, so they don’t know their trade-offs in terms of how much time it takes for humans, how well humans do it, and how much the machine learning can learn from it. That is what we are trying to learn here.”

Experiential learning

Mandel is pleased that some of the students from his spring 2019 class, Machine Learning for Data Scientists (CS 272), had the chance to see how the material they learned in class is applied in real-world situations. “They are learning these concepts within the context of the research, rather than me lecturing at them. That is what is great about research experiences; you are not just learning an abstract concept that may be useful someday, you are learning it because you need to use it right now.”

Sofia Ferreira, a UH Hilo marine science major, says she enjoyed interacting with students from other majors. “We had to work together because the marine science people know how to survey, but when it comes to incorporating these machine learning methods, we need the computer science students to help with it. This internship made me want to pursue the data science certificate.”

During the field portion of the research, the student divers conducted visual surveys of the coral reefs and subsequently collected imagery of the same locations. This work was completed in Kona and Laehala on Hawai‘i Island, and the resulting data could be expressed as binary categories (yes or no) to determine if each surveyed colony is affected by a disease.

Back in the classroom, students processed imagery of the coral reef study plots into 3D models using computer vision algorithms. The divers then examined the 3D models to verify the accuracy of the digital renderings compared with their observations of the conditions of the coral colonies in the field.

Students sit at computer banks. A trio of large computer screens are at the front of the class. The professor works at a computer to the left of the students.
During the classroom portion of the course, students processed imagery of the coral reef study plots into 3D models using computer vision algorithms. Photo by John Burns.

Findings

Burns says the results demonstrate that machines can complement the evaluation of coral health by human experts. Although the in-water diagnoses have a small advantage.

“The results show that there are cases where photography-based analysis is useful,” says Burns. “For example, if you are trying to go to a deep site, or a site where SCUBA is not feasible, or you want to cover a large spatial area, I think this really warrants utilizing digital tools, knowing that you are going to be a little less sensitive detecting diseases but you are not going to completely miss the identification of any diseases.”


UPDATE MAY 8, 2020: The findings of this study are now published: A Comparison of the Diagnostic Accuracy of in-situ and Digital Image-Based Assessments of Coral Health and Disease.


Funding

The UH Hilo Data Science Summer Research Experience was funded by the National Science Foundation through the statewide Experimental Program to Stimulate Competitive Research initiative. Under this EPSCoR program, in 2016, the NSF awarded $20 million to the UH System to do a five-year statewide study of water sustainability issues through a collaborative project called ‘Ike Wai (Knowledge, Water). UH Hilo is an important partner in the program to help the state address growing concerns over water resources in the state of Hawai‘i. At UH Hilo, the grant also funds other research initiatives and curricular programs including the Data Science Summer Bridge Program and the ‘Ike Wai Scholars program.

Related stories

UH Hilo developing new data science program

New computer scientist at UH Hilo specializes in artificial intelligence

Researcher returning to Hilo to launch new UH Hilo data science program

Story by Leah Sherwood, a graduate student in the tropical conservation biology and environmental science program at UH Hilo. She received her bachelor of science in biology and bachelor of arts in English from Boise State University.