Students work on interface improvement of artificial intelligence

The class was led by Travis Mandel, assistant professor of computer science, and was sponsored by the National Science Foundation.

Prof Mandel stands at center of room with students at computers.
Assistant Professor of Computer Science Travis Mandel leads summer class on artificial intelligence. Photo by Ilya Kravchik, may not be used without permission.

By Susan Enright.

Travis Mandel
Travis Mandel

This summer, five students from the University of Hawai‘i at Hilo worked with a computer scientist to investigate various research problems in human-in-the-loop artificial intelligence or AI. The class was led by Travis Mandel, assistant professor of computer science, and was sponsored by the National Science Foundation. Students were paid a stipend to participate.

“The students have been working really hard and have some interesting work to present,” says Mandel. The students presented their two group projects virtually on July 30.

Human-AI Collaboration in Fish Tracking

In an email sent out to the UH Hilo community last month, Mandel shares that students Chris Hanley and Meynard Ballesteros worked on a project covering desktop to mobile system use in detecting and tracking fish in open ocean.

Their abstract explains existing pipelines have been built that can accurately detect and track fish in real-world video collected by divers off the coast of Hawai‘i Island. Although useful for post-hoc analysis, these systems are meant to be run offline on a powerful server and cannot be used to guide data collection in the field.

Real-world diving is a challenging scenario for real-time machine learning, as due to the lack of a reliable internet connection all processing must happen on the local device. However, if a system could be built that could guide divers in real-time, it would have a huge impact on the process of scientific data collection in marine science. It could also serve as a testbed for exploring new methods of human-AI collaboration in a highly unusual and challenging scenario, which could generalize to other data collection settings.

Over the course of the summer, the students trained with a wide variety of object detection network architectures on publicly-available fish data.

“We evaluated their performance on real-world dive video collected off Hilo Bay to see whether it was possible to improve over the original RetinaNet architecture using more recent approaches,” the students explain in their abstract. “We experimented with different input sizes, training platforms, and training times. We found unexpected results regarding how different models tradeoff accuracy, the speed at which they train, and the speed at which they process images.”

Screenshot of underwater image with two fish identified.
Screenshot of the phone app developed by the fish detection group. Image courtesy of Travis Mandel.

The budding computer scientists explain that in order to be useful to divers in the field, object detection and tracking need to occur on mobile devices rather than a desktop computer or server. However, the existing pipelines were written in Python which is not ideal for cross-platform mobile development.

To this end, they compared numerous different libraries, frameworks, and formats for speed and compatibility with mobile platforms. They developed a cross-platform app which supports Android, iPhone, and desktop simultaneously and allows anyone to run fish detectors on live video from their webcam as well as on pre-collected dive videos.

“We instrumented our app to report key accuracy and timing metrics, which we compared to desktop performance, finding unexpected dissimilarities,” they say. “Finally, we explore avenues for running tracking in real-time alongside detections.”

“This work lays the foundation for a new paradigm for how humans and computer vision systems should interact during real-time data collection.”

Travid Mandel with his class of five students, all wearing masks.
Assistant Professor Travis Mandel (center front) stands with students from a UH Hilo summer class on artificial intelligence. (From left in front) Chris Hanley, Prof. Mandel, and Jaden Kapali. (From left in back) Meynard Ballesteros, Ka‘imi Beatty, and Keane Nakatsu. Photo by Ilya Kravchik, may not be used without permission.

Improving Human-in-the-Loop AI

Students Keane Nakatsu, Ka‘imi Beatty, and Jaden Kapali worked on a project evaluating a human-in-the-loop annotation interface.

The abstract describes the students’ work on applying deep learning methods to detect novel objects that typically require acquiring a large dataset of annotated training data. However, the process of manually annotating this data is time-consuming, inefficient, and expensive. The students posit that a better approach would be incorporating AI to work together in real time with human annotators to improve the efficiency of the data annotation process.

In previous work, a preliminary annotation interface was developed, but while it showed promise, its effectiveness was unclear. In order to evaluate and further improve the interface, the investigators worked from the premise that it is vital to be able to simulate the behavior of many different types of users.

“Additionally, we need to quickly and easily produce clear metrics that evaluate interface effectiveness,” explain the students. “Finally, it is critical to be able to distinguish the impact of the interface’s different features and measure their contribution to the overall performance.”

Two images: On the left is the old simulator and on the right the new and improved. Titled: Choosing the next box, picking the next box to work like a person would.
A slide from the presentation of the interface group. The students improved the simulator, which more realistically emulates human behavior compared to previous approaches. Image courtesy of Travis Mandel.

Over the course of the summer, the students made progress on all of these problems, improving the annotation interface and making it easier to iterate on. For instance, prior to this summer, interface evaluation required directly connecting to a Graphic Processing Unit or GPU server.

“We developed a novel system to generate realistic deep detections without the need to run deep learning at all, allowing us to iterate faster and more efficiently by augmenting the passage of time in the interface,” the students say in their abstract. “To further assist in evaluation, we created an improved simulator which more realistically emulates human behavior compared to previous approaches and is easily extensible.”

To demonstrate this, the students created a wide variety of new simulators that better reflect the great diversity of human behavior during the annotation process.

“Our streamlined evaluation approach allowed us to easily evaluate how various different features, including how the AI reacts to user behavior, animations, etc. affect annotation performance in a rigorous and quantitative way,” explain the students in their summary. “We are also able to easily create and evaluate newer and more complex algorithms to assist humans during the annotation process.”

“These types of improvements pave the way for a future in which humans and AI work together, learning from each other to solve challenging and important real-world problems.”

Related Post

Summer 2020: AI student research conducted while adhering to COVID-19 protocols

AI student research conducted while adhering to COVID-19 protocols

 

 

Story by Susan Enright, a public information specialist for the Office of the Chancellor and editor of UH Hilo Stories. She received her bachelor of arts in English and certificate in women’s studies from UH Hilo.