There are many thoughts and theories on why people are seeing great whites more often in places like California, and outside of there being an actual uptick in shark population, it makes sense that technology has simply made us more aware of them.
When it comes to tracking animals like great whites, researchers have employed different methods but mainly rely on tagging and tracking the movements of specific individuals. Of course, this limits data to only the animals that have had interaction with research teams and have been tagged. With new artificial intelligence developed by Salesforce Einstein, researchers in Santa Barbara are operating without that limitation.
Known as Einstein Object Detection, or “Einstein Vision,” it was originally created for developers to train the AI to recognize multiple distinct objects within an image and then analyze details like size and location. According to Salesforce Einstein, the AI is “ordinarily used for visual search, brand detection, and product identification,” but when they teamed up with UC Santa Barbara’s Benioff Ocean Initiative, they had the idea to write an algorithm that would allow Einstein Vision to analyze drone footage in near-real-time and recognize great whites. They call this specific application of their tech “Sharkeye.”
To start this, UC Santa Barbara students were sent out in research teams to fly drones and film a 10-mile stretch of coast in the area. After compiling enough footage and images of sharks during their flights, the AI built models for recognizing the distinct shapes of a great white. And the program is so specific that researchers were able to program it to differentiate between something like a harmless leopard shark and a juvenile great white, but even more impressive, it can even recognize if it’s spotting a shark for a second time or if it’s analyzing footage of an entirely different individual altogether.
“The way we explain it is that it’s almost exactly like how kids learn,” says Einstein Project Manager Zineb Laraki. “You tell them, ‘That’s a cat.’ Eventually, kids can generalize and start to understand the concept of a cat — what it looks like, how it moves, predicting what it will do based on watching it. It’s the same thing with the algorithm. You show it examples. You let it make a determination. Right or wrong, you give it feedback.”