Capturing underwater video footage in marine environments to measure and identify fish and sample populations has traditionally been technically and logistically challenging. With the Lagoon AI Camera Orb, Firetail Robotics has set out to revolutionise this process. In fact, we’ve partnered with leading Australian research and commercial organisations including Griffith University to develop what we believe is a world leading approach to accuracy and automation of underwater fish ID and measurement by creating a flexible, full featured and cost effective marine sensing platform.
Current approaches use a BRUV (baited rigged underwater video camera) which is lowered into a fixed position with video later retrieved and fish species manually counted by an end user by watching the footage.These current processes consume hours of staff time. Time and cost factors limit how this information can be utilised for positive environmental and commercial outcomes. Lagoon AI can fundamentally change marine surveying. Our Lagoon AI Camera Orb solution will drastically reduce the time and resources needed to obtain actionable fish type, behaviour, size, and abundance data by using ‘AI at the edge’ – capturing and processing real time ocean imagery and via an AI trained cloud-based annotation functionality. Lagoon AI Fish ID comprises real-time fish count and sizing results at the edge (real time & on-site) and/or in the cloud (real time/delayed). The platform presents data to the end user via easy to use software. The Lagoon AI Camera is delivered to location by the Surfbee, our proprietary Unmanned Surface Vessel (USV) and is complemented by a suite of proprietary products including a ground control station, hand controller unit and proprietary autopilot and long range radio, cellular or satellite link, meaning data can be actioned in real time anywhere in the world via the cloud.
Why is innovation to transform ocean health important to you:
“We believe that innovation to transform the ocean is critical because our oceans have traditionally suffered from a case of ‘out of sight, out of mind.’ As a team dedicated to creating drones, autonomous vehicles and robots, part of our core belief is that these tools can help people see the world around them in new ways and perspectives. From our regional base, we see firsthand the impact of increasing climate variability on the agriculture sectors around our region – we know firsthand how important healthy oceans are to the health of the globe. We think that the ocean is the next frontier of creating greater awareness of the scientific, environmental and climatic impacts of the years of misuse and abuse of our maritime environment – in effect, removing the ignorance of the plight of the oceans by making it easier for people to understand them. We want to be front and centre in that effort and help tell the story of the importance of oceans” – Jack Hurley, Founder.
Currently:
Capturing underwater video footage in marine environments to sense and identify marine life and react to real time environmental anomalies from deployed sensors has traditionally been technically and logistically challenging. Current approaches use a BRUV (baited rigged underwater video camera) which is lowered into a fixed position with video later retrieved and fish species manually counted by an end user by watching the footage or traditional environmental sensors like the Xylem YSI EXO Multiparameter Sonde are lowered into water with data recorded and reviewed. Each of these processes consume hours of staff time and present various OH&S considerations. These time and cost factors limit how this information can be utilised for positive environmental and commercial outcomes.
Our Response: Firetail has partnered with leading Australian research and commercial organisations including Griffith University to develop what they believe is a world leading approach to accuracy and automation of underwater AI visual and sensing classification by creating a flexible, full featured and cost effective marine sensing platform, named Lagoon A.I. Lagoon AI features an ‘at the edge’ artificial intelligence capability, underpinned by the Lagoon AI edge device . Impressive as this system is, it’s only as good as the underpinning robotic functionality – and this is where the Surfbee excels in delivering these sensing and survey technologies to the right spots.
The advantages of deploying the Lagoon AI sensing solution via the Surfbee platform quickly became evident from the very first ‘on water’ trials. Having a platform like the Surfbee with automated routing and pre drop deployment underwater depth and structure scanning drastically reduced the time and resources needed to obtain actionable data, as the team was able to identify the key survey areas in advance to ensure accurate ‘on water coverage.’ Because the Surfbee has best of breed AI at the edge processing and communications capability, this completely removes the time lag in gathering this data. Surfbee’s winch came into its own in variable depth environments, with the ability to automatically adjust sensor depth the key to maintaining exceptional survey data integrity.