This repository is the central hub for the Intelligent Systems Laboratory (ISL) research team's efforts in advancing the field of autonomous navigation and exploration. Our mission is to adapt and enhance object detection and semantic segmentation algorithms specifically tailored for integration with cutting-edge hardware platforms.
We are dedicated to developing optimized AI algorithms that are compatible with the NVIDIA Jetson Nano paired with the ZED camera on the Turtlebot, as well as the Intel NUC coupled with the Intel RealSense camera on our drone systems. A key focus of our work is to establish robust protocols for data sharing between these platforms, enabling real-time processing and ensuring synchronized operations across different systems.
The outcome of our project aims to significantly improve the autonomous capabilities of robotic and drone systems. By pushing the boundaries of what's possible within existing hardware configurations and computational constraints, we contribute to the broader spectrum of unmanned exploration missions. Our work is set to enhance AI-controlled autonomous navigation technologies, paving the way for innovative applications and research in intelligent systems.
We encourage collaboration and contributions from fellow researchers, developers, and enthusiasts in the field. Whether it's through improving code, suggesting optimizations, or providing feedback on our protocols, your input is invaluable in driving this project forward.
Keep an eye on this space for updates, progress reports, and insights into our development process. Together, we're charting a course for smarter, more capable autonomous systems.
Ground | Air | Communicate | Efficiency |
---|---|---|---|