Try out our application: Birds vs Drones Detection System Demo
A comprehensive computer vision application for detecting and tracking birds and drones using YOLOv8. This system employs advanced deep learning techniques with a focus on modularity, error handling, and robust logging capabilities.
- Real-time bird and drone detection
- Comprehensive data augmentation pipeline
- Advanced model evaluation metrics
- Object tracking capabilities
- Robust error handling and logging
- Streamlit-based user interface
- Support for both image and video processing
birds_vs_drones_detection_and_tracking/
├── .env # Environment variables configuration
├── .gitignore # Git ignore rules
├── app.py # Streamlit web application
├── requirements.txt # Project dependencies
├── setup.py # Package installation configuration
├── yolov8n.pt # YOLOv8 nano pre-trained weights
├── augmented_data/ # Directory for augmented training data
├── data/ # Dataset directory
│ ├── README.dataset.txt # Dataset documentation
│ ├── README.roboflow.txt # Roboflow dataset information
│ ├── data.yaml # Dataset configuration
│ ├── train/ # Training dataset
│ ├── valid/ # Validation dataset
│ └── test/ # Test dataset
├── research/ # Research and development notebooks
│ ├── edith-defence-system-v-0.0.1.ipynb
│ └── edith-defence-system-v-0.0.2.ipynb
├── runs/ # Training runs and model artifacts
└── src/ # Source code
├── __init__.py
├── components/ # Core components
│ ├── __init__.py
│ ├── data_augmentation.py # Data augmentation pipeline
│ └── download_dataset.py # Dataset download utilities
├── pipeline/ # Training and inference pipelines
│ ├── __init__.py
│ ├── evaluation.py # Model evaluation scripts
│ ├── prediction.py # Prediction pipeline
│ └── training-v-0.0.1.py # Training pipeline
├── custom_exception.py # Custom exception handling
└── logger.py # Logging infrastructure
- src/: Contains all source code and core functionality
- components/: Core processing modules
- pipeline/: Training and inference pipelines
- custom_exception.py: Exception handling system
- logger.py: Logging infrastructure
- data/: Contains dataset files and configuration
- Organized into train, validation, and test sets
- Includes dataset documentation and configuration
- augmented_data/: Stores augmented training data
- Generated by data_augmentation.py
- Used for model training enhancement
- runs/: Contains training outputs
- Model weights
- Training logs
- Evaluation metrics
- yolov8n.pt: Pre-trained YOLOv8 nano weights
- research/: Jupyter notebooks for R&D
- Version 0.0.1 and 0.0.2 development notebooks
- Experimental features and analysis
- .env: Environment variables
- requirements.txt: Python dependencies
- .gitignore: Git ignore patterns
- Clone the repository:
git clone https://github.com/Aman-Vishwakarma1729/Birds-vs-Drones-Detection-and-Tracking-System.git
cd Birds-vs-Drones-Detection-and-Tracking-System
- Create and activate a virtual environment:
python -m venv .venv
.venv\Scripts\activate # Windows
source .venv/bin/activate # Linux/Mac
- Install the package in development mode:
pip install -e .
- Set up environment variables:
Create a
.env
file in the project root with your Roboflow API key:
ROBOFLOW_API_KEY=your_api_key_here
- Download and setup the dataset:
python -m src.components.download_dataset
This will:
- Download the dataset from Roboflow
- Rename the downloaded folder to 'data'
- Update data.yaml with correct paths
- Create necessary directory structure:
data/ ├── data.yaml # Dataset configuration ├── train/ │ ├── images/ # Training images │ └── labels/ # Training labels ├── valid/ │ ├── images/ # Validation images │ └── labels/ # Validation labels └── test/ ├── images/ # Test images └── labels/ # Test labels
-
Data Augmentation (
components/data_augmentation.py
)- Implements advanced augmentation techniques
- Supports various transformation strategies
- Handles batch processing of images
-
Dataset Management (
components/download_dataset.py
)- Manages dataset download from Roboflow
- Handles data validation and verification
- Implements error handling for download process
-
Model Pipeline
- Evaluation (
pipeline/evaluation.py
): Comprehensive model evaluation metrics - Prediction (
pipeline/prediction.py
): Real-time inference pipeline - Training (
pipeline/training-v-0.0.1.py
): YOLOv8 training workflow
- Evaluation (
-
Exception Handling (
custom_exception.py
)- Custom exception classes for different error types
- Detailed error tracking with file and line information
- Specialized exceptions for model, data, and prediction errors
-
Logging System (
logger.py
)- Comprehensive logging infrastructure
- Multiple logger categories (model, data, prediction)
- Timestamped log files and console output
- ultralytics==8.0.0
- torch>=2.0.0
- opencv-python>=4.7.0
- streamlit>=1.24.0
- albumentations>=1.3.1
- python-dotenv>=1.0.0
- numpy>=1.24.0
- matplotlib>=3.7.1
- pandas>=2.0.0
-
Environment Setup
# Create and activate virtual environment python -m venv .venv .venv\Scripts\activate # Windows source .venv/bin/activate # Linux/Mac
-
Installation There are two ways to install the project dependencies:
a) Using requirements.txt:
pip install -r requirements.txt
b) Using setup.py (recommended for development):
pip install -e .
This will install the project in development mode, making it easier to modify the code without reinstalling.
The setup.py file includes the following dependencies:
- ultralytics (YOLOv8)
- numpy
- matplotlib
- pandas
- opencv-python
- streamlit
- albumentations
- python-dotenv
-
Configure Environment Variables
- Create a
.env
file in the project root - Add your Roboflow API key:
ROBOFLOW_API_KEY=your_api_key_here
- Create a
-
Download Dataset
python src/components/download_dataset.py
-
Run Training
python src/pipeline/training-v-0.0.1.py
-
Launch Web Interface
streamlit run app.py
- Base Model: YOLOv8 Nano
- Classes: Bird, Drone
- Default Confidence Threshold: 0.25
- IoU Threshold Range: 0.45-0.5
- mAP50
- mAP50-95
- Precision
- Recall
- F1-score
- API keys stored in
.env
file - Secure file handling implementation
- Input validation for all user inputs
- Comprehensive error logging
- Enhanced small object detection capabilities
- Implementation of advanced tracking algorithms
- Real-time camera input support
- Extended dataset diversity
- Edge device deployment optimizations
- Comprehensive unit test coverage
Contributions are welcome! Please feel free to submit a Pull Request.