iOS app created for Pixery Hackathon.
The app aims to help visually impaired people by letting them know about their surroundings.
Technologies used:
- ARKit to receive frames from camera.
- CoreML with InceptionV3 pre-trained model to detect objects around them.
- SFSpeechRecognizer to receive voice commands from the user.
- AVSpeechSynthesizer to give audio feedback to the user.