Skip to content

The SmartGuard is an iOS application which helps users to take care of their homes and allows them to video chat with visitors. This IoT(Internet of Things)project consists of three components which are a Raspberry Pi with a PIR motion sensor, another Raspberry Pi with a speaker and camera that combined with a microphone and a IOS client end bas…

Notifications You must be signed in to change notification settings

Michael0770/Smartguard

Repository files navigation

Smartguard

Introductions

This project consists of three components which are a Raspberry Pi with a PIR motion sensor, another Raspberry Pi with a speaker and camera that combined with a microphone and a IOS client end based on Swift.

1. IOT Project architecture

Specially, the first part is a Raspberry Pi connected with a PIR Motion sensor performing the motion detection function. As one of our project’s function is to monitor the outside in front of the door and send the notification to users when a specific motion is detected, the Raspberry Pi is chosen as a server based on the Node JS and MQTT protocol. The PIR Motion sensor is also our choice because it is easy to use and can detect the motion of a warm body within the range of five or six meters. On the Raspberry Pi, a server named Mosquitto using the MQTT protocol is setted up and it will run automatically when the the Raspberry Pi running. A JS format file including the code of controlling the motion sensor and the Mosquitto clients to publish the notification to the Mosquitto server will be executed manually. After running these file, the Mosquitto server will receive a message from the client that has subscribed a specific topic if the motion sensor is triggered and broadcast this message to all the clients that also subscribe the same topic.

The second part is another Raspberry Pi connected with a camera, a microphone and a speaker which enable the user to communicate with the visitors through our iOS application. The video stream is handled by MJPG streamer in the server (Raspberry Pi) and the voice stream is handled by Twilio framework in both client and server side. The frameworks used here will be introduced in part 3 later.

 See image below :

2.iOS architecture design

2.1 Project story architecture

The AppDelegate contains the app’s startup code, which responds to key changes in the state of the app. All these controllers are declared in AppDelegate which works alongside the app object to ensure the app interacts properly with the system and with other apps. It responds to both temporary interruptions and to changes in the execution state of the app and notifications originating from outside the app, such as remote notifications (also known as push notifications), low-memory warnings, SmartGuard Documentation download completion notifications, and more.

The tab bar controller is used to organize the app into one distinct mode of operation. The view hierarchy of a tab bar controller is self contained. It is composed of map view controller, property list table view controller and settings view controller. Each content view controller manages a distinct view hierarchy, and the tab bar controller coordinates the navigation between the view hierarchies.

The view controllers and table view controllers are the foundation of the app’s internal structure. Each view controller and table view controller manages a portion of the app’s user interface as well as the interactions between that interface and the underlying data. The UIViewController class defines the methods and properties for managing the views, handling events, transitioning from one viewcontroller to another, and coordinating with other parts of the app.

 See image below :

2.2 Data Model

The Core Data’s functionality depends on the schema created to describe the application’s entities, their properties, and the relationships between them. The managed object model is a set of objects that form together a blueprint describing the managed objects we use in the application, which allows Core Data to map from records in a persistent store to managed objects that we use in the application.

As shown by the graph above, in our app we has 2 tables in CoreData which are SmartGuard Documentation Location and Record, their mutual relationships is one-to-many that one Location can have multiple records as the history. The Location table has attributes as the String “title”, “lat”, “long” which are the geo-information, a boolean “beingVisited” which indicates if there is any visitor coming in front of the property at the moment, a String “url1” which stands for the IP address corresponding to the specific Raspberry Pi that is installed on the property. This table Record has the attributes of the String “datetime” that equals to the exact date and time when the data is recorded, namely the moment that motion sensor detects the stranger approaching and the snapshot taken by the camera at the same time. It also contains a String “thumbnail” which is generated by appending a suffix “.jpg” to the “datetime”. Therefore it stands for the filename of the snapshot that camera takes on the detection of the motion and puts into the CoreData. In addition, the snapshot is captured in the server when a visitor visit the location and the user will download the snapshot by sending a request to the server and move the downloaded image from the temp directory to a permanent documents directory and then store the file path in the thumbnail field of the record data model. When entering the RecordTableview, the contents will be retrieved from the matching file path and presented to the imageview showing the snapshot.

 See image below :

2.3 System architecture design

 See image below :

3. Frameworks

3.1 Mapkit

The MapKit framework provides an interface for embedding Apple Map directly into your own screen hence giving user a more direct and convenient view of location based data concerning the app. This framework also provides support for indicating user’s current location, annotating on the map, adding overlays, and performing reverse-geocoding lookups to determine placemark information for a given map coordinate. In our case, user can add a new placemark on the map in order to add own customized location, viewing all the existing location as well as viewing the detail of a location and redirect to the Monitor Screen of that specific location.

3.2 MJPG Streamer

The MJPG Streamer is used to handle the video stream from server side (Raspberry Pi) to client side (iOS application). The reason we use the MJPG Streamer is that it is compatible with embedded devices such as Raspberry Pi with limited RAM and CPU, and it provides a command line instructions to run MJPG Streamer, this will make the configuration of the server very easy as we only need to configure the MJPG Streamer through the terminal and then use the node-cmd to run command line instructions to create the stream when the server is running. We use the command “raspistill” to capture the picture from the camera module every 50 milliseconds and configure the input and out of the MJPG streamer by setting up the “input_raspi.so” and “output_http.so” files which enable the MJPG Streamer to stream the JPEG frames to the html page. The user can start the stream by send a request to the server to start capture pictures from the camera module and get the stream web page by send another request to the server. The workflow is shown in the following figure.

 See image below :

3.3 Twilio Framework

The Twilio Framework is used to stream the voice between the server (Raspberry Pi) and client application. Since audio is essentially a temporal signal compared to the spatial feature of video, it requires more algorithm and technology to sample the voice with a high frequency in order to achieve a real-time and high QOS audio transfer. Hence, we have tried a lot of third party frameworks to handle the voice stream between iOS application (written in swift) and Raspberry Pi server(written in node.js) such as OpenWebRTC, EasyRTC, PeerJS, QuickBlox and Skylink, but unfortunately they were all written in Objective-C which were difficult for us to bridge OC with swift as we are not an expert in Objective-C so if there are a framework that provide a swift version and is easy to be installed will save our life to struggle in how to perform a real time voice stream between swift and node.js. We finally chose Twilio.

The Twilio framework is a cross-platform framework that provide sufficient documentations for us to build iOS application client and node.js client which can communicate in real time with low latency. So after struggled for several days trying a lot of frameworks to enable our application communicate with server (Raspberry Pi) in real time we decided to use Twilio framework based on the above considerations.

Firstly, we need clarify a video client with the generated access token in both client and server side. And then set up the local media object that collect the audio captured from the local device’s microphone. In the server side, the setup of the local media is done by getUserMedia() function and in iOS client side, this is done by add a AudioTrack to the Twilio client created before. In addition, we also need to specify the room name of the clients to make sure the iOS client and the Raspberry Pi client will enter the same room when the user choose to call the visitor. After the connection is established, the user can communicate with the visitors in real time and when the communication is completed the user can easily disconnect the room by clicking on the hang up button.

3.3 MQTT Framework

To perform the connection between motion sensor and IOS application, we choose the MQTT protocol as medium as it can transfer message among clients that subscribe the same topic. On Raspberry Pi side, we use Mosquitto as our MQTT broker. source message broker that implements the MQ Telemetry Transport protocol Mosquitto is an open versions 3.1 and 3.1.1. MQTT provides a lightweight method of carrying out messaging using a publish/subscribe model. We also install a MQTT clients that subscribe a topic with the Mosquitto server to publish the message when motion is detected. The MQTT client which used in IOS side is Moscapsule. Moscapsule for iOS is written in Swift and this framework is implemented as a wrapper of Mosquitto library and covers almost all mosquitto features. Therefore, we can use it to connect the Mosquitto server in the Raspberry Pi and subscribe the same topic, thus receive the message and send notifications to users when the motion sensor is triggered.

 See image below :

About

The SmartGuard is an iOS application which helps users to take care of their homes and allows them to video chat with visitors. This IoT(Internet of Things)project consists of three components which are a Raspberry Pi with a PIR motion sensor, another Raspberry Pi with a speaker and camera that combined with a microphone and a IOS client end bas…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published