AWS IoT Home Sensor Project

By | January 14, 2017

I started working on a IoT AWS home sensor project. The main purpose is to learn more about the different AWS services and storing and analysing some home temperature data. I will probably post most of the code on GitHub. As the time of writing I already published two modules.

Getting an IoT starter kit

Some time ago I bought one of the IoT starter kits Amazon offers several starter kits and provides code samples. Instead of buying the kit I bought the single components as it was easier and faster to buy them when I was still in Switzerland. I went with the Intel Edison for Arduino which is available on Amazon as well and the predecessor of Grove Starter Kit Plus IoT Edition. The Grove Kit contains lots of sensors which can be connected to the board without soldering and contains code samples. The main selling points were are powerful low-energy-platform, with lots of code samples, Wi-Fi and Bluetooth. As the device is powerful enough it runs a Yocto Linux and your programming language of choice. Since then a new Raspberry Pi has been released which features Wi-Fi as well. In comparison to the Edison the Pi is more like a mini-computer whereas the Edison is more like a dedicated IoT test device. Other boards like the Arduino may just have very low memory and computing power and might require programming in C or other low level languages.
If I had to buy it now, I would probably go for the Pi as it gets more support and has a larger community which both result in more updates. Probably most of the starter kits will do. Many feature the Grove modules. You could even just buy a Pi and a sensor, however than you have to figure out how to read the values and convert them.

General idea

As told the local IoT device should capture the data and push it to AWS.

The Edison is reading the data from the connected Grove Temperature Sensor (out of the kit) running Node.Js and publishes this data to AWS using the AWS IoT services using the MQTT protocol. The transferred messages are stored with Amazon Simple Queue Service (SQS). From there they are currently polled using AWS Lambda triggered by Amazon CloudWatch and backed up to Amazon Simple Storage Service (Amazon S3). Later on these messages should be pushed to datastores like Amazon DynamoDB and made available using Amazon API Gateway being backed up Lambda REST microservices. These data should than be visualised by a JavaScript application. I may provide several different backends for the same data.

As I mainly developed Java in the past most pieces of code will probably be written in Java, however other technologies may be used as they suit.

The choices of which AWS projects are used is affected by keeping the expenses low. For production use other services might be are better choice. E.g. Lambdas serving requests need pre-warming and you probably want consistent low latency answers and AWS Kinesis Streams might be more suitable for processing incoming messages.

Leave a Reply

Your email address will not be published. Required fields are marked *