Starting on lesson 18

pull/67/head
Jim Bennett 4 years ago
parent 94a3644037
commit ba6ca24fcc

@ -2,7 +2,7 @@
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-1.png) ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-1.png)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click for a larger version. > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz

@ -24,20 +24,88 @@ In this lesson we'll cover:
## Architect complex IoT applications ## Architect complex IoT applications
IoT applications are made up of many components. This includes a variety of things, and a variety of internet services.
IoT applications can be described as *things* (devices) sending data that generates *insights*. These *insights* generate *actions* to improve a business or process. An example is an engine (the thing) sending temperature data. This data is used to evaluate whether the engine is performing as expected (the insight). The insight is used to proactively prioritize the maintenance schedule for the engine (the action).
* Different things gather different pieces of data.
* IoT services give insights over that data, sometimes augmenting it with data from additional sources.
* These insights drive actions, including controlling actuators in devices, or visualizing data.
![A reference iot architecture](../../../images/iot-reference-architecture.png) ![A reference iot architecture](../../../images/iot-reference-architecture.png)
***A reference iot architecture. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** ***A reference iot architecture. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows some of the components and services covered so far in these lessons and how the link together.
* **Things** are devices that gather data from sensors, maybe interacting with edge services to interpret that data, such as image classifiers to interpret image data. You've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device.
* The data from the devices is sent to an IoT service, and from there on to other services to generate insights. You've sent IoT data to Azure IoT Hub.
* **Insights** come from serverless applications, or from analytics run on stored data. So far you've used Azure Functions to respond to messages sent to an IoT Hub, and stored data for later analysis in Azure Storage.
* **Actions** can be commands send to devices, or visualization of data allowing humans to make decisions. You've controlled actuators based on decisions made in the cloud and commands sent to the devices, and you've visualized data using Azure Maps.
✅ Think about other IoT devices you have used, such as smart home appliances. What are the things, insights and actions involved in that device and it's software?
This pattern can be scaled out as large or small as you need, adding more devices and more services.
## Design a fruit quality control system ## Design a fruit quality control system
Lets now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application.
Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system.
✅ One of the trends with the rise of IoT (and technology in general) is that low-skill jobs are being replaced by machines. Do some research: How many jobs are estimated to be lost to IoT? How many new jobs will be created building IoT devices?
You need to build a system where fruit is detected as it arrives on the conveyer belt, it is then photographed and checked using an AI model running on the edge. The results are then sent to the cloud to be stored, and if the fruit is unripe a notification is given so the unripe fruit can be removed.
| | |
| - | - |
| **Things** | Detector for fruit arriving on the conveyor belt<br>Camera to photograph and classify the fruit<br>Edge device running the classifier<br>Device to notify of unripe fruit |
| **Insights** | Decide to check the ripeness of the fruit<br>Store the results of the ripeness classification<br>Determine if there is a need to alert about unripe fruit |
| **Actions** | Send a command to a device to photograph the fruit and check it with an image classifier<br>Send a command to a device to alert that the fruit is unripe |
### Prototyping your application
![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png) ![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png)
***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** ***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference architecture for this prototype application.
* An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected.
* A serverless application in the cloud sends a command to another device to take a photograph and classify the image.
* An IoT device with a camera takes a picture and sends it to an image classifier running on the edge. The results are then sent to the cloud.
* A serverless application in the cloud stores this information to be analyzed later to see what percentage of fruit is unripe. If the fruit is unripe it sends a command to another iot device to alert factory workers there is unripe fruit via an LED.
For the prototype, you will implement all of this on a single device. If you are using a microcontroller then you will use a separate edge device to run the image classifier. You have already learned most of the things you will need to be able to build this.
### Moving to production
The prototype will form the basis of your final production system. The differences when you move to production would be:
* Ruggedized components - using hardware designed to withstand the noise, heat, vibration and stress of a factory.
* Using internal communications - some of the components would communicate directly avoiding the hop to the cloud, only sending data to the cloud to be stored. How this is done depends on the factory setup.
* Automated fruit removal - instead of an LED to alert that fruit is unripe, automated devices would remove it.
## Trigger fruit quality checking from a sensor ## Trigger fruit quality checking from a sensor
The IoT device needs some kind of trigger to indicate when fruit is ready to be classified. One trigger for this would be to measure when the fruit is at the right location on the conveyor belt my measuring the distance to a sensor.
![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png)
***A reference iot architecture for fruit quality checking. Bananas by abderraouf omara / stop watch by Ziyad Al junaidi - all from the [Noun Project](https://thenounproject.com)***
Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor.
> 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away.
### Task - trigger fruit quality detection from a distance sensor ### Task - trigger fruit quality detection from a distance sensor
Work through the relevant guide to use a proximity sensor to detect an object using your IoT device:
* [Arduino - Wio Terminal](wio-terminal-proximity.md)
* [Single-board computer - Raspberry Pi](pi-proximity.md)
* [Single-board computer - Virtual device](virtual-device-proximity.md)
## Store fruit quality data ## Store fruit quality data
## Control feedback via an actuator ## Control feedback via an actuator
@ -52,6 +120,9 @@ In this lesson we'll cover:
## Review & Self Study ## Review & Self Study
* Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn)
* Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture)
## Assignment ## Assignment
[](assignment.md) [](assignment.md)

@ -0,0 +1,36 @@
import io
import time
from picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.classify_image(project_id, iteration_name, image)
for prediction in results.predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')

@ -0,0 +1,11 @@
import time
from grove.i2c import Bus
from rpi_vl53l0x.vl53l0x import VL53L0X
distance_sensor = VL53L0X(bus = Bus().bus)
distance_sensor.begin()
while True:
st = distance_sensor.wait_ready()
print(f'Distance = {distance_sensor.get_distance()} mm')
time.sleep(1)

@ -0,0 +1,98 @@
# Detect proximity - Raspberry Pi
In this part of the lesson, you will add a proximity sensor to your Raspberry Pi, and read distance from it.
## Hardware
The Raspberry Pi needs a proximity sensor.
The sensor you'll use is a [Grove Time of Flight distance sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html). This sensor uses a laser ranging module to detect distance. This sensor has a range of 10mm to 2000mm (1cm - 2m), and will report values in that range pretty accurately, with distances above 1000mm reported as 8109mm.
The laser rangefinder is on the back of the sensor, the opposite side to the Grove socket.
This is an I<sup>2</sup>C sensor.
### Connect the time of flight sensor
The Grove time of flight sensor can be connected to the Raspberry Pi.
#### Task - connect the time of flight sensor
Connect the time of flight sensor.
![A grove time of flight sensor](../../../images/grove-time-of-flight-sensor.png)
1. Insert one end of a Grove cable into the socket on the time of flight sensor. It will only go in one way round.
1. With the Raspberry Pi powered off, connect the other end of the Grove cable to one of the I<sup>2</sup>C sockets marked **I<sup>2</sup>C** on the Grove Base hat attached to the Pi. These sockets are on the bottom row, the opposite end to the GPI pins and next to the camera cable slot.
![The grove time of flight sensor connected to the I squared C socket](../../../images/pi-time-of-flight-sensor.png)
## Program the time of flight sensor
The Raspberry Pi can now be programmed to use the attached time of flight sensor.
### Task - program the time of flight sensor
Program the device.
1. Power up the Pi and wait for it to boot.
1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension.
1. Create a new file in this project called `distance-sensor.py`.
> 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time.
1. Add the following code to this file:
```python
import time
from grove.i2c import Bus
from rpi_vl53l0x.vl53l0x import VL53L0X
```
This imports the Grove I<sup>2</sup>C bus library, and a sensor library for the core sensor hardware built into the Grove time of flight sensor.
1. Below this, add the following code to access the sensor:
```python
distance_sensor = VL53L0X(bus = Bus().bus)
distance_sensor.begin()
```
This code declares a distance sensor using the Grove I<sup>2</sup>C bus, then starts the sensor.
1. Finally, add an infinite loop to read distances:
```python
while True:
st = distance_sensor.wait_ready()
print(f'Distance = {distance_sensor.get_distance()} mm')
time.sleep(1)
```
This code waits for a value to be ready to read from the sensor, then prints it to the console.
1. Run this code.
> 💁 Don't forget this file is called `distance-sensor.py`! Make sure to run this via Python, not `app.py`.
1. You will see distance measurements appear in the console. Position objects near the sensor and you will see the distance measurement:
```output
pi@raspberrypi:~/fruit-quality-detector $ python3 distance_sensor.py
Distance = 29 mm
Distance = 28 mm
Distance = 30 mm
Distance = 151 mm
```
The rangefinder is on the back of the sensor, so make sure you use hte correct side when measuring distance.
![The rangefinder on the back of the time of flight sensor pointing at a banana](../../../images/time-of-flight-banana.png)
> 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder.
😀 Your proximity sensor program was a success!

@ -0,0 +1,39 @@
# Detect proximity - Wio Terminal
In this part of the lesson, you will add a proximity sensor to your Wio Terminal, and read distance from it.
## Hardware
The Wio Terminal needs a proximity sensor.
The sensor you'll use is a [Grove Time of Flight distance sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html). This sensor uses a laser ranging module to detect distance. This sensor has a range of 10mm to 2000mm (1cm - 2m), and will report values in that range pretty accurately, with distances above 1000mm reported as 8109mm.
The laser rangefinder is on the back of the sensor, the opposite side to the Grove socket.
This is an I<sup>2</sup>C sensor.
### Connect the time of flight sensor
The Grove time of flight sensor can be connected to the Wio Terminal.
#### Task - connect the time of flight sensor
Connect the time of flight sensor.
![A grove time of flight sensor](../../../images/grove-time-of-flight-sensor.png)
1. Insert one end of a Grove cable into the socket on the time of flight sensor. It will only go in one way round.
1. With the Wio Terminal disconnected from your computer or other power supply, connect the other end of the Grove cable to the left-hand side Grove socket on the Wio Terminal as you look at the screen. This is the socket closest to from the power button. This is a combined digital and I<sup>2</sup>C socket.
![The grove time of flight sensor connected to the left hand socket](../../../images/wio-time-of-flight-sensor.png)
1. You can now connect the Wio Terminal to your computer.
## Program the time of flight sensor
The Wio Terminal can now be programmed to use the attached time of flight sensor.
### Task - program the time of flight sensor
1. Open the `fruit-quality-detector` application in VS Code

@ -20,7 +20,7 @@ The projects cover the journey of food from farm to table. This includes farming
![A road map for the course showing 24 lessons covering intro, farming, transport, processing, retail and cooking](sketchnotes/Roadmap.png) ![A road map for the course showing 24 lessons covering intro, farming, transport, processing, retail and cooking](sketchnotes/Roadmap.png)
**Hearty thanks to our authors [Jen Looper](https://github.com/jlooper), [Jim Bennett](https://github.com/jimbobbennett), and sketchnote artist [Nitya Narasimhan](https://github.com/nitya)** **Hearty thanks to our authors [Jen Fox](https://github.com/jenfoxbot), [Jen Looper](https://github.com/jlooper), [Jim Bennett](https://github.com/jimbobbennett), and sketchnote artist [Nitya Narasimhan](https://github.com/nitya)**
> **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md). > **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md).

@ -46,7 +46,7 @@ These are specific to using the Raspberry Pi, and are not relevant to using the
* [Raspberry Pi Camera module](https://www.raspberrypi.org/products/camera-module-v2/) * [Raspberry Pi Camera module](https://www.raspberrypi.org/products/camera-module-v2/)
* Microphone and speaker: * Microphone and speaker:
* Any USB Microphone * Any USB Microphone
* Any USB speaker, or speaker with a 3.5mm cable * Any USB speaker, or speaker with a 3.5mm cable, or using HDMI audio if your Raspberry Pi is connected to a monitor with speakers
or or
* [USB Speakerphone](https://www.amazon.com/USB-Speakerphone-Conference-Business-Microphones/dp/B07Q3D7F8S/ref=sr_1_1?dchild=1&keywords=m0&qid=1614647389&sr=8-1) * [USB Speakerphone](https://www.amazon.com/USB-Speakerphone-Conference-Business-Microphones/dp/B07Q3D7F8S/ref=sr_1_1?dchild=1&keywords=m0&qid=1614647389&sr=8-1)
* [Grove Sunlight sensor](https://www.seeedstudio.com/Grove-Sunlight-Sensor.html) * [Grove Sunlight sensor](https://www.seeedstudio.com/Grove-Sunlight-Sensor.html)
@ -60,7 +60,7 @@ Most of the sensors and actuators needed are used by both the Arduino and Raspbe
* [Grove capacitive soil moisture sensor](https://www.seeedstudio.com/Grove-Capacitive-Moisture-Sensor-Corrosion-Resistant.html) * [Grove capacitive soil moisture sensor](https://www.seeedstudio.com/Grove-Capacitive-Moisture-Sensor-Corrosion-Resistant.html)
* [Grove relay](https://www.seeedstudio.com/Grove-Relay.html) * [Grove relay](https://www.seeedstudio.com/Grove-Relay.html)
* [Grove GPS (Air530)](https://www.seeedstudio.com/Grove-GPS-Air530-p-4584.html) * [Grove GPS (Air530)](https://www.seeedstudio.com/Grove-GPS-Air530-p-4584.html)
* [Grove - Ultrasonic Distance Sensor](https://www.seeedstudio.com/Grove-Ultrasonic-Distance-Sensor.html) * [Grove - Time of flight Distance Sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html)
## Optional hardware ## Optional hardware

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 161 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 82 KiB

After

Width:  |  Height:  |  Size: 82 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 303 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 399 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 325 KiB

Loading…
Cancel
Save