diff --git a/.vscode/settings.json b/.vscode/settings.json
new file mode 100644
index 00000000..39aa6820
--- /dev/null
+++ b/.vscode/settings.json
@@ -0,0 +1,20 @@
+{
+ "cSpell.words": [
+ "ADCs",
+ "Geospatial",
+ "Kbps",
+ "Mbps",
+ "Seeed",
+ "Twilio",
+ "UART",
+ "UDID",
+ "Zigbee",
+ "antimeridian",
+ "geofence",
+ "geofences",
+ "geofencing",
+ "microcontrollers",
+ "mosquitto",
+ "sketchnote"
+ ]
+}
\ No newline at end of file
diff --git a/1-getting-started/lessons/1-introduction-to-iot/README.md b/1-getting-started/lessons/1-introduction-to-iot/README.md
index f2bd8dda..d601b69f 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/README.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/README.md
@@ -22,11 +22,11 @@ In this lesson we'll cover:
## What is the 'Internet of Things'?
-The term 'Internet of Things' was coined by [Kevin Ashton](https://wikipedia.org/wiki/Kevin_Ashton) in 1999 to refer to connecting the Internet to the physical world via sensors. Since then the term has been used to describe any device that interacts with the physical world around it either by gathering data from sensors, or providing real-world interactions via actuators (devices that do something like turn on a switch or light an LED), generally connected to other devices or the Internet.
+The term 'Internet of Things' was coined by [Kevin Ashton](https://wikipedia.org/wiki/Kevin_Ashton) in 1999, to refer to connecting the Internet to the physical world via sensors. Since then, the term has been used to describe any device that interacts with the physical world around it, either by gathering data from sensors, or providing real-world interactions via actuators (devices that do something like turn on a switch or light an LED), generally connected to other devices or the Internet.
> **Sensors** gather information from the world, such as measuring speed, temperature or location.
>
-> **Actuators** convert electrical signals into real-world interactions such as levers, turning on lights, making sounds, or sending control signals to other hardware such as to turn on a power socket
+> **Actuators** convert electrical signals into real-world interactions such as triggering a switch, turning on lights, making sounds, or sending control signals to other hardware, for example to turn on a power socket.
IoT as a technology area is more than just devices - it includes cloud based services that can process the sensor data, or send requests to actuators connected to IoT devices. It also includes devices that don't have or don't need Internet connectivity, often referred to as edge devices. These are devices that can process and respond to sensor data themselves, usually using AI models trained in the cloud.
@@ -34,19 +34,19 @@ IoT is a fast growing technology field. It is estimated that by the end of 2020,

-✅ Do a little research: how much of the data generated by IoT devices is actually used, and how much is wasted? Why is so much data ignored?
+✅ Do a little research: How much of the data generated by IoT devices is actually used, and how much is wasted? Why is so much data ignored?
-This data is the key to IoT's success. To be a successful IoT developer, you need to understand the data you need to gather, how to gather it, how to make decisions based off it, and how to use those decisions to interact back with the physical world if needed.
+This data is the key to IoT's success. To be a successful IoT developer, you need to understand the data you need to gather, how to gather it, how to make decisions based on it, and how to use those decisions to interact with the physical world if needed.
## IoT devices
The **T** in IoT stands for **Things** - devices that interact with the physical world around them either by gathering data from sensors, or providing real-world interactions via actuators.
-Devices for production or commercial use, such as the consumer fitness trackers, or industrial machine controllers, are usually custom made. They use custom circuit boards, maybe even custom processors, designed to meet the needs of a particular task, whether that's being small enough to fit on a wrist, or rugged enough to work in a high temperature, high stress, high vibration factory environment.
+Devices for production or commercial use, such as consumer fitness trackers, or industrial machine controllers, are usually custom-made. They use custom circuit boards, maybe even custom processors, designed to meet the needs of a particular task, whether that's being small enough to fit on a wrist, or rugged enough to work in a high temperature, high stress or high vibration factory environment.
-As a developer, either learning about IoT or creating a prototype device, you'll need to start with a developer kit. These are general purpose IoT devices designed for developers to use, often with features that you wouldn't see on a production device, such as a set of external pins to connect sensors or actuators to, hardware to support debugging, or additional resources that would add unnecessary cost when doing a large manufacturing run.
+As a developer either learning about IoT or creating a device prototype, you'll need to start with a developer kit. These are general-purpose IoT devices designed for developers to use, often with features that you wouldn't see on a production device, such as a set of external pins to connect sensors or actuators to, hardware to support debugging, or additional resources that would add unnecessary cost when doing a large manufacturing run.
-These developer kits usually fall into two categories - microcontrollers and single-board computers. These will be introduced here, and we'll go into them in more detail in the next lesson.
+These developer kits usually fall into two categories - microcontrollers and single-board computers. These will be introduced here, and we'll go into more detail in the next lesson.
> 💁 Your phone can also be considered to be a general-purpose IoT device, with sensors and actuators built in, with different apps using the sensors and actuators in different ways with different cloud services. You can even find some IoT tutorials that use a phone app as an IoT device.
@@ -56,19 +56,19 @@ A microcontroller (also referred to as an MCU, short for microcontroller unit) i
🧠 One or more central processing units (CPUs) - the 'brain' of the microcontroller that runs your program
-💾 Memory (RAM and program memory) - where your program, data, and variables are stored
+💾 Memory (RAM and program memory) - where your program, data and variables are stored
-🔌 Programmable input/output (I/O) connections - to talk to external peripherals (connected devices) such as sensors or actuators
+🔌 Programmable input/output (I/O) connections - to talk to external peripherals (connected devices) such as sensors and actuators
-Microcontrollers are typically low cost computing devices, with average prices for the ones used in custom hardware dropping to around US$0.50, with some devices as cheap as US$0.03. Developer kits can start as low as US$4, with costs rising as you add more features. The [Wio Terminal](https://www.seeedstudio.com/Wio-Terminal-p-4509.html), a microcontroller developer kit from [Seeed studios](https://www.seeedstudio.com) that has sensors, actuators, WiFi and a screen costs around US$30.
+Microcontrollers are typically low cost computing devices, with average prices for the ones used in custom hardware dropping to around US$0.50, and some devices as cheap as US$0.03. Developer kits can start as low as US$4, with costs rising as you add more features. The [Wio Terminal](https://www.seeedstudio.com/Wio-Terminal-p-4509.html), a microcontroller developer kit from [Seeed studios](https://www.seeedstudio.com) that has sensors, actuators, WiFi and a screen costs around US$30.

-> 💁 When searching the Internet for microcontrollers be observant of searching for the term **MCU** as this will bring back a lot of results for the Marvel Cinematic Universe, not microcontrollers.
+> 💁 When searching the Internet for microcontrollers, be wary of searching for the term **MCU** as this will bring back a lot of results for the Marvel Cinematic Universe, not microcontrollers.
Microcontrollers are designed to be programmed to do a limited number of very specific tasks, rather than being general-purpose computers like PCs or Macs. Except for very specific scenarios, you can't connect a monitor, keyboard and mouse and use them for general purpose tasks.
-Microcontroller developer kits usually come with additional sensors and actuators on board. Most boards will have one or more LEDs you can program, along with other devices such as standard plugs for adding more sensors or actuators using various manufacturers ecosystems or built in sensors (usually the most popular ones such as temperature). Some microcontrollers have built in wireless connectivity such as Bluetooth or WiFi, or have additional microcontrollers on the board to add this connectivity.
+Microcontroller developer kits usually come with additional sensors and actuators on board. Most boards will have one or more LEDs you can program, along with other devices such as standard plugs for adding more sensors or actuators using various manufacturers' ecosystems, or built in sensors (usually the most popular ones such as temperature sensors). Some microcontrollers have built in wireless connectivity such as Bluetooth or WiFi, or have additional microcontrollers on the board to add this connectivity.
> 💁 Microcontrollers are usually programmed in C/C++.
diff --git a/1-getting-started/lessons/1-introduction-to-iot/code/wio-terminal/nightlight/app.py b/1-getting-started/lessons/1-introduction-to-iot/code/wio-terminal/nightlight/app.py
deleted file mode 100644
index f33ca45e..00000000
--- a/1-getting-started/lessons/1-introduction-to-iot/code/wio-terminal/nightlight/app.py
+++ /dev/null
@@ -1,20 +0,0 @@
-from counterfit_shims_grove.counterfit_connection import CounterFitConnection
-import time
-from counterfit_shims_grove.grove_light_sensor_v1_2 import GroveLightSensor
-from counterfit_shims_grove.grove_led import GroveLed
-
-CounterFitConnection.init('127.0.0.1', 5000)
-
-light_sensor = GroveLightSensor(0)
-led = GroveLed(5)
-
-while True:
- light = light_sensor.light
- print('Light level:', light)
-
- if light < 200:
- led.on()
- else:
- led.off()
-
- time.sleep(1)
\ No newline at end of file
diff --git a/1-getting-started/lessons/1-introduction-to-iot/pi.md b/1-getting-started/lessons/1-introduction-to-iot/pi.md
index 119279c5..583d7a8d 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/pi.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/pi.md
@@ -12,7 +12,7 @@ If you are using a Raspberry Pi as your IoT hardware, you have two choices - you
Before you begin, you also need to connect the Grove Base Hat to your Pi.
-### Task
+### Task - setup
Install the Grove base hat on your Pi and configure the Pi
@@ -29,7 +29,7 @@ Install the Grove base hat on your Pi and configure the Pi
If you want to work directly on your Pi, you can use the desktop version of Raspberry Pi OS and install all the tools you need.
-#### Task
+#### Task - work directly on your Pi
Set up your Pi for development.
@@ -79,7 +79,7 @@ Rather than coding directly on the Pi, it can run 'headless', that is not connec
To code remotely, the Pi OS needs to be installed on an SD Card.
-##### Task
+##### Task - set up the Pi OS
Set up the headless Pi OS.
@@ -115,7 +115,7 @@ The OS will be written to the SD card, and once compete the card will be ejected
The next step is to remotely access the Pi. You can do this using `ssh`, which is available on macOS, Linux and recent versions of Windows.
-##### Task
+##### Task - connect to the Pi
Remotely access the Pi.
@@ -147,7 +147,7 @@ Remotely access the Pi.
Once you are connected to the Pi, you need to ensure the OS is up to date, and install various libraries and tools that interact with the Grove hardware.
-##### Task
+##### Task - configure software on the Pi
Configure the installed Pi software and install the Grove libraries.
@@ -179,7 +179,7 @@ Configure the installed Pi software and install the Grove libraries.
Once the Pi is configured, you can connect to it using Visual Studio Code (VS Code) from your computer - this is a free developer text editor you will be using to write your device code in Python.
-##### Task
+##### Task - configure VS Code for remote access
Install the required software and connect remotely to your Pi.
@@ -199,7 +199,7 @@ The Hello World app for the Pi will ensure that you have Python and Visual Studi
This app will be in a folder called `nightlight`, and it will be re-used with different code in later parts of this assignment to build the nightlight application.
-### Task
+### Task - hello world
Create the Hello World app.
diff --git a/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md b/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
index bc37888d..26e94915 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
@@ -34,7 +34,7 @@ One of the powerful features of Python is the ability to install [pip packages](
By default when you install a package it is available everywhere on your computer, and this can lead to problems with package versions - such as one application depending on one version of a package that breaks when you install a new version for a different application. To work around this problem, you can use a [Python virtual environment](https://docs.python.org/3/library/venv.html), essentially a copy of Python in a dedicated folder, and when you install pip packages they get installed just to that folder.
-#### Task
+#### Task - configure a Python virtual environment
Configure a Python virtual environment and install the pip packages for CounterFit.
@@ -96,7 +96,7 @@ Configure a Python virtual environment and install the pip packages for CounterF
Once the Python virtual environment is ready, you can write the code for the 'Hello World' application
-#### Task
+#### Task - write the code
Create a Python application to print `"Hello World"` to the console.
@@ -119,7 +119,8 @@ Create a Python application to print `"Hello World"` to the console.
```sh
code .
```
- > 💁 If your terminal returns `command not found` on macOS it means VS Code has not been added to PATH, you can [add VS Code to PATH](https://code.visualstudio.com/docs/setup/mac#_launching-from-the-command-line) and run the command afterwards. VS Code is installed to PATH by default on Windows and Linux.
+
+ > 💁 If your terminal returns `command not found` on macOS it means VS Code has not been added to your PATH. You can add VS Code to yout PATH by following the instructions in the [Launching from the command line section of the VS Code documentation](https://code.visualstudio.com/docs/setup/mac?WT.mc_id=academic-17441-jabenn#_launching-from-the-command-line) and run the command afterwards. VS Code is installed to your PATH by default on Windows and Linux.
1. When VS Code launches, it will activate the Python virtual environment. You will see this in the bottom status bar:
@@ -171,7 +172,7 @@ Create a Python application to print `"Hello World"` to the console.
As a second 'Hello World' step, you will run the CounterFit app and connect your code to it. This is the virtual equivalent of plugging in some IoT hardware to a dev kit.
-#### Task
+#### Task - connect the 'hardware'
1. From the VS Code terminal, launch the CounterFit app with the following command:
diff --git a/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md b/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
index 6692949b..ea676572 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
@@ -8,7 +8,7 @@ The [Wio Terminal from Seeed Studios](https://www.seeedstudio.com/Wio-Terminal-p
To use your Wio Terminal, you will need to install some free software on your computer. You will also need to update the Wio Terminal firmware before you can connect it to WiFi.
-### Task
+### Task - setup
Install the required software and update the firmware.
@@ -32,7 +32,7 @@ The Hello World app for the Wio Terminal will ensure that you have Visual Studio
The first step is to create a new project using PlatformIO configured for the Wio Terminal.
-#### Task
+#### Task - create a PlatformIO project
Create the PlatformIO project.
@@ -122,7 +122,7 @@ The VS Code explorer will show a number of files and folders created by the Plat
You're now ready to write the Hello World app.
-#### Task
+#### Task - write the Hello World app
Write the Hello World app.
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/README.md b/1-getting-started/lessons/3-sensors-and-actuators/README.md
index 436d5ebd..3f4926a9 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/README.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/README.md
@@ -28,15 +28,15 @@ Sensors are hardware devices that sense the physical world - that is they measur
Some common sensors include:
* Temperature sensors - these sense the air temperature, or the temperature of what they are immersed in. For hobbyists and developer, these are often combined with air pressure and humidity in a single sensor.
-* Buttons - they sense when they have been pressed
-* Light sensors - these detect light levels, and can be for specific colors, UV light, IR light, or general visible light
-* Cameras - these sense a visual representation of the world by taking a photograph or streaming video
-* Accelerometers - these sense movement in multiple directions
-* Microphones - these sense sound, either general sound levels, or directional sound
+* Buttons - they sense when they have been pressed.
+* Light sensors - these detect light levels, and can be for specific colors, UV light, IR light, or general visible light.
+* Cameras - these sense a visual representation of the world by taking a photograph or streaming video.
+* Accelerometers - these sense movement in multiple directions.
+* Microphones - these sense sound, either general sound levels, or directional sound.
✅ Do some research. What sensors does your phone have?
-All sensors have one thing in common - the convert whatever they sense into an electrical signal that can be interpreted by an IoT device. How this electrical signal is interpreted depends on the sensor, as well as the communication protocol used to communicate with the IoT device.
+All sensors have one thing in common - they convert whatever they sense into an electrical signal that can be interpreted by an IoT device. How this electrical signal is interpreted depends on the sensor, as well as the communication protocol used to communicate with the IoT device.
## Use a sensor
@@ -62,7 +62,7 @@ One example of this is a potentiometer. This is a dial that you can rotate betwe
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
-The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0v (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out.
+The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out.
> 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer).
@@ -74,9 +74,9 @@ The voltage that comes out the sensor is then read by the IoT device, and the de
IoT devices are digital - they can't work with analog values, they only work with 0s and 1s. This means that analog sensor values need to be converted to a digital signal before they can be processed. Many IoT devices have analog-to-digital converters (ADCs) to convert analog inputs to digital representations of their value. Sensors can also work with ADCs via a connector board. For example, in the Seeed Grove ecosystem with a Raspberry Pi, analog sensors connect to specific ports on a 'hat' that sits on the Pi connected to the Pi's GPIO pins, and this hat has an ADC to convert the voltage into a digital signal that can be sent off the Pi's GPIO pins.
-Imagine you have an analog light sensor connected to an IoT device that uses 3.3V, and is returning a value of 1v. This 1v doesn't mean anything in the digital world, so needs to be converted. The voltage will be converted to an analog value using a scale depending on the device and sensor. One example is the Seeed Grove light sensor which outputs values from 0 to 1,023. For this sensor running at 3.3V, a 1v output would be a value of 300. An IoT device can't handle 300 as an analog value, so the value would be converted to `0000000100101100`, the binary representation of 300 by the Grove hat. This would then be processed by the IoT device.
+Imagine you have an analog light sensor connected to an IoT device that uses 3.3V, and is returning a value of 1V. This 1V doesn't mean anything in the digital world, so needs to be converted. The voltage will be converted to an analog value using a scale depending on the device and sensor. One example is the Seeed Grove light sensor which outputs values from 0 to 1,023. For this sensor running at 3.3V, a 1V output would be a value of 300. An IoT device can't handle 300 as an analog value, so the value would be converted to `0000000100101100`, the binary representation of 300 by the Grove hat. This would then be processed by the IoT device.
-✅ If you don't know binary then do a small amount of research to learn how numbers are represented by 0s and 1s. The [BBC Bitesize introduction to binary lesson](https://www.bbc.co.uk/bitesize/guides/zwsbwmn/revision/1) is a great place to start.
+✅ If you don't know binary, then do a small amount of research to learn how numbers are represented by 0s and 1s. The [BBC Bitesize introduction to binary lesson](https://www.bbc.co.uk/bitesize/guides/zwsbwmn/revision/1) is a great place to start.
From a coding perspective, all this is usually handled by libraries that come with the sensors, so you don't need to worry about this conversion yourself. For the Grove light sensor you would use the Python library and call the `light` property, or use the Arduino library and call `analogRead` to get a value of 300.
@@ -92,9 +92,9 @@ The simplest digital sensor is a button or switch. This is a sensor with two sta
Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0.
-> 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8v as a 1, below 1.8v as 0.
+> 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0.
-* 3.3V goes into the button. The button is off so 0v comes out, giving a value of 0
+* 3.3V goes into the button. The button is off so 0V comes out, giving a value of 0
* 3.3V goes into the button. The button is on so 3.3V comes out, giving a value of 1
More advanced digital sensors read analog values, then convert them using on-board ADCs to digital signals. For example a digital temperature sensor will still use a thermocouple in the same way as an analog sensor, and will still measure the change in voltage caused by the resistance of the thermocouple at the current temperature. Instead of returning an analog value and relying on the device or connector board to convert to a digital signal, an ADC built into the sensor will convert the value and send it as a series of 0s and 1s to the IoT device. These 0s and 1s are sent in the same way as the digital signal for a button with 1 being full voltage and 0 being 0v.
@@ -155,13 +155,13 @@ Another option for converting digital signals from an IoT device to an analog si
For example, you can use PWM to control the speed of a motor.
-imagine you are controlling a motor with a 5V supply. You send a short pulse to your motor, switching the voltage to high (5V) for two hundredths of a second (0.02s). In that time your motor can rotate one tenth of a rotation, or 36°. The signal then pauses for two hundredths of a second (0.02s), sending a low signal (0v). Each cycle of on then off lasts 0.04s. The cycle then repeats.
+Imagine you are controlling a motor with a 5V supply. You send a short pulse to your motor, switching the voltage to high (5V) for two hundredths of a second (0.02s). In that time your motor can rotate one tenth of a rotation, or 36°. The signal then pauses for two hundredths of a second (0.02s), sending a low signal (0V). Each cycle of on then off lasts 0.04s. The cycle then repeats.

***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
-This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0v not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity).
+This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity).
```output
25 pulses per second x 0.1 rotations per pulse = 2.5 rotations per second
@@ -191,7 +191,7 @@ You can change the motor speed by changing the size of the pulses. For example,
Digital actuators, like digital sensors, either have two states controlled by a high or low voltage, or have a DAC built in so can convert a digital signal to an analog one.
-One simple digital actuator is an LED. When a device sends a digital signal of 1, a high voltage is sent that lights the LED. When a digital signal of 0 is sent, the voltage drops to 0v and the LED turns off.
+One simple digital actuator is an LED. When a device sends a digital signal of 1, a high voltage is sent that lights the LED. When a digital signal of 0 is sent, the voltage drops to 0V and the LED turns off.

diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
index beb55666..0e1ccc45 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
@@ -1,15 +1,15 @@
import time
-from grove.grove_light_sensor_v1_2 import GroveLightSensor
+import seeed_si114x
from grove.grove_led import GroveLed
-light_sensor = GroveLightSensor(0)
+light_sensor = seeed_si114x.grove_si114x()
led = GroveLed(5)
while True:
- light = light_sensor.light
+ light = light_sensor.ReadVisible
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/virtual-device/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/virtual-device/nightlight/app.py
index b7bf75b6..5aca9a1f 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/virtual-device/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/virtual-device/nightlight/app.py
@@ -12,7 +12,7 @@ while True:
light = light_sensor.light
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/app.py
index f33ca45e..d45b4dda 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/app.py
@@ -12,7 +12,7 @@ while True:
light = light_sensor.light
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/src/main.cpp b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/src/main.cpp
index ae2299f0..6f948db0 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/src/main.cpp
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/wio-terminal/nightlight/src/main.cpp
@@ -19,7 +19,7 @@ void loop()
Serial.print("Light value: ");
Serial.println(light);
- if (light < 200)
+ if (light < 300)
{
digitalWrite(D0, HIGH);
}
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
index 54f58874..5d44a7a3 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
@@ -1,10 +1,9 @@
import time
-from grove.grove_light_sensor_v1_2 import GroveLightSensor
+import seeed_si114x
-light_sensor = GroveLightSensor(0)
+light_sensor = seeed_si114x.grove_si114x()
while True:
- light = light_sensor.light
+ light = light_sensor.ReadVisible
print('Light level:', light)
-
time.sleep(1)
\ No newline at end of file
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/wio-terminal/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/wio-terminal/nightlight/app.py
index f33ca45e..d45b4dda 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/wio-terminal/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/wio-terminal/nightlight/app.py
@@ -12,7 +12,7 @@ while True:
light = light_sensor.light
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
index 962aa010..adb8aec6 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
@@ -12,7 +12,7 @@ The nightlight logic in pseudo-code is:
```output
Check the light level.
-If the light is less than 200
+If the light is less than 300
Turn the LED on
Otherwise
Turn the LED off
@@ -22,7 +22,7 @@ Otherwise
The Grove LED comes as a module with a selection of LEDs, allowing you to chose the color.
-#### Task
+#### Task - connect the LED
Connect the LED.
@@ -44,9 +44,9 @@ Connect the LED.
## Program the nightlight
-The nightlight can now be programmed using the Grove light sensor and the Grove LED.
+The nightlight can now be programmed using the Grove sunlight sensor and the Grove LED.
-### Task
+### Task - program the nightlight
Program the nightlight.
@@ -70,16 +70,18 @@ Program the nightlight.
The line `led = GroveLed(5)` creates an instance of the `GroveLed` class connecting to pin **D5** - the digital Grove pin that the LED is connected to.
+ > 💁 All the sockets have unique pin numbers. Pins 0, 2, 4, and 6 are analog pins, pins 5, 16, 18, 22, 24, and 26 are digital pins.
+
1. Add a check inside the `while` loop, and before the `time.sleep` to check the light levels and turn the LED on or off:
```python
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
```
- This code checks the `light` value. If this is less than 200 it calls the `on` method of the `GroveLed` class which sends a digital value of 1 to the LED, turning it on. If the light value is greater than or equal to 200 it calls the `off` method, sending a digital value of 0 to the LED, turning it off.
+ This code checks the `light` value. If this is less than 300 it calls the `on` method of the `GroveLed` class which sends a digital value of 1 to the LED, turning it on. If the light value is greater than or equal to 300 it calls the `off` method, sending a digital value of 0 to the LED, turning it off.
> 💁 This code should be indented to the same level as the `print('Light level:', light)` line to be inside the while loop!
@@ -103,7 +105,7 @@ Program the nightlight.
Light level: 290
```
-1. Cover and uncover the light sensor. Notice how the LED will light up if the light level is 200 or less, and turn off when the light level is greater than 200.
+1. Cover and uncover the sunlight sensor. Notice how the LED will light up if the light level is 300 or less, and turn off when the light level is greater than 300.
> 💁 If the LED doesn't turn on, make sure it is connected the right way round, and the spin button is set to full on.
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md b/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
index 45775161..7c968115 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
@@ -4,31 +4,33 @@ In this part of the lesson, you will add a light sensor to your Raspberry Pi.
## Hardware
-The sensor for this lesson is a **light sensor** that uses a [photodiode](https://wikipedia.org/wiki/Photodiode) to convert light to an electrical signal. This is an analog sensor that sends an integer value from 0 to 1,023 indicating a relative amount of light that doesn't map to any standard unit of measurement such as [lux](https://wikipedia.org/wiki/Lux).
+The sensor for this lesson is a **sunlight sensor** that uses [photodiodes](https://wikipedia.org/wiki/Photodiode) to convert visible and infrared light to an electrical signal. This is an analog sensor that sends an integer value from 0 to 1,023 indicating a relative amount of light, but this can be used to calculate exact values in [lux](https://wikipedia.org/wiki/Lux) by taking data from the separate infrared and visible light sensors.
-The light sensor is an eternal Grove sensor and needs to be connected to the Grove Base hat on the Raspberry Pi.
+The sunlight sensor is an eternal Grove sensor and needs to be connected to the Grove Base hat on the Raspberry Pi.
-### Connect the light sensor
+### Connect the sunlight sensor
-The Grove light sensor that is used to detect the light levels needs to be connected to the Raspberry Pi.
+The Grove sunlight sensor that is used to detect the light levels needs to be connected to the Raspberry Pi.
-#### Task
+#### Task - connect the sunlight sensor
-Connect the light sensor
+Connect the sunlight sensor
-
+
-1. Insert one end of a Grove cable into the socket on the light sensor module. It will only go in one way round.
+1. Insert one end of a Grove cable into the socket on the sunlight sensor module. It will only go in one way round.
-1. With the Raspberry Pi powered off, connect the other end of the Grove cable to the analog socket marked **A0** on the Grove Base hat attached to the Pi. This socket is the second from the right, on the row of sockets next to the GPIO pins.
+1. With the Raspberry Pi powered off, connect the other end of the Grove cable to one of the three the I2C sockets marked **I2C** on the Grove Base hat attached to the Pi. This socket is the second from the right, on the row of sockets next to the GPIO pins.
-
+ > 💁 I2C is a way sensors and actuators can communicate with an IoT device. It will be covered in more detail in a later lesson.
-## Program the light sensor
+
-The device can now be programmed using the Grove light sensor.
+## Program the sunlight sensor
-### Task
+The device can now be programmed using the Grove sunlight sensor.
+
+### Task - program the sunlight sensor
Program the device.
@@ -36,38 +38,44 @@ Program the device.
1. Open the nightlight project in VS Code that you created in the previous part of this assignment, either running directly on the Pi or connected using the Remote SSH extension.
+1. Run the following command to install a pip package for working with the sunlight sensor:
+
+ ```sh
+ pip3 install seeed-python-si114x
+ ```
+
+ Not all the libraries for the Grove Sensors are installed with the Grove install script you used in an earlier lesson. Some need additional packages.
+
1. Open the `app.py` file and remove all code from it
1. Add the following code to the `app.py` file to import some required libraries:
```python
import time
- from grove.grove_light_sensor_v1_2 import GroveLightSensor
+ import seeed_si114x
```
The `import time` statement imports the `time` module that will be used later in this assignment.
- The `from grove.grove_light_sensor_v1_2 import GroveLightSensor` statement imports the `GroveLightSensor` from the Grove Python libraries. This library has code to interact with a Grove light sensor, and was installed globally during the Pi setup.
+ The `import seeed_si114x` statement imports the `seeed_si114x` module that has code to interact with the Grove sunlight sensor.
1. Add the following code after the code above to create an instance of the class that manages the light sensor:
```python
- light_sensor = GroveLightSensor(0)
+ light_sensor = seeed_si114x.grove_si114x()
```
- The line `light_sensor = GroveLightSensor(0)` creates an instance of the `GroveLightSensor` class connecting to pin **A0** - the analog Grove pin that the light sensor is connected to.
-
- > 💁 All the sockets have unique pin numbers. Pins 0, 2, 4, and 6 are analog pins, pins 5, 16, 18, 22, 24, and 26 are digital pins.
+ The line `light_sensor = seeed_si114x.grove_si114x()` creates an instance of the `grove_si114x` sunlight sensor class.
1. Add an infinite loop after the code above to poll the light sensor value and print it to the console:
```python
while True:
- light = light_sensor.light
+ light = light_sensor.ReadVisible
print('Light level:', light)
```
- This will read the current light level on a scale of 0-1,023 using the `light` property of the `GroveLightSensor` class. This property reads the analog value from the pin. This value is then printed to the console.
+ This will read the current sunlight level on a scale of 0-1,023 using the `ReadVisible` property of the `grove_si114x` class. This value is then printed to the console.
1. Add a small sleep of one second at the end of the `loop` as the light levels don't need to be checked continuously. A sleep reduces the power consumption of the device.
@@ -81,16 +89,16 @@ Program the device.
python3 app.py
```
- You should see light values being output to the console. Cover and uncover the light sensor to see the values change:
+ You should see sunlight values being output to the console. Cover and uncover the sunlight sensor to see the values change:
```output
pi@raspberrypi:~/nightlight $ python3 app.py
- Light level: 634
- Light level: 634
- Light level: 634
- Light level: 230
- Light level: 104
- Light level: 290
+ Light level: 259
+ Light level: 265
+ Light level: 265
+ Light level: 584
+ Light level: 550
+ Light level: 497
```
> 💁 You can find this code in the [code-sensor/pi](code-sensor/pi) folder.
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
index 403655f8..74895af1 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
@@ -12,17 +12,17 @@ The nightlight logic in pseudo-code is:
```output
Check the light level.
-If the light is less than 200
+If the light is less than 300
Turn the LED on
Otherwise
Turn the LED off
```
-### Add the sensors to CounterFit
+### Add the actuator to CounterFit
To use a virtual LED, you need to add it to the CounterFit app
-#### Task
+#### Task - add the actuator to CounterFit
Add the LED to the CounterFit app.
@@ -48,7 +48,7 @@ Add the LED to the CounterFit app.
The nightlight can now be programmed using the CounterFit light sensor and LED.
-#### Task
+#### Task - program the nightlight
Program the nightlight.
@@ -75,13 +75,13 @@ Program the nightlight.
1. Add a check inside the `while` loop, and before the `time.sleep` to check the light levels and turn the LED on or off:
```python
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
```
- This code checks the `light` value. If this is less than 200 it calls the `on` method of the `GroveLed` class which sends a digital value of 1 to the LED, turning it on. If the light value is greater than or equal to 200 it calls the `off` method, sending a digital value of 0 to the LED, turning it off.
+ This code checks the `light` value. If this is less than 300 it calls the `on` method of the `GroveLed` class which sends a digital value of 1 to the LED, turning it on. If the light value is greater than or equal to 300 it calls the `off` method, sending a digital value of 0 to the LED, turning it off.
> 💁 This code should be indented to the same level as the `print('Light level:', light)` line to be inside the while loop!
@@ -101,7 +101,7 @@ Program the nightlight.
Light level: 253
```
-1. Change the *Value* or the *Random* settings to vary the light level above and below 200. You will see the LED turn on and off.
+1. Change the *Value* or the *Random* settings to vary the light level above and below 300. You will see the LED turn on and off.

diff --git a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
index 2f740bfe..e6d961de 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
@@ -12,7 +12,7 @@ The sensor is a **light sensor**. In a physical IoT device, it would be a [photo
To use a virtual light sensor, you need to add it to the CounterFit app
-#### Task
+#### Task - add the sensors to CounterFit
Add the light sensor to the CounterFit app.
@@ -38,11 +38,10 @@ Add the light sensor to the CounterFit app.
The device can now be programmed to use the built in light sensor.
-### Task
+### Task - program the light sensor
Program the device.
-
1. Open the nightlight project in VS Code that you created in the previous part of this assignment. Kill and re-create the terminal to ensure it is running using the virtual environment if necessary.
1. Open the `app.py` file
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
index d4dfd1db..513bc9f4 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
@@ -12,7 +12,7 @@ The nightlight logic in pseudo-code is:
```output
Check the light level.
-If the light is less than 200
+If the light is less than 300
Turn the LED on
Otherwise
Turn the LED off
@@ -22,7 +22,7 @@ Otherwise
The Grove LED comes as a module with a selection of LEDs, allowing you to chose the color.
-#### Task
+#### Task - connect the LED
Connect the LED.
@@ -48,7 +48,7 @@ Connect the LED.
The nightlight can now be programmed using the built in light sensor and the Grove LED.
-### Task
+### Task - program the nightlight
Program the nightlight.
@@ -67,7 +67,7 @@ Program the nightlight.
1. Add the following code immediately before the `delay` in the loop function:
```cpp
- if (light < 200)
+ if (light < 300)
{
digitalWrite(D0, HIGH);
}
@@ -77,7 +77,7 @@ Program the nightlight.
}
```
- This code checks the `light` value. If this is less than 200 it sends a `HIGH` value to the `D0` digital pin. This `HIGH` is a value of 1, turning on the LED. If the light is greater than or equal to 200, a `LOW` value of 0 is sent to the pin, turning the LED off.
+ This code checks the `light` value. If this is less than 300 it sends a `HIGH` value to the `D0` digital pin. This `HIGH` is a value of 1, turning on the LED. If the light is greater than or equal to 300, a `LOW` value of 0 is sent to the pin, turning the LED off.
> 💁 When sending digital values to actuators, a LOW value is 0v, and a HIGH value is the max voltage for the device. For the Wio Terminal, the HIGH voltage is 3.3V.
@@ -101,7 +101,7 @@ Program the nightlight.
Light value: 344
```
-1. Cover and uncover the light sensor. Notice how the LED will light up if the light level is 200 or less, and turn off when the light level is greater than 200.
+1. Cover and uncover the light sensor. Notice how the LED will light up if the light level is 300 or less, and turn off when the light level is greater than 300.

diff --git a/1-getting-started/lessons/4-connect-internet/README.md b/1-getting-started/lessons/4-connect-internet/README.md
index 80d9e811..e9d776aa 100644
--- a/1-getting-started/lessons/4-connect-internet/README.md
+++ b/1-getting-started/lessons/4-connect-internet/README.md
@@ -27,7 +27,7 @@ In this lesson we'll cover:
## Communication protocols
-There are a number of popular communication protocols used by IoT devices to communicate with the Internet. The most popular are based around publish/subscribe messaging via some kind of broker. The IoT devices connect to the broker and publish telemetry and subscribe to commands, the cloud services also connect to the broker and subscribe to all the telemetry messages and publishes commands either to specific devices, or to groups of devices.
+There are a number of popular communication protocols used by IoT devices to communicate with the Internet. The most popular are based around publish/subscribe messaging via some kind of broker. The IoT devices connect to the broker and publish telemetry and subscribe to commands, the cloud services also connect to the broker and subscribe to all the telemetry messages and publish commands either to specific devices, or to groups of devices.

@@ -39,7 +39,7 @@ MQTT is the most popular, and is covered in this lesson. Others include AMQP and
[MQTT](http://mqtt.org) is a lightweight, open standard messaging protocol that can send messages between devices. It was designed in 1999 to monitor oil pipelines, before being released as an open standard 15 years later by IBM.
-MQTT has a single broker and multiple clients. All clients connect to the broker, and the broker routes messages to the relevant clients. Messages are routed using named topics, rather than being sent direct to an individual client. A client can publish to a topic, and any clients that subscribe to that topic will receive the message.
+MQTT has a single broker and multiple clients. All clients connect to the broker, and the broker routes messages to the relevant clients. Messages are routed using named topics, rather than being sent directly to an individual client. A client can publish to a topic, and any clients that subscribe to that topic will receive the message.

@@ -49,19 +49,19 @@ MQTT has a single broker and multiple clients. All clients connect to the broker
### Connect your IoT device to MQTT
-The first part in adding Internet control to your nightlight is connecting it to an MQTT broker.
+The first part of adding Internet control to your nightlight is connecting it to an MQTT broker.
#### Task
Connect your device to an MQTT broker.
-In this part of the lesson, you will connect your IoT nightlight to the Internet to allow it to be remotely controlled. Later in this lesson your IoT device will send a telemetry message over MQTT to a public MQTT broker with the light level, where it will be picked up by some server code that you will write. This code will check the light level and send a command message back to the device telling it to turn the LED on or off.
+In this part of the lesson, you will connect your IoT nightlight to the Internet to allow it to be remotely controlled. Later in this lesson, your IoT device will send a telemetry message over MQTT to a public MQTT broker with the light level, where it will be picked up by some server code that you will write. This code will check the light level and send a command message back to the device telling it to turn the LED on or off.
-The real-world use case for such a setup could be to gather data from multiple light sensors before deciding to turn on lights, in a location that has a lot of lights, such as a stadium. This could stop the lights being turned on if only one sensor was covered by cloud or a bird, but the other sensors detected enough light.
+The real-world use case for such a setup could be to gather data from multiple light sensors before deciding to turn on lights, in a location that has a lot of lights, such as a stadium. This could stop the lights from being turned on if only one sensor was covered by cloud or a bird, but the other sensors detected enough light.
✅ What other situations would require data from multiple sensors to be evaluated before sending commands?
-Rather than dealing with the complexities of setting up an MQTT broker as part of this assignment, you can use a public test server that runs [Eclipse Mosquitto](https://www.mosquitto.org), an open-source MQTT broker. This test broker is publicly available at [test.mosquitto.org](https://test.mosquitto.org), and doesn't require an accounts to be set up, making it a great tool for testing MQTT clients and servers.
+Rather than dealing with the complexities of setting up an MQTT broker as part of this assignment, you can use a public test server that runs [Eclipse Mosquitto](https://www.mosquitto.org), an open-source MQTT broker. This test broker is publicly available at [test.mosquitto.org](https://test.mosquitto.org), and doesn't require an account to be set up, making it a great tool for testing MQTT clients and servers.
> 💁 This test broker is public and not secure. Anyone could be listening to what you publish, so should not be used with any data that needs to be kept private
@@ -76,7 +76,7 @@ Follow the relevant step below to connect your device to the MQTT broker:
### A deeper dive into MQTT
-Topics can have a hierarchy, and clients can subscribe to different levels of the hierarchy using wildcards. For example you can send temperature telemetry messages to `/telemetry/temperature` and humidity messages to `/telemetry/humidity`, then in your cloud app subscribe to `/telemetry/*` to receive both the temperature and humidity telemetry messages.
+Topics can have a hierarchy, and clients can subscribe to different levels of the hierarchy using wildcards. For example, you can send temperature telemetry messages to `/telemetry/temperature` and humidity messages to `/telemetry/humidity`, then in your cloud app subscribe to `/telemetry/*` to receive both the temperature and humidity telemetry messages.
Messages can be sent with a quality of service (QoS), which determines the guarantees of the message being received.
@@ -327,9 +327,9 @@ Write the server code.
One important consideration with telemetry is how often to measure and send the data? The answer is - it depends. If you measure often you can respond faster to changes in measurements, but you use more power, more bandwidth, generate more data and need more cloud resources to process. You need to measure often enough, but not too often.
-For a thermostat, measuring every few minutes is probably more than enough as temperatures don't change that often. If you only measure once a day then you could end up heating your house for nighttime temperatures in the middle of a sunny day, whereas if you measure every second you will have thousands of unnecessary duplicated temperature measurements that will eat into the users Internet speed and bandwidth (a problem for people with limited bandwidth plans), use more power which can be a problem for battery powered devices like remote sensors, and increase the cost of the providers cloud computing resources processing and storing them.
+For a thermostat, measuring every few minutes is probably more than enough as temperatures don't change that often. If you only measure once a day then you could end up heating your house for nighttime temperatures in the middle of a sunny day, whereas if you measure every second you will have thousands of unnecessarily duplicated temperature measurements that will eat into the users' Internet speed and bandwidth (a problem for people with limited bandwidth plans), use more power which can be a problem for battery powered devices like remote sensors, and increase the cost of the providers cloud computing resources processing and storing them.
-If you are monitoring a data around a piece of machinery in a factory that if it fails could cause catastrophic damage and millions of dollars in lost revenue, thn measuring multiple times a second might be necessary. It's better to waste bandwidth than miss telemetry that indicates that a machine needs to be stopped and fixed before it breaks.
+If you are monitoring data around a piece of machinery in a factory that if it fails could cause catastrophic damage and millions of dollars in lost revenue, then measuring multiple times a second might be necessary. It's better to waste bandwidth than miss telemetry that indicates that a machine needs to be stopped and fixed before it breaks.
> 💁 In this situation, you might consider having an edge device to process the telemetry first to reduce reliance on the Internet.
@@ -339,7 +339,7 @@ Internet connections can be unreliable, with outages common. What should an IoT
For a thermostat the data can probably be lost as soon as a new temperature measurement has been taken. The heating system doesn't care that 20 minutes ago it was 20.5°C if the temperature is now 19°C, it's the temperature now that determines if the heating should be on or off.
-For machinery you might want to keep the data, especially if it is used to look for trends. There are machine learning models that can detect anomalies in streams of data by looking over data from defined period of time (such as the last hour) and spotting anomalous data. This is often used for predictive maintenance, looking for indications that something might break soon so you can repair or replace before it happens. You might want every bit of telemetry for a machine sent so it can be processed for anomaly detection, so once the IoT device can reconnect it will send all the telemetry generated during the Internet outage.
+For machinery you might want to keep the data, especially if it is used to look for trends. There are machine learning models that can detect anomalies in streams of data by looking over data from defined period of time (such as the last hour) and spotting anomalous data. This is often used for predictive maintenance, looking for indications that something might break soon so you can repair or replace it before that happens. You might want every bit of telemetry for a machine sent so it can be processed for anomaly detection, so once the IoT device can reconnect it will send all the telemetry generated during the Internet outage.
IoT device designers should also consider if the IoT device can be used during an Internet outage or loss of signal caused by location. A smart thermostat should be able to make some limited decisions to control heating if it can't send telemetry to the cloud due to an outage.
@@ -372,13 +372,13 @@ The next step for our Internet controlled nightlight is for the server code to s
1. Add the following code to the end of the `handle_telemetry` function:
```python
- command = { 'led_on' : payload['light'] < 200 }
+ command = { 'led_on' : payload['light'] < 300 }
print("Sending message:", command)
client.publish(server_command_topic, json.dumps(command))
```
- This sends a JSON message to the command topic with the value of `led_on` set to true or false depending on if the light is less than 200 or not. If the light is less than 200, true is sent to instruct the device to turn the LED on.
+ This sends a JSON message to the command topic with the value of `led_on` set to true or false depending on if the light is less than 300 or not. If the light is less than 300, true is sent to instruct the device to turn the LED on.
1. Run the code as before
@@ -421,7 +421,7 @@ If the commands need to be processed in sequence, such as move a robot arm up, t
## 🚀 Challenge
-The challenge in the last three lessons was to list as many IoT devices as you can that are in your home, school or workplace and decide if they are built around microcontrollers or single-board computers, or even a mixture of both, and thing about what sensors and actuators they are using.
+The challenge in the last three lessons was to list as many IoT devices as you can that are in your home, school or workplace and decide if they are built around microcontrollers or single-board computers, or even a mixture of both, and think about what sensors and actuators they are using.
For these devices, think about what messages they might be sending or receiving. What telemetry do they send? What messages or commands might they receive? Do you think they are secure?
diff --git a/1-getting-started/lessons/4-connect-internet/code-commands/server/app.py b/1-getting-started/lessons/4-connect-internet/code-commands/server/app.py
index 01b6a0f2..41097692 100644
--- a/1-getting-started/lessons/4-connect-internet/code-commands/server/app.py
+++ b/1-getting-started/lessons/4-connect-internet/code-commands/server/app.py
@@ -18,7 +18,7 @@ def handle_telemetry(client, userdata, message):
payload = json.loads(message.payload.decode())
print("Message received:", payload)
- command = { 'led_on' : payload['light'] < 200 }
+ command = { 'led_on' : payload['light'] < 300 }
print("Sending message:", command)
client.publish(server_command_topic, json.dumps(command))
diff --git a/1-getting-started/lessons/4-connect-internet/code-mqtt/pi/nightlight/app.py b/1-getting-started/lessons/4-connect-internet/code-mqtt/pi/nightlight/app.py
index 5186de17..cadcd7ae 100644
--- a/1-getting-started/lessons/4-connect-internet/code-mqtt/pi/nightlight/app.py
+++ b/1-getting-started/lessons/4-connect-internet/code-mqtt/pi/nightlight/app.py
@@ -21,7 +21,7 @@ while True:
light = light_sensor.light
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/1-getting-started/lessons/4-connect-internet/code-mqtt/virtual-device/nightlight/app.py b/1-getting-started/lessons/4-connect-internet/code-mqtt/virtual-device/nightlight/app.py
index 6c68b5de..75dad840 100644
--- a/1-getting-started/lessons/4-connect-internet/code-mqtt/virtual-device/nightlight/app.py
+++ b/1-getting-started/lessons/4-connect-internet/code-mqtt/virtual-device/nightlight/app.py
@@ -24,7 +24,7 @@ while True:
light = light_sensor.light
print('Light level:', light)
- if light < 200:
+ if light < 300:
led.on()
else:
led.off()
diff --git a/3-transport/README.md b/3-transport/README.md
index 40d8b976..061f44d1 100644
--- a/3-transport/README.md
+++ b/3-transport/README.md
@@ -10,7 +10,7 @@ IoT can help with this supply chain by tracking the food in transit - ensuring d
In these 4 lessons, you'll learn how to apply the Internet of Things to improve the supply chain by monitoring food as it is loaded onto a (virtual) truck, which is tracked as it moves to it's destination. You will learn about GPS tracking, how to store and visualize GPS data, and how to be alerted when a truck arrives at its destination.
-> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you follow the [Clean up your project](lessons/4-keep-your-plant-secure/README.md#clean-up-your-project) step in [lesson 4](lessons/6-keep-your-plant-secure/README.md).
+> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
## Topics
diff --git a/4-manufacturing/README.md b/4-manufacturing/README.md
new file mode 100644
index 00000000..3d285074
--- /dev/null
+++ b/4-manufacturing/README.md
@@ -0,0 +1,24 @@
+# Manufacturing and processing - using IoT to improve the processing of food
+
+Once food reaches a central hub or processing plant, it isn't always just shipped out to supermarkets. A lot of the time the food goes through a number of processing steps, such as sorting by quality. This is a process that used to be manual - it would start in the field when pickers would only pick ripe fruit, then at the factory the fruit would be ride a conveyer belt and employees would manually remove any bruised or rotten fruit. Having picked and sorted strawberries myself as a summer job during school, I can testify that this isn't a fun job.
+
+More modern setups rely on IoT for sorting. Some of the earliest devices like the sorters from [Weco](https://wecotek.com) use optical sensors to detect the quality of produce, rejecting green tomatoes for example. These can be deployed in harvesters on the farm itself, or in processing plants.
+
+As advances happen in Artificial Intelligence (AI) and Machine Learning (ML), these machines can become more advanced, using ML models trained to distinguish between fruit and foreign objects such as rocks, dirt or insects. These models can also be trained to detect fruit quality, not just bruised fruit but early detection of disease or other crop problems.
+
+> 🎓 The term *ML model* refers to the output of training machine learning software on a set of data. For example, you can train a ML model to distinguish between ripe and unripe tomatoes, then use the model on new images to see if the tomatoes are ripe or not.
+
+In these 4 lessons you'll learn how to train image-based AI models to detect fruit quality, how to use these from an IoT device, and how to run these on the edge - that is on an IoT device rather than in the cloud.
+
+> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
+
+## Topics
+
+1. [Train a fruit quality detector](./4-manufacturing/lessons/1-train-fruit-detector/README.md)
+1. [Check fruit quality from an IoT device](./4-manufacturing/lessons/2-check-fruit-from-device/README.md)
+1. [Run your fruit detector on the edge](./4-manufacturing/lessons/3-run-fruit-detector-edge/README.md)
+1. [Trigger fruit quality detection from a sensor](./4-manufacturing/lessons/4-trigger-fruit-detector/README.md)
+
+## Credits
+
+All the lessons were written with ♥️ by [Jim Bennett](https://GitHub.com/JimBobBennett)
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/README.md b/4-manufacturing/lessons/1-train-fruit-detector/README.md
new file mode 100644
index 00000000..2ad0fcd9
--- /dev/null
+++ b/4-manufacturing/lessons/1-train-fruit-detector/README.md
@@ -0,0 +1,211 @@
+# Train a fruit quality detector
+
+Add a sketchnote if possible/appropriate
+
+This video gives an overview of the Azure Custom Vision service, a service that will be covered in this lesson.
+
+[](https://www.youtube.com/watch?v=TETcDLJlWR4)
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/29)
+
+## Introduction
+
+The recent rise in Artificial Intelligence (AI) and Machine Learning (ML) is providing a wide range of capabilities to todays developers. ML models can be trained to recognize different things in images, including unripe fruit, and this can be used in IoT devices to help sort produce either as it is being harvested, or during processing in factories or warehouses.
+
+In this lesson you will learn about image classification - using ML models to distinguish between images of different things. You will learn how to train an image classifier to distinguish between fruit that is good, and fruit that is bad, either under or over ripe, bruised, or rotten.
+
+In this lesson we'll cover:
+
+* [Using AI and ML to sort food](#using-ai-and-ml-to-sort-food)
+* [Image classification via Machine Learning](#image-classification-via-machine-learning)
+* [Train an image classifier](#train-an-image-classifier)
+* [Test your image classifier](#test-your-image-classifier)
+
+## Using AI and ML to sort food
+
+Feeding the global population is hard, especially at a price that makes food affordable for all. One of the largest costs is labor, so farmers are increasingly turning to automation and tools like IoT to reduce their labor costs. Harvesting by hand is labor intensive (and often backbreaking work), and is being replaced by machinery, especially in richer nations. Despite the savings in cost of using machinery to harvest, there is a downside - the ability to sort food as it is being harvested.
+
+Not all crops ripen evenly. Tomatoes, for example, can still have some green fruits on the vine when the majority is ready for harvest. Although it is a waste to harvest these early, it is cheaper and easier for the farmer to harvest everything using machinery and dispose of the unripe produce later.
+
+✅ Have a look at different fruits or vegetables, either growing near you in farms or in your garden, or in shops, Are they all the same ripeness, or do you see variation?
+
+The rise of automated harvesting moved the sorting of produce from the harvest to the factory. Food would travel on long conveyer belts with teams of people picking over the produce removing anything that wasn't up to the required quality standard. Harvesting was cheaper thanks to machinery, but there was still a cost to manually sort food.
+
+
+
+***If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever. tomato by parkjisun from the Noun Project - from the [Noun Project](https://thenounproject.com)***
+
+The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts.
+
+The video below shows one of these machines in action.
+
+[](https://www.youtube.com/watch?v=AcRL91DouAU)
+
+In this video, as tomatoes fall from one conveyer belt to another, green tomatoes are detected and flicked into a bin using levers.
+
+✅ What conditions would you need in a factory or in a field for these optical sensors to work correctly?
+
+The latest evolutions of these sorting machines take advantage of AI and ML, using models trained to distinguish good produce from bad, not just by obvious color differences such as green tomatoes vs red, but by more subtle differences in appearance that can indicate disease or bruising.
+
+## Image classification via Machine Learning
+
+Traditional programming is where you take data, apply an algorithm to the data, and get output. For example, in the last project you took GPS coordinates and a geofence, applied an algorithm that was provided by Azure Maps, and got back a result of if the point was inside or outside the geofence. You input more data, you get more output.
+
+
+
+Machine learning turns this around - you start with data and known outputs, and the machine learning tools work out the algorithm. You can then take that algorithm, called a *machine learning model*, and input new data and get new output.
+
+> 🎓 The process of a machine learning tool generating a model is called *training*. The inputs and known outputs are called *training data*.
+
+For example, you could give a model millions of pictures of unripe bananas as input training data, with the training output set as `unripe`, and millions of ripe banana pictures as training data with the output set as `ripe`. The ML tools will then generate a model. You then give this model a new picture of a banana and it will predict if the new picture is a ripe or an unripe banana.
+
+> 🎓 The results of ML models are called *predictions*
+
+
+
+ML models don't give a binary answer, instead they give probabilities. For example, a model may be given a picture of a banana and predict `ripe` at 99.7% and `unripe` at 0.3%. Your code would then pick the best prediction and decide the banana is ripe.
+
+The ML model used to detect images like this is called an *image classifier* - it is given labelled images, and then classifies new images based off these labels.
+
+## Train an image classifier
+
+To successfully train an image classifier you need millions of images. As it turns out, once you have an image classifier trained on millions or billions of assorted images, you can re-use it and re-train it using a small set of images and get great results, using a process called *transfer learning*.
+
+> 🎓 Transfer learning is where you transfer the learning from an existing ML model to a new model based off new data.
+
+Once an image classifier has been trained for a wide variety of images, it's internals are great at recognizing shapes, colors and patterns. Transfer learning allows the model to take what it has already learned in recognizing image parts, and use that to recognize new images.
+
+
+
+You can think of this as a bit like children's shape books, where once you can recognize a semi-circle, a rectangle and a triangle, you can recognize a sailboat or a cat depending on the configuration of these shapes. The image classifier can recognize the shapes, and the transfer learning teaches it what combination makes a boat or a cat - or a ripe banana.
+
+There are a wide range of tools that can help you do this, including cloud-based services that can help you train your model, then use it via web APIs.
+
+> 💁 Training these models takes a lot of computer power, usually via Graphics Processing Units, or GPUs. The same specialized hardware that makes games on your Xbox look amazing can also be used to train machine learning models. By using the cloud you can rent time on powerful computers with GPUs to train these models, getting access to the computing power you need, just for the time you need it.
+
+## Custom Vision
+
+Custom Vision is a cloud based tool for training image classifiers. It allows you to train a classifier using only a small number of images. You can upload images through a web portal, web API or an SDK, giving each image a *tag* that has the classification of that image. You then train the model, and test it out to see how well it performs. Once you are happy with the model, you can publish versions of it that can be accessed through a web API or an SDK.
+
+> 💁 You can train a custom vision model with as little as 5 images per classification, but more is better. You can get better results with at least 30 images.
+
+Custom Vision is part of a range of AI tools from Microsoft called Cognitive Services. These are AI tools that can be used either without any training, or with a small amount of training. They include speech recognition and translation, language understanding and image analysis. These are available with a free tier as services in Azure.
+
+> 💁 The free tier is more than enough to create a model, train it, then use it for development work. You can read about the limits of the free tier on the [Custom Vision Limits and quotas page on Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/limits-and-quotas?WT.mc_id=academic-17441-jabenn).
+
+### Task - create a cognitive services resource
+
+To use Custom Vision, you first need to create two cognitive services resources in Azure using the Azure CLI, one for Custom Vision training and one for Custom Vision prediction.
+
+1. Create a Resource Group for this project called `fruit-quality-detector`
+
+1. Use the following command to create a free Custom Vision training resource:
+
+ ```sh
+ az cognitiveservices account create --name fruit-quality-detector-training \
+ --resource-group fruit-quality-detector \
+ --kind CustomVision.Training \
+ --sku F0 \
+ --yes \
+ --location
+ ```
+
+ Replace `` with the location you used when creating the Resource Group.
+
+ This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
+
+1. Use the following command to create a free Custom Vision prediction resource:
+
+ ```sh
+ az cognitiveservices account create --name fruit-quality-detector-prediction \
+ --resource-group fruit-quality-detector \
+ --kind CustomVision.Prediction \
+ --sku F0 \
+ --yes \
+ --location
+ ```
+
+ Replace `` with the location you used when creating the Resource Group.
+
+ This will create a Custom Vision prediction resource in your Resource Group. It will be called `fruit-quality-detector-prediction` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
+
+### Task - create an image classifier project
+
+1. Follow the [Create a new Project section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
+
+ Call your project `fruit-quality-detector`.
+
+ When you create your project, make sure to use the `fruit-quality-detector-training` resource you created earlier. Use a *Classification* project type, a *Multiclass* classification type, and the *Food* domain.
+
+ 
+
+### Task - train your image classifier project
+
+To train an image classifier, you will need multiple pictures of fruit, both good and bad quality to tag as good and bad, such as an ripe and an overripe banana.
+
+> 💁 These classifiers can classify images of anything, so if you don't have fruit to hand of differing quality, you can use two different types of fruit, or cats and dogs!
+
+Ideally each picture should be just the fruit, with either a consistent background, or a wide variety of backgrounds. Ensure there's nothing in the background that is specific to ripe vs unripe fruit.
+
+> 💁 It's important not to have specific backgrounds, or specific items that are not related to the thing being classified for each tag, otherwise the classifier may just classify based on the background. There was a classifier for skin cancer that was trained on moles both normal and cancerous, and the cancerous ones all had rulers against them to measure the size. It turned out the classifier was almost 100% accurate at identifying rulers in pictures, not cancerous moles.
+
+1. Gather pictures for your classifier. You will need at least 5 pictures for each label to train the classifier, but the more the better. You will also need a few additional images to test the classifier. These images should all be different images of the same thing. For example:
+
+ * Using 2 ripe bananas, take some pictures of each one from a few different angles, taking at least 7 pictures (5 to train, 2 to test), but ideally more.
+
+ 
+
+ * Repeat the same process using 2 unripe bananas
+
+ You should have at least 10 training images, with at least 5 ripe and 5 unripe, and 4 testing images, 2 ripe, 2 unripe. You're images should be png or jpegs, small than 6MB. If you create them with an iPhone for example they may be high-resolution HEIC images, so will need to be converted and possibly shrunk. The more images the better, and you should have a similar number of ripe and unripe.
+
+ If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use.
+
+1. Follow the [Upload and tag images section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`.
+
+ 
+
+1. Follow the [Train the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images.
+
+ You will be given a choice of training type. Select **Quick Training**.
+
+The classifier will then train. It will take a few minutes for the training to complete.
+
+> 🍌 If you decide to eat your fruit whilst the classifier is training, make sure you have enough images to test with first!
+
+## Test your image classifier
+
+Once your classifier is trained, you can test it by giving it a new image to classify.
+
+### Task - test your image classifier
+
+1. Follow the [Test and retrain a model with Custom Vision Service documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training.
+
+ 
+
+1. Try all the testing images you have access to and observe the probabilities.
+
+---
+
+## 🚀 Challenge
+
+Image classifiers use machine learning to make predictions about what is in an image, based of probabilities that particular features of an image mean that it matches a particular label. It doesn't understand what is in the image - it doesn't know what a banana is or understand what makes a banana a banana instead of a boat.
+
+What do you think would happen if you used a picture of a strawberry with a model trained on bananas, or a picture of an inflatable banana, or a person in a banana suit, or even a yellow cartoon character like someone from the Simpsons?
+
+Try it out and see what the predictions are. You can find images to try with using [Bing Image search](https://www.bing.com/images/trending).
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/30)
+
+## Review & Self Study
+
+* When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the Evaluate the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier)
+* Read up on how to improve your classifier from the [How to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn)
+
+## Assignment
+
+[Train your classifier for multiple fruits and vegetables](assignment.md)
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/assignment.md b/4-manufacturing/lessons/1-train-fruit-detector/assignment.md
new file mode 100644
index 00000000..a6e9536f
--- /dev/null
+++ b/4-manufacturing/lessons/1-train-fruit-detector/assignment.md
@@ -0,0 +1,16 @@
+# Train your classifier for multiple fruits and vegetables
+
+## Instructions
+
+In this lesson you trained an image classifier to be able to distinguish between ripe and unripe fruits, but only using one type of fruit. A classifier can be trained to recognize multiple fruits, with varying rates of success depending on the type of fruit and the difference between ripe and unripe.
+
+For example, with fruits that change color when they ripen, image classifiers might be less effective than a color sensor as they usually work on grey scale images instead of full color.
+
+Train your classifier with other fruits to see how well it works, especially when fruits look similar. For example, apples and tomatoes.
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| Train the classifier for multiple fruits | Was able to train the classifier for multiple fruits | Was able to train the classifier for one additional fruit | Was unable to train the classifier for more fruits |
+| Determine how well the classifier works | Was able to comment correctly on how well the classifier worked with different fruits | Was able to observe and offer suggestions as to how well it was working | Was unable to comment on how well the classifier worked |
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-1.png b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-1.png
new file mode 100644
index 00000000..aca00483
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-1.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-2.png b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-2.png
new file mode 100644
index 00000000..82c4964d
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/ripe/banana-ripe-2.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-1.png b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-1.png
new file mode 100644
index 00000000..8801be6e
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-1.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-2.png b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-2.png
new file mode 100644
index 00000000..48bc2237
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/testing/unripe/banana-unripe-2.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-1.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-1.png
new file mode 100644
index 00000000..f7fb01d2
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-1.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-10.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-10.png
new file mode 100644
index 00000000..ec4954a3
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-10.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-11.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-11.png
new file mode 100644
index 00000000..5b211ea3
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-11.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-12.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-12.png
new file mode 100644
index 00000000..d0894e89
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-12.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-13.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-13.png
new file mode 100644
index 00000000..0140d21c
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-13.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-14.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-14.png
new file mode 100644
index 00000000..49dd0338
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-14.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-15.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-15.png
new file mode 100644
index 00000000..938baa0e
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-15.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-16.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-16.png
new file mode 100644
index 00000000..f2566293
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-16.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-17.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-17.png
new file mode 100644
index 00000000..ea7080c9
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-17.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-18.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-18.png
new file mode 100644
index 00000000..04cc12e8
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-18.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-19.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-19.png
new file mode 100644
index 00000000..ca5552c6
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-19.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-2.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-2.png
new file mode 100644
index 00000000..48e318a5
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-2.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-20.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-20.png
new file mode 100644
index 00000000..4bc1da13
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-20.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-21.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-21.png
new file mode 100644
index 00000000..02df8c27
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-21.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-22.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-22.png
new file mode 100644
index 00000000..2a4a0506
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-22.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-23.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-23.png
new file mode 100644
index 00000000..18f9a574
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-23.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-24.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-24.png
new file mode 100644
index 00000000..c74e01a6
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-24.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-25.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-25.png
new file mode 100644
index 00000000..caee5201
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-25.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-3.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-3.png
new file mode 100644
index 00000000..c3f1f772
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-3.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-5.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-5.png
new file mode 100644
index 00000000..906e677b
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-5.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-6.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-6.png
new file mode 100644
index 00000000..ef844dc0
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-6.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-7.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-7.png
new file mode 100644
index 00000000..af79b3a5
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-7.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-8.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-8.png
new file mode 100644
index 00000000..d92c0bb7
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-8.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-9.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-9.png
new file mode 100644
index 00000000..acda7402
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/ripe/banana-ripe-9.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-1.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-1.png
new file mode 100644
index 00000000..7406226f
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-1.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-10.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-10.png
new file mode 100644
index 00000000..5c6aeb4a
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-10.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-11.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-11.png
new file mode 100644
index 00000000..7f9a15fe
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-11.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-12.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-12.png
new file mode 100644
index 00000000..01397d31
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-12.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-13.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-13.png
new file mode 100644
index 00000000..bd327d57
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-13.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-14.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-14.png
new file mode 100644
index 00000000..266b79df
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-14.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-15.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-15.png
new file mode 100644
index 00000000..acfb0be0
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-15.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-16.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-16.png
new file mode 100644
index 00000000..d23f9daf
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-16.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-17.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-17.png
new file mode 100644
index 00000000..2e7f86d6
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-17.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-18.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-18.png
new file mode 100644
index 00000000..552ef62e
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-18.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-19.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-19.png
new file mode 100644
index 00000000..d69e164e
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-19.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-2.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-2.png
new file mode 100644
index 00000000..cfeaed93
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-2.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-20.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-20.png
new file mode 100644
index 00000000..aac758ef
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-20.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-21.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-21.png
new file mode 100644
index 00000000..75438f17
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-21.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-22.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-22.png
new file mode 100644
index 00000000..0c7e1994
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-22.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-23.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-23.png
new file mode 100644
index 00000000..5d30f2d9
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-23.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-24.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-24.png
new file mode 100644
index 00000000..a8e422a8
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-24.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-25.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-25.png
new file mode 100644
index 00000000..fae3c999
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-25.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-26.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-26.png
new file mode 100644
index 00000000..59ee7619
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-26.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-27.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-27.png
new file mode 100644
index 00000000..382a029f
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-27.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-28.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-28.png
new file mode 100644
index 00000000..bcc033c6
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-28.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-29.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-29.png
new file mode 100644
index 00000000..0b6452bc
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-29.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-3.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-3.png
new file mode 100644
index 00000000..6d8d882b
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-3.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-4.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-4.png
new file mode 100644
index 00000000..c91423a7
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-4.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-5.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-5.png
new file mode 100644
index 00000000..cba41dd2
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-5.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-6.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-6.png
new file mode 100644
index 00000000..a1137eb4
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-6.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-7.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-7.png
new file mode 100644
index 00000000..d8ad18b6
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-7.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-8.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-8.png
new file mode 100644
index 00000000..599a797b
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-8.png differ
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-9.png b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-9.png
new file mode 100644
index 00000000..73d48198
Binary files /dev/null and b/4-manufacturing/lessons/1-train-fruit-detector/images/training/unripe/banana-unripe-9.png differ
diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/README.md b/4-manufacturing/lessons/2-check-fruit-from-device/README.md
new file mode 100644
index 00000000..dd8e3726
--- /dev/null
+++ b/4-manufacturing/lessons/2-check-fruit-from-device/README.md
@@ -0,0 +1,33 @@
+# Check fruit quality from an IoT device
+
+Add a sketchnote if possible/appropriate
+
+
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/31)
+
+## Introduction
+
+In this lesson you will learn about
+
+In this lesson we'll cover:
+
+* [Thing 1](#thing-1)
+
+## Thing 1
+
+---
+
+## 🚀 Challenge
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/32)
+
+## Review & Self Study
+
+## Assignment
+
+[](assignment.md)
diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/assignment.md b/4-manufacturing/lessons/2-check-fruit-from-device/assignment.md
new file mode 100644
index 00000000..da157d5c
--- /dev/null
+++ b/4-manufacturing/lessons/2-check-fruit-from-device/assignment.md
@@ -0,0 +1,9 @@
+#
+
+## Instructions
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| | | | |
diff --git a/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md b/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md
new file mode 100644
index 00000000..ccebe9ac
--- /dev/null
+++ b/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md
@@ -0,0 +1,33 @@
+# Run your fruit detector on the edge
+
+Add a sketchnote if possible/appropriate
+
+
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/33)
+
+## Introduction
+
+In this lesson you will learn about
+
+In this lesson we'll cover:
+
+* [Thing 1](#thing-1)
+
+## Thing 1
+
+---
+
+## 🚀 Challenge
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
+
+## Review & Self Study
+
+## Assignment
+
+[](assignment.md)
diff --git a/4-manufacturing/lessons/3-run-fruit-detector-edge/assignment.md b/4-manufacturing/lessons/3-run-fruit-detector-edge/assignment.md
new file mode 100644
index 00000000..da157d5c
--- /dev/null
+++ b/4-manufacturing/lessons/3-run-fruit-detector-edge/assignment.md
@@ -0,0 +1,9 @@
+#
+
+## Instructions
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| | | | |
diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/README.md b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md
new file mode 100644
index 00000000..bbf413f7
--- /dev/null
+++ b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md
@@ -0,0 +1,33 @@
+# Trigger fruit quality detection from a sensor
+
+Add a sketchnote if possible/appropriate
+
+
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/35)
+
+## Introduction
+
+In this lesson you will learn about
+
+In this lesson we'll cover:
+
+* [Thing 1](#thing-1)
+
+## Thing 1
+
+---
+
+## 🚀 Challenge
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/36)
+
+## Review & Self Study
+
+## Assignment
+
+[](assignment.md)
diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md b/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md
new file mode 100644
index 00000000..da157d5c
--- /dev/null
+++ b/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md
@@ -0,0 +1,9 @@
+#
+
+## Instructions
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| | | | |
diff --git a/README.md b/README.md
index 9407fced..fcb49e9e 100644
--- a/README.md
+++ b/README.md
@@ -79,6 +79,10 @@ We have two choices of IoT hardware to use for the projects depending on persona
| 12 | [Transport](./3-transport) | Store location data | Learn how to store IoT data to be visualized or analysed later | [Store location data](./3-transport/lessons/2-store-location-data/README.md) |
| 13 | [Transport](./3-transport) | Visualize location data | Learn about visualizing location data on a map, and how maps represent the real 3d world in 2 dimensions | [Visualize location data](./3-transport/lessons/3-visualize-location-data/README.md) |
| 14 | [Transport](./3-transport) | Geofences | Learn about geofences, and how they can be used to alert when vehicles in the supply chain are close to their destination | [Geofences](./3-transport/lessons/4-geofences/README.md) |
+| 15 | [Manufacturing](./4-manufacturing) | Train a fruit quality detector | Learn about training an image classifier in the cloud to detect fruit quality | [Train a fruit quality detector](./4-manufacturing/lessons/1-train-fruit-detector/README.md) |
+| 16 | [Manufacturing](./4-manufacturing) | Check fruit quality from an IoT device | Learn about using your fruit quality detector from an IoT device | [Check fruit quality from an IoT device](./4-manufacturing/lessons/2-check-fruit-from-device/README.md) |
+| 17 | [Manufacturing](./4-manufacturing) | Run your fruit detector on the edge | Learn about running your fruit detector on an IoT device on the edge | [Run your fruit detector on the edge](./4-manufacturing/lessons/3-run-fruit-detector-edge/README.md) |
+| 18 | [Manufacturing](./4-manufacturing) | Trigger fruit quality detection from a sensor | Learn about triggering fruit quality detection from a sensor | [Trigger fruit quality detection from a sensor](./4-manufacturing/lessons/4-trigger-fruit-detector/README.md) |
## Offline access
diff --git a/hardware.md b/hardware.md
index c3a28bff..dfb527ed 100644
--- a/hardware.md
+++ b/hardware.md
@@ -23,7 +23,7 @@ All the device code for Arduino is in C++. To complete all the assignments you w
These are specific to using the Wio terminal Arduino device, and are not relevant to using the Raspberry Pi.
-* [Grove camera kit](https://www.seeedstudio.com/Grove-Serial-Camera-Kit.html)
+* [ArduCam Mini 2MP Plus - OV2640](https://www.arducam.com/product/arducam-2mp-spi-camera-b0067-arduino/)
* [Grove speaker plus](https://www.seeedstudio.com/Grove-Speaker-Plus-p-4592.html)
## Raspberry Pi
@@ -48,7 +48,7 @@ These are specific to using the Raspberry Pi, and are not relevant to using the
* Any USB speaker, or speaker with a 3.5mm cable
or
* [USB Speakerphone](https://www.amazon.com/USB-Speakerphone-Conference-Business-Microphones/dp/B07Q3D7F8S/ref=sr_1_1?dchild=1&keywords=m0&qid=1614647389&sr=8-1)
-* [Grove Light sensor](https://www.seeedstudio.com/Grove-Light-Sensor-v1-2-LS06-S-phototransistor.html)
+* [Grove Sunlight sensor](https://www.seeedstudio.com/Grove-Sunlight-Sensor.html)
## Sensors and actuators
@@ -58,8 +58,6 @@ Most of the sensors and actuators needed are used by both the Arduino and Raspbe
* [Grove humidity and temperature sensor](https://www.seeedstudio.com/Grove-Temperature-Humidity-Sensor-DHT11.html)
* [Grove capacitive soil moisture sensor](https://www.seeedstudio.com/Grove-Capacitive-Moisture-Sensor-Corrosion-Resistant.html)
* [Grove relay](https://www.seeedstudio.com/Grove-Relay.html)
-* [Grove 125KHz RFID reader](https://www.seeedstudio.com/Grove-125KHz-RFID-Reader.html)
-* [RFID tags (125KHz)](https://www.seeedstudio.com/RFID-tag-combo-125khz-5-pcs-p-700.html)
* [Grove GPS (Air530)](https://www.seeedstudio.com/Grove-GPS-Air530-p-4584.html)
* [Grove - Ultrasonic Distance Sensor](https://www.seeedstudio.com/Grove-Ultrasonic-Distance-Sensor.html)
diff --git a/images/Diagrams.sketch b/images/Diagrams.sketch
index d4241771..322ab07c 100644
Binary files a/images/Diagrams.sketch and b/images/Diagrams.sketch differ
diff --git a/images/banana-training-images.png b/images/banana-training-images.png
new file mode 100644
index 00000000..46adc8f3
Binary files /dev/null and b/images/banana-training-images.png differ
diff --git a/images/banana-unripe-quick-test-prediction.png b/images/banana-unripe-quick-test-prediction.png
new file mode 100644
index 00000000..000904f9
Binary files /dev/null and b/images/banana-unripe-quick-test-prediction.png differ
diff --git a/images/bananas-ripe-vs-unripe-predictions.png b/images/bananas-ripe-vs-unripe-predictions.png
new file mode 100644
index 00000000..fd2e67ec
Binary files /dev/null and b/images/bananas-ripe-vs-unripe-predictions.png differ
diff --git a/images/custom-vision-create-project.png b/images/custom-vision-create-project.png
new file mode 100644
index 00000000..8554bc7c
Binary files /dev/null and b/images/custom-vision-create-project.png differ
diff --git a/images/grove-sunlight-sensor.png b/images/grove-sunlight-sensor.png
new file mode 100644
index 00000000..554845b5
Binary files /dev/null and b/images/grove-sunlight-sensor.png differ
diff --git a/images/icons/noun_tomato_1285672.svg b/images/icons/noun_tomato_1285672.svg
new file mode 100644
index 00000000..54f2cf27
--- /dev/null
+++ b/images/icons/noun_tomato_1285672.svg
@@ -0,0 +1 @@
+
\ No newline at end of file
diff --git a/images/image-upload-bananas.png b/images/image-upload-bananas.png
new file mode 100644
index 00000000..5940c2e3
Binary files /dev/null and b/images/image-upload-bananas.png differ
diff --git a/images/optical-tomato-sorting.png b/images/optical-tomato-sorting.png
new file mode 100644
index 00000000..5c0dcb1f
Binary files /dev/null and b/images/optical-tomato-sorting.png differ
diff --git a/images/pi-sunlight-sensor.png b/images/pi-sunlight-sensor.png
new file mode 100644
index 00000000..8e373f3c
Binary files /dev/null and b/images/pi-sunlight-sensor.png differ
diff --git a/images/shapes-to-images.png b/images/shapes-to-images.png
new file mode 100644
index 00000000..c9f82313
Binary files /dev/null and b/images/shapes-to-images.png differ
diff --git a/images/traditional-vs-ml.png b/images/traditional-vs-ml.png
new file mode 100644
index 00000000..c56c011f
Binary files /dev/null and b/images/traditional-vs-ml.png differ