diff --git a/1-getting-started/Translations/README.ar.md b/1-getting-started/Translations/README.ar.md new file mode 100644 index 00000000..a0696388 --- /dev/null +++ b/1-getting-started/Translations/README.ar.md @@ -0,0 +1,16 @@ +#
البدء مع IoT (إنترنت الأشياء)
+ +
في هذا القسم من المنهج ، سيتم تقديمك إلى إنترنت الأشياء ، وستتعلم المفاهيم الأساسية بما في ذلك بناء أول مشروع إنترنت الأشياء الخاص بك "Hello World" المتصل بالسحابة. هذا المشروع عبارة عن ضوء ليلي يضيء عندما تكون مستويات الضوء التي يقيسها المستشعر منخفضة.
+ +![
يتم تشغيل وإيقاف مؤشر LED المتصل بـ WIO مع تغير مستوى الضوء
](wio-running-assignment-1-1.gif) + +#
المواضيع
+ +1. [مقدمة لإنترنت الأشياء](lessons/1-introduction-to-iot/README.md) +1. [التعمق أكثر بإنترنت الأشياء](lessons/2-deeper-dive/README.md) +1. [تفاعل مع العالم باستخدام المستشعرات والمحركات](lessons/3-sensors-and-actuators/README.md) +1. [قم بتوصيل جهازك بالإنترنت](lessons/4-connect-internet/README.md) + +#
الاعتمادات
+ + [Jim Bennett](https://GitHub.com/JimBobBennett) كتبت جميع الدروس مع ♥️ من قبل \ No newline at end of file diff --git a/1-getting-started/Translations/README.hi.md b/1-getting-started/Translations/README.hi.md new file mode 100644 index 00000000..b0bf8623 --- /dev/null +++ b/1-getting-started/Translations/README.hi.md @@ -0,0 +1,16 @@ +# IoT . के साथ शुरुआत करना + +पाठ्यक्रम के इस भाग में, आपको इंटरनेट ऑफ थिंग्स से परिचित कराया जाएगा, और क्लाउड से कनेक्ट होने वाली अपनी पहली 'हैलो वर्ल्ड' IoT प्रोजेक्ट के निर्माण सहित बुनियादी अवधारणाओं को सीखेंगे। यह परियोजना एक रात की रोशनी है जो एक सेंसर ड्रॉप द्वारा मापे गए प्रकाश स्तर के रूप में रोशनी करती है। + +![WIO से जुड़ी LED प्रकाश के स्तर में परिवर्तन के साथ चालू और बंद होती है](https://github.com/microsoft/IoT-For-Beginners/blob/main/images/wio-running-assignment-1-1.gif?raw=true) + +## विषय + +1. [IoT का परिचय](lessons/1-introduction-to-iot/README.md) +1. [IoT में एक गहरा गोता](lessons/2-deeper-dive/README.md) +1. [सेंसर और एक्चुएटर्स के साथ भौतिक दुनिया के साथ बातचीत करें](lessons/3-sensors-and-actuators/README.md) +1. [अपने डिवाइस को इंटरनेट से कनेक्ट करें](lessons/4-connect-internet/README.md) + +## क्रेडिट + +सभी पाठ [जिम बेनेट](https://GitHub.com/JimBobBennett) द्वारा ♥️ के साथ लिखे गए थे diff --git a/1-getting-started/lessons/1-introduction-to-iot/README.md b/1-getting-started/lessons/1-introduction-to-iot/README.md index d601b69f..619a6fde 100644 --- a/1-getting-started/lessons/1-introduction-to-iot/README.md +++ b/1-getting-started/lessons/1-introduction-to-iot/README.md @@ -1,8 +1,8 @@ # Introduction to IoT -Add a sketchnote if possible/appropriate +![A sketchnote overview of this lesson](../../../sketchnotes/lesson-1.png) -![Embed a video here if available](video-url) +> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version. ## Pre-lecture quiz diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/.dummy.md b/1-getting-started/lessons/1-introduction-to-iot/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/1-getting-started/lessons/1-introduction-to-iot/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/1-getting-started/lessons/2-deeper-dive/translations/.dummy.md b/1-getting-started/lessons/2-deeper-dive/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/1-getting-started/lessons/2-deeper-dive/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/.dummy.md b/1-getting-started/lessons/3-sensors-and-actuators/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/1-getting-started/lessons/3-sensors-and-actuators/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/1-getting-started/lessons/4-connect-internet/README.md b/1-getting-started/lessons/4-connect-internet/README.md index c6dcdf4e..49ae2dbe 100644 --- a/1-getting-started/lessons/4-connect-internet/README.md +++ b/1-getting-started/lessons/4-connect-internet/README.md @@ -76,9 +76,9 @@ Follow the relevant step below to connect your device to the MQTT broker: ### A deeper dive into MQTT -Topics can have a hierarchy, and clients can subscribe to different levels of the hierarchy using wildcards. For example, you can send temperature telemetry messages to `/telemetry/temperature` and humidity messages to `/telemetry/humidity`, then in your cloud app subscribe to `/telemetry/*` to receive both the temperature and humidity telemetry messages. +Topics can have a hierarchy, and clients can subscribe to different levels of the hierarchy using wildcards. For example, you can send temperature telemetry messages to the `/telemetry/temperature` topic and humidity messages to the `/telemetry/humidity` topic, then in your cloud app subscribe to the `/telemetry/*` topic to receive both the temperature and humidity telemetry messages. -Messages can be sent with a quality of service (QoS), which determines the guarantees of the message being received. +Messages can be sent with a quality of service (QoS), which determines the guarantee of the message being received. * At most once - the message is sent only once and the client and broker take no additional steps to acknowledge delivery (fire and forget). * At least once - the message is re-tried by the sender multiple times until acknowledgement is received (acknowledged delivery). @@ -86,11 +86,11 @@ Messages can be sent with a quality of service (QoS), which determines the guara ✅ What situations might require an assured delivery message over a fire and forget message? -Although the name is Message Queueing, it doesn't actually support message queues. This means that if a client disconnects, then reconnects it won't receive messages sent during the disconnection except for those messages that it had already started to process using the QoS process. Messages can have a retained flag set on them. If this is set, the MQTT broker will store the last message sent on a topic with this flag, and send this to any clients who later subscribe to the topic. This way the clients will always get the latest message. +Although the name is Message Queueing (initials in MQTT), it doesn't actually support message queues. This means that if a client disconnects, then reconnects it won't receive messages sent during the disconnection, except for those messages that it had already started to process using the QoS process. Messages can have a retained flag set on them. If this is set, the MQTT broker will store the last message sent on a topic with this flag, and send this to any clients who later subscribe to the topic. This way, the clients will always get the latest message. MQTT also supports a keep alive function that checks to see if the connection is still alive during long gaps between messages. -> 🦟 [Mosquitto from the Eclipse Foundation](https://mosquitto.org) has a free MQTT broker you can run yourself to experiment with MQTT, along with a public MQTT broker you can use to test your code hosted at [test.mosquitto.org](https://test.mosquitto.org). +> 🦟 [Mosquitto from the Eclipse Foundation](https://mosquitto.org) has a free MQTT broker you can run yourself to experiment with MQTT, along with a public MQTT broker you can use to test your code, hosted at [test.mosquitto.org](https://test.mosquitto.org). MQTT connections can be public and open, or encrypted and secured using usernames and passwords, or certificates. diff --git a/1-getting-started/lessons/4-connect-internet/translations/.dummy.md b/1-getting-started/lessons/4-connect-internet/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/1-getting-started/lessons/4-connect-internet/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/1-getting-started/lessons/4-connect-internet/wio-terminal-telemetry.md b/1-getting-started/lessons/4-connect-internet/wio-terminal-telemetry.md index eb728a21..12897cf1 100644 --- a/1-getting-started/lessons/4-connect-internet/wio-terminal-telemetry.md +++ b/1-getting-started/lessons/4-connect-internet/wio-terminal-telemetry.md @@ -24,7 +24,7 @@ Install the Arduino JSON library. The next step is to create a JSON document with telemetry and send it to the MQTT broker. -### Task +### Task - publish telemetry Publish telemetry to the MQTT broker. diff --git a/2-farm/lessons/1-predict-plant-growth/translations/.dummy.md b/2-farm/lessons/1-predict-plant-growth/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/1-predict-plant-growth/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/lessons/2-detect-soil-moisture/translations/.dummy.md b/2-farm/lessons/2-detect-soil-moisture/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/2-detect-soil-moisture/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/lessons/2-detect-soil-moisture/virtual-device-soil-moisture.md b/2-farm/lessons/2-detect-soil-moisture/virtual-device-soil-moisture.md index a0a1494a..ddceaead 100644 --- a/2-farm/lessons/2-detect-soil-moisture/virtual-device-soil-moisture.md +++ b/2-farm/lessons/2-detect-soil-moisture/virtual-device-soil-moisture.md @@ -14,7 +14,7 @@ This is an analog sensor, so uses a simulated 10-bit ADC to report a value from To use a virtual soil moisture sensor, you need to add it to the CounterFit app -#### Task +#### Task - dd the soil moisture sensor to CounterFit Add the soil moisture sensor to the CounterFit app. @@ -44,7 +44,7 @@ Add the soil moisture sensor to the CounterFit app. The soil moisture sensor app can now be programmed using the CounterFit sensors. -### Task +### Task - program the soil moisture sensor app Program the soil moisture sensor app. diff --git a/2-farm/lessons/3-automated-plant-watering/translations/.dummy.md b/2-farm/lessons/3-automated-plant-watering/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/3-automated-plant-watering/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md index 107eb6a8..900a3f24 100644 --- a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md +++ b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md @@ -25,33 +25,33 @@ In this lesson we'll cover: ## What is the cloud? -Before the cloud, when a company wanted to provide services to their employees (such as databases or file storage), or to the public (such as web sites), they would build and run a data center. This ranged from a room with a small number of computers in it, to a building with many computers. The company would manage everything, including: +Before the cloud, when a company wants to provide services to their employees (such as databases or file storage), or to the public (such as websites), they would build and run a data center. This ranged from a room with a small number of computers, to a building with many computers. The company would manage everything, including: * Buying computers * Hardware maintenance * Power and cooling * Networking * Security, including securing the building and securing the software on the computers -* Software installation and updates +* Software installation and updates. This could be very expensive, require a wide range of skilled employees, and be very slow to change when needed. For example, if an online store needed to plan for a busy holiday season, they would need to plan months in advance to buy more hardware, configure it, install it and install the software to run their sales process. After the holiday season was over and sales dropped back down, they would be left with computers they've paid for sitting idle till the next busy season. ✅ Do you think this would allow companies to move quickly? If an online clothing retailer suddenly got popular due to a celebrity being seen in their clothes, would they be able to increase their computing power quickly enough to support the sudden influx of orders? -### Somebody else's computer +### Someone else's computer -The cloud is often jokingly referred to as 'somebody else's computer'. The initial idea was simple - instead of buying computers, you rent somebody else's computer. Someone else, a cloud computing provider, would manage huge data centers. They would be responsible for buying and installing the hardware, managing power and cooling, networking, building security, hardware and software updates, everything. As a customer, you would rent the computers you need, renting more as demand spikes, then reducing the number you rent if demand drops. These cloud data centers are all around the world. +The cloud is often jokingly referred to as 'someone else's computer'. The initial idea was simple - instead of buying computers, you rent someone else's computer. Someone else, a cloud computing provider, would manage huge data centers. They would be responsible for buying and installing the hardware, managing power and cooling, networking, building security, hardware and software updates, everything. As a customer, you would rent the computers you need, renting more as demand spikes, then reducing the number you rent if demand drops. These cloud data centers are all around the world. ![A Microsoft cloud data center](../../../images/azure-region-existing.png) ![A Microsoft cloud data center planned expansion](../../../images/azure-region-planned-expansion.png) These data centers can be multiple square kilometers in size. The images above were taken a few years ago at a Microsoft cloud data center, and show the initial size, along with a planned expansion. The area cleared for the expansion is over 5 square kilometers. -> 💁 These data centers require such large amounts of power that some have their own power stations. Because of their size and the level of investment from the cloud providers, they are usually very environmentally friendly. They are more efficient than huge numbers of small data centers, they run mostly on renewable energy, and cloud providers work hard to reduce waste, cut water usage, and replant forests to make up for those cut down to provide space to build data centers. You can read mode about how one cloud provider is working on sustainability on the [Azure sustainability site](https://azure.microsoft.com/global-infrastructure/sustainability/?WT.mc_id=academic-17441-jabenn). +> 💁 These data centers require such large amounts of power that some have their own power stations. Because of their size and the level of investment from the cloud providers, they are usually very environmentally friendly. They are more efficient than huge numbers of small data centers, they run mostly on renewable energy, and cloud providers work hard to reduce waste, cut water usage, and replant forests to make up for those cut down to provide space to build data centers. You can read more about how one cloud provider is working on sustainability on the [Azure sustainability site](https://azure.microsoft.com/global-infrastructure/sustainability/?WT.mc_id=academic-17441-jabenn). ✅ Do some research: Read up on the major clouds such as [Azure from Microsoft](https://azure.microsoft.com/?WT.mc_id=academic-17441-jabenn) or [GCP from Google](https://cloud.google.com). How many data centers do they have, and where are they in the world? -Using the cloud keeps costs down for companies, and allows them to focus on what they do best, leaving the cloud computing expertise in the hands of the provider. Companies no longer need to rent or buy data center space or pay different providers for connectivity and power and employee experts. Instead they can pay one monthly bill to the cloud provider to have everything taken care off. +Using the cloud keeps costs down for companies, and allows them to focus on what they do best, leaving the cloud computing expertise in the hands of the provider. Companies no longer need to rent or buy data center space or pay different providers for connectivity, power and expert employees. Instead, they can pay one monthly bill to the cloud provider to have everything taken care off. The cloud provider can then use economies of scale to drive costs down, buying computers in bulk at lower costs, investing in tooling to reduce their workload for maintenance, even designing and building their own hardware to improve their cloud offering. @@ -178,7 +178,7 @@ To use the Azure CLI, first it must be installed on your PC or Mac. az account set --subscription ``` - Replace `` with the Id of hte subscription you want to use. After running this command, re-run the command to list your accounts. You will see the `IsDefault` column will be marked as `True` for the subscription you have just set. + Replace `` with the Id of the subscription you want to use. After running this command, re-run the command to list your accounts. You will see the `IsDefault` column will be marked as `True` for the subscription you have just set. ### Task - create a resource group @@ -246,11 +246,11 @@ The IoT Hub will be created. It make take a minute or so for this to complete. ## Communicate with IoT Hub -In the previous lesson, you used MQTT and sent messages back and forward on different topics, with the different topics having different purposes. Rather than send messages over different topics, IoT Hub has a number of defined ways for the device to communicate with the Hub, or the Hub to communicate with the device. +In the previous lesson, you used MQTT and sent messages back and forward on different topics, with the different topics having different purposes. Rather than send messages over different topics, IoT Hub has a number of defined ways for the device to communicate with the Hub, or for the Hub to communicate with the device. > 💁 Under the hood this communication between IoT Hub and your device can use MQTT, HTTPS or AMQP. -* Device to cloud (D2C) messages - these are messages sent from a device to IoT Hub, such as telemetry. They can then ber read off the IoT Hub by your application code +* Device to cloud (D2C) messages - these are messages sent from a device to IoT Hub, such as telemetry. They can then be read off the IoT Hub by your application code. > 🎓 Under the hood, IoT Hub uses an Azure service called [Event Hubs](https://docs.microsoft.com/azure/event-hubs/?WT.mc_id=academic-17441-jabenn). When you write code to read messages sent to the hub, these are often called events. @@ -262,7 +262,7 @@ In the previous lesson, you used MQTT and sent messages back and forward on diff IoT Hub can store messages and direct method requests for a configurable period of time (defaulting to one day), so if a device or application code loses connection, it can still retrieve messages sent whilst it was offline after it reconnects. Device twins are kept permanently in the IoT Hub, so at any time a device can reconnect and get the latest device twin. -✅ Do some research: Read more on these message types on the [Device-to-cloud communications guidance](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-d2c-guidance?WT.mc_id=academic-17441-jabenn), an the [Cloud-to-device communications guidance](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-c2d-guidance?WT.mc_id=academic-17441-jabenn) in the IoT Hub documentation. +✅ Do some research: Read more on these message types on the [Device-to-cloud communications guidance](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-d2c-guidance?WT.mc_id=academic-17441-jabenn), and the [Cloud-to-device communications guidance](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-c2d-guidance?WT.mc_id=academic-17441-jabenn) in the IoT Hub documentation. ## Connect your device to the IoT service diff --git a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/translations/.dummy.md b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/wio-terminal-connect-hub.md b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/wio-terminal-connect-hub.md index abff3ed8..a7380048 100644 --- a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/wio-terminal-connect-hub.md +++ b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/wio-terminal-connect-hub.md @@ -110,7 +110,7 @@ The next step is to connect your device to IoT Hub. initTime(); ``` -1. Add the following variable declaration to the top of the file, just below the include directived: +1. Add the following variable declaration to the top of the file, just below the included directives: ```cpp IOTHUB_DEVICE_CLIENT_LL_HANDLE _device_ll_handle; diff --git a/2-farm/lessons/5-migrate-application-to-the-cloud/translations/.dummy.md b/2-farm/lessons/5-migrate-application-to-the-cloud/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/5-migrate-application-to-the-cloud/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/lessons/6-keep-your-plant-secure/translations/.dummy.md b/2-farm/lessons/6-keep-your-plant-secure/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/lessons/6-keep-your-plant-secure/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/2-farm/translations/.dummy.md b/2-farm/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/2-farm/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/3-transport/lessons/1-location-tracking/translations/.dummy.md b/3-transport/lessons/1-location-tracking/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/3-transport/lessons/1-location-tracking/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/3-transport/lessons/2-store-location-data/translations/.dummy.md b/3-transport/lessons/2-store-location-data/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/3-transport/lessons/2-store-location-data/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/3-transport/lessons/3-visualize-location-data/translations/.dummy.md b/3-transport/lessons/3-visualize-location-data/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/3-transport/lessons/3-visualize-location-data/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/3-transport/lessons/4-geofences/translations/.dummy.md b/3-transport/lessons/4-geofences/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/3-transport/lessons/4-geofences/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/3-transport/translations/.dummy.md b/3-transport/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/3-transport/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/4-manufacturing/README.md b/4-manufacturing/README.md index 3d285074..6158dc51 100644 --- a/4-manufacturing/README.md +++ b/4-manufacturing/README.md @@ -14,10 +14,10 @@ In these 4 lessons you'll learn how to train image-based AI models to detect fru ## Topics -1. [Train a fruit quality detector](./4-manufacturing/lessons/1-train-fruit-detector/README.md) -1. [Check fruit quality from an IoT device](./4-manufacturing/lessons/2-check-fruit-from-device/README.md) -1. [Run your fruit detector on the edge](./4-manufacturing/lessons/3-run-fruit-detector-edge/README.md) -1. [Trigger fruit quality detection from a sensor](./4-manufacturing/lessons/4-trigger-fruit-detector/README.md) +1. [Train a fruit quality detector](./lessons/1-train-fruit-detector/README.md) +1. [Check fruit quality from an IoT device](./lessons/2-check-fruit-from-device/README.md) +1. [Run your fruit detector on the edge](./lessons/3-run-fruit-detector-edge/README.md) +1. [Trigger fruit quality detection from a sensor](./lessons/4-trigger-fruit-detector/README.md) ## Credits diff --git a/4-manufacturing/lessons/1-train-fruit-detector/translations/.dummy.md b/4-manufacturing/lessons/1-train-fruit-detector/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/4-manufacturing/lessons/1-train-fruit-detector/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/README.md b/4-manufacturing/lessons/2-check-fruit-from-device/README.md index 72329511..1de30bf0 100644 --- a/4-manufacturing/lessons/2-check-fruit-from-device/README.md +++ b/4-manufacturing/lessons/2-check-fruit-from-device/README.md @@ -117,7 +117,7 @@ In the image above, the banana picture on the left was taken using a Raspberry P To improve the model, you can retrain it using the images captured from the IoT device. -### Task -improve the model +### Task - improve the model 1. Classify multiple images of both ripe and unripe fruit using your IoT device. diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.gitignore b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.gitignore new file mode 100644 index 00000000..89cc49cb --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.gitignore @@ -0,0 +1,5 @@ +.pio +.vscode/.browse.c_cpp.db* +.vscode/c_cpp_properties.json +.vscode/launch.json +.vscode/ipch diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.vscode/extensions.json b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.vscode/extensions.json new file mode 100644 index 00000000..0f0d7401 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/.vscode/extensions.json @@ -0,0 +1,7 @@ +{ + // See http://go.microsoft.com/fwlink/?LinkId=827846 + // for the documentation about the extensions.json format + "recommendations": [ + "platformio.platformio-ide" + ] +} diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/include/README b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/include/README new file mode 100644 index 00000000..194dcd43 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/include/README @@ -0,0 +1,39 @@ + +This directory is intended for project header files. + +A header file is a file containing C declarations and macro definitions +to be shared between several project source files. You request the use of a +header file in your project source file (C, C++, etc) located in `src` folder +by including it, with the C preprocessing directive `#include'. + +```src/main.c + +#include "header.h" + +int main (void) +{ + ... +} +``` + +Including a header file produces the same results as copying the header file +into each source file that needs it. Such copying would be time-consuming +and error-prone. With a header file, the related declarations appear +in only one place. If they need to be changed, they can be changed in one +place, and programs that include the header file will automatically use the +new version when next recompiled. The header file eliminates the labor of +finding and changing all the copies as well as the risk that a failure to +find one copy will result in inconsistencies within a program. + +In C, the usual convention is to give header files names that end with `.h'. +It is most portable to use only letters, digits, dashes, and underscores in +header file names, and at most one dot. + +Read more about using header files in official GCC documentation: + +* Include Syntax +* Include Operation +* Once-Only Headers +* Computed Includes + +https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/lib/README b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/lib/README new file mode 100644 index 00000000..6debab1e --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/lib/README @@ -0,0 +1,46 @@ + +This directory is intended for project specific (private) libraries. +PlatformIO will compile them to static libraries and link into executable file. + +The source code of each library should be placed in a an own separate directory +("lib/your_library_name/[here are source files]"). + +For example, see a structure of the following two libraries `Foo` and `Bar`: + +|--lib +| | +| |--Bar +| | |--docs +| | |--examples +| | |--src +| | |- Bar.c +| | |- Bar.h +| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html +| | +| |--Foo +| | |- Foo.c +| | |- Foo.h +| | +| |- README --> THIS FILE +| +|- platformio.ini +|--src + |- main.c + +and a contents of `src/main.c`: +``` +#include +#include + +int main (void) +{ + ... +} + +``` + +PlatformIO Library Dependency Finder will find automatically dependent +libraries scanning project source files. + +More information about PlatformIO Library Dependency Finder +- https://docs.platformio.org/page/librarymanager/ldf.html diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini new file mode 100644 index 00000000..1e0cd574 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini @@ -0,0 +1,26 @@ +; PlatformIO Project Configuration File +; +; Build options: build flags, source filter +; Upload options: custom upload port, speed and extra flags +; Library options: dependencies, extra library storages +; Advanced options: extra scripting +; +; Please visit documentation for the other options and examples +; https://docs.platformio.org/page/projectconf.html + +[env:seeed_wio_terminal] +platform = atmelsam +board = seeed_wio_terminal +framework = arduino +lib_deps = + seeed-studio/Seeed Arduino rpcWiFi + seeed-studio/Seeed Arduino FS + seeed-studio/Seeed Arduino SFUD + seeed-studio/Seeed Arduino rpcUnified + seeed-studio/Seeed_Arduino_mbedtls + seeed-studio/Seeed Arduino RTC + bblanchon/ArduinoJson @ 6.17.3 +build_flags = + -w + -DARDUCAM_SHIELD_V2 + -DOV2640_CAM \ No newline at end of file diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/camera.h b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/camera.h new file mode 100644 index 00000000..2028039f --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/camera.h @@ -0,0 +1,160 @@ +#pragma once + +#include +#include + +class Camera +{ +public: + Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS) + { + _format = format; + _image_size = image_size; + } + + bool init() + { + // Reset the CPLD + _arducam.write_reg(0x07, 0x80); + delay(100); + + _arducam.write_reg(0x07, 0x00); + delay(100); + + // Check if the ArduCAM SPI bus is OK + _arducam.write_reg(ARDUCHIP_TEST1, 0x55); + if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55) + { + return false; + } + + // Change MCU mode + _arducam.set_mode(MCU2LCD_MODE); + + uint8_t vid, pid; + + // Check if the camera module type is OV2640 + _arducam.wrSensorReg8_8(0xff, 0x01); + _arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid); + _arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid); + if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42))) + { + return false; + } + + _arducam.set_format(_format); + _arducam.InitCAM(); + _arducam.OV2640_set_JPEG_size(_image_size); + _arducam.OV2640_set_Light_Mode(Auto); + _arducam.OV2640_set_Special_effects(Normal); + delay(1000); + + return true; + } + + void startCapture() + { + _arducam.flush_fifo(); + _arducam.clear_fifo_flag(); + _arducam.start_capture(); + } + + bool captureReady() + { + return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK); + } + + bool readImageToBuffer(byte **buffer, uint32_t &buffer_length) + { + if (!captureReady()) return false; + + // Get the image file length + uint32_t length = _arducam.read_fifo_length(); + buffer_length = length; + + if (length >= MAX_FIFO_SIZE) + { + return false; + } + if (length == 0) + { + return false; + } + + // create the buffer + byte *buf = new byte[length]; + + uint8_t temp = 0, temp_last = 0; + int i = 0; + uint32_t buffer_pos = 0; + bool is_header = false; + + _arducam.CS_LOW(); + _arducam.set_fifo_burst(); + + while (length--) + { + temp_last = temp; + temp = SPI.transfer(0x00); + //Read JPEG data from FIFO + if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while, + { + buf[buffer_pos] = temp; + + buffer_pos++; + i++; + + _arducam.CS_HIGH(); + } + if (is_header == true) + { + //Write image data to buffer if not full + if (i < 256) + { + buf[buffer_pos] = temp; + buffer_pos++; + i++; + } + else + { + _arducam.CS_HIGH(); + + i = 0; + buf[buffer_pos] = temp; + + buffer_pos++; + i++; + + _arducam.CS_LOW(); + _arducam.set_fifo_burst(); + } + } + else if ((temp == 0xD8) & (temp_last == 0xFF)) + { + is_header = true; + + buf[buffer_pos] = temp_last; + buffer_pos++; + i++; + + buf[buffer_pos] = temp; + buffer_pos++; + i++; + } + } + + _arducam.clear_fifo_flag(); + + _arducam.set_format(_format); + _arducam.InitCAM(); + _arducam.OV2640_set_JPEG_size(_image_size); + + // return the buffer + *buffer = buf; + } + +private: + ArduCAM _arducam; + int _format; + int _image_size; +}; diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/config.h b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/config.h new file mode 100644 index 00000000..ef40b4fa --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/config.h @@ -0,0 +1,49 @@ +#pragma once + +#include + +using namespace std; + +// WiFi credentials +const char *SSID = ""; +const char *PASSWORD = ""; + +const char *PREDICTION_URL = ""; +const char *PREDICTION_KEY = ""; + +// Microsoft Azure DigiCert Global Root G2 global certificate +const char *CERTIFICATE = + "-----BEGIN CERTIFICATE-----\r\n" + "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n" + "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n" + "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n" + "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n" + "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n" + "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n" + "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n" + "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n" + "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n" + "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n" + "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n" + "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n" + "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n" + "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n" + "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n" + "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n" + "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n" + "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n" + "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n" + "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n" + "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n" + "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n" + "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n" + "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n" + "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n" + "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n" + "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n" + "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n" + "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n" + "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n" + "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n" + "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n" + "-----END CERTIFICATE-----\r\n"; \ No newline at end of file diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/main.cpp b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/main.cpp new file mode 100644 index 00000000..19af3bd2 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/src/main.cpp @@ -0,0 +1,125 @@ +#include +#include +#include +#include +#include "SD/Seeed_SD.h" +#include +#include +#include + +#include "config.h" +#include "camera.h" + +Camera camera = Camera(JPEG, OV2640_640x480); + +WiFiClientSecure client; + +void setupCamera() +{ + pinMode(PIN_SPI_SS, OUTPUT); + digitalWrite(PIN_SPI_SS, HIGH); + + Wire.begin(); + SPI.begin(); + + if (!camera.init()) + { + Serial.println("Error setting up the camera!"); + } +} + +void connectWiFi() +{ + while (WiFi.status() != WL_CONNECTED) + { + Serial.println("Connecting to WiFi.."); + WiFi.begin(SSID, PASSWORD); + delay(500); + } + + client.setCACert(CERTIFICATE); + Serial.println("Connected!"); +} + +void setup() +{ + Serial.begin(9600); + + while (!Serial) + ; // Wait for Serial to be ready + + delay(1000); + + connectWiFi(); + + setupCamera(); + + pinMode(WIO_KEY_C, INPUT_PULLUP); +} + +void classifyImage(byte *buffer, uint32_t length) +{ + HTTPClient httpClient; + httpClient.begin(client, PREDICTION_URL); + httpClient.addHeader("Content-Type", "application/octet-stream"); + httpClient.addHeader("Prediction-Key", PREDICTION_KEY); + + int httpResponseCode = httpClient.POST(buffer, length); + + if (httpResponseCode == 200) + { + String result = httpClient.getString(); + + DynamicJsonDocument doc(1024); + deserializeJson(doc, result.c_str()); + + JsonObject obj = doc.as(); + JsonArray predictions = obj["predictions"].as(); + + for(JsonVariant prediction : predictions) + { + String tag = prediction["tagName"].as(); + float probability = prediction["probability"].as(); + + char buff[32]; + sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0); + Serial.println(buff); + } + } + + httpClient.end(); +} + +void buttonPressed() +{ + camera.startCapture(); + + while (!camera.captureReady()) + delay(100); + + Serial.println("Image captured"); + + byte *buffer; + uint32_t length; + + if (camera.readImageToBuffer(&buffer, length)) + { + Serial.print("Image read to buffer with length "); + Serial.println(length); + + classifyImage(buffer, length); + + delete (buffer); + } +} + +void loop() +{ + if (digitalRead(WIO_KEY_C) == LOW) + { + buttonPressed(); + delay(2000); + } + + delay(200); +} \ No newline at end of file diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/test/README b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/test/README new file mode 100644 index 00000000..b94d0890 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/test/README @@ -0,0 +1,11 @@ + +This directory is intended for PlatformIO Unit Testing and project tests. + +Unit Testing is a software testing method by which individual units of +source code, sets of one or more MCU program modules together with associated +control data, usage procedures, and operating procedures, are tested to +determine whether they are fit for use. Unit testing finds problems early +in the development cycle. + +More information about PlatformIO Unit Testing: +- https://docs.platformio.org/page/plus/unit-testing.html diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/single-board-computer-classify-image.md b/4-manufacturing/lessons/2-check-fruit-from-device/single-board-computer-classify-image.md index 5e5e2f74..41b6994c 100644 --- a/4-manufacturing/lessons/2-check-fruit-from-device/single-board-computer-classify-image.md +++ b/4-manufacturing/lessons/2-check-fruit-from-device/single-board-computer-classify-image.md @@ -88,4 +88,4 @@ The Custom Vision service has a Python SDK you can use to classify images. > 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-device](code-classify/virtual-device) folder. -😀 Your camera program was a success! +😀 Your fruit quality classifier program was a success! diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/translations/.dummy.md b/4-manufacturing/lessons/2-check-fruit-from-device/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/4-manufacturing/lessons/2-check-fruit-from-device/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md b/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md index 703c0eb0..8323de3b 100644 --- a/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md +++ b/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md @@ -286,7 +286,9 @@ The Wio Terminal can now be programmed to capture an image when a button is pres ### Task - capture an image -1. Microcontrollers run your code continuously, so it's not easy to trigger something like taking a photo without reacting to a sensor. The Wio Terminal has buttons, so the camera can be set up to be triggered by one of the buttons. Add the following code to the end of the `setup` function to configure the C button (one of the three buttons on the top, the one closest to the power switch): +1. Microcontrollers run your code continuously, so it's not easy to trigger something like taking a photo without reacting to a sensor. The Wio Terminal has buttons, so the camera can be set up to be triggered by one of the buttons. Add the following code to the end of the `setup` function to configure the C button (one of the three buttons on the top, the one closest to the power switch). + + ![The C button on the top closest to the power switch](../../../images/wio-terminal-c-button.png) ```cpp pinMode(WIO_KEY_C, INPUT_PULLUP); @@ -339,6 +341,7 @@ The Wio Terminal can now be programmed to capture an image when a button is pres { Serial.print("Image read to buffer with length "); Serial.println(length); + delete(buffer); } ``` @@ -455,4 +458,4 @@ The Wio Terminal only supports microSD cards of up to 16GB in size. If you have ![A picture of a banana captured using the ArduCam](../../../images/banana-arducam.jpg) - > 💁 It may take a few images for the white balance of the camera to adjust itself. You will notice this based on the color of the images captured, the first few may look off color. You can always work around this by changing the code to capture a few images that are ignored during the setup. + > 💁 It may take a few images for the white balance of the camera to adjust itself. You will notice this based on the color of the images captured, the first few may look off color. You can always work around this by changing the code to capture a few images that are ignored in the `setup` function. diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-classify-image.md b/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-classify-image.md index 7ab55f6a..6a646ef4 100644 --- a/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-classify-image.md +++ b/4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-classify-image.md @@ -1,3 +1,215 @@ # Classify an image - Wio Terminal -Coming soon! +In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it. + +## Classify an image + +The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. THis REST API is accessed over an HTTPS connection - a secure HTTP connection. + +When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application. + +These certificates contain public keys, and don't need to be kept secure. You can use them in your source code and share them in public on places like GitHub. + +### Task - set up a SSL client + +1. Open the `fruit-quality-detector` app project if it's not already open + +1. Open the `config.h` header file, and add the following: + + ```cpp + const char *CERTIFICATE = + "-----BEGIN CERTIFICATE-----\r\n" + "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n" + "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n" + "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n" + "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n" + "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n" + "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n" + "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n" + "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n" + "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n" + "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n" + "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n" + "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n" + "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n" + "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n" + "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n" + "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n" + "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n" + "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n" + "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n" + "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n" + "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n" + "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n" + "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n" + "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n" + "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n" + "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n" + "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n" + "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n" + "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n" + "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n" + "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n" + "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n" + "-----END CERTIFICATE-----\r\n"; + ``` + + This is the *Microsoft Azure DigiCert Global Root G2 certificate* - it's one of the certificates used by many Azure services globally. + + > 💁 To see that this is the certificate to use, run the following command on macOS or Linux. If you are using Windows, you can run this command using the [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/windows/wsl/?WT.mc_id=academic-17441-jabenn): + > + > ```sh + > openssl s_client -showcerts -verify 5 -connect api.cognitive.microsoft.com:443 + > ``` + > + > The output will list the DigiCert Global Root G2 certificate. + +1. Open `main.cpp` and add the following include directive: + + ```cpp + #include + ``` + +1. Below the include directives, declare an instance of `WifiClientSecure`: + + ```cpp + WiFiClientSecure client; + ``` + + This class contains code to communicate with web endpoints over HTTPS. + +1. In the `connectWiFi` method, set the WiFiClientSecure to use the DigiCert Global Root G2 certificate: + + ```cpp + client.setCACert(CERTIFICATE); + ``` + +### Task - classify an image + +1. Add the following as an additional line to the `lib_deps` list in the `platformio.ini` file: + + ```ini + bblanchon/ArduinoJson @ 6.17.3 + ``` + + This imports [ArduinoJson](https://arduinojson.org), an Arduino JSON library, and will be used to decode the JSON response from the REST API. + +1. In `config.h`, add constants for the prediction URL and Key from the Custom Vision service: + + ```cpp + const char *PREDICTION_URL = ""; + const char *PREDICTION_KEY = ""; + ``` + + Replace `` with the prediction URL from Custom Vision. Replace `` with the prediction key. + +1. In `main.cpp`, add an include directive for the ArduinoJson library: + + ```cpp + #include + ``` + +1. Add the following function to `main.cpp`, above the `buttonPressed` function. + + ```cpp + void classifyImage(byte *buffer, uint32_t length) + { + HTTPClient httpClient; + httpClient.begin(client, PREDICTION_URL); + httpClient.addHeader("Content-Type", "application/octet-stream"); + httpClient.addHeader("Prediction-Key", PREDICTION_KEY); + + int httpResponseCode = httpClient.POST(buffer, length); + + if (httpResponseCode == 200) + { + String result = httpClient.getString(); + + DynamicJsonDocument doc(1024); + deserializeJson(doc, result.c_str()); + + JsonObject obj = doc.as(); + JsonArray predictions = obj["predictions"].as(); + + for(JsonVariant prediction : predictions) + { + String tag = prediction["tagName"].as(); + float probability = prediction["probability"].as(); + + char buff[32]; + sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0); + Serial.println(buff); + } + } + + httpClient.end(); + } + ``` + + This code starts by declaring an `HTTPClient` - a class that contains methods to interact with REST APIs. It then connects the client to the prediction URL using the `WiFiClientSecure` instance that was set up with the Azure public key. + + Once connected, it sends headers - information about the upcoming request that will be made against the REST API. The `Content-Type` header indicates the API call will send raw binary data, the `Prediction-Key` header passes the Custom Vision prediction key. + + Next a POST request is made to the HTTP client, uploading a byte array. This will contain the JPEG image captured from the camera when this function is called. + + > 💁 POST request are meant for sending data, and getting a response. There are other request types such as GET requests that retrieve data. GET requests are used by your web browser to load web pages. + + The POST request returns a response status code. These are well-defined values, with 200 meaning **OK** - the POST request was successful. + + > 💁 You can see all the response status codes in the [List of HTTP status codes page on Wikipedia](https://wikipedia.org/wiki/List_of_HTTP_status_codes) + + If a 200 is returned, the result is read from the HTTP client. This is a text response from the REST API with the results of the prediction as a JSON document. The JSON is in the following format: + + ```jSON + { + "id":"45d614d3-7d6f-47e9-8fa2-04f237366a16", + "project":"135607e5-efac-4855-8afb-c93af3380531", + "iteration":"04f1c1fa-11ec-4e59-bb23-4c7aca353665", + "created":"2021-06-10T17:58:58.959Z", + "predictions":[ + { + "probability":0.5582016, + "tagId":"05a432ea-9718-4098-b14f-5f0688149d64", + "tagName":"ripe" + }, + { + "probability":0.44179836, + "tagId":"bb091037-16e5-418e-a9ea-31c6a2920f17", + "tagName":"unripe" + } + ] + } + ``` + + The important part here is the `predictions` array. This contains the predictions, with one entry for each tag containing the tag name and the probability. The probabilities returned are floating point numbers from 0-1, with 0 being a 0% chance of matching the tag, and 1 being a 100% chance. + + > 💁 Image classifiers will return the percentages for all tags that have been used. Each tag will have a probability that the image matches that tag. + + This JSON is decoded, and the probabilities for each tag are sent to the serial monitor. + +1. In the `buttonPressed` function, either replace the code that saves to the SD card with a call to `classifyImage`, or add it after the image is written, but **before** the buffer is deleted: + + ```cpp + classifyImage(buffer, length); + ``` + + > 💁 If you replace the code that saves to the SD card you can clean up your code removing the `setupSDCard` and `saveToSDCard` functions. + +1. Upload and run your code. Point the camera at some fruit and press the C button. You will see the output in the serial monitor: + + ```output + Connecting to WiFi.. + Connected! + Image captured + Image read to buffer with length 8200 + ripe: 56.84% + unripe: 43.16% + ``` + + You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision. + + ![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png) + +> 💁 You can find this code in the [code-classify/wio-terminal](code-classify/wio-terminal) folder. + +😀 Your fruit quality classifier program was a success! diff --git a/4-manufacturing/lessons/3-run-fruit-detector-edge/translations/.dummy.md b/4-manufacturing/lessons/3-run-fruit-detector-edge/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/4-manufacturing/lessons/3-run-fruit-detector-edge/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/README.md b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md index a92ce100..8b1a9350 100644 --- a/4-manufacturing/lessons/4-trigger-fruit-detector/README.md +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md @@ -12,46 +12,215 @@ Add a sketchnote if possible/appropriate An IoT application is not just a single device capturing data and sending it to the cloud, it is more often that not multiple devices all working together to capture data from the physical world using sensors, make decisions based off that data, and interacting back with the physical world via actuators or visualizations. -In this lesson you will learn more about architecting complex IoT applications, incorporating multiple sensors, multiple cloud services to analyze and store data, and showing a response via an actuator. You will piece together a more advanced fruit quality tracking system. +In this lesson you will learn more about architecting complex IoT applications, incorporating multiple sensors, multiple cloud services to analyze and store data, and showing a response via an actuator. You will learn how to architect a fruit quality control system prototype, including about using proximity sensors to trigger the IoT application, and what the architecture of this prototype would be. In this lesson we'll cover: * [Architect complex IoT applications](#architect-complex-iot-applications) * [Design a fruit quality control system](#design-a-fruit-quality-control-system) * [Trigger fruit quality checking from a sensor](#trigger-fruit-quality-checking-from-a-sensor) -* [Store fruit quality data](#store-fruit-quality-data) -* [Control feedback via an actuator](#control-feedback-via-an-actuator) +* [Data used for a fruit quality detector](#data-used-for-a-fruit-quality-detector) +* [Using developer devices to simulate multiple IoT devices](#using-developer-devices-to-simulate-multiple-iot-devices) +* [Moving to production](#moving-to-production) ## Architect complex IoT applications +IoT applications are made up of many components. This includes a variety of things, and a variety of internet services. + +IoT applications can be described as *things* (devices) sending data that generates *insights*. These *insights* generate *actions* to improve a business or process. An example is an engine (the thing) sending temperature data. This data is used to evaluate whether the engine is performing as expected (the insight). The insight is used to proactively prioritize the maintenance schedule for the engine (the action). + +* Different things gather different pieces of data. +* IoT services give insights over that data, sometimes augmenting it with data from additional sources. +* These insights drive actions, including controlling actuators in devices, or visualizing data. + +### Reference IoT architecture + ![A reference iot architecture](../../../images/iot-reference-architecture.png) -***A reference iot architecture. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** +***A reference iot architecture. Microcontroller by Template / IoT by Adrien Coquet / Brain by Icon Market - all from the [Noun Project](https://thenounproject.com)*** + +The diagram above shows a reference IoT architecture. + +> 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate. + +* **Things** are devices that gather data from sensors, maybe interacting with edge services to interpret that data, such as image classifiers to interpret image data. The data from the devices is sent to an IoT service. +* **Insights** come from serverless applications, or from analytics run on stored data. +* **Actions** can be commands sent to devices, or visualization of data allowing humans to make decisions. + +![A reference iot architecture](../../../images/iot-reference-architecture-azure.png) + +***A reference iot architecture. Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** + +The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture. + +* **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub. +* **Insights** - you've used Azure Functions to respond to messages sent to an IoT Hub, and stored data for later analysis in Azure Storage. +* **Actions** - you've controlled actuators based on decisions made in the cloud and commands sent to the devices, and you've visualized data using Azure Maps. + + +✅ Think about other IoT devices you have used, such as smart home appliances. What are the things, insights and actions involved in that device and it's software? + +This pattern can be scaled out as large or small as you need, adding more devices and more services. + +### Data and security + +As you define the architecture of your system, you need to constantly consider data and security. + +* What data does your device send and receive? +* How should that data be secured and protected? +* How should access to the device and cloud service be controlled? + +✅ Think about the data security of any IoT devices you own. How much of that data is personal and should be kept private, both in transit or when stored? What data should not be stored? ## Design a fruit quality control system +Lets now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application. + +Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system. + +✅ One of the trends with the rise of IoT (and technology in general) is that manual jobs are being replaced by machines. Do some research: How many jobs are estimated to be lost to IoT? How many new jobs will be created building IoT devices? + +You need to build a system where fruit is detected as it arrives on the conveyer belt, it is then photographed and checked using an AI model running on the edge. The results are then sent to the cloud to be stored, and if the fruit is unripe a notification is given so the unripe fruit can be removed. + +| | | +| - | - | +| **Things** | Detector for fruit arriving on the conveyor belt
Camera to photograph and classify the fruit
Edge device running the classifier
Device to notify of unripe fruit | +| **Insights** | Decide to check the ripeness of the fruit
Store the results of the ripeness classification
Determine if there is a need to alert about unripe fruit | +| **Actions** | Send a command to a device to photograph the fruit and check it with an image classifier
Send a command to a device to alert that the fruit is unripe | + +### Prototyping your application + ![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png) ***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** +The diagram above shows a reference architecture for this prototype application. + +* An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected. +* A serverless application in the cloud sends a command to another device to take a photograph and classify the image. +* An IoT device with a camera takes a picture and sends it to an image classifier running on the edge. The results are then sent to the cloud. +* A serverless application in the cloud stores this information to be analyzed later to see what percentage of fruit is unripe. If the fruit is unripe it sends a command to another iot device to alert factory workers there is unripe fruit via an LED. + +> 💁 This entire IoT application could be implemented as a single device, with all the logic to start the image classification and control the LED built in. It could use an IoT Hub just to track the number of unripe fruits detected and configure the device. In this lesson it is expanded to demonstrate the concepts for large scale IoT applications. + +For the prototype, you will implement all of this on a single device. If you are using a microcontroller then you will use a separate edge device to run the image classifier. You have already learned most of the things you will need to be able to build this. + ## Trigger fruit quality checking from a sensor +The IoT device needs some kind of trigger to indicate when fruit is ready to be classified. One trigger for this would be to measure when the fruit is at the right location on the conveyor belt my measuring the distance to a sensor. + +![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png) + +***Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back. Bananas by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)*** + +Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor. + +> 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away. + ### Task - trigger fruit quality detection from a distance sensor -## Store fruit quality data +Work through the relevant guide to use a proximity sensor to detect an object using your IoT device: -## Control feedback via an actuator +* [Arduino - Wio Terminal](wio-terminal-proximity.md) +* [Single-board computer - Raspberry Pi](pi-proximity.md) +* [Single-board computer - Virtual device](virtual-device-proximity.md) + +## Data used for a fruit quality detector + +The prototype fruit detector has multiple components communicating with each other. + +![The components communicating with each other](../../../images/fruit-quality-detector-message-flow.png) + +* A proximity sensor measuring the distance to a piece of fruit and sending this to IoT Hub +* The command to control the camera coming from IoT Hub to the camera device +* The results of the image classification being sent to IoT Hub +* The command to control an LED to alert when the fruit is unripe being sent from IoT Hub to the device with the LED + +It is good to define the structure of these messages up front, before you build out the application. + +> 💁 Pretty much every experienced developer has at some point in their career spent hours, days or even weeks chasing down bugs caused by differences in the data being sent compared to what is expected. + +For example - if you are sending temperature information, how would you define the JSON? You could have a field called `temperature`, or you could use the common abbreviation `temp`. + +```json +{ + "temperature": 20.7 +} +``` + +compared to: + +```json +{ + "temp": 20.7 +} +``` + +You also have to consider units - is the temperature in °C or °F? If you are measuring temperature using a consumer device and they change the display units, you need to make sure the units sent to the cloud remain consistent. + +✅ Do some research: How did unit problems cause the $125 million Mars Climate Orbiter to crash? + +Think about the data being sent for the fruit quality detector. How would you define each message? Where would you analyze the data and make decisions about what data to send? + +For example - triggering the image classification using the proximity sensor. The IoT device measures the distance, but where is the decision made? Does the device decide that the fruit is close enough and sends a message to tell the IoT Hub to trigger the classification? Or does it send proximity measurements and let the IoT Hub decide? + +The answer to questions like this is - it depends. Each use case is different, which is why as an IoT developer you need to understand the system you are building, how it is used, and the data being detected. + +* If the decision is made by the IoT Hub, you need to send multiple distance measurements. +* If you send too many messages, it increases the cost of the IoT Hub, and the amount of bandwidth needed by your IoT devices (especially in a factory with millions of devices). It can also slow down your device. +* If you make the decision on the device, you will need to provide a way to configure the device to fine tune the machine. + +## Using developer devices to simulate multiple IoT devices + +To build your prototype, you will need your IoT dev kit to act like multiple devices, sending telemetry and responding to commands. + +### Simulating multiple IoT devices on a Raspberry Pi or virtual IoT hardware + +When using a single board computer like a Raspberry Pi, you are able to run multiple applications at once. This means you can simulate multiple IoT devices by creating multiple applications, one per 'IoT device'. For example, you can implement each device as a separate Python file and run them in different terminal sessions. + +> 💁 Be aware that some hardware won't work when being accessed by multiple applications running simultaneously. + +### Simulating multiple devices on a microcontroller + +Microcontrollers are more complicated to simulate multiple devices. Unlike single board computers you cannot run multiple applications at once, you have to include all the logic for all the separate IoT devices in a single application. + +Some suggestions to make this process easier are: + +* Create one or more classes per IoT device - for example classes called `DistanceSensor`, `ClassifierCamera`, `LEDController`. Each one can have it's own `setup` and `loop` methods called by the main `setup` and `loop` functions. +* Handle commands in a single place, and direct them to the relevant device class as required. +* In the main `loop` function, you will need to consider the timing for each different device. For example, if you have one device class that needs to process every 10 seconds, and another that needs to process every 1 second, then in your main `loop` function use a 1 second delay. Every `loop` call triggers the relevant code for the device that needs to process every second, and use a counter to count each loop, processing the other device when the counter reaches 10 (resetting the counter afterwards). + +## Moving to production + +The prototype will form the basis of a final production system. Some of the differences when you move to production would be: + +* Ruggedized components - using hardware designed to withstand the noise, heat, vibration and stress of a factory. +* Using internal communications - some of the components would communicate directly avoiding the hop to the cloud, only sending data to the cloud to be stored. How this is done depends on the factory setup, with either direct communications, or by running part of the IoT service on the edge using a gateway device. +* Configuration options - each factory and use case is different, so the hardware would need to be configurable. For example, the proximity sensor may need to detect different fruit at different distances. Rather than hard code the distance to trigger the classification, you would want this to be configurable via the could, for example using a device twin +* Automated fruit removal - instead of an LED to alert that fruit is unripe, automated devices would remove it. + +✅ Do some research: In what other ways would production devices differ from developer kits? --- ## 🚀 Challenge +In this lesson you have learned some of the concepts you need to know to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above? + +Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need. + +For example - a vehicle tracking device that combines GPS with sensors to monitor things like temperatures in a refrigerated truck, the engine on and off times, and the identity of the driver. What are the devices involved, the services involved, the data being transmitted and the security and privacy considerations? + ## Post-lecture quiz [Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/36) ## Review & Self Study +* Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn) +* Read more about device twins in the [Understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn) +* Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture) + ## Assignment -[](assignment.md) +[Build a fruit quality detector](assignment.md) diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md b/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md index da157d5c..6063c695 100644 --- a/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/assignment.md @@ -1,9 +1,18 @@ -# +# Build a fruit quality detector ## Instructions +Build the fruit quality detector! + +Take everything you have learned so far and build the prototype fruit quality detector. Trigger image classification based off proximity using an AI model running on the edge, store the results of the classification in storage, and control an LED based off the ripeness of the fruit. + +You should be able to piece this together using code you have previously written in all the lessons so far. + ## Rubric | Criteria | Exemplary | Adequate | Needs Improvement | | -------- | --------- | -------- | ----------------- | -| | | | | +| Configure all the services | Was able to set up an IoT Hub, Azure functions application and Azure storage | Was able to set up the IoT Hub, but not either the Azure functions app or Azure storage | Was unable to set up any internet IoT services | +| Monitor proximity and send the data to IoT Hub if an object is closer than a pre-defined distance and trigger the camera via a command | Was able to measure distance and send a message to an IoT Hub when an object is close enough, and have a command send to trigger the camera | Was able to measure proximity and send to IoT Hub, but unable to get a command sent to the camera | Was unable to measure distance and send a message to IoT Hub, or trigger a command | +| Capture an image, classify it and send the results to IoT Hub | Was able to capture an image, classify it using an edge device and send the results to IoT Hub | Was able to classify the image but not using an edge device, or was unable to send the results to IoT Hub | Was unable to classify an image | +| Turn the LED on or off depending on the results of the classification using a command sent to a device | Was able to turn an LED on via a command if the fruit was unripe | Was able to send the command to the device but not control the LED | Was unable to send a command to control the LED | diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/pi/fruit-quality-detector/distance_sensor.py b/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/pi/fruit-quality-detector/distance_sensor.py new file mode 100644 index 00000000..f3800461 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/pi/fruit-quality-detector/distance_sensor.py @@ -0,0 +1,11 @@ +import time +from grove.i2c import Bus +from rpi_vl53l0x.vl53l0x import VL53L0X + +distance_sensor = VL53L0X(bus = Bus().bus) +distance_sensor.begin() + +while True: + distance_sensor.wait_ready() + print(f'Distance = {distance_sensor.get_distance()} mm') + time.sleep(1) \ No newline at end of file diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/virtual-iot-device/fruit-quality-detector/distance_sensor.py b/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/virtual-iot-device/fruit-quality-detector/distance_sensor.py new file mode 100644 index 00000000..8db3d0b0 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/code-proximity/virtual-iot-device/fruit-quality-detector/distance_sensor.py @@ -0,0 +1,14 @@ +from counterfit_connection import CounterFitConnection +CounterFitConnection.init('127.0.0.1', 5000) + +import time + +from counterfit_shims_rpi_vl53l0x.vl53l0x import VL53L0X + +distance_sensor = VL53L0X() +distance_sensor.begin() + +while True: + distance_sensor.wait_ready() + print(f'Distance = {distance_sensor.get_distance()} mm') + time.sleep(1) \ No newline at end of file diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/pi-proximity.md b/4-manufacturing/lessons/4-trigger-fruit-detector/pi-proximity.md new file mode 100644 index 00000000..bf887291 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/pi-proximity.md @@ -0,0 +1,98 @@ +# Detect proximity - Raspberry Pi + +In this part of the lesson, you will add a proximity sensor to your Raspberry Pi, and read distance from it. + +## Hardware + +The Raspberry Pi needs a proximity sensor. + +The sensor you'll use is a [Grove Time of Flight distance sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html). This sensor uses a laser ranging module to detect distance. This sensor has a range of 10mm to 2000mm (1cm - 2m), and will report values in that range pretty accurately, with distances above 1000mm reported as 8109mm. + +The laser rangefinder is on the back of the sensor, the opposite side to the Grove socket. + +This is an I2C sensor. + +### Connect the time of flight sensor + +The Grove time of flight sensor can be connected to the Raspberry Pi. + +#### Task - connect the time of flight sensor + +Connect the time of flight sensor. + +![A grove time of flight sensor](../../../images/grove-time-of-flight-sensor.png) + +1. Insert one end of a Grove cable into the socket on the time of flight sensor. It will only go in one way round. + +1. With the Raspberry Pi powered off, connect the other end of the Grove cable to one of the I2C sockets marked **I2C** on the Grove Base hat attached to the Pi. These sockets are on the bottom row, the opposite end to the GPI pins and next to the camera cable slot. + +![The grove time of flight sensor connected to the I squared C socket](../../../images/pi-time-of-flight-sensor.png) + +## Program the time of flight sensor + +The Raspberry Pi can now be programmed to use the attached time of flight sensor. + +### Task - program the time of flight sensor + +Program the device. + +1. Power up the Pi and wait for it to boot. + +1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension. + +1. Create a new file in this project called `distance-sensor.py`. + + > 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time. + +1. Add the following code to this file: + + ```python + import time + + from grove.i2c import Bus + from rpi_vl53l0x.vl53l0x import VL53L0X + ``` + + This imports the Grove I2C bus library, and a sensor library for the core sensor hardware built into the Grove time of flight sensor. + +1. Below this, add the following code to access the sensor: + + ```python + distance_sensor = VL53L0X(bus = Bus().bus) + distance_sensor.begin() + ``` + + This code declares a distance sensor using the Grove I2C bus, then starts the sensor. + +1. Finally, add an infinite loop to read distances: + + ```python + while True: + distance_sensor.wait_ready() + print(f'Distance = {distance_sensor.get_distance()} mm') + time.sleep(1) + ``` + + This code waits for a value to be ready to read from the sensor, then prints it to the console. + +1. Run this code. + + > 💁 Don't forget this file is called `distance-sensor.py`! Make sure to run this via Python, not `app.py`. + +1. You will see distance measurements appear in the console. Position objects near the sensor and you will see the distance measurement: + + ```output + pi@raspberrypi:~/fruit-quality-detector $ python3 distance_sensor.py + Distance = 29 mm + Distance = 28 mm + Distance = 30 mm + Distance = 151 mm + ``` + + The rangefinder is on the back of the sensor, so make sure you use hte correct side when measuring distance. + + ![The rangefinder on the back of the time of flight sensor pointing at a banana](../../../images/time-of-flight-banana.png) + +> 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder. + +😀 Your proximity sensor program was a success! \ No newline at end of file diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/translations/.dummy.md b/4-manufacturing/lessons/4-trigger-fruit-detector/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/virtual-device-proximity.md b/4-manufacturing/lessons/4-trigger-fruit-detector/virtual-device-proximity.md new file mode 100644 index 00000000..dd999f05 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/virtual-device-proximity.md @@ -0,0 +1,107 @@ +# Detect proximity - Virtual IoT Hardware + +In this part of the lesson, you will add a proximity sensor to your virtual IoT device, and read distance from it. + +## Hardware + +The virtual IoT device will use a simulated distance sensor. + +In a physical IoT device you would use a sensor with a laser ranging module to detect distance. + +### Add the distance sensor to CounterFit + +To use a virtual distance sensor, you need to add one to the CounterFit app + +#### Task - add the distance sensor to CounterFit + +Add the distance sensor to the CounterFit app. + +1. Open the `fruit-quality-detector` code in VS Code, and make sure the virtual environment is activated. + +1. Install an additional Pip package to install a CounterFit shim that can talk to distance sensors by simulating the [rpi-vl53l0x Pip package](https://pypi.org/project/rpi-vl53l0x/), a Python package that interacts with [a VL53L0X time-of-flight distance sensor](https://wiki.seeedstudio.com/Grove-Time_of_Flight_Distance_Sensor-VL53L0X/). Make sure you are installing this from a terminal with the virtual environment activated. + + ```sh + pip install counterfit-shims-rpi-vl53l0x + ``` + +1. Make sure the CounterFit web app is running + +1. Create a distance sensor: + + 1. In the *Create sensor* box in the *Sensors* pane, drop down the *Sensor type* box and select *Distance*. + + 1. Leave the *Units* as `Millimeter` + + 1. This sensor is an I2C sensor, so set the address to `0x29`. If you used a physical VL53L0X sensor it would be hardcoded to this address. + + 1. Select the **Add** button to create the distance sensor + + ![The distance sensor settings](../../../images/counterfit-create-distance-sensor.png) + + The distance sensor will be created and appear in the sensors list. + + ![The distance sensor created](../../../images/counterfit-distance-sensor.png) + +## Program the distance sensor + +The virtual IoT device can now be programmed to use the simulated distance sensor. + +### Task - program the time of flight sensor + +1. Create a new file in the `fruit-quality-detector` project called `distance-sensor.py`. + + > 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time. + +1. Start a connection to CounterFit with the following code: + + ```python + from counterfit_connection import CounterFitConnection + CounterFitConnection.init('127.0.0.1', 5000) + ``` + +1. Add the following code below this: + + ```python + import time + + from counterfit_shims_rpi_vl53l0x.vl53l0x import VL53L0X + ``` + + This imports the sensor library shim for the VL53L0X time of flight sensor. + +1. Below this, add the following code to access the sensor: + + ```python + distance_sensor = VL53L0X() + distance_sensor.begin() + ``` + + This code declares a distance sensor, then starts the sensor. + +1. Finally, add an infinite loop to read distances: + + ```python + while True: + distance_sensor.wait_ready() + print(f'Distance = {distance_sensor.get_distance()} mm') + time.sleep(1) + ``` + + This code waits for a value to be ready to read from the sensor, then prints it to the console. + +1. Run this code. + + > 💁 Don't forget this file is called `distance-sensor.py`! Make sure to run this via Python, not `app.py`. + +1. You will see distance measurements appear in the console. Change the value in CounterFit to see this value change, or use random values. + + ```output + (.venv) ➜ fruit-quality-detector python distance-sensor.py + Distance = 37 mm + Distance = 42 mm + Distance = 29 mm + ``` + +> 💁 You can find this code in the [code-proximity/virtual-iot-device](code-proximity/virtual-iot-device) folder. + +😀 Your proximity sensor program was a success! \ No newline at end of file diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/wio-terminal-proximity.md b/4-manufacturing/lessons/4-trigger-fruit-detector/wio-terminal-proximity.md new file mode 100644 index 00000000..f6e5a764 --- /dev/null +++ b/4-manufacturing/lessons/4-trigger-fruit-detector/wio-terminal-proximity.md @@ -0,0 +1,40 @@ +# Detect proximity - Wio Terminal + +In this part of the lesson, you will add a proximity sensor to your Wio Terminal, and read distance from it. + +## Hardware + +The Wio Terminal needs a proximity sensor. + +The sensor you'll use is a [Grove Time of Flight distance sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html). This sensor uses a laser ranging module to detect distance. This sensor has a range of 10mm to 2000mm (1cm - 2m), and will report values in that range pretty accurately, with distances above 1000mm reported as 8109mm. + +The laser rangefinder is on the back of the sensor, the opposite side to the Grove socket. + +This is an I2C sensor. + +### Connect the time of flight sensor + +The Grove time of flight sensor can be connected to the Wio Terminal. + +#### Task - connect the time of flight sensor + +Connect the time of flight sensor. + +![A grove time of flight sensor](../../../images/grove-time-of-flight-sensor.png) + +1. Insert one end of a Grove cable into the socket on the time of flight sensor. It will only go in one way round. + +1. With the Wio Terminal disconnected from your computer or other power supply, connect the other end of the Grove cable to the left-hand side Grove socket on the Wio Terminal as you look at the screen. This is the socket closest to from the power button. This is a combined digital and I2C socket. + +![The grove time of flight sensor connected to the left hand socket](../../../images/wio-time-of-flight-sensor.png) + +1. You can now connect the Wio Terminal to your computer. + +## Program the time of flight sensor + +The Wio Terminal can now be programmed to use the attached time of flight sensor. + +### Task - program the time of flight sensor + +1. Create a brand new Wio Terminal project using PlatformIO. Call this project `distance-sensor`. Add code in the `setup` function to configure the serial port. + diff --git a/4-manufacturing/translations/.dummy.md b/4-manufacturing/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/4-manufacturing/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/5-retail/translations/.dummy.md b/5-retail/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/5-retail/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/6-consumer/translations/.dummy.md b/6-consumer/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/6-consumer/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts! diff --git a/README.md b/README.md index 8b42fd59..2dc2b3bb 100644 --- a/README.md +++ b/README.md @@ -20,7 +20,7 @@ The projects cover the journey of food from farm to table. This includes farming ![A road map for the course showing 24 lessons covering intro, farming, transport, processing, retail and cooking](sketchnotes/Roadmap.png) -**Hearty thanks to our authors [Jen Looper](https://github.com/jlooper), [Jim Bennett](https://github.com/jimbobbennett), and sketchnote artist [Nitya Narasimhan](https://github.com/nitya)** +**Hearty thanks to our authors [Jen Fox](https://github.com/jenfoxbot), [Jen Looper](https://github.com/jlooper), [Jim Bennett](https://github.com/jimbobbennett), and sketchnote artist [Nitya Narasimhan](https://github.com/nitya)** > **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md). diff --git a/hardware.md b/hardware.md index b186c9b2..a66999e3 100644 --- a/hardware.md +++ b/hardware.md @@ -46,7 +46,7 @@ These are specific to using the Raspberry Pi, and are not relevant to using the * [Raspberry Pi Camera module](https://www.raspberrypi.org/products/camera-module-v2/) * Microphone and speaker: * Any USB Microphone - * Any USB speaker, or speaker with a 3.5mm cable + * Any USB speaker, or speaker with a 3.5mm cable, or using HDMI audio if your Raspberry Pi is connected to a monitor with speakers or * [USB Speakerphone](https://www.amazon.com/USB-Speakerphone-Conference-Business-Microphones/dp/B07Q3D7F8S/ref=sr_1_1?dchild=1&keywords=m0&qid=1614647389&sr=8-1) * [Grove Sunlight sensor](https://www.seeedstudio.com/Grove-Sunlight-Sensor.html) @@ -60,7 +60,7 @@ Most of the sensors and actuators needed are used by both the Arduino and Raspbe * [Grove capacitive soil moisture sensor](https://www.seeedstudio.com/Grove-Capacitive-Moisture-Sensor-Corrosion-Resistant.html) * [Grove relay](https://www.seeedstudio.com/Grove-Relay.html) * [Grove GPS (Air530)](https://www.seeedstudio.com/Grove-GPS-Air530-p-4584.html) -* [Grove - Ultrasonic Distance Sensor](https://www.seeedstudio.com/Grove-Ultrasonic-Distance-Sensor.html) +* [Grove - Time of flight Distance Sensor](https://www.seeedstudio.com/Grove-Time-of-Flight-Distance-Sensor-VL53L0X.html) ## Optional hardware diff --git a/images/Diagrams.sketch b/images/Diagrams.sketch index 1f53a291..ad785858 100644 Binary files a/images/Diagrams.sketch and b/images/Diagrams.sketch differ diff --git a/images/counterfit-create-distance-sensor.png b/images/counterfit-create-distance-sensor.png new file mode 100644 index 00000000..6cf60a61 Binary files /dev/null and b/images/counterfit-create-distance-sensor.png differ diff --git a/images/counterfit-distance-sensor.png b/images/counterfit-distance-sensor.png new file mode 100644 index 00000000..63e29c8d Binary files /dev/null and b/images/counterfit-distance-sensor.png differ diff --git a/images/fruit-quality-detector-message-flow.png b/images/fruit-quality-detector-message-flow.png new file mode 100644 index 00000000..311a527f Binary files /dev/null and b/images/fruit-quality-detector-message-flow.png differ diff --git a/images/grove-time-of-flight-sensor.png b/images/grove-time-of-flight-sensor.png new file mode 100644 index 00000000..dc6f2562 Binary files /dev/null and b/images/grove-time-of-flight-sensor.png differ diff --git a/images/icons/noun_Button_23781.svg b/images/icons/noun_Button_23781.svg deleted file mode 100644 index 28687cc8..00000000 --- a/images/icons/noun_Button_23781.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Dan Hetteixfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_CPU_3204712.svg b/images/icons/noun_CPU_3204712.svg deleted file mode 100644 index e832c565..00000000 --- a/images/icons/noun_CPU_3204712.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Icon Laukfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Calendar_3743023.svg b/images/icons/noun_Calendar_3743023.svg deleted file mode 100644 index d47de4d6..00000000 --- a/images/icons/noun_Calendar_3743023.svg +++ /dev/null @@ -1,6 +0,0 @@ -Created by Alice-vectorfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Certificate_3938014.svg b/images/icons/noun_Certificate_3938014.svg deleted file mode 100644 index b89752a8..00000000 --- a/images/icons/noun_Certificate_3938014.svg +++ /dev/null @@ -1 +0,0 @@ -Created by alimasykurmfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Cloud_3750184.svg b/images/icons/noun_Cloud_3750184.svg deleted file mode 100644 index 9a57dc12..00000000 --- a/images/icons/noun_Cloud_3750184.svg +++ /dev/null @@ -1 +0,0 @@ -Artboard 26Created by Debi Alpa Nugrahafrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_GPS_3949932.svg b/images/icons/noun_GPS_3949932.svg deleted file mode 100644 index 931adaa9..00000000 --- a/images/icons/noun_GPS_3949932.svg +++ /dev/null @@ -1 +0,0 @@ -1Created by mim studiofrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Idea_1422926.svg b/images/icons/noun_Idea_1422926.svg deleted file mode 100644 index 8b3e0b10..00000000 --- a/images/icons/noun_Idea_1422926.svg +++ /dev/null @@ -1 +0,0 @@ -Bulb-idea-light-light bulbCreated by Pause08from the Noun Project \ No newline at end of file diff --git a/images/icons/noun_IoT_2696195.svg b/images/icons/noun_IoT_2696195.svg deleted file mode 100644 index ab61a8a4..00000000 --- a/images/icons/noun_IoT_2696195.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Adrien Coquetfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_LED_705744.svg b/images/icons/noun_LED_705744.svg deleted file mode 100644 index 30c8cfb1..00000000 --- a/images/icons/noun_LED_705744.svg +++ /dev/null @@ -1 +0,0 @@ -Created by abderraouf omarafrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Microcontroller_2426329.svg b/images/icons/noun_Microcontroller_2426329.svg deleted file mode 100644 index 430eb7e8..00000000 --- a/images/icons/noun_Microcontroller_2426329.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Templatefrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Plant Cell_3424143.svg b/images/icons/noun_Plant Cell_3424143.svg deleted file mode 100644 index 1ff3f2b6..00000000 --- a/images/icons/noun_Plant Cell_3424143.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Léa Lortalfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Plant_2743061.svg b/images/icons/noun_Plant_2743061.svg deleted file mode 100644 index 6f06ce08..00000000 --- a/images/icons/noun_Plant_2743061.svg +++ /dev/null @@ -1 +0,0 @@ -plantCreated by Alex Muravevfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Satellite_3880466.svg b/images/icons/noun_Satellite_3880466.svg deleted file mode 100644 index 446139d6..00000000 --- a/images/icons/noun_Satellite_3880466.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Noura Mbarkifrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Temperature_1979336.svg b/images/icons/noun_Temperature_1979336.svg deleted file mode 100644 index 515db33c..00000000 --- a/images/icons/noun_Temperature_1979336.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Vectors Marketfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_Watering Can_1091274.svg b/images/icons/noun_Watering Can_1091274.svg deleted file mode 100644 index 86ad5f5d..00000000 --- a/images/icons/noun_Watering Can_1091274.svg +++ /dev/null @@ -1,4 +0,0 @@ -Created by Daria Moskvinafrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_broadcast_3541650.svg b/images/icons/noun_broadcast_3541650.svg deleted file mode 100644 index b8b2e8cd..00000000 --- a/images/icons/noun_broadcast_3541650.svg +++ /dev/null @@ -1 +0,0 @@ -Created by RomStufrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_chip_3661040.svg b/images/icons/noun_chip_3661040.svg deleted file mode 100644 index bc86ceed..00000000 --- a/images/icons/noun_chip_3661040.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Astatine Labfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_database_2231958.svg b/images/icons/noun_database_2231958.svg deleted file mode 100644 index 11bff791..00000000 --- a/images/icons/noun_database_2231958.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Icons Bazaarfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_dial_311733.svg b/images/icons/noun_dial_311733.svg deleted file mode 100644 index a0d0ac3d..00000000 --- a/images/icons/noun_dial_311733.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Jamie Dickinsonfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_heater_3462184.svg b/images/icons/noun_heater_3462184.svg deleted file mode 100644 index df7b6131..00000000 --- a/images/icons/noun_heater_3462184.svg +++ /dev/null @@ -1 +0,0 @@ -100. HeaterCreated by Pascal Heßfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_ldr_3160537.svg b/images/icons/noun_ldr_3160537.svg deleted file mode 100644 index 9cef5815..00000000 --- a/images/icons/noun_ldr_3160537.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Eucalypfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_mobile phone_3743108.svg b/images/icons/noun_mobile phone_3743108.svg deleted file mode 100644 index af4894ab..00000000 --- a/images/icons/noun_mobile phone_3743108.svg +++ /dev/null @@ -1,5 +0,0 @@ -Created by Alice-vectorfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_motor_1601857.svg b/images/icons/noun_motor_1601857.svg deleted file mode 100644 index 954d5cdf..00000000 --- a/images/icons/noun_motor_1601857.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Bakunetsu Kaitofrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_probe_1368437.svg b/images/icons/noun_probe_1368437.svg deleted file mode 100644 index 87069026..00000000 --- a/images/icons/noun_probe_1368437.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Adnen Kadrifrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_ram_1442357.svg b/images/icons/noun_ram_1442357.svg deleted file mode 100644 index a384bfff..00000000 --- a/images/icons/noun_ram_1442357.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Atif Arshadfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_smart sensor_3340135.svg b/images/icons/noun_smart sensor_3340135.svg deleted file mode 100644 index 14b05e41..00000000 --- a/images/icons/noun_smart sensor_3340135.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Andrei Yushchenkofrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_tomato_1285672.svg b/images/icons/noun_tomato_1285672.svg deleted file mode 100644 index 54f2cf27..00000000 --- a/images/icons/noun_tomato_1285672.svg +++ /dev/null @@ -1 +0,0 @@ -18Created by parkjisunfrom the Noun Project \ No newline at end of file diff --git a/images/icons/noun_weather_1174141.svg b/images/icons/noun_weather_1174141.svg deleted file mode 100644 index 2a5ffff0..00000000 --- a/images/icons/noun_weather_1174141.svg +++ /dev/null @@ -1 +0,0 @@ -Created by Adrien Coquetfrom the Noun Project \ No newline at end of file diff --git a/images/iot-reference-architecture-azure.png b/images/iot-reference-architecture-azure.png new file mode 100644 index 00000000..14aee8da Binary files /dev/null and b/images/iot-reference-architecture-azure.png differ diff --git a/images/iot-reference-architecture-fruit-quality.png b/images/iot-reference-architecture-fruit-quality.png index 865f06f8..284ee435 100644 Binary files a/images/iot-reference-architecture-fruit-quality.png and b/images/iot-reference-architecture-fruit-quality.png differ diff --git a/images/iot-reference-architecture.png b/images/iot-reference-architecture.png index 5615b63e..bbdb91e7 100644 Binary files a/images/iot-reference-architecture.png and b/images/iot-reference-architecture.png differ diff --git a/images/pi-time-of-flight-sensor.png b/images/pi-time-of-flight-sensor.png new file mode 100644 index 00000000..51247310 Binary files /dev/null and b/images/pi-time-of-flight-sensor.png differ diff --git a/images/proximity-sensor.png b/images/proximity-sensor.png new file mode 100644 index 00000000..0b0e5bde Binary files /dev/null and b/images/proximity-sensor.png differ diff --git a/images/time-of-flight-banana.png b/images/time-of-flight-banana.png new file mode 100644 index 00000000..d521f27e Binary files /dev/null and b/images/time-of-flight-banana.png differ diff --git a/images/wio-terminal-c-button.png b/images/wio-terminal-c-button.png new file mode 100644 index 00000000..438f6b84 Binary files /dev/null and b/images/wio-terminal-c-button.png differ diff --git a/images/wio-time-of-flight-sensor.png b/images/wio-time-of-flight-sensor.png new file mode 100644 index 00000000..1080ba2b Binary files /dev/null and b/images/wio-time-of-flight-sensor.png differ diff --git a/sketchnotes/lesson-1.png b/sketchnotes/lesson-1.png new file mode 100644 index 00000000..624bfba1 Binary files /dev/null and b/sketchnotes/lesson-1.png differ diff --git a/translations/.dummy.md b/translations/.dummy.md new file mode 100644 index 00000000..6e7db247 --- /dev/null +++ b/translations/.dummy.md @@ -0,0 +1,9 @@ +# Dummy File + +This file acts as a placeholder for the `translations` folder.
+**Please remove this file after adding the first translation** + +For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) . + +## THANK YOU +We truly appreciate your efforts!