Merge branch 'main' into arabic

pull/153/head
Jim Bennett 4 years ago committed by GitHub
commit c820318a41
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -21,6 +21,7 @@
"mosquitto", "mosquitto",
"photodiode", "photodiode",
"photodiodes", "photodiodes",
"quickstart",
"sketchnote" "sketchnote"
] ]
} }

@ -78,8 +78,6 @@ A single-board computer is a small computing device that has all the elements of
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The Raspberry Pi is one of the most popular single-board computers. The Raspberry Pi is one of the most popular single-board computers.
Like a microcontroller, single-board computers have a CPU, memory and input/output pins, but they have additional features such as a graphics chip to allow you to connect monitors, audio outputs, and USB ports to connect keyboards mice and other standard USB devices like webcams or external storage. Programs are stored on SD cards or hard drives along with an operating system, instead of a memory chip built into the board. Like a microcontroller, single-board computers have a CPU, memory and input/output pins, but they have additional features such as a graphics chip to allow you to connect monitors, audio outputs, and USB ports to connect keyboards mice and other standard USB devices like webcams or external storage. Programs are stored on SD cards or hard drives along with an operating system, instead of a memory chip built into the board.

@ -4,8 +4,6 @@ The [Raspberry Pi](https://raspberrypi.org) is a single-board computer. You can
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## Setup ## Setup
If you are using a Raspberry Pi as your IoT hardware, you have two choices - you can work through all these lessons and code directly on the Pi, or you can connect remotely to a 'headless' Pi and code from your computer. If you are using a Raspberry Pi as your IoT hardware, you have two choices - you can work through all these lessons and code directly on the Pi, or you can connect remotely to a 'headless' Pi and code from your computer.

@ -94,10 +94,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
<div dir="rtl"> <div dir="rtl">
يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة. يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة.

@ -79,8 +79,6 @@ IoT শব্দে **T** হলো **Things** - ‘থিংস’ বা জ
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
বিখ্যাত সিংগেল-বোর্ড কম্পিউটারগুলোর মধ্যে রাস্পবেরি পাই অন্যতম । বিখ্যাত সিংগেল-বোর্ড কম্পিউটারগুলোর মধ্যে রাস্পবেরি পাই অন্যতম ।
মাইক্রোকন্ট্রোলারের মতো সিংগেল-বোর্ড কম্পিউটার এরও রয়েছে সিপিইউ, মেমরি, ইনপুট/আউটপুট পিন । তবে তাদের অতিরিক্ত বৈশিষ্ট্য রয়েছে যেমন গ্রাফিক্স চিপ দ্বারা আমরা মনিটর, অডিও আউটপুট এবং ইউএসবি পোর্টগুলিকে সংযুক্ত করতে পারি কী-বোর্ড , মাউস এর সাথে বা অন্যান্য স্ট্যান্ডার্ড ইউএসবি ডিভাইসগুলি যেমনঃ ওয়েবক্যাম বা বাহ্যিক স্টোরেজ এর সাথে সংযোগ করতে দেয় । প্রোগ্রামগুলি বোর্ডে তৈরি মেমরি চিপে নয়, বরং হার্ড ড্রাইভে বা এসডি কার্ড এ সংরক্ষণ করা হয়, যার সাথে থাকে অপারেটিং সিস্টেম । মাইক্রোকন্ট্রোলারের মতো সিংগেল-বোর্ড কম্পিউটার এরও রয়েছে সিপিইউ, মেমরি, ইনপুট/আউটপুট পিন । তবে তাদের অতিরিক্ত বৈশিষ্ট্য রয়েছে যেমন গ্রাফিক্স চিপ দ্বারা আমরা মনিটর, অডিও আউটপুট এবং ইউএসবি পোর্টগুলিকে সংযুক্ত করতে পারি কী-বোর্ড , মাউস এর সাথে বা অন্যান্য স্ট্যান্ডার্ড ইউএসবি ডিভাইসগুলি যেমনঃ ওয়েবক্যাম বা বাহ্যিক স্টোরেজ এর সাথে সংযোগ করতে দেয় । প্রোগ্রামগুলি বোর্ডে তৈরি মেমরি চিপে নয়, বরং হার্ড ড্রাইভে বা এসডি কার্ড এ সংরক্ষণ করা হয়, যার সাথে থাকে অপারেটিং সিস্টেম ।

@ -81,10 +81,6 @@ IoT में **T** का अर्थ है **चीजें** - ऐसे
![एक रास्पबेरी पाई 4](../../../../images/raspberry-pi-4.jpg) ![एक रास्पबेरी पाई 4](../../../../images/raspberry-pi-4.jpg)
***रास्पबेरी पाई 4. माइकल हेन्ज़लर /
[विकिमीडिया कॉमन्स](https://commons.wikimedia.org/wiki/Main_Page) /
[CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
रास्पबेरी पाई सबसे लोकप्रिय सिंगल-बोर्ड कंप्यूटरों में से एक है। रास्पबेरी पाई सबसे लोकप्रिय सिंगल-बोर्ड कंप्यूटरों में से एक है।
एक माइक्रोकंट्रोलर की तरह, सिंगल-बोर्ड कंप्यूटर में एक सीपीयू, मेमोरी और इनपुट/आउटपुट पिन होते हैं, लेकिन उनमें ग्राफिक्स चिप जैसी अतिरिक्त सुविधाएं होती हैं, जिससे आप मॉनिटर, ऑडियो आउटपुट और यूएसबी पोर्ट को कीबोर्ड, चूहों और अन्य मानक यूएसबी से कनेक्ट कर सकते हैं। वेबकैम या बाहरी भंडारण जैसे उपकरण। बोर्ड में निर्मित मेमोरी चिप के बजाय प्रोग्राम को ऑपरेटिंग सिस्टम के साथ एसडी कार्ड या हार्ड ड्राइव पर संग्रहीत किया जाता है। एक माइक्रोकंट्रोलर की तरह, सिंगल-बोर्ड कंप्यूटर में एक सीपीयू, मेमोरी और इनपुट/आउटपुट पिन होते हैं, लेकिन उनमें ग्राफिक्स चिप जैसी अतिरिक्त सुविधाएं होती हैं, जिससे आप मॉनिटर, ऑडियो आउटपुट और यूएसबी पोर्ट को कीबोर्ड, चूहों और अन्य मानक यूएसबी से कनेक्ट कर सकते हैं। वेबकैम या बाहरी भंडारण जैसे उपकरण। बोर्ड में निर्मित मेमोरी चिप के बजाय प्रोग्राम को ऑपरेटिंग सिस्टम के साथ एसडी कार्ड या हार्ड ड्राइव पर संग्रहीत किया जाता है।

@ -78,8 +78,6 @@ Komputer papan tunggal adalah perangkat komputasi kecil yang memiliki semua elem
![Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer. Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer.
Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan. Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan.

@ -78,8 +78,6 @@ IoT 的 **T** 代表 **Things**(物)—— 可以跟物质世界交互的设
![一个 Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![一个 Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
Raspberry Pi 是其中最流行的单板机。 Raspberry Pi 是其中最流行的单板机。
就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。 就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。

@ -4,8 +4,6 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***রাস্পবেরি পাই - Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## সেটাপ ## সেটাপ
যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন। যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন।

@ -26,16 +26,12 @@ The two components of an IoT application are the *Internet* and the *thing*. Let
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The **Thing** part of IoT refers to a device that can interact with the physical world. These devices are usually small, low-priced computers, running at low speeds and using low power - for example, simple microcontrollers with kilobytes of RAM (as opposed to gigabytes in a PC) running at only a few hundred megahertz (as opposed to gigahertz in a PC), but consuming sometimes so little power they can run for weeks, months or even years on batteries. The **Thing** part of IoT refers to a device that can interact with the physical world. These devices are usually small, low-priced computers, running at low speeds and using low power - for example, simple microcontrollers with kilobytes of RAM (as opposed to gigabytes in a PC) running at only a few hundred megahertz (as opposed to gigahertz in a PC), but consuming sometimes so little power they can run for weeks, months or even years on batteries.
These devices interact with the physical world, either by using sensors to gather data from their surroundings or by controlling outputs or actuators to make physical changes. The typical example of this is a smart thermostat - a device that has a temperature sensor, a means to set a desired temperature such as a dial or touchscreen, and a connection to a heating or cooling system that can be turned on when the temperature detected is outside the desired range. The temperature sensor detects that the room is too cold and an actuator turns the heating on. These devices interact with the physical world, either by using sensors to gather data from their surroundings or by controlling outputs or actuators to make physical changes. The typical example of this is a smart thermostat - a device that has a temperature sensor, a means to set a desired temperature such as a dial or touchscreen, and a connection to a heating or cooling system that can be turned on when the temperature detected is outside the desired range. The temperature sensor detects that the room is too cold and an actuator turns the heating on.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
There are a huge range of different things that can act as IoT devices, from dedicated hardware that senses one thing, to general purpose devices, even your smartphone! A smartphone can use sensors to detect the world around it, and actuators to interact with the world - for example using a GPS sensor to detect your location and a speaker to give you navigation instructions to a destination. There are a huge range of different things that can act as IoT devices, from dedicated hardware that senses one thing, to general purpose devices, even your smartphone! A smartphone can use sensors to detect the world around it, and actuators to interact with the world - for example using a GPS sensor to detect your location and a speaker to give you navigation instructions to a destination.
✅ Think of other systems you have around you that read data from a sensor and use that to make decisions. One example would be the thermostat on an oven. Can you find more? ✅ Think of other systems you have around you that read data from a sensor and use that to make decisions. One example would be the thermostat on an oven. Can you find more?
@ -52,14 +48,10 @@ With the example of a smart thermostat, the thermostat would connect using home
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
An even smarter version could use AI in the cloud with data from other sensors connected to other IoT devices such as occupancy sensors that detect what rooms are in use, as well as data such as weather and even your calendar, to make decisions on how to set the temperature in a smart fashion. For example, it could turn your heating off if it reads from your calendar you are on vacation, or turn off the heating on a room by room basis depending on what rooms you use, learning from the data to be more and more accurate over time. An even smarter version could use AI in the cloud with data from other sensors connected to other IoT devices such as occupancy sensors that detect what rooms are in use, as well as data such as weather and even your calendar, to make decisions on how to set the temperature in a smart fashion. For example, it could turn your heating off if it reads from your calendar you are on vacation, or turn off the heating on a room by room basis depending on what rooms you use, learning from the data to be more and more accurate over time.
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ What other data could help make an Internet connected thermostat smarter? ✅ What other data could help make an Internet connected thermostat smarter?
### IoT on the Edge ### IoT on the Edge
@ -96,8 +88,6 @@ The faster the clock cycle, the more instructions that can be processed each sec
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second. Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second.
✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal. ✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal.
@ -212,8 +202,6 @@ The [Raspberry Pi Foundation](https://www.raspberrypi.org) is a charity from the
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The latest iteration of the full size Raspberry Pi is the Raspberry Pi 4B. This has a quad-core (4 core) CPU running at 1.5GHz, 2, 4, or 8GB of RAM, gigabit ethernet, WiFi, 2 HDMI ports supporting 4k screens, an audio and composite video output port, USB ports (2 USB 2.0, 2 USB 3.0), 40 GPIO pins, a camera connector for a Raspberry Pi camera module, and an SD card slot. All this on a board that is 88mm x 58mm x 19.5mm and is powered by a 3A USB-C power supply. These start at US$35, much cheaper than a PC or Mac. The latest iteration of the full size Raspberry Pi is the Raspberry Pi 4B. This has a quad-core (4 core) CPU running at 1.5GHz, 2, 4, or 8GB of RAM, gigabit ethernet, WiFi, 2 HDMI ports supporting 4k screens, an audio and composite video output port, USB ports (2 USB 2.0, 2 USB 3.0), 40 GPIO pins, a camera connector for a Raspberry Pi camera module, and an SD card slot. All this on a board that is 88mm x 58mm x 19.5mm and is powered by a 3A USB-C power supply. These start at US$35, much cheaper than a PC or Mac.
> 💁 There is also a Pi400 all in one computer with a Pi4 built into a keyboard. > 💁 There is also a Pi400 all in one computer with a Pi4 built into a keyboard.

@ -32,16 +32,12 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات. يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات.
تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة. تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة. هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة.
✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟ ✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟
@ -59,15 +55,11 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت . يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت .
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟ ✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟
@ -114,8 +106,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية. تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية.
✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio. ✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio.
@ -237,8 +227,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac. أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac.
> 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح. > 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح.

@ -1,7 +1,5 @@
# IoT এর আরো গভীরে # IoT এর আরো গভীরে
![Embed a video here if available](video-url)
## লেকচার পূর্ববর্তী কুইজ ## লেকচার পূর্ববর্তী কুইজ
[লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3) [লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3)
@ -24,16 +22,12 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
**থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র‍্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে। **থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র‍্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে।
এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে। এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে।
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***একটি সাধারণ থার্মোস্ট্যাট Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে। বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে।
✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ? ✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ?
@ -50,14 +44,10 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***একটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত, ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট / Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে। আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে।
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***একটি ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট যা একাধিক রুমের সেন্সর ব্যবহার করে । এটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত এবং আবহাওয়া ও ক্যালেন্ডারের ডেটা থেকে বুদ্ধিমত্তা গ্রহণ করতে সক্ষম. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে? ✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে?
### Edge চালিত IoT ### Edge চালিত IoT
@ -91,8 +81,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়। মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়।
✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি। ✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি।
@ -207,8 +195,6 @@ Wio Terminal পর্যালোচনা করি।
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র‍্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম। পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র‍্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম।
> 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে। > 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে।

@ -60,8 +60,6 @@ One example of this is a potentiometer. This is a dial that you can rotate betwe
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out. The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out.
> 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer). > 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer).
@ -88,8 +86,6 @@ The simplest digital sensor is a button or switch. This is a sensor with two sta
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0. Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0.
> 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example, the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0. > 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example, the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0.
@ -101,8 +97,6 @@ More advanced digital sensors read analog values, then convert them using on-boa
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Sending digital data allows sensors to become more complex and send more detailed data, even encrypted data for secure sensors. One example is a camera. This is a sensor that captures an image and sends it as digital data containing that image, usually in a compressed format such as JPEG, to be read by the IoT device. It can even stream video by capturing images and sending either the complete image frame by frame or a compressed video stream. Sending digital data allows sensors to become more complex and send more detailed data, even encrypted data for secure sensors. One example is a camera. This is a sensor that captures an image and sends it as digital data containing that image, usually in a compressed format such as JPEG, to be read by the IoT device. It can even stream video by capturing images and sending either the complete image frame by frame or a compressed video stream.
## What are actuators? ## What are actuators?
@ -125,8 +119,6 @@ Follow the relevant guide below to add an actuator to your IoT device, controlle
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md) * [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md) * [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -143,8 +135,6 @@ One example is a dimmable light, such as the ones you might have in your house.
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Like with sensors, the actual IoT device works on digital signals, not analog. This means to send an analog signal, the IoT device needs a digital to analog converter (DAC), either on the IoT device directly, or on a connector board. This will convert the 0s and 1s from the IoT device to an analog voltage that the actuator can use. Like with sensors, the actual IoT device works on digital signals, not analog. This means to send an analog signal, the IoT device needs a digital to analog converter (DAC), either on the IoT device directly, or on a connector board. This will convert the 0s and 1s from the IoT device to an analog voltage that the actuator can use.
✅ What do you think happens if the IoT device sends a higher voltage than the actuator can handle? ⛔️ DO NOT test this out. ✅ What do you think happens if the IoT device sends a higher voltage than the actuator can handle? ⛔️ DO NOT test this out.
@ -159,8 +149,6 @@ Imagine you are controlling a motor with a 5V supply. You send a short pulse to
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity). This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity).
```output ```output
@ -172,8 +160,6 @@ This means in one second you have 25 5V pulses of 0.02s that rotate the motor, e
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
You can change the motor speed by changing the size of the pulses. For example, with the same motor you can keep the same cycle time of 0.04s, with the on pulse halved to 0.01s, and the off pulse increasing to 0.03s. You have the same number of pulses per second (25), but each on pulse is half the length. A half length pulse only turns the motor one twentieth of a rotation, and at 25 pulses a second will complete 1.25 rotations per second or 75rpm. By changing the pulse speed of a digital signal you've halved the speed of an analog motor. You can change the motor speed by changing the size of the pulses. For example, with the same motor you can keep the same cycle time of 0.04s, with the on pulse halved to 0.01s, and the off pulse increasing to 0.03s. You have the same number of pulses per second (25), but each on pulse is half the length. A half length pulse only turns the motor one twentieth of a rotation, and at 25 pulses a second will complete 1.25 rotations per second or 75rpm. By changing the pulse speed of a digital signal you've halved the speed of an analog motor.
```output ```output
@ -195,8 +181,6 @@ One simple digital actuator is an LED. When a device sends a digital signal of 1
![A LED is off at 0 volts and on at 5V](../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ What other simple 2-state actuators can you think of? One example is a solenoid, which is an electromagnet that can be activated to do things like move a door bolt locking/unlocking a door. ✅ What other simple 2-state actuators can you think of? One example is a solenoid, which is an electromagnet that can be activated to do things like move a door bolt locking/unlocking a door.
More advanced digital actuators, such as screens require the digital data to be sent in certain formats. They usually come with libraries that make it easier to send the correct data to control them. More advanced digital actuators, such as screens require the digital data to be sent in certain formats. They usually come with libraries that make it easier to send the correct data to control them.

@ -62,8 +62,6 @@
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../../images/potentiometer.png)
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى <a href="https://wikipedia.org/wiki/Up_to_eleven">11</a> ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت). سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى <a href="https://wikipedia.org/wiki/Up_to_eleven">11</a> ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت).
> 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على <a href="https://wikipedia.org/wiki/Potentiometer">potentiometer Wikipedia page</a> > 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على <a href="https://wikipedia.org/wiki/Potentiometer">potentiometer Wikipedia page</a>
@ -89,8 +87,6 @@
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../../images/button.png)
***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط. يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط.
> 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0. > 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0.
@ -102,8 +98,6 @@
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../../images/temperature-as-digital.png)
***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط. يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط.
## ما هي المحركات؟ ## ما هي المحركات؟
@ -126,8 +120,6 @@
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md) * [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md) * [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md)
@ -144,8 +136,6 @@
![A light dimmed at a low voltage and brighter at a higher voltage](../../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل. كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل.
✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك. ✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك.
@ -160,8 +150,6 @@
![Pule width modulation rotation of a motor at 150 RPM](../../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 <a href="https://wikipedia.org/wiki/Revolutions_per_minute">دورة في الدقيقة</a> ، وهو مقياس غير قياسي لسرعة الدوران). هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 <a href="https://wikipedia.org/wiki/Revolutions_per_minute">دورة في الدقيقة</a> ، وهو مقياس غير قياسي لسرعة الدوران).
```output ```output
@ -172,8 +160,6 @@
![Pule width modulation rotation of a motor at 75 RPM](../../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف. يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف.
```output ```output
@ -195,8 +181,6 @@
![A LED is off at 0 volts and on at 5V](../../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب. ✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب.
تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها. تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها.

@ -57,8 +57,6 @@
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***পটেনশিওমিটার । Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে। আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে।
> 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে। > 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে।
@ -85,8 +83,6 @@
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***বাটন । Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে। আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে।
> 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে। > 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে।
@ -98,8 +94,6 @@
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***ডিজিটাল তাপমাত্রা সেন্সর । Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে। ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে।
## অ্যাকচুয়েটর কী? ## অ্যাকচুয়েটর কী?
@ -122,8 +116,6 @@
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md) * [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md) * [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -138,8 +130,6 @@
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে। সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে।
✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত। ✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত।
@ -152,8 +142,6 @@
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে .০২ সেকেন্ডে মোটর ঘুরছে আবার ভোল্টের জন্য .০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি। তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে .০২ সেকেন্ডে মোটর ঘুরছে আবার ভোল্টের জন্য .০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি।
```output ```output
@ -165,8 +153,6 @@
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে। পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে।
```output ```output
@ -188,8 +174,6 @@
![A LED is off at 0 volts and on at 5V](../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে। ✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে।
আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে। আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে।

@ -31,8 +31,6 @@ There are a number of popular communication protocols used by IoT devices to com
![IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices.](../../../images/pub-sub.png) ![IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices.](../../../images/pub-sub.png)
***IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices. Broadcast by RomStu / Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
MQTT is the most popular communication protocol for IoT devices and is covered in this lesson. Others protocols include AMQP and HTTP/HTTPS. MQTT is the most popular communication protocol for IoT devices and is covered in this lesson. Others protocols include AMQP and HTTP/HTTPS.
## Message Queueing Telemetry Transport (MQTT) ## Message Queueing Telemetry Transport (MQTT)
@ -43,8 +41,6 @@ MQTT has a single broker and multiple clients. All clients connect to the broker
![IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic](../../../images/mqtt.png) ![IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic](../../../images/mqtt.png)
***IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic. Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research. If you have a lot of IoT devices, how can you ensure your MQTT broker can handle all the messages? ✅ Do some research. If you have a lot of IoT devices, how can you ensure your MQTT broker can handle all the messages?
### Connect your IoT device to MQTT ### Connect your IoT device to MQTT
@ -67,8 +63,6 @@ Rather than dealing with the complexities of setting up an MQTT broker as part o
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-internet-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-internet-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
Follow the relevant step below to connect your device to the MQTT broker: Follow the relevant step below to connect your device to the MQTT broker:
* [Arduino - Wio Terminal](wio-terminal-mqtt.md) * [Arduino - Wio Terminal](wio-terminal-mqtt.md)
@ -106,8 +100,6 @@ Let's look back at the example of the smart thermostat from Lesson 1.
![An Internet connected thermostat using multiple room sensors](../../../images/telemetry.png) ![An Internet connected thermostat using multiple room sensors](../../../images/telemetry.png)
***An Internet connected thermostat using multiple room sensors. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The thermostat has temperature sensors to gather telemetry. It would most likely have one temperature sensor built in, and it might connect to multiple external temperature sensors over a wireless protocol such as [Bluetooth Low Energy](https://wikipedia.org/wiki/Bluetooth_Low_Energy) (BLE). The thermostat has temperature sensors to gather telemetry. It would most likely have one temperature sensor built in, and it might connect to multiple external temperature sensors over a wireless protocol such as [Bluetooth Low Energy](https://wikipedia.org/wiki/Bluetooth_Low_Energy) (BLE).
An example of the telemetry data it would send could be: An example of the telemetry data it would send could be:
@ -353,8 +345,6 @@ Commands are messages sent by the cloud to a device, instructing it to do someth
![An Internet connected thermostat receiving a command to turn on the heating](../../../images/commands.png) ![An Internet connected thermostat receiving a command to turn on the heating](../../../images/commands.png)
***An Internet connected thermostat receiving a command to turn on the heating. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
A thermostat could receive a command from the cloud to turn the heating on. Based on the telemetry data from all the sensors, if the cloud service has decided that the heating should be on, so it sends the relevant command. A thermostat could receive a command from the cloud to turn the heating on. Based on the telemetry data from all the sensors, if the cloud service has decided that the heating should be on, so it sends the relevant command.
### Send commands to the MQTT broker ### Send commands to the MQTT broker

@ -4,9 +4,7 @@ As the population grows, so does the demand on agriculture. The amount of land a
In these 6 lessons you'll learn how to apply the Internet of Things to improve and automate farming. In these 6 lessons you'll learn how to apply the Internet of Things to improve and automate farming.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you follow the [Clean up your project](lessons/6-keep-your-plant-secure/README.md#clean-up-your-project) step in [lesson 6](lessons/6-keep-your-plant-secure/README.md). > 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
**Add video of automated plant**
## Topics ## Topics

@ -1,8 +1,8 @@
# Predict plant growth with IoT # Predict plant growth with IoT
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-5.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -134,8 +134,6 @@ By gathering temperature data using an IoT device, a farmer can automatically be
![Telemetry data is sent to a server and then saved to a database](../../../images/save-telemetry-database.png) ![Telemetry data is sent to a server and then saved to a database](../../../images/save-telemetry-database.png)
***Telemetry data is sent to a server and then saved to a database. database by Icons Bazaar - from the [Noun Project](https://thenounproject.com)***
The server code can also augment the data by adding extra information. For example, the IoT device can publish an identifier to indicate which device it is, and the sever code can use this to look up the location of the device, and what crops it is monitoring. It can also add basic data like the current time as some IoT devices don't have the necessary hardware to keep track of an accurate time, or require additional code to read the current time over the Internet. The server code can also augment the data by adding extra information. For example, the IoT device can publish an identifier to indicate which device it is, and the sever code can use this to look up the location of the device, and what crops it is monitoring. It can also add basic data like the current time as some IoT devices don't have the necessary hardware to keep track of an accurate time, or require additional code to read the current time over the Internet.
✅ Why do you think different fields might have different temperatures? ✅ Why do you think different fields might have different temperatures?

@ -1,8 +1,8 @@
# Detect soil moisture # Detect soil moisture
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-6.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -33,8 +33,6 @@ Plants require water to grow. They absorb water throughout the entire plant, wit
![Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure](../../../images/transpiration.png) ![Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure](../../../images/transpiration.png)
***Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure. Plant by Alex Muravev / Plant Cell by Léa Lortal - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research: how much water is lost through transpiration? ✅ Do some research: how much water is lost through transpiration?
The root system provides water from moisture in the soil where the plant grows. Too little water in the soil and the plant cannot absorb enough to grow, too much water and roots cannot absorb enough oxygen needed to function. This leads to roots dying and the plant unable to get enough nutrients to survive. The root system provides water from moisture in the soil where the plant grows. Too little water in the soil and the plant cannot absorb enough to grow, too much water and roots cannot absorb enough oxygen needed to function. This leads to roots dying and the plant unable to get enough nutrients to survive.
@ -83,14 +81,10 @@ You can use GPIO pins directly with some digital sensors and actuators when you
![A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1](../../../images/button-with-digital.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1](../../../images/button-with-digital.png)
***A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
* LED. You can connect an LED between an output pin and a ground pin (using a resistor otherwise you'll burn out the LED). From code you can set the output pin to high and it will send 3.3V, making a circuit from the 3.3V pin, through the LED, to the ground pin. This will light the LED. * LED. You can connect an LED between an output pin and a ground pin (using a resistor otherwise you'll burn out the LED). From code you can set the output pin to high and it will send 3.3V, making a circuit from the 3.3V pin, through the LED, to the ground pin. This will light the LED.
![An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit.](../../../images/led-digital-control.png) ![An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit.](../../../images/led-digital-control.png)
***An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
For more advanced sensors, you can use GPIO pins to send and receive digital data directly with digital sensors and actuators, or via controller boards with ADCs and DACs to talk to analog sensors and actuators. For more advanced sensors, you can use GPIO pins to send and receive digital data directly with digital sensors and actuators, or via controller boards with ADCs and DACs to talk to analog sensors and actuators.
> 💁 if you are using a Raspberry Pi for these labs, the Grove Base Hat has hardware to convert analog sensor signals to digital to send over GPIO. > 💁 if you are using a Raspberry Pi for these labs, the Grove Base Hat has hardware to convert analog sensor signals to digital to send over GPIO.
@ -105,8 +99,6 @@ For example, on a 3.3V board, if the sensor returns 3.3V, the value returned wou
![A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511](../../../images/analog-sensor-voltage.png) ![A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511](../../../images/analog-sensor-voltage.png)
***A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511. probe by Adnen Kadri / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
> 💁 Back in nightlight - lesson 3, the light sensor returned a value from 0-1,023. If you are using a Wio Terminal, the sensor was connected to an analog pin. If you are using a Raspberry Pi, then the sensor was connected to an analog pin on the base hat that has an integrated ADC to communicate over the GPIO pins. The virtual device was set to send a value from 0-1,023 to simulate an analog pin. > 💁 Back in nightlight - lesson 3, the light sensor returned a value from 0-1,023. If you are using a Wio Terminal, the sensor was connected to an analog pin. If you are using a Raspberry Pi, then the sensor was connected to an analog pin on the base hat that has an integrated ADC to communicate over the GPIO pins. The virtual device was set to send a value from 0-1,023 to simulate an analog pin.
Soil moisture sensors rely on voltages, so will use analog pins and give values from 0-1,023. Soil moisture sensors rely on voltages, so will use analog pins and give values from 0-1,023.
@ -130,8 +122,6 @@ I<sup>2</sup>C has a bus made of 2 main wires, along with 2 power wires:
![I2C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire](../../../images/i2c.png) ![I2C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire](../../../images/i2c.png)
***I<sup>2</sup>C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire. Microcontroller by Template / LED by abderraouf omara / ldr by Eucalyp - all from the [Noun Project](https://thenounproject.com)***
To send data, one device will issue a start condition to show it is ready to send data. It will then become the controller. The controller then sends the address of the device that it wants to communicate with, along with if it wants to read or write data. After the data has been transmitted, the controller sends a stop condition to indicate that it has finished. After this another device can become the controller and send or receive data. To send data, one device will issue a start condition to show it is ready to send data. It will then become the controller. The controller then sends the address of the device that it wants to communicate with, along with if it wants to read or write data. After the data has been transmitted, the controller sends a stop condition to indicate that it has finished. After this another device can become the controller and send or receive data.
I<sup>2</sup>C has speed limits, with 3 different modes running at fixed speeds. The fastest is High Speed mode with a maximum speed of 3.4Mbps (megabits per second), though very few devices support that speed. The Raspberry Pi for example, is limited to fast mode at 400Kbps (kilobits per second). Standard mode runs at 100Kbps. I<sup>2</sup>C has speed limits, with 3 different modes running at fixed speeds. The fastest is High Speed mode with a maximum speed of 3.4Mbps (megabits per second), though very few devices support that speed. The Raspberry Pi for example, is limited to fast mode at 400Kbps (kilobits per second). Standard mode runs at 100Kbps.
@ -147,8 +137,6 @@ UART involves physical circuitry that allows two devices to communicate. Each de
![UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa](../../../images/uart.png) ![UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa](../../../images/uart.png)
***UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
> 🎓 The data is sent one bit at a time, and this is known as *serial* communication. Most operating systems and microcontrollers have *serial ports*, that is connections that can send and receive serial data that are available to your code. > 🎓 The data is sent one bit at a time, and this is known as *serial* communication. Most operating systems and microcontrollers have *serial ports*, that is connections that can send and receive serial data that are available to your code.
UART devices have a [baud rate](https://wikipedia.org/wiki/Symbol_rate) (also known as Symbol rate), which is the speed that data will be sent and received in bits per second. A common baud rate is 9,600, meaning 9,600 bits (0s and 1s) of data are sent each second. UART devices have a [baud rate](https://wikipedia.org/wiki/Symbol_rate) (also known as Symbol rate), which is the speed that data will be sent and received in bits per second. A common baud rate is 9,600, meaning 9,600 bits (0s and 1s) of data are sent each second.
@ -178,8 +166,6 @@ SPI controllers use 3 wires, along with 1 extra wire per peripheral. Peripherals
![SPI with on controller and two peripherals](../../../images/spi.png) ![SPI with on controller and two peripherals](../../../images/spi.png)
***SPI with on controller and two peripherals. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
The CS wire is used to activate one peripheral at a time, communicating over the COPI and CIPO wires. When the controller needs to change peripheral, it deactivates the CS wire connected to the currently active peripheral, then activates the wire connected to the peripheral it wants to communicate with next. The CS wire is used to activate one peripheral at a time, communicating over the COPI and CIPO wires. When the controller needs to change peripheral, it deactivates the CS wire connected to the currently active peripheral, then activates the wire connected to the peripheral it wants to communicate with next.
SPI is *full-duplex*, meaning the controller can send and receive data at the same time from the same peripheral using the COPI and CIPO wires. SPI uses a clock signal on the SCLK wire to keep the devices in sync, so unlike sending directly over UART it doesn't need start and stop bits. SPI is *full-duplex*, meaning the controller can send and receive data at the same time from the same peripheral using the COPI and CIPO wires. SPI uses a clock signal on the SCLK wire to keep the devices in sync, so unlike sending directly over UART it doesn't need start and stop bits.

@ -1,8 +1,8 @@
# Automated plant watering # Automated plant watering
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-7.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -30,8 +30,6 @@ The solution to this is to have a pump connected to an external power supply, an
![A light switch turns power on to a light](../../../images/light-switch.png) ![A light switch turns power on to a light](../../../images/light-switch.png)
***A light switch turns power on to a light. switch by Chattapat / lightbulb by Maxim Kulikov - all from the [Noun Project](https://thenounproject.com)***
> 🎓 [Mains electricity](https://wikipedia.org/wiki/Mains_electricity) refers to the electricity delivered to homes and businesses through national infrastructure in many parts of the world. > 🎓 [Mains electricity](https://wikipedia.org/wiki/Mains_electricity) refers to the electricity delivered to homes and businesses through national infrastructure in many parts of the world.
✅ IoT devices can usually provide 3.3V or 5V, at less than 1 amp (1A) of current. Compare this to mains electricity which is most often at 230V (120V in North America and 100V in Japan), and can provide power for devices that draw 30A. ✅ IoT devices can usually provide 3.3V or 5V, at less than 1 amp (1A) of current. Compare this to mains electricity which is most often at 230V (120V in North America and 100V in Japan), and can provide power for devices that draw 30A.
@ -46,14 +44,10 @@ A relay is an electromechanical switch that converts an electrical signal into a
![When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit](../../../images/relay-on.png) ![When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit](../../../images/relay-on.png)
***When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
In a relay, a control circuit powers the electromagnet. When the electromagnet is on, it pulls a lever that moves a switch, closing a pair of contacts and completing an output circuit. In a relay, a control circuit powers the electromagnet. When the electromagnet is on, it pulls a lever that moves a switch, closing a pair of contacts and completing an output circuit.
![When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit](../../../images/relay-off.png) ![When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit](../../../images/relay-off.png)
***When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
When the control circuit is off, the electromagnet turns off, releasing the lever and opening the contacts, turning off the output circuit. Relays are digital actuators - a high signal to the relay turns it on, a low signal turns it off. When the control circuit is off, the electromagnet turns off, releasing the lever and opening the contacts, turning off the output circuit. Relays are digital actuators - a high signal to the relay turns it on, a low signal turns it off.
The output circuit can be used to power additional hardware, like an irrigation system. The IoT device can turn the relay on, completing the output circuit that powers the irrigation system, and plants get watered. The IoT device can then turn the relay off, cutting the power to the irrigation system, turning the water off. The output circuit can be used to power additional hardware, like an irrigation system. The IoT device can turn the relay on, completing the output circuit that powers the irrigation system, and plants get watered. The IoT device can then turn the relay off, cutting the power to the irrigation system, turning the water off.
@ -132,8 +126,6 @@ If you did the last lesson on soil moisture using a physical sensor, you would h
![A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil](../../../images/soil-moisture-travel.png) ![A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil](../../../images/soil-moisture-travel.png)
***A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
In the diagram above, a soil moisture reading shows 658. The plant is watered, but this reading doesn't change immediately, as the water has yet to reach the sensor. Watering can even finish before the water reaches the sensor and the value drops to reflect the new moisture level. In the diagram above, a soil moisture reading shows 658. The plant is watered, but this reading doesn't change immediately, as the water has yet to reach the sensor. Watering can even finish before the water reaches the sensor and the value drops to reflect the new moisture level.
If you were writing code to control an irrigation system via a relay based off soil moisture levels, you would need to take this delay into consideration and build smarter timing into your IoT device. If you were writing code to control an irrigation system via a relay based off soil moisture levels, you would need to take this delay into consideration and build smarter timing into your IoT device.
@ -160,8 +152,6 @@ For example, I have a strawberry plant with a soil moisture sensor and a pump co
![Step 1, take measurement. Step 2, add water. Step 3, wait for water to soak through the soil. Step 4, retake measurement](../../../images/soil-moisture-delay.png) ![Step 1, take measurement. Step 2, add water. Step 3, wait for water to soak through the soil. Step 4, retake measurement](../../../images/soil-moisture-delay.png)
***Measure, add water, wait, remeasure. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
This means the best process would be a watering cycle that is something like: This means the best process would be a watering cycle that is something like:
* Turn on the pump for 5 seconds * Turn on the pump for 5 seconds
@ -207,10 +197,11 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
1. Open the `app.py` file 1. Open the `app.py` file
1. Add the following code to the `app.py` file below the existing imports: 1. Add the following code to the `app.py` file below the existing imports:
```python ```python
import threading import threading
``` ```
This statement imports `threading` from Python libraries, threading allows python to execute other code while waiting. This statement imports `threading` from Python libraries, threading allows python to execute other code while waiting.
1. Add the following code before the `handle_telemetry` function that handles telemetry messages received by the server code: 1. Add the following code before the `handle_telemetry` function that handles telemetry messages received by the server code:
@ -278,7 +269,7 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
``` ```
A good way to test this in a simulated irrigation system is to use dry soil, then pour water in manually whilst the relay is on, stopping pouring when the relay turns off. A good way to test this in a simulated irrigation system is to use dry soil, then pour water in manually whilst the relay is on, stopping pouring when the relay turns off.
> 💁 You can find this code in the [code-timing](./code-timing) folder. > 💁 You can find this code in the [code-timing](./code-timing) folder.
> 💁 If you want to use a pump to build a real irrigation system, then you can use a [6V water pump](https://www.seeedstudio.com/6V-Mini-Water-Pump-p-1945.html) with a [USB terminal power supply](https://www.adafruit.com/product/3628). Make sure the power to or from the pump is connected via the relay. > 💁 If you want to use a pump to build a real irrigation system, then you can use a [6V water pump](https://www.seeedstudio.com/6V-Mini-Water-Pump-p-1945.html) with a [USB terminal power supply](https://www.adafruit.com/product/3628). Make sure the power to or from the pump is connected via the relay.

@ -1,8 +1,8 @@
# Migrate your plant to the cloud # Migrate your plant to the cloud
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-8.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -106,14 +106,10 @@ IoT devices connect to a cloud service either using a device SDK (a library that
![Devices connect to a service using a device SDK. Server code also connects to the service via an SDK](../../../images/iot-service-connectivity.png) ![Devices connect to a service using a device SDK. Server code also connects to the service via an SDK](../../../images/iot-service-connectivity.png)
***Devices connect to a service using a device SDK. Server code also connects to the service via an SDK. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
Your device then communicates with other parts of your application over this service - similar to how you sent telemetry and received commands over MQTT. This is usually using a service SDK or a similar library. Messages come from your device to the service where other components of your application can then read them, and messages can then be sent back to your device. Your device then communicates with other parts of your application over this service - similar to how you sent telemetry and received commands over MQTT. This is usually using a service SDK or a similar library. Messages come from your device to the service where other components of your application can then read them, and messages can then be sent back to your device.
![Devices without a valid secret key cannot connect to the IoT service](../../../images/iot-service-allowed-denied-connection.png) ![Devices without a valid secret key cannot connect to the IoT service](../../../images/iot-service-allowed-denied-connection.png)
***Devices without a valid secret key cannot connect to the IoT service. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
These services implement security by knowing about all the devices that can connect and send data, either by having the devices pre-registered with the service, or by giving the devices secret keys or certificates they can use to register themselves with the service the first time they connect. Unknown devices are unable to connect, if they try the service rejects the connection and ignores messages sent by them. These services implement security by knowing about all the devices that can connect and send data, either by having the devices pre-registered with the service, or by giving the devices secret keys or certificates they can use to register themselves with the service the first time they connect. Unknown devices are unable to connect, if they try the service rejects the connection and ignores messages sent by them.
✅ Do some research: What is the downside of having an open IoT service where any device or code can connect? Can you find specific examples of hackers taking advantage of this? ✅ Do some research: What is the downside of having an open IoT service where any device or code can connect? Can you find specific examples of hackers taking advantage of this?

@ -1,8 +1,8 @@
# Migrate your application logic to the cloud # Migrate your application logic to the cloud
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-9.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -26,14 +26,10 @@ Serverless, or serverless computing, involves creating small blocks of code that
![Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run](../../../images/iot-messages-to-serverless.png) ![Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run](../../../images/iot-messages-to-serverless.png)
***Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
> 💁 If you've used database triggers before, you can think of this as the same thing, code being triggered by an event such as inserting a row. > 💁 If you've used database triggers before, you can think of this as the same thing, code being triggered by an event such as inserting a row.
![When many events are sent at the same time, the serverless service scales up to run them all at the same time](../../../images/serverless-scaling.png) ![When many events are sent at the same time, the serverless service scales up to run them all at the same time](../../../images/serverless-scaling.png)
***When many events are sent at the same time, the serverless service scales up to run them all at the same time. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
Your code is only run when the event happens, there is nothing keeping your code alive at other times. The event happens, your code is loaded and run. This makes serverless very scalable - if many events happen at the same time, the cloud provider can run your function as many times as you need at the same time across whatever servers they have available. The downside to this is if you need to share information between events, you need to save it somewhere like a database rather than storing it in memory. Your code is only run when the event happens, there is nothing keeping your code alive at other times. The event happens, your code is loaded and run. This makes serverless very scalable - if many events happen at the same time, the cloud provider can run your function as many times as you need at the same time across whatever servers they have available. The downside to this is if you need to share information between events, you need to save it somewhere like a database rather than storing it in memory.
Your code is written as a function that takes details about the event as a parameter. You can use a wide range of programming languages to write these serverless functions. Your code is written as a function that takes details about the event as a parameter. You can use a wide range of programming languages to write these serverless functions.

@ -1,8 +1,8 @@
# Keep your plant secure # Keep your plant secure
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-10.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -54,14 +54,10 @@ When a device connects to an IoT service, it uses an ID to identify itself. The
![Both valid and malicious devices could use the same ID to send telemetry](../../../images/iot-device-and-hacked-device-connecting.png) ![Both valid and malicious devices could use the same ID to send telemetry](../../../images/iot-device-and-hacked-device-connecting.png)
***Both valid and malicious devices could use the same ID to send telemetry. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The way round this is to convert the data being sent into a scrambled format, using some kind of value to scramble the data known to the device and the cloud only. This process is called *encryption*, and the value used to encrypt the data is called an *encryption key*. The way round this is to convert the data being sent into a scrambled format, using some kind of value to scramble the data known to the device and the cloud only. This process is called *encryption*, and the value used to encrypt the data is called an *encryption key*.
![If encryption is used, then only encrypted messages will be accepted, others will be rejected](../../../images/iot-device-and-hacked-device-connecting-encryption.png) ![If encryption is used, then only encrypted messages will be accepted, others will be rejected](../../../images/iot-device-and-hacked-device-connecting-encryption.png)
***If encryption is used, then only encrypted messages will be accepted, others will be rejected. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The cloud service can then convert the data back to a readable format, using a process called *decryption*, using either the same encryption key, or a *decryption key*. If the encrypted message cannot be decrypted by the key, the device has been hacked and the message is rejected. The cloud service can then convert the data back to a readable format, using a process called *decryption*, using either the same encryption key, or a *decryption key*. If the encrypted message cannot be decrypted by the key, the device has been hacked and the message is rejected.
The technique for doing encryption and decryption is called *cryptography*. The technique for doing encryption and decryption is called *cryptography*.
@ -164,8 +160,6 @@ When using X.509 certificates, both the sender and the recipient will have their
![Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it.](../../../images/send-message-certificate.png) ![Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it.](../../../images/send-message-certificate.png)
***Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it. Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)***
One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub. One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub.
The certificate used by your device to encrypt messages it sends to the IoT Hub is published by Microsoft. It is the same certificate that a lot of Azure services use, and is sometimes built into the SDKs The certificate used by your device to encrypt messages it sends to the IoT Hub is published by Microsoft. It is the same certificate that a lot of Azure services use, and is sometimes built into the SDKs

@ -1,9 +0,0 @@
# Dummy File
This file acts as a placeholder for the `translations` folder. <br>
**Please remove this file after adding the first translation**
For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) .
## THANK YOU
We truly appreciate your efforts!

@ -0,0 +1,20 @@
# কৃষিকাজে IoT
জনসংখ্যা যেমন বাড়ছে, তেমনি কৃষির চাহিদাও বাড়ছে। কৃষিজমির পরিমাণ অতোটা পরিবর্তন না হলেও, জলবায়ুর পরিবর্তন ঠিকই হচ্ছে - যা কৃষকদের আরও বেশি সমস্যার মুখে ফেলে দিচ্ছে, বিশেষত সেই ২বিলিয়ন [জীবিকা নির্বাহী কৃষক](https://wikipedia.org/wiki/Subsistence_agriculture) যাদের ফসল বেড়ে ওঠার উপর নির্ভর করেই তাদের পরিবারের অন্নসংস্থান হয়। কোন ধরণের ফসল উৎপাদন করা যাবে, কখন কাজ শুরু করা উচিত, ফলনের বৃদ্ধি, শারিরীক শ্রমের পরিমাণ হ্রাস এবং কীটপতঙ্গগুলি সনাক্ত ও তাদেরকে বিনাশ করার বিষয়ে কৃষকদের অনেকাংশে সাহায্য করতে পারে আইওটি ।
এই ৬টি লেসনে আমরা শিখবো কীভাবে কৃষিকাজ উন্নত ও স্বয়ংক্রিয় করতে ইন্টারনেট অফ থিংস প্রয়োগ করা যায়।
> 💁 এই লেসনগুলোতে আমরা ক্লাউড রিসোর্স ব্যবহার করবো। যদি এই অধ্যায়ের সমস্ত পাঠ সম্পূর্ণ করা সম্ভব নাও হয়, তবুও [Clean up your project](../clean-up.md) অংশটি অবশ্যই দেখে নিতে হবে।
## বিষয়াবলী
1. [আইওটি দ্বারা উদ্ভিদ বৃদ্ধির পূর্বাভাস](lessons/1-predict-plant-growth/README.md)
1. [মাটির আর্দ্রতা সনাক্তকরণ](lessons/2-detect-soil-moisture/README.md)
1. [স্বয়ংক্রিয়ভাবে গাছে সেচকার্য](lessons/3-automated-plant-watering/README.md)
1. [উদ্ভিদকে ক্লাউড থেকে নিয়ন্ত্রণ](lessons/4-migrate-your-plant-to-the-cloud/README.md)
1. [ক্লাউড থেকে এপ্লিকেশন নিয়ন্ত্রণ](lessons/5-migrate-application-to-the-cloud/README.md)
1. [উদ্ভিদের নিরাপত্তা নিশ্চিতকরণ](lessons/6-keep-your-plant-secure/README.md)
## ক্রেডিট
♥️ প্রতিটি লেসনই ভালোবাসার সাথে তৈরী করেছেন [Jim Bennett](https://GitHub.com/JimBobBennett)

@ -1,8 +1,8 @@
# Location tracking # Location tracking
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-11.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -114,15 +114,13 @@ GPS systems work by having a number of satellites that send a signal with each s
![By knowing the distance from the sensor to multiple satellites, the location be calculated](../../../images/gps-satellites.png) ![By knowing the distance from the sensor to multiple satellites, the location be calculated](../../../images/gps-satellites.png)
***By knowing the distance from the sensor to multiple satellites, the location be calculated. Satellite by Noura Mbarki from the [Noun Project](https://thenounproject.com)***
GPS satellites are circling the Earth, not at a fixed point above the sensor, so location data includes altitude above sea level as well as latitude and longitude. GPS satellites are circling the Earth, not at a fixed point above the sensor, so location data includes altitude above sea level as well as latitude and longitude.
GPS used to have limitations on accuracy enforced by the US military, limiting accuracy to around 5 meters. This limitation was removed in 2000, allowing an accuracy of 30 centimeters. Getting this accuracy is not always possible due to interference with the signals. GPS used to have limitations on accuracy enforced by the US military, limiting accuracy to around 5 meters. This limitation was removed in 2000, allowing an accuracy of 30 centimeters. Getting this accuracy is not always possible due to interference with the signals.
✅ If you have a smart phone, launch the mapping app and see how accurate your location is. It may take a short period of time for your phone to detect multiple satellites to get a more accurate location. ✅ If you have a smart phone, launch the mapping app and see how accurate your location is. It may take a short period of time for your phone to detect multiple satellites to get a more accurate location.
> 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite. > 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks on Earth, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite.
GPS systems have been developed and deployed by a number of countries and political unions including the US, Russia, Japan, India, the EU, and China. Modern GPS sensor can connect to most of these systems to get faster and more accurate fixes. GPS systems have been developed and deployed by a number of countries and political unions including the US, Russia, Japan, India, the EU, and China. Modern GPS sensor can connect to most of these systems to get faster and more accurate fixes.

@ -1,8 +1,8 @@
# Store location data # Store location data
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-12.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -18,6 +18,7 @@ In this lesson we'll cover:
* [Structured and unstructured data](#structured-and-unstructured-data) * [Structured and unstructured data](#structured-and-unstructured-data)
* [Send GPS data to an IoT Hub](#send-gps-data-to-an-iot-hub) * [Send GPS data to an IoT Hub](#send-gps-data-to-an-iot-hub)
* [Hot, warm, and cold paths](#hot-warm-and-cold-paths)
* [Handle GPS events using serverless code](#handle-gps-events-using-serverless-code) * [Handle GPS events using serverless code](#handle-gps-events-using-serverless-code)
* [Azure Storage Accounts](#azure-storage-accounts) * [Azure Storage Accounts](#azure-storage-accounts)
* [Connect your serverless code to storage](#connect-your-serverless-code-to-storage) * [Connect your serverless code to storage](#connect-your-serverless-code-to-storage)
@ -44,6 +45,8 @@ Imagine you were adding IoT devices to a fleet of vehicles for a large commercia
This data can change constantly. For example, if the IoT device is in a truck cab, then the data it sends may change as the trailer changes, for example only sending temperature data when a refrigerated trailer is used. This data can change constantly. For example, if the IoT device is in a truck cab, then the data it sends may change as the trailer changes, for example only sending temperature data when a refrigerated trailer is used.
✅ What other IoT data might be captured? Think about the kinds of loads trucks can carry, as well as maintenance data.
This data varies from vehicle to vehicle, but it all gets sent to the same IoT service for processing. The IoT service needs to be able to process this unstructured data, storing it in a way that allows it to be searched or analyzed, but works with different structures to this data. This data varies from vehicle to vehicle, but it all gets sent to the same IoT service for processing. The IoT service needs to be able to process this unstructured data, storing it in a way that allows it to be searched or analyzed, but works with different structures to this data.
### SQL vs NoSQL storage ### SQL vs NoSQL storage
@ -58,10 +61,14 @@ The first databases were Relational Database Management Systems (RDBMS), or rela
For example, if you stored a users personal details in a table, you would have some kind of internal unique ID per user that is used in a row in a table that contains the users name and address. If you then wanted to store other details about that user, such as their purchases, in another table, you would have one column in the new table for that users ID. When you look up a user, you can use their ID to get their personal details from one table, and their purchases from another. For example, if you stored a users personal details in a table, you would have some kind of internal unique ID per user that is used in a row in a table that contains the users name and address. If you then wanted to store other details about that user, such as their purchases, in another table, you would have one column in the new table for that users ID. When you look up a user, you can use their ID to get their personal details from one table, and their purchases from another.
SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema. Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL. SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema.
✅ If you haven't used SQL before, take a moment to read up on it on the [SQL page on Wikipedia](https://wikipedia.org/wiki/SQL). ✅ If you haven't used SQL before, take a moment to read up on it on the [SQL page on Wikipedia](https://wikipedia.org/wiki/SQL).
Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL.
✅ Do some research: Read up on some of these SQL databases and their capabilities.
#### NoSQL database #### NoSQL database
NoSQL databases are called NoSQL because they don't have the same rigid structure of SQL databases. They are also known as document databases as they can store unstructured data such as documents. NoSQL databases are called NoSQL because they don't have the same rigid structure of SQL databases. They are also known as document databases as they can store unstructured data such as documents.
@ -74,6 +81,8 @@ NoSQL database do not have a pre-defined schema that limits how data is stored,
Some well known NoSQL databases include Azure CosmosDB, MongoDB, and CouchDB. Some well known NoSQL databases include Azure CosmosDB, MongoDB, and CouchDB.
✅ Do some research: Read up on some of these NoSQL databases and their capabilities.
In this lesson, you will be using NoSQL storage to store IoT data. In this lesson, you will be using NoSQL storage to store IoT data.
## Send GPS data to an IoT Hub ## Send GPS data to an IoT Hub
@ -82,8 +91,6 @@ In the last lesson you captured GPS data from a GPS sensor connected to your IoT
![Sending GPS telemetry from an IoT device to IoT Hub](../../../images/gps-telemetry-iot-hub.png) ![Sending GPS telemetry from an IoT device to IoT Hub](../../../images/gps-telemetry-iot-hub.png)
***Sending GPS telemetry from an IoT device to IoT Hub. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - send GPS data to an IoT Hub ### Task - send GPS data to an IoT Hub
1. Create a new IoT Hub using the free tier. 1. Create a new IoT Hub using the free tier.
@ -136,14 +143,36 @@ message = Message(json.dumps(message_json))
Run your device code and ensure messages are flowing into IoT Hub using the `az iot hub monitor-events` CLI command. Run your device code and ensure messages are flowing into IoT Hub using the `az iot hub monitor-events` CLI command.
## Hot, warm, and cold paths
Data that flows from an IoT device to the cloud is not always processed in real time. Some data needs real time processing, other data can be processed a short time later, and other data can be processed much later. The flow of data to different services that process the data at different times is referred to hot, warm and cold paths.
### Hot path
The hot path refers to data that needs to be processed in real time or near real time. You would use hot path data for alerts, such as getting alerts that a vehicle is approaching a depot, or that the temperature in a refrigerated truck is too high.
To use hot path data, your code would respond to events as soon as they are received by your cloud services.
### Warm path
The warm path refers to data that can be processed a short while after being received, for example for reporting or short term analytics. You would use warm path data for daily reports on vehicle mileage, using data gathered the previous day.
Warm path data is stored once it is received by the cloud service inside some kind of storage that can be quickly accessed.
### Cold path
THe cold path refers to historic data, storing data for the long term to be processed whenever needed. For example, you could use the cold path to get annual mileage reports for vehicles, or run analytics on routes to find the most optimal route to reduce fuel costs.
Cold path data is stored in data warehouses - databases designed for storing large amounts of data that will never change and can be queried quickly and easily. You would normally have a regular job in your cloud application that would run at a regular time each day, week, or month to move data from warm path storage into the data warehouse.
✅ Think about the data you have captured so far in these lessons. Is it hot, warm or cold path data?
## Handle GPS events using serverless code ## Handle GPS events using serverless code
Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint. Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint. This is the warm path - this data will be stored and used in the next lesson for reporting on the journey.
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger](../../../images/gps-telemetry-iot-hub-functions.png) ![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger](../../../images/gps-telemetry-iot-hub-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - handle GPS events using serverless code ### Task - handle GPS events using serverless code
1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this. 1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this.
@ -207,8 +236,6 @@ In this lesson, you will use the Python SDK to see how to interact with blob sto
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage](../../../images/save-telemetry-to-storage-from-functions.png) ![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage](../../../images/save-telemetry-to-storage-from-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The data will be saved as a JSON blob with the following format: The data will be saved as a JSON blob with the following format:
```json ```json
@ -343,7 +370,6 @@ The data will be saved as a JSON blob with the following format:
> 💁 Make sure you are not running the IoT Hub event monitor at the same time. > 💁 Make sure you are not running the IoT Hub event monitor at the same time.
> 💁 You can find this code in the [code/functions](code/functions) folder. > 💁 You can find this code in the [code/functions](code/functions) folder.
### Task - verify the uploaded blobs ### Task - verify the uploaded blobs

@ -1,8 +1,10 @@
# Visualize location data # Visualize location data
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-13.png)
This video gives an overview of OAzure Maps with IoT, a service that will be covered in this lesson. > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of Azure Maps with IoT, a service that will be covered in this lesson.
[![Azure Maps - The Microsoft Azure Enterprise Location Platform](https://img.youtube.com/vi/P5i2GFTtb2s/0.jpg)](https://www.youtube.com/watch?v=P5i2GFTtb2s) [![Azure Maps - The Microsoft Azure Enterprise Location Platform](https://img.youtube.com/vi/P5i2GFTtb2s/0.jpg)](https://www.youtube.com/watch?v=P5i2GFTtb2s)

@ -1,6 +1,8 @@
# Geofences # Geofences
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-14.png)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of geofences and how to use them in Azure Maps, topics that will be covered in this lesson: This video gives an overview of geofences and how to use them in Azure Maps, topics that will be covered in this lesson:

@ -1,6 +1,8 @@
# Train a fruit quality detector # Train a fruit quality detector
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-15.png)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of the Azure Custom Vision service, a service that will be covered in this lesson. This video gives an overview of the Azure Custom Vision service, a service that will be covered in this lesson.
@ -38,8 +40,6 @@ The rise of automated harvesting moved the sorting of produce from the harvest t
![If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever](../../../images/optical-tomato-sorting.png) ![If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever](../../../images/optical-tomato-sorting.png)
***If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever. tomato by parkjisun from the Noun Project - from the [Noun Project](https://thenounproject.com)***
The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts. The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts.
The video below shows one of these machines in action. The video below shows one of these machines in action.
@ -74,6 +74,8 @@ ML models don't give a binary answer, instead they give probabilities. For examp
The ML model used to detect images like this is called an *image classifier* - it is given labelled images, and then classifies new images based off these labels. The ML model used to detect images like this is called an *image classifier* - it is given labelled images, and then classifies new images based off these labels.
> 💁 This is an over-simplification, and there are many other ways to train models that don't always need labelled outputs, such as unsupervised learning. If you want to learn more about ML, check out [ML for beginners, a 24 lesson curriculum on Machine Learning](https://aka.ms/ML-beginners).
## Train an image classifier ## Train an image classifier
To successfully train an image classifier you need millions of images. As it turns out, once you have an image classifier trained on millions or billions of assorted images, you can re-use it and re-train it using a small set of images and get great results, using a process called *transfer learning*. To successfully train an image classifier you need millions of images. As it turns out, once you have an image classifier trained on millions or billions of assorted images, you can re-use it and re-train it using a small set of images and get great results, using a process called *transfer learning*.
@ -122,7 +124,7 @@ To use Custom Vision, you first need to create two cognitive services resources
Replace `<location>` with the location you used when creating the Resource Group. Replace `<location>` with the location you used when creating the Resource Group.
This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services. This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
> 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services. > 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services.
1. Use the following command to create a free Custom Vision prediction resource: 1. Use the following command to create a free Custom Vision prediction resource:
@ -144,7 +146,7 @@ To use Custom Vision, you first need to create two cognitive services resources
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account. 1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account.
1. Follow the [Create a new Project section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference. 1. Follow the [create a new Project section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
Call your project `fruit-quality-detector`. Call your project `fruit-quality-detector`.
@ -178,11 +180,11 @@ Image classifiers run at very low resolution. For example Custom Vision can take
If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use. If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use.
1. Follow the [Upload and tag images section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`. 1. Follow the [upload and tag images section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`.
![The upload dialogs showing the upload of ripe and unripe banana pictures](../../../images/image-upload-bananas.png) ![The upload dialogs showing the upload of ripe and unripe banana pictures](../../../images/image-upload-bananas.png)
1. Follow the [Train the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images. 1. Follow the [train the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images.
You will be given a choice of training type. Select **Quick Training**. You will be given a choice of training type. Select **Quick Training**.
@ -196,7 +198,7 @@ Once your classifier is trained, you can test it by giving it a new image to cla
### Task - test your image classifier ### Task - test your image classifier
1. Follow the [Test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training. 1. Follow the [test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training.
![A unripe banana predicted as unripe with a 98.9% probability, ripe with a 1.1% probability](../../../images/banana-unripe-quick-test-prediction.png) ![A unripe banana predicted as unripe with a 98.9% probability, ripe with a 1.1% probability](../../../images/banana-unripe-quick-test-prediction.png)
@ -210,7 +212,7 @@ Every time you make a prediction using the quick test option, the image and resu
### Task - retrain your image classifier ### Task - retrain your image classifier
1. Follow the [Use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image. 1. Follow the [use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image.
1. Once you model has been retrained, test on new images. 1. Once you model has been retrained, test on new images.
@ -228,8 +230,8 @@ Try it out and see what the predictions are. You can find images to try with usi
## Review & Self Study ## Review & Self Study
* When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the Evaluate the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier) * When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the evaluate the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier)
* Read up on how to improve your classifier from the [How to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn) * Read up on how to improve your classifier from the [how to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment

@ -1,8 +1,8 @@
# Check fruit quality from an IoT device # Check fruit quality from an IoT device
Add a sketchnote if possible/appropriate ![A sketchnote overview of this lesson](../../../sketchnotes/lesson-16.png)
![Embed a video here if available](video-url) > Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz ## Pre-lecture quiz
@ -151,7 +151,7 @@ If you were to create a production device to sell to farms or factories, how wou
You trained your custom vision model using the portal. This relies on having images available - and in the real world you may not be able to get training data that matches what the camera on your device captures. You can work round this by training directly from your device using the training API, to train a model using images captured from your IoT device. You trained your custom vision model using the portal. This relies on having images available - and in the real world you may not be able to get training data that matches what the camera on your device captures. You can work round this by training directly from your device using the training API, to train a model using images captured from your IoT device.
* Read up on the training API in the [Using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn) * Read up on the training API in the [using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment

@ -1,6 +1,6 @@
# Classify an image - Virtual IoT Hardware and Raspberry Pi # Classify an image - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it. In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Send images to Custom Vision ## Send images to Custom Vision
@ -25,7 +25,7 @@ The Custom Vision service has a Python SDK you can use to classify images.
This brings in some modules from the Custom Vision libraries, one to authenticate with the prediction key, and one to provide a prediction client class that can call Custom Vision. This brings in some modules from the Custom Vision libraries, one to authenticate with the prediction key, and one to provide a prediction client class that can call Custom Vision.
1. Add the following code to to the end of the file: 1. Add the following code to the end of the file:
```python ```python
prediction_url = '<prediction_url>' prediction_url = '<prediction_url>'
@ -86,6 +86,6 @@ The Custom Vision service has a Python SDK you can use to classify images.
![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png) ![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png)
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-device](code-classify/virtual-device) folder. > 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success! 😀 Your fruit quality classifier program was a success!

@ -101,7 +101,7 @@ Program the device.
> 💁 You can capture the image directly to a file instead of a `BytesIO` object by passing the file name to the `camera.capture` call. The reason for using the `BytesIO` object is so that later in this lesson you can send the image to your image classifier. > 💁 You can capture the image directly to a file instead of a `BytesIO` object by passing the file name to the `camera.capture` call. The reason for using the `BytesIO` object is so that later in this lesson you can send the image to your image classifier.
1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captures from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam. 1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captured from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam.
![CounterFit with a file set as the image source, and a web cam set showing a person holding a banana in a preview of the webcam](../../../images/counterfit-camera-options.png) ![CounterFit with a file set as the image source, and a web cam set showing a person holding a banana in a preview of the webcam](../../../images/counterfit-camera-options.png)

@ -10,7 +10,7 @@ The camera you'll use is an [ArduCam Mini 2MP Plus](https://www.arducam.com/prod
## Connect the camera ## Connect the camera
The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C busses via the GPIO pins on the Wio Terminal. The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C buses via the GPIO pins on the Wio Terminal.
### Task - connect the camera ### Task - connect the camera

@ -1,10 +1,10 @@
# Classify an image - Wio Terminal # Classify an image - Wio Terminal
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it. In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Classify an image ## Classify an image
The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. THis REST API is accessed over an HTTPS connection - a secure HTTP connection. The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. This REST API is accessed over an HTTPS connection - a secure HTTP connection.
When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application. When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application.
@ -12,7 +12,7 @@ These certificates contain public keys, and don't need to be kept secure. You ca
### Task - set up a SSL client ### Task - set up a SSL client
1. Open the `fruit-quality-detector` app project if it's not already open 1. Open the `fruit-quality-detector` app project if it's not already open.
1. Open the `config.h` header file, and add the following: 1. Open the `config.h` header file, and add the following:

@ -1,14 +1,8 @@
# Run your fruit detector on the edge # Run your fruit detector on the edge
<!-- This lesson is still under development -->
Add a sketchnote if possible/appropriate
This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson. This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson.
[![Custom Vison AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us) [![Custom Vision AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us)
> 🎥 Click the image above to watch a video
## Pre-lecture quiz ## Pre-lecture quiz
@ -16,23 +10,98 @@ This video gives an overview of running image classifiers on IoT devices, the to
## Introduction ## Introduction
In this lesson you will learn about In the last lesson you used your image classifier to classify ripe and unripe fruit, sending an image captured by the camera on your IoT device over the internet to a cloud service. These calls take time, cost money, and depending on the kind of image data you are using, could have privacy implications.
In this lesson you will learn about how to run machine learning (ML) models on the edge - on IoT devices running on your own network rather than in the cloud. You will learn the benefits and drawbacks of edge computing versus cloud computing, how to deploy your AI model to the edge, and how to access it from your IoT device.
In this lesson we'll cover: In this lesson we'll cover:
* [Edge computing](#edge-computing) * [Edge computing](#edge-computing)
* [Azure IoT Edge](#azure-iot-edge) * [Azure IoT Edge](#Azure-IoT-Edge)
* [Register an IoT Edge device](#registeran-iot-edge-device) * [Register an IoT Edge device](#register-an-iot-edge-device)
* [Set up an IoT Edge device](#set-up-an-iot-dge-device) * [Set up an IoT Edge device](#set-up-an-iot-edge-device)
* [Run your classifier on the edge](run-your-classifier-on-the-edge) * [Export your model](#export-your-model)
* [Prepare your container for deployment](#prepare-your-container-for-deployment)
* [Deploy your container](#deploy-your-container)
* [Use your IoT Edge device](#use-your-iot-edge-device)
## Edge computing ## Edge computing
Edge computing involves having computers that process IoT data as close as possible to where the data is generated. Instead of having this processing in the cloud, it is moved to the edge of the cloud - your internal network.
![An architecture diagram showing internet services in the cloud and IoT devices on a local network](../../../images/cloud-without-edge.png)
In the lessons so far, you have had devices gathering data and sending data to the cloud to be analyzed, running serverless functions or AI models in the cloud.
![An architecture diagram showing IoT devices on a local network connecting to edge devices, and those edge devices connect to the cloud](../../../images/cloud-with-edge.png)
Edge computing involves moving some of the cloud services off the cloud and onto computers running on the same network as the IoT devices, only communicating with the cloud if needed. For example, you can run AI models on edge devices to analyse fruit for ripeness, and only send analytics back to the cloud, such as the number of ripe pieces of fruit vs unripe.
✅ Think about the IoT applications you have built so far. Which parts of them could be moved to the edge.
### Upsides
The upsides of edge computing are:
1. **Speed** - edge computing is ideal for time-sensitive data as actions are done on the same network as the device, rather than making calls across the internet. This enables higher speeds as internal networks can run at substantially faster speeds than internet connections, with the data travelling much shorter distance.
> 💁 Despite optical cables being used for internet connections allowing data to travel at the speed of light, data can take time to travel around the world to cloud providers. For example, if you are sending data from Europe to cloud services in the US it takes at least 28ms for the data to cross the atlantic in an optical cable, and that is ignoring the time taken to get the data to the transatlantic cable, convert from electrical to light signals and back again the other side, then from the optical cable to the cloud provider.
Edge computing also requires less network traffic, reducing the risk of your data slowing down due to congestion on the limited bandwidth available for an internet connection.
1. **Remote accessibility** - edge compute works when you have limited or no connectivity, or connectivity is too expensive to use continually. For example when working in humanitarian disaster areas where infrastructure is limited, or in developing nations.
1. **Lower costs** - performing data collection, storage, analysis, and triggering actions on edge device reduces usage of cloud services which can reduce the overall cost of your IoT application. There has been a recent rise in devices designed for edge computing, such as AI accelerator boards like the [Jetson Nano from NVIDIA](https://developer.nvidia.com/embedded/jetson-nano-developer-kit), which can run AI workloads using GPU-based hardware on devices that cost less than US$100.
1. **Privacy and security** - with edge compute, data stays on your network and is not uploaded to the cloud. This is often preferred for sensitive and personally identifiable information, especially because data does not need to be stored after it has been analyzed, which greatly reduces the risk of data leaks. Examples include medical data and security camera footage.
1. **Handling insecure devices** - if you have devices with known security flaws that you don't want to connect directly to your network or the internet, then you can connect them to a separate network to a gateway IoT Edge device. This edge device can then also have a connection to your wider network or the internet, and manage the data flows back and forth.
1. **Support for incompatible devices** - if you have devices that cannot connect to IoT Hub, for example devices that can only connect using HTTP connections or devices that only have Bluetooth to connect, you can use an IoT edge device as a gateway device, forwarding on messages to IoT Hub.
✅ Do some research: What other upsides might there be to edge computing?
### Downsides
There are downsides to edge computing, where the cloud may be a preferred option:
1. **Scale and flexibility** - cloud computing can adjust to network and data needs in real-time by adding or reducing servers and other resources. To add more edge computers requires manually adding more devices.
1. **Reliability and resiliency** - cloud computing provides multiple servers often in multiple locations for redundancy and disaster recovery. To have the same level of redundancy on the edge requires large investments and a lor of configuration work.
1. **Maintenance** - cloud service providers provide system maintenance and updates.
✅ Do some research: What other downsides might there be to edge computing?
The downsides are really the opposite of the upsides of using the cloud - you have to build and manage these devices yourself, rather than relying on the expertise and scale of cloud providers.
Some of the risks are mitigated by the very nature of edge computing. For example, if you have an edge device running in a factory gathering data from machinery, you don't need to think about some disaster recovery scenarios. If the power to the factory goes out then you don't need a backup edge device as the machines that generate the data the edge device processes will also be without power.
For IoT systems, you'll often want a blend of cloud and edge computing, leveraging each service based on the needs of the system, its customers, and its maintainers.
## Azure IoT Edge ## Azure IoT Edge
![The Azure IoT Edge logo](../../../images/azure-iot-edge-logo.png) ![The Azure IoT Edge logo](../../../images/azure-iot-edge-logo.png)
IoT Edge runs code from containers. Azure IoT Edge is a service that can help you to move workloads out of the cloud and to the edge. You set up a device as an edge device, and from the cloud you can deploy code to that edge device. This allows you to mix the capabilities of the cloud and the edge.
> 🎓 *Workloads* is a term for any service that does some kind of work, such as AI models, applications, or serverless functions.
For example, you can train an image classifier in the cloud, then from the cloud deploy it to an edge device. Your IoT device then sends images to the edge device for classification, rather than sending the images over the internet. If you need to deploy a new iteration of the model, you can train it in the cloud and use IoT Edge to update the model on the edge device to your new iteration.
> 🎓 Software that is deployed to IoT Edge is known as *modules*. By default IoT Edge runs modules that communicate with IoT Hub, such as the `edgeAgent` and `edgeHub` modules. When you deploy an image classifier, this is deployed as an additional module.
IoT Edge is built into IoT Hub, so you can manage edge devices using the same service you would use to manage IoT devices, with the same level of security.
IoT Edge runs code from *containers* - self contained applications that are run in isolation from the rest of the applications on your computer. When you run a container it act's like a separate computer running inside your computer, with it's own software, services and applications running. Most of the time containers cannot access anything on your computer unless you choose to share things like a folder with the container. The container then exposes services via an open port that you can connect to or expose to your network.
![A web request redirected to a container](../../../images/container-web-browser.png)
For example, you can have a container with a web site running on port 80, the default HTTP port, and you can then expose it from your computer also on port 80.
✅ Do some research: Read up on containers and services such as Docker or Moby.
You can use Custom Vision to download image classifiers and deploy them as containers, either running direct to a device or deployed via IoT Edge. Once they are running in a container, they can be accessed using the same REST API as the cloud version, but with the endpoint pointing to the Edge device running the container.
## Register an IoT Edge device ## Register an IoT Edge device
@ -66,9 +135,11 @@ To use an IoT Edge device, it needs to be registered in IoT Hub.
## Set up an IoT Edge device ## Set up an IoT Edge device
### Task - set up an IoT Edge device Once you have created the edge device registration in your IoT Hub, you can set up the edge device.
### Task - Install and start the IoT Edge Runtime
The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on Windows using Linux Virtual Machines. **The IoT Edge runtime only runs Linux containers.** It can be run on Linux, or on Windows using Linux Virtual Machines.
* If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string. * If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string.
@ -80,24 +151,457 @@ The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on W
* If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this. * If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this.
## Create a classifier that can run on the edge ## Export your model
To run the classifier at the edge, it needs to be exported from Custom Vision. Custom Vision can generate two types of models - standard models and compact models. Compact models use various techniques to reduce the size of the model, making it small enough to be downloaded and deployed on IoT devices.
When you created the image classifier, you used the *Food* domain, a version of the model that is optimized for training on food images. In Custom Vision, you change the domain of your project, using your training data to train a new model with the new domain. All of the domains supported by Custom Vision are available as standard and compact.
## Run your classifier on the edge ### Task - train your model using the Food (compact) domain
### Task - deploy your classifier using IoT Edge 1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `fruit-quality-detector` project.
### Task - use the edge classifier from your IoT device 1. Select the **Settings** button (the ⚙ icon)
1. In the *Domains* list, select *Food (compact)*
1. Under *Export Capabilities*, make sure *Basic platforms (Tensorflow, CoreML, ONNX, ...)* is selected.
1. At the bottom of the Settings page, select **Save Changes**.
1. Retrain the model with the **Train** button, selecting *Quick training*.
### Task - export your model
Once the model has been trained, it needs to be exported as a container.
1. Select the **Performance** tab, and find your latest iteration that was trained using the compact domain.
1. Select the **Export** button at the top.
1. Select **DockerFile**, then choose a version that matches your edge device:
* If you are running IoT Edge on a Linux computer, a Windows computer or a Virtual Machine, select the *Linux* version.
* If you are running IoT Edge on a Raspberry Pi, select the *ARM (Raspberry Pi 3)* version.
> 🎓 Docker is one of the most popular tools for managing containers, and a DockerFile is a set of instructions on how to set up the container.
1. Select **Export** to get Custom Vision to create the relevant files, then **Download** to download them in a zip file.
1. Save the files to your computer, then unzip the folder.
## Prepare your container for deployment
![Containers are built then pushed to a container registry, then deployed from the container registry to an edge device using IoT Edge](../../../images/container-edge-flow.png)
Once you have downloaded your model, it needs to be built into a container, then pushed to a container registry - an online location where you can store containers. IoT Edge can then download the container from the registry and push it to your device.
![THe Azure Container Registry logo](../../../images/azure-container-registry-logo.png)
The container registry you will use for this lesson is Azure Container Registry. This is not a free service, so to save money make sure you [clean up your project](../../../clean-up.md) once you are finished.
> 💁 You can see the costs of using an Azure Container Registry in the [Azure Container Registry pricing page](https://azure.microsoft.com/pricing/details/container-registry/?WT.mc_id=academic-17441-jabenn)
### Task - install Docker
To build and deploy the classifier classifier, you'll need to install [Docker](https://www.docker.com/).
1. Follow the Docker installation instructions on the [Docker install page](https://www.docker.com/products/docker-desktop) to install Docker Desktop or the Docker engine. Ensure it is running after installation.
### Task - create a container registry resource
1. Run the following command from your Terminal or command prompt to create an Azure Container Registry resource:
```sh
az acr create --resource-group fruit-quality-detector \
--sku Basic \
--name <Container registry name>
```
Replace `<Container registry name>` with a unique name for your container registry, using letters and numbers only. Base this around `fruitqualitydetector`. This name becomes part of the URL to access the container registry, so needs to be globally unique.
1. Log in to the Azure Container Registry with the following command:
```sh
az acr login --name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
1. Set the container registry into admin mode so you can generate a password with the following command:
```sh
az acr update --admin-enabled true \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
1. Generate passwords for your container registry with the following command:
```sh
az acr credential renew --password-name password \
--output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
Take a copy of the value of `PASSWORD`, as you will need this later.
### Task - build your container
What you downloaded from Custom Vision was a DockerFile containing instructions on how the container should be built, along with application code that will be run inside the container to host your custom vision model, along with a REST API to call it. You can use Docker to build a tagged container from the DockerFile, then push it to your container registry.
> 🎓 Containers are given a tag that defines a name and version for them. When you need to update a container you can build it with the same tag but a newer version.
1. Open your terminal or command prompt and navigate to the unzipped model that you downloaded from Custom Vision.
1. Run the following command to build and tag the image:
```sh
docker build --platform <platform> -t <Container registry name>.azurecr.io/classifier:v1 .
```
Replace `<platform>` with the platform that this container will run on. If you are running IoT Edge on a Raspberry Pi, set this to `linux/arm64`, otherwise set this to `linux/amd64`.
> 💁 If you are running this command from the device you are running IoT Edge from, such as running this from your Raspberry Pi, you can omit the `--platform <platform>` part as it defaults to the current platform.
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
Docker will build the image, configuring all the software needed. The image will then be tagged as `classifier:v1`.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker build --platform linux/amd64 -t fruitqualitydetectorjimb.azurecr.io/classifier:v1 .
[+] Building 102.4s (11/11) FINISHED
=> [internal] load build definition from Dockerfile
=> => transferring dockerfile: 131B
=> [internal] load .dockerignore
=> => transferring context: 2B
=> [internal] load metadata for docker.io/library/python:3.7-slim
=> [internal] load build context
=> => transferring context: 905B
=> [1/6] FROM docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => resolve docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda 27.15MB / 27.15MB
=> => sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9 2.77MB / 2.77MB
=> => sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8 10.06MB / 10.06MB
=> => sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6 1.86kB / 1.86kB
=> => sha256:44073386687709c437586676b572ff45128ff1f1570153c2f727140d4a9accad 1.37kB / 1.37kB
=> => sha256:3d94f0f2ca798607808b771a7766f47ae62a26f820e871dd488baeccc69838d1 8.31kB / 8.31kB
=> => sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41 233B / 233B
=> => sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f 2.64MB / 2.64MB
=> => extracting sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda
=> => extracting sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9
=> => extracting sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8
=> => extracting sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41
=> => extracting sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f
=> [2/6] RUN pip install -U pip
=> [3/6] RUN pip install --no-cache-dir numpy~=1.17.5 tensorflow~=2.0.2 flask~=1.1.2 pillow~=7.2.0
=> [4/6] RUN pip install --no-cache-dir mscviplib==2.200731.16
=> [5/6] COPY app /app
=> [6/6] WORKDIR /app
=> exporting to image
=> => exporting layers
=> => writing image sha256:1846b6f134431f78507ba7c079358ed66d944c0e185ab53428276bd822400386
=> => naming to fruitqualitydetectorjimb.azurecr.io/classifier:v1
```
### Task - push your container to your container registry
1. Use the following command to push your container to your container registry:
```sh
docker push <Container registry name>.azurecr.io/classifier:v1
```
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
The container will be pushed to the container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker push fruitqualitydetectorjimb.azurecr.io/classifier:v1
The push refers to repository [fruitqualitydetectorjimb.azurecr.io/classifier]
5f70bf18a086: Pushed
8a1ba9294a22: Pushed
56cf27184a76: Pushed
b32154f3f5dd: Pushed
36103e9a3104: Pushed
e2abb3cacca0: Pushed
4213fd357bbe: Pushed
7ea163ba4dce: Pushed
537313a13d90: Pushed
764055ebc9a7: Pushed
v1: digest: sha256:ea7894652e610de83a5a9e429618e763b8904284253f4fa0c9f65f0df3a5ded8 size: 2423
```
1. To verify the push, you can list the containers in your registry with the following command:
```sh
az acr repository list --output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux az acr repository list --name fruitqualitydetectorjimb --output table
Result
----------
classifier
```
You will see your classifier listed in the output.
## Deploy your container
Your container can now be deployed to your IoT Edge device. To deploy you need to define a deployment manifest - a JSON document that lists the modules that will be deployed to the edge device.
### Task - create the deployment manifest
1. Create a new file called `deployment.json` somewhere on your computer.
1. Add the following to this file:
```json
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container registry password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}
```
> 💁 You can find this file in the [code-deployment/deployment](code-deployment/deployment) folder.
Replace the three instances of`<Container registry name>` with the name you used for your container registry. One is in the `ImageClassifier` module section, the other two are in the `registryCredentials` section.
Replace `<Container registry password>` in the `registryCredentials` section with your container registry password.
1. From the folder containing your deployment manifest, run the following command:
```sh
az iot edge set-modules --device-id fruit-quality-detector-edge \
--content deployment.json \
--hub-name <hub_name>
```
Replace `<hub_name>` with the name of your IoT Hub.
The image classifier module will be deployed to your edge device.
### Task - verify the classifier is running
1. Connect to the IoT edge device:
* If you are using a Raspberry Pi to run IoT Edge, connect using ssh either from your terminal, or via a remote SSH session in VS Code
* If you are running IoT Edge in a Linux container on Windows, follow the steps in the [Verify successful configuration guide](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge-on-windows?view=iotedge-2018-06&tabs=powershell&WT.mc_id=academic-17441-jabenn#verify-successful-configuration) to connect to the IoT Edge device.
* If you are running IoT Edge on a virtual machine, you can SSH into the machine using the `adminUsername` and `password` you set when creating the VM, and using either the IP address or DNS name:
```sh
ssh <adminUsername>@<IP address>
```
Or:
```sh
ssh <adminUsername>@<DNS Name>
```
Enter your password when prompted
1. Once you are connected, run the following command to get the list of IoT Edge modules:
```sh
iotedge list
```
> 💁 You may need to run this command with `sudo`.
You will see the running modules:
```output
jim@fruit-quality-detector-jimb:~$ iotedge list
NAME STATUS DESCRIPTION CONFIG
ImageClassifier running Up 42 minutes fruitqualitydetectorjimb.azurecr.io/classifier:v1
edgeAgent running Up 42 minutes mcr.microsoft.com/azureiotedge-agent:1.1
edgeHub running Up 42 minutes mcr.microsoft.com/azureiotedge-hub:1.1
```
1. Check the logs for the Image classifier module with the following command:
```sh
iotedge logs ImageClassifier
```
> 💁 You may need to run this command with `sudo`.
```output
jim@fruit-quality-detector-jimb:~$ iotedge logs ImageClassifier
2021-07-05 20:30:15.387144: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-07-05 20:30:15.392185: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2394450000 Hz
2021-07-05 20:30:15.392712: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55ed9ac83470 executing computations on platform Host. Devices:
2021-07-05 20:30:15.392806: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Host, Default Version
Loading model...Success!
Loading labels...2 found. Success!
* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
```
### Task - test the image classifier
1. You can use CURL to test the image classifier using the IP address or host name of the computer that is running the IoT Edge agent. Find the IP address:
* If you are on the same machine that IoT Edge is running, you can use `localhost` as the host name.
* If you are using a VM, you can use either the IP address or the DNS name of the VM
* Otherwise you can obtain the IP address of the machine running IoT Edge:
* On Windows 10, follow the [Find your IP address guide](https://support.microsoft.com/windows/find-your-ip-address-f21a9bbc-c582-55cd-35e0-73431160a1b9?WT.mc_id=academic-17441-jabenn)
* On macOS, follow the [How to find you IP address on a Mac guide](https://www.hellotech.com/guide/for/how-to-find-ip-address-on-mac)
* On linux, follow the section on finding your private IP address in the [How to find your IP address in Linux guide](https://opensource.com/article/18/5/how-find-ip-address-linux)
1. You can test the container with a local file by running the following curl command:
```sh
curl --location \
--request POST 'http://<IP address or name>/image' \
--header 'Content-Type: image/png' \
--data-binary '@<file_Name>'
```
Replace `<IP address or name>` with the IP address or host name of the computer running IoT Edge. Replace `<file_Name>` with the name of the file to test.
You will see the prediction results in the output:
```output
{
"created": "2021-07-05T21:44:39.573181",
"id": "",
"iteration": "",
"predictions": [
{
"boundingBox": null,
"probability": 0.9995615482330322,
"tagId": "",
"tagName": "ripe"
},
{
"boundingBox": null,
"probability": 0.0004384400090202689,
"tagId": "",
"tagName": "unripe"
}
],
"project": ""
}
```
> 💁 There is no need to provide a prediction key here, as this is not using an Azure resource. Instead security would be configured on the internal network based on internal security needs, rather than relying on a public endpoint and an API key.
## Use your IoT Edge device
Now that your Image Classifier has been deployed to an IoT Edge device, you can use it from your IoT device.
### Task - use your IoT Edge device
Work through the relevant guide to classify images using the IoT Edge classifier:
* [Arduino - Wio Terminal](wio-terminal.md)
* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer.md)
### Model retraining
One of the downsides to running image classifiers on IoT Edge is that they are not connected to your Custom Vision project. If you look at the **Predictions** tab in Custom Vision you won't see the images that were classified using the Edge-based classifier.
This is the expected behavior - images are not sent to the cloud for classification, so they won't be available in the cloud. One of the upsides of using IoT Edge is privacy, ensuring that images don't leave your network, another is being able to work offline, so no relying on uploading images when the device has no internet connection. The downside is improving your model - you would need to implement another way of storing images that can be manually re-classified to improve and re-train the image classifier.
✅ Think about ways to upload images to retrain the classifier.
--- ---
## 🚀 Challenge ## 🚀 Challenge
Running AI models on edge devices can be faster that in the cloud - the network hop is shorter. THey can also be slower as the hardware than runs the model may not be as powerful as the cloud.
Do some timings and compare if the call to your edge device is faster or slower than the call to the cloud? Think about reasons to explain the difference, or lack of difference. Research ways to run AI models faster on the edge using specialized hardware.
## Post-lecture quiz ## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34) [Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
## Review & Self Study ## Review & Self Study
* Read more about containers on the [OS-level virtualization page on Wikipedia](https://wikipedia.org/wiki/OS-level_virtualization)
* Read more on edge computing, with an emphasis on how 5G can help expand edge computing in the [What is edge computing and why does it matter? article on NetworkWorld](https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html)
* Learn more about running AI services in IoT Edge by watching the [Learn How to Use Azure IoT Edge on a Pre-Built AI Service on the Edge to do Language Detection episode of Learn Live on Microsoft Channel9](https://channel9.msdn.com/Shows/Learn-Live/Sharpen-Your-AI-Edge-Skills-Episode-4-Learn-How-to-Use-Azure-IoT-Edge-on-a-Pre-Built-AI-Service-on-t?WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment
[](assignment.md) [Run other services on the edge](assignment.md)

@ -1,9 +1,13 @@
# # Run other services on the edge
## Instructions ## Instructions
It's not just image classifiers that can be run on the edge, anything that can be packaged up into a container can be deployed to an IoT Edge device. Serverless code running as Azure Functions, such as the triggers you've created in earlier lessons can be run in containers, and therefor on IoT Edge.
Pick one of the previous lessons and try to run the Azure Functions app in an IoT Edge container. You can find a guide that shows how to do this using a different Functions app project in the [Tutorial: Deploy Azure Functions as IoT Edge modules on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/tutorial-deploy-function?view=iotedge-2020-11&WT.mc_id=academic-17441-jabenn).
## Rubric ## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement | | Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- | | -------- | --------- | -------- | ----------------- |
| | | | | | Deploy an Azure Functions app to IoT Edge | Was able to deploy an Azure Functions app to IoT Edge and use it with an IoT device to run a trigger from IoT data | Was able to deploy a Functions App to IoT Edge, but was unable to get the trigger to fire | Was unable to deploy a Functions App to IoT Edge |

@ -0,0 +1,28 @@
import io
import requests
import time
from picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,28 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
import requests
from counterfit_shims_picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,11 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";

@ -0,0 +1,123 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <WiFiClient.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClient client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
void classifyImage(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
classifyImage(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,66 @@
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container Password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}

@ -0,0 +1,54 @@
# Classify an image using an IoT Edge based image classifier - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
The Python library for Custom Vision only works with cloud-hosted models, not models hosted on IoT Edge. This means you will need to use the REST API to call the classifier.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` project in VS Code if it is not already open. If you are using a virtual IoT device, then make sure the virtual environment is activated.
1. Open the `app.py` file, and remove the import statements from `azure.cognitiveservices.vision.customvision.prediction` and `msrest.authentication`.
1. Add the following import at the top of the file:
```python
import requests
```
1. Delete all the code after the image is saved to a file, from `image_file.write(image.read())` to the end of the file.
1. Add the following code to the end of the file:
```python
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')
```
Replace `<URL>` with the URL for your classifier.
This code makes a REST POST request to the classifier, sending the image as the body of the request. The results come back as JSON, and this is decoded to print out the probabilities.
1. Run your code, with your camera pointing at some fruit, or an appropriate image set, or fruit visible on your webcam if using virtual IoT hardware. You will see the output in the console:
```output
(.venv) ➜ fruit-quality-detector python app.py
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success!

@ -31,36 +31,71 @@ In Azure, you can create a virtual machine - a computer in the cloud that you ca
Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device. Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device.
1. You will need either the IP address or the DNS name of the VM to call the image classifier from it. Run the following command to get this:
```sh
az vm list --resource-group fruit-quality-detector \
--output table \
--show-details
```
Take a copy of either the `PublicIps` field, or the `Fqdns` field.
1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project. 1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project.
To shut down the VM, use the following command: You can configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this:
```sh ```sh
az vm deallocate --resource-group fruit-quality-detector \ az vm auto-shutdown --resource-group fruit-quality-detector \
--name <vm_name> --name <vm_name> \
--time <shutdown_time_utc>
``` ```
Replace `<vm_name>` with the name of your virtual machine. Replace `<vm_name>` with the name of your virtual machine.
> 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running. Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC).
To restart the VM, use the following command: 1. Your image classifier will be running on this edge device, listening on port 80 (the standard HTTP port). By default, virtual machines have inbound ports blocked, so you will need to enable port 80. Ports are enabled on network security groups, so first you need to know the name of the network security group for your VM, which you can find with the following command:
```sh ```sh
az vm start --resource-group fruit-quality-detector \ az network nsg list --resource-group fruit-quality-detector \
--name <vm_name> --output table
``` ```
Replace `<vm_name>` with the name of your virtual machine. Copy the value of the `Name` field.
You can also configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this: 1. Run the following command to add a rule to open port 80 to the network security group:
```sh ```sh
az vm auto-shutdown --resource-group fruit-quality-detector \ az network nsg rule create \
--name <vm_name> \ --resource-group fruit-quality-detector \
--time <shutdown_time_utc> --name Port_80 \
--protocol tcp \
--priority 1010 \
--destination-port-range 80 \
--nsg-name <nsg name>
```
Replace `<nsg name>` with the network security group name from the previous step.
### Task - manage your VM to reduce costs
1. When you are not using your VM, you should shut it down. To shut down the VM, use the following command:
```sh
az vm deallocate --resource-group fruit-quality-detector \
--name <vm_name>
``` ```
Replace `<vm_name>` with the name of your virtual machine. Replace `<vm_name>` with the name of your virtual machine.
Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC). > 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running.
1. To restart the VM, use the following command:
```sh
az vm start --resource-group fruit-quality-detector \
--name <vm_name>
```
Replace `<vm_name>` with the name of your virtual machine.

@ -0,0 +1,52 @@
# Classify an image using an IoT Edge based image classifier - Wio Terminal
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` app project if it's not already open.
1. The image classifier is running as a REST API using HTTP, not HTTPS, so the call needs to use a WiFi client that works with HTTP calls only. This means the certificate is not needed. Delete the `CERTIFICATE` from the `config.h` file.
1. The prediction URL in the `config.h` file needs to be updated to the new URL. You can also delete the `PREDICTION_KEY` as this is not needed.
```cpp
const char *PREDICTION_URL = "<URL>";
```
Replace `<URL>` with the URL for your classifier.
1. In `main.cpp`, change the include directive for the WiFi Client Secure to import the standard HTTP version:
```cpp
#include <WiFiClient.h>
```
1. Change the declaration of `WiFiClient` to be the HTTP version:
```cpp
WiFiClient client;
```
1. Select the line that sets the certificate on the WiFi client. Remove the line `client.setCACert(CERTIFICATE);` from the `connectWiFi` function.
1. In the `classifyImage` function, remove the `httpClient.addHeader("Prediction-Key", PREDICTION_KEY);` line that sets the prediction key in the header.
1. Upload and run your code. Point the camera at some fruit and press the C button. You will see the output in the serial monitor:
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 8200
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/wio-terminal](code-classify/wio-terminal) folder.
😀 Your fruit quality classifier program was a success!

@ -1,9 +1,5 @@
# Trigger fruit quality detection from a sensor # Trigger fruit quality detection from a sensor
Add a sketchnote if possible/appropriate
![Embed a video here if available](video-url)
## Pre-lecture quiz ## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/35) [Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/35)
@ -41,8 +37,6 @@ IoT applications can be described as *things* (devices) sending data that genera
![A reference iot architecture](../../../images/iot-reference-architecture.png) ![A reference iot architecture](../../../images/iot-reference-architecture.png)
***A reference iot architecture. Microcontroller by Template / IoT by Adrien Coquet / Brain by Icon Market - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference IoT architecture. The diagram above shows a reference IoT architecture.
> 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate. > 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate.
@ -53,8 +47,6 @@ The diagram above shows a reference IoT architecture.
![A reference iot architecture](../../../images/iot-reference-architecture-azure.png) ![A reference iot architecture](../../../images/iot-reference-architecture-azure.png)
***A reference iot architecture. Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture. The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture.
* **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub. * **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub.
@ -78,7 +70,7 @@ As you define the architecture of your system, you need to constantly consider d
## Design a fruit quality control system ## Design a fruit quality control system
Lets now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application. Let's now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application.
Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system. Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system.
@ -96,8 +88,6 @@ You need to build a system where fruit is detected as it arrives on the conveyer
![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png) ![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png)
***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference architecture for this prototype application. The diagram above shows a reference architecture for this prototype application.
* An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected. * An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected.
@ -115,8 +105,6 @@ The IoT device needs some kind of trigger to indicate when fruit is ready to be
![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png) ![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png)
***Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back. Bananas by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor. Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor.
> 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away. > 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away.
@ -209,7 +197,7 @@ The prototype will form the basis of a final production system. Some of the diff
## 🚀 Challenge ## 🚀 Challenge
In this lesson you have learned some of the concepts you need to know to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above? In this lesson you have learned some of the concepts you need to know on how to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above?
Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need. Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need.
@ -222,7 +210,7 @@ For example - a vehicle tracking device that combines GPS with sensors to monito
## Review & Self Study ## Review & Self Study
* Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn) * Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn)
* Read more about device twins in the [Understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn) * Read more about device twins in the [understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn)
* Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture) * Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture)
## Assignment ## Assignment

@ -40,6 +40,12 @@ Program the device.
1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension. 1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension.
1. Install the rpi-vl53l0x Pip package, a Python package that interacts with a VL53L0X time-of-flight distance sensor. Install it using this pip command
```sh
pip install rpi-vl53l0x
```
1. Create a new file in this project called `distance-sensor.py`. 1. Create a new file in this project called `distance-sensor.py`.
> 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time. > 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time.
@ -95,4 +101,4 @@ Program the device.
> 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder. > 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder.
😀 Your proximity sensor program was a success! 😀 Your proximity sensor program was a success!

@ -8,7 +8,7 @@ IoT can help with this, using AI models running on IoT devices to count stock, u
In these 2 lessons you'll learn how to train image-based AI models to count stock, and run these models on IoT devices. In these 2 lessons you'll learn how to train image-based AI models to count stock, and run these models on IoT devices.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md). > 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics ## Topics

@ -1,7 +1,5 @@
# Train a stock detector # Train a stock detector
Add a sketchnote if possible/appropriate
This video gives an overview of Object Detection the Azure Custom Vision service, a service that will be covered in this lesson. This video gives an overview of Object Detection the Azure Custom Vision service, a service that will be covered in this lesson.
[![Custom Vision 2 - Object Detection Made Easy | The Xamarin Show](https://img.youtube.com/vi/wtTYSyBUpFc/0.jpg)](https://www.youtube.com/watch?v=wtTYSyBUpFc) [![Custom Vision 2 - Object Detection Made Easy | The Xamarin Show](https://img.youtube.com/vi/wtTYSyBUpFc/0.jpg)](https://www.youtube.com/watch?v=wtTYSyBUpFc)
@ -56,7 +54,7 @@ Object detection involves training a model to recognize objects. Instead of givi
When you then use it to predict images, instead of getting back a list of tags and percentages, you get back a list of detected objects, with their bounding box and the probability that the object matches the assigned tag. When you then use it to predict images, instead of getting back a list of tags and percentages, you get back a list of detected objects, with their bounding box and the probability that the object matches the assigned tag.
> 🎓 *Bounding boxes* are the boxes around an object. They are given using coordinates relative to the image as a whole on a scale of 0-1. For example, if the image is 800 pixels wide, by 600 tall and the object it detected between 400 and 600 pixels along, and 150 and 300 pixels down, the bounding box would have a top/left coordinate of 0.5,0.25, with a width of 0.25 and a height of 0.25. That way no matter what size the image is scaled to, the bounding box starts half way along, and a quarter of the way down, and is a quarter of the width and the height. > 🎓 *Bounding boxes* are the boxes around an object.
![Object detection of cashew nuts and tomato paste](../../../images/object-detector-cashews-tomato.png) ![Object detection of cashew nuts and tomato paste](../../../images/object-detector-cashews-tomato.png)
@ -107,7 +105,7 @@ You can train an object detector using Custom Vision, in a similar way to how yo
Call your project `stock-detector`. Call your project `stock-detector`.
When you create your project, make sure to use the `stock-detector-training` resource you created earlier. Use a n*Object Detection* project type, and the *Products on Shelves* domain. When you create your project, make sure to use the `stock-detector-training` resource you created earlier. Use the *Object Detection* project type, and the *Products on Shelves* domain.
![The settings for the custom vision project with the name set to fruit-quality-detector, no description, the resource set to fruit-quality-detector-training, the project type set to classification, the classification types set to multi class and the domains set to food](../../../images/custom-vision-create-object-detector-project.png) ![The settings for the custom vision project with the name set to fruit-quality-detector, no description, the resource set to fruit-quality-detector-training, the project type set to classification, the classification types set to multi class and the domains set to food](../../../images/custom-vision-create-object-detector-project.png)
@ -137,7 +135,7 @@ To train your model you will need a set of images containing the objects you wan
![Tagging some tomato paste](../../../images/object-detector-tag-tomato-paste.png) ![Tagging some tomato paste](../../../images/object-detector-tag-tomato-paste.png)
> 💁 If you have more than 15 images for each object, you can train after 15 then use the **Suggested tags** feature. This will use the trained model to detect the objecs in the untagged image. You can then confirm the detected objects, or reject and re-draw the bounding boxes. This can save a *lot* of time. > 💁 If you have more than 15 images for each object, you can train after 15 then use the **Suggested tags** feature. This will use the trained model to detect the objects in the untagged image. You can then confirm the detected objects, or reject and re-draw the bounding boxes. This can save a *lot* of time.
1. Follow the [Train the detector section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#train-the-detector) to train the object detector on your tagged images. 1. Follow the [Train the detector section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#train-the-detector) to train the object detector on your tagged images.

@ -1,33 +1,173 @@
# Check stock from an IoT device # Check stock from an IoT device
Add a sketchnote if possible/appropriate
![Embed a video here if available](video-url)
## Pre-lecture quiz ## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/39) [Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/39)
## Introduction ## Introduction
In this lesson you will learn about In the previous lesson you learned about the different uses of object detection in retail. You also learned how to train an object detector to identify stock. In this lesson you will learn how to use your object detector from your IoT device to count stock.
In this lesson we'll cover: In this lesson we'll cover:
* [Thing 1](#thing-1) * [Stock counting](#stock-counting)
* [Call your object detector from your IoT device](#call-your-object-detector-from-your-iot-device)
* [Bounding boxes](#bounding-boxes)
* [Retrain the model](#retrain-the-model)
* [Count stock](#count-stock)
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Stock counting
Object detectors can be used for stock checking, either counting stock or ensuring stock is where it should be. IoT devices with cameras can be deployed all around the store to monitor stock, starting with hot spots where having items restocked is important, such as areas where small numbers of high value items are stocked.
For example, if a camera is pointing at a set of shelves that can hold 8 cans of tomato paste, and an object detector only detects 7 cans, then one is missing and needs to be restocked.
![7 cans of tomato paste on a shelf, 4 on the top row, 3 on top](../../../images/stock-7-cans-tomato-paste.png)
In the above image, an object detector has detected 7 cans of tomato paste on a shelf that can hold 8 cans. Not only can the IoT device send a notification of the need to restock, but it can even give an indication of the location of the missing item, important data if you are using robots to restock shelves.
> 💁 Depending on the store and popularity of the item, restocking probably wouldn't happen if only 1 can was missing. You would need to build an algorithm that determines when to restock based on your produce, customers and other criteria.
✅ In what other scenarios could you combine object detection and robots?
Sometimes the wrong stock can be on the shelves. This could be human error when restocking, or customers changing their mind on a purchase and putting an item back in the first available space. When this is a non-perishable item such as canned goods, this is an annoyance. If it is a perishable item such as frozen or chilled goods, this can mean that the product can no longer be sold as it might be impossible to tell how long the item was out of the freezer.
Object detection can be used to detect unexpected items, again alerting a human or robot to return the item as soon as it is detected.
![A rogue can of baby corn on the tomato paste shelf](../../../images/stock-rogue-corn.png)
In the above image, a can of baby corn has been put on the shelf next to the tomato paste. The object detector has detected this, allowing the IoT device to notify a human or robot to return the can to it's correct location.
## Call your object detector from your IoT device
The object detector you trained in the last lesson can be called from your IoT device.
### Task - publish an iteration of your object detector
Iterations are published from the Custom Vision portal.
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `stock-detector` project.
1. Select the **Performance** tab from the options at the top
1. Select the latest iteration from the *Iterations* list on the side
1. Select the **Publish** button for the iteration
![The publish button](../../../images/custom-vision-object-detector-publish-button.png)
1. In the *Publish Model* dialog, set the *Prediction resource* to the `stock-detector-prediction` resource you created in the last lesson. Leave the name as `Iteration2`, and select the **Publish** button.
1. Once published, select the **Prediction URL** button. This will show details of the prediction API, and you will need these to call the model from your IoT device. The lower section is labelled *If you have an image file*, and this is the details you want. Take a copy of the URL that is shown which will be something like:
```output
https://<location>.api.cognitive.microsoft.com/customvision/v3.0/Prediction/<id>/detect/iterations/Iteration2/image
```
Where `<location>` will be the location you used when creating your custom vision resource, and `<id>` will be a long ID made up of letters and numbers.
Also take a copy of the *Prediction-Key* value. This is a secure key that you have to pass when you call the model. Only applications that pass this key are allowed to use the model, any other applications are rejected.
![The prediction API dialog showing the URL and key](../../../images/custom-vision-prediction-key-endpoint.png)
✅ When a new iteration is published, it will have a different name. How do you think you would change the iteration an IoT device is using?
### Task - call your object detector from your IoT device
Follow the relevant guide below to use the object detector from your IoT device:
* [Arduino - Wio Terminal](wio-terminal-object-detector.md)
* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-object-detector.md)
## Thing 1 ## Bounding boxes
When you use the object detector, you not only get back the detected objects with their tags and probabilities, but you also get the bounding boxes of the objects. These define where the object detector detected the object with the given probability.
> 💁 A bounding box is a box that defines the area that contains the object detected, a box that defines the boundary for the object.
The results of a prediction in the **Predictions** tab in Custom Vision have the bounding boxes drawn on the image that was sent for prediction.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
In the image above, 4 cans of tomato paste were detected. In the results a red square is overlaid for each object that was detected in the image, indicating the bounding box for the image.
✅ Open the predictions in Custom Vision and check out the bounding boxes.
Bounding boxes are defined with 4 values - top, left, height and width. These values are on a scale of 0-1, representing the positions as a percentage of the size of the image. The origin (the 0,0 position) is the top left of the image, so the top value is the distance from the top, and the bottom of the bounding box is the top plus the height.
![A bounding box around a can of tomato paste](../../../images/bounding-box.png)
The above image is 600 pixels wide and 800 pixels tall. The bounding box starts at 320 pixels down, giving a top coordinate of 0.4 (800 x 0.4 = 320). From the left, the bounding box starts at 240 pixels across, giving a left coordinate of 0.4 (600 x 0.4 = 240). The height of the bounding box is 240 pixels, giving a height value of 0.3 (800 x 0.3 = 240). The width of the bounding box is 120 pixels, giving a width value of 0.2 (600 x 0.2 = 120).
| Coordinate | Value |
| ---------- | ----: |
| Top | 0.4 |
| Left | 0.4 |
| Height | 0.3 |
| Width | 0.2 |
Using percentage values from 0-1 means no matter what size the image is scaled to, the bounding box starts 0.4 of the way along and down, and is a 0.3 of the height and 0.2 of the width.
You can use bounding boxes combined with probabilities to evaluate how accurate a detection is. For example, an object detector can detect multiple objects that overlap, for example detecting one can inside another. Your code could look at the bounding boxes, understand that this is impossible, and ignore any objects that have a significant overlap with other objects.
![Two bonding boxes overlapping a can of tomato paste](../../../images/overlap-object-detection.png)
In the example above, one bounding box indicated a predicted can of tomato paste at 78.3%. A second bounding box is slightly smaller, and is inside the first bounding box with a probability of 64.3%. Your code can check the bounding boxes, see they overlap completely, and ignore the lower probability as there is no way one can can be inside another.
✅ Can you think of a situation where is it valid to detect one object inside another?
## Retrain the model
Like with the image classifier, you can retrain your model using data captured by your IoT device. Using this real-world data will ensure your model works well when used from your IoT device.
Unlike with the image classifier, you can't just tag an image. Instead you need to review every bounding box detected by the model. If the box is around the wrong thing then it needs to be deleted, if it is in the wrong location it needs to be adjusted.
### Task - retrain the model
1. Make sure you have captured a range of images using your IoT device.
1. From the **Predictions** tab, select an image. You will see red boxes indicating the bounding boxes of the detected objects.
1. Work through each bounding box. Select it first and you will see a pop-up showing the tag. Use the handles on the corners of the bounding box to adjust the size if necessary. If the tag is wrong, remove it with the **X** button and add the correct tag. If the bounding box doesn't contain an object, delete it with the trashcan button.
1. Close the editor when done and the image will move from the **Predictions** tab to the **Training Images** tab. Repeat the process for all the predictions.
1. Use the **Train** button to re-train your model. Once it has trained, publish the iteration and update your IoT device to use the URL of the new iteration.
1. Re-deploy your code and test your IoT device.
## Count stock
Using a combination of the number of objects detected and the bounding boxes, you can count the stock on a shelf.
### Task - count stock
Follow the relevant guide below to count stock using the results from the object detector from your IoT device:
* [Arduino - Wio Terminal](wio-terminal-count-stock.md)
* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-count-stock.md)
--- ---
## 🚀 Challenge ## 🚀 Challenge
Can you detect incorrect stock? Train your model on multiple objects, then update your app to alert you if the wrong stock is detected.
Maybe even take this further and detect stock side by side on the same shelf, and see if something has been put in the wrong place by defining limits on the bounding boxes.
## Post-lecture quiz ## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/40) [Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/40)
## Review & Self Study ## Review & Self Study
* Learn more about how to architect an end-to-end stock detection system from the [Out of stock detection at the edge pattern guide on Microsoft Docs](https://docs.microsoft.com/hybrid/app-solutions/pattern-out-of-stock-at-edge?WT.mc_id=academic-17441-jabenn)
* Learn other ways to build end-to-end retail solutions combining a range of IoT and cloud services by watching this [Behind the scenes of a retail solution - Hands On! video on YouTube](https://www.youtube.com/watch?v=m3Pc300x2Mw).
## Assignment ## Assignment
[](assignment.md) [Use your object detector on the edge](assignment.md)

@ -1,9 +1,11 @@
# # Use your object detector on the edge
## Instructions ## Instructions
In the last project, you deployed your image classifier to the edge. Do the same with your object detector, exporting it as a compact model and running it on the edge, accessing the edge version from your IoT device.
## Rubric ## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement | | Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- | | -------- | --------- | -------- | ----------------- |
| | | | | | Deploy your object detector to the edge | Was able to use the correct compact domain, export the object detector and run it on the edge | Was able to use the correct compact domain, and export the object detector, but was unable to run it on the edge | Was unable to use the correct compact domain, export the object detector, and run it on the edge |

@ -0,0 +1,92 @@
import io
import time
from picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
from PIL import Image, ImageDraw, ImageColor
from shapely.geometry import Polygon
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
overlap_threshold = 0.002
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')

@ -0,0 +1,92 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
from counterfit_shims_picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
from PIL import Image, ImageDraw, ImageColor
from shapely.geometry import Polygon
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
overlap_threshold = 0.002
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,49 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";
const char *PREDICTION_KEY = "<PREDICTION_KEY>";
// Microsoft Azure DigiCert Global Root G2 global certificate
const char *CERTIFICATE =
"-----BEGIN CERTIFICATE-----\r\n"
"MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
"MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
"d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
"MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
"MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
"c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
"ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
"wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
"iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
"ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
"aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
"0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
"gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
"sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
"lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
"N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
"Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
"AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
"BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
"JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
"CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
"Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
"aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
"cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
"MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
"cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
"AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
"+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
"cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
"kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
"trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
"8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
"-----END CERTIFICATE-----\r\n";

@ -0,0 +1,223 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <vector>
#include <WiFiClientSecure.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClientSecure client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
client.setCACert(CERTIFICATE);
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
const float threshold = 0.0f;
const float overlap_threshold = 0.20f;
struct Point {
float x, y;
};
struct Rect {
Point topLeft, bottomRight;
};
float area(Rect rect)
{
return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
}
float overlappingArea(Rect rect1, Rect rect2)
{
float left = max(rect1.topLeft.x, rect2.topLeft.x);
float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
float top = max(rect1.topLeft.y, rect2.topLeft.y);
float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
if ( right > left && bottom > top )
{
return (right-left)*(bottom-top);
}
return 0.0f;
}
Rect rectFromBoundingBox(JsonVariant prediction)
{
JsonObject bounding_box = prediction["boundingBox"].as<JsonObject>();
float left = bounding_box["left"].as<float>();
float top = bounding_box["top"].as<float>();
float width = bounding_box["width"].as<float>();
float height = bounding_box["height"].as<float>();
Point topLeft = {left, top};
Point bottomRight = {left + width, top + height};
return {topLeft, bottomRight};
}
void processPredictions(std::vector<JsonVariant> &predictions)
{
std::vector<JsonVariant> passed_predictions;
for (int i = 0; i < predictions.size(); ++i)
{
Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
float prediction_1_area = area(prediction_1_rect);
bool passed = true;
for (int j = i + 1; j < predictions.size(); ++j)
{
Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
float prediction_2_area = area(prediction_2_rect);
float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
float smallest_area = min(prediction_1_area, prediction_2_area);
if (overlap > (overlap_threshold * smallest_area))
{
passed = false;
break;
}
}
if (passed)
{
passed_predictions.push_back(predictions[i]);
}
}
for(JsonVariant prediction : passed_predictions)
{
String boundingBox = prediction["boundingBox"].as<String>();
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
Serial.println(buff);
}
Serial.print("Counted ");
Serial.print(passed_predictions.size());
Serial.println(" stock items.");
}
void detectStock(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
detectStock(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,40 @@
import io
import time
from picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')

@ -0,0 +1,40 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
from counterfit_shims_picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,49 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";
const char *PREDICTION_KEY = "<PREDICTION_KEY>";
// Microsoft Azure DigiCert Global Root G2 global certificate
const char *CERTIFICATE =
"-----BEGIN CERTIFICATE-----\r\n"
"MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
"MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
"d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
"MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
"MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
"c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
"ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
"wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
"iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
"ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
"aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
"0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
"gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
"sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
"lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
"N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
"Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
"AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
"BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
"JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
"CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
"Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
"aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
"cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
"MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
"cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
"AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
"+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
"cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
"kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
"trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
"8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
"-----END CERTIFICATE-----\r\n";

@ -0,0 +1,145 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <list>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <vector>
#include <WiFiClientSecure.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClientSecure client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
client.setCACert(CERTIFICATE);
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
const float threshold = 0.3f;
void processPredictions(std::vector<JsonVariant> &predictions)
{
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
void detectStock(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
detectStock(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,163 @@
# Count stock from your IoT device - Virtual IoT Hardware and Raspberry Pi
A combination of the predictions and their bounding boxes can be used to count stock in an image
## Show bounding boxes
As a helpful debugging step you can not only print out the bounding boxes, but you can also draw them on the image that was written to disk when an image was captured.
### Task - print the bounding boxes
1. Ensure the `stock-counter` project is open in VS Code, and the virtual environment is activated if you are using a virtual IoT device.
1. Change the `print` statement in the `for` loop to the following to print the bounding boxes to the console:
```python
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%\t{prediction.bounding_box}')
```
1. Run the app with the camera pointing at some stock on a shelf. The bounding boxes will be printed to the console, with left, top, width and height values from 0-1.
```output
pi@raspberrypi:~/stock-counter $ python3 app.py
tomato paste: 33.42% {'additional_properties': {}, 'left': 0.3455171, 'top': 0.09916268, 'width': 0.14175442, 'height': 0.29405564}
tomato paste: 34.41% {'additional_properties': {}, 'left': 0.48283678, 'top': 0.10242918, 'width': 0.11782813, 'height': 0.27467814}
tomato paste: 31.25% {'additional_properties': {}, 'left': 0.4923783, 'top': 0.35007596, 'width': 0.13668466, 'height': 0.28304994}
tomato paste: 31.05% {'additional_properties': {}, 'left': 0.36416405, 'top': 0.37494493, 'width': 0.14024884, 'height': 0.26880276}
```
### Task - draw bounding boxes on the image
1. The Pip package [Pillow](https://pypi.org/project/Pillow/) can be used to draw on images. Install this with the following command:
```sh
pip3 install pillow
```
If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
1. Add the following import statement to the top of the `app.py` file:
```python
from PIL import Image, ImageDraw, ImageColor
```
This imports code needed to edit the image.
1. Add the following code to the end of the `app.py` file:
```python
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')
```
This code opens the image that was saved earlier for editing. It then loops through the predictions getting the bounding boxes, and calculates the bottom right coordinate using the bounding box values from 0-1. These are then converted to image coordinates by multiplying by the relevant dimension of the image. For example, if the left value was 0.5 on an image that was 600 pixels wide, this would convert it to 300 (0.5 x 600 = 300).
Each bounding box is drawn on the image using a red line. Finally the edited image is saved, overwriting the original image.
1. Run the app with the camera pointing at some stock on a shelf. You will see the `image.jpg` file in the VS Code explorer, and you will be able to select it to see the bounding boxes.
![4 cans of tomato paste with bounding boxes around each can](../../../images/rpi-stock-with-bounding-boxes.jpg)
## Count stock
In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
### Task - count stock ignoring overlap
1. The Pip package [Shapely](https://pypi.org/project/Shapely/) can be used to calculate the intersection. If you are using a Raspberry Pi, you will need to install a library dependency first:
```sh
sudo apt install libgeos-dev
```
1. Install the Shapely Pip package:
```sh
pip3 install shapely
```
If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
1. Add the following import statement to the top of the `app.py` file:
```python
from shapely.geometry import Polygon
```
This imports code needed to create polygons to calculate overlap.
1. Above the code that draws the bounding boxes, add the following code:
```python
overlap_threshold = 0.20
```
This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
1. To calculate overlap using Shapely, the bounding boxes need to be converted into Shapely polygons. Add the following function to do this:
```python
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
```
This creates a polygon using the bounding box of a prediction.
1. The logic for removing overlapping objects involves comparing all bounding boxes and if any pairs of predictions have bounding boxes that overlap more than the threshold, delete one of the predictions. To compare all the predictions, you compare prediction 1 with 2, 3, 4, etc., then 2 with 3, 4, etc. The following code does this:
```python
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
```
The overlap is calculated using the Shapely `Polygon.intersection` method that returns a polygon that has the overlap. The area is then calculated from this polygon. This overlap threshold is not an absolute value, but needs to be a percentage of the bounding box, so the smallest bounding box is found, and the overlap threshold is used to calculate what area the overlap can be to not exceed the percentage overlap threshold of the smallest bounding box. If the overlap exceeds this, the prediction is marked for deletion.
Once a prediction has been marked for deletion it doesn't need to be checked again, so the inner loop breaks out to check the next prediction. You can't delete items from a list whilst iterating through it, so the bounding boxes that overlap more than the threshold are added to the `to_delete` list, then deleted at the end.
Finally the stock count is printed to the console. This could then be sent to an IoT service to alert if the stock levels are low. All of this code is before the bounding boxes are drawn, so you will see the stock predictions without overlaps on the generated images.
> 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
1. Run the app with the camera pointing at some stock on a shelf. The output will indicate the number of bounding boxes without overlaps that exceed the threshold. Try adjusting the `overlap_threshold` value to see predictions being ignored.
> 💁 You can find this code in the [code-count/pi](code-count/pi) or [code-count/virtual-iot-device](code-count/virtual-iot-device) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,74 @@
# Call your object detector from your IoT device - Virtual IoT Hardware and Raspberry Pi
Once your object detector has been published, it can be used from your IoT device.
## Copy the image classifier project
The majority of your stock detector is the same as the image classifier you created in a previous lesson.
### Task - copy the image classifier project
1. Create a folder called `stock-counter` either on your computer if you are using a virtual IoT device, or on your Raspberry Pi. If you are using a virtual IoT device make sure you set up a virtual environment.
1. Set up the camera hardware.
* If you are using a Raspberry Pi you will need to fit the PiCamera. You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
* If you are using a virtual IoT device then you will need to install CounterFit and the CounterFit PyCamera shim. If you are going to use still images, then capture some images that your object detector hasn't seen yet, if you are going to use your web cam make sure it is positioned in a way that can see the stock you are detecting.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
## Change the code from a classifier to an image detector
The code you used to classify images is very similar to the code to detect objects. The main difference is the method called on the Custom Vision SDK, and the results of the call.
### Task - change the code from a classifier to an image detector
1. Delete the three lines of code that classifies the image and processes the predictions:
```python
results = predictor.classify_image(project_id, iteration_name, image)
for prediction in results.predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
```
Remove these three lines.
1. Add the following code to detect objects in the image:
```python
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
```
This code calls the `detect_image` method on the predictor to run the object detector. It then gathers all the predictions with a probability above a threshold, printing them to the console.
Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
1. Run this code and it will capture an image, sending it to the object detector, and print out the detected objects. If you are using a virtual IoT device ensure you have an appropriate image set in CounterFit, or our web cam is selected. If you are using a Raspberry Pi, make sure your camera is pointing to objects on a shelf.
```output
pi@raspberrypi:~/stock-counter $ python3 app.py
tomato paste: 34.13%
tomato paste: 33.95%
tomato paste: 35.05%
tomato paste: 32.80%
```
> 💁 You may need to adjust the `threshold` to an appropriate value for your images.
You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
> 💁 You can find this code in the [code-detect/pi](code-detect/pi) or [code-detect/virtual-iot-device](code-detect/virtual-iot-device) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,167 @@
# Count stock from your IoT device - Wio Terminal
A combination of the predictions and their bounding boxes can be used to count stock in an image.
## Count stock
![4 cans of tomato paste with bounding boxes around each can](../../../images/rpi-stock-with-bounding-boxes.jpg)
In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
### Task - count stock ignoring overlap
1. Open your `stock-counter` project if it is not already open.
1. Above the `processPredictions` function, add the following code:
```cpp
const float overlap_threshold = 0.20f;
```
This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
1. Below this, and above the `processPredictions` function, add the following code to calculate the overlap between two rectangles:
```cpp
struct Point {
float x, y;
};
struct Rect {
Point topLeft, bottomRight;
};
float area(Rect rect)
{
return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
}
float overlappingArea(Rect rect1, Rect rect2)
{
float left = max(rect1.topLeft.x, rect2.topLeft.x);
float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
float top = max(rect1.topLeft.y, rect2.topLeft.y);
float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
if ( right > left && bottom > top )
{
return (right-left)*(bottom-top);
}
return 0.0f;
}
```
This code defines a `Point` struct to store points on the image, and a `Rect` struct to define a rectangle using a top left and bottom right coordinate. It then defines an `area` function that calculates the area of a rectangle from a top left and bottom right coordinate.
Next it defines a `overlappingArea` function that calculates the overlapping area of 2 rectangles. If they don't overlap, it returns 0.
1. Below the `overlappingArea` function, declare a function to convert a bounding box to a `Rect`:
```cpp
Rect rectFromBoundingBox(JsonVariant prediction)
{
JsonObject bounding_box = prediction["boundingBox"].as<JsonObject>();
float left = bounding_box["left"].as<float>();
float top = bounding_box["top"].as<float>();
float width = bounding_box["width"].as<float>();
float height = bounding_box["height"].as<float>();
Point topLeft = {left, top};
Point bottomRight = {left + width, top + height};
return {topLeft, bottomRight};
}
```
This takes a prediction from the object detector, extracts the bounding box and uses the values on the bounding box to define a rectangle. The right side is calculated from the left plus the width. The bottom is calculated as the top plus the height.
1. The predictions need to be compared to each other, and if 2 predictions have an overlap of more that the threshold, one of them needs to be deleted. The overlap threshold is a percentage, so needs to be multiplied by the size of the smallest bounding box to check that the overlap exceeds the given percentage of the bounding box, not the given percentage of the whole image. Start by deleting the content of the `processPredictions` function.
1. Add the following to the empty `processPredictions` function:
```cpp
std::vector<JsonVariant> passed_predictions;
for (int i = 0; i < predictions.size(); ++i)
{
Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
float prediction_1_area = area(prediction_1_rect);
bool passed = true;
for (int j = i + 1; j < predictions.size(); ++j)
{
Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
float prediction_2_area = area(prediction_2_rect);
float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
float smallest_area = min(prediction_1_area, prediction_2_area);
if (overlap > (overlap_threshold * smallest_area))
{
passed = false;
break;
}
}
if (passed)
{
passed_predictions.push_back(predictions[i]);
}
}
```
This code declares a vector to store the predictions that don't overlap. It then loops through all the predictions, creating a `Rect` from the bounding box.
Next this code loops through the remaining predictions, starting at the one after the current prediction. This stops predictions being compared more than once - once 1 and 2 have been compared, there's no need to compare 2 with 1, only with 3, 4, etc.
For each pair of predictions the overlapping area is calculated. This is then compared to the area of the smallest bounding box - if the overlap exceeds the threshold percentage of the smallest bounding box, the prediction is marked as not passed. If after comparing all the overlap, the prediction passes the checks it is added to the `passed_predictions` collection.
> 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
1. After this, add the following code to send details of the passed predictions to the serial monitor:
```cpp
for(JsonVariant prediction : passed_predictions)
{
String boundingBox = prediction["boundingBox"].as<String>();
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
Serial.println(buff);
}
```
This code loops through the passed predictions and prints their details to the serial monitor.
1. Below this, add code to print the number of counted items to the serial monitor:
```cpp
Serial.print("Counted ");
Serial.print(passed_predictions.size());
Serial.println(" stock items.");
```
This could then be sent to an IoT service to alert if the stock levels are low.
1. Upload and run your code. Point the camera at objects on a shelf and press the C button. Try adjusting the `overlap_threshold` value to see predictions being ignored.
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 17416
tomato paste: 35.84% {"left":0.395631,"top":0.215897,"width":0.180768,"height":0.359364}
tomato paste: 35.87% {"left":0.378554,"top":0.583012,"width":0.14824,"height":0.359382}
tomato paste: 34.11% {"left":0.699024,"top":0.592617,"width":0.124411,"height":0.350456}
tomato paste: 35.16% {"left":0.513006,"top":0.647853,"width":0.187472,"height":0.325817}
Counted 4 stock items.
```
> 💁 You can find this code in the [code-count/wio-terminal](code-count/wio-terminal) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,102 @@
# Call your object detector from your IoT device - Wio Terminal
Once your object detector has been published, it can be used from your IoT device.
## Copy the image classifier project
The majority of your stock detector is the same as the image classifier you created in a previous lesson.
### Task - copy the image classifier project
1. Connect your ArduCam your Wio Terminal, following the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md#task---connect-the-camera).
You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
1. Create a brand new Wio Terminal project using PlatformIO. Call this project `stock-counter`.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
## Change the code from a classifier to an image detector
The code you used to classify images is very similar to the code to detect objects. The main difference is the URL that is called that you obtained from Custom Vision, and the results of the call.
### Task - change the code from a classifier to an image detector
1. Add the following include directive to the top of the `main.cpp` file:
```cpp
#include <vector>
```
1. Rename the `classifyImage` function to `detectStock`, both the name of the function and the call in the `buttonPressed` function.
1. Above the `detectStock` function, declare a threshold to filter out any detections that have a low probability:
```cpp
const float threshold = 0.3f;
```
Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
1. Above the `detectStock` function, declare a function to process the predictions:
```cpp
void processPredictions(std::vector<JsonVariant> &predictions)
{
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
```
This takes a list of predictions and prints them to the serial monitor.
1. In the `detectStock` function, replace the contents of the `for` loop that loops through the predictions with the following:
```cpp
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
```
This loops through the predictions, comparing the probability to the threshold. All predictions that have a probability higher than the threshold are added to a `list` and passed to the `processPredictions` function.
1. Upload and run your code. Point the camera at objects on a shelf and press the C button. You will see the output in the serial monitor:
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 17416
tomato paste: 35.84%
tomato paste: 35.87%
tomato paste: 34.11%
tomato paste: 35.16%
```
> 💁 You may need to adjust the `threshold` to an appropriate value for your images.
You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
> 💁 You can find this code in the [code-detect/wio-terminal](code-detect/wio-terminal) folder.
😀 Your stock counter program was a success!

@ -1,12 +1,12 @@
# Consumer IoT - build a smart voice assistant # Consumer IoT - build a smart voice assistant
The fod has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric. The food has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric.
The latest iterations are now part of our smart devices. In kitchens in homes all throughout the world you'll hear cooks shouting "Hey Siri - set a 10 minute timer", or "Alexa - cancel my bread timer". No longer do you have to walk back to the kitchen to check on a timer, you can do it from your phone, or a call out across the room. The latest iterations are now part of our smart devices. In kitchens in homes all throughout the world you'll hear cooks shouting "Hey Siri - set a 10 minute timer", or "Alexa - cancel my bread timer". No longer do you have to walk back to the kitchen to check on a timer, you can do it from your phone, or a call out across the room.
In these 4 lessons you'll learn how to build a smart timer, using AI to recognize your voice, understand what you are asking for, and reply with information about your timer. You'll also add support for multiple languages. In these 4 lessons you'll learn how to build a smart timer, using AI to recognize your voice, understand what you are asking for, and reply with information about your timer. You'll also add support for multiple languages.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md). > 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics ## Topics

@ -1,7 +1,5 @@
# Recognize speech with an IoT device # Recognize speech with an IoT device
Add a sketchnote if possible/appropriate
This video gives an overview of the Azure speech service, a topic that will be covered in this lesson: This video gives an overview of the Azure speech service, a topic that will be covered in this lesson:
[![How to get started using your Cognitive Services Speech resource from the Microsoft Azure YouTube channel](https://img.youtube.com/vi/iW0Fw0l3mrA/0.jpg)](https://www.youtube.com/watch?v=iW0Fw0l3mrA) [![How to get started using your Cognitive Services Speech resource from the Microsoft Azure YouTube channel](https://img.youtube.com/vi/iW0Fw0l3mrA/0.jpg)](https://www.youtube.com/watch?v=iW0Fw0l3mrA)
@ -18,7 +16,7 @@ This video gives an overview of the Azure speech service, a topic that will be c
'Alexa, timer status' 'Alexa, timer status'
'Alexa set a 8 minute timer called steam broccoli' 'Alexa, set a 8 minute timer called steam broccoli'
Smart devices are becoming more and more pervasive. Not just as smart speakers like HomePods, Echos and Google Homes, but embedded in our phones, watches, and even light fittings and thermostats. Smart devices are becoming more and more pervasive. Not just as smart speakers like HomePods, Echos and Google Homes, but embedded in our phones, watches, and even light fittings and thermostats.
@ -51,8 +49,6 @@ Microphones come in a variety of types:
![Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone](../../../images/dynamic-mic.jpg) ![Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone](../../../images/dynamic-mic.jpg)
***Beni Köhler / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* Ribbon - Ribbon microphones are similar to dynamic microphones, except they have a metal ribbon instead of a diaphragm. This ribbon moves in a magnetic field generating an electrical current. Like dynamic microphones, ribbon microphones don't need power to work. * Ribbon - Ribbon microphones are similar to dynamic microphones, except they have a metal ribbon instead of a diaphragm. This ribbon moves in a magnetic field generating an electrical current. Like dynamic microphones, ribbon microphones don't need power to work.
![Edmund Lowe, American actor, standing at radio microphone (labeled for (NBC) Blue Network), holding script, 1942](../../../images/ribbon-mic.jpg) ![Edmund Lowe, American actor, standing at radio microphone (labeled for (NBC) Blue Network), holding script, 1942](../../../images/ribbon-mic.jpg)
@ -61,8 +57,6 @@ Microphones come in a variety of types:
![C451B small-diaphragm condenser microphone by AKG Acoustics](../../../images/condenser-mic.jpg) ![C451B small-diaphragm condenser microphone by AKG Acoustics](../../../images/condenser-mic.jpg)
***[Harumphy](https://en.wikipedia.org/wiki/User:Harumphy) at [en.wikipedia](https://en.wikipedia.org/) / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* MEMS - Microelectromechanical systems microphones, or MEMS, are microphones on a chip. They have a pressure sensitive diaphragm etched onto a silicon chip, and work similar to a condenser microphone. These microphones can be tiny, and integrated into circuitry. * MEMS - Microelectromechanical systems microphones, or MEMS, are microphones on a chip. They have a pressure sensitive diaphragm etched onto a silicon chip, and work similar to a condenser microphone. These microphones can be tiny, and integrated into circuitry.
![A MEMS microphone on a circuit board](../../../images/mems-microphone.png) ![A MEMS microphone on a circuit board](../../../images/mems-microphone.png)
@ -87,7 +81,7 @@ For example most streaming music services offer 16-bit or 24-bit audio. This mea
> 💁 You may have hard of 8-bit audio, often referred to as LoFi. This is audio sampled using only 8-bits, so -128 to 127. The first computer audio was limited to 8 bits due to hardware limitations, so this is often seen in retro gaming. > 💁 You may have hard of 8-bit audio, often referred to as LoFi. This is audio sampled using only 8-bits, so -128 to 127. The first computer audio was limited to 8 bits due to hardware limitations, so this is often seen in retro gaming.
These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'loseless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz. These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'lossless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz.
✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio? ✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio?

@ -1,9 +1,5 @@
# Understand language # Understand language
Add a sketchnote if possible/appropriate
![Embed a video here if available](video-url)
## Pre-lecture quiz ## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/43) [Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/43)

@ -1,9 +1,5 @@
# Set a timer and provide spoken feedback # Set a timer and provide spoken feedback
Add a sketchnote if possible/appropriate
![Embed a video here if available](video-url)
## Pre-lecture quiz ## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/45) [Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/45)

@ -1,7 +1,5 @@
# Support multiple languages # Support multiple languages
Add a sketchnote if possible/appropriate
This video gives an overview of the Azure speech services, covering speech to text and text to speech from earlier lessons, as well as translating speech, a topic covered in this lesson: This video gives an overview of the Azure speech services, covering speech to text and text to speech from earlier lessons, as well as translating speech, a topic covered in this lesson:
[![Recognizing speech with a few lines of Python from Microsoft Build 2020](https://img.youtube.com/vi/h6xbpMPSGEA/0.jpg)](https://www.youtube.com/watch?v=h6xbpMPSGEA) [![Recognizing speech with a few lines of Python from Microsoft Build 2020](https://img.youtube.com/vi/h6xbpMPSGEA/0.jpg)](https://www.youtube.com/watch?v=h6xbpMPSGEA)
@ -26,6 +24,10 @@ In this lesson we'll cover:
* [Support multiple languages in applications with translations](#support-multiple-languages-in-applications-with-translations) * [Support multiple languages in applications with translations](#support-multiple-languages-in-applications-with-translations)
* [Translate text using an AI service](#translate-text-using-an-ai-service) * [Translate text using an AI service](#translate-text-using-an-ai-service)
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Translate text ## Translate text
Text translation has been a computer science problem that has been researched for over 70 years, and only now thanks to advances in AI and computer power is close to being solved to a point where it is almost as good as human translators. Text translation has been a computer science problem that has been researched for over 70 years, and only now thanks to advances in AI and computer power is close to being solved to a point where it is almost as good as human translators.
@ -42,7 +44,7 @@ For example, translating "Hello world" from English into French can be performed
Substitutions don't work when different languages use different ways of saying the same thing. For example, the English sentence "My name is Jim", translates into "Je m'appelle Jim" in French - literally "I call myself Jim". "Je" is French for "I", "moi" is me, but is concatenated with the verb as it starts with a vowel, so becomes "m'", "appelle" is to call, and "Jim" isn't translated as it's a name, and not a word that can be translated. Word ordering also becomes an issue - a simple substitution of "Je m'appelle Jim" becomes "I myself call Jim", with a different word order to English. Substitutions don't work when different languages use different ways of saying the same thing. For example, the English sentence "My name is Jim", translates into "Je m'appelle Jim" in French - literally "I call myself Jim". "Je" is French for "I", "moi" is me, but is concatenated with the verb as it starts with a vowel, so becomes "m'", "appelle" is to call, and "Jim" isn't translated as it's a name, and not a word that can be translated. Word ordering also becomes an issue - a simple substitution of "Je m'appelle Jim" becomes "I myself call Jim", with a different word order to English.
> 💁 Some words are never translated - my name is Jim regardless of which language is used to introduce me. > 💁 Some words are never translated - my name is Jim regardless of which language is used to introduce me. When translating to languages that use different alphabets, or use different letters for different sounds, then words can be *transliterated*, that is selecting letters or characters that give the appropriate sound to sound the same as the given word.
Idioms are also a problem for translation. These are phrases that have an understood meaning that is different from a direct interpretation of the words. For example, in English the idiom "I've got ants in my pants" does not literally refer to having ants in your clothing, but to being restless. If you translated this to German, you would end up confusing the listener, as the German version is "I have bumble bees in the bottom". Idioms are also a problem for translation. These are phrases that have an understood meaning that is different from a direct interpretation of the words. For example, in English the idiom "I've got ants in my pants" does not literally refer to having ants in your clothing, but to being restless. If you translated this to German, you would end up confusing the listener, as the German version is "I have bumble bees in the bottom".
@ -117,8 +119,6 @@ In an ideal world, your whole application should understand as many different la
![A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese](../../../images/translated-smart-timer.png) ![A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese](../../../images/translated-smart-timer.png)
***A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese. Microcontroller by Template / recording by Aybige Speaker / Speaker by Gregor Cresnar - all from the [Noun Project](https://thenounproject.com)***
Imagine you are building a smart timer that uses English end-to-end, understanding spoken English and converting that to text, running the language understanding in English, building up responses in English and replying with English speech. If you wanted to add support for Japanese, you could start with translating spoken Japanese to English text, then keep the core of the application the same, then translate the response text to Japanese before speaking the response. This would allow you to quickly add Japanese support, and you can expand to providing full end-to-end Japanese support later. Imagine you are building a smart timer that uses English end-to-end, understanding spoken English and converting that to text, running the language understanding in English, building up responses in English and replying with English speech. If you wanted to add support for Japanese, you could start with translating spoken Japanese to English text, then keep the core of the application the same, then translate the response text to Japanese before speaking the response. This would allow you to quickly add Japanese support, and you can expand to providing full end-to-end Japanese support later.
> 💁 The downside to relying on machine translation is that different languages and cultures have different ways of saying the same things, so the translation may not match the expression you are expecting. > 💁 The downside to relying on machine translation is that different languages and cultures have different ways of saying the same things, so the translation may not match the expression you are expecting.

@ -22,13 +22,13 @@ The projects cover the journey of food from farm to table. This includes farming
> **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md). > **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md).
> **Students**, to use this curriculum on your own, fork the entire repo and complete the exercises on your own, starting with a pre-lecture quiz, then reading the lecture and completing the rest of the activities. Try to create the projects by comprehending the lessons rather than copying the solution code; however that code is available in the /solutions folders in each project-oriented lesson. Another idea would be to form a study group with friends and go through the content together. For further study, we recommend [Microsoft Learn](create a Learn collection and post it here) and by watching the videos mentioned below. > **Students**, to use this curriculum on your own, fork the entire repo and complete the exercises on your own, starting with a pre-lecture quiz, then reading the lecture and completing the rest of the activities. Try to create the projects by comprehending the lessons rather than copying the solution code; however that code is available in the /solutions folders in each project-oriented lesson. Another idea would be to form a study group with friends and go through the content together. For further study, we recommend [Microsoft Learn](https://docs.microsoft.com/users/jimbobbennett/collections/ke2ehd351jopwr?WT.mc_id=academic-17441-jabenn).
> Your promo video here > Your promo video here
[![Promo video](./images/iot-for-beginners.png)](https://youtube.com/watch?v=R1wrdtmBSII "Promo video") [![Promo video](./images/iot-for-beginners.png)](https://youtube.com/watch?v=R1wrdtmBSII "Promo video")
> 💁 Click the image above for a video about the project and the folks who created it! > 💁 Click the image above for a video about the project!
## Pedagogy ## Pedagogy
@ -64,7 +64,7 @@ We have two choices of IoT hardware to use for the projects depending on persona
| | Project Name | Concepts Taught | Learning Objectives | Linked Lesson | | | Project Name | Concepts Taught | Learning Objectives | Linked Lesson |
| :-: | :----------: | :-------------: | ------------------- | :-----------: | | :-: | :----------: | :-------------: | ------------------- | :-----------: |
| 01 | [Getting started](./1-getting-started) | Introduction to IoT | Learn the basic principles of IoT and the basic building blocks of IoT solutions such as sensors and cloud services whilst you are setting up your first IoT device | [Introduction to IoT](./1-getting-started/lessons/1-introduction-to-iot/README.md) | | 01 | [Getting started](./1-getting-started) | Introduction to IoT | Learn the basic principles of IoT and the basic building blocks of IoT solutions such as sensors and cloud services whilst you are setting up your first IoT device | [Introduction to IoT](./1-getting-started/lessons/1-introduction-to-iot/README.md) |
| 02 | [Getting started](./1-getting-started) | A deeper dive into IoT| Learn more about the components of an IoT system, as well as microcontrollers and single-board computers | [A deeper dive into IoT](./1-getting-started/lessons/2-deeper-dive/README.md) | | 02 | [Getting started](./1-getting-started) | A deeper dive into IoT | Learn more about the components of an IoT system, as well as microcontrollers and single-board computers | [A deeper dive into IoT](./1-getting-started/lessons/2-deeper-dive/README.md) |
| 03 | [Getting started](./1-getting-started) | Interact with the physical world with sensors and actuators | Learn about sensors to gather data from the physical world, and actuators to send feedback, whilst you build a nightlight | [Interact with the physical world with sensors and actuators](./1-getting-started/lessons/3-sensors-and-actuators/README.md) | | 03 | [Getting started](./1-getting-started) | Interact with the physical world with sensors and actuators | Learn about sensors to gather data from the physical world, and actuators to send feedback, whilst you build a nightlight | [Interact with the physical world with sensors and actuators](./1-getting-started/lessons/3-sensors-and-actuators/README.md) |
| 04 | [Getting started](./1-getting-started) | Connect your device to the Internet | Learn about how to connect an IoT device to the Internet to send and receive messages by connecting your nightlight to an MQTT broker | [Connect your device to the Internet](./1-getting-started/lessons/4-connect-internet/README.md) | | 04 | [Getting started](./1-getting-started) | Connect your device to the Internet | Learn about how to connect an IoT device to the Internet to send and receive messages by connecting your nightlight to an MQTT broker | [Connect your device to the Internet](./1-getting-started/lessons/4-connect-internet/README.md) |
| 05 | [Farm](./2-farm) | Predict plant growth | Learn how to predict plant growth using temperature data captured by an IoT device | [Predict plant growth](./2-farm/lessons/1-predict-plant-growth/README.md) | | 05 | [Farm](./2-farm) | Predict plant growth | Learn how to predict plant growth using temperature data captured by an IoT device | [Predict plant growth](./2-farm/lessons/1-predict-plant-growth/README.md) |
@ -91,3 +91,7 @@ We have two choices of IoT hardware to use for the projects depending on persona
## Offline access ## Offline access
You can run this documentation offline by using [Docsify](https://docsify.js.org/#/). Fork this repo, [install Docsify](https://docsify.js.org/#/quickstart) on your local machine, and then in the root folder of this repo, type `docsify serve`. The website will be served on port 3000 on your localhost: `localhost:3000`. You can run this documentation offline by using [Docsify](https://docsify.js.org/#/). Fork this repo, [install Docsify](https://docsify.js.org/#/quickstart) on your local machine, and then in the root folder of this repo, type `docsify serve`. The website will be served on port 3000 on your localhost: `localhost:3000`.
## Image attributions
You can find all the attributions for the images used in this curriculum where required in the [Attributions](./attributions.md).

@ -1,24 +1,10 @@
# TODO: The maintainer of this repo has not yet edited this file
**REPO OWNER**: Do you want Customer Service & Support (CSS) support for this product/project?
- **No CSS support:** Fill out this template with information about how to file issues and get help.
- **Yes CSS support:** Fill out an intake form at [aka.ms/spot](https://aka.ms/spot). CSS will work with/help you to determine next steps. More details also available at [aka.ms/onboardsupport](https://aka.ms/onboardsupport).
- **Not sure?** Fill out a SPOT intake as though the answer were "Yes". CSS will help you decide.
*Then remove this first heading from this SUPPORT.MD file before publishing your repo.*
# Support # Support
## How to file issues and get help ## How to file issues and get help
This project uses GitHub Issues to track bugs and feature requests. Please search the existing This project uses GitHub Issues to track bugs and feature requests. Please search the existing issues before filing new issues to avoid duplicates. For new issues, file your bug or feature request as a new Issue.
issues before filing new issues to avoid duplicates. For new issues, file your bug or
feature request as a new Issue.
For help and questions about using this project, please **REPO MAINTAINER: INSERT INSTRUCTIONS HERE For help and questions about using this project, please contact us by raising an issue in this repo.
FOR HOW TO ENGAGE REPO OWNERS OR COMMUNITY FOR HELP. COULD BE A STACK OVERFLOW TAG OR OTHER
CHANNEL. WHERE WILL YOU HELP PEOPLE?**.
## Microsoft Support Policy ## Microsoft Support Policy

@ -0,0 +1,40 @@
# Image attributions
* Bananas by abderraouf omara from the [Noun Project](https://thenounproject.com)
* Brain by Icon Market from the [Noun Project](https://thenounproject.com)
* Broadcast by RomStu from the [Noun Project](https://thenounproject.com)
* Button by Dan Hetteix from the [Noun Project](https://thenounproject.com)
* C451B small-diaphragm condenser microphone by AKG Acoustics. [Harumphy](https://en.wikipedia.org/wiki/User:Harumphy) at [en.wikipedia](https://en.wikipedia.org/) / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)
* Calendar by Alice-vector from the [Noun Project](https://thenounproject.com)
* Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)
* chip by Astatine Lab from the [Noun Project](https://thenounproject.com)
* Cloud by Debi Alpa Nugraha from the [Noun Project](https://thenounproject.com)
* container by ProSymbols from the [Noun Project](https://thenounproject.com)
* CPU by Icon Lauk from the [Noun Project](https://thenounproject.com)
* database by Icons Bazaar from the [Noun Project](https://thenounproject.com)
* dial by Jamie Dickinson from the [Noun Project](https://thenounproject.com)
* GPS by mim studio from the [Noun Project](https://thenounproject.com)
* heater by Pascal Heß from the [Noun Project](https://thenounproject.com)
* Idea by Pause08 from the [Noun Project](https://thenounproject.com)
* IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)
* LED by abderraouf omara from the [Noun Project](https://thenounproject.com)
* ldr by Eucalyp from the [Noun Project](https://thenounproject.com)
* lightbulb by Maxim Kulikov from the [Noun Project](https://thenounproject.com)
* Microcontroller by Template from the [Noun Project](https://thenounproject.com)
* mobile phone by Alice-vector from the [Noun Project](https://thenounproject.com)
* motor by Bakunetsu Kaito from the [Noun Project](https://thenounproject.com)
* Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone. Beni Köhler / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)
* Plant by Alex Muravev from the [Noun Project](https://thenounproject.com)
* Plant Cell by Léa Lortal from the [Noun Project](https://thenounproject.com)
* probe by Adnen Kadri from the [Noun Project](https://thenounproject.com)
* ram by Atif Arshad from the [Noun Project](https://thenounproject.com)
* Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
* recording by Aybige Speaker from the [Noun Project](https://thenounproject.com)
* Satellite by Noura Mbarki from the [Noun Project](https://thenounproject.com)
* smart sensor by Andrei Yushchenko from the [Noun Project](https://thenounproject.com)
* Speaker by Gregor Cresnar from the [Noun Project](https://thenounproject.com)
* switch by Chattapat from the [Noun Project](https://thenounproject.com)*
* Temperature by Vectors Market from the [Noun Project](https://thenounproject.com)
* tomato by parkjisun from the Noun Project from the [Noun Project](https://thenounproject.com)
* Watering Can by Daria Moskvina from the [Noun Project](https://thenounproject.com)
* weather by Adrien Coquet from the [Noun Project](https://thenounproject.com)

@ -6,11 +6,12 @@ In the lessons for each project, you may have created some of the following:
* A Resource Group * A Resource Group
* An IoT Hub * An IoT Hub
* Two IoT device registrations * IoT device registrations
* A Storage Account * A Storage Account
* A Functions App * A Functions App
* An Azure Maps account * An Azure Maps account
* A custom vision project * A custom vision project
* An Azure Container Registry
* A cognitive services resource * A cognitive services resource
Most of these resources will have no cost - either they are completely free, or you are using a free tier. For services that require a paid tier, you would have been using them at a level that is included in the free allowance, or will only cost a few cents. Most of these resources will have no cost - either they are completely free, or you are using a free tier. For services that require a paid tier, you would have been using them at a level that is included in the free allowance, or will only cost a few cents.

@ -10,6 +10,15 @@ The specific hardware was chosen to reduce the complexity of the lessons and ass
You will also need a few non-technical items, such as soil or a pot plant, and fruit or vegetables. You will also need a few non-technical items, such as soil or a pot plant, and fruit or vegetables.
## Buy the kits
![The Seeed studios logo](./images/seeed-logo.png)
Seeed Studios have very kindly made all the hardware available as easy to purchase kits:
* [IoT for beginners with Seeed and Microsoft - Wio Terminal Starter Kit](https://www.seeedstudio.com/IoT-for-beginners-with-Seeed-and-Microsoft-Wio-Terminal-Starter-Kit-p-5006.html)
* [IoT for beginners with Seeed and Microsoft - Raspberry Pi 4 Starter Kit](https://www.seeedstudio.com/IoT-for-beginners-with-Seeed-and-Microsoft-Raspberry-Pi-Starter-Kit.html)
## Arduino ## Arduino
All the device code for Arduino is in C++. To complete all the assignments you will need the following: All the device code for Arduino is in C++. To complete all the assignments you will need the following:

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 299 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save