Merge branch 'microsoft:main' into Comp-Files

pull/163/head
Mohammad Iftekher (Iftu) Ebne Jalal 4 years ago committed by GitHub
commit d5ec1dc998
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -21,6 +21,7 @@
"mosquitto", "mosquitto",
"photodiode", "photodiode",
"photodiodes", "photodiodes",
"quickstart",
"sketchnote" "sketchnote"
] ]
} }

@ -78,8 +78,6 @@ A single-board computer is a small computing device that has all the elements of
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The Raspberry Pi is one of the most popular single-board computers. The Raspberry Pi is one of the most popular single-board computers.
Like a microcontroller, single-board computers have a CPU, memory and input/output pins, but they have additional features such as a graphics chip to allow you to connect monitors, audio outputs, and USB ports to connect keyboards mice and other standard USB devices like webcams or external storage. Programs are stored on SD cards or hard drives along with an operating system, instead of a memory chip built into the board. Like a microcontroller, single-board computers have a CPU, memory and input/output pins, but they have additional features such as a graphics chip to allow you to connect monitors, audio outputs, and USB ports to connect keyboards mice and other standard USB devices like webcams or external storage. Programs are stored on SD cards or hard drives along with an operating system, instead of a memory chip built into the board.

@ -4,8 +4,6 @@ The [Raspberry Pi](https://raspberrypi.org) is a single-board computer. You can
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## Setup ## Setup
If you are using a Raspberry Pi as your IoT hardware, you have two choices - you can work through all these lessons and code directly on the Pi, or you can connect remotely to a 'headless' Pi and code from your computer. If you are using a Raspberry Pi as your IoT hardware, you have two choices - you can work through all these lessons and code directly on the Pi, or you can connect remotely to a 'headless' Pi and code from your computer.

@ -94,10 +94,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
<div dir="rtl"> <div dir="rtl">
يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة. يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة.

@ -79,8 +79,6 @@ IoT শব্দে **T** হলো **Things** - ‘থিংস’ বা জ
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
বিখ্যাত সিংগেল-বোর্ড কম্পিউটারগুলোর মধ্যে রাস্পবেরি পাই অন্যতম । বিখ্যাত সিংগেল-বোর্ড কম্পিউটারগুলোর মধ্যে রাস্পবেরি পাই অন্যতম ।
মাইক্রোকন্ট্রোলারের মতো সিংগেল-বোর্ড কম্পিউটার এরও রয়েছে সিপিইউ, মেমরি, ইনপুট/আউটপুট পিন । তবে তাদের অতিরিক্ত বৈশিষ্ট্য রয়েছে যেমন গ্রাফিক্স চিপ দ্বারা আমরা মনিটর, অডিও আউটপুট এবং ইউএসবি পোর্টগুলিকে সংযুক্ত করতে পারি কী-বোর্ড , মাউস এর সাথে বা অন্যান্য স্ট্যান্ডার্ড ইউএসবি ডিভাইসগুলি যেমনঃ ওয়েবক্যাম বা বাহ্যিক স্টোরেজ এর সাথে সংযোগ করতে দেয় । প্রোগ্রামগুলি বোর্ডে তৈরি মেমরি চিপে নয়, বরং হার্ড ড্রাইভে বা এসডি কার্ড এ সংরক্ষণ করা হয়, যার সাথে থাকে অপারেটিং সিস্টেম । মাইক্রোকন্ট্রোলারের মতো সিংগেল-বোর্ড কম্পিউটার এরও রয়েছে সিপিইউ, মেমরি, ইনপুট/আউটপুট পিন । তবে তাদের অতিরিক্ত বৈশিষ্ট্য রয়েছে যেমন গ্রাফিক্স চিপ দ্বারা আমরা মনিটর, অডিও আউটপুট এবং ইউএসবি পোর্টগুলিকে সংযুক্ত করতে পারি কী-বোর্ড , মাউস এর সাথে বা অন্যান্য স্ট্যান্ডার্ড ইউএসবি ডিভাইসগুলি যেমনঃ ওয়েবক্যাম বা বাহ্যিক স্টোরেজ এর সাথে সংযোগ করতে দেয় । প্রোগ্রামগুলি বোর্ডে তৈরি মেমরি চিপে নয়, বরং হার্ড ড্রাইভে বা এসডি কার্ড এ সংরক্ষণ করা হয়, যার সাথে থাকে অপারেটিং সিস্টেম ।

@ -81,10 +81,6 @@ IoT में **T** का अर्थ है **चीजें** - ऐसे
![एक रास्पबेरी पाई 4](../../../../images/raspberry-pi-4.jpg) ![एक रास्पबेरी पाई 4](../../../../images/raspberry-pi-4.jpg)
***रास्पबेरी पाई 4. माइकल हेन्ज़लर /
[विकिमीडिया कॉमन्स](https://commons.wikimedia.org/wiki/Main_Page) /
[CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
रास्पबेरी पाई सबसे लोकप्रिय सिंगल-बोर्ड कंप्यूटरों में से एक है। रास्पबेरी पाई सबसे लोकप्रिय सिंगल-बोर्ड कंप्यूटरों में से एक है।
एक माइक्रोकंट्रोलर की तरह, सिंगल-बोर्ड कंप्यूटर में एक सीपीयू, मेमोरी और इनपुट/आउटपुट पिन होते हैं, लेकिन उनमें ग्राफिक्स चिप जैसी अतिरिक्त सुविधाएं होती हैं, जिससे आप मॉनिटर, ऑडियो आउटपुट और यूएसबी पोर्ट को कीबोर्ड, चूहों और अन्य मानक यूएसबी से कनेक्ट कर सकते हैं। वेबकैम या बाहरी भंडारण जैसे उपकरण। बोर्ड में निर्मित मेमोरी चिप के बजाय प्रोग्राम को ऑपरेटिंग सिस्टम के साथ एसडी कार्ड या हार्ड ड्राइव पर संग्रहीत किया जाता है। एक माइक्रोकंट्रोलर की तरह, सिंगल-बोर्ड कंप्यूटर में एक सीपीयू, मेमोरी और इनपुट/आउटपुट पिन होते हैं, लेकिन उनमें ग्राफिक्स चिप जैसी अतिरिक्त सुविधाएं होती हैं, जिससे आप मॉनिटर, ऑडियो आउटपुट और यूएसबी पोर्ट को कीबोर्ड, चूहों और अन्य मानक यूएसबी से कनेक्ट कर सकते हैं। वेबकैम या बाहरी भंडारण जैसे उपकरण। बोर्ड में निर्मित मेमोरी चिप के बजाय प्रोग्राम को ऑपरेटिंग सिस्टम के साथ एसडी कार्ड या हार्ड ड्राइव पर संग्रहीत किया जाता है।

@ -78,8 +78,6 @@ Komputer papan tunggal adalah perangkat komputasi kecil yang memiliki semua elem
![Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer. Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer.
Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan. Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan.

@ -78,8 +78,6 @@ IoT 的 **T** 代表 **Things**(物)—— 可以跟物质世界交互的设
![一个 Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![一个 Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
Raspberry Pi 是其中最流行的单板机。 Raspberry Pi 是其中最流行的单板机。
就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。 就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。

@ -4,8 +4,6 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***রাস্পবেরি পাই - Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## সেটাপ ## সেটাপ
যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন। যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন।

@ -26,16 +26,12 @@ The two components of an IoT application are the *Internet* and the *thing*. Let
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The **Thing** part of IoT refers to a device that can interact with the physical world. These devices are usually small, low-priced computers, running at low speeds and using low power - for example, simple microcontrollers with kilobytes of RAM (as opposed to gigabytes in a PC) running at only a few hundred megahertz (as opposed to gigahertz in a PC), but consuming sometimes so little power they can run for weeks, months or even years on batteries. The **Thing** part of IoT refers to a device that can interact with the physical world. These devices are usually small, low-priced computers, running at low speeds and using low power - for example, simple microcontrollers with kilobytes of RAM (as opposed to gigabytes in a PC) running at only a few hundred megahertz (as opposed to gigahertz in a PC), but consuming sometimes so little power they can run for weeks, months or even years on batteries.
These devices interact with the physical world, either by using sensors to gather data from their surroundings or by controlling outputs or actuators to make physical changes. The typical example of this is a smart thermostat - a device that has a temperature sensor, a means to set a desired temperature such as a dial or touchscreen, and a connection to a heating or cooling system that can be turned on when the temperature detected is outside the desired range. The temperature sensor detects that the room is too cold and an actuator turns the heating on. These devices interact with the physical world, either by using sensors to gather data from their surroundings or by controlling outputs or actuators to make physical changes. The typical example of this is a smart thermostat - a device that has a temperature sensor, a means to set a desired temperature such as a dial or touchscreen, and a connection to a heating or cooling system that can be turned on when the temperature detected is outside the desired range. The temperature sensor detects that the room is too cold and an actuator turns the heating on.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
There are a huge range of different things that can act as IoT devices, from dedicated hardware that senses one thing, to general purpose devices, even your smartphone! A smartphone can use sensors to detect the world around it, and actuators to interact with the world - for example using a GPS sensor to detect your location and a speaker to give you navigation instructions to a destination. There are a huge range of different things that can act as IoT devices, from dedicated hardware that senses one thing, to general purpose devices, even your smartphone! A smartphone can use sensors to detect the world around it, and actuators to interact with the world - for example using a GPS sensor to detect your location and a speaker to give you navigation instructions to a destination.
✅ Think of other systems you have around you that read data from a sensor and use that to make decisions. One example would be the thermostat on an oven. Can you find more? ✅ Think of other systems you have around you that read data from a sensor and use that to make decisions. One example would be the thermostat on an oven. Can you find more?
@ -52,14 +48,10 @@ With the example of a smart thermostat, the thermostat would connect using home
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
An even smarter version could use AI in the cloud with data from other sensors connected to other IoT devices such as occupancy sensors that detect what rooms are in use, as well as data such as weather and even your calendar, to make decisions on how to set the temperature in a smart fashion. For example, it could turn your heating off if it reads from your calendar you are on vacation, or turn off the heating on a room by room basis depending on what rooms you use, learning from the data to be more and more accurate over time. An even smarter version could use AI in the cloud with data from other sensors connected to other IoT devices such as occupancy sensors that detect what rooms are in use, as well as data such as weather and even your calendar, to make decisions on how to set the temperature in a smart fashion. For example, it could turn your heating off if it reads from your calendar you are on vacation, or turn off the heating on a room by room basis depending on what rooms you use, learning from the data to be more and more accurate over time.
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ What other data could help make an Internet connected thermostat smarter? ✅ What other data could help make an Internet connected thermostat smarter?
### IoT on the Edge ### IoT on the Edge
@ -96,8 +88,6 @@ The faster the clock cycle, the more instructions that can be processed each sec
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second. Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second.
✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal. ✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal.
@ -212,8 +202,6 @@ The [Raspberry Pi Foundation](https://www.raspberrypi.org) is a charity from the
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The latest iteration of the full size Raspberry Pi is the Raspberry Pi 4B. This has a quad-core (4 core) CPU running at 1.5GHz, 2, 4, or 8GB of RAM, gigabit ethernet, WiFi, 2 HDMI ports supporting 4k screens, an audio and composite video output port, USB ports (2 USB 2.0, 2 USB 3.0), 40 GPIO pins, a camera connector for a Raspberry Pi camera module, and an SD card slot. All this on a board that is 88mm x 58mm x 19.5mm and is powered by a 3A USB-C power supply. These start at US$35, much cheaper than a PC or Mac. The latest iteration of the full size Raspberry Pi is the Raspberry Pi 4B. This has a quad-core (4 core) CPU running at 1.5GHz, 2, 4, or 8GB of RAM, gigabit ethernet, WiFi, 2 HDMI ports supporting 4k screens, an audio and composite video output port, USB ports (2 USB 2.0, 2 USB 3.0), 40 GPIO pins, a camera connector for a Raspberry Pi camera module, and an SD card slot. All this on a board that is 88mm x 58mm x 19.5mm and is powered by a 3A USB-C power supply. These start at US$35, much cheaper than a PC or Mac.
> 💁 There is also a Pi400 all in one computer with a Pi4 built into a keyboard. > 💁 There is also a Pi400 all in one computer with a Pi4 built into a keyboard.

@ -32,16 +32,12 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات. يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات.
تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة. تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة. هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة.
✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟ ✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟
@ -59,15 +55,11 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت . يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت .
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟ ✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟
@ -114,8 +106,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية. تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية.
✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio. ✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio.
@ -237,8 +227,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac. أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac.
> 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح. > 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح.

@ -1,7 +1,5 @@
# IoT এর আরো গভীরে # IoT এর আরো গভীরে
![Embed a video here if available](video-url)
## লেকচার পূর্ববর্তী কুইজ ## লেকচার পূর্ববর্তী কুইজ
[লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3) [লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3)
@ -24,16 +22,12 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
**থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র‍্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে। **থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র‍্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে।
এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে। এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে।
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***একটি সাধারণ থার্মোস্ট্যাট Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে। বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে।
✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ? ✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ?
@ -50,14 +44,10 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png) ![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***একটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত, ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট / Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে। আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে।
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png) ![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***একটি ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট যা একাধিক রুমের সেন্সর ব্যবহার করে । এটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত এবং আবহাওয়া ও ক্যালেন্ডারের ডেটা থেকে বুদ্ধিমত্তা গ্রহণ করতে সক্ষম. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে? ✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে?
### Edge চালিত IoT ### Edge চালিত IoT
@ -91,8 +81,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png) ![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়। মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়।
✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি। ✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি।
@ -207,8 +195,6 @@ Wio Terminal পর্যালোচনা করি।
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg) ![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র‍্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম। পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র‍্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম।
> 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে। > 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে।

@ -60,8 +60,6 @@ One example of this is a potentiometer. This is a dial that you can rotate betwe
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out. The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out.
> 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer). > 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer).
@ -88,8 +86,6 @@ The simplest digital sensor is a button or switch. This is a sensor with two sta
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0. Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0.
> 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example, the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0. > 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example, the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0.
@ -101,8 +97,6 @@ More advanced digital sensors read analog values, then convert them using on-boa
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Sending digital data allows sensors to become more complex and send more detailed data, even encrypted data for secure sensors. One example is a camera. This is a sensor that captures an image and sends it as digital data containing that image, usually in a compressed format such as JPEG, to be read by the IoT device. It can even stream video by capturing images and sending either the complete image frame by frame or a compressed video stream. Sending digital data allows sensors to become more complex and send more detailed data, even encrypted data for secure sensors. One example is a camera. This is a sensor that captures an image and sends it as digital data containing that image, usually in a compressed format such as JPEG, to be read by the IoT device. It can even stream video by capturing images and sending either the complete image frame by frame or a compressed video stream.
## What are actuators? ## What are actuators?
@ -125,8 +119,6 @@ Follow the relevant guide below to add an actuator to your IoT device, controlle
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md) * [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md) * [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -143,8 +135,6 @@ One example is a dimmable light, such as the ones you might have in your house.
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Like with sensors, the actual IoT device works on digital signals, not analog. This means to send an analog signal, the IoT device needs a digital to analog converter (DAC), either on the IoT device directly, or on a connector board. This will convert the 0s and 1s from the IoT device to an analog voltage that the actuator can use. Like with sensors, the actual IoT device works on digital signals, not analog. This means to send an analog signal, the IoT device needs a digital to analog converter (DAC), either on the IoT device directly, or on a connector board. This will convert the 0s and 1s from the IoT device to an analog voltage that the actuator can use.
✅ What do you think happens if the IoT device sends a higher voltage than the actuator can handle? ⛔️ DO NOT test this out. ✅ What do you think happens if the IoT device sends a higher voltage than the actuator can handle? ⛔️ DO NOT test this out.
@ -159,8 +149,6 @@ Imagine you are controlling a motor with a 5V supply. You send a short pulse to
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity). This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity).
```output ```output
@ -172,8 +160,6 @@ This means in one second you have 25 5V pulses of 0.02s that rotate the motor, e
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
You can change the motor speed by changing the size of the pulses. For example, with the same motor you can keep the same cycle time of 0.04s, with the on pulse halved to 0.01s, and the off pulse increasing to 0.03s. You have the same number of pulses per second (25), but each on pulse is half the length. A half length pulse only turns the motor one twentieth of a rotation, and at 25 pulses a second will complete 1.25 rotations per second or 75rpm. By changing the pulse speed of a digital signal you've halved the speed of an analog motor. You can change the motor speed by changing the size of the pulses. For example, with the same motor you can keep the same cycle time of 0.04s, with the on pulse halved to 0.01s, and the off pulse increasing to 0.03s. You have the same number of pulses per second (25), but each on pulse is half the length. A half length pulse only turns the motor one twentieth of a rotation, and at 25 pulses a second will complete 1.25 rotations per second or 75rpm. By changing the pulse speed of a digital signal you've halved the speed of an analog motor.
```output ```output
@ -195,8 +181,6 @@ One simple digital actuator is an LED. When a device sends a digital signal of 1
![A LED is off at 0 volts and on at 5V](../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ What other simple 2-state actuators can you think of? One example is a solenoid, which is an electromagnet that can be activated to do things like move a door bolt locking/unlocking a door. ✅ What other simple 2-state actuators can you think of? One example is a solenoid, which is an electromagnet that can be activated to do things like move a door bolt locking/unlocking a door.
More advanced digital actuators, such as screens require the digital data to be sent in certain formats. They usually come with libraries that make it easier to send the correct data to control them. More advanced digital actuators, such as screens require the digital data to be sent in certain formats. They usually come with libraries that make it easier to send the correct data to control them.

@ -62,8 +62,6 @@
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../../images/potentiometer.png)
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى <a href="https://wikipedia.org/wiki/Up_to_eleven">11</a> ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت). سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى <a href="https://wikipedia.org/wiki/Up_to_eleven">11</a> ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت).
> 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على <a href="https://wikipedia.org/wiki/Potentiometer">potentiometer Wikipedia page</a> > 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على <a href="https://wikipedia.org/wiki/Potentiometer">potentiometer Wikipedia page</a>
@ -89,8 +87,6 @@
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../../images/button.png)
***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط. يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط.
> 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0. > 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0.
@ -102,8 +98,6 @@
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../../images/temperature-as-digital.png)
***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط. يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط.
## ما هي المحركات؟ ## ما هي المحركات؟
@ -126,8 +120,6 @@
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md) * [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md) * [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md)
@ -144,8 +136,6 @@
![A light dimmed at a low voltage and brighter at a higher voltage](../../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل. كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل.
✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك. ✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك.
@ -160,8 +150,6 @@
![Pule width modulation rotation of a motor at 150 RPM](../../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 <a href="https://wikipedia.org/wiki/Revolutions_per_minute">دورة في الدقيقة</a> ، وهو مقياس غير قياسي لسرعة الدوران). هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 <a href="https://wikipedia.org/wiki/Revolutions_per_minute">دورة في الدقيقة</a> ، وهو مقياس غير قياسي لسرعة الدوران).
```output ```output
@ -172,8 +160,6 @@
![Pule width modulation rotation of a motor at 75 RPM](../../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف. يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف.
```output ```output
@ -195,8 +181,6 @@
![A LED is off at 0 volts and on at 5V](../../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب. ✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب.
تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها. تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها.

@ -57,8 +57,6 @@
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png) ![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***পটেনশিওমিটার । Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে। আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে।
> 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে। > 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে।
@ -85,8 +83,6 @@
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***বাটন । Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে। আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে।
> 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে। > 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে।
@ -98,8 +94,6 @@
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png) ![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***ডিজিটাল তাপমাত্রা সেন্সর । Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে। ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে।
## অ্যাকচুয়েটর কী? ## অ্যাকচুয়েটর কী?
@ -122,8 +116,6 @@
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md) * [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md) * [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md) * [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -138,8 +130,6 @@
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png) ![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে। সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে।
✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত। ✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত।
@ -152,8 +142,6 @@
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png) ![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে .০২ সেকেন্ডে মোটর ঘুরছে আবার ভোল্টের জন্য .০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি। তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে .০২ সেকেন্ডে মোটর ঘুরছে আবার ভোল্টের জন্য .০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি।
```output ```output
@ -165,8 +153,6 @@
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png) ![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে। পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে।
```output ```output
@ -188,8 +174,6 @@
![A LED is off at 0 volts and on at 5V](../../../images/led.png) ![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে। ✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে।
আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে। আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে।

@ -31,8 +31,6 @@ There are a number of popular communication protocols used by IoT devices to com
![IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices.](../../../images/pub-sub.png) ![IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices.](../../../images/pub-sub.png)
***IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices. Broadcast by RomStu / Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
MQTT is the most popular communication protocol for IoT devices and is covered in this lesson. Others protocols include AMQP and HTTP/HTTPS. MQTT is the most popular communication protocol for IoT devices and is covered in this lesson. Others protocols include AMQP and HTTP/HTTPS.
## Message Queueing Telemetry Transport (MQTT) ## Message Queueing Telemetry Transport (MQTT)
@ -43,8 +41,6 @@ MQTT has a single broker and multiple clients. All clients connect to the broker
![IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic](../../../images/mqtt.png) ![IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic](../../../images/mqtt.png)
***IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic. Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research. If you have a lot of IoT devices, how can you ensure your MQTT broker can handle all the messages? ✅ Do some research. If you have a lot of IoT devices, how can you ensure your MQTT broker can handle all the messages?
### Connect your IoT device to MQTT ### Connect your IoT device to MQTT
@ -67,8 +63,6 @@ Rather than dealing with the complexities of setting up an MQTT broker as part o
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-internet-flow.png) ![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-internet-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
Follow the relevant step below to connect your device to the MQTT broker: Follow the relevant step below to connect your device to the MQTT broker:
* [Arduino - Wio Terminal](wio-terminal-mqtt.md) * [Arduino - Wio Terminal](wio-terminal-mqtt.md)
@ -106,8 +100,6 @@ Let's look back at the example of the smart thermostat from Lesson 1.
![An Internet connected thermostat using multiple room sensors](../../../images/telemetry.png) ![An Internet connected thermostat using multiple room sensors](../../../images/telemetry.png)
***An Internet connected thermostat using multiple room sensors. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The thermostat has temperature sensors to gather telemetry. It would most likely have one temperature sensor built in, and it might connect to multiple external temperature sensors over a wireless protocol such as [Bluetooth Low Energy](https://wikipedia.org/wiki/Bluetooth_Low_Energy) (BLE). The thermostat has temperature sensors to gather telemetry. It would most likely have one temperature sensor built in, and it might connect to multiple external temperature sensors over a wireless protocol such as [Bluetooth Low Energy](https://wikipedia.org/wiki/Bluetooth_Low_Energy) (BLE).
An example of the telemetry data it would send could be: An example of the telemetry data it would send could be:
@ -353,8 +345,6 @@ Commands are messages sent by the cloud to a device, instructing it to do someth
![An Internet connected thermostat receiving a command to turn on the heating](../../../images/commands.png) ![An Internet connected thermostat receiving a command to turn on the heating](../../../images/commands.png)
***An Internet connected thermostat receiving a command to turn on the heating. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
A thermostat could receive a command from the cloud to turn the heating on. Based on the telemetry data from all the sensors, if the cloud service has decided that the heating should be on, so it sends the relevant command. A thermostat could receive a command from the cloud to turn the heating on. Based on the telemetry data from all the sensors, if the cloud service has decided that the heating should be on, so it sends the relevant command.
### Send commands to the MQTT broker ### Send commands to the MQTT broker

@ -4,9 +4,7 @@ As the population grows, so does the demand on agriculture. The amount of land a
In these 6 lessons you'll learn how to apply the Internet of Things to improve and automate farming. In these 6 lessons you'll learn how to apply the Internet of Things to improve and automate farming.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you follow the [Clean up your project](lessons/6-keep-your-plant-secure/README.md#clean-up-your-project) step in [lesson 6](lessons/6-keep-your-plant-secure/README.md). > 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
**Add video of automated plant**
## Topics ## Topics

@ -130,8 +130,6 @@ By gathering temperature data using an IoT device, a farmer can automatically be
![Telemetry data is sent to a server and then saved to a database](../../../images/save-telemetry-database.png) ![Telemetry data is sent to a server and then saved to a database](../../../images/save-telemetry-database.png)
***Telemetry data is sent to a server and then saved to a database. database by Icons Bazaar - from the [Noun Project](https://thenounproject.com)***
The server code can also augment the data by adding extra information. For example, the IoT device can publish an identifier to indicate which device it is, and the sever code can use this to look up the location of the device, and what crops it is monitoring. It can also add basic data like the current time as some IoT devices don't have the necessary hardware to keep track of an accurate time, or require additional code to read the current time over the Internet. The server code can also augment the data by adding extra information. For example, the IoT device can publish an identifier to indicate which device it is, and the sever code can use this to look up the location of the device, and what crops it is monitoring. It can also add basic data like the current time as some IoT devices don't have the necessary hardware to keep track of an accurate time, or require additional code to read the current time over the Internet.
✅ Why do you think different fields might have different temperatures? ✅ Why do you think different fields might have different temperatures?

@ -29,8 +29,6 @@ Plants require water to grow. They absorb water throughout the entire plant, wit
![Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure](../../../images/transpiration.png) ![Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure](../../../images/transpiration.png)
***Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure. Plant by Alex Muravev / Plant Cell by Léa Lortal - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research: how much water is lost through transpiration? ✅ Do some research: how much water is lost through transpiration?
The root system provides water from moisture in the soil where the plant grows. Too little water in the soil and the plant cannot absorb enough to grow, too much water and roots cannot absorb enough oxygen needed to function. This leads to roots dying and the plant unable to get enough nutrients to survive. The root system provides water from moisture in the soil where the plant grows. Too little water in the soil and the plant cannot absorb enough to grow, too much water and roots cannot absorb enough oxygen needed to function. This leads to roots dying and the plant unable to get enough nutrients to survive.
@ -79,14 +77,10 @@ You can use GPIO pins directly with some digital sensors and actuators when you
![A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1](../../../images/button-with-digital.png) ![A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1](../../../images/button-with-digital.png)
***A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
* LED. You can connect an LED between an output pin and a ground pin (using a resistor otherwise you'll burn out the LED). From code you can set the output pin to high and it will send 3.3V, making a circuit from the 3.3V pin, through the LED, to the ground pin. This will light the LED. * LED. You can connect an LED between an output pin and a ground pin (using a resistor otherwise you'll burn out the LED). From code you can set the output pin to high and it will send 3.3V, making a circuit from the 3.3V pin, through the LED, to the ground pin. This will light the LED.
![An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit.](../../../images/led-digital-control.png) ![An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit.](../../../images/led-digital-control.png)
***An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
For more advanced sensors, you can use GPIO pins to send and receive digital data directly with digital sensors and actuators, or via controller boards with ADCs and DACs to talk to analog sensors and actuators. For more advanced sensors, you can use GPIO pins to send and receive digital data directly with digital sensors and actuators, or via controller boards with ADCs and DACs to talk to analog sensors and actuators.
> 💁 if you are using a Raspberry Pi for these labs, the Grove Base Hat has hardware to convert analog sensor signals to digital to send over GPIO. > 💁 if you are using a Raspberry Pi for these labs, the Grove Base Hat has hardware to convert analog sensor signals to digital to send over GPIO.
@ -101,8 +95,6 @@ For example, on a 3.3V board, if the sensor returns 3.3V, the value returned wou
![A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511](../../../images/analog-sensor-voltage.png) ![A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511](../../../images/analog-sensor-voltage.png)
***A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511. probe by Adnen Kadri / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
> 💁 Back in nightlight - lesson 3, the light sensor returned a value from 0-1,023. If you are using a Wio Terminal, the sensor was connected to an analog pin. If you are using a Raspberry Pi, then the sensor was connected to an analog pin on the base hat that has an integrated ADC to communicate over the GPIO pins. The virtual device was set to send a value from 0-1,023 to simulate an analog pin. > 💁 Back in nightlight - lesson 3, the light sensor returned a value from 0-1,023. If you are using a Wio Terminal, the sensor was connected to an analog pin. If you are using a Raspberry Pi, then the sensor was connected to an analog pin on the base hat that has an integrated ADC to communicate over the GPIO pins. The virtual device was set to send a value from 0-1,023 to simulate an analog pin.
Soil moisture sensors rely on voltages, so will use analog pins and give values from 0-1,023. Soil moisture sensors rely on voltages, so will use analog pins and give values from 0-1,023.
@ -126,8 +118,6 @@ I<sup>2</sup>C has a bus made of 2 main wires, along with 2 power wires:
![I2C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire](../../../images/i2c.png) ![I2C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire](../../../images/i2c.png)
***I<sup>2</sup>C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire. Microcontroller by Template / LED by abderraouf omara / ldr by Eucalyp - all from the [Noun Project](https://thenounproject.com)***
To send data, one device will issue a start condition to show it is ready to send data. It will then become the controller. The controller then sends the address of the device that it wants to communicate with, along with if it wants to read or write data. After the data has been transmitted, the controller sends a stop condition to indicate that it has finished. After this another device can become the controller and send or receive data. To send data, one device will issue a start condition to show it is ready to send data. It will then become the controller. The controller then sends the address of the device that it wants to communicate with, along with if it wants to read or write data. After the data has been transmitted, the controller sends a stop condition to indicate that it has finished. After this another device can become the controller and send or receive data.
I<sup>2</sup>C has speed limits, with 3 different modes running at fixed speeds. The fastest is High Speed mode with a maximum speed of 3.4Mbps (megabits per second), though very few devices support that speed. The Raspberry Pi for example, is limited to fast mode at 400Kbps (kilobits per second). Standard mode runs at 100Kbps. I<sup>2</sup>C has speed limits, with 3 different modes running at fixed speeds. The fastest is High Speed mode with a maximum speed of 3.4Mbps (megabits per second), though very few devices support that speed. The Raspberry Pi for example, is limited to fast mode at 400Kbps (kilobits per second). Standard mode runs at 100Kbps.
@ -143,8 +133,6 @@ UART involves physical circuitry that allows two devices to communicate. Each de
![UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa](../../../images/uart.png) ![UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa](../../../images/uart.png)
***UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
> 🎓 The data is sent one bit at a time, and this is known as *serial* communication. Most operating systems and microcontrollers have *serial ports*, that is connections that can send and receive serial data that are available to your code. > 🎓 The data is sent one bit at a time, and this is known as *serial* communication. Most operating systems and microcontrollers have *serial ports*, that is connections that can send and receive serial data that are available to your code.
UART devices have a [baud rate](https://wikipedia.org/wiki/Symbol_rate) (also known as Symbol rate), which is the speed that data will be sent and received in bits per second. A common baud rate is 9,600, meaning 9,600 bits (0s and 1s) of data are sent each second. UART devices have a [baud rate](https://wikipedia.org/wiki/Symbol_rate) (also known as Symbol rate), which is the speed that data will be sent and received in bits per second. A common baud rate is 9,600, meaning 9,600 bits (0s and 1s) of data are sent each second.
@ -174,8 +162,6 @@ SPI controllers use 3 wires, along with 1 extra wire per peripheral. Peripherals
![SPI with on controller and two peripherals](../../../images/spi.png) ![SPI with on controller and two peripherals](../../../images/spi.png)
***SPI with on controller and two peripherals. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
The CS wire is used to activate one peripheral at a time, communicating over the COPI and CIPO wires. When the controller needs to change peripheral, it deactivates the CS wire connected to the currently active peripheral, then activates the wire connected to the peripheral it wants to communicate with next. The CS wire is used to activate one peripheral at a time, communicating over the COPI and CIPO wires. When the controller needs to change peripheral, it deactivates the CS wire connected to the currently active peripheral, then activates the wire connected to the peripheral it wants to communicate with next.
SPI is *full-duplex*, meaning the controller can send and receive data at the same time from the same peripheral using the COPI and CIPO wires. SPI uses a clock signal on the SCLK wire to keep the devices in sync, so unlike sending directly over UART it doesn't need start and stop bits. SPI is *full-duplex*, meaning the controller can send and receive data at the same time from the same peripheral using the COPI and CIPO wires. SPI uses a clock signal on the SCLK wire to keep the devices in sync, so unlike sending directly over UART it doesn't need start and stop bits.

@ -26,8 +26,6 @@ The solution to this is to have a pump connected to an external power supply, an
![A light switch turns power on to a light](../../../images/light-switch.png) ![A light switch turns power on to a light](../../../images/light-switch.png)
***A light switch turns power on to a light. switch by Chattapat / lightbulb by Maxim Kulikov - all from the [Noun Project](https://thenounproject.com)***
> 🎓 [Mains electricity](https://wikipedia.org/wiki/Mains_electricity) refers to the electricity delivered to homes and businesses through national infrastructure in many parts of the world. > 🎓 [Mains electricity](https://wikipedia.org/wiki/Mains_electricity) refers to the electricity delivered to homes and businesses through national infrastructure in many parts of the world.
✅ IoT devices can usually provide 3.3V or 5V, at less than 1 amp (1A) of current. Compare this to mains electricity which is most often at 230V (120V in North America and 100V in Japan), and can provide power for devices that draw 30A. ✅ IoT devices can usually provide 3.3V or 5V, at less than 1 amp (1A) of current. Compare this to mains electricity which is most often at 230V (120V in North America and 100V in Japan), and can provide power for devices that draw 30A.
@ -42,14 +40,10 @@ A relay is an electromechanical switch that converts an electrical signal into a
![When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit](../../../images/relay-on.png) ![When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit](../../../images/relay-on.png)
***When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
In a relay, a control circuit powers the electromagnet. When the electromagnet is on, it pulls a lever that moves a switch, closing a pair of contacts and completing an output circuit. In a relay, a control circuit powers the electromagnet. When the electromagnet is on, it pulls a lever that moves a switch, closing a pair of contacts and completing an output circuit.
![When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit](../../../images/relay-off.png) ![When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit](../../../images/relay-off.png)
***When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
When the control circuit is off, the electromagnet turns off, releasing the lever and opening the contacts, turning off the output circuit. Relays are digital actuators - a high signal to the relay turns it on, a low signal turns it off. When the control circuit is off, the electromagnet turns off, releasing the lever and opening the contacts, turning off the output circuit. Relays are digital actuators - a high signal to the relay turns it on, a low signal turns it off.
The output circuit can be used to power additional hardware, like an irrigation system. The IoT device can turn the relay on, completing the output circuit that powers the irrigation system, and plants get watered. The IoT device can then turn the relay off, cutting the power to the irrigation system, turning the water off. The output circuit can be used to power additional hardware, like an irrigation system. The IoT device can turn the relay on, completing the output circuit that powers the irrigation system, and plants get watered. The IoT device can then turn the relay off, cutting the power to the irrigation system, turning the water off.
@ -128,8 +122,6 @@ If you did the last lesson on soil moisture using a physical sensor, you would h
![A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil](../../../images/soil-moisture-travel.png) ![A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil](../../../images/soil-moisture-travel.png)
***A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
In the diagram above, a soil moisture reading shows 658. The plant is watered, but this reading doesn't change immediately, as the water has yet to reach the sensor. Watering can even finish before the water reaches the sensor and the value drops to reflect the new moisture level. In the diagram above, a soil moisture reading shows 658. The plant is watered, but this reading doesn't change immediately, as the water has yet to reach the sensor. Watering can even finish before the water reaches the sensor and the value drops to reflect the new moisture level.
If you were writing code to control an irrigation system via a relay based off soil moisture levels, you would need to take this delay into consideration and build smarter timing into your IoT device. If you were writing code to control an irrigation system via a relay based off soil moisture levels, you would need to take this delay into consideration and build smarter timing into your IoT device.
@ -156,8 +148,6 @@ For example, I have a strawberry plant with a soil moisture sensor and a pump co
![Step 1, take measurement. Step 2, add water. Step 3, wait for water to soak through the soil. Step 4, retake measurement](../../../images/soil-moisture-delay.png) ![Step 1, take measurement. Step 2, add water. Step 3, wait for water to soak through the soil. Step 4, retake measurement](../../../images/soil-moisture-delay.png)
***Measure, add water, wait, remeasure. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
This means the best process would be a watering cycle that is something like: This means the best process would be a watering cycle that is something like:
* Turn on the pump for 5 seconds * Turn on the pump for 5 seconds
@ -203,10 +193,11 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
1. Open the `app.py` file 1. Open the `app.py` file
1. Add the following code to the `app.py` file below the existing imports: 1. Add the following code to the `app.py` file below the existing imports:
```python ```python
import threading import threading
``` ```
This statement imports `threading` from Python libraries, threading allows python to execute other code while waiting. This statement imports `threading` from Python libraries, threading allows python to execute other code while waiting.
1. Add the following code before the `handle_telemetry` function that handles telemetry messages received by the server code: 1. Add the following code before the `handle_telemetry` function that handles telemetry messages received by the server code:
@ -274,7 +265,7 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
``` ```
A good way to test this in a simulated irrigation system is to use dry soil, then pour water in manually whilst the relay is on, stopping pouring when the relay turns off. A good way to test this in a simulated irrigation system is to use dry soil, then pour water in manually whilst the relay is on, stopping pouring when the relay turns off.
> 💁 You can find this code in the [code-timing](./code-timing) folder. > 💁 You can find this code in the [code-timing](./code-timing) folder.
> 💁 If you want to use a pump to build a real irrigation system, then you can use a [6V water pump](https://www.seeedstudio.com/6V-Mini-Water-Pump-p-1945.html) with a [USB terminal power supply](https://www.adafruit.com/product/3628). Make sure the power to or from the pump is connected via the relay. > 💁 If you want to use a pump to build a real irrigation system, then you can use a [6V water pump](https://www.seeedstudio.com/6V-Mini-Water-Pump-p-1945.html) with a [USB terminal power supply](https://www.adafruit.com/product/3628). Make sure the power to or from the pump is connected via the relay.

@ -102,14 +102,10 @@ IoT devices connect to a cloud service either using a device SDK (a library that
![Devices connect to a service using a device SDK. Server code also connects to the service via an SDK](../../../images/iot-service-connectivity.png) ![Devices connect to a service using a device SDK. Server code also connects to the service via an SDK](../../../images/iot-service-connectivity.png)
***Devices connect to a service using a device SDK. Server code also connects to the service via an SDK. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
Your device then communicates with other parts of your application over this service - similar to how you sent telemetry and received commands over MQTT. This is usually using a service SDK or a similar library. Messages come from your device to the service where other components of your application can then read them, and messages can then be sent back to your device. Your device then communicates with other parts of your application over this service - similar to how you sent telemetry and received commands over MQTT. This is usually using a service SDK or a similar library. Messages come from your device to the service where other components of your application can then read them, and messages can then be sent back to your device.
![Devices without a valid secret key cannot connect to the IoT service](../../../images/iot-service-allowed-denied-connection.png) ![Devices without a valid secret key cannot connect to the IoT service](../../../images/iot-service-allowed-denied-connection.png)
***Devices without a valid secret key cannot connect to the IoT service. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
These services implement security by knowing about all the devices that can connect and send data, either by having the devices pre-registered with the service, or by giving the devices secret keys or certificates they can use to register themselves with the service the first time they connect. Unknown devices are unable to connect, if they try the service rejects the connection and ignores messages sent by them. These services implement security by knowing about all the devices that can connect and send data, either by having the devices pre-registered with the service, or by giving the devices secret keys or certificates they can use to register themselves with the service the first time they connect. Unknown devices are unable to connect, if they try the service rejects the connection and ignores messages sent by them.
✅ Do some research: What is the downside of having an open IoT service where any device or code can connect? Can you find specific examples of hackers taking advantage of this? ✅ Do some research: What is the downside of having an open IoT service where any device or code can connect? Can you find specific examples of hackers taking advantage of this?

@ -22,14 +22,10 @@ Serverless, or serverless computing, involves creating small blocks of code that
![Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run](../../../images/iot-messages-to-serverless.png) ![Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run](../../../images/iot-messages-to-serverless.png)
***Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
> 💁 If you've used database triggers before, you can think of this as the same thing, code being triggered by an event such as inserting a row. > 💁 If you've used database triggers before, you can think of this as the same thing, code being triggered by an event such as inserting a row.
![When many events are sent at the same time, the serverless service scales up to run them all at the same time](../../../images/serverless-scaling.png) ![When many events are sent at the same time, the serverless service scales up to run them all at the same time](../../../images/serverless-scaling.png)
***When many events are sent at the same time, the serverless service scales up to run them all at the same time. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
Your code is only run when the event happens, there is nothing keeping your code alive at other times. The event happens, your code is loaded and run. This makes serverless very scalable - if many events happen at the same time, the cloud provider can run your function as many times as you need at the same time across whatever servers they have available. The downside to this is if you need to share information between events, you need to save it somewhere like a database rather than storing it in memory. Your code is only run when the event happens, there is nothing keeping your code alive at other times. The event happens, your code is loaded and run. This makes serverless very scalable - if many events happen at the same time, the cloud provider can run your function as many times as you need at the same time across whatever servers they have available. The downside to this is if you need to share information between events, you need to save it somewhere like a database rather than storing it in memory.
Your code is written as a function that takes details about the event as a parameter. You can use a wide range of programming languages to write these serverless functions. Your code is written as a function that takes details about the event as a parameter. You can use a wide range of programming languages to write these serverless functions.

@ -50,14 +50,10 @@ When a device connects to an IoT service, it uses an ID to identify itself. The
![Both valid and malicious devices could use the same ID to send telemetry](../../../images/iot-device-and-hacked-device-connecting.png) ![Both valid and malicious devices could use the same ID to send telemetry](../../../images/iot-device-and-hacked-device-connecting.png)
***Both valid and malicious devices could use the same ID to send telemetry. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The way round this is to convert the data being sent into a scrambled format, using some kind of value to scramble the data known to the device and the cloud only. This process is called *encryption*, and the value used to encrypt the data is called an *encryption key*. The way round this is to convert the data being sent into a scrambled format, using some kind of value to scramble the data known to the device and the cloud only. This process is called *encryption*, and the value used to encrypt the data is called an *encryption key*.
![If encryption is used, then only encrypted messages will be accepted, others will be rejected](../../../images/iot-device-and-hacked-device-connecting-encryption.png) ![If encryption is used, then only encrypted messages will be accepted, others will be rejected](../../../images/iot-device-and-hacked-device-connecting-encryption.png)
***If encryption is used, then only encrypted messages will be accepted, others will be rejected. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The cloud service can then convert the data back to a readable format, using a process called *decryption*, using either the same encryption key, or a *decryption key*. If the encrypted message cannot be decrypted by the key, the device has been hacked and the message is rejected. The cloud service can then convert the data back to a readable format, using a process called *decryption*, using either the same encryption key, or a *decryption key*. If the encrypted message cannot be decrypted by the key, the device has been hacked and the message is rejected.
The technique for doing encryption and decryption is called *cryptography*. The technique for doing encryption and decryption is called *cryptography*.
@ -160,8 +156,6 @@ When using X.509 certificates, both the sender and the recipient will have their
![Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it.](../../../images/send-message-certificate.png) ![Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it.](../../../images/send-message-certificate.png)
***Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it. Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)***
One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub. One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub.
The certificate used by your device to encrypt messages it sends to the IoT Hub is published by Microsoft. It is the same certificate that a lot of Azure services use, and is sometimes built into the SDKs The certificate used by your device to encrypt messages it sends to the IoT Hub is published by Microsoft. It is the same certificate that a lot of Azure services use, and is sometimes built into the SDKs

@ -110,15 +110,13 @@ GPS systems work by having a number of satellites that send a signal with each s
![By knowing the distance from the sensor to multiple satellites, the location be calculated](../../../images/gps-satellites.png) ![By knowing the distance from the sensor to multiple satellites, the location be calculated](../../../images/gps-satellites.png)
***By knowing the distance from the sensor to multiple satellites, the location be calculated. Satellite by Noura Mbarki from the [Noun Project](https://thenounproject.com)***
GPS satellites are circling the Earth, not at a fixed point above the sensor, so location data includes altitude above sea level as well as latitude and longitude. GPS satellites are circling the Earth, not at a fixed point above the sensor, so location data includes altitude above sea level as well as latitude and longitude.
GPS used to have limitations on accuracy enforced by the US military, limiting accuracy to around 5 meters. This limitation was removed in 2000, allowing an accuracy of 30 centimeters. Getting this accuracy is not always possible due to interference with the signals. GPS used to have limitations on accuracy enforced by the US military, limiting accuracy to around 5 meters. This limitation was removed in 2000, allowing an accuracy of 30 centimeters. Getting this accuracy is not always possible due to interference with the signals.
✅ If you have a smart phone, launch the mapping app and see how accurate your location is. It may take a short period of time for your phone to detect multiple satellites to get a more accurate location. ✅ If you have a smart phone, launch the mapping app and see how accurate your location is. It may take a short period of time for your phone to detect multiple satellites to get a more accurate location.
> 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite. > 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks on Earth, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite.
GPS systems have been developed and deployed by a number of countries and political unions including the US, Russia, Japan, India, the EU, and China. Modern GPS sensor can connect to most of these systems to get faster and more accurate fixes. GPS systems have been developed and deployed by a number of countries and political unions including the US, Russia, Japan, India, the EU, and China. Modern GPS sensor can connect to most of these systems to get faster and more accurate fixes.

@ -87,8 +87,6 @@ In the last lesson you captured GPS data from a GPS sensor connected to your IoT
![Sending GPS telemetry from an IoT device to IoT Hub](../../../images/gps-telemetry-iot-hub.png) ![Sending GPS telemetry from an IoT device to IoT Hub](../../../images/gps-telemetry-iot-hub.png)
***Sending GPS telemetry from an IoT device to IoT Hub. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - send GPS data to an IoT Hub ### Task - send GPS data to an IoT Hub
1. Create a new IoT Hub using the free tier. 1. Create a new IoT Hub using the free tier.
@ -171,8 +169,6 @@ Once data is flowing into your IoT Hub, you can write some serverless code to li
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger](../../../images/gps-telemetry-iot-hub-functions.png) ![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger](../../../images/gps-telemetry-iot-hub-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - handle GPS events using serverless code ### Task - handle GPS events using serverless code
1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this. 1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this.
@ -236,8 +232,6 @@ In this lesson, you will use the Python SDK to see how to interact with blob sto
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage](../../../images/save-telemetry-to-storage-from-functions.png) ![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage](../../../images/save-telemetry-to-storage-from-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The data will be saved as a JSON blob with the following format: The data will be saved as a JSON blob with the following format:
```json ```json
@ -372,7 +366,6 @@ The data will be saved as a JSON blob with the following format:
> 💁 Make sure you are not running the IoT Hub event monitor at the same time. > 💁 Make sure you are not running the IoT Hub event monitor at the same time.
> 💁 You can find this code in the [code/functions](code/functions) folder. > 💁 You can find this code in the [code/functions](code/functions) folder.
### Task - verify the uploaded blobs ### Task - verify the uploaded blobs

@ -36,8 +36,6 @@ The rise of automated harvesting moved the sorting of produce from the harvest t
![If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever](../../../images/optical-tomato-sorting.png) ![If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever](../../../images/optical-tomato-sorting.png)
***If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever. tomato by parkjisun from the Noun Project - from the [Noun Project](https://thenounproject.com)***
The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts. The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts.
The video below shows one of these machines in action. The video below shows one of these machines in action.
@ -122,7 +120,7 @@ To use Custom Vision, you first need to create two cognitive services resources
Replace `<location>` with the location you used when creating the Resource Group. Replace `<location>` with the location you used when creating the Resource Group.
This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services. This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
> 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services. > 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services.
1. Use the following command to create a free Custom Vision prediction resource: 1. Use the following command to create a free Custom Vision prediction resource:
@ -144,7 +142,7 @@ To use Custom Vision, you first need to create two cognitive services resources
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account. 1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account.
1. Follow the [Create a new Project section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference. 1. Follow the [create a new Project section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
Call your project `fruit-quality-detector`. Call your project `fruit-quality-detector`.
@ -178,11 +176,11 @@ Image classifiers run at very low resolution. For example Custom Vision can take
If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use. If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use.
1. Follow the [Upload and tag images section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`. 1. Follow the [upload and tag images section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`.
![The upload dialogs showing the upload of ripe and unripe banana pictures](../../../images/image-upload-bananas.png) ![The upload dialogs showing the upload of ripe and unripe banana pictures](../../../images/image-upload-bananas.png)
1. Follow the [Train the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images. 1. Follow the [train the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images.
You will be given a choice of training type. Select **Quick Training**. You will be given a choice of training type. Select **Quick Training**.
@ -196,7 +194,7 @@ Once your classifier is trained, you can test it by giving it a new image to cla
### Task - test your image classifier ### Task - test your image classifier
1. Follow the [Test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training. 1. Follow the [test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training.
![A unripe banana predicted as unripe with a 98.9% probability, ripe with a 1.1% probability](../../../images/banana-unripe-quick-test-prediction.png) ![A unripe banana predicted as unripe with a 98.9% probability, ripe with a 1.1% probability](../../../images/banana-unripe-quick-test-prediction.png)
@ -210,7 +208,7 @@ Every time you make a prediction using the quick test option, the image and resu
### Task - retrain your image classifier ### Task - retrain your image classifier
1. Follow the [Use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image. 1. Follow the [use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image.
1. Once you model has been retrained, test on new images. 1. Once you model has been retrained, test on new images.
@ -228,8 +226,8 @@ Try it out and see what the predictions are. You can find images to try with usi
## Review & Self Study ## Review & Self Study
* When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the Evaluate the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier) * When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the evaluate the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier)
* Read up on how to improve your classifier from the [How to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn) * Read up on how to improve your classifier from the [how to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment

@ -1,7 +1,5 @@
# Check fruit quality from an IoT device # Check fruit quality from an IoT device
![Embed a video here if available](video-url)
## Pre-lecture quiz ## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/31) [Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/31)
@ -149,7 +147,7 @@ If you were to create a production device to sell to farms or factories, how wou
You trained your custom vision model using the portal. This relies on having images available - and in the real world you may not be able to get training data that matches what the camera on your device captures. You can work round this by training directly from your device using the training API, to train a model using images captured from your IoT device. You trained your custom vision model using the portal. This relies on having images available - and in the real world you may not be able to get training data that matches what the camera on your device captures. You can work round this by training directly from your device using the training API, to train a model using images captured from your IoT device.
* Read up on the training API in the [Using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn) * Read up on the training API in the [using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment

@ -1,6 +1,6 @@
# Classify an image - Virtual IoT Hardware and Raspberry Pi # Classify an image - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it. In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Send images to Custom Vision ## Send images to Custom Vision
@ -25,7 +25,7 @@ The Custom Vision service has a Python SDK you can use to classify images.
This brings in some modules from the Custom Vision libraries, one to authenticate with the prediction key, and one to provide a prediction client class that can call Custom Vision. This brings in some modules from the Custom Vision libraries, one to authenticate with the prediction key, and one to provide a prediction client class that can call Custom Vision.
1. Add the following code to to the end of the file: 1. Add the following code to the end of the file:
```python ```python
prediction_url = '<prediction_url>' prediction_url = '<prediction_url>'
@ -86,6 +86,6 @@ The Custom Vision service has a Python SDK you can use to classify images.
![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png) ![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png)
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-device](code-classify/virtual-device) folder. > 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success! 😀 Your fruit quality classifier program was a success!

@ -101,7 +101,7 @@ Program the device.
> 💁 You can capture the image directly to a file instead of a `BytesIO` object by passing the file name to the `camera.capture` call. The reason for using the `BytesIO` object is so that later in this lesson you can send the image to your image classifier. > 💁 You can capture the image directly to a file instead of a `BytesIO` object by passing the file name to the `camera.capture` call. The reason for using the `BytesIO` object is so that later in this lesson you can send the image to your image classifier.
1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captures from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam. 1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captured from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam.
![CounterFit with a file set as the image source, and a web cam set showing a person holding a banana in a preview of the webcam](../../../images/counterfit-camera-options.png) ![CounterFit with a file set as the image source, and a web cam set showing a person holding a banana in a preview of the webcam](../../../images/counterfit-camera-options.png)

@ -10,7 +10,7 @@ The camera you'll use is an [ArduCam Mini 2MP Plus](https://www.arducam.com/prod
## Connect the camera ## Connect the camera
The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C busses via the GPIO pins on the Wio Terminal. The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C buses via the GPIO pins on the Wio Terminal.
### Task - connect the camera ### Task - connect the camera

@ -1,10 +1,10 @@
# Classify an image - Wio Terminal # Classify an image - Wio Terminal
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it. In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Classify an image ## Classify an image
The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. THis REST API is accessed over an HTTPS connection - a secure HTTP connection. The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. This REST API is accessed over an HTTPS connection - a secure HTTP connection.
When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application. When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application.
@ -12,7 +12,7 @@ These certificates contain public keys, and don't need to be kept secure. You ca
### Task - set up a SSL client ### Task - set up a SSL client
1. Open the `fruit-quality-detector` app project if it's not already open 1. Open the `fruit-quality-detector` app project if it's not already open.
1. Open the `config.h` header file, and add the following: 1. Open the `config.h` header file, and add the following:

@ -1,12 +1,8 @@
# Run your fruit detector on the edge # Run your fruit detector on the edge
<!-- This lesson is still under development -->
This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson. This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson.
[![Custom Vison AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us) [![Custom Vision AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us)
> 🎥 Click the image above to watch a video
## Pre-lecture quiz ## Pre-lecture quiz
@ -14,23 +10,98 @@ This video gives an overview of running image classifiers on IoT devices, the to
## Introduction ## Introduction
In this lesson you will learn about In the last lesson you used your image classifier to classify ripe and unripe fruit, sending an image captured by the camera on your IoT device over the internet to a cloud service. These calls take time, cost money, and depending on the kind of image data you are using, could have privacy implications.
In this lesson you will learn about how to run machine learning (ML) models on the edge - on IoT devices running on your own network rather than in the cloud. You will learn the benefits and drawbacks of edge computing versus cloud computing, how to deploy your AI model to the edge, and how to access it from your IoT device.
In this lesson we'll cover: In this lesson we'll cover:
* [Edge computing](#edge-computing) * [Edge computing](#edge-computing)
* [Azure IoT Edge](#azure-iot-edge) * [Azure IoT Edge](#Azure-IoT-Edge)
* [Register an IoT Edge device](#registeran-iot-edge-device) * [Register an IoT Edge device](#register-an-iot-edge-device)
* [Set up an IoT Edge device](#set-up-an-iot-dge-device) * [Set up an IoT Edge device](#set-up-an-iot-edge-device)
* [Run your classifier on the edge](run-your-classifier-on-the-edge) * [Export your model](#export-your-model)
* [Prepare your container for deployment](#prepare-your-container-for-deployment)
* [Deploy your container](#deploy-your-container)
* [Use your IoT Edge device](#use-your-iot-edge-device)
## Edge computing ## Edge computing
Edge computing involves having computers that process IoT data as close as possible to where the data is generated. Instead of having this processing in the cloud, it is moved to the edge of the cloud - your internal network.
![An architecture diagram showing internet services in the cloud and IoT devices on a local network](../../../images/cloud-without-edge.png)
In the lessons so far, you have had devices gathering data and sending data to the cloud to be analyzed, running serverless functions or AI models in the cloud.
![An architecture diagram showing IoT devices on a local network connecting to edge devices, and those edge devices connect to the cloud](../../../images/cloud-with-edge.png)
Edge computing involves moving some of the cloud services off the cloud and onto computers running on the same network as the IoT devices, only communicating with the cloud if needed. For example, you can run AI models on edge devices to analyse fruit for ripeness, and only send analytics back to the cloud, such as the number of ripe pieces of fruit vs unripe.
✅ Think about the IoT applications you have built so far. Which parts of them could be moved to the edge.
### Upsides
The upsides of edge computing are:
1. **Speed** - edge computing is ideal for time-sensitive data as actions are done on the same network as the device, rather than making calls across the internet. This enables higher speeds as internal networks can run at substantially faster speeds than internet connections, with the data travelling much shorter distance.
> 💁 Despite optical cables being used for internet connections allowing data to travel at the speed of light, data can take time to travel around the world to cloud providers. For example, if you are sending data from Europe to cloud services in the US it takes at least 28ms for the data to cross the atlantic in an optical cable, and that is ignoring the time taken to get the data to the transatlantic cable, convert from electrical to light signals and back again the other side, then from the optical cable to the cloud provider.
Edge computing also requires less network traffic, reducing the risk of your data slowing down due to congestion on the limited bandwidth available for an internet connection.
1. **Remote accessibility** - edge compute works when you have limited or no connectivity, or connectivity is too expensive to use continually. For example when working in humanitarian disaster areas where infrastructure is limited, or in developing nations.
1. **Lower costs** - performing data collection, storage, analysis, and triggering actions on edge device reduces usage of cloud services which can reduce the overall cost of your IoT application. There has been a recent rise in devices designed for edge computing, such as AI accelerator boards like the [Jetson Nano from NVIDIA](https://developer.nvidia.com/embedded/jetson-nano-developer-kit), which can run AI workloads using GPU-based hardware on devices that cost less than US$100.
1. **Privacy and security** - with edge compute, data stays on your network and is not uploaded to the cloud. This is often preferred for sensitive and personally identifiable information, especially because data does not need to be stored after it has been analyzed, which greatly reduces the risk of data leaks. Examples include medical data and security camera footage.
1. **Handling insecure devices** - if you have devices with known security flaws that you don't want to connect directly to your network or the internet, then you can connect them to a separate network to a gateway IoT Edge device. This edge device can then also have a connection to your wider network or the internet, and manage the data flows back and forth.
1. **Support for incompatible devices** - if you have devices that cannot connect to IoT Hub, for example devices that can only connect using HTTP connections or devices that only have Bluetooth to connect, you can use an IoT edge device as a gateway device, forwarding on messages to IoT Hub.
✅ Do some research: What other upsides might there be to edge computing?
### Downsides
There are downsides to edge computing, where the cloud may be a preferred option:
1. **Scale and flexibility** - cloud computing can adjust to network and data needs in real-time by adding or reducing servers and other resources. To add more edge computers requires manually adding more devices.
1. **Reliability and resiliency** - cloud computing provides multiple servers often in multiple locations for redundancy and disaster recovery. To have the same level of redundancy on the edge requires large investments and a lor of configuration work.
1. **Maintenance** - cloud service providers provide system maintenance and updates.
✅ Do some research: What other downsides might there be to edge computing?
The downsides are really the opposite of the upsides of using the cloud - you have to build and manage these devices yourself, rather than relying on the expertise and scale of cloud providers.
Some of the risks are mitigated by the very nature of edge computing. For example, if you have an edge device running in a factory gathering data from machinery, you don't need to think about some disaster recovery scenarios. If the power to the factory goes out then you don't need a backup edge device as the machines that generate the data the edge device processes will also be without power.
For IoT systems, you'll often want a blend of cloud and edge computing, leveraging each service based on the needs of the system, its customers, and its maintainers.
## Azure IoT Edge ## Azure IoT Edge
![The Azure IoT Edge logo](../../../images/azure-iot-edge-logo.png) ![The Azure IoT Edge logo](../../../images/azure-iot-edge-logo.png)
IoT Edge runs code from containers. Azure IoT Edge is a service that can help you to move workloads out of the cloud and to the edge. You set up a device as an edge device, and from the cloud you can deploy code to that edge device. This allows you to mix the capabilities of the cloud and the edge.
> 🎓 *Workloads* is a term for any service that does some kind of work, such as AI models, applications, or serverless functions.
For example, you can train an image classifier in the cloud, then from the cloud deploy it to an edge device. Your IoT device then sends images to the edge device for classification, rather than sending the images over the internet. If you need to deploy a new iteration of the model, you can train it in the cloud and use IoT Edge to update the model on the edge device to your new iteration.
> 🎓 Software that is deployed to IoT Edge is known as *modules*. By default IoT Edge runs modules that communicate with IoT Hub, such as the `edgeAgent` and `edgeHub` modules. When you deploy an image classifier, this is deployed as an additional module.
IoT Edge is built into IoT Hub, so you can manage edge devices using the same service you would use to manage IoT devices, with the same level of security.
IoT Edge runs code from *containers* - self contained applications that are run in isolation from the rest of the applications on your computer. When you run a container it act's like a separate computer running inside your computer, with it's own software, services and applications running. Most of the time containers cannot access anything on your computer unless you choose to share things like a folder with the container. The container then exposes services via an open port that you can connect to or expose to your network.
![A web request redirected to a container](../../../images/container-web-browser.png)
For example, you can have a container with a web site running on port 80, the default HTTP port, and you can then expose it from your computer also on port 80.
✅ Do some research: Read up on containers and services such as Docker or Moby.
You can use Custom Vision to download image classifiers and deploy them as containers, either running direct to a device or deployed via IoT Edge. Once they are running in a container, they can be accessed using the same REST API as the cloud version, but with the endpoint pointing to the Edge device running the container.
## Register an IoT Edge device ## Register an IoT Edge device
@ -64,9 +135,11 @@ To use an IoT Edge device, it needs to be registered in IoT Hub.
## Set up an IoT Edge device ## Set up an IoT Edge device
### Task - set up an IoT Edge device Once you have created the edge device registration in your IoT Hub, you can set up the edge device.
### Task - Install and start the IoT Edge Runtime
The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on Windows using Linux Virtual Machines. **The IoT Edge runtime only runs Linux containers.** It can be run on Linux, or on Windows using Linux Virtual Machines.
* If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string. * If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string.
@ -78,24 +151,457 @@ The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on W
* If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this. * If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this.
## Create a classifier that can run on the edge ## Export your model
To run the classifier at the edge, it needs to be exported from Custom Vision. Custom Vision can generate two types of models - standard models and compact models. Compact models use various techniques to reduce the size of the model, making it small enough to be downloaded and deployed on IoT devices.
When you created the image classifier, you used the *Food* domain, a version of the model that is optimized for training on food images. In Custom Vision, you change the domain of your project, using your training data to train a new model with the new domain. All of the domains supported by Custom Vision are available as standard and compact.
### Task - train your model using the Food (compact) domain
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `fruit-quality-detector` project.
1. Select the **Settings** button (the ⚙ icon)
1. In the *Domains* list, select *Food (compact)*
1. Under *Export Capabilities*, make sure *Basic platforms (Tensorflow, CoreML, ONNX, ...)* is selected.
1. At the bottom of the Settings page, select **Save Changes**.
1. Retrain the model with the **Train** button, selecting *Quick training*.
### Task - export your model
Once the model has been trained, it needs to be exported as a container.
1. Select the **Performance** tab, and find your latest iteration that was trained using the compact domain.
1. Select the **Export** button at the top.
1. Select **DockerFile**, then choose a version that matches your edge device:
* If you are running IoT Edge on a Linux computer, a Windows computer or a Virtual Machine, select the *Linux* version.
* If you are running IoT Edge on a Raspberry Pi, select the *ARM (Raspberry Pi 3)* version.
> 🎓 Docker is one of the most popular tools for managing containers, and a DockerFile is a set of instructions on how to set up the container.
1. Select **Export** to get Custom Vision to create the relevant files, then **Download** to download them in a zip file.
1. Save the files to your computer, then unzip the folder.
## Prepare your container for deployment
![Containers are built then pushed to a container registry, then deployed from the container registry to an edge device using IoT Edge](../../../images/container-edge-flow.png)
Once you have downloaded your model, it needs to be built into a container, then pushed to a container registry - an online location where you can store containers. IoT Edge can then download the container from the registry and push it to your device.
![THe Azure Container Registry logo](../../../images/azure-container-registry-logo.png)
The container registry you will use for this lesson is Azure Container Registry. This is not a free service, so to save money make sure you [clean up your project](../../../clean-up.md) once you are finished.
> 💁 You can see the costs of using an Azure Container Registry in the [Azure Container Registry pricing page](https://azure.microsoft.com/pricing/details/container-registry/?WT.mc_id=academic-17441-jabenn)
### Task - install Docker
To build and deploy the classifier classifier, you'll need to install [Docker](https://www.docker.com/).
1. Follow the Docker installation instructions on the [Docker install page](https://www.docker.com/products/docker-desktop) to install Docker Desktop or the Docker engine. Ensure it is running after installation.
### Task - create a container registry resource
1. Run the following command from your Terminal or command prompt to create an Azure Container Registry resource:
```sh
az acr create --resource-group fruit-quality-detector \
--sku Basic \
--name <Container registry name>
```
Replace `<Container registry name>` with a unique name for your container registry, using letters and numbers only. Base this around `fruitqualitydetector`. This name becomes part of the URL to access the container registry, so needs to be globally unique.
1. Log in to the Azure Container Registry with the following command:
```sh
az acr login --name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
1. Set the container registry into admin mode so you can generate a password with the following command:
```sh
az acr update --admin-enabled true \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
## Run your classifier on the edge 1. Generate passwords for your container registry with the following command:
### Task - deploy your classifier using IoT Edge ```sh
az acr credential renew --password-name password \
--output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
Take a copy of the value of `PASSWORD`, as you will need this later.
### Task - build your container
What you downloaded from Custom Vision was a DockerFile containing instructions on how the container should be built, along with application code that will be run inside the container to host your custom vision model, along with a REST API to call it. You can use Docker to build a tagged container from the DockerFile, then push it to your container registry.
> 🎓 Containers are given a tag that defines a name and version for them. When you need to update a container you can build it with the same tag but a newer version.
1. Open your terminal or command prompt and navigate to the unzipped model that you downloaded from Custom Vision.
1. Run the following command to build and tag the image:
```sh
docker build --platform <platform> -t <Container registry name>.azurecr.io/classifier:v1 .
```
Replace `<platform>` with the platform that this container will run on. If you are running IoT Edge on a Raspberry Pi, set this to `linux/arm64`, otherwise set this to `linux/amd64`.
> 💁 If you are running this command from the device you are running IoT Edge from, such as running this from your Raspberry Pi, you can omit the `--platform <platform>` part as it defaults to the current platform.
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
Docker will build the image, configuring all the software needed. The image will then be tagged as `classifier:v1`.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker build --platform linux/amd64 -t fruitqualitydetectorjimb.azurecr.io/classifier:v1 .
[+] Building 102.4s (11/11) FINISHED
=> [internal] load build definition from Dockerfile
=> => transferring dockerfile: 131B
=> [internal] load .dockerignore
=> => transferring context: 2B
=> [internal] load metadata for docker.io/library/python:3.7-slim
=> [internal] load build context
=> => transferring context: 905B
=> [1/6] FROM docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => resolve docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda 27.15MB / 27.15MB
=> => sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9 2.77MB / 2.77MB
=> => sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8 10.06MB / 10.06MB
=> => sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6 1.86kB / 1.86kB
=> => sha256:44073386687709c437586676b572ff45128ff1f1570153c2f727140d4a9accad 1.37kB / 1.37kB
=> => sha256:3d94f0f2ca798607808b771a7766f47ae62a26f820e871dd488baeccc69838d1 8.31kB / 8.31kB
=> => sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41 233B / 233B
=> => sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f 2.64MB / 2.64MB
=> => extracting sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda
=> => extracting sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9
=> => extracting sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8
=> => extracting sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41
=> => extracting sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f
=> [2/6] RUN pip install -U pip
=> [3/6] RUN pip install --no-cache-dir numpy~=1.17.5 tensorflow~=2.0.2 flask~=1.1.2 pillow~=7.2.0
=> [4/6] RUN pip install --no-cache-dir mscviplib==2.200731.16
=> [5/6] COPY app /app
=> [6/6] WORKDIR /app
=> exporting to image
=> => exporting layers
=> => writing image sha256:1846b6f134431f78507ba7c079358ed66d944c0e185ab53428276bd822400386
=> => naming to fruitqualitydetectorjimb.azurecr.io/classifier:v1
```
### Task - push your container to your container registry
1. Use the following command to push your container to your container registry:
```sh
docker push <Container registry name>.azurecr.io/classifier:v1
```
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
The container will be pushed to the container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker push fruitqualitydetectorjimb.azurecr.io/classifier:v1
The push refers to repository [fruitqualitydetectorjimb.azurecr.io/classifier]
5f70bf18a086: Pushed
8a1ba9294a22: Pushed
56cf27184a76: Pushed
b32154f3f5dd: Pushed
36103e9a3104: Pushed
e2abb3cacca0: Pushed
4213fd357bbe: Pushed
7ea163ba4dce: Pushed
537313a13d90: Pushed
764055ebc9a7: Pushed
v1: digest: sha256:ea7894652e610de83a5a9e429618e763b8904284253f4fa0c9f65f0df3a5ded8 size: 2423
```
1. To verify the push, you can list the containers in your registry with the following command:
```sh
az acr repository list --output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux az acr repository list --name fruitqualitydetectorjimb --output table
Result
----------
classifier
```
You will see your classifier listed in the output.
## Deploy your container
Your container can now be deployed to your IoT Edge device. To deploy you need to define a deployment manifest - a JSON document that lists the modules that will be deployed to the edge device.
### Task - create the deployment manifest
1. Create a new file called `deployment.json` somewhere on your computer.
1. Add the following to this file:
```json
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container registry password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}
```
> 💁 You can find this file in the [code-deployment/deployment](code-deployment/deployment) folder.
Replace the three instances of`<Container registry name>` with the name you used for your container registry. One is in the `ImageClassifier` module section, the other two are in the `registryCredentials` section.
Replace `<Container registry password>` in the `registryCredentials` section with your container registry password.
1. From the folder containing your deployment manifest, run the following command:
```sh
az iot edge set-modules --device-id fruit-quality-detector-edge \
--content deployment.json \
--hub-name <hub_name>
```
Replace `<hub_name>` with the name of your IoT Hub.
The image classifier module will be deployed to your edge device.
### Task - verify the classifier is running
### Task - use the edge classifier from your IoT device 1. Connect to the IoT edge device:
* If you are using a Raspberry Pi to run IoT Edge, connect using ssh either from your terminal, or via a remote SSH session in VS Code
* If you are running IoT Edge in a Linux container on Windows, follow the steps in the [Verify successful configuration guide](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge-on-windows?view=iotedge-2018-06&tabs=powershell&WT.mc_id=academic-17441-jabenn#verify-successful-configuration) to connect to the IoT Edge device.
* If you are running IoT Edge on a virtual machine, you can SSH into the machine using the `adminUsername` and `password` you set when creating the VM, and using either the IP address or DNS name:
```sh
ssh <adminUsername>@<IP address>
```
Or:
```sh
ssh <adminUsername>@<DNS Name>
```
Enter your password when prompted
1. Once you are connected, run the following command to get the list of IoT Edge modules:
```sh
iotedge list
```
> 💁 You may need to run this command with `sudo`.
You will see the running modules:
```output
jim@fruit-quality-detector-jimb:~$ iotedge list
NAME STATUS DESCRIPTION CONFIG
ImageClassifier running Up 42 minutes fruitqualitydetectorjimb.azurecr.io/classifier:v1
edgeAgent running Up 42 minutes mcr.microsoft.com/azureiotedge-agent:1.1
edgeHub running Up 42 minutes mcr.microsoft.com/azureiotedge-hub:1.1
```
1. Check the logs for the Image classifier module with the following command:
```sh
iotedge logs ImageClassifier
```
> 💁 You may need to run this command with `sudo`.
```output
jim@fruit-quality-detector-jimb:~$ iotedge logs ImageClassifier
2021-07-05 20:30:15.387144: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-07-05 20:30:15.392185: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2394450000 Hz
2021-07-05 20:30:15.392712: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55ed9ac83470 executing computations on platform Host. Devices:
2021-07-05 20:30:15.392806: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Host, Default Version
Loading model...Success!
Loading labels...2 found. Success!
* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
```
### Task - test the image classifier
1. You can use CURL to test the image classifier using the IP address or host name of the computer that is running the IoT Edge agent. Find the IP address:
* If you are on the same machine that IoT Edge is running, you can use `localhost` as the host name.
* If you are using a VM, you can use either the IP address or the DNS name of the VM
* Otherwise you can obtain the IP address of the machine running IoT Edge:
* On Windows 10, follow the [Find your IP address guide](https://support.microsoft.com/windows/find-your-ip-address-f21a9bbc-c582-55cd-35e0-73431160a1b9?WT.mc_id=academic-17441-jabenn)
* On macOS, follow the [How to find you IP address on a Mac guide](https://www.hellotech.com/guide/for/how-to-find-ip-address-on-mac)
* On linux, follow the section on finding your private IP address in the [How to find your IP address in Linux guide](https://opensource.com/article/18/5/how-find-ip-address-linux)
1. You can test the container with a local file by running the following curl command:
```sh
curl --location \
--request POST 'http://<IP address or name>/image' \
--header 'Content-Type: image/png' \
--data-binary '@<file_Name>'
```
Replace `<IP address or name>` with the IP address or host name of the computer running IoT Edge. Replace `<file_Name>` with the name of the file to test.
You will see the prediction results in the output:
```output
{
"created": "2021-07-05T21:44:39.573181",
"id": "",
"iteration": "",
"predictions": [
{
"boundingBox": null,
"probability": 0.9995615482330322,
"tagId": "",
"tagName": "ripe"
},
{
"boundingBox": null,
"probability": 0.0004384400090202689,
"tagId": "",
"tagName": "unripe"
}
],
"project": ""
}
```
> 💁 There is no need to provide a prediction key here, as this is not using an Azure resource. Instead security would be configured on the internal network based on internal security needs, rather than relying on a public endpoint and an API key.
## Use your IoT Edge device
Now that your Image Classifier has been deployed to an IoT Edge device, you can use it from your IoT device.
### Task - use your IoT Edge device
Work through the relevant guide to classify images using the IoT Edge classifier:
* [Arduino - Wio Terminal](wio-terminal.md)
* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer.md)
### Model retraining
One of the downsides to running image classifiers on IoT Edge is that they are not connected to your Custom Vision project. If you look at the **Predictions** tab in Custom Vision you won't see the images that were classified using the Edge-based classifier.
This is the expected behavior - images are not sent to the cloud for classification, so they won't be available in the cloud. One of the upsides of using IoT Edge is privacy, ensuring that images don't leave your network, another is being able to work offline, so no relying on uploading images when the device has no internet connection. The downside is improving your model - you would need to implement another way of storing images that can be manually re-classified to improve and re-train the image classifier.
✅ Think about ways to upload images to retrain the classifier.
--- ---
## 🚀 Challenge ## 🚀 Challenge
Running AI models on edge devices can be faster that in the cloud - the network hop is shorter. THey can also be slower as the hardware than runs the model may not be as powerful as the cloud.
Do some timings and compare if the call to your edge device is faster or slower than the call to the cloud? Think about reasons to explain the difference, or lack of difference. Research ways to run AI models faster on the edge using specialized hardware.
## Post-lecture quiz ## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34) [Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
## Review & Self Study ## Review & Self Study
* Read more about containers on the [OS-level virtualization page on Wikipedia](https://wikipedia.org/wiki/OS-level_virtualization)
* Read more on edge computing, with an emphasis on how 5G can help expand edge computing in the [What is edge computing and why does it matter? article on NetworkWorld](https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html)
* Learn more about running AI services in IoT Edge by watching the [Learn How to Use Azure IoT Edge on a Pre-Built AI Service on the Edge to do Language Detection episode of Learn Live on Microsoft Channel9](https://channel9.msdn.com/Shows/Learn-Live/Sharpen-Your-AI-Edge-Skills-Episode-4-Learn-How-to-Use-Azure-IoT-Edge-on-a-Pre-Built-AI-Service-on-t?WT.mc_id=academic-17441-jabenn)
## Assignment ## Assignment
[](assignment.md) [Run other services on the edge](assignment.md)

@ -1,9 +1,13 @@
# # Run other services on the edge
## Instructions ## Instructions
It's not just image classifiers that can be run on the edge, anything that can be packaged up into a container can be deployed to an IoT Edge device. Serverless code running as Azure Functions, such as the triggers you've created in earlier lessons can be run in containers, and therefor on IoT Edge.
Pick one of the previous lessons and try to run the Azure Functions app in an IoT Edge container. You can find a guide that shows how to do this using a different Functions app project in the [Tutorial: Deploy Azure Functions as IoT Edge modules on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/tutorial-deploy-function?view=iotedge-2020-11&WT.mc_id=academic-17441-jabenn).
## Rubric ## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement | | Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- | | -------- | --------- | -------- | ----------------- |
| | | | | | Deploy an Azure Functions app to IoT Edge | Was able to deploy an Azure Functions app to IoT Edge and use it with an IoT device to run a trigger from IoT data | Was able to deploy a Functions App to IoT Edge, but was unable to get the trigger to fire | Was unable to deploy a Functions App to IoT Edge |

@ -0,0 +1,28 @@
import io
import requests
import time
from picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,28 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
import requests
from counterfit_shims_picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,11 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";

@ -0,0 +1,123 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <WiFiClient.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClient client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
void classifyImage(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
classifyImage(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,66 @@
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container Password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}

@ -0,0 +1,54 @@
# Classify an image using an IoT Edge based image classifier - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
The Python library for Custom Vision only works with cloud-hosted models, not models hosted on IoT Edge. This means you will need to use the REST API to call the classifier.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` project in VS Code if it is not already open. If you are using a virtual IoT device, then make sure the virtual environment is activated.
1. Open the `app.py` file, and remove the import statements from `azure.cognitiveservices.vision.customvision.prediction` and `msrest.authentication`.
1. Add the following import at the top of the file:
```python
import requests
```
1. Delete all the code after the image is saved to a file, from `image_file.write(image.read())` to the end of the file.
1. Add the following code to the end of the file:
```python
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')
```
Replace `<URL>` with the URL for your classifier.
This code makes a REST POST request to the classifier, sending the image as the body of the request. The results come back as JSON, and this is decoded to print out the probabilities.
1. Run your code, with your camera pointing at some fruit, or an appropriate image set, or fruit visible on your webcam if using virtual IoT hardware. You will see the output in the console:
```output
(.venv) ➜ fruit-quality-detector python app.py
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success!

@ -31,36 +31,71 @@ In Azure, you can create a virtual machine - a computer in the cloud that you ca
Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device. Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device.
1. You will need either the IP address or the DNS name of the VM to call the image classifier from it. Run the following command to get this:
```sh
az vm list --resource-group fruit-quality-detector \
--output table \
--show-details
```
Take a copy of either the `PublicIps` field, or the `Fqdns` field.
1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project. 1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project.
To shut down the VM, use the following command: You can configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this:
```sh ```sh
az vm deallocate --resource-group fruit-quality-detector \ az vm auto-shutdown --resource-group fruit-quality-detector \
--name <vm_name> --name <vm_name> \
--time <shutdown_time_utc>
``` ```
Replace `<vm_name>` with the name of your virtual machine. Replace `<vm_name>` with the name of your virtual machine.
> 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running. Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC).
To restart the VM, use the following command: 1. Your image classifier will be running on this edge device, listening on port 80 (the standard HTTP port). By default, virtual machines have inbound ports blocked, so you will need to enable port 80. Ports are enabled on network security groups, so first you need to know the name of the network security group for your VM, which you can find with the following command:
```sh ```sh
az vm start --resource-group fruit-quality-detector \ az network nsg list --resource-group fruit-quality-detector \
--name <vm_name> --output table
``` ```
Replace `<vm_name>` with the name of your virtual machine. Copy the value of the `Name` field.
You can also configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this: 1. Run the following command to add a rule to open port 80 to the network security group:
```sh ```sh
az vm auto-shutdown --resource-group fruit-quality-detector \ az network nsg rule create \
--name <vm_name> \ --resource-group fruit-quality-detector \
--time <shutdown_time_utc> --name Port_80 \
--protocol tcp \
--priority 1010 \
--destination-port-range 80 \
--nsg-name <nsg name>
```
Replace `<nsg name>` with the network security group name from the previous step.
### Task - manage your VM to reduce costs
1. When you are not using your VM, you should shut it down. To shut down the VM, use the following command:
```sh
az vm deallocate --resource-group fruit-quality-detector \
--name <vm_name>
``` ```
Replace `<vm_name>` with the name of your virtual machine. Replace `<vm_name>` with the name of your virtual machine.
Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC). > 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running.
1. To restart the VM, use the following command:
```sh
az vm start --resource-group fruit-quality-detector \
--name <vm_name>
```
Replace `<vm_name>` with the name of your virtual machine.

@ -0,0 +1,52 @@
# Classify an image using an IoT Edge based image classifier - Wio Terminal
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` app project if it's not already open.
1. The image classifier is running as a REST API using HTTP, not HTTPS, so the call needs to use a WiFi client that works with HTTP calls only. This means the certificate is not needed. Delete the `CERTIFICATE` from the `config.h` file.
1. The prediction URL in the `config.h` file needs to be updated to the new URL. You can also delete the `PREDICTION_KEY` as this is not needed.
```cpp
const char *PREDICTION_URL = "<URL>";
```
Replace `<URL>` with the URL for your classifier.
1. In `main.cpp`, change the include directive for the WiFi Client Secure to import the standard HTTP version:
```cpp
#include <WiFiClient.h>
```
1. Change the declaration of `WiFiClient` to be the HTTP version:
```cpp
WiFiClient client;
```
1. Select the line that sets the certificate on the WiFi client. Remove the line `client.setCACert(CERTIFICATE);` from the `connectWiFi` function.
1. In the `classifyImage` function, remove the `httpClient.addHeader("Prediction-Key", PREDICTION_KEY);` line that sets the prediction key in the header.
1. Upload and run your code. Point the camera at some fruit and press the C button. You will see the output in the serial monitor:
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 8200
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/wio-terminal](code-classify/wio-terminal) folder.
😀 Your fruit quality classifier program was a success!

@ -37,8 +37,6 @@ IoT applications can be described as *things* (devices) sending data that genera
![A reference iot architecture](../../../images/iot-reference-architecture.png) ![A reference iot architecture](../../../images/iot-reference-architecture.png)
***A reference iot architecture. Microcontroller by Template / IoT by Adrien Coquet / Brain by Icon Market - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference IoT architecture. The diagram above shows a reference IoT architecture.
> 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate. > 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate.
@ -49,8 +47,6 @@ The diagram above shows a reference IoT architecture.
![A reference iot architecture](../../../images/iot-reference-architecture-azure.png) ![A reference iot architecture](../../../images/iot-reference-architecture-azure.png)
***A reference iot architecture. Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture. The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture.
* **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub. * **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub.
@ -74,7 +70,7 @@ As you define the architecture of your system, you need to constantly consider d
## Design a fruit quality control system ## Design a fruit quality control system
Lets now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application. Let's now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application.
Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system. Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system.
@ -92,8 +88,6 @@ You need to build a system where fruit is detected as it arrives on the conveyer
![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png) ![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png)
***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference architecture for this prototype application. The diagram above shows a reference architecture for this prototype application.
* An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected. * An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected.
@ -111,8 +105,6 @@ The IoT device needs some kind of trigger to indicate when fruit is ready to be
![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png) ![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png)
***Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back. Bananas by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor. Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor.
> 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away. > 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away.
@ -205,7 +197,7 @@ The prototype will form the basis of a final production system. Some of the diff
## 🚀 Challenge ## 🚀 Challenge
In this lesson you have learned some of the concepts you need to know to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above? In this lesson you have learned some of the concepts you need to know on how to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above?
Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need. Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need.
@ -218,7 +210,7 @@ For example - a vehicle tracking device that combines GPS with sensors to monito
## Review & Self Study ## Review & Self Study
* Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn) * Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn)
* Read more about device twins in the [Understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn) * Read more about device twins in the [understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn)
* Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture) * Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture)
## Assignment ## Assignment

@ -40,6 +40,12 @@ Program the device.
1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension. 1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension.
1. Install the rpi-vl53l0x Pip package, a Python package that interacts with a VL53L0X time-of-flight distance sensor. Install it using this pip command
```sh
pip install rpi-vl53l0x
```
1. Create a new file in this project called `distance-sensor.py`. 1. Create a new file in this project called `distance-sensor.py`.
> 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time. > 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time.
@ -95,4 +101,4 @@ Program the device.
> 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder. > 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder.
😀 Your proximity sensor program was a success! 😀 Your proximity sensor program was a success!

@ -16,6 +16,10 @@ In this lesson we'll cover:
* [Retrain the model](#retrain-the-model) * [Retrain the model](#retrain-the-model)
* [Count stock](#count-stock) * [Count stock](#count-stock)
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Stock counting ## Stock counting
Object detectors can be used for stock checking, either counting stock or ensuring stock is where it should be. IoT devices with cameras can be deployed all around the store to monitor stock, starting with hot spots where having items restocked is important, such as areas where small numbers of high value items are stocked. Object detectors can be used for stock checking, either counting stock or ensuring stock is where it should be. IoT devices with cameras can be deployed all around the store to monitor stock, starting with hot spots where having items restocked is important, such as areas where small numbers of high value items are stocked.

@ -49,8 +49,6 @@ Microphones come in a variety of types:
![Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone](../../../images/dynamic-mic.jpg) ![Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone](../../../images/dynamic-mic.jpg)
***Beni Köhler / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* Ribbon - Ribbon microphones are similar to dynamic microphones, except they have a metal ribbon instead of a diaphragm. This ribbon moves in a magnetic field generating an electrical current. Like dynamic microphones, ribbon microphones don't need power to work. * Ribbon - Ribbon microphones are similar to dynamic microphones, except they have a metal ribbon instead of a diaphragm. This ribbon moves in a magnetic field generating an electrical current. Like dynamic microphones, ribbon microphones don't need power to work.
![Edmund Lowe, American actor, standing at radio microphone (labeled for (NBC) Blue Network), holding script, 1942](../../../images/ribbon-mic.jpg) ![Edmund Lowe, American actor, standing at radio microphone (labeled for (NBC) Blue Network), holding script, 1942](../../../images/ribbon-mic.jpg)
@ -59,8 +57,6 @@ Microphones come in a variety of types:
![C451B small-diaphragm condenser microphone by AKG Acoustics](../../../images/condenser-mic.jpg) ![C451B small-diaphragm condenser microphone by AKG Acoustics](../../../images/condenser-mic.jpg)
***[Harumphy](https://en.wikipedia.org/wiki/User:Harumphy) at [en.wikipedia](https://en.wikipedia.org/) / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* MEMS - Microelectromechanical systems microphones, or MEMS, are microphones on a chip. They have a pressure sensitive diaphragm etched onto a silicon chip, and work similar to a condenser microphone. These microphones can be tiny, and integrated into circuitry. * MEMS - Microelectromechanical systems microphones, or MEMS, are microphones on a chip. They have a pressure sensitive diaphragm etched onto a silicon chip, and work similar to a condenser microphone. These microphones can be tiny, and integrated into circuitry.
![A MEMS microphone on a circuit board](../../../images/mems-microphone.png) ![A MEMS microphone on a circuit board](../../../images/mems-microphone.png)
@ -85,7 +81,7 @@ For example most streaming music services offer 16-bit or 24-bit audio. This mea
> 💁 You may have hard of 8-bit audio, often referred to as LoFi. This is audio sampled using only 8-bits, so -128 to 127. The first computer audio was limited to 8 bits due to hardware limitations, so this is often seen in retro gaming. > 💁 You may have hard of 8-bit audio, often referred to as LoFi. This is audio sampled using only 8-bits, so -128 to 127. The first computer audio was limited to 8 bits due to hardware limitations, so this is often seen in retro gaming.
These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'loseless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz. These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'lossless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz.
✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio? ✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio?

@ -24,6 +24,10 @@ In this lesson we'll cover:
* [Support multiple languages in applications with translations](#support-multiple-languages-in-applications-with-translations) * [Support multiple languages in applications with translations](#support-multiple-languages-in-applications-with-translations)
* [Translate text using an AI service](#translate-text-using-an-ai-service) * [Translate text using an AI service](#translate-text-using-an-ai-service)
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Translate text ## Translate text
Text translation has been a computer science problem that has been researched for over 70 years, and only now thanks to advances in AI and computer power is close to being solved to a point where it is almost as good as human translators. Text translation has been a computer science problem that has been researched for over 70 years, and only now thanks to advances in AI and computer power is close to being solved to a point where it is almost as good as human translators.
@ -115,8 +119,6 @@ In an ideal world, your whole application should understand as many different la
![A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese](../../../images/translated-smart-timer.png) ![A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese](../../../images/translated-smart-timer.png)
***A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese. Microcontroller by Template / recording by Aybige Speaker / Speaker by Gregor Cresnar - all from the [Noun Project](https://thenounproject.com)***
Imagine you are building a smart timer that uses English end-to-end, understanding spoken English and converting that to text, running the language understanding in English, building up responses in English and replying with English speech. If you wanted to add support for Japanese, you could start with translating spoken Japanese to English text, then keep the core of the application the same, then translate the response text to Japanese before speaking the response. This would allow you to quickly add Japanese support, and you can expand to providing full end-to-end Japanese support later. Imagine you are building a smart timer that uses English end-to-end, understanding spoken English and converting that to text, running the language understanding in English, building up responses in English and replying with English speech. If you wanted to add support for Japanese, you could start with translating spoken Japanese to English text, then keep the core of the application the same, then translate the response text to Japanese before speaking the response. This would allow you to quickly add Japanese support, and you can expand to providing full end-to-end Japanese support later.
> 💁 The downside to relying on machine translation is that different languages and cultures have different ways of saying the same things, so the translation may not match the expression you are expecting. > 💁 The downside to relying on machine translation is that different languages and cultures have different ways of saying the same things, so the translation may not match the expression you are expecting.

@ -91,3 +91,7 @@ We have two choices of IoT hardware to use for the projects depending on persona
## Offline access ## Offline access
You can run this documentation offline by using [Docsify](https://docsify.js.org/#/). Fork this repo, [install Docsify](https://docsify.js.org/#/quickstart) on your local machine, and then in the root folder of this repo, type `docsify serve`. The website will be served on port 3000 on your localhost: `localhost:3000`. You can run this documentation offline by using [Docsify](https://docsify.js.org/#/). Fork this repo, [install Docsify](https://docsify.js.org/#/quickstart) on your local machine, and then in the root folder of this repo, type `docsify serve`. The website will be served on port 3000 on your localhost: `localhost:3000`.
## Image attributions
You can find all the attributions for the images used in this curriculum where required in the [Attributions](./attributions.md).

@ -0,0 +1,40 @@
# Image attributions
* Bananas by abderraouf omara from the [Noun Project](https://thenounproject.com)
* Brain by Icon Market from the [Noun Project](https://thenounproject.com)
* Broadcast by RomStu from the [Noun Project](https://thenounproject.com)
* Button by Dan Hetteix from the [Noun Project](https://thenounproject.com)
* C451B small-diaphragm condenser microphone by AKG Acoustics. [Harumphy](https://en.wikipedia.org/wiki/User:Harumphy) at [en.wikipedia](https://en.wikipedia.org/) / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)
* Calendar by Alice-vector from the [Noun Project](https://thenounproject.com)
* Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)
* chip by Astatine Lab from the [Noun Project](https://thenounproject.com)
* Cloud by Debi Alpa Nugraha from the [Noun Project](https://thenounproject.com)
* container by ProSymbols from the [Noun Project](https://thenounproject.com)
* CPU by Icon Lauk from the [Noun Project](https://thenounproject.com)
* database by Icons Bazaar from the [Noun Project](https://thenounproject.com)
* dial by Jamie Dickinson from the [Noun Project](https://thenounproject.com)
* GPS by mim studio from the [Noun Project](https://thenounproject.com)
* heater by Pascal Heß from the [Noun Project](https://thenounproject.com)
* Idea by Pause08 from the [Noun Project](https://thenounproject.com)
* IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)
* LED by abderraouf omara from the [Noun Project](https://thenounproject.com)
* ldr by Eucalyp from the [Noun Project](https://thenounproject.com)
* lightbulb by Maxim Kulikov from the [Noun Project](https://thenounproject.com)
* Microcontroller by Template from the [Noun Project](https://thenounproject.com)
* mobile phone by Alice-vector from the [Noun Project](https://thenounproject.com)
* motor by Bakunetsu Kaito from the [Noun Project](https://thenounproject.com)
* Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone. Beni Köhler / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)
* Plant by Alex Muravev from the [Noun Project](https://thenounproject.com)
* Plant Cell by Léa Lortal from the [Noun Project](https://thenounproject.com)
* probe by Adnen Kadri from the [Noun Project](https://thenounproject.com)
* ram by Atif Arshad from the [Noun Project](https://thenounproject.com)
* Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
* recording by Aybige Speaker from the [Noun Project](https://thenounproject.com)
* Satellite by Noura Mbarki from the [Noun Project](https://thenounproject.com)
* smart sensor by Andrei Yushchenko from the [Noun Project](https://thenounproject.com)
* Speaker by Gregor Cresnar from the [Noun Project](https://thenounproject.com)
* switch by Chattapat from the [Noun Project](https://thenounproject.com)*
* Temperature by Vectors Market from the [Noun Project](https://thenounproject.com)
* tomato by parkjisun from the Noun Project from the [Noun Project](https://thenounproject.com)
* Watering Can by Daria Moskvina from the [Noun Project](https://thenounproject.com)
* weather by Adrien Coquet from the [Noun Project](https://thenounproject.com)

@ -6,11 +6,12 @@ In the lessons for each project, you may have created some of the following:
* A Resource Group * A Resource Group
* An IoT Hub * An IoT Hub
* Two IoT device registrations * IoT device registrations
* A Storage Account * A Storage Account
* A Functions App * A Functions App
* An Azure Maps account * An Azure Maps account
* A custom vision project * A custom vision project
* An Azure Container Registry
* A cognitive services resource * A cognitive services resource
Most of these resources will have no cost - either they are completely free, or you are using a free tier. For services that require a paid tier, you would have been using them at a level that is included in the free allowance, or will only cost a few cents. Most of these resources will have no cost - either they are completely free, or you are using a free tier. For services that require a paid tier, you would have been using them at a level that is included in the free allowance, or will only cost a few cents.

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 292 KiB

After

Width:  |  Height:  |  Size: 263 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

After

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 118 KiB

After

Width:  |  Height:  |  Size: 117 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 13 KiB

@ -1011,6 +1011,614 @@
] ]
} }
] ]
},
{
"id": 21,
"title": "Lesson 11 - Location tracking: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "Your location can be defined using",
"answerOptions": [
{
"answerText": "Latitude only",
"isCorrect": "false"
},
{
"answerText": "Longitude only",
"isCorrect": "false"
},
{
"answerText": "Latitude and Longitude",
"isCorrect": "true"
}
]
},
{
"questionText": "The types of sensors that can track your location are called:",
"answerOptions": [
{
"answerText": "GPS",
"isCorrect": "true"
},
{
"answerText": "PGP",
"isCorrect": "false"
},
{
"answerText": "GIF",
"isCorrect": "false"
}
]
},
{
"questionText": "There is no value in tracking vehicle locations",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
}
]
},
{
"id": 22,
"title": "Lesson 11 - Location tracking: Post-Lecture Quiz",
"quiz": [
{
"questionText": "GPS data is sent from sensors using:",
"answerOptions": [
{
"answerText": "Latitude and Longitude coordinates",
"isCorrect": "false"
},
{
"answerText": "Addresses",
"isCorrect": "false"
},
{
"answerText": "NMEA sentences",
"isCorrect": "true"
}
]
},
{
"questionText": "To get a good GPS fix you need to receive signals from at least how many satellites",
"answerOptions": [
{
"answerText": "1",
"isCorrect": "false"
},
{
"answerText": "2",
"isCorrect": "false"
},
{
"answerText": "3",
"isCorrect": "true"
}
]
},
{
"questionText": "GPS sensors send data over:",
"answerOptions": [
{
"answerText": "SPI",
"isCorrect": "false"
},
{
"answerText": "UART",
"isCorrect": "true"
},
{
"answerText": "Email",
"isCorrect": "false"
}
]
}
]
},
{
"id": 23,
"title": "Lesson 12 - Store location data: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "IoT data is stored by IoT Hub",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
},
{
"questionText": "Data can be divided into the following two types",
"answerOptions": [
{
"answerText": "Blob and table",
"isCorrect": "false"
},
{
"answerText": "Structured and unstructured",
"isCorrect": "true"
},
{
"answerText": "Red and blue",
"isCorrect": "false"
}
]
},
{
"questionText": "Serverless code can be used to write IoT data to a database",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
}
]
},
{
"id": 24,
"title": "Lesson 12 - Store location data: Post-Lecture Quiz",
"quiz": [
{
"questionText": "IoT data that needs to be processed immediately is on which path:",
"answerOptions": [
{
"answerText": "Hot",
"isCorrect": "true"
},
{
"answerText": "Warm",
"isCorrect": "false"
},
{
"answerText": "Cold",
"isCorrect": "false"
}
]
},
{
"questionText": "Azure storage has the following storage types:",
"answerOptions": [
{
"answerText": "Boxes, tubs, bins",
"isCorrect": "false"
},
{
"answerText": "Blob, table, queue and file",
"isCorrect": "true"
},
{
"answerText": "Hot, warm, cold",
"isCorrect": "false"
}
]
},
{
"questionText": "Azure Functions can be bound to database to write return values to the database",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
}
]
},
{
"id": 25,
"title": "Lesson 13 - Visualize location data: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "Very large tables of data are an easy way to quickly look up data",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
},
{
"questionText": "GPS data can be visualized on maps",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
},
{
"questionText": "On maps of large areas, the same distance drawn on the map represents the same distance on the real world no matter where on the map the measurement is taken",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
}
]
},
{
"id": 26,
"title": "Lesson 13 - Visualize location data: Post-Lecture Quiz",
"quiz": [
{
"questionText": "The service to draw maps on a web page is called:",
"answerOptions": [
{
"answerText": "Azure Maps",
"isCorrect": "true"
},
{
"answerText": "Azure Atlas",
"isCorrect": "false"
},
{
"answerText": "Azure World VIsualizer",
"isCorrect": "false"
}
]
},
{
"questionText": "Azure maps plots data using:",
"answerOptions": [
{
"answerText": "GeoJSON",
"isCorrect": "true"
},
{
"answerText": "Lists of latitude and longitude",
"isCorrect": "false"
},
{
"answerText": "Lists of addresses",
"isCorrect": "false"
}
]
},
{
"questionText": "Blobs can be retrieved via a URL",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
}
]
},
{
"id": 27,
"title": "Lesson 14 - Geofences: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "GPS coordinates can be used to check if something is in a defined area",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
},
{
"questionText": "GPS is incredibly accurate so can indicate with precision of less than 1M when a device enters a given area",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
},
{
"questionText": "Geofences are useful when tracking vehicles to:",
"answerOptions": [
{
"answerText": "Determine when a vehicle enters a given area only",
"isCorrect": "false"
},
{
"answerText": "Determine when a vehicle leaves a given area only",
"isCorrect": "false"
},
{
"answerText": "Determine when a vehicle enters or leaves a given area",
"isCorrect": "true"
}
]
}
]
},
{
"id": 28,
"title": "Lesson 14 - Geofences: Post-Lecture Quiz",
"quiz": [
{
"questionText": "To have multiple services pull data from an IoT Hub, you need to create multiple:",
"answerOptions": [
{
"answerText": "Consumer groups",
"isCorrect": "true"
},
{
"answerText": "Pipes",
"isCorrect": "false"
},
{
"answerText": "IoT Hubs",
"isCorrect": "false"
}
]
},
{
"questionText": "The default search buffer for a geofence call is:",
"answerOptions": [
{
"answerText": "5m",
"isCorrect": "false"
},
{
"answerText": "50m",
"isCorrect": "true"
},
{
"answerText": "500m",
"isCorrect": "false"
}
]
},
{
"questionText": "Points inside the geofence have a distance:",
"answerOptions": [
{
"answerText": "Less than 0 (a negative value)",
"isCorrect": "true"
},
{
"answerText": "Greater than 0 (a positive value)",
"isCorrect": "false"
}
]
}
]
},
{
"id": 29,
"title": "Lesson 15 - Train a fruit quality detector: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "Cameras can be used as IoT sensors",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
},
{
"questionText": "Fruit can be sorted using cameras",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
},
{
"questionText": "Images-based AI models are incredibly complex and time consuming to train, requiring hundreds of thousands of images:",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
}
]
},
{
"id": 30,
"title": "Lesson 15 - Train a fruit quality detector: Post-Lecture Quiz",
"quiz": [
{
"questionText": "The technique custom vision uses to train a model with only a few images is called:",
"answerOptions": [
{
"answerText": "Transformational learning",
"isCorrect": "false"
},
{
"answerText": "Transaction learning",
"isCorrect": "false"
},
{
"answerText": "Transfer learning",
"isCorrect": "true"
}
]
},
{
"questionText": "Image classifiers can be trained using:",
"answerOptions": [
{
"answerText": "Only 1 image per tag",
"isCorrect": "false"
},
{
"answerText": "At least 5 images per tag",
"isCorrect": "true"
},
{
"answerText": "At least 50 images per tag",
"isCorrect": "false"
}
]
},
{
"questionText": "The hardware that allows ML models to be trained quickly, as well as making the graphics on out Xbox look amazing are called:",
"answerOptions": [
{
"answerText": "PGU",
"isCorrect": "false"
},
{
"answerText": "GPU",
"isCorrect": "true"
},
{
"answerText": "PUG",
"isCorrect": "false"
}
]
}
]
},
{
"id": 31,
"title": "Lesson 16 - Check fruit quality from an IoT device: Pre-Lecture Quiz",
"quiz": [
{
"questionText": "IoT devices are not powerful enough to use cameras:",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
},
{
"questionText": "Camera sensors use film to capture images",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
},
{
"questionText": "Camera sensors send which type of data",
"answerOptions": [
{
"answerText": "Digital",
"isCorrect": "true"
},
{
"answerText": "Analog",
"isCorrect": "false"
}
]
}
]
},
{
"id": 32,
"title": "Lesson 16 - Check fruit quality from an IoT device: Post-Lecture Quiz",
"quiz": [
{
"questionText": "A published version of a custom vision model is called an:",
"answerOptions": [
{
"answerText": "Iteration",
"isCorrect": "true"
},
{
"answerText": "Instance",
"isCorrect": "false"
},
{
"answerText": "Iguana",
"isCorrect": "false"
}
]
},
{
"questionText": "When images are sent for classification, they then become available to retrain the model:",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "true"
},
{
"answerText": "False",
"isCorrect": "false"
}
]
},
{
"questionText": "You don't need to use images captured from an IoT device to train the model as the cameras are as good quality as phone cameras:",
"answerOptions": [
{
"answerText": "True",
"isCorrect": "false"
},
{
"answerText": "False",
"isCorrect": "true"
}
]
}
]
} }
] ]
} }

Loading…
Cancel
Save