Merge branch 'microsoft:main' into main

pull/134/head
Ricardo Zamudio 4 years ago committed by GitHub
commit 3443836963
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -21,6 +21,7 @@
"mosquitto",
"photodiode",
"photodiodes",
"quickstart",
"sketchnote"
]
}

@ -0,0 +1,16 @@
# Memulai dengan IoT
Pada bagian ini, Anda akan diperkenalkan dengan Internet of Things, dan mempelajari konsep dasar termasuk membangung proyek IoT 'Hello World' pertama Anda yang terhubung ke *cloud*. Proyek ini merupakan lampu malam yang akan menyala saat tingkat pencahayaan diukur dengan penurunan sensor. This project is a nightlight that lights up as light levels measured by a sensor drop.
![Lampu LED terhubung ke WIO menyala dan mati saat tingkat pencahayaan berubah](../../images/wio-running-assignment-1-1.gif)
## Topik
1. [Pengenalan IoT](lessons/1-introduction-to-iot/README.md)
2. [Lebih dalam dengan IoT](lessons/2-deeper-dive/README.md)
3. [Berinteraksi dengan dunia menggunakan sensor dan aktuator](lessons/3-sensors-and-actuators/README.md)
4. [Menghubungkan perangkat Anda ke Internet](lessons/4-connect-internet/README.md)
## Kredit
Semua pelajaran ditulis dengan ♥️ oleh [Jim Bennett](https://GitHub.com/JimBobBennett)

@ -1,6 +1,6 @@
# Introduction to IoT
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-1.png)
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-1.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
@ -78,8 +78,6 @@ A single-board computer is a small computing device that has all the elements of
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The Raspberry Pi is one of the most popular single-board computers.
Like a microcontroller, single-board computers have a CPU, memory and input/output pins, but they have additional features such as a graphics chip to allow you to connect monitors, audio outputs, and USB ports to connect keyboards mice and other standard USB devices like webcams or external storage. Programs are stored on SD cards or hard drives along with an operating system, instead of a memory chip built into the board.

@ -4,8 +4,6 @@ The [Raspberry Pi](https://raspberrypi.org) is a single-board computer. You can
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## Setup
If you are using a Raspberry Pi as your IoT hardware, you have two choices - you can work through all these lessons and code directly on the Pi, or you can connect remotely to a 'headless' Pi and code from your computer.

@ -1,6 +1,6 @@
# <div dir="rtl"> مقدمة لإنترنت الأشياء </div>
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-1.png)
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-1.jpg)
> <div dir="rtl"> خريطة من <a href="https://github.com/nitya">Nitya Narasimhan</a> </div>
> <div dir="rtl"> اضغط على الصورة لتكبيرها </div>
@ -94,10 +94,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
<div dir="rtl">
يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة.

@ -1,6 +1,6 @@
# IoT পরিচিতি
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-1.png)
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-1.jpg)
> Nitya Narasimhan তৈরী করছেন এই স্কেচনোটটি । এটির বড় সংস্করণ দেখতে চাইলে ছবিটির উপর ক্লিক করুন ।
@ -79,8 +79,6 @@ IoT শব্দে **T** হলো **Things** - ‘থিংস’ বা জ
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
বিখ্যাত সিংগেল-বোর্ড কম্পিউটারগুলোর মধ্যে রাস্পবেরি পাই অন্যতম ।
মাইক্রোকন্ট্রোলারের মতো সিংগেল-বোর্ড কম্পিউটার এরও রয়েছে সিপিইউ, মেমরি, ইনপুট/আউটপুট পিন । তবে তাদের অতিরিক্ত বৈশিষ্ট্য রয়েছে যেমন গ্রাফিক্স চিপ দ্বারা আমরা মনিটর, অডিও আউটপুট এবং ইউএসবি পোর্টগুলিকে সংযুক্ত করতে পারি কী-বোর্ড , মাউস এর সাথে বা অন্যান্য স্ট্যান্ডার্ড ইউএসবি ডিভাইসগুলি যেমনঃ ওয়েবক্যাম বা বাহ্যিক স্টোরেজ এর সাথে সংযোগ করতে দেয় । প্রোগ্রামগুলি বোর্ডে তৈরি মেমরি চিপে নয়, বরং হার্ড ড্রাইভে বা এসডি কার্ড এ সংরক্ষণ করা হয়, যার সাথে থাকে অপারেটিং সিস্টেম ।

@ -1,6 +1,6 @@
# इंटरनेट ऑफ थिंग्स (IoT )का परिचय
![इस पाठ का एक संक्षिप्त विवरण](../../../../sketchnotes/lesson-1.png)
![इस पाठ का एक संक्षिप्त विवरण](../../../../sketchnotes/lesson-1.jpg)
> [Nitya Narasimhan](https://github.com/nitya) द्वारा स्केचनोट. Click the image for a larger version.
@ -81,10 +81,6 @@ IoT में **T** का अर्थ है **चीजें** - ऐसे
![एक रास्पबेरी पाई 4](../../../../images/raspberry-pi-4.jpg)
***रास्पबेरी पाई 4. माइकल हेन्ज़लर /
[विकिमीडिया कॉमन्स](https://commons.wikimedia.org/wiki/Main_Page) /
[CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
रास्पबेरी पाई सबसे लोकप्रिय सिंगल-बोर्ड कंप्यूटरों में से एक है।
एक माइक्रोकंट्रोलर की तरह, सिंगल-बोर्ड कंप्यूटर में एक सीपीयू, मेमोरी और इनपुट/आउटपुट पिन होते हैं, लेकिन उनमें ग्राफिक्स चिप जैसी अतिरिक्त सुविधाएं होती हैं, जिससे आप मॉनिटर, ऑडियो आउटपुट और यूएसबी पोर्ट को कीबोर्ड, चूहों और अन्य मानक यूएसबी से कनेक्ट कर सकते हैं। वेबकैम या बाहरी भंडारण जैसे उपकरण। बोर्ड में निर्मित मेमोरी चिप के बजाय प्रोग्राम को ऑपरेटिंग सिस्टम के साथ एसडी कार्ड या हार्ड ड्राइव पर संग्रहीत किया जाता है।

@ -0,0 +1,97 @@
# Pengenalan IoT
![Ikhtisar catatan sketsa dari pelajaran ini](../../../../sketchnotes/lesson-1.jpg)
> Sketsa dibuat oleh [Nitya Narasimhan](https://github.com/nitya). Klik gambar untuk versi yang lebih besar.
## Kuis prakuliah
[Kuis prakuliah](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/1)
## Pengantar
Pelajaran ini mencakup beberapa topok pengantar mengenai Internet of Things, dan membuat Anda dapat mempersiapkan dan mengatur perangkat keras Anda.
Dalam pelajaran ini kita akan membahas:
* [Apa itu 'Internet of Things'?](#apa-itu-internet-of-things)
* [Perangkat IoT](#perangkat-iot)
* [Mengatur Perangkat Anda](#set-up-your-device)
* [Penerapan dari IoT](#applications-of-iot)
* [Contoh Perangkat IoT yang Mungkin Anda Punya di Sekitar](#examples-of-iot-devices-you-may-have-around-you)
## Apa itu 'Internet of Things'?
Istilah 'Internet of Things' diciptakan oleh [Kevin Ashton](https://wikipedia.org/wiki/Kevin_Ashton) pada tahun 1999, yang merujuk pada menghubungkan Internet ke dunia fisik melalui sensor. Sejak saat itu, istilah IoT digunakan untuk menggambarkan perangkat apa pun yang berinteraksi dengan dunia fisik di sekitarnya, baik dengan mengumpulkan data dari sensor, atau menyediakan interaksi dunia nyata melalui aktuator (perangkat yang melakukan sesuatu seperti menyalakan sakelar atau menyalakan LED), dan terhubung ke perangkat lain atau Internet.
> **Sensor** mengumpulkan informasi dari lingkungan, seperti mengukur kecepatan, suhu, atau lokasi.
>
> **Aktuator** mengubah sinyal listrik menjadi interaksi pada lingkungan seperti memicu sakelar, menyalakan lampu, membuat suara, atau mengirim *control signal* ke perangkat keras lain, misalnya untuk menyalakan soket listrik.
IoT sebagai suatu bidang teknologi lebih dari sekadar perangkat. Hal ini mencakup layanan berbasis cloud yang dapat memproses data sensor, atau mengirim permintaan ke aktuator yang terhubung ke perangkat IoT. IoT juga mencakup perangkat yang tidak memiliki atau tidak memerlukan konektivitas Internet, sering disebut sebagai *edge devices* atau perangkat tepi. Perangkat tepi adalah perangkat yang dapat memproses dan merespons data sensor itu sendiri, biasanya menggunakan model AI yang dilatih di cloud.
IoT merupakan bidang teknologi yang berkembang pesat. Diperkirakan pada akhir tahun 2020, 30 miliar perangkat IoT dikerahkan dan terhubung ke Internet. Jika melihat ke masa depan, diperkirakan pada tahun 2025, perangkat IoT akan mengumpulkan hampir 80 zettabytes data atau 80 triliun gigabyte. Banyak sekali bukan?
![Grafik yang menunjukkan perangkat IoT aktif dari waktu ke waktu, dengan tren meningkat dari di bawah 5 miliar pada tahun 2015 menjadi lebih dari 30 miliar pada tahun 2025](../../../../images/connected-iot-devices.svg)
✅ Lakukan sedikit riset: Berapa banyak data yang dihasilkan oleh perangkat IoT yang benar-benar digunakan, dan berapa banyak yang terbuang? Mengapa begitu banyak data yang diabaikan?
Data ini adalah kunci kesuksesan IoT. Untuk menjadi pengembang IoT yang sukses, Anda perlu memahami data yang perlu Anda kumpulkan, cara mengumpulkannya, cara membuat keputusan berdasarkan data tersebut, dan cara menggunakan keputusan tersebut untuk berinteraksi dengan lingkungan fisik jika diperlukan.
## Perangkat IoT
Huruf **T** di IoT adalah singkatan dari **Things** - perangkat yang berinteraksi dengan lingkungan fisik di sekitarnya baik dengan mengumpulkan data dari sensor atau menyediakan interaksi dunia nyata melalui aktuator.
Perangkat untuk produksi atau penggunaan komersial, seperti pelacak kebugaran konsumen, atau pengontrol mesin industri, biasanya dibuat khusus. Mereka menggunakan papan sirkuit khusus, bahkan mungkin prosesor khusus, yang dirancang untuk memenuhi kebutuhan tugas tertentu, apakah itu cukup kecil untuk muat di pergelangan tangan, atau cukup kuat untuk bekerja di lingkungan pabrik dengan suhu tinggi, stres tinggi, atau getaran tinggi.
Sebagai pengembang yang belajar tentang IoT atau membuat prototipe perangkat, Anda harus mulai dengan *developer kit* atau perangkat pengembang. Perangkat tersebut adalah perangkat IoT untuk tujuan umum yang dirancang untuk digunakan pengembang, seringkali dengan fitur yang tidak akan Anda miliki di perangkat produksi, seperti satu set pin eksternal untuk menghubungkan sensor atau aktuator, perangkat keras untuk mendukung debugging, atau sumber daya tambahan yang akan menambah biaya yang tidak perlu saat melakukan produksi manufaktur.
Perangkat pengembang ini biasanya terbagi dalam dua kategori - mikrokontroler dan komputer papan tunggal. Perangkat tersebut akan diperkenalkan di sini, dan kita akan membahas lebih detail di pelajaran berikutnya.
> 💁 Ponsel Anda juga dapat dianggap sebagai perangkat IoT tujuan umum, dengan sensor dan aktuator bawaan, dengan berbagai aplikasi yang menggunakan sensor dan aktuator dengan cara yang berbeda dengan layanan cloud yang berbeda. Anda bahkan dapat menemukan beberapa tutorial IoT yang menggunakan aplikasi ponsel sebagai perangkat IoT.
### Mikrokontroler
Mikrokontroler atau Pengendali mikro (juga disebut sebagai MCU, kependekan dari microcontroller unit) adalah komputer kecil yang terdiri dari:
🧠 Satu atau lebih unit pemrosesan pusat (CPU) - 'otak' mikrokontroler yang menjalankan program Anda
💾 Memori (RAM dan memori program) - tempat program, data, dan variabel Anda disimpan
🔌 Koneksi input/output (I/O) yang dapat diprogram - untuk berbicara dengan periferal eksternal (perangkat yang terhubung) seperti sensor dan aktuator
Mikrokontroler biasanya merupakan perangkat komputasi berbiaya rendah, dengan harga rata-rata untuk yang digunakan dalam perangkat keras khusus turun menjadi sekitar US$0,50, dan beberapa perangkat bahkan semurah US$0,03. Perangkat pengembang dapat ditemukan mulai dari harga US$4, dengan biaya meningkat karena Anda menambahkan lebih banyak fitur. [Wio Terminal](https://www.seeedstudio.com/Wio-Terminal-p-4509.html), perangkat pengembang mikrokontroler dari [Seeed studios](https://www.seeedstudio.com) yang memiliki sensor , aktuator, WiFi, dan layar berharga sekitar US$30.
![sebuah terminal wio](../../../../images/wio-terminal.png)
> 💁 Saat mencari mikrokontroler di Internet, berhati-hatilah saat mencari istilah **MCU** karena ini akan mengembalikan banyak hasil untuk Marvel Cinematic Universe, bukan mikrokontroler.
Mikrokontroler dirancang untuk diprogram untuk melakukan sejumlah tugas yang sangat spesifik, daripada menjadi komputer dengan tujuan umum seperti PC atau Mac. Kecuali untuk skenario yang sangat spesifik, Anda tidak dapat menghubungkan monitor, keyboard, dan mouse dan menggunakannya untuk tugas umum.
Perangkat pengembang mikrokontroler biasanya dilengkapi dengan sensor dan aktuator tambahan. Sebagian besar papan (board) akan memiliki satu atau lebih LED yang dapat Anda program, bersama dengan perangkat lain seperti steker standar untuk menambahkan lebih banyak sensor atau aktuator menggunakan berbagai ekosistem pabrikan atau sensor bawaan (biasanya yang paling populer seperti sensor suhu). Beberapa mikrokontroler memiliki konektivitas nirkabel bawaan seperti Bluetooth atau WiFi atau memiliki mikrokontroler tambahan di papan untuk menambahkan konektivitas ini.
> 💁 Mikrokontroler biasanya diprogram dalam bahasa C/C++.
### Komputer papan tunggal
Komputer papan tunggal adalah perangkat komputasi kecil yang memiliki semua elemen komputer lengkap yang terdapat pada satu papan kecil. Ini adalah perangkat yang memiliki spesifikasi yang mirip dengan desktop atau laptop PC atau Mac, menjalankan sistem operasi lengkap, tetapi berukuran kecil, menggunakan lebih sedikit daya, dan jauh lebih murah.
![Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer.
Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan.
> 🎓 Anda dapat menganggap komputer papan tunggal sebagai versi PC atau Mac yang lebih kecil dan lebih murah, dengan tambahan pin GPIO (general-purpose input/output) untuk berinteraksi dengan sensor dan aktuator.
Komputer papan tunggal adalah komputer berfitur lengkap, sehingga dapat diprogram dalam bahasa apa pun. Perangkat IoT biasanya diprogram dengan Python.
### Pilihan perangkat keras untuk sisa pelajaran
Semua pelajaran selanjutnya mencakup tugas menggunakan perangkat IoT untuk berinteraksi dengan dunia fisik dan berkomunikasi dengan cloud. Setiap pelajaran mendukung 3 pilihan perangkat - Arduino (menggunakan Terminal Seeed Studios Wio), atau komputer papan tunggal, baik perangkat fisik (Raspberry Pi 4) atau komputer papan tunggal virtual yang berjalan di PC atau Mac Anda.
Anda dapat membaca tentang perangkat keras yang diperlukan untuk menyelesaikan semua tugas di [panduan perangkat keras](../../../hardware.md).
> 💁 Anda tidak perlu membeli perangkat keras IoT apa pun untuk menyelesaikan tugas, Anda dapat melakukan semuanya menggunakan komputer papan tunggal virtual.
Perangkat keras mana yang Anda pilih terserah Anda - itu tergantung pada apa yang Anda miliki di rumah di sekolah Anda, dan bahasa pemrograman apa yang Anda ketahui atau rencanakan untuk dipelajari. Kedua varian perangkat keras akan menggunakan ekosistem sensor yang sama, jadi jika Anda memulai pada salah satu perangkat, Anda dapat dengan mudah melakukannya pada perangkat lain tanpa harus mengganti sebagian besar perangkat pengembang. Komputer papan tunggal virtual akan setara dengan pembelajaran di Raspberry Pi, dengan sebagian besar kode dapat ditransfer ke Pi jika Anda akhirnya mendapatkan perangkat dan sensor.

@ -1,6 +1,6 @@
# 物联网IoT简介
![这个课程概述的涂鸦笔记sketchnote](../../../sketchnotes/lesson-1.png)
![这个课程概述的涂鸦笔记sketchnote](../../../../sketchnotes/lesson-1.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). 如果你想看比较大的图片,请点击它。
@ -78,8 +78,6 @@ IoT 的 **T** 代表 **Things**(物)—— 可以跟物质世界交互的设
![一个 Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
Raspberry Pi 是其中最流行的单板机。
就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。

@ -4,8 +4,6 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***রাস্পবেরি পাই - Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
## সেটাপ
যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন।

@ -0,0 +1,201 @@
# Wio Terminal
[সীড স্টুডিও](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) এর Wio Terminal একটি আরডুইনো সাপোর্টেড মাইক্রোকন্ট্রোলার, যাতে ওয়াইফাই সংযোগ এবং কিছু সেন্সর ও অ্যাকচুয়েটর বিল্ট-ইন রয়েছে। এছাড়াও এর সাথে রয়েছে কিছু পোর্ট, অতিরিক্ত সেন্সর ও অ্যাকচুয়েটর সংযোগ এবং এটি নির্মাণ করা হয়েছে একটি হার্ডওয়্যার ইকোসিস্টেম ব্যবহার করে যার নাম
[Grove](https://www.seeedstudio.com/category/Grove-c-1003.html).
![A Seeed studios Wio Terminal](../../../images/wio-terminal.png)
## সেটআপ
Wio Terminal ব্যবহার করার জন্য, আমাদের কিছু ফ্রি সটওয়্যার নিজেদের কম্পিউটার এ ইনস্টল করতে হবে। আমাদের অবশ্যই ওয়াইফাই সংযোগদানের পূর্বে Wio Terminal ফার্মওয়্যারটি আপডেট করে নিতে হবে।
### কাজের সেটআপ
প্রথমেই আমরা আমাদের প্রয়োজনীয় সটওয়্যারগুলো এবং ফার্মওয়ারটি আপডেট করে নেব।
১. ভিজুয়াল স্টুডিও কোড (ভি এস কোড) ইনস্টল করতে হবে । এটি একটি এডিটর যার সাহায্যে আমরা আমাদের ডিভাইস কোড লিখতে পারি সি/সি++ ভাষায়। বিস্তারিত জানতে [VS Code documentation](https://code.visualstudio.com?WT.mc_id=academic-17441-jabenn) টি পড়ে নেয়া যেতে পারে।
> 💁 আরডুইনো ডেভলপমেন্ট এর জন্য আর একটি ভালো আই.ডি.ই হলো [Arduino IDE](https://www.arduino.cc/en/software). এই IDE টির সাথে কাজ করার পূর্ব অভিজ্ঞতা থাকলে ভি এস কোড ও platformIO এর পরিবর্তে একেও ব্যাবহার করা যেতে পারে। তবে, এখানে আমরা ভি এস কোডের উপর ভিত্তি করেই কাজ করবো।
২. এরপর ভি এস কোড platformIO এক্সটেনশনটি ইনস্টল করতে হবে। এই এক্সটেনশনটি ভি এস কোডে ইনস্টল করতে [PlatformIO extension documentation](https://marketplace.visualstudio.com/items?itemName=platformio.platformio-ide&WT.mc_id=academic-17441-jabenn) এ দেওয়া দিকির্দেশনাগুলো পড়ে দেখতে পারেন। এটি একটি ভি এস কোড এক্সটেনশন যা সি/সি++ ভাষায় মাইক্রোকন্ট্রোলার প্রোগ্রামিংকে সাপোর্ট করে। এই এক্সটেনশনটি মাইক্রোসফট সি/সি++ এর উপর নির্ভর করে , সি অথবা সি++ ভাষা নিয়ে কাজ করার জন্য। উল্লেখ্য, এই সি/সি++ এক্সটেনশন সয়ংক্রিয়ভাবে ইনস্টল হয়ে যায় যখন কেউ platformIO ইনস্টল করে।
1. এখন, আমরা আমাদের Wio Terminal কে কম্পিউটার এর সাথে সংযুক্ত করব। এটির নিচের দিকে একটি ইউএসবি-সি পোর্ট আছে, সেটিকে আমরা আমাদের কম্পিউটার এর ইউএসবি পোর্ট এর সাথে সংযোগ দিব। উইও টার্মিনালে ইউএসবি-সি ও ইউএসবি-এ ক্যাবল থাকে। যদি আমাদের কম্পিউটারে শুধু ইউএসবি-সি পোর্ট থেকে, তাহলে আমাদের হয় ইউএসবি-সি ক্যাবল অথবা ইউএসবি-এ ক্যাবলের প্রয়োজন হবে ইউএসবি-সি অ্যাডাপ্টার এ সংযোগ দেওয়ার জন্য।
1. [Wio Terminal Wiki WiFi Overview documentation](https://wiki.seeedstudio.com/Wio-Terminal-Network-Overview/) এ উল্লেখিত দিকনির্দেশনা গুলোকে মেনে আমরা আমাদের উইও টার্মিনাল সেটআপ ও ফার্মওয়্যার আপডেট করে ফেলি।
+## হ্যালো ওয়ার্ল্ড
প্রথাগতভাবে, কোনো নতুন প্রোগ্রামিং ল্যাঙ্গুয়েজ অথবা টেকনোলজি নিয়ে কাজ শুরু করার সময় আমরা একটি "Hello World" application লিখি, একটি ছোট application যা আউটপুট হিসেবে `"Hello World"` লেখাটি দেখায়। এতে করে আমরা বুঝি যে আমাদের প্রোগ্রামটিতে সকল টুল সঠিকভাবে কাজ করছে।
আমাদের Wio Terminal এর হেলো ওয়ার্ল্ড অ্যাপটি এটি নিশ্চিত করবে যে আমাদের ভিজুয়াল স্টুডিও কোড platformIO এর সাথে সঠিকভাবে ইনস্টল করা হয়েছে এবং এখন এটি microcontroller development এর জন্য প্রস্তুত।
### platformIO প্রজেক্ট তৈরী
আমাদের প্রথম কাজ হলো platformIO ব্যাবহার করে একটি নতুন প্রজেক্ট তৈরী করা যা Wio terminal এর জন্য কনফিগার করা।
#### কাজ- platformIO প্রজেক্ট তৈরী
একটি platformIO প্রজেক্ট তৈরী করি।
১. Wio terminal কে কম্পিউটারের সাথে সংযোগ দেই।
২. ভি এস কোড launch করি
৩. আমরা platformIO আইকনটি সাইড মেন্যু বারে দেখতে পাবো:
![The Platform IO menu option](../../../images/vscode-platformio-menu.png)
এই মেন্যু আইটেমটি সিলেক্ট করে, সিলেক্ট করি *PIO Home -> Open*
![The Platform IO open option](../../../images/vscode-platformio-home-open.png)
. Welcome স্ক্রীন থেকে **+ New Project** বাটনটিতে ক্লিক করি।
![The new project button](../../../images/vscode-platformio-welcome-new-button.png)
৫. প্রজেক্টটিকে *Project Wizard* এ configure করি
1. প্রজেক্টটিকে `nightlight` নাম দেই।
1. *Board* dropdown থেকে, `WIO` লিখে বোর্ডগুলোকে ফিল্টার করি, *Seeeduino Wio Terminal* সিলেক্ট করি।
1. Leave the *Framework* as *Arduino*
1. হয় *Use default location* কে টিক অবস্থায় ছেড়ে দেই অথবা সেটিকে টিক না দিয়ে আমাদের প্রজেক্টটির জন্য যেকোনো location সিলেক্ট করি।
1. **Finish** বাটনটিতে ক্লিক করি।
![The completed project wizard](../../../images/vscode-platformio-nightlight-project-wizard.png)
platformIO এখন wio terminal এর কোডগুলোকে compile করার জন্য প্রয়োজনীয় কম্পনেন্টস ডাউনলোড করে নেবে এবং আমাদের প্রজেক্টটি create করে নেবে। পুরো প্রক্রয়াটি সম্পন্ন হতে কয়েক মিনিট সময় লাগতে পারে।
### platformIO প্রজেক্টটি investigate করে দেখা
ভি এস কোড এক্সপ্লোরার আমাদের কিছু ফাইল এবং ফোল্ডার দেখাবে যা platformIO wizerd দ্বারা তৈরি হয়েছে।
#### ফোল্ডারস
* `.pio` - এই ফোল্ডারটি কিছু temporary ডাটা বহন করে যা platformIO এর প্রয়জন হতে পারে, যেমন: libraries অথবা compiled code, এটা delete করার সাথে সাথে আবার পুনঃনির্মিতো হয়। U আমরা প্রজেক্টটি কোনো সাইট 
* `.vscode` - এই ফোল্ডারটি ভি এস কোড ও platformIO দ্বারা ব্যবহৃত configuration গুলোকে বহন করে। এটা delete করার সাথে সাথে আবার পুনঃনির্মিতো হয়। প্রজেক্টটি কোনো সাইট যেমন GitHub এ share করতে এর কোনো সোর্স কোড কন্ট্রোল অ্যাড করতে হবে না।
* `include` - এই ফোল্ডারটি এক্সটার্নাল হেডার ফাইল বহনের জন্য রয়েছে যা আমাদের কোডে অতিরিক্ত library যোগের সময় দরকার হয়। আমাদের কাজগুলোতে আমরা এই ফোল্ডারটি ব্যাবহার করব না।
* `lib` - এই ফোল্ডারটি কিছু এক্সটার্নাল libraries বহন করবে যা আমরা আমাদের কোড থেকে কল করব। আমাদের কাজগুলোতে আমরা এই ফোল্ডারটি ব্যাবহার করব না।
* `src` - এই ফোল্ডারটি আমাদের main সোর্স কোডটিকে বহন করবে, যা কিনা একটি সিংগেল ফাইল - main.cpp
* `test` - এই ফোল্ডারটি সেই স্থান যেখানে আমরা আমাদের কোডের ইউনিট টেস্ট গুলোকে রাখবো।
#### ফাইলস
* `main.cpp` - src ফোল্ডারে অবস্থিত এই ফাইলটি আমাদের অ্যাপ্লিকেশন এর entry point হিসেবে কাজ করবে। আমরা ফাইলটি খুলে দেখব, এটি বহন করে:
```cpp
#include <Arduino.h>
void setup() {
// put your setup code here, to run once:
}
void loop() {
// put your main code here, to run repeatedly:
}
```
যখন ডিভাইসটি কাজ শুরু করে, Arduino framework টি সেটআপ ফাংশনটি একবার রান করে, এরপর নিরন্তর এটিকে রান করতে থেকে যতক্ষণ পর্যন্ত ডিভাইসটি বন্ধ না হয় 
* `.gitignore` - এটি সেই ফাইল ও ডিরেক্টরিগুলোকে লিস্ট করে রাখে, যেগুলোকে আমরা আমাদের কোড git source code control এ যুক্ত করার সময় ইগনোর করবো, যেমন: কোনো GitHub repository তে আপলোড করার সময়।
* `platformio.ini` - এই ফাইলে আমাদের ডিভাইসের এবং অ্যাপের configuration গুলো রয়েছে । এটি খুললে দেখা যাবে: 
```ini
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
```
`[env:seeed_wio_terminal]` সেকশনটিতে wio terminal এর configuration আছে। আমরা একের অধিক `env` সেকশন রাখতে পারি যেন আমাদের কোডকে একের অধিক board এর জন্য compile করা যায়।
Project wizerd থেকে আরো কিছু value যা configuration ম্যাচ করে:
* `platform = atmelsam` Wio terminal যে হার্ডওয়্যারটি ব্যাবহার করে তাকে ডিফাইন করে (an ATSAMD51-based microcontroller)
* `board = seeed_wio_terminal` মাইক্রোকন্ট্রোলার এর টাইপ কে ডিফাইন করে (the Wio Terminal)
* `framework = arduino` আমাদের প্রজেক্টটি Arduino framework ব্যাবহার করে সেটি ডিফাইন করে।
### হ্যালো ওয়ার্ল্ড অ্যাপটি লিখি
এখন আমরা হ্যালো ওয়ার্ল্ড অ্যাপটি লিখার জন্য প্রস্তুত হয়েছি।
#### কাজ - হ্যালো ওয়ার্ল্ড অ্যাপটি লিখা
হ্যালো ওয়ার্ল্ড অ্যাপটি লিখি।
1. `main.cpp` ফাইলটি ভি এস কোড থেকে ওপেন করি।
1. কোডটি এমনভাবে লিখি যেনো এটি নিম্নোক্ত কোডটির সাথে মিলে যায়:
```cpp
#include <Arduino.h>
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
}
void loop()
{
Serial.println("Hello World");
delay(5000);
}
```
`setup` ফাংশনটি একটি connection কে initialize করে সিরিয়াল পোর্ট এর সাথে, সেই usb পোর্টটি যেটি আমাদের কম্পিউটারকে wio terminal এর সাথে সংযুক্ত করেছে। `9600` প্যারামিটারটি হলো [baud rate](https://wikipedia.org/wiki/Symbol_rate) (যা সিম্বল রেট হিসেবেও পরিচিত) সিরিয়াল পোর্ট এর মধ্য দিয়ে যাওয়া ডাটার speed (bits per second). এই সেটিং দ্বারা আমরা বোঝাই ৯৬০০ bits ( এবং ১) ডাটা পাঠানো হচ্ছে প্রতি সেকেন্ডে। এরপর এটি সিরিয়াল পোর্টটি ready state এ যাওয়ার জন্য wait করে। 
+ `loop` ফাংশনটি `Hello World!` লাইনটির character গুলো এবং একটি new line character সিরিয়াল পোর্টে পাঠায়। এরপর, এটি ৫০০০ মিলি সেকেন্ড সময়ের জন্য sleep state এ যায়। Loop শেষ হওয়ার পর, এটি আবার রান করে এবং চলতে থাকে যতক্ষণ পর্যন্ত মাইক্রোকন্ট্রোলারটি ON থাকে।
1. কোডটি বিল্ড করে wio terminal এ আপলোড করি
1. ভি এস কোড command palette ওপেন করি।
1. 1. টাইপ করি `PlatformIO Upload` আপলোড অপশনটি খুঁজে পাওয়ার জন্য, এরপর *PlatformIO: Upload* সিলেক্ট করি।
![The PlatformIO upload option in the command palette](../../../images/vscode-platformio-upload-command-palette.png)
যদি দরকার হয়, platformIO এখন অটোমেটিক ভাবে কোডটিকে বিল্ড করবে, আপলোড করার পূর্বে।
1. কোডটি কম্পাইল হয়ে wio terminal এ আপলোড হয়ে যাবে 
> 💁 আমরা যদি MacOS ব্যাবহার করে থাকি, একটি *DISK NOT EJECTED PROPERLY* notification দেখতে পাবো। এটা এজন্যে দেখায় যে, wio terminal টি মাউন্টেড হয় ড্রাইভ হিসেবে যা কিনা ফ্লাশিং প্রসেসের একটি পার্ট, এবং এটি বিচ্ছিন্ন হয়ে যায় যখন compiled code টি আমদর ডিভাইস এ লেখা। আমরা এই নোটিফিকেশনটি ইগনোর করতে পারি।
⚠️ আমরা যদি error দেখতে পাই যে আপলোড পোর্ট unavailable, প্রথমত, আমাদের দেখতে হবে wio টার্মিনালটি আমাদের কম্পিউটারের সাথে সংযুক্ত আছে কিনা এবং স্ক্রীন এর বামদিকের সুইচটি অন করা আছে কিনা। নিচের দিকের সবুজ লাইটটি অন থাকতে হবে। এরপরও যদি error আসে, আমরা on/off সুইটটিকে দুবার নিচের দিকে টানবো এমনভাবে যেনো আমাদের wio terminal টি bootloader mode এ যায়। এরপর, আবার আপলোড করবো।
wio terminal এর একটি serial monitor থাকে যা wio terminal থেকে usb পোর্ট এর মাধ্যমে কতটুকু ডাটা পাঠানো হয়েছে তা দেখে। আমরা `Serial.println("Hello World");` কমান্ডটির মাধ্যমে কতটুকু ডাটা পাঠানো হয়েছে তা মনিটর করতে পারবো।
1. ভি এস কোড command palette ওপেন করি
1. `PlatformIO Serial` টাইপ করি serial monitor অপশনটি খুঁজে পাওয়া জন্য, সিলেক্ট *PlatformIO: Serial Monitor*
![The PlatformIO Serial Monitor option in the command palette](../../../images/vscode-platformio-serial-monitor-command-palette.png)
এখন একটি নতুন টার্মিনাল ওপেন হবে যেখানে সিরিয়াল পোর্টের মাধ্যমে যত ডাটা পাঠানো হয়েছে তা দেখা যাবে:
```output
> Executing task: platformio device monitor <
--- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
--- More details at http://bit.ly/pio-monitor-filters
--- Miniterm on /dev/cu.usbmodem101 9600,8,N,1 ---
--- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
Hello World
Hello World
```
serial monitor এ প্রতি ৫ সেকেন্ডে `Hello World` প্রিন্ট হবে।
> 💁 আমরা উক্ত কোডটি [code/wio-terminal](code/wio-terminal) ফোল্ডারে খুঁজে পাবো। 
😀 আমাদের 'হ্যালো ওয়ার্ল্ড' লেখাটি সফল হলো!!

@ -1,6 +1,6 @@
# A deeper dive into IoT
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-2.png)
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-2.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
@ -26,16 +26,12 @@ The two components of an IoT application are the *Internet* and the *thing*. Let
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The **Thing** part of IoT refers to a device that can interact with the physical world. These devices are usually small, low-priced computers, running at low speeds and using low power - for example, simple microcontrollers with kilobytes of RAM (as opposed to gigabytes in a PC) running at only a few hundred megahertz (as opposed to gigahertz in a PC), but consuming sometimes so little power they can run for weeks, months or even years on batteries.
These devices interact with the physical world, either by using sensors to gather data from their surroundings or by controlling outputs or actuators to make physical changes. The typical example of this is a smart thermostat - a device that has a temperature sensor, a means to set a desired temperature such as a dial or touchscreen, and a connection to a heating or cooling system that can be turned on when the temperature detected is outside the desired range. The temperature sensor detects that the room is too cold and an actuator turns the heating on.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
There are a huge range of different things that can act as IoT devices, from dedicated hardware that senses one thing, to general purpose devices, even your smartphone! A smartphone can use sensors to detect the world around it, and actuators to interact with the world - for example using a GPS sensor to detect your location and a speaker to give you navigation instructions to a destination.
✅ Think of other systems you have around you that read data from a sensor and use that to make decisions. One example would be the thermostat on an oven. Can you find more?
@ -52,14 +48,10 @@ With the example of a smart thermostat, the thermostat would connect using home
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
An even smarter version could use AI in the cloud with data from other sensors connected to other IoT devices such as occupancy sensors that detect what rooms are in use, as well as data such as weather and even your calendar, to make decisions on how to set the temperature in a smart fashion. For example, it could turn your heating off if it reads from your calendar you are on vacation, or turn off the heating on a room by room basis depending on what rooms you use, learning from the data to be more and more accurate over time.
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ What other data could help make an Internet connected thermostat smarter?
### IoT on the Edge
@ -96,8 +88,6 @@ The faster the clock cycle, the more instructions that can be processed each sec
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second.
✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal.
@ -212,8 +202,6 @@ The [Raspberry Pi Foundation](https://www.raspberrypi.org) is a charity from the
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
The latest iteration of the full size Raspberry Pi is the Raspberry Pi 4B. This has a quad-core (4 core) CPU running at 1.5GHz, 2, 4, or 8GB of RAM, gigabit ethernet, WiFi, 2 HDMI ports supporting 4k screens, an audio and composite video output port, USB ports (2 USB 2.0, 2 USB 3.0), 40 GPIO pins, a camera connector for a Raspberry Pi camera module, and an SD card slot. All this on a board that is 88mm x 58mm x 19.5mm and is powered by a 3A USB-C power supply. These start at US$35, much cheaper than a PC or Mac.
> 💁 There is also a Pi400 all in one computer with a Pi4 built into a keyboard.

@ -1,6 +1,6 @@
# <div dir="rtl">التعمق أكثر بإنترنت الأشياء</div>
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-2.png)
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-2.jpg)
> <div dir="rtl"> خريطة من <a href="https://github.com/nitya">Nitya Narasimhan</a> </div>
> <div dir="rtl"> اضغط على الصورة لتكبيرها </div>
@ -32,16 +32,12 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات.
تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة.
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../../images/basic-thermostat.png)
***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة.
✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟
@ -59,15 +55,11 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../../images/mobile-controlled-thermostat.png)
***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت .
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../../images/smarter-thermostat.png)
***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟
@ -114,8 +106,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية.
✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio.
@ -237,8 +227,6 @@
![A Raspberry Pi 4](../../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac.
> 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح.

@ -1,7 +1,5 @@
# IoT এর আরো গভীরে
![Embed a video here if available](video-url)
## লেকচার পূর্ববর্তী কুইজ
[লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3)
@ -24,16 +22,12 @@
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
**থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র‍্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে।
এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে।
![A diagram showing temperature and a dial as inputs to an IoT device, and control of a heater as an output](../../../images/basic-thermostat.png)
***একটি সাধারণ থার্মোস্ট্যাট Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে।
✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ?
@ -50,14 +44,10 @@
![A diagram showing temperature and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, and control of a heater as an output from the IoT device](../../../images/mobile-controlled-thermostat.png)
***একটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত, ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট / Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে।
![A diagram showing multiple temperature sensors and a dial as inputs to an IoT device, the IoT device with 2 way communication to the cloud, which in turn has 2 way communication to a phone, a calendar and a weather service, and control of a heater as an output from the IoT device](../../../images/smarter-thermostat.png)
***একটি ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট যা একাধিক রুমের সেন্সর ব্যবহার করে । এটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত এবং আবহাওয়া ও ক্যালেন্ডারের ডেটা থেকে বুদ্ধিমত্তা গ্রহণ করতে সক্ষম. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে?
### Edge চালিত IoT
@ -91,8 +81,6 @@
![The fetch decode execute cycles showing the fetch taking an instruction from the program stored in RAM, then decoding and executing it on a CPU](../../../images/fetch-decode-execute.png)
***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়।
✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি।
@ -207,8 +195,6 @@ Wio Terminal পর্যালোচনা করি।
![A Raspberry Pi 4](../../../images/raspberry-pi-4.jpg)
***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র‍্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম।
> 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে।

@ -1,8 +1,8 @@
# Interact with the physical world with sensors and actuators
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-3.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -60,8 +60,6 @@ One example of this is a potentiometer. This is a dial that you can rotate betwe
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
The IoT device will send an electrical signal to the potentiometer at a voltage, such as 5 volts (5V). As the potentiometer is adjusted it changes the voltage that comes out of the other side. Imagine you have a potentiometer labelled as a dial that goes from 0 to [11](https://wikipedia.org/wiki/Up_to_eleven), such as a volume knob on an amplifier. When the potentiometer is in the full off position (0) then 0V (0 volts) will come out. When it is in the full on position (11), 5V (5 volts) will come out.
> 🎓 This is an oversimplification, and you can read more on potentiometers and variable resistors on the [potentiometer Wikipedia page](https://wikipedia.org/wiki/Potentiometer).
@ -88,8 +86,6 @@ The simplest digital sensor is a button or switch. This is a sensor with two sta
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
Pins on IoT devices such as GPIO pins can measure this signal directly as a 0 or 1. If the voltage sent is the same as the voltage returned, the value read is 1, otherwise the value read is 0. There is no need to convert the signal, it can only be 1 or 0.
> 💁 Voltages are never exact especially as the components in a sensor will have some resistance, so there is usually a tolerance. For example, the GPIO pins on a Raspberry Pi work on 3.3V, and read a return signal above 1.8V as a 1, below 1.8V as 0.
@ -101,8 +97,6 @@ More advanced digital sensors read analog values, then convert them using on-boa
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Sending digital data allows sensors to become more complex and send more detailed data, even encrypted data for secure sensors. One example is a camera. This is a sensor that captures an image and sends it as digital data containing that image, usually in a compressed format such as JPEG, to be read by the IoT device. It can even stream video by capturing images and sending either the complete image frame by frame or a compressed video stream.
## What are actuators?
@ -125,8 +119,6 @@ Follow the relevant guide below to add an actuator to your IoT device, controlle
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -143,8 +135,6 @@ One example is a dimmable light, such as the ones you might have in your house.
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Like with sensors, the actual IoT device works on digital signals, not analog. This means to send an analog signal, the IoT device needs a digital to analog converter (DAC), either on the IoT device directly, or on a connector board. This will convert the 0s and 1s from the IoT device to an analog voltage that the actuator can use.
✅ What do you think happens if the IoT device sends a higher voltage than the actuator can handle? ⛔️ DO NOT test this out.
@ -159,8 +149,6 @@ Imagine you are controlling a motor with a 5V supply. You send a short pulse to
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
This means in one second you have 25 5V pulses of 0.02s that rotate the motor, each followed by 0.02s pause of 0V not rotating the motor. Each pulse rotates the motor one tenth of a rotation, meaning the motor completes 2.5 rotations per second. You've used a digital signal to rotate the motor at 2.5 rotations per second, or 150 ([revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute), a non-standard measure of rotational velocity).
```output
@ -172,8 +160,6 @@ This means in one second you have 25 5V pulses of 0.02s that rotate the motor, e
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
You can change the motor speed by changing the size of the pulses. For example, with the same motor you can keep the same cycle time of 0.04s, with the on pulse halved to 0.01s, and the off pulse increasing to 0.03s. You have the same number of pulses per second (25), but each on pulse is half the length. A half length pulse only turns the motor one twentieth of a rotation, and at 25 pulses a second will complete 1.25 rotations per second or 75rpm. By changing the pulse speed of a digital signal you've halved the speed of an analog motor.
```output
@ -195,8 +181,6 @@ One simple digital actuator is an LED. When a device sends a digital signal of 1
![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ What other simple 2-state actuators can you think of? One example is a solenoid, which is an electromagnet that can be activated to do things like move a door bolt locking/unlocking a door.
More advanced digital actuators, such as screens require the digital data to be sent in certain formats. They usually come with libraries that make it easier to send the correct data to control them.

@ -0,0 +1,211 @@
# <div dir="rtl">تفاعل مع العالم المادي باستخدام المستشعرات والمحركات</div>
## <div dir="rtl"> اختبار ما قبل المحاضرة </div>
[<div dir="rtl"> اختبار ما قبل المحاضرة </div>](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/5)
## <div dir="rtl"> المقدمة </div>
<div dir="rtl">
يقدم هذا الدرس اثنين من المفاهيم الهامة لجهاز إنترنت الأشياء الخاص بك - أجهزة الاستشعار والمشغلات. ستحصل أيضًا على تدريب عملي مع كليهما ، بإضافة مستشعر الضوء إلى مشروع إنترنت الأشياء الخاص بك ، ثم إضافة مؤشر LED يتم التحكم فيه بواسطة مستويات الإضاءة ، مما يؤدي إلى بناء ضوء ليلي بشكل فعال.
</br>
سنغطي في هذا الدرس:
* [ما هي المستشعرات؟](#what-are-sensors)
* [استخدم جهاز استشعار](#use-a-sensor)
* [أنواع أجهزة الاستشعار](#sensor-types)
* [ما هي المحركات؟](#what-are-actuators)
* [استخدم مشغل](#use-an-actuator)
* [أنواع المحركات](#actuator-types)
## ما هي المستشعرات؟
أجهزة الاستشعار هي أجهزة تستشعر العالم المادي - أي أنها تقيس خاصية واحدة أو أكثر من حولها وترسل المعلومات إلى جهاز إنترنت الأشياء. تغطي المستشعرات مجموعة كبيرة من الأجهزة حيث يوجد الكثير من الأشياء التي يمكن قياسها ، من الخصائص الطبيعية مثل درجة حرارة الهواء إلى التفاعلات الفيزيائية مثل الحركة.
تتضمن بعض المستشعرات الشائعة ما يلي:
* مستشعرات درجة الحرارة - تستشعر درجة حرارة الهواء أو درجة حرارة ما يتم غمرها فيه. بالنسبة للهواة والمطورين ، غالبًا ما يتم دمجها مع ضغط الهواء والرطوبة في مستشعر واحد.
* الأزرار - تشعر بها عند الضغط عليها.
* مستشعرات الضوء - تكتشف مستويات الضوء ويمكن أن تكون لألوان معينة أو ضوء الأشعة فوق البنفسجية أو ضوء الأشعة تحت الحمراء أو الضوء المرئي العام.
* الكاميرات - تستشعر التمثيل المرئي للعالم من خلال التقاط صورة أو دفق الفيديو.
* مقاييس التسارع - تستشعر هذه الحركة في اتجاهات متعددة.
* الميكروفونات - تستشعر الأصوات ، سواء كانت مستويات صوت عامة أو صوت اتجاهي.
✅ قم ببعض البحث. ما المستشعرات الموجودة بهاتفك؟
تشترك جميع المستشعرات في شيء واحد - فهي تحول كل ما تشعر به إلى إشارة كهربائية يمكن تفسيرها بواسطة جهاز إنترنت الأشياء. تعتمد كيفية تفسير هذه الإشارة الكهربائية على المستشعر ، بالإضافة إلى بروتوكول الاتصال المستخدم للتواصل مع جهاز إنترنت الأشياء.
## استخدم جهاز استشعار
اتبع الدليل ذي الصلة أدناه لإضافة مستشعر إلى جهاز إنترنت الأشياء الخاص بك:
* [Arduino - Wio Terminal](wio-terminal-sensor.md)
* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-sensor.md)
* [كمبيوتر ذو لوحة واحدة - جهاز افتراضي](virtual-device-sensor.md)
## أنواع أجهزة الاستشعار
المستشعرات إما قياسية أو رقمية.
### المستعرات القياسية
بعض المستشعرات الأساسية هي أجهزة استشعار تمثيلية. تتلقى هذه المستشعرات فولت من جهاز إنترنت الأشياء ، وتقوم مكونات المستشعر بضبط هذا الفولت ، ويتم قياس الفولت الذي يتم إرجاعه من المستشعر لإعطاء قيمة المستشعر.
> 🎓 الفولت هو مقياس لمقدار الدفع لنقل الكهرباء من مكان إلى آخر ، مثل من طرف موجب للبطارية إلى الطرف السالب. على سبيل المثال ، بطارية AA القياسية هي 1.5 فولت (V هو رمز فولت) ويمكنها دفع الكهرباء بقوة 1.5 فولت من طرفها الموجب إلى الطرف السالب. تتطلب الأجهزة الكهربائية المختلفة فولتًا مختلفًا للعمل ، على سبيل المثال ، يمكن لمصباح LED أن يضيء بفولت يتراوح بين 2-3 فولت ، لكن المصباح الخيطي 100 وات يحتاج إلى 240 فولت. يمكنك قراءة المزيد عن الفولت على <a href="https://wikipedia.org/wiki/Voltage">صفحة الفولت على ويكيبيديا</a>
أحد الأمثلة على ذلك هو مقياس الفولت. هذا قرص يمكنك تدويره بين موضعين ويقيس المستشعر الدوران.
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../../images/potentiometer.png)
سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى <a href="https://wikipedia.org/wiki/Up_to_eleven">11</a> ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت).
> 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على <a href="https://wikipedia.org/wiki/Potentiometer">potentiometer Wikipedia page</a>
يتم بعد ذلك قراءة الفولت الذي يخرج من المستشعر بواسطة جهاز إنترنت الأشياء ، ويمكن للجهاز الاستجابة له. اعتمادًا على المستشعر ، يمكن أن يكون هذا الفولت قيمة عشوائية أو يمكن تعيينه إلى وحدة قياسية. على سبيل المثال ، يقوم مستشعر درجة الحرارة التناظرية المستند إلى <a href="https://wikipedia.org/wiki/Thermistor">thermistor</a> بتغيير مقاومته اعتمادًا على درجة الحرارة. يمكن بعد ذلك تحويل فولت الخرج إلى درجة حرارة بوحدة كلفن ، وبالتالي إلى درجة مئوية أو درجة فهرنهايت ، عن طريق الحسابات في الكود.
✅ ما الذي يحدث برأيك إذا قام المستشعر بإرجاع فولت أعلى مما تم إرساله (على سبيل المثال قادم من مصدر طاقة خارجي)؟ ⛔️ لا تختبر ذلك.
#### التحويل القياسي إلى الرقمي
أجهزة إنترنت الأشياء رقمية - لا يمكنها العمل مع القيم التناظرية ، فهي تعمل فقط مع 0 و 1. هذا يعني أنه يجب تحويل قيم المستشعرات التناظرية إلى إشارة رقمية قبل معالجتها. تحتوي العديد من أجهزة إنترنت الأشياء على محولات من التناظرية إلى الرقمية (ADC) لتحويل المدخلات التناظرية إلى تمثيلات رقمية لقيمتها. يمكن أن تعمل المستشعرات أيضًا مع ADC عبر لوحة موصل. على سبيل المثال ، في نظام Seeed Grove البيئي مع Raspberry Pi ، تتصل المستشعرات التناظرية بمنافذ محددة على "قبعة" مثبتة على Pi متصلة بدبابيس GPIO الخاصة بـ Pi ، وتحتوي هذه القبعة على ADC لتحويل الجهد إلى إشارة رقمية التي يمكن إرسالها من دبابيس GPIO الخاصة بـ Pi.
تخيل أن لديك مستشعر ضوء تناظري متصل بجهاز إنترنت الأشياء يستخدم 3.3 فولت ويعيد قيمة 1 فولت. لا يعني هذا 1V أي شيء في العالم الرقمي ، لذلك يجب تحويله. سيتم تحويل الجهد إلى قيمة تمثيلية باستخدام مقياس يعتمد على الجهاز والمستشعر. أحد الأمثلة على ذلك هو مستشعر الضوء Seeed Grove الذي ينتج قيمًا من 0 إلى 1023. بالنسبة لهذا المستشعر الذي يعمل عند 3.3 فولت ، سيكون خرج 1 فولت بقيمة 300. لا يمكن لجهاز إنترنت الأشياء التعامل مع 300 كقيمة تناظرية ، لذلك سيتم تحويل القيمة إلى "0000000100101100" ، التمثيل الثنائي 300 بواسطة Grove قبعة. ثم تتم معالجة ذلك بواسطة جهاز إنترنت الأشياء.
✅ إذا كنت لا تعرف النظام الثنائي ، فقم بإجراء قدر صغير من البحث لمعرفة كيفية تمثيل الأرقام بالأصفار والآحاد. تعتبر <a href="https://www.bbc.co.uk/bitesize/guides/zwsbwmn/revision/1">مقدمة BBC Bitesize للدرس الثنائي</a> مكانًا رائعًا للبدء.
من منظور الترميز ، يتم التعامل مع كل هذا عادةً بواسطة المكتبات التي تأتي مع أجهزة الاستشعار ، لذلك لا داعي للقلق بشأن هذا التحويل بنفسك. بالنسبة لمستشعر الضوء Grove ، يمكنك استخدام مكتبة Python واستدعاء خاصية "light" ، أو استخدام مكتبة Arduino واستدعاء "analogRead" للحصول على قيمة 300.
### المستشعرات الرقمية
تكتشف المستشعرات الرقمية ، مثل المستشعرات التناظرية ، العالم من حولها باستخدام التغيرات في الجهد الكهربائي. الفرق هو أنهم يخرجون إشارة رقمية ، إما عن طريق قياس حالتين فقط أو باستخدام ADC مدمج. أصبحت المستشعرات الرقمية أكثر شيوعًا لتجنب الحاجة إلى استخدام ADC إما في لوحة الموصل أو على جهاز إنترنت الأشياء نفسه.
أبسط مستشعر رقمي هو زر أو مفتاح. هذا جهاز استشعار بحالتين ، يعمل أو لا يعمل.
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../../images/button.png)
يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط.
> 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0.
* 3.3 فولت يذهب إلى الزر. الزر مغلق حتى يخرج 0 فولت ، مما يعطي القيمة 0
* 3.3 فولت يذهب إلى الزر. الزر في وضع التشغيل بحيث يخرج 3.3 فولت ، مما يعطي القيمة 1
تقوم المستشعرات الرقمية الأكثر تقدمًا بقراءة القيم التناظرية ، ثم تحويلها باستخدام ADC المدمجة إلى إشارات رقمية. على سبيل المثال ، سيظل مستشعر درجة الحرارة الرقمي يستخدم مزدوجًا حراريًا بنفس طريقة المستشعر التناظري ، وسيظل يقيس التغير في الجهد الناتج عن مقاومة المزدوجة الحرارية عند درجة الحرارة الحالية. بدلاً من إرجاع القيمة التناظرية والاعتماد على الجهاز أو لوحة الموصل للتحويل إلى إشارة رقمية ، ستقوم وحدة ADC المدمجة في المستشعر بتحويل القيمة وإرسالها كسلسلة من 0 و 1 إلى جهاز إنترنت الأشياء. يتم إرسال هذه القيم من 0 و 1 بنفس طريقة إرسال الإشارة الرقمية للزر حيث يمثل 1 جهدًا كاملاً و 0 يمثل 0 فولت.
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../../images/temperature-as-digital.png)
يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط.
## ما هي المحركات؟
المشغلات هي عكس المستشعرات - فهي تقوم بتحويل الإشارة الكهربائية من جهاز إنترنت الأشياء الخاص بك إلى تفاعل مع العالم المادي مثل إصدار الضوء أو الصوت أو تحريك المحرك.
تتضمن بعض المحركات الشائعة ما يلي:
* LED - ينبعث منها ضوء عند تشغيله
* مكبر الصوت - يصدر صوتًا بناءً على الإشارة المرسلة إليهم ، من الجرس الأساسي إلى مكبر الصوت الذي يمكنه تشغيل الموسيقى
* محرك متدرج - يقوم بتحويل الإشارة إلى مقدار محدد من الدوران ، مثل تدوير القرص بزاوية 90 درجة
* الترحيل - هذه هي المفاتيح التي يمكن تشغيلها أو إيقاف تشغيلها بواسطة إشارة كهربائية. إنها تسمح بجهد صغير من جهاز إنترنت الأشياء لتشغيل الفولتية الأكبر.
* الشاشات - هذه مشغلات أكثر تعقيدًا وتعرض معلومات على شاشة متعددة الأجزاء. تختلف الشاشات من شاشات LED البسيطة إلى شاشات الفيديو عالية الدقة.
✅ قم ببعض البحث. ما هي المشغلات التي يمتلكها هاتفك؟
## استخدام مشغل
اتبع الدليل ذي الصلة أدناه لإضافة مشغل إلى جهاز إنترنت الأشياء الخاص بك ، والذي يتحكم فيه المستشعر ، لإنشاء ضوء ليلي لإنترنت الأشياء. سيجمع مستويات الضوء من مستشعر الضوء ، ويستخدم مشغلًا على شكل LED لإصدار الضوء عندما يكون مستوى الضوء المكتشف منخفضًا جدًا.
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../../images/assignment-1-flow.png)
* [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md)
* [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md)
## أنواع المحرك
مثل المستشعرات ، تكون المحركات إما قياسية أو رقمية.
### المحركات القياسية
تأخذ المشغلات القياسية إشارة قياسية وتحولها إلى نوع من التفاعل ، حيث يتغير التفاعل بناءً على الجهد المزود.
أحد الأمثلة هو الضوء الخافت ، مثل الذي قد يكون لديك في منزلك. يحدد مقدار الجهد المقدم للضوء مدى سطوعه.
![A light dimmed at a low voltage and brighter at a higher voltage](../../../../images/dimmable-light.png)
كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل.
✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك.
#### تعديل عرض النبض
هناك خيار آخر لتحويل الإشارات الرقمية من جهاز إنترنت الأشياء إلى إشارة تمثيلية وهو تعديل عرض النبضة. يتضمن هذا إرسال الكثير من النبضات الرقمية القصيرة التي تعمل كما لو كانت إشارة تمثيلية.
على سبيل المثال ، يمكنك استخدام PWM للتحكم في سرعة المحرك.
تخيل أنك تتحكم في محرك مزود بمصدر 5 فولت. تقوم بإرسال نبضة قصيرة إلى المحرك الخاص بك ، حيث تقوم بتحويل الجهد إلى الجهد العالي (5 فولت) لمدة مائتي ثانية (0.02 ثانية). في ذلك الوقت ، يمكن لمحركك أن يدور عُشر الدوران ، أو 36 درجة. ثم تتوقف الإشارة مؤقتًا لمدة مائتي ثانية (0.02 ثانية) ، لإرسال إشارة منخفضة (0 فولت). كل دورة تشغيل ثم إيقاف تستمر 0.04 ثانية. ثم تتكرر الدورة.
![Pule width modulation rotation of a motor at 150 RPM](../../../../images/pwm-motor-150rpm.png)
هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 <a href="https://wikipedia.org/wiki/Revolutions_per_minute">دورة في الدقيقة</a> ، وهو مقياس غير قياسي لسرعة الدوران).
```output
25 نبضة في الثانية × 0.1 دورة لكل نبضة = 2.5 دورة في الثانية
2.5 دورة في الثانية × 60 ثانية في الدقيقة = 150 دورة في الدقيقة
```
> 🎓 عندما تكون إشارة PWM قيد التشغيل لمدة نصف الوقت ، وإيقاف تشغيلها لنصف المدة ، يشار إليها على أنها <a href="https://wikipedia.org/wiki/Duty_cycle">50٪ دورة عمل</a>. يتم قياس دورات التشغيل كنسبة مئوية من الوقت تكون فيه الإشارة في حالة التشغيل مقارنة بحالة إيقاف التشغيل.
![Pule width modulation rotation of a motor at 75 RPM](../../../../images/pwm-motor-75rpm.png)
يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف.
```output
25 نبضة في الثانية × 0.05 دورة لكل نبضة = 1.25 دورة في الثانية
1.25 دورة في الثانية × 60 ثانية في الدقيقة = 75 دورة في الدقيقة
```
✅ كيف تحافظ على سلاسة دوران المحرك ، خاصة عند السرعات المنخفضة؟ هل ستستخدم عددًا صغيرًا من النبضات الطويلة مع فترات توقف طويلة أم الكثير من النبضات القصيرة جدًا مع فترات توقف قصيرة جدًا؟
> 💁 تستخدم بعض المستشعرات أيضًا PWM لتحويل الإشارات التناظرية إلى إشارات رقمية.
> 🎓 يمكنك قراءة المزيد عن تعديل عرض النبض على <a href="https://wikipedia.org/wiki/Pulse-width_modulation">صفحة تعديل عرض النبض على ويكيبيديا</a>.
### المشغلات الرقمية
المشغلات الرقمية ، مثل المستشعرات الرقمية ، إما لها حالتان يتم التحكم فيهما بجهد مرتفع أو منخفض أو تحتوي على DAC مدمجة بحيث يمكنها تحويل إشارة رقمية إلى إشارة تمثيلية.
أحد المشغلات الرقمية البسيطة هو LED. عندما يرسل الجهاز إشارة رقمية بقيمة 1 ، يتم إرسال جهد عالي يضيء مؤشر LED. عند إرسال إشارة رقمية بقيمة 0 ، ينخفض الجهد إلى 0 فولت وينطفئ مؤشر LED.
![A LED is off at 0 volts and on at 5V](../../../../images/led.png)
✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب.
تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها.
---
## 🚀 التحدي
كان التحدي في الدرسين الأخيرين هو سرد أكبر عدد ممكن من أجهزة إنترنت الأشياء الموجودة في منزلك أو مدرستك أو مكان عملك وتحديد ما إذا كانت مبنية على وحدات تحكم دقيقة أو أجهزة كمبيوتر أحادية اللوحة ، أو حتى مزيج من الاثنين معًا.
لكل جهاز أدرجته ، ما المستشعرات والمشغلات التي يتصلون بها؟ ما هو الغرض من كل حساس ومشغل متصل بهذه الأجهزة؟
## اختبار ما بعد المحاضرة
<a href="https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/6">اختبار ما بعد المحاضرة</a>
## مراجعة ودراسة ذاتية
* اقرأ عن الكهرباء والدوائر على <a href="http://www.thinglearn.com/essentials/">ThingLearn</a>.
* اقرأ عن الأنواع المختلفة من مستشعرات درجة الحرارة في <a href="https://www.seeedstudio.com/blog/2019/10/14/temperature-sensors-for-arduino-projects/">دليل مستشعرات درجة الحرارة في الاستوديوها</a>
* اقرأ عن مصابيح LED على <a href="https://wikipedia.org/wiki/Light-emitting_diode">صفحة Wikipedia LED</a>
## الواجب
[أجهزة الاستشعار والمحركات البحثية](assignment.md)
</div>

@ -57,8 +57,6 @@
![A potentiometer set to a mid point being sent 5 volts returning 3.8 volts](../../../images/potentiometer.png)
***পটেনশিওমিটার । Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে।
> 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে।
@ -85,8 +83,6 @@
![A button is sent 5 volts. When not pressed it returns 0 volts, when pressed it returns 5 volts](../../../images/button.png)
***বাটন । Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে।
> 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে।
@ -98,8 +94,6 @@
![A digital temperature sensor converting an analog reading to binary data with 0 as 0 volts and 1 as 5 volts before sending it to an IoT device](../../../images/temperature-as-digital.png)
***ডিজিটাল তাপমাত্রা সেন্সর । Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে।
## অ্যাকচুয়েটর কী?
@ -122,8 +116,6 @@
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
* [Arduino - Wio Terminal](wio-terminal-actuator.md)
* [Single-board computer - Raspberry Pi](pi-actuator.md)
* [Single-board computer - Virtual device](virtual-device-actuator.md)
@ -138,8 +130,6 @@
![A light dimmed at a low voltage and brighter at a higher voltage](../../../images/dimmable-light.png)
***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে।
✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত।
@ -152,8 +142,6 @@
![Pule width modulation rotation of a motor at 150 RPM](../../../images/pwm-motor-150rpm.png)
***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে .০২ সেকেন্ডে মোটর ঘুরছে আবার ভোল্টের জন্য .০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি।
```output
@ -165,8 +153,6 @@
![Pule width modulation rotation of a motor at 75 RPM](../../../images/pwm-motor-75rpm.png)
***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে।
```output
@ -188,8 +174,6 @@
![A LED is off at 0 volts and on at 5V](../../../images/led.png)
***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে।
আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে।

@ -0,0 +1,21 @@
<div dir="rtl">
# بحث مستشعرات و مشغلات
## تعليمات
غطى هذا الدرس أجهزة الاستشعار والمحركات. ابحث وأوصف مستشعرًا ومشغلًا واحدًا يمكن استخدامه مع مجموعة أدوات تطوير إنترنت الأشياء ، بما في ذلك:
* ماذا يفعل
* الأجهزة الإلكترونية / الأجهزة المستخدمة بالداخل
* هل هو تناظري أم رقمي
* ما هي وحدات ونطاق المدخلات أو القياسات
## الموضوع
| المعايير | نموذجي | كافية | يحتاج إلى تحسين |
| -------- | --------- | -------- | ----------------- |
| وصف جهاز استشعار | وصف جهاز استشعار بما في ذلك تفاصيل عن جميع الأقسام الأربعة المذكورة أعلاه. | وصف جهاز استشعار ، ولكنه كان قادرًا فقط على توفير 2-3 من الأقسام أعلاه | وصف جهاز استشعار ، لكنه كان قادرًا فقط على توفير 1 من الأقسام أعلاه |
| وصف المشغل | وصف المشغل بما في ذلك التفاصيل لجميع الأقسام الأربعة المذكورة أعلاه. | وصف مشغل ، لكنه كان قادرًا فقط على توفير 2-3 من الأقسام أعلاه | وصف مشغل ، لكنه كان قادرًا فقط على توفير 1 من الأقسام أعلاه |
</div>

@ -0,0 +1,17 @@
# সেন্সর এবং অ্যাকচুয়েটর সংক্রান্ত গবেষণা
## নির্দেশনা
এই পাঠটিতে সেন্সর এবং অ্যাকচুয়েটর আলোচনা হয়েছে। একটি আইওটি ডেভলাপার কিটে ব্যবহার করা যেতে পারে এমন একটি সেন্সর এবং একটি অ্যাকচুয়েটর বর্ণনা করতে হবে, যেখানে উল্লেখ থাকবে:
* এটি কী কাজ করে
* ভিতরে ব্যবহৃত ইলেকট্রনিক্স/হার্ডওয়্যার
* এটি কি অ্যানালগ নাকি ডিজিটাল
* ইনপুট বা পরিমাপের একক কী এবং যন্ত্রটির ব্যবহার্য সীমা (range) কতটুকু
## এসাইনমেন্ট মূল্যায়ন মানদন্ড
| ক্রাইটেরিয়া | দৃষ্টান্তমূলক ব্যখ্যা (সর্বোত্তম) | পর্যাপ্ত ব্যখ্যা (মাঝারি) | আরো উন্নতির প্রয়োজন (নিম্ন) |
| -------- | --------- | -------- | ----------------- |
| একটি সেন্সর সংক্রান্ত বর্ণনা | উপরে তালিকাভুক্ত 4 টি বিভাগের বিশদ ব্যখ্যা সহ সেন্সর বর্ণিত হয়েছে | একটি সেন্সর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 2-3টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে | একটি সেন্সর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 1টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে |
| একটি অ্যাকচুয়েটর সংক্রান্ত বর্ণনা | উপরে তালিকাভুক্ত 4 টি বিভাগের বিশদ ব্যখ্যা সহ অ্যাকচুয়েটর বর্ণিত হয়েছে | একটি অ্যাকচুয়েটর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 2-3টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে | একটি অ্যাকচুয়েটর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 1টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে |

@ -0,0 +1,128 @@
<div dir="rtl">
# قم ببناء ضوء ليلي - Raspberry Pi
في هذا الجزء من الدرس ، ستضيف مؤشر LED إلى Raspberry Pi الخاص بك وتستخدمه لإنشاء ضوء ليلي.
## المعدات
يحتاج ضوء الليل الآن إلى مشغل.
المشغل هو ** LED ** ، <a href="https://wikipedia.org/wiki/Light-emitting_diode"> الصمام الثنائي الباعث للضوء</a> الذي ينبعث منه الضوء عندما يتدفق التيار خلاله. هذا مشغل رقمي له حالتان ، تشغيل وإيقاف. يؤدي إرسال القيمة 1 إلى تشغيل مؤشر LED و 0 يؤدي إلى إيقاف تشغيله. LED هو مشغل Grove خارجي ويجب توصيله بقبعة Grove Base على Raspberry Pi.
منطق ضوء الليل في الكود الزائف هو:
```output
تحقق من مستوى الضوء.
إذا كان الضوء أقل من 300
قم بتشغيل LED
غير ذلك
قم بإيقاف تشغيل LED
```
### قم بتوصيل الصمام
يأتي Grove LED كوحدة نمطية مع مجموعة مختارة من مصابيح LED ، مما يسمح لك باختيار اللون.
#### المهمة - قم بتوصيل LED
قم بتوصيل الصمام.
![A grove LED](../../../../images/grove-led.png)
1. اختر مؤشر LED المفضل لديك وأدخل الأرجل في الفتحتين على وحدة LED.
المصابيح هي صمامات ثنائية باعثة للضوء ، والصمامات الثنائية هي أجهزة إلكترونية يمكنها حمل التيار في اتجاه واحد فقط. هذا يعني أن مؤشر LED يحتاج إلى الاتصال بالطريقة الصحيحة ، وإلا فلن يعمل.
أحد أرجل مؤشر LED هو الدبوس الموجب ، والآخر هو الدبوس السالب. LED ليس مستديرًا تمامًا وهو مسطح قليلاً من جانب واحد. الجانب المسطح قليلاً هو الدبوس السالب. عندما تقوم بتوصيل مؤشر LED بالوحدة ، تأكد من توصيل دبوس الجانب المستدير بالمقبس المميز بعلامة ** + ** على الجزء الخارجي من الوحدة ، وأن الجانب المسطح متصل بالمقبس الأقرب إلى منتصف الجزء وحدة.
1. تحتوي وحدة LED على زر دوران يسمح لك بالتحكم في السطوع. اقلب هذا على طول الطريق لتبدأ بتدويره عكس اتجاه عقارب الساعة بقدر ما سيذهب باستخدام مفك براغي صغير من فيليبس.
1. أدخل أحد طرفي كبل Grove في المقبس الموجود في وحدة LED. سوف تذهب في اتجاه واحد فقط.
1. مع إيقاف تشغيل Raspberry Pi ، قم بتوصيل الطرف الآخر من كابل Grove بالمقبس الرقمي المميز بعلامة ** D5 ** على قبعة Grove Base المرفقة بـ Pi. هذا المقبس هو الثاني من اليسار ، على صف المقابس بجوار دبابيس GPIO.
![The grove LED connected to socket D5](../../../../images/pi-led.png)
## برمجة ضوء الليل
يمكن الآن برمجة ضوء الليل باستخدام مستشعر الضوء Grove و Grove LED.
### المهمة - برمجة ضوء الليل
برمجة ضوء الليل.
1. قم بتشغيل Pi وانتظر حتى يتم التمهيد
1. افتح مشروع Nightlight في VS Code الذي أنشأته في الجزء السابق من هذه المهمة ، سواء كان يعمل مباشرة على Pi أو متصل باستخدام امتداد Remote SSH.
1. أضف الكود التالي إلى ملف `app.py` للاتصال لاستيراد المكتبة المطلوبة. يجب إضافة هذا إلى الأعلى ، أسفل سطور "الاستيراد" الأخرى.
```python
from grove.grove_led import GroveLed
```
تستورد العبارة `from grove.grove_led import GroveLed`` GroveLed` من مكتبات Grove Python. تحتوي هذه المكتبة على رمز للتفاعل مع Grove LED.
1. أضف الكود التالي بعد إعلان "light_sensor" لإنشاء مثيل للفئة التي تدير مؤشر LED:
```python
led = GroveLed(5)
```
يُنشئ السطر `led = GroveLed (5)` مثيلًا لفئة `GroveLed` التي تتصل بالطرف ** D5 ** - دبوس Grove الرقمي الذي يتصل به مؤشر LED.
> 💁 جميع المقابس لها أرقام دبوس فريدة. الدبابيس 0 و 2 و 4 و 6 هي دبابيس تمثيلية ، والدبابيس 5 و 16 و 18 و 22 و 24 و 26 هي دبابيس رقمية.
1. أضف فحصًا داخل حلقة "while" وقبل "time.sleep" للتحقق من مستويات الإضاءة وتشغيل مؤشر LED أو إيقاف تشغيله:
</div>
```python
if light < 300:
led.on()
else:
led.off()
```
<div dir="rtl">
يتحقق هذا الرمز من قيمة "light". إذا كان هذا أقل من 300 ، فإنه يستدعي طريقة "on" لفئة "GroveLed" التي ترسل قيمة رقمية 1 إلى LED ، وتشغيلها. إذا كانت قيمة الضوء أكبر من أو تساوي 300 ، فإنها تستدعي طريقة "إيقاف التشغيل" ، وإرسال قيمة رقمية بقيمة 0 إلى LED ، وإيقاف تشغيلها.
> 💁 يجب وضع مسافة بادئة لهذا الرمز إلى نفس مستوى خط الطباعة ('Light level:'، light) `ليكون داخل حلقة while!
> 💁 عند إرسال القيم الرقمية إلى المشغلات ، تكون القيمة 0 هي 0 فولت ، والقيمة 1 هي أقصى جهد للجهاز. بالنسبة لـ Raspberry Pi مع مستشعرات ومشغلات Grove ، يكون الجهد 1 هو 3.3 فولت.
1. من VS Code Terminal ، قم بتشغيل ما يلي لتشغيل تطبيق Python:
```sh
python3 app.py
```
Light values will be output to the console.
```output
pi@raspberrypi:~/nightlight $ python3 app.py
Light level: 634
Light level: 634
Light level: 634
Light level: 230
Light level: 104
Light level: 290
```
1. قم بتغطية وكشف مستشعر الضوء. لاحظ كيف سيضيء مؤشر LED إذا كان مستوى الضوء 300 أو أقل ، وينطفئ عندما يكون مستوى الضوء أكبر من 300.
> 💁 إذا لم يتم تشغيل مؤشر LED ، فتأكد من توصيله بالطريقة الصحيحة ، وأن زر الدوران مضبوط على الوضع الكامل.
![The LED connected to the Pi turning on and off as the light level changes](../../../../images/pi-running-assignment-1-1.gif)
> 💁 يمكنك العثور على هذا الرمز في المجلد
[code-actuator/pi](../code-actuator/pi)
😀 كان برنامج الإضاءة الليلية الخاص بك ناجحًا!
</div>

@ -1,8 +1,8 @@
# Connect your device to the Internet
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-4.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -31,8 +31,6 @@ There are a number of popular communication protocols used by IoT devices to com
![IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices.](../../../images/pub-sub.png)
***IoT devices connect to a broker and publish telemetry and subscribe to commands. Cloud services connect to the broker and subscribe to all telemetry and send commands to specific devices. Broadcast by RomStu / Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
MQTT is the most popular communication protocol for IoT devices and is covered in this lesson. Others protocols include AMQP and HTTP/HTTPS.
## Message Queueing Telemetry Transport (MQTT)
@ -43,8 +41,6 @@ MQTT has a single broker and multiple clients. All clients connect to the broker
![IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic](../../../images/mqtt.png)
***IoT device publishing telemetry on the /telemetry topic, and the cloud service subscribing to that topic. Microcontroller by Template / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research. If you have a lot of IoT devices, how can you ensure your MQTT broker can handle all the messages?
### Connect your IoT device to MQTT
@ -67,8 +63,6 @@ Rather than dealing with the complexities of setting up an MQTT broker as part o
![A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled](../../../images/assignment-1-internet-flow.png)
***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
Follow the relevant step below to connect your device to the MQTT broker:
* [Arduino - Wio Terminal](wio-terminal-mqtt.md)
@ -106,8 +100,6 @@ Let's look back at the example of the smart thermostat from Lesson 1.
![An Internet connected thermostat using multiple room sensors](../../../images/telemetry.png)
***An Internet connected thermostat using multiple room sensors. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The thermostat has temperature sensors to gather telemetry. It would most likely have one temperature sensor built in, and it might connect to multiple external temperature sensors over a wireless protocol such as [Bluetooth Low Energy](https://wikipedia.org/wiki/Bluetooth_Low_Energy) (BLE).
An example of the telemetry data it would send could be:
@ -353,8 +345,6 @@ Commands are messages sent by the cloud to a device, instructing it to do someth
![An Internet connected thermostat receiving a command to turn on the heating](../../../images/commands.png)
***An Internet connected thermostat receiving a command to turn on the heating. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
A thermostat could receive a command from the cloud to turn the heating on. Based on the telemetry data from all the sensors, if the cloud service has decided that the heating should be on, so it sends the relevant command.
### Send commands to the MQTT broker

@ -4,9 +4,7 @@ As the population grows, so does the demand on agriculture. The amount of land a
In these 6 lessons you'll learn how to apply the Internet of Things to improve and automate farming.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you follow the [Clean up your project](lessons/6-keep-your-plant-secure/README.md#clean-up-your-project) step in [lesson 6](lessons/6-keep-your-plant-secure/README.md).
**Add video of automated plant**
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
## Topics

@ -1,8 +1,8 @@
# Predict plant growth with IoT
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-5.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -134,8 +134,6 @@ By gathering temperature data using an IoT device, a farmer can automatically be
![Telemetry data is sent to a server and then saved to a database](../../../images/save-telemetry-database.png)
***Telemetry data is sent to a server and then saved to a database. database by Icons Bazaar - from the [Noun Project](https://thenounproject.com)***
The server code can also augment the data by adding extra information. For example, the IoT device can publish an identifier to indicate which device it is, and the sever code can use this to look up the location of the device, and what crops it is monitoring. It can also add basic data like the current time as some IoT devices don't have the necessary hardware to keep track of an accurate time, or require additional code to read the current time over the Internet.
✅ Why do you think different fields might have different temperatures?

@ -1,9 +0,0 @@
# Dummy File
This file acts as a placeholder for the `translations` folder. <br>
**Please remove this file after adding the first translation**
For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) .
## THANK YOU
We truly appreciate your efforts!

@ -0,0 +1,265 @@
# আইওটি দ্বারা উদ্ভিদ বৃদ্ধির পূর্বাভাস
![A sketchnote overview of this lesson](../../../../sketchnotes/lesson-5.jpg)
> স্কেচনোটটি তৈরী করেছেন [Nitya Narasimhan](https://github.com/nitya). বড় সংস্করণে দেখার জন্য ছবিটিতে ক্লিক করতে হবে।
## লেকচার-পূর্ববর্তী কুইজ
[লেকচার-পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/9)
## সূচনা
উদ্ভিদের বৃদ্ধির জন্য নির্দিষ্ট কিছু উপাদান প্রয়োজন - পানি, কার্বন-ডাইঅক্সাইড, পুষ্টি, হালকা এবং তাপ। এই পাঠে আমরা শিখবো কীভাবে বায়ুর তাপমাত্রা পরিমাপ করে উদ্ভিদের বৃদ্ধি এবং পরিপক্কতার হার হিসেব করা যায়।
এই লেসনে আমরা শিখবো:
* [ডিজিটাল কৃষিব্যবস্থা](#ডিজিটাল-কৃষিব্যবস্থা)
* [কৃষিকাজে তাপমাত্রা কেন গুরুত্বপূর্ণ?](#কৃষিকাজে-তাপমাত্রা-কেন-গুরুত্বপূর্ণ)
* [চারিপার্শ্বিক তাপমাত্রা পরিমাপ](#চারিপার্শ্বিক-তাপমাত্রা-পরিমাপ)
* [Growing degree days (GDD) হিসেব](#growing-degree-days)
* [তাপমাত্রা সেন্সরের তথ্য দিয়ে GDD নির্ণয়](#তাপমাত্রা-সেন্সরের-তথ্য-দিয়ে-GDD-নির্ণয়)
## ডিজিটাল কৃষিব্যবস্থা
ডিজিটাল বিপ্লব আমাদের কৃষিব্যবস্থায় আমূল পরিবর্তন আনছে। ডেটা সংগ্রহ, ডেটা স্টোর এবং সেই তথ্য বিশ্লেষণের মাধ্যমে আমাদের কৃষিকাজ নতুন রূপ লাভ করছে। ওয়ার্ল্ড ইকোনমিক ফোরামের ভাষ্যমতে আমরা বর্তমানে 'চতুর্থ শিল্প বিপ্লব' এর সময়ে আছি এবং ডিজিটাল কৃষির উত্থানকে 'চতুর্থ কৃষি বিপ্লব' বা 'কৃষি 4.0' হিসাবে চিহ্নিত করা হচ্ছে।
> 🎓 'ডিজিটাল এগ্রিকালচার' বলতে কৃষিব্যবস্থার সম্পূর্ণ value chain কে বোঝানো হয় যা একেবারে খামার থেকে খাওয়ার টেবিল পর্যন্ত সম্পূর্ণ যাত্রা । খাদ্য পরিবহণ এবং প্রক্রিয়াজাতকরণের সময় তার গুণগত মান ট্র্যাক করা, গুদাম এবং ই-কমার্স ব্যবস্থা পরিচালনা এমনকি ট্র্যাক্টর ভাড়া করা পর্যন্ত - সবকিছুই বর্তমানে ডিজিটাল মাধ্যমে করা সম্ভব হচ্ছে
এই বৈপ্লবিক পরিবর্তনের কারণে কম পরিমাণে সার ও কীটনাশক এবং দক্ষতার সাথে সেচ বা পানি ব্যবহার করে কৃষকেরা এখন অধিক ফলন পাচ্ছেন। যদিও প্রাথমিকভাবে কেবল উন্নত দেশগুলোতেই সেন্সর এবং অন্যান্য আধুনিক যন্ত্রের ব্যবহার শুরু হয়, বর্তমানে এগুলোর দাম কমে আসছে এবং উন্নয়নশীল দেশগুলোতেও অনেক বেশি সহজলভ্য হচ্ছে।
ডিজিটাল কৃষিব্যবস্থায় আমরা দেখতে পাচ্ছি -
* তাপমাত্রা পরিমাপ - তাপমাত্রা সঠিক পরিমাপ কৃষকদের গাছের বৃদ্ধি এবং পরিপক্কতা সম্পর্কে পূর্বাভাস দেয়।
* স্বয়ংক্রিয় সেচকার্য - নির্দিষ্ট সময় পরপর পানি দেয়ার পরিবর্তে, মাটির আর্দ্রতা পরিমাপ করে, তা খুব শুষ্ক হলে সেচ ব্যবস্থা চালু করা অধিক উপযোগী। সময়ভিত্তিক সেচের ফলে গরম, শুষ্ক সময়ে কম পানি পাওয়া বা বৃষ্টির সময় অতিরিক্ত পানির চাপ বেড়ে যাওয়ার মতো ঘটনা ঘটতে পারে। কেবল প্রয়োজনের সময়ে সেচ দিয়েই, কৃষকরা তাদের জলের ব্যবহার আরো বেশি কার্যকর করতে পারে।
* কীটপতঙ্গ নিয়ন্ত্রণ - কৃষকরা স্বয়ংক্রিয় রোবট বা ড্রোনগুলিতে ক্যামেরা ব্যবহার করে কীতপতঙ্গ গুলো পর্যবেক্ষণ করতে পারে এবং কেবল যেখানে প্রয়োজন সেখানেই কীটনাশক প্রয়োগ করতে পারেন। এতে করে কীটনাশকের পরিমাণ হ্রাস হবে এবং স্থানীয় জলের সরবরাহে কীটনাশকের মিশ্রণ কমে আসবে।
✅ একটু ভাবা যাক - কৃষকের ফলন উন্নত করতে আর কোন কোন কৌশল ব্যবহার করা হয়?
> 🎓 'Precision Agriculture' পরিভাষা ব্যবহৃত হয় ফসলের পর্যবেক্ষণ, পরিমাপ ও প্রতিক্রিয়া সংক্রান্ত কাজে। এর মধ্যে সেচ নিশ্চিতকরণ, পুষ্টিমাত্রা সঠিক রাখা এবং কীটপতঙ্গের মাত্রা পরিমাপ করা এবং নির্ভুলভাবে যথাযথ কাজটি করা যেমন ক্ষেতের কেবলমাত্র প্রয়োজনীয় ছোট অংশে সেচ দেয়া - এসব অন্তর্ভুক্ত।
## কৃষিকাজে তাপমাত্রা কেন গুরুত্বপূর্ণ ?
গাছপালা সম্পর্কে শেখার সময়, বেশিরভাগ শিক্ষার্থীদের পানি, আলো, কার্বন ডাইঅক্সাইড এবং পুষ্টির প্রয়োজনীয়তা সম্পর্কে শেখানো হয়। গাছপালা বৃদ্ধির জন্য উষ্ণতাও প্রয়োজন - এই কারণেই তাপমাত্রা বৃদ্ধির সাথে সাথে বসন্তে উদ্ভিদের ফুল ফোটে। উষ্ণতার কারণেই ড্যাফোডিল খুব শীঘ্রই ফুটতে পারে পারে এবং হটহাউস ও গ্রিনহাউসগুলি গাছ এর বৃদ্ধির জন্য কার্যকরী ভূমিকা রাখে।
> 🎓 হটহাউস এবং গ্রিনহাউসগুলি একই কাজ করে তবে একটি গুরুত্বপূর্ণ পার্থক্য রয়েছে। হটহাউস কৃত্রিমভাবে উত্তপ্ত করা হয় এবং কৃষকদেরকে তাপমাত্রা আরও সঠিকভাবে নিয়ন্ত্রণ করার সুযোগ দেয়, কিন্তু গ্রিনহাউসগুলি উষ্ণতার জন্য সূর্যের উপর নির্ভর করে এবং সাধারণত তাপমাত্রা নিয়ন্ত্রণের জন্য জানালা বা কোন ছিদ্রের উপর নির্ভর করতে হয়।
গাছের একটি বেস বা সর্বনিম্ন তাপমাত্রা, সর্বোত্তম বা পরিমিত তাপমাত্রা এবং সর্বাধিক তাপমাত্রা থাকে, যা সবগুলিই প্রতিদিনের গড় তাপমাত্রার উপর নির্ভর করে।
* বেস (সর্বনিম্ন) তাপমাত্রা - এটি কোনও গাছের বৃদ্ধির জন্য প্রয়োজনীয় ন্যূনতম দৈনিক গড় তাপমাত্রা।
* পরিমিত তাপমাত্রা - সর্বাধিক বৃদ্ধি পেতে এটি সেরা দৈনিক গড় তাপমাত্রা।
* সর্বাধিক তাপমাত্রা - এটি একটি উদ্ভিদ সহ্য করতে পারে সর্বোচ্চ তাপমাত্রা। এর উপরে উদ্ভিদ পানি সংরক্ষণ এবং জীবিত থাকার চেষ্টায় এর বৃদ্ধি বন্ধ করে দেবে।
> 💁 এগুলি দিন এবং রাতের তুলনায় গড় তাপমাত্রা। গাছপালাগুলিকে আরও বেশি দক্ষতার সাথে আলোকসংশ্লেষণ করতে এবং রাতে শক্তি সঞ্চয় করতে সহায়তা করার জন্য দিন ও রাতে বিভিন্ন তাপমাত্রার প্রয়োজন হয়।
প্রতিটি প্রজাতির উদ্ভিদের তাদের সর্বনিম্ন, সর্বোত্তম এবং সর্বাধিক তাপমাত্রারর জন্য আলাদা আলাদা মান রয়েছে । এ কারণেই কিছু দেশে গাছপালা উষ্ণ এবং অপর কিছু দেশে শীতল অবস্থায় অধিক বৃদ্ধি পায়।
✅ ছোট একটি গবেষণা করা যাক। আমাদের বাগান, স্কুল বা স্থানীয় উদ্যানের যে কোনও গাছের জন্য কীভাবে বেস তাপমাত্রা খুঁজে পেতে পারেন কিনা তা দেখুন।
![A graph showing growth rate rising as temperature rises, then dropping as the temperature goes too high](../../../../images/plant-growth-temp-graph.png)
উপরের গ্রাফটি তাপমাত্রার গ্রাফের একটি বৃদ্ধির হার দেখায়। বেস তাপমাত্রা পর্যন্ত কোনও বৃদ্ধি নেই। বৃদ্ধির হার পরিমিত তাপমাত্রা পর্যন্ত বৃদ্ধি পায়, তারপরে এই শীর্ষে পৌঁছানোর পরে কমতে থাকে। সর্বোচ্চ তাপমাত্রায় বৃদ্ধি বন্ধ হয়।
এই গ্রাফের আকার গাছের প্রজাতি থেকে উদ্ভিদ প্রজাতির মধ্যে পরিবর্তিত হয়। কোন কোন ক্ষেত্রে অপটিমাম (পরিমিত) তাপমাত্রার পরে খুব খাড়াভাবে নেমে যায়, আবার কোন কোন উদ্ভিদে বেস তাপমাত্রা থেকে খুবই ধীরে ধীরে বৃদ্ধি পেয়ে পরিমিত তাপমাত্রায় পৌঁছায়।
> 💁 একজন কৃষকের সর্বোচ্চ ফলন নিশ্চিত করার জন্য, তিনটি তাপমাত্রার মানই জানতে হবে এবং তাদের গাছগুলি বৃদ্ধির গ্রাফের ধরণ বুঝতে হবে।
যদি কোন কৃষকের তাপমাত্রা নিয়ন্ত্রিত মাধ্যম থাকে, উদাহরণস্বরূপ বাণিজ্যিক হটহাউস, যেখানে তারা তাদের গাছগুলির জন্য অনুকূল তাপমাত্রা নিশ্চিত করতে পারে। যেমন বাণিজ্যিকভাবে এরকম হটহাউসে টমেটোগুলির দ্রুততম বৃদ্ধির জন্য দিনে তাপমাত্রা 25 ডিগ্রি সেন্টিগ্রেড এবং রাতে 20 ডিগ্রি সেলসিয়াস রাখতে হয়।
> 🍅 কৃত্রিম আলো, সার এর নিয়ন্ত্রণের সাথে তাপমাত্রার মেলবন্ধনের ফলে অর্থ বাণিজ্যিক উৎপাদকরা সারা বছর ধরে তাদের উৎপাদন বজায় রাখতে পারে।
## চারিপার্শ্বিক তাপমাত্রা পরিমাপ
আইওটি ডিভাইসের সাথে তাপমাত্রা সেন্সর ব্যবহার করে চারিপার্শ্বের তাপমাত্রা পরিমাপ করা যায়।
### কাজ - তাপমাত্রা পরিমাপ
আইওটি ডিভাইসটি ব্যবহার করে তাপমাত্রা পর্যবেক্ষণ করতে, পছন্দ অনুসারে নিচের যেকোন একটি গাইডের মাধ্যমে কাজ শুরু করতে হবে।
* [Arduino - Wio Terminal](wio-terminal-temp.md)
* [Single-board computer - Raspberry Pi](pi-temp.md)
* [Single-board computer - Virtual device](virtual-device-temp.md)
## Growing degree days
Growing degree days (যাকে growing degree units ও বলা হয়) হলো তাপমাত্রার ভিত্তিতে গাছের বৃদ্ধি পরিমাপ করার একটি উপায়। একটি উদ্ভিদে পর্যাপ্ত পরিমাণে জল, পুষ্টি এবং CO<sub>2</sub> রয়েছে - এমনটা ধরে নিয়েই, তাপমাত্রার ভিত্তিতে বৃদ্ধির হার নির্ধারণ করা হয়।
Growing degree days, সংক্ষেপে GDD কে উদিভের বেস তাপমাত্রার উপরে, প্রতিদিনের গড় তাপমাত্রা হিসেবে গণনা করা হয়। প্রতিটি উদ্ভিদে বৃদ্ধি, ফুল হওয়া বা উৎপাদন এবং পরিপক্ক হওয়ার জন্য নির্দিষ্ট সংখ্যক GDD প্রয়োজন। প্রতিদিন যত বেশি জিডিডি হবে, গাছটি তত দ্রুত বাড়বে।
GDD-এর সম্পূর্ণ সূত্রটি কিছুটা জটিল, তবে একটি সরলীকৃত সমীকরণ রয়েছে যা প্রায়শই একটি কাছাকাছি মান হিসাবে ব্যবহৃত হয়:
![GDD = T max + T min divided by 2, all minus T base](../../../../images/gdd-calculation.png)
* **GDD** - growing degree days এর সংখ্যা
* **T<sub>max</sub>** - এটি প্রতিদিনের সর্বোচ্চ তাপমাত্রা (ডিগ্রি সেলসিয়াসে)
* **T<sub>min</sub>** - এটি প্রতিদিনের সর্বনিম্ন তাপমাত্রা (ডিগ্রি সেলসিয়াসে)
* **T<sub>base</sub>** - এটি **উদ্ভিদের** বেইস তাপমাত্রা (ডিগ্রি সেলসিয়াসে)
> 💁 T<sub>max</sub> এর মান ৩০ ডিগ্রি এর বেশি হলে অথবা T<sub>min</sub> এর মান T<sub>base</sub> এর কম হলে এখানে কিছুটা পরিবর্তন আসে। তবে আমরা এখন আপাতত এই পরিবর্তনগুলো অগ্রাহ্য করবো।
### উদাহরণ - ভুট্টা 🌽
বিভিন্ন জাতের উপর নির্ভর করে, ভুট্টা পরিপক্ক হতে 800 এবং 2,700 জিডিডি প্রয়োজন (যখন বেস তাপমাত্রা 10 ডিগ্রি ্সেলসিয়াস)।
বেস তাপমাত্রার উপরে প্রথম দিন, নিম্নলিখিত তাপমাত্রা পরিমাপ করা হয়েছিল:
| পরিমাপ | তাপমাত্রা °C |
| :---------- | :-----: |
| সর্বোচ্চ | 16 |
| সর্বনিম্ন | 12 |
তাহলে,
* T<sub>max</sub> = 16
* T<sub>min</sub> = 12
* T<sub>base</sub> = 10
হিসেব করে পাই,
![GDD = 16 + 12 divided by 2, all minus 10, giving an answer of 4](../../../../images/gdd-calculation-corn.png)
ঐদিন ভুট্টা 4 জিডিডি পেয়েছিল। এটিকে 800 GDD চাহিদাসম্পন্ন জাত হিসেবে বিবেচনা করলে, উদ্ভিদটির পরিপক্ব হতে আরো 796 GDD প্রয়োজন।
✅ ছোট একটি গবেষণা করা যাক । বাগান, স্কুল বা স্থানীয় পার্কে যে কোনও উদ্ভিদের পরিপক্কতায় পৌঁছানোর জন্য বা ফসল উৎপাদনের জন্য প্রয়োজনীয় জিডিডি নম্বর নির্ণয় করা যায় কিনা দেখা যাক।
## তাপমাত্রা সেন্সরের তথ্য দিয়ে GDD নির্ণয়
গাছপালা নির্দিষ্ট তারিখ ধরে জন্মায় না - উদাহরণস্বরূপ আমরা কোনও বীজ রোপণ করতে রোপন করে এটা বলতে পারবোনা যে গাছটি ঠিক 100 দিন পরে ফলন দিবে। কৃষক হিসাবে আমরা বরং একটি উদ্ভিদ বাড়তে কত সময় নেয় তার ধরে সম্পর্কে মোটামুটি রাখতে পারি এবং সেই ধারণা অনুসারে প্রতিদিন ফসলগুলি প্রস্তুত হচ্ছে কিনা তা পরীক্ষা করে দেখতে পারি।
এভাবে গাছপালার প্রতি সার্বক্ষণিক নজর রাখাটা বড় খামারে অনেক সমস্যাপূর্ণ বিষয় কেননা এতে প্রচুর লোকবল দরকার এবং কোন ফলন অপ্রত্যাশিতভাবে তাড়াতাড়ি প্রস্তুত হয়ে গেলে, ঝুঁকি রয়েছে যে কৃষক হয়তো নজর এড়িয়ে যেতে পারে। তাপমাত্রা পরিমাপ করে, কৃষক একটি উদ্ভিদ এর জিডিডি গণনা করতে পারে এবং তখন কেবলমাত্র তাদের প্রত্যাশিত পরিপক্কতার সময়ের কাছাকাছি হলেই সার্বক্ষণিক নজরদারি করলেই হয়, এতে অতিরিক্ত শ্রম কমে আসে।
আইওটি ডিভাইস ব্যবহার করে তাপমাত্রার ডেটা সংগ্রহ করে, যখন গাছগুলি পরিপক্কতার কাছাকাছি থাকে তখন একজন কৃষককে স্বয়ংক্রিয়ভাবে অবহিত করা যায়। এর জন্য একটি সাধারণ একটি কার্যধারা হল আইওটি ডিভাইসগুলির দ্বারা তাপমাত্রা পরিমাপ করে, তারপরে MQTT-র মতো কিছু ব্যবহার করে এই টেলিমেট্রি ডেটা ইন্টারনেটে প্রকাশ করতে হবে। সার্ভার কোড তখন এই ডেটা গ্রহণ করে এবং এটি কোথাও সংরক্ষণ করে, যেমন কোনও ডাটাবেস এ। এতে করে এই ডেটাগুলি পরে বিশ্লেষণ করা যেতে পারে - যেমনঃ জিডিডি গণনা করা, নির্দিষ্ট পর্যন্ত প্রতিটি ফসলের জন্য মোট জিডিডি এর হিসেব রাখা এবং কোন উদ্ভিদ পরিপক্কতার কাছাকাছি গেলে, কৃষককে এই বিষয়ে সতর্ক করা।
![Telemetry data is sent to a server and then saved to a database](../../../../images/save-telemetry-database.png)
সার্ভার কোড অতিরিক্ত তথ্য যুক্ত করে ডেটা সমৃদ্ধ করতে পারে। উদাহরণস্বরূপ, আইওটি ডিভাইসটি নিজেকে চিহ্নিত করতে একটি সনাক্তকারী/আইডেন্টিফায়ার প্রকাশ করতে পারে এবং সার্ভার কোডটি ডিভাইসের অবস্থান এবং এটি কোন স্থানে কী ফসল পর্যবেক্ষণ করছে তা সন্ধান করতে এটি ব্যবহার করতে পারে। এটি চলমান সময়ের মতো বেসিক ডেটাও যুক্ত করতে পারে কারণ কিছু আইওটি ডিভাইসে সঠিক সময়ের উপর নজর রাখতে প্রয়োজনীয় হার্ডওয়্যার থাকেনা বা ইন্টারনেটের মাধ্যমে সময় এর হিসাব রাখতে অতিরিক্ত কোডের প্রয়োজন হয়।
✅ বিভিন্ন ফার্মের তাপমাত্রা ভিন্ন ভিন্ন হওয়ার কারণ কী হতে পারে?
### কাজ - তাপমাত্রা সংক্রান্ত তথ্য প্রকাশ করা
আইওটি ডিভাইস ব্যবহার করে MQTT এর মাধ্যমে তাপমাত্রার ডেটা প্রকাশের জন্য নীচের যেকোন একটি গাইডের মাধ্যমে কাজ শুরু করতে হবে:
* [Arduino - Wio Terminal](wio-terminal-temp-publish.md)
* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer-temp-publish.md)
### কাজ - তাপমাত্রা সংক্রান্ত তথ্য গ্রহণ ও সংরক্ষণ
আইওটি ডিভাইসটি টেলিমেট্রি প্রকাশ করার পরে, এমনভাবে কোড লিখতে হবে সার্ভার কোডটি এই ডেটাতে সাবস্ক্রাইব এবং সংরক্ষণ করে। এটি একটি ডেটাবেজে সংরক্ষণের পরিবর্তে, সার্ভার কোড এটি একটি Comma Separated Values (CSV) ফাইলে সংরক্ষণ করবে। CSV ফাইল মানগুলির সারি হিসাবে ডেটা সংরক্ষণ করে; প্রতিটি মান একটি কমা দ্বারা পৃথক করা হয় এবং প্রতিটি রেকর্ড নতুন লাইনে থাকে। ফাইল হিসাবে ডেটা সংরক্ষণ করার জন্য এটি বেশ সুবিধাজনক, যা মানব-পঠনযোগ্য এবং অবশ্যই একটি ভাল উপায়।
এই CSV ফাইলে দুটি কলাম থাকবে - *তারিখ* এবং *তাপমাত্রা* । এখানে *তারিখ* কলামটিতে সার্ভার কর্তৃক ম্যাসেজ পাওয়ার সময় ও তারিখ থাকে এবং *তাপমাত্রা* সংক্রান্ত তথ্য আসে টেলিমেট্রি থেকে।
1. টেলিমেট্রিতে সাবস্ক্রাইব করার জন্য সার্ভার কোড তৈরি করতে লেসন 4 এ এই ধাপগুলি পুনরাবৃত্তি করতে হবে। কমান্ডগুলি প্রকাশ করার জন্য আলাদা কোড যুক্ত করার দরকার নেই।
ধাপগুলো হলোঃ
* পাইথন ভার্চুয়াল এনভায়রনমেন্ট কনফিগার এবং একটিভেট করা ।
* এরপর paho-mqtt pip package ইন্সটল করতে হবে।
* টেলিমেট্রিতে যেসকল MQTT messages প্রকাশিত হয়েছে তা একসেস করতে হলে কোড লিখতে হবে।
> ⚠️ এক্ষেত্রে [লেসন 4 এ প্রদেয়, টেলিমেট্রি ডেটা রিসিভ করার জন্য পাইথন কোড ](../../../../1-getting-started/lessons/4-connect-internet/README.md#receive-telemetry-from-the-mqtt-broker) আমরা অনুসরণ করতে পারি।
এই প্রজেক্টের জন্য ফোল্ডারের নাম দিই `temperature-sensor-server`.
1. নিশ্চিত করতে হবে যেন `client_name` এ এই প্রজেক্টের ইঙ্গিত থাকে:
```cpp
client_name = id + 'temperature_sensor_server'
```
1. আগে থেকে এসকম ইম্পোর্ট রয়েছে তার নীচে এগুলো যুক্ত করতে হবে:
```python
from os import path
import csv
from datetime import datetime
```
এটি ফাইলগুলি পড়ার জন্য একটি লাইব্রেরি ইম্পোর্ট করে । এছাড়াও সিএসভি ফাইলগুলির সাথে যোগাযোগের জন্য একটি লাইব্রেরি এবং তারিখ ও সময়গুলির সংরক্ষণের জন্যে একটি লাইব্রেরি অন্তর্ভুক্ত হয়।
1. তারপর`handle_telemetry` function এর পূর্বে এই কোড যুক্ত করতে হবে:
```python
temperature_file_name = 'temperature.csv'
fieldnames = ['date', 'temperature']
if not path.exists(temperature_file_name):
with open(temperature_file_name, mode='w') as csv_file:
writer = csv.DictWriter(csv_file, fieldnames=fieldnames)
writer.writeheader()
```
এই কোডে কিছু কনস্ট্যান্ট ডিক্লেয়ার করা হয়েছে - যে CSV ফাইলে ডেটা রাখা হবে সেটির একং কলাম হেডারগুলোর নামকরণ এর জন্য। কোন সিএসভি ফাইলের প্রথম সারিতে সাধারণত কমা দ্বারা আলাদাকৃত ভাবে কলামের শিরোনাম বা হেডার থাকে।
কোডটি পরীক্ষা করে দেখ সিএসভি ফাইল ইতিমধ্যে বিদ্যমান কিনা । যদি এটি বিদ্যমান না থাকে তবে এটি প্রথম সারিতে কলাম শিরোনাম দিয়ে তৈরি করা হবে।
1. `handle_telemetry` function এর পর নিম্নোক্ত কোড যুক্ত করতে হবে:
```python
with open(temperature_file_name, mode='a') as temperature_file:
temperature_writer = csv.DictWriter(temperature_file, fieldnames=fieldnames)
temperature_writer.writerow({'date' : datetime.now().astimezone().replace(microsecond=0).isoformat(), 'temperature' : payload['temperature']})
```
এই কোড CSV file টি খুলে এতে নতুন সারি যুক্ত করবে। সেই সারিতে আইওটি ডিভাইস থেকে প্রাপ্ত তাপমাত্রা এবং তখনকার সময় ও তারিখ আমাদের বোধগম্য ফরম্যাটে রাখা হয়। এই তথ্যসমূহ [ISO 8601 ফরম্যাট](https://wikipedia.org/wiki/ISO_8601) অনুসরণ করে, নির্দিষ্ট টাইমজোন সহ, তবে এখানে মাইক্রোসেকেন্ড আকারে সময় সংরক্ষিত হয়।
1. আগের মতই এই কোডটি রান করতে হবে - এদিকে এটাও নিশ্চিত করতে হবে যে আইওটি ডিভাইস থেকে ডেটা পাঠানো হচ্ছে। দেখা যাবে `temperature.csv`নামে একটি ফোল্ডার তৈরী হয়েছে একই ফোল্ডারে। এটি খুললে দেখা যাবে, তারিখ-সময় এবং তাপমাত্রা রয়েছে।
```output
date,temperature
2021-04-19T17:21:36-07:00,25
2021-04-19T17:31:36-07:00,24
2021-04-19T17:41:36-07:00,25
```
1. তথ্য সংগ্রহের জন্য এই কোডটিও রান করি। জিডিডি গণনার জন্য পর্যাপ্ত ডেটা সংগ্রহ করার জন্য আমাদের পুরো দিন ধরে এটি রান করানো উচিত।
> 💁 যদি আমরা Virtual IoT Device ব্যবহার করে থাকি, সেক্ষেত্রে র‍্যান্ডম চেকবাক্সটি নির্বাচন করতে হবে এবং প্রতিবার তাপমাত্রার মান একই আসার বিষয়টি এড়াতে একটি সীমা নির্ধারণ করতে হবে।
![Select the random checkbox and set a range](../../../../images/select-the-random-checkbox-and-set-a-range.png)
> 💁 যদি এটি একটি পুরো দিন চালনা করতে চাই, তবে আমাদেরকে অবশ্যই এটি নিশ্চিত করতে হবে, যে সার্ভারে কোডটি চলছে সেই কম্পিউটারটি সারাদিন কাজ করবে। এক্ষেত্রে হয় আমাদেরকে পাওয়ার সেটিং ঠিক করে নিতে হবে অথবা [সিস্টেমকে একটিভ রাখার পাইথন স্ক্রিপ্ট](https://github.com/jaqsparow/keep-system-active) এর মতো কিছু ব্যবহার করতে হবে।
> 💁 এই কোডগুলো [code-server/temperature-sensor-server](code-server/temperature-sensor-server) ফোল্ডারে পাওয়া যাবে।
### কাজ - সঞ্চিত ডেটা ব্যবহার করে জিডিডি গণনাকরণ
একবার সার্ভার তাপমাত্রার ডেটা ক্যাপচার করে নিলে, একটি গাছের জন্য জিডিডি গণনা করা যায়।
এটি ম্যানুয়ালি করার ধাপগুলি হল:
1. উদ্ভিদের জন্য আগে বেস তাপমাত্রা নির্ণয়. যেমন, সট্রবেরির জন্য এটি ১°C.
1. এরপর `temperature.csv`থেকে, দিনের সর্বোচ্চ এবং সর্বনিম্ন তাপমাত্রা বের করতে হবে।
1. পূর্বে প্রদত্ত সূত্র ব্যবহার করে জিডিডি গণনা করতে হবে।
উদাহরণস্বরূপ, যদি দিনের সর্বোচ্চ তাপমাত্রা ২৫°C হয় এবং সর্বনিম্ন ১২°C হয়:
![GDD = 25 + 12 divided by 2, then subtract 10 from the result giving 8.5](../../../../images/gdd-calculation-strawberries.png)
* 25 + 12 = 37
* 37 / 2 = 18.5
* 18.5 - 10 = 8.5
এখানে স্ট্রবেরির জন্য মান এসেছে **8.5** GDD । যেহেতু স্ট্রবেরির ফলনের জন্য ২৫০ জিডিডি প্রয়োজন, তাহলে আরো অনেকদিন হাতে রয়েছে।
---
## 🚀 চ্যালেঞ্জ
উদ্ভিদের বৃদ্ধির জন্য তাপ ছাড়াও আরো কিছু বিষয় প্রয়োজন। আর কি কি জিনিস দরকার এক্ষেত্রে?
এই অতিরিক্ত ফ্যাক্টরগুলোর জন্য কী সেন্সর রয়েছে যা সেগুলি পরিমাপ করতে পারে ? এই স্তরগুলি নিয়ন্ত্রণ করতে একচুয়েটর ব্যবহার করা যায়? উদ্ভিদের বৃদ্ধি অনুকূল করতে কীভাবে এক বা একাধিক আইওটি ডিভাইস যুক্ত করলে ভালো ফল আসবে ?
## লেকচার পরবর্তী কুইজ
[লেকচার পরবর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/10)
## রিভিউ এবং স্ব-অধ্যয়ন
ডিজিটাল কৃষিব্যবস্থা সম্পর্কে আরো জানতে [Digital Agriculture Wikipedia page](https://wikipedia.org/wiki/Digital_agriculture) পড়া যায়। এছাড়াও [Precision Agriculture](https://wikipedia.org/wiki/Precision_agriculture) এর ব্যাপারেও আরো জানার চেষ্টা করা উচিত।
জিডিডি গণনা এখানে দেওয়া সরলীকৃত সূত্রের চেয়ে কিছুটা জটিল। এই সংক্রান্ত বিষয়ে আরো জানতে এবং বেস তাপমাত্রার নিচে দৈনিক তাপমাত্রা হলে তা কিভাবে ম্যানেজ করা যায় সেই সংক্রান্ত তথ্য [Growing Degree Day Wikipedia page](https://wikipedia.org/wiki/Growing_degree-day) এ পাওয়া যাবে।
## এসাইনমেন্ট
[জুপিটার নোটবুক ব্যবহার করে জিডিডি ডেটা প্রদর্শন করা](assignment.md)

@ -0,0 +1,43 @@
# জুপিটার নোটবুক ব্যবহার করে জিডিডি ডেটা প্রদর্শন করা
## নির্দেশাবলী
এই পাঠে আমরা আইওটি সেন্সর ব্যবহার করে জিডিডি ডেটা সংগ্রহ করেছি। ভাল জিডিডি ডেটা পেতে, আমাদেরকে একাধিক দিনের জন্য ডেটা সংগ্রহ করতে হবে। তাপমাত্রার ডেটা ভিজ্যুয়ালাইজ করতে এবং জিডিডি গণনা করা বা ডেটা বিশ্লেষণ করতে [জুপিটার নোটবুকস](https://jupyter.org) এর মতো ট্যুল ব্যবহার করা যেতে পারে।
কিছুদিন ডেটা সংগ্রহের মাধ্যমে কাজ শুরু করতে হবে। আইওটি ডিভাইসটি চলমান সময়ে আমাদের সার্ভার কোড যে চলমান রয়েছে তা নিশ্চিত করতে হবে, হয় পাওয়ার ম্যানেজমেন্ট সেটিংস ঠিক করে বা কোন [ সিস্টেমকে অ্যাক্টিভ রাখার পাইথন স্ক্রিপ্ট](https://github.com/jaqsparow/keep-system-active) এর মতো কিছু চালিয়ে।
একবার তাপমাত্রার ডেটা পেয়ে গেলে আমরা এটি দেখতে এবং জিডিডি গণনা করতে এই রেপোতে জুপিটার নোটবুকটি ব্যবহার করতে পার্রি। জুপিটার নোটবুক এ ব্লক হিসেবে *সেল* এ কোড এবং নির্দেশাবলী মিশ্রিত থাকে, এবং সাধারণত পাইথনেই কোড করা হয়। নির্দেশাবলী পড়ে, তারপরে কোডের প্রতিটি ব্লক বাই ব্লক রান করতে হবে। এখানে প্রদত্ত কোডটি এডিট করা যেতে পারে। উদাহরণস্বরূপ, এই নোটবুকটিতে, য়ামরা আমাদের গাছের জন্য জিডিডি গণনা করতে ব্যবহৃত বেস তাপমাত্রা, উদ্ভিদভেদে ভিন্ন ভিন্ন হবে।
1. `gdd-calculation` নামে একটি ফোল্ডার খুলতে হবে।
1. [gdd.ipynb](./code-notebook/gdd.ipynb) ডাউনলোড করে তার একটি কপি `gdd-calculation` ফোল্ডারে রাখতে হবে।
1. MQTT সার্ভার দ্বারা তৈরী `temperature.csv` ফাইলটি কপি করতে হবে।
1. `gdd-calculation` ফোল্ডারে নতুন পাইথন এনভায়রনমেন্ট তৈরী করতে হবে।
1. জুপিটার নোটবুকে কিছু পিপ প্যাকেজ ইন্সটল করতে হবে।
```sh
pip install --upgrade pip
pip install pandas
pip install matplotlib
pip install jupyter
```
1. নোটবুকটি জুপিটারে রান করতে হবে
```sh
jupyter notebook gdd.ipynb
```
জুপিটারটি শুরু হয়ে ব্রাউজারে নোটবুকটি খুলবে। তাপমাত্রা পরিমাপ করা এবং জিডিডি গণনা করতে নোটবুকের নির্দেশাবলীর মাধ্যমে কাজ করতে হবে।
![The jupyter notebook](../../../images/gdd-jupyter-notebook.png)
## এসাইনমেন্ট মূল্যায়ন মানদন্ড
| ক্রাইটেরিয়া | দৃষ্টান্তমূলক ব্যখ্যা (সর্বোত্তম) | পর্যাপ্ত ব্যখ্যা (মাঝারি) | আরো উন্নতির প্রয়োজন (নিম্ন) |
| -------- | --------- | -------- | ----------------- |
| ডেটা সংগ্রহ করা | কমপক্ষে ২টি সম্পূর্ণ দিনের ডেটা সংগ্রহ করা | কমপক্ষে ১টি সম্পূর্ণ দিনের ডেটা সংগ্রহ করা | কিছু ডেটা সংগ্রহ করা |
| জিডিডি গণনা করা | সফলতার সাথে নোটবুক রান করে জিডিডি গণনা | সফলতার সাথে নোটবুক রান করা | নোটবুক রান করতে ব্যার্থ |

@ -1,8 +1,8 @@
# Detect soil moisture
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-6.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -33,8 +33,6 @@ Plants require water to grow. They absorb water throughout the entire plant, wit
![Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure](../../../images/transpiration.png)
***Water is absorbed through plant roots then carried around the plant, being used for photosynthesis and plant structure. Plant by Alex Muravev / Plant Cell by Léa Lortal - all from the [Noun Project](https://thenounproject.com)***
✅ Do some research: how much water is lost through transpiration?
The root system provides water from moisture in the soil where the plant grows. Too little water in the soil and the plant cannot absorb enough to grow, too much water and roots cannot absorb enough oxygen needed to function. This leads to roots dying and the plant unable to get enough nutrients to survive.
@ -83,14 +81,10 @@ You can use GPIO pins directly with some digital sensors and actuators when you
![A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1](../../../images/button-with-digital.png)
***A button is sent 5 volts. When not pressed it returns 0 volts, or 0, when pressed it returns 5 volts, or 1. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
* LED. You can connect an LED between an output pin and a ground pin (using a resistor otherwise you'll burn out the LED). From code you can set the output pin to high and it will send 3.3V, making a circuit from the 3.3V pin, through the LED, to the ground pin. This will light the LED.
![An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit.](../../../images/led-digital-control.png)
***An LED is sent a signal of 0 (3.3V), which lights the LED. If it is sent 0 (0v), the LED is not lit. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
For more advanced sensors, you can use GPIO pins to send and receive digital data directly with digital sensors and actuators, or via controller boards with ADCs and DACs to talk to analog sensors and actuators.
> 💁 if you are using a Raspberry Pi for these labs, the Grove Base Hat has hardware to convert analog sensor signals to digital to send over GPIO.
@ -105,8 +99,6 @@ For example, on a 3.3V board, if the sensor returns 3.3V, the value returned wou
![A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511](../../../images/analog-sensor-voltage.png)
***A soil moisture sensor sent 3.3V and returning 1.65v, or a reading of 511. probe by Adnen Kadri / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
> 💁 Back in nightlight - lesson 3, the light sensor returned a value from 0-1,023. If you are using a Wio Terminal, the sensor was connected to an analog pin. If you are using a Raspberry Pi, then the sensor was connected to an analog pin on the base hat that has an integrated ADC to communicate over the GPIO pins. The virtual device was set to send a value from 0-1,023 to simulate an analog pin.
Soil moisture sensors rely on voltages, so will use analog pins and give values from 0-1,023.
@ -130,8 +122,6 @@ I<sup>2</sup>C has a bus made of 2 main wires, along with 2 power wires:
![I2C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire](../../../images/i2c.png)
***I<sup>2</sup>C bus with 3 devices connected to the SDA and SCL wires, sharing a common ground wire. Microcontroller by Template / LED by abderraouf omara / ldr by Eucalyp - all from the [Noun Project](https://thenounproject.com)***
To send data, one device will issue a start condition to show it is ready to send data. It will then become the controller. The controller then sends the address of the device that it wants to communicate with, along with if it wants to read or write data. After the data has been transmitted, the controller sends a stop condition to indicate that it has finished. After this another device can become the controller and send or receive data.
I<sup>2</sup>C has speed limits, with 3 different modes running at fixed speeds. The fastest is High Speed mode with a maximum speed of 3.4Mbps (megabits per second), though very few devices support that speed. The Raspberry Pi for example, is limited to fast mode at 400Kbps (kilobits per second). Standard mode runs at 100Kbps.
@ -147,8 +137,6 @@ UART involves physical circuitry that allows two devices to communicate. Each de
![UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa](../../../images/uart.png)
***UART with the Tx pin on one chip connected to the Rx pin on another, and vice versa. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
> 🎓 The data is sent one bit at a time, and this is known as *serial* communication. Most operating systems and microcontrollers have *serial ports*, that is connections that can send and receive serial data that are available to your code.
UART devices have a [baud rate](https://wikipedia.org/wiki/Symbol_rate) (also known as Symbol rate), which is the speed that data will be sent and received in bits per second. A common baud rate is 9,600, meaning 9,600 bits (0s and 1s) of data are sent each second.
@ -178,8 +166,6 @@ SPI controllers use 3 wires, along with 1 extra wire per peripheral. Peripherals
![SPI with on controller and two peripherals](../../../images/spi.png)
***SPI with on controller and two peripherals. chip by Astatine Lab - all from the [Noun Project](https://thenounproject.com)***
The CS wire is used to activate one peripheral at a time, communicating over the COPI and CIPO wires. When the controller needs to change peripheral, it deactivates the CS wire connected to the currently active peripheral, then activates the wire connected to the peripheral it wants to communicate with next.
SPI is *full-duplex*, meaning the controller can send and receive data at the same time from the same peripheral using the COPI and CIPO wires. SPI uses a clock signal on the SCLK wire to keep the devices in sync, so unlike sending directly over UART it doesn't need start and stop bits.

@ -1,8 +1,8 @@
# Automated plant watering
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-7.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -30,8 +30,6 @@ The solution to this is to have a pump connected to an external power supply, an
![A light switch turns power on to a light](../../../images/light-switch.png)
***A light switch turns power on to a light. switch by Chattapat / lightbulb by Maxim Kulikov - all from the [Noun Project](https://thenounproject.com)***
> 🎓 [Mains electricity](https://wikipedia.org/wiki/Mains_electricity) refers to the electricity delivered to homes and businesses through national infrastructure in many parts of the world.
✅ IoT devices can usually provide 3.3V or 5V, at less than 1 amp (1A) of current. Compare this to mains electricity which is most often at 230V (120V in North America and 100V in Japan), and can provide power for devices that draw 30A.
@ -46,14 +44,10 @@ A relay is an electromechanical switch that converts an electrical signal into a
![When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit](../../../images/relay-on.png)
***When on, the electromagnet creates a magnetic field, turning on the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
In a relay, a control circuit powers the electromagnet. When the electromagnet is on, it pulls a lever that moves a switch, closing a pair of contacts and completing an output circuit.
![When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit](../../../images/relay-off.png)
***When off, the electromagnet doesn't create a magnetic field, turning off the switch for the output circuit. lightbulb by Maxim Kulikov - from the [Noun Project](https://thenounproject.com)***
When the control circuit is off, the electromagnet turns off, releasing the lever and opening the contacts, turning off the output circuit. Relays are digital actuators - a high signal to the relay turns it on, a low signal turns it off.
The output circuit can be used to power additional hardware, like an irrigation system. The IoT device can turn the relay on, completing the output circuit that powers the irrigation system, and plants get watered. The IoT device can then turn the relay off, cutting the power to the irrigation system, turning the water off.
@ -132,8 +126,6 @@ If you did the last lesson on soil moisture using a physical sensor, you would h
![A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil](../../../images/soil-moisture-travel.png)
***A soil moisture measurement of 658 doesn't change during watering, it only drops to 320 after watering when water has soaked through the soil. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
In the diagram above, a soil moisture reading shows 658. The plant is watered, but this reading doesn't change immediately, as the water has yet to reach the sensor. Watering can even finish before the water reaches the sensor and the value drops to reflect the new moisture level.
If you were writing code to control an irrigation system via a relay based off soil moisture levels, you would need to take this delay into consideration and build smarter timing into your IoT device.
@ -160,8 +152,6 @@ For example, I have a strawberry plant with a soil moisture sensor and a pump co
![Step 1, take measurement. Step 2, add water. Step 3, wait for water to soak through the soil. Step 4, retake measurement](../../../images/soil-moisture-delay.png)
***Measure, add water, wait, remeasure. Plant by Alex Muravev / Watering Can by Daria Moskvina - all from the [Noun Project](https://thenounproject.com)***
This means the best process would be a watering cycle that is something like:
* Turn on the pump for 5 seconds
@ -207,10 +197,11 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
1. Open the `app.py` file
1. Add the following code to the `app.py` file below the existing imports:
```python
import threading
```
This statement imports `threading` from Python libraries, threading allows python to execute other code while waiting.
1. Add the following code before the `handle_telemetry` function that handles telemetry messages received by the server code:
@ -278,7 +269,7 @@ Update your server code to run the relay for 5 seconds, then wait 20 seconds.
```
A good way to test this in a simulated irrigation system is to use dry soil, then pour water in manually whilst the relay is on, stopping pouring when the relay turns off.
> 💁 You can find this code in the [code-timing](./code-timing) folder.
> 💁 If you want to use a pump to build a real irrigation system, then you can use a [6V water pump](https://www.seeedstudio.com/6V-Mini-Water-Pump-p-1945.html) with a [USB terminal power supply](https://www.adafruit.com/product/3628). Make sure the power to or from the pump is connected via the relay.

@ -1,8 +1,8 @@
# Migrate your plant to the cloud
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-8.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -106,14 +106,10 @@ IoT devices connect to a cloud service either using a device SDK (a library that
![Devices connect to a service using a device SDK. Server code also connects to the service via an SDK](../../../images/iot-service-connectivity.png)
***Devices connect to a service using a device SDK. Server code also connects to the service via an SDK. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
Your device then communicates with other parts of your application over this service - similar to how you sent telemetry and received commands over MQTT. This is usually using a service SDK or a similar library. Messages come from your device to the service where other components of your application can then read them, and messages can then be sent back to your device.
![Devices without a valid secret key cannot connect to the IoT service](../../../images/iot-service-allowed-denied-connection.png)
***Devices without a valid secret key cannot connect to the IoT service. Microcontroller by Template / Cloud by Debi Alpa Nugraha / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
These services implement security by knowing about all the devices that can connect and send data, either by having the devices pre-registered with the service, or by giving the devices secret keys or certificates they can use to register themselves with the service the first time they connect. Unknown devices are unable to connect, if they try the service rejects the connection and ignores messages sent by them.
✅ Do some research: What is the downside of having an open IoT service where any device or code can connect? Can you find specific examples of hackers taking advantage of this?

@ -1,8 +1,8 @@
# Migrate your application logic to the cloud
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-9.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -26,14 +26,10 @@ Serverless, or serverless computing, involves creating small blocks of code that
![Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run](../../../images/iot-messages-to-serverless.png)
***Events being sent from an IoT service to a serverless service, all being processed at the same time by multiple functions being run. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
> 💁 If you've used database triggers before, you can think of this as the same thing, code being triggered by an event such as inserting a row.
![When many events are sent at the same time, the serverless service scales up to run them all at the same time](../../../images/serverless-scaling.png)
***When many events are sent at the same time, the serverless service scales up to run them all at the same time. IoT by Adrien Coquet from the [Noun Project](https://thenounproject.com)***
Your code is only run when the event happens, there is nothing keeping your code alive at other times. The event happens, your code is loaded and run. This makes serverless very scalable - if many events happen at the same time, the cloud provider can run your function as many times as you need at the same time across whatever servers they have available. The downside to this is if you need to share information between events, you need to save it somewhere like a database rather than storing it in memory.
Your code is written as a function that takes details about the event as a parameter. You can use a wide range of programming languages to write these serverless functions.

@ -1,8 +1,8 @@
# Keep your plant secure
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-10.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -54,14 +54,10 @@ When a device connects to an IoT service, it uses an ID to identify itself. The
![Both valid and malicious devices could use the same ID to send telemetry](../../../images/iot-device-and-hacked-device-connecting.png)
***Both valid and malicious devices could use the same ID to send telemetry. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The way round this is to convert the data being sent into a scrambled format, using some kind of value to scramble the data known to the device and the cloud only. This process is called *encryption*, and the value used to encrypt the data is called an *encryption key*.
![If encryption is used, then only encrypted messages will be accepted, others will be rejected](../../../images/iot-device-and-hacked-device-connecting-encryption.png)
***If encryption is used, then only encrypted messages will be accepted, others will be rejected. Microcontroller by Template / IoT by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
The cloud service can then convert the data back to a readable format, using a process called *decryption*, using either the same encryption key, or a *decryption key*. If the encrypted message cannot be decrypted by the key, the device has been hacked and the message is rejected.
The technique for doing encryption and decryption is called *cryptography*.
@ -164,8 +160,6 @@ When using X.509 certificates, both the sender and the recipient will have their
![Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it.](../../../images/send-message-certificate.png)
***Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it. Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)***
One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub.
The certificate used by your device to encrypt messages it sends to the IoT Hub is published by Microsoft. It is the same certificate that a lot of Azure services use, and is sometimes built into the SDKs

@ -1,9 +0,0 @@
# Dummy File
This file acts as a placeholder for the `translations` folder. <br>
**Please remove this file after adding the first translation**
For the instructions, follow the directives in the [translations guide](https://github.com/microsoft/IoT-For-Beginners/blob/main/TRANSLATIONS.md) .
## THANK YOU
We truly appreciate your efforts!

@ -0,0 +1,20 @@
# কৃষিকাজে IoT
জনসংখ্যা যেমন বাড়ছে, তেমনি কৃষির চাহিদাও বাড়ছে। কৃষিজমির পরিমাণ অতোটা পরিবর্তন না হলেও, জলবায়ুর পরিবর্তন ঠিকই হচ্ছে - যা কৃষকদের আরও বেশি সমস্যার মুখে ফেলে দিচ্ছে, বিশেষত সেই ২বিলিয়ন [জীবিকা নির্বাহী কৃষক](https://wikipedia.org/wiki/Subsistence_agriculture) যাদের ফসল বেড়ে ওঠার উপর নির্ভর করেই তাদের পরিবারের অন্নসংস্থান হয়। কোন ধরণের ফসল উৎপাদন করা যাবে, কখন কাজ শুরু করা উচিত, ফলনের বৃদ্ধি, শারিরীক শ্রমের পরিমাণ হ্রাস এবং কীটপতঙ্গগুলি সনাক্ত ও তাদেরকে বিনাশ করার বিষয়ে কৃষকদের অনেকাংশে সাহায্য করতে পারে আইওটি ।
এই ৬টি লেসনে আমরা শিখবো কীভাবে কৃষিকাজ উন্নত ও স্বয়ংক্রিয় করতে ইন্টারনেট অফ থিংস প্রয়োগ করা যায়।
> 💁 এই লেসনগুলোতে আমরা ক্লাউড রিসোর্স ব্যবহার করবো। যদি এই অধ্যায়ের সমস্ত পাঠ সম্পূর্ণ করা সম্ভব নাও হয়, তবুও [Clean up your project](../clean-up.md) অংশটি অবশ্যই দেখে নিতে হবে।
## বিষয়াবলী
1. [আইওটি দ্বারা উদ্ভিদ বৃদ্ধির পূর্বাভাস](lessons/1-predict-plant-growth/README.md)
1. [মাটির আর্দ্রতা সনাক্তকরণ](lessons/2-detect-soil-moisture/README.md)
1. [স্বয়ংক্রিয়ভাবে গাছে সেচকার্য](lessons/3-automated-plant-watering/README.md)
1. [উদ্ভিদকে ক্লাউড থেকে নিয়ন্ত্রণ](lessons/4-migrate-your-plant-to-the-cloud/README.md)
1. [ক্লাউড থেকে এপ্লিকেশন নিয়ন্ত্রণ](lessons/5-migrate-application-to-the-cloud/README.md)
1. [উদ্ভিদের নিরাপত্তা নিশ্চিতকরণ](lessons/6-keep-your-plant-secure/README.md)
## ক্রেডিট
♥️ প্রতিটি লেসনই ভালোবাসার সাথে তৈরী করেছেন [Jim Bennett](https://GitHub.com/JimBobBennett)

@ -1,8 +1,8 @@
# Location tracking
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-11.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -114,15 +114,13 @@ GPS systems work by having a number of satellites that send a signal with each s
![By knowing the distance from the sensor to multiple satellites, the location be calculated](../../../images/gps-satellites.png)
***By knowing the distance from the sensor to multiple satellites, the location be calculated. Satellite by Noura Mbarki from the [Noun Project](https://thenounproject.com)***
GPS satellites are circling the Earth, not at a fixed point above the sensor, so location data includes altitude above sea level as well as latitude and longitude.
GPS used to have limitations on accuracy enforced by the US military, limiting accuracy to around 5 meters. This limitation was removed in 2000, allowing an accuracy of 30 centimeters. Getting this accuracy is not always possible due to interference with the signals.
✅ If you have a smart phone, launch the mapping app and see how accurate your location is. It may take a short period of time for your phone to detect multiple satellites to get a more accurate location.
> 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite.
> 💁 The satellites contain atomic clocks that are incredibly accurate, but they drift by 38 microseconds (0.0000038 seconds) a day compared to atomic clocks on Earth, due to time slowing down as speed increases as predicted by Einstein's theories of special and general relativity - the satellites travel faster than the Earth's rotation. This drift has been used to prove the predictions of special and general relativity, and has to be adjusted for in the design of GPS systems. Literally time runs slower on a GPS satellite.
GPS systems have been developed and deployed by a number of countries and political unions including the US, Russia, Japan, India, the EU, and China. Modern GPS sensor can connect to most of these systems to get faster and more accurate fixes.

@ -1,8 +1,8 @@
# Store location data
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-12.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -18,6 +18,7 @@ In this lesson we'll cover:
* [Structured and unstructured data](#structured-and-unstructured-data)
* [Send GPS data to an IoT Hub](#send-gps-data-to-an-iot-hub)
* [Hot, warm, and cold paths](#hot-warm-and-cold-paths)
* [Handle GPS events using serverless code](#handle-gps-events-using-serverless-code)
* [Azure Storage Accounts](#azure-storage-accounts)
* [Connect your serverless code to storage](#connect-your-serverless-code-to-storage)
@ -44,6 +45,8 @@ Imagine you were adding IoT devices to a fleet of vehicles for a large commercia
This data can change constantly. For example, if the IoT device is in a truck cab, then the data it sends may change as the trailer changes, for example only sending temperature data when a refrigerated trailer is used.
✅ What other IoT data might be captured? Think about the kinds of loads trucks can carry, as well as maintenance data.
This data varies from vehicle to vehicle, but it all gets sent to the same IoT service for processing. The IoT service needs to be able to process this unstructured data, storing it in a way that allows it to be searched or analyzed, but works with different structures to this data.
### SQL vs NoSQL storage
@ -58,10 +61,14 @@ The first databases were Relational Database Management Systems (RDBMS), or rela
For example, if you stored a users personal details in a table, you would have some kind of internal unique ID per user that is used in a row in a table that contains the users name and address. If you then wanted to store other details about that user, such as their purchases, in another table, you would have one column in the new table for that users ID. When you look up a user, you can use their ID to get their personal details from one table, and their purchases from another.
SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema. Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL.
SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema.
✅ If you haven't used SQL before, take a moment to read up on it on the [SQL page on Wikipedia](https://wikipedia.org/wiki/SQL).
Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL.
✅ Do some research: Read up on some of these SQL databases and their capabilities.
#### NoSQL database
NoSQL databases are called NoSQL because they don't have the same rigid structure of SQL databases. They are also known as document databases as they can store unstructured data such as documents.
@ -74,6 +81,8 @@ NoSQL database do not have a pre-defined schema that limits how data is stored,
Some well known NoSQL databases include Azure CosmosDB, MongoDB, and CouchDB.
✅ Do some research: Read up on some of these NoSQL databases and their capabilities.
In this lesson, you will be using NoSQL storage to store IoT data.
## Send GPS data to an IoT Hub
@ -82,8 +91,6 @@ In the last lesson you captured GPS data from a GPS sensor connected to your IoT
![Sending GPS telemetry from an IoT device to IoT Hub](../../../images/gps-telemetry-iot-hub.png)
***Sending GPS telemetry from an IoT device to IoT Hub. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - send GPS data to an IoT Hub
1. Create a new IoT Hub using the free tier.
@ -136,14 +143,36 @@ message = Message(json.dumps(message_json))
Run your device code and ensure messages are flowing into IoT Hub using the `az iot hub monitor-events` CLI command.
## Hot, warm, and cold paths
Data that flows from an IoT device to the cloud is not always processed in real time. Some data needs real time processing, other data can be processed a short time later, and other data can be processed much later. The flow of data to different services that process the data at different times is referred to hot, warm and cold paths.
### Hot path
The hot path refers to data that needs to be processed in real time or near real time. You would use hot path data for alerts, such as getting alerts that a vehicle is approaching a depot, or that the temperature in a refrigerated truck is too high.
To use hot path data, your code would respond to events as soon as they are received by your cloud services.
### Warm path
The warm path refers to data that can be processed a short while after being received, for example for reporting or short term analytics. You would use warm path data for daily reports on vehicle mileage, using data gathered the previous day.
Warm path data is stored once it is received by the cloud service inside some kind of storage that can be quickly accessed.
### Cold path
THe cold path refers to historic data, storing data for the long term to be processed whenever needed. For example, you could use the cold path to get annual mileage reports for vehicles, or run analytics on routes to find the most optimal route to reduce fuel costs.
Cold path data is stored in data warehouses - databases designed for storing large amounts of data that will never change and can be queried quickly and easily. You would normally have a regular job in your cloud application that would run at a regular time each day, week, or month to move data from warm path storage into the data warehouse.
✅ Think about the data you have captured so far in these lessons. Is it hot, warm or cold path data?
## Handle GPS events using serverless code
Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint.
Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint. This is the warm path - this data will be stored and used in the next lesson for reporting on the journey.
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger](../../../images/gps-telemetry-iot-hub-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
### Task - handle GPS events using serverless code
1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this.
@ -207,8 +236,6 @@ In this lesson, you will use the Python SDK to see how to interact with blob sto
![Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage](../../../images/save-telemetry-to-storage-from-functions.png)
***Sending GPS telemetry from an IoT device to IoT Hub, then to Azure Functions via an event hub trigger, then saving it to blob storage. GPS by mim studio / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The data will be saved as a JSON blob with the following format:
```json
@ -343,7 +370,6 @@ The data will be saved as a JSON blob with the following format:
> 💁 Make sure you are not running the IoT Hub event monitor at the same time.
> 💁 You can find this code in the [code/functions](code/functions) folder.
### Task - verify the uploaded blobs

@ -1,8 +1,10 @@
# Visualize location data
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-13.jpg)
This video gives an overview of OAzure Maps with IoT, a service that will be covered in this lesson.
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of Azure Maps with IoT, a service that will be covered in this lesson.
[![Azure Maps - The Microsoft Azure Enterprise Location Platform](https://img.youtube.com/vi/P5i2GFTtb2s/0.jpg)](https://www.youtube.com/watch?v=P5i2GFTtb2s)

@ -1,6 +1,8 @@
# Geofences
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-14.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of geofences and how to use them in Azure Maps, topics that will be covered in this lesson:
@ -41,7 +43,7 @@ There are many reasons why you would want to know that a vehicle is inside or ou
* Preparation for unloading - getting a notification that a vehicle has arrived on-site allows a crew to be prepared to unload the vehicle, reducing vehicle waiting time. This can allow a driver to make more deliveries in a day with less waiting time.
* Tax compliance - some countries, such as New Zealand, charge road taxes for diesel vehicles based on the vehicle weight when driving on public roads only. Using geofences allows you to track the mileage driven on public roads as opposed to private roads on sites such as farms or logging areas.
* Monitoring theft - if a vehicle should only remain in a certain area such as on a farm, and it leaves the geofence, it might be being stolen.
* Monitoring theft - if a vehicle should only remain in a certain area such as on a farm, and it leaves the geofence, it might have been stolen.
* Location compliance - some parts of a work site, farm or factory may be off-limits to certain vehicles, such as keeping vehicles that carry artificial fertilizers and pesticides away from fields growing organic produce. If a geofence is entered, then a vehicle is outside of compliance and the driver can be notified.
✅ Can you think of other uses for geofences?
@ -212,7 +214,7 @@ For example, imagine GPS readings showing a vehicle was driving along a road tha
![A GPS trail showing a vehicle passing the Microsoft campus on the 520, with GPS readings along the road except for one on the campus, inside a geofence](../../../images/geofence-crossing-inaccurate-gps.png)
In the above image, there is a geofence over part of the Microsoft campus. The red line shows a truck driving along the 520, with circles to show the GPS readings. Most of these are accurate and along the 520, with one inaccurate reading inside the geofence. The is no way that reading can be correct - there are no roads for the truck to suddenly divert from the 520 onto campus, then back onto the 520. The code that checks this geofence will need to take the previous readings into consideration before acting on the results of the geofence test.
In the above image, there is a geofence over part of the Microsoft campus. The red line shows a truck driving along the 520, with circles to show the GPS readings. Most of these are accurate and along the 520, with one inaccurate reading inside the geofence. There is no way that reading can be correct - there are no roads for the truck to suddenly divert from the 520 onto campus, then back onto the 520. The code that checks this geofence will need to take the previous readings into consideration before acting on the results of the geofence test.
✅ What additional data would you need to check to see if a GPS reading could be considered correct?
@ -237,7 +239,7 @@ In the above image, there is a geofence over part of the Microsoft campus. The r
1. Use curl to make a GET request to this URL:
```sh
curl --request GET <URL>
curl --request GET '<URL>'
```
> 💁 If you get a response code of `BadRequest`, with an error of:
@ -255,7 +257,7 @@ In the above image, there is a geofence over part of the Microsoft campus. The r
"geometries": [
{
"deviceId": "gps-sensor",
"udId": "1ffb2047-6757-8c29-2c3d-da44cec55ff9",
"udId": "7c3776eb-da87-4c52-ae83-caadf980323a",
"geometryId": "1",
"distance": 999.0,
"nearestLat": 47.645875,

@ -1,6 +1,6 @@
# Manufacturing and processing - using IoT to improve the processing of food
Once food reaches a central hub or processing plant, it isn't always just shipped out to supermarkets. A lot of the time the food goes through a number of processing steps, such as sorting by quality. This is a process that used to be manual - it would start in the field when pickers would only pick ripe fruit, then at the factory the fruit would be ride a conveyer belt and employees would manually remove any bruised or rotten fruit. Having picked and sorted strawberries myself as a summer job during school, I can testify that this isn't a fun job.
Once food reaches a central hub or processing plant, it isn't always just shipped out to supermarkets. A lot of the time the food goes through a number of processing steps, such as sorting by quality. This is a process that used to be manual - it would start in the field when pickers would only pick ripe fruit, then at the factory the fruit would ride a conveyer belt and employees would manually remove any bruised or rotten fruit. Having picked and sorted strawberries myself as a summer job during school, I can testify that this isn't a fun job.
More modern setups rely on IoT for sorting. Some of the earliest devices like the sorters from [Weco](https://wecotek.com) use optical sensors to detect the quality of produce, rejecting green tomatoes for example. These can be deployed in harvesters on the farm itself, or in processing plants.
@ -10,7 +10,7 @@ As advances happen in Artificial Intelligence (AI) and Machine Learning (ML), th
In these 4 lessons you'll learn how to train image-based AI models to detect fruit quality, how to use these from an IoT device, and how to run these on the edge - that is on an IoT device rather than in the cloud.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics
@ -21,4 +21,4 @@ In these 4 lessons you'll learn how to train image-based AI models to detect fru
## Credits
All the lessons were written with ♥️ by [Jim Bennett](https://GitHub.com/JimBobBennett)
All the lessons were written with ♥️ by [Jen Fox](https://github.com/jenfoxbot) and [Jim Bennett](https://GitHub.com/JimBobBennett)

@ -1,6 +1,8 @@
# Train a fruit quality detector
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-15.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of the Azure Custom Vision service, a service that will be covered in this lesson.
@ -38,8 +40,6 @@ The rise of automated harvesting moved the sorting of produce from the harvest t
![If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever](../../../images/optical-tomato-sorting.png)
***If a red tomato is detected it continues its journey uninterrupted. If a green tomato is detected it is flicked into a waste bin by a lever. tomato by parkjisun from the Noun Project - from the [Noun Project](https://thenounproject.com)***
The next evolution was to use machines to sort, either built into the harvester, or in the processing plants. The first generation of these machines used optical sensors to detect colors, controlling actuators to push green tomatoes into a waste bin using levers or puffs of air, leaving red tomatoes to continue on a network of conveyor belts.
The video below shows one of these machines in action.
@ -60,11 +60,11 @@ Traditional programming is where you take data, apply an algorithm to the data,
![Traditional development takes input and an algorithm and gives output. Machine learning uses input and output data to train a model, and this model can take new input data to generate new output](../../../images/traditional-vs-ml.png)
Machine learning turns this around - you start with data and known outputs, and the machine learning tools work out the algorithm. You can then take that algorithm, called a *machine learning model*, and input new data and get new output.
Machine learning turns this around - you start with data and known outputs, and the machine learning algorithm learns from the data. You can then take that trained algorithm, called a *machine learning model* or *model*, and input new data and get new output.
> 🎓 The process of a machine learning tool generating a model is called *training*. The inputs and known outputs are called *training data*.
> 🎓 The process of a machine learning algorithm learning from the data is called *training*. The inputs and known outputs are called *training data*.
For example, you could give a model millions of pictures of unripe bananas as input training data, with the training output set as `unripe`, and millions of ripe banana pictures as training data with the output set as `ripe`. The ML tools will then generate a model. You then give this model a new picture of a banana and it will predict if the new picture is a ripe or an unripe banana.
For example, you could give a model millions of pictures of unripe bananas as input training data, with the training output set as `unripe`, and millions of ripe banana pictures as training data with the output set as `ripe`. The ML algorithm will then create a model based off this data. You then give this model a new picture of a banana and it will predict if the new picture is a ripe or an unripe banana.
> 🎓 The results of ML models are called *predictions*
@ -74,6 +74,8 @@ ML models don't give a binary answer, instead they give probabilities. For examp
The ML model used to detect images like this is called an *image classifier* - it is given labelled images, and then classifies new images based off these labels.
> 💁 This is an over-simplification, and there are many other ways to train models that don't always need labelled outputs, such as unsupervised learning. If you want to learn more about ML, check out [ML for beginners, a 24 lesson curriculum on Machine Learning](https://aka.ms/ML-beginners).
## Train an image classifier
To successfully train an image classifier you need millions of images. As it turns out, once you have an image classifier trained on millions or billions of assorted images, you can re-use it and re-train it using a small set of images and get great results, using a process called *transfer learning*.
@ -123,6 +125,8 @@ To use Custom Vision, you first need to create two cognitive services resources
This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
> 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services.
1. Use the following command to create a free Custom Vision prediction resource:
```sh
@ -142,7 +146,7 @@ To use Custom Vision, you first need to create two cognitive services resources
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account.
1. Follow the [Create a new Project section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
1. Follow the [create a new Project section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
Call your project `fruit-quality-detector`.
@ -176,11 +180,11 @@ Image classifiers run at very low resolution. For example Custom Vision can take
If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use.
1. Follow the [Upload and tag images section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`.
1. Follow the [upload and tag images section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Tag the ripe fruit as `ripe`, and the unripe fruit as `unripe`.
![The upload dialogs showing the upload of ripe and unripe banana pictures](../../../images/image-upload-bananas.png)
1. Follow the [Train the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images.
1. Follow the [train the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#train-the-classifier) to train the image classifier on your uploaded images.
You will be given a choice of training type. Select **Quick Training**.
@ -194,7 +198,7 @@ Once your classifier is trained, you can test it by giving it a new image to cla
### Task - test your image classifier
1. Follow the [Test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training.
1. Follow the [test your model documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#test-your-model) to test your image classifier. Use the testing images you created earlier, not any of the images you used for training.
![A unripe banana predicted as unripe with a 98.9% probability, ripe with a 1.1% probability](../../../images/banana-unripe-quick-test-prediction.png)
@ -208,7 +212,7 @@ Every time you make a prediction using the quick test option, the image and resu
### Task - retrain your image classifier
1. Follow the [Use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image.
1. Follow the [use the predicted image for training documentation on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/test-your-model?WT.mc_id=academic-17441-jabenn#use-the-predicted-image-for-training) to retrain your model, using the correct tag for each image.
1. Once you model has been retrained, test on new images.
@ -226,8 +230,8 @@ Try it out and see what the predictions are. You can find images to try with usi
## Review & Self Study
* When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the Evaluate the classifier section of the Build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier)
* Read up on how to improve your classifier from the [How to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn)
* When you trained your classifier, you would have seen values for *Precision*, *Recall*, and *AP* that rate the model that was created. Read up on what these values are using [the evaluate the classifier section of the build a classifier quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-build-a-classifier?WT.mc_id=academic-17441-jabenn#evaluate-the-classifier)
* Read up on how to improve your classifier from the [how to improve your Custom Vision model on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/getting-started-improving-your-classifier?WT.mc_id=academic-17441-jabenn)
## Assignment

@ -1,8 +1,8 @@
# Check fruit quality from an IoT device
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-16.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -151,7 +151,7 @@ If you were to create a production device to sell to farms or factories, how wou
You trained your custom vision model using the portal. This relies on having images available - and in the real world you may not be able to get training data that matches what the camera on your device captures. You can work round this by training directly from your device using the training API, to train a model using images captured from your IoT device.
* Read up on the training API in the [Using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn)
* Read up on the training API in the [using the Custom Vision SDK quick start](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/quickstarts/image-classification?tabs=visual-studio&pivots=programming-language-python&WT.mc_id=academic-17441-jabenn)
## Assignment

@ -1,6 +1,6 @@
# Classify an image - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it.
In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Send images to Custom Vision
@ -25,7 +25,7 @@ The Custom Vision service has a Python SDK you can use to classify images.
This brings in some modules from the Custom Vision libraries, one to authenticate with the prediction key, and one to provide a prediction client class that can call Custom Vision.
1. Add the following code to to the end of the file:
1. Add the following code to the end of the file:
```python
prediction_url = '<prediction_url>'
@ -86,6 +86,6 @@ The Custom Vision service has a Python SDK you can use to classify images.
![A banana in custom vision predicted ripe at 56.8% and unripe at 43.1%](../../../images/custom-vision-banana-prediction.png)
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-device](code-classify/virtual-device) folder.
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success!

@ -101,7 +101,7 @@ Program the device.
> 💁 You can capture the image directly to a file instead of a `BytesIO` object by passing the file name to the `camera.capture` call. The reason for using the `BytesIO` object is so that later in this lesson you can send the image to your image classifier.
1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captures from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam.
1. Configure the image that the camera in CounterFit will capture. You can either set the *Source* to *File*, then upload an image file, or set the *Source* to *WebCam*, and images will be captured from your web cam. Make sure you select the **Set** button after selecting a picture or selecting your webcam.
![CounterFit with a file set as the image source, and a web cam set showing a person holding a banana in a preview of the webcam](../../../images/counterfit-camera-options.png)

@ -10,7 +10,7 @@ The camera you'll use is an [ArduCam Mini 2MP Plus](https://www.arducam.com/prod
## Connect the camera
The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C busses via the GPIO pins on the Wio Terminal.
The ArduCam doesn't have a Grove socket, instead it connects to both the SPI and I<sup>2</sup>C buses via the GPIO pins on the Wio Terminal.
### Task - connect the camera

@ -1,10 +1,10 @@
# Classify an image - Wio Terminal
In this part of the lesson, you will add send the image captured by the camera to the Custom Vision service to classify it.
In this part of the lesson, you will send the image captured by the camera to the Custom Vision service to classify it.
## Classify an image
The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. THis REST API is accessed over an HTTPS connection - a secure HTTP connection.
The Custom Vision service has a REST API you can call from the Wio Terminal use to classify images. This REST API is accessed over an HTTPS connection - a secure HTTP connection.
When interacting with HTTPS endpoints, the client code needs to request the public key certificate from the server being accessed, and use that to encrypt the traffic it sends. Your web browser does this automatically, but microcontrollers do not. You will need to request this certificate manually and use it to create a secure connection to the REST API. These certificates don't change, so once you have a certificate, it can be hard coded in your application.
@ -12,7 +12,7 @@ These certificates contain public keys, and don't need to be kept secure. You ca
### Task - set up a SSL client
1. Open the `fruit-quality-detector` app project if it's not already open
1. Open the `fruit-quality-detector` app project if it's not already open.
1. Open the `config.h` header file, and add the following:

@ -1,14 +1,12 @@
# Run your fruit detector on the edge
<!-- This lesson is still under development -->
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-17.jpg)
Add a sketchnote if possible/appropriate
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson.
[![Custom Vison AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us)
> 🎥 Click the image above to watch a video
[![Custom Vision AI on Azure IoT Edge](https://img.youtube.com/vi/_K5fqGLO8us/0.jpg)](https://www.youtube.com/watch?v=_K5fqGLO8us)
## Pre-lecture quiz
@ -16,23 +14,98 @@ This video gives an overview of running image classifiers on IoT devices, the to
## Introduction
In this lesson you will learn about
In the last lesson you used your image classifier to classify ripe and unripe fruit, sending an image captured by the camera on your IoT device over the internet to a cloud service. These calls take time, cost money, and depending on the kind of image data you are using, could have privacy implications.
In this lesson you will learn about how to run machine learning (ML) models on the edge - on IoT devices running on your own network rather than in the cloud. You will learn the benefits and drawbacks of edge computing versus cloud computing, how to deploy your AI model to the edge, and how to access it from your IoT device.
In this lesson we'll cover:
* [Edge computing](#edge-computing)
* [Azure IoT Edge](#azure-iot-edge)
* [Register an IoT Edge device](#registeran-iot-edge-device)
* [Set up an IoT Edge device](#set-up-an-iot-dge-device)
* [Run your classifier on the edge](run-your-classifier-on-the-edge)
* [Azure IoT Edge](#Azure-IoT-Edge)
* [Register an IoT Edge device](#register-an-iot-edge-device)
* [Set up an IoT Edge device](#set-up-an-iot-edge-device)
* [Export your model](#export-your-model)
* [Prepare your container for deployment](#prepare-your-container-for-deployment)
* [Deploy your container](#deploy-your-container)
* [Use your IoT Edge device](#use-your-iot-edge-device)
## Edge computing
Edge computing involves having computers that process IoT data as close as possible to where the data is generated. Instead of having this processing in the cloud, it is moved to the edge of the cloud - your internal network.
![An architecture diagram showing internet services in the cloud and IoT devices on a local network](../../../images/cloud-without-edge.png)
In the lessons so far, you have had devices gathering data and sending data to the cloud to be analyzed, running serverless functions or AI models in the cloud.
![An architecture diagram showing IoT devices on a local network connecting to edge devices, and those edge devices connect to the cloud](../../../images/cloud-with-edge.png)
Edge computing involves moving some of the cloud services off the cloud and onto computers running on the same network as the IoT devices, only communicating with the cloud if needed. For example, you can run AI models on edge devices to analyse fruit for ripeness, and only send analytics back to the cloud, such as the number of ripe pieces of fruit vs unripe.
✅ Think about the IoT applications you have built so far. Which parts of them could be moved to the edge.
### Upsides
The upsides of edge computing are:
1. **Speed** - edge computing is ideal for time-sensitive data as actions are done on the same network as the device, rather than making calls across the internet. This enables higher speeds as internal networks can run at substantially faster speeds than internet connections, with the data travelling much shorter distance.
> 💁 Despite optical cables being used for internet connections allowing data to travel at the speed of light, data can take time to travel around the world to cloud providers. For example, if you are sending data from Europe to cloud services in the US it takes at least 28ms for the data to cross the atlantic in an optical cable, and that is ignoring the time taken to get the data to the transatlantic cable, convert from electrical to light signals and back again the other side, then from the optical cable to the cloud provider.
Edge computing also requires less network traffic, reducing the risk of your data slowing down due to congestion on the limited bandwidth available for an internet connection.
1. **Remote accessibility** - edge compute works when you have limited or no connectivity, or connectivity is too expensive to use continually. For example when working in humanitarian disaster areas where infrastructure is limited, or in developing nations.
1. **Lower costs** - performing data collection, storage, analysis, and triggering actions on edge device reduces usage of cloud services which can reduce the overall cost of your IoT application. There has been a recent rise in devices designed for edge computing, such as AI accelerator boards like the [Jetson Nano from NVIDIA](https://developer.nvidia.com/embedded/jetson-nano-developer-kit), which can run AI workloads using GPU-based hardware on devices that cost less than US$100.
1. **Privacy and security** - with edge compute, data stays on your network and is not uploaded to the cloud. This is often preferred for sensitive and personally identifiable information, especially because data does not need to be stored after it has been analyzed, which greatly reduces the risk of data leaks. Examples include medical data and security camera footage.
1. **Handling insecure devices** - if you have devices with known security flaws that you don't want to connect directly to your network or the internet, then you can connect them to a separate network to a gateway IoT Edge device. This edge device can then also have a connection to your wider network or the internet, and manage the data flows back and forth.
1. **Support for incompatible devices** - if you have devices that cannot connect to IoT Hub, for example devices that can only connect using HTTP connections or devices that only have Bluetooth to connect, you can use an IoT edge device as a gateway device, forwarding on messages to IoT Hub.
✅ Do some research: What other upsides might there be to edge computing?
### Downsides
There are downsides to edge computing, where the cloud may be a preferred option:
1. **Scale and flexibility** - cloud computing can adjust to network and data needs in real-time by adding or reducing servers and other resources. To add more edge computers requires manually adding more devices.
1. **Reliability and resiliency** - cloud computing provides multiple servers often in multiple locations for redundancy and disaster recovery. To have the same level of redundancy on the edge requires large investments and a lor of configuration work.
1. **Maintenance** - cloud service providers provide system maintenance and updates.
✅ Do some research: What other downsides might there be to edge computing?
The downsides are really the opposite of the upsides of using the cloud - you have to build and manage these devices yourself, rather than relying on the expertise and scale of cloud providers.
Some of the risks are mitigated by the very nature of edge computing. For example, if you have an edge device running in a factory gathering data from machinery, you don't need to think about some disaster recovery scenarios. If the power to the factory goes out then you don't need a backup edge device as the machines that generate the data the edge device processes will also be without power.
For IoT systems, you'll often want a blend of cloud and edge computing, leveraging each service based on the needs of the system, its customers, and its maintainers.
## Azure IoT Edge
![The Azure IoT Edge logo](../../../images/azure-iot-edge-logo.png)
IoT Edge runs code from containers.
Azure IoT Edge is a service that can help you to move workloads out of the cloud and to the edge. You set up a device as an edge device, and from the cloud you can deploy code to that edge device. This allows you to mix the capabilities of the cloud and the edge.
> 🎓 *Workloads* is a term for any service that does some kind of work, such as AI models, applications, or serverless functions.
For example, you can train an image classifier in the cloud, then from the cloud deploy it to an edge device. Your IoT device then sends images to the edge device for classification, rather than sending the images over the internet. If you need to deploy a new iteration of the model, you can train it in the cloud and use IoT Edge to update the model on the edge device to your new iteration.
> 🎓 Software that is deployed to IoT Edge is known as *modules*. By default IoT Edge runs modules that communicate with IoT Hub, such as the `edgeAgent` and `edgeHub` modules. When you deploy an image classifier, this is deployed as an additional module.
IoT Edge is built into IoT Hub, so you can manage edge devices using the same service you would use to manage IoT devices, with the same level of security.
IoT Edge runs code from *containers* - self contained applications that are run in isolation from the rest of the applications on your computer. When you run a container it act's like a separate computer running inside your computer, with it's own software, services and applications running. Most of the time containers cannot access anything on your computer unless you choose to share things like a folder with the container. The container then exposes services via an open port that you can connect to or expose to your network.
![A web request redirected to a container](../../../images/container-web-browser.png)
For example, you can have a container with a web site running on port 80, the default HTTP port, and you can then expose it from your computer also on port 80.
✅ Do some research: Read up on containers and services such as Docker or Moby.
You can use Custom Vision to download image classifiers and deploy them as containers, either running direct to a device or deployed via IoT Edge. Once they are running in a container, they can be accessed using the same REST API as the cloud version, but with the endpoint pointing to the Edge device running the container.
## Register an IoT Edge device
@ -66,9 +139,11 @@ To use an IoT Edge device, it needs to be registered in IoT Hub.
## Set up an IoT Edge device
### Task - set up an IoT Edge device
Once you have created the edge device registration in your IoT Hub, you can set up the edge device.
### Task - Install and start the IoT Edge Runtime
The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on Windows using Linux Virtual Machines.
**The IoT Edge runtime only runs Linux containers.** It can be run on Linux, or on Windows using Linux Virtual Machines.
* If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string.
@ -80,24 +155,457 @@ The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on W
* If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this.
## Create a classifier that can run on the edge
## Export your model
To run the classifier at the edge, it needs to be exported from Custom Vision. Custom Vision can generate two types of models - standard models and compact models. Compact models use various techniques to reduce the size of the model, making it small enough to be downloaded and deployed on IoT devices.
When you created the image classifier, you used the *Food* domain, a version of the model that is optimized for training on food images. In Custom Vision, you change the domain of your project, using your training data to train a new model with the new domain. All of the domains supported by Custom Vision are available as standard and compact.
### Task - train your model using the Food (compact) domain
## Run your classifier on the edge
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `fruit-quality-detector` project.
### Task - deploy your classifier using IoT Edge
1. Select the **Settings** button (the ⚙ icon)
### Task - use the edge classifier from your IoT device
1. In the *Domains* list, select *Food (compact)*
1. Under *Export Capabilities*, make sure *Basic platforms (Tensorflow, CoreML, ONNX, ...)* is selected.
1. At the bottom of the Settings page, select **Save Changes**.
1. Retrain the model with the **Train** button, selecting *Quick training*.
### Task - export your model
Once the model has been trained, it needs to be exported as a container.
1. Select the **Performance** tab, and find your latest iteration that was trained using the compact domain.
1. Select the **Export** button at the top.
1. Select **DockerFile**, then choose a version that matches your edge device:
* If you are running IoT Edge on a Linux computer, a Windows computer or a Virtual Machine, select the *Linux* version.
* If you are running IoT Edge on a Raspberry Pi, select the *ARM (Raspberry Pi 3)* version.
> 🎓 Docker is one of the most popular tools for managing containers, and a DockerFile is a set of instructions on how to set up the container.
1. Select **Export** to get Custom Vision to create the relevant files, then **Download** to download them in a zip file.
1. Save the files to your computer, then unzip the folder.
## Prepare your container for deployment
![Containers are built then pushed to a container registry, then deployed from the container registry to an edge device using IoT Edge](../../../images/container-edge-flow.png)
Once you have downloaded your model, it needs to be built into a container, then pushed to a container registry - an online location where you can store containers. IoT Edge can then download the container from the registry and push it to your device.
![THe Azure Container Registry logo](../../../images/azure-container-registry-logo.png)
The container registry you will use for this lesson is Azure Container Registry. This is not a free service, so to save money make sure you [clean up your project](../../../clean-up.md) once you are finished.
> 💁 You can see the costs of using an Azure Container Registry in the [Azure Container Registry pricing page](https://azure.microsoft.com/pricing/details/container-registry/?WT.mc_id=academic-17441-jabenn)
### Task - install Docker
To build and deploy the classifier classifier, you'll need to install [Docker](https://www.docker.com/).
1. Follow the Docker installation instructions on the [Docker install page](https://www.docker.com/products/docker-desktop) to install Docker Desktop or the Docker engine. Ensure it is running after installation.
### Task - create a container registry resource
1. Run the following command from your Terminal or command prompt to create an Azure Container Registry resource:
```sh
az acr create --resource-group fruit-quality-detector \
--sku Basic \
--name <Container registry name>
```
Replace `<Container registry name>` with a unique name for your container registry, using letters and numbers only. Base this around `fruitqualitydetector`. This name becomes part of the URL to access the container registry, so needs to be globally unique.
1. Log in to the Azure Container Registry with the following command:
```sh
az acr login --name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
1. Set the container registry into admin mode so you can generate a password with the following command:
```sh
az acr update --admin-enabled true \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
1. Generate passwords for your container registry with the following command:
```sh
az acr credential renew --password-name password \
--output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
Take a copy of the value of `PASSWORD`, as you will need this later.
### Task - build your container
What you downloaded from Custom Vision was a DockerFile containing instructions on how the container should be built, along with application code that will be run inside the container to host your custom vision model, along with a REST API to call it. You can use Docker to build a tagged container from the DockerFile, then push it to your container registry.
> 🎓 Containers are given a tag that defines a name and version for them. When you need to update a container you can build it with the same tag but a newer version.
1. Open your terminal or command prompt and navigate to the unzipped model that you downloaded from Custom Vision.
1. Run the following command to build and tag the image:
```sh
docker build --platform <platform> -t <Container registry name>.azurecr.io/classifier:v1 .
```
Replace `<platform>` with the platform that this container will run on. If you are running IoT Edge on a Raspberry Pi, set this to `linux/arm64`, otherwise set this to `linux/amd64`.
> 💁 If you are running this command from the device you are running IoT Edge from, such as running this from your Raspberry Pi, you can omit the `--platform <platform>` part as it defaults to the current platform.
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
Docker will build the image, configuring all the software needed. The image will then be tagged as `classifier:v1`.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker build --platform linux/amd64 -t fruitqualitydetectorjimb.azurecr.io/classifier:v1 .
[+] Building 102.4s (11/11) FINISHED
=> [internal] load build definition from Dockerfile
=> => transferring dockerfile: 131B
=> [internal] load .dockerignore
=> => transferring context: 2B
=> [internal] load metadata for docker.io/library/python:3.7-slim
=> [internal] load build context
=> => transferring context: 905B
=> [1/6] FROM docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => resolve docker.io/library/python:3.7-slim@sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6
=> => sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda 27.15MB / 27.15MB
=> => sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9 2.77MB / 2.77MB
=> => sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8 10.06MB / 10.06MB
=> => sha256:b21b91c9618e951a8cbca5b696424fa5e820800a88b7e7afd66bba0441a764d6 1.86kB / 1.86kB
=> => sha256:44073386687709c437586676b572ff45128ff1f1570153c2f727140d4a9accad 1.37kB / 1.37kB
=> => sha256:3d94f0f2ca798607808b771a7766f47ae62a26f820e871dd488baeccc69838d1 8.31kB / 8.31kB
=> => sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41 233B / 233B
=> => sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f 2.64MB / 2.64MB
=> => extracting sha256:b4d181a07f8025e00e0cb28f1cc14613da2ce26450b80c54aea537fa93cf3bda
=> => extracting sha256:de8ecf497b753094723ccf9cea8a46076e7cb845f333df99a6f4f397c93c6ea9
=> => extracting sha256:707b80804672b7c5d8f21e37c8396f319151e1298d976186b4f3b76ead9f10c8
=> => extracting sha256:283715715396fd56d0e90355125fd4ec57b4f0773f306fcd5fa353b998beeb41
=> => extracting sha256:8353afd48f6b84c3603ea49d204bdcf2a1daada15f5d6cad9cc916e186610a9f
=> [2/6] RUN pip install -U pip
=> [3/6] RUN pip install --no-cache-dir numpy~=1.17.5 tensorflow~=2.0.2 flask~=1.1.2 pillow~=7.2.0
=> [4/6] RUN pip install --no-cache-dir mscviplib==2.200731.16
=> [5/6] COPY app /app
=> [6/6] WORKDIR /app
=> exporting to image
=> => exporting layers
=> => writing image sha256:1846b6f134431f78507ba7c079358ed66d944c0e185ab53428276bd822400386
=> => naming to fruitqualitydetectorjimb.azurecr.io/classifier:v1
```
### Task - push your container to your container registry
1. Use the following command to push your container to your container registry:
```sh
docker push <Container registry name>.azurecr.io/classifier:v1
```
Replace `<Container registry name>` with the name you used for your container registry.
> 💁 If you are running Linux you nay need to use `sudo` to run this command.
The container will be pushed to the container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux docker push fruitqualitydetectorjimb.azurecr.io/classifier:v1
The push refers to repository [fruitqualitydetectorjimb.azurecr.io/classifier]
5f70bf18a086: Pushed
8a1ba9294a22: Pushed
56cf27184a76: Pushed
b32154f3f5dd: Pushed
36103e9a3104: Pushed
e2abb3cacca0: Pushed
4213fd357bbe: Pushed
7ea163ba4dce: Pushed
537313a13d90: Pushed
764055ebc9a7: Pushed
v1: digest: sha256:ea7894652e610de83a5a9e429618e763b8904284253f4fa0c9f65f0df3a5ded8 size: 2423
```
1. To verify the push, you can list the containers in your registry with the following command:
```sh
az acr repository list --output table \
--name <Container registry name>
```
Replace `<Container registry name>` with the name you used for your container registry.
```output
➜ d4ccc45da0bb478bad287128e1274c3c.DockerFile.Linux az acr repository list --name fruitqualitydetectorjimb --output table
Result
----------
classifier
```
You will see your classifier listed in the output.
## Deploy your container
Your container can now be deployed to your IoT Edge device. To deploy you need to define a deployment manifest - a JSON document that lists the modules that will be deployed to the edge device.
### Task - create the deployment manifest
1. Create a new file called `deployment.json` somewhere on your computer.
1. Add the following to this file:
```json
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container registry password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}
```
> 💁 You can find this file in the [code-deployment/deployment](code-deployment/deployment) folder.
Replace the three instances of`<Container registry name>` with the name you used for your container registry. One is in the `ImageClassifier` module section, the other two are in the `registryCredentials` section.
Replace `<Container registry password>` in the `registryCredentials` section with your container registry password.
1. From the folder containing your deployment manifest, run the following command:
```sh
az iot edge set-modules --device-id fruit-quality-detector-edge \
--content deployment.json \
--hub-name <hub_name>
```
Replace `<hub_name>` with the name of your IoT Hub.
The image classifier module will be deployed to your edge device.
### Task - verify the classifier is running
1. Connect to the IoT edge device:
* If you are using a Raspberry Pi to run IoT Edge, connect using ssh either from your terminal, or via a remote SSH session in VS Code
* If you are running IoT Edge in a Linux container on Windows, follow the steps in the [Verify successful configuration guide](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge-on-windows?view=iotedge-2018-06&tabs=powershell&WT.mc_id=academic-17441-jabenn#verify-successful-configuration) to connect to the IoT Edge device.
* If you are running IoT Edge on a virtual machine, you can SSH into the machine using the `adminUsername` and `password` you set when creating the VM, and using either the IP address or DNS name:
```sh
ssh <adminUsername>@<IP address>
```
Or:
```sh
ssh <adminUsername>@<DNS Name>
```
Enter your password when prompted
1. Once you are connected, run the following command to get the list of IoT Edge modules:
```sh
iotedge list
```
> 💁 You may need to run this command with `sudo`.
You will see the running modules:
```output
jim@fruit-quality-detector-jimb:~$ iotedge list
NAME STATUS DESCRIPTION CONFIG
ImageClassifier running Up 42 minutes fruitqualitydetectorjimb.azurecr.io/classifier:v1
edgeAgent running Up 42 minutes mcr.microsoft.com/azureiotedge-agent:1.1
edgeHub running Up 42 minutes mcr.microsoft.com/azureiotedge-hub:1.1
```
1. Check the logs for the Image classifier module with the following command:
```sh
iotedge logs ImageClassifier
```
> 💁 You may need to run this command with `sudo`.
```output
jim@fruit-quality-detector-jimb:~$ iotedge logs ImageClassifier
2021-07-05 20:30:15.387144: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-07-05 20:30:15.392185: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2394450000 Hz
2021-07-05 20:30:15.392712: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55ed9ac83470 executing computations on platform Host. Devices:
2021-07-05 20:30:15.392806: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Host, Default Version
Loading model...Success!
Loading labels...2 found. Success!
* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
```
### Task - test the image classifier
1. You can use CURL to test the image classifier using the IP address or host name of the computer that is running the IoT Edge agent. Find the IP address:
* If you are on the same machine that IoT Edge is running, you can use `localhost` as the host name.
* If you are using a VM, you can use either the IP address or the DNS name of the VM
* Otherwise you can obtain the IP address of the machine running IoT Edge:
* On Windows 10, follow the [Find your IP address guide](https://support.microsoft.com/windows/find-your-ip-address-f21a9bbc-c582-55cd-35e0-73431160a1b9?WT.mc_id=academic-17441-jabenn)
* On macOS, follow the [How to find you IP address on a Mac guide](https://www.hellotech.com/guide/for/how-to-find-ip-address-on-mac)
* On linux, follow the section on finding your private IP address in the [How to find your IP address in Linux guide](https://opensource.com/article/18/5/how-find-ip-address-linux)
1. You can test the container with a local file by running the following curl command:
```sh
curl --location \
--request POST 'http://<IP address or name>/image' \
--header 'Content-Type: image/png' \
--data-binary '@<file_Name>'
```
Replace `<IP address or name>` with the IP address or host name of the computer running IoT Edge. Replace `<file_Name>` with the name of the file to test.
You will see the prediction results in the output:
```output
{
"created": "2021-07-05T21:44:39.573181",
"id": "",
"iteration": "",
"predictions": [
{
"boundingBox": null,
"probability": 0.9995615482330322,
"tagId": "",
"tagName": "ripe"
},
{
"boundingBox": null,
"probability": 0.0004384400090202689,
"tagId": "",
"tagName": "unripe"
}
],
"project": ""
}
```
> 💁 There is no need to provide a prediction key here, as this is not using an Azure resource. Instead security would be configured on the internal network based on internal security needs, rather than relying on a public endpoint and an API key.
## Use your IoT Edge device
Now that your Image Classifier has been deployed to an IoT Edge device, you can use it from your IoT device.
### Task - use your IoT Edge device
Work through the relevant guide to classify images using the IoT Edge classifier:
* [Arduino - Wio Terminal](wio-terminal.md)
* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer.md)
### Model retraining
One of the downsides to running image classifiers on IoT Edge is that they are not connected to your Custom Vision project. If you look at the **Predictions** tab in Custom Vision you won't see the images that were classified using the Edge-based classifier.
This is the expected behavior - images are not sent to the cloud for classification, so they won't be available in the cloud. One of the upsides of using IoT Edge is privacy, ensuring that images don't leave your network, another is being able to work offline, so no relying on uploading images when the device has no internet connection. The downside is improving your model - you would need to implement another way of storing images that can be manually re-classified to improve and re-train the image classifier.
✅ Think about ways to upload images to retrain the classifier.
---
## 🚀 Challenge
Running AI models on edge devices can be faster that in the cloud - the network hop is shorter. THey can also be slower as the hardware than runs the model may not be as powerful as the cloud.
Do some timings and compare if the call to your edge device is faster or slower than the call to the cloud? Think about reasons to explain the difference, or lack of difference. Research ways to run AI models faster on the edge using specialized hardware.
## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
## Review & Self Study
* Read more about containers on the [OS-level virtualization page on Wikipedia](https://wikipedia.org/wiki/OS-level_virtualization)
* Read more on edge computing, with an emphasis on how 5G can help expand edge computing in the [What is edge computing and why does it matter? article on NetworkWorld](https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html)
* Learn more about running AI services in IoT Edge by watching the [Learn How to Use Azure IoT Edge on a Pre-Built AI Service on the Edge to do Language Detection episode of Learn Live on Microsoft Channel9](https://channel9.msdn.com/Shows/Learn-Live/Sharpen-Your-AI-Edge-Skills-Episode-4-Learn-How-to-Use-Azure-IoT-Edge-on-a-Pre-Built-AI-Service-on-t?WT.mc_id=academic-17441-jabenn)
## Assignment
[](assignment.md)
[Run other services on the edge](assignment.md)

@ -1,9 +1,13 @@
#
# Run other services on the edge
## Instructions
It's not just image classifiers that can be run on the edge, anything that can be packaged up into a container can be deployed to an IoT Edge device. Serverless code running as Azure Functions, such as the triggers you've created in earlier lessons can be run in containers, and therefor on IoT Edge.
Pick one of the previous lessons and try to run the Azure Functions app in an IoT Edge container. You can find a guide that shows how to do this using a different Functions app project in the [Tutorial: Deploy Azure Functions as IoT Edge modules on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/tutorial-deploy-function?view=iotedge-2020-11&WT.mc_id=academic-17441-jabenn).
## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- |
| | | | |
| Deploy an Azure Functions app to IoT Edge | Was able to deploy an Azure Functions app to IoT Edge and use it with an IoT device to run a trigger from IoT data | Was able to deploy a Functions App to IoT Edge, but was unable to get the trigger to fire | Was unable to deploy a Functions App to IoT Edge |

@ -0,0 +1,28 @@
import io
import requests
import time
from picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,28 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
import requests
from counterfit_shims_picamera import PiCamera
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,11 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";

@ -0,0 +1,123 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <WiFiClient.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClient client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
void classifyImage(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
classifyImage(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,66 @@
{
"content": {
"modulesContent": {
"$edgeAgent": {
"properties.desired": {
"schemaVersion": "1.1",
"runtime": {
"type": "docker",
"settings": {
"minDockerVersion": "v1.25",
"loggingOptions": "",
"registryCredentials": {
"ClassifierRegistry": {
"username": "<Container registry name>",
"password": "<Container Password>",
"address": "<Container registry name>.azurecr.io"
}
}
}
},
"systemModules": {
"edgeAgent": {
"type": "docker",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-agent:1.1",
"createOptions": "{}"
}
},
"edgeHub": {
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "mcr.microsoft.com/azureiotedge-hub:1.1",
"createOptions": "{\"HostConfig\":{\"PortBindings\":{\"5671/tcp\":[{\"HostPort\":\"5671\"}],\"8883/tcp\":[{\"HostPort\":\"8883\"}],\"443/tcp\":[{\"HostPort\":\"443\"}]}}}"
}
}
},
"modules": {
"ImageClassifier": {
"version": "1.0",
"type": "docker",
"status": "running",
"restartPolicy": "always",
"settings": {
"image": "<Container registry name>.azurecr.io/classifier:v1",
"createOptions": "{\"ExposedPorts\": {\"80/tcp\": {}},\"HostConfig\": {\"PortBindings\": {\"80/tcp\": [{\"HostPort\": \"80\"}]}}}"
}
}
}
}
},
"$edgeHub": {
"properties.desired": {
"schemaVersion": "1.1",
"routes": {
"upstream": "FROM /messages/* INTO $upstream"
},
"storeAndForwardConfiguration": {
"timeToLiveSecs": 7200
}
}
}
}
}
}

@ -0,0 +1,54 @@
# Classify an image using an IoT Edge based image classifier - Virtual IoT Hardware and Raspberry Pi
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
The Python library for Custom Vision only works with cloud-hosted models, not models hosted on IoT Edge. This means you will need to use the REST API to call the classifier.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` project in VS Code if it is not already open. If you are using a virtual IoT device, then make sure the virtual environment is activated.
1. Open the `app.py` file, and remove the import statements from `azure.cognitiveservices.vision.customvision.prediction` and `msrest.authentication`.
1. Add the following import at the top of the file:
```python
import requests
```
1. Delete all the code after the image is saved to a file, from `image_file.write(image.read())` to the end of the file.
1. Add the following code to the end of the file:
```python
prediction_url = '<URL>'
headers = {
'Content-Type' : 'application/octet-stream'
}
image.seek(0)
response = requests.post(prediction_url, headers=headers, data=image)
results = response.json()
for prediction in results['predictions']:
print(f'{prediction["tagName"]}:\t{prediction["probability"] * 100:.2f}%')
```
Replace `<URL>` with the URL for your classifier.
This code makes a REST POST request to the classifier, sending the image as the body of the request. The results come back as JSON, and this is decoded to print out the probabilities.
1. Run your code, with your camera pointing at some fruit, or an appropriate image set, or fruit visible on your webcam if using virtual IoT hardware. You will see the output in the console:
```output
(.venv) ➜ fruit-quality-detector python app.py
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/pi](code-classify/pi) or [code-classify/virtual-iot-device](code-classify/virtual-iot-device) folder.
😀 Your fruit quality classifier program was a success!

@ -31,36 +31,71 @@ In Azure, you can create a virtual machine - a computer in the cloud that you ca
Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device.
1. You will need either the IP address or the DNS name of the VM to call the image classifier from it. Run the following command to get this:
```sh
az vm list --resource-group fruit-quality-detector \
--output table \
--show-details
```
Take a copy of either the `PublicIps` field, or the `Fqdns` field.
1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project.
To shut down the VM, use the following command:
You can configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this:
```sh
az vm deallocate --resource-group fruit-quality-detector \
--name <vm_name>
az vm auto-shutdown --resource-group fruit-quality-detector \
--name <vm_name> \
--time <shutdown_time_utc>
```
Replace `<vm_name>` with the name of your virtual machine.
> 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running.
Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC).
To restart the VM, use the following command:
1. Your image classifier will be running on this edge device, listening on port 80 (the standard HTTP port). By default, virtual machines have inbound ports blocked, so you will need to enable port 80. Ports are enabled on network security groups, so first you need to know the name of the network security group for your VM, which you can find with the following command:
```sh
az vm start --resource-group fruit-quality-detector \
--name <vm_name>
az network nsg list --resource-group fruit-quality-detector \
--output table
```
Replace `<vm_name>` with the name of your virtual machine.
Copy the value of the `Name` field.
You can also configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this:
1. Run the following command to add a rule to open port 80 to the network security group:
```sh
az vm auto-shutdown --resource-group fruit-quality-detector \
--name <vm_name> \
--time <shutdown_time_utc>
az network nsg rule create \
--resource-group fruit-quality-detector \
--name Port_80 \
--protocol tcp \
--priority 1010 \
--destination-port-range 80 \
--nsg-name <nsg name>
```
Replace `<nsg name>` with the network security group name from the previous step.
### Task - manage your VM to reduce costs
1. When you are not using your VM, you should shut it down. To shut down the VM, use the following command:
```sh
az vm deallocate --resource-group fruit-quality-detector \
--name <vm_name>
```
Replace `<vm_name>` with the name of your virtual machine.
Replace `<shutdown_time_utc>` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC).
> 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running.
1. To restart the VM, use the following command:
```sh
az vm start --resource-group fruit-quality-detector \
--name <vm_name>
```
Replace `<vm_name>` with the name of your virtual machine.

@ -0,0 +1,52 @@
# Classify an image using an IoT Edge based image classifier - Wio Terminal
In this part of the lesson, you will use the Image Classifier running on the IoT Edge device.
## Use the IoT Edge classifier
The IoT device can be re-directed to use the IoT Edge image classifier. The URL for the Image Classifier is `http://<IP address or name>/image`, replacing `<IP address or name>` with the IP address or host name of the computer running IoT Edge.
### Task - use the IoT Edge classifier
1. Open the `fruit-quality-detector` app project if it's not already open.
1. The image classifier is running as a REST API using HTTP, not HTTPS, so the call needs to use a WiFi client that works with HTTP calls only. This means the certificate is not needed. Delete the `CERTIFICATE` from the `config.h` file.
1. The prediction URL in the `config.h` file needs to be updated to the new URL. You can also delete the `PREDICTION_KEY` as this is not needed.
```cpp
const char *PREDICTION_URL = "<URL>";
```
Replace `<URL>` with the URL for your classifier.
1. In `main.cpp`, change the include directive for the WiFi Client Secure to import the standard HTTP version:
```cpp
#include <WiFiClient.h>
```
1. Change the declaration of `WiFiClient` to be the HTTP version:
```cpp
WiFiClient client;
```
1. Select the line that sets the certificate on the WiFi client. Remove the line `client.setCACert(CERTIFICATE);` from the `connectWiFi` function.
1. In the `classifyImage` function, remove the `httpClient.addHeader("Prediction-Key", PREDICTION_KEY);` line that sets the prediction key in the header.
1. Upload and run your code. Point the camera at some fruit and press the C button. You will see the output in the serial monitor:
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 8200
ripe: 56.84%
unripe: 43.16%
```
> 💁 You can find this code in the [code-classify/wio-terminal](code-classify/wio-terminal) folder.
😀 Your fruit quality classifier program was a success!

@ -1,8 +1,8 @@
# Trigger fruit quality detection from a sensor
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-18.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -41,8 +41,6 @@ IoT applications can be described as *things* (devices) sending data that genera
![A reference iot architecture](../../../images/iot-reference-architecture.png)
***A reference iot architecture. Microcontroller by Template / IoT by Adrien Coquet / Brain by Icon Market - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference IoT architecture.
> 🎓 A *reference architecture* is an example architecture you can use as a reference when designing new systems. In this case, if you were building a new IoT system you can follow the reference architecture, substituting your own devices and services where appropriate.
@ -53,8 +51,6 @@ The diagram above shows a reference IoT architecture.
![A reference iot architecture](../../../images/iot-reference-architecture-azure.png)
***A reference iot architecture. Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows some of the components and services covered so far in these lessons and how the link together in a reference IoT architecture.
* **Things** - you've written device code to capture data from sensors, and analyse images using Custom Vision running both in the cloud and on an edge device. This data was sent to IoT Hub.
@ -78,7 +74,7 @@ As you define the architecture of your system, you need to constantly consider d
## Design a fruit quality control system
Lets now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application.
Let's now take this idea of things, insights, and actions and apply it to our fruit quality detector to design a larger end-to-end application.
Imagine you have been given the task of building a fruit quality detector to be used in a processing plant. Fruit travels on a conveyer belt system where currently employees spend time checking the fruit by hand and removing any unripe fruit as it arrives. To reduce costs, the plant owner wants an automated system.
@ -96,8 +92,6 @@ You need to build a system where fruit is detected as it arrives on the conveyer
![A reference iot architecture for fruit quality checking](../../../images/iot-reference-architecture-fruit-quality.png)
***A reference iot architecture for fruit quality checking. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
The diagram above shows a reference architecture for this prototype application.
* An IoT device with a proximity sensor detects the arrival of fruit. This sends a message to the cloud to say fruit has been detected.
@ -115,8 +109,6 @@ The IoT device needs some kind of trigger to indicate when fruit is ready to be
![Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back](../../../images/proximity-sensor.png)
***Proximity sensors send laser beams to objects like bananas and time how long till the beam is bounced back. Bananas by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
Proximity sensors can be used to measure the distance from the sensor to an object. They usually transmit a beam of electromagnetic radiation such as a laser beam or infra-red light, then detect the radiation bouncing off an object. The time between the laser beam being sent and the signal bouncing back can be used to calculate the distance to the sensor.
> 💁 You have probably used proximity sensors without even knowing about it. Most smartphone will turn the screen off when you hold them to your ear to stop you accidentally ending a call with your earlobe, and this works using a proximity sensor, detecting an object close to the screen during a call and disabling the touch capabilities until the phone is a certain distance away.
@ -209,7 +201,7 @@ The prototype will form the basis of a final production system. Some of the diff
## 🚀 Challenge
In this lesson you have learned some of the concepts you need to know to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above?
In this lesson you have learned some of the concepts you need to know on how to architect an IoT system. Think back to the previous projects. How would do they fit into the reference architecture shown above?
Pick one of the projects so far and think of the design of a more complicated solution bringing together multiple capabilities beyond what was covered in the projects. Draw the architecture and think of all the devices and services you would need.
@ -222,7 +214,7 @@ For example - a vehicle tracking device that combines GPS with sensors to monito
## Review & Self Study
* Read more about IoT architecture on the [Azure IoT reference architecture documentation on Microsoft docs](https://docs.microsoft.com/azure/architecture/reference-architectures/iot?WT.mc_id=academic-17441-jabenn)
* Read more about device twins in the [Understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn)
* Read more about device twins in the [understand and use device twins in IoT Hub documentation on Microsoft docs](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-device-twins?WT.mc_id=academic-17441-jabenn)
* Read about OPC-UA, a machine to machine communication protocol used in industrial automation on the [OPC-UA page on Wikipedia](https://wikipedia.org/wiki/OPC_Unified_Architecture)
## Assignment

@ -40,6 +40,12 @@ Program the device.
1. Open the `fruit-quality-detector` code in VS Code, either directly on the Pi, or connect via the Remote SSH extension.
1. Install the rpi-vl53l0x Pip package, a Python package that interacts with a VL53L0X time-of-flight distance sensor. Install it using this pip command
```sh
pip install rpi-vl53l0x
```
1. Create a new file in this project called `distance-sensor.py`.
> 💁 An easy way to simulate multiple IoT devices is to do each in a different Python file, then run them at the same time.
@ -95,4 +101,4 @@ Program the device.
> 💁 You can find this code in the [code-proximity/pi](code-proximity/pi) folder.
😀 Your proximity sensor program was a success!
😀 Your proximity sensor program was a success!

@ -8,7 +8,7 @@ IoT can help with this, using AI models running on IoT devices to count stock, u
In these 2 lessons you'll learn how to train image-based AI models to count stock, and run these models on IoT devices.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics

@ -1,6 +1,8 @@
# Train a stock detector
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-19.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of Object Detection the Azure Custom Vision service, a service that will be covered in this lesson.
@ -56,7 +58,7 @@ Object detection involves training a model to recognize objects. Instead of givi
When you then use it to predict images, instead of getting back a list of tags and percentages, you get back a list of detected objects, with their bounding box and the probability that the object matches the assigned tag.
> 🎓 *Bounding boxes* are the boxes around an object. They are given using coordinates relative to the image as a whole on a scale of 0-1. For example, if the image is 800 pixels wide, by 600 tall and the object it detected between 400 and 600 pixels along, and 150 and 300 pixels down, the bounding box would have a top/left coordinate of 0.5,0.25, with a width of 0.25 and a height of 0.25. That way no matter what size the image is scaled to, the bounding box starts half way along, and a quarter of the way down, and is a quarter of the width and the height.
> 🎓 *Bounding boxes* are the boxes around an object.
![Object detection of cashew nuts and tomato paste](../../../images/object-detector-cashews-tomato.png)
@ -107,7 +109,7 @@ You can train an object detector using Custom Vision, in a similar way to how yo
Call your project `stock-detector`.
When you create your project, make sure to use the `stock-detector-training` resource you created earlier. Use a n*Object Detection* project type, and the *Products on Shelves* domain.
When you create your project, make sure to use the `stock-detector-training` resource you created earlier. Use the *Object Detection* project type, and the *Products on Shelves* domain.
![The settings for the custom vision project with the name set to fruit-quality-detector, no description, the resource set to fruit-quality-detector-training, the project type set to classification, the classification types set to multi class and the domains set to food](../../../images/custom-vision-create-object-detector-project.png)
@ -137,7 +139,7 @@ To train your model you will need a set of images containing the objects you wan
![Tagging some tomato paste](../../../images/object-detector-tag-tomato-paste.png)
> 💁 If you have more than 15 images for each object, you can train after 15 then use the **Suggested tags** feature. This will use the trained model to detect the objecs in the untagged image. You can then confirm the detected objects, or reject and re-draw the bounding boxes. This can save a *lot* of time.
> 💁 If you have more than 15 images for each object, you can train after 15 then use the **Suggested tags** feature. This will use the trained model to detect the objects in the untagged image. You can then confirm the detected objects, or reject and re-draw the bounding boxes. This can save a *lot* of time.
1. Follow the [Train the detector section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#train-the-detector) to train the object detector on your tagged images.

@ -1,8 +1,8 @@
# Check stock from an IoT device
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-20.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -10,24 +10,168 @@ Add a sketchnote if possible/appropriate
## Introduction
In this lesson you will learn about
In the previous lesson you learned about the different uses of object detection in retail. You also learned how to train an object detector to identify stock. In this lesson you will learn how to use your object detector from your IoT device to count stock.
In this lesson we'll cover:
* [Thing 1](#thing-1)
* [Stock counting](#stock-counting)
* [Call your object detector from your IoT device](#call-your-object-detector-from-your-iot-device)
* [Bounding boxes](#bounding-boxes)
* [Retrain the model](#retrain-the-model)
* [Count stock](#count-stock)
## Thing 1
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Stock counting
Object detectors can be used for stock checking, either counting stock or ensuring stock is where it should be. IoT devices with cameras can be deployed all around the store to monitor stock, starting with hot spots where having items restocked is important, such as areas where small numbers of high value items are stocked.
For example, if a camera is pointing at a set of shelves that can hold 8 cans of tomato paste, and an object detector only detects 7 cans, then one is missing and needs to be restocked.
![7 cans of tomato paste on a shelf, 4 on the top row, 3 on top](../../../images/stock-7-cans-tomato-paste.png)
In the above image, an object detector has detected 7 cans of tomato paste on a shelf that can hold 8 cans. Not only can the IoT device send a notification of the need to restock, but it can even give an indication of the location of the missing item, important data if you are using robots to restock shelves.
> 💁 Depending on the store and popularity of the item, restocking probably wouldn't happen if only 1 can was missing. You would need to build an algorithm that determines when to restock based on your produce, customers and other criteria.
✅ In what other scenarios could you combine object detection and robots?
Sometimes the wrong stock can be on the shelves. This could be human error when restocking, or customers changing their mind on a purchase and putting an item back in the first available space. When this is a non-perishable item such as canned goods, this is an annoyance. If it is a perishable item such as frozen or chilled goods, this can mean that the product can no longer be sold as it might be impossible to tell how long the item was out of the freezer.
Object detection can be used to detect unexpected items, again alerting a human or robot to return the item as soon as it is detected.
![A rogue can of baby corn on the tomato paste shelf](../../../images/stock-rogue-corn.png)
In the above image, a can of baby corn has been put on the shelf next to the tomato paste. The object detector has detected this, allowing the IoT device to notify a human or robot to return the can to it's correct location.
## Call your object detector from your IoT device
The object detector you trained in the last lesson can be called from your IoT device.
### Task - publish an iteration of your object detector
Iterations are published from the Custom Vision portal.
1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `stock-detector` project.
1. Select the **Performance** tab from the options at the top
1. Select the latest iteration from the *Iterations* list on the side
1. Select the **Publish** button for the iteration
![The publish button](../../../images/custom-vision-object-detector-publish-button.png)
1. In the *Publish Model* dialog, set the *Prediction resource* to the `stock-detector-prediction` resource you created in the last lesson. Leave the name as `Iteration2`, and select the **Publish** button.
1. Once published, select the **Prediction URL** button. This will show details of the prediction API, and you will need these to call the model from your IoT device. The lower section is labelled *If you have an image file*, and this is the details you want. Take a copy of the URL that is shown which will be something like:
```output
https://<location>.api.cognitive.microsoft.com/customvision/v3.0/Prediction/<id>/detect/iterations/Iteration2/image
```
Where `<location>` will be the location you used when creating your custom vision resource, and `<id>` will be a long ID made up of letters and numbers.
Also take a copy of the *Prediction-Key* value. This is a secure key that you have to pass when you call the model. Only applications that pass this key are allowed to use the model, any other applications are rejected.
![The prediction API dialog showing the URL and key](../../../images/custom-vision-prediction-key-endpoint.png)
✅ When a new iteration is published, it will have a different name. How do you think you would change the iteration an IoT device is using?
### Task - call your object detector from your IoT device
Follow the relevant guide below to use the object detector from your IoT device:
* [Arduino - Wio Terminal](wio-terminal-object-detector.md)
* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-object-detector.md)
## Bounding boxes
When you use the object detector, you not only get back the detected objects with their tags and probabilities, but you also get the bounding boxes of the objects. These define where the object detector detected the object with the given probability.
> 💁 A bounding box is a box that defines the area that contains the object detected, a box that defines the boundary for the object.
The results of a prediction in the **Predictions** tab in Custom Vision have the bounding boxes drawn on the image that was sent for prediction.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
In the image above, 4 cans of tomato paste were detected. In the results a red square is overlaid for each object that was detected in the image, indicating the bounding box for the image.
✅ Open the predictions in Custom Vision and check out the bounding boxes.
Bounding boxes are defined with 4 values - top, left, height and width. These values are on a scale of 0-1, representing the positions as a percentage of the size of the image. The origin (the 0,0 position) is the top left of the image, so the top value is the distance from the top, and the bottom of the bounding box is the top plus the height.
![A bounding box around a can of tomato paste](../../../images/bounding-box.png)
The above image is 600 pixels wide and 800 pixels tall. The bounding box starts at 320 pixels down, giving a top coordinate of 0.4 (800 x 0.4 = 320). From the left, the bounding box starts at 240 pixels across, giving a left coordinate of 0.4 (600 x 0.4 = 240). The height of the bounding box is 240 pixels, giving a height value of 0.3 (800 x 0.3 = 240). The width of the bounding box is 120 pixels, giving a width value of 0.2 (600 x 0.2 = 120).
| Coordinate | Value |
| ---------- | ----: |
| Top | 0.4 |
| Left | 0.4 |
| Height | 0.3 |
| Width | 0.2 |
Using percentage values from 0-1 means no matter what size the image is scaled to, the bounding box starts 0.4 of the way along and down, and is a 0.3 of the height and 0.2 of the width.
You can use bounding boxes combined with probabilities to evaluate how accurate a detection is. For example, an object detector can detect multiple objects that overlap, for example detecting one can inside another. Your code could look at the bounding boxes, understand that this is impossible, and ignore any objects that have a significant overlap with other objects.
![Two bonding boxes overlapping a can of tomato paste](../../../images/overlap-object-detection.png)
In the example above, one bounding box indicated a predicted can of tomato paste at 78.3%. A second bounding box is slightly smaller, and is inside the first bounding box with a probability of 64.3%. Your code can check the bounding boxes, see they overlap completely, and ignore the lower probability as there is no way one can can be inside another.
✅ Can you think of a situation where is it valid to detect one object inside another?
## Retrain the model
Like with the image classifier, you can retrain your model using data captured by your IoT device. Using this real-world data will ensure your model works well when used from your IoT device.
Unlike with the image classifier, you can't just tag an image. Instead you need to review every bounding box detected by the model. If the box is around the wrong thing then it needs to be deleted, if it is in the wrong location it needs to be adjusted.
### Task - retrain the model
1. Make sure you have captured a range of images using your IoT device.
1. From the **Predictions** tab, select an image. You will see red boxes indicating the bounding boxes of the detected objects.
1. Work through each bounding box. Select it first and you will see a pop-up showing the tag. Use the handles on the corners of the bounding box to adjust the size if necessary. If the tag is wrong, remove it with the **X** button and add the correct tag. If the bounding box doesn't contain an object, delete it with the trashcan button.
1. Close the editor when done and the image will move from the **Predictions** tab to the **Training Images** tab. Repeat the process for all the predictions.
1. Use the **Train** button to re-train your model. Once it has trained, publish the iteration and update your IoT device to use the URL of the new iteration.
1. Re-deploy your code and test your IoT device.
## Count stock
Using a combination of the number of objects detected and the bounding boxes, you can count the stock on a shelf.
### Task - count stock
Follow the relevant guide below to count stock using the results from the object detector from your IoT device:
* [Arduino - Wio Terminal](wio-terminal-count-stock.md)
* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-count-stock.md)
---
## 🚀 Challenge
Can you detect incorrect stock? Train your model on multiple objects, then update your app to alert you if the wrong stock is detected.
Maybe even take this further and detect stock side by side on the same shelf, and see if something has been put in the wrong place by defining limits on the bounding boxes.
## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/40)
## Review & Self Study
* Learn more about how to architect an end-to-end stock detection system from the [Out of stock detection at the edge pattern guide on Microsoft Docs](https://docs.microsoft.com/hybrid/app-solutions/pattern-out-of-stock-at-edge?WT.mc_id=academic-17441-jabenn)
* Learn other ways to build end-to-end retail solutions combining a range of IoT and cloud services by watching this [Behind the scenes of a retail solution - Hands On! video on YouTube](https://www.youtube.com/watch?v=m3Pc300x2Mw).
## Assignment
[](assignment.md)
[Use your object detector on the edge](assignment.md)

@ -1,9 +1,11 @@
#
# Use your object detector on the edge
## Instructions
In the last project, you deployed your image classifier to the edge. Do the same with your object detector, exporting it as a compact model and running it on the edge, accessing the edge version from your IoT device.
## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- |
| | | | |
| Deploy your object detector to the edge | Was able to use the correct compact domain, export the object detector and run it on the edge | Was able to use the correct compact domain, and export the object detector, but was unable to run it on the edge | Was unable to use the correct compact domain, export the object detector, and run it on the edge |

@ -0,0 +1,92 @@
import io
import time
from picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
from PIL import Image, ImageDraw, ImageColor
from shapely.geometry import Polygon
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
overlap_threshold = 0.002
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')

@ -0,0 +1,92 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
from counterfit_shims_picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
from PIL import Image, ImageDraw, ImageColor
from shapely.geometry import Polygon
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
overlap_threshold = 0.002
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,49 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";
const char *PREDICTION_KEY = "<PREDICTION_KEY>";
// Microsoft Azure DigiCert Global Root G2 global certificate
const char *CERTIFICATE =
"-----BEGIN CERTIFICATE-----\r\n"
"MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
"MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
"d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
"MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
"MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
"c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
"ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
"wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
"iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
"ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
"aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
"0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
"gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
"sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
"lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
"N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
"Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
"AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
"BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
"JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
"CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
"Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
"aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
"cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
"MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
"cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
"AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
"+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
"cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
"kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
"trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
"8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
"-----END CERTIFICATE-----\r\n";

@ -0,0 +1,223 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <vector>
#include <WiFiClientSecure.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClientSecure client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
client.setCACert(CERTIFICATE);
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
const float threshold = 0.0f;
const float overlap_threshold = 0.20f;
struct Point {
float x, y;
};
struct Rect {
Point topLeft, bottomRight;
};
float area(Rect rect)
{
return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
}
float overlappingArea(Rect rect1, Rect rect2)
{
float left = max(rect1.topLeft.x, rect2.topLeft.x);
float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
float top = max(rect1.topLeft.y, rect2.topLeft.y);
float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
if ( right > left && bottom > top )
{
return (right-left)*(bottom-top);
}
return 0.0f;
}
Rect rectFromBoundingBox(JsonVariant prediction)
{
JsonObject bounding_box = prediction["boundingBox"].as<JsonObject>();
float left = bounding_box["left"].as<float>();
float top = bounding_box["top"].as<float>();
float width = bounding_box["width"].as<float>();
float height = bounding_box["height"].as<float>();
Point topLeft = {left, top};
Point bottomRight = {left + width, top + height};
return {topLeft, bottomRight};
}
void processPredictions(std::vector<JsonVariant> &predictions)
{
std::vector<JsonVariant> passed_predictions;
for (int i = 0; i < predictions.size(); ++i)
{
Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
float prediction_1_area = area(prediction_1_rect);
bool passed = true;
for (int j = i + 1; j < predictions.size(); ++j)
{
Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
float prediction_2_area = area(prediction_2_rect);
float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
float smallest_area = min(prediction_1_area, prediction_2_area);
if (overlap > (overlap_threshold * smallest_area))
{
passed = false;
break;
}
}
if (passed)
{
passed_predictions.push_back(predictions[i]);
}
}
for(JsonVariant prediction : passed_predictions)
{
String boundingBox = prediction["boundingBox"].as<String>();
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
Serial.println(buff);
}
Serial.print("Counted ");
Serial.print(passed_predictions.size());
Serial.println(" stock items.");
}
void detectStock(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
detectStock(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,40 @@
import io
import time
from picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
time.sleep(2)
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')

@ -0,0 +1,40 @@
from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import io
from counterfit_shims_picamera import PiCamera
from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
from msrest.authentication import ApiKeyCredentials
camera = PiCamera()
camera.resolution = (640, 480)
camera.rotation = 0
image = io.BytesIO()
camera.capture(image, 'jpeg')
image.seek(0)
with open('image.jpg', 'wb') as image_file:
image_file.write(image.read())
prediction_url = '<prediction_url>'
prediction_key = '<prediction key>'
parts = prediction_url.split('/')
endpoint = 'https://' + parts[2]
project_id = parts[6]
iteration_name = parts[9]
prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
image.seek(0)
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')

@ -0,0 +1,5 @@
.pio
.vscode/.browse.c_cpp.db*
.vscode/c_cpp_properties.json
.vscode/launch.json
.vscode/ipch

@ -0,0 +1,7 @@
{
// See http://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"platformio.platformio-ide"
]
}

@ -0,0 +1,39 @@
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html

@ -0,0 +1,46 @@
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

@ -0,0 +1,26 @@
; PlatformIO Project Configuration File
;
; Build options: build flags, source filter
; Upload options: custom upload port, speed and extra flags
; Library options: dependencies, extra library storages
; Advanced options: extra scripting
;
; Please visit documentation for the other options and examples
; https://docs.platformio.org/page/projectconf.html
[env:seeed_wio_terminal]
platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
-DARDUCAM_SHIELD_V2
-DOV2640_CAM

@ -0,0 +1,160 @@
#pragma once
#include <ArduCAM.h>
#include <Wire.h>
class Camera
{
public:
Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
{
_format = format;
_image_size = image_size;
}
bool init()
{
// Reset the CPLD
_arducam.write_reg(0x07, 0x80);
delay(100);
_arducam.write_reg(0x07, 0x00);
delay(100);
// Check if the ArduCAM SPI bus is OK
_arducam.write_reg(ARDUCHIP_TEST1, 0x55);
if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
{
return false;
}
// Change MCU mode
_arducam.set_mode(MCU2LCD_MODE);
uint8_t vid, pid;
// Check if the camera module type is OV2640
_arducam.wrSensorReg8_8(0xff, 0x01);
_arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
_arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
{
return false;
}
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
_arducam.OV2640_set_Light_Mode(Auto);
_arducam.OV2640_set_Special_effects(Normal);
delay(1000);
return true;
}
void startCapture()
{
_arducam.flush_fifo();
_arducam.clear_fifo_flag();
_arducam.start_capture();
}
bool captureReady()
{
return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
}
bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
{
if (!captureReady()) return false;
// Get the image file length
uint32_t length = _arducam.read_fifo_length();
buffer_length = length;
if (length >= MAX_FIFO_SIZE)
{
return false;
}
if (length == 0)
{
return false;
}
// create the buffer
byte *buf = new byte[length];
uint8_t temp = 0, temp_last = 0;
int i = 0;
uint32_t buffer_pos = 0;
bool is_header = false;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
while (length--)
{
temp_last = temp;
temp = SPI.transfer(0x00);
//Read JPEG data from FIFO
if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_HIGH();
}
if (is_header == true)
{
//Write image data to buffer if not full
if (i < 256)
{
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
else
{
_arducam.CS_HIGH();
i = 0;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
_arducam.CS_LOW();
_arducam.set_fifo_burst();
}
}
else if ((temp == 0xD8) & (temp_last == 0xFF))
{
is_header = true;
buf[buffer_pos] = temp_last;
buffer_pos++;
i++;
buf[buffer_pos] = temp;
buffer_pos++;
i++;
}
}
_arducam.clear_fifo_flag();
_arducam.set_format(_format);
_arducam.InitCAM();
_arducam.OV2640_set_JPEG_size(_image_size);
// return the buffer
*buffer = buf;
}
private:
ArduCAM _arducam;
int _format;
int _image_size;
};

@ -0,0 +1,49 @@
#pragma once
#include <string>
using namespace std;
// WiFi credentials
const char *SSID = "<SSID>";
const char *PASSWORD = "<PASSWORD>";
const char *PREDICTION_URL = "<PREDICTION_URL>";
const char *PREDICTION_KEY = "<PREDICTION_KEY>";
// Microsoft Azure DigiCert Global Root G2 global certificate
const char *CERTIFICATE =
"-----BEGIN CERTIFICATE-----\r\n"
"MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
"MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
"d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
"MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
"MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
"c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
"ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
"wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
"iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
"ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
"aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
"0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
"gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
"sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
"lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
"N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
"Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
"AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
"BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
"JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
"CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
"Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
"aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
"cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
"MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
"cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
"AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
"+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
"cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
"kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
"trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
"8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
"-----END CERTIFICATE-----\r\n";

@ -0,0 +1,145 @@
#include <Arduino.h>
#include <ArduinoJson.h>
#include <HTTPClient.h>
#include <list>
#include <rpcWiFi.h>
#include "SD/Seeed_SD.h"
#include <Seeed_FS.h>
#include <SPI.h>
#include <vector>
#include <WiFiClientSecure.h>
#include "config.h"
#include "camera.h"
Camera camera = Camera(JPEG, OV2640_640x480);
WiFiClientSecure client;
void setupCamera()
{
pinMode(PIN_SPI_SS, OUTPUT);
digitalWrite(PIN_SPI_SS, HIGH);
Wire.begin();
SPI.begin();
if (!camera.init())
{
Serial.println("Error setting up the camera!");
}
}
void connectWiFi()
{
while (WiFi.status() != WL_CONNECTED)
{
Serial.println("Connecting to WiFi..");
WiFi.begin(SSID, PASSWORD);
delay(500);
}
client.setCACert(CERTIFICATE);
Serial.println("Connected!");
}
void setup()
{
Serial.begin(9600);
while (!Serial)
; // Wait for Serial to be ready
delay(1000);
connectWiFi();
setupCamera();
pinMode(WIO_KEY_C, INPUT_PULLUP);
}
const float threshold = 0.3f;
void processPredictions(std::vector<JsonVariant> &predictions)
{
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
void detectStock(byte *buffer, uint32_t length)
{
HTTPClient httpClient;
httpClient.begin(client, PREDICTION_URL);
httpClient.addHeader("Content-Type", "application/octet-stream");
httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
int httpResponseCode = httpClient.POST(buffer, length);
if (httpResponseCode == 200)
{
String result = httpClient.getString();
DynamicJsonDocument doc(1024);
deserializeJson(doc, result.c_str());
JsonObject obj = doc.as<JsonObject>();
JsonArray predictions = obj["predictions"].as<JsonArray>();
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
}
httpClient.end();
}
void buttonPressed()
{
camera.startCapture();
while (!camera.captureReady())
delay(100);
Serial.println("Image captured");
byte *buffer;
uint32_t length;
if (camera.readImageToBuffer(&buffer, length))
{
Serial.print("Image read to buffer with length ");
Serial.println(length);
detectStock(buffer, length);
delete (buffer);
}
}
void loop()
{
if (digitalRead(WIO_KEY_C) == LOW)
{
buttonPressed();
delay(2000);
}
delay(200);
}

@ -0,0 +1,11 @@
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html

@ -0,0 +1,163 @@
# Count stock from your IoT device - Virtual IoT Hardware and Raspberry Pi
A combination of the predictions and their bounding boxes can be used to count stock in an image
## Show bounding boxes
As a helpful debugging step you can not only print out the bounding boxes, but you can also draw them on the image that was written to disk when an image was captured.
### Task - print the bounding boxes
1. Ensure the `stock-counter` project is open in VS Code, and the virtual environment is activated if you are using a virtual IoT device.
1. Change the `print` statement in the `for` loop to the following to print the bounding boxes to the console:
```python
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%\t{prediction.bounding_box}')
```
1. Run the app with the camera pointing at some stock on a shelf. The bounding boxes will be printed to the console, with left, top, width and height values from 0-1.
```output
pi@raspberrypi:~/stock-counter $ python3 app.py
tomato paste: 33.42% {'additional_properties': {}, 'left': 0.3455171, 'top': 0.09916268, 'width': 0.14175442, 'height': 0.29405564}
tomato paste: 34.41% {'additional_properties': {}, 'left': 0.48283678, 'top': 0.10242918, 'width': 0.11782813, 'height': 0.27467814}
tomato paste: 31.25% {'additional_properties': {}, 'left': 0.4923783, 'top': 0.35007596, 'width': 0.13668466, 'height': 0.28304994}
tomato paste: 31.05% {'additional_properties': {}, 'left': 0.36416405, 'top': 0.37494493, 'width': 0.14024884, 'height': 0.26880276}
```
### Task - draw bounding boxes on the image
1. The Pip package [Pillow](https://pypi.org/project/Pillow/) can be used to draw on images. Install this with the following command:
```sh
pip3 install pillow
```
If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
1. Add the following import statement to the top of the `app.py` file:
```python
from PIL import Image, ImageDraw, ImageColor
```
This imports code needed to edit the image.
1. Add the following code to the end of the `app.py` file:
```python
with Image.open('image.jpg') as im:
draw = ImageDraw.Draw(im)
for prediction in predictions:
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
left = scale_left * im.width
top = scale_top * im.height
right = scale_right * im.width
bottom = scale_bottom * im.height
draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
im.save('image.jpg')
```
This code opens the image that was saved earlier for editing. It then loops through the predictions getting the bounding boxes, and calculates the bottom right coordinate using the bounding box values from 0-1. These are then converted to image coordinates by multiplying by the relevant dimension of the image. For example, if the left value was 0.5 on an image that was 600 pixels wide, this would convert it to 300 (0.5 x 600 = 300).
Each bounding box is drawn on the image using a red line. Finally the edited image is saved, overwriting the original image.
1. Run the app with the camera pointing at some stock on a shelf. You will see the `image.jpg` file in the VS Code explorer, and you will be able to select it to see the bounding boxes.
![4 cans of tomato paste with bounding boxes around each can](../../../images/rpi-stock-with-bounding-boxes.jpg)
## Count stock
In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
### Task - count stock ignoring overlap
1. The Pip package [Shapely](https://pypi.org/project/Shapely/) can be used to calculate the intersection. If you are using a Raspberry Pi, you will need to install a library dependency first:
```sh
sudo apt install libgeos-dev
```
1. Install the Shapely Pip package:
```sh
pip3 install shapely
```
If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
1. Add the following import statement to the top of the `app.py` file:
```python
from shapely.geometry import Polygon
```
This imports code needed to create polygons to calculate overlap.
1. Above the code that draws the bounding boxes, add the following code:
```python
overlap_threshold = 0.20
```
This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
1. To calculate overlap using Shapely, the bounding boxes need to be converted into Shapely polygons. Add the following function to do this:
```python
def create_polygon(prediction):
scale_left = prediction.bounding_box.left
scale_top = prediction.bounding_box.top
scale_right = prediction.bounding_box.left + prediction.bounding_box.width
scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
```
This creates a polygon using the bounding box of a prediction.
1. The logic for removing overlapping objects involves comparing all bounding boxes and if any pairs of predictions have bounding boxes that overlap more than the threshold, delete one of the predictions. To compare all the predictions, you compare prediction 1 with 2, 3, 4, etc., then 2 with 3, 4, etc. The following code does this:
```python
to_delete = []
for i in range(0, len(predictions)):
polygon_1 = create_polygon(predictions[i])
for j in range(i+1, len(predictions)):
polygon_2 = create_polygon(predictions[j])
overlap = polygon_1.intersection(polygon_2).area
smallest_area = min(polygon_1.area, polygon_2.area)
if overlap > (overlap_threshold * smallest_area):
to_delete.append(predictions[i])
break
for d in to_delete:
predictions.remove(d)
print(f'Counted {len(predictions)} stock items')
```
The overlap is calculated using the Shapely `Polygon.intersection` method that returns a polygon that has the overlap. The area is then calculated from this polygon. This overlap threshold is not an absolute value, but needs to be a percentage of the bounding box, so the smallest bounding box is found, and the overlap threshold is used to calculate what area the overlap can be to not exceed the percentage overlap threshold of the smallest bounding box. If the overlap exceeds this, the prediction is marked for deletion.
Once a prediction has been marked for deletion it doesn't need to be checked again, so the inner loop breaks out to check the next prediction. You can't delete items from a list whilst iterating through it, so the bounding boxes that overlap more than the threshold are added to the `to_delete` list, then deleted at the end.
Finally the stock count is printed to the console. This could then be sent to an IoT service to alert if the stock levels are low. All of this code is before the bounding boxes are drawn, so you will see the stock predictions without overlaps on the generated images.
> 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
1. Run the app with the camera pointing at some stock on a shelf. The output will indicate the number of bounding boxes without overlaps that exceed the threshold. Try adjusting the `overlap_threshold` value to see predictions being ignored.
> 💁 You can find this code in the [code-count/pi](code-count/pi) or [code-count/virtual-iot-device](code-count/virtual-iot-device) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,74 @@
# Call your object detector from your IoT device - Virtual IoT Hardware and Raspberry Pi
Once your object detector has been published, it can be used from your IoT device.
## Copy the image classifier project
The majority of your stock detector is the same as the image classifier you created in a previous lesson.
### Task - copy the image classifier project
1. Create a folder called `stock-counter` either on your computer if you are using a virtual IoT device, or on your Raspberry Pi. If you are using a virtual IoT device make sure you set up a virtual environment.
1. Set up the camera hardware.
* If you are using a Raspberry Pi you will need to fit the PiCamera. You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
* If you are using a virtual IoT device then you will need to install CounterFit and the CounterFit PyCamera shim. If you are going to use still images, then capture some images that your object detector hasn't seen yet, if you are going to use your web cam make sure it is positioned in a way that can see the stock you are detecting.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
## Change the code from a classifier to an image detector
The code you used to classify images is very similar to the code to detect objects. The main difference is the method called on the Custom Vision SDK, and the results of the call.
### Task - change the code from a classifier to an image detector
1. Delete the three lines of code that classifies the image and processes the predictions:
```python
results = predictor.classify_image(project_id, iteration_name, image)
for prediction in results.predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
```
Remove these three lines.
1. Add the following code to detect objects in the image:
```python
results = predictor.detect_image(project_id, iteration_name, image)
threshold = 0.3
predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
for prediction in predictions:
print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
```
This code calls the `detect_image` method on the predictor to run the object detector. It then gathers all the predictions with a probability above a threshold, printing them to the console.
Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
1. Run this code and it will capture an image, sending it to the object detector, and print out the detected objects. If you are using a virtual IoT device ensure you have an appropriate image set in CounterFit, or our web cam is selected. If you are using a Raspberry Pi, make sure your camera is pointing to objects on a shelf.
```output
pi@raspberrypi:~/stock-counter $ python3 app.py
tomato paste: 34.13%
tomato paste: 33.95%
tomato paste: 35.05%
tomato paste: 32.80%
```
> 💁 You may need to adjust the `threshold` to an appropriate value for your images.
You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
> 💁 You can find this code in the [code-detect/pi](code-detect/pi) or [code-detect/virtual-iot-device](code-detect/virtual-iot-device) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,167 @@
# Count stock from your IoT device - Wio Terminal
A combination of the predictions and their bounding boxes can be used to count stock in an image.
## Count stock
![4 cans of tomato paste with bounding boxes around each can](../../../images/rpi-stock-with-bounding-boxes.jpg)
In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
### Task - count stock ignoring overlap
1. Open your `stock-counter` project if it is not already open.
1. Above the `processPredictions` function, add the following code:
```cpp
const float overlap_threshold = 0.20f;
```
This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
1. Below this, and above the `processPredictions` function, add the following code to calculate the overlap between two rectangles:
```cpp
struct Point {
float x, y;
};
struct Rect {
Point topLeft, bottomRight;
};
float area(Rect rect)
{
return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
}
float overlappingArea(Rect rect1, Rect rect2)
{
float left = max(rect1.topLeft.x, rect2.topLeft.x);
float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
float top = max(rect1.topLeft.y, rect2.topLeft.y);
float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
if ( right > left && bottom > top )
{
return (right-left)*(bottom-top);
}
return 0.0f;
}
```
This code defines a `Point` struct to store points on the image, and a `Rect` struct to define a rectangle using a top left and bottom right coordinate. It then defines an `area` function that calculates the area of a rectangle from a top left and bottom right coordinate.
Next it defines a `overlappingArea` function that calculates the overlapping area of 2 rectangles. If they don't overlap, it returns 0.
1. Below the `overlappingArea` function, declare a function to convert a bounding box to a `Rect`:
```cpp
Rect rectFromBoundingBox(JsonVariant prediction)
{
JsonObject bounding_box = prediction["boundingBox"].as<JsonObject>();
float left = bounding_box["left"].as<float>();
float top = bounding_box["top"].as<float>();
float width = bounding_box["width"].as<float>();
float height = bounding_box["height"].as<float>();
Point topLeft = {left, top};
Point bottomRight = {left + width, top + height};
return {topLeft, bottomRight};
}
```
This takes a prediction from the object detector, extracts the bounding box and uses the values on the bounding box to define a rectangle. The right side is calculated from the left plus the width. The bottom is calculated as the top plus the height.
1. The predictions need to be compared to each other, and if 2 predictions have an overlap of more that the threshold, one of them needs to be deleted. The overlap threshold is a percentage, so needs to be multiplied by the size of the smallest bounding box to check that the overlap exceeds the given percentage of the bounding box, not the given percentage of the whole image. Start by deleting the content of the `processPredictions` function.
1. Add the following to the empty `processPredictions` function:
```cpp
std::vector<JsonVariant> passed_predictions;
for (int i = 0; i < predictions.size(); ++i)
{
Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
float prediction_1_area = area(prediction_1_rect);
bool passed = true;
for (int j = i + 1; j < predictions.size(); ++j)
{
Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
float prediction_2_area = area(prediction_2_rect);
float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
float smallest_area = min(prediction_1_area, prediction_2_area);
if (overlap > (overlap_threshold * smallest_area))
{
passed = false;
break;
}
}
if (passed)
{
passed_predictions.push_back(predictions[i]);
}
}
```
This code declares a vector to store the predictions that don't overlap. It then loops through all the predictions, creating a `Rect` from the bounding box.
Next this code loops through the remaining predictions, starting at the one after the current prediction. This stops predictions being compared more than once - once 1 and 2 have been compared, there's no need to compare 2 with 1, only with 3, 4, etc.
For each pair of predictions the overlapping area is calculated. This is then compared to the area of the smallest bounding box - if the overlap exceeds the threshold percentage of the smallest bounding box, the prediction is marked as not passed. If after comparing all the overlap, the prediction passes the checks it is added to the `passed_predictions` collection.
> 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
1. After this, add the following code to send details of the passed predictions to the serial monitor:
```cpp
for(JsonVariant prediction : passed_predictions)
{
String boundingBox = prediction["boundingBox"].as<String>();
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
Serial.println(buff);
}
```
This code loops through the passed predictions and prints their details to the serial monitor.
1. Below this, add code to print the number of counted items to the serial monitor:
```cpp
Serial.print("Counted ");
Serial.print(passed_predictions.size());
Serial.println(" stock items.");
```
This could then be sent to an IoT service to alert if the stock levels are low.
1. Upload and run your code. Point the camera at objects on a shelf and press the C button. Try adjusting the `overlap_threshold` value to see predictions being ignored.
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 17416
tomato paste: 35.84% {"left":0.395631,"top":0.215897,"width":0.180768,"height":0.359364}
tomato paste: 35.87% {"left":0.378554,"top":0.583012,"width":0.14824,"height":0.359382}
tomato paste: 34.11% {"left":0.699024,"top":0.592617,"width":0.124411,"height":0.350456}
tomato paste: 35.16% {"left":0.513006,"top":0.647853,"width":0.187472,"height":0.325817}
Counted 4 stock items.
```
> 💁 You can find this code in the [code-count/wio-terminal](code-count/wio-terminal) folder.
😀 Your stock counter program was a success!

@ -0,0 +1,102 @@
# Call your object detector from your IoT device - Wio Terminal
Once your object detector has been published, it can be used from your IoT device.
## Copy the image classifier project
The majority of your stock detector is the same as the image classifier you created in a previous lesson.
### Task - copy the image classifier project
1. Connect your ArduCam your Wio Terminal, following the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md#task---connect-the-camera).
You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
1. Create a brand new Wio Terminal project using PlatformIO. Call this project `stock-counter`.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
## Change the code from a classifier to an image detector
The code you used to classify images is very similar to the code to detect objects. The main difference is the URL that is called that you obtained from Custom Vision, and the results of the call.
### Task - change the code from a classifier to an image detector
1. Add the following include directive to the top of the `main.cpp` file:
```cpp
#include <vector>
```
1. Rename the `classifyImage` function to `detectStock`, both the name of the function and the call in the `buttonPressed` function.
1. Above the `detectStock` function, declare a threshold to filter out any detections that have a low probability:
```cpp
const float threshold = 0.3f;
```
Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
1. Above the `detectStock` function, declare a function to process the predictions:
```cpp
void processPredictions(std::vector<JsonVariant> &predictions)
{
for(JsonVariant prediction : predictions)
{
String tag = prediction["tagName"].as<String>();
float probability = prediction["probability"].as<float>();
char buff[32];
sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
Serial.println(buff);
}
}
```
This takes a list of predictions and prints them to the serial monitor.
1. In the `detectStock` function, replace the contents of the `for` loop that loops through the predictions with the following:
```cpp
std::vector<JsonVariant> passed_predictions;
for(JsonVariant prediction : predictions)
{
float probability = prediction["probability"].as<float>();
if (probability > threshold)
{
passed_predictions.push_back(prediction);
}
}
processPredictions(passed_predictions);
```
This loops through the predictions, comparing the probability to the threshold. All predictions that have a probability higher than the threshold are added to a `list` and passed to the `processPredictions` function.
1. Upload and run your code. Point the camera at objects on a shelf and press the C button. You will see the output in the serial monitor:
```output
Connecting to WiFi..
Connected!
Image captured
Image read to buffer with length 17416
tomato paste: 35.84%
tomato paste: 35.87%
tomato paste: 34.11%
tomato paste: 35.16%
```
> 💁 You may need to adjust the `threshold` to an appropriate value for your images.
You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
![4 cans of tomato paste on a shelf with predictions for the 4 detections of 35.8%, 33.5%, 25.7% and 16.6%](../../../images/custom-vision-stock-prediction.png)
> 💁 You can find this code in the [code-detect/wio-terminal](code-detect/wio-terminal) folder.
😀 Your stock counter program was a success!

@ -1,12 +1,12 @@
# Consumer IoT - build a smart voice assistant
The fod has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric.
The food has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric.
The latest iterations are now part of our smart devices. In kitchens in homes all throughout the world you'll hear cooks shouting "Hey Siri - set a 10 minute timer", or "Alexa - cancel my bread timer". No longer do you have to walk back to the kitchen to check on a timer, you can do it from your phone, or a call out across the room.
In these 4 lessons you'll learn how to build a smart timer, using AI to recognize your voice, understand what you are asking for, and reply with information about your timer. You'll also add support for multiple languages.
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics

@ -1,6 +1,8 @@
# Recognize speech with an IoT device
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-21.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of the Azure speech service, a topic that will be covered in this lesson:
@ -18,7 +20,7 @@ This video gives an overview of the Azure speech service, a topic that will be c
'Alexa, timer status'
'Alexa set a 8 minute timer called steam broccoli'
'Alexa, set a 8 minute timer called steam broccoli'
Smart devices are becoming more and more pervasive. Not just as smart speakers like HomePods, Echos and Google Homes, but embedded in our phones, watches, and even light fittings and thermostats.
@ -51,8 +53,6 @@ Microphones come in a variety of types:
![Patti Smith singing into a Shure SM58 (dynamic cardioid type) microphone](../../../images/dynamic-mic.jpg)
***Beni Köhler / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* Ribbon - Ribbon microphones are similar to dynamic microphones, except they have a metal ribbon instead of a diaphragm. This ribbon moves in a magnetic field generating an electrical current. Like dynamic microphones, ribbon microphones don't need power to work.
![Edmund Lowe, American actor, standing at radio microphone (labeled for (NBC) Blue Network), holding script, 1942](../../../images/ribbon-mic.jpg)
@ -61,8 +61,6 @@ Microphones come in a variety of types:
![C451B small-diaphragm condenser microphone by AKG Acoustics](../../../images/condenser-mic.jpg)
***[Harumphy](https://en.wikipedia.org/wiki/User:Harumphy) at [en.wikipedia](https://en.wikipedia.org/) / [Creative Commons Attribution-Share Alike 3.0 Unported](https://creativecommons.org/licenses/by-sa/3.0/deed.en)***
* MEMS - Microelectromechanical systems microphones, or MEMS, are microphones on a chip. They have a pressure sensitive diaphragm etched onto a silicon chip, and work similar to a condenser microphone. These microphones can be tiny, and integrated into circuitry.
![A MEMS microphone on a circuit board](../../../images/mems-microphone.png)
@ -87,7 +85,7 @@ For example most streaming music services offer 16-bit or 24-bit audio. This mea
> 💁 You may have hard of 8-bit audio, often referred to as LoFi. This is audio sampled using only 8-bits, so -128 to 127. The first computer audio was limited to 8 bits due to hardware limitations, so this is often seen in retro gaming.
These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'loseless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz.
These samples are taken many thousands of times per second, using well-defined sample rates measured in KHz (thousands of readings per second). Streaming music services use 48KHz for most audio, but some 'lossless' audio uses up to 96KHz or even 192KHz. The higher the sample rate, the closer to the original the audio will be, up to a point. There is debate whether humans can tell the difference above 48KHz.
✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio?

@ -1,8 +1,8 @@
# Understand language
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-22.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz

@ -1,8 +1,8 @@
# Set a timer and provide spoken feedback
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-23.jpg)
![Embed a video here if available](video-url)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@ -121,6 +121,7 @@ SSML has ways to change how words are spoken, such as adding emphasis to certain
* Read more on speech synthesis on the [Speech synthesis page on Wikipedia](https://wikipedia.org/wiki/Speech_synthesis)
* Read more on ways criminals are using speech synthesis to steal on the [Fake voices 'help cyber crooks steal cash' story on BBC news](https://www.bbc.com/news/technology-48908736)
* Learn more about the risks to voice actors from synthesized versions of their voices in the [This TikTok Lawsuit Is Highlighting How AI Is Screwing Over Voice Actors article on Vice](https://www.vice.com/en/article/z3xqwj/this-tiktok-lawsuit-is-highlighting-how-ai-is-screwing-over-voice-actors)
## Assignment

@ -1,6 +1,8 @@
# Support multiple languages
Add a sketchnote if possible/appropriate
![A sketchnote overview of this lesson](../../../sketchnotes/lesson-24.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
This video gives an overview of the Azure speech services, covering speech to text and text to speech from earlier lessons, as well as translating speech, a topic covered in this lesson:
@ -26,6 +28,10 @@ In this lesson we'll cover:
* [Support multiple languages in applications with translations](#support-multiple-languages-in-applications-with-translations)
* [Translate text using an AI service](#translate-text-using-an-ai-service)
> 🗑 This is the last lesson in this project, so after completing this lesson and the assignment, don't forget to clean up your cloud services. You will need the services to complete the assignment, so make sure to complete that first.
>
> Refer to [the clean up your project guide](../../../clean-up.md) if necessary for instructions on how to do this.
## Translate text
Text translation has been a computer science problem that has been researched for over 70 years, and only now thanks to advances in AI and computer power is close to being solved to a point where it is almost as good as human translators.
@ -42,7 +48,7 @@ For example, translating "Hello world" from English into French can be performed
Substitutions don't work when different languages use different ways of saying the same thing. For example, the English sentence "My name is Jim", translates into "Je m'appelle Jim" in French - literally "I call myself Jim". "Je" is French for "I", "moi" is me, but is concatenated with the verb as it starts with a vowel, so becomes "m'", "appelle" is to call, and "Jim" isn't translated as it's a name, and not a word that can be translated. Word ordering also becomes an issue - a simple substitution of "Je m'appelle Jim" becomes "I myself call Jim", with a different word order to English.
> 💁 Some words are never translated - my name is Jim regardless of which language is used to introduce me.
> 💁 Some words are never translated - my name is Jim regardless of which language is used to introduce me. When translating to languages that use different alphabets, or use different letters for different sounds, then words can be *transliterated*, that is selecting letters or characters that give the appropriate sound to sound the same as the given word.
Idioms are also a problem for translation. These are phrases that have an understood meaning that is different from a direct interpretation of the words. For example, in English the idiom "I've got ants in my pants" does not literally refer to having ants in your clothing, but to being restless. If you translated this to German, you would end up confusing the listener, as the German version is "I have bumble bees in the bottom".
@ -117,8 +123,6 @@ In an ideal world, your whole application should understand as many different la
![A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese](../../../images/translated-smart-timer.png)
***A smart timer architecture translating Japanese to English, processing in English then translating back to Japanese. Microcontroller by Template / recording by Aybige Speaker / Speaker by Gregor Cresnar - all from the [Noun Project](https://thenounproject.com)***
Imagine you are building a smart timer that uses English end-to-end, understanding spoken English and converting that to text, running the language understanding in English, building up responses in English and replying with English speech. If you wanted to add support for Japanese, you could start with translating spoken Japanese to English text, then keep the core of the application the same, then translate the response text to Japanese before speaking the response. This would allow you to quickly add Japanese support, and you can expand to providing full end-to-end Japanese support later.
> 💁 The downside to relying on machine translation is that different languages and cultures have different ways of saying the same things, so the translation may not match the expression you are expecting.

@ -14,7 +14,9 @@ Azure Cloud Advocates at Microsoft are pleased to offer a 12-week, 24-lesson cur
The projects cover the journey of food from farm to table. This includes farming, logistics, manufacturing, retail and consumer - all popular industry areas for IoT devices.
![A road map for the course showing 24 lessons covering intro, farming, transport, processing, retail and cooking](sketchnotes/Roadmap.png)
![A road map for the course showing 24 lessons covering intro, farming, transport, processing, retail and cooking](sketchnotes/Roadmap.jpg)
> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
**Hearty thanks to our authors [Jen Fox](https://github.com/jenfoxbot), [Jen Looper](https://github.com/jlooper), [Jim Bennett](https://github.com/jimbobbennett), and our sketchnote artist [Nitya Narasimhan](https://github.com/nitya).**
@ -22,13 +24,15 @@ The projects cover the journey of food from farm to table. This includes farming
> **Teachers**, we have [included some suggestions](for-teachers.md) on how to use this curriculum. If you would like to create your own lessons, we have also included a [lesson template](lesson-template/README.md).
> **Students**, to use this curriculum on your own, fork the entire repo and complete the exercises on your own, starting with a pre-lecture quiz, then reading the lecture and completing the rest of the activities. Try to create the projects by comprehending the lessons rather than copying the solution code; however that code is available in the /solutions folders in each project-oriented lesson. Another idea would be to form a study group with friends and go through the content together. For further study, we recommend [Microsoft Learn](create a Learn collection and post it here) and by watching the videos mentioned below.
> **Students**, to use this curriculum on your own, fork the entire repo and complete the exercises on your own, starting with a pre-lecture quiz, then reading the lecture and completing the rest of the activities. Try to create the projects by comprehending the lessons rather than copying the solution code; however that code is available in the /solutions folders in each project-oriented lesson. Another idea would be to form a study group with friends and go through the content together. For further study, we recommend [Microsoft Learn](https://docs.microsoft.com/users/jimbobbennett/collections/ke2ehd351jopwr?WT.mc_id=academic-17441-jabenn).
<!--
> Your promo video here
[![Promo video](./images/iot-for-beginners.png)](https://youtube.com/watch?v=R1wrdtmBSII "Promo video")
> 💁 Click the image above for a video about the project and the folks who created it!
> 💁 Click the image above for a video about the project!
-->
## Pedagogy
@ -40,13 +44,15 @@ In addition, a low-stakes quiz before a class sets the intention of the student
Each project is be based around real-world hardware available to students and hobbyists. Each project looks into the specific project domain, providing relevant background knowledge. To be a successful developer it helps to understand the domain in which you are solving problems, providing this background knowledge allows students to think about their IoT solutions and learnings in the context of the kind of real-world problem that they might be asked to solve as an IoT developer. Students learn the 'why' of the solutions they are building, and get an appreciation of the end user.
We have two choices of IoT hardware to use for the projects depending on personal preference, programming language knowledge or preferences, learning goals and availability. We have also provided a 'virtual hardware' version for those who don't have access to hardware, or want to learn mode before committing to a purchase. You can read more and find a 'shopping list' on the [hardware page](./hardware.md).
## Hardware
We have two choices of IoT hardware to use for the projects depending on personal preference, programming language knowledge or preferences, learning goals and availability. We have also provided a 'virtual hardware' version for those who don't have access to hardware, or want to learn mode before committing to a purchase. You can read more and find a 'shopping list' on the [hardware page](./hardware.md), including links to buy complete kits from our friends at Seeed Studio.
> 💁 Find our [Code of Conduct](CODE_OF_CONDUCT.md), [Contributing](CONTRIBUTING.md), and [Translation](TRANSLATIONS.md) guidelines. We welcome your constructive feedback!
## Each lesson includes:
- optional sketchnote
- sketchnote
- optional supplemental video
- pre-lesson warmup quiz
- written lesson
@ -64,7 +70,7 @@ We have two choices of IoT hardware to use for the projects depending on persona
| | Project Name | Concepts Taught | Learning Objectives | Linked Lesson |
| :-: | :----------: | :-------------: | ------------------- | :-----------: |
| 01 | [Getting started](./1-getting-started) | Introduction to IoT | Learn the basic principles of IoT and the basic building blocks of IoT solutions such as sensors and cloud services whilst you are setting up your first IoT device | [Introduction to IoT](./1-getting-started/lessons/1-introduction-to-iot/README.md) |
| 02 | [Getting started](./1-getting-started) | A deeper dive into IoT| Learn more about the components of an IoT system, as well as microcontrollers and single-board computers | [A deeper dive into IoT](./1-getting-started/lessons/2-deeper-dive/README.md) |
| 02 | [Getting started](./1-getting-started) | A deeper dive into IoT | Learn more about the components of an IoT system, as well as microcontrollers and single-board computers | [A deeper dive into IoT](./1-getting-started/lessons/2-deeper-dive/README.md) |
| 03 | [Getting started](./1-getting-started) | Interact with the physical world with sensors and actuators | Learn about sensors to gather data from the physical world, and actuators to send feedback, whilst you build a nightlight | [Interact with the physical world with sensors and actuators](./1-getting-started/lessons/3-sensors-and-actuators/README.md) |
| 04 | [Getting started](./1-getting-started) | Connect your device to the Internet | Learn about how to connect an IoT device to the Internet to send and receive messages by connecting your nightlight to an MQTT broker | [Connect your device to the Internet](./1-getting-started/lessons/4-connect-internet/README.md) |
| 05 | [Farm](./2-farm) | Predict plant growth | Learn how to predict plant growth using temperature data captured by an IoT device | [Predict plant growth](./2-farm/lessons/1-predict-plant-growth/README.md) |
@ -91,3 +97,16 @@ We have two choices of IoT hardware to use for the projects depending on persona
## Offline access
You can run this documentation offline by using [Docsify](https://docsify.js.org/#/). Fork this repo, [install Docsify](https://docsify.js.org/#/quickstart) on your local machine, and then in the root folder of this repo, type `docsify serve`. The website will be served on port 3000 on your localhost: `localhost:3000`.
### PDF
You can generate a PDF of this content for offline access if needed. To do this, make sure you have [npm installed](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) and run the following commands in the root folder of this repo:
```sh
npm i
npm run convert
```
## Image attributions
You can find all the attributions for the images used in this curriculum where required in the [Attributions](./attributions.md).

@ -1,24 +1,10 @@
# TODO: The maintainer of this repo has not yet edited this file
**REPO OWNER**: Do you want Customer Service & Support (CSS) support for this product/project?
- **No CSS support:** Fill out this template with information about how to file issues and get help.
- **Yes CSS support:** Fill out an intake form at [aka.ms/spot](https://aka.ms/spot). CSS will work with/help you to determine next steps. More details also available at [aka.ms/onboardsupport](https://aka.ms/onboardsupport).
- **Not sure?** Fill out a SPOT intake as though the answer were "Yes". CSS will help you decide.
*Then remove this first heading from this SUPPORT.MD file before publishing your repo.*
# Support
## How to file issues and get help
This project uses GitHub Issues to track bugs and feature requests. Please search the existing
issues before filing new issues to avoid duplicates. For new issues, file your bug or
feature request as a new Issue.
This project uses GitHub Issues to track bugs and feature requests. Please search the existing issues before filing new issues to avoid duplicates. For new issues, file your bug or feature request as a new Issue.
For help and questions about using this project, please **REPO MAINTAINER: INSERT INSTRUCTIONS HERE
FOR HOW TO ENGAGE REPO OWNERS OR COMMUNITY FOR HELP. COULD BE A STACK OVERFLOW TAG OR OTHER
CHANNEL. WHERE WILL YOU HELP PEOPLE?**.
For help and questions about using this project, please contact us by raising an issue in this repo.
## Microsoft Support Policy

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save