diff --git a/.github/workflows/azure-static-web-apps-brave-island-0b7c7f50f.yml b/.github/workflows/azure-static-web-apps-brave-island-0b7c7f50f.yml
index 9f8faffe..eb4fa343 100644
--- a/.github/workflows/azure-static-web-apps-brave-island-0b7c7f50f.yml
+++ b/.github/workflows/azure-static-web-apps-brave-island-0b7c7f50f.yml
@@ -4,10 +4,6 @@ on:
push:
branches:
- main
- pull_request:
- types: [opened, synchronize, reopened, closed]
- branches:
- - main
jobs:
build_and_deploy_job:
diff --git a/.gitignore b/.gitignore
index d432aa94..1217dc76 100644
--- a/.gitignore
+++ b/.gitignore
@@ -7,3 +7,4 @@
.vscode/launch.json
.vscode/ipch
.ipynb_checkpoints
+/node_modules
diff --git a/.vscode/settings.json b/.vscode/settings.json
index b3d8c5b5..dde853fc 100644
--- a/.vscode/settings.json
+++ b/.vscode/settings.json
@@ -1,10 +1,13 @@
{
"cSpell.words": [
"ADCs",
+ "Alexa",
"Geospatial",
"Kbps",
"Mbps",
+ "SSML",
"Seeed",
+ "Siri",
"Twilio",
"UART",
"UDID",
diff --git a/1-getting-started/Translations/README.id.md b/1-getting-started/Translations/README.id.md
new file mode 100644
index 00000000..438bdaad
--- /dev/null
+++ b/1-getting-started/Translations/README.id.md
@@ -0,0 +1,16 @@
+# Memulai dengan IoT
+
+Pada bagian ini, Anda akan diperkenalkan dengan Internet of Things, dan mempelajari konsep dasar termasuk membangung proyek IoT 'Hello World' pertama Anda yang terhubung ke *cloud*. Proyek ini merupakan lampu malam yang akan menyala saat tingkat pencahayaan diukur dengan penurunan sensor. This project is a nightlight that lights up as light levels measured by a sensor drop.
+
+
+
+## Topik
+
+1. [Pengenalan IoT](lessons/1-introduction-to-iot/README.md)
+2. [Lebih dalam dengan IoT](lessons/2-deeper-dive/README.md)
+3. [Berinteraksi dengan dunia menggunakan sensor dan aktuator](lessons/3-sensors-and-actuators/README.md)
+4. [Menghubungkan perangkat Anda ke Internet](lessons/4-connect-internet/README.md)
+
+## Kredit
+
+Semua pelajaran ditulis dengan ♥️ oleh [Jim Bennett](https://GitHub.com/JimBobBennett)
diff --git a/1-getting-started/lessons/1-introduction-to-iot/README.md b/1-getting-started/lessons/1-introduction-to-iot/README.md
index 06901940..4af59048 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/README.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/README.md
@@ -44,7 +44,7 @@ The **T** in IoT stands for **Things** - devices that interact with the physical
Devices for production or commercial use, such as consumer fitness trackers, or industrial machine controllers, are usually custom-made. They use custom circuit boards, maybe even custom processors, designed to meet the needs of a particular task, whether that's being small enough to fit on a wrist, or rugged enough to work in a high temperature, high stress or high vibration factory environment.
-As a developer either learning about IoT or creating a device prototype, you'll need to start with a developer kit. These are general-purpose IoT devices designed for developers to use, often with features that you wouldn't see on a production device, such as a set of external pins to connect sensors or actuators to, hardware to support debugging, or additional resources that would add unnecessary cost when doing a large manufacturing run.
+As a developer either learning about IoT or creating a device prototype, you'll need to start with a developer kit. These are general-purpose IoT devices designed for developers to use, often with features that you wouldn't have on a production device, such as a set of external pins to connect sensors or actuators to, hardware to support debugging, or additional resources that would add unnecessary cost when doing a large manufacturing run.
These developer kits usually fall into two categories - microcontrollers and single-board computers. These will be introduced here, and we'll go into more detail in the next lesson.
@@ -135,6 +135,8 @@ Work through the relevant guide to set your device up and complete a 'Hello Worl
* [Single-board computer - Raspberry Pi](pi.md)
* [Single-board computer - Virtual device](virtual-device.md)
+✅ You will be using VS Code for both Arduino and Single-board computers. If you haven't used this before, read more about it on the [VS Code site](https://code.visualstudio.com?WT.mc_id=academic-17441-jabenn)
+
## Applications of IoT
IoT covers a huge range of use cases, across a few broad groups:
diff --git a/1-getting-started/lessons/1-introduction-to-iot/pi.md b/1-getting-started/lessons/1-introduction-to-iot/pi.md
index 08242b54..eaf4101e 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/pi.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/pi.md
@@ -235,7 +235,7 @@ Create the Hello World app.
> 💁 You need to explicitly call `python3` to run this code just in case you have Python 2 installed in addition to Python 3 (the latest version). If you have Python2 installed then calling `python` will use Python 2 instead of Python 3
- You should see the following output:
+ The following output will appear in the terminal:
```output
pi@raspberrypi:~/nightlight $ python3 app.py
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/README.ar.md b/1-getting-started/lessons/1-introduction-to-iot/translations/README.ar.md
new file mode 100644
index 00000000..b1e096c3
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/README.ar.md
@@ -0,0 +1,284 @@
+#
مقدمة لإنترنت الأشياء
+
+
+
+>
يغطي هذا الدرس بعض الموضوعات التمهيدية حول إنترنت الأشياء ، ويساعدك على إعداد أجهزتك.
+
+
سنغطي في هذا الدرس:
+
+
+
+- [ ما هو انترنت الأشياء ](#what-is-the-internet-of-things)
+- [الأجهزة المتعلقة بانترنت الأشياء](#iot-devices)
+- [قم بإعداد جهازك](#set-up-your-device)
+- [تطبيقات انترنت الأشياء](#applications-of-iot)
+- [أمثلة على أجهزة إنترنت الأشياء التي قد تكون موجودة حولك](#examples-of-iot-devices-you-may-have-around-you)
+
+
+##
ما هو "انترنت الأشياء"؟
+
+
مصطلح "إنترنت الأشياء" ابتكره
+Kevin Ashton
+ في عام 1999 ، للإشارة إلى توصيل الإنترنت بالعالم المادي عبر أجهزة الاستشعار. منذ ذلك الحين ، تم استخدام المصطلح لوصف أي جهاز يتفاعل مع العالم المادي من حوله ، إما عن طريق جمع البيانات من أجهزة الاستشعار ، أو توفير تفاعلات في العالم الحقيقي عبر المشغلات (الأجهزة التي تقوم بشيء مثل تشغيل مفتاح أو إضاءة LED ) ، متصلة بشكل عام بأجهزة أخرى أو بالإنترنت.
+
+>
المستشعرات تجمع المعلومات من العالم ، مثل قياس السرعة أو درجة الحرارة أو الموقع.
+>
+>
المشغلات تحول الإشارات الكهربائية إلى تفاعلات في العالم الحقيقي مثل تشغيل مفتاح أو تشغيل الأضواء أو إصدار أصوات أو إرسال إشارات تحكم إلى أجهزة أخرى ، على سبيل المثال لتشغيل مقبس طاقة.
+
+
إن إنترنت الأشياء كمجال تقني هو أكثر من مجرد أجهزة - فهو يشمل الخدمات المستندة إلى السحابة التي يمكنها معالجة بيانات المستشعر ، أو إرسال طلبات إلى المشغلات المتصلة بأجهزة إنترنت الأشياء. ويشمل أيضًا الأجهزة التي لا تحتوي على اتصال بالإنترنت أو لا تحتاج إليه ، وغالبًا ما يشار إليها باسم الأجهزة المتطورة. هذه هي الأجهزة التي يمكنها معالجة بيانات الاستشعار والاستجابة لها بنفسها ، وعادةً ما تستخدم نماذج الذكاء الاصطناعي المدربة في السحابة.
+
+
إنترنت الأشياء هو مجال تكنولوجي سريع النمو. تشير التقديرات إلى أنه بحلول نهاية عام 2020 ، تم نشر 30 مليار جهاز إنترنت الأشياء وتوصيلها بالإنترنت. بالنظر إلى المستقبل ، تشير التقديرات إلى أنه بحلول عام 2025 ، ستجمع أجهزة إنترنت الأشياء ما يقرب من 80 زيتابايت من البيانات ، أو 80 تريليون جيجابايت. هذا كثير من البيانات!
+
+
+
+
✅ قم بإجراء القليل من البحث: ما مقدار البيانات التي تم إنشاؤها بواسطة أجهزة إنترنت الأشياء المستخدمة بالفعل ، وكم يتم إهدارها؟ لماذا يتم تجاهل الكثير من البيانات؟
+
+
+هذه البيانات هي مفتاح نجاح إنترنت الأشياء. لكي تكون مطورًا ناجحًا لإنترنت الأشياء ، فأنت بحاجة إلى فهم البيانات التي تحتاج إلى جمعها ، وكيفية جمعها ، وكيفية اتخاذ القرارات بناءً عليها ، وكيفية استخدام هذه القرارات للتفاعل مع العالم المادي إذا لزم الأمر.
+
+##
الأجهزة المتعلقة بانترنت الأشياء
+
+
+يرمز T في إنترنت الأشياء إلى الأشياء - الأجهزة التي تتفاعل مع العالم المادي من حولها إما عن طريق جمع البيانات من أجهزة الاستشعار ، أو توفير تفاعلات واقعية عبر المشغلات.
+
+عادةً ما تكون الأجهزة المخصصة للإنتاج أو الاستخدام التجاري ، مثل أجهزة تتبع اللياقة البدنية للمستهلكين أو أجهزة التحكم في الآلات الصناعية ، مصنوعة خصيصًا. يستخدمون لوحات دوائر مخصصة ، وربما حتى معالجات مخصصة ، مصممة لتلبية احتياجات مهمة معينة ، سواء كانت صغيرة بما يكفي لتناسب معصمك ، أو متينة بما يكفي للعمل في درجات حرارة عالية ، أو ضغط مرتفع ، أو بيئة مصنع عالية الاهتزاز.
+
+بصفتك مطورًا إما يتعلم عن إنترنت الأشياء أو يصنع نموذجًا أوليًا للجهاز ، فستحتاج إلى البدء بمجموعة أدوات التطوير. هذه أجهزة إنترنت الأشياء للأغراض العامة مصممة للمطورين لاستخدامها ، غالبًا مع ميزات لن تراها على جهاز إنتاج ، مثل مجموعة من المسامير الخارجية لتوصيل المستشعرات أو المشغلات بها ، أو الأجهزة لدعم التصحيح ، أو الموارد الإضافية التي سيضيف تكلفة غير ضرورية عند إجراء عملية تصنيع كبيرة.
+
+تنقسم مجموعات المطورين هذه عادةً إلى فئتين - المتحكمات الدقيقة وأجهزة الكمبيوتر أحادية اللوحة. سيتم تقديم هذه هنا ، وسنتناول المزيد من التفاصيل في الدرس التالي.
+
+>💁يمكن أيضًا اعتبار هاتفك جهاز إنترنت الأشياء للأغراض العامة ، مع أجهزة استشعار ومحركات مدمجة ، مع تطبيقات مختلفة تستخدم المستشعرات والمشغلات بطرق مختلفة مع خدمات سحابية مختلفة. يمكنك أيضًا العثور على بعض البرامج التعليمية لإنترنت الأشياء التي تستخدم تطبيق الهاتف كجهاز إنترنت الأشياء.
+
+
+
+
+
+###
المتحكم الدقيق
+
+
+المتحكم الدقيق (يشار إليه أيضًا باسم MCU ، اختصارًا لوحدة التحكم الدقيقة) هو جهاز كمبيوتر صغير يتكون من:
+
+🧠 واحدة أو أكثر من وحدات المعالجة المركزية (CPUs) - "عقل" المتحكم الدقيق الذي يدير برنامجك
+
+💾 الذاكرة (ذاكرة الوصول العشوائي وذاكرة البرنامج) - حيث يتم تخزين البرنامج والبيانات والمتغيرات الخاصة بك
+
+🔌 اتصالات الإدخال / الإخراج القابلة للبرمجة (I / O) - للتحدث إلى الأجهزة الطرفية الخارجية (الأجهزة المتصلة) مثل المستشعرات والمشغلات
+
+عادةً ما تكون وحدات التحكم الدقيقة أجهزة حوسبة منخفضة التكلفة ، حيث ينخفض متوسط أسعار الأجهزة المستخدمة في الأجهزة المخصصة إلى حوالي 0.50 دولار أمريكي ، وبعض الأجهزة رخيصة مثل 0.03 دولار أمريكي. يمكن أن تبدأ مجموعات المطورين بسعر منخفض يصل إلى 4 دولارات أمريكية ، مع ارتفاع التكاليف كلما أضفت المزيد من الميزات. محطة Wio, مجموعة مطور متحكم من Seeed studios تحتوي على أجهزة استشعار ومحركات واي فاي وشاشة تكلف حوالي 30 دولارًا أمريكيًا.
+
💁 عند البحث في الإنترنت عن المتحكمات الدقيقة ، احذر من البحث عن المصطلح MCU لأن هذا سيعيد لك الكثير من النتائج ل Marvel السينمائي ، وليس للمتحكمات الدقيقة.
+
+
+تم تصميم وحدات التحكم الدقيقة بحيث تتم برمجتها للقيام بعدد محدود من المهام المحددة للغاية ، بدلاً من أن تكون أجهزة كمبيوتر للأغراض العامة مثل أجهزة الكمبيوتر الشخصية أو أجهزة Mac. باستثناء سيناريوهات محددة للغاية ، لا يمكنك توصيل الشاشة ولوحة المفاتيح والماوس واستخدامها في مهام الأغراض العامة.
+
+عادة ما تأتي مجموعات مطوري وحدات التحكم الدقيقة بأجهزة استشعار ومشغلات إضافية على متنها. تحتوي معظم اللوحات على واحد أو أكثر من مصابيح LED التي يمكنك برمجتها ، بالاضافة الى الأجهزة الأخرى مثل المقابس القياسية لإضافة المزيد من أجهزة الاستشعار أو المشغلات باستخدام أنظمة بيئية مختلفة للمصنعين ، أو أجهزة استشعار مدمجة (عادةً ما تكون الأكثر شيوعًا مثل مستشعرات درجة الحرارة). تحتوي بعض وحدات التحكم الدقيقة على اتصال لاسلكي مدمج مثل Bluetooth أو WiFi ، أو تحتوي على وحدات تحكم دقيقة إضافية على اللوحة لإضافة هذا الاتصال.
+
+
+>
💁 عادة ما يتم برمجة المتحكمات الدقيقة في C / C ++.
+
+###
أجهزة كمبيوتر أحادية اللوحة
+
+
الكمبيوتر أحادي اللوحة هو جهاز حوسبة صغير يحتوي على جميع عناصر الكمبيوتر الكامل الموجودة على لوحة صغيرة واحدة. هذه هي الأجهزة التي لها مواصفات قريبة من سطح المكتب أو الكمبيوتر المحمول أو جهاز Mac ، وتعمل بنظام تشغيل كامل ، ولكنها صغيرة ، وتستخدم طاقة أقل ، وأرخص بكثير.
+
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+
+
+يعد Raspberry Pi أحد أشهر أجهزة الكمبيوتر أحادية اللوحة.
+
+مثل وحدة التحكم الدقيقة ، تحتوي أجهزة الكمبيوتر أحادية اللوحة على وحدة معالجة مركزية وذاكرة ودبابيس إدخال / إخراج ، ولكنها تحتوي على ميزات إضافية مثل شريحة رسومات للسماح لك بتوصيل الشاشات ومخرجات الصوت ومنافذ USB مثل كاميرات الويب أو وحدات التخزين الخارجية. يتم تخزين البرامج على بطاقات SD أو محركات أقراص ثابتة بالاضافة الى نظام التشغيل ، بدلاً من شريحة ذاكرة مدمجة في اللوحة.
+
+
+>
🎓 يمكنك التفكير في الكمبيوتر أحادي اللوحة كإصدار أصغر وأرخص من الكمبيوتر الشخصي أو جهاز Mac الذي تقرأ عليه ، مع إضافة دبابيس GPIO (إدخال / إخراج للأغراض العامة) للتفاعل مع المستشعرات والمحركات.
+
+
+
أجهزة الكمبيوتر أحادية اللوحة هي أجهزة كمبيوتر كاملة الميزات ، لذا يمكن برمجتها بأي لغة. عادةً ما تتم برمجة أجهزة إنترنت الأشياء بلغة Python.
+
+
+###
اختيارات الأجهزة لبقية الدروس
+
+
تتضمن جميع الدروس اللاحقة مهام باستخدام جهاز إنترنت الأشياء للتفاعل مع العالم والتواصل مع السحابة. يدعم كل درس 3 خيارات للأجهزة - Arduino (باستخدام Seeed Studios Wio Terminal) ، أو كمبيوتر لوحة واحدة ، إما جهاز (Raspberry Pi 4) ، أو جهاز كمبيوتر افتراضي أحادي اللوحة يعمل على الكمبيوتر الشخصي أو جهاز Mac.
+
+ يمكنك أن تقرأ عن الأجهزة اللازمة لإكمال جميع المهام في ملف [hardware guide](../../../../hardware.md)
+
+ >
💁 لا تحتاج إلى شراء أي جهاز إنترنت الأشياء لإكمال المهام ، يمكنك القيام بكل شيء باستخدام جهاز كمبيوتر افتراضي أحادي اللوحة.
+
+
تحديد الأجهزة التي تختارها متروك لك - يعتمد ذلك على ما لديك إما في المنزل أو في مدرستك ، ولغة البرمجة التي تعرفها أو تخطط لتعلمها. سيستخدم كلا النوعين من الأجهزة نفس نظام المستشعر البيئي ، لذلك إذا بدأت في مسار واحد ، يمكنك التغيير إلى الآخر دون الحاجة إلى استبدال معظم المجموعة. سيكون الكمبيوتر الافتراضي أحادي اللوحة مكافئًا للتعلم على Raspberry Pi ، حيث يمكن نقل معظم الشفرة إلى Pi إذا حصلت في النهاية على جهاز ومستشعرات.
+
+
+
+###
مجموعة مطوري Arduino
+
+
إذا كنت مهتمًا بتعلم تطوير وحدة التحكم الدقيقة ، فيمكنك إكمال المهام باستخدام جهاز Arduino. ستحتاج إلى فهم أساسي لبرمجة C / C ++ ، حيث أن الدروس ستعلم فقط الكود ذي صلة بإطار عمل Arduino ، وأجهزة الاستشعار والمشغلات المستخدمة ، والمكتبات التي تتفاعل مع السحابة.
+إذا كنت مهتمًا بتعلم تطوير إنترنت الأشياء باستخدام أجهزة كمبيوتر أحادية اللوحة ، فيمكنك إكمال المهام باستخدام Raspberry Pi ، أو جهاز افتراضي يعمل على جهاز الكمبيوتر أو جهاز Mac.
+
+ستحتاج إلى فهم أساسي لبرمجة Python ، حيث ستعلم الدروس فقط التعليمات البرمجية ذات الصلة بالمستشعرات والمشغلات المستخدمة والمكتبات التي تتفاعل مع السحابة.
+
+
+>
💁 إذا كنت تريد تعلم البرمجة في Python ، فراجع سلسلتي الفيديو التاليين:
+>
+>* [بايثون للمبتدئين](https://channel9.msdn.com/Series/Intro-to-Python-Development?WT.mc_id=academic-17441-jabenn)
+> * [ المزيد من بايثون للمبتدئين ](https://channel9.msdn.com/Series/More-Python-for-Beginners?WT.mc_id=academic-7372-jabenn)
+
+
+
+الواجبات ستستخدم Visual Studio Code
+
+إذا كنت تستخدم Raspberry Pi ، فيمكنك إما تشغيل Pi الخاص بك باستخدام إصدار سطح المكتب الكامل من Raspberry Pi OS ، والقيام بكل الترميز مباشرة على Pi باستخدام the Raspberry Pi OS version of VS Code ، أو قم بتشغيل Pi من جهاز الكمبيوتر أو جهاز Mac باستخدام VS Code مع Remote SSH extension التي يتيح لك الاتصال بـ Pi الخاص بك وتحرير التعليمات البرمجية وتصحيحها وتشغيلها كما لو كنت تقوم بالتشفير عليها مباشرةً.
+
+إذا كنت تستخدم خيار الجهاز الظاهري ، فستقوم بالتشفير مباشرة على جهاز الكمبيوتر الخاص بك. بدلاً من الوصول إلى المستشعرات والمشغلات ، ستستخدم أداة لمحاكاة هذا الجهاز لتوفير قيم أجهزة الاستشعار التي يمكنك تحديدها ، وإظهار نتائج المشغلات على الشاشة.
+
+
+###
قم بإعداد جهازك
+
+
قبل أن تبدأ في برمجة جهاز إنترنت الأشياء الخاص بك ، ستحتاج إلى إجراء قدر صغير من الإعداد. اتبع التعليمات ذات الصلة أدناه بناءً على الجهاز الذي ستستخدمه.
+
+>
💁 إذا لم يكن لديك جهاز بعد ، فارجع إلى
+>
+>[hardware guide](../../../../hardware.md) للمساعدة في تحديد الجهاز الذي ستستخدمه والأجهزة الإضافية التي تحتاج إلى شرائها. لا تحتاج إلى شراء أجهزة ، حيث يمكن تشغيل جميع المشاريع على أجهزة افتراضية.
+
+
+تتضمن هذه التعليمات ارتباطات إلى مواقع ويب تابعة لجهات خارجية من منشئي الأجهزة أو الأدوات التي ستستخدمها. هذا للتأكد من أنك تستخدم دائمًا أحدث الإرشادات للأدوات والأجهزة المختلفة.
+
+اعمل من خلال الدليل ذي الصلة لإعداد جهازك وإكمال مشروع "Hello World". ستكون هذه هي الخطوة الأولى في إنشاء ضوء ليلي لإنترنت الأشياء على الدروس الأربعة .
+
+* [ وحدة Arduino - Wio ](../wio-terminal.md)
+* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](../pi.md)
+* [كمبيوتر ذو لوحة واحدة - جهاز افتراضي](../virtual-device.md)
+
+
+ ###
تطبيقات إنترنت الأشياء
+
+
يغطي إنترنت الأشياء مجموعة كبيرة من حالات الاستخدام عبر مجموعات قليلة:
+
+* انترنت الاشياء المستهلك
+* إنترنت الأشياء التجاري
+* إنترنت الأشياء الصناعي
+* انترنت الاشياء البنية التحتية
+
+
+
✅ قم ببعض البحث: لكل مجال من المجالات الموضحة أدناه ، ابحث عن مثال ملموس واحد لم يرد في النص.
+
+ ###
انترنت الاشياء المستهلك
+
+
+يشير IoT للمستهلكين إلى أجهزة IoT التي سيشتريها المستهلكون ويستخدمونها في المنزل. بعض هذه الأجهزة مفيدة بشكل لا يصدق ، مثل مكبرات الصوت الذكية وأنظمة التدفئة الذكية والمكانس الكهربائية الآلية. البعض الآخر مشكوك فيه في فائدته ، مثل الصنابير التي يتم التحكم فيها بالصوت والتي تعني بعد ذلك أنه لا يمكنك إيقاف تشغيلها لأن التحكم الصوتي لا يمكنه سماع صوت المياه الجارية.
+
+تعمل أجهزة إنترنت الأشياء للمستهلكين على تمكين الأشخاص من تحقيق المزيد في محيطهم ، وخاصة المليار شخص من ذوي الإعاقة. يمكن للمكانس الكهربائية الروبوتية توفير أرضيات نظيفة للأشخاص الذين يعانون من مشاكل في الحركة والذين لا يستطيعون تنظيف أنفسهم ، وتسمح الأفران التي يتم التحكم فيها بالصوت للأشخاص ذوي الرؤية المحدودة أو التحكم في المحرك بتسخين أفرانهم بصوتهم فقط ، ويمكن أن تسمح أجهزة المراقبة الصحية للمرضى بمراقبة الحالات المزمنة بأنفسهم بمزيد من الانتظام والمزيد من التحديثات التفصيلية عن ظروفهم. أصبحت هذه الأجهزة منتشرة في كل مكان حتى أن الأطفال الصغار يستخدمونها كجزء من حياتهم اليومية ، على سبيل المثال الطلاب الذين يقومون بالتعليم الافتراضي أثناء جائحة COVID يضبطون أجهزة ضبط الوقت على الأجهزة المنزلية الذكية لتتبع أعمالهم المدرسية أو أجهزة الإنذار لتذكيرهم باجتماعات الفصل القادمة.
+
+
+
+
✅ ما هي أجهزة IoT الاستهلاكية التي لديك أو في منزلك؟
+
+
+
+ ###
انترنت الأشياء التجاري
+
+
+يغطي إنترنت الأشياء التجاري استخدام إنترنت الأشياء في مكان العمل. في مكتب ما ، قد يكون هناك أجهزة استشعار إشغال وكاشفات حركة لإدارة الإضاءة والتدفئة لإبقاء الأضواء والتدفئة فقط عند عدم الحاجة إليها ، مما يقلل التكلفة وانبعاثات الكربون. في المصنع ، يمكن لأجهزة إنترنت الأشياء مراقبة مخاطر السلامة مثل عدم ارتداء العمال للقبعات الصلبة أو الضوضاء التي وصلت إلى مستويات خطيرة. في البيع بالتجزئة ، يمكن لأجهزة إنترنت الأشياء قياس درجة حرارة التخزين البارد ، وتنبيه صاحب المتجر إذا كانت الثلاجة أو الفريزر خارج نطاق درجة الحرارة المطلوبة ، أو يمكنهم مراقبة العناصر الموجودة على الأرفف لتوجيه الموظفين لإعادة تعبئة المنتجات التي تم بيعها. تعتمد صناعة النقل أكثر فأكثر على إنترنت الأشياء لمراقبة مواقع المركبات ، وتتبع الأميال على الطريق لشحن مستخدمي الطريق ، وتتبع ساعات السائق وانقطاع الامتثال ، أو إخطار الموظفين عند اقتراب مركبة من المستودع للاستعداد للتحميل أو التفريغ.
+
+
+
+
✅ ما هي أجهزة إنترنت الأشياء التجارية المتوفرة لديك في مدرستك أو مكان عملك؟
+
+
+
+ ###
انترنت الأشياء الصناعي
+
+
+إنترنت الأشياء الصناعي ، أو IIoT ، هو استخدام أجهزة إنترنت الأشياء للتحكم في الآلات وإدارتها على نطاق واسع. يغطي هذا مجموعة واسعة من حالات الاستخدام ، من المصانع إلى الزراعة الرقمية.
+
+تستخدم المصانع أجهزة إنترنت الأشياء بعدة طرق مختلفة. يمكن مراقبة الماكينات بأجهزة استشعار متعددة لتتبع أشياء مثل درجة الحرارة والاهتزاز وسرعة الدوران. يمكن بعد ذلك مراقبة هذه البيانات للسماح للجهاز بالتوقف إذا خرج عن تفاوتات معينة - يعمل على درجة حرارة عالية جدًا ويتم إيقاف تشغيله على سبيل المثال. يمكن أيضًا جمع هذه البيانات وتحليلها بمرور الوقت لإجراء الصيانة التنبؤية ، حيث ستنظر نماذج الذكاء الاصطناعي في البيانات المؤدية إلى الفشل ، وتستخدم ذلك للتنبؤ بحالات الفشل الأخرى قبل حدوثها.
+
+تعتبر الزراعة الرقمية مهمة إذا كان كوكب الأرض يريد إطعام العدد المتزايد من السكان ، خاصة بالنسبة لملياري شخص في 500 مليون أسرة تعيش على زراعة الكفاف يمكن أن تتراوح الزراعة الرقمية من عدد قليل من أجهزة الاستشعار بالدولار الواحد ، إلى الأجهزة التجارية الضخمة. يمكن للمزارع أن يبدأ بمراقبة درجات الحرارة واستخدام أيام الدرجات المتزايدة للتنبؤ بموعد جاهزية المحصول للحصاد. يمكنهم ربط مراقبة رطوبة التربة بأنظمة الري الآلية لمنح نباتاتهم القدر المطلوب من المياه ، ولكن ليس أكثر لضمان عدم جفاف محاصيلهم دون إهدار المياه. بل إن المزارعين يأخذون الأمر إلى أبعد من ذلك ويستخدمون الطائرات بدون طيار وبيانات الأقمار الصناعية والذكاء الاصطناعي لمراقبة نمو المحاصيل والأمراض وجودة التربة في مساحات شاسعة من الأراضي الزراعية
+
+
+
✅ ما هي أجهزة إنترنت الأشياء الأخرى التي يمكن أن تساعد المزارعين؟
+
+ ###
انترنت الاشياء البنية التحتية
+
+
يقوم إنترنت الأشياء للبنية التحتية بمراقبة والتحكم في البنية التحتية المحلية والعالمية التي يستخدمها الناس كل يوم.
+
+
المدن الذكية هي مناطق حضرية تستخدم أجهزة إنترنت الأشياء لجمع البيانات حول المدينة واستخدامها لتحسين كيفية إدارة المدينة. عادة ما يتم تشغيل هذه المدن بالتعاون بين الحكومات المحلية والأوساط الأكاديمية والشركات المحلية ، وتتبع وإدارة الأشياء التي تختلف من النقل إلى وقوف السيارات والتلوث. على سبيل المثال ، في كوبنهاغن ، الدنمارك ، يعد تلوث الهواء مهمًا للسكان المحليين ، لذلك يتم قياسه واستخدام البيانات لتوفير معلومات حول أنظف طرق ركوب الدراجات والركض.
+
+
+
شبكات الكهرباء الذكية تسمح بتحليلات أفضل للطلب على الطاقة من خلال جمع بيانات الاستخدام على مستوى المنازل الفردية. يمكن أن توجه هذه البيانات القرارات على مستوى الدولة بما في ذلك مكان بناء محطات طاقة جديدة ، وعلى المستوى الشخصي من خلال إعطاء المستخدمين رؤى حول مقدار الطاقة التي يستخدمونها ، وأوقات استخدامها ، وحتى اقتراحات حول كيفية تقليل التكاليف ، مثل لشحن السيارات الكهربائية في الليل.
+
+
✅ إذا كان بإمكانك إضافة أجهزة إنترنت الأشياء لقياس أي شيء تعيش فيه ، فماذا سيكون؟
+
+
+
+ ###
أمثلة على أجهزة إنترنت الأشياء التي قد تكون موجودة حولك
+
+
+
+ستندهش من عدد أجهزة إنترنت الأشياء الموجودة حولك. أكتب هذا من المنزل و لدي الأجهزة التالية متصلة بالإنترنت بميزات ذكية مثل التحكم في التطبيق أو التحكم الصوتي أو القدرة على إرسال البيانات إلي عبر هاتفي:
+
+* مكبرات صوت ذكية متعددة
+* ثلاجة وغسالة صحون وفرن وميكروويف
+* مرقاب كهرباء الألواح الشمسية
+* المقابس الذكية
+* جرس باب بالفيديو وكاميرات مراقبة
+* ترموستات ذكي مع عدة مستشعرات ذكية للغرفة
+* فتحت باب المرآب
+* أنظمة الترفيه المنزلي وأجهزة التلفزيون ذات التحكم الصوتي
+* أضواء
+* أجهزة تتبع اللياقة البدنية والصحة
+
+كل هذه الأنواع من الأجهزة لها مستشعرات و / أو مشغلات وتتحدث إلى الإنترنت. يمكنني معرفة ما إذا كان باب الكاراج الخاص بي مفتوحًا من هاتفي ، وأطلب من السماعة الذكية إغلاقها من أجلي. يمكنني حتى ضبطه على مؤقت ، لذلك إذا كان لا يزال مفتوحًا في الليل ، فسيتم إغلاقه تلقائيًا. عندما يرن جرس الباب ، يمكنني أن أرى من هاتفي من يوجد أينما كنت في العالم ، وأتحدث إليهم عبر مكبر صوت وميكروفون مدمجين في جرس الباب. يمكنني مراقبة الجلوكوز في الدم ومعدل ضربات القلب وأنماط النوم ، والبحث عن أنماط في البيانات لتحسين صحتي. يمكنني التحكم في الأضواء الخاصة بي عبر السحابة ، والجلوس في الظلام عندما ينقطع الاتصال بالإنترنت.
+
+
+---
+
+##
تحدي
+
+
ضع قائمة بأكبر عدد ممكن من أجهزة إنترنت الأشياء الموجودة في منزلك أو مدرستك أو مكان عملك - قد يكون هناك أكثر مما تعتقد!
اقرأ عن مزايا وإخفاقات مشروعات إنترنت الأشياء للمستهلكين. تحقق من المواقع الإخبارية بحثًا عن مقالات عن الأوقات التي حدث فيها خطأ ، مثل مشكلات الخصوصية أو مشكلات الأجهزة أو المشكلات الناجمة عن نقص الاتصال.
+
+بعض الأمثلة:
+
+* قم بالاطلاع على حساب التويتر لأمثلة جيدة على انترنت الاشياء المستهلك **[Internet of Sh*t](https://twitter.com/internetofshit)** *(profanity warning)*
+* [c|net - My Apple Watch saved my life: 5 people share their stories](https://www.cnet.com/news/apple-watch-lifesaving-health-features-read-5-peoples-stories/)
+* [c|net - ADT technician pleads guilty to spying on customer camera feeds for years](https://www.cnet.com/news/adt-home-security-technician-pleads-guilty-to-spying-on-customer-camera-feeds-for-years/) *(trigger warning - non-consensual voyeurism)*
+
+### الواجب
+
+[التحقيق في مشروع إنترنت الأشياء](assignment.ar.md)
+
+
+
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/README.id.md b/1-getting-started/lessons/1-introduction-to-iot/translations/README.id.md
new file mode 100644
index 00000000..8dacf239
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/README.id.md
@@ -0,0 +1,99 @@
+# Pengenalan IoT
+
+
+
+> Sketsa dibuat oleh [Nitya Narasimhan](https://github.com/nitya). Klik gambar untuk versi yang lebih besar.
+
+## Kuis prakuliah
+
+[Kuis prakuliah](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/1)
+
+## Pengantar
+
+Pelajaran ini mencakup beberapa topok pengantar mengenai Internet of Things, dan membuat Anda dapat mempersiapkan dan mengatur perangkat keras Anda.
+
+Dalam pelajaran ini kita akan membahas:
+
+* [Apa itu 'Internet of Things'?](#apa-itu-internet-of-things)
+* [Perangkat IoT](#perangkat-iot)
+* [Mengatur Perangkat Anda](#set-up-your-device)
+* [Penerapan dari IoT](#applications-of-iot)
+* [Contoh Perangkat IoT yang Mungkin Anda Punya di Sekitar](#examples-of-iot-devices-you-may-have-around-you)
+
+## Apa itu 'Internet of Things'?
+
+Istilah 'Internet of Things' diciptakan oleh [Kevin Ashton](https://wikipedia.org/wiki/Kevin_Ashton) pada tahun 1999, yang merujuk pada menghubungkan Internet ke dunia fisik melalui sensor. Sejak saat itu, istilah IoT digunakan untuk menggambarkan perangkat apa pun yang berinteraksi dengan dunia fisik di sekitarnya, baik dengan mengumpulkan data dari sensor, atau menyediakan interaksi dunia nyata melalui aktuator (perangkat yang melakukan sesuatu seperti menyalakan sakelar atau menyalakan LED), dan terhubung ke perangkat lain atau Internet.
+
+> **Sensor** mengumpulkan informasi dari lingkungan, seperti mengukur kecepatan, suhu, atau lokasi.
+>
+> **Aktuator** mengubah sinyal listrik menjadi interaksi pada lingkungan seperti memicu sakelar, menyalakan lampu, membuat suara, atau mengirim *control signal* ke perangkat keras lain, misalnya untuk menyalakan soket listrik.
+
+IoT sebagai suatu bidang teknologi lebih dari sekadar perangkat. Hal ini mencakup layanan berbasis cloud yang dapat memproses data sensor, atau mengirim permintaan ke aktuator yang terhubung ke perangkat IoT. IoT juga mencakup perangkat yang tidak memiliki atau tidak memerlukan konektivitas Internet, sering disebut sebagai *edge devices* atau perangkat tepi. Perangkat tepi adalah perangkat yang dapat memproses dan merespons data sensor itu sendiri, biasanya menggunakan model AI yang dilatih di cloud.
+
+IoT merupakan bidang teknologi yang berkembang pesat. Diperkirakan pada akhir tahun 2020, 30 miliar perangkat IoT dikerahkan dan terhubung ke Internet. Jika melihat ke masa depan, diperkirakan pada tahun 2025, perangkat IoT akan mengumpulkan hampir 80 zettabytes data atau 80 triliun gigabyte. Banyak sekali bukan?
+
+
+
+✅ Lakukan sedikit riset: Berapa banyak data yang dihasilkan oleh perangkat IoT yang benar-benar digunakan, dan berapa banyak yang terbuang? Mengapa begitu banyak data yang diabaikan?
+
+Data ini adalah kunci kesuksesan IoT. Untuk menjadi pengembang IoT yang sukses, Anda perlu memahami data yang perlu Anda kumpulkan, cara mengumpulkannya, cara membuat keputusan berdasarkan data tersebut, dan cara menggunakan keputusan tersebut untuk berinteraksi dengan lingkungan fisik jika diperlukan.
+
+## Perangkat IoT
+
+Huruf **T** di IoT adalah singkatan dari **Things** - perangkat yang berinteraksi dengan lingkungan fisik di sekitarnya baik dengan mengumpulkan data dari sensor atau menyediakan interaksi dunia nyata melalui aktuator.
+
+Perangkat untuk produksi atau penggunaan komersial, seperti pelacak kebugaran konsumen, atau pengontrol mesin industri, biasanya dibuat khusus. Mereka menggunakan papan sirkuit khusus, bahkan mungkin prosesor khusus, yang dirancang untuk memenuhi kebutuhan tugas tertentu, apakah itu cukup kecil untuk muat di pergelangan tangan, atau cukup kuat untuk bekerja di lingkungan pabrik dengan suhu tinggi, stres tinggi, atau getaran tinggi.
+
+Sebagai pengembang yang belajar tentang IoT atau membuat prototipe perangkat, Anda harus mulai dengan *developer kit* atau perangkat pengembang. Perangkat tersebut adalah perangkat IoT untuk tujuan umum yang dirancang untuk digunakan pengembang, seringkali dengan fitur yang tidak akan Anda miliki di perangkat produksi, seperti satu set pin eksternal untuk menghubungkan sensor atau aktuator, perangkat keras untuk mendukung debugging, atau sumber daya tambahan yang akan menambah biaya yang tidak perlu saat melakukan produksi manufaktur.
+
+Perangkat pengembang ini biasanya terbagi dalam dua kategori - mikrokontroler dan komputer papan tunggal. Perangkat tersebut akan diperkenalkan di sini, dan kita akan membahas lebih detail di pelajaran berikutnya.
+
+> 💁 Ponsel Anda juga dapat dianggap sebagai perangkat IoT tujuan umum, dengan sensor dan aktuator bawaan, dengan berbagai aplikasi yang menggunakan sensor dan aktuator dengan cara yang berbeda dengan layanan cloud yang berbeda. Anda bahkan dapat menemukan beberapa tutorial IoT yang menggunakan aplikasi ponsel sebagai perangkat IoT.
+
+### Mikrokontroler
+
+Mikrokontroler atau Pengendali mikro (juga disebut sebagai MCU, kependekan dari microcontroller unit) adalah komputer kecil yang terdiri dari:
+
+🧠 Satu atau lebih unit pemrosesan pusat (CPU) - 'otak' mikrokontroler yang menjalankan program Anda
+
+💾 Memori (RAM dan memori program) - tempat program, data, dan variabel Anda disimpan
+
+🔌 Koneksi input/output (I/O) yang dapat diprogram - untuk berbicara dengan periferal eksternal (perangkat yang terhubung) seperti sensor dan aktuator
+
+Mikrokontroler biasanya merupakan perangkat komputasi berbiaya rendah, dengan harga rata-rata untuk yang digunakan dalam perangkat keras khusus turun menjadi sekitar US$0,50, dan beberapa perangkat bahkan semurah US$0,03. Perangkat pengembang dapat ditemukan mulai dari harga US$4, dengan biaya meningkat karena Anda menambahkan lebih banyak fitur. [Wio Terminal](https://www.seeedstudio.com/Wio-Terminal-p-4509.html), perangkat pengembang mikrokontroler dari [Seeed studios](https://www.seeedstudio.com) yang memiliki sensor , aktuator, WiFi, dan layar berharga sekitar US$30.
+
+
+
+> 💁 Saat mencari mikrokontroler di Internet, berhati-hatilah saat mencari istilah **MCU** karena ini akan mengembalikan banyak hasil untuk Marvel Cinematic Universe, bukan mikrokontroler.
+
+Mikrokontroler dirancang untuk diprogram untuk melakukan sejumlah tugas yang sangat spesifik, daripada menjadi komputer dengan tujuan umum seperti PC atau Mac. Kecuali untuk skenario yang sangat spesifik, Anda tidak dapat menghubungkan monitor, keyboard, dan mouse dan menggunakannya untuk tugas umum.
+
+Perangkat pengembang mikrokontroler biasanya dilengkapi dengan sensor dan aktuator tambahan. Sebagian besar papan (board) akan memiliki satu atau lebih LED yang dapat Anda program, bersama dengan perangkat lain seperti steker standar untuk menambahkan lebih banyak sensor atau aktuator menggunakan berbagai ekosistem pabrikan atau sensor bawaan (biasanya yang paling populer seperti sensor suhu). Beberapa mikrokontroler memiliki konektivitas nirkabel bawaan seperti Bluetooth atau WiFi atau memiliki mikrokontroler tambahan di papan untuk menambahkan konektivitas ini.
+
+> 💁 Mikrokontroler biasanya diprogram dalam bahasa C/C++.
+
+### Komputer papan tunggal
+
+Komputer papan tunggal adalah perangkat komputasi kecil yang memiliki semua elemen komputer lengkap yang terdapat pada satu papan kecil. Ini adalah perangkat yang memiliki spesifikasi yang mirip dengan desktop atau laptop PC atau Mac, menjalankan sistem operasi lengkap, tetapi berukuran kecil, menggunakan lebih sedikit daya, dan jauh lebih murah.
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+Raspberry Pi adalah salah satu komputer papan tunggal yang paling populer.
+
+Seperti mikrokontroler, komputer papan tunggal memiliki CPU, memori dan pin input/output, tetapi mereka memiliki fitur tambahan seperti chip grafis untuk memungkinkan Anda menghubungkan monitor, output audio, dan port USB untuk menghubungkan mouse keyboard dan USB standar lainnya. perangkat seperti webcam atau penyimpanan eksternal. Program disimpan di kartu SD atau hard drive bersama dengan sistem operasi, bukan chip memori yang terpasang di papan.
+
+> 🎓 Anda dapat menganggap komputer papan tunggal sebagai versi PC atau Mac yang lebih kecil dan lebih murah, dengan tambahan pin GPIO (general-purpose input/output) untuk berinteraksi dengan sensor dan aktuator.
+
+Komputer papan tunggal adalah komputer berfitur lengkap, sehingga dapat diprogram dalam bahasa apa pun. Perangkat IoT biasanya diprogram dengan Python.
+
+### Pilihan perangkat keras untuk sisa pelajaran
+
+Semua pelajaran selanjutnya mencakup tugas menggunakan perangkat IoT untuk berinteraksi dengan dunia fisik dan berkomunikasi dengan cloud. Setiap pelajaran mendukung 3 pilihan perangkat - Arduino (menggunakan Terminal Seeed Studios Wio), atau komputer papan tunggal, baik perangkat fisik (Raspberry Pi 4) atau komputer papan tunggal virtual yang berjalan di PC atau Mac Anda.
+
+Anda dapat membaca tentang perangkat keras yang diperlukan untuk menyelesaikan semua tugas di [panduan perangkat keras](../../../hardware.md).
+
+> 💁 Anda tidak perlu membeli perangkat keras IoT apa pun untuk menyelesaikan tugas, Anda dapat melakukan semuanya menggunakan komputer papan tunggal virtual.
+
+Perangkat keras mana yang Anda pilih terserah Anda - itu tergantung pada apa yang Anda miliki di rumah di sekolah Anda, dan bahasa pemrograman apa yang Anda ketahui atau rencanakan untuk dipelajari. Kedua varian perangkat keras akan menggunakan ekosistem sensor yang sama, jadi jika Anda memulai pada salah satu perangkat, Anda dapat dengan mudah melakukannya pada perangkat lain tanpa harus mengganti sebagian besar perangkat pengembang. Komputer papan tunggal virtual akan setara dengan pembelajaran di Raspberry Pi, dengan sebagian besar kode dapat ditransfer ke Pi jika Anda akhirnya mendapatkan perangkat dan sensor.
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/README.zh.md b/1-getting-started/lessons/1-introduction-to-iot/translations/README.zh.md
new file mode 100644
index 00000000..9ee0838c
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/README.zh.md
@@ -0,0 +1,222 @@
+# 物联网(IoT)简介
+
+
+
+> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). 如果你想看比较大的图片,请点击它。
+
+## 知识检查(初)
+
+[知识检查(初)](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/1)
+
+## 简介
+
+本课程涵盖一些介绍物联网(IoT)的主题,以及教你怎么开始设置你的硬件。
+
+本课程将涵盖:
+
+* [什么是 ‘物联网(IoT)’?](#what-is-the-internet-of-things)
+* [IoT 设备](#iot-devices)
+* [设置你的设备](#set-up-your-device)
+* [IoT 的应用场景](#applications-of-iot)
+* [在你的周围的IoT 设备例子](#examples-of-iot-devices-you-may-have-around-you)
+
+## 什么是 ‘物联网(IoT)’?
+
+为了形容运用感应器来链接网络与物质世界,1999年 [凯文·阿什顿(Kevin Ashton)](https://wikipedia.org/wiki/Kevin_Ashton) 生造了‘物联网(IoT)’这个词。自从那时起,这个生造词被用来形容任何能够跟周围的世界交互的设备。这些设备可以使用感应器收集数据或者使用执行器(会做事—例如打开开关、发光二极管等—的设备)在物质世界完成任务。通常执行器会连接到其它设备或者网络。
+
+> **感应器** 从世界中收集数据,例如:速度、温度或地点。
+>
+> **执行器** 将电信号转换成行动,例如:打开灯,发出声音或将控制信号传送到其它硬件。
+
+IoT 不仅是设备,还包含云服务;这些服务能处理数据,或者将请求传送给跟 IoT 设备有链接的执行器。它也包括没有链接的设备;它们通常被称为“边缘设备”,而且它们有能力用基于云的AI模型自己处理与回应感应器的数据。
+
+IoT 是一个快速发展的技术领域。专家预计2020底,世界上有三百亿 IoT 设备跟网络有链接。专家也预计2025年,IoT 设备将来收集大概80 ZB(80万亿GB)。那是个非常大的数量!
+
+
+
+✅ 做一点儿研究: IoT 设备收集的数据,多少是有用的、多少是被浪费的?为什么那么多数据被忽略了?
+
+对于 IoT 的成功,这些数据是不可或缺的。想成为一名有成就的 IoT 开发者,就必须了解你需要收集的数据、怎么收集它,怎么利用它来作出决定以及如果有必要的话,怎么用那些决定来跟物质世界交互。
+
+## IoT 设备
+
+IoT 的 **T** 代表 **Things**(物)—— 可以跟物质世界交互的设备;它们使用感应器收集数据或者使用执行器在物质世界完成任务。
+
+为生产或商业的设备(例:健身追踪器、机器控制器等)通常是自定义生成的。它们利用的自定义生成电路板——有时连自定义生成处理器都有——设计使它们能够满足某某任务的需求。例:要戴在手上的需要够小,或者要承受高温度、高压力、高振动的工厂环境的需要够耐用。
+
+无论你正在学 IoT 或者在创立原型设备,作为一名 IoT 开发者,你必须由一个开发者套件开始。这些是为 IoT 开发者设计的通用设备,而它们通常不会有生产设备的特点,例如用来链接感应器和执行器的外部引脚、帮助排除错误的硬件或者将生产运行中加不必要的成本的额外资源。
+
+这些开发者套件通常有两种:微控制器和单板机。我们会在这儿介绍它们,而将在下一课更详细地解释它们。
+
+> 💁 你的手机也算是一个通用 IoT 设备;它拥有感应器与执行器,以及有不同应用程序用不同的方式来跟不同云服务利用它们。你甚至可以找到几个用手机的应用程序当作 IoT 设备的 IoT 教程。
+
+### 微控制器
+
+一个微控制器(MCU)是一个小电脑。它包含:
+
+🧠 至少一个中央处理器(CPU);它就是微控制器的“脑”——运行你的程序
+
+💾 内存(随机存取存储器(RAM)和程序存储器——储存你的程序、数据变量的地方
+
+🔌 可编程输入输出(I/O)连接——为了跟外围设备(如感应器或执行器)沟通
+
+微控制器通常是较便宜的计算设备;自定义生成硬件的平均成本下降到 US$0.50,而也有些设备到 US$0.03 那么便宜。开发者套件的价钱可以从 US$4 起,但你加上越多特点,价钱就越高。[Wio Terminal](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) 是个来自 [Seeed studios](https://www.seeedstudio.com) 的微控制器;它包含感应器、执行器、Wi-Fi和一个屏幕,总共算起来大约 US$30。
+
+
+
+> 💁 当你在网上寻找微控制器时,要小心用 **MCU** 这个词,因为这回带来许多关于漫威电影宇宙(Marvel Cinematic Universe)的搜索结果,而不是关于微控制器的。
+
+微控制器的设计允许它们被编程完成几个非常特定的任务,不像比较通用的电脑。除了一些很具体的情况,你无法连接显示器、键盘和鼠标并利用它完成通用任务。
+
+微控制器开发者套件平时包括额外的感应器和执行器。大多数的会有至少一个能被编程的发光二极管(LEDs),还有其它设备如普通插头用来链接更多应用其或执行器或内置感应器(平时最常见的如温度)。有些微控制器有内置的无线连接如蓝牙或 Wi-Fi,或者有额外微控制器用来加这个连接性能。
+
+> 💁 我们通常用 C 或 C++ 来为微控制器写程序。
+
+### 单板机
+
+单板机指的是一个小计算器;它把一个电脑的所有要素装在单单一个小板上。这些设备的规格跟台式电脑或笔记本电脑比较相似,它们也运行完整的操作系统,但它们较小,用比较少电力以及便宜多了。
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+Raspberry Pi 是其中最流行的单板机。
+
+就像一个微控制器,单板机有个中央处理器、内存和输入输出引脚,但它们也有额外的特点如一个让你链接显示器的图形芯片、音频输出与 USB 端口让你链接键盘、鼠标和其它普通 USB 设备如网络摄像头和外置储存。程序将在 SD 卡或硬盘以及一个操作系统被储存,而不是通过一个内置的存储芯片。
+
+> 🎓 你可以把单板机当成一个较小、较便宜的电脑版本,就像你现在正在用来读这些的电脑。可是,单板机还加了通用输入/输出端口,让你和感应器、执行器交互。
+
+单板机有电脑的所有要素,所以你可以用任何编程语言来为它写程序。我们通常用 Python 为 IoT 设备写程序。
+
+### 为其余的课的硬件选择
+
+其余的课程都包括作业,而且你必须用一个 IoT 设备跟物质世界交互以及跟云沟通。每个课程会支持3种设备选择:Arduino(通过一个 Seeed Studios Wio Terminal),或者一个单板机——一个物质设备(一个 Raspberry Pi 4) 或一个在你的电脑上运行的虚拟单板机。
+
+你能在[硬件手册](../../../hardware.md)查到需要用来完成作业的硬件。
+
+> 💁 你不需要为了完成作业而买任何 IoT 硬件;所有东西可以使用一个虚拟单板机来做。
+
+要使用哪个硬件是你的选择,取决于你家里或学校里有什么,以及你知道或想学的编程语言。两种硬件都利用同样的感应器系统,所以万一你想途中改变你的选择,你也不需要替换大部分的套件。用虚拟单板机学跟用一个 Raspberry Pi 学差不多一模一样,而且你可以把大多数的程序转换去你的 Pi 如果你后来得到一个设备和感应器。
+
+### Arduino 开发者套件
+
+如果你对微控制器的开发感兴趣,那你可以用一个 Arduino 设备完成作业。你需要对 C 或 C++ 的编程语言有基本的理解,因为将来的课程只会教关于 Arduino 框架的程序、需要用到的感应器和执行器以及跟云交互的库。
+
+作业将用 [Visual Studio Code](https://code.visualstudio.com/?WT.mc_id=academic-17441-jabenn) 跟 [为微控制器开发的 PlatformIO 扩展](https://platformio.org). 如果你对 Arduino IDE 熟悉的话,你也能用它,但我们不会提供指示。
+
+### 单板机开发者套件
+
+如果你对使用单板机学 IoT 开发有兴趣,你可以用一个 Raspberry Pi 完成作业,或者在你的电脑运行的虚拟设备。
+
+你需要对 Python 有基本的理解,因为将来的课程只会教关于需要用到的感应器和执行器的程序以及跟云交互的库。
+
+> 💁 如果你想学怎么用 Python 写程序,看一看一下的两个视频系列:
+>
+> * [Python for beginners(为初学者的 Python)](https://channel9.msdn.com/Series/Intro-to-Python-Development?WT.mc_id=academic-17441-jabenn)
+> * [More Python for beginners(更多为初学者的 Python)](https://channel9.msdn.com/Series/More-Python-for-Beginners?WT.mc_id=academic-7372-jabenn)
+
+作业将用 [Visual Studio Code](https://code.visualstudio.com/?WT.mc_id=academic-17441-jabenn)。
+
+如果你在用一个 Raspberry Pi,为了运行你的 Pi,你可以通过完整的桌面 Raspberry Pi 操作系统以及用 [VS Code 的 Raspberry Pi 操作系统版本](https://code.visualstudio.com/docs/setup/raspberry-pi?WT.mc_id=academic-17441-jabenn)直接在你的 Pi 写程序,或者把它当成一个无头设备,从你的电脑用 VS Code 的 [Remote SSH 扩展](https://code.visualstudio.com/docs/remote/ssh?WT.mc_id=academic-17441-jabenn)写程序;这个扩展让你链接你的 Pi,便编辑你的程序、从程序排除错误和运行程序,就像如果你直接在 Pi上写程序一样。
+
+如果你选择用虚拟设备,你会直接在你的电脑上写程序。你不会读取感应器和执行器,反而你会用模拟工具来定义传感器值以及在屏幕上查看执行器的结果。
+
+##设置你的设备
+
+在你为你的 IoT 设备写程序前,你需要做点设置。请根据你将用到的设备,按照以下的指示。
+
+> 💁 如果你还缺少了一个设备,请用[硬件手册](../../../hardware.md) 帮你决定你要用的是哪个设备,以及你需要买的额外硬件。你不必买硬件,因为你可以用虚拟硬件运行所有的项目。
+
+这些指示包括第三方网站的链接;这些网站由你将用到的硬件或工具的创造者。这是为了确保你会一直在按照各种工具和硬件的最新指示。
+
+按照相当的指南来设置你的设备,并完成一个“Hello World”项目。我们将在这个介绍部分用4个课程创造一个 IoT 夜灯,而这是第一步。
+
+* [Arduino:Wio Terminal](wio-terminal.md)
+* [单板机:Raspberry Pi](pi.md)
+* [单板机:虚拟设备](virtual-device.md)
+
+## IoT 的应用场景
+
+IoT 有好多用例,跨过几组:
+
+* 消费者 IoT
+* 商业 IoT
+* 工业 IoT
+* 基础设施 IoT
+
+✅ 做一点儿研究:关于以下的每个范围,找一下一个不在内容里的详细例子。
+
+###消费者 IoT
+
+消费者 IoT 指的是消费者将买以及在家里用的 IoT 设备。这些设备中有的非常有用,例如:智能音箱、智能供暖和机器人吸尘器。其它的有些用例比较可疑,好像声控水龙头;你无法把它们关掉因为有了流水的声音,声控就无法听到你的语音。
+
+消费者 IoT 设备使人能够在他们的周围做成更多东西,尤其是世界上的10亿个残障人士。机器人吸尘器能为移动有困难、无法自己清扫的人提供干净的地板、声控烤箱让视力或移动力较差的人用自己的语音来给烤箱加热、健康监测器使患者能够自己监测自己的慢性病情况并定期得到更加详细的信息。这些设备将变得普及到连小孩子也在天天用着它们,如学生们在冠状病毒疫情时进行居家学习、利用智能家居设备的计时器来记录他们的功课或者设置闹钟来提醒他们参与他们未来的课程。
+
+✅ 你人身上或家里有什么消费者 IoT 设备呢?
+
+### 商业 IoT
+
+商业 IoT 包含公司里的 IoT 用例。在办公室里,有可能会有空间占用传感器和移动探测器被用来管理灯光和供暖以及在不需要的时候把它们关掉,以避免浪费钱和减少碳排放。在个工厂,IoT 设备可以监测安全隐患,例如:没有戴安全帽的人员或过于大的巨响。在店里,IoT 设备可以量冷库的温度,并通知店主如果某个冰箱的温度超过理想范围,或者它们可以监测架子上的产品,并通知工作人员如果他们为买完的产品补货。交通运输业也越来越依靠 IoT 设备来监测交通工具的地点、为道路使用者收费记录行驶里程、记录司机的工作时间和徐熙时间或者通知工作人员如果有货车即将来到仓库,并为上货或下货做准备。
+
+✅ 你的学校或公司里有什么消费者 IoT 设备呢?
+
+### 工业 IoT (IIoT)
+
+工业 IoT(也称为 “IIoT”)指的是使用 IoT 设备在大范围上来控制与管理机械。这包含很多用例,从工厂到数字农业。
+
+IoT 设备在工厂中有很多用例。它们能使用各种感应器(如:温度、振动、旋转速度等)来监测机械。我们将可以观察这些数据,而如果机器超出某些公差(如它的温度太高),我们可以把它停下来。我们也能收集并分析这些数据,让人工智能(AI)模型看故障前的数据,再利用它预报其它未来的故障;这就叫做“预测性维护”。
+
+为了养活不断增长的人口,数字农业非要不可,尤其是对于依靠[自给农业](https://wikipedia.org/wiki/Subsistence_agriculture) 的5亿家户中的20亿人。数字农业的范围包含才几块钱的感应器,也包含大大的初创企业。首先,一位农民可以监测温度以及用[生长度日(GDD)](https://wikipedia.org/wiki/Growing_degree-day),预测农作物能什么时候收割。再次,为了确保植物有充足的水量和避免浪费太多水,他们可以连接土壤水分监测。最后,农民可以进一步、用无人驾驶飞机、卫星数据、人工智能来监测大面积农田的作物生长、疾病和土壤质量。
+
+✅ 还有什么 IoT 设备可以用来帮助农民呢?
+
+### 基础设施 IoT
+
+基础设施 IoT 指的是监测与控制民众天天用的本地与全球基础设施。
+
+[智慧城市](https://wikipedia.org/wiki/Smart_city)是用 IoT 设备来收集关于城市的数据再利用它们来改善城市运行方式的城市地区。这些城市通常靠本地政府、学术界和本地企业之间的合作,监测和管理各种东西——从交通到污染。一个例子是在哥本哈根(丹麦王国首都),空气污染对人民来说非常重要,所以城市量它,再用它给人民提供关于最环保的骑自行车路线与步道的信息。
+
+[智能电网](https://wikipedia.org/wiki/Smart_grid)以收集各各家户使用电力的数据的方式来允许更好的电力需求分析。这些数据能影响国家的某些决定,包括在哪里建新发电厂。它们也能影响我们的个人决定;它们让我们明确地了解我们使用多少电力、我们在什么时候使用电力,还可以为我们提供减少浪费的意见,例如晚上为电动汽车充电。
+
+✅ 假如你可以在你住的地方加 IoT 设备来量任何东西,那会是什么?
+
+##在你的周围的 IoT 设备例子
+
+你会惊讶于你身边有多少 IoT 设备。我正在家里写这个课程的内容,而却在我的周围通过智能特点(应用程式控制、语音控制、通过手机把数据寄给我的能力)跟互联网有连接有以下的设备:
+
+* 好几个智能音箱
+* 冰箱、洗碗机、烤箱和微波炉
+* 为太阳能电池板的电量监测器
+* 智能插座
+* 摄像门铃和监视器
+* 有好几个在房间里的智能传感器的智能恒温器
+* 车库开门器
+* 家庭娱乐系统和声控电视
+* 灯光
+* 健身和健康追踪器
+
+这些设备都有感应器和/或执行器与跟互联网沟通。从我的手机,我能看得出如果我的车库门还开着,再叫我的智能音箱替我把它关上。我甚至能用计时器,那万一它晚上还开着,它可以自动关上。每当我的门铃响着,无论我在世界的哪儿个地方,我都能从手机看到门前是谁,并通过门铃的音箱和麦克风跟他们沟通。我能监测我的血糖、心率和睡眠周期,再用数据中的趋势来改善自己的健康状况。我能通过云控制我的灯,而当我的网络连接出状况,我能在黑暗中坐着。
+
+---
+
+## 🚀 挑战
+
+将在你的家、学校或工作场所中的 IoT 设备列成单子——有可能比你的想象中还要多!
+
+##知识检查(后)
+
+[知识检查(后)](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/2)
+
+## 复习和自学
+
+读一下关于消费者 IoT 项目的成功和失败。在新闻网站上找一找关于失败的文章,例如:隐私问题、硬件问题或者因缺少连接性能而发生的问题。
+
+几个例子:
+
+* 这个推特户口 **[Internet of Sh*t](https://twitter.com/internetofshit)** *(亵渎警告)* 有几个关于消费者 IoT 失败的好例子。
+* [c|net - My Apple Watch saved my life: 5 people share their stories](https://www.cnet.com/news/apple-watch-lifesaving-health-features-read-5-peoples-stories/)
+* [c|net - ADT technician pleads guilty to spying on customer camera feeds for years](https://www.cnet.com/news/adt-home-security-technician-pleads-guilty-to-spying-on-customer-camera-feeds-for-years/) *(触发警告:未经同意的偷窥)*
+
+## 作业
+
+[调查一个物联网(IoT)项目](assignment.md)
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.ar.md b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.ar.md
new file mode 100644
index 00000000..36c331ab
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.ar.md
@@ -0,0 +1,20 @@
+
+
+
+# التحقيق في مشروع إنترنت الأشياء
+
+## تعليمات
+
+هناك العديد من مشاريع إنترنت الأشياء الكبيرة والصغيرة التي يتم طرحها على مستوى العالم ، من المزارع الذكية إلى المدن الذكية ، ومراقبة الرعاية الصحية ، والنقل ، أو استخدام الأماكن العامة.
+
+ابحث في الويب عن تفاصيل المشروع الذي يثير اهتمامك ، ومن الأفضل أن يكون مشروعًا قريبًا من المكان الذي تعيش فيه. اشرح الجوانب الإيجابية والسلبية للمشروع ، مثل الفائدة التي تأتي منه ، وأي مشاكل يسببها وكيف يتم أخذ الخصوصية في الاعتبار.
+
+## الموضوع
+
+
+| المعايير | نموذجي | كافية | يحتاج إلى تحسين |
+| -------- | --------- | -------- | ----------------- |
+| اشرح الإيجابيات والسلبيات | قدّم شرحاً واضحاً لأوجه الإيجابيات والسلبيات للمشروع | قدم شرحا مختصرا للجوانب الإيجابية والسلبية | لم يشرح الإيجابيات أو السلبيات |
+
+
+
\ No newline at end of file
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.bn.md b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.bn.md
new file mode 100644
index 00000000..5eb0fa36
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.bn.md
@@ -0,0 +1,13 @@
+# একটি IoT প্রজেক্ট পর্যালোচনা
+
+## নির্দেশাবলী
+
+স্মার্ট ফার্ম থেকে শুরু করে স্মার্ট শহরগুলিতে, স্বাস্থ্যসেবা পর্যবেক্ষণ, পরিবহন এবং জনসাধারণের ব্যবহারের জন্য বিশ্বব্যাপী বড় এবং ছোট আকারের অনেক আইওটি প্রকল্প আসছে।
+
+আপনার বসবাসের জায়গার আশেপাশের এমন কোন প্রকল্প থাকলে, সেটি সম্পর্কে ইন্টারনেটে সার্চ করুন। প্রজেক্টটির ইতিবাচক এবং নেতিবাচক দিকগুলো (যেমন: এটির কারণে কী কী সুবিধা হচ্ছে, কোন সমস্যা তৈরী করছে কিনা বা তথ্যের গোপনীয়তা সংক্রান্ত বিষয়গুলি কীভাবে দেখা হচ্ছে) ব্যখ্যা করুন।
+
+## এসাইনমেন্ট মূল্যায়ন মানদন্ড
+
+| ক্রাইটেরিয়া | দৃষ্টান্তমূলক ব্যখ্যা | পর্যাপ্ত ব্যখ্যা | আরো উন্নতির প্রয়োজন |
+| -------- | --------- | -------- | -----------------|
+| ইতিবাচক এবং নেতিবাচক দিকগুলোর ব্যখ্যা করুন | বিশদভাব ব্যখ্যা করা হয়েছে | সংক্ষিপ্ত ব্যখ্যা করা হয়েছে | ভালোভাবে ব্যখ্যা করা হয়নি |
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.zh.md b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.zh.md
new file mode 100644
index 00000000..fbcf4b2f
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/assignment.zh.md
@@ -0,0 +1,13 @@
+# 调查一个物联网(IoT)项目
+
+## 指示
+
+从智慧农场到智慧城市、医疗保健监测系统、交通或利用公共空间等,世界上有非常多IoT项目。
+
+在互联网寻找一个让你感兴趣的IoT项目的细节,最好是离你不太远的。解释一下项目的好处与坏处,例如:它带来的益处、它带来的任何麻烦以及他怎么顾及隐私。
+
+## 评分表
+
+| 条件 | 优秀 | 一般 | 需改进 |
+| -------- | --------- | -------- | ----------------- |
+| 解释项目的好处与坏处 | 把项目的好处与坏处解释得很清楚 |简要地解释项目的好处与坏处 | 没有解释项目的好处与坏处 |
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/pi.bn.md b/1-getting-started/lessons/1-introduction-to-iot/translations/pi.bn.md
new file mode 100644
index 00000000..8b23cffd
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/pi.bn.md
@@ -0,0 +1,244 @@
+# রাস্পবেরি পাই
+
+[রাস্পবেরি পাই](https://raspberrypi.org) হলো একটি সিংগেল বোর্ড কম্পিউটার । আমরা বিভিন্ন ইকোসিস্টেমের সেন্সর এবং অ্যাকচুয়েটর ব্যবহার করতে পারি, আর এই লেসনে আমরা [Grove](https://www.seeedstudio.com/category/Grove-c-1003.html) নামের বেশ সমৃদ্ধ একটি হার্ডওয়্যার ইকোসিস্টেম ব্যবহার করবো। আমাদের রাস্পবেরি পাই (সংক্ষেপে "পাই") এর কোডিং এবং Grove সেন্সরগুলো আমরা নিয়ন্ত্রণ করবো পাইথন ল্যাংগুয়েজে।
+
+
+
+***রাস্পবেরি পাই - ৪ Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+## সেটাপ
+
+যদি আমরা আমাদের আইওটি হার্ডওয়্যার হিসাবে রাস্পবেরি পাই ব্যবহার করি, তবে দুটি অপশন আছে - সবগুলো লেসন পড়ে সরাসরি রাসপবেরি পাই তে কোডের মাধ্যমে কাজ করা অথবা কম্পিউটার থেকে 'হেডলেস' পাই এবং কোডের সাথে দূরবর্তীভাবে সংযোগ করতে পারেন।
+
+কাজ শুরু করার আগে আমাদের গ্রোভ বেস হ্যাটটি আপনার পাইয়ের সাথে সংযুক্ত করতে হবে।
+
+### কাজ - সেটাপ
+
+Grove বেস হ্যাটটি রাস্পবেরি পাই এ ইন্সটল করা এবং পাই কে সেই অনুসারে কনফিগার করা।
+
+১. গ্রোভ বেস টুপিটি রাস্পবেরি পাই এর সাথে সংযুক্ত করতে হবে। নিচের ছবির মতো, জিপিআইও পিনগুলো বরাবর আমরা গ্রোভ হ্যাট বসাতে পারবো।
+
+ 
+
+২. কীভাবে রাস্পবেরি পাই তে কাজ করতে চাচ্ছি, সে সম্পর্কিত সিদ্ধান্ত নিয়ে - নিচের যেকোন একটি প্রাসঙ্গিক সেকশন এ যেতে হবে
+
+ * [সরাসরি রাস্পবেরি পাই তে কাজ করা](#সরাসরি-রাস্পবেরি-পাই-তে-কাজ-করা)
+ * [রাস্পবেরি পাই তে রিমোট একসেস নেয়া](#পাই-তে-রিমোট-একসেস)
+
+### সরাসরি রাস্পবেরি পাই তে কাজ করা
+
+আমরা যদি সরাসরি রাস্পবেরি পাই তে কাজ করতে চাই, সেক্ষত্রে আমাদেরকে Raspberry Pi OS এর ডেস্কটপ ভার্সন ব্যবহার করতে হবে এবং প্রয়োজনীয় সব উপাদান ইন্সটল করতে হবে।
+
+#### কাজ - সরাসরি রাস্পবেরি পাই
+
+আমাদেরকে রাস্পবেরি পাই সেটাপ করে নিতে হবে।
+
+1. [ রাস্পবেরি পাই সেটাপ গাইড](https://projects.raspberrypi.org/en/projects/raspberry-pi-setting-up) থেকে সব নির্দেশ অনুসরণ করে আমাদের পাই সেটাপ করে নিতে হবে। এটিকে এবার কীবোর্ড/মাউস/মনিটরের সাথে যুক্ত করি। তারপর ওয়াইফাই বা ইথারনেটে সংযুক্ত করে, সফটওয়্যর আপডেট করে নিতে হবে। এক্ষেত্রে যে অপারেটিং সিস্টেম আমরা ডাউনলোড করবো তা হলো **Raspberry Pi OS (32 bit)** , এটিই রেকমেন্ডেড হিসেবে মার্ক করা থাকে ।
+
+
+গ্রোভ সেন্সর ও একচুয়েটর ব্যবহার করে কাজ করার জন্য, আগেই একটি এডিটর ইন্সটল করতে হবে যাতে আমরা কোড লিখতে পারি এবং বিভিন্ন লাইব্রেরি ও ট্যুল ব্যবহার করতে পারি - এতে করে সহজেই আমরা গ্রোভে কাজ করতে পারবো।
+
+1. পাই রিব্যুট করার পরে, উপরের মেন্যু বার থেকে **Terminal** আইকনে ক্লিক করে তা চালু করতে হবে অথবা *Menu -> Accessories -> Terminal* এভাবে চালু করতে হবে।
+
+2. ওএস এবং সব সফটওয়্যর আপডেট করা আছে কিনা তার জন্য নীচের কমান্ড টা রান করতে হবে।
+
+ ```sh
+ sudo apt update && sudo apt full-upgrade --yes
+ ```
+3. গ্রোভে সকল লাইব্রেরি ইন্সটল করার জন্য নিচের কমান্ড রান দিই।
+
+ ```sh
+ curl -sL https://github.com/Seeed-Studio/grove.py/raw/master/install.sh | sudo bash -s -
+ ```
+ পাইথনের অন্যতম শক্তিশালী একটি সুবিধা হলো [pip packages](https://pypi.org) ইন্সটল করতে পারা - পিপ প্যাকেজ হলো অন্যদের তৈরী ও পাবলিশ করা কোডের প্যাকেজ। মাত্র ১টা কমান্ড দিয়েই পিপ ইন্সটল করে ব্যবহার করা যায়। এই গ্রুভ ইন্সটল স্ক্রিপ্ট টি রান করলে, তা আমাদের প্রয়োজনীয় সকল ট্যুল ইন্সটল করে নিবে।
+
+4. আমাদের রাস্পবেরি পাই টি মেন্যু থেকে অথবা নিচের স্ক্রিপ্ট রান করে রিব্যুট করে নিই।
+
+ ```sh
+ sudo reboot
+ ```
+
+5. পাই রিব্যুট হওয়ার পর, টার্মিনাল আবারো চালু করতে হবে আর [Visual Studio Code (VS Code)](https://code.visualstudio.com?WT.mc_id=academic-17441-jabenn) ইন্সটল করতে হবে। এই এডিটরের সাহায্যেই মূলত আমরা সব কোড লিখবো।
+
+ ```sh
+ sudo apt install code
+ ```
+
+ ইন্সটলেশনের পর টপ মেন্যু থেকেই ভিএস কোড পাওয়া যাবে।
+
+ > 💁 পছন্দানুসারে যেকোন পাইথন আইডিই বা এডিটর ব্যবহার করলেই হয়, কিন্তু আমরা এখানে সম্পূর্ণ টিউটোরিয়াল সাজিয়েছি ভিএস কোডের উপর ভিত্তি করে।
+
+6. Pylance ইন্সটল করতে হবে। পাইথনে কোড করার জন্য, এটি ভিএস কোডের একটি এক্সটেনশন। [Pylance extension documentation](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance&WT.mc_id=academic-17441-jabenn) থেকে এটি ইন্সটল করার সকল নির্দেশনা পাওয়া যাবে।
+
+### পাই তে রিমোট একসেস
+
+কম্পিউটারের কীবোর্ড/মাউস/মনিটরের সাথে যুক্ত না রেখে, সরাসরি এটিতে কোডিং না করে, আমরা 'হেডলেস' হিসেবে এটাকে রান করতে পারি। এক্ষেত্রেও ভিস্যুয়াল স্টুডিও কোড ব্যবহার করে আমরা কনফিগারেশন এবং কোডিং করবো।
+
+#### পাই অপারেটিং সিস্টেম সেটাপ করা
+
+পাই ওএস কে একটি এসডি কার্ডে ইন্সটল করে রাখতে হবে।
+
+##### কাজ- পাই ওএস সেটাপ
+
+হেডলেস পাই ওএস সেটাপ করা :
+
+1. [Raspberry Pi OS software page](https://www.raspberrypi.org/software/) থেকে **Raspberry Pi Imager** ডাউনলোড এবং ইন্সটল করতে হবে।
+
+1. কম্পিউটারে একটি এসডি কার্ড প্রবেশ করাতে হবে (এডাপটার প্রয়োজন হতে পারে) ।
+
+1. রাস্পবেরি পাই ইমেজার চালু করতে হবে ।
+1. রাস্পবেরি পাই ইমেজার থেকে **CHOOSE OS** সিলেক্ট করি। তারপর *Raspberry Pi OS (Other)* সিলেক্ট করতে হবে *Raspberry Pi OS Lite (32-bit)* এর পরে ।
+
+ 
+
+ > 💁 Raspberry Pi OS Lite হলো মূলত Raspberry Pi OS এরই একটি ভার্সন যার ডেস্কটপ ইন্টারফেস বা এই সংক্রান্ত ট্যুল নেই। হেডলেস পাই তে এসব দরকার নেই বলেই লাইট ভার্সন নেয়া হচ্ছে যাতে প্রক্রিয়াটি সংক্ষিপ্ত হয় এবং দ্রুত ব্যুট করা যায়।
+
+1. **CHOOSE STORAGE** এ ক্লিক করে, এসডি কার্ড সিলেক্ট করি ।
+
+1. **Advanced Options** চালু করতে হবে, `Ctrl+Shift+X` প্রেস করে। এখান থেকে আমরা পাই এর কিছু প্রি-কনফিগারেশন করতে পারবো।
+ 1. এখন **Enable SSH** বক্সে টিক দিতে হবে এবং পাই ইউজারের জন্য পাসওয়ার্ড সেট করতে হবে। এই পাসওয়ার্ডটি আমরা পরে পাইতে লগ ইন করতে ব্যবহার করবো।
+
+ 1. যদি আমাদেরকে পাই কে ওয়াইফাইয়ের সাথে সংযোগ স্থাপনের মাধ্যমে কাজ করার ইচ্ছে থাকে, তাহলে **Configure WiFi** চেক বাক্সটি চেক করুন এবং আমাদের ওয়াইফাই, এসএসআইডি এবং পাসওয়ার্ড লিখে, দেশ নির্বাচন করতে হবে। যদি ইথারনেট ক্যাবল ব্যবহার করি, তবে এটি করার দরকার নেই। এখানে অবশ্যই নিশ্চিত থাকতে হবে যে, পাই কে যে নেটওয়ার্কে সংযোগ করা হচ্ছে, সেই একই নেটওয়ার্কে কম্পিউটারটি যুক্ত রয়েছে।
+
+ 1. **Set locale settings** বক্সে টিক দিতে হবে এব দেশ এবং টাইমজোন দিতে হবে।
+
+ 1. **SAVE** সিলেক্ট করি।
+
+1. এবার **WRITE** ক্লিক করলে ওএস আর এসডি কার্ডের কাজ শুরু। যদি ম্যাক-ওএস ব্যবহার করলে এক্ষেত্রে পাসওয়ার্ডটি প্রবেশ করতে বলা হবে যা ডিস্ক ইমেজ এ কাজ করার একসেস দেয়।
+
+অপারেটিং সিস্টেমটি এসডি কার্ডে 'write' করা হবে এবং কার্ডটি সম্পূর্ণ হয়ে গেলে ওএস দ্বারা 'ইজেক্ট' করে দেওয়া হবে এবং ইউজারকে অবহিত করা হবে। এটি হয়ে গেলে, কম্পিউটার থেকে এসডি কার্ড সরিয়ে তা পাই তে প্রবেশ করিয়ে তা চালু করতে হবে।
+
+#### পাই এর সাথে সংযোগ
+
+পরবর্তী ধাপ হলো পাই তে রিমোট একসেস পাওয়া। এটি `ssh` ব্যবহার করে করা যায়, যা ম্যাক-ওএস, লিনাক্স এবং উইন্ডোজের সাম্প্রতিক ভার্সনগুলোতে রয়েছে।
+
+##### কাজ - পাই এর সাথে সংযোগ
+
+পাই এ রিমোট একসেস
+
+1. টার্মিনাল বা কমান্ড প্রম্পট চালু করতে হবে এবং পাই কে যুক্ত করার জন্য নীচের কমান্ডটি চালু করতে হবে।
+
+ ```sh
+ ssh pi@raspberrypi.local
+ ```
+
+ উইন্ডোজের পুরাতন ভার্সন, যেসবে `ssh` নেই সেখানে কী করা যায় ? খুব সহজ - OpenSSH ব্যবহার করা যাবে। ইন্সটল করার সব নির্দেশনা [OpenSSH installation documentation](https://docs.microsoft.com//windows-server/administration/openssh/openssh_install_firstuse?WT.mc_id=academic-17441-jabenn) এ পাওয়া যাবে।
+
+1. এটি পাইয়ের সাথে সংযুক্ত হয়ে এবং পাসওয়ার্ড চাইবে।
+
+ আমাদের কম্পিউটার নেটওয়ার্ক `.local` এই কমান্ডের মাধ্যমে জানতে পারাটা বেশ নতুন একটি ফীচার লিনাক্স এবং উইন্ডোজে। যদি লিনাক্স বা উইন্ডোজে এইক্ষেত্রে হোস্টনেম পাওয়া না গিয়ে বরং এরর আসে তাহলে, অতিরিক্ত সফটওয়্যার ইন্সটল করতে হবে যাতে করে 'ZeroConf networking' চালু করা যায় (এপল ডিভাইসের জন্য 'Bonjour'):
+
+ 1. লিনাক্স ব্যবহারকারী হলে, Avahi ইন্সটল করতে হবে নীচের কমান্ড ব্যবহার করে:
+
+ ```sh
+ sudo apt-get install avahi-daemon
+ ```
+
+ 1. উইন্ডোজে সবচেয়ে সহজে 'ZeroConf networking' চালু করার জন্য [Bonjour Print Services for Windows](http://support.apple.com/kb/DL999) ইন্সটল করলেই হবে। এছাড়াও [iTunes for Windows](https://www.apple.com/itunes/download/) ইন্সটল করলেও হবে, আর এতে কিছু নতুন সুবিধা রয়েছে যা স্ট্যান্ড-এলোন হিসেবে সাধারণত পাওয়া যায়না।
+
+ > 💁 যদি `raspberrypi.local` ব্যবহার করে কানেক্ট করা না যায় , তখন পাই এর আইপি এড্রেস ব্যবহার করতে হবে। এই সংক্রান্ত নির্দেশনা [Raspberry Pi IP address documentation](https://www.raspberrypi.org/documentation/remote-access/ip-address.md) এ বিস্তারিত দেয়া রয়েছে।
+
+1. রাস্পবেরি পাই ইমেজার এডভান্সড অপশনে যে পাসওয়ার্ডটি সেট করা হয়েছিল, তা প্রবেশ করাতে হবে।
+
+#### পাই এ সফ্টওয়্যার কনফিগার
+
+একবার পাইয়ের সাথে সংযুক্ত হয়ে গেলে, আমাদেরকে খেয়াল রাখতে হবে ওএস আপ টু ডেট রয়েছে কিনা এবং গ্রোভ হার্ডওয়ারের সাথে যুক্ত বিভিন্ন লাইব্রেরি এবং সরঞ্জাম ইনস্টল করতে হবে।
+
+##### কাজ - পাই সফ্টওয়্যার কনফিগার
+
+পাই সফ্টওয়্যার কনফিগার এবং গ্রোভ লাইব্রেরি ইন্সটল করা।
+
+1. `ssh` সেশন থেকে, নিচের কমান্ডগুলো রান করতে হবে এবং আপডেট করার পর , পাই রিব্যুট করতে হবে।
+
+ ```sh
+ sudo apt update && sudo apt full-upgrade --yes && sudo reboot
+ ```
+
+ আপডেট এবং রিব্যুট হয়ে যাবে আর তা শেষ হলে `ssh`সেশন শেষ হয়ে যাবে। তাই ৩০ সেকেন্ড পর পুনরায় কানেক্ট করতে হবে।
+
+1. রিকানেক্ট করা `ssh` সেশনে , নিচের কমান্ডগুলো রান করতে হবে গ্রোভ লাইব্রেরি ইন্সটল করার জন্য:
+
+ ```sh
+ curl -sL https://github.com/Seeed-Studio/grove.py/raw/master/install.sh | sudo bash -s -
+ ```
+
+ পাইথনের অন্যতম শক্তিশালী একটি সুবিধা হলো [pip packages](https://pypi.org) ইন্সটল করতে পারা - পিপ প্যাকেজ হলো অন্যদের তৈরী ও পাবলিশ করা কোডের প্যাকেজ। মাত্র ১টা কমান্ড দিয়েই পিপ ইন্সটল করে ব্যবহার করা যায়। এই গ্রুভ ইন্সটল স্ক্রিপ্ট টি রান করলে, তা আমাদের প্রয়োজনীয় সকল ট্যুল ইন্সটল করে নিবে।
+
+1. নিচের কমান্ডটি রান করে রিব্যুট করতে হবে:
+
+ ```sh
+ sudo reboot
+ ```
+
+ পাই রিব্যুট হওয়ার পর `ssh`সেশন শেষ হয়ে যাবে। রিকানেক্ট করার আর প্রয়োজন নেই।
+
+#### রিমোট একসেসের জন্য ভিএস কোড কনফিগার
+
+পাই কনফিগার করার পরে, এটাতে Visual Studio Code (অর্থাৎ VS Code) এর মাধ্যমে কানেক্ট করা যাবে।
+
+##### কাজ - রিমোট একসেসের জন্য ভিএস কোড কনফিগার
+
+প্রয়োজনীয় সফ্টওয়্যার ইনস্টল করে এবং পাই এর সাথে রিমোট বা দূরবর্তী সংযোগ স্থাপন করতে হবে।
+
+1. [VS Code documentation](https://code.visualstudio.com?WT.mc_id=academic-17441-jabenn) অনুসারে ভিসুয়াল স্টুডিও কোড ইন্সটল করতে হবে।
+
+1. তারপর [VS Code Remote Development using SSH documentation](https://code.visualstudio.com/docs/remote/ssh?WT.mc_id=academic-17441-jabenn) অনুসরণ করে প্রয়োজনীয় সব কম্পোনেন্ট ইন্সটল করতে হবে।
+
+1. একই গাইড ফলো করে রাস্পবেরি পাই কে ভিএস কোডের সাথে সংযুক্ত করতে হবে।
+
+1. কানেক্ট হয়ে যাওয়ার পরে [managing extensions](https://code.visualstudio.com/docs/remote/ssh#_managing-extensions?WT.mc_id=academic-17441-jabenn) অনুসারে [Pylance extension](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance&WT.mc_id=academic-17441-jabenn) রিমোট মাধ্যমে পাই তে ইন্সটল করতে হবে।
+
+## Hello world (হ্যালো ওয়ার্লড)
+
+কোন নতুন প্রোগ্রামিং ভাষা বা প্রযুক্তি শেখা শুরু করার সময় এটি প্রচলিত রয়েছে যে - একটি 'হ্যালো ওয়ার্ল্ড' অ্যাপ্লিকেশন তৈরির মাধ্যমে যাত্রা শুরু করা । 'হ্যালো ওয়ার্ল্ড'- একটি ছোট অ্যাপ্লিকেশন যা `"Hello World"` আউটপুট হিসেবে প্রদান করবে আর এতে আমরা বুঝতে পারি যে আমাদের সব কনফিগারেশন ঠিক আছে কিনা।
+
+এক্ষেত্রে 'হ্যালো ওয়ার্ল্ড' অ্যাপটি নিশ্চিত করবে যে আমাদের পাইথন এবং ভিজ্যুয়াল স্টুডিও কোডটি সঠিকভাবে ইনস্টল করা হয়েছে।
+
+এই অ্যাপ্লিকেশনটি `nightlight` নামে একটি ফোল্ডারে থাকবে এবং নাইটলাইট অ্যাপ্লিকেশনটি তৈরি করতে এই অ্যাসাইনমেন্টের পরবর্তী অংশগুলিতে এটি বিভিন্ন কোডের সাথে পুনরায় ব্যবহার করা হবে।
+
+### কাজ - হ্যালো ওয়ার্লড
+
+'হ্যালো ওয়ার্ল্ড' অ্যাপ তৈরী করা
+
+1. ভিএস কোড চালু করতে হবে। সরসারি রাস্পবেরি পাই অথবা কম্পিউটার থেকে এটি করা যাবে যা Remote SSH extension দিয়ে পাই এর সাথে যুক্ত ।
+
+1. ভিএস কোডের টার্মিনাল চালু করতে হবে, এজন্য আমাদেরকে এই ধারা অনুসরণ করতে হবে *Terminal -> New Terminal অথবা `` CTRL+` `` টাইপ করে। এটি `pi` ইউজারের হোম ডিরেক্টরি চালু করবে।
+
+1. নিচের কমান্ড রান করার মাধ্যমে কোড এর জন্য একটি ডিরেক্টরি ক্রিয়েট করা হবে এবং আমরা `app.py` নামের একটি পাইথন ফাইল সেই ডিরেক্টরি তে তৈরী করছি:
+
+ ```sh
+ mkdir nightlight
+ cd nightlight
+ touch app.py
+ ```
+
+1. এই ফোল্ডারটি ভিএস কোডের মাধ্যমে ওপেন করতে হবেঃ *File -> Open...* তারপর *nightlight* folder সিলেক্ট করে **OK** তে ক্লিক করতে হবে।
+
+ 
+
+1. `app.py` ফাইলটি ভিএস কোড এক্সপ্লোরারের মাধ্যমে ওপেন করে, নিম্নের কোডটি লিখি
+
+ ```python
+ print('Hello World!')
+ ```
+
+ এখানে `print` ফাংশনটি এর ভেতরে যা রাখা হবে, তাকে আউটপুট হিসেবে প্রদর্শন করবে।
+
+1. ভিএস কোড টার্মিনাল থেকে নিচের কমান্ডটি রান করানো হলে, পাইথন ফাইলটি রান করবে :
+
+ ```sh
+ python3 app.py
+ ```
+
+ > 💁 এই কোডটি চালানোর জন্য স্পষ্টভাবে `Python 3` কল করতে হবে কেননা ডিভাইসে যদি পাইথন 3 (সর্বশেষ সংস্করণ) ছাড়াও পাইথন 2 ইনস্টল করা থাকে সেক্ষেত্রে সমস্যা এড়ানোর জন্য। যদি আমাদের পাইথন 2 ইনস্টল থাকে তবে `Python` কল করলে, পাইথন 3 এর পরিবর্তে পাইথন 2 ব্যবহার হবে যা আমাদের অবশ্যই এড়িয়ে চলতে হবে।
+
+ টার্মিনালে নিম্নোক্ত আউটপুট দেখাবে :
+
+ ```output
+ pi@raspberrypi:~/nightlight $ python3 app.py
+ Hello World!
+ ```
+
+> 💁 এই সম্পূর্ণ কোডটি পাওয়া যাবে [code/pi](code/pi) ফোল্ডারে ।
+
+😀 আমাদের 'Hello World'প্রোগ্রাম সফল হলো !
diff --git a/1-getting-started/lessons/1-introduction-to-iot/translations/wio-terminal.bn.md b/1-getting-started/lessons/1-introduction-to-iot/translations/wio-terminal.bn.md
new file mode 100644
index 00000000..5ff5b9f1
--- /dev/null
+++ b/1-getting-started/lessons/1-introduction-to-iot/translations/wio-terminal.bn.md
@@ -0,0 +1,201 @@
+# Wio Terminal
+
+[সীড স্টুডিও](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) এর Wio Terminal একটি আরডুইনো সাপোর্টেড মাইক্রোকন্ট্রোলার, যাতে ওয়াইফাই সংযোগ এবং কিছু সেন্সর ও অ্যাকচুয়েটর বিল্ট-ইন রয়েছে। এছাড়াও এর সাথে রয়েছে কিছু পোর্ট, অতিরিক্ত সেন্সর ও অ্যাকচুয়েটর সংযোগ এবং এটি নির্মাণ করা হয়েছে একটি হার্ডওয়্যার ইকোসিস্টেম ব্যবহার করে যার নাম
+[Grove](https://www.seeedstudio.com/category/Grove-c-1003.html).
+
+
+
+## সেটআপ
+
+Wio Terminal ব্যবহার করার জন্য, আমাদের কিছু ফ্রি সটওয়্যার নিজেদের কম্পিউটার এ ইনস্টল করতে হবে। আমাদের অবশ্যই ওয়াইফাই সংযোগদানের পূর্বে Wio Terminal ফার্মওয়্যারটি আপডেট করে নিতে হবে।
+
+### কাজের সেটআপ
+
+প্রথমেই আমরা আমাদের প্রয়োজনীয় সটওয়্যারগুলো এবং ফার্মওয়ারটি আপডেট করে নেব।
+
+১. ভিজুয়াল স্টুডিও কোড (ভি এস কোড) ইনস্টল করতে হবে । এটি একটি এডিটর যার সাহায্যে আমরা আমাদের ডিভাইস কোড লিখতে পারি সি/সি++ ভাষায়। বিস্তারিত জানতে [VS Code documentation](https://code.visualstudio.com?WT.mc_id=academic-17441-jabenn) টি পড়ে নেয়া যেতে পারে।
+
+ > 💁 আরডুইনো ডেভলপমেন্ট এর জন্য আর একটি ভালো আই.ডি.ই হলো [Arduino IDE](https://www.arduino.cc/en/software). এই IDE টির সাথে কাজ করার পূর্ব অভিজ্ঞতা থাকলে ভি এস কোড ও platformIO এর পরিবর্তে একেও ব্যাবহার করা যেতে পারে। তবে, এখানে আমরা ভি এস কোডের উপর ভিত্তি করেই কাজ করবো।
+
+২. এরপর ভি এস কোড platformIO এক্সটেনশনটি ইনস্টল করতে হবে। এই এক্সটেনশনটি ভি এস কোডে ইনস্টল করতে [PlatformIO extension documentation](https://marketplace.visualstudio.com/items?itemName=platformio.platformio-ide&WT.mc_id=academic-17441-jabenn) এ দেওয়া দিকির্দেশনাগুলো পড়ে দেখতে পারেন। এটি একটি ভি এস কোড এক্সটেনশন যা সি/সি++ ভাষায় মাইক্রোকন্ট্রোলার প্রোগ্রামিংকে সাপোর্ট করে। এই এক্সটেনশনটি মাইক্রোসফট সি/সি++ এর উপর নির্ভর করে , সি অথবা সি++ ভাষা নিয়ে কাজ করার জন্য। উল্লেখ্য, এই সি/সি++ এক্সটেনশন সয়ংক্রিয়ভাবে ইনস্টল হয়ে যায় যখন কেউ platformIO ইনস্টল করে।
+
+1. এখন, আমরা আমাদের Wio Terminal কে কম্পিউটার এর সাথে সংযুক্ত করব। এটির নিচের দিকে একটি ইউএসবি-সি পোর্ট আছে, সেটিকে আমরা আমাদের কম্পিউটার এর ইউএসবি পোর্ট এর সাথে সংযোগ দিব। উইও টার্মিনালে ইউএসবি-সি ও ইউএসবি-এ ক্যাবল থাকে। যদি আমাদের কম্পিউটারে শুধু ইউএসবি-সি পোর্ট থেকে, তাহলে আমাদের হয় ইউএসবি-সি ক্যাবল অথবা ইউএসবি-এ ক্যাবলের প্রয়োজন হবে ইউএসবি-সি অ্যাডাপ্টার এ সংযোগ দেওয়ার জন্য।
+
+1. [Wio Terminal Wiki WiFi Overview documentation](https://wiki.seeedstudio.com/Wio-Terminal-Network-Overview/) এ উল্লেখিত দিকনির্দেশনা গুলোকে মেনে আমরা আমাদের উইও টার্মিনাল সেটআপ ও ফার্মওয়্যার আপডেট করে ফেলি।
+
+
++## হ্যালো ওয়ার্ল্ড
+
+ প্রথাগতভাবে, কোনো নতুন প্রোগ্রামিং ল্যাঙ্গুয়েজ অথবা টেকনোলজি নিয়ে কাজ শুরু করার সময় আমরা একটি "Hello World" application লিখি, একটি ছোট application যা আউটপুট হিসেবে `"Hello World"` লেখাটি দেখায়। এতে করে আমরা বুঝি যে আমাদের প্রোগ্রামটিতে সকল টুল সঠিকভাবে কাজ করছে।
+
+আমাদের Wio Terminal এর হেলো ওয়ার্ল্ড অ্যাপটি এটি নিশ্চিত করবে যে আমাদের ভিজুয়াল স্টুডিও কোড platformIO এর সাথে সঠিকভাবে ইনস্টল করা হয়েছে এবং এখন এটি microcontroller development এর জন্য প্রস্তুত।
+
+### platformIO প্রজেক্ট তৈরী
+
+আমাদের প্রথম কাজ হলো platformIO ব্যাবহার করে একটি নতুন প্রজেক্ট তৈরী করা যা Wio terminal এর জন্য কনফিগার করা।
+
+#### কাজ- platformIO প্রজেক্ট তৈরী
+
+একটি platformIO প্রজেক্ট তৈরী করি।
+
+১. Wio terminal কে কম্পিউটারের সাথে সংযোগ দেই।
+
+২. ভি এস কোড launch করি
+
+৩. আমরা platformIO আইকনটি সাইড মেন্যু বারে দেখতে পাবো:
+
+ 
+
+ এই মেন্যু আইটেমটি সিলেক্ট করে, সিলেক্ট করি *PIO Home -> Open*
+
+ 
+
+৪. Welcome স্ক্রীন থেকে **+ New Project** বাটনটিতে ক্লিক করি।
+
+ 
+
+৫. প্রজেক্টটিকে *Project Wizard* এ configure করি
+
+ 1. প্রজেক্টটিকে `nightlight` নাম দেই।
+
+ 1. *Board* dropdown থেকে, `WIO` লিখে বোর্ডগুলোকে ফিল্টার করি, *Seeeduino Wio Terminal* সিলেক্ট করি।
+
+ 1. Leave the *Framework* as *Arduino*
+
+ 1. হয় *Use default location* কে টিক অবস্থায় ছেড়ে দেই অথবা সেটিকে টিক না দিয়ে আমাদের প্রজেক্টটির জন্য যেকোনো location সিলেক্ট করি।
+
+ 1. **Finish** বাটনটিতে ক্লিক করি।
+
+ 
+
+ platformIO এখন wio terminal এর কোডগুলোকে compile করার জন্য প্রয়োজনীয় কম্পনেন্টস ডাউনলোড করে নেবে এবং আমাদের প্রজেক্টটি create করে নেবে। পুরো প্রক্রয়াটি সম্পন্ন হতে কয়েক মিনিট সময় লাগতে পারে।
+
+### platformIO প্রজেক্টটি investigate করে দেখা
+
+ভি এস কোড এক্সপ্লোরার আমাদের কিছু ফাইল এবং ফোল্ডার দেখাবে যা platformIO wizerd দ্বারা তৈরি হয়েছে।
+
+#### ফোল্ডারস
+
+* `.pio` - এই ফোল্ডারটি কিছু temporary ডাটা বহন করে যা platformIO এর প্রয়জন হতে পারে, যেমন: libraries অথবা compiled code, এটা delete করার সাথে সাথে আবার পুনঃনির্মিতো হয়। U আমরা প্রজেক্টটি কোনো সাইট
+* `.vscode` - এই ফোল্ডারটি ভি এস কোড ও platformIO দ্বারা ব্যবহৃত configuration গুলোকে বহন করে। এটা delete করার সাথে সাথে আবার পুনঃনির্মিতো হয়। প্রজেক্টটি কোনো সাইট যেমন GitHub এ share করতে এর কোনো সোর্স কোড কন্ট্রোল অ্যাড করতে হবে না।
+* `include` - এই ফোল্ডারটি এক্সটার্নাল হেডার ফাইল বহনের জন্য রয়েছে যা আমাদের কোডে অতিরিক্ত library যোগের সময় দরকার হয়। আমাদের কাজগুলোতে আমরা এই ফোল্ডারটি ব্যাবহার করব না।
+* `lib` - এই ফোল্ডারটি কিছু এক্সটার্নাল libraries বহন করবে যা আমরা আমাদের কোড থেকে কল করব। আমাদের কাজগুলোতে আমরা এই ফোল্ডারটি ব্যাবহার করব না।
+* `src` - এই ফোল্ডারটি আমাদের main সোর্স কোডটিকে বহন করবে, যা কিনা একটি সিংগেল ফাইল - main.cpp
+* `test` - এই ফোল্ডারটি সেই স্থান যেখানে আমরা আমাদের কোডের ইউনিট টেস্ট গুলোকে রাখবো।
+
+#### ফাইলস
+
+* `main.cpp` - src ফোল্ডারে অবস্থিত এই ফাইলটি আমাদের অ্যাপ্লিকেশন এর entry point হিসেবে কাজ করবে। আমরা ফাইলটি খুলে দেখব, এটি বহন করে:
+
+ ```cpp
+ #include
+
+ void setup() {
+ // put your setup code here, to run once:
+ }
+
+ void loop() {
+ // put your main code here, to run repeatedly:
+ }
+ ```
+
+ যখন ডিভাইসটি কাজ শুরু করে, Arduino framework টি সেটআপ ফাংশনটি একবার রান করে, এরপর নিরন্তর এটিকে রান করতে থেকে যতক্ষণ পর্যন্ত ডিভাইসটি বন্ধ না হয়
+
+* `.gitignore` - এটি সেই ফাইল ও ডিরেক্টরিগুলোকে লিস্ট করে রাখে, যেগুলোকে আমরা আমাদের কোড git source code control এ যুক্ত করার সময় ইগনোর করবো, যেমন: কোনো GitHub repository তে আপলোড করার সময়।
+
+* `platformio.ini` - এই ফাইলে আমাদের ডিভাইসের এবং অ্যাপের configuration গুলো রয়েছে । এটি খুললে দেখা যাবে:
+
+ ```ini
+ [env:seeed_wio_terminal]
+ platform = atmelsam
+ board = seeed_wio_terminal
+ framework = arduino
+ ```
+
+ `[env:seeed_wio_terminal]` সেকশনটিতে wio terminal এর configuration আছে। আমরা একের অধিক `env` সেকশন রাখতে পারি যেন আমাদের কোডকে একের অধিক board এর জন্য compile করা যায়।
+
+ Project wizerd থেকে আরো কিছু value যা configuration ম্যাচ করে:
+
+ * `platform = atmelsam` Wio terminal যে হার্ডওয়্যারটি ব্যাবহার করে তাকে ডিফাইন করে (an ATSAMD51-based microcontroller)
+ * `board = seeed_wio_terminal` মাইক্রোকন্ট্রোলার এর টাইপ কে ডিফাইন করে (the Wio Terminal)
+ * `framework = arduino` আমাদের প্রজেক্টটি Arduino framework ব্যাবহার করে সেটি ডিফাইন করে।
+
+### হ্যালো ওয়ার্ল্ড অ্যাপটি লিখি
+
+এখন আমরা হ্যালো ওয়ার্ল্ড অ্যাপটি লিখার জন্য প্রস্তুত হয়েছি।
+
+#### কাজ - হ্যালো ওয়ার্ল্ড অ্যাপটি লিখা
+
+হ্যালো ওয়ার্ল্ড অ্যাপটি লিখি।
+
+1. `main.cpp` ফাইলটি ভি এস কোড থেকে ওপেন করি।
+
+1. কোডটি এমনভাবে লিখি যেনো এটি নিম্নোক্ত কোডটির সাথে মিলে যায়:
+
+ ```cpp
+ #include
+
+ void setup()
+ {
+ Serial.begin(9600);
+
+ while (!Serial)
+ ; // Wait for Serial to be ready
+
+ delay(1000);
+ }
+
+ void loop()
+ {
+ Serial.println("Hello World");
+ delay(5000);
+ }
+ ```
+
+ `setup` ফাংশনটি একটি connection কে initialize করে সিরিয়াল পোর্ট এর সাথে, সেই usb পোর্টটি যেটি আমাদের কম্পিউটারকে wio terminal এর সাথে সংযুক্ত করেছে। `9600` প্যারামিটারটি হলো [baud rate](https://wikipedia.org/wiki/Symbol_rate) (যা সিম্বল রেট হিসেবেও পরিচিত) সিরিয়াল পোর্ট এর মধ্য দিয়ে যাওয়া ডাটার speed (bits per second). এই সেটিং দ্বারা আমরা বোঝাই ৯৬০০ bits (০ এবং ১) ডাটা পাঠানো হচ্ছে প্রতি সেকেন্ডে। এরপর এটি সিরিয়াল পোর্টটি ready state এ যাওয়ার জন্য wait করে।
+
+ + `loop` ফাংশনটি `Hello World!` লাইনটির character গুলো এবং একটি new line character সিরিয়াল পোর্টে পাঠায়। এরপর, এটি ৫০০০ মিলি সেকেন্ড সময়ের জন্য sleep state এ যায়। Loop শেষ হওয়ার পর, এটি আবার রান করে এবং চলতে থাকে যতক্ষণ পর্যন্ত মাইক্রোকন্ট্রোলারটি ON থাকে।
+
+
+1. কোডটি বিল্ড করে wio terminal এ আপলোড করি
+
+ 1. ভি এস কোড command palette ওপেন করি।
+
+ 1. 1. টাইপ করি `PlatformIO Upload` আপলোড অপশনটি খুঁজে পাওয়ার জন্য, এরপর *PlatformIO: Upload* সিলেক্ট করি।
+
+ 
+
+ যদি দরকার হয়, platformIO এখন অটোমেটিক ভাবে কোডটিকে বিল্ড করবে, আপলোড করার পূর্বে।
+
+ 1. কোডটি কম্পাইল হয়ে wio terminal এ আপলোড হয়ে যাবে
+
+ > 💁 আমরা যদি MacOS ব্যাবহার করে থাকি, একটি *DISK NOT EJECTED PROPERLY* notification দেখতে পাবো। এটা এজন্যে দেখায় যে, wio terminal টি মাউন্টেড হয় ড্রাইভ হিসেবে যা কিনা ফ্লাশিং প্রসেসের একটি পার্ট, এবং এটি বিচ্ছিন্ন হয়ে যায় যখন compiled code টি আমদর ডিভাইস এ লেখা। আমরা এই নোটিফিকেশনটি ইগনোর করতে পারি।
+
+ ⚠️ আমরা যদি error দেখতে পাই যে আপলোড পোর্ট unavailable, প্রথমত, আমাদের দেখতে হবে wio টার্মিনালটি আমাদের কম্পিউটারের সাথে সংযুক্ত আছে কিনা এবং স্ক্রীন এর বামদিকের সুইচটি অন করা আছে কিনা। নিচের দিকের সবুজ লাইটটি অন থাকতে হবে। এরপরও যদি error আসে, আমরা on/off সুইটটিকে দুবার নিচের দিকে টানবো এমনভাবে যেনো আমাদের wio terminal টি bootloader mode এ যায়। এরপর, আবার আপলোড করবো।
+
+wio terminal এর একটি serial monitor থাকে যা wio terminal থেকে usb পোর্ট এর মাধ্যমে কতটুকু ডাটা পাঠানো হয়েছে তা দেখে। আমরা `Serial.println("Hello World");` কমান্ডটির মাধ্যমে কতটুকু ডাটা পাঠানো হয়েছে তা মনিটর করতে পারবো।
+
+1. ভি এস কোড command palette ওপেন করি
+
+1. `PlatformIO Serial` টাইপ করি serial monitor অপশনটি খুঁজে পাওয়া জন্য, সিলেক্ট *PlatformIO: Serial Monitor*
+
+ 
+
+ এখন একটি নতুন টার্মিনাল ওপেন হবে যেখানে সিরিয়াল পোর্টের মাধ্যমে যত ডাটা পাঠানো হয়েছে তা দেখা যাবে:
+
+ ```output
+ > Executing task: platformio device monitor <
+
+ --- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
+ --- More details at http://bit.ly/pio-monitor-filters
+ --- Miniterm on /dev/cu.usbmodem101 9600,8,N,1 ---
+ --- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
+ Hello World
+ Hello World
+ ```
+
+ serial monitor এ প্রতি ৫ সেকেন্ডে `Hello World` প্রিন্ট হবে।
+
+> 💁 আমরা উক্ত কোডটি [code/wio-terminal](code/wio-terminal) ফোল্ডারে খুঁজে পাবো।
+
+😀 আমাদের 'হ্যালো ওয়ার্ল্ড' লেখাটি সফল হলো!!
diff --git a/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md b/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
index 1c457f01..dbda7c91 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/virtual-device.md
@@ -67,13 +67,13 @@ Configure a Python virtual environment and install the pip packages for CounterF
source ./.venv/bin/activate
```
-1. Once the virtual environment has been activated, the default `python` command will run the version of Python that was used to create the virtual environment. Run the following to see this:
+1. Once the virtual environment has been activated, the default `python` command will run the version of Python that was used to create the virtual environment. Run the following to get the version:
```sh
python --version
```
- You should see the following:
+ The output should contain the following:
```output
(.venv) ➜ nightlight python --version
@@ -122,7 +122,7 @@ Create a Python application to print `"Hello World"` to the console.
> 💁 If your terminal returns `command not found` on macOS it means VS Code has not been added to your PATH. You can add VS Code to your PATH by following the instructions in the [Launching from the command line section of the VS Code documentation](https://code.visualstudio.com/docs/setup/mac?WT.mc_id=academic-17441-jabenn#_launching-from-the-command-line) and run the command afterwards. VS Code is installed to your PATH by default on Windows and Linux.
-1. When VS Code launches, it will activate the Python virtual environment. You will see this in the bottom status bar:
+1. When VS Code launches, it will activate the Python virtual environment. The selected virtual environment will appear in the bottom status bar:

@@ -136,9 +136,9 @@ Create a Python application to print `"Hello World"` to the console.
(.venv) ➜ nightlight
```
- If you don't see `.venv` as a prefix on the prompt, the virtual environment is not active in the terminal.
+ If you don't have `.venv` as a prefix on the prompt, the virtual environment is not active in the terminal.
-1. Launch a new VS Code Terminal by selecting *Terminal -> New Terminal, or pressing `` CTRL+` ``. The new terminal will load the virtual environment, and you will see the call to activate this in the terminal, as well as having the name of the virtual environment (`.venv`) in the prompt:
+1. Launch a new VS Code Terminal by selecting *Terminal -> New Terminal, or pressing `` CTRL+` ``. The new terminal will load the virtual environment, and the the call to activate this will appear in the terminal. The prompt will also have the name of the virtual environment (`.venv`):
```output
➜ nightlight source .venv/bin/activate
@@ -159,7 +159,7 @@ Create a Python application to print `"Hello World"` to the console.
python app.py
```
- You should see the following output:
+ The following will be in the output:
```output
(.venv) ➜ nightlight python app.py
@@ -184,7 +184,7 @@ As a second 'Hello World' step, you will run the CounterFit app and connect your

- You will see it marked as *Disconnected*, with the LED in the top-right corner turned off.
+ It will be marked as *Disconnected*, with the LED in the top-right corner turned off.
1. Add the following code to the top of `app.py`:
@@ -201,7 +201,7 @@ As a second 'Hello World' step, you will run the CounterFit app and connect your

-1. In this new terminal, run the `app.py` file as before. You will see the status of CounterFit change to **Connected** and the LED light up.
+1. In this new terminal, run the `app.py` file as before. The status of CounterFit will change to **Connected** and the LED will light up.

diff --git a/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md b/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
index 7cfa5325..cd45180d 100644
--- a/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
+++ b/1-getting-started/lessons/1-introduction-to-iot/wio-terminal.md
@@ -40,7 +40,7 @@ Create the PlatformIO project.
1. Launch VS Code
-1. You should see the PlatformIO icon on the side menu bar:
+1. The PlatformIO icon will be on the side menu bar:

@@ -83,7 +83,7 @@ The VS Code explorer will show a number of files and folders created by the Plat
#### Files
-* `main.cpp` - this file in the `src` folder contains the entry point for your application. If you open the file, you will see the following:
+* `main.cpp` - this file in the `src` folder contains the entry point for your application. Open this file, and it will contain the following code:
```cpp
#include
@@ -101,7 +101,7 @@ The VS Code explorer will show a number of files and folders created by the Plat
* `.gitignore` - this file lists the files and directories to be ignored when adding your code to git source code control, such as uploading to a repository on GitHub.
-* `platformio.ini` - this file contains configuration for your device and app. If you open this file, you will see the following:
+* `platformio.ini` - this file contains configuration for your device and app. Open this file, and it will contain the following code:
```ini
[env:seeed_wio_terminal]
@@ -166,7 +166,7 @@ Write the Hello World app.
1. The code will be compiled and uploaded to the Wio Terminal
- > 💁 If you are using macOS you will see a notification about a *DISK NOT EJECTED PROPERLY*. This is because the Wio Terminal gets mounted as a drive as part of the flashing process, and it is disconnected when the compiled code is written to the device. You can ignore this notification.
+ > 💁 If you are using macOS, a notification about a *DISK NOT EJECTED PROPERLY* will appear. This is because the Wio Terminal gets mounted as a drive as part of the flashing process, and it is disconnected when the compiled code is written to the device. You can ignore this notification.
⚠️ If you get errors about the upload port being unavailable, first make sure you have the Wio Terminal connected to your computer, and switched on using the switch on the left hand side of the screen. The green light on the bottom should be on. If you still get the error, pull the on/off switch down twice in quick succession to force the Wio Terminal into bootloader mode and try the upload again.
@@ -191,7 +191,7 @@ PlatformIO has a Serial Monitor that can monitor data sent over the USB cable fr
Hello World
```
- You will see `Hello World` appear every 5 seconds.
+ `Hello World` will print to the serial monitor every 5 seconds.
> 💁 You can find this code in the [code/wio-terminal](code/wio-terminal) folder.
diff --git a/1-getting-started/lessons/2-deeper-dive/README.md b/1-getting-started/lessons/2-deeper-dive/README.md
index 9d269cd1..c0447594 100644
--- a/1-getting-started/lessons/2-deeper-dive/README.md
+++ b/1-getting-started/lessons/2-deeper-dive/README.md
@@ -66,7 +66,7 @@ An even smarter version could use AI in the cloud with data from other sensors c
Although the I in IoT stands for Internet, these devices don't have to connect to the Internet. In some cases, devices can connect to 'edge' devices - gateway devices that run on your local network meaning you can process data without making a call over the Internet. This can be faster when you have a lot of data or a slow Internet connection, it allows you to run offline where Internet connectivity is not possible such as on a ship or in a disaster area when responding to a humanitarian crisis, and allows you to keep data private. Some devices will contain processing code created using cloud tools and run this locally to gather and respond to data without using an Internet connection to make a decision.
-One example of this is a smart home device such as an Apple HomePod, Amazon Alexa, or Google Home, which will listen to your voice using AI models trained in the cloud, and will 'wake up' when a certain word or phrase is spoken, and only then send your speech to the Internet for processing, keeping everything else you say private.
+One example of this is a smart home device such as an Apple HomePod, Amazon Alexa, or Google Home, which will listen to your voice using AI models trained in the cloud, but running locally on the device. These devices will 'wake up' when a certain word or phrase is spoken, and only then send your speech over the Internet for processing. The device will stop sending speech at an appropriate point such as when it detects a pause in your speech. Everything you say before waking up the device with the wake word, and everything you say after the device has stopped listening will not be sent over the internet to the device provider, and therefore will be private.
✅ Think of other scenarios where privacy is important so processing of data would be better done on the edge rather than in the cloud. As a hint - think about IoT devices with cameras or other imaging devices on them.
@@ -100,7 +100,7 @@ The faster the clock cycle, the more instructions that can be processed each sec
Microcontrollers have much lower clock speeds than desktop or laptop computers, or even most smartphones. The Wio Terminal for example has a CPU that runs at 120MHz or 120,000,000 cycles per second.
-✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and see how many times faster it is than the Wio terminal.
+✅ An average PC or Mac has a CPU with multiple cores running at multiple GigaHertz, meaning the clock ticks billions of times a second. Research the clock speed of your computer and compare how many times faster it is than the Wio terminal.
Each clock cycle draws power and generates heat. The faster the ticks, the more power consumed and more heat generated. PC's have heat sinks and fans to remove heat, without which they would overheat and shut down within seconds. Microcontrollers often have neither as they run much cooler and therefore much slower. PC's run off mains power or large batteries for a few hours, microcontrollers can run for days, months, or even years off small batteries. Microcontrollers can also have cores that run at different speeds, switching to slower low power cores when the demand on the CPU is low to reduce power consumption.
@@ -112,7 +112,7 @@ Each clock cycle draws power and generates heat. The faster the ticks, the more
Investigate the Wio Terminal.
-If you are using a Wio Terminal for these lessons, see if you can find the CPU. Find the *Hardware Overview* section of the [Wio Terminal product page](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) for a picture of the internals, and see if you can see the CPU through the clear plastic window on the back.
+If you are using a Wio Terminal for these lessons, try to find the CPU. Find the *Hardware Overview* section of the [Wio Terminal product page](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) for a picture of the internals, and try to find the CPU through the clear plastic window on the back.
### Memory
@@ -150,7 +150,7 @@ Microcontrollers need input and output (I/O) connections to read data from senso
Investigate the Wio Terminal.
-If you are using a Wio Terminal for these lessons, find the GPIO pins. Find the *Pinout diagram* section of the [Wio Terminal product page](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) to see which pins are which. The Wio Terminal comes with a sticker you can mount on the back with pin numbers, so add this now if you haven't already.
+If you are using a Wio Terminal for these lessons, find the GPIO pins. Find the *Pinout diagram* section of the [Wio Terminal product page](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) to learn which pins are which. The Wio Terminal comes with a sticker you can mount on the back with pin numbers, so add this now if you haven't already.
### Physical size
@@ -198,7 +198,7 @@ There is a large ecosystem of third-party Arduino libraries that allow you to ad
Investigate the Wio Terminal.
-If you are using a Wio Terminal for these lessons, re-read the code you wrote in the last lesson. Find the `setup` and `loop` function. Monitor the serial output to see the loop function being called repeatedly. Try adding code to the `setup` function to write to the serial port and see this code is only called once each time you reboot. Try rebooting your device with the power switch on the side to see this called each time the device reboots.
+If you are using a Wio Terminal for these lessons, re-read the code you wrote in the last lesson. Find the `setup` and `loop` function. Monitor the serial output for the loop function being called repeatedly. Try adding code to the `setup` function to write to the serial port and observe that this code is only called once each time you reboot. Try rebooting your device with the power switch on the side to show this is called each time the device reboots.
## Deeper dive into single-board computers
@@ -261,6 +261,7 @@ The challenge in the last lesson was to list as many IoT devices as you can that
* Read the [Arduino getting started guide](https://www.arduino.cc/en/Guide/Introduction) to understand more about the Arduino platform.
* Read the [introduction to the Raspberry Pi 4](https://www.raspberrypi.org/products/raspberry-pi-4-model-b/) to learn more about Raspberry Pis.
+* Learn more on some of the concepts and acronyms in the [What the FAQ are CPUs, MPUs, MCUs, and GPUs article in the Electrical Engineering Journal](https://www.eejournal.com/article/what-the-faq-are-cpus-mpus-mcus-and-gpus/).
✅ Use these guides, along with the costs shown by following the links in the [hardware guide](../../../hardware.md) to decide on what hardware platform you want to use, or if you would rather use a virtual device.
diff --git a/1-getting-started/lessons/2-deeper-dive/translations/README.ar.md b/1-getting-started/lessons/2-deeper-dive/translations/README.ar.md
new file mode 100644
index 00000000..0ce7d3b1
--- /dev/null
+++ b/1-getting-started/lessons/2-deeper-dive/translations/README.ar.md
@@ -0,0 +1,295 @@
+#
التعمق أكثر بإنترنت الأشياء
+
+
+
+>
+
+يتعمق هذا الدرس في بعض المفاهيم التي تم تناولها في الدرس الأخير.
+
+سنغطي في هذا الدرس:
+
+* [مكونات تطبيق إنترنت الأشياء](#components-of-an-iot-application)
+* [التعمق اكثر في المتحكم الدقيق](#deeper-dive-into-microcontrollers)
+* [التعمق اكثر في أجهزة الكمبيوتر ذات اللوحة الواحدة](#deeper-dive-into-single-board-computers)
+
+## مكونات تطبيقات إنترنت الأشياء
+
+المكونان لتطبيق إنترنت الأشياء هما الإنترنت و الشيء. لنلقِ نظرة على هذين المكونين بمزيد من التفصيل.
+
+### الشيء
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+يشير الشيء من إنترنت الأشياء إلى جهاز يمكنه التفاعل مع العالم المادي. عادةً ما تكون هذه الأجهزة أجهزة كمبيوتر صغيرة ومنخفضة السعر ، وتعمل بسرعات منخفضة وتستخدم طاقة منخفضة - على سبيل المثال ، وحدات تحكم دقيقة بسيطة بها كيلوبايت من ذاكرة الوصول العشوائي (على عكس الجيجابايت في جهاز الكمبيوتر) تعمل ببضع مئات من الميجاهرتز فقط (على عكس الجيجاهيرتز) في جهاز كمبيوتر) ، ولكن في بعض الأحيان يستهلك القليل من الطاقة بحيث يمكن تشغيلها لأسابيع أو شهور أو حتى سنوات على البطاريات.
+
+تتفاعل هذه الأجهزة مع العالم المادي ، إما باستخدام أجهزة استشعار لجمع البيانات من محيطها أو عن طريق التحكم في المخرجات أو المحركات لإجراء تغييرات فيزيائية. المثال النموذجي لذلك هو منظم الحرارة الذكي - جهاز يحتوي على مستشعر درجة الحرارة ، ووسيلة لتعيين درجة الحرارة المرغوبة مثل قرص أو شاشة تعمل باللمس ، ووصلة بنظام تدفئة أو تبريد يمكن تشغيله عند اكتشاف درجة الحرارة خارج النطاق المطلوب. يكتشف مستشعر درجة الحرارة أن الغرفة شديدة البرودة ويقوم المشغل بتشغيل التدفئة.
+
+
+
+***A simple thermostat. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
+
+هناك مجموعة كبيرة من الأشياء المختلفة التي يمكن أن تعمل كأجهزة إنترنت الأشياء ، من الأجهزة المخصصة التي تستشعر شيئًا واحدًا ، إلى الأجهزة ذات الأغراض العامة ، حتى هاتفك الذكي! يمكن للهاتف الذكي استخدام المستشعرات لاكتشاف العالم من حوله والمحركات للتفاعل مع العالم - على سبيل المثال ، باستخدام مستشعر GPS لاكتشاف موقعك ومكبر صوت لإعطائك إرشادات التنقل إلى وجهة.
+
+✅ فكر في الأنظمة الأخرى الموجودة حولك والتي تقرأ البيانات من جهاز استشعار وتستخدمها لاتخاذ القرارات. أحد الأمثلة على ذلك هو منظم الحرارة الموجود في الفرن. هل يمكنك إيجاد المزيد؟
+
+
+
+### الانترنت
+يتكون جانب الإنترنت من التطبيقات التي يمكن لجهاز إنترنت الأشياء توصيلها لإرسال البيانات واستقبالها ، بالإضافة إلى التطبيقات الأخرى التي يمكنها معالجة البيانات من جهاز إنترنت الأشياء والمساعدة في اتخاذ قرارات بشأن الطلبات التي سيتم إرسالها إلى مشغلات أجهزة إنترنت الأشياء.
+
+يتمثل أحد الإعدادات النموذجية في وجود نوع من الخدمة السحابية التي يتصل بها جهاز إنترنت الأشياء ، وتتولى هذه الخدمة السحابية أشياء مثل الأمان ، بالإضافة إلى تلقي الرسائل من جهاز إنترنت الأشياء ، وإرسال الرسائل مرة أخرى إلى الجهاز. ستتصل هذه الخدمة السحابية بعد ذلك بالتطبيقات الأخرى التي يمكنها معالجة بيانات المستشعر أو تخزينها ، أو استخدام بيانات المستشعر مع البيانات من الأنظمة الأخرى لاتخاذ القرارات.
+
+لا تتصل الأجهزة دائمًا مباشرة بالإنترنت عبر شبكة WiFi أو اتصالات سلكية. تستخدم بعض الأجهزة الشبكات المتداخلة للتحدث مع بعضها البعض عبر تقنيات مثل Bluetooth ، والاتصال عبر جهاز لوحة وصل متصل بالإنترنت.
+
+باستخدام مثال منظم الحرارة الذكي ، سيتصل منظم الحرارة باستخدام شبكة WiFi المنزلية بخدمة سحابية تعمل في السحابة. سيرسل بيانات درجة الحرارة إلى هذه الخدمة السحابية ، ومن هناك ستتم كتابتها في قاعدة بيانات بشكل يسمح لمالك المنزل بالتحقق من درجات الحرارة الحالية والسابقة باستخدام تطبيق الهاتف. ستعرف خدمة أخرى في السحابة درجة الحرارة التي يريدها صاحب المنزل ، وترسل الرسائل مرة أخرى إلى جهاز إنترنت الأشياء عبر الخدمة السحابية لإخبار نظام التدفئة بالتشغيل أو الإيقاف.
+
+
+
+***An Internet connected thermostat with mobile app control. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
+
+
+يمكن لنسخة أكثر ذكاءً استخدام الذكاء الاصطناعي في السحابة مع بيانات من مستشعرات أخرى متصلة بأجهزة إنترنت الأشياء الأخرى مثل مستشعرات الإشغال التي تكتشف الغرف المستخدمة ، بالاضافة الى البيانات مثل الطقس وحتى التقويم الخاص بك ، لاتخاذ قرارات بشأن كيفية ضبط درجة الحرارة بطريقة ذكية. على سبيل المثال ، يمكن أن يوقف التدفئة إذا كان يقرأ من التقويم الخاص بك أنك في إجازة ، أو أيقاف التدفئة على أساس كل غرفة على حدة اعتمادًا على الغرف التي تستخدمها ، والتعلم من البيانات لتكون أكثر دقة بمرور الوقت .
+
+
+
+***An Internet connected thermostat using multiple room sensors, with mobile app control, as well as intelligence from weather and calendar data. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
+
+
+✅ ما هي البيانات الأخرى التي يمكن أن تساعد في جعل منظم الحرارة المتصل بالإنترنت أكثر ذكاءً؟
+
+
+
+### إنترنت الأشياء على edge
+
+على الرغم من أن انترنت الاشياء تعني الإنترنت ، إلا أن هذه الأجهزة لا تحتاج إلى الاتصال بالإنترنت. في بعض الحالات ، يمكن للأجهزة الاتصال بأجهزة "edge" - أجهزة البوابة التي تعمل على شبكتك المحلية مما يعني أنه يمكنك معالجة البيانات دون إجراء مكالمة عبر الإنترنت. يمكن أن يكون هذا أسرع عندما يكون لديك الكثير من البيانات أو يكون اتصال الإنترنت بطيئًا ، فهو يسمح لك بالعمل دون اتصال بالإنترنت حيث يكون الاتصال بالإنترنت غير ممكن كما هو الحال على متن سفينة أو في منطقة كوارث عند الاستجابة لأزمة إنسانية ، ويسمح لك الحفاظ على خصوصية البيانات. ستحتوي بعض الأجهزة على كود معالجة تم إنشاؤه باستخدام أدوات السحابة وتشغيله محليًا لجمع البيانات والاستجابة لها دون استخدام اتصال بالإنترنت لاتخاذ قرار.
+
+أحد الأمثلة على ذلك هو جهاز منزلي ذكي مثل Apple HomePod أو Amazon Alexa أو Google Home ، والتي ستستمع إلى صوتك باستخدام نماذج AI المدربة في السحابة ، ولكنها تعمل محليًا على الجهاز. سوف "تستيقظ" هذه الأجهزة عند نطق كلمة أو عبارة معينة ، وعندها فقط ترسل كلامك عبر الإنترنت لمعالجته. سيتوقف الجهاز عن إرسال الكلام في نقطة مناسبة مثل عندما يكتشف توقفًا مؤقتًا في كلامك. كل ما تقوله قبل إيقاظ الجهاز بكلمة التنبيه ، وكل ما تقوله بعد توقف الجهاز عن الاستماع لن يتم إرساله عبر الإنترنت إلى مزود الجهاز ، وبالتالي سيكون خاصًا.
+
+✅ فكر في سيناريوهات أخرى حيث تكون الخصوصية مهمة ، لذا من الأفضل إجراء معالجة البيانات على edge بدلاً من السحابة. كتلميح - فكر في أجهزة إنترنت الأشياء المزودة بكاميرات أو أجهزة تصوير أخرى عليها.
+
+
+
+### أمن إنترنت الأشياء
+
+مع أي اتصال بالإنترنت ، يعد الأمان أحد الاعتبارات المهمة. هناك مزحة قديمة مفادها أن "S in IoT تعني الأمان" - لا يوجد حرف "S" في إنترنت الأشياء ، مما يعني أنه ليس آمنًا.
+
+تتصل أجهزة إنترنت الأشياء بالخدمة السحابية ، وبالتالي فهي آمنة فقط مثل تلك الخدمة السحابية - إذا كانت الخدمة السحابية الخاصة بك تسمح لأي جهاز بالاتصال ، فيمكن إرسال البيانات الضارة ، أو يمكن أن تحدث هجمات الفيروسات. يمكن أن يكون لهذا عواقب حقيقية للغاية حيث تتفاعل أجهزة إنترنت الأشياء وتتحكم في الأجهزة الأخرى. على سبيل المثال ، ملف Stuxnet worm التلاعب بالصمامات في أجهزة الطرد المركزي لإتلافها. استفاد القراصنة أيضًا من ضعف الأمن للوصول إلى أجهزة مراقبة الأطفال وأجهزة المراقبة المنزلية الأخرى.
+
+> 💁 في بعض الأحيان ، تعمل أجهزة إنترنت الأشياء والأجهزة الطرفية على شبكة معزولة تمامًا عن الإنترنت للحفاظ على خصوصية البيانات وأمانها. هذا هو المعروف باسم air-gapping.
+
+
+
+
+### التعمق اكثر في المتحكم الدقيق
+
+في الدرس الأخير ، قدمنا المتحكمات الدقيقة. دعونا الآن نلقي نظرة أعمق عليهم.
+
+
+
+### وحدة المعالجة المركزية
+
+وحدة المعالجة المركزية هي "عقل" المتحكم الدقيق. إنه المعالج الذي يقوم بتشغيل الكود الخاص بك ويمكنه إرسال البيانات واستقبال البيانات من أي أجهزة متصلة. يمكن أن تحتوي وحدات المعالجة المركزية (CPU) على مركز واحد أو أكثر - وحدة معالجة مركزية واحدة أو أكثر يمكنها العمل معًا لتشغيل التعليمات البرمجية الخاصة بك.
+
+تعتمد وحدات المعالجة المركزية (CPU) على ساعة لتحديد عدة ملايين أو مليارات المرات في الثانية. تقوم كل علامة أو دورة بمزامنة الإجراءات التي يمكن أن تتخذها وحدة المعالجة المركزية. مع كل علامة ، يمكن لوحدة المعالجة المركزية تنفيذ تعليمات من أحد البرامج ، مثل استرداد البيانات من جهاز خارجي أو إجراء عملية حسابية. تسمح هذه الدورة المنتظمة بإكمال جميع الإجراءات قبل معالجة التعليمات التالية.
+
+كلما كانت دورة الساعة أسرع ، زادت التعليمات التي يمكن معالجتها كل ثانية ، وبالتالي زادت سرعة وحدة المعالجة المركزية. يتم قياس سرعات وحدة المعالجة المركزية بـHertz (Hz) ،وحدة قياسية حيث 1 هرتز يعني دورة واحدة أو علامة ساعة في الثانية.
+
+> 🎓 سرعات وحدة المعالجة المركزية تُعطى غالبًا بالميغاهرتز أو الجيجاهرتز. 1 ميجا هرتز هو 1 مليون هرتز ، 1 جيجا هرتز 1 مليار هرتز.
+
+> 💁 تنفذ وحدات المعالجة المركزية البرامج باستخدام امتداد fetch-decode-execute cycle) . لكل علامة ساعة ، ستقوم وحدة المعالجة المركزية بجلب التعليمات التالية من الذاكرة ، وفك تشفيرها ، ثم تنفيذها مثل استخدام وحدة المنطق الحسابي (ALU) لإضافة رقمين. ستستغرق بعض عمليات التنفيذ عدة علامات للتشغيل ، لذا ستعمل الدورة التالية عند العلامة التالية بعد اكتمال التعليمات.
+
+
+
+***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
+
+تتميز المتحكمات الدقيقة بسرعات ساعة أقل بكثير من أجهزة الكمبيوتر المكتبية أو المحمولة ، أو حتى معظم الهواتف الذكية. تحتوي Wio Terminal على سبيل المثال على وحدة معالجة مركزية تعمل بسرعة 120 ميجاهرتز أو 120.000.000 دورة في الثانية.
+
+✅ يحتوي الكمبيوتر الشخصي العادي أو جهاز Mac على وحدة معالجة مركزية متعددة النوى تعمل بسرعة جيجاهرتز متعددة ، مما يعني أن الساعة تدق مليارات المرات في الثانية. ابحث في سرعة الساعة لجهاز الكمبيوتر الخاص بك وقارن عدد المرات التي تكون فيها أسرع من محطة Wio.
+
+تقوم كل دورة على مدار الساعة بسحب الطاقة وتوليد الحرارة. كلما زادت سرعة ضربات الساعة ، زادت الطاقة المستهلكة والمزيد من الحرارة المتولدة. تحتوي أجهزة الكمبيوتر على أحواض حرارية ومراوح لإزالة الحرارة ، والتي بدونها سوف ترتفع درجة حرارتها وتغلق في غضون ثوانٍ. غالبًا ما لا تحتوي المتحكمات الدقيقة على أي منهما لأنها تعمل بشكل أكثر برودة وبالتالي فهي أبطأ بكثير. نفد التيار الكهربائي للكمبيوتر الشخصي أو البطاريات الكبيرة لبضع ساعات ، يمكن أن تعمل وحدات التحكم الدقيقة لأيام أو شهور أو حتى سنوات من البطاريات الصغيرة. يمكن أن تحتوي وحدات التحكم الدقيقة أيضًا على نوى تعمل بسرعات مختلفة ، وتتحول إلى نوى منخفضة الطاقة أبطأ عندما يكون الطلب على وحدة المعالجة المركزية منخفضًا لتقليل استهلاك الطاقة.
+
+> 💁 تتبنى بعض أجهزة الكمبيوتر الشخصية وأجهزة Mac نفس المزيج من النوى السريعة عالية الطاقة ونواتج الطاقة المنخفضة الأبطأ ، مع التبديل لتوفير البطارية. على سبيل المثال ، يمكن لشريحة M1 في أحدث أجهزة كمبيوتر Apple المحمولة التبديل بين 4 نوى أداء و 4 نوى كفاءة لتحسين عمر البطارية أو سرعتها اعتمادًا على المهمة التي يتم تشغيلها.
+
+✅ قم ببعض البحث: اقرأ عن وحدات المعالجة المركزية في Wikipedia CPU article
+
+### مهمة
+
+تحقق من محطة Wio.
+
+إذا كنت تستخدم Wio Terminal لهذه الدروس ، فحاول العثور على وحدة المعالجة المركزية. ابحث عن قسم نظرة عامة على الأجهزة في Wio Terminal product page للحصول على صورة للأجزاء الداخلية ، وحاول العثور على وحدة المعالجة المركزية من خلال النافذة البلاستيكية الشفافة الموجودة في الخلف.
+
+### ذاكرة
+
+عادة ما تحتوي المتحكمات الدقيقة على نوعين من الذاكرة - ذاكرة البرنامج وذاكرة الوصول العشوائي (RAM).
+
+ذاكرة البرنامج غير متطايرة ، مما يعني أن كل ما هو مكتوب عليها يبقى عند عدم وجود طاقة للجهاز. هذه هي الذاكرة التي تخزن كود البرنامج الخاص بك.
+
+ذاكرة الوصول العشوائي هي الذاكرة التي يستخدمها البرنامج للتشغيل ، وتحتوي على متغيرات يخصصها برنامجك وبيانات تم جمعها من الأجهزة الطرفية. ذاكرة الوصول العشوائي متقلبة ، عندما تنقطع الطاقة تضيع المحتويات ، مما يؤدي إلى إعادة ضبط البرنامج بشكل فعال.
+
+> 🎓 ذاكرة البرنامج تخزن الكود الخاص بك وتبقى عند انقطاع التيار الكهربائي.
+
+> 🎓 تُستخدم ذاكرة الوصول العشوائي (RAM) لتشغيل برنامجك وتتم إعادة تعيينها عند انقطاع التيار الكهربائي
+
+كما هو الحال مع وحدة المعالجة المركزية ، فإن الذاكرة الموجودة على وحدة التحكم الدقيقة هي أصغر من أجهزة الكمبيوتر الشخصي أو جهاز Mac. قد يحتوي جهاز الكمبيوتر العادي على 8 جيجا بايت من ذاكرة الوصول العشوائي ، أو 8000 مليون بايت ، مع مساحة كافية لكل بايت لتخزين حرف واحد أو رقم من 0-255. سيكون للمتحكم الدقيق فقط كيلو بايت (KB) من ذاكرة الوصول العشوائي ، مع كيلو بايت يبلغ 1000 بايت. تحتوي محطة Wio المذكورة أعلاه على 192 كيلو بايت من ذاكرة الوصول العشوائي ، أو 192000 بايت - أي أكثر من 40000 مرة أقل من متوسط الكمبيوتر الشخصي!
+
+يوضح الرسم البياني أدناه اختلاف الحجم النسبي بين 192 كيلو بايت و 8 جيجابايت - تمثل النقطة الصغيرة في المنتصف 192 كيلو بايت.
+
+
+
+تخزين البرنامج أصغر أيضًا من جهاز الكمبيوتر. قد يحتوي جهاز الكمبيوتر العادي على محرك أقراص ثابت سعة 500 جيجابايت لتخزين البرنامج ، في حين أن وحدة التحكم الدقيقة قد تحتوي على كيلوبايت فقط أو ربما بضعة ميغا بايت (MB) من التخزين (1 ميجابايت تساوي 1،000 كيلو بايت ، أو 1000000 بايت). محطة Wio لديها 4 ميغا بايت من تخزين البرنامج.
+
+✅ قم بإجراء بحث بسيط: ما مقدار ذاكرة الوصول العشوائي ومساحة التخزين التي يمتلكها الكمبيوتر الذي تستخدمه لقراءة هذا؟ كيف يقارن هذا بالمتحكم الدقيق؟
+
+### الإدخال / الإخراج
+
+تحتاج المتحكمات الدقيقة إلى توصيلات الإدخال والإخراج (I / O) لقراءة البيانات من أجهزة الاستشعار وإرسال إشارات التحكم إلى المشغلات. عادة ما تحتوي على عدد من دبابيس الإدخال / الإخراج للأغراض العامة (GPIO). يمكن تكوين هذه المسامير في البرنامج بحيث يتم إدخالها (أي أنها تتلقى إشارة) ، أو إخراج (ترسل إشارة).
+
+🧠⬅️ تُستخدم دبابيس الإدخال لقراءة القيم من أجهزة الاستشعار
+
+🧠➡️ ترسل دبابيس الإخراج التعليمات إلى المشغلات
+
+✅ سوف تتعلم المزيد عن هذا في درس لاحق.
+
+#### مهمة
+
+تحقق من محطة Wio.
+
+إذا كنت تستخدم Wio Terminal لهذه الدروس ، فابحث عن دبابيس GPIO. ابحث عن قسم Pinout diagram في ملف Wio Terminal product page لمعرفة الدبابيس التي. تأتي Wio Terminal مع ملصق يمكنك تثبيته على ظهره بأرقام الدبوس ، لذا أضف هذا الآن إذا لم تكن قد قمت بذلك بالفعل.
+
+### حجم فيزيائي
+
+عادة ما تكون المتحكمات الدقيقة صغيرة الحجم ، وأصغرها Freescale Kinetis KL03 MCU صغير بما يكفي ليلائم غمازة كرة الجولف . يمكن لوحدة المعالجة المركزية في جهاز الكمبيوتر فقط قياس 40 مم × 40 مم ، وهذا لا يشمل المشتتات الحرارية والمراوح اللازمة لضمان تشغيل وحدة المعالجة المركزية لأكثر من بضع ثوانٍ دون ارتفاع درجة الحرارة ، وهي أكبر بكثير من وحدة تحكم دقيقة كاملة. إن مجموعة مطور Wio الطرفية المزودة بوحدة تحكم دقيقة وحالة وشاشة ومجموعة من التوصيلات والمكونات ليست أكبر بكثير من وحدة المعالجة المركزية Intel i9 ، وهي أصغر بكثير من وحدة المعالجة المركزية مع المشتت الحراري والمروحة!
+
+| الجهاز | الحجم |
+| ------------------------------- | --------------------- |
+| Freescale Kinetis KL03 | 1.6 مم × 2 مم × 1 مم |
+| محطة Wio | 72 مم × 57 مم × 12 مم |
+| وحدة المعالجة المركزية Intel i9 ، المشتت الحراري والمروحة | 136 مم × 145 مم × 103 مم |
+
+
+### أنظمة التشغيل
+
+نظرًا لانخفاض سرعتها وحجم الذاكرة ، لا تقوم وحدات التحكم الدقيقة بتشغيل نظام تشغيل (OS) بمعنى سطح المكتب . يحتاج نظام التشغيل الذي يعمل على تشغيل الكمبيوتر (Windows أو Linux أو macOS) إلى قدر كبير من الذاكرة وقوة المعالجة لتشغيل المهام غير الضرورية تمامًا لوحدة التحكم الدقيقة. تذكر أن المتحكمات الدقيقة عادة ما تكون مبرمجة لأداء مهمة واحدة أو أكثر من المهام المحددة للغاية ، على عكس أجهزة الكمبيوتر ذات الأغراض العامة مثل الكمبيوتر الشخصي أو جهاز Mac الذي يحتاج إلى دعم واجهة المستخدم أو تشغيل الموسيقى أو الأفلام أو توفير أدوات لكتابة المستندات أو التعليمات البرمجية أو ممارسة الألعاب أو تصفح الانترنت.
+
+لبرمجة متحكم دقيق بدون نظام تشغيل ، فأنت بحاجة إلى بعض الأدوات للسماح لك ببناء الكود الخاص بك بطريقة يمكن للميكروكونترولر تشغيلها ، باستخدام واجهات برمجة التطبيقات التي يمكنها التحدث إلى أي أجهزة طرفية. يختلف كل متحكم عن الآخر ، لذا فإن الشركات المصنعة تدعم عادةً الأطر القياسية التي تسمح لك باتباع "وصفة" قياسية لبناء الكود الخاص بك وتشغيله على أي متحكم يدعم هذا الإطار.
+
+يمكنك برمجة وحدات التحكم الدقيقة باستخدام نظام تشغيل - يشار إليه غالبًا باسم نظام التشغيل في الوقت الفعلي (RTOS) ، حيث تم تصميمه للتعامل مع إرسال البيانات من وإلى الأجهزة الطرفية في الوقت الفعلي. تتميز أنظمة التشغيل هذه بأنها خفيفة الوزن للغاية وتوفر ميزات مثل:
+
+* خيوط متعددة ، مما يسمح للكود الخاص بك بتشغيل أكثر من كتلة واحدة من التعليمات البرمجية في نفس الوقت ، إما على مراكز متعددة أو بالتناوب على نواة واحدة
+* الشبكات للسماح بالاتصال عبر الإنترنت بشكل آمن
+* مكونات واجهة المستخدم الرسومية (GUI) لبناء واجهات المستخدم (UI) على الأجهزة التي تحتوي على شاشات.
+
+
+✅ اقرأ عن بعض أنظمة RTOS المختلفة:
+Azure RTOS , FreeRTOS , Zephyr
+
+
+
+#### Arduino
+
+
+
+Arduino
+من المحتمل أن يكون إطار عمل وحدة التحكم الدقيقة الأكثر شيوعًا ، خاصة بين الطلاب والهواة والصناع. Arduino عبارة عن منصة إلكترونية مفتوحة المصدر تجمع بين البرامج والأجهزة. يمكنك شراء لوحات Arduino المتوافقة من Arduino نفسها أو من الشركات المصنعة الأخرى ، ثم كتابة التعليمات البرمجية باستخدام إطار عمل Arduino.
+
+يتم ترميز لوحات Arduino في C أو C ++. يتيح استخدام C / C ++ تجميع التعليمات البرمجية الخاصة بك بشكل صغير جدًا وتشغيلها بسرعة ، وهو شيء مطلوب على جهاز مقيد مثل متحكم دقيق. يُشار إلى جوهر تطبيق Arduino بالرسم وهو رمز C / C ++ مع وظيفتين - "الإعداد" و "الحلقة". عند بدء تشغيل اللوحة ، سيعمل كود إطار عمل Arduino على تشغيل وظيفة "الإعداد" مرة واحدة ، ثم يقوم بتشغيل وظيفة "الحلقة" مرارًا وتكرارًا ، وتشغيلها باستمرار حتى يتم إيقاف تشغيل الطاقة.
+
+ستكتب رمز الإعداد الخاص بك في وظيفة "الإعداد" ، مثل الاتصال بشبكة WiFi والخدمات السحابية أو تهيئة المسامير للإدخال والإخراج. سيحتوي رمز الحلقة الخاص بك بعد ذلك على رمز معالجة ، مثل القراءة من جهاز استشعار وإرسال القيمة إلى السحابة. يمكنك عادةً تضمين تأخير في كل حلقة ، على سبيل المثال ، إذا كنت تريد فقط إرسال بيانات المستشعر كل 10 ثوانٍ ، فستضيف تأخيرًا لمدة 10 ثوانٍ في نهاية الحلقة حتى يتمكن المتحكم الدقيق من السكون ، وتوفير الطاقة ، ثم التشغيل الحلقة مرة أخرى عند الحاجة بعد 10 ثوانٍ.
+
+
+
+✅ تُعرف بنية البرنامج هذه باسم حلقة الحدث أو حلقة الرسالة . تستخدم العديد من التطبيقات هذا تحت الغطاء وهو المعيار لمعظم تطبيقات سطح المكتب التي تعمل على أنظمة تشغيل مثل Windows أو macOS أو Linux. "الحلقة" تستمع إلى الرسائل الواردة من مكونات واجهة المستخدم مثل الأزرار أو الأجهزة مثل لوحة المفاتيح وتستجيب لها. يمكنك قراءة المزيد في هذا article on the event loop
+
+يوفر Arduino مكتبات قياسية للتفاعل مع وحدات التحكم الدقيقة ودبابيس الإدخال / الإخراج ، مع تطبيقات مختلفة أسفل الغطاء للتشغيل على وحدات تحكم دقيقة مختلفة. على سبيل المثال ، ملف `delay` function سيوقف البرنامج مؤقتًا لفترة معينة من الوقت ، فإن `digitalRead` function سيقرأ قيمة "HIGH" أو "LOW" من رقم التعريف الشخصي المحدد ، بغض النظر عن اللوحة التي يتم تشغيل الكود عليها. تعني هذه المكتبات القياسية أن كود Arduino المكتوب للوحة واحدة يمكن إعادة تجميعه لأي لوحة Arduino أخرى وسيتم تشغيله ، على افتراض أن الدبابيس هي نفسها وأن اللوحات تدعم نفس الميزات.
+
+يوجد نظام بيئي كبير من مكتبات Arduino الخارجية يسمح لك بإضافة ميزات إضافية إلى مشاريع Arduino الخاصة بك ، مثل استخدام المستشعرات والمشغلات أو الاتصال بخدمات إنترنت الأشياء السحابية.
+
+##### مهمة
+
+تحقق من محطة Wio.
+
+إذا كنت تستخدم Wio Terminal لهذه الدروس ، فأعد قراءة الكود الذي كتبته في الدرس الأخير. ابحث عن وظيفتي "الإعداد" و "الحلقة". راقب الإخراج التسلسلي لوظيفة الحلقة التي يتم استدعاؤها بشكل متكرر. حاول إضافة رمز إلى وظيفة "الإعداد" للكتابة إلى المنفذ التسلسلي ولاحظ أن هذا الرمز يتم استدعاؤه مرة واحدة فقط في كل مرة تقوم فيها بإعادة التشغيل. حاول إعادة تشغيل جهازك باستخدام مفتاح الطاقة الموجود على الجانب لإظهار أن هذا يسمى في كل مرة يتم فيها إعادة تشغيل الجهاز.
+
+
+## التعمق اكثر في أجهزة الكمبيوتر ذات اللوحة الواحد
+
+في الدرس الأخير ، قدمنا أجهزة كمبيوتر أحادية اللوحة. دعونا الآن نلقي نظرة أعمق عليهم.
+
+### Raspberry Pi
+
+
+
+Raspberry Pi Foundation هي مؤسسة خيرية من المملكة المتحدة تأسست عام 2009 للترويج لدراسة علوم الكمبيوتر ، وخاصة على مستوى المدرسة. كجزء من هذه المهمة ، قاموا بتطوير جهاز كمبيوتر ذو لوحة واحدة ، يسمى Raspberry Pi. يتوفر Raspberry Pi حاليًا في 3 متغيرات - إصدار بالحجم الكامل ، و Pi Zero الأصغر ، ووحدة حسابية يمكن دمجها في جهاز IoT النهائي الخاص بك.
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+أحدث نسخة من Raspberry Pi بالحجم الكامل هي Raspberry Pi 4B. يحتوي هذا على وحدة معالجة مركزية رباعية النواة (4 نواة) تعمل بسرعة 1.5 جيجاهرتز أو 2 أو 4 أو 8 جيجابايت من ذاكرة الوصول العشوائي و Gigabit ethernet و WiFi ومنفذين HDMI يدعمان شاشات 4K ومنفذ إخراج الصوت والفيديو المركب ومنافذ USB (2 USB 2.0 ، 2 USB 3.0) ، 40 GPIO ، موصل كاميرا لوحدة كاميرا Raspberry Pi ، وفتحة بطاقة SD. كل هذا على لوحة مقاس 88 مم × 58 مم × 19.5 مم ويتم تشغيلها بواسطة مصدر طاقة 3 أمبير USB-C. يبدأ سعرها من 35 دولارًا أمريكيًا ، وهو أرخص بكثير من جهاز الكمبيوتر الشخصي أو جهاز Mac.
+
+> 💁 يوجد أيضًا جهاز كمبيوتر Pi400 الكل في واحد مع Pi4 المدمج في لوحة المفاتيح.
+
+
+
+Pi Zero أصغر بكثير ، مع طاقة أقل. يحتوي على وحدة معالجة مركزية أحادية النواة بسرعة 1 جيجاهرتز ، وذاكرة وصول عشوائي (RAM) سعة 512 ميجابايت ، و WiFi (في طراز Zero W) ، ومنفذ HDMI واحد ، ومنفذ micro-USB ، و 40 دبوس GPIO ، وموصل كاميرا لوحدة كاميرا Raspberry Pi ، وبطاقة SD فتحة. يقيس 65 مم × 30 مم × 5 مم ، ولا يستهلك سوى القليل من الطاقة. سعر Zero 5 دولارات أمريكية ، مع إصدار W مع شبكة WiFi بقيمة 10 دولارات أمريكية.
+
+> 🎓 وحدات المعالجة المركزية في كلاهما هي معالجات ARM ، على عكس معالجات Intel / AMD x86 أو x64 التي تجدها في معظم أجهزة الكمبيوتر الشخصية وأجهزة Mac. هذه تشبه وحدات المعالجة المركزية التي تجدها في بعض المتحكمات الدقيقة ، وكذلك جميع الهواتف المحمولة تقريبًا ، ومايكروسوفت سيرفس إكس ، وأجهزة آبل ماك الجديدة القائمة على السيليكون.
+
+تعمل جميع متغيرات Raspberry Pi على إصدار من Debian Linux يسمى Raspberry Pi OS. يتوفر هذا كإصدار خفيف بدون سطح مكتب ، وهو مثالي لمشروعات "بلا رأس" حيث لا تحتاج إلى شاشة ، أو نسخة كاملة مع بيئة سطح مكتب كاملة ، مع متصفح الويب ، والتطبيقات المكتبية ، وأدوات الترميز والألعاب. نظرًا لأن نظام التشغيل هو إصدار من Debian Linux ، يمكنك تثبيت أي تطبيق أو أداة تعمل على Debian ومصممة لمعالج ARM داخل Pi.
+
+#### مهمة
+
+تحقق من Raspberry Pi.
+
+إذا كنت تستخدم Raspberry Pi لهذه الدروس ، فاقرأ عن مكونات الأجهزة المختلفة الموجودة على اللوحة.
+
+* يمكنك العثور على تفاصيل حول المعالجات المستخدمة في Raspberry Pi hardware documentation page اقرأ عن المعالج المستخدم في Pi الذي تستخدمه.
+* حدد موقع دبابيس GPIO. اقرأ المزيد عنها في Raspberry Pi GPIO documentation . استخدم GPIO Pin Usage guide لتحديد الدبابيس المختلفة على Pi الخاص بك.
+
+### برمجة أجهزة الكمبيوتر ذات اللوحة الواحدة
+
+أجهزة الكمبيوتر أحادية اللوحة هي أجهزة كمبيوتر كاملة تعمل بنظام تشغيل كامل. هذا يعني أن هناك مجموعة واسعة من لغات البرمجة والأطر والأدوات التي يمكنك استخدامها لترميزها ، على عكس المتحكمات الدقيقة التي تعتمد على دعم اللوحة في أطر مثل Arduino. تحتوي معظم لغات البرمجة على مكتبات يمكنها الوصول إلى دبابيس GPIO لإرسال واستقبال البيانات من أجهزة الاستشعار والمشغلات.
+
+✅ ما هي لغات البرمجة التي تعرفها؟ هل هم مدعومون على لينكس؟
+
+لغة البرمجة الأكثر شيوعًا لبناء تطبيقات إنترنت الأشياء على Raspberry Pi هي Python. يوجد نظام بيئي ضخم للأجهزة المصممة لـ Pi ، وكلها تقريبًا تشتمل على الكود ذي الصلة اللازم لاستخدامها كمكتبات Python. تعتمد بعض هذه الأنظمة البيئية على "القبعات" - وهذا ما يسمى لأنها تجلس على قمة Pi مثل القبعة وتتصل بمقبس كبير إلى 40 دبوس GPIO. توفر هذه القبعات إمكانات إضافية ، مثل الشاشات أو المستشعرات أو السيارات التي يتم التحكم فيها عن بُعد أو المحولات للسماح لك بتوصيل المستشعرات بكابلات قياسية
+
+### استخدام أجهزة الكمبيوتر أحادية اللوحة في عمليات نشر إنترنت الأشياء الاحترافية
+
+تُستخدم أجهزة الكمبيوتر أحادية اللوحة في عمليات نشر إنترنت الأشياء الاحترافية ، وليس فقط كمجموعات للمطورين. يمكن أن توفر طريقة قوية للتحكم في الأجهزة وتشغيل المهام المعقدة مثل تشغيل نماذج التعلم الآلي. على سبيل المثال ، Raspberry Pi 4 compute module توفر كل قوة Raspberry Pi 4 ولكن في عامل شكل مدمج وأرخص بدون معظم المنافذ ، مصمم ليتم تثبيته في الأجهزة المخصصة.
+
+---
+
+## 🚀 تحدي
+
+كان التحدي في الدرس الأخير هو سرد أكبر عدد ممكن من أجهزة إنترنت الأشياء الموجودة في منزلك أو مدرستك أو مكان عملك. لكل جهاز في هذه القائمة ، هل تعتقد أنه مبني على وحدات تحكم دقيقة أو أجهزة كمبيوتر أحادية اللوحة ، أو حتى مزيج من الاثنين؟
+
+## مسابقة ما بعد المحاضرة
+
+[
اختبار ما بعد المحاضرة
](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/4)
+
+## مراجعة ودراسة ذاتية
+
+* اقرأ Arduino getting started guide لفهم المزيد حول نظام Arduino الأساسي.
+* اقرأ introduction to the Raspberry Pi 4 لمعرفة المزيد عن Raspberry Pi.
+
+✅ استخدم هذه الأدلة ، جنبًا إلى جنب مع التكاليف الموضحة باتباع الروابط الموجودة في ملف [hardware guide](../../../hardware.md) لتحديد النظام الأساسي للأجهزة الذي تريد استخدامه ، أو إذا كنت تفضل استخدام جهاز افتراضي.
+
+## واجب
+
+[قارن بين المتحكمات الدقيقة وأجهزة الكمبيوتر أحادية اللوحة](assignment.ar.md)
+
\ No newline at end of file
diff --git a/1-getting-started/lessons/2-deeper-dive/translations/README.bn.md b/1-getting-started/lessons/2-deeper-dive/translations/README.bn.md
new file mode 100644
index 00000000..222036f2
--- /dev/null
+++ b/1-getting-started/lessons/2-deeper-dive/translations/README.bn.md
@@ -0,0 +1,264 @@
+# IoT এর আরো গভীরে
+
+
+
+## লেকচার পূর্ববর্তী কুইজ
+
+[লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/3)
+
+## সূচনা
+
+এই অংশে আমাদের আগের পাঠ্যের আলোচিত কিছু বেসিক ধারণাগুলির আরও গভী্রে যাব আমরা।
+
+এই পাঠে আমরা কভার করব:
+
+* [আইওটি উপাদানসমূহ](#আইওটি-উপাদানসমূহ)
+* [মাইক্রোকন্ট্রোলারের আরো গভীরে](#মাইক্রোকন্ট্রোলারের-আরো-গভীরে)
+* [সিংগেল বোর্ড কম্পিউটারের আরো গভীরে](#সিংগেল-বোর্ড-কম্পিউটারের-আরো-গভীরে)
+
+## আইওটি উপাদানসমূহ
+
+আইওটি অ্যাপ্লিকেশনের দুটি উপাদান হলো *ইন্টারনেট* এবং *থিংস* । এই দুটি উপাদানকে আরও কিছুটা বিস্তারিতভাবে দেখা যাক।
+
+### থিংস
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+ **থিংস** বলতে আইওটির এই অংশটি এমন একটি ডিভাইসকে বোঝায় যা চারপাশের জগতের সাথে যোগাযোগ করতে পারে। এই ডিভাইসগুলি সাধারণত ছোট, কম দামের কম্পিউটার, কম গতিতে চলমান এবং কম শক্তি ব্যবহার করে। উদাহরণস্বরূপ সাধারণ মাইক্রোকন্ট্রোলারগুলি কিলোবাইট র্যামের (অথচ একটি পিসিতে তা গিগাবাইটের) চালিত হয় মাত্র কয়েক শতাধিক মেগাহার্টজ (অথচ একটি পিসিতে তা গিগাহার্টজের)। তবে কখনও কখনও এত অল্প শক্তি ব্যবহার করে তারা ব্যাটারিতে সপ্তাহ, মাস বা কয়েক বছর ধরে চলতে পারে।
+
+এই যন্ত্রগুলো আমাদের চারপাশের পৃথিবীর সাথে সংযুক্ত থাকে; হয় সেন্সর ব্যবহার করে তথ্য সংগ্রহ করে অথবা একচুয়েটরের আউটপুট নিয়ন্ত্রণ করে কোন কাজ করার মাধ্যমে। এর সাধারণ একটি উদাহরণ হল স্মার্ট থার্মোস্ট্যাট -এমন একটি ডিভাইস যার মধ্যে তাপমাত্রা সেন্সর থাকে। এছাড়াও এতে থাকে একটি পছন্দসই তাপমাত্রা সেট করার উপায় যেমন ডায়াল বা টাচস্ক্রিন ব্যবহার করে এবং একটি তাপীকরণ বা শীতলকরণ ব্যবস্থার সাথে সংযুক্ত থাকে। ব্যবহারকারীর নির্ধারিত সীমার বাইরে গেলেই এই যন্ত্রগুলো চালু হয় । এখানে উদাহরণস্বরূপ, তাপমাত্রা সেন্সর সনাক্ত করে যে ঘরটি খুব শীতল এবং একটি একচুয়েটর তখন হিটিং চালু করে।
+
+
+
+***একটি সাধারণ থার্মোস্ট্যাট Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß - all from the [Noun Project](https://thenounproject.com)***
+
+বিভিন্ন জিনিস রয়েছে যা আইওটি ডিভাইস হিসাবে কাজ করতে পারে, সংবেদনশীল ডেডিকেটেড হার্ডওয়্যার থেকে শুরু করে , জেনারেল পারপাস ডিভাইস এমনকি আমাদের স্মার্টফোন পর্যন্ত! একটি স্মার্টফোন চারপাশের বিভিন্ন তথ্য সংগ্রহের জন্য সেন্সর ব্যবহার করে এবং বাস্তব জগতের সাথে যোগাযোগ করে একচুয়েটর - উদাহরণস্বরূপ আমাদের অবস্থান সনাক্ত করতে জিপিএস সেন্সর এবং কোন গন্তব্যে আমাদেরকে নির্দেশনা দেওয়ার জন্য স্পিকার রয়েছে।
+
+✅ আমাদের চারপাশের অন্যান্য সিস্টেমগুলির কথা চিন্তা করি যা সেন্সর থেকে ডেটা সংগ্রহ করে এবং সিদ্ধান্ত নিতে তা ব্যবহার করে। একটি উদাহরণ হতে পারে, ওভেনের উপর রাখা থার্মোস্ট্যাট। চারপাশে আরও কিছু কী খুঁজে পাওয়া যাবে ?
+
+### ইন্টারনেট
+
+আইওটি অ্যাপ্লিকেশনটির **ইন্টারনেট** অংশে বোঝান হয় এমন সব অ্যাপ্লিকেশন যার সাথে আইওটি ডিভাইসে সংযুক্ত থেকে ডেটা প্রেরণ এবং গ্রহণ করতে পারে। পাশাপাশি অন্যান্য অ্যাপ্লিকেশনগুলিও এর অংশ যা আইওটি ডিভাইস থেকে প্রাপ্ত ডেটা বিশ্লেষণ করতে পারে এবং আইওটি ডিভাইস একচুয়েটরকে কী কী নির্দেশ পাঠাতে হবে সেই সিদ্ধান্ত নেয়।
+
+একটি সাধারণ সেটআপে আইওটি ডিভাইসটি সংযুক্ত হওয়ার সাথে কিছু ধরণের ক্লাউড সেবা থাকবে এবং এই ক্লাউড পরিষেবাগুলো সুরক্ষা, আইওটি ডিভাইস থেকে বার্তা গ্রহণ এবং ডিভাইসে বার্তা প্রেরণের মতো বিষয়গুলি পরিচালনা করে। এই ক্লাউড সার্ভিসটি তখন এমন অন্যান্য অ্যাপ্লিকেশনগুলির সাথে সংযুক্ত হবে যা সেন্সর ডেটা প্রক্রিয়া করতে বা স্টোর করতে পারে। এছাড়াও সিদ্ধান্ত নিতে অন্যান্য যেকোন সিস্টেমের ডেটার সেন্সর থেকে প্রাপ্ত ডেটা ক্লাউড সার্ভিস ব্যবহার করে থাকে।
+
+ডিভাইসগুলো সবসময় যে ক্যাবল বা ওয়াইফাই দ্বারা সরাসরি ইন্টারনেটে সংযুক্ত থাকবে তাও কিন্তু নয়। কিছু যন্ত্র মেশ নেটওয়ার্ক ব্যবহার করে ব্লুটুথ বা এইধরণের কোন টেকনলজির সাহায্যে অন্য ডিভাইসের সাথে যুক্ত থাকে, আর এই সংযুক্তি ঘটায় হাব যা নিজে ইন্টারনেটের সাথে যুক্ত ।
+
+ইন্টারনেট সংযোগে কাজের উদাহরণস্বরূপ, একটি থার্মোস্ট্যাট নিই, যা কিনা ক্লাউডে হোম ওয়াইফাই ব্যবহার করে সংযুক্ত হয়েছে। এটি এই ক্লাউড পরিষেবায় তাপমাত্রার ডেটা প্রেরণ করে এবং সেখান থেক তা কোন ডাটাবেইস বা তথ্যভান্ডারে সংরক্ষিত থাকে এবং বাড়ির মালিককে কোন একটি মোবাইল অ্যাপ্লিকেশন ব্যবহার করে বর্তমান এবং অতীত তাপমাত্রা যাচাই করার সুযোগ দেয়। ক্লাউডের অন্য একটি আগে থেকেই জেনে নেয় যে বাড়ির মালিক কত তাপমাত্রা পছন্দ করেন এবং সেই পছন্দের ভিত্তিতে ক্লাউড সার্ভিসের মাধ্যমে আইওটি ডিভাইসে বার্তা প্রেরণ করে হিটিং সিস্টেমটি চালু বা বন্ধ করতে বলে।
+
+
+
+***একটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত, ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট / Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone by Alice-vector / Cloud by Debi Alpa Nugraha - all from the [Noun Project](https://thenounproject.com)***
+
+আরও উন্নত কোন ভার্সন, যা আইওটি ডিভাইসে সংযুক্ত অন্যান্য ডিভাইসের সেন্সরগুলির সাথে যেমন অকুপেন্সি সেন্সর থেকে বিভিন্ন তথ্য ( যেমন সেই সময়ের আবহাওয়া বা আপনার ব্যক্তিগত ক্যালেন্ডারে কী কী তথ্য রয়েছে) এর ভিত্তিতে সিদ্ধান্ত নিতে পারে যে তাপমাত্রা কত হওয়া উচিত। উদাহরণস্বরূপ আপনার ক্যালেন্ডারে বলা রয়েছে আজ আপনি ভ্রমণে গিয়েছেন। সেক্ষেত্রে শীতকালে আপনার রুমে হিটার চালানোর কোন দরকার নেই আর, আইওটি এই স্মার্ট ডিসিশনটি নিতে পারবে। এছাড়াও আপনি কোন রুম কখন কীভাবে ব্যবহার করেন, তার ভিত্তিতেও আর্টিফিশিয়াল ইন্টেলিজেন্স মডেলগুলি সিদ্ধান্ত নিতে পারে আর সময়ের সাথে সাথে প্রাপ্ত ডেটার কারণে এই সিদ্ধান্তগুলি আরো বেশি সঠিক হতে থাকে।
+
+
+
+***একটি ইন্টারনেট সংযুক্ত থার্মোস্ট্যাট যা একাধিক রুমের সেন্সর ব্যবহার করে । এটি মোবাইল অ্যাপ্লিকেশন নিয়ন্ত্রিত এবং আবহাওয়া ও ক্যালেন্ডারের ডেটা থেকে বুদ্ধিমত্তা গ্রহণ করতে সক্ষম. Temperature by Vectors Market / Microcontroller by Template / dial by Jamie Dickinson / heater by Pascal Heß / mobile phone and Calendar by Alice-vector / Cloud by Debi Alpa Nugraha / smart sensor by Andrei Yushchenko / weather by Adrien Coquet - all from the [Noun Project](https://thenounproject.com)***
+
+✅ ইন্টারনেট সংযুক্ত থার্মোস্ট্যাটকে আরও স্মার্ট করে তুলতে অন্য কোন কোন ধরণের ডেটা সাহায্য করতে পারে?
+
+### Edge চালিত IoT
+
+যদিও আইওটিতে **I** বলতে ইন্টারনেট বোঝায়, এই ডিভাইসগুলি যে অবশ্যই ইন্টারনেটে সংযুক্ত থাকতে হবে - তা পুরোপুরি সত্য নয়। কিছু ক্ষেত্রে আইওটি যন্ত্রগুলো 'এজ' ডিভাইসগুলির সাথে সংযোগ স্থাপন করতে পারে - যেগুলো হলো লোকাল নেটওয়ার্কে চালিত গেটওয়ে ডিভাইস যেখানে ইন্টারনেটে কোন সংযোগ না করেই ডেটা প্রক্রিয়া করতে পারবো। আমাদের যখন প্রচুর ডেটা ট্রান্সফার বা ধীর ইন্টারনেট সংযোগ থাকে তখন এটির মাধমে সম্পূর্ণ কাজ আরও দ্রুততর হতে পারে যা অফলাইনে চালানো যাবে এমনকি যখন কোন মানবিক সংকটের সময় কোন জাহাজে বা দুর্যোগ অঞ্চলে ইন্টারনেট সংযোগ সম্ভব হয়না, তখন সেখানে কার্যক্রম পরিচালনা করা যায়; সাথে সাথে আমাদের ব্যক্তিগত তথ্যের গোপনীয়তা রক্ষা করাও সম্ভব । কিছু ডিভাইসে ক্লাউড সুবিধা ব্যবহার করে তৈরি প্রসেসিং কোড থাকে এবং কোনও সিদ্ধান্ত নেওয়ার জন্য কোনও ইন্টারনেট সংযোগ ব্যবহার না করেই ডেটা সংগ্রহ ও প্রতিক্রিয়া জানাতে লোকাল নেটওয়ার্কে এটি চালানো যাবে।
+
+উদাহরণস্বরূপ, আমাদের স্মার্ট হোম ডিভাইস যেমন অ্যাপল হোমপড, অ্যামাজন অ্যালেক্সা বা গুগল হোম যা প্রশিক্ষিত এআই মডেলগুলি ব্যবহার করে আমাদের ভয়েস শুনতে পাবে এবং নির্দিষ্ট শব্দ বা বাক্যাংশ বললে চালু হয় বা 'wake up' করে এবং তারপরই আমরা আমাদের কথাগুলো ইন্টারনেটে প্রসেসিং এর জন্য পাঠাই অথচ বাকি সময়ের নির্দেশগুলো প্রাইভেট থাকে। বিস্তারিত বলতে গেলে, ডিভাইসটি উপযুক্ত সময়ে আমাদের ভয়েস প্রেরণ বন্ধ করবে যেমন এটি যখন আমাদের কথায় কোন বিরতি সনাক্ত করে, তখন বন্ধ হয়ে যায়। এটিকে'wake up' করার আগে এবং ডিভাইসটি বন্ধ করার পরে আমরা যা কিছু বলছি, তা ইন্টারনেটের মাধ্যমে ডিভাইস সরবরাহকারীর বা প্রস্তুতকারকের কাছে প্রেরণ করা হবে না এবং তাই এটি ব্যক্তিগত গোপনীয়তা বজায় রাখবে।
+
+✅ এমন কিছু পরিস্থিতির কথা চিন্তা করি যেখানে গোপনীয়তা গুরুত্বপূর্ণ, তাই ডেটা প্রক্রিয়াকরণটি ক্লাউডের চেয়ে 'Edge' এ করা তুলনামূলকভাবে ভালো । ছোট্ট একটি ইঙ্গিত দিই - ক্যামেরা বা অন্যান্য ইমেজিং ডিভাইস সম্বলিত আইওটি সার্ভিসকে এক্ষেত্রে ভাবা যেতে পারে ।
+
+### IoT নিরাপত্তা
+
+যেকোন ইন্টারনেট সংযোগের সাথে, সুরক্ষা একটি গুরুত্বপূর্ণ বিষয়। বেশ পরিচিত একটি কৌতুক রয়েছে যেখানে বলা হয়, IoT তে S মানে হলো Security - কিন্তু আইওটির পূর্ণরূপে কোথাও S নেই, যার মানে Security বা নিরাপত্তা নেই। উদাহরণস্বরূপ, [Stuxnet worm](https://wikipedia.org/wiki/Stuxnet) নামক ক্ষতিকারক 'কীট' বা worm অনেকগুলো সেন্ট্রিফিউজের ভাল্ভকে ভুলভাবে নিয়ন্ত্রণ করে কাজে ব্যাঘাত ঘটায়। হ্যাকাররাও তখন [baby monitor গুলোর নিম্নমানের নিরাপত্তার](https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable) সুযোগ নেয়।
+
+> 💁 কখনও কখনও আইওটি ডিভাইস এবং Edge ডিভাইসগুলি ব্যক্তিগত তথ্য ও নিরাপত্তাকে সুরক্ষিত রাখতে ইন্টারনেট থেকে সম্পূর্ণ বিচ্ছিন্ন কোন নেটওয়ার্কে চালিত হয়। এটিকে বলা হয় [এয়ার গ্যাপিং](https://wikipedia.org/wiki/Air_gap_(networking))।
+
+## মাইক্রোকন্ট্রোলারের আরো গভীরে
+
+গত লেসনে আমরা মাইক্রোকন্ট্রোলারদের পরিচিত হয়েছিলাম। এখন তাদেরকে আরও গভীরভাবে জানবো।
+
+### সিপিইউ
+
+সিপিইউ হল মাইক্রোকন্ট্রোলারের 'মস্তিষ্ক'। এটি মূলত প্রসেসর যা আপনার কোড রান করে এবং কোন সংযুক্ত ডিভাইসে ডেটা প্রেরণ এবং তা থেকে ডেটা গ্রহণ করতে পারে। সিপিইউতে এক বা একাধিক কোর থাকতে পারে - যা মূলত এক বা একাধিক সিপিইউ যা কোড রান করার জন্য একসাথে কাজ করতে পারে।
+
+সিপিইউগুলি একধরণের ঘড়ির উপর নির্ভর করে যা প্রতি সেকেন্ডে বহু মিলিয়ন বা বিলিয়ন বার টিক দেয়। প্রতিটি টিক বা সাইকেলে, সিপিইউর তার ক্ষমতানুসারে কাজগুলো করে। প্রতিটি টিকের সাথেই সিপিইউ কোন প্রোগ্রামের একটি নির্দেশনা কার্যকর করতে পারে, যেমন কোন বাহ্যিক ডিভাইস থেকে ডেটা পুনরুদ্ধার করা বা গাণিতিক গণনা সম্পাদন করা। এই নিয়মিত চক্রটি পরবর্তী নির্দেশাবলী প্রক্রিয়া করার আগেই আগের সব কাজ করে ফেলে।
+
+ক্লক সাইকেল যত দ্রুত হবে, প্রতি সেকেন্ডে সিপিইউ তত বেশি কাজও করতে পারবে অর্থাৎ দ্রুততর সিপিইউ হবে। এদের গতি পরিমাপ করা হয়ে থাকে [হার্টজ (Hz)](https://wikipedia.org/wiki/Hertz) এককে, যেখানে ১ হার্টজ বলতে বোঝান হয়, প্রতি সেকেন্ডে একটি চক্র বা টিক সম্পাদন করা।
+> 🎓 বেশিরভাগ সময় সিপিইউ স্পীড লেখা হয় MHz অথবা GHz দিয়ে। ১ মেগাহার্টজ হলো ১ মিলিয়ন হার্টজ এবং ১ গিগাহার্টজ হলো ১ বিলিয়ন হার্টজ ।
+
+> 💁 সিপিইউগুলো [fetch-decode-execute cycle](https://wikipedia.org/wiki/Instruction_cycle) ব্যবহার করে প্রোগ্রাম এক্সেকিউট করে। প্রতি টিক এর সাথে সিপিইউ পরবর্তী নির্দেশনা গ্রহণ করবে, তা ডিকোড করবে এবং পরিশেষে এক্সেকিউট করবে যেমনঃ Arithmetic Logic Unit (ALU) ব্যবহার করে ২ যোগ করা। কিছু কিছু এক্সেকিউশন এর জন্য একাধিক টিক বা সাইকেল দরকার হয়। কাজ হয়ে যাওয়ার পর, পরবর্তী টিক আসলে তবেই পরের সাইকেলটি রান করবে।
+
+
+
+***CPU by Icon Lauk / ram by Atif Arshad - all from the [Noun Project](https://thenounproject.com)***
+
+মাইক্রোকন্ট্রোলারগুলির ক্লক স্পীড ডেস্কটপ বা ল্যাপটপ কম্পিউটার, এমনকি বেশিরভাগ স্মার্টফোনের চেয়ে অনেক কম। উদাহরণস্বরূপ, Wio টার্মিনালের একটি সিপিইউ রয়েছে যা 120MHz বা সেকেন্ডে 120,000,000 সাইকেল চালায়।
+
+✅ একটি গড়পড়তা পিসি বা ম্যাক এর গিগাহার্টজে চলমান একাধিক কোর থাকে অর্থাৎ সেকেন্ডে কয়েক বিলিয়ন বার টিক দেয় বা সাইকেল সম্পাদন করে। আমাদের কম্পিউটারের ক্লক স্পীড কত তা জেনে নিয়ে Wio টার্মিনালের চেয়ে তা কতগুণ দ্রুত সেই হিসেব করি।
+
+প্রতিটি সাইকেল রান করতে প্রয়োজন হয় শক্তি যা কিনা তাপ বা হিট তৈরী করে। যত দ্রুত ক্লকস্পীড, তত বেশি পাওয়ার প্রয়োজন হবে এবং তাপ উৎপন্ন হবে। পিসির তাপ অপসারণ করত 'হিট সিংক' এবং ফ্যান ব্যবহৃত হয় যা ছাড়া প্রচণ্ড উত্তাপের সাথে কয়েক সেকেন্ডের মধ্যেই কম্পিউটার বন্ধ হয়ে যাবে। মাইক্রোকন্ট্রোলারে এরকম তাপ অপসারণের কোন সুযোগ দরকার হয়না কেননা এরা অনেক কম গতিসম্পন্ন ক্লকস্পীডে চলে এবং তাই ততটা তাপ তৈরী করেনা।
+
+> 💁 কিছু পিসি বা ম্যাক দ্রুত গতির হাই-পাওয়ার কোর এবং ধীর গতির লো-পাওয়ার কোর ব্যবহার শুরু করছে যাতে ব্যাটারি পাওয়ার বাঁচানো যায়। উদাহরণস্বরূপ, লেটেস্ট অ্যাপল ল্যাপটপে M1 চিপ ব্যবহার করে হচ্ছে, যা ৪টি পার্ফম্যান্স কোর এবং ৪টি ইফিশিয়েন্ট কোর এর মধ্যে কাজ ভাগ করতে পারে,যাতে করে পরিমিত ব্যাটারি লাইফ পাওয়া যায় আবার কাজের গতিও ঠিক থাকে।
+
+✅ ছোট্ট একটা কাজ করিঃ সিপিইউ সম্পর্কে আরো একটু বিষদভাবে জানার চেষ্টা করি, এই [Wikipedia CPU article](https://wikipedia.org/wiki/Central_processing_unit) থেকে।
+
+#### কাজ
+
+ Wio Terminal পর্যালোচনা করা ।
+
+এই লেসনের জন্য আমরা Wio Terminal ব্যবহার করলে, এটার সিপিইউ কী খুঁজে দেখতে পারি ? [Wio Terminal প্রোডাক্ট পেইজ](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) এ গিয়ে, একটু নিচে গেলেই *Hardware Overview* নামে একটি অংশ পাওয়া যাবে যেখানে Wio Terminal এর ভেতরের সব দেখা যায় - সিপিইউ আমরা সেখান থেকেই দেখতে পাবো।
+
+### মেমোরি
+
+মাইক্রোকন্ট্রোলারে সাধারণত ২ ধরণের মেমোরি থাকে - প্রোগ্রাম মেমোরি এবং র্যান্ডম একসেস মেমোরি (র্যাম)
+
+প্রোগ্রাম মেমোরি অপরিবর্তনশীল, যার অর্থ এটিতে যা লেখা থাকে তা যখন ডিভাইসে পাওয়ার (ইলেক্ট্রিক সংযোগ) না থাকলেও, এটি ডিভাইসে স্টোর করা থাকে । আমাদের প্রোগ্রাম কোড মূলত এই মেমোরিতেই সংরক্ষিত থাকে ।
+
+র্যাম ব্যবহার করে প্রোগ্রামগুলো চালানো হয় যা সেই সময়কালীন ভ্যারিয়েবল এবং প্রাপ্ত ডেটা স্টোর করে। র্যাম পরিবর্তনশীল, পাওয়ার (ইলেক্ট্রিক সংযোগ) বিচ্ছিন্ন হয়ে গেলে, এতে থাকা সব তথ্যও হারিয়ে যায় - বলতে গেলে পুরো প্রোগ্রামটাই প্রাথমিক অবস্থায় চলে আসে।
+
+> 🎓 প্রোগ্রাম মেমোরি আমাদের কোড গুলো সংরক্ষণ করে থাকে যা পাওয়ার (ইলেক্ট্রিক সংযোগ) না থাকলেও, ডিভাইসে থেকে যায়।
+
+> 🎓 র্যাম মূলত প্রোগ্রামকে রান করায় এবং পাওয়ার না থাকলে সবকিছু একদম শুরুর অবস্থায় চলে আসে।
+
+মাইক্রোকন্ট্রোলারের মেমোরি, পিসি বা ম্যাকের তুলনায় নিতান্তই ক্ষুদ্র । একটা সাধারণ পিসি তে ৮ গিগাবাইট (জিবি) অর্থাৎ ৮০০০০০০০০০০ বাইট র্যাম থাকে, যার প্রতি বাইটে একটি অক্ষর বা ০ থেকে ২৫৫ এর মধ্যে কোন সংখ্যা রাখা যায়। সেই তুলনায়, মাইক্রোকন্ট্রোলারে কিলোবাইট পর্যায়ের র্যাম থাকে, যা প্রায় ১০০০ বাইটের সমান। এখানে আমরা যে Wio terminal ব্যবহার করছি, তার র্যাম ১৯২ কিলোবাইট অর্থাৎ ১৯২০০০ বাইট - গড়পড়তা পিসির তুলনায় ৪০,০০০ গুণ কম।
+
+নীচের চিত্রটি 192KB এবং 8GB এর মধ্যে আপেক্ষিক আকারের পার্থক্য দেখায় - কেন্দ্রের ছোট ডটটি 192KB উপস্থাপন করে।
+
+
+
+প্রোগ্রাম মেমোরিও পিসির তুলনায় কম। একটি সাধারণ পিসিতে প্রোগ্রাম স্টোরেজের জন্য 500 গিগাবাইটের হার্ড ড্রাইভ থাকতে পারে, অন্যদিকে মাইক্রোকন্ট্রোলারের কাছে কেবল কিলোবাইট পর্যায়ের বা কয়েক মেগাবাইট (এমবি) স্টোরেজ থাকতে পারে (1 এমবি হলো 1000KB বা 1,000,000 বাইট এর সমান)। উইও টার্মিনালে 4MB প্রোগ্রাম স্টোরেজ রয়েছে।
+
+✅ ছোট্ট একটা গবেষণা করা যাক : এই লেখা পড়তে যে কম্পিউটারটি ব্যবহার করছি তার র্যাম এবং কত স্টোরেজ ব্যবহার করে ত জানার চেষ্টা করি। কোন মাইক্রোকন্ট্রোলারের সাথে এটিকে কীভাবে তুলনা করা যায়?
+
+### ইনপুট/আউটপুট
+
+মাইক্রোকন্ট্রোলার সেন্সর থেকে ডেটা পড়তে এবং অ্যাকচুয়েটর দিয়ে নিয়ন্ত্রণ সংকেত প্রেরণের জন্য ইনপুট এবং আউটপুট সংযোগ প্রয়োজন। মাইক্রোকন্ট্রোলারে সাধারণত বেশ কয়েকটি জেনারেল-পারপাস ইনপুট / আউটপুট (জিপিআইও) পিন থাকে। এই পিনগুলিকে ইনপুট (সংকেত গ্রহণ) বা আউটপুট (সংকেত প্রেরণ) পিন হিসেবে কাজ করার জন্য সফ্টওয়্যারে কনফিগার করতে হবে ।
+
+🧠⬅️ ইনপুট পিন দিয়ে সেন্সর থেকে তথ্য নেয়া হয়।
+
+🧠➡️ আউটপুট পিন দিয়ে অ্যাকচুয়েটরে সংকেত পাঠানো হয়।
+
+✅ পরবর্তী পাঠে এসব নিয়ে আমরা আরো বিস্তারিত জানতে পারবো।
+
+#### কাজ
+
+Wio Terminal পর্যালোচনা করি।
+
+এই লেসনের জন্য আমরা Wio Terminal ব্যবহার করলে, এটার জিপিআইও পিনগুলো কী খুঁজে দেখতে পারি ? [Wio Terminal প্রোডাক্ট পেইজে](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) একটু নিচে গেলেই *Pinout diagram* নামে একটি অংশ পাওয়া যাবে যেখানে আমরা সহজেই পিনগুলো দেখতে পারি। Wio Terminal এ পিন নাম্বার দেয়ার জন্য, স্টিকার দেয়া হয়ে থাকে - এগুলো ব্যবহার না করে থাকলে, দেরি না করে এখনই করে ফেলা যাক।
+
+### বাস্তবিক আকার
+
+মাইক্রোকন্ট্রোলার আকারে বেশ ছোট হয়, যেমন [Freescale Kinetis KL03 MCU ](https://www.edn.com/tiny-arm-cortex-m0-based-mcu-shrinks-package/) এর কথাই ধরা যাক - এতই ছোট যে গলফ বলের ডিম্পলের সমান। কোন পিসিতে থাকা সিপিইউ 40 মিমি x 40 মিমি আকারের এবং এটি (অতিরিক্ত গরম হয়ে কয়েক সেকেন্ডের বন্ধ হয়ে যাওয়া ঠেকাতে প্রয়োজনীয়) হিট-সিংক ও ফ্যান বাদ দিয়ে তারপরের হিসেব। বোঝাই যাচ্ছে, মাইক্রোকন্ট্রোলারের চেয়ে সিপিইউ কত বড়! মাইক্রোকন্ট্রোলার বাহ্যিক কেস, স্ক্রিন এবং বিভিন্ন সংযোগ এবং উপাদানগুলির সাথে থাকা Wio Terminal Developer kits একটি Intel i9 এর শুধুমাত্র সিপিইউয়ের চেয়ে তেমন একটা বড় নাহ - এবং হিট সিঙ্ক এবং ফ্যান সহ হিসেব করলে সিপিইউর চেয়ে অনেক ছোট !
+
+| যন্ত্র | আকার |
+| ------------------------------- | --------------------- |
+| Freescale Kinetis KL03 | 1.6mm x 2mm x 1mm |
+| Wio terminal | 72mm x 57mm x 12mm |
+| Intel i9 CPU, Heat sink and fan | 136mm x 145mm x 103mm |
+
+### ফ্রেমওয়ার্ক এবং অপারেটিং সিস্টেম
+
+গতি এবং মেমরির আকারের কারণে, মাইক্রোকন্ট্রোলার ডেস্কটপ অর্থে কোন অপারেটিং সিস্টেম (ওএস) চালায় না। আমাদের কম্পিউটার (উইন্ডোজ, লিনাক্স বা ম্যাক-ওএস) চালিত অপারেটিং সিস্টেমের কাজগুলি চালনার জন্য প্রচুর মেমরি এবং প্রসেসিং পাওয়ার প্রয়োজন - মাইক্রোকন্ট্রোলারের জন্য যা সম্পূর্ণ অপ্রয়োজনীয় । মনে রাখতে হবে যে, মাইক্রোকন্ট্রোলার সাধারণত এক বা একাধিক নির্দিষ্ট কাজ সম্পাদনের জন্য প্রোগ্রাম করা হয়; যা পিসি বা ম্যাকের মতো জেনারেল পারপাস যন্ত্র নাহ। পিসি বা ম্যাক এ এমন ইন্টারফেস রাখতে হয় যা সঙ্গীত বা সিনেমা চালাতে পারে, ডকুমেন্টেশন বা কোড লেখার সরঞ্জাম সরবরাহ করতে পারে, গেম খেলতে পারে বা ইন্টারনেট ব্রাউজ করতে পারে - যা কিনা মাইক্রোকন্ট্রোলারের কাজের ধরণের তুলনায় অনেক আলাদা।
+
+কোন ওএস ছাড়াই একটি মাইক্রোকন্ট্রোলার প্রোগ্রাম করার জন্য আমাদেরকে কোন ধরণের কোডিং করতে হবে যাতে মাইক্রোকন্ট্রোলার চলতে পারে বা API ব্যবহার করে যে কোনও পার্শ্ববর্তী কিছুর সাথে সংযোগ রাখতে পারে। প্রতিটি মাইক্রোকন্ট্রোলার আলাদা হয়, তাই নির্মাতারা সাধারণত স্ট্যান্ডার্ড ফ্রেমওয়ার্কগুলিকে সমর্থন করে যা আপনাকে আমাদের কোড তৈরির জন্য একটি স্ট্যান্ডার্ড 'রেসিপি' অনুসরণ করার সুযোগ করে এবং সেই ফ্রেমওয়ার্কটিকে সাপোর্ট করে এমন কোনও মাইক্রোকন্ট্রোলারে প্রোগ্রাম রান করে।
+
+আমরা কোন একটি ওএস ব্যবহার করে মাইক্রোকন্ট্রোলারগুলিকে প্রোগ্রাম করতে পারি - প্রায়শই এগুলোকে রিয়েল-টাইম অপারেটিং সিস্টেম (আরটিওএস) হিসাবে উল্লেখ করা হয়, কারণ এগুলি রিয়েল টাইমে পেরিফেরিয়ালগুলিতে এবং পাঠানো ডেটা হ্যান্ডেল করার জন্য তৈরি করা হয়। এই অপারেটিং সিস্টেমগুলি খুব লাইটওয়েট এবং এদের বৈশিষ্ট্যগুলি হলো-
+
+* মাল্টি থ্রেডিং, আপনার কোডগুলি একই সাথে একাধিক কোর বা একটি কোর পর্যায়ক্রমে ব্যবহার করে একাধিক ব্লক কোড চালানোর অনুমতি দেয়।
+* নেটওয়ার্কিং - নিরাপদে ইন্টারনেটের মাধ্যমে যোগাযোগের অনুমতি দেওয়ার জন্য
+* স্ক্রিন রয়েছে এমন ডিভাইসে ব্যবহারকারীর ইন্টারফেস (ইউআই) তৈরির জন্য গ্রাফিকাল ইউজার ইন্টারফেস (জিইউআই) এর উপস্থিতি।
+
+✅ বিভিন্ন RTOS এর ব্যপারে জানতে এসব পড়তে পারি : [Azure RTOS](https://azure.microsoft.com/services/rtos/?WT.mc_id=academic-17441-jabenn), [FreeRTOS](https://www.freertos.org), [Zephyr](https://www.zephyrproject.org)
+
+#### আরডুইনো
+
+
+
+[আরডুইনো](https://www.arduino.cc) খুব সম্ভবত সবচেয়ে জনপ্রিয় মাইক্রোকন্ট্রোলার ফ্রেমওয়ার্ক, বিশেষতঃ শিক্ষার্থী, শখের বশে আইওটিতে কাজ করতে আগ্রহীদের মাঝে। আরডুইনো একটি ওপেন সোর্স ইলেক্ট্রনিক্স প্ল্যাটফর্ম যা সফ্টওয়্যার এবং হার্ডওয়্যার সমন্বিত। আরডুইনো থেকে বা অন্য নির্মাতাদের কাছ থেকে আমরা আরডুইন সম্বলিত বোর্ড কিনে সেই ফ্রেমওয়ার্ক ব্যবহার করে কোড করতে পারবো।
+
+আরডুইনো বোর্ডগুলি সি বা সি ++ এ কোড করা হয়। এই ভাষায় আমরা কোডগুলো খুব ছোট করে সংকলন করতে পারি এবং দ্রুত রান করতে পারি, যা একটি সীমাবদ্ধ ডিভাইসে যেমন মাইক্রোকন্ট্রোলারের জন্য বেশ গুরুত্বপূর্ণ। আরডুইনো অ্যাপ্লিকেশনটির মূল বিষয়কে স্কেচ হিসাবে উল্লেখ করা হয় এবং তা সি/সি++ এ কোড করা হয় মূলত ২টি ফাংশনে - `setup` এবং `loop`। বোর্ড চালু হয়ে গেলে, আরডুইনো ফ্রেমওয়ার্ক কোডটি একবার `setup` ফাংশনটি পরিচালনা করবে, তারপরে এটি `loop` ফাংশনটি বারবার রান করবে, পাওয়ার বন্ধ না হওয়া অবধি এটি অবিচ্ছিন্নভাবে চালিত হবে।
+
+আমরা সেটআপ কোডটি `setup` ফাংশনে লিখবো, যেমন ওয়াইফাই এবং ক্লাউড সার্ভিসের সাথে সংযুক্ত হওয়া বা ইনপুট এবং আউটপুট জন্য পিন চালু হওয়া। আমাদের লুপ কোডটিতে তখন প্রসেসিং কোড থাকবে যেমন সেন্সর থেকে ডেটা নেয়া এবং ক্লাউডে তা পাঠানো । প্রতিটি লুপে সাধারণত একটি বিলম্ব (delay) অন্তর্ভুক্ত করতে হবে, উদাহরণস্বরূপ যদি আমরা কেবল 10 সেকেন্ড পরপর সেন্সর ডেটা প্রেরণ করতে চাই, তবে লুপের শেষে 10 সেকেন্ডের বিলম্ব যুক্ত করতে হবে, যাতে মাইক্রোকন্ট্রোলার তখন বিশ্রামে থাকে, শক্তি সঞ্চয় করে এবং তারপরে আবার 10 সেকেন্ড পরে যখন ডেটা প্রয়োজন হবে, তখন ল্যুপ চলবে।
+
+
+
+✅ এই প্রোগ্রাম আর্কিটেকচারকে বলা হয় *event loop* অথবা *message loop*. অনেক অ্যাপ্লিকেশন এটি ব্যবহার করে এবং উইন্ডোজ, ম্যাক-ওএস বা লিনাক্সের মতো ওএসে চালিত বেশিরভাগ ডেস্কটপ অ্যাপ্লিকেশনগুলির জন্য এটি স্ট্যান্ডার্ড। `loop` এখানে বাটনের মতো ব্যবহারকারী বা ইন্টারফেস উপাদান বা কীবোর্ডের মতো ডিভাইসগুলির নির্দেশনা গ্রহণ করে এবং সেই অনুযায়ে সাড়া দেয়। আরো বিস্তারিত জানতে [ইভেন্ট ল্যুপ](https://wikipedia.org/wiki/Event_loop) সংক্রান্ত লেখাটি পড়তে পারি।
+
+আরডুইনো মাইক্রোকন্ট্রোলার এবং আই/ও পিনের সাথে সংযোগের জন্য স্ট্যান্ডার্ড লাইব্রেরি সরবরাহ করে, বিভিন্ন মাইক্রোকন্ট্রোলারগুলিতে চালনার যা সহায়তা করে। উদাহরণস্বরূপ, [`delay` function](https://www.arduino.cc/reference/en/language/functions/time/delay/) কোন প্রোগ্রামকে নির্দিষ্ট সময়ের জন্য বন্ধ রাখবে, [`digitalRead` function](https://www.arduino.cc/reference/en/language/functions/digital-io/digitalread/) বোর্ডের পিনগুলি থেকে `HIGH` অথবা `LOW` ডেটা সংগ্রহ করে, তা যে বোর্ডেই কোড রান করা হোক না কেন। এই স্ট্যান্ডার্ড লাইব্রেরিগুলির অর্থ একটি বোর্ডের জন্য লিখিত আরডুইনো কোড অন্য যে কোন আরডুইনো বোর্ডের জন্য পুনরায় ব্যবহার করা যেতে পারে এবং চলবে (পিনগুলি একই ধরে নিয়ে এবং বোর্ডগুলি একই বৈশিষ্ট্যগুলিকে সমর্থন কর - এমন হলে)
+
+থার্ড-পার্টি আরডুইনো লাইব্রেরির একটি বড় সংগ্রহ রয়েছে যা আরডুইনো প্রকল্পগুলিতে অতিরিক্ত বৈশিষ্ট্য যুক্ত করার অনুমতি দেয় যেমন সেন্সর এবং অ্যাকচুয়েটর ব্যবহার করে বা ক্লাউড আইওটি সার্ভিসগুলিতে সংযুক্ত করা।
+
+##### কাজ
+
+Wio Terminal পর্যালোচনা করি।
+
+এই লেসনের জন্য আমরা Wio Terminal ব্যবহার করলে, গত লেসনের কোডগুলো আবার একটু দেখি। [Wio Terminal প্রোডাক্ট পেইজে](https://www.seeedstudio.com/Wio-Terminal-p-4509.html) একটু নিচে গেলেই *Pinout diagram* নামে একটি অংশ পাওয়া যাবে যেখানে আমরা সহজেই পিনগুলো দেখতে পারি। Wio Terminal এ পিন নাম্বার দেয়ার জন্য, স্টিকার দেয়া হয়ে থাকে - এগুলো ব্যবহার না করে থাকলে, দেরি না করে এখনই করে ফেলা যাক।
+
+## সিংগেল-বোর্ড কম্পিউটারের আরো গভীরে
+
+শেষ লেসনে আমরা সিংগেল-বোর্ড কম্পিউটার এর সাথে পরিচিত হয়েছিলাম। এখন তাদের আরও গভীরভাবে জানবো।
+
+### রাস্পবেরি পাই
+
+
+
+[Raspberry Pi Foundation](https://www.raspberrypi.org) হলো মূলত স্কুল পর্যায়ে কম্পিউটার বিজ্ঞানের অধ্যয়নের প্রচারের জন্য ২০০৯ সালে প্রতিষ্ঠিত যুক্তরাজ্যের একটি দাতব্য সংস্থা। এই মিশনের অংশ হিসাবে তারা সিংগেল-বোর্ড কম্পিউটার তৈরী করে, যার নাম রাস্পবেরি পাই। এটি বর্তমানে ৩টি ভেরিয়েন্টে পাওয়া যা - একটি পূর্ণ আকারের সংস্করণ, ছোট পাই জিরো এবং একটি চূড়ান্ত মডিউল যা দিয়ে আমাদের আইওটি ডিভাইসে তৈরি করা যেতে পারে।
+
+
+
+***Raspberry Pi 4. Michael Henzler / [Wikimedia Commons](https://commons.wikimedia.org/wiki/Main_Page) / [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)***
+
+পূর্ণ আকারের রাস্পবেরি পাইয়ের সর্বশেষ ভার্সন হল Raspberry Pi 4B । এটিতে একটি কোয়াড-কোর (4 কোর) সিপিইউ রয়েছে যা 1.5GHz এবং 2, 4, বা 8 জিবি র্যাম, গিগাবিট ইথারনেট, ওয়াইফাই, 2টি এইচডিএমআই পোর্ট 4K স্ক্রিন সমর্থন করে, একটি অডিও এবং মিশ্রিত ভিডিও আউটপুট পোর্ট, ইউএসবি পোর্টস (USB 2.0, 2 USB 3.0 ভার্সন), 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির জন্য একটি ক্যামেরা সংযোজক এবং একটি এসডি কার্ড স্লট। এই সমস্ত বোর্ড যা রয়েছে সব মিলিয়ে 88mm x 58mm x 19.5mm সাইজ এবং এটি একটি 3A USB-C পাওয়ার সাপ্লাই দ্বারা চালিত। রাস্পবেরি পাইয়ের দাম 35 মার্কিন ডলার থেকে শুরু হয়, যা পিসি বা ম্যাক এর তুলনায় অনেক কম।
+
+> 💁 Pi400 নামে একটি "একের-ভিতর-সব" কম্পিউটার রয়েছে, যার কীবোর্ডে Pi4 বিল্ট-ইন রয়েছে।
+
+
+
+পাই জিরো এর আকার অনেক ছোট , যার পাওয়ার অনেক কম। এটিতে একটি একক কোর 1GHz সিপিইউ, 512 এমবি র্যাম, ওয়াইফাই (Zero W model এ ), একটি এইচডিএমআই পোর্ট, একটি মাইক্রো-ইউএসবি পোর্ট, 40টি জিপিআইও পিন, রাস্পবেরি পাই ক্যামেরা মডিউলটির সাথে একটি ক্যামেরা সংযোগকারী এবং একটি এসডি কার্ড স্লট রয়েছে।পাই এর আকার 65 মিমি x 30 মিমি x 5 মিমি এবং খুব অল্প পাওয়ার নিয়েই কাজ করতে পারে। পাই জিরো এর মূল্য 5 মার্কিন ডলার, আর ওয়াইফাই সহ , W ভার্সনটির দাম 10 মার্কিন ডলার।
+
+> 🎓 এখানের সিপিইউ গুলো ARM processor এর, যা আমাদের পিসি বা ম্যাক এর রেগুলার Intel/AMD x86 বা x64 প্রসেসরের মতো নয়। তবে মাইক্রোকন্ট্রোলারের পাশাপাশি প্রায় সমস্ত মোবাইল ফোন, মাইক্রোসফ্ট সারফেস এক্স এবং নতুন অ্যাপল সিলিকন ভিত্তিক অ্যাপল ম্যাক এর সিপিইউগুলির সাথে এদের মিল রয়েছে।
+
+রাস্পবেরি পাই এর সমস্ত ভ্যারিয়েন্ট রাস্পবেরি পাই ওএস নামে ডিবিয়ান লিনাক্সের একটি ভার্সন রান করে। এটি কোনও ডেস্কটপ ছাড়াই একটি ছোট ভার্সন হিসেবে পাওয়া যায় যা 'হেডলেস' প্রজেক্টগুলোর জন্য উপযুক্ত যেখানে ওয়েব ব্রাউজার, অফিস অ্যাপ্লিকেশন, কোডিং সরঞ্জাম এবং গেম চালানোর জন্য একটি স্ক্রিন বা একটি পূর্ণ ডেস্কটপ পরিবেশের কোন প্রয়োজনই নেই। এই ওএস হল ডেবিয়ান লিনাক্সের একটি সংস্করণ, যেটিতে আমরা ডেবিয়ানে চালিত কোন অ্যাপ্লিকেশন বা ট্যুল ইনস্টল করতে পারবো এবং এটি ARM প্রসেসরের জন্যই তৈরি।
+
+#### কাজ
+
+রাস্পবেরি পাই পর্যালোচনা
+
+যদি এই লেসনটির জন্য আমরা রাস্পবেরি পাই ব্যবহার করি, তাহলে বোর্ডের বিভিন্ন হার্ডওয়্যার উপাদানগুলি সম্পর্কে ভালোভাবে জানতে হবে।
+
+* প্রসেসর সম্পর্কিত ডিটেইলস [Raspberry Pi hardware documentation page](https://www.raspberrypi.org/documentation/hardware/raspberrypi/) এ পাওয়া যাবে। আমাদের ব্যবহার করা পাই এর প্রসেসর সম্পর্কে ঐ পেইজটি থেকে জানতে পারবো।
+* GPIO পিনগুলো খুঁজে বের করি। [Raspberry Pi GPIO documentation](https://www.raspberrypi.org/documentation/hardware/raspberrypi/gpio/README.md)থেকে এদের ব্যাপারে আরো বিস্তারিত জানতে পারবো। [GPIO Pin Usage guide](https://www.raspberrypi.org/documentation/usage/gpio/README.md) টি পড়লে পাই এর বিভিন্ন পিন সম্পর্কে আমরা বিস্তারিত জানবো।
+
+### সিংগেল-বোর্ড কম্পিউটারে প্রোগ্রামিং
+
+সিংগেল-বোর্ড কম্পিউটারগুলিকে সম্পূর্ণ কম্পিউটার বলা যা্য, যা একটি সম্পূর্ণ ওএস এ রান করে। এর অর্থ হল যে অনেকগুলো প্রোগ্রামিং ভাষা, ফ্রেমওয়ার্ক এবং ট্যুল ব্যবহার করে কোডিং করা যাবে যা কিনা মাইক্রোকন্ট্রোলার যেমনঃ আরডুইনো তে সচরাচর করা যায়না কারণ বোর্ড থেকে সাপোর্টেড রয়েছে কিনা - এরকম বিষয়গুলি এখানে প্রভাব রাখে। বেশিরভাগ প্রোগ্রামিং ভাষার লাইব্রেরি রয়েছে যা সেন্সর এবং অ্যাকচুয়েটর থেকে যথাক্রমে ডেটা গ্রহণ এবং প্রেরণ করতে GPIO পিনগুলিতে ব্যবহার করতে পারে।
+
+✅ আমরা কোন কোন প্রোগ্রামিং ভাষার সাথে পরিচিত? তারা কি লিনাক্স এ সাপোর্টেড ?
+
+রাস্পবেরি পাইতে আইওটি অ্যাপ্লিকেশন তৈরির সর্বাধিক সাধারণ প্রোগ্রামিং ল্যাঙ্গুয়েজ হল পাইথন। পাইয়ের জন্য বানানো একটি হার্ডওয়ারের এক বিশাল ইকোসিস্টেম রয়েছে এবং এগুলির প্রায় সবগুলিতেই পাইথন লাইব্রেরি হিসাবে তাদের ব্যবহার করার জন্য প্রয়োজনীয় প্রাসঙ্গিক কোড অন্তর্ভুক্ত রয়েছে। এর মধ্যে কিছু ইকোসিস্টেম হল 'হ্যাট' এর উপর ভিত্তি করে - টুপির মতো একটি লেয়ার যা পাইয়ের উপরে বসে 40টি জিপিআইও পিনের একটি বড় সকেটের সাথে সংযুক্ত থাকে। এই টুপিগুলি অতিরিক্ত কিছু সুবিধা দেয় যেমনঃ স্ক্রিন, সেন্সর, রিমোট কন্ট্রোল কার অথবা অ্যাডাপ্টারগুলিকে সেন্সর যুক্ত করা যায় স্ট্যান্ডার্ড ক্যাবল ব্যবহার করেই।
+
+### প্রফেশনাল পর্যায়ে আইওটি তৈরীর ক্ষেত্রে সিংগেল-বোর্ড কম্পিউটারের ব্যবহার
+
+সিংগেল-বোর্ড কম্পিউটারগুলি কেবলমাত্র ডেভলাপার কিট হিসাবে নয়,বরং প্রফেশনাল পর্যায়ে আইওটি তৈরীর ক্ষেত্রেও ব্যবহৃত হয়। তারা হার্ডওয়্যার নিয়ন্ত্রণ এবং মেশিন লার্নিং মডেলগুলি চালানোর মতো জটিল কাজগুলি চালনার শক্তিশালী উপায় সরবরাহ করতে পারে। উদাহরণস্বরূপ, একটি [রাস্পবেরি পাই-4 কম্পিউট মডিউল](https://www.raspberrypi.org/blog/raspberry-pi-compute-module-4/) রয়েছে যা রাস্পবেরি পাই-4 এর সমস্ত পাওয়ার সরবরাহ করে, তবে তা বেশ কমপ্যাক্ট এবং কম মানের ফর্ম ফ্যাক্টরে যা কাস্টম হার্ডওয়্যারে ইনস্টল করার জন্য নকশাকৃত যাতে বেশিরভাগ পোর্ট নেই।
+
+---
+
+## 🚀 চ্যালেঞ্জ
+
+গত লেসনের চ্যালেঞ্জটি ছিল বাড়ি, স্কুল বা কর্মক্ষেত্রে যতগুলি আইওটি ডিভাইস রয়েছে তার তালিকা করা। এই তালিকার প্রতিটি ডিভাইসের জন্য কী মাইক্রোকন্ট্রোলার বা সিংগেল-বোর্ড কম্পিউটার ব্যবহৃত হয় ? নাকি উভয়ের মিশ্রণের ফলেই এরা নির্মিত?
+
+## লেকচার পরবর্তী কুইজ
+
+[লেকচার পরবর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/4)
+
+## রিভিউ এবং স্ব-অধ্যয়ন
+
+* [Arduino getting started guide](https://www.arduino.cc/en/Guide/Introduction) টি পড়ে আরডুইনো প্লাটফর্ম সম্পর্কে আরো জানতে হবে।
+* [introduction to the Raspberry Pi 4](https://www.raspberrypi.org/products/raspberry-pi-4-model-b/) পড়ে রাস্পবেরি পাই সম্পর্কে আরো জানতে হবে।
+
+✅ কোন হার্ডওয়্যার প্ল্যাটফর্মটি ব্যবহার করলে ভালো হবে বা শুধুমাত্র ভার্চুয়াল ডিভাইস ব্যবহার করবো কিনা তবে সিদ্ধান্ত নেওয়ার জন্য [হার্ডওয়্যার গাইডের](../../../hardware.md) লিংকগুলোতে প্রদত্ত খরচ এর তুলনা করতে হবে।
+
+## এসাইনমেন্ট
+
+[মাইক্রোকন্ট্রোলার এবং সিংগেল-বোর্ড কম্পিউটারের তুলনা করে পার্থক্য দাঁড় করানো](assignment.md)।
diff --git a/1-getting-started/lessons/2-deeper-dive/translations/assignment.ar.md b/1-getting-started/lessons/2-deeper-dive/translations/assignment.ar.md
new file mode 100644
index 00000000..df7ca1c2
--- /dev/null
+++ b/1-getting-started/lessons/2-deeper-dive/translations/assignment.ar.md
@@ -0,0 +1,16 @@
+
+
+# قارن بين المتحكمات الدقيقة وأجهزة الكمبيوتر أحادية اللوحة
+
+## التعليمات
+
+غطى هذا الدرس المتحكمات الدقيقة وأجهزة الكمبيوتر أحادية اللوحة. قم بإنشاء جدول لمقارنتها ، ولاحظ على الأقل سببين لاستخدام متحكم دقيق على جهاز كمبيوتر ذي لوحة واحدة ، وسببين على الأقل لاستخدام جهاز كمبيوتر من لوحة واحدة بدلاً من متحكم دقيق.
+
+## نماذج
+
+| معايير | نموذجي | مناسب | يحتاج الى تحسين |
+| -------- | --------- | -------- | ----------------- |
+| قم بإنشاء جدول يقارن المتحكمات الدقيقة بأجهزة الكمبيوتر أحادية اللوحة | إنشاء قائمة بالعناصر المتعددة للمقارنة والتباين بشكل صحيح | تم إنشاء قائمة تحتوي على عنصرين فقط | كان قادرًا على الخروج بعنصر واحد فقط ، أو لا توجد عناصر للمقارنة والتباين |
+| أسباب استخدام أحدهما على الآخر | كان قادرًا على تقديم سببين أو أكثر لاجهزة التحكم الدقيق ، وسببين أو أكثر لأجهزة الكمبيوتر ذات اللوحة الواحدة | كان قادرًا فقط على تقديم 1-2 سبب لمتحكم دقيق ، وسببين أو اكثر لجهاز كمبيوتر لوحة واحدة | لم يكن قادرًا على تقديم سبب واحد أو أكثر لمتحكم دقيق أو لجهاز كمبيوتر أحادي اللوحة |
+
+
\ No newline at end of file
diff --git a/1-getting-started/lessons/2-deeper-dive/translations/assignment.bn.md b/1-getting-started/lessons/2-deeper-dive/translations/assignment.bn.md
new file mode 100644
index 00000000..1b6c97b4
--- /dev/null
+++ b/1-getting-started/lessons/2-deeper-dive/translations/assignment.bn.md
@@ -0,0 +1,13 @@
+# মাইক্রোকন্ট্রোলার এবং সিংগেল-বোর্ড কম্পিউটারের তুলনা করে পার্থক্য দাঁড় করানো
+
+## নির্দেশনা
+
+এই পাঠটিতে মাইক্রোকন্ট্রোলার এবং সিংগেল-বোর্ড কম্পিউটার নিয়ে আলোচনা হয়েছে । তাদের তুলনা করে এবং বিপরী্ত্য সম্বলিত একটি সারণী তৈরি করে কমপক্ষে ২টি কারণ লিখতে হবে যে কেন একটি সিংগেল-বোর্ড কম্পিউটারের পরিবর্তে মাইক্রোকন্ট্রোলার ব্যবহার করা উচিত। একইভাবে কমপক্ষে ২টি কারণ লিখতে হবে যে কেন একট মাইক্রোকন্ট্রোলারের পবিবর্তে সিংগেল-বোর্ড কম্পিউটার ব্যবহার করা উচিত।
+
+
+## এসাইনমেন্ট মূল্যায়ন মানদন্ড
+
+| ক্রাইটেরিয়া | দৃষ্টান্তমূলক ব্যখ্যা (সর্বোত্তম) | পর্যাপ্ত ব্যখ্যা (মাঝারি) | আরো উন্নতির প্রয়োজন (নিম্ন) |
+| -------- | ---------------------- | ------------------- | ------------------------- |
+| একক-বোর্ড কম্পিউটারের সাথে মাইক্রোকন্ট্রোলার এর তুলনা করে একটি সারণী তৈরি করা | একাধিক আইটেম সঠিকভাবে তুলনা এবং বৈপরীত্যসহ একটি তালিকা তৈরি করেছে | কেবল অল্প কয়েকটি বিষয় নিয়ে একটি তালিকা তৈরি করেছে | শুধুমাত্র একটি বা শুণ্যটি তুলনা এবং বৈপরীত্যসহ তালিকা তৈরি করেছে |
+| একটির পরিবর্তে অন্যটি ব্যবিহারের কারণ | ২ বা ততোধিক কারণ প্রদর্শন করেছে | ১ বা ২টি কারণ প্রদর্শন করেছে | ১ বা ততোধিক কারণ প্রদর্শন করতে পারেনি |
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/README.md b/1-getting-started/lessons/3-sensors-and-actuators/README.md
index e0546633..4efefeb5 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/README.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/README.md
@@ -1,8 +1,8 @@
# Interact with the physical world with sensors and actuators
-Add a sketchnote if possible/appropriate
+
-
+> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
index 0e1ccc45..43f1cfaf 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-actuator/pi/nightlight/app.py
@@ -1,12 +1,12 @@
import time
-import seeed_si114x
+from grove.grove_light_sensor_v1_2 import GroveLightSensor
from grove.grove_led import GroveLed
-light_sensor = seeed_si114x.grove_si114x()
+light_sensor = GroveLightSensor(0)
led = GroveLed(5)
while True:
- light = light_sensor.ReadVisible
+ light = light_sensor.light
print('Light level:', light)
if light < 300:
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
index 5d44a7a3..54f58874 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
+++ b/1-getting-started/lessons/3-sensors-and-actuators/code-sensor/pi/nightlight/app.py
@@ -1,9 +1,10 @@
import time
-import seeed_si114x
+from grove.grove_light_sensor_v1_2 import GroveLightSensor
-light_sensor = seeed_si114x.grove_si114x()
+light_sensor = GroveLightSensor(0)
while True:
- light = light_sensor.ReadVisible
+ light = light_sensor.light
print('Light level:', light)
+
time.sleep(1)
\ No newline at end of file
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
index b6a082e6..78e78cc6 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/pi-actuator.md
@@ -44,7 +44,7 @@ Connect the LED.
## Program the nightlight
-The nightlight can now be programmed using the Grove sunlight sensor and the Grove LED.
+The nightlight can now be programmed using the Grove light sensor and the Grove LED.
### Task - program the nightlight
@@ -93,7 +93,7 @@ Program the nightlight.
python3 app.py
```
- You should see light values being output to the console.
+ Light values will be output to the console.
```output
pi@raspberrypi:~/nightlight $ python3 app.py
@@ -105,7 +105,7 @@ Program the nightlight.
Light level: 290
```
-1. Cover and uncover the sunlight sensor. Notice how the LED will light up if the light level is 300 or less, and turn off when the light level is greater than 300.
+1. Cover and uncover the light sensor. Notice how the LED will light up if the light level is 300 or less, and turn off when the light level is greater than 300.
> 💁 If the LED doesn't turn on, make sure it is connected the right way round, and the spin button is set to full on.
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md b/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
index 7c968115..f4af4d18 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/pi-sensor.md
@@ -4,33 +4,31 @@ In this part of the lesson, you will add a light sensor to your Raspberry Pi.
## Hardware
-The sensor for this lesson is a **sunlight sensor** that uses [photodiodes](https://wikipedia.org/wiki/Photodiode) to convert visible and infrared light to an electrical signal. This is an analog sensor that sends an integer value from 0 to 1,023 indicating a relative amount of light, but this can be used to calculate exact values in [lux](https://wikipedia.org/wiki/Lux) by taking data from the separate infrared and visible light sensors.
+The sensor for this lesson is a **light sensor** that uses a [photodiode](https://wikipedia.org/wiki/Photodiode) to convert light to an electrical signal. This is an analog sensor that sends an integer value from 0 to 1,000 indicating a relative amount of light that doesn't map to any standard unit of measurement such as [lux](https://wikipedia.org/wiki/Lux).
-The sunlight sensor is an eternal Grove sensor and needs to be connected to the Grove Base hat on the Raspberry Pi.
+The light sensor is an eternal Grove sensor and needs to be connected to the Grove Base hat on the Raspberry Pi.
-### Connect the sunlight sensor
+### Connect the light sensor
-The Grove sunlight sensor that is used to detect the light levels needs to be connected to the Raspberry Pi.
+The Grove light sensor that is used to detect the light levels needs to be connected to the Raspberry Pi.
-#### Task - connect the sunlight sensor
+#### Task - connect the light sensor
-Connect the sunlight sensor
+Connect the light sensor
-
+
-1. Insert one end of a Grove cable into the socket on the sunlight sensor module. It will only go in one way round.
+1. Insert one end of a Grove cable into the socket on the light sensor module. It will only go in one way round.
-1. With the Raspberry Pi powered off, connect the other end of the Grove cable to one of the three the I2C sockets marked **I2C** on the Grove Base hat attached to the Pi. This socket is the second from the right, on the row of sockets next to the GPIO pins.
+1. With the Raspberry Pi powered off, connect the other end of the Grove cable to the analog socket marked **A0** on the Grove Base hat attached to the Pi. This socket is the second from the right, on the row of sockets next to the GPIO pins.
- > 💁 I2C is a way sensors and actuators can communicate with an IoT device. It will be covered in more detail in a later lesson.
+
-
+## Program the light sensor
-## Program the sunlight sensor
+The device can now be programmed using the Grove light sensor.
-The device can now be programmed using the Grove sunlight sensor.
-
-### Task - program the sunlight sensor
+### Task - program the light sensor
Program the device.
@@ -38,44 +36,36 @@ Program the device.
1. Open the nightlight project in VS Code that you created in the previous part of this assignment, either running directly on the Pi or connected using the Remote SSH extension.
-1. Run the following command to install a pip package for working with the sunlight sensor:
-
- ```sh
- pip3 install seeed-python-si114x
- ```
-
- Not all the libraries for the Grove Sensors are installed with the Grove install script you used in an earlier lesson. Some need additional packages.
-
1. Open the `app.py` file and remove all code from it
1. Add the following code to the `app.py` file to import some required libraries:
```python
import time
- import seeed_si114x
+ from grove.grove_light_sensor_v1_2 import GroveLightSensor
```
The `import time` statement imports the `time` module that will be used later in this assignment.
- The `import seeed_si114x` statement imports the `seeed_si114x` module that has code to interact with the Grove sunlight sensor.
+ The `from grove.grove_light_sensor_v1_2 import GroveLightSensor` statement imports the `GroveLightSensor` from the Grove Python libraries. This library has code to interact with a Grove light sensor, and was installed globally during the Pi setup.
1. Add the following code after the code above to create an instance of the class that manages the light sensor:
```python
- light_sensor = seeed_si114x.grove_si114x()
+ light_sensor = GroveLightSensor(0)
```
- The line `light_sensor = seeed_si114x.grove_si114x()` creates an instance of the `grove_si114x` sunlight sensor class.
+ The line `light_sensor = GroveLightSensor(0)` creates an instance of the `GroveLightSensor` class connecting to pin **A0** - the analog Grove pin that the light sensor is connected to.
1. Add an infinite loop after the code above to poll the light sensor value and print it to the console:
```python
while True:
- light = light_sensor.ReadVisible
+ light = light_sensor.light
print('Light level:', light)
```
- This will read the current sunlight level on a scale of 0-1,023 using the `ReadVisible` property of the `grove_si114x` class. This value is then printed to the console.
+ This will read the current light level on a scale of 0-1,023 using the `light` property of the `GroveLightSensor` class. This property reads the analog value from the pin. This value is then printed to the console.
1. Add a small sleep of one second at the end of the `loop` as the light levels don't need to be checked continuously. A sleep reduces the power consumption of the device.
@@ -89,16 +79,16 @@ Program the device.
python3 app.py
```
- You should see sunlight values being output to the console. Cover and uncover the sunlight sensor to see the values change:
+ Light values will be output to the console. Cover and uncover the light sensor, and the values will change:
```output
pi@raspberrypi:~/nightlight $ python3 app.py
- Light level: 259
- Light level: 265
- Light level: 265
- Light level: 584
- Light level: 550
- Light level: 497
+ Light level: 634
+ Light level: 634
+ Light level: 634
+ Light level: 230
+ Light level: 104
+ Light level: 290
```
> 💁 You can find this code in the [code-sensor/pi](code-sensor/pi) folder.
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/README.ar.md b/1-getting-started/lessons/3-sensors-and-actuators/translations/README.ar.md
new file mode 100644
index 00000000..a9377534
--- /dev/null
+++ b/1-getting-started/lessons/3-sensors-and-actuators/translations/README.ar.md
@@ -0,0 +1,227 @@
+#
تفاعل مع العالم المادي باستخدام المستشعرات والمحركات
+يقدم هذا الدرس اثنين من المفاهيم الهامة لجهاز إنترنت الأشياء الخاص بك - أجهزة الاستشعار والمشغلات. ستحصل أيضًا على تدريب عملي مع كليهما ، بإضافة مستشعر الضوء إلى مشروع إنترنت الأشياء الخاص بك ، ثم إضافة مؤشر LED يتم التحكم فيه بواسطة مستويات الإضاءة ، مما يؤدي إلى بناء ضوء ليلي بشكل فعال.
+
+
+
+سنغطي في هذا الدرس:
+
+
+* [ما هي المستشعرات؟](#what-are-sensors)
+* [استخدم جهاز استشعار](#use-a-sensor)
+* [أنواع أجهزة الاستشعار](#sensor-types)
+* [ما هي المحركات؟](#what-are-actuators)
+* [استخدم مشغل](#use-an-actuator)
+* [أنواع المحركات](#actuator-types)
+
+## ما هي المستشعرات؟
+
+أجهزة الاستشعار هي أجهزة تستشعر العالم المادي - أي أنها تقيس خاصية واحدة أو أكثر من حولها وترسل المعلومات إلى جهاز إنترنت الأشياء. تغطي المستشعرات مجموعة كبيرة من الأجهزة حيث يوجد الكثير من الأشياء التي يمكن قياسها ، من الخصائص الطبيعية مثل درجة حرارة الهواء إلى التفاعلات الفيزيائية مثل الحركة.
+
+تتضمن بعض المستشعرات الشائعة ما يلي:
+
+* مستشعرات درجة الحرارة - تستشعر درجة حرارة الهواء أو درجة حرارة ما يتم غمرها فيه. بالنسبة للهواة والمطورين ، غالبًا ما يتم دمجها مع ضغط الهواء والرطوبة في مستشعر واحد.
+* الأزرار - تشعر بها عند الضغط عليها.
+* مستشعرات الضوء - تكتشف مستويات الضوء ويمكن أن تكون لألوان معينة أو ضوء الأشعة فوق البنفسجية أو ضوء الأشعة تحت الحمراء أو الضوء المرئي العام.
+* الكاميرات - تستشعر التمثيل المرئي للعالم من خلال التقاط صورة أو دفق الفيديو.
+* مقاييس التسارع - تستشعر هذه الحركة في اتجاهات متعددة.
+* الميكروفونات - تستشعر الأصوات ، سواء كانت مستويات صوت عامة أو صوت اتجاهي.
+
+✅ قم ببعض البحث. ما المستشعرات الموجودة بهاتفك؟
+
+تشترك جميع المستشعرات في شيء واحد - فهي تحول كل ما تشعر به إلى إشارة كهربائية يمكن تفسيرها بواسطة جهاز إنترنت الأشياء. تعتمد كيفية تفسير هذه الإشارة الكهربائية على المستشعر ، بالإضافة إلى بروتوكول الاتصال المستخدم للتواصل مع جهاز إنترنت الأشياء.
+
+
+## استخدم جهاز استشعار
+
+اتبع الدليل ذي الصلة أدناه لإضافة مستشعر إلى جهاز إنترنت الأشياء الخاص بك:
+
+* [Arduino - Wio Terminal](wio-terminal-sensor.md)
+* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-sensor.md)
+* [كمبيوتر ذو لوحة واحدة - جهاز افتراضي](virtual-device-sensor.md)
+
+## أنواع أجهزة الاستشعار
+
+المستشعرات إما قياسية أو رقمية.
+
+### المستعرات القياسية
+
+بعض المستشعرات الأساسية هي أجهزة استشعار تمثيلية. تتلقى هذه المستشعرات فولت من جهاز إنترنت الأشياء ، وتقوم مكونات المستشعر بضبط هذا الفولت ، ويتم قياس الفولت الذي يتم إرجاعه من المستشعر لإعطاء قيمة المستشعر.
+
+> 🎓 الفولت هو مقياس لمقدار الدفع لنقل الكهرباء من مكان إلى آخر ، مثل من طرف موجب للبطارية إلى الطرف السالب. على سبيل المثال ، بطارية AA القياسية هي 1.5 فولت (V هو رمز فولت) ويمكنها دفع الكهرباء بقوة 1.5 فولت من طرفها الموجب إلى الطرف السالب. تتطلب الأجهزة الكهربائية المختلفة فولتًا مختلفًا للعمل ، على سبيل المثال ، يمكن لمصباح LED أن يضيء بفولت يتراوح بين 2-3 فولت ، لكن المصباح الخيطي 100 وات يحتاج إلى 240 فولت. يمكنك قراءة المزيد عن الفولت على صفحة الفولت على ويكيبيديا
+
+أحد الأمثلة على ذلك هو مقياس الفولت. هذا قرص يمكنك تدويره بين موضعين ويقيس المستشعر الدوران.
+
+
+
+***A potentiometer. Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
+
+سيرسل جهاز إنترنت الأشياء إشارة كهربائية إلى مقياس الفولت ، مثل 5 فولت . عندما يتم ضبط مقياس الفولت فإنه يغير الفولت الذي يخرج من الجانب الآخر. تخيل أن لديك مقياس فولت مُصنَّف على أنه قرص يمتد من 0 إلى 11 ، مثل مقبض الصوت في مكبر الصوت. عندما يكون مقياس الفولت في وضع إيقاف التشغيل الكامل (0) ، فسيخرج 0 فولت (0 فولت). عندما يكون في وضع التشغيل الكامل (11) ، سيخرج 5 فولت (5 فولت).
+
+> 🎓 هذا تبسيط مفرط ، ويمكنك قراءة المزيد عن مقاييس الفولت والمقاومات المتغيرة على potentiometer Wikipedia page
+يتم بعد ذلك قراءة الفولت الذي يخرج من المستشعر بواسطة جهاز إنترنت الأشياء ، ويمكن للجهاز الاستجابة له. اعتمادًا على المستشعر ، يمكن أن يكون هذا الفولت قيمة عشوائية أو يمكن تعيينه إلى وحدة قياسية. على سبيل المثال ، يقوم مستشعر درجة الحرارة التناظرية المستند إلى thermistor بتغيير مقاومته اعتمادًا على درجة الحرارة. يمكن بعد ذلك تحويل فولت الخرج إلى درجة حرارة بوحدة كلفن ، وبالتالي إلى درجة مئوية أو درجة فهرنهايت ، عن طريق الحسابات في الكود.
+
+✅ ما الذي يحدث برأيك إذا قام المستشعر بإرجاع فولت أعلى مما تم إرساله (على سبيل المثال قادم من مصدر طاقة خارجي)؟ ⛔️ لا تختبر ذلك.
+
+#### التحويل القياسي إلى الرقمي
+
+أجهزة إنترنت الأشياء رقمية - لا يمكنها العمل مع القيم التناظرية ، فهي تعمل فقط مع 0 و 1. هذا يعني أنه يجب تحويل قيم المستشعرات التناظرية إلى إشارة رقمية قبل معالجتها. تحتوي العديد من أجهزة إنترنت الأشياء على محولات من التناظرية إلى الرقمية (ADC) لتحويل المدخلات التناظرية إلى تمثيلات رقمية لقيمتها. يمكن أن تعمل المستشعرات أيضًا مع ADC عبر لوحة موصل. على سبيل المثال ، في نظام Seeed Grove البيئي مع Raspberry Pi ، تتصل المستشعرات التناظرية بمنافذ محددة على "قبعة" مثبتة على Pi متصلة بدبابيس GPIO الخاصة بـ Pi ، وتحتوي هذه القبعة على ADC لتحويل الجهد إلى إشارة رقمية التي يمكن إرسالها من دبابيس GPIO الخاصة بـ Pi.
+
+تخيل أن لديك مستشعر ضوء تناظري متصل بجهاز إنترنت الأشياء يستخدم 3.3 فولت ويعيد قيمة 1 فولت. لا يعني هذا 1V أي شيء في العالم الرقمي ، لذلك يجب تحويله. سيتم تحويل الجهد إلى قيمة تمثيلية باستخدام مقياس يعتمد على الجهاز والمستشعر. أحد الأمثلة على ذلك هو مستشعر الضوء Seeed Grove الذي ينتج قيمًا من 0 إلى 1023. بالنسبة لهذا المستشعر الذي يعمل عند 3.3 فولت ، سيكون خرج 1 فولت بقيمة 300. لا يمكن لجهاز إنترنت الأشياء التعامل مع 300 كقيمة تناظرية ، لذلك سيتم تحويل القيمة إلى "0000000100101100" ، التمثيل الثنائي 300 بواسطة Grove قبعة. ثم تتم معالجة ذلك بواسطة جهاز إنترنت الأشياء.
+
+✅ إذا كنت لا تعرف النظام الثنائي ، فقم بإجراء قدر صغير من البحث لمعرفة كيفية تمثيل الأرقام بالأصفار والآحاد. تعتبر مقدمة BBC Bitesize للدرس الثنائي مكانًا رائعًا للبدء.
+
+من منظور الترميز ، يتم التعامل مع كل هذا عادةً بواسطة المكتبات التي تأتي مع أجهزة الاستشعار ، لذلك لا داعي للقلق بشأن هذا التحويل بنفسك. بالنسبة لمستشعر الضوء Grove ، يمكنك استخدام مكتبة Python واستدعاء خاصية "light" ، أو استخدام مكتبة Arduino واستدعاء "analogRead" للحصول على قيمة 300.
+
+### المستشعرات الرقمية
+
+تكتشف المستشعرات الرقمية ، مثل المستشعرات التناظرية ، العالم من حولها باستخدام التغيرات في الجهد الكهربائي. الفرق هو أنهم يخرجون إشارة رقمية ، إما عن طريق قياس حالتين فقط أو باستخدام ADC مدمج. أصبحت المستشعرات الرقمية أكثر شيوعًا لتجنب الحاجة إلى استخدام ADC إما في لوحة الموصل أو على جهاز إنترنت الأشياء نفسه.
+
+أبسط مستشعر رقمي هو زر أو مفتاح. هذا جهاز استشعار بحالتين ، يعمل أو لا يعمل.
+
+
+
+***A button. Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
+
+يمكن أن تقيس الدبابيس الموجودة على أجهزة إنترنت الأشياء مثل دبابيس GPIO هذه الإشارة مباشرة على أنها 0 أو 1. إذا كان الجهد المرسل هو نفس الجهد الذي تم إرجاعه ، فإن القيمة المقروءة هي 1 ، وإلا فإن القيمة المقروءة هي 0. ليست هناك حاجة للتحويل الإشارة ، يمكن أن تكون 1 أو 0 فقط.
+
+> 💁 الفولتية لا تكون دقيقة أبدًا خاصة وأن المكونات الموجودة في المستشعر سيكون لها بعض المقاومة ، لذلك عادة ما يكون هناك تفاوت. على سبيل المثال ، تعمل دبابيس GPIO على Raspberry Pi على 3.3 فولت ، وتقرأ إشارة عودة أعلى من 1.8 فولت على أنها 1 ، وأقل من 1.8 فولت مثل 0.
+
+* 3.3 فولت يذهب إلى الزر. الزر مغلق حتى يخرج 0 فولت ، مما يعطي القيمة 0
+* 3.3 فولت يذهب إلى الزر. الزر في وضع التشغيل بحيث يخرج 3.3 فولت ، مما يعطي القيمة 1
+
+تقوم المستشعرات الرقمية الأكثر تقدمًا بقراءة القيم التناظرية ، ثم تحويلها باستخدام ADC المدمجة إلى إشارات رقمية. على سبيل المثال ، سيظل مستشعر درجة الحرارة الرقمي يستخدم مزدوجًا حراريًا بنفس طريقة المستشعر التناظري ، وسيظل يقيس التغير في الجهد الناتج عن مقاومة المزدوجة الحرارية عند درجة الحرارة الحالية. بدلاً من إرجاع القيمة التناظرية والاعتماد على الجهاز أو لوحة الموصل للتحويل إلى إشارة رقمية ، ستقوم وحدة ADC المدمجة في المستشعر بتحويل القيمة وإرسالها كسلسلة من 0 و 1 إلى جهاز إنترنت الأشياء. يتم إرسال هذه القيم من 0 و 1 بنفس طريقة إرسال الإشارة الرقمية للزر حيث يمثل 1 جهدًا كاملاً و 0 يمثل 0 فولت.
+
+
+
+***A digital temperature sensor. Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+يتيح إرسال البيانات الرقمية لأجهزة الاستشعار أن تصبح أكثر تعقيدًا وإرسال بيانات أكثر تفصيلاً ، حتى البيانات المشفرة لأجهزة الاستشعار الآمنة. مثال واحد هو الكاميرا. هذا مستشعر يلتقط صورة ويرسلها كبيانات رقمية تحتوي على تلك الصورة ، عادة بتنسيق مضغوط مثل JPEG ، ليقرأها جهاز إنترنت الأشياء. يمكنه حتى دفق الفيديو عن طريق التقاط الصور وإرسال إما إطار الصورة الكامل بإطار أو بث فيديو مضغوط.
+
+## ما هي المحركات؟
+
+المشغلات هي عكس المستشعرات - فهي تقوم بتحويل الإشارة الكهربائية من جهاز إنترنت الأشياء الخاص بك إلى تفاعل مع العالم المادي مثل إصدار الضوء أو الصوت أو تحريك المحرك.
+
+تتضمن بعض المحركات الشائعة ما يلي:
+
+* LED - ينبعث منها ضوء عند تشغيله
+* مكبر الصوت - يصدر صوتًا بناءً على الإشارة المرسلة إليهم ، من الجرس الأساسي إلى مكبر الصوت الذي يمكنه تشغيل الموسيقى
+* محرك متدرج - يقوم بتحويل الإشارة إلى مقدار محدد من الدوران ، مثل تدوير القرص بزاوية 90 درجة
+* الترحيل - هذه هي المفاتيح التي يمكن تشغيلها أو إيقاف تشغيلها بواسطة إشارة كهربائية. إنها تسمح بجهد صغير من جهاز إنترنت الأشياء لتشغيل الفولتية الأكبر.
+* الشاشات - هذه مشغلات أكثر تعقيدًا وتعرض معلومات على شاشة متعددة الأجزاء. تختلف الشاشات من شاشات LED البسيطة إلى شاشات الفيديو عالية الدقة.
+
+✅ قم ببعض البحث. ما هي المشغلات التي يمتلكها هاتفك؟
+
+## استخدام مشغل
+
+اتبع الدليل ذي الصلة أدناه لإضافة مشغل إلى جهاز إنترنت الأشياء الخاص بك ، والذي يتحكم فيه المستشعر ، لإنشاء ضوء ليلي لإنترنت الأشياء. سيجمع مستويات الضوء من مستشعر الضوء ، ويستخدم مشغلًا على شكل LED لإصدار الضوء عندما يكون مستوى الضوء المكتشف منخفضًا جدًا.
+
+
+
+***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
+
+* [Arduino - Wio Terminal](wio-terminal-actuator.md)
+* [كمبيوتر ذو لوحة واحدة - Raspberry Pi](pi-actuator.md)
+* [كمبيوتر ذو لوحة واحدة - Virtual device](virtual-device-actuator.md)
+
+## أنواع المحرك
+
+مثل المستشعرات ، تكون المحركات إما قياسية أو رقمية.
+
+### المحركات القياسية
+
+تأخذ المشغلات القياسية إشارة قياسية وتحولها إلى نوع من التفاعل ، حيث يتغير التفاعل بناءً على الجهد المزود.
+
+أحد الأمثلة هو الضوء الخافت ، مثل الذي قد يكون لديك في منزلك. يحدد مقدار الجهد المقدم للضوء مدى سطوعه.
+
+
+
+***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+كما هو الحال مع المستشعرات ، يعمل جهاز إنترنت الأشياء الفعلي على الإشارات الرقمية وليس التناظرية. هذا يعني لإرسال إشارة تناظرية ، يحتاج جهاز إنترنت الأشياء إلى محول رقمي إلى تناظري (DAC) ، إما على جهاز إنترنت الأشياء مباشرة ، أو على لوحة الموصل. سيؤدي هذا إلى تحويل 0 و 1 من جهاز إنترنت الأشياء إلى جهد تناظري يمكن أن يستخدمه المشغل.
+
+✅ ما الذي يحدث برأيك إذا أرسل جهاز إنترنت الأشياء جهدًا أعلى مما يستطيع المشغل تحمله؟ ⛔️ لا تختبر ذلك.
+
+#### تعديل عرض النبض
+
+هناك خيار آخر لتحويل الإشارات الرقمية من جهاز إنترنت الأشياء إلى إشارة تمثيلية وهو تعديل عرض النبضة. يتضمن هذا إرسال الكثير من النبضات الرقمية القصيرة التي تعمل كما لو كانت إشارة تمثيلية.
+
+على سبيل المثال ، يمكنك استخدام PWM للتحكم في سرعة المحرك.
+
+تخيل أنك تتحكم في محرك مزود بمصدر 5 فولت. تقوم بإرسال نبضة قصيرة إلى المحرك الخاص بك ، حيث تقوم بتحويل الجهد إلى الجهد العالي (5 فولت) لمدة مائتي ثانية (0.02 ثانية). في ذلك الوقت ، يمكن لمحركك أن يدور عُشر الدوران ، أو 36 درجة. ثم تتوقف الإشارة مؤقتًا لمدة مائتي ثانية (0.02 ثانية) ، لإرسال إشارة منخفضة (0 فولت). كل دورة تشغيل ثم إيقاف تستمر 0.04 ثانية. ثم تتكرر الدورة.
+
+
+
+***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+هذا يعني أنه في ثانية واحدة لديك 25 نبضة 5 فولت من 0.02 ثانية والتي تقوم بتدوير المحرك ، يتبع كل منها توقف مؤقت بمقدار 0.02 ثانية بمقدار 0 فولت لا يقوم بتدوير المحرك. تقوم كل نبضة بتدوير المحرك بمقدار عُشر الدوران ، مما يعني أن المحرك يكمل 2.5 دورة في الثانية. لقد استخدمت إشارة رقمية لتدوير المحرك بمعدل 2.5 دورة في الثانية ، أو 150 دورة في الدقيقة ، وهو مقياس غير قياسي لسرعة الدوران).
+
+```output
+25 نبضة في الثانية × 0.1 دورة لكل نبضة = 2.5 دورة في الثانية
+2.5 دورة في الثانية × 60 ثانية في الدقيقة = 150 دورة في الدقيقة
+```
+> 🎓 عندما تكون إشارة PWM قيد التشغيل لمدة نصف الوقت ، وإيقاف تشغيلها لنصف المدة ، يشار إليها على أنها 50٪ دورة عمل. يتم قياس دورات التشغيل كنسبة مئوية من الوقت تكون فيه الإشارة في حالة التشغيل مقارنة بحالة إيقاف التشغيل.
+
+
+
+***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+يمكنك تغيير سرعة المحرك عن طريق تغيير حجم النبضات. على سبيل المثال ، باستخدام نفس المحرك ، يمكنك الحفاظ على نفس وقت الدورة عند 0.04 ثانية ، مع خفض نبضة التشغيل إلى النصف إلى 0.01 ثانية ، وزيادة نبضة الإيقاف إلى 0.03 ثانية. لديك نفس عدد النبضات في الثانية (25) ، ولكن كل نبضة تساوي نصف الطول. نبضة بطول نصف تدير المحرك بمقدار عشرين من الدوران ، وعند 25 نبضة في الثانية ستكمل 1.25 دورة في الثانية أو 75 دورة في الدقيقة. من خلال تغيير سرعة النبض لإشارة رقمية ، تكون قد خفضت سرعة المحرك التناظري إلى النصف.
+
+```output
+25 نبضة في الثانية × 0.05 دورة لكل نبضة = 1.25 دورة في الثانية
+1.25 دورة في الثانية × 60 ثانية في الدقيقة = 75 دورة في الدقيقة
+```
+
+✅ كيف تحافظ على سلاسة دوران المحرك ، خاصة عند السرعات المنخفضة؟ هل ستستخدم عددًا صغيرًا من النبضات الطويلة مع فترات توقف طويلة أم الكثير من النبضات القصيرة جدًا مع فترات توقف قصيرة جدًا؟
+
+> 💁 تستخدم بعض المستشعرات أيضًا PWM لتحويل الإشارات التناظرية إلى إشارات رقمية.
+
+> 🎓 يمكنك قراءة المزيد عن تعديل عرض النبض على صفحة تعديل عرض النبض على ويكيبيديا.
+
+### المشغلات الرقمية
+
+المشغلات الرقمية ، مثل المستشعرات الرقمية ، إما لها حالتان يتم التحكم فيهما بجهد مرتفع أو منخفض أو تحتوي على DAC مدمجة بحيث يمكنها تحويل إشارة رقمية إلى إشارة تمثيلية.
+
+أحد المشغلات الرقمية البسيطة هو LED. عندما يرسل الجهاز إشارة رقمية بقيمة 1 ، يتم إرسال جهد عالي يضيء مؤشر LED. عند إرسال إشارة رقمية بقيمة 0 ، ينخفض الجهد إلى 0 فولت وينطفئ مؤشر LED.
+
+
+
+***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+✅ ما هي المشغلات البسيطة الأخرى ذات الحالتين التي يمكنك التفكير فيها؟ أحد الأمثلة على ذلك هو الملف اللولبي ، وهو مغناطيس كهربائي يمكن تنشيطه للقيام بأشياء مثل تحريك مسمار قفل الباب / فتح قفل الباب.
+
+تتطلب المحركات الرقمية الأكثر تقدمًا ، مثل الشاشات ، إرسال البيانات الرقمية بتنسيقات معينة. عادةً ما تأتي مع مكتبات تسهل إرسال البيانات الصحيحة للتحكم فيها.
+
+---
+
+## 🚀 التحدي
+
+كان التحدي في الدرسين الأخيرين هو سرد أكبر عدد ممكن من أجهزة إنترنت الأشياء الموجودة في منزلك أو مدرستك أو مكان عملك وتحديد ما إذا كانت مبنية على وحدات تحكم دقيقة أو أجهزة كمبيوتر أحادية اللوحة ، أو حتى مزيج من الاثنين معًا.
+
+لكل جهاز أدرجته ، ما المستشعرات والمشغلات التي يتصلون بها؟ ما هو الغرض من كل حساس ومشغل متصل بهذه الأجهزة؟
+
+## اختبار ما بعد المحاضرة
+
+اختبار ما بعد المحاضرة
+
+## مراجعة ودراسة ذاتية
+
+* اقرأ عن الكهرباء والدوائر على ThingLearn.
+* اقرأ عن الأنواع المختلفة من مستشعرات درجة الحرارة في دليل مستشعرات درجة الحرارة في الاستوديوها
+* اقرأ عن مصابيح LED على صفحة Wikipedia LED
+
+## الواجب
+
+[أجهزة الاستشعار والمحركات البحثية](assignment.md)
+
+
+
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/README.bn.md b/1-getting-started/lessons/3-sensors-and-actuators/translations/README.bn.md
new file mode 100644
index 00000000..7ef60e20
--- /dev/null
+++ b/1-getting-started/lessons/3-sensors-and-actuators/translations/README.bn.md
@@ -0,0 +1,215 @@
+# সেন্সর এবং অ্যাকচুয়েটরের সাহায্যে বাহ্যিক জগতের সাথে যোগাযোগ
+
+## লেকচার পূর্ববর্তী কুইজ
+
+[লেকচার পূর্ববর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/5)
+
+## পরিচিতি
+
+এই লেসনটি আমাদের আইওটি ডিভাইসের জন্য দুটি গুরুত্বপূর্ণ ধারণার পরিচয় করিয়ে দেয় - সেন্সর এবং অ্যাকচুয়েটর । আইওটি প্রজেক্টে লাইট সেন্সর যুক্ত করে, তারপরে আলোর মাত্রা দ্বারা নিয়ন্ত্রিত একটি এলইডি সংযুক্ত করার মাধ্যমে কার্যকরভাবে একটি রাতের আলোকীয় যন্ত্র বা 'নাইটলাইট' তৈরি করা যাবে।
+
+এই পাঠ্যে আমরা দেখবো :
+
+* [সেন্সর কী?](#সেন্সর-কী)
+* [একটি সেন্সর ব্যবহার](#একটি-সেন্সর-ব্যবহার)
+* [সেন্সর কত প্রকার](#সেন্সর-কত-প্রকার)
+* [অ্যাকচুয়েটর কী?](#অ্যাকচুয়েটর-কী)
+* [একটি অ্যাকচুয়েটর ব্যবহার](#একটি-অ্যাকচুয়েটর-ব্যবহার)
+* [অ্যাকচুয়েটর কত প্রকার](#অ্যাকচুয়েটর-কত-প্রকার)
+
+## সেন্সর কী?
+
+সেন্সরগুলি এমন একটি হার্ডওয়্যার ডিভাইস যা বাহ্যিক জগতকে বুঝতে পারে - অর্থাৎ তারা তাদের চারপাশে এক বা একাধিক বৈশিষ্ট্য পরিমাপ করে এবং তথ্যগুলো আইওটি ডিভাইসে প্রেরণ করে। প্রাকৃতিক বৈশিষ্ট্য যেমন বায়ু তাপমাত্রা থেকে শুরু করে শারীরিক মিথস্ক্রিয়া যেমন চলাফেরার মতো প্রাকৃতিক বৈশিষ্ট্য থেকে পরিমাপ করা যায় এমন অনেকগুলি জিনিস রয়েছে বলে সেন্সরগুলি বিস্তৃত ডিভাইস কভার করে।
+সেন্সর দ্বারা অনেকগুলো বিষয় পরিমাপ করা সম্ভব; প্রাকৃতিক বৈশিষ্ট্য যেমন বায়ু তাপমাত্রা থেকে শুরু করে শারীরিক মিথস্ক্রিয়া যেমন চলাফেরার মতো অনেককিছুতেই এটির ব্যবহার লক্ষ্যণীয় ।
+
+কিছু অতি ব্যবহৃত সেন্সর হলো:
+
+* তাপমাত্রা সেন্সর - এগুলি বায়ুর তাপমাত্রা বা যে মাধ্যমে নিমজ্জিত রয়েছে সেটির তাপমাত্রা নির্ণয় করতে পারে। শৌখিন এবং প্রফেশনাল ডেভলাপারদের জন্য প্রায়ই এই একটি সেন্সরেই বায়ুচাপ এবং আর্দ্রতার ও নির্ণয়ের সুবিধা প্রদান করা হয়।
+* বাটন - কেউ যদি বাটনে প্রেস করে তখন তারা তা বুঝতে পারে।
+* আলোকীয় সেন্সর - এগুলি আলোর মাত্রা সনাক্ত করে এবং নির্দিষ্ট রঙ, ইউভি আলো, আইআর লাইট বা সাধারণ দৃশ্যমান আলোর জন্য সুনির্দিষ্টভাবে কাজ করতে পারে।
+* ক্যামেরা - এগুলি কোন ছবি বা স্ট্রিমিং ভিডিও গ্রহণ করে বিশ্বের চিত্রিত প্রতিরূপ তৈরী করতে পারে।
+* একসেলেরোমিটার - বিভিন্ন দিকে গতিবিধির পরিবর্তন বুঝতে পারে।
+* মাইক্রোফোন - সাধারণ শব্দ স্তর বা সুনির্দিষ্ট কোন দিক থেকে আসা শব্দ বুঝতে পারে।
+
+✅ ছোট্ট একটি কাজ করা যাক এখন। আমাদের ব্যবহৃত ফোনে কী কী সেন্সর রয়েছে তা চিন্তা করি।
+
+সব সেন্সর এর মধ্যে একটি সাধারণ বিষয় রয়েছে - তারা যা কিছু সেন্স করতে পারে, তা বৈদ্যুতিক সংকেতে রূপান্তর করে যে ডেটা আইওটি ডিভাইস ব্যবহার করতে পারে। এই বৈদ্যুতিক সংকেতটি কীভাবে ব্যবহৃত হবে, তা সেন্সরের উপর নির্ভর করে, পাশাপাশি আইওটি ডিভাইসের সাথে যোগাযোগ করার জন্য ব্যবহৃত যোগাযোগ প্রোটোকলও এই ক্ষেত্রে প্রভাব রাখে।
+
+## একটি সেন্সর ব্যবহার
+
+আইওটি ডিভাইসে সেন্সর যুক্ত করতে নীচের কোন একটি প্রাসঙ্গিক গাইড অনুসরণ করতে হবে:
+
+* [Arduino - Wio Terminal](wio-terminal-sensor.md)
+* [Single-board computer - Raspberry Pi](pi-sensor.md)
+* [Single-board computer - Virtual device](virtual-device-sensor.md)
+
+## সেন্সর কত প্রকার
+
+সেন্সরগুলি মূলত ২ প্রকার - অ্যানালগ এবং ডিজিটাল।
+
+### অ্যানালগ সেন্সর
+
+সবথেকে বেসিক সেন্সর হল এনালগ সেন্সর। এগুলো আইওটি ডিভাইস থেকে একটি ভোল্টেজ গ্রহণ করে, সেন্সর উপাদানগুলি এই ভোল্টেজটি সামঞ্জস্য করে এবং সেন্সর থেকে ফিরে আসা ভোল্টেজটিই মান পরিমাপ করে।
+
+> 🎓ভোল্টেজ এমন এক রাশি যা বিদ্যুৎ এক জায়গা থেকে অন্য জায়গায় প্রবাহিত হবে কিনা তা ঠিক করে, যেমন ব্যাটারির পসিটিভ টার্মিনাল থেকে নেগেটিভ টার্মিনালে যায়। উদাহরণস্বরূপ, একটি স্ট্যান্ডার্ড ডাবল-এ ব্যাটারি 1.5V (V হলো ভোল্টের প্রতীক) এবং এটি পসিটিভ টার্মিনাল থেকে নেগেটিভ টার্মিনালে 1.5V এর শক্তি দিয়ে বিদ্যুৎকে প্রবাহিত করতে পারে। বিভিন্ন বৈদ্যুতিক হার্ডওয়্যারের কাজ করার জন্য বিভিন্ন ভোল্টেজের প্রয়োজন হয়, উদাহরণস্বরূপ, একটি এলইডি ২-৩ ভোল্টেজের মধ্যে আলোকিত হতে পারে, তবে একটি 100ওয়াট ফিলামেন্ট লাইটবাল্বের জন্য 240V প্রয়োজন হবে। [ভোল্টেজ - উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Voltage) পড়লে এই সংক্রান্ত বিস্তারিত জানা যাবে।
+
+উদাহরণস্বরূপ পোটেনশিওমিটার এর কথা ধরা যাক। এটি এমন একটি ডায়াল যা আমরা দুটি অবস্থানের মধ্যে ঘোরাই এবং সেন্সরটি ঘূর্ণনটি পরিমাপ করে প্রয়োজনীয় তথ্য সংগ্রহ করে।
+
+
+
+***পটেনশিওমিটার । Microcontroller by Template / dial by Jamie Dickinson - all from the [Noun Project](https://thenounproject.com)***
+
+আইওটি ডিভাইসগুলো কোন নির্দিষ্ট ভোল্টেজে (যেমনঃ 5V) পোটেনশিওমিটারে বৈদ্যুতিক সংকেত পাঠাবে। পটেনশিওমিটার অ্যাডজাস্ট করার সাথে সাথে এটি অন্য দিক থেকে আগত ভোল্টেজকে পরিবর্তন করে। কল্পনা করি যে ভলিউম নব এর মতো আমাদের ডায়াল হিসাবে 0 থেকে [11] (https://wikedia.org/wiki/Up_to_eleven) লেবেলযুক্ত একটি পটেনশিওমিটার রয়েছে। যখন পেন্টিয়োমিটার পূর্ণ অফ অবস্থানে (0) থাকবে তখন 0V (0 ভোল্ট)আর যখন এটি সম্পূর্ণ অন পজিশনে থাকবে (11), তখন 5V (5 ভোল্ট) মান দিবে।
+
+> 🎓 পুরো বিষয়টিকে অত্যন্ত সহজভাবে বোঝানোর চেষ্টা করা হয়েছে। পোটেনশিওমিটার এবং পরিবর্তনযোগ্য রোধক সম্পর্কে [পোটেনশিওমিটার উইকিপিডিয়া পেইজ](https://wikipedia.org/wiki/Potentiometer) এ বিশদ ব্যখ্যা রয়েছে।
+
+সেন্সর প্রদত্ত ভোল্টেজটি আইওটি ডিভাইস গ্রহণ করে এবং ডিভাইস এটিতে সাড়া দিতে পারে। সেন্সরের উপর নির্ভর করে, এই ভোল্টেজ যেকোন মান এর হতে পারে বা একটি আদর্শ মান ধারণ করতে পারে। উদাহরণস্বরূপ, [থার্মিস্টর] (https://wikedia.org/wiki/Thermistor) এর উপর ভিত্তি করে তৈরী একটি অ্যানালগ তাপমাত্রা সেন্সর, তাপমাত্রার উপর নির্ভর করে তার রেসিস্ট্যান্ট বা রোধ এর মান পরিবর্তন করে। আউটপুট ভোল্টেজটি তখন ক্যালভিন তাপমাত্রায় রূপান্তরিত করা যাবে এবং কোডিং এর মাধ্যমে সেলসিয়াস বা ফারেনহাইটে পরিবর্তনযোগ্য।
+
+✅ সেন্সর যদি প্রদত্ত ভোল্টেজ (যেমনঃ কোন এক্সটার্নাল পাওয়ার সাপ্লাই থেকে) এর চাইতে বেশি রিটার্ন করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত।
+
+#### অ্যানালগ থেকে ডিজিটালে রূপান্তর
+
+আইওটি ডিভাইসগুলি হলো ডিজিটাল যন্ত্র - এগুলো অ্যানালগ মান নিয়ে কাজ করতে পারে না, তারা কেবল 0 এবং 1 এর মাধ্যমে কাজ করে। এর অর্থ হল এনালগ সেন্সর মানগুলি নিয়ে কাজ করার আগে, তাদেরকে ডিজিটাল সিগন্যালে রূপান্তর করা দরকার। অনেক আইওটি ডিভাইসে এনালগ ইনপুটগুলিকে তাদের মানের ডিজিটাল ফর্মে রূপান্তর করতে 'অ্যানালগ-থেকে-ডিজিটাল কনভার্টার (এডিসি)' থাকে। সেন্সরগুলি সংযোগকারী বোর্ডের মাধ্যমেও এডিসিগুলির সাথে কাজ করতে পারে। উদাহরণস্বরূপ, রাস্পবেরি পাই সহ সীড গ্রোভ ইকোসিস্টেমে, অ্যানালগ সেন্সরগুলি একটি 'হ্যাট'- এর নির্দিষ্ট পোর্টের সাথে সংযোগ করে যা পাই এর জিপিআইও পিনের সাথে যুক্ত হয়ে পাইতে বসে এবং এই হ্যাটটির একটি 'অ্যানালগ-থেকে-ডিজিটাল কনভার্টার (এডিসি)' রয়েছে যা প্রাপ্ত মানকে ডিজিটাল সিগন্যালে পরিণত করে যা জিপিআইও দ্বারা প্রেরিত হয়।
+
+কল্পনা করি যে আমাদের আইওটি ডিভাইসের সাথে সংযুক্ত একটি এনালগ লাইট সেন্সর রয়েছে যা 3.3V ব্যবহার করে এবং 1V রিটার্ন করছে । এই 1V বিষয়টি ডিজিটাল জগতে কোন কিছুই বোঝায়না, তাই এটি রূপান্তর করা দরকার। ভোল্টেজটিকে ডিভাইস এবং সেন্সরের উপর নির্ভর করে, নির্দিষ্ট স্কেলে ব্যবহার করে অ্যানালগ মানে রূপান্তরিত করা হবে। উদাহরণস্বরূপ, সীড গ্রোভ লাইট সেন্সর যা 0 থেকে 1023 পর্যন্ত মান আউটপুট দেয়। 3.3V-তে চলমান এই সেন্সরটির জন্য, 1V আউটপুটটির মান হবে 300 । একটি আইওটি ডিভাইস 300 এনালগ মান হিসাবে পরিচালনা করতে পারে না, সুতরাং মানটি `0000000100101100` এ রূপান্তরিত হবে, যা গ্রোভের দ্বারা রুপান্তরিত 300 এর বাইনারি রূপ এবং এটি পরে আইওটি ডিভাইস দ্বারা প্রক্রিয়া করা হবে।
+
+✅ বাইনারি সম্পর্কে জানা না থাকলে, 0 এবং 1 দ্বারা লিখিত এই সংখ্যাপদ্ধতি সম্পর্কে আমাদের জানতে হবে। এই [BBC Bitesize introduction to binary lesson](https://www.bbc.co.uk/bitesize/guides/zwsbwmn/revision/1) থেকে আমরা আমাদের বাইনারি সংক্রান্ত জ্ঞান আহরণ শুরু করতে পারি।
+
+কোডিং দৃষ্টিকোণ থেকে, এইসব বিষয় সাধারণত সেন্সরগুলির সাথে থাকা লাইব্রেরি দ্বারা পরিচালিত হয়, সুতরাং আমাদেরকে আসলে এই রূপান্তর সম্পর্কে অতোটা চিন্তা করার দরকার নেই। গ্রোভ লাইট সেন্সরের জন্য আমরা পাইথন লাইব্রেরিটি ব্যবহার করবো এবং `light` নামক প্রপার্টিকে কল করলে বা আরডুইনো লাইব্রেরিটি ব্যবহার করে `analogRead` কল করলে, 300 এর মান পাওয়া যাবে।
+
+### ডিজিটাল সেন্সর
+
+অ্যানালগ সেন্সরের মতো ডিজিটাল সেন্সরগুলি বৈদ্যুতিক ভোল্টেজের পরিবর্তনগুলি ব্যবহার করে চারপাশের বাহ্যিক জগৎ সনাক্ত করে। পার্থক্য হল ডিজিটাল সেন্সরগুলো হয় ২টি ভিন্ন স্টেট এর মাঝে তুলনা করে বা বিল্ট-ইন এডিসি ব্যবহার করে একটি ডিজিটাল সিগন্যাল আউটপুট দেয়। বর্তমানে বিভিন্ন কানেক্টর বোর্ড বা আইওটি ডিভাইসে এডিসি এর ব্যবহার এড়ানোর জন্য ডিজিটাল সেন্সরগুলোই বেশি ব্যবহৃত হচ্ছে।
+
+সবচেয়ে সহজ সাধারণ ডিজিটাল সেন্সর হলো বাটন বা স্যুইচ। এটি ২টি অবস্থা সম্পন্ন একটি সেন্সর , অবস্থা দুটি হলো চালু (on) এবং বন্ধ (off) ।
+
+
+
+***বাটন । Microcontroller by Template / Button by Dan Hetteix - all from the [Noun Project](https://thenounproject.com)***
+
+আইওটি ডিভাইসে থাকা পিনগুলি যেমন জিপিআইও পিনগুলি এই সংকেতটি সরাসরি 0 বা 1 হিসাবে পরিমাপ করতে পারে। প্রেরিত এবং প্রাপ্ত ভোল্টেজ সমান হলে, এর মান হয় 1, অন্যথায় মানটি হয় 0। এক্ষেত্রে সিগন্যাল রূপান্তর করার দরকার নেই কারণ এদের মান কেবল 1 বা 0 হতে পারে।
+
+> 💁 ভোল্টেজগুলি কখনই হুবহু মিলেনা না, বিশেষত যেহেতু একটি সেন্সরের উপাদানগুলির রোধ থাকে, তাই এক্ষেত্রে ভোল্টেজের হেরফের হয়। উদাহরণস্বরূপ, জিপিআইও পিনগুলি একটি রাস্পবেরি পাইতে 3.3V-তে কাজ করে এবং রিটার্ন সিগন্যালে 1.8V এর উপর ভোল্টেজ এর মানকে 1 হিসেবে বিবেচনা করে এবং 1.8V এর কম হলে 0 হিসাবে বিবেচনা করে থাকে।
+
+* 3.3V বাটনে যায়। এটি বন্ধ (off), তাই 0V বেরিয়ে আসে, তাই এর মান হয় 0 ।
+* 3.3V বাটনে যায়। এটি চালু (on), তাই 3.3V বেরিয়ে আসে,তাই এর মান হয় 1 ।
+
+আরও উন্নত ডিজিটাল সেন্সরগুলো অ্যানালগ মানগুলি গ্রহণ করে, তারপরে অন-বোর্ড এডিসি ব্যবহার করে ডিজিটাল সিগন্যালে রূপান্তর করে। উদাহরণস্বরূপ, একটি ডিজিটাল টেম্পারেচার সেন্সর, এনালগ সেন্সরের মতোই থার্মোকাপল ব্যবহার করবে এবং বর্তমান তাপমাত্রায় থার্মোকাপলের রোধের কারণে সৃষ্ট ভোল্টেজের পরিবর্তনকে পরিমাপ করবে। এনালগ ভ্যালু রিটার্ন করে এটিকে ডিজিটাল সিগন্যালে রূপান্তরের জন্য যন্ত্র বা কানেক্টর বোর্ডের উপর নির্ভর করার পরিবর্তে, সেন্সরের বিল্ট-ইন সেন্সরটিই এই রূপান্তর করে দেয় এবং 0 আর 1 সিরিজবিশিষ্ট মান রিটার্ন করে আইওটি ডিভাইসে। একটি বাটন যেমন 1 বলতে ফুল ভোল্টেজ এবং 0 বলতে শূণ্য ভোল্টেজ বোঝায়, এখানেও একইভাবে সম্পূর্ন বাইনারি সিরিজটি প্রেরিত হয়।
+
+
+
+***ডিজিটাল তাপমাত্রা সেন্সর । Temperature by Vectors Market / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+ডিজিটাল ডেটা প্রেরণের জন্য সেন্সরগুলো আরও জটিল হয়ে উঠতে শুরু করেছে। একইসাথে অনেক বেশি বিস্তারিরভাবে তথ্য প্রেরণ করা হচ্ছে, এমনকি সুরক্ষিত সেন্সরগুলির জন্য এনক্রিপ্ট করা ডেটা প্রেরণের ব্যবস্থাও লক্ষ্যণীয়। এর একটি উদাহরণ হলো ক্যামেরা - এটি এমন একটি সেন্সর যা একটি চিত্র ধারণ করে এবং আইওটি ডিভাইস এর জন্য সাধারণত JPEG এর মতো সংকোচিত বিন্যাসে এটি ডিজিটাল ডেটা হিসাবে প্রেরিত হয়। চিত্রধারণ করে, ক্যামেরার পক্ষে ভিডিও স্ট্রীমিংও সম্ভব । হয় পুরো ছবিকে ফ্রেম বাই ফ্রেম সাজিয়ে বা কম্প্রেস করে পাঠানোর মাধ্যমে স্ট্রীমিং হয়ে থাকে।
+
+## অ্যাকচুয়েটর কী?
+
+অ্যাকচুয়েটর হলো সেন্সর এর বিপরীত - এগুলো আইওটি ডিভাইস থেকে বৈদ্যুতিক সংকেতকে বাহ্যিক জগতের সাথে মিথস্ক্রিয়ায় রূপান্তর করে। যেমন, আলোক বা শব্দ নির্গমন করা বা একটি মোটরকে চালানো।
+
+কিছু অতিব্যবহৃত অ্যাকচুয়েটর হিসেবে বলা যায় -
+
+* এলইডি - চালু করলে, এগুলি আলোকিত হয়।
+* স্পিকার - একটি সাধারণ buzzer থেকে শুরু করে, অডিও চালাতে সক্ষম এমন যন্ত্রগুলোই স্পিকার। এরা প্রেরিত সিগন্যালের উপর ভিত্তি করে শব্দ তৈরী করে।
+* স্টেপার মোটর - এগুলি সংকেতকে একটি সুনির্দিষ্ট পরিমাণ ঘূর্ণনে রূপান্তর করে, যেমন কোন ডায়ালকে 90° কোণে বাঁকানো।
+* রিলে - এগুলো এমন সুইচ যা বৈদ্যুতিক সংকেতের সাহায্যে অন/অফ করা যায়। এগুলো আইওটি ডিভাইস থেকে প্রাপ্ত ক্ষুদ্র মানের ভোল্টেজ দ্বারা বৃহত্তর মানের ভোল্টেজ চালু করে।
+* স্ক্রিন - এগুলো বেশ জটিল ধরণের অ্যাকচুয়েটর যা একটি পর্দা (display) এর বিভিন্ন অংশে বিভিন্ন তথ্য প্রদর্শন করে। সাধারণ LED display থেকে শুরু করে হাই-রেস্যুলেশন পর্যন্ত প্রদর্শনযোগ্য স্ক্রিন রয়েছে।
+
+✅ ছোট্ট একটি কাজ করা যাক এখন। আমাদের ব্যবহৃত ফোনে কী কী অ্যাকচুয়েটর রয়েছে তা চিন্তা করি।
+
+## একটি অ্যাকচুয়েটর ব্যবহার
+
+আইওটি ডিভাইসে সেন্সর যুক্ত করতে নীচের কোন একটি প্রাসঙ্গিক গাইডটি অনুসরণ করতে হবে। এই ডিভাইসটি সেন্সর নিয়ন্ত্রিত,আর সাহায্যে nightlight এর প্রজেক্টটি করা হবে। এটি সেন্সর দ্বারা পরিবেশে আলোর মাত্রা শনাক্ত করবে, অ্যাকচুয়েটর হিসেবে এলইডি ব্যবহার করবে যেটি (সেন্সর প্রাপ্ত ডাটা অনুসারে) আলোর মাত্রা কম থাকলে, নিজেই জ্বলে উঠবে।
+
+
+
+***A flow chart of the assignment showing light levels being read and checked, and the LED begin controlled. ldr by Eucalyp / LED by abderraouf omara - all from the [Noun Project](https://thenounproject.com)***
+
+* [Arduino - Wio Terminal](wio-terminal-actuator.md)
+* [Single-board computer - Raspberry Pi](pi-actuator.md)
+* [Single-board computer - Virtual device](virtual-device-actuator.md)
+
+## অ্যাকচুয়েটর কত প্রকার
+
+সেন্সর এর মতো, অ্যাকচুয়েটরও মূলত ২ প্রকার - অ্যানালগ এবং ডিজিটাল।
+
+### অ্যানালগ অ্যাকচুয়েটর
+
+অ্যানালগ অ্যাকচুয়েটর একটি অ্যানালগ সংকেত নিয়ে এটিকে বাহ্যিক জগতের মিথস্ক্রিয়ায় রূপান্তর করে, যেখানে প্রদত্ত ভোল্টেজের ভিত্তিতে মিথস্ক্রিয়া পরিবর্তিত হয়। উদাহরণ হিসেবে, আমাদের বাসাবাড়িতে ব্যবহৃত নিয়ন্ত্রণযোগ্য লাইটের কথা চিন্তা করা যেতে পারে। এটি প্রাপ্ত ভোল্টেজের ভিত্তিতেই নির্ধারিত হয় যে, এই আলোর ঔজ্জ্বল্য কতটা হবে।
+
+
+
+***A light controlled by the voltage output of an IoT device. Idea by Pause08 / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+সেন্সরগুলির মতো, প্রকৃত আইওটি ডিভাইস ডিজিটাল সিগন্যালে কাজ করে, এনালগ এ নয়। একটি এনালগ সিগন্যাল প্রেরণ করার জন্য, আইওটি ডিভাইসটির জন্য ডিজিটাল টু এনালগ কনভার্টার (DAC) দরকার হয়। DAC হয় আইওটি ডিভাইসে সরাসরি, বা কোনও সংযোজক বোর্ডের সাহায্যে যুক্ত করতে হবে। এটি 0 এবং 1 গুলি আইওটি ডিভাইস থেকে অ্যানালগ ভোল্টেজকে রূপান্তর করবে যা অ্যাকচুয়েটর ব্যবহার করতে পারে।
+
+✅ আইওটি ডিভাইসটি যদি অ্যাকচুয়েটর এর সহ্যসীমার বেশি ভোল্টেজ প্রদান করে , তাহলে কী ঘটবে বলে মনে হয় ? ⛔️ এটার বাস্তবিক টেস্ট করা থেকে সর্বাবস্থায় বিরত থাকা উচিত।
+
+#### পালস-উইথ মড্যুলেশন (PWM)
+
+আইওটি ডিভাইস থেকে অ্যানালগ সিগন্যালে রূপান্তর করার জন্য আর একটি বিকল্প হল পালস-উইথ মড্যুলেশন। এর মধ্যে প্রচুর সংক্ষিপ্ত ডিজিটাল পালস সিগন্যাল প্রেরণ করা হয় যা এটি অ্যানালগ সিগন্যাল হিসাবে কাজ করে। উদাহরণস্বরূপ, PWM দ্বারা মোটরের গতি নিয়ন্ত্রণ করা যাবে।
+
+কল্পনা করি যে আমরা 5V পাওয়ার সাপ্লাই দিয়ে, মোটরটি নিয়ন্ত্রণ করছি। ভোল্টেজটি ০.০২ সেকেন্ডের জন্য high অর্থাৎ 5V রাখার মাধ্যমে, মোটরে একটি সংক্ষিপ্ত পালস প্রেরণ করি। সেই সময়ে মোটরটি একটি পূর্ণ ঘূর্ণনের দশমাংশ বা 36° ঘুরতে পারে। এর পরে লো সিগন্যাল দিয়ে অর্থাৎ 0V প্রেরণ করে, সিগন্যালটি 0.02 সেকেন্ডের জন্য বিরতি দেয়। তারপরে অন-অফ এর প্রতিটি চক্র 0.04s অবধি চলে। তারপরে আবারও পুনরাবৃত্তি করে।
+
+
+
+***PWM rotation of a motor at 150RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+তাহলে প্রতি সেকেন্ডে ২৫টি পালস দেয়া হচ্ছে যেখানে ৫ভোল্টের প্রতি সিগন্যালে ০.০২ সেকেন্ডে মোটর ঘুরছে আবার ০ ভোল্টের জন্য ০.০২ সেকেন্ডে মোটর বিরতি নিচ্ছে। প্রতিটি পালস এখানে মোটরকে একটি ঘূর্ণনের দশমাংশে ঘুরায়, যার অর্থ মোটর প্রতি সেকেন্ডে 2.5 ঘূর্ণন সম্পন্ন করে। এখানে ডিজিটাল সিগন্যাল ব্যবহার করে আমরা একটি মোটরকে প্রতি সেকেন্ডে ২.৫টি করে ঘূর্ণন প্রদান করেছি অর্থাৎ ১৫০ আরপিএম বা [revolutions per minute](https://wikipedia.org/wiki/Revolutions_per_minute) এ ঘুরিয়েছি।
+
+```output
+25 pulses per second x 0.1 rotations per pulse = 2.5 rotations per second
+2.5 rotations per second x 60 seconds in a minute = 150rpm
+```
+
+> 🎓 কোন PWM সিগন্যাল যদি অর্ধেক সময় ON থাকে এবং বাকি অর্ধেক সময় OFF থাকে, তবে এই বিষয়টিকে বলা হয় [50% ডিউটি সাইকেল](https://wikipedia.org/wiki/Duty_cycle)। ডিউটি সাইকেল হলো মূলত অন-অফ এই দুই অবস্থার সময়ের দৈর্ঘ্যের তুলনা।
+
+
+
+***PWM rotation of a motor at 75RPM. motor by Bakunetsu Kaito / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+পালসের আকার পরিবর্তন করে মোটরের গতি পরিবর্তন করা যাবে। উদাহরণস্বরূপ, একই মোটর দিয়ে আমরা 0.04 সেকেন্ডের একই চক্র রাখতে পারবো যেখানে ON পালসটি 0.01 ধরে থাকবে এবং OFF পালসটি 0.03 সেকেন্ড সময় ধরে থাকবে। আমাদের প্রতি সেকেন্ডে পালসের সংখ্যার পরিমাণ একই রয়েছে (25) তবে পালসের ON অবস্থার দৈর্ঘ্য এখন অর্ধেক। একটি অর্ধ দৈর্ঘ্যের পালস মোটরটিকে কেবল একটি ঘূর্ণনের এক বিংশতম পর্যন্ত ঘুরতে দেয় এবং 25 পালস দ্বারা প্রতি সেকেন্ডে 1.25টি ঘূর্ণন সম্পন্ন হব অর্থাৎ ৭৫ আরপিএম । ডিজিটাল সিগন্যালের পালসের গতি পরিবর্তন করে এভাবে অ্যানালগ মোটরের গতি অর্ধেকে নামিয়ে ফেলা যাবে।
+
+```output
+25 pulses per second x 0.05 rotations per pulse = 1.25 rotations per second
+1.25 rotations per second x 60 seconds in a minute = 75rpm
+```
+
+✅ কীভাবে মোটর রোটেশন (বিশেষত কম গতিতে) মসৃণ রাখা যায় ? এখানে দীর্ঘ বিরতি সহ স্বল্প সংখ্যক লম্বা পালস নাকি খুব সংক্ষিপ্ত বিরতি দিয়ে প্রচুর সংক্ষিপ্ত পালস - কোনটি ব্যবহার করা উচিত ?
+
+> 💁 কিছু সেন্সরও PWM ব্যবহার করে অ্যানালগ সিগন্যালগুলিকে ডিজিটাল সিগন্যালে রূপান্তর করে।
+
+> 🎓 পালস-উইথ মড্যুলেশন (PWM) এর ব্যপারে আরো জানতে [ঊইকিপিডিয়ার এই আর্টিকেল](https://wikipedia.org/wiki/Pulse-width_modulation) পড়া ভালো হবে।
+
+### ডিজিটাল অ্যাকচুয়েটর
+
+ডিজিটাল অ্যাকচুয়েটরও ডিজিটাল সেন্সরগুলর মতো হয় উচ্চ বা নিম্ন ভোল্টেজের দ্বারা নিয়ন্ত্রিত দুটি স্টেট এ থাকে বা একটি DAC বিল্ট-ইন থাকে, যাতে ডিজিটাল সিগন্যালটিকে এনালগকে রূপান্তর করতে পারে।
+
+একটি সাধারণ ডিজিটাল অ্যাকচুয়েটর এর উদাহরণ হল একটি এলইডি। যখন কোন ডিভাইস ডিজিটাল সিগন্যাল হিসেবে 1 প্রেরণ করে, তখন একটি উচ্চ ভোল্টেজ প্রেরণ করা হয় যা LED জ্বালায় । আবার 0 এর একটি ডিজিটাল সিগন্যাল প্রেরণ করা হলে, ভোল্টেজ 0V এ নেমে আসে এবং LED বন্ধ হয়ে যায়।
+
+
+
+***An LED turning on and off depending on voltage. LED by abderraouf omara / Microcontroller by Template - all from the [Noun Project](https://thenounproject.com)***
+
+✅ ২-অবস্থা বিশিষ্ট আর কোন অ্যাকচুয়েটর কী আশেপাশে দেখা যায় ? একটি উদাহরণ হলো সলিনয়েড, একটি ইলেক্ট্রোম্যাগনেট যা দ্বারা কোন দরজার নব নিয়ন্ত্রণ করে খোলা-বন্ধ করা যাবে।
+
+আরও উন্নত ডিজিটাল অ্যাকচুয়েটর যেমন স্ক্রিনের জন্য ডিজিটাল ডেটা নির্দিষ্ট ফর্ম্যাটে প্রেরণ করা প্রয়োজন। এগুলি সাধারণত প্রোগ্রাম লাইব্রেরিতে থাকে যা এগুলি নিয়ন্ত্রণ করতে সঠিক ডেটা প্রেরণকে সহজ করে।
+
+---
+
+## 🚀 চ্যালেঞ্জ
+
+শেষ দুটি পাঠের চ্যালেঞ্জ ছিল বাসস্থান, স্কুল বা কর্মক্ষেত্রে যতগুলি আইওটি ডিভাইস রয়েছে তা তালিকাভুক্ত করা এবং তারা মাইক্রোকন্ট্রোলার বা একক-বোর্ড কম্পিউটার, বা উভয়ের মিশ্রণের দ্বারা নির্মিত কিনা তা সিদ্ধান্ত নেওয়া। এবারের চ্যালেঞ্জ হলো, তালিকাভুক্ত প্রতিটি ডিভাইসের জন্য, তারা কোন সেন্সর এবং অ্যাকচুয়েটর সাথে সংযুক্ত আছে? এই ডিভাইসগুলির সাথে সংযুক্ত প্রতিটি সেন্সর এবং অ্যাকচুয়েটরের উদ্দেশ্য কী?
+
+## লেকচার পরবর্তী কুইজ
+
+[লেকচার পরবর্তী কুইজ](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/6)
+
+## রিভিউ এবং স্ব-অধ্যয়ন
+
+* [ThingLearn](http://www.thinglearn.com/essentials/) থেকে ইলেক্ট্রিসিটি ও সার্কিটের ব্যাপারে পড়া।
+* [Seeed Studios Temperature Sensors guide](https://www.seeedstudio.com/blog/2019/10/14/temperature-sensors-for-arduino-projects/) থেকে বিভিন্ন ধরণের তাপমাত্রা সেন্সরের ব্যাপারে জানা।
+* এলইডি সম্পর্কে [Wikipedia LED page](https://wikipedia.org/wiki/Light-emitting_diode) থেকে আরো বিস্তারিত ধারণা লাভ করা।
+
+## এসাইনমেন্ট
+
+[সেন্সর এবং অ্যাকচুয়েটর নিয়ে গবেষণা ](assignment.md)
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.ar.md b/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.ar.md
new file mode 100644
index 00000000..f6986eb8
--- /dev/null
+++ b/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.ar.md
@@ -0,0 +1,21 @@
+
+
+# بحث مستشعرات و مشغلات
+
+## تعليمات
+
+غطى هذا الدرس أجهزة الاستشعار والمحركات. ابحث وأوصف مستشعرًا ومشغلًا واحدًا يمكن استخدامه مع مجموعة أدوات تطوير إنترنت الأشياء ، بما في ذلك:
+
+* ماذا يفعل
+* الأجهزة الإلكترونية / الأجهزة المستخدمة بالداخل
+* هل هو تناظري أم رقمي
+* ما هي وحدات ونطاق المدخلات أو القياسات
+
+## الموضوع
+
+| المعايير | نموذجي | كافية | يحتاج إلى تحسين |
+| -------- | --------- | -------- | ----------------- |
+| وصف جهاز استشعار | وصف جهاز استشعار بما في ذلك تفاصيل عن جميع الأقسام الأربعة المذكورة أعلاه. | وصف جهاز استشعار ، ولكنه كان قادرًا فقط على توفير 2-3 من الأقسام أعلاه | وصف جهاز استشعار ، لكنه كان قادرًا فقط على توفير 1 من الأقسام أعلاه |
+| وصف المشغل | وصف المشغل بما في ذلك التفاصيل لجميع الأقسام الأربعة المذكورة أعلاه. | وصف مشغل ، لكنه كان قادرًا فقط على توفير 2-3 من الأقسام أعلاه | وصف مشغل ، لكنه كان قادرًا فقط على توفير 1 من الأقسام أعلاه |
+
+
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.bn.md b/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.bn.md
new file mode 100644
index 00000000..819c65e5
--- /dev/null
+++ b/1-getting-started/lessons/3-sensors-and-actuators/translations/assignment.bn.md
@@ -0,0 +1,17 @@
+# সেন্সর এবং অ্যাকচুয়েটর সংক্রান্ত গবেষণা
+
+## নির্দেশনা
+
+এই পাঠটিতে সেন্সর এবং অ্যাকচুয়েটর আলোচনা হয়েছে। একটি আইওটি ডেভলাপার কিটে ব্যবহার করা যেতে পারে এমন একটি সেন্সর এবং একটি অ্যাকচুয়েটর বর্ণনা করতে হবে, যেখানে উল্লেখ থাকবে:
+
+* এটি কী কাজ করে
+* ভিতরে ব্যবহৃত ইলেকট্রনিক্স/হার্ডওয়্যার
+* এটি কি অ্যানালগ নাকি ডিজিটাল
+* ইনপুট বা পরিমাপের একক কী এবং যন্ত্রটির ব্যবহার্য সীমা (range) কতটুকু
+
+## এসাইনমেন্ট মূল্যায়ন মানদন্ড
+
+| ক্রাইটেরিয়া | দৃষ্টান্তমূলক ব্যখ্যা (সর্বোত্তম) | পর্যাপ্ত ব্যখ্যা (মাঝারি) | আরো উন্নতির প্রয়োজন (নিম্ন) |
+| -------- | --------- | -------- | ----------------- |
+| একটি সেন্সর সংক্রান্ত বর্ণনা | উপরে তালিকাভুক্ত 4 টি বিভাগের বিশদ ব্যখ্যা সহ সেন্সর বর্ণিত হয়েছে | একটি সেন্সর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 2-3টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে | একটি সেন্সর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 1টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে |
+| একটি অ্যাকচুয়েটর সংক্রান্ত বর্ণনা | উপরে তালিকাভুক্ত 4 টি বিভাগের বিশদ ব্যখ্যা সহ অ্যাকচুয়েটর বর্ণিত হয়েছে | একটি অ্যাকচুয়েটর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 2-3টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে | একটি অ্যাকচুয়েটর বর্ণিত হয়েছ, তবে উপরের তালিকা থেকে কেবল 1টি বিষয় ব্যখ্যা করতে সক্ষম হয়েছে |
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
index 74895af1..d782fb62 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-actuator.md
@@ -91,7 +91,7 @@ Program the nightlight.
python3 app.py
```
- You should see light values being output to the console.
+ Light values will be output to the console.
```output
(.venv) ➜ GroveTest python3 app.py
@@ -101,7 +101,7 @@ Program the nightlight.
Light level: 253
```
-1. Change the *Value* or the *Random* settings to vary the light level above and below 300. You will see the LED turn on and off.
+1. Change the *Value* or the *Random* settings to vary the light level above and below 300. The LED will turn on and off.

diff --git a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
index e6d961de..72590325 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/virtual-device-sensor.md
@@ -87,7 +87,7 @@ Program the device.
python3 app.py
```
- You should see light values being output to the console. Initially this value will be 0.
+ Light values will be output to the console. Initially this value will be 0.
1. From the CounterFit app, change the value of the light sensor that will be read by the app. You can do this in one of two ways:
@@ -95,7 +95,7 @@ Program the device.
* Check the *Random* checkbox, and enter a *Min* and *Max* value, then select the **Set** button. Every time the sensor reads a value, it will read a random number between *Min* and *Max*.
- You should see the values you set appearing in the console. Change the *Value* or the *Random* settings to see the value change.
+ The values you set will be output to the console. Change the *Value* or the *Random* settings to make the value change.
```output
(.venv) ➜ GroveTest python3 app.py
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
index 9175874f..c0dd79a1 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-actuator.md
@@ -83,7 +83,7 @@ Program the nightlight.
1. Reconnect the Wio Terminal to your computer, and upload the new code as you did before.
-1. Connect the Serial Monitor. You should see light values being output to the terminal.
+1. Connect the Serial Monitor. Light values will be output to the terminal.
```output
> Executing task: platformio device monitor <
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-sensor.md b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-sensor.md
index e89cbefc..73fa96ea 100644
--- a/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-sensor.md
+++ b/1-getting-started/lessons/3-sensors-and-actuators/wio-terminal-sensor.md
@@ -40,7 +40,7 @@ Program the device.
Serial.println(light);
```
- This code reads an analog value from the `WIO_LIGHT` pin. This reads a value from 0-1,023 from the on-board light sensor. This value is then sent to the serial port so you can see it in the Serial Monitor when this code is running. `Serial.print` writes the text without a new line on the end, so each line will start with `Light value:` and end with the actual light value.
+ This code reads an analog value from the `WIO_LIGHT` pin. This reads a value from 0-1,023 from the on-board light sensor. This value is then sent to the serial port so you can read it in the Serial Monitor when this code is running. `Serial.print` writes the text without a new line on the end, so each line will start with `Light value:` and end with the actual light value.
1. Add a small delay of one second (1,000ms) at the end of the `loop` as the light levels don't need to be checked continuously. A delay reduces the power consumption of the device.
@@ -50,7 +50,7 @@ Program the device.
1. Reconnect the Wio Terminal to your computer, and upload the new code as you did before.
-1. Connect the Serial Monitor. You should see light values being output to the terminal. Cover and uncover the light sensor on the back of the Wio Terminal to see the values change.
+1. Connect the Serial Monitor.Light values will be output to the terminal. Cover and uncover the light sensor on the back of the Wio Terminal, and the values will change.
```output
> Executing task: platformio device monitor <
diff --git a/1-getting-started/lessons/4-connect-internet/README.md b/1-getting-started/lessons/4-connect-internet/README.md
index 49ae2dbe..f351b945 100644
--- a/1-getting-started/lessons/4-connect-internet/README.md
+++ b/1-getting-started/lessons/4-connect-internet/README.md
@@ -1,8 +1,8 @@
# Connect your device to the Internet
-Add a sketchnote if possible/appropriate
+
-
+> Sketchnote by [Nitya Narasimhan](https://github.com/nitya). Click the image for a larger version.
## Pre-lecture quiz
@@ -88,7 +88,7 @@ Messages can be sent with a quality of service (QoS), which determines the guara
Although the name is Message Queueing (initials in MQTT), it doesn't actually support message queues. This means that if a client disconnects, then reconnects it won't receive messages sent during the disconnection, except for those messages that it had already started to process using the QoS process. Messages can have a retained flag set on them. If this is set, the MQTT broker will store the last message sent on a topic with this flag, and send this to any clients who later subscribe to the topic. This way, the clients will always get the latest message.
-MQTT also supports a keep alive function that checks to see if the connection is still alive during long gaps between messages.
+MQTT also supports a keep alive function that checks if the connection is still alive during long gaps between messages.
> 🦟 [Mosquitto from the Eclipse Foundation](https://mosquitto.org) has a free MQTT broker you can run yourself to experiment with MQTT, along with a public MQTT broker you can use to test your code, hosted at [test.mosquitto.org](https://test.mosquitto.org).
@@ -198,13 +198,13 @@ Configure a Python virtual environment and install the MQTT pip packages.
source ./.venv/bin/activate
```
-1. Once the virtual environment has been activated, the default `python` command will run the version of Python that was used to create the virtual environment. Run the following to see this:
+1. Once the virtual environment has been activated, the default `python` command will run the version of Python that was used to create the virtual environment. Run the following to get the version:
```sh
python --version
```
- You should see the following:
+ The output will be similar to the following:
```output
(.venv) ➜ nightlight-server python --version
@@ -249,7 +249,7 @@ Write the server code.
code .
```
-1. When VS Code launches, it will activate the Python virtual environment. You will see this in the bottom status bar:
+1. When VS Code launches, it will activate the Python virtual environment. This will be reported in the bottom status bar:

@@ -257,7 +257,7 @@ Write the server code.

-1. Launch a new VS Code Terminal by selecting *Terminal -> New Terminal, or pressing `` CTRL+` ``. The new terminal will load the virtual environment, and you will see the call to activate this in the terminal, as well as having the name of the virtual environment (`.venv`) in the prompt:
+1. Launch a new VS Code Terminal by selecting *Terminal -> New Terminal, or pressing `` CTRL+` ``. The new terminal will load the virtual environment, with the call to activate this appearing in the terminal. The name of the virtual environment (`.venv`) will also be in the prompt:
```output
➜ nightlight source .venv/bin/activate
@@ -311,15 +311,15 @@ Write the server code.
The app will start listening to messages from the IoT device.
-1. Make sure your device is running and sending telemetry messages. Adjust the light levels detected by your physical or virtual device. You will see messages being received in the terminal.
+1. Make sure your device is running and sending telemetry messages. Adjust the light levels detected by your physical or virtual device. Messages being received will be printed to the terminal.
```output
(.venv) ➜ nightlight-server python app.py
Message received: {'light': 0}
Message received: {'light': 400}
```
-
- The app.py file in the nightlight virtual environment has to be running for the app.py file in the nightlight-server virtual environment to recieve the messages being sent.
+
+ The app.py file in the nightlight virtual environment has to be running for the app.py file in the nightlight-server virtual environment to receive the messages being sent.
> 💁 You can find this code in the [code-server/server](code-server/server) folder.
@@ -382,7 +382,7 @@ The next step for our Internet controlled nightlight is for the server code to s
1. Run the code as before
-1. Adjust the light levels detected by your physical or virtual device. You will see messages being received and commands being sent in the terminal:
+1. Adjust the light levels detected by your physical or virtual device. Messages being received and commands being sent will be written to the terminal:
```output
(.venv) ➜ nightlight-server python app.py
diff --git a/1-getting-started/lessons/4-connect-internet/code-commands/wio-terminal/nightlight/platformio.ini b/1-getting-started/lessons/4-connect-internet/code-commands/wio-terminal/nightlight/platformio.ini
index 2d16f6d5..0e141c71 100644
--- a/1-getting-started/lessons/4-connect-internet/code-commands/wio-terminal/nightlight/platformio.ini
+++ b/1-getting-started/lessons/4-connect-internet/code-commands/wio-terminal/nightlight/platformio.ini
@@ -15,8 +15,8 @@ framework = arduino
lib_deps =
knolleary/PubSubClient @ 2.8
bblanchon/ArduinoJson @ 6.17.3
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
\ No newline at end of file
diff --git a/1-getting-started/lessons/4-connect-internet/code-mqtt/wio-terminal/nightlight/platformio.ini b/1-getting-started/lessons/4-connect-internet/code-mqtt/wio-terminal/nightlight/platformio.ini
index 3b2294a4..f11c7019 100644
--- a/1-getting-started/lessons/4-connect-internet/code-mqtt/wio-terminal/nightlight/platformio.ini
+++ b/1-getting-started/lessons/4-connect-internet/code-mqtt/wio-terminal/nightlight/platformio.ini
@@ -14,8 +14,8 @@ board = seeed_wio_terminal
framework = arduino
lib_deps =
knolleary/PubSubClient @ 2.8
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/1-getting-started/lessons/4-connect-internet/code-telemetry/wio-terminal/nightlight/platformio.ini b/1-getting-started/lessons/4-connect-internet/code-telemetry/wio-terminal/nightlight/platformio.ini
index aa415c00..db2bae01 100644
--- a/1-getting-started/lessons/4-connect-internet/code-telemetry/wio-terminal/nightlight/platformio.ini
+++ b/1-getting-started/lessons/4-connect-internet/code-telemetry/wio-terminal/nightlight/platformio.ini
@@ -14,8 +14,8 @@ board = seeed_wio_terminal
framework = arduino
lib_deps =
knolleary/PubSubClient @ 2.8
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/1-getting-started/lessons/4-connect-internet/single-board-computer-commands.md b/1-getting-started/lessons/4-connect-internet/single-board-computer-commands.md
index ae6a975a..84464cb6 100644
--- a/1-getting-started/lessons/4-connect-internet/single-board-computer-commands.md
+++ b/1-getting-started/lessons/4-connect-internet/single-board-computer-commands.md
@@ -46,7 +46,7 @@ Subscribe to commands.
1. Run the code in the same way as you ran the code from the previous part of the assignment. If you are using a virtual IoT device, then make sure the CounterFit app is running and the light sensor and LED have been created on the correct pins.
-1. Adjust the light levels detected by your physical or virtual device. You will see messages being received and commands being sent in the terminal. You will also see the LED is being turned on and off depending on the light level.
+1. Adjust the light levels detected by your physical or virtual device. Messages being received and commands being sent will be written to the terminal. The LED will also be turned on and off depending on the light level.
> 💁 You can find this code in the [code-commands/virtual-device](code-commands/virtual-device) folder or the [code-commands/pi](code-commands/pi) folder.
diff --git a/1-getting-started/lessons/4-connect-internet/wio-terminal-mqtt.md b/1-getting-started/lessons/4-connect-internet/wio-terminal-mqtt.md
index 08cf6ef8..f1dd6650 100644
--- a/1-getting-started/lessons/4-connect-internet/wio-terminal-mqtt.md
+++ b/1-getting-started/lessons/4-connect-internet/wio-terminal-mqtt.md
@@ -24,8 +24,8 @@ Install the Arduino libraries.
```ini
lib_deps =
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/2-farm/lessons/1-predict-plant-growth/README.md b/2-farm/lessons/1-predict-plant-growth/README.md
index 1c545284..5a56781b 100644
--- a/2-farm/lessons/1-predict-plant-growth/README.md
+++ b/2-farm/lessons/1-predict-plant-growth/README.md
@@ -1,9 +1,5 @@
# Predict plant growth with IoT
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/9)
diff --git a/2-farm/lessons/1-predict-plant-growth/code-publish-temperature/wio-terminal/temperature-sensor/platformio.ini b/2-farm/lessons/1-predict-plant-growth/code-publish-temperature/wio-terminal/temperature-sensor/platformio.ini
index 05b21598..916fd4b6 100644
--- a/2-farm/lessons/1-predict-plant-growth/code-publish-temperature/wio-terminal/temperature-sensor/platformio.ini
+++ b/2-farm/lessons/1-predict-plant-growth/code-publish-temperature/wio-terminal/temperature-sensor/platformio.ini
@@ -16,8 +16,8 @@ lib_deps =
seeed-studio/Grove Temperature And Humidity Sensor @ 1.0.1
knolleary/PubSubClient @ 2.8
bblanchon/ArduinoJson @ 6.17.3
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/2-farm/lessons/2-detect-soil-moisture/README.md b/2-farm/lessons/2-detect-soil-moisture/README.md
index 697bd885..eae4a828 100644
--- a/2-farm/lessons/2-detect-soil-moisture/README.md
+++ b/2-farm/lessons/2-detect-soil-moisture/README.md
@@ -1,9 +1,5 @@
# Detect soil moisture
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/11)
diff --git a/2-farm/lessons/3-automated-plant-watering/README.md b/2-farm/lessons/3-automated-plant-watering/README.md
index c4d257be..4dbaf134 100644
--- a/2-farm/lessons/3-automated-plant-watering/README.md
+++ b/2-farm/lessons/3-automated-plant-watering/README.md
@@ -1,9 +1,5 @@
# Automated plant watering
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/13)
diff --git a/2-farm/lessons/3-automated-plant-watering/code-mqtt/wio-terminal/soil-moisture-sensor/platformio.ini b/2-farm/lessons/3-automated-plant-watering/code-mqtt/wio-terminal/soil-moisture-sensor/platformio.ini
index 38827344..2f148840 100644
--- a/2-farm/lessons/3-automated-plant-watering/code-mqtt/wio-terminal/soil-moisture-sensor/platformio.ini
+++ b/2-farm/lessons/3-automated-plant-watering/code-mqtt/wio-terminal/soil-moisture-sensor/platformio.ini
@@ -15,8 +15,8 @@ framework = arduino
lib_deps =
knolleary/PubSubClient @ 2.8
bblanchon/ArduinoJson @ 6.17.3
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md
index 7f232755..994ae719 100644
--- a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md
+++ b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md
@@ -1,9 +1,5 @@
# Migrate your plant to the cloud
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/15)
@@ -132,6 +128,8 @@ The video below gives a short overview of Azure IoT Hub:
[](https://www.youtube.com/watch?v=smuZaZZXKsU)
+> 🎥 Click the image above to watch a video
+
✅ Take a moment to do some research and read the overview of IoT hub in the [Microsoft IoT Hub documentation](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=academic-17441-jabenn).
The cloud services available in Azure can be configured through a web-based portal, or via a command-line interface (CLI). For this task, you will use the CLI.
@@ -382,6 +380,8 @@ For now, you won't be updating your server code. Instead you can use the Azure C
The time values in the annotations are in [UNIX time](https://wikipedia.org/wiki/Unix_time), representing the number of seconds since midnight on 1st January 1970.
+ Exit the event monitor when you are done.
+
### Task - control your IoT device
You can also use the Azure CLI to call direct methods on your IoT device.
diff --git a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/code/wio-terminal/soil-moisture-sensor/platformio.ini b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/code/wio-terminal/soil-moisture-sensor/platformio.ini
index a240bd42..3daba989 100644
--- a/2-farm/lessons/4-migrate-your-plant-to-the-cloud/code/wio-terminal/soil-moisture-sensor/platformio.ini
+++ b/2-farm/lessons/4-migrate-your-plant-to-the-cloud/code/wio-terminal/soil-moisture-sensor/platformio.ini
@@ -14,8 +14,8 @@ board = seeed_wio_terminal
framework = arduino
lib_deps =
bblanchon/ArduinoJson @ 6.17.3
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/2-farm/lessons/5-migrate-application-to-the-cloud/README.md b/2-farm/lessons/5-migrate-application-to-the-cloud/README.md
index 5120becd..80223f76 100644
--- a/2-farm/lessons/5-migrate-application-to-the-cloud/README.md
+++ b/2-farm/lessons/5-migrate-application-to-the-cloud/README.md
@@ -1,9 +1,5 @@
# Migrate your application logic to the cloud
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/17)
@@ -46,7 +42,7 @@ Despite the name, serverless does actually use servers. The naming is because yo
As an IoT developer, the serverless model is ideal. You can write a function that is called in response to messages sent from any IoT device that is connected to your cloud-hosted IoT service. Your code will handle all messages sent, but only be running when needed.
-✅ Look back at the code you wrote as server code listening to messages over MQTT. As is, how might this run in the cloud using serverless? How do you think the code might be changed to support serverless computing?
+✅ Look back at the code you wrote as server code listening to messages over MQTT. How might this run in the cloud using serverless? How do you think the code might be changed to support serverless computing?
> 💁 The serverless model is moving to other cloud services in addition to running code. For example, serverless databases are available in the cloud using a serverless pricing model where you pay per request made against the database, such as a query or insert, usually using pricing based on how much work is done to service the request. For example a single select of one row against a primary key will cost less than a complicated operation joining many tables and returning thousands of rows.
@@ -60,6 +56,8 @@ The short video below has an overview of Azure Functions
[](https://www.youtube.com/watch?v=8-jz5f_JyEQ)
+> 🎥 Click the image above to watch a video
+
✅ Take a moment to do some research and read the overview of Azure Functions in the [Microsoft Azure Functions documentation](https://docs.microsoft.com/azure/azure-functions/functions-overview?WT.mc_id=academic-17441-jabenn).
To write Azure Functions, you start with an Azure Functions app in the language of your choice. Out of the box Azure Functions supports Python, JavaScript, TypeScript, C#, F#, Java, and Powershell. In this lesson you will learn how to write an Azure Functions app in Python.
@@ -192,6 +190,23 @@ The Azure Functions CLI can be used to create a new Functions app.
> ⚠️ If you get a firewall notification, grant access as the `func` application needs to be able to read and write to your network.
+ > ⚠️ If you are using macOS, there may be warnings in the output:
+ >
+ > ```output
+ > (.venv) ➜ soil-moisture-trigger func start
+ > Found Python version 3.9.1 (python3).
+ >
+ > Azure Functions Core Tools
+ > Core Tools Version: 3.0.3442 Commit hash: 6bfab24b2743f8421475d996402c398d2fe4a9e0 (64-bit)
+ > Function Runtime Version: 3.0.15417.0
+ >
+ > [2021-06-16T08:18:28.315Z] Cannot create directory for shared memory usage: /dev/shm/AzureFunctions
+ > [2021-06-16T08:18:28.316Z] System.IO.FileSystem: Access to the path '/dev/shm/AzureFunctions' is denied. Operation not permitted.
+ > [2021-06-16T08:18:30.361Z] No job functions found.
+ > ```
+ >
+ > You can ignore these as long as the Functions app starts correctly and lists the running functions. As mentioned in [this question on the Microsoft Docs Q&A](https://docs.microsoft.com/answers/questions/396617/azure-functions-core-tools-error-osx-devshmazurefu.html?WT.mc_id=academic-17441-jabenn) it can be ignored.
+
1. Stop the Functions app by pressing `ctrl+c`.
1. Open the current folder in VS Code, either by opening VS Code, then opening this folder, or by running the following:
@@ -254,12 +269,12 @@ You are now ready to create the event trigger.
1. From the VS Code terminal run the following command from inside the `soil-moisture-trigger` folder:
```sh
- func new --name iot_hub_trigger --template "Azure Event Hub trigger"
+ func new --name iot-hub-trigger --template "Azure Event Hub trigger"
```
- This creates a new Function called `iot_hub_trigger`. The trigger will connect to the Event Hub compatible endpoint on the IoT Hub, so you can use an event hub trigger. There is no specific IoT Hub trigger.
+ This creates a new Function called `iot-hub-trigger`. The trigger will connect to the Event Hub compatible endpoint on the IoT Hub, so you can use an event hub trigger. There is no specific IoT Hub trigger.
-This will create a folder inside the `soil-moisture-trigger` folder called `iot_hub_trigger` that contains this function. This folder will have the following files inside it:
+This will create a folder inside the `soil-moisture-trigger` folder called `iot-hub-trigger` that contains this function. This folder will have the following files inside it:
* `__init__.py` - this is the Python code file that contains the trigger, using the standard Python file name convention to turn this folder into a Python module.
@@ -292,7 +307,7 @@ This will create a folder inside the `soil-moisture-trigger` folder called `iot_
* `"type": "eventHubTrigger"` - this tells the function it needs to listen to events from an Event Hub
* `"name": "events"` - this is the parameter name to use for the Event Hub events. This matches the parameter name in the `main` function in the Python code.
- * `"direction": "in",` - this is an input binding, the data from the event hub comes into the function
+ * `"direction": "in"` - this is an input binding, the data from the event hub comes into the function
* `"connection": ""` - this defines the name of the setting to read the connection string from. When running locally, this will read this setting from the `local.settings.json` file.
> 💁 The connection string cannot be stored in the `function.json` file, it has to be read from the settings. This is to stop you accidentally exposing your connection string.
@@ -307,13 +322,17 @@ This will create a folder inside the `soil-moisture-trigger` folder called `iot_
### Task - run the event trigger
+1. Make sure you are not running the IoT Hub event monitor. If this is running at the same time as the functions app, the functions app will not be able to connect and consume events.
+
+ > 💁 Multiple apps can connect to the IoT Hub endpoints using different *consumer groups*. These are covered in a later lesson.
+
1. To run the Functions app, run the following command from the VS Code terminal
```sh
func start
```
- The Functions app will start up, and will discover the `iot_hub_trigger` function. It will then process any events that have already been sent to the IoT Hub in the past day.
+ The Functions app will start up, and will discover the `iot-hub-trigger` function. It will then process any events that have already been sent to the IoT Hub in the past day.
```output
(.venv) ➜ soil-moisture-trigger func start
@@ -325,23 +344,23 @@ This will create a folder inside the `soil-moisture-trigger` folder called `iot_
Functions:
- iot_hub_trigger: eventHubTrigger
+ iot-hub-trigger: eventHubTrigger
For detailed output, run func with --verbose flag.
[2021-05-05T02:44:07.517Z] Worker process started and initialized.
- [2021-05-05T02:44:09.202Z] Executing 'Functions.iot_hub_trigger' (Reason='(null)', Id=802803a5-eae9-4401-a1f4-176631456ce4)
+ [2021-05-05T02:44:09.202Z] Executing 'Functions.iot-hub-trigger' (Reason='(null)', Id=802803a5-eae9-4401-a1f4-176631456ce4)
[2021-05-05T02:44:09.205Z] Trigger Details: PartionId: 0, Offset: 1011240-1011632, EnqueueTimeUtc: 2021-05-04T19:04:04.2030000Z-2021-05-04T19:04:04.3900000Z, SequenceNumber: 2546-2547, Count: 2
[2021-05-05T02:44:09.352Z] Python EventHub trigger processed an event: {"soil_moisture":628}
[2021-05-05T02:44:09.354Z] Python EventHub trigger processed an event: {"soil_moisture":624}
- [2021-05-05T02:44:09.395Z] Executed 'Functions.iot_hub_trigger' (Succeeded, Id=802803a5-eae9-4401-a1f4-176631456ce4, Duration=245ms)
+ [2021-05-05T02:44:09.395Z] Executed 'Functions.iot-hub-trigger' (Succeeded, Id=802803a5-eae9-4401-a1f4-176631456ce4, Duration=245ms)
```
- Each call to the function will be surrounded by a `Executing 'Functions.iot_hub_trigger'`/`Executed 'Functions.iot_hub_trigger'` block in the output, so you can how many messages were processed in each function call.
+ Each call to the function will be surrounded by a `Executing 'Functions.iot-hub-trigger'`/`Executed 'Functions.iot-hub-trigger'` block in the output, so you can how many messages were processed in each function call.
> If you get the following error:
```output
- The listener for function 'Functions.iot_hub_trigger' was unable to start. Microsoft.WindowsAzure.Storage: Connection refused. System.Net.Http: Connection refused. System.Private.CoreLib: Connection refused.
+ The listener for function 'Functions.iot-hub-trigger' was unable to start. Microsoft.WindowsAzure.Storage: Connection refused. System.Net.Http: Connection refused. System.Private.CoreLib: Connection refused.
```
Then check Azurite is running and you have set the `AzureWebJobsStorage` in the `local.settings.json` file to `UseDevelopmentStorage=true`.
@@ -370,7 +389,7 @@ To connect to the Registry Manager, you need a connection string.
Replace `` with the name you used for your IoT Hub.
- The connection string is requested for the *ServiceConnect* policy using the `--policy-name service` parameter. When you request a connection string, you can specify what permissions that connection string will allow. The ServiceConnect policy allows yor code to connect and send messages to IoT devices.
+ The connection string is requested for the *ServiceConnect* policy using the `--policy-name service` parameter. When you request a connection string, you can specify what permissions that connection string will allow. The ServiceConnect policy allows your code to connect and send messages to IoT devices.
✅ Do some research: Read up on the different policies in the [IoT Hub permissions documentation](https://docs.microsoft.com/azure/iot-hub/iot-hub-devguide-security#iot-hub-permissions?WT.mc_id=academic-17441-jabenn)
@@ -561,7 +580,7 @@ Deployment successful.
Remote build succeeded!
Syncing triggers...
Functions in soil-moisture-sensor:
- iot_hub_trigger - [eventHubTrigger]
+ iot-hub-trigger - [eventHubTrigger]
```
Make sure your IoT device is running. Change the moisture levels by adjusting the soil moisture, or moving the sensor in and out of the soil. You will see the relay turn on and off as the soil moisture changes.
diff --git a/2-farm/lessons/5-migrate-application-to-the-cloud/assignment.md b/2-farm/lessons/5-migrate-application-to-the-cloud/assignment.md
index 4980d6fc..b8d0dd2f 100644
--- a/2-farm/lessons/5-migrate-application-to-the-cloud/assignment.md
+++ b/2-farm/lessons/5-migrate-application-to-the-cloud/assignment.md
@@ -35,7 +35,7 @@ Some hints:
relay_on: [GET,POST] http://localhost:7071/api/relay_on
- iot_hub_trigger: eventHubTrigger
+ iot-hub-trigger: eventHubTrigger
```
Paste the URL into your browser and hit `return`, or `Ctrl+click` (`Cmd+click` on macOS) the link in the terminal window in VS Code to open it in your default browser. This will run the trigger.
diff --git a/2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot_hub_trigger/__init__.py b/2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot-hub-trigger/__init__.py
similarity index 100%
rename from 2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot_hub_trigger/__init__.py
rename to 2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot-hub-trigger/__init__.py
diff --git a/2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot_hub_trigger/function.json b/2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot-hub-trigger/function.json
similarity index 100%
rename from 2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot_hub_trigger/function.json
rename to 2-farm/lessons/5-migrate-application-to-the-cloud/code/functions/soil-moisture-trigger/iot-hub-trigger/function.json
diff --git a/2-farm/lessons/6-keep-your-plant-secure/README.md b/2-farm/lessons/6-keep-your-plant-secure/README.md
index fe4b6616..b999f533 100644
--- a/2-farm/lessons/6-keep-your-plant-secure/README.md
+++ b/2-farm/lessons/6-keep-your-plant-secure/README.md
@@ -1,9 +1,5 @@
# Keep your plant secure
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/19)
@@ -39,7 +35,7 @@ If your IoT application is not secure, there are a number of risks:
These are real world scenarios, and happen all the time. Some examples were given in earlier lessons, but here are some more:
-* In 2018 hackers used an open WiFi access point on a fish tank thermostat to gain access to a casino's network to steal data. [The Hacker News - Casino Gets Hacked Through Its Internet-Connected Fish Tank Thermometer](https://thehackernews.com/2018/04/iot-hacking-thermometer.html)
+* In 2018, hackers used an open WiFi access point on a fish tank thermostat to gain access to a casino's network to steal data. [The Hacker News - Casino Gets Hacked Through Its Internet-Connected Fish Tank Thermometer](https://thehackernews.com/2018/04/iot-hacking-thermometer.html)
* In 2016, the Mirai Botnet launched a denial of service attack against Dyn, an Internet service provider, taking down large portions of the Internet. This botnet used malware to connect to IoT devices such as DVRs and cameras that used default usernames and passwords, and from there launched the attack. [The Guardian - DDoS attack that disrupted internet was largest of its kind in history, experts say](https://www.theguardian.com/technology/2016/oct/26/ddos-attack-dyn-mirai-botnet)
* Spiral Toys had a database of users of their CloudPets connected toys publicly available over the Internet. [Troy Hunt - Data from connected CloudPets teddy bears leaked and ransomed, exposing kids' voice messages](https://www.troyhunt.com/data-from-connected-cloudpets-teddy-bears-leaked-and-ransomed-exposing-kids-voice-messages/).
* Strava tagged runners that you ran past and showed their routes, allowing strangers to effectively see where you live. [Kim Komndo - Fitness app could lead a stranger right to your home — change this setting](https://www.komando.com/security-privacy/strava-fitness-app-privacy/755349/).
@@ -90,7 +86,7 @@ Unfortunately, not everything is secure. Some devices have no security, others a
Encryption comes in two types - symmetric and asymmetric.
-**Symmetric** encryption uses the same key to encrypt and decrypt the data. Both the sender and receive need to know the same key. This is the least secure type, as the key needs to be shared somehow. For a sender to send an encrypted message to a recipient, the sender first might have to send the recipient the key.
+**Symmetric** encryption uses the same key to encrypt and decrypt the data. Both the sender and receiver need to know the same key. This is the least secure type, as the key needs to be shared somehow. For a sender to send an encrypted message to a recipient, the sender first might have to send the recipient the key.

@@ -146,9 +142,11 @@ After the connection, all data sent to the IoT Hub from the device, or to the de
>
> When learning IoT it is often easier to put the key in code, as you did in an earlier lesson, but you must ensure this key is not checked into public source code control.
+Devices have 2 keys, and 2 corresponding connection strings. This allows you to rotate the keys - that is switch from one key to another if the first gets compromised, and re-generate the first key.
+
### X.509 certificates
-When you are using a asymmetric encryption with a public/private key pair, you need to provide your public key to anyone who wants to send you data. The problem is, how can the recipient of your key be sure it's actually your public key, not someone else pretending to be you? Instead of providing a key, you can instead provide your public key inside a certificate that has been verified by a trusted third party, called an X.509 certificate.
+When you are using asymmetric encryption with a public/private key pair, you need to provide your public key to anyone who wants to send you data. The problem is, how can the recipient of your key be sure it's actually your public key, not someone else pretending to be you? Instead of providing a key, you can instead provide your public key inside a certificate that has been verified by a trusted third party, called an X.509 certificate.
X.509 certificates are digital documents that contain the public key part of the public/private key pair. They are usually issued by one of a number of trusted organizations called [Certification authorities](https://wikipedia.org/wiki/Certificate_authority) (CAs), and digitally signed by the CA to indicate the key is valid and comes from you. You trust the certificate and that the public key is from who the certificate says it is from, because you trust the CA, similar to how you would trust a passport or driving license because you trust the country issuing it. Certificates cost money, so you can also 'self-sign', that is create a certificate yourself that is signed by you, for testing purposes.
@@ -162,7 +160,7 @@ When using X.509 certificates, both the sender and the recipient will have their

-***nstead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it. Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)***
+***Instead of sharing a public key, you can share a certificate. The user of the certificate can verify that it comes from you by checking with the certificate authority who signed it. Certificate by alimasykurm from the [Noun Project](https://thenounproject.com)***
One big advantage of using X.509 certificates is that they can be shared between devices. You can create one certificate, upload it to IoT Hub, and use this for all your devices. Each device then just needs to know the private key to decrypt the messages it receives from IoT Hub.
@@ -176,7 +174,7 @@ The certificate used by your device to encrypt messages it sends to the IoT Hub
The steps to generate an X.509 certificate are:
-1. Create a public/private key pair. One of the most widely used algorithm to generate a public/private key pair is called [RSA](https://wikipedia.org/wiki/RSA_(cryptosystem)).
+1. Create a public/private key pair. One of the most widely used algorithm to generate a public/private key pair is called [Rivest–Shamir–Adleman](https://wikipedia.org/wiki/RSA_(cryptosystem))(RSA).
1. Submit the public key with associated data for signing, either by a CA, or by self-signing
diff --git a/2-farm/lessons/6-keep-your-plant-secure/single-board-computer-x509.md b/2-farm/lessons/6-keep-your-plant-secure/single-board-computer-x509.md
index 93bf7985..00ddabf1 100644
--- a/2-farm/lessons/6-keep-your-plant-secure/single-board-computer-x509.md
+++ b/2-farm/lessons/6-keep-your-plant-secure/single-board-computer-x509.md
@@ -47,8 +47,10 @@ The next step is to connect your device to IoT Hub using the X.509 certificates.
```
This will connect using the X.509 certificate instead of a connection string.
+
+1. Delete the line with `connection_string` variable.
-1, RUn your code. Monitor the messages sent to IoT Hub, and send direct method requests as before. You will see the device connecting and sending soil moisture readings, as well as receiving direct method requests.
+1. Run your code. Monitor the messages sent to IoT Hub, and send direct method requests as before. You will see the device connecting and sending soil moisture readings, as well as receiving direct method requests.
> 💁 You can find this code in the [code/pi](code/pi) or [code/virtual-device](code/virtual-device) folder.
diff --git a/3-transport/README.md b/3-transport/README.md
index 061f44d1..61bc84a2 100644
--- a/3-transport/README.md
+++ b/3-transport/README.md
@@ -1,10 +1,10 @@
# Transport from farm to factory - using IoT to track food deliveries
-Many farmers grow food to sell - either they are commercial growers who sell everything they grow, or they are subsistence farmers who sell their excess produce to buy necessities. Somehow the food has to get from the farm to the consumer, and this usually relies on bulk transport from farms, to hubs or processing plants, then on to stores. For example, a tomato farmer will harvest tomatoes, pack them into boxes, load the boxes into a truck then deliver to a processing plant. The tomatoes will then be sorted, and from there delivered to the consumers in the form of retail, food processing, or restaurants.
+Many farmers grow food to sell - either they are commercial farmers who sell everything they grow, or they are subsistence farmers who sell their excess produce to buy necessities. Somehow the food has to get from the farm to the consumer, and this usually relies on bulk transport from farms, to hubs or processing plants, then to stores. For example, a tomato farmer will harvest tomatoes, pack them into boxes, load the boxes into a truck then deliver to a processing plant. The tomatoes will then be sorted, and from there delivered to the consumers in the form of processed food, retail sales, or consumed at restaurants.
-IoT can help with this supply chain by tracking the food in transit - ensuring drivers are going where they should, monitoring vehicle locations, and getting alerts when vehicles arrive so that food can be unloaded, ready for processing as soon as possible.
+IoT can help with this supply chain by tracking the food in transit - ensuring drivers are going where they should, monitoring vehicle locations, and getting alerts when vehicles arrive so that food can be unloaded, and be ready for processing as soon as possible.
-> 🎓 A *supply chain* is the sequence of activities to make and deliver something. For example, in tomato farming it covers seed, soil, fertilizer and water supply, growing tomatoes, delivering tomatoes to a central hub, transporting them to a supermarkets local hub, transporting to the individual supermarket, being put out on display, then sold to a consumer and taken home to eat. Each step is like the links in a chain.
+> 🎓 A *supply chain* is the sequence of activities to make and deliver something. For example, in tomato farming it covers seed, soil, fertilizer and water supply, growing tomatoes, delivering tomatoes to a central hub, transporting them to a supermarket's local hub, transporting to the individual supermarket, being put out on display, then sold to a consumer and taken home to eat. Each step is like the links in a chain.
> 🎓 The transportation part of the supply chain is know as *logistics*.
@@ -15,10 +15,10 @@ In these 4 lessons, you'll learn how to apply the Internet of Things to improve
## Topics
1. [Location tracking](lessons/1-location-tracking/README.md)
-1. [Store location data](./3-transport/lessons/2-store-location-data/README.md)
+1. [Store location data](lessons/2-store-location-data/README.md)
1. [Visualize location data](lessons/3-visualize-location-data/README.md)
1. [Geofences](lessons/4-geofences/README.md)
## Credits
-All the lessons were written with ♥️ by [Jim Bennett](https://GitHub.com/JimBobBennett)
+All the lessons were written with ♥️ by [Jen Looper](https://github.com/jlooper) and [Jim Bennett](https://GitHub.com/JimBobBennett)
diff --git a/3-transport/lessons/1-location-tracking/README.md b/3-transport/lessons/1-location-tracking/README.md
index 2bb82fb8..cea9db86 100644
--- a/3-transport/lessons/1-location-tracking/README.md
+++ b/3-transport/lessons/1-location-tracking/README.md
@@ -1,22 +1,20 @@
# Location tracking
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/21)
## Introduction
-The main process for getting food from a farmer to a consumer involves loading boxes of produce on to trucks, ships, airplanes, or other commercial transport vehicles, and delivering the food somewhere - either direct to a customer, or to a central hub or warehouse for processing. The whole end-to-end process from farm to consumer is part of a process called the *supply chain*. The video below from Arizona State University's W. P. Carey School of Business talks about the idea of the supply chain and how it is managed in more detail.
+The main process for getting food from a farmer to a consumer involves loading boxes of produce on to trucks, ships, airplanes, or other commercial transport vehicles, and delivering the food somewhere - either directly to a customer, or to a central hub or warehouse for processing. The whole end-to-end process from farm to consumer is part of a process called the *supply chain*. The video below from Arizona State University's W. P. Carey School of Business talks about the idea of the supply chain and how it is managed in more detail.
[](https://www.youtube.com/watch?v=Mi1QBxVjZAw)
+> 🎥 Click the image above to watch a video
+
Adding IoT devices can drastically improve your supply chain, allowing you to manage where items are, plan transport and goods handling better, and respond quicker to problems.
-When managing a fleet of vehicles such as trucks, it is helpful to know where each vehicle is at a given time. Vehicles can be fitted with GPS sensors that send their location to IoT systems, allowing the owners to pinpoint their location, see the route they have taken, and know when they will arrive at their destination. Most vehicles operate outside of WiFi coverage, so they use cellular networks to send this kind of data. Sometimes the GPS sensor is built into more complex IoT devices such as electronic log books. These devices track how long a truck has been driven for to ensure drivers are in compliance with local laws on working hours.
+When managing a fleet of vehicles such as trucks, it is helpful to know where each vehicle is at a given time. Vehicles can be fitted with GPS sensors that send their location to IoT systems, allowing the owners to pinpoint their location, see the route they have taken, and know when they will arrive at their destination. Most vehicles operate outside of WiFi coverage, so they use cellular networks to send this kind of data. Sometimes the GPS sensor is built into more complex IoT devices such as electronic log books. These devices track how long a truck has been in transit to ensure drivers are in compliance with local laws on working hours.
In this lesson you will learn how to track a vehicles location using a Global Positioning System (GPS) sensor.
@@ -53,11 +51,11 @@ The core component of vehicle tracking is GPS - sensors that can pinpoint their
## Geospatial coordinates
-Geospatial coordinates are used to define points on the Earth's surface, similar to how coordinates can be used to draw to a pixel on a computer screen or position stitches in cross stitch. For a single point, you have a pair of coordinates. For example, the Microsoft Campus in Redmond, Washington, USA is located at 47.6423109,-122.1390293.
+Geospatial coordinates are used to define points on the Earth's surface, similar to how coordinates can be used to draw to a pixel on a computer screen or position stitches in cross stitch. For a single point, you have a pair of coordinates. For example, the Microsoft Campus in Redmond, Washington, USA is located at 47.6423109, -122.1390293.
### Latitude and longitude
-The Earth is a sphere - a three-dimensional circle. Because of this, points are defined is by dividing it into 360 degrees, the same as the geometry of circles. Latitude measures the number of degrees north to south, longitude measures the number of degrees east to west.
+The Earth is a sphere - a three-dimensional circle. Because of this, points are defined by dividing it into 360 degrees, the same as the geometry of circles. Latitude measures the number of degrees north to south, longitude measures the number of degrees east to west.
> 💁 No-one really knows the original reason why circles are divided into 360 degrees. The [degree (angle) page on Wikipedia](https://wikipedia.org/wiki/Degree_(angle)) covers some of the possible reasons.
@@ -136,7 +134,7 @@ You can use a GPS sensor on your IoT device to get GPS data.
### Task - connect a GPS sensor and read GPS data
-Work through the relevant guide to measure soil moisture using your IoT device:
+Work through the relevant guide to read GPS data using your IoT device:
* [Arduino - Wio Terminal](wio-terminal-gps-sensor.md)
* [Single-board computer - Raspberry Pi](pi-gps-sensor.md)
@@ -178,7 +176,7 @@ Rather than use the raw NMEA data, it is better to decode it into a more useful
### Task - decode GPS sensor data
-Work through the relevant guide to measure soil moisture using your IoT device:
+Work through the relevant guide to decode GPS sensor data using your IoT device:
* [Arduino - Wio Terminal](wio-terminal-gps-decode.md)
* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer-gps-decode.md)
diff --git a/3-transport/lessons/1-location-tracking/assignment.md b/3-transport/lessons/1-location-tracking/assignment.md
index e524e326..76d106e6 100644
--- a/3-transport/lessons/1-location-tracking/assignment.md
+++ b/3-transport/lessons/1-location-tracking/assignment.md
@@ -6,7 +6,7 @@ The NMEA sentences that come from your GPS sensor have other data in addition to
For example - can you get the current date and time? If you are using a microcontroller, can you set the clock using GPS data in the same way you set is using NTP signals in the previous project? Can you get elevation (your height above sea level), or your current speed?
-If you are using a virtual IoT device, then you can get some of this data by sending MENA sentences generated using tools [nmeagen.org](https://www.nmeagen.org).
+If you are using a virtual IoT device, then you can get some of this data by sending NMEA sentences generated using tools [nmeagen.org](https://www.nmeagen.org).
## Rubric
diff --git a/3-transport/lessons/1-location-tracking/code-gps-decode/pi/gps-sensor/app.py b/3-transport/lessons/1-location-tracking/code-gps-decode/pi/gps-sensor/app.py
index 3aad9865..a5de9192 100644
--- a/3-transport/lessons/1-location-tracking/code-gps-decode/pi/gps-sensor/app.py
+++ b/3-transport/lessons/1-location-tracking/code-gps-decode/pi/gps-sensor/app.py
@@ -2,21 +2,12 @@ import time
import serial
import pynmea2
import json
-from azure.iot.device import IoTHubDeviceClient, Message
-
-connection_string = ''
serial = serial.Serial('/dev/ttyAMA0', 9600, timeout=1)
serial.reset_input_buffer()
serial.flush()
-device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
-
-print('Connecting')
-device_client.connect()
-print('Connected')
-
-def printGPSData(line):
+def print_gps_data(line):
msg = pynmea2.parse(line)
if msg.sentence_type == 'GGA':
lat = pynmea2.dm_to_sd(msg.lat)
@@ -28,16 +19,13 @@ def printGPSData(line):
if msg.lon_dir == 'W':
lon = lon * -1
- message_json = { "gps" : { "lat":lat, "lon":lon } }
- print("Sending telemetry", message_json)
- message = Message(json.dumps(message_json))
- device_client.send_message(message)
+ print(f'{lat},{lon} - from {msg.num_sats} satellites')
while True:
line = serial.readline().decode('utf-8')
while len(line) > 0:
- printGPSData(line)
+ print_gps_data(line)
line = serial.readline().decode('utf-8')
time.sleep(1)
diff --git a/3-transport/lessons/1-location-tracking/code-gps-decode/virtual-device/gps-sensor/app.py b/3-transport/lessons/1-location-tracking/code-gps-decode/virtual-device/gps-sensor/app.py
index 0383f8fd..2b819a56 100644
--- a/3-transport/lessons/1-location-tracking/code-gps-decode/virtual-device/gps-sensor/app.py
+++ b/3-transport/lessons/1-location-tracking/code-gps-decode/virtual-device/gps-sensor/app.py
@@ -5,18 +5,11 @@ import time
import counterfit_shims_serial
import pynmea2
import json
-from azure.iot.device import IoTHubDeviceClient, Message
connection_string = ''
serial = counterfit_shims_serial.Serial('/dev/ttyAMA0')
-device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
-
-print('Connecting')
-device_client.connect()
-print('Connected')
-
def send_gps_data(line):
msg = pynmea2.parse(line)
if msg.sentence_type == 'GGA':
@@ -29,10 +22,7 @@ def send_gps_data(line):
if msg.lon_dir == 'W':
lon = lon * -1
- message_json = { "gps" : { "lat":lat, "lon":lon } }
- print("Sending telemetry", message_json)
- message = Message(json.dumps(message_json))
- device_client.send_message(message)
+ print(f'{lat},{lon} - from {msg.num_sats} satellites')
while True:
line = serial.readline().decode('utf-8')
@@ -41,4 +31,4 @@ while True:
send_gps_data(line)
line = serial.readline().decode('utf-8')
- time.sleep(60)
\ No newline at end of file
+ time.sleep(1)
\ No newline at end of file
diff --git a/3-transport/lessons/1-location-tracking/code-gps/pi/gps-sensor/app.py b/3-transport/lessons/1-location-tracking/code-gps/pi/gps-sensor/app.py
index 8a282ee1..0dfad1e9 100644
--- a/3-transport/lessons/1-location-tracking/code-gps/pi/gps-sensor/app.py
+++ b/3-transport/lessons/1-location-tracking/code-gps/pi/gps-sensor/app.py
@@ -5,14 +5,14 @@ serial = serial.Serial('/dev/ttyAMA0', 9600, timeout=1)
serial.reset_input_buffer()
serial.flush()
-def printGPSData():
+def print_gps_data():
print(line.rstrip())
while True:
line = serial.readline().decode('utf-8')
while len(line) > 0:
- printGPSData()
+ print_gps_data()
line = serial.readline().decode('utf-8')
time.sleep(1)
diff --git a/3-transport/lessons/1-location-tracking/code-gps/virtual-device/gps-sensor/app.py b/3-transport/lessons/1-location-tracking/code-gps/virtual-device/gps-sensor/app.py
index 7dd2a965..811ffded 100644
--- a/3-transport/lessons/1-location-tracking/code-gps/virtual-device/gps-sensor/app.py
+++ b/3-transport/lessons/1-location-tracking/code-gps/virtual-device/gps-sensor/app.py
@@ -6,14 +6,14 @@ import counterfit_shims_serial
serial = counterfit_shims_serial.Serial('/dev/ttyAMA0')
-def printGPSData(line):
+def print_gps_data(line):
print(line.rstrip())
while True:
line = serial.readline().decode('utf-8')
while len(line) > 0:
- printGPSData(line)
+ print_gps_data(line)
line = serial.readline().decode('utf-8')
time.sleep(1)
\ No newline at end of file
diff --git a/3-transport/lessons/1-location-tracking/pi-gps-sensor.md b/3-transport/lessons/1-location-tracking/pi-gps-sensor.md
index 63d8793f..01148da1 100644
--- a/3-transport/lessons/1-location-tracking/pi-gps-sensor.md
+++ b/3-transport/lessons/1-location-tracking/pi-gps-sensor.md
@@ -24,7 +24,7 @@ Connect the GPS sensor.
1. With the Raspberry Pi powered off, connect the other end of the Grove cable to the UART socket marked **UART** on the Grove Base hat attached to the Pi. This socket is on the middle row, on the side nearest the SD Card slot, the other end from the USB ports and ethernet socket.
-
+ 
1. Position the GPS sensor so that the attached antenna has visibility to the sky - ideally next to an open window or outside. It's easier to get a clearer signal with nothing in the way of the antenna.
@@ -42,7 +42,7 @@ Program the device.
1. Launch VS Code, either directly on the Pi, or connect via the Remote SSH extension.
- > ⚠️ You can refer to [the instructions for setting up and launch VS Code in lesson 1 if needed](../../../1-getting-started/lessons/1-introduction-to-iot/pi.md).
+ > ⚠️ You can refer to [the instructions for setting up and launching VS Code in lesson 1 if needed](../../../1-getting-started/lessons/1-introduction-to-iot/pi.md).
1. With newer versions of the Raspberry Pi that support Bluetooth, there is a conflict between the serial port used for Bluetooth, and the one used by the Grove UART port. To fix this, do the following:
@@ -118,14 +118,14 @@ Program the device.
serial.reset_input_buffer()
serial.flush()
- def printGPSData(line):
+ def print_gps_data(line):
print(line.rstrip())
while True:
line = serial.readline().decode('utf-8')
while len(line) > 0:
- printGPSData(line)
+ print_gps_data(line)
line = serial.readline().decode('utf-8')
time.sleep(1)
@@ -133,9 +133,9 @@ Program the device.
This code imports the `serial` module from the `pyserial` Pip package. It then connects to the `/dev/ttyAMA0` serial port - this is the address of the serial port that the Grove Pi Base Hat uses for its UART port. It then clears any existing data from this serial connection.
- Next a function called `printGPSData` is defined that prints out the line passed to it to the console.
+ Next a function called `print_gps_data` is defined that prints out the line passed to it to the console.
- Next the code loops forever, reading as many lines of text as it can from the serial port in each loop. It calls the `printGPSData` function for each line.
+ Next the code loops forever, reading as many lines of text as it can from the serial port in each loop. It calls the `print_gps_data` function for each line.
After all the data has been read, the loop sleeps for 1 second, then tries again.
diff --git a/3-transport/lessons/1-location-tracking/single-board-computer-gps-decode.md b/3-transport/lessons/1-location-tracking/single-board-computer-gps-decode.md
index bf7bc26f..39954623 100644
--- a/3-transport/lessons/1-location-tracking/single-board-computer-gps-decode.md
+++ b/3-transport/lessons/1-location-tracking/single-board-computer-gps-decode.md
@@ -24,7 +24,7 @@ Program the device to decode the GPS data.
import pynmea2
```
-1. Replace the contents of the `printGPSData` function with the following:
+1. Replace the contents of the `print_gps_data` function with the following:
```python
msg = pynmea2.parse(line)
diff --git a/3-transport/lessons/1-location-tracking/virtual-device-gps-sensor.md b/3-transport/lessons/1-location-tracking/virtual-device-gps-sensor.md
index d6c58030..a524c7b5 100644
--- a/3-transport/lessons/1-location-tracking/virtual-device-gps-sensor.md
+++ b/3-transport/lessons/1-location-tracking/virtual-device-gps-sensor.md
@@ -77,28 +77,28 @@ Program the GPS sensor app.
1. Add the following code below this to read from the serial port and print the values to the console:
```python
- def printGPSData(line):
+ def print_gps_data(line):
print(line.rstrip())
while True:
line = serial.readline().decode('utf-8')
while len(line) > 0:
- printGPSData(line)
+ print_gps_data(line)
line = serial.readline().decode('utf-8')
time.sleep(1)
```
- A function called `printGPSData` is defined that prints out the line passed to it to the console.
+ A function called `print_gps_data` is defined that prints out the line passed to it to the console.
- Next the code loops forever, reading as many lines of text as it can from the serial port in each loop. It calls the `printGPSData` function for each line.
+ Next the code loops forever, reading as many lines of text as it can from the serial port in each loop. It calls the `print_gps_data` function for each line.
After all the data has been read, the loop sleeps for 1 second, then tries again.
1. Run this code, ensuring you are using a different terminal to the one that the CounterFit app is running it, so that the CounterFit app remains running.
-1. From the CounterFit app, change the value of the gps sensor. You can do this in one of thess ways:
+1. From the CounterFit app, change the value of the gps sensor. You can do this in one of these ways:
* Set the **Source** to `Lat/Lon`, and set an explicit latitude, longitude and number of satellites used to get the GPS fix. This value will be sent only once, so check the **Repeat** box to have the data repeat every second.
diff --git a/3-transport/lessons/1-location-tracking/wio-terminal-gps-decode.md b/3-transport/lessons/1-location-tracking/wio-terminal-gps-decode.md
index 8eb9622a..b2662363 100644
--- a/3-transport/lessons/1-location-tracking/wio-terminal-gps-decode.md
+++ b/3-transport/lessons/1-location-tracking/wio-terminal-gps-decode.md
@@ -31,7 +31,7 @@ Program the device to decode the GPS data.
TinyGPSPlus gps;
```
-1. Change the contents of the `printGPSData` function to be the following:
+1. Change the contents of the `printGPSData` function to the following:
```cpp
if (gps.encode(Serial3.read()))
diff --git a/3-transport/lessons/1-location-tracking/wio-terminal-gps-sensor.md b/3-transport/lessons/1-location-tracking/wio-terminal-gps-sensor.md
index ea65a736..3a48b86d 100644
--- a/3-transport/lessons/1-location-tracking/wio-terminal-gps-sensor.md
+++ b/3-transport/lessons/1-location-tracking/wio-terminal-gps-sensor.md
@@ -22,15 +22,15 @@ Connect the GPS sensor.
1. Insert one end of a Grove cable into the socket on the GPS sensor. It will only go in one way round.
-1. With the Wio Terminal disconnected from your computer or other power supply, connect the other end of the Grove cable to the left-hand side Grove socket on the Wio Terminal as you look at the screen. This is the socket closest to from the power button.
+1. With the Wio Terminal disconnected from your computer or other power supply, connect the other end of the Grove cable to the left-hand side Grove socket on the Wio Terminal as you look at the screen. This is the socket closest to the power button.
-
+ 
1. Position the GPS sensor so that the attached antenna has visibility to the sky - ideally next to an open window or outside. It's easier to get a clearer signal with nothing in the way of the antenna.
1. You can now connect the Wio Terminal to your computer.
-1. The GPS sensor has 2 LEDs - a blue LED that flashes when data is transmitted, and a green LED that flashes every second when receiving data from satellites. Ensure the blue LED is flashing when you power up the Pi. After a few minutes the green LED will flash - if not, you may need to reposition the antenna.
+1. The GPS sensor has 2 LEDs - a blue LED that flashes when data is transmitted, and a green LED that flashes every second when receiving data from satellites. Ensure the blue LED is flashing when you power up the Wio Terminal. After a few minutes the green LED will flash - if not, you may need to reposition the antenna.
## Program the GPS sensor
diff --git a/3-transport/lessons/2-store-location-data/README.md b/3-transport/lessons/2-store-location-data/README.md
index fd2a00ab..0cbb658c 100644
--- a/3-transport/lessons/2-store-location-data/README.md
+++ b/3-transport/lessons/2-store-location-data/README.md
@@ -1,16 +1,12 @@
# Store location data
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/23)
## Introduction
-In the last lesson, you learned how to use a GPS sensor to capture location data. To use this data to visualize the both the location of a truck laden with food, but also it's journey, it needs to be sent to an IoT service in the cloud, and then stored somewhere.
+In the last lesson, you learned how to use a GPS sensor to capture location data. To use this data to visualize the location of a truck laden with food, and it's journey, it needs to be sent to an IoT service in the cloud, and then stored somewhere.
In this lesson you will learn about the different ways to store IoT data, and learn how to store data from your IoT service using serverless code.
@@ -18,6 +14,7 @@ In this lesson we'll cover:
* [Structured and unstructured data](#structured-and-unstructured-data)
* [Send GPS data to an IoT Hub](#send-gps-data-to-an-iot-hub)
+* [Hot, warm, and cold paths](#hot-warm-and-cold-paths)
* [Handle GPS events using serverless code](#handle-gps-events-using-serverless-code)
* [Azure Storage Accounts](#azure-storage-accounts)
* [Connect your serverless code to storage](#connect-your-serverless-code-to-storage)
@@ -44,6 +41,8 @@ Imagine you were adding IoT devices to a fleet of vehicles for a large commercia
This data can change constantly. For example, if the IoT device is in a truck cab, then the data it sends may change as the trailer changes, for example only sending temperature data when a refrigerated trailer is used.
+✅ What other IoT data might be captured? Think about the kinds of loads trucks can carry, as well as maintenance data.
+
This data varies from vehicle to vehicle, but it all gets sent to the same IoT service for processing. The IoT service needs to be able to process this unstructured data, storing it in a way that allows it to be searched or analyzed, but works with different structures to this data.
### SQL vs NoSQL storage
@@ -58,13 +57,17 @@ The first databases were Relational Database Management Systems (RDBMS), or rela
For example, if you stored a users personal details in a table, you would have some kind of internal unique ID per user that is used in a row in a table that contains the users name and address. If you then wanted to store other details about that user, such as their purchases, in another table, you would have one column in the new table for that users ID. When you look up a user, you can use their ID to get their personal details from one table, and their purchases from another.
-SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema. Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL.
+SQL databases are ideal for storing structured data, and for when you want to ensure the data matches your schema.
✅ If you haven't used SQL before, take a moment to read up on it on the [SQL page on Wikipedia](https://wikipedia.org/wiki/SQL).
+Some well known SQL databases are Microsoft SQL Server, MySQL, and PostgreSQL.
+
+✅ Do some research: Read up on some of these SQL databases and their capabilities.
+
#### NoSQL database
-NoSQL databases are so called because they don't have the same rigid structure of SQL databases. There are also known as document databases as they can store unstructured data such as documents.
+NoSQL databases are called NoSQL because they don't have the same rigid structure of SQL databases. They are also known as document databases as they can store unstructured data such as documents.
> 💁 Despite their name, some NoSQL databases allow you to use SQL to query the data.
@@ -74,6 +77,8 @@ NoSQL database do not have a pre-defined schema that limits how data is stored,
Some well known NoSQL databases include Azure CosmosDB, MongoDB, and CouchDB.
+✅ Do some research: Read up on some of these NoSQL databases and their capabilities.
+
In this lesson, you will be using NoSQL storage to store IoT data.
## Send GPS data to an IoT Hub
@@ -88,7 +93,7 @@ In the last lesson you captured GPS data from a GPS sensor connected to your IoT
1. Create a new IoT Hub using the free tier.
- > ⚠️ You can refer to [the instructions for creating an IoT Hub from project 2, lesson 4 if needed](../../../2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md#create-an-iot-service-in-the-cloud).
+ > ⚠️ You can refer to the [instructions for creating an IoT Hub from project 2, lesson 4](../../../2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md#create-an-iot-service-in-the-cloud) if needed.
Remember to create a new Resource Group. Name the new Resource Group `gps-sensor`, and the new IoT Hub a unique name based on `gps-sensor`, such as `gps-sensor-`.
@@ -98,7 +103,7 @@ In the last lesson you captured GPS data from a GPS sensor connected to your IoT
1. Update your device code to send the GPS data to the new IoT Hub using the device connection string from the previous step.
- > ⚠️ You can refer to [the instructions for connecting your device to an IoT from project 2, lesson 4 if needed](../../../2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md#connect-your-device-to-the-iot-service).
+ > ⚠️ You can refer to the [instructions for connecting your device to an IoT from project 2, lesson 4](../../../2-farm/lessons/4-migrate-your-plant-to-the-cloud/README.md#connect-your-device-to-the-iot-service) if needed.
1. When you send the GPS data, do it as JSON in the following format:
@@ -136,9 +141,33 @@ message = Message(json.dumps(message_json))
Run your device code and ensure messages are flowing into IoT Hub using the `az iot hub monitor-events` CLI command.
+## Hot, warm, and cold paths
+
+Data that flows from an IoT device to the cloud is not always processed in real time. Some data needs real time processing, other data can be processed a short time later, and other data can be processed much later. The flow of data to different services that process the data at different times is referred to hot, warm and cold paths.
+
+### Hot path
+
+The hot path refers to data that needs to be processed in real time or near real time. You would use hot path data for alerts, such as getting alerts that a vehicle is approaching a depot, or that the temperature in a refrigerated truck is too high.
+
+To use hot path data, your code would respond to events as soon as they are received by your cloud services.
+
+### Warm path
+
+The warm path refers to data that can be processed a short while after being received, for example for reporting or short term analytics. You would use warm path data for daily reports on vehicle mileage, using data gathered the previous day.
+
+Warm path data is stored once it is received by the cloud service inside some kind of storage that can be quickly accessed.
+
+### Cold path
+
+THe cold path refers to historic data, storing data for the long term to be processed whenever needed. For example, you could use the cold path to get annual mileage reports for vehicles, or run analytics on routes to find the most optimal route to reduce fuel costs.
+
+Cold path data is stored in data warehouses - databases designed for storing large amounts of data that will never change and can be queried quickly and easily. You would normally have a regular job in your cloud application that would run at a regular time each day, week, or month to move data from warm path storage into the data warehouse.
+
+✅ Think about the data you have captured so far in these lessons. Is it hot, warm or cold path data?
+
## Handle GPS events using serverless code
-Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint.
+Once data is flowing into your IoT Hub, you can write some serverless code to listen for events published to the Event-Hub compatible endpoint. This is the warm path - this data will be stored and used in the next lesson for reporting on the journey.

@@ -148,21 +177,21 @@ Once data is flowing into your IoT Hub, you can write some serverless code to li
1. Create an Azure Functions app using the Azure Functions CLI. Use the Python runtime, and create it in a folder called `gps-trigger`, and use the same name for the Functions App project name. Make sure you create a virtual environment to use for this.
- > ⚠️ You can refer to [the instructions for creating an Azure Functions Project from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#create-a-serverless-application).
+ > ⚠️ You can refer to the [instructions for creating an Azure Functions Project from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#create-a-serverless-application) if needed.
1. Add an IoT Hub event trigger that uses the IoT Hub's Event Hub compatible endpoint.
- > ⚠️ You can refer to [the instructions for creating an IoT Hub event trigger from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#create-an-iot-hub-event-trigger).
+ > ⚠️ You can refer to the [instructions for creating an IoT Hub event trigger from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#create-an-iot-hub-event-trigger) if needed.
1. Set the Event Hub compatible endpoint connection string in the `local.settings.json` file, and use the key for that entry in the `function.json` file.
1. Use the Azurite app as a local storage emulator
-Run your functions app to ensure it is receiving events from your GPS device. Make sure your IoT device is also running and sending GPS data.
+1. Run your functions app to ensure it is receiving events from your GPS device. Make sure your IoT device is also running and sending GPS data.
-```output
-Python EventHub trigger processed an event: {"gps": {"lat": 47.73481, "lon": -122.25701}}
-```
+ ```output
+ Python EventHub trigger processed an event: {"gps": {"lat": 47.73481, "lon": -122.25701}}
+ ```
## Azure Storage Accounts
@@ -180,13 +209,13 @@ You will use blob storage in this lesson to store IoT data.
### Table storage
-Table storage allows you to store semi-structured data. Table storage is actually a NoSQL database, so doesn't require a defined set of tables up front, but is designed to store data in one or more tables, with unique keys to define each row.
+Table storage allows you to store semi-structured data. Table storage is actually a NoSQL database, so doesn't require a defined set of tables up front, but it is designed to store data in one or more tables, with unique keys to define each row.
✅ Do some research: Read up on [Azure Table Storage](https://docs.microsoft.com/azure/storage/tables/table-storage-overview?WT.mc_id=academic-17441-jabenn)
### Queue storage
-Queue storage allows you to store messages of up to 64KB in size in a queue. You can add messages to the back of the queue, and read them off the front. Queues store messages indefinitely as long as there is still storage space, so allows messages to be stored long term. then read off when needed. For example, if you wanted to run a monthly job to process GPS data you could add it to a queue every day for a month, then at the end of the month process all the messages off the queue.
+Queue storage allows you to store messages of up to 64KB in size in a queue. You can add messages to the back of the queue, and read them off the front. Queues store messages indefinitely as long as there is still storage space, so it allows messages to be stored long term. then read off when needed. For example, if you wanted to run a monthly job to process GPS data you could add it to a queue every day for a month, then at the end of the month process all the messages off the queue.
✅ Do some research: Read up on [Azure Queue Storage](https://docs.microsoft.com/azure/storage/queues/storage-queues-introduction?WT.mc_id=academic-17441-jabenn)
@@ -227,7 +256,7 @@ The data will be saved as a JSON blob with the following format:
1. Create an Azure Storage account. Name it something like `gps`.
- > ⚠️ You can refer to [the instructions for creating a storage account from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---create-the-cloud-resources).
+ > ⚠️ You can refer to the [instructions for creating a storage account from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---create-the-cloud-resources) if needed.
If you still have a storage account from the previous project, you can re-use this.
@@ -258,13 +287,13 @@ The data will be saved as a JSON blob with the following format:
> pip install --upgrade pip
> ```
-1. In the `__init__.py` file for the `iot_hub_trigger`, add the following import statements:
+1. In the `__init__.py` file for the `iot-hub-trigger`, add the following import statements:
```python
import json
import os
import uuid
- from azure.storage.blob import BlobServiceClient
+ from azure.storage.blob import BlobServiceClient, PublicAccess
```
The `json` system module will be used to read and write JSON, the `os` system module will be used to read the connection string, the `uuid` system module will be used to generate a unique ID for the GPS reading.
@@ -282,11 +311,13 @@ The data will be saved as a JSON blob with the following format:
if container.name == name:
return blob_service_client.get_container_client(container.name)
- return blob_service_client.create_container(name)
+ return blob_service_client.create_container(name, public_access=PublicAccess.Container)
```
The Python blob SDK doesn't have a helper method to create a container if it doesn't exist. This code will load the connection string from the `local.settings.json` file (or the Application Settings once deployed to the cloud), then create a `BlobServiceClient` class from this to interact with the blob storage account. It then loops through all the containers for the blob storage account, looking for one with the provided name - if it finds one it will return a `ContainerClient` class that can interact with the container to create blobs. If it doesn't find one, then the container is created and the client for the new container is returned.
+ When the new container is created, public access is granted to query the blobs in the container. This will be used in the next lesson to visualize the GPS data on a map.
+
1. Unlike with soil moisture, with this code we want to store every event, so add the following code inside the `for event in events:` loop in the `main` function, below the `logging` statement:
```python
@@ -294,7 +325,7 @@ The data will be saved as a JSON blob with the following format:
blob_name = f'{device_id}/{str(uuid.uuid1())}.json'
```
- This code gets the device ID from the event metadata, then uses it to create a blob name. Blobs can be stored in folders, and device ID will be used for the folder name, so each device will have all it's GPS events in one folder. The blob name is this folder, followed by a document name, separated with forward slashes, similar to Linux and macOS paths (similar to Windows as well, but Windows uses back slashes). The document name is a unique ID generated using the Python `uuid` module, with the file type of `json`.
+ This code gets the device ID from the event metadata, then uses it to create a blob name. Blobs can be stored in folders, and device ID will be used for the folder name, so each device will have all its GPS events in one folder. The blob name is this folder, followed by a document name, separated with forward slashes, similar to Linux and macOS paths (similar to Windows as well, but Windows uses back slashes). The document name is a unique ID generated using the Python `uuid` module, with the file type of `json`.
For example, for the `gps-sensor` device ID, the blob name might be `gps-sensor/a9487ac2-b9cf-11eb-b5cd-1e00621e3648.json`.
@@ -339,6 +370,9 @@ The data will be saved as a JSON blob with the following format:
[2021-05-21T01:31:14.351Z] Writing blob to gps-sensor/4b6089fe-ba8d-11eb-bc7b-1e00621e3648.json - {'device_id': 'gps-sensor', 'timestamp': '2021-05-21T00:57:53.878Z', 'gps': {'lat': 47.73092, 'lon': -122.26206}}
```
+ > 💁 Make sure you are not running the IoT Hub event monitor at the same time.
+
+
> 💁 You can find this code in the [code/functions](code/functions) folder.
### Task - verify the uploaded blobs
@@ -404,15 +438,15 @@ Now that your Function app is working, you can deploy it to the cloud.
1. Create a new Azure Functions app, using the storage account you created earlier. Name this something like `gps-sensor-` and add a unique identifier on the end, like some random words or your name.
- > ⚠️ You can refer to [the instructions for creating a Functions app from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---create-the-cloud-resources).
+ > ⚠️ You can refer to the [instructions for creating a Functions app from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---create-the-cloud-resources) if needed.
1. Upload the `IOT_HUB_CONNECTION_STRING` and `STORAGE_CONNECTION_STRING` values to the Application Settings
- > ⚠️ You can refer to [the instructions for uploading Application Settings from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---upload-your-application-settings).
+ > ⚠️ You can refer to the [instructions for uploading Application Settings from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---upload-your-application-settings) if needed.
1. Deploy your local Functions app to the cloud.
- > ⚠️ You can refer to [the instructions for deploying your Functions app from project 2, lesson 5 if needed](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---deploy-your-functions-app-to-the-cloud).
+ > ⚠️ You can refer to the [instructions for deploying your Functions app from project 2, lesson 5](../../../2-farm/lessons/5-migrate-application-to-the-cloud/README.md#task---deploy-your-functions-app-to-the-cloud) if needed.
---
diff --git a/3-transport/lessons/4-geofences/code/functions/gps-trigger/iot_hub_trigger/__init__.py b/3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot-hub-trigger/__init__.py
similarity index 86%
rename from 3-transport/lessons/4-geofences/code/functions/gps-trigger/iot_hub_trigger/__init__.py
rename to 3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot-hub-trigger/__init__.py
index d908e887..eb4a0a3c 100644
--- a/3-transport/lessons/4-geofences/code/functions/gps-trigger/iot_hub_trigger/__init__.py
+++ b/3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot-hub-trigger/__init__.py
@@ -5,7 +5,7 @@ import azure.functions as func
import json
import os
import uuid
-from azure.storage.blob import BlobServiceClient
+from azure.storage.blob import BlobServiceClient, PublicAccess
def get_or_create_container(name):
connection_str = os.environ['STORAGE_CONNECTION_STRING']
@@ -15,7 +15,7 @@ def get_or_create_container(name):
if container.name == name:
return blob_service_client.get_container_client(container.name)
- return blob_service_client.create_container(name)
+ return blob_service_client.create_container(name, public_access=PublicAccess.Container)
def main(events: List[func.EventHubEvent]):
for event in events:
diff --git a/3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot_hub_trigger/function.json b/3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot-hub-trigger/function.json
similarity index 100%
rename from 3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot_hub_trigger/function.json
rename to 3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot-hub-trigger/function.json
diff --git a/3-transport/lessons/2-store-location-data/code/pi/gps-sensor/app.py b/3-transport/lessons/2-store-location-data/code/pi/gps-sensor/app.py
index a0b5c0cf..3364f6b4 100644
--- a/3-transport/lessons/2-store-location-data/code/pi/gps-sensor/app.py
+++ b/3-transport/lessons/2-store-location-data/code/pi/gps-sensor/app.py
@@ -1,13 +1,14 @@
import time
-from grove.adc import ADC
-from grove.grove_relay import GroveRelay
+import serial
+import pynmea2
import json
-from azure.iot.device import IoTHubDeviceClient, Message, MethodResponse
+from azure.iot.device import IoTHubDeviceClient, Message
connection_string = ''
-adc = ADC()
-relay = GroveRelay(5)
+serial = serial.Serial('/dev/ttyAMA0', 9600, timeout=1)
+serial.reset_input_buffer()
+serial.flush()
device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
@@ -15,24 +16,28 @@ print('Connecting')
device_client.connect()
print('Connected')
-def handle_method_request(request):
- print("Direct method received - ", request.name)
-
- if request.name == "relay_on":
- relay.on()
- elif request.name == "relay_off":
- relay.off()
+def print_gps_data(line):
+ msg = pynmea2.parse(line)
+ if msg.sentence_type == 'GGA':
+ lat = pynmea2.dm_to_sd(msg.lat)
+ lon = pynmea2.dm_to_sd(msg.lon)
- method_response = MethodResponse.create_from_method_request(request, 200)
- device_client.send_method_response(method_response)
+ if msg.lat_dir == 'S':
+ lat = lat * -1
-device_client.on_method_request_received = handle_method_request
+ if msg.lon_dir == 'W':
+ lon = lon * -1
+
+ message_json = { "gps" : { "lat":lat, "lon":lon } }
+ print("Sending telemetry", message_json)
+ message = Message(json.dumps(message_json))
+ device_client.send_message(message)
while True:
- soil_moisture = adc.read(0)
- print("Soil moisture:", soil_moisture)
+ line = serial.readline().decode('utf-8')
- message = Message(json.dumps({ 'soil_moisture': soil_moisture }))
- device_client.send_message(message)
+ while len(line) > 0:
+ print_gps_data(line)
+ line = serial.readline().decode('utf-8')
- time.sleep(10)
\ No newline at end of file
+ time.sleep(60)
diff --git a/3-transport/lessons/2-store-location-data/code/virtual-device/gps-sensor/app.py b/3-transport/lessons/2-store-location-data/code/virtual-device/gps-sensor/app.py
index aa211db1..0383f8fd 100644
--- a/3-transport/lessons/2-store-location-data/code/virtual-device/gps-sensor/app.py
+++ b/3-transport/lessons/2-store-location-data/code/virtual-device/gps-sensor/app.py
@@ -2,15 +2,14 @@ from counterfit_connection import CounterFitConnection
CounterFitConnection.init('127.0.0.1', 5000)
import time
-from counterfit_shims_grove.adc import ADC
-from counterfit_shims_grove.grove_relay import GroveRelay
+import counterfit_shims_serial
+import pynmea2
import json
-from azure.iot.device import IoTHubDeviceClient, Message, MethodResponse
+from azure.iot.device import IoTHubDeviceClient, Message
connection_string = ''
-adc = ADC()
-relay = GroveRelay(5)
+serial = counterfit_shims_serial.Serial('/dev/ttyAMA0')
device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
@@ -18,24 +17,28 @@ print('Connecting')
device_client.connect()
print('Connected')
-def handle_method_request(request):
- print("Direct method received - ", request.name)
-
- if request.name == "relay_on":
- relay.on()
- elif request.name == "relay_off":
- relay.off()
+def send_gps_data(line):
+ msg = pynmea2.parse(line)
+ if msg.sentence_type == 'GGA':
+ lat = pynmea2.dm_to_sd(msg.lat)
+ lon = pynmea2.dm_to_sd(msg.lon)
- method_response = MethodResponse.create_from_method_request(request, 200)
- device_client.send_method_response(method_response)
+ if msg.lat_dir == 'S':
+ lat = lat * -1
-device_client.on_method_request_received = handle_method_request
+ if msg.lon_dir == 'W':
+ lon = lon * -1
+
+ message_json = { "gps" : { "lat":lat, "lon":lon } }
+ print("Sending telemetry", message_json)
+ message = Message(json.dumps(message_json))
+ device_client.send_message(message)
while True:
- soil_moisture = adc.read(0)
- print("Soil moisture:", soil_moisture)
+ line = serial.readline().decode('utf-8')
- message = Message(json.dumps({ 'soil_moisture': soil_moisture }))
- device_client.send_message(message)
+ while len(line) > 0:
+ send_gps_data(line)
+ line = serial.readline().decode('utf-8')
- time.sleep(10)
\ No newline at end of file
+ time.sleep(60)
\ No newline at end of file
diff --git a/3-transport/lessons/2-store-location-data/code/wio-terminal/gps-sensor/platformio.ini b/3-transport/lessons/2-store-location-data/code/wio-terminal/gps-sensor/platformio.ini
index f09ffc65..6aee0066 100644
--- a/3-transport/lessons/2-store-location-data/code/wio-terminal/gps-sensor/platformio.ini
+++ b/3-transport/lessons/2-store-location-data/code/wio-terminal/gps-sensor/platformio.ini
@@ -14,8 +14,8 @@ board = seeed_wio_terminal
framework = arduino
lib_deps =
bblanchon/ArduinoJson @ 6.17.3
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/3-transport/lessons/3-visualize-location-data/README.md b/3-transport/lessons/3-visualize-location-data/README.md
index dc632d58..27c27ddb 100644
--- a/3-transport/lessons/3-visualize-location-data/README.md
+++ b/3-transport/lessons/3-visualize-location-data/README.md
@@ -1,8 +1,10 @@
# Visualize location data
-Add a sketchnote if possible/appropriate
+This video gives an overview of Azure Maps with IoT, a service that will be covered in this lesson.
-
+[](https://www.youtube.com/watch?v=P5i2GFTtb2s)
+
+> 🎥 Click the image above to watch the video
## Pre-lecture quiz
@@ -10,24 +12,331 @@ Add a sketchnote if possible/appropriate
## Introduction
-In this lesson you will learn about
+In the last lesson you learned how to get GPS data from your sensors to save to the cloud in a storage container using serverless code. Now you will discover how to visualize those points on an Azure map. You will learn how to create a map on a web page, learn about the GeoJSON data format and how to use it to plot all the captured GPS points on your map.
In this lesson we'll cover:
-* [Thing 1](#thing-1)
+* [What is data visualization](#what-is-data-visualization)
+* [Map services](#map-services)
+* [Create an Azure Maps resource](#create-an-azure-maps-resource)
+* [Show a map on a web page](#show-a-map-on-a-web-page)
+* [The GeoJSON format](#the-geojson-format)
+* [Plot GPS data on a Map using GeoJSON](#plot-gps-data-on-a-map-using-geojson)
+
+> 💁 This lesson will involve a small amount of HTML and JavaScript. If you would like to learn more about web development using HTML and JavaScript, check out [Web development for beginners](https://github.com/microsoft/Web-Dev-For-Beginners).
+
+## What is data visualization
+
+Data visualization, as the name suggests, is about visualizing data in ways that make it easier for humans to understand. It is usually associated with charts and graphs, but is any way of pictorially representing data to help humans to not only understand the data better, but help them make decisions.
+
+Taking a simple example - back in the farm project you captured soil moisture settings. A table of soil moisture data captured every hour for the 1st June 2021 might be something like the following:
+
+| Date | Reading |
+| ---------------- | ------: |
+| 01/06/2021 00:00 | 257 |
+| 01/06/2021 01:00 | 268 |
+| 01/06/2021 02:00 | 295 |
+| 01/06/2021 03:00 | 305 |
+| 01/06/2021 04:00 | 325 |
+| 01/06/2021 05:00 | 359 |
+| 01/06/2021 06:00 | 398 |
+| 01/06/2021 07:00 | 410 |
+| 01/06/2021 08:00 | 429 |
+| 01/06/2021 09:00 | 451 |
+| 01/06/2021 10:00 | 460 |
+| 01/06/2021 11:00 | 452 |
+| 01/06/2021 12:00 | 420 |
+| 01/06/2021 13:00 | 408 |
+| 01/06/2021 14:00 | 431 |
+| 01/06/2021 15:00 | 462 |
+| 01/06/2021 16:00 | 432 |
+| 01/06/2021 17:00 | 402 |
+| 01/06/2021 18:00 | 387 |
+| 01/06/2021 19:00 | 360 |
+| 01/06/2021 20:00 | 358 |
+| 01/06/2021 21:00 | 354 |
+| 01/06/2021 22:00 | 356 |
+| 01/06/2021 23:00 | 362 |
+
+As a human, understanding that data can be hard. It's a wall of numbers without any meaning. As a first step to visualizing this data, it can be plotted on a line chart:
+
+
+
+This can be further enhanced by adding a line to indicate when the automated watering system was turned on at a soil moisture reading of 450:
+
+
+
+This chart shows very quickly not only what the soil moisture levels were, but the points where the watering system was turned on.
+
+Charts are not the only tool to visualize data. IoT devices that track weather can have web apps or mobile apps that visualize weather conditions using symbols, such as a cloud symbol for cloudy days, a rain cloud for rainy days and so on. There are a huge number of ways to visualize data, many serious, some fun.
+
+✅ Think about ways you've seen data visualized. Which methods have been the clearest and have allowed you to make decisions fastest?
+
+The best visualizations allow humans to humans to make decisions quickly. For example, having a wall of gauges showing all manner of readings from industrial machinery is hard to process, but a flashing red light when something goes wrong allows a human to make a decision. Sometimes the best visualization is a flashing light!
+
+When working with GPS data, the clearest visualization can be to plot the data on a map. A map showing delivery trucks for example, can help workers at a processing plant see when trucks will arrive. If this map shows more that just pictures of trucks at their current locations, but gives an idea of the contents of a truck, then the workers at the plant can plan accordingly - if they see a refrigerated truck close by they know to prepare space in a fridge.
+
+## Map services
+
+Working with maps is an interesting exercise, and there are many to choose from such as Bing Maps, Leaflet, Open Street Maps, and Google Maps. In this lesson, you will learn about [Azure Maps](https://azure.microsoft.com/services/azure-maps/?WT.mc_id=academic-17441-jabenn) and how they can display your GPS data.
+
+
+
+Azure Maps is "a collection of geospatial services and SDKs that use fresh mapping data to provide geographic context to web and mobile applications." Developers are provided with tools to create beautiful, interactive maps that can do things like provide recommended traffic routes, give information about traffic incidents, indoor navigation, search capabilities, elevation information, weather services and more.
+
+✅ Experiment with some [mapping code samples](https://docs.microsoft.com/samples/browse/?products=azure-maps&WT.mc_id=academic-17441-jabenn)
+
+You can display the maps as a blank canvas, tiles, satellite images, satellite images with roads superimposed, various types of grayscale maps, maps with shaded relief to show elevation, night view maps, and a high contrast map. You can get real-time updates on your maps by integrating them with [Azure Event Grid](https://azure.microsoft.com/services/event-grid/?WT.mc_id=academic-17441-jabenn). You can control the behavior and look of your maps by enabling various controls to allow the map to react to events like pinch, drag, and click. To control the look of your map, you can add layers that include bubbles, lines, polygons, heat maps, and more. Which style of map you implement depends on your choice of SDK.
+
+You can access Azure Maps APIs by leveraging its [REST API](https://docs.microsoft.com/javascript/api/azure-maps-rest/?view=azure-maps-typescript-latest&WT.mc_id=academic-17441-jabenn), its [Web SDK](https://docs.microsoft.com/azure/azure-maps/how-to-use-map-control?WT.mc_id=academic-17441-jabenn), or, if you are building a mobile app, its [Android SDK](https://docs.microsoft.com/azure/azure-maps/how-to-use-android-map-control-library?pivots=programming-language-java-android&WT.mc_id=academic-17441-jabenn).
+
+In this lesson, you will use the web SDK to draw a map and display your sensor's GPS location's path.
+
+## Create an Azure Maps resource
+
+Your first step is to create an Azure Maps account.
+
+### Task - create an Azure Maps resource
+
+1. Run the following command from your Terminal or Command Prompt to create an Azure Maps resource in your `gps-sensor` resource group:
+
+ ```sh
+ az maps account create --name gps-sensor \
+ --resource-group gps-sensor \
+ --accept-tos \
+ --sku S1
+ ```
+
+ This will create an Azure Maps resource called `gps-sensor`. The tier being used is `S1`, which is a paid tier that includes a range of features, but with a generous amount of calls for free.
+
+ > 💁 To see the cost of using Azure Maps, check out the [Azure Maps pricing page](https://azure.microsoft.com/pricing/details/azure-maps/?WT.mc_id=academic-17441-jabenn).
+
+1. You will need an API key for the maps resource. Use the following command to get this key:
+
+ ```sh
+ az maps account keys list --name gps-sensor \
+ --resource-group gps-sensor \
+ --output table
+ ```
+
+ Take a copy of the `PrimaryKey` value.
+
+## Show a map on a web page
+
+Now you can take the next step which is to display your map on a web page. We will use just one `html` file for your small web app; keep in mind that in a production or team environment, your web app will most likely have more moving parts!
+
+### Task - show a map on a web page
+
+1. Create a file called index.html in a folder somewhere on your local computer. Add HTML markup to hold a map:
+
+ ```html
+
+
+
+
+
+
+
+
+
+ ```
+
+ The map will load in the `myMap` `div`. A few styles allow it to span the width and height of the page.
+
+ > 🎓 a `div` is a section of a web page that can be named and styled.
+
+1. Under the opening `` tag, add an external style sheet to control the map display, and an external script from the Web SDK to manage its behavior:
+
+ ```html
+
+
+ ```
+
+ This style sheet contains the settings for how the map looks, and the script file contains code to load the map. Adding this code is similar to including C++ header files or importing Python modules.
-## Thing 1
+1. Under that script, add a script block to launch the map.
+
+ ```javascript
+
+ ```
+
+ Replace `` with the API key for your Azure Maps account.
+
+ If you open your `index.html` page in a web browser, you should see a map loaded, and focused on the Seattle area.
+
+ 
+
+ ✅ Experiment with the zoom and center parameters to change your map display. You can add different coordinates corresponding to your data's latitude and longitude to re-center the map.
+
+> 💁 A better way to work with web apps locally is to install [http-server](https://www.npmjs.com/package/http-server). You will need [node.js](https://nodejs.org/) and [npm](https://www.npmjs.com/) installed before using this tool. Once those tools are installed, you can navigate to the location of your `index.html` file and type `http-server`. The web app will open on a local webserver [http://127.0.0.1:8080/](http://127.0.0.1:8080/).
+
+## The GeoJSON format
+
+Now that you have your web app in place with the map displaying, you need to extract GPS data from your storage account and display it in a layer of markers on top of the map. Before we do that, let's look at the [GeoJSON](https://wikipedia.org/wiki/GeoJSON) format that is required by Azure Maps.
+
+[GeoJSON](https://geojson.org/) is an open standard JSON specification with special formatting designed to handle geographic-specific data. You can learn about it by testing sample data using [geojson.io](https://geojson.io), which is also a useful tool to debug GeoJSON files.
+
+Sample GeoJSON data looks like this:
+
+```json
+{
+ "type": "FeatureCollection",
+ "features": [
+ {
+ "type": "Feature",
+ "geometry": {
+ "type": "Point",
+ "coordinates": [
+ -2.10237979888916,
+ 57.164918677004714
+ ]
+ }
+ }
+ ]
+}
+```
+
+Of particular interest is the way the data is nested as a `Feature` within a `FeatureCollection`. Within that object can be found `geometry` with the `coordinates` indicating latitude and longitude.
+
+✅ When building your geoJSON, pay attention to the order of `latitude` and `longitude` in the object, or your points will not appear where they should! GeoJSON expects data in the order `lon,lat` for points, not `lat,lon`.
+
+`Geometry` can have different types, such as a single point or a polygon. In this example, it is a point with two coordinates specified, the longitude, and the latitude.
+
+✅ Azure Maps supports standard GeoJSON plus some [enhanced features](https://docs.microsoft.com/azure/azure-maps/extend-geojson?WT.mc_id=academic-17441-jabenn) including the ability to draw circles and other geometries.
+
+## Plot GPS data on a Map using GeoJSON
+
+Now you are ready to consume data from the storage that you built in the previous lesson. As a reminder, it is stored as a number of files in blob storage so you will need to retrieve the files and parse them so that Azure Maps can use the data.
+
+### Task - configure storage to be accessed from a web page
+
+If you make a call to your storage to fetch the data you might be surprised to see errors occurring in your browser's console. That's because you need to set permissions for [CORS](https://developer.mozilla.org/docs/Web/HTTP/CORS) on this storage to allow external web apps to read its data.
+
+> 🎓 CORS stands for "Cross-Origin Resource Sharing" and usually needs to be set explicitly in Azure for security reasons. It stops sites you don't expect from being able to access your data.
+
+1. Run the following command to enable CORS:
+
+ ```sh
+ az storage cors add --methods GET \
+ --origins "*" \
+ --services b \
+ --account-name \
+ --account-key
+ ```
+
+ Replace `` with the name of your storage account. Replace `` with the account key for your storage account.
+
+ This command allows any website (the wildcard `*` means any) to make a *GET* request, that is get data, from your storage account. The `--services b` means only apply this setting for blobs.
+
+### Task - load the GPS data from storage
+
+1. Replace the entire contents of the `init` function with the following code:
+
+ ```javascript
+ fetch("https://.blob.core.windows.net/gps-data/?restype=container&comp=list")
+ .then(response => response.text())
+ .then(str => new window.DOMParser().parseFromString(str, "text/xml"))
+ .then(xml => {
+ let blobList = Array.from(xml.querySelectorAll("Url"));
+ blobList.forEach(async blobUrl => {
+ loadJSON(blobUrl.innerHTML)
+ });
+ })
+ .then( response => {
+ map = new atlas.Map('myMap', {
+ center: [-122.26473, 47.73444],
+ zoom: 14,
+ authOptions: {
+ authType: "subscriptionKey",
+ subscriptionKey: "",
+
+ }
+ });
+ map.events.add('ready', function () {
+ var source = new atlas.source.DataSource();
+ map.sources.add(source);
+ map.layers.add(new atlas.layer.BubbleLayer(source));
+ source.add(features);
+ })
+ })
+ ```
+
+ Replace `` with the name of your storage account. Replace `` with the API key for your Azure Maps account.
+
+ There are several things happening here. First, the code fetches your GPS data from your blob container using a URL endpoint built using your storage account name. This URL retrieves from `gps-data`, indicating the resource type is a container (`restype=container`), and lists information about all the blobs. This list won't return the blobs themselves, but will return a URL for each blob that can be used to load the blob data.
+
+ > 💁 You can put this URL into your browser to see details of all the blobs in your container. Each item will have a `Url` property that you can also load in your browser to see the contents of the blob.
+
+ This code then loads each blob, calling a `loadJSON` function, which will be created next. It then creates the map control, and adds code to the `ready` event. This event is called when the map is displayed on the web page.
+
+ The ready event creates an Azure Maps data source - a container that contains GeoJSON data that will be populated later. This data source is then used to create a bubble layer - that is a set of circles on the map centered over each point in the GeoJSON.
+
+1. Add the `loadJSON` function to your script block, below the `init` function:
+
+ ```javascript
+ var map, features;
+
+ function loadJSON(file) {
+ var xhr = new XMLHttpRequest();
+ features = [];
+ xhr.onreadystatechange = function () {
+ if (xhr.readyState === XMLHttpRequest.DONE) {
+ if (xhr.status === 200) {
+ gps = JSON.parse(xhr.responseText)
+ features.push(
+ new atlas.data.Feature(new atlas.data.Point([parseFloat(gps.gps.lon), parseFloat(gps.gps.lat)]))
+ )
+ }
+ }
+ };
+ xhr.open("GET", file, true);
+ xhr.send();
+ }
+ ```
+
+ This function is called by the fetch routine to parse through the JSON data and convert it to be read as longitude and latitude coordinates as geoJSON.
+ Once parsed, the data is set as part of a geoJSON `Feature`. The map will be initialized and little bubbles will appear around the path your data is plotting:
+
+1. Load the HTML page in your browser. It will load the map, then load all the GPS data from storage and plot it on the map.
+
+ 
+
+> 💁 You can find this code in the [code](./code) folder.
---
## 🚀 Challenge
+It's nice to be able to display static data on a map as markers. Can you enhance this web app to add animation and show the path of the markers over time, using the timestamped json files? Here are [some samples](https://azuremapscodesamples.azurewebsites.net/) of using animation within maps.
+
## Post-lecture quiz
[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/26)
## Review & Self Study
+Azure Maps is particularly useful for working with IoT devices.
+
+* Research some of the uses in the [Azure Maps documentation on Microsoft docs](https://docs.microsoft.com/azure/azure-maps/tutorial-iot-hub-maps?WT.mc_id=academic-17441-jabenn).
+* Deepen your knowledge of map making and waypoints with the [create your first route finding app with Azure Maps self-guided learning module on Microsoft Learn](https://docs.microsoft.com/learn/modules/create-your-first-app-with-azure-maps/?WT.mc_id=academic-17441-jabenn).
+
## Assignment
-[](assignment.md)
+[Deploy your app](assignment.md)
diff --git a/3-transport/lessons/3-visualize-location-data/assignment.md b/3-transport/lessons/3-visualize-location-data/assignment.md
index da157d5c..f1b770b8 100644
--- a/3-transport/lessons/3-visualize-location-data/assignment.md
+++ b/3-transport/lessons/3-visualize-location-data/assignment.md
@@ -1,9 +1,12 @@
-#
+# Deploy your app
## Instructions
+There are several ways that you can deploy your app so that you can share it with the world, including using GitHub pages or using one of many service providers. A really excellent way to do this is to use Azure Static Web Apps. In this assignment, build your web app and deploy it to the cloud by following [these instructions](https://github.com/Azure/static-web-apps-cli) or watching [these videos](https://www.youtube.com/watch?v=ADVGIXciYn8&list=PLlrxD0HtieHgMPeBaDQFx9yNuFxx6S1VG&index=3).
+A benefit of using Azure Static Web Apps is that you can hide any API keys in the portal, so take this opportunity to refactor your subscriptionKey as a variable and store it in the cloud.
+
## Rubric
-| Criteria | Exemplary | Adequate | Needs Improvement |
-| -------- | --------- | -------- | ----------------- |
-| | | | |
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------- |
+| | A working web app is presented in a documented GitHub repository with its subscriptionKey stored in the cloud and called via a variable | A working web app is presented in a documented GitHub repository but its subscriptionKey is not stored in the cloud | The web app contains bugs or does not work properly |
diff --git a/3-transport/lessons/3-visualize-location-data/code/index.html b/3-transport/lessons/3-visualize-location-data/code/index.html
new file mode 100644
index 00000000..e653c742
--- /dev/null
+++ b/3-transport/lessons/3-visualize-location-data/code/index.html
@@ -0,0 +1,68 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/3-transport/lessons/4-geofences/README.md b/3-transport/lessons/4-geofences/README.md
index 697cbbd9..77b67eec 100644
--- a/3-transport/lessons/4-geofences/README.md
+++ b/3-transport/lessons/4-geofences/README.md
@@ -1,11 +1,11 @@
# Geofences
-Add a sketchnote if possible/appropriate
-
This video gives an overview of geofences and how to use them in Azure Maps, topics that will be covered in this lesson:
[](https://www.youtube.com/watch?v=nsrgYhaYNVY)
+> 🎥 Click the image above to watch a video
+
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/27)
@@ -39,7 +39,7 @@ There are many reasons why you would want to know that a vehicle is inside or ou
* Preparation for unloading - getting a notification that a vehicle has arrived on-site allows a crew to be prepared to unload the vehicle, reducing vehicle waiting time. This can allow a driver to make more deliveries in a day with less waiting time.
* Tax compliance - some countries, such as New Zealand, charge road taxes for diesel vehicles based on the vehicle weight when driving on public roads only. Using geofences allows you to track the mileage driven on public roads as opposed to private roads on sites such as farms or logging areas.
-* Monitoring theft - if a vehicle should only remain in a certain area such as on a farm, and it leaves the geofence, it might be being stolen.
+* Monitoring theft - if a vehicle should only remain in a certain area such as on a farm, and it leaves the geofence, it might have been stolen.
* Location compliance - some parts of a work site, farm or factory may be off-limits to certain vehicles, such as keeping vehicles that carry artificial fertilizers and pesticides away from fields growing organic produce. If a geofence is entered, then a vehicle is outside of compliance and the driver can be notified.
✅ Can you think of other uses for geofences?
@@ -210,7 +210,7 @@ For example, imagine GPS readings showing a vehicle was driving along a road tha

-In the above image, there is a geofence over part of the Microsoft campus. The red line shows a truck driving along the 520, with circles to show the GPS readings. Most of these are accurate and along the 520, with one inaccurate reading inside the geofence. The is no way that reading can be correct - there are no roads for the truck to suddenly divert from the 520 onto campus, then back onto the 520. The code that checks this geofence will need to take the previous readings into consideration before acting on the results of the geofence test.
+In the above image, there is a geofence over part of the Microsoft campus. The red line shows a truck driving along the 520, with circles to show the GPS readings. Most of these are accurate and along the 520, with one inaccurate reading inside the geofence. There is no way that reading can be correct - there are no roads for the truck to suddenly divert from the 520 onto campus, then back onto the 520. The code that checks this geofence will need to take the previous readings into consideration before acting on the results of the geofence test.
✅ What additional data would you need to check to see if a GPS reading could be considered correct?
@@ -235,7 +235,7 @@ In the above image, there is a geofence over part of the Microsoft campus. The r
1. Use curl to make a GET request to this URL:
```sh
- curl --request GET
+ curl --request GET ''
```
> 💁 If you get a response code of `BadRequest`, with an error of:
@@ -253,7 +253,7 @@ In the above image, there is a geofence over part of the Microsoft campus. The r
"geometries": [
{
"deviceId": "gps-sensor",
- "udId": "1ffb2047-6757-8c29-2c3d-da44cec55ff9",
+ "udId": "7c3776eb-da87-4c52-ae83-caadf980323a",
"geometryId": "1",
"distance": 999.0,
"nearestLat": 47.645875,
@@ -322,6 +322,8 @@ When you create an IoT Hub, you get the `$Default` consumer group created by def
geofence gps-sensor
```
+> 💁 When you ran the IoT Hub event monitor in an earlier lesson, it connected to the `$Default` consumer group. This was why you can't run the event monitor and an event trigger. If you want to run both, then you can use other consumer groups for all your function apps, and keep `$Default` for the event monitor.
+
### Task - create a new IoT Hub trigger
1. Add a new IoT Hub event trigger to your `gps-trigger` function app that you created in an earlier lesson. Call this function `geofence-trigger`.
diff --git a/3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot_hub_trigger/__init__.py b/3-transport/lessons/4-geofences/code/functions/gps-trigger/iot-hub-trigger/__init__.py
similarity index 100%
rename from 3-transport/lessons/2-store-location-data/code/functions/gps-trigger/iot_hub_trigger/__init__.py
rename to 3-transport/lessons/4-geofences/code/functions/gps-trigger/iot-hub-trigger/__init__.py
diff --git a/3-transport/lessons/4-geofences/code/functions/gps-trigger/iot_hub_trigger/function.json b/3-transport/lessons/4-geofences/code/functions/gps-trigger/iot-hub-trigger/function.json
similarity index 100%
rename from 3-transport/lessons/4-geofences/code/functions/gps-trigger/iot_hub_trigger/function.json
rename to 3-transport/lessons/4-geofences/code/functions/gps-trigger/iot-hub-trigger/function.json
diff --git a/4-manufacturing/README.md b/4-manufacturing/README.md
index 6158dc51..b68b0319 100644
--- a/4-manufacturing/README.md
+++ b/4-manufacturing/README.md
@@ -1,6 +1,6 @@
# Manufacturing and processing - using IoT to improve the processing of food
-Once food reaches a central hub or processing plant, it isn't always just shipped out to supermarkets. A lot of the time the food goes through a number of processing steps, such as sorting by quality. This is a process that used to be manual - it would start in the field when pickers would only pick ripe fruit, then at the factory the fruit would be ride a conveyer belt and employees would manually remove any bruised or rotten fruit. Having picked and sorted strawberries myself as a summer job during school, I can testify that this isn't a fun job.
+Once food reaches a central hub or processing plant, it isn't always just shipped out to supermarkets. A lot of the time the food goes through a number of processing steps, such as sorting by quality. This is a process that used to be manual - it would start in the field when pickers would only pick ripe fruit, then at the factory the fruit would ride a conveyer belt and employees would manually remove any bruised or rotten fruit. Having picked and sorted strawberries myself as a summer job during school, I can testify that this isn't a fun job.
More modern setups rely on IoT for sorting. Some of the earliest devices like the sorters from [Weco](https://wecotek.com) use optical sensors to detect the quality of produce, rejecting green tomatoes for example. These can be deployed in harvesters on the farm itself, or in processing plants.
@@ -10,7 +10,7 @@ As advances happen in Artificial Intelligence (AI) and Machine Learning (ML), th
In these 4 lessons you'll learn how to train image-based AI models to detect fruit quality, how to use these from an IoT device, and how to run these on the edge - that is on an IoT device rather than in the cloud.
-> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
+> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [clean up your project](../clean-up.md).
## Topics
diff --git a/4-manufacturing/lessons/1-train-fruit-detector/README.md b/4-manufacturing/lessons/1-train-fruit-detector/README.md
index 181af1e4..938ff3a4 100644
--- a/4-manufacturing/lessons/1-train-fruit-detector/README.md
+++ b/4-manufacturing/lessons/1-train-fruit-detector/README.md
@@ -1,11 +1,11 @@
# Train a fruit quality detector
-Add a sketchnote if possible/appropriate
-
This video gives an overview of the Azure Custom Vision service, a service that will be covered in this lesson.
[](https://www.youtube.com/watch?v=TETcDLJlWR4)
+> 🎥 Click the image above to watch the video
+
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/29)
@@ -44,6 +44,8 @@ The video below shows one of these machines in action.
[](https://www.youtube.com/watch?v=AcRL91DouAU)
+> 🎥 Click the image above to watch a video
+
In this video, as tomatoes fall from one conveyer belt to another, green tomatoes are detected and flicked into a bin using levers.
✅ What conditions would you need in a factory or in a field for these optical sensors to work correctly?
@@ -56,11 +58,11 @@ Traditional programming is where you take data, apply an algorithm to the data,

-Machine learning turns this around - you start with data and known outputs, and the machine learning tools work out the algorithm. You can then take that algorithm, called a *machine learning model*, and input new data and get new output.
+Machine learning turns this around - you start with data and known outputs, and the machine learning algorithm learns from the data. You can then take that trained algorithm, called a *machine learning model* or *model*, and input new data and get new output.
-> 🎓 The process of a machine learning tool generating a model is called *training*. The inputs and known outputs are called *training data*.
+> 🎓 The process of a machine learning algorithm learning from the data is called *training*. The inputs and known outputs are called *training data*.
-For example, you could give a model millions of pictures of unripe bananas as input training data, with the training output set as `unripe`, and millions of ripe banana pictures as training data with the output set as `ripe`. The ML tools will then generate a model. You then give this model a new picture of a banana and it will predict if the new picture is a ripe or an unripe banana.
+For example, you could give a model millions of pictures of unripe bananas as input training data, with the training output set as `unripe`, and millions of ripe banana pictures as training data with the output set as `ripe`. The ML algorithm will then create a model based off this data. You then give this model a new picture of a banana and it will predict if the new picture is a ripe or an unripe banana.
> 🎓 The results of ML models are called *predictions*
@@ -70,6 +72,8 @@ ML models don't give a binary answer, instead they give probabilities. For examp
The ML model used to detect images like this is called an *image classifier* - it is given labelled images, and then classifies new images based off these labels.
+> 💁 This is an over-simplification, and there are many other ways to train models that don't always need labelled outputs, such as unsupervised learning. If you want to learn more about ML, check out [ML for beginners, a 24 lesson curriculum on Machine Learning](https://aka.ms/ML-beginners).
+
## Train an image classifier
To successfully train an image classifier you need millions of images. As it turns out, once you have an image classifier trained on millions or billions of assorted images, you can re-use it and re-train it using a small set of images and get great results, using a process called *transfer learning*.
@@ -118,6 +122,8 @@ To use Custom Vision, you first need to create two cognitive services resources
Replace `` with the location you used when creating the Resource Group.
This will create a Custom Vision training resource in your Resource Group. It will be called `fruit-quality-detector-training` and use the `F0` sku, which is the free tier. The `--yes` option means you agree to the terms and conditions of the cognitive services.
+
+> 💁 Use `S0` sku if you already have a free account using any of the Cognitive Services.
1. Use the following command to create a free Custom Vision prediction resource:
@@ -146,6 +152,8 @@ To use Custom Vision, you first need to create two cognitive services resources

+✅ Take some time to explore the Custom Vision UI for your image classifier.
+
### Task - train your image classifier project
To train an image classifier, you will need multiple pictures of fruit, both good and bad quality to tag as good and bad, such as an ripe and an overripe banana.
@@ -166,7 +174,7 @@ Image classifiers run at very low resolution. For example Custom Vision can take
* Repeat the same process using 2 unripe bananas
- You should have at least 10 training images, with at least 5 ripe and 5 unripe, and 4 testing images, 2 ripe, 2 unripe. You're images should be png or jpegs, small than 6MB. If you create them with an iPhone for example they may be high-resolution HEIC images, so will need to be converted and possibly shrunk. The more images the better, and you should have a similar number of ripe and unripe.
+ You should have at least 10 training images, with at least 5 ripe and 5 unripe, and 4 testing images, 2 ripe, 2 unripe. Your images should be png or jpegs, small than 6MB. If you create them with an iPhone for example they may be high-resolution HEIC images, so will need to be converted and possibly shrunk. The more images the better, and you should have a similar number of ripe and unripe.
If you don't have both ripe and unripe fruit, you can use different fruits, or any two objects you have available. You can also find some example images in the [images](./images) folder of ripe and unripe bananas that you can use.
@@ -196,7 +204,7 @@ Once your classifier is trained, you can test it by giving it a new image to cla
## Retrain your image classifier
-When you test you classifier, it may not give the results you expect. Image classifiers use machine learning to make predictions about what is in an image, based of probabilities that particular features of an image mean that it matches a particular label. It doesn't understand what is in the image - it doesn't know what a banana is or understand what makes a banana a banana instead of a boat. You can improve your classifier by retraining it with images it gets wrong.
+When you test your classifier, it may not give the results you expect. Image classifiers use machine learning to make predictions about what is in an image, based of probabilities that particular features of an image mean that it matches a particular label. It doesn't understand what is in the image - it doesn't know what a banana is or understand what makes a banana a banana instead of a boat. You can improve your classifier by retraining it with images it gets wrong.
Every time you make a prediction using the quick test option, the image and results are stored. You can use these images to retrain your model.
diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/README.md b/4-manufacturing/lessons/2-check-fruit-from-device/README.md
index 1de30bf0..fe522c07 100644
--- a/4-manufacturing/lessons/2-check-fruit-from-device/README.md
+++ b/4-manufacturing/lessons/2-check-fruit-from-device/README.md
@@ -1,7 +1,5 @@
# Check fruit quality from an IoT device
-Add a sketchnote if possible/appropriate
-

## Pre-lecture quiz
@@ -66,7 +64,7 @@ When you are happy with an iteration, you can publish it to make it available to
Iterations are published from the Custom Vision portal.
-1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already.
+1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `fruit-quality-detector` project.
1. Select the **Performance** tab from the options at the top
diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-camera/wio-terminal/fruit-quality-detector/platformio.ini b/4-manufacturing/lessons/2-check-fruit-from-device/code-camera/wio-terminal/fruit-quality-detector/platformio.ini
index 57efb3ca..d2d6f51d 100644
--- a/4-manufacturing/lessons/2-check-fruit-from-device/code-camera/wio-terminal/fruit-quality-detector/platformio.ini
+++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-camera/wio-terminal/fruit-quality-detector/platformio.ini
@@ -13,8 +13,8 @@ platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
- seeed-studio/Seeed Arduino rpcWiFi @ 1.0.3
- seeed-studio/Seeed Arduino FS @ 2.0.2
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
seeed-studio/Seeed Arduino SFUD @ 2.0.1
seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
diff --git a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini
index 1e0cd574..5f3eb8a7 100644
--- a/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini
+++ b/4-manufacturing/lessons/2-check-fruit-from-device/code-classify/wio-terminal/fruit-quality-detector/platformio.ini
@@ -13,12 +13,12 @@ platform = atmelsam
board = seeed_wio_terminal
framework = arduino
lib_deps =
- seeed-studio/Seeed Arduino rpcWiFi
- seeed-studio/Seeed Arduino FS
- seeed-studio/Seeed Arduino SFUD
- seeed-studio/Seeed Arduino rpcUnified
- seeed-studio/Seeed_Arduino_mbedtls
- seeed-studio/Seeed Arduino RTC
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+ seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
+ seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
+ seeed-studio/Seeed Arduino RTC @ 2.0.0
bblanchon/ArduinoJson @ 6.17.3
build_flags =
-w
diff --git a/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md b/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md
index 120c1d18..de3d4c73 100644
--- a/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md
+++ b/4-manufacturing/lessons/3-run-fruit-detector-edge/README.md
@@ -1,11 +1,13 @@
# Run your fruit detector on the edge
-Add a sketchnote if possible/appropriate
+
This video gives an overview of running image classifiers on IoT devices, the topic that is covered in this lesson.
[](https://www.youtube.com/watch?v=_K5fqGLO8us)
+> 🎥 Click the image above to watch a video
+
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/33)
@@ -16,9 +18,73 @@ In this lesson you will learn about
In this lesson we'll cover:
-* [Thing 1](#thing-1)
+* [Edge computing](#edge-computing)
+* [Azure IoT Edge](#azure-iot-edge)
+* [Register an IoT Edge device](#registeran-iot-edge-device)
+* [Set up an IoT Edge device](#set-up-an-iot-dge-device)
+* [Run your classifier on the edge](run-your-classifier-on-the-edge)
+
+## Edge computing
+
+## Azure IoT Edge
+
+
+
+IoT Edge runs code from containers.
+
+## Register an IoT Edge device
+
+To use an IoT Edge device, it needs to be registered in IoT Hub.
+
+### Task - register an IoT Edge device
+
+1. Create an IoT Hub in the `fruit-quality-detector` resource group. Give it a unique name based around `fruit-quality-detector`.
+
+1. Register an IoT Edge device called `fruit-quality-detector-edge` in your IoT Hub. The command to do this is similar to the one used to register a non-edge device, except you pass the `--edge-enabled` flag.
+
+ ```sh
+ az iot hub device-identity create --edge-enabled \
+ --device-id fruit-quality-detector-edge \
+ --hub-name
+ ```
+
+ Replace `` with the name of your IoT Hub.
+
+1. Get the connection string for your device using the following command:
+
+ ```sh
+ az iot hub device-identity connection-string show --device-id fruit-quality-detector-edge \
+ --output table \
+ --hub-name
+ ```
+
+ Replace `` with the name of your IoT Hub.
+
+ Take a copy of the connection string that is shown in the output.
+
+## Set up an IoT Edge device
+
+### Task - set up an IoT Edge device
+
+The IoT Edge runtime only runs Linux containers. It can be run on Linux, or on Windows using Linux Virtual Machines.
+
+* If you are using a Raspberry Pi as your IoT device, then this runs a supported version of Linux and can host the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string.
+
+ > 💁 Remember, Raspberry Pi OS is a variant of Debian Linux.
+
+* If you are not using a Raspberry Pi, but have a Linux computer, you can run the IoT Edge runtime. Follow the [Install Azure IoT Edge for Linux guide on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/how-to-install-iot-edge?WT.mc_id=academic-17441-jabenn) to install IoT Edge and set the connection string.
+
+* If you are using Windows, you can install the IoT Edge runtime in a Linux Virtual Machine by following the [Install and start the IoT Edge runtime section of the Deploy your first IoT Edge module to a Windows device quickstart on Microsoft docs](https://docs.microsoft.com/azure/iot-edge/quickstart?WT.mc_id=academic-17441-jabenn#install-and-start-the-iot-edge-runtime). You can stop when you reach the *Deploy a module* section.
+
+* If you are using macOS, you can create a virtual machine (VM) in the cloud to use for your IoT Edge device. These are computers you can create in the cloud and access over the internet. You can create a Linux VM that has IoT Edge installed. Follow the [Create a virtual machine running IoT Edge guide](vm-iotedge.md) for instructions on how to do this.
+
+## Create a classifier that can run on the edge
+
+## Run your classifier on the edge
+
+### Task - deploy your classifier using IoT Edge
-## Thing 1
+### Task - use the edge classifier from your IoT device
---
diff --git a/4-manufacturing/lessons/3-run-fruit-detector-edge/vm-iotedge.md b/4-manufacturing/lessons/3-run-fruit-detector-edge/vm-iotedge.md
new file mode 100644
index 00000000..54c8736f
--- /dev/null
+++ b/4-manufacturing/lessons/3-run-fruit-detector-edge/vm-iotedge.md
@@ -0,0 +1,66 @@
+# Create a virtual machine running IoT Edge
+
+In Azure, you can create a virtual machine - a computer in the cloud that you can configure any way you wish and run your own software on it.
+
+> 💁 You can read more about virtual machines on teh [Virtual Machine page on Wikipedia](https://wikipedia.org/wiki/Virtual_machine).
+
+## Task - Set up an IoT Edge virtual machine
+
+1. Run the following command to create a VM that has Azure IoT Edge already pre-installed:
+
+ ```sh
+ az deployment group create \
+ --resource-group fruit-quality-detector \
+ --template-uri https://raw.githubusercontent.com/Azure/iotedge-vm-deploy/1.2.0/edgeDeploy.json \
+ --parameters dnsLabelPrefix= \
+ --parameters adminUsername= \
+ --parameters deviceConnectionString="" \
+ --parameters authenticationType=password \
+ --parameters adminPasswordOrKey=""
+ ```
+
+ Replace `` with a name for this virtual machine. This needs to be globally unique, so use something like `fruit-quality-detector-vm-` with your name or another value on the end.
+
+ Replace `` and `` with a username and password to use to log in to the VM. These need to be relatively secure, so you can't use admin/password.
+
+ Replace `` with the connection string of your `fruit-quality-detector-edge` IoT Edge device.
+
+ This will create a VM configured as a `DS1 v2` virtual machine. These categories indicate how powerful the machine is, and therefor how much it costs. This VM has 1 CPU and 3.5GB of RAM.
+
+ > 💰 You can see the current pricing of these VMs on the [Azure Virtual Machine pricing guide](https://azure.microsoft.com/pricing/details/virtual-machines/linux/?WT.mc_id=academic-17441-jabenn)
+
+ Once the VM has been created, the IoT Edge runtime will be installed automatically, and configured you connect to your IoT Hub as your `fruit-quality-detector-edge` device.
+
+1. VMs cost money. At the time of writing, a DS1 VM costs about $0.06 per hour. To keep costs down, you should shut down the VM when you are not using it, and delete it when you are finished with this project.
+
+ To shut down the VM, use the following command:
+
+ ```sh
+ az vm deallocate --resource-group fruit-quality-detector \
+ --name
+ ```
+
+ Replace `` with the name of your virtual machine.
+
+ > 💁 There is an `az vm stop` command which will stop the VM, but it keeps the computer allocated to you, so you still pay as if it was still running.
+
+ To restart the VM, use the following command:
+
+ ```sh
+ az vm start --resource-group fruit-quality-detector \
+ --name
+ ```
+
+ Replace `` with the name of your virtual machine.
+
+ You can also configure your VM to automatically shut down at a certain time each day. This means if you forget to shut it down, you won't be billed for more than the time till the automatic shutdown. Use the following command to set this:
+
+ ```sh
+ az vm auto-shutdown --resource-group fruit-quality-detector \
+ --name \
+ --time
+ ```
+
+ Replace `` with the name of your virtual machine.
+
+ Replace `` with the UTC time that you want the VM to shut down using 4 digits as HHMM. For example, if you want to shutdown at midnight UTC, you would set this to `0000`. For 7:30PM on the west coast of the USA, you would use 0230 (7:30PM on the US west coast is 2:30AM UTC).
diff --git a/4-manufacturing/lessons/4-trigger-fruit-detector/README.md b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md
index c3f92593..b358d2b2 100644
--- a/4-manufacturing/lessons/4-trigger-fruit-detector/README.md
+++ b/4-manufacturing/lessons/4-trigger-fruit-detector/README.md
@@ -1,9 +1,5 @@
# Trigger fruit quality detection from a sensor
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/35)
diff --git a/5-retail/README.md b/5-retail/README.md
index 0471ed34..c5a2e4c4 100644
--- a/5-retail/README.md
+++ b/5-retail/README.md
@@ -1,7 +1,20 @@
# Retail - using IoT to manage stock levels
+The last stage for feed before it reaches consumers is retail - the markets, greengrocers, supermarkets and stores that sell produce to consumers. These stores want to ensure they have produce out on shelves for consumers to see and buy.
+
+One of the most manual, time consuming tasks in food stores, especially in large supermarkets, is making sure the shelves are stocked. Checking individual shelves to ensure any gaps are filled with produce from store rooms.
+
+IoT can help with this, using AI models running on IoT devices to count stock, using machine learning models that don't just classify images, but can detect individual objects and count them.
+
+In these 2 lessons you'll learn how to train image-based AI models to count stock, and run these models on IoT devices.
+
+> 💁 These lessons will use some cloud resources. If you don't complete all the lessons in this project, make sure you [Clean up your project](../clean-up.md).
+
## Topics
+1. [Train a stock detector](./lessons/1-train-stock-detector/README.md)
+1. [Check stock from an IoT device](./lessons/2-check-stock-device/README.md)
## Credits
+All the lessons were written with ♥️ by [Jim Bennett](https://GitHub.com/JimBobBennett)
diff --git a/5-retail/lessons/1-train-stock-detector/README.md b/5-retail/lessons/1-train-stock-detector/README.md
new file mode 100644
index 00000000..7516434f
--- /dev/null
+++ b/5-retail/lessons/1-train-stock-detector/README.md
@@ -0,0 +1,189 @@
+# Train a stock detector
+
+This video gives an overview of Object Detection the Azure Custom Vision service, a service that will be covered in this lesson.
+
+[](https://www.youtube.com/watch?v=wtTYSyBUpFc)
+
+> 🎥 Click the image above to watch the video
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/37)
+
+## Introduction
+
+In the previous project, you used AI to train an image classifier - a model that can tell if an image contains something, such as ripe fruit or unripe fruit. Another type of AI model that can be used with images is object detection. These models don't classify an image by tags, instead they are trained to recognize objects, and can find them in images, not only detecting that the image is present, but detecting where in the image it is. This allows you to count objects in images.
+
+In this lesson you will learn about object detection, including how it can be used in retail. You will also learn how to train an object detector in the cloud.
+
+In this lesson we'll cover:
+
+* [Object detection](#object-detection)
+* [Use object detection in retail](#use-object-detection-in-retail)
+* [Train an object detector](#train-an-object-detector)
+* [Test your object detector](#test-your-object-detector)
+* [Retrain your object detector](#retrain-your-object-detector)
+
+## Object detection
+
+Object detection involves detecting objects in images using AI. Unlike the image classifier you trained in the last project, object detection is not about predicting the best tag for an image as a whole, but for finding one or more objects in an image.
+
+### Object detection vs image classification
+
+Image classification is about classifying an image as a whole - what are the probabilities that the whole image matches each tag. You get back probabilities for every tag used to train the model.
+
+
+
+In the example above, two images are classified using a model trained to classify tubs of cashew nuts or cans of tomato paste. The first image is a tub of cashew nuts, and has two results from the image classifier:
+
+| Tag | Probability |
+| -------------- | ----------: |
+| `cashew nuts` | 98.4% |
+| `tomato paste` | 1.6% |
+
+The second image is of a can of tomato paste, and the results are:
+
+| Tag | Probability |
+| -------------- | ----------: |
+| `cashew nuts` | 0.7% |
+| `tomato paste` | 99.3% |
+
+You could use these value with a threshold percentage to predict what was in the image. But what if an image contained multiple cans of tomato paste, or both cashew nuts and tomato paste? The results would probably not give you what you want. This is where object detection comes in.
+
+Object detection involves training a model to recognize objects. Instead of giving it images containing the object and telling it each image is one tag or another, you highlight the section of an image that contains the specific object, and tag that. You can tag a single object in an image or multiple. This way the model learns what the object itself looks like, not just what images that contain the object look like.
+
+When you then use it to predict images, instead of getting back a list of tags and percentages, you get back a list of detected objects, with their bounding box and the probability that the object matches the assigned tag.
+
+> 🎓 *Bounding boxes* are the boxes around an object.
+
+
+
+The image above contains both a tub of cashew nuts and three cans of tomato paste. The object detector detected the cashew nuts, returning the bounding box that contains the cashew nuts with the percentage chance that that bounding box contains the object, in this case 97.6%. The object detector has also detected three cans of tomato paste, and provides three separate bounding boxes, one for each detected can, and each one has a percentage probability that the bounding box contains a can of tomato paste.
+
+✅ Think of some different scenarios you might want to use image-based AI models for. Which ones would need classification, and which would need object detection?
+
+### How object detection works
+
+Object detection uses complex ML models. These models work by diving the image up into multiple cells, then checks if the center of the bounding box is the center of an image that matches one of the images used to train the model. You can think of this as kind of like running an image classifier over different parts of the image to look for matches.
+
+> 💁 This is a drastic over-simplification. There are many techniques for object detection, and you can read more about them on the [Object detection page on Wikipedia](https://wikipedia.org/wiki/Object_detection).
+
+There are a number of different models that can do object detection. One particularly famous model is [YOLO (You only look once)](https://pjreddie.com/darknet/yolo/), which is incredibly fast and can detect 20 different class of objects, such as people, dogs, bottles and cars.
+
+✅ Read up on the YOLO model at [pjreddie.com/darknet/yolo/](https://pjreddie.com/darknet/yolo/)
+
+Object detection models can be re-trained using transfer learning to detect custom objects.
+
+## Use object detection in retail
+
+Object detection has multiple uses in retail. Some include:
+
+* **Stock checking and counting** - recognizing when stock is low on shelves. If stock is too low, notifications can be sent to staff or robots to re-stock shelves.
+* **mask detection** - in stores with mask policies during public health events, object detection can recognize people with masks and those without.
+* **Automated billing** - detecting items picked off shelves in automated stores and billing customers appropriately.
+* **Hazard detection** - recognizing broken items on floors, or spilled liquids, alerting cleaning crews.
+
+✅ Do some research: What are some more use cases for object detection in retail?
+
+## Train an object detector
+
+You can train an object detector using Custom Vision, in a similar way to how you trained an image classifier.
+
+### Task - create an object detector
+
+1. Create a Resource Group for this project called `stock-detector`
+
+1. Create a free Custom Vision training resource, and a free Custom Vision prediction resource in the `stock-detector` resource group. Name them `stock-detector-training` and `stock-detector-prediction`.
+
+ > 💁 You can only have one free training and prediction resource, so make sure you've cleaned up your project from the earlier lessons.
+
+ > ⚠️ You can refer to [the instructions for creating training and prediction resources from project 4, lesson 1 if needed](../../../4-manufacturing/lessons/1-train-fruit-detector/README.md#task---create-a-cognitive-services-resource).
+
+1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai), and sign in with the Microsoft account you used for your Azure account.
+
+1. Follow the [Create a new Project section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#create-a-new-project) to create a new Custom Vision project. The UI may change and these docs are always the most up to date reference.
+
+ Call your project `stock-detector`.
+
+ When you create your project, make sure to use the `stock-detector-training` resource you created earlier. Use a n*Object Detection* project type, and the *Products on Shelves* domain.
+
+ 
+
+ ✅ The products on shelves domain is specifically targeted for detecting stock on store shelves. Read more on the different domains in the [Select a domian documentation on Microsoft Docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/select-domain?WT.mc_id=academic-17441-jabenn#object-detection)
+
+✅ Take some time to explore the Custom Vision UI for your object detector.
+
+### Task - train your object detector
+
+To train your model you will need a set of images containing the objects you want to detect.
+
+1. Gather images that contain the object to detect. You will need at least 15 images containing each object to detect from a variety of different angles and in different lighting conditions, but the more the better. This object detector uses the *Products on shelves* domain, so try to set up the objects as if they were on a store shelf. You will also need a few images to test the model. If you are detecting more than one object, you will want some testing images that contain all the objects.
+
+ > 💁 Images with multiple different objects count towards the 15 image minimum for all the objects in the image.
+
+ Your images should be png or jpegs, small than 6MB. If you create them with an iPhone for example they may be high-resolution HEIC images, so will need to be converted and possibly shrunk. The more images the better, and you should have a similar number of ripe and unripe.
+
+ The model is designed for products on shelves, so try to take the photos of the objects on shelves.
+
+ You can find some example images that you can use in the [images](./images) folder of cashew nuts and tomato paste that you can use.
+
+1. Follow the [Upload and tag images section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#upload-and-tag-images) to upload your training images. Create relevant tags depending on the types of objects you want to detect.
+
+ 
+
+ When you draw bounding boxes for objects, keep them nice and tight around the object. It can take a while to outline all the images, but the tool will detect what it thinks are the bounding boxes, making it faster.
+
+ 
+
+ > 💁 If you have more than 15 images for each object, you can train after 15 then use the **Suggested tags** feature. This will use the trained model to detect the objecs in the untagged image. You can then confirm the detected objects, or reject and re-draw the bounding boxes. This can save a *lot* of time.
+
+1. Follow the [Train the detector section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#train-the-detector) to train the object detector on your tagged images.
+
+ You will be given a choice of training type. Select **Quick Training**.
+
+The object detector will then train. It will take a few minutes for the training to complete.
+
+## Test your object detector
+
+Once your object detector is trained, you can test it by giving it new images to detect objects in.
+
+### Task - test your object detector
+
+1. Use the **Quick Test** button to upload testing images and verify the objects are detected. Use the testing images you created earlier, not any of the images you used for training.
+
+ 
+
+1. Try all the testing images you have access to and observe the probabilities.
+
+## Retrain your object detector
+
+When you test your object detector, it may not give the results you expect, the same as with image classifiers in the previous project. You can improve your object detector by retraining it with images it gets wrong.
+
+Every time you make a prediction using the quick test option, the image and results are stored. You can use these images to retrain your model.
+
+1. Use the **Predictions** tab to locate the images you used for testing
+
+1. Confirm any accurate detections, delete an incorrect ones and add any missing objects.
+
+1. Retrain and re-test the model.
+
+---
+
+## 🚀 Challenge
+
+What would happen if you used the object detector with similar looking items, such as same brand cans of tomato paste and chopped tomatoes?
+
+If you have any similar looking items, test it out by adding images of them to your object detector.
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/38)
+
+## Review & Self Study
+
+* When you trained your object detector, you would have seen values for *Precision*, *Recall*, and *mAP* that rate the model that was created. Read up on what these values are using [the Evaluate the detector section of the Build an object detector quickstart on the Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/custom-vision-service/get-started-build-detector?WT.mc_id=academic-17441-jabenn#evaluate-the-detector)
+* Read more about object detection on the [Object detection page on Wikipedia](https://wikipedia.org/wiki/Object_detection)
+
+## Assignment
+
+[Compare domains](assignment.md)
diff --git a/5-retail/lessons/1-train-stock-detector/assignment.md b/5-retail/lessons/1-train-stock-detector/assignment.md
new file mode 100644
index 00000000..13c342d0
--- /dev/null
+++ b/5-retail/lessons/1-train-stock-detector/assignment.md
@@ -0,0 +1,14 @@
+# Compare domains
+
+## Instructions
+
+When you created your object detector, you had a choice of multiple domains. Compare how well they work for your stock detector, and describe which gives better results.
+
+To change the domain, select the **Settings** button on the top menu, select a new domain, select the **Save changes** button, then retrain the model. Make sure you test with the new iteration of the model trained with the new domain.
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| Train the model with a different domain | Was able to change the domain and re-train the model | Was able to change the domain and re-train the model | Was unable to change the domain or re-train the model |
+| Test the model and compare the results | Was able to test the model with different domains, compare results, and describe which is better | Was able to test the model with different domains, but was unable to compare the results and describe which is better | Was unable to test the model with different domains |
diff --git a/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5305.png b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5305.png
new file mode 100644
index 00000000..1b68c8d9
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5305.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5306.png b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5306.png
new file mode 100644
index 00000000..f8f4bee5
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5306.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5321.png b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5321.png
new file mode 100644
index 00000000..428e2820
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5321.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5343.png b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5343.png
new file mode 100644
index 00000000..fc96c31a
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/testing/IMG_5343.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5307.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5307.png
new file mode 100644
index 00000000..9688128b
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5307.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5308.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5308.png
new file mode 100644
index 00000000..b74c98fb
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5308.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5309.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5309.png
new file mode 100644
index 00000000..59e74933
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5309.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5310.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5310.png
new file mode 100644
index 00000000..dacc6f89
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5310.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5311.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5311.png
new file mode 100644
index 00000000..56a5b535
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5311.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5312.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5312.png
new file mode 100644
index 00000000..772b087d
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5312.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5313.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5313.png
new file mode 100644
index 00000000..73f4752f
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5313.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5314.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5314.png
new file mode 100644
index 00000000..e5f0a295
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5314.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5315.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5315.png
new file mode 100644
index 00000000..2071c95e
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5315.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5316.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5316.png
new file mode 100644
index 00000000..d7c4c032
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5316.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5317.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5317.png
new file mode 100644
index 00000000..f86727c2
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5317.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5318.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5318.png
new file mode 100644
index 00000000..92388841
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5318.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5319.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5319.png
new file mode 100644
index 00000000..c26ae5a6
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5319.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5320.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5320.png
new file mode 100644
index 00000000..2a27e76b
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5320.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5322.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5322.png
new file mode 100644
index 00000000..ece228c4
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5322.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5323.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5323.png
new file mode 100644
index 00000000..628905a1
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5323.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5324.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5324.png
new file mode 100644
index 00000000..9abe3b40
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5324.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5325.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5325.png
new file mode 100644
index 00000000..e066181a
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5325.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5326.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5326.png
new file mode 100644
index 00000000..c2ad92de
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5326.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5327.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5327.png
new file mode 100644
index 00000000..269190c8
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5327.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5328.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5328.png
new file mode 100644
index 00000000..307fc8b6
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5328.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5329.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5329.png
new file mode 100644
index 00000000..3298f753
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5329.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5330.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5330.png
new file mode 100644
index 00000000..25a3af17
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5330.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5331.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5331.png
new file mode 100644
index 00000000..7215fb3b
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5331.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5332.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5332.png
new file mode 100644
index 00000000..0b3c6975
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5332.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5333.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5333.png
new file mode 100644
index 00000000..4e0dfb2e
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5333.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5334.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5334.png
new file mode 100644
index 00000000..5ab25123
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5334.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5335.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5335.png
new file mode 100644
index 00000000..66bdea2c
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5335.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5336.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5336.png
new file mode 100644
index 00000000..9c72fc51
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5336.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5337.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5337.png
new file mode 100644
index 00000000..63fe37a4
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5337.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5338.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5338.png
new file mode 100644
index 00000000..a59d84a9
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5338.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5340.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5340.png
new file mode 100644
index 00000000..dbdc680d
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5340.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5341.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5341.png
new file mode 100644
index 00000000..dda9743b
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5341.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5342.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5342.png
new file mode 100644
index 00000000..c75bfedd
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5342.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5344.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5344.png
new file mode 100644
index 00000000..9e2e07fa
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5344.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5345.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5345.png
new file mode 100644
index 00000000..0bb1b2d2
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5345.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5346.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5346.png
new file mode 100644
index 00000000..5196cd37
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5346.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5347.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5347.png
new file mode 100644
index 00000000..5410d143
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5347.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5348.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5348.png
new file mode 100644
index 00000000..a13b4415
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5348.png differ
diff --git a/5-retail/lessons/1-train-stock-detector/images/training/IMG_5349.png b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5349.png
new file mode 100644
index 00000000..3612f097
Binary files /dev/null and b/5-retail/lessons/1-train-stock-detector/images/training/IMG_5349.png differ
diff --git a/1-getting-started/lessons/2-deeper-dive/translations/.dummy.md b/5-retail/lessons/1-train-stock-detector/translations/.dummy.md
similarity index 100%
rename from 1-getting-started/lessons/2-deeper-dive/translations/.dummy.md
rename to 5-retail/lessons/1-train-stock-detector/translations/.dummy.md
diff --git a/5-retail/lessons/2-check-stock-device/README.md b/5-retail/lessons/2-check-stock-device/README.md
new file mode 100644
index 00000000..595c4e80
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/README.md
@@ -0,0 +1,169 @@
+# Check stock from an IoT device
+
+## Pre-lecture quiz
+
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/39)
+
+## Introduction
+
+In the previous lesson you learned about the different uses of object detection in retail. You also learned how to train an object detector to identify stock. In this lesson you will learn how to use your object detector from your IoT device to count stock.
+
+In this lesson we'll cover:
+
+* [Stock counting](#stock-counting)
+* [Call your object detector from your IoT device](#call-your-object-detector-from-your-iot-device)
+* [Bounding boxes](#bounding-boxes)
+* [Retrain the model](#retrain-the-model)
+* [Count stock](#count-stock)
+
+## Stock counting
+
+Object detectors can be used for stock checking, either counting stock or ensuring stock is where it should be. IoT devices with cameras can be deployed all around the store to monitor stock, starting with hot spots where having items restocked is important, such as areas where small numbers of high value items are stocked.
+
+For example, if a camera is pointing at a set of shelves that can hold 8 cans of tomato paste, and an object detector only detects 7 cans, then one is missing and needs to be restocked.
+
+
+
+In the above image, an object detector has detected 7 cans of tomato paste on a shelf that can hold 8 cans. Not only can the IoT device send a notification of the need to restock, but it can even give an indication of the location of the missing item, important data if you are using robots to restock shelves.
+
+> 💁 Depending on the store and popularity of the item, restocking probably wouldn't happen if only 1 can was missing. You would need to build an algorithm that determines when to restock based on your produce, customers and other criteria.
+
+✅ In what other scenarios could you combine object detection and robots?
+
+Sometimes the wrong stock can be on the shelves. This could be human error when restocking, or customers changing their mind on a purchase and putting an item back in the first available space. When this is a non-perishable item such as canned goods, this is an annoyance. If it is a perishable item such as frozen or chilled goods, this can mean that the product can no longer be sold as it might be impossible to tell how long the item was out of the freezer.
+
+Object detection can be used to detect unexpected items, again alerting a human or robot to return the item as soon as it is detected.
+
+
+
+In the above image, a can of baby corn has been put on the shelf next to the tomato paste. The object detector has detected this, allowing the IoT device to notify a human or robot to return the can to it's correct location.
+
+## Call your object detector from your IoT device
+
+The object detector you trained in the last lesson can be called from your IoT device.
+
+### Task - publish an iteration of your object detector
+
+Iterations are published from the Custom Vision portal.
+
+1. Launch the Custom Vision portal at [CustomVision.ai](https://customvision.ai) and sign in if you don't have it open already. Then open your `stock-detector` project.
+
+1. Select the **Performance** tab from the options at the top
+
+1. Select the latest iteration from the *Iterations* list on the side
+
+1. Select the **Publish** button for the iteration
+
+ 
+
+1. In the *Publish Model* dialog, set the *Prediction resource* to the `stock-detector-prediction` resource you created in the last lesson. Leave the name as `Iteration2`, and select the **Publish** button.
+
+1. Once published, select the **Prediction URL** button. This will show details of the prediction API, and you will need these to call the model from your IoT device. The lower section is labelled *If you have an image file*, and this is the details you want. Take a copy of the URL that is shown which will be something like:
+
+ ```output
+ https://.api.cognitive.microsoft.com/customvision/v3.0/Prediction//detect/iterations/Iteration2/image
+ ```
+
+ Where `` will be the location you used when creating your custom vision resource, and `` will be a long ID made up of letters and numbers.
+
+ Also take a copy of the *Prediction-Key* value. This is a secure key that you have to pass when you call the model. Only applications that pass this key are allowed to use the model, any other applications are rejected.
+
+ 
+
+✅ When a new iteration is published, it will have a different name. How do you think you would change the iteration an IoT device is using?
+
+### Task - call your object detector from your IoT device
+
+Follow the relevant guide below to use the object detector from your IoT device:
+
+* [Arduino - Wio Terminal](wio-terminal-object-detector.md)
+* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-object-detector.md)
+
+## Bounding boxes
+
+When you use the object detector, you not only get back the detected objects with their tags and probabilities, but you also get the bounding boxes of the objects. These define where the object detector detected the object with the given probability.
+
+> 💁 A bounding box is a box that defines the area that contains the object detected, a box that defines the boundary for the object.
+
+The results of a prediction in the **Predictions** tab in Custom Vision have the bounding boxes drawn on the image that was sent for prediction.
+
+
+
+In the image above, 4 cans of tomato paste were detected. In the results a red square is overlaid for each object that was detected in the image, indicating the bounding box for the image.
+
+✅ Open the predictions in Custom Vision and check out the bounding boxes.
+
+Bounding boxes are defined with 4 values - top, left, height and width. These values are on a scale of 0-1, representing the positions as a percentage of the size of the image. The origin (the 0,0 position) is the top left of the image, so the top value is the distance from the top, and the bottom of the bounding box is the top plus the height.
+
+
+
+The above image is 600 pixels wide and 800 pixels tall. The bounding box starts at 320 pixels down, giving a top coordinate of 0.4 (800 x 0.4 = 320). From the left, the bounding box starts at 240 pixels across, giving a left coordinate of 0.4 (600 x 0.4 = 240). The height of the bounding box is 240 pixels, giving a height value of 0.3 (800 x 0.3 = 240). The width of the bounding box is 120 pixels, giving a width value of 0.2 (600 x 0.2 = 120).
+
+| Coordinate | Value |
+| ---------- | ----: |
+| Top | 0.4 |
+| Left | 0.4 |
+| Height | 0.3 |
+| Width | 0.2 |
+
+Using percentage values from 0-1 means no matter what size the image is scaled to, the bounding box starts 0.4 of the way along and down, and is a 0.3 of the height and 0.2 of the width.
+
+You can use bounding boxes combined with probabilities to evaluate how accurate a detection is. For example, an object detector can detect multiple objects that overlap, for example detecting one can inside another. Your code could look at the bounding boxes, understand that this is impossible, and ignore any objects that have a significant overlap with other objects.
+
+
+
+In the example above, one bounding box indicated a predicted can of tomato paste at 78.3%. A second bounding box is slightly smaller, and is inside the first bounding box with a probability of 64.3%. You code can check the bounding boxes, see they overlap completely, and ignore the lower probability as there is no way one can can be inside another.
+
+✅ Can you think of a situation where is it valid to detect one object inside another?
+
+## Retrain the model
+
+Like with the image classifier, you can retrain your model using data captured by your IoT device. Using this real-world data will ensure your model works well when used from your IoT device.
+
+Unlike with the image classifier, you can't just tag an image. Instead you need to review every bounding box detected by the model. If the box is around the wrong thing then it needs to be deleted, if it is in the wrong location it needs to be adjusted.
+
+### Task - retrain the model
+
+1. Make sure you have captured a range of images using your IoT device.
+
+1. From the **Predictions** tab, select an image. You will see red boxes indicating the bounding boxes of the detected objects.
+
+1. Work through each bounding box. Select it first and you will see a pop-up showing the tag. Use the handles on the corners of the bounding box to adjust the size if necessary. If the tag is wrong, remove it with the **X** button and add the correct tag. If the bounding box doesn't contain an object, delete it with the trashcan button.
+
+1. Close the editor when done and the image will move from the **Predictions** tab to the **Training Images** tab. Repeat the process for all the predictions.
+
+1. Use the **Train** button to re-train your model. Once it has trained, publish the iteration and update your IoT device to use the URL of the new iteration.
+
+1. Re-deploy your code and test your IoT device.
+
+## Count stock
+
+Using a combination of the number of objects detected and the bounding boxes, you can count the stock on a shelf.
+
+### Task - count stock
+
+Follow the relevant guide below to count stock using the results from the object detector from your IoT device:
+
+* [Arduino - Wio Terminal](wio-terminal-count-stock.md)
+* [Single-board computer - Raspberry Pi/Virtual device](single-board-computer-count-stock.md)
+
+---
+
+## 🚀 Challenge
+
+Can you detect incorrect stock? Train your model on multiple objects, then update your app to alert you if the wrong stock is detected.
+
+Maybe even take this further and detect stock side by side on the same shelf, and see if something has been put in the wrong place bu defining limits on the bounding boxes.
+
+## Post-lecture quiz
+
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/40)
+
+## Review & Self Study
+
+* Learn more about how to architect an end-to-end stock detection system from the [Out of stock detection at the edge pattern guide on Microsoft Docs](https://docs.microsoft.com/hybrid/app-solutions/pattern-out-of-stock-at-edge?WT.mc_id=academic-17441-jabenn)
+* Learn other ways to build end-to-end retail solutions combining a range of IoT and cloud services by watching this [Behind the scenes of a retail solution - Hands On! video on YouTube](https://www.youtube.com/watch?v=m3Pc300x2Mw).
+
+## Assignment
+
+[Use your object detector on the edge](assignment.md)
diff --git a/5-retail/lessons/2-check-stock-device/assignment.md b/5-retail/lessons/2-check-stock-device/assignment.md
new file mode 100644
index 00000000..eb2342f5
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/assignment.md
@@ -0,0 +1,11 @@
+# Use your object detector on the edge
+
+## Instructions
+
+In the last project, you deployed your image classifier to the edge. Do the same with your object detector, exporting it as a compact model and running it on the edge, accessing the edge version from your IoT device.
+
+## Rubric
+
+| Criteria | Exemplary | Adequate | Needs Improvement |
+| -------- | --------- | -------- | ----------------- |
+| Deploy your object detector to the edge | Was able to use the correct compact domain, export the object detector and run it on the edge | Was able to use the correct compact domain, and export the object detector, but was unable to run it on the edge | Was unable to use the correct compact domain, export the object detector, and run it on the edge |
diff --git a/5-retail/lessons/2-check-stock-device/code-count/pi/fruit-quality-detector/app.py b/5-retail/lessons/2-check-stock-device/code-count/pi/fruit-quality-detector/app.py
new file mode 100644
index 00000000..74ab558a
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/pi/fruit-quality-detector/app.py
@@ -0,0 +1,92 @@
+import io
+import time
+from picamera import PiCamera
+
+from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
+from msrest.authentication import ApiKeyCredentials
+
+from PIL import Image, ImageDraw, ImageColor
+from shapely.geometry import Polygon
+
+camera = PiCamera()
+camera.resolution = (640, 480)
+camera.rotation = 0
+
+time.sleep(2)
+
+image = io.BytesIO()
+camera.capture(image, 'jpeg')
+image.seek(0)
+
+with open('image.jpg', 'wb') as image_file:
+ image_file.write(image.read())
+
+prediction_url = ''
+prediction_key = ''
+
+parts = prediction_url.split('/')
+endpoint = 'https://' + parts[2]
+project_id = parts[6]
+iteration_name = parts[9]
+
+prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
+predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
+
+image.seek(0)
+results = predictor.detect_image(project_id, iteration_name, image)
+
+threshold = 0.3
+
+predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
+
+for prediction in predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
+
+overlap_threshold = 0.002
+
+def create_polygon(prediction):
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
+
+to_delete = []
+
+for i in range(0, len(predictions)):
+ polygon_1 = create_polygon(predictions[i])
+
+ for j in range(i+1, len(predictions)):
+ polygon_2 = create_polygon(predictions[j])
+ overlap = polygon_1.intersection(polygon_2).area
+
+ smallest_area = min(polygon_1.area, polygon_2.area)
+
+ if overlap > (overlap_threshold * smallest_area):
+ to_delete.append(predictions[i])
+ break
+
+for d in to_delete:
+ predictions.remove(d)
+
+print(f'Counted {len(predictions)} stock items')
+
+
+with Image.open('image.jpg') as im:
+ draw = ImageDraw.Draw(im)
+
+ for prediction in predictions:
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ left = scale_left * im.width
+ top = scale_top * im.height
+ right = scale_right * im.width
+ bottom = scale_bottom * im.height
+
+ draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
+
+ im.save('image.jpg')
diff --git a/5-retail/lessons/2-check-stock-device/code-count/virtual-iot-device/fruit-quality-detector/app.py b/5-retail/lessons/2-check-stock-device/code-count/virtual-iot-device/fruit-quality-detector/app.py
new file mode 100644
index 00000000..37b464ea
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/virtual-iot-device/fruit-quality-detector/app.py
@@ -0,0 +1,92 @@
+from counterfit_connection import CounterFitConnection
+CounterFitConnection.init('127.0.0.1', 5000)
+
+import io
+from counterfit_shims_picamera import PiCamera
+
+from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
+from msrest.authentication import ApiKeyCredentials
+
+from PIL import Image, ImageDraw, ImageColor
+from shapely.geometry import Polygon
+
+camera = PiCamera()
+camera.resolution = (640, 480)
+camera.rotation = 0
+
+image = io.BytesIO()
+camera.capture(image, 'jpeg')
+image.seek(0)
+
+with open('image.jpg', 'wb') as image_file:
+ image_file.write(image.read())
+
+prediction_url = ''
+prediction_key = ''
+
+parts = prediction_url.split('/')
+endpoint = 'https://' + parts[2]
+project_id = parts[6]
+iteration_name = parts[9]
+
+prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
+predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
+
+image.seek(0)
+results = predictor.detect_image(project_id, iteration_name, image)
+
+threshold = 0.3
+
+predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
+
+for prediction in predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
+
+overlap_threshold = 0.002
+
+def create_polygon(prediction):
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
+
+to_delete = []
+
+for i in range(0, len(predictions)):
+ polygon_1 = create_polygon(predictions[i])
+
+ for j in range(i+1, len(predictions)):
+ polygon_2 = create_polygon(predictions[j])
+ overlap = polygon_1.intersection(polygon_2).area
+
+ smallest_area = min(polygon_1.area, polygon_2.area)
+
+ if overlap > (overlap_threshold * smallest_area):
+ to_delete.append(predictions[i])
+ break
+
+for d in to_delete:
+ predictions.remove(d)
+
+print(f'Counted {len(predictions)} stock items')
+
+
+with Image.open('image.jpg') as im:
+ draw = ImageDraw.Draw(im)
+
+ for prediction in predictions:
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ left = scale_left * im.width
+ top = scale_top * im.height
+ right = scale_right * im.width
+ bottom = scale_bottom * im.height
+
+ draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
+
+ im.save('image.jpg')
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.gitignore b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.gitignore
new file mode 100644
index 00000000..89cc49cb
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.gitignore
@@ -0,0 +1,5 @@
+.pio
+.vscode/.browse.c_cpp.db*
+.vscode/c_cpp_properties.json
+.vscode/launch.json
+.vscode/ipch
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.vscode/extensions.json b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.vscode/extensions.json
new file mode 100644
index 00000000..0f0d7401
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/.vscode/extensions.json
@@ -0,0 +1,7 @@
+{
+ // See http://go.microsoft.com/fwlink/?LinkId=827846
+ // for the documentation about the extensions.json format
+ "recommendations": [
+ "platformio.platformio-ide"
+ ]
+}
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/include/README b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/include/README
new file mode 100644
index 00000000..194dcd43
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/include/README
@@ -0,0 +1,39 @@
+
+This directory is intended for project header files.
+
+A header file is a file containing C declarations and macro definitions
+to be shared between several project source files. You request the use of a
+header file in your project source file (C, C++, etc) located in `src` folder
+by including it, with the C preprocessing directive `#include'.
+
+```src/main.c
+
+#include "header.h"
+
+int main (void)
+{
+ ...
+}
+```
+
+Including a header file produces the same results as copying the header file
+into each source file that needs it. Such copying would be time-consuming
+and error-prone. With a header file, the related declarations appear
+in only one place. If they need to be changed, they can be changed in one
+place, and programs that include the header file will automatically use the
+new version when next recompiled. The header file eliminates the labor of
+finding and changing all the copies as well as the risk that a failure to
+find one copy will result in inconsistencies within a program.
+
+In C, the usual convention is to give header files names that end with `.h'.
+It is most portable to use only letters, digits, dashes, and underscores in
+header file names, and at most one dot.
+
+Read more about using header files in official GCC documentation:
+
+* Include Syntax
+* Include Operation
+* Once-Only Headers
+* Computed Includes
+
+https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/lib/README b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/lib/README
new file mode 100644
index 00000000..6debab1e
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/lib/README
@@ -0,0 +1,46 @@
+
+This directory is intended for project specific (private) libraries.
+PlatformIO will compile them to static libraries and link into executable file.
+
+The source code of each library should be placed in a an own separate directory
+("lib/your_library_name/[here are source files]").
+
+For example, see a structure of the following two libraries `Foo` and `Bar`:
+
+|--lib
+| |
+| |--Bar
+| | |--docs
+| | |--examples
+| | |--src
+| | |- Bar.c
+| | |- Bar.h
+| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
+| |
+| |--Foo
+| | |- Foo.c
+| | |- Foo.h
+| |
+| |- README --> THIS FILE
+|
+|- platformio.ini
+|--src
+ |- main.c
+
+and a contents of `src/main.c`:
+```
+#include
+#include
+
+int main (void)
+{
+ ...
+}
+
+```
+
+PlatformIO Library Dependency Finder will find automatically dependent
+libraries scanning project source files.
+
+More information about PlatformIO Library Dependency Finder
+- https://docs.platformio.org/page/librarymanager/ldf.html
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/platformio.ini b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/platformio.ini
new file mode 100644
index 00000000..5f3eb8a7
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/platformio.ini
@@ -0,0 +1,26 @@
+; PlatformIO Project Configuration File
+;
+; Build options: build flags, source filter
+; Upload options: custom upload port, speed and extra flags
+; Library options: dependencies, extra library storages
+; Advanced options: extra scripting
+;
+; Please visit documentation for the other options and examples
+; https://docs.platformio.org/page/projectconf.html
+
+[env:seeed_wio_terminal]
+platform = atmelsam
+board = seeed_wio_terminal
+framework = arduino
+lib_deps =
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+ seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
+ seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
+ seeed-studio/Seeed Arduino RTC @ 2.0.0
+ bblanchon/ArduinoJson @ 6.17.3
+build_flags =
+ -w
+ -DARDUCAM_SHIELD_V2
+ -DOV2640_CAM
\ No newline at end of file
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/camera.h b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/camera.h
new file mode 100644
index 00000000..2028039f
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/camera.h
@@ -0,0 +1,160 @@
+#pragma once
+
+#include
+#include
+
+class Camera
+{
+public:
+ Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
+ {
+ _format = format;
+ _image_size = image_size;
+ }
+
+ bool init()
+ {
+ // Reset the CPLD
+ _arducam.write_reg(0x07, 0x80);
+ delay(100);
+
+ _arducam.write_reg(0x07, 0x00);
+ delay(100);
+
+ // Check if the ArduCAM SPI bus is OK
+ _arducam.write_reg(ARDUCHIP_TEST1, 0x55);
+ if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
+ {
+ return false;
+ }
+
+ // Change MCU mode
+ _arducam.set_mode(MCU2LCD_MODE);
+
+ uint8_t vid, pid;
+
+ // Check if the camera module type is OV2640
+ _arducam.wrSensorReg8_8(0xff, 0x01);
+ _arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
+ _arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
+ if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
+ {
+ return false;
+ }
+
+ _arducam.set_format(_format);
+ _arducam.InitCAM();
+ _arducam.OV2640_set_JPEG_size(_image_size);
+ _arducam.OV2640_set_Light_Mode(Auto);
+ _arducam.OV2640_set_Special_effects(Normal);
+ delay(1000);
+
+ return true;
+ }
+
+ void startCapture()
+ {
+ _arducam.flush_fifo();
+ _arducam.clear_fifo_flag();
+ _arducam.start_capture();
+ }
+
+ bool captureReady()
+ {
+ return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
+ }
+
+ bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
+ {
+ if (!captureReady()) return false;
+
+ // Get the image file length
+ uint32_t length = _arducam.read_fifo_length();
+ buffer_length = length;
+
+ if (length >= MAX_FIFO_SIZE)
+ {
+ return false;
+ }
+ if (length == 0)
+ {
+ return false;
+ }
+
+ // create the buffer
+ byte *buf = new byte[length];
+
+ uint8_t temp = 0, temp_last = 0;
+ int i = 0;
+ uint32_t buffer_pos = 0;
+ bool is_header = false;
+
+ _arducam.CS_LOW();
+ _arducam.set_fifo_burst();
+
+ while (length--)
+ {
+ temp_last = temp;
+ temp = SPI.transfer(0x00);
+ //Read JPEG data from FIFO
+ if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
+ {
+ buf[buffer_pos] = temp;
+
+ buffer_pos++;
+ i++;
+
+ _arducam.CS_HIGH();
+ }
+ if (is_header == true)
+ {
+ //Write image data to buffer if not full
+ if (i < 256)
+ {
+ buf[buffer_pos] = temp;
+ buffer_pos++;
+ i++;
+ }
+ else
+ {
+ _arducam.CS_HIGH();
+
+ i = 0;
+ buf[buffer_pos] = temp;
+
+ buffer_pos++;
+ i++;
+
+ _arducam.CS_LOW();
+ _arducam.set_fifo_burst();
+ }
+ }
+ else if ((temp == 0xD8) & (temp_last == 0xFF))
+ {
+ is_header = true;
+
+ buf[buffer_pos] = temp_last;
+ buffer_pos++;
+ i++;
+
+ buf[buffer_pos] = temp;
+ buffer_pos++;
+ i++;
+ }
+ }
+
+ _arducam.clear_fifo_flag();
+
+ _arducam.set_format(_format);
+ _arducam.InitCAM();
+ _arducam.OV2640_set_JPEG_size(_image_size);
+
+ // return the buffer
+ *buffer = buf;
+ }
+
+private:
+ ArduCAM _arducam;
+ int _format;
+ int _image_size;
+};
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/config.h b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/config.h
new file mode 100644
index 00000000..ef40b4fa
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/config.h
@@ -0,0 +1,49 @@
+#pragma once
+
+#include
+
+using namespace std;
+
+// WiFi credentials
+const char *SSID = "";
+const char *PASSWORD = "";
+
+const char *PREDICTION_URL = "";
+const char *PREDICTION_KEY = "";
+
+// Microsoft Azure DigiCert Global Root G2 global certificate
+const char *CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
+ "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
+ "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
+ "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
+ "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
+ "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
+ "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
+ "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
+ "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
+ "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
+ "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
+ "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
+ "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
+ "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
+ "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
+ "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
+ "-----END CERTIFICATE-----\r\n";
\ No newline at end of file
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/main.cpp b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/main.cpp
new file mode 100644
index 00000000..b8bf581f
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/src/main.cpp
@@ -0,0 +1,223 @@
+#include
+#include
+#include
+#include
+#include "SD/Seeed_SD.h"
+#include
+#include
+#include
+#include
+
+#include "config.h"
+#include "camera.h"
+
+Camera camera = Camera(JPEG, OV2640_640x480);
+
+WiFiClientSecure client;
+
+void setupCamera()
+{
+ pinMode(PIN_SPI_SS, OUTPUT);
+ digitalWrite(PIN_SPI_SS, HIGH);
+
+ Wire.begin();
+ SPI.begin();
+
+ if (!camera.init())
+ {
+ Serial.println("Error setting up the camera!");
+ }
+}
+
+void connectWiFi()
+{
+ while (WiFi.status() != WL_CONNECTED)
+ {
+ Serial.println("Connecting to WiFi..");
+ WiFi.begin(SSID, PASSWORD);
+ delay(500);
+ }
+
+ client.setCACert(CERTIFICATE);
+ Serial.println("Connected!");
+}
+
+void setup()
+{
+ Serial.begin(9600);
+
+ while (!Serial)
+ ; // Wait for Serial to be ready
+
+ delay(1000);
+
+ connectWiFi();
+
+ setupCamera();
+
+ pinMode(WIO_KEY_C, INPUT_PULLUP);
+}
+
+const float threshold = 0.0f;
+const float overlap_threshold = 0.20f;
+
+struct Point {
+ float x, y;
+};
+
+struct Rect {
+ Point topLeft, bottomRight;
+};
+
+float area(Rect rect)
+{
+ return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
+}
+
+float overlappingArea(Rect rect1, Rect rect2)
+{
+ float left = max(rect1.topLeft.x, rect2.topLeft.x);
+ float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
+ float top = max(rect1.topLeft.y, rect2.topLeft.y);
+ float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
+
+
+ if ( right > left && bottom > top )
+ {
+ return (right-left)*(bottom-top);
+ }
+
+ return 0.0f;
+}
+
+Rect rectFromBoundingBox(JsonVariant prediction)
+{
+ JsonObject bounding_box = prediction["boundingBox"].as();
+
+ float left = bounding_box["left"].as();
+ float top = bounding_box["top"].as();
+ float width = bounding_box["width"].as();
+ float height = bounding_box["height"].as();
+
+ Point topLeft = {left, top};
+ Point bottomRight = {left + width, top + height};
+
+ return {topLeft, bottomRight};
+}
+
+void processPredictions(std::vector &predictions)
+{
+ std::vector passed_predictions;
+
+ for (int i = 0; i < predictions.size(); ++i)
+ {
+ Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
+ float prediction_1_area = area(prediction_1_rect);
+ bool passed = true;
+
+ for (int j = i + 1; j < predictions.size(); ++j)
+ {
+ Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
+ float prediction_2_area = area(prediction_2_rect);
+
+ float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
+ float smallest_area = min(prediction_1_area, prediction_2_area);
+
+ if (overlap > (overlap_threshold * smallest_area))
+ {
+ passed = false;
+ break;
+ }
+ }
+
+ if (passed)
+ {
+ passed_predictions.push_back(predictions[i]);
+ }
+ }
+
+ for(JsonVariant prediction : passed_predictions)
+ {
+ String boundingBox = prediction["boundingBox"].as();
+ String tag = prediction["tagName"].as();
+ float probability = prediction["probability"].as();
+
+ char buff[32];
+ sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
+ Serial.println(buff);
+ }
+
+ Serial.print("Counted ");
+ Serial.print(passed_predictions.size());
+ Serial.println(" stock items.");
+}
+
+void detectStock(byte *buffer, uint32_t length)
+{
+ HTTPClient httpClient;
+ httpClient.begin(client, PREDICTION_URL);
+ httpClient.addHeader("Content-Type", "application/octet-stream");
+ httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
+
+ int httpResponseCode = httpClient.POST(buffer, length);
+
+ if (httpResponseCode == 200)
+ {
+ String result = httpClient.getString();
+
+ DynamicJsonDocument doc(1024);
+ deserializeJson(doc, result.c_str());
+
+ JsonObject obj = doc.as();
+ JsonArray predictions = obj["predictions"].as();
+
+ std::vector passed_predictions;
+
+ for(JsonVariant prediction : predictions)
+ {
+ float probability = prediction["probability"].as();
+ if (probability > threshold)
+ {
+ passed_predictions.push_back(prediction);
+ }
+ }
+
+ processPredictions(passed_predictions);
+ }
+
+ httpClient.end();
+}
+
+void buttonPressed()
+{
+ camera.startCapture();
+
+ while (!camera.captureReady())
+ delay(100);
+
+ Serial.println("Image captured");
+
+ byte *buffer;
+ uint32_t length;
+
+ if (camera.readImageToBuffer(&buffer, length))
+ {
+ Serial.print("Image read to buffer with length ");
+ Serial.println(length);
+
+ detectStock(buffer, length);
+
+ delete (buffer);
+ }
+}
+
+void loop()
+{
+ if (digitalRead(WIO_KEY_C) == LOW)
+ {
+ buttonPressed();
+ delay(2000);
+ }
+
+ delay(200);
+}
diff --git a/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/test/README b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/test/README
new file mode 100644
index 00000000..b94d0890
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-count/wio-terminal/fruit-quality-detector/test/README
@@ -0,0 +1,11 @@
+
+This directory is intended for PlatformIO Unit Testing and project tests.
+
+Unit Testing is a software testing method by which individual units of
+source code, sets of one or more MCU program modules together with associated
+control data, usage procedures, and operating procedures, are tested to
+determine whether they are fit for use. Unit testing finds problems early
+in the development cycle.
+
+More information about PlatformIO Unit Testing:
+- https://docs.platformio.org/page/plus/unit-testing.html
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/pi/fruit-quality-detector/app.py b/5-retail/lessons/2-check-stock-device/code-detect/pi/fruit-quality-detector/app.py
new file mode 100644
index 00000000..8c1182fe
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/pi/fruit-quality-detector/app.py
@@ -0,0 +1,40 @@
+import io
+import time
+from picamera import PiCamera
+
+from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
+from msrest.authentication import ApiKeyCredentials
+
+camera = PiCamera()
+camera.resolution = (640, 480)
+camera.rotation = 0
+
+time.sleep(2)
+
+image = io.BytesIO()
+camera.capture(image, 'jpeg')
+image.seek(0)
+
+with open('image.jpg', 'wb') as image_file:
+ image_file.write(image.read())
+
+prediction_url = ''
+prediction_key = ''
+
+parts = prediction_url.split('/')
+endpoint = 'https://' + parts[2]
+project_id = parts[6]
+iteration_name = parts[9]
+
+prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
+predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
+
+image.seek(0)
+results = predictor.detect_image(project_id, iteration_name, image)
+
+threshold = 0.3
+
+predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
+
+for prediction in predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/virtual-iot-device/fruit-quality-detector/app.py b/5-retail/lessons/2-check-stock-device/code-detect/virtual-iot-device/fruit-quality-detector/app.py
new file mode 100644
index 00000000..cc53a73c
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/virtual-iot-device/fruit-quality-detector/app.py
@@ -0,0 +1,40 @@
+from counterfit_connection import CounterFitConnection
+CounterFitConnection.init('127.0.0.1', 5000)
+
+import io
+from counterfit_shims_picamera import PiCamera
+
+from azure.cognitiveservices.vision.customvision.prediction import CustomVisionPredictionClient
+from msrest.authentication import ApiKeyCredentials
+
+camera = PiCamera()
+camera.resolution = (640, 480)
+camera.rotation = 0
+
+image = io.BytesIO()
+camera.capture(image, 'jpeg')
+image.seek(0)
+
+with open('image.jpg', 'wb') as image_file:
+ image_file.write(image.read())
+
+prediction_url = ''
+prediction_key = ''
+
+parts = prediction_url.split('/')
+endpoint = 'https://' + parts[2]
+project_id = parts[6]
+iteration_name = parts[9]
+
+prediction_credentials = ApiKeyCredentials(in_headers={"Prediction-key": prediction_key})
+predictor = CustomVisionPredictionClient(endpoint, prediction_credentials)
+
+image.seek(0)
+results = predictor.detect_image(project_id, iteration_name, image)
+
+threshold = 0.3
+
+predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
+
+for prediction in predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.gitignore b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.gitignore
new file mode 100644
index 00000000..89cc49cb
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.gitignore
@@ -0,0 +1,5 @@
+.pio
+.vscode/.browse.c_cpp.db*
+.vscode/c_cpp_properties.json
+.vscode/launch.json
+.vscode/ipch
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.vscode/extensions.json b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.vscode/extensions.json
new file mode 100644
index 00000000..0f0d7401
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/.vscode/extensions.json
@@ -0,0 +1,7 @@
+{
+ // See http://go.microsoft.com/fwlink/?LinkId=827846
+ // for the documentation about the extensions.json format
+ "recommendations": [
+ "platformio.platformio-ide"
+ ]
+}
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/include/README b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/include/README
new file mode 100644
index 00000000..194dcd43
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/include/README
@@ -0,0 +1,39 @@
+
+This directory is intended for project header files.
+
+A header file is a file containing C declarations and macro definitions
+to be shared between several project source files. You request the use of a
+header file in your project source file (C, C++, etc) located in `src` folder
+by including it, with the C preprocessing directive `#include'.
+
+```src/main.c
+
+#include "header.h"
+
+int main (void)
+{
+ ...
+}
+```
+
+Including a header file produces the same results as copying the header file
+into each source file that needs it. Such copying would be time-consuming
+and error-prone. With a header file, the related declarations appear
+in only one place. If they need to be changed, they can be changed in one
+place, and programs that include the header file will automatically use the
+new version when next recompiled. The header file eliminates the labor of
+finding and changing all the copies as well as the risk that a failure to
+find one copy will result in inconsistencies within a program.
+
+In C, the usual convention is to give header files names that end with `.h'.
+It is most portable to use only letters, digits, dashes, and underscores in
+header file names, and at most one dot.
+
+Read more about using header files in official GCC documentation:
+
+* Include Syntax
+* Include Operation
+* Once-Only Headers
+* Computed Includes
+
+https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/lib/README b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/lib/README
new file mode 100644
index 00000000..6debab1e
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/lib/README
@@ -0,0 +1,46 @@
+
+This directory is intended for project specific (private) libraries.
+PlatformIO will compile them to static libraries and link into executable file.
+
+The source code of each library should be placed in a an own separate directory
+("lib/your_library_name/[here are source files]").
+
+For example, see a structure of the following two libraries `Foo` and `Bar`:
+
+|--lib
+| |
+| |--Bar
+| | |--docs
+| | |--examples
+| | |--src
+| | |- Bar.c
+| | |- Bar.h
+| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
+| |
+| |--Foo
+| | |- Foo.c
+| | |- Foo.h
+| |
+| |- README --> THIS FILE
+|
+|- platformio.ini
+|--src
+ |- main.c
+
+and a contents of `src/main.c`:
+```
+#include
+#include
+
+int main (void)
+{
+ ...
+}
+
+```
+
+PlatformIO Library Dependency Finder will find automatically dependent
+libraries scanning project source files.
+
+More information about PlatformIO Library Dependency Finder
+- https://docs.platformio.org/page/librarymanager/ldf.html
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/platformio.ini b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/platformio.ini
new file mode 100644
index 00000000..5f3eb8a7
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/platformio.ini
@@ -0,0 +1,26 @@
+; PlatformIO Project Configuration File
+;
+; Build options: build flags, source filter
+; Upload options: custom upload port, speed and extra flags
+; Library options: dependencies, extra library storages
+; Advanced options: extra scripting
+;
+; Please visit documentation for the other options and examples
+; https://docs.platformio.org/page/projectconf.html
+
+[env:seeed_wio_terminal]
+platform = atmelsam
+board = seeed_wio_terminal
+framework = arduino
+lib_deps =
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+ seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
+ seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
+ seeed-studio/Seeed Arduino RTC @ 2.0.0
+ bblanchon/ArduinoJson @ 6.17.3
+build_flags =
+ -w
+ -DARDUCAM_SHIELD_V2
+ -DOV2640_CAM
\ No newline at end of file
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/camera.h b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/camera.h
new file mode 100644
index 00000000..2028039f
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/camera.h
@@ -0,0 +1,160 @@
+#pragma once
+
+#include
+#include
+
+class Camera
+{
+public:
+ Camera(int format, int image_size) : _arducam(OV2640, PIN_SPI_SS)
+ {
+ _format = format;
+ _image_size = image_size;
+ }
+
+ bool init()
+ {
+ // Reset the CPLD
+ _arducam.write_reg(0x07, 0x80);
+ delay(100);
+
+ _arducam.write_reg(0x07, 0x00);
+ delay(100);
+
+ // Check if the ArduCAM SPI bus is OK
+ _arducam.write_reg(ARDUCHIP_TEST1, 0x55);
+ if (_arducam.read_reg(ARDUCHIP_TEST1) != 0x55)
+ {
+ return false;
+ }
+
+ // Change MCU mode
+ _arducam.set_mode(MCU2LCD_MODE);
+
+ uint8_t vid, pid;
+
+ // Check if the camera module type is OV2640
+ _arducam.wrSensorReg8_8(0xff, 0x01);
+ _arducam.rdSensorReg8_8(OV2640_CHIPID_HIGH, &vid);
+ _arducam.rdSensorReg8_8(OV2640_CHIPID_LOW, &pid);
+ if ((vid != 0x26) && ((pid != 0x41) || (pid != 0x42)))
+ {
+ return false;
+ }
+
+ _arducam.set_format(_format);
+ _arducam.InitCAM();
+ _arducam.OV2640_set_JPEG_size(_image_size);
+ _arducam.OV2640_set_Light_Mode(Auto);
+ _arducam.OV2640_set_Special_effects(Normal);
+ delay(1000);
+
+ return true;
+ }
+
+ void startCapture()
+ {
+ _arducam.flush_fifo();
+ _arducam.clear_fifo_flag();
+ _arducam.start_capture();
+ }
+
+ bool captureReady()
+ {
+ return _arducam.get_bit(ARDUCHIP_TRIG, CAP_DONE_MASK);
+ }
+
+ bool readImageToBuffer(byte **buffer, uint32_t &buffer_length)
+ {
+ if (!captureReady()) return false;
+
+ // Get the image file length
+ uint32_t length = _arducam.read_fifo_length();
+ buffer_length = length;
+
+ if (length >= MAX_FIFO_SIZE)
+ {
+ return false;
+ }
+ if (length == 0)
+ {
+ return false;
+ }
+
+ // create the buffer
+ byte *buf = new byte[length];
+
+ uint8_t temp = 0, temp_last = 0;
+ int i = 0;
+ uint32_t buffer_pos = 0;
+ bool is_header = false;
+
+ _arducam.CS_LOW();
+ _arducam.set_fifo_burst();
+
+ while (length--)
+ {
+ temp_last = temp;
+ temp = SPI.transfer(0x00);
+ //Read JPEG data from FIFO
+ if ((temp == 0xD9) && (temp_last == 0xFF)) //If find the end ,break while,
+ {
+ buf[buffer_pos] = temp;
+
+ buffer_pos++;
+ i++;
+
+ _arducam.CS_HIGH();
+ }
+ if (is_header == true)
+ {
+ //Write image data to buffer if not full
+ if (i < 256)
+ {
+ buf[buffer_pos] = temp;
+ buffer_pos++;
+ i++;
+ }
+ else
+ {
+ _arducam.CS_HIGH();
+
+ i = 0;
+ buf[buffer_pos] = temp;
+
+ buffer_pos++;
+ i++;
+
+ _arducam.CS_LOW();
+ _arducam.set_fifo_burst();
+ }
+ }
+ else if ((temp == 0xD8) & (temp_last == 0xFF))
+ {
+ is_header = true;
+
+ buf[buffer_pos] = temp_last;
+ buffer_pos++;
+ i++;
+
+ buf[buffer_pos] = temp;
+ buffer_pos++;
+ i++;
+ }
+ }
+
+ _arducam.clear_fifo_flag();
+
+ _arducam.set_format(_format);
+ _arducam.InitCAM();
+ _arducam.OV2640_set_JPEG_size(_image_size);
+
+ // return the buffer
+ *buffer = buf;
+ }
+
+private:
+ ArduCAM _arducam;
+ int _format;
+ int _image_size;
+};
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/config.h b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/config.h
new file mode 100644
index 00000000..ef40b4fa
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/config.h
@@ -0,0 +1,49 @@
+#pragma once
+
+#include
+
+using namespace std;
+
+// WiFi credentials
+const char *SSID = "";
+const char *PASSWORD = "";
+
+const char *PREDICTION_URL = "";
+const char *PREDICTION_KEY = "";
+
+// Microsoft Azure DigiCert Global Root G2 global certificate
+const char *CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
+ "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
+ "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
+ "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
+ "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
+ "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
+ "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
+ "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
+ "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
+ "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
+ "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
+ "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
+ "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
+ "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
+ "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
+ "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
+ "-----END CERTIFICATE-----\r\n";
\ No newline at end of file
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/main.cpp b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/main.cpp
new file mode 100644
index 00000000..d507be7d
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/src/main.cpp
@@ -0,0 +1,145 @@
+#include
+#include
+#include
+#include
+#include
+#include "SD/Seeed_SD.h"
+#include
+#include
+#include
+#include
+
+#include "config.h"
+#include "camera.h"
+
+Camera camera = Camera(JPEG, OV2640_640x480);
+
+WiFiClientSecure client;
+
+void setupCamera()
+{
+ pinMode(PIN_SPI_SS, OUTPUT);
+ digitalWrite(PIN_SPI_SS, HIGH);
+
+ Wire.begin();
+ SPI.begin();
+
+ if (!camera.init())
+ {
+ Serial.println("Error setting up the camera!");
+ }
+}
+
+void connectWiFi()
+{
+ while (WiFi.status() != WL_CONNECTED)
+ {
+ Serial.println("Connecting to WiFi..");
+ WiFi.begin(SSID, PASSWORD);
+ delay(500);
+ }
+
+ client.setCACert(CERTIFICATE);
+ Serial.println("Connected!");
+}
+
+void setup()
+{
+ Serial.begin(9600);
+
+ while (!Serial)
+ ; // Wait for Serial to be ready
+
+ delay(1000);
+
+ connectWiFi();
+
+ setupCamera();
+
+ pinMode(WIO_KEY_C, INPUT_PULLUP);
+}
+
+const float threshold = 0.3f;
+
+void processPredictions(std::vector &predictions)
+{
+ for(JsonVariant prediction : predictions)
+ {
+ String tag = prediction["tagName"].as();
+ float probability = prediction["probability"].as();
+
+ char buff[32];
+ sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
+ Serial.println(buff);
+ }
+}
+
+void detectStock(byte *buffer, uint32_t length)
+{
+ HTTPClient httpClient;
+ httpClient.begin(client, PREDICTION_URL);
+ httpClient.addHeader("Content-Type", "application/octet-stream");
+ httpClient.addHeader("Prediction-Key", PREDICTION_KEY);
+
+ int httpResponseCode = httpClient.POST(buffer, length);
+
+ if (httpResponseCode == 200)
+ {
+ String result = httpClient.getString();
+
+ DynamicJsonDocument doc(1024);
+ deserializeJson(doc, result.c_str());
+
+ JsonObject obj = doc.as();
+ JsonArray predictions = obj["predictions"].as();
+
+ std::vector passed_predictions;
+
+ for(JsonVariant prediction : predictions)
+ {
+ float probability = prediction["probability"].as();
+ if (probability > threshold)
+ {
+ passed_predictions.push_back(prediction);
+ }
+ }
+
+ processPredictions(passed_predictions);
+ }
+
+ httpClient.end();
+}
+
+void buttonPressed()
+{
+ camera.startCapture();
+
+ while (!camera.captureReady())
+ delay(100);
+
+ Serial.println("Image captured");
+
+ byte *buffer;
+ uint32_t length;
+
+ if (camera.readImageToBuffer(&buffer, length))
+ {
+ Serial.print("Image read to buffer with length ");
+ Serial.println(length);
+
+ detectStock(buffer, length);
+
+ delete (buffer);
+ }
+}
+
+void loop()
+{
+ if (digitalRead(WIO_KEY_C) == LOW)
+ {
+ buttonPressed();
+ delay(2000);
+ }
+
+ delay(200);
+}
\ No newline at end of file
diff --git a/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/test/README b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/test/README
new file mode 100644
index 00000000..b94d0890
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/code-detect/wio-terminal/fruit-quality-detector/test/README
@@ -0,0 +1,11 @@
+
+This directory is intended for PlatformIO Unit Testing and project tests.
+
+Unit Testing is a software testing method by which individual units of
+source code, sets of one or more MCU program modules together with associated
+control data, usage procedures, and operating procedures, are tested to
+determine whether they are fit for use. Unit testing finds problems early
+in the development cycle.
+
+More information about PlatformIO Unit Testing:
+- https://docs.platformio.org/page/plus/unit-testing.html
diff --git a/5-retail/lessons/2-check-stock-device/single-board-computer-count-stock.md b/5-retail/lessons/2-check-stock-device/single-board-computer-count-stock.md
new file mode 100644
index 00000000..a6a627da
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/single-board-computer-count-stock.md
@@ -0,0 +1,163 @@
+# Count stock from your IoT device - Virtual IoT Hardware and Raspberry Pi
+
+A combination of the predictions and their bounding boxes can be used to count stock in an image
+
+## Show bounding boxes
+
+As a helpful debugging step you can not only print out the bounding boxes, but you can also draw them on the image that was written to disk when an image was captured.
+
+### Task - print the bounding boxes
+
+1. Ensure the `stock-counter` project is open in VS Code, and the virtual environment is activated if you are using a virtual IoT device.
+
+1. Change the `print` statement in the `for` loop to the following to print the bounding boxes to the console:
+
+ ```python
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%\t{prediction.bounding_box}')
+ ```
+
+1. Run the app with the camera pointing at some stock on a shelf. The bounding boxes will be printed to the console, with left, top, width and height values from 0-1.
+
+ ```output
+ pi@raspberrypi:~/stock-counter $ python3 app.py
+ tomato paste: 33.42% {'additional_properties': {}, 'left': 0.3455171, 'top': 0.09916268, 'width': 0.14175442, 'height': 0.29405564}
+ tomato paste: 34.41% {'additional_properties': {}, 'left': 0.48283678, 'top': 0.10242918, 'width': 0.11782813, 'height': 0.27467814}
+ tomato paste: 31.25% {'additional_properties': {}, 'left': 0.4923783, 'top': 0.35007596, 'width': 0.13668466, 'height': 0.28304994}
+ tomato paste: 31.05% {'additional_properties': {}, 'left': 0.36416405, 'top': 0.37494493, 'width': 0.14024884, 'height': 0.26880276}
+ ```
+
+### Task - draw bounding boxes on the image
+
+1. The Pip package [Pillow](https://pypi.org/project/Pillow/) can be used to draw on images. Install this with the following command:
+
+ ```sh
+ pip3 install pillow
+ ```
+
+ If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
+
+1. Add the following import statement to the top of the `app.py` file:
+
+ ```python
+ from PIL import Image, ImageDraw, ImageColor
+ ```
+
+ This imports code needed to edit the image.
+
+1. Add the following code to the end of the `app.py` file:
+
+ ```python
+ with Image.open('image.jpg') as im:
+ draw = ImageDraw.Draw(im)
+
+ for prediction in predictions:
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ left = scale_left * im.width
+ top = scale_top * im.height
+ right = scale_right * im.width
+ bottom = scale_bottom * im.height
+
+ draw.rectangle([left, top, right, bottom], outline=ImageColor.getrgb('red'), width=2)
+
+ im.save('image.jpg')
+ ```
+
+ This code opens the image that was saved earlier for editing. It then loops through the predictions getting the bounding boxes, and calculates the bottom right coordinate using the bounding box values from 0-1. These are then converted to image coordinates by multiplying by the relevant dimension of the image. For example, if the left value was 0.5 on an image that was 600 pixels wide, this would convert it to 300 (0.5 x 600 = 300).
+
+ Each bounding box is drawn on the image using a red line. Finally the edited image is saved, overwriting the original image.
+
+1. Run the app with the camera pointing at some stock on a shelf. You will see the `image.jpg` file in the VS Code explorer, and you will be able to select it to see the bounding boxes.
+
+ 
+
+## Count stock
+
+In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
+
+### Task - count stock ignoring overlap
+
+1. The Pip package [Shapely](https://pypi.org/project/Shapely/) can be used to calculate the intersection. If you are using a Raspberry Pi, you will need to instal a library dependency first:
+
+ ```sh
+ sudo apt install libgeos-dev
+ ```
+
+1. Install the Shapely Pip package:
+
+ ```sh
+ pip3 install shapely
+ ```
+
+ If you are using a virtual IoT device, make sure to run this from inside the activated virtual environment.
+
+1. Add the following import statement to the top of the `app.py` file:
+
+ ```python
+ from shapely.geometry import Polygon
+ ```
+
+ This imports code needed to create polygons to calculate overlap.
+
+1. Above the code that draws the bounding boxes, add the following code:
+
+ ```python
+ overlap_threshold = 0.20
+ ```
+
+ This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
+
+1. To calculate overlap using Shapely, the bounding boxes need to be converted into Shapely polygons. Add the following function to do this:
+
+ ```python
+ def create_polygon(prediction):
+ scale_left = prediction.bounding_box.left
+ scale_top = prediction.bounding_box.top
+ scale_right = prediction.bounding_box.left + prediction.bounding_box.width
+ scale_bottom = prediction.bounding_box.top + prediction.bounding_box.height
+
+ return Polygon([(scale_left, scale_top), (scale_right, scale_top), (scale_right, scale_bottom), (scale_left, scale_bottom)])
+ ```
+
+ This creates a polygon using the bounding box of a prediction.
+
+1. The logic for removing overlapping objects involves comparing all bounding boxes and if any pairs of predictions have bounding boxes that overlap more than the threshold, delete one of the predictions. To compare all the predictions, you compare prediction 1 with 2, 3, 4, etc., then 2 with 3, 4, etc. The following code does this:
+
+ ```python
+ to_delete = []
+
+ for i in range(0, len(predictions)):
+ polygon_1 = create_polygon(predictions[i])
+
+ for j in range(i+1, len(predictions)):
+ polygon_2 = create_polygon(predictions[j])
+ overlap = polygon_1.intersection(polygon_2).area
+
+ smallest_area = min(polygon_1.area, polygon_2.area)
+
+ if overlap > (overlap_threshold * smallest_area):
+ to_delete.append(predictions[i])
+ break
+
+ for d in to_delete:
+ predictions.remove(d)
+
+ print(f'Counted {len(predictions)} stock items')
+ ```
+
+ The overlap is calculated using the Shapely `Polygon.intersection` method that returns a polygon that has the overlap. The area is then calculated from this polygon. This overlap threshold is not an absolute value, but needs to be a percentage of the bounding box, so the smallest bounding box is found, and the overlap threshold is used to calculate what area the overlap can be to not exceed the percentage overlap threshold of the smallest bounding box. If the overlap exceeds this, the prediction is marked for deletion.
+
+ Once a prediction has been marked for deletion it doesn't need to be checked again, so the inner loop breaks out to check the next prediction. You can't delete items from a list whilst iterating through it, so the bounding boxes that overlap more than the threshold are added to the `to_delete` list, then deleted at the end.
+
+ Finally the stock count is printed to the console. This could then be sent to an IoT service to alert if the stock levels are low. All of this code is before the bounding boxes are drawn, so you will see the stock predictions without overlaps on the generated images.
+
+ > 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
+
+1. Run the app with the camera pointing at some stock on a shelf. The output will indicate the number of bounding boxes without overlaps that exceed the threshold. Try adjusting the `overlap_threshold` value to see predictions being ignored.
+
+> 💁 You can find this code in the [code-count/pi](code-count/pi) or [code-count/virtual-device](code-count/virtual-device) folder.
+
+😀 Your stock counter program was a success!
diff --git a/5-retail/lessons/2-check-stock-device/single-board-computer-object-detector.md b/5-retail/lessons/2-check-stock-device/single-board-computer-object-detector.md
new file mode 100644
index 00000000..49760468
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/single-board-computer-object-detector.md
@@ -0,0 +1,74 @@
+# Call your object detector from your IoT device - Virtual IoT Hardware and Raspberry Pi
+
+Once your object detector has been published, it can be used from your IoT device.
+
+## Copy the image classifier project
+
+The majority of your stock detector is the same as the image classifier you created in a previous lesson.
+
+### Task - copy the image classifier project
+
+1. Create a folder called `stock-counter` either on your computer if you are using a virtual IoT device, or on your Raspberry Pi. If you are using a virtual IoT device make sure you set up a virtual environment.
+
+1. Set up the camera hardware.
+
+ * If you are using a Raspberry Pi you will need to fit the PiCamera. You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
+ * If you are using a virtual IoT device then you will need to install CounterFit and the CounterFit PyCamera shim. If you are going to use still images, then capture some images that your object detector hasn't seen yet, if you are going to use your web cam make sure it is positioned in a way that can see the stock you are detecting.
+
+1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
+
+1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
+
+## Change the code from a classifier to an image detector
+
+The code you used to classify images is very similar to the code to detect objects. The main difference is the method called on the Custom Vision SDK, and the results of the call.
+
+### Task - change the code from a classifier to an image detector
+
+1. Delete the three lines of code that classifies the image and processes the predictions:
+
+ ```python
+ results = predictor.classify_image(project_id, iteration_name, image)
+
+ for prediction in results.predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
+ ```
+
+ Remove these three lines.
+
+1. Add the following code to detect objects in the image:
+
+ ```python
+ results = predictor.detect_image(project_id, iteration_name, image)
+
+ threshold = 0.3
+
+ predictions = list(prediction for prediction in results.predictions if prediction.probability > threshold)
+
+ for prediction in predictions:
+ print(f'{prediction.tag_name}:\t{prediction.probability * 100:.2f}%')
+ ```
+
+ This code calls the `detect_image` method on the predictor to run the object detector. It then gathers all the predictions with a probability above a threshold, printing them to the console.
+
+ Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
+
+1. Run this code and it will capture an image, sending it to the object detector, and print out the detected objects. If you are using a virtual IoT device ensure you have an appropriate image set in CounterFit, or our web cam is selected. If you are using a Raspberry Pi, make sure your camera is pointing to objects on a shelf.
+
+ ```output
+ pi@raspberrypi:~/stock-counter $ python3 app.py
+ tomato paste: 34.13%
+ tomato paste: 33.95%
+ tomato paste: 35.05%
+ tomato paste: 32.80%
+ ```
+
+ > 💁 You may need to adjust the `threshold` to an appropriate value for your images.
+
+ You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
+
+ 
+
+> 💁 You can find this code in the [code-detect/pi](code-detect/pi) or [code-detect/virtual-device](code-detect/virtual-device) folder.
+
+😀 Your stock counter program was a success!
diff --git a/1-getting-started/lessons/3-sensors-and-actuators/translations/.dummy.md b/5-retail/lessons/2-check-stock-device/translations/.dummy.md
similarity index 100%
rename from 1-getting-started/lessons/3-sensors-and-actuators/translations/.dummy.md
rename to 5-retail/lessons/2-check-stock-device/translations/.dummy.md
diff --git a/5-retail/lessons/2-check-stock-device/wio-terminal-count-stock.md b/5-retail/lessons/2-check-stock-device/wio-terminal-count-stock.md
new file mode 100644
index 00000000..c6bc9b3c
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/wio-terminal-count-stock.md
@@ -0,0 +1,167 @@
+# Count stock from your IoT device - Wio Terminal
+
+A combination of the predictions and their bounding boxes can be used to count stock in an image.
+
+## Count stock
+
+
+
+In the image shown above, the bounding boxes have a small overlap. If this overlap was much larger, then the bounding boxes may indicate the same object. To count the objects correctly, you need to ignore boxes with a significant overlap.
+
+### Task - count stock ignoring overlap
+
+1. Open your `stock-counter` project if it is not already open.
+
+1. Above the `processPredictions` function, add the following code:
+
+ ```cpp
+ const float overlap_threshold = 0.20f;
+ ```
+
+ This defines the percentage overlap allowed before the bounding boxes are considered to be the same object. 0.20 defines a 20% overlap.
+
+1. Below this, and above the `processPredictions` function, add the following code to calculate the overlap between two rectangles:
+
+ ```cpp
+ struct Point {
+ float x, y;
+ };
+
+ struct Rect {
+ Point topLeft, bottomRight;
+ };
+
+ float area(Rect rect)
+ {
+ return abs(rect.bottomRight.x - rect.topLeft.x) * abs(rect.bottomRight.y - rect.topLeft.y);
+ }
+
+ float overlappingArea(Rect rect1, Rect rect2)
+ {
+ float left = max(rect1.topLeft.x, rect2.topLeft.x);
+ float right = min(rect1.bottomRight.x, rect2.bottomRight.x);
+ float top = max(rect1.topLeft.y, rect2.topLeft.y);
+ float bottom = min(rect1.bottomRight.y, rect2.bottomRight.y);
+
+
+ if ( right > left && bottom > top )
+ {
+ return (right-left)*(bottom-top);
+ }
+
+ return 0.0f;
+ }
+ ```
+
+ This code defines a `Point` struct to store points on the image, and a `Rect` struct to define a rectangle using a top left and bottom right coordinate. It then defines an `area` function that calculates the area of a rectangle from a top left and bottom right coordinate.
+
+ Next it defines a `overlappingArea` function that calculates the overlapping area of 2 rectangles. If they don't overlap, it returns 0.
+
+1. Below the `overlappingArea` function, declare a function to convert a bounding box to a `Rect`:
+
+ ```cpp
+ Rect rectFromBoundingBox(JsonVariant prediction)
+ {
+ JsonObject bounding_box = prediction["boundingBox"].as();
+
+ float left = bounding_box["left"].as();
+ float top = bounding_box["top"].as();
+ float width = bounding_box["width"].as();
+ float height = bounding_box["height"].as();
+
+ Point topLeft = {left, top};
+ Point bottomRight = {left + width, top + height};
+
+ return {topLeft, bottomRight};
+ }
+ ```
+
+ This takes a prediction from the object detector, extracts the bounding box and uses the values on the bounding box to define a rectangle. The right side is calculated from the left plus the width. The bottom is calculated as the top plus the height.
+
+1. The predictions need to be compared to each other, and if 2 predictions have an overlap of more that the threshold, one of them needs to be deleted. The overlap threshold is a percentage, so needs to be multiplied by the size of the smallest bounding box to check that the overlap exceeds the given percentage of the bounding box, not the given percentage of the whole image. Start by deleting the content of the `processPredictions` function.
+
+1. Add the following to the empty `processPredictions` function:
+
+ ```cpp
+ std::vector passed_predictions;
+
+ for (int i = 0; i < predictions.size(); ++i)
+ {
+ Rect prediction_1_rect = rectFromBoundingBox(predictions[i]);
+ float prediction_1_area = area(prediction_1_rect);
+ bool passed = true;
+
+ for (int j = i + 1; j < predictions.size(); ++j)
+ {
+ Rect prediction_2_rect = rectFromBoundingBox(predictions[j]);
+ float prediction_2_area = area(prediction_2_rect);
+
+ float overlap = overlappingArea(prediction_1_rect, prediction_2_rect);
+ float smallest_area = min(prediction_1_area, prediction_2_area);
+
+ if (overlap > (overlap_threshold * smallest_area))
+ {
+ passed = false;
+ break;
+ }
+ }
+
+ if (passed)
+ {
+ passed_predictions.push_back(predictions[i]);
+ }
+ }
+ ```
+
+ This code declares a vector to store the predictions that don't overlap. It then loops through all the predictions, creating a `Rect` from the bounding box.
+
+ Next this code loops through the remaining predictions, starting at the one after the current prediction. This stops predictions being compared more than once - once 1 and 2 have been compared, there's no need to compare 2 with 1, only with 3, 4, etc.
+
+ For each pair of predictions the overlapping area is calculated. This is then compared to the area of the smallest bounding box - if the overlap exceeds the threshold percentage of the smallest bounding box, the prediction is marked as not passed. If after comparing all the overlap, the prediction passes the checks it is added to the `passed_predictions` collection.
+
+ > 💁 This is very simplistic way to remove overlaps, just removing the first one in an overlapping pair. For production code, you would want to put more logic in here, such as considering the overlaps between multiple objects, or if one bounding box is contained by another.
+
+1. After this, add the following code to send details of the passed predictions to the serial monitor:
+
+ ```cpp
+ for(JsonVariant prediction : passed_predictions)
+ {
+ String boundingBox = prediction["boundingBox"].as();
+ String tag = prediction["tagName"].as();
+ float probability = prediction["probability"].as();
+
+ char buff[32];
+ sprintf(buff, "%s:\t%.2f%%\t%s", tag.c_str(), probability * 100.0, boundingBox.c_str());
+ Serial.println(buff);
+ }
+ ```
+
+ This code loops through the passed predictions and prints their details to the serial monitor.
+
+1. Below this, add code to print the number of counted items to the serial monitor:
+
+ ```cpp
+ Serial.print("Counted ");
+ Serial.print(passed_predictions.size());
+ Serial.println(" stock items.");
+ ```
+
+ This could then be sent to an IoT service to alert if the stock levels are low.
+
+1. Upload and run your code. Point the camera at objects on a shelf and press the C button. Try adjusting the `overlap_threshold` value to see predictions being ignored.
+
+ ```output
+ Connecting to WiFi..
+ Connected!
+ Image captured
+ Image read to buffer with length 17416
+ tomato paste: 35.84% {"left":0.395631,"top":0.215897,"width":0.180768,"height":0.359364}
+ tomato paste: 35.87% {"left":0.378554,"top":0.583012,"width":0.14824,"height":0.359382}
+ tomato paste: 34.11% {"left":0.699024,"top":0.592617,"width":0.124411,"height":0.350456}
+ tomato paste: 35.16% {"left":0.513006,"top":0.647853,"width":0.187472,"height":0.325817}
+ Counted 4 stock items.
+ ```
+
+> 💁 You can find this code in the [code-count/wio-terminal](code-count/wio-terminal) folder.
+
+😀 Your stock counter program was a success!
diff --git a/5-retail/lessons/2-check-stock-device/wio-terminal-object-detector.md b/5-retail/lessons/2-check-stock-device/wio-terminal-object-detector.md
new file mode 100644
index 00000000..5c6e7fec
--- /dev/null
+++ b/5-retail/lessons/2-check-stock-device/wio-terminal-object-detector.md
@@ -0,0 +1,102 @@
+# Call your object detector from your IoT device - Wio Terminal
+
+Once your object detector has been published, it can be used from your IoT device.
+
+## Copy the image classifier project
+
+The majority of your stock detector is the same as the image classifier you created in a previous lesson.
+
+### Task - copy the image classifier project
+
+1. Connect your ArduCam your Wio Terminal, following the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/wio-terminal-camera.md#task---connect-the-camera).
+
+ You might also want to fix the camera in a single position, for example, by hanging the cable over a box or can, or fixing the camera to a box with double-sided tape.
+
+1. Create a brand new Wio Terminal project using PlatformIO. Call this project `stock-counter`.
+
+1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---capture-an-image-using-an-iot-device) to capture images from the camera.
+
+1. Replicate the steps from [lesson 2 of the manufacturing project](../../../4-manufacturing/lessons/2-check-fruit-from-device/README.md#task---classify-images-from-your-iot-device) to call the image classifier. The majority of this code will be re-used to detect objects.
+
+## Change the code from a classifier to an image detector
+
+The code you used to classify images is very similar to the code to detect objects. The main difference is the URL that is called that you obtained from Custom Vision, and the results of the call.
+
+### Task - change the code from a classifier to an image detector
+
+1. Add the following include directive to the top of the `main.cpp` file:
+
+ ```cpp
+ #include
+ ```
+
+1. Rename the `classifyImage` function to `detectStock`, both the name of the function and the call in the `buttonPressed` function.
+
+1. Above the `detectStock` function, declare a threshold to filter out any detections that have a low probability:
+
+ ```cpp
+ const float threshold = 0.3f;
+ ```
+
+ Unlike an image classifier that only returns one result per tag, the object detector will return multiple results, so any with a low probability need to be filtered out.
+
+1. Above the `detectStock` function, declare a function to process the predictions:
+
+ ```cpp
+ void processPredictions(std::vector &predictions)
+ {
+ for(JsonVariant prediction : predictions)
+ {
+ String tag = prediction["tagName"].as();
+ float probability = prediction["probability"].as();
+
+ char buff[32];
+ sprintf(buff, "%s:\t%.2f%%", tag.c_str(), probability * 100.0);
+ Serial.println(buff);
+ }
+ }
+ ```
+
+ This takes a list of predictions and prints them to the serial monitor.
+
+1. In the `detectStock` function, replace the contents of the `for` loop that loops through the predictions with the following:
+
+ ```cpp
+ std::vector passed_predictions;
+
+ for(JsonVariant prediction : predictions)
+ {
+ float probability = prediction["probability"].as();
+ if (probability > threshold)
+ {
+ passed_predictions.push_back(prediction);
+ }
+ }
+
+ processPredictions(passed_predictions);
+ ```
+
+ This loops through the predictions, comparing the probability to the threshold. All predictions that have a probability higher than the threshold are added to a `list` and passed to the `processPredictions` function.
+
+1. Upload and run your code. Point the camera at objects on a shelf and press the C button. You will see the output in the serial monitor:
+
+ ```output
+ Connecting to WiFi..
+ Connected!
+ Image captured
+ Image read to buffer with length 17416
+ tomato paste: 35.84%
+ tomato paste: 35.87%
+ tomato paste: 34.11%
+ tomato paste: 35.16%
+ ```
+
+ > 💁 You may need to adjust the `threshold` to an appropriate value for your images.
+
+ You will be able to see the image that was taken, and these values in the **Predictions** tab in Custom Vision.
+
+ 
+
+> 💁 You can find this code in the [code-detect/wio-terminal](code-detect/wio-terminal) folder.
+
+😀 Your stock counter program was a success!
diff --git a/6-consumer/README.md b/6-consumer/README.md
index 09fa9331..62a89c35 100644
--- a/6-consumer/README.md
+++ b/6-consumer/README.md
@@ -1,8 +1,8 @@
# Consumer IoT - build a smart voice assistant
-The fod has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as simple hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric.
+The fod has been grown, driven to a processing plant, sorted for quality, sold in the store and now it's time to cook! One of the core pieces of any kitchen is a timer. Initially these started as hour glasses - your food was cooked when all the sand trickled down into the bottom bulb. They then went clockwork, then electric.
-The latest iterations are now part of our smart devices. In kitchens all throughout the world you'll head chefs shouting "Hey Siri - set a 10 minute timer", or "Alexa - cancel my bread timer". No longer do you have to walk back to the kitchen to check on a timer, you can do it from your phone, or a call out across the room.
+The latest iterations are now part of our smart devices. In kitchens in homes all throughout the world you'll hear cooks shouting "Hey Siri - set a 10 minute timer", or "Alexa - cancel my bread timer". No longer do you have to walk back to the kitchen to check on a timer, you can do it from your phone, or a call out across the room.
In these 4 lessons you'll learn how to build a smart timer, using AI to recognize your voice, understand what you are asking for, and reply with information about your timer. You'll also add support for multiple languages.
@@ -12,7 +12,7 @@ In these 4 lessons you'll learn how to build a smart timer, using AI to recogniz
1. [Recognize speech with an IoT device](./lessons/1-speech-recognition/README.md)
1. [Understand language](./lessons/2-language-understanding/README.md)
-1. [Provide spoken feedback](./lessons/3-spoken-feedback/README.md)
+1. [Set a timer and provide spoken feedback](./lessons/3-spoken-feedback/README.md)
1. [Support multiple languages](./lessons/4-multiple-language-support/README.md)
## Credits
diff --git a/6-consumer/lessons/1-speech-recognition/README.md b/6-consumer/lessons/1-speech-recognition/README.md
index c94c02c8..417af583 100644
--- a/6-consumer/lessons/1-speech-recognition/README.md
+++ b/6-consumer/lessons/1-speech-recognition/README.md
@@ -1,12 +1,14 @@
# Recognize speech with an IoT device
-Add a sketchnote if possible/appropriate
+This video gives an overview of the Azure speech service, a topic that will be covered in this lesson:
-
+[](https://www.youtube.com/watch?v=iW0Fw0l3mrA)
+
+> 🎥 Click the image above to watch a video
## Pre-lecture quiz
-[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/33)
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/41)
## Introduction
@@ -87,6 +89,22 @@ These samples are taken many thousands of times per second, using well-defined s
✅ Do some research: If you use a streaming music service, what sample rate and size does it use? If you use CDs, what is the sample rate and size of CD audio?
+There are a number of different formats for audio data. You've probably heard of mp3 files - audio data that is compressed to make it smaller without losing any quality. Uncompressed audio is often stored as a WAV file - this is a file with 44 bytes of header information, followed by raw audio data. The header contains information such as the sample rate (for example 16000 for 16KHz) and sample size (16 for 16-bit), and the number of channels. After the header, the WAV file contains the raw audio data.
+
+> 🎓 Channels refers to how many different audio streams make up the audio. For example, for stereo audio with left and right, there would be 2 channels. For 7.1 surround sound for a home theater system this would be 8.
+
+### Audio data size
+
+Audio data is relatively large. For example, capturing uncompressed 16-bit audio at 16KHz (a good enough rate for use with speech to text model), takes 32KB of data for each second of audio:
+
+* 16-bit means 2 bytes per sample (1 byte is 8 bits).
+* 16KHz is 16,000 samples per second.
+* 16,000 x 2 bytes = 32,000 bytes per second.
+
+This sounds like a small amount of data, but if you are using a microcontroller with limited memory, this can be a lot. For example, the Wio Terminal has 192KB of memory, and that needs to store program code and variables. Even if your program code was tiny, you couldn't capture more than 5 seconds of audio.
+
+Microcontrollers can access additional storage, such as SD cards or flash memory. When building an IoT device that captures audio you will need to ensure not only you have additional storage, but your code writes the audio captured from your microphone directly to that storage, and when sending it to the cloud, you stream from storage to the web request. That way you can avoid running out of memory by trying to hold the entire block of audio data in memory at once.
+
## Capture audio from your IoT device
Your IoT device can be connected to a microphone to capture audio, ready for conversion to text. It can also be connected to speakers to output audio. In later lessons this will be used to give audio feedback, but it is useful to set up speakers now to test the microphone.
@@ -113,7 +131,7 @@ Speech to text, or speech recognition, involves using AI to convert words in an
### Speech recognition models
-To convert speech to text, samples from the audio signal are grouped together and fed into a machine learning model based around a Recurrent Neural network (RNN). This is a type of machine learning model that can use previous data to make a decision about incoming data. For example, the RNN could detect one block of audio samples as the sound 'Hel', and when it receives another that it thinks is the sound 'lo', it can combine this with the previous sound, see that 'Hello' is a valid word and select that as the outcome.
+To convert speech to text, samples from the audio signal are grouped together and fed into a machine learning model based around a Recurrent Neural network (RNN). This is a type of machine learning model that can use previous data to make a decision about incoming data. For example, the RNN could detect one block of audio samples as the sound 'Hel', and when it receives another that it thinks is the sound 'lo', it can combine this with the previous sound, find that 'Hello' is a valid word and select that as the outcome.
ML models always accept data of the same size every time. The image classifier you built in an earlier lesson resizes images to a fixed size and processes them. The same with speech models, they have to process fixed sized audio chunks. The speech models need to be able to combine the outputs of multiple predictions to get the answer, to allow it to distinguish between 'Hi' and 'Highway', or 'flock' and 'floccinaucinihilipilification'.
@@ -143,7 +161,9 @@ To avoid the complexity of training and using a wake word model, the smart timer
## Convert speech to text
-Just like with image classification in the last project, there are pre-built AI services that can take speech as an audio file and convert it to text. Once such service is the Speech Service, part of the Cognitive Services, pre-built AI services you can use in your apps.
+
+
+Just like with image classification in an earlier project, there are pre-built AI services that can take speech as an audio file and convert it to text. Once such service is the Speech Service, part of the Cognitive Services, pre-built AI services you can use in your apps.
### Task - configure a speech AI resource
@@ -180,37 +200,17 @@ Work through the relevant guide to convert speech to text on your IoT device:
* [Single-board computer - Raspberry Pi](pi-speech-to-text.md)
* [Single-board computer - Virtual device](virtual-device-speech-to-text.md)
-### Task - send converted speech to an IoT services
-
-To use the results of the speech to text conversion, you need to send it to the cloud. There it will be interpreted and responses sent back to the IoT device as commands.
-
-1. Create a new IoT Hub in the `smart-timer` resource group, and register a new device called `smart-timer`.
-
-1. Connect your IoT device to this IoT Hub using what you have learned in previous lessons, and send the speech as telemetry. Use a JSON document in this format:
-
- ```json
- {
- "speech" : ""
- }
- ```
-
- Where `` is the output from the speech to text call.
-
-1. Verify that messages are being sent by monitoring the Event Hub compatible endpoint using the `az iot hub monitor-events` command.
-
-> 💁 You can find this code in the [code-iot-hub/virtual-iot-device](code-iot-hub/virtual-iot-device), [code-iot-hub/pi](code-iot-hub/pi), or [code-iot-hub/wio-terminal](code-iot-hub/wio-terminal) folder.
-
---
## 🚀 Challenge
-Speech recognition has been around for a long time, and is continuously improving. Research the current capabilities and see how these have evolved over time, including how accurate machine transcriptions are compared to human.
+Speech recognition has been around for a long time, and is continuously improving. Research the current capabilities and compare how these have evolved over time, including how accurate machine transcriptions are compared to human.
What do you think the future holds for speech recognition?
## Post-lecture quiz
-[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/42)
## Review & Self Study
diff --git a/6-consumer/lessons/1-speech-recognition/code-iot-hub/virtual-iot-device/smart-timer/app.py b/6-consumer/lessons/1-speech-recognition/code-iot-hub/virtual-iot-device/smart-timer/app.py
deleted file mode 100644
index 59df6635..00000000
--- a/6-consumer/lessons/1-speech-recognition/code-iot-hub/virtual-iot-device/smart-timer/app.py
+++ /dev/null
@@ -1,33 +0,0 @@
-import json
-import time
-from azure.cognitiveservices.speech import SpeechConfig, SpeechRecognizer
-from azure.iot.device import IoTHubDeviceClient, Message
-
-api_key = ''
-location = ''
-language = ''
-connection_string = ''
-
-device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
-
-print('Connecting')
-device_client.connect()
-print('Connected')
-
-speech_config = SpeechConfig(subscription=api_key,
- region=location,
- speech_recognition_language=language)
-
-recognizer = SpeechRecognizer(speech_config=speech_config)
-
-def recognized(args):
- if len(args.result.text) > 0:
- message = Message(json.dumps({ 'speech': args.result.text }))
- device_client.send_message(message)
-
-recognizer.recognized.connect(recognized)
-
-recognizer.start_continuous_recognition()
-
-while True:
- time.sleep(1)
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/include/README b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/include/README
new file mode 100644
index 00000000..194dcd43
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/include/README
@@ -0,0 +1,39 @@
+
+This directory is intended for project header files.
+
+A header file is a file containing C declarations and macro definitions
+to be shared between several project source files. You request the use of a
+header file in your project source file (C, C++, etc) located in `src` folder
+by including it, with the C preprocessing directive `#include'.
+
+```src/main.c
+
+#include "header.h"
+
+int main (void)
+{
+ ...
+}
+```
+
+Including a header file produces the same results as copying the header file
+into each source file that needs it. Such copying would be time-consuming
+and error-prone. With a header file, the related declarations appear
+in only one place. If they need to be changed, they can be changed in one
+place, and programs that include the header file will automatically use the
+new version when next recompiled. The header file eliminates the labor of
+finding and changing all the copies as well as the risk that a failure to
+find one copy will result in inconsistencies within a program.
+
+In C, the usual convention is to give header files names that end with `.h'.
+It is most portable to use only letters, digits, dashes, and underscores in
+header file names, and at most one dot.
+
+Read more about using header files in official GCC documentation:
+
+* Include Syntax
+* Include Operation
+* Once-Only Headers
+* Computed Includes
+
+https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/lib/README b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/lib/README
new file mode 100644
index 00000000..6debab1e
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/lib/README
@@ -0,0 +1,46 @@
+
+This directory is intended for project specific (private) libraries.
+PlatformIO will compile them to static libraries and link into executable file.
+
+The source code of each library should be placed in a an own separate directory
+("lib/your_library_name/[here are source files]").
+
+For example, see a structure of the following two libraries `Foo` and `Bar`:
+
+|--lib
+| |
+| |--Bar
+| | |--docs
+| | |--examples
+| | |--src
+| | |- Bar.c
+| | |- Bar.h
+| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
+| |
+| |--Foo
+| | |- Foo.c
+| | |- Foo.h
+| |
+| |- README --> THIS FILE
+|
+|- platformio.ini
+|--src
+ |- main.c
+
+and a contents of `src/main.c`:
+```
+#include
+#include
+
+int main (void)
+{
+ ...
+}
+
+```
+
+PlatformIO Library Dependency Finder will find automatically dependent
+libraries scanning project source files.
+
+More information about PlatformIO Library Dependency Finder
+- https://docs.platformio.org/page/librarymanager/ldf.html
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/platformio.ini b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/platformio.ini
new file mode 100644
index 00000000..c5999f17
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/platformio.ini
@@ -0,0 +1,19 @@
+; PlatformIO Project Configuration File
+;
+; Build options: build flags, source filter
+; Upload options: custom upload port, speed and extra flags
+; Library options: dependencies, extra library storages
+; Advanced options: extra scripting
+;
+; Please visit documentation for the other options and examples
+; https://docs.platformio.org/page/projectconf.html
+
+[env:seeed_wio_terminal]
+platform = atmelsam
+board = seeed_wio_terminal
+framework = arduino
+lib_deps =
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+build_flags =
+ -DSFUD_USING_QSPI
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/flash_writer.h b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/flash_writer.h
new file mode 100644
index 00000000..87fdff29
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/flash_writer.h
@@ -0,0 +1,60 @@
+#pragma once
+
+#include
+#include
+
+class FlashWriter
+{
+public:
+ void init()
+ {
+ _flash = sfud_get_device_table() + 0;
+ _sfudBufferSize = _flash->chip.erase_gran;
+ _sfudBuffer = new byte[_sfudBufferSize];
+ _sfudBufferPos = 0;
+ _sfudBufferWritePos = 0;
+ }
+
+ void reset()
+ {
+ _sfudBufferPos = 0;
+ _sfudBufferWritePos = 0;
+ }
+
+ void writeSfudBuffer(byte b)
+ {
+ _sfudBuffer[_sfudBufferPos++] = b;
+ if (_sfudBufferPos == _sfudBufferSize)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+
+ void flushSfudBuffer()
+ {
+ if (_sfudBufferPos > 0)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+
+ void writeSfudBuffer(byte *b, size_t len)
+ {
+ for (size_t i = 0; i < len; ++i)
+ {
+ writeSfudBuffer(b[i]);
+ }
+ }
+
+private:
+ byte *_sfudBuffer;
+ size_t _sfudBufferSize;
+ size_t _sfudBufferPos;
+ size_t _sfudBufferWritePos;
+
+ const sfud_flash *_flash;
+};
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/main.cpp b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/main.cpp
new file mode 100644
index 00000000..0f77c9bd
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/main.cpp
@@ -0,0 +1,49 @@
+#include
+#include
+#include
+
+#include "mic.h"
+
+void setup()
+{
+ Serial.begin(9600);
+
+ while (!Serial)
+ ; // Wait for Serial to be ready
+
+ delay(1000);
+
+ while (!(sfud_init() == SFUD_SUCCESS))
+ ;
+
+ sfud_qspi_fast_read_enable(sfud_get_device(SFUD_W25Q32_DEVICE_INDEX), 2);
+
+ pinMode(WIO_KEY_C, INPUT_PULLUP);
+
+ mic.init();
+
+ Serial.println("Ready.");
+}
+
+void processAudio()
+{
+
+}
+
+void loop()
+{
+ if (digitalRead(WIO_KEY_C) == LOW && !mic.isRecording())
+ {
+ Serial.println("Starting recording...");
+ mic.startRecording();
+ }
+
+ if (!mic.isRecording() && mic.isRecordingReady())
+ {
+ Serial.println("Finished recording");
+
+ processAudio();
+
+ mic.reset();
+ }
+}
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/mic.h b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/mic.h
new file mode 100644
index 00000000..ecdeb418
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/src/mic.h
@@ -0,0 +1,248 @@
+#pragma once
+
+#include
+
+#include "flash_writer.h"
+
+#define RATE 16000
+#define SAMPLE_LENGTH_SECONDS 4
+#define SAMPLES RATE * SAMPLE_LENGTH_SECONDS
+#define BUFFER_SIZE (SAMPLES * 2) + 44
+#define ADC_BUF_LEN 1600
+
+class Mic
+{
+public:
+ Mic()
+ {
+ _isRecording = false;
+ _isRecordingReady = false;
+ }
+
+ void startRecording()
+ {
+ _isRecording = true;
+ _isRecordingReady = false;
+ }
+
+ bool isRecording()
+ {
+ return _isRecording;
+ }
+
+ bool isRecordingReady()
+ {
+ return _isRecordingReady;
+ }
+
+ void init()
+ {
+ analogReference(AR_INTERNAL2V23);
+
+ _writer.init();
+
+ initBufferHeader();
+ configureDmaAdc();
+ }
+
+ void reset()
+ {
+ _isRecordingReady = false;
+ _isRecording = false;
+
+ _writer.reset();
+
+ initBufferHeader();
+ }
+
+ void dmaHandler()
+ {
+ static uint8_t count = 0;
+ static uint16_t idx = 0;
+
+ if (DMAC->Channel[1].CHINTFLAG.bit.SUSP)
+ {
+ DMAC->Channel[1].CHCTRLB.reg = DMAC_CHCTRLB_CMD_RESUME;
+ DMAC->Channel[1].CHINTFLAG.bit.SUSP = 1;
+
+ if (count)
+ {
+ audioCallback(_adc_buf_0, ADC_BUF_LEN);
+ }
+ else
+ {
+ audioCallback(_adc_buf_1, ADC_BUF_LEN);
+ }
+
+ count = (count + 1) % 2;
+ }
+ }
+
+private:
+ volatile bool _isRecording;
+ volatile bool _isRecordingReady;
+ FlashWriter _writer;
+
+typedef struct
+ {
+ uint16_t btctrl;
+ uint16_t btcnt;
+ uint32_t srcaddr;
+ uint32_t dstaddr;
+ uint32_t descaddr;
+ } dmacdescriptor;
+
+ // Globals - DMA and ADC
+ volatile dmacdescriptor _wrb[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor_section[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor __attribute__((aligned(16)));
+
+ void configureDmaAdc()
+ {
+ // Configure DMA to sample from ADC at a regular interval (triggered by timer/counter)
+ DMAC->BASEADDR.reg = (uint32_t)_descriptor_section; // Specify the location of the descriptors
+ DMAC->WRBADDR.reg = (uint32_t)_wrb; // Specify the location of the write back descriptors
+ DMAC->CTRL.reg = DMAC_CTRL_DMAENABLE | DMAC_CTRL_LVLEN(0xf); // Enable the DMAC peripheral
+ DMAC->Channel[1].CHCTRLA.reg = DMAC_CHCTRLA_TRIGSRC(TC5_DMAC_ID_OVF) | // Set DMAC to trigger on TC5 timer overflow
+ DMAC_CHCTRLA_TRIGACT_BURST; // DMAC burst transfer
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[1]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_0 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_0 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[0], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[0]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_1 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_1 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[1], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ // Configure NVIC
+ NVIC_SetPriority(DMAC_1_IRQn, 0); // Set the Nested Vector Interrupt Controller (NVIC) priority for DMAC1 to 0 (highest)
+ NVIC_EnableIRQ(DMAC_1_IRQn); // Connect DMAC1 to Nested Vector Interrupt Controller (NVIC)
+
+ // Activate the suspend (SUSP) interrupt on DMAC channel 1
+ DMAC->Channel[1].CHINTENSET.reg = DMAC_CHINTENSET_SUSP;
+
+ // Configure ADC
+ ADC1->INPUTCTRL.bit.MUXPOS = ADC_INPUTCTRL_MUXPOS_AIN12_Val; // Set the analog input to ADC0/AIN2 (PB08 - A4 on Metro M4)
+ while (ADC1->SYNCBUSY.bit.INPUTCTRL)
+ ; // Wait for synchronization
+ ADC1->SAMPCTRL.bit.SAMPLEN = 0x00; // Set max Sampling Time Length to half divided ADC clock pulse (2.66us)
+ while (ADC1->SYNCBUSY.bit.SAMPCTRL)
+ ; // Wait for synchronization
+ ADC1->CTRLA.reg = ADC_CTRLA_PRESCALER_DIV128; // Divide Clock ADC GCLK by 128 (48MHz/128 = 375kHz)
+ ADC1->CTRLB.reg = ADC_CTRLB_RESSEL_12BIT | // Set ADC resolution to 12 bits
+ ADC_CTRLB_FREERUN; // Set ADC to free run mode
+ while (ADC1->SYNCBUSY.bit.CTRLB)
+ ; // Wait for synchronization
+ ADC1->CTRLA.bit.ENABLE = 1; // Enable the ADC
+ while (ADC1->SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ ADC1->SWTRIG.bit.START = 1; // Initiate a software trigger to start an ADC conversion
+ while (ADC1->SYNCBUSY.bit.SWTRIG)
+ ; // Wait for synchronization
+
+ // Enable DMA channel 1
+ DMAC->Channel[1].CHCTRLA.bit.ENABLE = 1;
+
+ // Configure Timer/Counter 5
+ GCLK->PCHCTRL[TC5_GCLK_ID].reg = GCLK_PCHCTRL_CHEN | // Enable perhipheral channel for TC5
+ GCLK_PCHCTRL_GEN_GCLK1; // Connect generic clock 0 at 48MHz
+
+ TC5->COUNT16.WAVE.reg = TC_WAVE_WAVEGEN_MFRQ; // Set TC5 to Match Frequency (MFRQ) mode
+ TC5->COUNT16.CC[0].reg = 3000 - 1; // Set the trigger to 16 kHz: (4Mhz / 16000) - 1
+ while (TC5->COUNT16.SYNCBUSY.bit.CC0)
+ ; // Wait for synchronization
+
+ // Start Timer/Counter 5
+ TC5->COUNT16.CTRLA.bit.ENABLE = 1; // Enable the TC5 timer
+ while (TC5->COUNT16.SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ }
+
+ uint16_t _adc_buf_0[ADC_BUF_LEN];
+ uint16_t _adc_buf_1[ADC_BUF_LEN];
+
+ // WAV files have a header. This struct defines that header
+ struct wavFileHeader
+ {
+ char riff[4]; /* "RIFF" */
+ long flength; /* file length in bytes */
+ char wave[4]; /* "WAVE" */
+ char fmt[4]; /* "fmt " */
+ long chunk_size; /* size of FMT chunk in bytes (usually 16) */
+ short format_tag; /* 1=PCM, 257=Mu-Law, 258=A-Law, 259=ADPCM */
+ short num_chans; /* 1=mono, 2=stereo */
+ long srate; /* Sampling rate in samples per second */
+ long bytes_per_sec; /* bytes per second = srate*bytes_per_samp */
+ short bytes_per_samp; /* 2=16-bit mono, 4=16-bit stereo */
+ short bits_per_samp; /* Number of bits per sample */
+ char data[4]; /* "data" */
+ long dlength; /* data length in bytes (filelength - 44) */
+ };
+
+ void initBufferHeader()
+ {
+ wavFileHeader wavh;
+
+ strncpy(wavh.riff, "RIFF", 4);
+ strncpy(wavh.wave, "WAVE", 4);
+ strncpy(wavh.fmt, "fmt ", 4);
+ strncpy(wavh.data, "data", 4);
+
+ wavh.chunk_size = 16;
+ wavh.format_tag = 1; // PCM
+ wavh.num_chans = 1; // mono
+ wavh.srate = RATE;
+ wavh.bytes_per_sec = (RATE * 1 * 16 * 1) / 8;
+ wavh.bytes_per_samp = 2;
+ wavh.bits_per_samp = 16;
+ wavh.dlength = RATE * 2 * 1 * 16 / 2;
+ wavh.flength = wavh.dlength + 44;
+
+ _writer.writeSfudBuffer((byte *)&wavh, 44);
+ }
+
+ void audioCallback(uint16_t *buf, uint32_t buf_len)
+ {
+ static uint32_t idx = 44;
+
+ if (_isRecording)
+ {
+ for (uint32_t i = 0; i < buf_len; i++)
+ {
+ int16_t audio_value = ((int16_t)buf[i] - 2048) * 16;
+
+ _writer.writeSfudBuffer(audio_value & 0xFF);
+ _writer.writeSfudBuffer((audio_value >> 8) & 0xFF);
+ }
+
+ idx += buf_len;
+
+ if (idx >= BUFFER_SIZE)
+ {
+ _writer.flushSfudBuffer();
+ idx = 44;
+ _isRecording = false;
+ _isRecordingReady = true;
+ }
+ }
+ }
+};
+
+Mic mic;
+
+void DMAC_1_Handler()
+{
+ mic.dmaHandler();
+}
diff --git a/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/test/README b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/test/README
new file mode 100644
index 00000000..b94d0890
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-record/wio-terminal/smart-timer/test/README
@@ -0,0 +1,11 @@
+
+This directory is intended for PlatformIO Unit Testing and project tests.
+
+Unit Testing is a software testing method by which individual units of
+source code, sets of one or more MCU program modules together with associated
+control data, usage procedures, and operating procedures, are tested to
+determine whether they are fit for use. Unit testing finds problems early
+in the development cycle.
+
+More information about PlatformIO Unit Testing:
+- https://docs.platformio.org/page/plus/unit-testing.html
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/pi/smart-timer/app.py b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/pi/smart-timer/app.py
index 64eb2990..b3bd252a 100644
--- a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/pi/smart-timer/app.py
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/pi/smart-timer/app.py
@@ -1,5 +1,4 @@
import io
-import json
import pyaudio
import requests
import time
@@ -39,13 +38,13 @@ def capture_audio():
return wav_buffer
-api_key = ''
+speech_api_key = ''
location = ''
language = ''
def get_access_token():
headers = {
- 'Ocp-Apim-Subscription-Key': api_key
+ 'Ocp-Apim-Subscription-Key': speech_api_key
}
token_endpoint = f'https://{location}.api.cognitive.microsoft.com/sts/v1.0/issuetoken'
@@ -66,17 +65,20 @@ def convert_speech_to_text(buffer):
}
response = requests.post(url, headers=headers, params=params, data=buffer)
- response_json = json.loads(response.text)
+ response_json = response.json()
if response_json['RecognitionStatus'] == 'Success':
return response_json['DisplayText']
else:
return ''
+def process_text(text):
+ print(text)
+
while True:
while not button.is_pressed():
time.sleep(.1)
buffer = capture_audio()
text = convert_speech_to_text(buffer)
- print(text)
\ No newline at end of file
+ process_text(text)
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/virtual-iot-device/smart-timer/app.py b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/virtual-iot-device/smart-timer/app.py
index 15632657..4c9ea0a1 100644
--- a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/virtual-iot-device/smart-timer/app.py
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/virtual-iot-device/smart-timer/app.py
@@ -1,18 +1,21 @@
import time
from azure.cognitiveservices.speech import SpeechConfig, SpeechRecognizer
-api_key = ''
+speech_api_key = ''
location = ''
language = ''
-speech_config = SpeechConfig(subscription=api_key,
- region=location,
- speech_recognition_language=language)
+recognizer_config = SpeechConfig(subscription=speech_api_key,
+ region=location,
+ speech_recognition_language=language)
-recognizer = SpeechRecognizer(speech_config=speech_config)
+recognizer = SpeechRecognizer(speech_config=recognizer_config)
+
+def process_text(text):
+ print(text)
def recognized(args):
- print(args.result.text)
+ process_text(args.result.text)
recognizer.recognized.connect(recognized)
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/include/README b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/include/README
new file mode 100644
index 00000000..194dcd43
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/include/README
@@ -0,0 +1,39 @@
+
+This directory is intended for project header files.
+
+A header file is a file containing C declarations and macro definitions
+to be shared between several project source files. You request the use of a
+header file in your project source file (C, C++, etc) located in `src` folder
+by including it, with the C preprocessing directive `#include'.
+
+```src/main.c
+
+#include "header.h"
+
+int main (void)
+{
+ ...
+}
+```
+
+Including a header file produces the same results as copying the header file
+into each source file that needs it. Such copying would be time-consuming
+and error-prone. With a header file, the related declarations appear
+in only one place. If they need to be changed, they can be changed in one
+place, and programs that include the header file will automatically use the
+new version when next recompiled. The header file eliminates the labor of
+finding and changing all the copies as well as the risk that a failure to
+find one copy will result in inconsistencies within a program.
+
+In C, the usual convention is to give header files names that end with `.h'.
+It is most portable to use only letters, digits, dashes, and underscores in
+header file names, and at most one dot.
+
+Read more about using header files in official GCC documentation:
+
+* Include Syntax
+* Include Operation
+* Once-Only Headers
+* Computed Includes
+
+https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/lib/README b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/lib/README
new file mode 100644
index 00000000..6debab1e
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/lib/README
@@ -0,0 +1,46 @@
+
+This directory is intended for project specific (private) libraries.
+PlatformIO will compile them to static libraries and link into executable file.
+
+The source code of each library should be placed in a an own separate directory
+("lib/your_library_name/[here are source files]").
+
+For example, see a structure of the following two libraries `Foo` and `Bar`:
+
+|--lib
+| |
+| |--Bar
+| | |--docs
+| | |--examples
+| | |--src
+| | |- Bar.c
+| | |- Bar.h
+| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
+| |
+| |--Foo
+| | |- Foo.c
+| | |- Foo.h
+| |
+| |- README --> THIS FILE
+|
+|- platformio.ini
+|--src
+ |- main.c
+
+and a contents of `src/main.c`:
+```
+#include
+#include
+
+int main (void)
+{
+ ...
+}
+
+```
+
+PlatformIO Library Dependency Finder will find automatically dependent
+libraries scanning project source files.
+
+More information about PlatformIO Library Dependency Finder
+- https://docs.platformio.org/page/librarymanager/ldf.html
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/platformio.ini b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/platformio.ini
new file mode 100644
index 00000000..5adbe733
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/platformio.ini
@@ -0,0 +1,22 @@
+; PlatformIO Project Configuration File
+;
+; Build options: build flags, source filter
+; Upload options: custom upload port, speed and extra flags
+; Library options: dependencies, extra library storages
+; Advanced options: extra scripting
+;
+; Please visit documentation for the other options and examples
+; https://docs.platformio.org/page/projectconf.html
+
+[env:seeed_wio_terminal]
+platform = atmelsam
+board = seeed_wio_terminal
+framework = arduino
+lib_deps =
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
+ seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
+ seeed-studio/Seeed Arduino RTC @ 2.0.0
+ bblanchon/ArduinoJson @ 6.17.3
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/config.h b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/config.h
new file mode 100644
index 00000000..cca25e6a
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/config.h
@@ -0,0 +1,89 @@
+#pragma once
+
+#define RATE 16000
+#define SAMPLE_LENGTH_SECONDS 4
+#define SAMPLES RATE * SAMPLE_LENGTH_SECONDS
+#define BUFFER_SIZE (SAMPLES * 2) + 44
+#define ADC_BUF_LEN 1600
+
+const char *SSID = "";
+const char *PASSWORD = "";
+
+const char *SPEECH_API_KEY = "";
+const char *SPEECH_LOCATION = "";
+const char *LANGUAGE = "";
+
+const char *TOKEN_URL = "https://%s.api.cognitive.microsoft.com/sts/v1.0/issuetoken";
+const char *SPEECH_URL = "https://%s.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=%s";
+
+const char *TOKEN_CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
+ "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
+ "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
+ "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
+ "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
+ "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
+ "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
+ "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
+ "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
+ "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
+ "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
+ "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
+ "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
+ "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
+ "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
+ "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
+ "-----END CERTIFICATE-----\r\n";
+
+const char *SPEECH_CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQCq+mxcpjxFFB6jvh98dTFzANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwMTCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBAMedcDrkXufP7pxVm1FHLDNA9IjwHaMoaY8arqqZ4Gff4xyr\r\n"
+ "RygnavXL7g12MPAx8Q6Dd9hfBzrfWxkF0Br2wIvlvkzW01naNVSkHp+OS3hL3W6n\r\n"
+ "l/jYvZnVeJXjtsKYcXIf/6WtspcF5awlQ9LZJcjwaH7KoZuK+THpXCMtzD8XNVdm\r\n"
+ "GW/JI0C/7U/E7evXn9XDio8SYkGSM63aLO5BtLCv092+1d4GGBSQYolRq+7Pd1kR\r\n"
+ "EkWBPm0ywZ2Vb8GIS5DLrjelEkBnKCyy3B0yQud9dpVsiUeE7F5sY8Me96WVxQcb\r\n"
+ "OyYdEY/j/9UpDlOG+vA+YgOvBhkKEjiqygVpP8EZoMMijephzg43b5Qi9r5UrvYo\r\n"
+ "o19oR/8pf4HJNDPF0/FJwFVMW8PmCBLGstin3NE1+NeWTkGt0TzpHjgKyfaDP2tO\r\n"
+ "4bCk1G7pP2kDFT7SYfc8xbgCkFQ2UCEXsaH/f5YmpLn4YPiNFCeeIida7xnfTvc4\r\n"
+ "7IxyVccHHq1FzGygOqemrxEETKh8hvDR6eBdrBwmCHVgZrnAqnn93JtGyPLi6+cj\r\n"
+ "WGVGtMZHwzVvX1HvSFG771sskcEjJxiQNQDQRWHEh3NxvNb7kFlAXnVdRkkvhjpR\r\n"
+ "GchFhTAzqmwltdWhWDEyCMKC2x/mSZvZtlZGY+g37Y72qHzidwtyW7rBetZJAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQUDyBd16FXlduSzyvQx8J3BM5ygHYwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQAlFvNh7QgXVLAZSsNR2XRmIn9iS8OHFCBA\r\n"
+ "WxKJoi8YYQafpMTkMqeuzoL3HWb1pYEipsDkhiMnrpfeYZEA7Lz7yqEEtfgHcEBs\r\n"
+ "K9KcStQGGZRfmWU07hPXHnFz+5gTXqzCE2PBMlRgVUYJiA25mJPXfB00gDvGhtYa\r\n"
+ "+mENwM9Bq1B9YYLyLjRtUz8cyGsdyTIG/bBM/Q9jcV8JGqMU/UjAdh1pFyTnnHEl\r\n"
+ "Y59Npi7F87ZqYYJEHJM2LGD+le8VsHjgeWX2CJQko7klXvcizuZvUEDTjHaQcs2J\r\n"
+ "+kPgfyMIOY1DMJ21NxOJ2xPRC/wAh/hzSBRVtoAnyuxtkZ4VjIOh\r\n"
+ "-----END CERTIFICATE-----\r\n";
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_stream.h b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_stream.h
new file mode 100644
index 00000000..b841f1d0
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_stream.h
@@ -0,0 +1,69 @@
+#pragma once
+
+#include
+#include
+#include
+
+#include "config.h"
+
+class FlashStream : public Stream
+{
+public:
+ FlashStream()
+ {
+ _pos = 0;
+ _flash_address = 0;
+ _flash = sfud_get_device_table() + 0;
+
+ populateBuffer();
+ }
+
+ virtual size_t write(uint8_t val)
+ {
+ return 0;
+ }
+
+ virtual int available()
+ {
+ int remaining = BUFFER_SIZE - ((_flash_address - HTTP_TCP_BUFFER_SIZE) + _pos);
+ int bytes_available = min(HTTP_TCP_BUFFER_SIZE, remaining);
+
+ if (bytes_available == 0)
+ {
+ bytes_available = -1;
+ }
+
+ return bytes_available;
+ }
+
+ virtual int read()
+ {
+ int retVal = _buffer[_pos++];
+
+ if (_pos == HTTP_TCP_BUFFER_SIZE)
+ {
+ populateBuffer();
+ }
+
+ return retVal;
+ }
+
+ virtual int peek()
+ {
+ return _buffer[_pos];
+ }
+
+private:
+ void populateBuffer()
+ {
+ sfud_read(_flash, _flash_address, HTTP_TCP_BUFFER_SIZE, _buffer);
+ _flash_address += HTTP_TCP_BUFFER_SIZE;
+ _pos = 0;
+ }
+
+ size_t _pos;
+ size_t _flash_address;
+ const sfud_flash *_flash;
+
+ byte _buffer[HTTP_TCP_BUFFER_SIZE];
+};
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_writer.h b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_writer.h
new file mode 100644
index 00000000..87fdff29
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/flash_writer.h
@@ -0,0 +1,60 @@
+#pragma once
+
+#include
+#include
+
+class FlashWriter
+{
+public:
+ void init()
+ {
+ _flash = sfud_get_device_table() + 0;
+ _sfudBufferSize = _flash->chip.erase_gran;
+ _sfudBuffer = new byte[_sfudBufferSize];
+ _sfudBufferPos = 0;
+ _sfudBufferWritePos = 0;
+ }
+
+ void reset()
+ {
+ _sfudBufferPos = 0;
+ _sfudBufferWritePos = 0;
+ }
+
+ void writeSfudBuffer(byte b)
+ {
+ _sfudBuffer[_sfudBufferPos++] = b;
+ if (_sfudBufferPos == _sfudBufferSize)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+
+ void flushSfudBuffer()
+ {
+ if (_sfudBufferPos > 0)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+
+ void writeSfudBuffer(byte *b, size_t len)
+ {
+ for (size_t i = 0; i < len; ++i)
+ {
+ writeSfudBuffer(b[i]);
+ }
+ }
+
+private:
+ byte *_sfudBuffer;
+ size_t _sfudBufferSize;
+ size_t _sfudBufferPos;
+ size_t _sfudBufferWritePos;
+
+ const sfud_flash *_flash;
+};
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/main.cpp b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/main.cpp
new file mode 100644
index 00000000..37924a60
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/main.cpp
@@ -0,0 +1,69 @@
+#include
+#include
+#include
+#include
+
+#include "config.h"
+#include "mic.h"
+#include "speech_to_text.h"
+
+void connectWiFi()
+{
+ while (WiFi.status() != WL_CONNECTED)
+ {
+ Serial.println("Connecting to WiFi..");
+ WiFi.begin(SSID, PASSWORD);
+ delay(500);
+ }
+
+ Serial.println("Connected!");
+}
+
+void setup()
+{
+ Serial.begin(9600);
+
+ while (!Serial)
+ ; // Wait for Serial to be ready
+
+ delay(1000);
+
+ connectWiFi();
+
+ while (!(sfud_init() == SFUD_SUCCESS))
+ ;
+
+ sfud_qspi_fast_read_enable(sfud_get_device(SFUD_W25Q32_DEVICE_INDEX), 2);
+
+ pinMode(WIO_KEY_C, INPUT_PULLUP);
+
+ mic.init();
+
+ speechToText.init();
+
+ Serial.println("Ready.");
+}
+
+void processAudio()
+{
+ String text = speechToText.convertSpeechToText();
+ Serial.println(text);
+}
+
+void loop()
+{
+ if (digitalRead(WIO_KEY_C) == LOW && !mic.isRecording())
+ {
+ Serial.println("Starting recording...");
+ mic.startRecording();
+ }
+
+ if (!mic.isRecording() && mic.isRecordingReady())
+ {
+ Serial.println("Finished recording");
+
+ processAudio();
+
+ mic.reset();
+ }
+}
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/mic.h b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/mic.h
new file mode 100644
index 00000000..5f0815de
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/mic.h
@@ -0,0 +1,242 @@
+#pragma once
+
+#include
+
+#include "config.h"
+#include "flash_writer.h"
+
+class Mic
+{
+public:
+ Mic()
+ {
+ _isRecording = false;
+ _isRecordingReady = false;
+ }
+
+ void startRecording()
+ {
+ _isRecording = true;
+ _isRecordingReady = false;
+ }
+
+ bool isRecording()
+ {
+ return _isRecording;
+ }
+
+ bool isRecordingReady()
+ {
+ return _isRecordingReady;
+ }
+
+ void init()
+ {
+ analogReference(AR_INTERNAL2V23);
+
+ _writer.init();
+
+ initBufferHeader();
+ configureDmaAdc();
+ }
+
+ void reset()
+ {
+ _isRecordingReady = false;
+ _isRecording = false;
+
+ _writer.reset();
+
+ initBufferHeader();
+ }
+
+ void dmaHandler()
+ {
+ static uint8_t count = 0;
+
+ if (DMAC->Channel[1].CHINTFLAG.bit.SUSP)
+ {
+ DMAC->Channel[1].CHCTRLB.reg = DMAC_CHCTRLB_CMD_RESUME;
+ DMAC->Channel[1].CHINTFLAG.bit.SUSP = 1;
+
+ if (count)
+ {
+ audioCallback(_adc_buf_0, ADC_BUF_LEN);
+ }
+ else
+ {
+ audioCallback(_adc_buf_1, ADC_BUF_LEN);
+ }
+
+ count = (count + 1) % 2;
+ }
+ }
+
+private:
+ volatile bool _isRecording;
+ volatile bool _isRecordingReady;
+ FlashWriter _writer;
+
+typedef struct
+ {
+ uint16_t btctrl;
+ uint16_t btcnt;
+ uint32_t srcaddr;
+ uint32_t dstaddr;
+ uint32_t descaddr;
+ } dmacdescriptor;
+
+ // Globals - DMA and ADC
+ volatile dmacdescriptor _wrb[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor_section[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor __attribute__((aligned(16)));
+
+ void configureDmaAdc()
+ {
+ // Configure DMA to sample from ADC at a regular interval (triggered by timer/counter)
+ DMAC->BASEADDR.reg = (uint32_t)_descriptor_section; // Specify the location of the descriptors
+ DMAC->WRBADDR.reg = (uint32_t)_wrb; // Specify the location of the write back descriptors
+ DMAC->CTRL.reg = DMAC_CTRL_DMAENABLE | DMAC_CTRL_LVLEN(0xf); // Enable the DMAC peripheral
+ DMAC->Channel[1].CHCTRLA.reg = DMAC_CHCTRLA_TRIGSRC(TC5_DMAC_ID_OVF) | // Set DMAC to trigger on TC5 timer overflow
+ DMAC_CHCTRLA_TRIGACT_BURST; // DMAC burst transfer
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[1]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_0 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_0 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[0], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[0]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_1 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_1 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[1], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ // Configure NVIC
+ NVIC_SetPriority(DMAC_1_IRQn, 0); // Set the Nested Vector Interrupt Controller (NVIC) priority for DMAC1 to 0 (highest)
+ NVIC_EnableIRQ(DMAC_1_IRQn); // Connect DMAC1 to Nested Vector Interrupt Controller (NVIC)
+
+ // Activate the suspend (SUSP) interrupt on DMAC channel 1
+ DMAC->Channel[1].CHINTENSET.reg = DMAC_CHINTENSET_SUSP;
+
+ // Configure ADC
+ ADC1->INPUTCTRL.bit.MUXPOS = ADC_INPUTCTRL_MUXPOS_AIN12_Val; // Set the analog input to ADC0/AIN2 (PB08 - A4 on Metro M4)
+ while (ADC1->SYNCBUSY.bit.INPUTCTRL)
+ ; // Wait for synchronization
+ ADC1->SAMPCTRL.bit.SAMPLEN = 0x00; // Set max Sampling Time Length to half divided ADC clock pulse (2.66us)
+ while (ADC1->SYNCBUSY.bit.SAMPCTRL)
+ ; // Wait for synchronization
+ ADC1->CTRLA.reg = ADC_CTRLA_PRESCALER_DIV128; // Divide Clock ADC GCLK by 128 (48MHz/128 = 375kHz)
+ ADC1->CTRLB.reg = ADC_CTRLB_RESSEL_12BIT | // Set ADC resolution to 12 bits
+ ADC_CTRLB_FREERUN; // Set ADC to free run mode
+ while (ADC1->SYNCBUSY.bit.CTRLB)
+ ; // Wait for synchronization
+ ADC1->CTRLA.bit.ENABLE = 1; // Enable the ADC
+ while (ADC1->SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ ADC1->SWTRIG.bit.START = 1; // Initiate a software trigger to start an ADC conversion
+ while (ADC1->SYNCBUSY.bit.SWTRIG)
+ ; // Wait for synchronization
+
+ // Enable DMA channel 1
+ DMAC->Channel[1].CHCTRLA.bit.ENABLE = 1;
+
+ // Configure Timer/Counter 5
+ GCLK->PCHCTRL[TC5_GCLK_ID].reg = GCLK_PCHCTRL_CHEN | // Enable perhipheral channel for TC5
+ GCLK_PCHCTRL_GEN_GCLK1; // Connect generic clock 0 at 48MHz
+
+ TC5->COUNT16.WAVE.reg = TC_WAVE_WAVEGEN_MFRQ; // Set TC5 to Match Frequency (MFRQ) mode
+ TC5->COUNT16.CC[0].reg = 3000 - 1; // Set the trigger to 16 kHz: (4Mhz / 16000) - 1
+ while (TC5->COUNT16.SYNCBUSY.bit.CC0)
+ ; // Wait for synchronization
+
+ // Start Timer/Counter 5
+ TC5->COUNT16.CTRLA.bit.ENABLE = 1; // Enable the TC5 timer
+ while (TC5->COUNT16.SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ }
+
+ uint16_t _adc_buf_0[ADC_BUF_LEN];
+ uint16_t _adc_buf_1[ADC_BUF_LEN];
+
+ // WAV files have a header. This struct defines that header
+ struct wavFileHeader
+ {
+ char riff[4]; /* "RIFF" */
+ long flength; /* file length in bytes */
+ char wave[4]; /* "WAVE" */
+ char fmt[4]; /* "fmt " */
+ long chunk_size; /* size of FMT chunk in bytes (usually 16) */
+ short format_tag; /* 1=PCM, 257=Mu-Law, 258=A-Law, 259=ADPCM */
+ short num_chans; /* 1=mono, 2=stereo */
+ long srate; /* Sampling rate in samples per second */
+ long bytes_per_sec; /* bytes per second = srate*bytes_per_samp */
+ short bytes_per_samp; /* 2=16-bit mono, 4=16-bit stereo */
+ short bits_per_samp; /* Number of bits per sample */
+ char data[4]; /* "data" */
+ long dlength; /* data length in bytes (filelength - 44) */
+ };
+
+ void initBufferHeader()
+ {
+ wavFileHeader wavh;
+
+ strncpy(wavh.riff, "RIFF", 4);
+ strncpy(wavh.wave, "WAVE", 4);
+ strncpy(wavh.fmt, "fmt ", 4);
+ strncpy(wavh.data, "data", 4);
+
+ wavh.chunk_size = 16;
+ wavh.format_tag = 1; // PCM
+ wavh.num_chans = 1; // mono
+ wavh.srate = RATE;
+ wavh.bytes_per_sec = (RATE * 1 * 16 * 1) / 8;
+ wavh.bytes_per_samp = 2;
+ wavh.bits_per_samp = 16;
+ wavh.dlength = RATE * 2 * 1 * 16 / 2;
+ wavh.flength = wavh.dlength + 44;
+
+ _writer.writeSfudBuffer((byte *)&wavh, 44);
+ }
+
+ void audioCallback(uint16_t *buf, uint32_t buf_len)
+ {
+ static uint32_t idx = 44;
+
+ if (_isRecording)
+ {
+ for (uint32_t i = 0; i < buf_len; i++)
+ {
+ int16_t audio_value = ((int16_t)buf[i] - 2048) * 16;
+
+ _writer.writeSfudBuffer(audio_value & 0xFF);
+ _writer.writeSfudBuffer((audio_value >> 8) & 0xFF);
+ }
+
+ idx += buf_len;
+
+ if (idx >= BUFFER_SIZE)
+ {
+ _writer.flushSfudBuffer();
+ idx = 44;
+ _isRecording = false;
+ _isRecordingReady = true;
+ }
+ }
+ }
+};
+
+Mic mic;
+
+void DMAC_1_Handler()
+{
+ mic.dmaHandler();
+}
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/speech_to_text.h b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/speech_to_text.h
new file mode 100644
index 00000000..a7ce075f
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/src/speech_to_text.h
@@ -0,0 +1,102 @@
+#pragma once
+
+#include
+#include
+#include
+#include
+
+#include "config.h"
+#include "flash_stream.h"
+
+class SpeechToText
+{
+public:
+ void init()
+ {
+ _token_client.setCACert(TOKEN_CERTIFICATE);
+ _speech_client.setCACert(SPEECH_CERTIFICATE);
+ _access_token = getAccessToken();
+ }
+
+ String convertSpeechToText()
+ {
+ char url[128];
+ sprintf(url, SPEECH_URL, SPEECH_LOCATION, LANGUAGE);
+
+ HTTPClient httpClient;
+ httpClient.begin(_speech_client, url);
+
+ httpClient.addHeader("Authorization", String("Bearer ") + _access_token);
+ httpClient.addHeader("Content-Type", String("audio/wav; codecs=audio/pcm; samplerate=") + String(RATE));
+ httpClient.addHeader("Accept", "application/json;text/xml");
+
+ Serial.println("Sending speech...");
+
+ FlashStream stream;
+ int httpResponseCode = httpClient.sendRequest("POST", &stream, BUFFER_SIZE);
+
+ Serial.println("Speech sent!");
+
+ String text = "";
+
+ if (httpResponseCode == 200)
+ {
+ String result = httpClient.getString();
+ Serial.println(result);
+
+ DynamicJsonDocument doc(1024);
+ deserializeJson(doc, result.c_str());
+
+ JsonObject obj = doc.as();
+ text = obj["DisplayText"].as();
+ }
+ else if (httpResponseCode == 401)
+ {
+ Serial.println("Access token expired, trying again with a new token");
+ _access_token = getAccessToken();
+ return convertSpeechToText();
+ }
+ else
+ {
+ Serial.print("Failed to convert text to speech - error ");
+ Serial.println(httpResponseCode);
+ }
+
+ httpClient.end();
+
+ return text;
+ }
+
+private:
+ String getAccessToken()
+ {
+ char url[128];
+ sprintf(url, TOKEN_URL, SPEECH_LOCATION);
+
+ HTTPClient httpClient;
+ httpClient.begin(_token_client, url);
+
+ httpClient.addHeader("Ocp-Apim-Subscription-Key", SPEECH_API_KEY);
+ int httpResultCode = httpClient.POST("{}");
+
+ if (httpResultCode != 200)
+ {
+ Serial.println("Error getting access token, trying again...");
+ delay(10000);
+ return getAccessToken();
+ }
+
+ Serial.println("Got access token.");
+ String result = httpClient.getString();
+
+ httpClient.end();
+
+ return result;
+ }
+
+ WiFiClientSecure _token_client;
+ WiFiClientSecure _speech_client;
+ String _access_token;
+};
+
+SpeechToText speechToText;
diff --git a/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/test/README b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/test/README
new file mode 100644
index 00000000..b94d0890
--- /dev/null
+++ b/6-consumer/lessons/1-speech-recognition/code-speech-to-text/wio-terminal/smart-timer/test/README
@@ -0,0 +1,11 @@
+
+This directory is intended for PlatformIO Unit Testing and project tests.
+
+Unit Testing is a software testing method by which individual units of
+source code, sets of one or more MCU program modules together with associated
+control data, usage procedures, and operating procedures, are tested to
+determine whether they are fit for use. Unit testing finds problems early
+in the development cycle.
+
+More information about PlatformIO Unit Testing:
+- https://docs.platformio.org/page/plus/unit-testing.html
diff --git a/6-consumer/lessons/1-speech-recognition/pi-audio.md b/6-consumer/lessons/1-speech-recognition/pi-audio.md
index 7e8f4163..a51c0dcc 100644
--- a/6-consumer/lessons/1-speech-recognition/pi-audio.md
+++ b/6-consumer/lessons/1-speech-recognition/pi-audio.md
@@ -190,7 +190,7 @@ You can capture audio from the microphone using Python code.
1. Run the code. Press the button and speak into the microphone. Release the button when you are done, and you will hear the recording.
- You may see some ALSA errors when the PyAudio instance is created. This is due to configuration on the Pi for audio devices you don't have. You can ignore these errors.
+ You may get some ALSA errors when the PyAudio instance is created. This is due to configuration on the Pi for audio devices you don't have. You can ignore these errors.
```output
pi@raspberrypi:~/smart-timer $ python3 app.py
@@ -200,7 +200,7 @@ You can capture audio from the microphone using Python code.
ALSA lib pcm.c:2565:(snd_pcm_open_noupdate) Unknown PCM cards.pcm.side
```
- If you see the following error:
+ If you get the following error:
```output
OSError: [Errno -9997] Invalid sample rate
diff --git a/6-consumer/lessons/1-speech-recognition/pi-speech-to-text.md b/6-consumer/lessons/1-speech-recognition/pi-speech-to-text.md
index ee4bb242..1f18f11f 100644
--- a/6-consumer/lessons/1-speech-recognition/pi-speech-to-text.md
+++ b/6-consumer/lessons/1-speech-recognition/pi-speech-to-text.md
@@ -12,22 +12,21 @@ The audio can be sent to the speech service using the REST API. To use the speec
1. Remove the `play_audio` function. This is no longer needed as you don't want a smart timer to repeat back to you what you said.
-1. Add the following imports to the top of the `app.py` file:
+1. Add the following import to the top of the `app.py` file:
```python
import requests
- import json
```
1. Add the following code above the `while True` loop to declare some settings for the speech service:
```python
- api_key = ''
+ speech_api_key = ''
location = ''
language = ''
```
- Replace `` with the API key for your speech service. Replace `` with the location you used when you created the speech service resource.
+ Replace `` with the API key for your speech service resource. Replace `` with the location you used when you created the speech service resource.
Replace `` with the locale name for language you will be speaking in, for example `en-GB` for English, or `zn-HK` for Cantonese. You can find a list of the supported languages and their locale names in the [Language and voice support documentation on Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/speech-service/language-support?WT.mc_id=academic-17441-jabenn#speech-to-text).
@@ -36,7 +35,7 @@ The audio can be sent to the speech service using the REST API. To use the speec
```python
def get_access_token():
headers = {
- 'Ocp-Apim-Subscription-Key': api_key
+ 'Ocp-Apim-Subscription-Key': speech_api_key
}
token_endpoint = f'https://{location}.api.cognitive.microsoft.com/sts/v1.0/issuetoken'
@@ -74,7 +73,7 @@ The audio can be sent to the speech service using the REST API. To use the speec
```python
response = requests.post(url, headers=headers, params=params, data=buffer)
- response_json = json.loads(response.text)
+ response_json = response.json()
if response_json['RecognitionStatus'] == 'Success':
return response_json['DisplayText']
@@ -84,14 +83,21 @@ The audio can be sent to the speech service using the REST API. To use the speec
This calls the URL and decodes the JSON value that comes in the response. The `RecognitionStatus` value in the response indicates if the call was able to extract speech into text successfully, and if this is `Success` then the text is returned from the function, otherwise an empty string is returned.
-1. Finally replace the call to `play_audio` in the `while True` loop with a call to the `convert_speech_to_text` function, as well as printing the text to the console:
+1. Above the `while True:` loop, define a function to process the text returned from the speech to text service. This function will just print the text to the console for now.
+
+ ```python
+ def process_text(text):
+ print(text)
+ ```
+
+1. Finally replace the call to `play_audio` in the `while True` loop with a call to the `convert_speech_to_text` function, passing the text to the `process_text` function:
```python
text = convert_speech_to_text(buffer)
- print(text)
+ process_text(text)
```
-1. Run the code. Press the button and speak into the microphone. Release the button when you are done, and you will see the audio converted to text in the output.
+1. Run the code. Press the button and speak into the microphone. Release the button when you are done, and the audio will be converted to text and printed to the console.
```output
pi@raspberrypi:~/smart-timer $ python3 app.py
diff --git a/6-consumer/lessons/1-speech-recognition/virtual-device-speech-to-text.md b/6-consumer/lessons/1-speech-recognition/virtual-device-speech-to-text.md
index 5e982344..5aaf1025 100644
--- a/6-consumer/lessons/1-speech-recognition/virtual-device-speech-to-text.md
+++ b/6-consumer/lessons/1-speech-recognition/virtual-device-speech-to-text.md
@@ -16,7 +16,7 @@ On Windows, Linux, and macOS, the speech services Python SDK can be used to list
pip install azure-cognitiveservices-speech
```
- > ⚠️ If you see the following error:
+ > ⚠️ If you get the following error:
>
> ```output
> ERROR: Could not find a version that satisfies the requirement azure-cognitiveservices-speech (from versions: none)
@@ -32,6 +32,7 @@ On Windows, Linux, and macOS, the speech services Python SDK can be used to list
1. Add the following imports to the `app,py` file:
```python
+ import requests
import time
from azure.cognitiveservices.speech import SpeechConfig, SpeechRecognizer
```
@@ -41,13 +42,13 @@ On Windows, Linux, and macOS, the speech services Python SDK can be used to list
1. Add the following code to declare some configuration:
```python
- api_key = ''
+ speech_api_key = ''
location = ''
language = ''
- speech_config = SpeechConfig(subscription=api_key,
- region=location,
- speech_recognition_language=language)
+ recognizer_config = SpeechConfig(subscription=speech_api_key,
+ region=location,
+ speech_recognition_language=language)
```
Replace `` with the API key for your speech service. Replace `` with the location you used when you created the speech service resource.
@@ -59,14 +60,17 @@ On Windows, Linux, and macOS, the speech services Python SDK can be used to list
1. Add the following code to create a speech recognizer:
```python
- recognizer = SpeechRecognizer(speech_config=speech_config)
+ recognizer = SpeechRecognizer(speech_config=recognizer_config)
```
-1. The speech recognizer runs on a background thread, listening for audio and converting any speech in it to text. You can get the text using a callback function - a function you define and pass to the recognizer. Every time speech is detected, the callback is called. Add the following code to define a callback that prints the text to the console, and pass this callback to the recognizer:
+1. The speech recognizer runs on a background thread, listening for audio and converting any speech in it to text. You can get the text using a callback function - a function you define and pass to the recognizer. Every time speech is detected, the callback is called. Add the following code to define a callback, and pass this callback to the recognizer, as well as defining a function to process the text, writing it to the consoled:
```python
+ def process_text(text):
+ print(text)
+
def recognized(args):
- print(args.result.text)
+ process_text(args.result.text)
recognizer.recognized.connect(recognized)
```
@@ -80,7 +84,7 @@ On Windows, Linux, and macOS, the speech services Python SDK can be used to list
time.sleep(1)
```
-1. Run this app. Speak into your microphone and you will see the audio converted to text in the console.
+1. Run this app. Speak into your microphone and the audio converted to text will be output to the console.
```output
(.venv) ➜ smart-timer python3 app.py
diff --git a/6-consumer/lessons/1-speech-recognition/wio-terminal-audio.md b/6-consumer/lessons/1-speech-recognition/wio-terminal-audio.md
index 9c643d65..aca2922f 100644
--- a/6-consumer/lessons/1-speech-recognition/wio-terminal-audio.md
+++ b/6-consumer/lessons/1-speech-recognition/wio-terminal-audio.md
@@ -1,3 +1,536 @@
# Capture audio - Wio Terminal
-Coming soon!
+In this part of the lesson, you will write code to capture audio on your Wio Terminal. Audio capture will be controlled by one of the buttons on the top of the Wio Terminal.
+
+## Program the device to capture audio
+
+You can capture audio from the microphone using C++ code. The Wio Terminal only has 192KB of RAM, not enough to capture more than a couple of seconds of audio. It also has 4MB of flash memory, so this can be used instead, saving captured audio to the flash memory.
+
+The built-in microphone captures an analog signal, which gets converted to a digital signal that the Wio Terminal can use. When capturing audio, the data needs to be captured at the correct time - for example to capture audio at 16KHz, the audio needs to be captured exactly 16,000 times per second, with equal intervals between each sample. Rather than use your code to do this, you can use the direct memory access controller (DMAC). This is circuitry that can capture a signal from somewhere and write to memory, without interrupting your code running on the processor.
+
+✅ Read more on DMA on the [direct memory access page on Wikipedia](https://wikipedia.org/wiki/Direct_memory_access).
+
+
+
+The DMAC can capture audio from the ADC at fixed intervals, such as at 16,000 times a second for 16KHz audio. It can write this captured data to a pre-allocated memory buffer, and when this is full, make it available to your code to process. Using this memory can delay capturing audio, but you can set up multiple buffers. The DMAC writes to buffer 1, then when it's full, notifies your code to proces buffer 1, whilst the DMAC writes to buffer 2. When buffer 2 is full, it notifies your code, and goes back to writing to buffer 1. That way as long as you process each buffer in less time that it takes to fill one, you will not lose any data.
+
+Once each buffer has been captured, it can be written to the flash memory. Flash memory needs to be written to using defined addresses, specifying where to write and how large to write, similar to updating an array of bytes in memory. Flash memory has granularity, meaning erase and writing operations rely not only on being of a fixed size, but aligning to that size. For example, if the granularity is 4096 bytes and you request an erase at address 4200, it could erase all the data from address 4096 to 8192. This means when you write the audio data to flash memory, it has to be in chunks of the correct size.
+
+### Task - configure flash memory
+
+1. Create a brand new Wio Terminal project using PlatformIO. Call this project `smart-timer`. Add code in the `setup` function to configure the serial port.
+
+1. Add the following library dependencies to the `platformio.ini` file to provide access to the flash memory:
+
+ ```ini
+ lib_deps =
+ seeed-studio/Seeed Arduino FS @ 2.0.3
+ seeed-studio/Seeed Arduino SFUD @ 2.0.1
+ ```
+
+1. Open the `main.cpp` file and add the following include directive for the flash memory library to the top of the file:
+
+ ```cpp
+ #include
+ #include
+ ```
+
+ > 🎓 SFUD stands for Serial Flash Universal Driver, and is a library designed to work with all flash memory chips
+
+1. In the `setup` function, add the following code to set up the flash storage library:
+
+ ```cpp
+ while (!(sfud_init() == SFUD_SUCCESS))
+ ;
+
+ sfud_qspi_fast_read_enable(sfud_get_device(SFUD_W25Q32_DEVICE_INDEX), 2);
+ ```
+
+ This loops until the SFUD library is initialized, then turns on fast reads. The built-in flash memory can be accessed using a Queued Serial Peripheral Interface (QSPI), a type of SPI controller that allows continuous access via a queue with minimal processor usage. This makes it faster to read and write to flash memory.
+
+1. Create a new file in the `src` folder called `flash_writer.h`.
+
+1. Add the following to the top of this file:
+
+ ```cpp
+ #pragma once
+
+ #include
+ #include
+ ```
+
+ This includes some needed header files, including the header file for the SFUD library to interact with flash memory
+
+1. Define a class in this new header file called `FlashWriter`:
+
+ ```cpp
+ class FlashWriter
+ {
+ public:
+
+ private:
+ };
+ ```
+
+1. In the `private` section, add the following code:
+
+ ```cpp
+ byte *_sfudBuffer;
+ size_t _sfudBufferSize;
+ size_t _sfudBufferPos;
+ size_t _sfudBufferWritePos;
+
+ const sfud_flash *_flash;
+ ```
+
+ This defines some fields for the buffer to use to store data before writing it to the flash memory. There is a byte array, `_sfudBuffer`, to write data to, and when this is full, the data is written to flash memory. The `_sfudBufferPos` field stores the current location to write to in this buffer, and `_sfudBufferWritePos` stores the location in flash memory to write to. `_flash` is a pointer the flash memory to write to - some microcontrollers have multiple flash memory chips.
+
+1. Add the following method to the `public` section to initialize this class:
+
+ ```cpp
+ void init()
+ {
+ _flash = sfud_get_device_table() + 0;
+ _sfudBufferSize = _flash->chip.erase_gran;
+ _sfudBuffer = new byte[_sfudBufferSize];
+ _sfudBufferPos = 0;
+ _sfudBufferWritePos = 0;
+ }
+ ```
+
+ This configures the flash memory on teh Wio Terminal to write to, and sets up the buffers based off the grain size of the flash memory. This is in an `init` method, rather than a constructor as this needs to be called after the flash memory has been set up in the `setup` function.
+
+1. Add the following code to the `public` section:
+
+ ```cpp
+ void writeSfudBuffer(byte b)
+ {
+ _sfudBuffer[_sfudBufferPos++] = b;
+ if (_sfudBufferPos == _sfudBufferSize)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+
+ void writeSfudBuffer(byte *b, size_t len)
+ {
+ for (size_t i = 0; i < len; ++i)
+ {
+ writeSfudBuffer(b[i]);
+ }
+ }
+
+ void flushSfudBuffer()
+ {
+ if (_sfudBufferPos > 0)
+ {
+ sfud_erase_write(_flash, _sfudBufferWritePos, _sfudBufferSize, _sfudBuffer);
+ _sfudBufferWritePos += _sfudBufferSize;
+ _sfudBufferPos = 0;
+ }
+ }
+ ```
+
+ This code defines methods to write bytes to the flash storage system. It works by writing to an in-memory buffer that is the right size for the flash memory, and when this is full, this is written to the flash memory, erasing any existing data at that location. There is also a `flushSfudBuffer` to write an incomplete buffer, as the data being captured won't be exact multiples of the grain size, so the end part of the data needs to be written.
+
+ > 💁 The end part of the data will write additional unwanted data, but this is ok as only the data needed will be read.
+
+### Task - set up audio capture
+
+1. Create a new file in the `src` folder called `config.h`.
+
+1. Add the following to the top of this file:
+
+ ```cpp
+ #pragma once
+
+ #define RATE 16000
+ #define SAMPLE_LENGTH_SECONDS 4
+ #define SAMPLES RATE * SAMPLE_LENGTH_SECONDS
+ #define BUFFER_SIZE (SAMPLES * 2) + 44
+ #define ADC_BUF_LEN 1600
+ ```
+
+ This code sets up some constants for the audio capture.
+
+ | Constant | Value | Description |
+ | --------------------- | -----: | - |
+ | RATE | 16000 | The sample rate for the audio. !6,000 is 16KHz |
+ | SAMPLE_LENGTH_SECONDS | 4 | The length of audio to capture. This is set to 4 seconds. To record longer audio, increase this. |
+ | SAMPLES | 64000 | The total number of audio samples that will be captured. Set to the sample rate * the number of seconds |
+ | BUFFER_SIZE | 128044 | The size of the audio buffer to capture. Audio will be captured as a WAV file, which is 44 bytes of header, then 128,000 bytes of audio date (each sample is 2 bytes) |
+ | ADC_BUF_LEN | 1600 | The size of the buffers to use to capture audio from the DMAC |
+
+ > 💁 If you find 4 seconds is too short to request a timer, you can increase the `SAMPLE_LENGTH_SECONDS` value, and all the other values will recalculate.
+
+1. Create a new file in the `src` folder called `mic.h`.
+
+1. Add the following to the top of this file:
+
+ ```cpp
+ #pragma once
+
+ #include
+
+ #include "config.h"
+ #include "flash_writer.h"
+ ```
+
+ This includes some needed header files, including the `config.h` and `FlashWriter` header files.
+
+1. Add the following to define a `Mic` class that can capture from the microphone:
+
+ ```cpp
+ class Mic
+ {
+ public:
+ Mic()
+ {
+ _isRecording = false;
+ _isRecordingReady = false;
+ }
+
+ void startRecording()
+ {
+ _isRecording = true;
+ _isRecordingReady = false;
+ }
+
+ bool isRecording()
+ {
+ return _isRecording;
+ }
+
+ bool isRecordingReady()
+ {
+ return _isRecordingReady;
+ }
+
+ private:
+ volatile bool _isRecording;
+ volatile bool _isRecordingReady;
+ FlashWriter _writer;
+ };
+
+ Mic mic;
+ ```
+
+ This class currently only has a couple of fields to track if recording has started, and if a recording is ready to be used. When the DMAC is set up, it continuously writes to memory buffers, so the `_isRecording` flag determines if these should be processed or ignored. The `_isRecordingReady` flag will be set when the required 4 seconds of audio has been captured. The `_writer` field is used to save the audio data to flash memory.
+
+ A global variable is then declared for an instance of the `Mic` class.
+
+1. Add the following code to the `private` section of the `Mic` class:
+
+ ```cpp
+ typedef struct
+ {
+ uint16_t btctrl;
+ uint16_t btcnt;
+ uint32_t srcaddr;
+ uint32_t dstaddr;
+ uint32_t descaddr;
+ } dmacdescriptor;
+
+ // Globals - DMA and ADC
+ volatile dmacdescriptor _wrb[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor_section[DMAC_CH_NUM] __attribute__((aligned(16)));
+ dmacdescriptor _descriptor __attribute__((aligned(16)));
+
+ void configureDmaAdc()
+ {
+ // Configure DMA to sample from ADC at a regular interval (triggered by timer/counter)
+ DMAC->BASEADDR.reg = (uint32_t)_descriptor_section; // Specify the location of the descriptors
+ DMAC->WRBADDR.reg = (uint32_t)_wrb; // Specify the location of the write back descriptors
+ DMAC->CTRL.reg = DMAC_CTRL_DMAENABLE | DMAC_CTRL_LVLEN(0xf); // Enable the DMAC peripheral
+ DMAC->Channel[1].CHCTRLA.reg = DMAC_CHCTRLA_TRIGSRC(TC5_DMAC_ID_OVF) | // Set DMAC to trigger on TC5 timer overflow
+ DMAC_CHCTRLA_TRIGACT_BURST; // DMAC burst transfer
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[1]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_0 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_0 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[0], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ _descriptor.descaddr = (uint32_t)&_descriptor_section[0]; // Set up a circular descriptor
+ _descriptor.srcaddr = (uint32_t)&ADC1->RESULT.reg; // Take the result from the ADC0 RESULT register
+ _descriptor.dstaddr = (uint32_t)_adc_buf_1 + sizeof(uint16_t) * ADC_BUF_LEN; // Place it in the adc_buf_1 array
+ _descriptor.btcnt = ADC_BUF_LEN; // Beat count
+ _descriptor.btctrl = DMAC_BTCTRL_BEATSIZE_HWORD | // Beat size is HWORD (16-bits)
+ DMAC_BTCTRL_DSTINC | // Increment the destination address
+ DMAC_BTCTRL_VALID | // Descriptor is valid
+ DMAC_BTCTRL_BLOCKACT_SUSPEND; // Suspend DMAC channel 0 after block transfer
+ memcpy(&_descriptor_section[1], &_descriptor, sizeof(_descriptor)); // Copy the descriptor to the descriptor section
+
+ // Configure NVIC
+ NVIC_SetPriority(DMAC_1_IRQn, 0); // Set the Nested Vector Interrupt Controller (NVIC) priority for DMAC1 to 0 (highest)
+ NVIC_EnableIRQ(DMAC_1_IRQn); // Connect DMAC1 to Nested Vector Interrupt Controller (NVIC)
+
+ // Activate the suspend (SUSP) interrupt on DMAC channel 1
+ DMAC->Channel[1].CHINTENSET.reg = DMAC_CHINTENSET_SUSP;
+
+ // Configure ADC
+ ADC1->INPUTCTRL.bit.MUXPOS = ADC_INPUTCTRL_MUXPOS_AIN12_Val; // Set the analog input to ADC0/AIN2 (PB08 - A4 on Metro M4)
+ while (ADC1->SYNCBUSY.bit.INPUTCTRL)
+ ; // Wait for synchronization
+ ADC1->SAMPCTRL.bit.SAMPLEN = 0x00; // Set max Sampling Time Length to half divided ADC clock pulse (2.66us)
+ while (ADC1->SYNCBUSY.bit.SAMPCTRL)
+ ; // Wait for synchronization
+ ADC1->CTRLA.reg = ADC_CTRLA_PRESCALER_DIV128; // Divide Clock ADC GCLK by 128 (48MHz/128 = 375kHz)
+ ADC1->CTRLB.reg = ADC_CTRLB_RESSEL_12BIT | // Set ADC resolution to 12 bits
+ ADC_CTRLB_FREERUN; // Set ADC to free run mode
+ while (ADC1->SYNCBUSY.bit.CTRLB)
+ ; // Wait for synchronization
+ ADC1->CTRLA.bit.ENABLE = 1; // Enable the ADC
+ while (ADC1->SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ ADC1->SWTRIG.bit.START = 1; // Initiate a software trigger to start an ADC conversion
+ while (ADC1->SYNCBUSY.bit.SWTRIG)
+ ; // Wait for synchronization
+
+ // Enable DMA channel 1
+ DMAC->Channel[1].CHCTRLA.bit.ENABLE = 1;
+
+ // Configure Timer/Counter 5
+ GCLK->PCHCTRL[TC5_GCLK_ID].reg = GCLK_PCHCTRL_CHEN | // Enable perhipheral channel for TC5
+ GCLK_PCHCTRL_GEN_GCLK1; // Connect generic clock 0 at 48MHz
+
+ TC5->COUNT16.WAVE.reg = TC_WAVE_WAVEGEN_MFRQ; // Set TC5 to Match Frequency (MFRQ) mode
+ TC5->COUNT16.CC[0].reg = 3000 - 1; // Set the trigger to 16 kHz: (4Mhz / 16000) - 1
+ while (TC5->COUNT16.SYNCBUSY.bit.CC0)
+ ; // Wait for synchronization
+
+ // Start Timer/Counter 5
+ TC5->COUNT16.CTRLA.bit.ENABLE = 1; // Enable the TC5 timer
+ while (TC5->COUNT16.SYNCBUSY.bit.ENABLE)
+ ; // Wait for synchronization
+ }
+
+ uint16_t _adc_buf_0[ADC_BUF_LEN];
+ uint16_t _adc_buf_1[ADC_BUF_LEN];
+ ```
+
+ This code defines a `configureDmaAdc` method that configures the DMAC, connecting it to the ADC and setting it to populate two different alternating buffers, `_adc_buf_0` and `_adc_buf_0`.
+
+ > 💁 One of the downsides of microcontroller development is the complexity of the code needed to interact with hardware, as your code runs at a very low level interacting with hardware directly. This code is more complex than what you would write for a single-board computer or desktop computer as there is no operating system to help. There are some libraries available that can simplify this, but there is still a lot of complexity.
+
+1. Below this, add the following code:
+
+ ```cpp
+ // WAV files have a header. This struct defines that header
+ struct wavFileHeader
+ {
+ char riff[4]; /* "RIFF" */
+ long flength; /* file length in bytes */
+ char wave[4]; /* "WAVE" */
+ char fmt[4]; /* "fmt " */
+ long chunk_size; /* size of FMT chunk in bytes (usually 16) */
+ short format_tag; /* 1=PCM, 257=Mu-Law, 258=A-Law, 259=ADPCM */
+ short num_chans; /* 1=mono, 2=stereo */
+ long srate; /* Sampling rate in samples per second */
+ long bytes_per_sec; /* bytes per second = srate*bytes_per_samp */
+ short bytes_per_samp; /* 2=16-bit mono, 4=16-bit stereo */
+ short bits_per_samp; /* Number of bits per sample */
+ char data[4]; /* "data" */
+ long dlength; /* data length in bytes (filelength - 44) */
+ };
+
+ void initBufferHeader()
+ {
+ wavFileHeader wavh;
+
+ strncpy(wavh.riff, "RIFF", 4);
+ strncpy(wavh.wave, "WAVE", 4);
+ strncpy(wavh.fmt, "fmt ", 4);
+ strncpy(wavh.data, "data", 4);
+
+ wavh.chunk_size = 16;
+ wavh.format_tag = 1; // PCM
+ wavh.num_chans = 1; // mono
+ wavh.srate = RATE;
+ wavh.bytes_per_sec = (RATE * 1 * 16 * 1) / 8;
+ wavh.bytes_per_samp = 2;
+ wavh.bits_per_samp = 16;
+ wavh.dlength = RATE * 2 * 1 * 16 / 2;
+ wavh.flength = wavh.dlength + 44;
+
+ _writer.writeSfudBuffer((byte *)&wavh, 44);
+ }
+ ```
+
+ This code defines the WAV header as a struct that takes up 44 bytes of memory. It writes details to it about the audio file rate, size, and number of channels. This header is then written to the flash memory
+
+1. Below this code, add the following to declare a method to be called when the audio buffers are ready to process:
+
+ ```cpp
+ void audioCallback(uint16_t *buf, uint32_t buf_len)
+ {
+ static uint32_t idx = 44;
+
+ if (_isRecording)
+ {
+ for (uint32_t i = 0; i < buf_len; i++)
+ {
+ int16_t audio_value = ((int16_t)buf[i] - 2048) * 16;
+
+ _writer.writeSfudBuffer(audio_value & 0xFF);
+ _writer.writeSfudBuffer((audio_value >> 8) & 0xFF);
+ }
+
+ idx += buf_len;
+
+ if (idx >= BUFFER_SIZE)
+ {
+ _writer.flushSfudBuffer();
+ idx = 44;
+ _isRecording = false;
+ _isRecordingReady = true;
+ }
+ }
+ }
+ ```
+
+ The audio buffers are arrays of 16-bit integers containing the audio from the ADC. The ADC returns 12-bit unsigned values (0-1023), so these need to be converted to 16-bit signed values, and then converted into 2 bytes to be stored as raw binary data.
+
+ These bytes are written to the flash memory buffers. The write starts at index 44 - this is the offset from the 44 bytes written as the WAV file header. Once all the bytes needed for the required audio length have been captured, the remaing data is written to the flash memory.
+
+1. In the `public` section of the `Mic` class, add the following code:
+
+ ```cpp
+ void dmaHandler()
+ {
+ static uint8_t count = 0;
+
+ if (DMAC->Channel[1].CHINTFLAG.bit.SUSP)
+ {
+ DMAC->Channel[1].CHCTRLB.reg = DMAC_CHCTRLB_CMD_RESUME;
+ DMAC->Channel[1].CHINTFLAG.bit.SUSP = 1;
+
+ if (count)
+ {
+ audioCallback(_adc_buf_0, ADC_BUF_LEN);
+ }
+ else
+ {
+ audioCallback(_adc_buf_1, ADC_BUF_LEN);
+ }
+
+ count = (count + 1) % 2;
+ }
+ }
+ ```
+
+ This code will be called by the DMAC to tell your code to process the buffers. It checks that there is data to process, and calls the `audioCallback` method with the relevant buffer.
+
+1. Outside the class, after the `Mic mic;` declaration, add the following code:
+
+ ```cpp
+ void DMAC_1_Handler()
+ {
+ mic.dmaHandler();
+ }
+ ```
+
+ The `DMAC_1_Handler` will be called by the DMAC when there the buffers are ready to process. This function is found by name, so just needs to exist to be called.
+
+1. Add the following two methods to the `public` section of the `Mic` class:
+
+ ```cpp
+ void init()
+ {
+ analogReference(AR_INTERNAL2V23);
+
+ _writer.init();
+
+ initBufferHeader();
+ configureDmaAdc();
+ }
+
+ void reset()
+ {
+ _isRecordingReady = false;
+ _isRecording = false;
+
+ _writer.reset();
+
+ initBufferHeader();
+ }
+ ```
+
+ The `init` method contain code to initialize the `Mic` class. This method sets the correct voltage for the Mic pin, sets up the flash memory writer, writes the WAV file header, and configures the DMAC. The `reset` method resets the flash memory and re-writes the header after the audio has been captured and used.
+
+### Task - capture audio
+
+1. In the `main.cpp` file, and an include directive for the `mic.h` header file:
+
+ ```cpp
+ #include "mic.h"
+ ```
+
+1. In the `setup` function, initialize the C button. Audio capture will start when this button is pressed, and continue for 4 seconds:
+
+ ```cpp
+ pinMode(WIO_KEY_C, INPUT_PULLUP);
+ ```
+
+1. Below this, initialize the microphone, then print to the console that audio is ready to be captured:
+
+ ```cpp
+ mic.init();
+
+ Serial.println("Ready.");
+ ```
+
+1. Above the `loop` function, define a function to process the captured audio. For now this does nothing, but later in this lesson it will send the speech to be converted to text:
+
+ ```cpp
+ void processAudio()
+ {
+
+ }
+ ```
+
+1. Add the following to the `loop` function:
+
+ ```cpp
+ void loop()
+ {
+ if (digitalRead(WIO_KEY_C) == LOW && !mic.isRecording())
+ {
+ Serial.println("Starting recording...");
+ mic.startRecording();
+ }
+
+ if (!mic.isRecording() && mic.isRecordingReady())
+ {
+ Serial.println("Finished recording");
+
+ processAudio();
+
+ mic.reset();
+ }
+ }
+ ```
+
+ This code checks hte C button, and if this is pressed and recording hasn't started, then the `_isRecording` field of the `Mic` class is set to true. This will cause the `audioCallback` method of the `Mic` class to store audio until 4 seconds has been captured. Once 4 seconds of audio has been captured, the `_isRecording` field is set to false, and the `_isRecordingReady` field is set to true. This is then checked in the `loop` function, and when true the `processAudio` function is called, then the mic class is reset.
+
+1. Build this code, upload it to your Wio Terminal and test it out through the serial monitor. Press the C button (the one on the left-hand side, closest to the power switch), and speak. 4 seconds of audio will be captured.
+
+ ```output
+ --- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
+ --- More details at http://bit.ly/pio-monitor-filters
+ --- Miniterm on /dev/cu.usbmodem1101 9600,8,N,1 ---
+ --- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
+ Ready.
+ Starting recording...
+ Finished recording
+ ```
+
+> 💁 You can find this code in the [code-record/wio-terminal](code-record/wio-terminal) folder.
+
+😀 Your audio recording program was a success!
diff --git a/6-consumer/lessons/1-speech-recognition/wio-terminal-microphone.md b/6-consumer/lessons/1-speech-recognition/wio-terminal-microphone.md
index 1e9bb936..4212f414 100644
--- a/6-consumer/lessons/1-speech-recognition/wio-terminal-microphone.md
+++ b/6-consumer/lessons/1-speech-recognition/wio-terminal-microphone.md
@@ -1,3 +1,11 @@
# Configure your microphone and speakers - Wio Terminal
-Coming soon!
+In this part of the lesson, you will add and speakers to your Wio Terminal. The Wio Terminal already has a microphone built-in, and this can be used to capture speech.
+
+## Hardware
+
+Coming soon
+
+### Task - connect speakers
+
+Coming soon
diff --git a/6-consumer/lessons/1-speech-recognition/wio-terminal-speech-to-text.md b/6-consumer/lessons/1-speech-recognition/wio-terminal-speech-to-text.md
index e89f1caa..a1e2cddb 100644
--- a/6-consumer/lessons/1-speech-recognition/wio-terminal-speech-to-text.md
+++ b/6-consumer/lessons/1-speech-recognition/wio-terminal-speech-to-text.md
@@ -1,3 +1,521 @@
# Speech to text - Wio Terminal
-Coming soon!
+In this part of the lesson, you will write code to convert speech in the captured audio to text using the speech service.
+
+## Send the audio to the speech service
+
+The audio can be sent to the speech service using the REST API. To use the speech service, first you need to request an access token, then use that token to access the REST API. These access tokens expire after 10 minutes, so your code should request them on a regular basis to ensure they are always up to date.
+
+### Task - get an access token
+
+1. Open the `smart-timer` project if it's not already open.
+
+1. Add the following library dependencies to the `platformio.ini` file to access WiFi and handle JSON:
+
+ ```ini
+ seeed-studio/Seeed Arduino rpcWiFi @ 1.0.5
+ seeed-studio/Seeed Arduino rpcUnified @ 2.1.3
+ seeed-studio/Seeed_Arduino_mbedtls @ 3.0.1
+ seeed-studio/Seeed Arduino RTC @ 2.0.0
+ bblanchon/ArduinoJson @ 6.17.3
+ ```
+
+1. Add the following code to the `config.h` header file:
+
+ ```cpp
+ const char *SSID = "";
+ const char *PASSWORD = "";
+
+ const char *SPEECH_API_KEY = "";
+ const char *SPEECH_LOCATION = "";
+ const char *LANGUAGE = "";
+
+ const char *TOKEN_URL = "https://%s.api.cognitive.microsoft.com/sts/v1.0/issuetoken";
+ ```
+
+ Replace `` and `` with the relevant values for your WiFi.
+
+ Replace `` with the API key for your speech service resource. Replace `` with the location you used when you created the speech service resource.
+
+ Replace `` with the locale name for language you will be speaking in, for example `en-GB` for English, or `zn-HK` for Cantonese. You can find a list of the supported languages and their locale names in the [Language and voice support documentation on Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/speech-service/language-support?WT.mc_id=academic-17441-jabenn#speech-to-text).
+
+ The `TOKEN_URL` constant is the URL of the token issuer without the location. This will be combined with the location later to get the full URL.
+
+1. Just like connecting to Custom Vision, you will need to use an HTTPS connection to connect to the token issuing service. To the end of `config.h`, add the following code:
+
+ ```cpp
+ const char *TOKEN_CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQAueRcfuAIek/4tmDg0xQwDANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwNjCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBALVGARl56bx3KBUSGuPc4H5uoNFkFH4e7pvTCxRi4j/+z+Xb\r\n"
+ "wjEz+5CipDOqjx9/jWjskL5dk7PaQkzItidsAAnDCW1leZBOIi68Lff1bjTeZgMY\r\n"
+ "iwdRd3Y39b/lcGpiuP2d23W95YHkMMT8IlWosYIX0f4kYb62rphyfnAjYb/4Od99\r\n"
+ "ThnhlAxGtfvSbXcBVIKCYfZgqRvV+5lReUnd1aNjRYVzPOoifgSx2fRyy1+pO1Uz\r\n"
+ "aMMNnIOE71bVYW0A1hr19w7kOb0KkJXoALTDDj1ukUEDqQuBfBxReL5mXiu1O7WG\r\n"
+ "0vltg0VZ/SZzctBsdBlx1BkmWYBW261KZgBivrql5ELTKKd8qgtHcLQA5fl6JB0Q\r\n"
+ "gs5XDaWehN86Gps5JW8ArjGtjcWAIP+X8CQaWfaCnuRm6Bk/03PQWhgdi84qwA0s\r\n"
+ "sRfFJwHUPTNSnE8EiGVk2frt0u8PG1pwSQsFuNJfcYIHEv1vOzP7uEOuDydsmCjh\r\n"
+ "lxuoK2n5/2aVR3BMTu+p4+gl8alXoBycyLmj3J/PUgqD8SL5fTCUegGsdia/Sa60\r\n"
+ "N2oV7vQ17wjMN+LXa2rjj/b4ZlZgXVojDmAjDwIRdDUujQu0RVsJqFLMzSIHpp2C\r\n"
+ "Zp7mIoLrySay2YYBu7SiNwL95X6He2kS8eefBBHjzwW/9FxGqry57i71c2cDAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQU1cFnOsKjnfR3UltZEjgp5lVou6UwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQB2oWc93fB8esci/8esixj++N22meiGDjgF\r\n"
+ "+rA2LUK5IOQOgcUSTGKSqF9lYfAxPjrqPjDCUPHCURv+26ad5P/BYtXtbmtxJWu+\r\n"
+ "cS5BhMDPPeG3oPZwXRHBJFAkY4O4AF7RIAAUW6EzDflUoDHKv83zOiPfYGcpHc9s\r\n"
+ "kxAInCedk7QSgXvMARjjOqdakor21DTmNIUotxo8kHv5hwRlGhBJwps6fEVi1Bt0\r\n"
+ "trpM/3wYxlr473WSPUFZPgP1j519kLpWOJ8z09wxay+Br29irPcBYv0GMXlHqThy\r\n"
+ "8y4m/HyTQeI2IMvMrQnwqPpY+rLIXyviI2vLoI+4xKE4Rn38ZZ8m\r\n"
+ "-----END CERTIFICATE-----\r\n";
+ ```
+
+ This is the same certificate you used when connecting to Custom Vision.
+
+1. Add an include for the WiFi header file and the config header file to the top of the `main.cpp` file:
+
+ ```cpp
+ #include
+
+ #include "config.h"
+ ```
+
+1. Add code to connect to WiFi in `main.cpp` above the `setup` function:
+
+ ```cpp
+ void connectWiFi()
+ {
+ while (WiFi.status() != WL_CONNECTED)
+ {
+ Serial.println("Connecting to WiFi..");
+ WiFi.begin(SSID, PASSWORD);
+ delay(500);
+ }
+
+ Serial.println("Connected!");
+ }
+ ```
+
+1. Call this function from the `setup` function after the serial connection has been established:
+
+ ```cpp
+ connectWiFi();
+ ```
+
+1. Create a new header file in the `src` folder called `speech_to_text.h`. In this header file, add the following code:
+
+ ```cpp
+ #pragma once
+
+ #include
+ #include
+ #include
+ #include
+
+ #include "config.h"
+ #include "mic.h"
+
+ class SpeechToText
+ {
+ public:
+
+ private:
+
+ };
+
+ SpeechToText speechToText;
+ ```
+
+ This includes some necessary header files for an HTTP connection, configuration and the `mic.h` header file, and defines a class called `SpeechToText`, before declaring an instance of that class that can be used later.
+
+1. Add the following 2 fields to the `private` section of this class:
+
+ ```cpp
+ WiFiClientSecure _token_client;
+ String _access_token;
+ ```
+
+ The `_token_client` is a WiFi Client that uses HTTPS and will be used to get the access token. This token will then be stored in `_access_token`.
+
+1. Add the following method to the `private` section:
+
+ ```cpp
+ String getAccessToken()
+ {
+ char url[128];
+ sprintf(url, TOKEN_URL, SPEECH_LOCATION);
+
+ HTTPClient httpClient;
+ httpClient.begin(_token_client, url);
+
+ httpClient.addHeader("Ocp-Apim-Subscription-Key", SPEECH_API_KEY);
+ int httpResultCode = httpClient.POST("{}");
+
+ if (httpResultCode != 200)
+ {
+ Serial.println("Error getting access token, trying again...");
+ delay(10000);
+ return getAccessToken();
+ }
+
+ Serial.println("Got access token.");
+ String result = httpClient.getString();
+
+ httpClient.end();
+
+ return result;
+ }
+ ```
+
+ This code builds the URL for the token issuer API using the location of the speech resource. It then creates an `HTTPClient` to make the web request, setting it up to use the WiFi client configured with the token endpoints certificate. It sets the API key as a header for the call. It then makes a POST request to get the certificate, retrying if it gets any errors. Finally the access token is returned.
+
+1. To the `public` section, add an `init` method that sets up the token client:
+
+ ```cpp
+ void init()
+ {
+ _token_client.setCACert(TOKEN_CERTIFICATE);
+ _access_token = getAccessToken();
+ }
+ ```
+
+ This sets the certificate on the WiFi client, then gets the access token.
+
+1. In `main.cpp`, add this new header file to the include directives:
+
+ ```cpp
+ #include "speech_to_text.h"
+ ```
+
+1. Initialize the `SpeechToText` class at the end of the `setup` function, after the `mic.init` call but before `Ready` is written to the serial monitor:
+
+ ```cpp
+ speechToText.init();
+ ```
+
+### Task - read audio from flash memory
+
+1. In an earlier part of this lesson, the audio was recorded to the flash memory. This audio will need to be sent to the Speech Services REST API, so it needs to be read from the flash memory. It can't be loaded into an in-memory buffer as it would be too large. The `HTTPClient` class that makes REST calls can stream data using an Arduino Stream - a class that can load data in small chunks, sending the chunks one at a time as part of the request. Every time you call `read` on a stream it returns the next block of data. An Arduino stream can be created that can read from the flash memory. Create a new file called `flash_stream.h` in the `src` folder, and add the following code to it:
+
+ ```cpp
+ #pragma once
+
+ #include
+ #include
+ #include
+
+ #include "config.h"
+
+ class FlashStream : public Stream
+ {
+ public:
+ virtual size_t write(uint8_t val)
+ {
+ }
+
+ virtual int available()
+ {
+ }
+
+ virtual int read()
+ {
+ }
+
+ virtual int peek()
+ {
+ }
+ private:
+
+ };
+ ```
+
+ This declares the `FlashStream` class, deriving from the Arduino `Stream` class. This is an abstract class - derived classes have to implement a few methods before the class can be instantiated, and these methods are defined in this class.
+
+ ✅ Read more on Arduino Streams in the [Arduino Stream documentation](https://www.arduino.cc/reference/en/language/functions/communication/stream/)
+
+1. Add the following fields to the `private` section:
+
+ ```cpp
+ size_t _pos;
+ size_t _flash_address;
+ const sfud_flash *_flash;
+
+ byte _buffer[HTTP_TCP_BUFFER_SIZE];
+ ```
+
+ This defines a temporary buffer to store data read from the flash memory, along with fields to store the current position when reading from the buffer, the current address to read from the flash memory, and the flash memory device.
+
+1. In the `private` section, add the following method:
+
+ ```cpp
+ void populateBuffer()
+ {
+ sfud_read(_flash, _flash_address, HTTP_TCP_BUFFER_SIZE, _buffer);
+ _flash_address += HTTP_TCP_BUFFER_SIZE;
+ _pos = 0;
+ }
+ ```
+
+ This code reads from the flash memory at the current address and stores the data in a buffer. It then increments the address, so the next call reads the next block of memory. The buffer is sized based on the largest chunk that the `HTTPClient` will send to the REST API at one time.
+
+ > 💁 Erasing flash memory has to be done using the grain size, reading on the other hand does not.
+
+1. In the `public` section of this class, add a constructor:
+
+ ```cpp
+ FlashStream()
+ {
+ _pos = 0;
+ _flash_address = 0;
+ _flash = sfud_get_device_table() + 0;
+
+ populateBuffer();
+ }
+ ```
+
+ This constructor sets up all the fields to start reading from the start of the flash memory block, and loads the first chunk of data into the buffer.
+
+1. Implement the `write` method. This stream will only read data, so this can do nothing and return 0:
+
+ ```cpp
+ virtual size_t write(uint8_t val)
+ {
+ return 0;
+ }
+ ```
+
+1. Implement the `peek` method. This returns the data at the current position without moving the stream along. Calling `peek` multiple times will always return the same data as long as no data is read from the stream.
+
+ ```cpp
+ virtual int peek()
+ {
+ return _buffer[_pos];
+ }
+ ```
+
+1. Implement the `available` function. This returns how many bytes can be read from the stream, or -1 if the stream is complete. For this class, the maximum available will be no more than the HTTPClient's chunk size. When this stream is used in the HTTP client it calls this function to see how much data is available, then requests that much data to send to the REST API. We don't want each chunk to be more than the HTTP clients chunk size, so if more than that is available, the chunk size is returned. If less, then what is available is returned. Once all the data has been streamed, -1 is returned.
+
+ ```cpp
+ virtual int available()
+ {
+ int remaining = BUFFER_SIZE - ((_flash_address - HTTP_TCP_BUFFER_SIZE) + _pos);
+ int bytes_available = min(HTTP_TCP_BUFFER_SIZE, remaining);
+
+ if (bytes_available == 0)
+ {
+ bytes_available = -1;
+ }
+
+ return bytes_available;
+ }
+ ```
+
+1. Implement the `read` method to return the next byte from the buffer, incrementing the position. If the position exceeds the size of the buffer, it populates the buffer with the next block from the flash memory and resets the position.
+
+ ```cpp
+ virtual int read()
+ {
+ int retVal = _buffer[_pos++];
+
+ if (_pos == HTTP_TCP_BUFFER_SIZE)
+ {
+ populateBuffer();
+ }
+
+ return retVal;
+ }
+ ```
+
+1. In the `speech_to_text.h` header file, add an include directive for this new header file:
+
+ ```cpp
+ #include "flash_stream.h"
+ ```
+
+### Task - convert the speech to text
+
+1. The speech can be converted to text by sending the audio to the Speech Service via a REST API. This REST API has a different certificate to the token issuer, so add the following code to the `config.h` header file to define this certificate:
+
+ ```cpp
+ const char *SPEECH_CERTIFICATE =
+ "-----BEGIN CERTIFICATE-----\r\n"
+ "MIIF8zCCBNugAwIBAgIQCq+mxcpjxFFB6jvh98dTFzANBgkqhkiG9w0BAQwFADBh\r\n"
+ "MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3\r\n"
+ "d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBH\r\n"
+ "MjAeFw0yMDA3MjkxMjMwMDBaFw0yNDA2MjcyMzU5NTlaMFkxCzAJBgNVBAYTAlVT\r\n"
+ "MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKjAoBgNVBAMTIU1pY3Jv\r\n"
+ "c29mdCBBenVyZSBUTFMgSXNzdWluZyBDQSAwMTCCAiIwDQYJKoZIhvcNAQEBBQAD\r\n"
+ "ggIPADCCAgoCggIBAMedcDrkXufP7pxVm1FHLDNA9IjwHaMoaY8arqqZ4Gff4xyr\r\n"
+ "RygnavXL7g12MPAx8Q6Dd9hfBzrfWxkF0Br2wIvlvkzW01naNVSkHp+OS3hL3W6n\r\n"
+ "l/jYvZnVeJXjtsKYcXIf/6WtspcF5awlQ9LZJcjwaH7KoZuK+THpXCMtzD8XNVdm\r\n"
+ "GW/JI0C/7U/E7evXn9XDio8SYkGSM63aLO5BtLCv092+1d4GGBSQYolRq+7Pd1kR\r\n"
+ "EkWBPm0ywZ2Vb8GIS5DLrjelEkBnKCyy3B0yQud9dpVsiUeE7F5sY8Me96WVxQcb\r\n"
+ "OyYdEY/j/9UpDlOG+vA+YgOvBhkKEjiqygVpP8EZoMMijephzg43b5Qi9r5UrvYo\r\n"
+ "o19oR/8pf4HJNDPF0/FJwFVMW8PmCBLGstin3NE1+NeWTkGt0TzpHjgKyfaDP2tO\r\n"
+ "4bCk1G7pP2kDFT7SYfc8xbgCkFQ2UCEXsaH/f5YmpLn4YPiNFCeeIida7xnfTvc4\r\n"
+ "7IxyVccHHq1FzGygOqemrxEETKh8hvDR6eBdrBwmCHVgZrnAqnn93JtGyPLi6+cj\r\n"
+ "WGVGtMZHwzVvX1HvSFG771sskcEjJxiQNQDQRWHEh3NxvNb7kFlAXnVdRkkvhjpR\r\n"
+ "GchFhTAzqmwltdWhWDEyCMKC2x/mSZvZtlZGY+g37Y72qHzidwtyW7rBetZJAgMB\r\n"
+ "AAGjggGtMIIBqTAdBgNVHQ4EFgQUDyBd16FXlduSzyvQx8J3BM5ygHYwHwYDVR0j\r\n"
+ "BBgwFoAUTiJUIBiV5uNu5g/6+rkS7QYXjzkwDgYDVR0PAQH/BAQDAgGGMB0GA1Ud\r\n"
+ "JQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjASBgNVHRMBAf8ECDAGAQH/AgEAMHYG\r\n"
+ "CCsGAQUFBwEBBGowaDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu\r\n"
+ "Y29tMEAGCCsGAQUFBzAChjRodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln\r\n"
+ "aUNlcnRHbG9iYWxSb290RzIuY3J0MHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6Ly9j\r\n"
+ "cmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5jcmwwN6A1oDOG\r\n"
+ "MWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RHMi5j\r\n"
+ "cmwwHQYDVR0gBBYwFDAIBgZngQwBAgEwCAYGZ4EMAQICMBAGCSsGAQQBgjcVAQQD\r\n"
+ "AgEAMA0GCSqGSIb3DQEBDAUAA4IBAQAlFvNh7QgXVLAZSsNR2XRmIn9iS8OHFCBA\r\n"
+ "WxKJoi8YYQafpMTkMqeuzoL3HWb1pYEipsDkhiMnrpfeYZEA7Lz7yqEEtfgHcEBs\r\n"
+ "K9KcStQGGZRfmWU07hPXHnFz+5gTXqzCE2PBMlRgVUYJiA25mJPXfB00gDvGhtYa\r\n"
+ "+mENwM9Bq1B9YYLyLjRtUz8cyGsdyTIG/bBM/Q9jcV8JGqMU/UjAdh1pFyTnnHEl\r\n"
+ "Y59Npi7F87ZqYYJEHJM2LGD+le8VsHjgeWX2CJQko7klXvcizuZvUEDTjHaQcs2J\r\n"
+ "+kPgfyMIOY1DMJ21NxOJ2xPRC/wAh/hzSBRVtoAnyuxtkZ4VjIOh\r\n"
+ "-----END CERTIFICATE-----\r\n";
+ ```
+
+1. Add a constant to this file for the speech URL without the location. This will be combined with the location and language later to get the full URL.
+
+ ```cpp
+ const char *SPEECH_URL = "https://%s.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=%s";
+ ```
+
+1. In the `speech_to_text.h` header file, in the `private` section of the `SpeechToText` class, define a field for a WiFi Client using the speech certificate:
+
+ ```cpp
+ WiFiClientSecure _speech_client;
+ ```
+
+1. In the `init` method, set the certificate on this WiFi Client:
+
+ ```cpp
+ _speech_client.setCACert(SPEECH_CERTIFICATE);
+ ```
+
+1. Add the following code to the `public` section of the `SpeechToText` class to define a method to convert speech to text:
+
+ ```cpp
+ String convertSpeechToText()
+ {
+
+ }
+ ```
+
+1. Add the following code to this method to create an HTTP client using the WiFi client configured with the speech certificate, and using the speech URL set with the location and language:
+
+ ```cpp
+ char url[128];
+ sprintf(url, SPEECH_URL, SPEECH_LOCATION, LANGUAGE);
+
+ HTTPClient httpClient;
+ httpClient.begin(_speech_client, url);
+ ```
+
+1. Some headers need to be set on the connection:
+
+ ```cpp
+ httpClient.addHeader("Authorization", String("Bearer ") + _access_token);
+ httpClient.addHeader("Content-Type", String("audio/wav; codecs=audio/pcm; samplerate=") + String(RATE));
+ httpClient.addHeader("Accept", "application/json;text/xml");
+ ```
+
+ This sets headers for the authorization using the access token, the audio format using the sample rate, and sets that the client expects the result as JSON.
+
+1. After this, add the following code to make the REST API call:
+
+ ```cpp
+ Serial.println("Sending speech...");
+
+ FlashStream stream;
+ int httpResponseCode = httpClient.sendRequest("POST", &stream, BUFFER_SIZE);
+
+ Serial.println("Speech sent!");
+ ```
+
+ This creates a `FlashStream` and uses it to stream data to the REST API.
+
+1. Below this, add the following code:
+
+ ```cpp
+ String text = "";
+
+ if (httpResponseCode == 200)
+ {
+ String result = httpClient.getString();
+ Serial.println(result);
+
+ DynamicJsonDocument doc(1024);
+ deserializeJson(doc, result.c_str());
+
+ JsonObject obj = doc.as();
+ text = obj["DisplayText"].as();
+ }
+ else if (httpResponseCode == 401)
+ {
+ Serial.println("Access token expired, trying again with a new token");
+ _access_token = getAccessToken();
+ return convertSpeechToText();
+ }
+ else
+ {
+ Serial.print("Failed to convert text to speech - error ");
+ Serial.println(httpResponseCode);
+ }
+ ```
+
+ This code checks the response code.
+
+ If it is 200, the code for success, then the result is retrieved, decoded from JSON, and the `DisplayText` property is set into the `text` variable. This is the property that the text version of the speech is returned in.
+
+ If the response code is 401, then the access token has expired (these tokens only last 10 minutes). A new access token is requested, and the call is made again.
+
+ Otherwise, an error is sent to the serial monitor, and the `text` is left blank.
+
+1. Add the following code to the end of this method to close the HTTP client and return the text:
+
+ ```cpp
+ httpClient.end();
+
+ return text;
+ ```
+
+1. In `main.cpp` call this new `convertSpeechToText` method in the `processAudio` function, then log out the speech to the serial monitor:
+
+ ```cpp
+ String text = speechToText.convertSpeechToText();
+ Serial.println(text);
+ ```
+
+1. Build this code, upload it to your Wio Terminal and test it out through the serial monitor. Press the C button (the one on the left-hand side, closest to the power switch), and speak. 4 seconds of audio will be captured, then converted to text.
+
+ ```output
+ --- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
+ --- More details at http://bit.ly/pio-monitor-filters
+ --- Miniterm on /dev/cu.usbmodem1101 9600,8,N,1 ---
+ --- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
+ Connecting to WiFi..
+ Connected!
+ Got access token.
+ Ready.
+ Starting recording...
+ Finished recording
+ Sending speech...
+ Speech sent!
+ {"RecognitionStatus":"Success","DisplayText":"Set a 2 minute and 27 second timer.","Offset":4700000,"Duration":35300000}
+ Set a 2 minute and 27 second timer.
+ ```
+
+> 💁 You can find this code in the [code-speech-to-text/wio-terminal](code-speech-to-text/wio-terminal) folder.
+
+😀 Your speech to text program was a success!
diff --git a/6-consumer/lessons/2-language-understanding/README.md b/6-consumer/lessons/2-language-understanding/README.md
index 00dcb0ac..e6da5a28 100644
--- a/6-consumer/lessons/2-language-understanding/README.md
+++ b/6-consumer/lessons/2-language-understanding/README.md
@@ -1,33 +1,527 @@
# Understand language
-Add a sketchnote if possible/appropriate
-
-
-
## Pre-lecture quiz
-[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/33)
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/43)
## Introduction
-In this lesson you will learn about
+In the last lesson you converted speech to text. For this to be used to program a smart timer, your code will need to have an understanding of what was said. You could assume the user will speak a fixed phrase, such as "Set a 3 minute timer", and parse that expression to get how long the timer should be, but this isn't very user-friendly. If a user were to say "Set a timer for 3 minutes", you or I would understand what they mean, but your code would not, it would be expecting a fixed phrase.
+
+This is where language understanding comes in, using AI models to interpret text and return the details that are needed, for example being able to take both "Set a 3 minute timer" and "Set a timer for 3 minutes", and understand that a timer is required for 3 minutes.
+
+In this lesson you will learn about language understanding models, how to create them, train them, and use them from your code.
In this lesson we'll cover:
-* [Thing 1](#thing-1)
+* [Language understanding](#language-understanding)
+* [Create a language understanding model](create-a-language-understanding-model)
+* [Intents and entities](#intents-and-entities)
+* [Use the language understanding model](#use-the-language-understanding-model)
+
+## Language understanding
+
+Humans have used language to communicate for hundreds of thousands of years. We communicate with words, sounds, or actions and understand what is said, both the meaning of the words, sounds or actions, but also their context. We understand sincerity and sarcasm, allowing the same words to mean different things depending on the tone of our voice.
+
+✅ Think about some of the conversations you have had recently. How much of the conversation would be hard for a computer to understand because it needs context?
+
+Language understanding, also called natural-language understanding is part of a field of artificial intelligence called natural-language processing (or NLP), and deals with reading comprehension, trying to understand the details of words or sentences. If you use a voice assistant such as Alexa or Siri, you have used language understanding services. These are the behind-the-scenes AI services that convert "Alexa, play the latest album by Taylor Swift" into my daughter dancing around the living room to her favorite tunes.
+
+> 💁 Computers, despite all their advances, still have a long way to go to truly understand text. When we refer to language understanding with computers, we don't mean anything anywhere near as advanced as human communication, instead we mean taking some words and extracting key details.
+
+As humans, we understand language without really thinking about it. If I asked another human to "play the latest album by Taylor Swift" then they would instinctively know what I meant. For a computer, this is harder. It would have to take the words, converted from speech to text, and work out the following pieces of information:
+
+* Music needs to be played
+* The music is by the artist Taylor Swift
+* The specific music is a whole album of multiple tracks in order
+* Taylor Swift has many albums, so they need to be sorted by chronological order and the most recently published is the one required
+
+✅ Think of some other sentences you have spoken when making requests, such as ordering coffee or asking a family member to pass you something. Try to break then down into the pieces of information a computer would need to extract to understand the sentence.
+
+Language understanding models are AI models that are trained to extract certain details from language, and then are trained for specific tasks using transfer learning, in the same way you trained a Custom Vision model using a small set of images. You can take a model, then train it using the text you want it to understand.
+
+## Create a language understanding model
+
+
+
+You can create language understanding models using LUIS, a language understanding service from Microsoft that is part of Cognitive Services.
+
+### Task - create an authoring resource
+
+To use LUIS, you need to create an authoring resource.
+
+1. Use the following command to create an authoring resource in your `smart-timer` resource group:
+
+ ```python
+ az cognitiveservices account create --name smart-timer-luis-authoring \
+ --resource-group smart-timer \
+ --kind LUIS.Authoring \
+ --sku F0 \
+ --yes \
+ --location
+ ```
+
+ Replace `` with the location you used when creating the Resource Group.
+
+ > ⚠️ LUIS isn't available in all regions, so if you get the following error:
+ >
+ > ```output
+ > InvalidApiSetId: The account type 'LUIS.Authoring' is either invalid or unavailable in given region.
+ > ```
+ >
+ > pick a different region.
+
+ This will create a free-tier LUIS authoring resource.
+
+### Task - create a language understanding app
+
+1. Open the LUIS portal at [luis.ai](https://luis.ai?WT.mc_id=academic-17441-jabenn) in your browser, and sign in with the same account you have been using for Azure.
+
+1. Follow the instructions on the dialog to select your Azure subscription, then select the `smart-timer-luis-authoring` resource you have just created.
+
+1. From the *Conversation apps* list, select the **New app** button to create a new application. Name the new app `smart-timer`, and set the *Culture* to your language.
+
+ > 💁 There is a field for a prediction resource. You can create a second resource just for prediction, but the free authoring resource allows 1,000 predictions a month which should be enough for development, so you can leave this blank.
+
+1. Read through the guide that appears once you cerate the app to get an understanding of the steps you need to take to train the language understanding model. Close this guide when you are done.
+
+## Intents and entities
+
+Language understanding is based around *intents* and *entities*. Intents are what the intent of the words are, for example playing music, setting a timer, or ordering food. Entities are what the intent is referring to, such as the album, the length of the timer, or the type of food. Each sentence that the model interprets should have at least one intent, and optionally one or more entities.
+
+Some examples:
+
+| Sentence | Intent | Entities |
+| --------------------------------------------------- | ---------------- | ------------------------------------------ |
+| "Play the latest album by Taylor Swift" | *play music* | *the latest album by Taylor Swift* |
+| "Set a 3 minute timer" | *set a timer* | *3 minutes* |
+| "Cancel my timer" | *cancel a timer* | None |
+| "Order 3 large pineapple pizzas and a caesar salad" | *order food* | *3 large pineapple pizzas*, *caesar salad* |
+
+✅ With the sentences you though about earlier, what would be the intent and any entities in that sentence?
+
+To train LUIS, first you set the entities. These can be a fixed list of terms, or learned from the text. For example, you could provide a fixed list of food available from your menu, with variations (or synonyms) of each word, such as *egg plant* and *aubergine* as variations of *aubergine*. LUIS also has pre-built entities that can be used, such as numbers and locations.
+
+For setting a timer, you could have one entity using the pre-built number entities for the time, and another for the units, such as minutes and seconds. Each unit would have multiple variations to cover the singular and plural forms - such as minute and minutes.
+
+Once the entities are defined, you create intents. These are learned by the model based on example sentences that you provide (known as utterances). For example, for a *set timer* intent, you might provide the following sentences:
+
+* `set a 1 second timer`
+* `set a timer for 1 minute and 12 seconds`
+* `set a timer for 3 minutes`
+* `set a 9 minute 30 second timer`
+
+You then tell LUIS what parts of these sentences map to the entities:
+
+
+
+The sentence `set a timer for 1 minute and 12 seconds` has the intent of `set timer`. It also has 2 entities with 2 values each:
+
+| | time | unit |
+| ---------- | ---: | ------ |
+| 1 minute | 1 | minute |
+| 12 seconds | 12 | second |
+
+To train a good model, you need a range of different example sentences to cover the many different ways someone might ask for the same thing.
+
+> 💁 As with any AI model, the more data and the more accurate the data you use to train, the better the model.
+
+✅ Think about the different ways you might ask the same thing and expect a human to understand.
+
+### Task - add entities to the language understanding models
+
+For the timer, you need to add 2 entities - one for the unit of time (minutes or seconds), and one for the number of minutes or seconds.
+
+You can find instructions for using the LUIS portal in the [Quickstart: Build your app in LUIS portal documentation on Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/luis/luis-get-started-create-app?WT.mc_id=academic-17441-jabenn).
+
+1. From the LUIS portal, select the *Entities* tab and add the *number* prebuilt entity by selecting the **Add prebuilt entity** button, then selecting *number* from the list.
+
+1. Create a new entity for the time unit using the **Create** button. Name the entity `time unit` and set the type to *List*. Add values for `minute` and `second` to the *Normalized values* list, adding the singular and plural forms to the *synonyms* list. Press `return` after adding each synonym to add it to the list.
+
+ | Normalized value | Synonyms |
+ | ---------------- | --------------- |
+ | minute | minute, minutes |
+ | second | second, seconds |
+
+### Task - add intents to the language understanding models
+
+1. From the *Intents* tab, select the **Create** button to create a new intent. Name this intent `set timer`.
+
+1. In the examples, enter different ways to set a timer using both minutes, seconds and minutes and seconds combined. Examples could be:
+
+ * `set a 1 second timer`
+ * `set a 4 minute timer`
+ * `set a four minute six second timer`
+ * `set a 9 minute 30 second timer`
+ * `set a timer for 1 minute and 12 seconds`
+ * `set a timer for 3 minutes`
+ * `set a timer for 3 minutes and 1 second`
+ * `set a timer for three minutes and one second`
+ * `set a timer for 1 minute and 1 second`
+ * `set a timer for 30 seconds`
+ * `set a timer for 1 second`
+
+ Mix up numbers as words and numerics so the model learns to handle both.
+
+1. As you enter each example, LUIS will start detecting entities, and will underline and label any it finds.
+
+ 
+
+### Task - train and test the model
+
+1. Once the entities and intents are configured, you can train the model using the **Train** button on the top menu. Select this button, and the model should train in a few seconds. The button will be greyed out whilst training, and be re-enabled once done.
+
+1. Select the **Test** button from the top menu to test the language understanding model. Enter text such as `set a timer for 5 minutes and 4 seconds` and press return. The sentence will appear in a box under the text box that you typed it in to, and blow that will be the *top intent*, or the intent that was detected with the highest probability. This should be `set timer`. The intent name will be followed by the probability that the intent detected was the right one.
+
+1. Select the **Inspect** option to see a breakdown of the results. You will see the top-scoring intent with it's percentage probability, along with lists of the entities detected.
+
+1. Close the *Test* pane when you are done testing.
+
+### Task - publish the model
+
+To use this model from code, you need to publish it. When publishing from LUIS, you can publish to either a staging environment for testing, or a product environment for a full release. In this lesson, a staging environment is fine.
+
+1. From the LUIS portal, select the **Publish** button from the top menu.
+
+1. Make sure *Staging slot* is selected, then select **Done**. You will see a notification when the app is published.
+
+1. You can test this using curl. To build the curl command, you need three values - the endpoint, the application ID (App ID) and an API key. These can be accessed from the **MANAGE** tab that can be selected from the top menu.
+
+ 1. From the *Settings* section, copy the App ID
+
+ 1. From the *Azure Resources* section, select *Authoring Resource*, and copy the *Primary Key* and *Endpoint URL*
+
+1. Run the following curl command in your command prompt or terminal:
+
+ ```sh
+ curl "/luis/prediction/v3.0/apps//slots/staging/predict" \
+ --request GET \
+ --get \
+ --data "subscription-key=" \
+ --data "verbose=false" \
+ --data "show-all-intents=true" \
+ --data-urlencode "query="
+ ```
+
+ Replace `` with the Endpoint URL from the *Azure Resources* section.
+
+ Replace `` with the App ID from the *Settings* section.
+
+ Replace `` with the Primary Key from the *Azure Resources* section.
+
+ Replace `` with the sentence you want to test with.
+
+1. The output of this call will be a JSON document that details the query, the top intent, and a list of entities broken down by type.
+
+ ```JSON
+ {
+ "query": "set a timer for 45 minutes and 12 seconds",
+ "prediction": {
+ "topIntent": "set timer",
+ "intents": {
+ "set timer": {
+ "score": 0.97031575
+ },
+ "None": {
+ "score": 0.02205793
+ }
+ },
+ "entities": {
+ "number": [
+ 45,
+ 12
+ ],
+ "time-unit": [
+ [
+ "minute"
+ ],
+ [
+ "second"
+ ]
+ ]
+ }
+ }
+ }
+ ```
+
+ The JSON above came from querying with `set a timer for 45 minutes and 12 seconds`:
+
+ * The `set timer` was the top intent with a probability of 97%.
+ * Two *number* entities were detected, `45` and `12`.
+ * Two *time-unit* entities were detected, `minute` and `second`.
-## Thing 1
+## Use the language understanding model
+
+Once published, the LUIS model can be called from code. In previous lessons, you have used an IoT Hub to handle communication with cloud services, sending telemetry and listening for commands. This is very asynchronous - once telemetry is sent your code doesn't wait for a response, and if the cloud service is down, you wouldn't know.
+
+For a smart timer, we want a response straight away, so we can tell the user that a timer is set, or alert them that the cloud services are unavailable. To do this, our IoT device will call a web endpoint directly, instead of relying on an IoT Hub.
+
+Rather than calling LUIS from the IoT device, you can use serverless code with a different type of trigger - an HTTP trigger. This allows your function app to listen for REST requests, and respond to them. This function will be a REST endpoint your device can call.
+
+> 💁 Although you can call LUIS directly from your IoT device, it's better to use something like serverless code. This way when of you want to change the LUIS app that you call, for example when you train a better model or train a model in a different language, you only have to update your cloud code, not re-deploy code to potentially thousands or millions of IoT device.
+
+### Task - create a serverless functions app
+
+1. Create an Azure Functions app called `smart-timer-trigger`, and open this in VS Code
+
+1. Add an HTTP trigger to this app called `speech-trigger` using the following command from inside the VS Code terminal:
+
+ ```sh
+ func new --name text-to-timer --template "HTTP trigger"
+ ```
+
+ This will crate an HTTP trigger called `text-to-timer`.
+
+1. Test the HTTP trigger by running the functions app. When it runs you will see the endpoint listed in the output:
+
+ ```output
+ Functions:
+
+ text-to-timer: [GET,POST] http://localhost:7071/api/text-to-timer
+ ```
+
+ Test this by loading the [http://localhost:7071/api/text-to-timer](http://localhost:7071/api/text-to-timer) URL in your browser.
+
+ ```output
+ This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.
+ ```
+
+### Task - use the language understanding model
+
+1. The SDK for LUIS is available via a Pip package. Add the following line to the `requirements.txt` file to add the dependency on this package:
+
+ ```sh
+ azure-cognitiveservices-language-luis
+ ```
+
+1. Make sure the VS Code terminal has the virtual environment activated, and run the following command to install the Pip packages:
+
+ ```sh
+ pip install -r requirements.txt
+ ```
+
+ > 💁 If you get errors, you may need to upgrade pip with the following command:
+ >
+ > ```sh
+ > pip install --upgrade pip
+ > ```
+
+1. Add new entries to the `local.settings.json` file for your LUIS API Key, Endpoint URL, and App ID from the **MANAGE** tab of the LUIS portal:
+
+ ```JSON
+ "LUIS_KEY": "",
+ "LUIS_ENDPOINT_URL": "",
+ "LUIS_APP_ID": ""
+ ```
+
+ Replace `` with the Endpoint URL from the *Azure Resources* section of the **MANAGE** tab. This will be `https://.api.cognitive.microsoft.com/`.
+
+ Replace `` with the App ID from the *Settings* section of the **MANAGE** tab.
+
+ Replace `` with the Primary Key from the *Azure Resources* section of the **MANAGE** tab.
+
+1. Add the following imports to the `__init__.py` file:
+
+ ```python
+ import json
+ import os
+ from azure.cognitiveservices.language.luis.runtime import LUISRuntimeClient
+ from msrest.authentication import CognitiveServicesCredentials
+ ```
+
+ This imports some system libraries, as well as the libraries to interact with LUIS.
+
+1. Delete the contents of the `main` method, and add the following code:
+
+ ```python
+ luis_key = os.environ['LUIS_KEY']
+ endpoint_url = os.environ['LUIS_ENDPOINT_URL']
+ app_id = os.environ['LUIS_APP_ID']
+
+ credentials = CognitiveServicesCredentials(luis_key)
+ client = LUISRuntimeClient(endpoint=endpoint_url, credentials=credentials)
+ ```
+
+ This loads the values you added to the `local.settings.json` file for your LUIS app, creates a credentials object with your API key, then creates a LUIS client object to interact with your LUIS app.
+
+1. This HTTP trigger will be called passing the text to understand as JSON, with the text in a property called `text`. The following code extracts the value from the body of the HTTP request, and logs it to the console. Add this code to the `main` function:
+
+ ```python
+ req_body = req.get_json()
+ text = req_body['text']
+ logging.info(f'Request - {text}')
+ ```
+
+1. Predictions are requested from LUIS by sending a prediction request - a JSON document containing the text to predict. Create this with the following code:
+
+ ```python
+ prediction_request = { 'query' : text }
+ ```
+
+1. This request can then be sent to LUIS, using the staging slot that your app was published to:
+
+ ```python
+ prediction_response = client.prediction.get_slot_prediction(app_id, 'Staging', prediction_request)
+ ```
+
+1. The prediction response contains the top intent - the intent with the highest prediction score, along with the entities. If the top intent is `set timer`, then the entities can be read to get the time needed for the timer:
+
+ ```python
+ if prediction_response.prediction.top_intent == 'set timer':
+ numbers = prediction_response.prediction.entities['number']
+ time_units = prediction_response.prediction.entities['time unit']
+ total_seconds = 0
+ ```
+
+ The `number` entities wil be an array of numbers. For example, if you said *"Set a four minute 17 second timer."*, then the `number` array will contain 2 integers - 4 and 17.
+
+ The `time unit` entities will be an array of arrays of strings, with each time unit as an array of strings inside the array. For example, if you said *"Set a four minute 17 second timer."*, then the `time unit` array will contain 2 arrays with single values each - `['minute']` and `['second']`.
+
+ The JSON version of these entities for *"Set a four minute 17 second timer."* is:
+
+ ```json
+ {
+ "number": [4, 17],
+ "time unit": [
+ ["minute"],
+ ["second"]
+ ]
+ }
+ ```
+
+ This code also defines a count for the total time for the timer in seconds. This will be populated by the values from the entities.
+
+1. The entities aren't linked, but we can make some assumptions about them. They will be in the order spoken, so the position in the array can be used to determine which number matches to which time unit. For example:
+
+ * *"Set a 30 second timer"* - this will have one number, `30`, and one time unit, `second` so the single number will match the single time unit.
+ * *"Set a 2 minute and 30 second timer"* - this will have two numbers, `2` and `30`, and two time units, `minute` and `second` so the first number will be for the first time unit (2 minutes), and the second number for the second time unit (30 seconds).
+
+ The following code gets the count of items in the number entities, and uses that to extract the first item from each array, then the second and so on. Add this inside the `if` block.
+
+ ```python
+ for i in range(0, len(numbers)):
+ number = numbers[i]
+ time_unit = time_units[i][0]
+ ```
+
+ For *"Set a four minute 17 second timer."*, this will loop twice, giving the following values:
+
+ | loop count | `number` | `time_unit` |
+ | ---------: | -------: | ----------- |
+ | 0 | 4 | minute |
+ | 1 | 17 | second |
+
+1. Inside this loop, use the number and time unit to calculate the total time for the timer, adding 60 seconds for each minute, and the number of seconds for any seconds.
+
+ ```python
+ if time_unit == 'minute':
+ total_seconds += number * 60
+ else:
+ total_seconds += number
+ ```
+
+1. Outside this loop through the entities, log the total time for the timer:
+
+ ```python
+ logging.info(f'Timer required for {total_seconds} seconds')
+ ```
+
+1. The number of seconds needs to be returned from the function as an HTTP response. At the end of the `if` block, add the following:
+
+ ```python
+ payload = {
+ 'seconds': total_seconds
+ }
+ return func.HttpResponse(json.dumps(payload), status_code=200)
+ ```
+
+ This code creates a payload containing the total number of seconds for the timer, converts it to a JSON string and returns it as an HTTP result with a status code of 200, which means the call was successful.
+
+1. Finally, outside the `if` block, handle if the intent was not recognized by returning an error code:
+
+ ```python
+ return func.HttpResponse(status_code=404)
+ ```
+
+ 404 is the status code for *not found*.
+
+1. Run the function app and test it out using curl.
+
+ ```sh
+ curl --request POST 'http://localhost:7071/api/text-to-timer' \
+ --header 'Content-Type: application/json' \
+ --include \
+ --data '{"text":""}'
+ ```
+
+ Replace `` with the text of your request, for example `set a 2 minutes 27 second timer`.
+
+ You will see the following output from the functions app:
+
+ ```output
+ Functions:
+
+ text-to-timer: [GET,POST] http://localhost:7071/api/text-to-timer
+
+ For detailed output, run func with --verbose flag.
+ [2021-06-26T19:45:14.502Z] Worker process started and initialized.
+ [2021-06-26T19:45:19.338Z] Host lock lease acquired by instance ID '000000000000000000000000951CAE4E'.
+ [2021-06-26T19:45:52.059Z] Executing 'Functions.text-to-timer' (Reason='This function was programmatically called via the host APIs.', Id=f68bfb90-30e4-47a5-99da-126b66218e81)
+ [2021-06-26T19:45:53.577Z] Timer required for 147 seconds
+ [2021-06-26T19:45:53.746Z] Executed 'Functions.text-to-timer' (Succeeded, Id=f68bfb90-30e4-47a5-99da-126b66218e81, Duration=1750ms)
+ ```
+
+ The call to curl will return the following:
+
+ ```output
+ HTTP/1.1 200 OK
+ Date: Tue, 29 Jun 2021 01:14:11 GMT
+ Content-Type: text/plain; charset=utf-8
+ Server: Kestrel
+ Transfer-Encoding: chunked
+
+ {"seconds": 147}
+ ```
+
+ The number of seconds for the timer is in the `"seconds"` value.
+
+> 💁 You can find this code in the [code/functions](code/functions) folder.
+
+### Task - make your function available to your IoT device
+
+1. For your IoT device to call your REST endpoint, it will need to know the URL. When you accessed it earlier, you used `localhost`, which is a shortcut to access REST endpoints on your local machine. To allow you IoT device to get access, you need to either:
+
+ * Publish the Functions app - follow the instructions in earlier lessons to publish your functions app to the cloud. Once published, the URL will be `http://.azurewebsites.net/api/text-to-timer`, where `` will be the name of your functions app.
+ * Run the functions app locally, and access using the IP address - you can get the IP address of your computer on your local network, and use that to build the URL.
+
+ Find your IP address:
+
+ * On Windows 10, follow the [Find your IP address guide](https://support.microsoft.com/windows/find-your-ip-address-f21a9bbc-c582-55cd-35e0-73431160a1b9?WT.mc_id=academic-17441-jabenn)
+ * On macOS, follow the [How to find you IP address on a Mac guide](https://www.hellotech.com/guide/for/how-to-find-ip-address-on-mac)
+ * On linux, follow the section on finding your private IP address in the [How to find your IP address in Linux guide](https://opensource.com/article/18/5/how-find-ip-address-linux)
+
+ Once you have your IP address, you will able to access the function at `http://:7071/api/text-to-timer`, where `` will be your IP address, for example `http://192.168.1.10:7071/api/text-to-timer`.
+
+ > 💁 This will only work if your IoT device is on the same network as your computer.
+
+1. Test the endpoint by accessing it using your browser.
---
## 🚀 Challenge
+There are many ways to request the same thing, such as setting a timer. Think of different ways to do this, and use them as examples in your LUIS app. Test these out, to see how well your model can cope with multiple ways to request a timer.
+
## Post-lecture quiz
-[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/44)
## Review & Self Study
+* Read more about LUIS and it's capabilities on the [Language Understanding (LUIS) documentation page on Microsoft docs](https://docs.microsoft.com/azure/cognitive-services/luis/?WT.mc_id=academic-17441-jabenn)
+* Read more about language understanding on the [Natural-language understanding page on Wikipedia](https://wikipedia.org/wiki/Natural-language_understanding)
+* Read more on HTTP triggers in the [Azure Functions HTTP trigger documentation on Microsoft docs](https://docs.microsoft.com/azure/azure-functions/functions-bindings-http-webhook-trigger?tabs=python&WT.mc_id=academic-17441-jabenn)
+
## Assignment
-[](assignment.md)
+[Cancel the timer](assignment.md)
diff --git a/6-consumer/lessons/2-language-understanding/assignment.md b/6-consumer/lessons/2-language-understanding/assignment.md
index da157d5c..acfdd3ae 100644
--- a/6-consumer/lessons/2-language-understanding/assignment.md
+++ b/6-consumer/lessons/2-language-understanding/assignment.md
@@ -1,9 +1,14 @@
-#
+# Cancel the timer
## Instructions
+So far in this lesson you have trained a model to understand setting a timer. Another useful feature is cancelling a timer - maybe your bread is ready and can be taken out of the oven before the timer is elapsed.
+
+Add a new intent to your LUIS app to cancel the timer. It won't need any entities, but will need some example sentences. Handle this in your serverless code if it is the top intent, logging that the intent was recognized and returning an appropriate response.
+
## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- |
-| | | | |
+| Add the cancel timer intent to the LUIS app | Was able to add the intent and train the model | Was able to add the intent but not train the model | Was unable to add the intent and train the model |
+| Handle the intent in the serverless app | Was able to detect the intent as the top intent and log it | Was able to detect the intent as the top intent | Was unable to detect the intent as the top intent |
diff --git a/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/host.json b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/host.json
new file mode 100644
index 00000000..291065f8
--- /dev/null
+++ b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/host.json
@@ -0,0 +1,15 @@
+{
+ "version": "2.0",
+ "logging": {
+ "applicationInsights": {
+ "samplingSettings": {
+ "isEnabled": true,
+ "excludedTypes": "Request"
+ }
+ }
+ },
+ "extensionBundle": {
+ "id": "Microsoft.Azure.Functions.ExtensionBundle",
+ "version": "[2.*, 3.0.0)"
+ }
+}
\ No newline at end of file
diff --git a/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/local.settings.json b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/local.settings.json
new file mode 100644
index 00000000..ee6b34ce
--- /dev/null
+++ b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/local.settings.json
@@ -0,0 +1,10 @@
+{
+ "IsEncrypted": false,
+ "Values": {
+ "FUNCTIONS_WORKER_RUNTIME": "python",
+ "AzureWebJobsStorage": "",
+ "LUIS_KEY": "",
+ "LUIS_ENDPOINT_URL": "",
+ "LUIS_APP_ID": ""
+ }
+}
\ No newline at end of file
diff --git a/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/requirements.txt b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/requirements.txt
new file mode 100644
index 00000000..d0405a38
--- /dev/null
+++ b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/requirements.txt
@@ -0,0 +1,4 @@
+# Do not include azure-functions-worker as it may conflict with the Azure Functions platform
+
+azure-functions
+azure-cognitiveservices-language-luis
\ No newline at end of file
diff --git a/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/__init__.py b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/__init__.py
new file mode 100644
index 00000000..d15d6e68
--- /dev/null
+++ b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/__init__.py
@@ -0,0 +1,46 @@
+import logging
+
+import azure.functions as func
+import json
+import os
+from azure.cognitiveservices.language.luis.runtime import LUISRuntimeClient
+from msrest.authentication import CognitiveServicesCredentials
+
+
+def main(req: func.HttpRequest) -> func.HttpResponse:
+ luis_key = os.environ['LUIS_KEY']
+ endpoint_url = os.environ['LUIS_ENDPOINT_URL']
+ app_id = os.environ['LUIS_APP_ID']
+
+ credentials = CognitiveServicesCredentials(luis_key)
+ client = LUISRuntimeClient(endpoint=endpoint_url, credentials=credentials)
+
+ req_body = req.get_json()
+ text = req_body['text']
+ logging.info(f'Request - {text}')
+ prediction_request = { 'query' : text }
+
+ prediction_response = client.prediction.get_slot_prediction(app_id, 'Staging', prediction_request)
+
+ if prediction_response.prediction.top_intent == 'set timer':
+ numbers = prediction_response.prediction.entities['number']
+ time_units = prediction_response.prediction.entities['time unit']
+ total_seconds = 0
+
+ for i in range(0, len(numbers)):
+ number = numbers[i]
+ time_unit = time_units[i][0]
+
+ if time_unit == 'minute':
+ total_seconds += number * 60
+ else:
+ total_seconds += number
+
+ logging.info(f'Timer required for {total_seconds} seconds')
+
+ payload = {
+ 'seconds': total_seconds
+ }
+ return func.HttpResponse(json.dumps(payload), status_code=200)
+
+ return func.HttpResponse(status_code=404)
\ No newline at end of file
diff --git a/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/function.json b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/function.json
new file mode 100644
index 00000000..d9019652
--- /dev/null
+++ b/6-consumer/lessons/2-language-understanding/code/functions/smart-timer-trigger/text-to-timer/function.json
@@ -0,0 +1,20 @@
+{
+ "scriptFile": "__init__.py",
+ "bindings": [
+ {
+ "authLevel": "function",
+ "type": "httpTrigger",
+ "direction": "in",
+ "name": "req",
+ "methods": [
+ "get",
+ "post"
+ ]
+ },
+ {
+ "type": "http",
+ "direction": "out",
+ "name": "$return"
+ }
+ ]
+}
\ No newline at end of file
diff --git a/6-consumer/lessons/3-spoken-feedback/README.md b/6-consumer/lessons/3-spoken-feedback/README.md
index da6a602b..3cff7a3f 100644
--- a/6-consumer/lessons/3-spoken-feedback/README.md
+++ b/6-consumer/lessons/3-spoken-feedback/README.md
@@ -1,33 +1,123 @@
-# Provide spoken feedback
-
-Add a sketchnote if possible/appropriate
-
-
+# Set a timer and provide spoken feedback
## Pre-lecture quiz
-[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/33)
+[Pre-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/45)
## Introduction
-In this lesson you will learn about
+Smart assistants are not one-way communication devices. You speak to them, and they respond:
+
+"Alexa, set a 3 minute timer"
+
+"Ok, your timer is set for 3 minutes"
+
+In the last 2 lessons you learned how to take speech and create text, then extract a set timer request from that text. In this lesson you will learn how to set the timer on the IoT device, responding to the user with spoken words confirming their timer, and alerting them when their timer is finished.
In this lesson we'll cover:
-* [Thing 1](#thing-1)
+* [Text to speech](#text-to-speech)
+* [Set the timer](#set-the-timer)
+* [Convert text to speech](#convert-text-to-speech)
+
+## Text to speech
+
+Text to speech, as the name suggests, is the process of converting text into audio that contains the text as spoken words. The basic principle is to break down the words in the text into their constituent sounds (known as phonemes), and stitch together audio for those sounds, either using pre-recorded audio or using audio generated by AI models.
+
+
+
+Text to speech systems typically have 3 stages:
+
+* Text analysis
+* Linguistic analysis
+* Wave-form generation
+
+### Text analysis
+
+Text analysis involves taking the text provided, and converting into words that can be used to generate speech. For example, if you convert "Hello world", there there is no text analysis needed, the two words can be converted to speech. If you have "1234" however, then this might need to be converted either into the words "One thousand, two hundred thirty four" or "One, two, three, four" depending on the context. For "I have 1234 apples", then it would be "One thousand, two hundred thirty four", but for "The child counted 1234" then it would be "One, two, three, four".
+
+The words created vary not only for the language, but the locale of that language. For example, in American English, 120 would be "One hundred twenty", in British English it would be "One hundred and twenty", with the use of "and" after the hundreds.
+
+✅ Some other examples that require text analysis include "in" as a short form of inch, and "st" as a short form of saint and street. Can you think of other examples in your language of words that are ambiguous without context.
+
+Once the words have been defined, they are sent for linguistic analysis.
+
+### Linguistic analysis
+
+Linguistic analysis breaks the words down into phonemes. Phonemes are based not just on the letters used, but the other letters in the word. For example, in English the 'a' sound in 'car' and 'care' is different. The English language has 44 different phonemes for the 26 letters in the alphabet, some shared by different letters, such as the same phoneme used at the start of 'circle' and 'serpent'.
+
+✅ Do some research: What are the phonemes for you language?
+
+Once the words have been converted to phonemes, these phonemes need additional data to support intonation, adjusting the tone or duration depending on the context. One example is in English pitch increases can be used to convert a sentence into a question, having a raised pitch for the last word implies a question.
+
+For example - the sentence "You have an apple" is a statement saying that you have an apple. If the pitch goes up at the end, increasing for the word apple, it becomes the question "You have an apple?", asking if you have an apple. The linguistic analysis needs to use the question mark at the end to decide to increase pitch.
+
+Once the phonemes have been generated, they can be sent for wave-form generation to produce the audio output.
-## Thing 1
+### Wave-form generation
+
+The first electronic text to speech systems used single audio recordings for each phoneme, leading to very monotonous, robotic sounding voices. The linguistic analysis would produce phonemes, these would be loaded from a database of sounds and stitched together to make the audio.
+
+✅ Do some research: Find some audio recordings from early speech synthesis systems. Compare it to modern speech synthesis, such as that used in smart assistants.
+
+More modern wave-form generation uses ML models built using deep learning (very large neural networks that act in a similar way to neurons in the brain) to produce more natural sounding voices that can be indistinguishable from humans.
+
+> 💁 Some of these ML models can be re-trained using transfer learning to sound like real people. This means using voice as a security system, something banks are increasingly trying to do, is no longer a good idea as anyone with a recording of a few minutes of your voice can impersonate you.
+
+These large ML models are being trained to combine all three steps into end-to-end speech synthesizers.
+
+## Set the timer
+
+To set the timer, your IoT device needs to call the REST endpoint you created using serverless code, then use the resulting number of seconds to set a timer.
+
+### Task - call the serverless function to get the timer time
+
+Follow the relevant guide to call the REST endpoint from your IoT device and set a timer for the required time:
+
+* [Arduino - Wio Terminal](wio-terminal-set-timer.md)
+* [Single-board computer - Raspberry Pi/Virtual IoT device](single-board-computer-set-timer.md)
+
+## Convert text to speech
+
+The same speech service you used to convert speech to text can be used to convert text back into speech, and this can be played through a speaker on your IoT device. The text to convert is sent to the speech service, along with the type of audio required (such as the sample rate), and binary data containing the audio is returned.
+
+When you send this request, you send it using *Speech Synthesis Markup Language* (SSML), an XML-based markup language for speech synthesis applications. This defines not only the text to be converted, but the language of the text, the voice to use, and can even be used to define speed, volume, and pitch for some or all of the words in the text.
+
+For example, this SSML defines a request to convert the text "Your 3 minute 5 second time has been set" to speech using a British English voice called `en-GB-MiaNeural`
+
+```xml
+
+
+ Your 3 minute 5 second time has been set
+
+
+```
+
+> 💁 Most text to speech systems have multiple voices for different languages, with relevant accents such as a British English voice with an English accent and a New Zealand English voice with a New Zealand accent.
+
+### Task - convert text to speech
+
+Work through the relevant guide to convert text to speech using your IoT device:
+
+* [Arduino - Wio Terminal](wio-terminal-text-to-speech.md)
+* [Single-board computer - Raspberry Pi](pi-text-to-speech.md)
+* [Single-board computer - Virtual device](virtual-device-text-to-speech.md)
---
## 🚀 Challenge
+SSML has ways to change how words are spoken, such as adding emphasis to certain words, adding pauses, or changing pitch. Try some of these out, sending different SSML from your IoT device and comparing the output. You can read more about SSML, including how to change the way words are spoken in the [Speech Synthesis Markup Language (SSML) Version 1.1 specification from the World Wide Web consortium](https://www.w3.org/TR/speech-synthesis11/).
+
## Post-lecture quiz
-[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/34)
+[Post-lecture quiz](https://brave-island-0b7c7f50f.azurestaticapps.net/quiz/46)
## Review & Self Study
+* Read more on speech synthesis on the [Speech synthesis page on Wikipedia](https://wikipedia.org/wiki/Speech_synthesis)
+* Read more on ways criminals are using speech synthesis to steal on the [Fake voices 'help cyber crooks steal cash' story on BBC news](https://www.bbc.com/news/technology-48908736)
+
## Assignment
-[](assignment.md)
+[Cancel the timer](assignment.md)
diff --git a/6-consumer/lessons/3-spoken-feedback/assignment.md b/6-consumer/lessons/3-spoken-feedback/assignment.md
index da157d5c..efaad571 100644
--- a/6-consumer/lessons/3-spoken-feedback/assignment.md
+++ b/6-consumer/lessons/3-spoken-feedback/assignment.md
@@ -1,9 +1,12 @@
-#
+# Cancel the timer
## Instructions
+In the assignment for the last lesson, you added a cancel timer intent to LUIS. For this assignment you need to handle this intent in the serverless code, send a command to the IoT device, then cancel the timer.
+
## Rubric
| Criteria | Exemplary | Adequate | Needs Improvement |
| -------- | --------- | -------- | ----------------- |
-| | | | |
+| Handle the intent in serverless code and send a command | Was able to handle the intent and send a command to the device | Was able to handle the intent but was unable to send the command to the device | Was unable to handle the intent |
+| Cancel the timer on the device | Was able to receive the command and cancel the timer | Was able to receive the command but not cancel the timer | Was unable to receive the command |
diff --git a/6-consumer/lessons/3-spoken-feedback/code-spoken-response/pi/smart-timer/app.py b/6-consumer/lessons/3-spoken-feedback/code-spoken-response/pi/smart-timer/app.py
new file mode 100644
index 00000000..50290b21
--- /dev/null
+++ b/6-consumer/lessons/3-spoken-feedback/code-spoken-response/pi/smart-timer/app.py
@@ -0,0 +1,189 @@
+import io
+import json
+import pyaudio
+import requests
+import time
+import wave
+import threading
+
+from grove.factory import Factory
+button = Factory.getButton('GPIO-HIGH', 5)
+
+audio = pyaudio.PyAudio()
+microphone_card_number = 1
+speaker_card_number = 1
+rate = 16000
+
+def capture_audio():
+ stream = audio.open(format = pyaudio.paInt16,
+ rate = rate,
+ channels = 1,
+ input_device_index = microphone_card_number,
+ input = True,
+ frames_per_buffer = 4096)
+
+ frames = []
+
+ while button.is_pressed():
+ frames.append(stream.read(4096))
+
+ stream.stop_stream()
+ stream.close()
+
+ wav_buffer = io.BytesIO()
+ with wave.open(wav_buffer, 'wb') as wavefile:
+ wavefile.setnchannels(1)
+ wavefile.setsampwidth(audio.get_sample_size(pyaudio.paInt16))
+ wavefile.setframerate(rate)
+ wavefile.writeframes(b''.join(frames))
+ wav_buffer.seek(0)
+
+ return wav_buffer
+
+speech_api_key = ''
+location = ''
+language = ''
+
+def get_access_token():
+ headers = {
+ 'Ocp-Apim-Subscription-Key': speech_api_key
+ }
+
+ token_endpoint = f'https://{location}.api.cognitive.microsoft.com/sts/v1.0/issuetoken'
+ response = requests.post(token_endpoint, headers=headers)
+ return str(response.text)
+
+def convert_speech_to_text(buffer):
+ url = f'https://{location}.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1'
+
+ headers = {
+ 'Authorization': 'Bearer ' + get_access_token(),
+ 'Content-Type': f'audio/wav; codecs=audio/pcm; samplerate={rate}',
+ 'Accept': 'application/json;text/xml'
+ }
+
+ params = {
+ 'language': language
+ }
+
+ response = requests.post(url, headers=headers, params=params, data=buffer)
+ response_json = json.loads(response.text)
+
+ if response_json['RecognitionStatus'] == 'Success':
+ return response_json['DisplayText']
+ else:
+ return ''
+
+def get_timer_time(text):
+ url = ''
+
+ body = {
+ 'text': text
+ }
+
+ response = requests.post(url, json=body)
+
+ if response.status_code != 200:
+ return 0
+
+ payload = response.json()
+ return payload['seconds']
+
+def process_text(text):
+ print(text)
+
+ seconds = get_timer_time(text)
+ if seconds > 0:
+ create_timer(seconds)
+
+def get_voice():
+ url = f'https://{location}.tts.speech.microsoft.com/cognitiveservices/voices/list'
+
+ headers = {
+ 'Authorization': 'Bearer ' + get_access_token()
+ }
+
+ response = requests.get(url, headers=headers)
+ voices_json = json.loads(response.text)
+
+ first_voice = next(x for x in voices_json if x['Locale'].lower() == language.lower())
+ return first_voice['ShortName']
+
+voice = get_voice()
+print(f'Using voice {voice}')
+
+playback_format = 'riff-48khz-16bit-mono-pcm'
+
+def get_speech(text):
+ url = f'https://{location}.tts.speech.microsoft.com/cognitiveservices/v1'
+
+ headers = {
+ 'Authorization': 'Bearer ' + get_access_token(),
+ 'Content-Type': 'application/ssml+xml',
+ 'X-Microsoft-OutputFormat': playback_format
+ }
+
+ ssml = f''
+ ssml += f''
+ ssml += text
+ ssml += ''
+ ssml += ''
+
+ response = requests.post(url, headers=headers, data=ssml.encode('utf-8'))
+ return io.BytesIO(response.content)
+
+def play_speech(speech):
+ with wave.open(speech, 'rb') as wave_file:
+ stream = audio.open(format=audio.get_format_from_width(wave_file.getsampwidth()),
+ channels=wave_file.getnchannels(),
+ rate=wave_file.getframerate(),
+ output_device_index=speaker_card_number,
+ output=True)
+
+ data = wave_file.readframes(4096)
+
+ while len(data) > 0:
+ stream.write(data)
+ data = wave_file.readframes(4096)
+
+ stream.stop_stream()
+ stream.close()
+
+def say(text):
+ speech = get_speech(text)
+ play_speech(speech)
+
+def announce_timer(minutes, seconds):
+ announcement = 'Times up on your '
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer.'
+ say(announcement)
+
+def create_timer(total_seconds):
+ minutes, seconds = divmod(total_seconds, 60)
+ threading.Timer(total_seconds, announce_timer, args=[minutes, seconds]).start()
+ announcement = ''
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer started.'
+ say(announcement)
+
+def handle_method_request(request):
+ if request.name == 'set-timer':
+ payload = json.loads(request.payload)
+ seconds = payload['seconds']
+ if seconds > 0:
+ create_timer(payload['seconds'])
+
+while True:
+ while not button.is_pressed():
+ time.sleep(.1)
+
+ buffer = capture_audio()
+ text = convert_speech_to_text(buffer)
+ process_text(text)
\ No newline at end of file
diff --git a/6-consumer/lessons/3-spoken-feedback/code-spoken-response/virtual-iot-device/smart-timer/app.py b/6-consumer/lessons/3-spoken-feedback/code-spoken-response/virtual-iot-device/smart-timer/app.py
new file mode 100644
index 00000000..fa2e3c5a
--- /dev/null
+++ b/6-consumer/lessons/3-spoken-feedback/code-spoken-response/virtual-iot-device/smart-timer/app.py
@@ -0,0 +1,86 @@
+import requests
+import threading
+import time
+from azure.cognitiveservices.speech import SpeechConfig, SpeechRecognizer, SpeechSynthesizer
+
+speech_api_key = ''
+location = ''
+language = ''
+
+recognizer_config = SpeechConfig(subscription=speech_api_key,
+ region=location,
+ speech_recognition_language=language)
+
+recognizer = SpeechRecognizer(speech_config=recognizer_config)
+
+def say(text):
+ ssml = f''
+ ssml += f''
+ ssml += text
+ ssml += ''
+ ssml += ''
+
+ recognizer.stop_continuous_recognition()
+ speech_synthesizer.speak_ssml(ssml)
+ recognizer.start_continuous_recognition()
+
+def announce_timer(minutes, seconds):
+ announcement = 'Times up on your '
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer.'
+ say(announcement)
+
+def create_timer(total_seconds):
+ minutes, seconds = divmod(total_seconds, 60)
+ threading.Timer(total_seconds, announce_timer, args=[minutes, seconds]).start()
+ announcement = ''
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer started.'
+ say(announcement)
+
+def get_timer_time(text):
+ url = ''
+
+ body = {
+ 'text': text
+ }
+
+ response = requests.post(url, json=body)
+
+ if response.status_code != 200:
+ return 0
+
+ payload = response.json()
+ return payload['seconds']
+
+def process_text(text):
+ print(text)
+
+ seconds = get_timer_time(text)
+ if seconds > 0:
+ create_timer(seconds)
+
+def recognized(args):
+ process_text(args.result.text)
+
+recognizer.recognized.connect(recognized)
+
+recognizer.start_continuous_recognition()
+
+speech_config = SpeechConfig(subscription=speech_api_key,
+ region=location)
+speech_config.speech_synthesis_language = language
+speech_synthesizer = SpeechSynthesizer(speech_config=speech_config)
+
+voices = speech_synthesizer.get_voices_async().get().voices
+first_voice = next(x for x in voices if x.locale.lower() == language.lower())
+speech_config.speech_synthesis_voice_name = first_voice.short_name
+
+while True:
+ time.sleep(1)
\ No newline at end of file
diff --git a/6-consumer/lessons/1-speech-recognition/code-iot-hub/pi/smart-timer/app.py b/6-consumer/lessons/3-spoken-feedback/code-timer/pi/smart-timer/app.py
similarity index 62%
rename from 6-consumer/lessons/1-speech-recognition/code-iot-hub/pi/smart-timer/app.py
rename to 6-consumer/lessons/3-spoken-feedback/code-timer/pi/smart-timer/app.py
index b821a839..478501c8 100644
--- a/6-consumer/lessons/1-speech-recognition/code-iot-hub/pi/smart-timer/app.py
+++ b/6-consumer/lessons/3-spoken-feedback/code-timer/pi/smart-timer/app.py
@@ -1,27 +1,17 @@
import io
-import json
import pyaudio
import requests
+import threading
import time
import wave
-from azure.iot.device import IoTHubDeviceClient, Message
-
from grove.factory import Factory
button = Factory.getButton('GPIO-HIGH', 5)
-connection_string = ''
-
-device_client = IoTHubDeviceClient.create_from_connection_string(connection_string)
-
-print('Connecting')
-device_client.connect()
-print('Connected')
-
audio = pyaudio.PyAudio()
microphone_card_number = 1
speaker_card_number = 1
-rate = 48000
+rate = 16000
def capture_audio():
stream = audio.open(format = pyaudio.paInt16,
@@ -49,13 +39,13 @@ def capture_audio():
return wav_buffer
-api_key = ''
+speech_api_key = ''
location = ''
language = ''
def get_access_token():
headers = {
- 'Ocp-Apim-Subscription-Key': api_key
+ 'Ocp-Apim-Subscription-Key': speech_api_key
}
token_endpoint = f'https://{location}.api.cognitive.microsoft.com/sts/v1.0/issuetoken'
@@ -76,19 +66,63 @@ def convert_speech_to_text(buffer):
}
response = requests.post(url, headers=headers, params=params, data=buffer)
- response_json = json.loads(response.text)
+ response_json = response.json()
if response_json['RecognitionStatus'] == 'Success':
return response_json['DisplayText']
else:
return ''
+def get_timer_time(text):
+ url = ''
+
+ body = {
+ 'text': text
+ }
+
+ response = requests.post(url, json=body)
+
+ if response.status_code != 200:
+ return 0
+
+ payload = response.json()
+ return payload['seconds']
+
+def say(text):
+ print(text)
+
+def announce_timer(minutes, seconds):
+ announcement = 'Times up on your '
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer.'
+ say(announcement)
+
+def create_timer(total_seconds):
+ minutes, seconds = divmod(total_seconds, 60)
+ threading.Timer(total_seconds, announce_timer, args=[minutes, seconds]).start()
+
+ announcement = ''
+ if minutes > 0:
+ announcement += f'{minutes} minute '
+ if seconds > 0:
+ announcement += f'{seconds} second '
+ announcement += 'timer started.'
+ say(announcement)
+
+def process_text(text):
+ print(text)
+
+ seconds = get_timer_time(text)
+ if seconds > 0:
+ create_timer(seconds)
+
while True:
while not button.is_pressed():
time.sleep(.1)
buffer = capture_audio()
text = convert_speech_to_text(buffer)
- if len(text) > 0:
- message = Message(json.dumps({ 'speech': text }))
- device_client.send_message(message)
\ No newline at end of file
+ process_text(text)
\ No newline at end of file
diff --git a/6-consumer/lessons/3-spoken-feedback/code-timer/virtual-iot-device/smart-timer/app.py b/6-consumer/lessons/3-spoken-feedback/code-timer/virtual-iot-device/smart-timer/app.py
new file mode 100644
index 00000000..0f745b8a
--- /dev/null
+++ b/6-consumer/lessons/3-spoken-feedback/code-timer/virtual-iot-device/smart-timer/app.py
@@ -0,0 +1,70 @@
+import requests
+import threading
+import time
+from azure.cognitiveservices.speech import SpeechConfig, SpeechRecognizer
+
+speech_api_key = ''
+location = ''
+language = '