Difference between revisions of "UBISS2024"
(→Tasks) |
|||
(116 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
+ | This course is designed as a week-long tutorial to engage with ubiquitous devices in the domain of smart environments and how to use machine learning to build smart devices. Here, we use an [[Arduino_Nano_RP2040_Connect|Arduino Nano RP2040 Connect]]. | ||
+ | |||
+ | = Final Projects = | ||
+ | |||
+ | == StressLess Shell == | ||
+ | See the teaser video on [https://www.youtube.com/watch?v=NBSCIGqiXqM YouTube] by Songyan Teng, Jingyao Zheng, and Tim Zindulka. | ||
+ | |||
+ | == Plant Monitoring and Warning System == | ||
+ | See the teaser video on [https://www.youtube.com/watch?v=DxjguBbUobk YouTube] by Shenxiu Wu, and Huong Nguyen. | ||
+ | |||
+ | == IntelliPen == | ||
+ | The pen that can recognize the characters that you write! See the teaser video on [https://www.youtube.com/watch?v=WdLBq__ORBQ YouTube] by Mohammed Khalili, and Ali Mahmoudi. | ||
+ | |||
+ | == Hand Gesture Recognition == | ||
+ | Find the code and documentation at [https://github.com/mamadzebal/Morse-Code-Detector GitHub]. The project was completed by Mohammed Farhoudi and Samira Kamali Poorazad. | ||
+ | |||
+ | = Schedule = | ||
+ | == Day 1 == | ||
+ | * 16:00-17:00 Lecture: Introduction & Creating Interactive Smart Objects and Environments | ||
+ | * 17:00-17:45 Hands-On: [[UBISS2024#Task 1: Getting Started|Task 1: Getting Started]] Components, tools, and development environments | ||
+ | * 17:45-18:00 Lecture: Preview of the Tasks Ahead | ||
+ | |||
+ | == Day 2 == | ||
+ | * 10:00-10:45 Lecture: Desiging and Implementing Sensor-Based Systems | ||
+ | * 10:45-11:00 ML Development Cycle: data collection, data cleaning, and labeling, selection of the AI/ML approach, hyper-parameter * selection and implementing the model, training the model/system, deploying the model, operations, re-training/continuous improvement | ||
+ | * 11:00-12:00 Hands-On: [[UBISS2024#Task 2: Read Data|Task 2: Read Data]] (accelerometer and analog pin) | ||
+ | * 12:00-13:00 lunch break | ||
+ | * 13:00-14:00 Hands-On: Task 3: Record the accelerometer data for four different actions (labeled dataset), store it, and transfer it to PC using the Arduino IDE | ||
+ | * 14:00-14:30 Lecture: Rule-based Systems: how to design them, pros: explainability, cons: it is hard | ||
+ | * 14:30-15:00 Hands-On: Task 4: analyze the data in Excel/Google sheets and find rules for the 4 actions | ||
+ | * 15:00-15:45 Hands-On: Task 5: Implement your rule-based algorithm, optional include explanations of why the state is recognized using the Arduino IDE | ||
+ | * 15:45-16:00 get Coffee / break | ||
+ | * 16:00-17:00 Ideation and testing ideas | ||
+ | * 17:00-17:15 Present your ideas | ||
+ | * 17:15-18:00 Hands-On: Discussion and presenting the results of Task 4 and 5 | ||
+ | |||
+ | == Day 3 == | ||
+ | * 10:00-10:30 Lecture: Introduction to Jupyter Notebooks, training an ML model based on a given data and the self-recorded data set on the PC | ||
+ | * 10:30-11:00 Lecture: Introduction to ML Libraries (everywhereML) | ||
+ | * 11:00-12:00 Hands-On: Project specification, Ideation on Project Ideas; and discussion of project ideas, group forming (groups of 2 or 3), Make your groups, specify your projects, see if you get the components (refine to work with the components available) | ||
+ | * 12:00-13:00 Lunch break | ||
+ | * 13:00-15:30 Hands-On: [[UBISS2024#Task 6: Getting Started with Jupyter Notebook|Task 6: Getting Started with Jupyter Notebook]] Installing Jupyter Notebook for Micropython, controlling LED, reading data, storing data | ||
+ | * 15:30-16:00 Coffee break | ||
+ | * 16:00-16:30 Hands-On: Presentation: status update on your project | ||
+ | * 16:15-18:00 Hands-On: [[UBISS2024#Task 7: Deploy Machine Learning Models|Task 7: Deploy Machine Learning Models]] Implementing a basic model using everywhereML | ||
+ | |||
+ | == Day 4 == | ||
+ | * 10:00-10:45 Hands-On: Definition of project, project outline | ||
+ | * 10:45-11:15 Hands-On: project presentation: 60 sec per team | ||
+ | * 11:15-12:00 Hands-On: project work | ||
+ | * 12:00-13:00 Lunch break | ||
+ | * 13:00-15:00 Hands-On: project work | ||
+ | * 15:00-15:30 Hands-On: stand-up meeting on project progress | ||
+ | * 15:30-16:00 Coffee break | ||
+ | * 16:00-16:15 Lecture: How to run your system of a battery (see [[Tutorial AutoRun]]). | ||
+ | * 16:15-17:30 Hands-On: project work | ||
+ | * 17:30-18:00 Lecture: How to Evaluate ML Solutions (talk and discussion) | ||
+ | |||
+ | == Day 5 == | ||
+ | * 10:00-10:30 Hands-On: stand-up meeting — project challenges and solutions | ||
+ | * 10:30-11:30 Hands-On: project work and preparing the presentation | ||
+ | ** [[UBISS2024#Requirements_for_the_Final_Presentation | Requirements for the Presentation]] | ||
+ | |||
+ | * 11:30-12:00 Lecture: Pitfalls and Challenges in Developing ML/AI for IoT | ||
+ | * 12:00-13:00 Lunch break | ||
+ | * 13:00-15:30 Hands-On: project work | ||
+ | * 15:30-16:00 Coffee break | ||
+ | * 16:00-16:30 Lecture: Testing and Reporting ML Performance (How to Test the Prototype? & How to Report Performance?) | ||
+ | * 16:30-17:30 Hands-On: Testing of prototype performance | ||
+ | * 17:30-18:00 Hands-On: Open issues for the presentation on Saturday, Feedback sessions | ||
+ | |||
+ | == Day 6 == | ||
+ | * 13:15-18:15: Workshop Result Presentations | ||
+ | * 18:30-18:50: Debriefing | ||
+ | |||
+ | == Requirements for the Final Presentation == | ||
+ | * The presentation has to be 4 minutes long (we stop you after 4 minutes!) | ||
+ | * First slide: Your team name and your names - and if you want a photo of the team | ||
+ | * A short video of the tech you envision (up to 60 sec, [https://www.kickstarter.com/ Kickstarter]-style promotion type) | ||
+ | * A technology description, including the list of components used in the prototype | ||
+ | * A description of your data set and how it was acquired | ||
+ | * The ML model/approach you took to learning the data | ||
+ | * An evaluation of how well your ML model works with the data set (and optional in real live) | ||
+ | |||
+ | == Final Submissions == | ||
+ | You have to upload your final submission to the drive. This should include: | ||
+ | * a video where you explain your technology | ||
+ | ** show the electronics components and name them | ||
+ | ** show the physical setup that you created | ||
+ | ** show the code you wrote and briefly explain it | ||
+ | * a zip file with all the code that is used in your project | ||
+ | * a schematic / drawing of your system as PDF or image (drawing it on paper and making a photo is fine) | ||
+ | * your final presentation (as PDF, Powerpoint) | ||
+ | * [optional] a drawing of your system architecture (hand drawing is fine) | ||
+ | |||
= Tasks = | = Tasks = | ||
− | == | + | == Task 1: Getting Started == |
− | * | + | * Connect an Arduino Nano RP2040 Connect board, for this see [[Arduino_Nano_RP2040_Connect#Install the Arduino Nano RP2040 Connect Firmware|Install the Arduino Nano RP2040 Connect Firmware]] |
− | + | * if you use Linux/MacOS and there are issues, look for serial line permissions, e.g. https://www.xanthium.in/linux-serial-port-programming-using-python-pyserial-and-arduino-avr-pic-microcontroller and https://github.com/arduino/lab-micropython-editor/issues/64 | |
− | * Make the | + | * Install the Arduino Lab for MicroPython development environment, https://labs.arduino.cc/en/labs/micropython |
− | * | + | * Task 1.1: Make the orange LED (pin 6) blink using micro python https://docs.arduino.cc/micropython/basics/digital-analog-pins/ |
− | + | * Task 1.2: Connect an external RGB LED (pin D2 = GPIO25, D3 = GPIO15, D4 = GPIO16) and control it (on, off, mix color, brightness), https://www.sketching-with-hardware.org/wiki/RGB_LED | |
+ | * Task 1.3: Add the photo resistors to your board, read their values, and write them to the file; see the instructions for [[LDR]]. | ||
+ | * Task 1.4: Combine your [[LDR]] and the [[RGB_LED]] example to change the blinking interval with the light value measures. | ||
+ | |||
+ | === Solution Task 1.1: LED Blinking === | ||
+ | <syntaxhighlight lang="python" line='line'> | ||
+ | # Blinky example | ||
+ | |||
+ | import time | ||
+ | from machine import Pin | ||
+ | |||
+ | # This is the only LED pin available on the Nano RP2040, | ||
+ | # other than the RGB LED connected to Nano WiFi module. | ||
+ | led = Pin(6, Pin.OUT) | ||
+ | |||
+ | while (True): | ||
+ | led.on() | ||
+ | time.sleep_ms(250) | ||
+ | led.off() | ||
+ | time.sleep_ms(200) | ||
+ | </syntaxhighlight> | ||
+ | |||
+ | === Solution Task 1.2 Control external RGB === | ||
+ | <syntaxhighlight lang="python" line='line'> | ||
+ | # RGB example | ||
+ | |||
+ | import time | ||
+ | from machine import Pin | ||
+ | |||
+ | # RGB LED connected to the RP2040 | ||
+ | ledG = Pin(25, Pin.OUT) | ||
+ | ledR = Pin(15, Pin.OUT) | ||
+ | ledB = Pin(16, Pin.OUT) | ||
+ | print("start") | ||
+ | |||
+ | while (True): | ||
+ | print("*") | ||
+ | ledG.on() | ||
+ | ledR.off() | ||
+ | ledB.off() | ||
+ | time.sleep_ms(250) | ||
+ | ledG.off() | ||
+ | ledR.on() | ||
+ | ledB.off() | ||
+ | time.sleep_ms(250) | ||
+ | ledG.off() | ||
+ | ledR.off() | ||
+ | ledB.on() | ||
+ | time.sleep_ms(250) | ||
+ | </syntaxhighlight> | ||
+ | |||
+ | === Solution Task 1.3 Read Light-Dependent Resistor (LDR) === | ||
+ | See [[LDR]]. A0 is the analog input with 16 bit resolution. It reads the analog value every second and print it to the console- | ||
+ | |||
+ | <syntaxhighlight lang="python" line='line'> | ||
+ | #Example usage for Arduino Nano | ||
+ | from machine import Pin, ADC | ||
+ | from time import sleep | ||
+ | |||
+ | analogPin = ADC(Pin(26)) | ||
+ | |||
+ | while True: | ||
+ | analogVal16 = analogPin.read_u16() | ||
+ | print(analogVal16) | ||
+ | sleep(1) | ||
+ | </syntaxhighlight> | ||
+ | |||
− | == | + | === Solution Task 1.4 Combine Light-Dependent Resistor (LDR) with Blinking LED === |
− | * read data from the accelerometer and print them (Arduino IDE) https://docs.arduino.cc/micropython/basics/board-examples/ | + | <syntaxhighlight lang="python" line='line'> |
− | * extend | + | from machine import Pin, ADC |
+ | import time | ||
+ | |||
+ | led = Pin(6, Pin.OUT) | ||
+ | analogPin = ADC(Pin(26)) | ||
+ | |||
+ | while (True): | ||
+ | analogVal16 = analogPin.read_u16() | ||
+ | print(analogVal16) | ||
+ | rate = analogVal16 / 300 # create a simple mapping | ||
+ | led.on() | ||
+ | time.sleep_ms(int(rate)) # convert the rate to an integer type | ||
+ | led.off() | ||
+ | time.sleep_ms(int(rate)) | ||
+ | </syntaxhighlight> | ||
+ | |||
+ | == Task 2: Read Data == | ||
+ | * read data from the accelerometer and the gyro and print them (Arduino IDE) https://docs.arduino.cc/micropython/basics/board-examples/ | ||
+ | * extend your program to write the data from the accelerometers to a file; see the instructions for [[FileIO]]. | ||
* transfer the file to your computer | * transfer the file to your computer | ||
+ | * optional: add the photo resistors to your board, read their values, and write them to the file; see the instructions for [[LDR]]. | ||
+ | === Solution Task 2.1: Read Accelerometer and Gyro === | ||
+ | <syntaxhighlight lang="python" line='line'> | ||
+ | import time | ||
+ | from lsm6dsox import LSM6DSOX | ||
+ | from machine import Pin, I2C | ||
+ | lsm = LSM6DSOX(I2C(0, scl=Pin(13), sda=Pin(12))) | ||
+ | while (True): | ||
+ | accel_data = lsm.accel() | ||
+ | print('Accelerometer: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*accel_data)) | ||
+ | gyro_data = lsm.gyro() | ||
+ | print('Gyroscope: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*gyro_data)) | ||
+ | print("") | ||
+ | time.sleep_ms(100) | ||
+ | </syntaxhighlight> | ||
− | == Task 2: | + | === Solution Task 2.2: Read analog values === |
− | + | A0 is the analog input with 16-bit resolution. It reads the analog value every second and print it to the console- | |
− | |||
+ | <syntaxhighlight lang="python" line='line'> | ||
+ | from machine import Pin, ADC | ||
+ | from time import sleep | ||
− | + | analogPin = ADC(Pin(26)) | |
+ | while True: | ||
+ | analogVal16 = analogPin.read_u16() | ||
+ | print(analogVal16) | ||
+ | sleep(1) | ||
+ | </syntaxhighlight> | ||
− | === Task | + | == Task 6: Getting Started with Jupyter Notebook == |
+ | * Connect the board | ||
+ | * Install the [[Jupyter Notebook]], | ||
+ | * Read the accelerometer and the gyro and show it in the notebook | ||
+ | |||
+ | === Task 6.1: is it moved? === | ||
* read acceleration and gyro | * read acceleration and gyro | ||
* calculate the differences between values | * calculate the differences between values | ||
Line 30: | Line 236: | ||
* create a file on the device that logs, when it is moved | * create a file on the device that logs, when it is moved | ||
− | === Task | + | === Task 6.2: it was turned upside down? === |
* read acceleration and gyro | * read acceleration and gyro | ||
* make a rule based "AI" that records | * make a rule based "AI" that records | ||
Line 37: | Line 243: | ||
** it was moved "quickly" | ** it was moved "quickly" | ||
− | == Task | + | == Task 7: Deploy Machine Learning Models == |
− | + | We will use https://github.com/eloquentarduino/everywhereml to detect the same gestures as in Task 2.2. For this, install everywhereml: | |
− | + | See [[EverywhereML]] for downloading the full example and a dataset to experiment with. | |
− | |||
− | |||
− | |||
− | = | + | <syntaxhighlight lang="Bash"> |
+ | pip3 install -U everywhereml | ||
+ | </syntaxhighlight> | ||
− | + | Using everywhereml we can train a model on a more powerful machine for deployment on a microcontroller. See https://eloquentarduino.com/posts/micropython-machine-learning for example for such a training process. There are other approaches to code generation such as m2cgen https://github.com/BayesWitnesses/m2cgen/tree/master or emlearn https://github.com/emlearn/emlearn | |
− | |||
− | https:// | ||
− | |||
− | https:// | ||
− | |||
− | |||
− | https:// | ||
− | |||
− | |||
− | + | Assuming that our ML model is trained and stored in variable clf then we can save the model to a file using | |
− | + | <syntaxhighlight lang="python"> | |
− | + | clf.to_micropython_file("MyModel.py") | |
− | + | </syntaxhighlight> | |
− | |||
− | |||
− | |||
− | == | + | The MyModel.py file can then be saved and called directly on the microcontroller. To run the model on the microcontroller, assume your data is stored in x and you trained a RandomForestClassifier. Then you can predict via the following code snippet |
− | + | <syntaxhighlight lang="python"> | |
− | + | import MyModel | |
− | + | clf = MyModel.RandomForestClassifier() | |
+ | clf.predict(x) | ||
+ | </syntaxhighlight> | ||
− | == | + | == Task 8: Connect to WiFi == |
− | + | This is an optional task. | |
+ | * Connect both boards to WIFI using [[Tutorial Network]] | ||
+ | * Use the Arduino Nano RP2040 Connect as output (showing a color) | ||
+ | * Use the Arduino Nano RP2040 Connect as input (recognize with rules 3 gestures) | ||
− | = | + | = Links = |
− | + | See the full list of links: [[UBISS2024-Links]] | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
== Local Links == | == Local Links == | ||
Line 102: | Line 282: | ||
= Reading = | = Reading = | ||
== Required Reading before the course == | == Required Reading before the course == | ||
− | Albrecht Schmidt | + | * Albrecht Schmidt (2020) Interactive Human Centered Artificial Intelligence: A Definition and Research Challenges. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New York, NY, USA, Article 3, 1–4. https://doi.org/10.1145/3399715.3400873 https://uni.ubicomp.net/as/iHCAI2020.pdf (4p) |
− | + | * Albrecht Schmidt and Kristof van Laerhoven (2021) How to build smart appliances? In IEEE Personal Communications, vol. 8, no. 4, pp. 66-71, Aug. 2001, https://doi.org/10.1109/98.944006 https://www.eti.uni-siegen.de/ubicomp/papers/sl_ieeepc2001.pdf (6p) | |
− | + | * Albrecht Schmidt (2017) Understanding and researching through making: a plea for functional prototypes. interactions 24.3, 78-81. https://doi.org/10.1145/3058498 https://www.sketching-with-hardware.org/files/functional3058498.pdf (4p) | |
− | https://www.eti.uni-siegen.de/ubicomp/papers/sl_ieeepc2001.pdf | + | * Huy Viet Le, Sven Mayer, and Niels Henze (2020) Deep learning for human-computer interaction. interactions 28, 1 (January - February 2021), 78–82. https://doi.org/10.1145/3436958 https://sven-mayer.com/wp-content/uploads/2021/01/huy2021deep.pdf (5p) |
− | + | * Huy Viet Le, Sven Mayer, Max Weiß, Jonas Vogelsang, Henrike Weingärtner, and Niels Henze (2020) Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. In: ACM Trans. Comput.-Hum. Interact., vol. 27, no. 5, pp. 38. https://sven-mayer.com/wp-content/uploads/2020/09/le2020shortcuts.pdf (38p) | |
− | Schmidt | + | * Judith Hurwitz, and Daniel Kirsch (2018) Machine learning for dummies. IBM Limited Edition 75, 9780429196645-6. https://www.ibm.com/downloads/cas/GB8ZMQZ3 (Pages 3-18 and 29-47, this is Chapters 1 and 3) (35p) |
− | https://www.sketching-with-hardware.org/files/functional3058498.pdf | + | * Chris Garrett. MicroPython: An Intro to Programming Hardware in Python https://realpython.com/micropython/ (14 pages) |
− | + | * MicroPython Basics https://docs.arduino.cc/micropython/basics/micropython-basics/ (5 pages) | |
− | Huy Viet Le, Sven Mayer, and Niels Henze | ||
− | https://doi.org/10.1145/3436958 https://sven-mayer.com/wp-content/uploads/2021/01/huy2021deep.pdf | ||
− | |||
− | |||
− | https://sven-mayer.com/wp-content/uploads/2020/09/le2020shortcuts.pdf | ||
− | |||
− | Hurwitz | ||
− | |||
− | Chris Garrett. MicroPython: An Intro to Programming Hardware in Python https://realpython.com/micropython/ (14 pages) | ||
− | |||
− | MicroPython Basics https://docs.arduino.cc/micropython/basics/micropython-basics/ (5 pages) | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
+ | == Recommended Reading before the course == | ||
+ | * John D. Kelleher (2019) Deep Learning. https://mitpress.mit.edu/9780262537551/deep-learning/ | ||
+ | * Yuli Vasiliev, Python for Data Science: A Hands-On Introduction, https://nostarch.com/python-data-science | ||
+ | * Tutorial on Jupyter Notebooks: https://www.datacamp.com/tutorial/tutorial-jupyter-notebook | ||
+ | * Alex Smola, and S. V. N. Vishwanathan (2008) Introduction to machine learning. Cambridge University, UK 32.34. https://alex.smola.org/drafts/thebook.pdf | ||
= Random Commands = | = Random Commands = | ||
Line 155: | Line 320: | ||
C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>python -m notebook | C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>python -m notebook | ||
+ | |||
+ | [[Category:Courses]] | ||
+ | [[Category:UBISS2024]] | ||
+ | [[Category:Arduino Nano RP2040 Connect]] | ||
+ | [[Category:MicroPython]] |
Latest revision as of 11:35, 5 September 2024
This course is designed as a week-long tutorial to engage with ubiquitous devices in the domain of smart environments and how to use machine learning to build smart devices. Here, we use an Arduino Nano RP2040 Connect.
Contents
Final Projects[edit]
StressLess Shell[edit]
See the teaser video on YouTube by Songyan Teng, Jingyao Zheng, and Tim Zindulka.
Plant Monitoring and Warning System[edit]
See the teaser video on YouTube by Shenxiu Wu, and Huong Nguyen.
IntelliPen[edit]
The pen that can recognize the characters that you write! See the teaser video on YouTube by Mohammed Khalili, and Ali Mahmoudi.
Hand Gesture Recognition[edit]
Find the code and documentation at GitHub. The project was completed by Mohammed Farhoudi and Samira Kamali Poorazad.
Schedule[edit]
Day 1[edit]
- 16:00-17:00 Lecture: Introduction & Creating Interactive Smart Objects and Environments
- 17:00-17:45 Hands-On: Task 1: Getting Started Components, tools, and development environments
- 17:45-18:00 Lecture: Preview of the Tasks Ahead
Day 2[edit]
- 10:00-10:45 Lecture: Desiging and Implementing Sensor-Based Systems
- 10:45-11:00 ML Development Cycle: data collection, data cleaning, and labeling, selection of the AI/ML approach, hyper-parameter * selection and implementing the model, training the model/system, deploying the model, operations, re-training/continuous improvement
- 11:00-12:00 Hands-On: Task 2: Read Data (accelerometer and analog pin)
- 12:00-13:00 lunch break
- 13:00-14:00 Hands-On: Task 3: Record the accelerometer data for four different actions (labeled dataset), store it, and transfer it to PC using the Arduino IDE
- 14:00-14:30 Lecture: Rule-based Systems: how to design them, pros: explainability, cons: it is hard
- 14:30-15:00 Hands-On: Task 4: analyze the data in Excel/Google sheets and find rules for the 4 actions
- 15:00-15:45 Hands-On: Task 5: Implement your rule-based algorithm, optional include explanations of why the state is recognized using the Arduino IDE
- 15:45-16:00 get Coffee / break
- 16:00-17:00 Ideation and testing ideas
- 17:00-17:15 Present your ideas
- 17:15-18:00 Hands-On: Discussion and presenting the results of Task 4 and 5
Day 3[edit]
- 10:00-10:30 Lecture: Introduction to Jupyter Notebooks, training an ML model based on a given data and the self-recorded data set on the PC
- 10:30-11:00 Lecture: Introduction to ML Libraries (everywhereML)
- 11:00-12:00 Hands-On: Project specification, Ideation on Project Ideas; and discussion of project ideas, group forming (groups of 2 or 3), Make your groups, specify your projects, see if you get the components (refine to work with the components available)
- 12:00-13:00 Lunch break
- 13:00-15:30 Hands-On: Task 6: Getting Started with Jupyter Notebook Installing Jupyter Notebook for Micropython, controlling LED, reading data, storing data
- 15:30-16:00 Coffee break
- 16:00-16:30 Hands-On: Presentation: status update on your project
- 16:15-18:00 Hands-On: Task 7: Deploy Machine Learning Models Implementing a basic model using everywhereML
Day 4[edit]
- 10:00-10:45 Hands-On: Definition of project, project outline
- 10:45-11:15 Hands-On: project presentation: 60 sec per team
- 11:15-12:00 Hands-On: project work
- 12:00-13:00 Lunch break
- 13:00-15:00 Hands-On: project work
- 15:00-15:30 Hands-On: stand-up meeting on project progress
- 15:30-16:00 Coffee break
- 16:00-16:15 Lecture: How to run your system of a battery (see Tutorial AutoRun).
- 16:15-17:30 Hands-On: project work
- 17:30-18:00 Lecture: How to Evaluate ML Solutions (talk and discussion)
Day 5[edit]
- 10:00-10:30 Hands-On: stand-up meeting — project challenges and solutions
- 10:30-11:30 Hands-On: project work and preparing the presentation
- 11:30-12:00 Lecture: Pitfalls and Challenges in Developing ML/AI for IoT
- 12:00-13:00 Lunch break
- 13:00-15:30 Hands-On: project work
- 15:30-16:00 Coffee break
- 16:00-16:30 Lecture: Testing and Reporting ML Performance (How to Test the Prototype? & How to Report Performance?)
- 16:30-17:30 Hands-On: Testing of prototype performance
- 17:30-18:00 Hands-On: Open issues for the presentation on Saturday, Feedback sessions
Day 6[edit]
- 13:15-18:15: Workshop Result Presentations
- 18:30-18:50: Debriefing
Requirements for the Final Presentation[edit]
- The presentation has to be 4 minutes long (we stop you after 4 minutes!)
- First slide: Your team name and your names - and if you want a photo of the team
- A short video of the tech you envision (up to 60 sec, Kickstarter-style promotion type)
- A technology description, including the list of components used in the prototype
- A description of your data set and how it was acquired
- The ML model/approach you took to learning the data
- An evaluation of how well your ML model works with the data set (and optional in real live)
Final Submissions[edit]
You have to upload your final submission to the drive. This should include:
- a video where you explain your technology
- show the electronics components and name them
- show the physical setup that you created
- show the code you wrote and briefly explain it
- a zip file with all the code that is used in your project
- a schematic / drawing of your system as PDF or image (drawing it on paper and making a photo is fine)
- your final presentation (as PDF, Powerpoint)
- [optional] a drawing of your system architecture (hand drawing is fine)
Tasks[edit]
Task 1: Getting Started[edit]
- Connect an Arduino Nano RP2040 Connect board, for this see Install the Arduino Nano RP2040 Connect Firmware
- if you use Linux/MacOS and there are issues, look for serial line permissions, e.g. https://www.xanthium.in/linux-serial-port-programming-using-python-pyserial-and-arduino-avr-pic-microcontroller and https://github.com/arduino/lab-micropython-editor/issues/64
- Install the Arduino Lab for MicroPython development environment, https://labs.arduino.cc/en/labs/micropython
- Task 1.1: Make the orange LED (pin 6) blink using micro python https://docs.arduino.cc/micropython/basics/digital-analog-pins/
- Task 1.2: Connect an external RGB LED (pin D2 = GPIO25, D3 = GPIO15, D4 = GPIO16) and control it (on, off, mix color, brightness), https://www.sketching-with-hardware.org/wiki/RGB_LED
- Task 1.3: Add the photo resistors to your board, read their values, and write them to the file; see the instructions for LDR.
- Task 1.4: Combine your LDR and the RGB_LED example to change the blinking interval with the light value measures.
Solution Task 1.1: LED Blinking[edit]
1 # Blinky example
2
3 import time
4 from machine import Pin
5
6 # This is the only LED pin available on the Nano RP2040,
7 # other than the RGB LED connected to Nano WiFi module.
8 led = Pin(6, Pin.OUT)
9
10 while (True):
11 led.on()
12 time.sleep_ms(250)
13 led.off()
14 time.sleep_ms(200)
Solution Task 1.2 Control external RGB[edit]
1 # RGB example
2
3 import time
4 from machine import Pin
5
6 # RGB LED connected to the RP2040
7 ledG = Pin(25, Pin.OUT)
8 ledR = Pin(15, Pin.OUT)
9 ledB = Pin(16, Pin.OUT)
10 print("start")
11
12 while (True):
13 print("*")
14 ledG.on()
15 ledR.off()
16 ledB.off()
17 time.sleep_ms(250)
18 ledG.off()
19 ledR.on()
20 ledB.off()
21 time.sleep_ms(250)
22 ledG.off()
23 ledR.off()
24 ledB.on()
25 time.sleep_ms(250)
Solution Task 1.3 Read Light-Dependent Resistor (LDR)[edit]
See LDR. A0 is the analog input with 16 bit resolution. It reads the analog value every second and print it to the console-
1 #Example usage for Arduino Nano
2 from machine import Pin, ADC
3 from time import sleep
4
5 analogPin = ADC(Pin(26))
6
7 while True:
8 analogVal16 = analogPin.read_u16()
9 print(analogVal16)
10 sleep(1)
Solution Task 1.4 Combine Light-Dependent Resistor (LDR) with Blinking LED[edit]
1 from machine import Pin, ADC
2 import time
3
4 led = Pin(6, Pin.OUT)
5 analogPin = ADC(Pin(26))
6
7 while (True):
8 analogVal16 = analogPin.read_u16()
9 print(analogVal16)
10 rate = analogVal16 / 300 # create a simple mapping
11 led.on()
12 time.sleep_ms(int(rate)) # convert the rate to an integer type
13 led.off()
14 time.sleep_ms(int(rate))
Task 2: Read Data[edit]
- read data from the accelerometer and the gyro and print them (Arduino IDE) https://docs.arduino.cc/micropython/basics/board-examples/
- extend your program to write the data from the accelerometers to a file; see the instructions for FileIO.
- transfer the file to your computer
- optional: add the photo resistors to your board, read their values, and write them to the file; see the instructions for LDR.
Solution Task 2.1: Read Accelerometer and Gyro[edit]
1 import time
2 from lsm6dsox import LSM6DSOX
3
4 from machine import Pin, I2C
5 lsm = LSM6DSOX(I2C(0, scl=Pin(13), sda=Pin(12)))
6
7 while (True):
8 accel_data = lsm.accel()
9 print('Accelerometer: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*accel_data))
10 gyro_data = lsm.gyro()
11 print('Gyroscope: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*gyro_data))
12 print("")
13 time.sleep_ms(100)
Solution Task 2.2: Read analog values[edit]
A0 is the analog input with 16-bit resolution. It reads the analog value every second and print it to the console-
1 from machine import Pin, ADC
2 from time import sleep
3
4 analogPin = ADC(Pin(26))
5
6 while True:
7 analogVal16 = analogPin.read_u16()
8 print(analogVal16)
9 sleep(1)
Task 6: Getting Started with Jupyter Notebook[edit]
- Connect the board
- Install the Jupyter Notebook,
- Read the accelerometer and the gyro and show it in the notebook
Task 6.1: is it moved?[edit]
- read acceleration and gyro
- calculate the differences between values
- show an ouput when it is move
- create a file on the device that logs, when it is moved
Task 6.2: it was turned upside down?[edit]
- read acceleration and gyro
- make a rule based "AI" that records
- it was put upside down
- it was turned 360
- it was moved "quickly"
Task 7: Deploy Machine Learning Models[edit]
We will use https://github.com/eloquentarduino/everywhereml to detect the same gestures as in Task 2.2. For this, install everywhereml:
See EverywhereML for downloading the full example and a dataset to experiment with.
pip3 install -U everywhereml
Using everywhereml we can train a model on a more powerful machine for deployment on a microcontroller. See https://eloquentarduino.com/posts/micropython-machine-learning for example for such a training process. There are other approaches to code generation such as m2cgen https://github.com/BayesWitnesses/m2cgen/tree/master or emlearn https://github.com/emlearn/emlearn
Assuming that our ML model is trained and stored in variable clf then we can save the model to a file using
clf.to_micropython_file("MyModel.py")
The MyModel.py file can then be saved and called directly on the microcontroller. To run the model on the microcontroller, assume your data is stored in x and you trained a RandomForestClassifier. Then you can predict via the following code snippet
import MyModel
clf = MyModel.RandomForestClassifier()
clf.predict(x)
Task 8: Connect to WiFi[edit]
This is an optional task.
- Connect both boards to WIFI using Tutorial Network
- Use the Arduino Nano RP2040 Connect as output (showing a color)
- Use the Arduino Nano RP2040 Connect as input (recognize with rules 3 gestures)
Links[edit]
See the full list of links: UBISS2024-Links
Local Links[edit]
https://ubicomp.net/sw/db1/var2db.php? http://localhost:8888/notebooks/ArduinoNanoRP2040_v01.ipynb http://localhost:8888/doc/tree/create-ML-model01.ipynb
Reading[edit]
Required Reading before the course[edit]
- Albrecht Schmidt (2020) Interactive Human Centered Artificial Intelligence: A Definition and Research Challenges. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New York, NY, USA, Article 3, 1–4. https://doi.org/10.1145/3399715.3400873 https://uni.ubicomp.net/as/iHCAI2020.pdf (4p)
- Albrecht Schmidt and Kristof van Laerhoven (2021) How to build smart appliances? In IEEE Personal Communications, vol. 8, no. 4, pp. 66-71, Aug. 2001, https://doi.org/10.1109/98.944006 https://www.eti.uni-siegen.de/ubicomp/papers/sl_ieeepc2001.pdf (6p)
- Albrecht Schmidt (2017) Understanding and researching through making: a plea for functional prototypes. interactions 24.3, 78-81. https://doi.org/10.1145/3058498 https://www.sketching-with-hardware.org/files/functional3058498.pdf (4p)
- Huy Viet Le, Sven Mayer, and Niels Henze (2020) Deep learning for human-computer interaction. interactions 28, 1 (January - February 2021), 78–82. https://doi.org/10.1145/3436958 https://sven-mayer.com/wp-content/uploads/2021/01/huy2021deep.pdf (5p)
- Huy Viet Le, Sven Mayer, Max Weiß, Jonas Vogelsang, Henrike Weingärtner, and Niels Henze (2020) Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. In: ACM Trans. Comput.-Hum. Interact., vol. 27, no. 5, pp. 38. https://sven-mayer.com/wp-content/uploads/2020/09/le2020shortcuts.pdf (38p)
- Judith Hurwitz, and Daniel Kirsch (2018) Machine learning for dummies. IBM Limited Edition 75, 9780429196645-6. https://www.ibm.com/downloads/cas/GB8ZMQZ3 (Pages 3-18 and 29-47, this is Chapters 1 and 3) (35p)
- Chris Garrett. MicroPython: An Intro to Programming Hardware in Python https://realpython.com/micropython/ (14 pages)
- MicroPython Basics https://docs.arduino.cc/micropython/basics/micropython-basics/ (5 pages)
Recommended Reading before the course[edit]
- John D. Kelleher (2019) Deep Learning. https://mitpress.mit.edu/9780262537551/deep-learning/
- Yuli Vasiliev, Python for Data Science: A Hands-On Introduction, https://nostarch.com/python-data-science
- Tutorial on Jupyter Notebooks: https://www.datacamp.com/tutorial/tutorial-jupyter-notebook
- Alex Smola, and S. V. N. Vishwanathan (2008) Introduction to machine learning. Cambridge University, UK 32.34. https://alex.smola.org/drafts/thebook.pdf
Random Commands[edit]
pip install micropython-lsm6dsox
picotool.exe load -x C:\Users\ru42qak\AppData\Roaming\OpenMV\openmvide\firmware\ARDUINO_NANO_RP2040_CONNECT\firmware.bin
pip install jupyterlab
pip install everywhereml
python -m pip install jupyter
git clone https://github.com/goatchurchprime/jupyter_micropython_kernel.git
pip install -e jupyter_micropython_kernel
python -m notebook
python -m jupyter kernelspec list
C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>pip install -e jupyter_micropython_kernel
C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>python -m notebook