UBISS2024

From Sketching with Hardware at LMU Wiki
Jump to navigation Jump to search

This course is designed as a week-long tutorial to engage with ubiquitous devices in the domain of smart environments and how to use machine learning to build smart devices. Here, we use an Arduino Nano RP2040 Connect.

Schedule

Day 1

  • 16:00-17:00 Lecture: Introduction & Creating Interactive Smart Objects and Environments
  • 17:00-17:45 Hands-On: Task 1: Getting Started Components, tools, and development environments
  • 17:45-18:00 Lecture: Preview of the Tasks Ahead

Day 2

  • 10:00-10:45 Lecture: Desiging and Implementing Sensor-Based Systems
  • 10:45-11:00 ML Development Cycle: data collection, data cleaning, and labeling, selection of the AI/ML approach, hyper-parameter * selection and implementing the model, training the model/system, deploying the model, operations, re-training/continuous improvement
  • 11:00-12:00 Hands-On: Task 2: Read Data (accelerometer and analog pin)
  • 12:00-13:00 lunch break
  • 13:00-14:00 Hands-On: Task 3: Record the accelerometer data for four different actions (labeled dataset), store it, and transfer it to PC using the Arduino IDE
  • 14:00-14:30 Lecture: Rule-based Systems: how to design them, pros: explainability, cons: it is hard
  • 14:30-15:00 Hands-On: Task 4: analyze the data in Excel/Google sheets and find rules for the 4 actions
  • 15:00-15:45 Hands-On: Task 5: Implement your rule-based algorithm, optional include explanations of why the state is recognized using the Arduino IDE
  • 15:45-16:00 get Coffee / break
  • 16:00-17:00 Ideation and testing ideas
  • 17:00-17:15 Present your ideas
  • 17:15-18:00 Hands-On: Discussion and presenting the results of Task 4 and 5

Day 3

  • 10:00-10:30 Lecture: Introduction to Jupyter Notebooks, training an ML model based on a given data and the self-recorded data set on the PC
  • 10:30-11:00 Lecture: Introduction to ML Libraries (everywhereML)
  • 11:00-12:00 Hands-On: Project specification, Ideation on Project Ideas; and discussion of project ideas, group forming (groups of 2 or 3), Make your groups, specify your projects, see if you get the components (refine to work with the components available)
  • 12:00-13:00 Lunch break
  • 13:00-15:30 Hands-On: Task 6: Getting Started with Jupyter Notebook Installing Jupyter Notebook for Micropython, controlling LED, reading data, storing data
  • 15:30-16:00 Coffee break
  • 16:00-16:30 Hands-On: Presentation: status update on your project
  • 16:15-18:00 Hands-On: Task 7: Deploy Machine Learning Models Implementing a basic model using everywhereML

Day 4

  • 10:00-10:45 Hands-On: Definition of project, project outline
  • 10:45-11:15 Hands-On: project presentation: 60 sec per team
  • 11:15-12:00 Hands-On: project work
  • 12:00-13:00 Lunch break
  • 13:00-15:00 Hands-On: project work
  • 15:00-15:30 Hands-On: stand-up meeting on project progress
  • 15:30-16:00 Coffee break
  • 16:00-17:30 Hands-On: project work
  • 17:30-18:00 Lecture: How to Evaluate ML Solutions (talk and discussion)

Day 5

  • 10:00-10:30 Hands-On: stand-up meeting — project challenges and solutions
  • 10:30-11:30 Hands-On: project work and preparing the presentation
    • Requirements for the Presentation:
      • Your team name and your names - and if you want a photo
      • A short video of the tech you envision (up to 60 sec, Kickstarter-style promotion type)
      • A technology description, including the list of components used in the prototype
      • A description of your data set and how it was acquired
      • The ML model/approach you took to learning the data
      • An evaluation of how well your ML model works with the data set (and optional in real live)
  • 11:30-12:00 Lecture: Pitfalls and Challenges in Developing ML/AI for IoT
  • 12:00-13:00 Lunch break
  • 13:00-15:30 Hands-On: project work
  • 15:30-16:00 Coffee break
  • 16:00-16:30 Lecture: Testing and Reporting ML Performance (How to Test the Prototype? & How to Report Performance?)
  • 16:30-17:30 Hands-On: Testing of prototype performance
  • 17:30-18:00 Hands-On: Open issues for the presentation on Saturday, Feedback sessions

Day 6

  • 13:15-18:15: Workshop Result Presentations
  • 18:30-18:50: Debriefing

Tasks

Task 1: Getting Started

Solution Task 1.1: LED Blinking

 1 # Blinky example
 2 
 3 import time
 4 from machine import Pin
 5 
 6 # This is the only LED pin available on the Nano RP2040,
 7 # other than the RGB LED connected to Nano WiFi module.
 8 led = Pin(6, Pin.OUT)
 9 
10 while (True):
11    led.on()
12    time.sleep_ms(250)
13    led.off()
14    time.sleep_ms(200)

Solution Task 1.2 Control external RGB

 1 # RGB example
 2 
 3 import time
 4 from machine import Pin
 5 
 6 # RGB LED connected to the RP2040
 7 ledG = Pin(25, Pin.OUT)
 8 ledR = Pin(15, Pin.OUT)
 9 ledB = Pin(16, Pin.OUT)
10 print("start")
11 
12 while (True):
13     print("*")
14     ledG.on()
15     ledR.off()
16     ledB.off()
17     time.sleep_ms(250)
18     ledG.off()
19     ledR.on()
20     ledB.off()
21     time.sleep_ms(250)
22     ledG.off()
23     ledR.off()
24     ledB.on()
25     time.sleep_ms(250)

Solution Task 1.3 Read Light-Dependent Resistor (LDR)

See LDR. A0 is the analog input with 16 bit resolution. It reads the analog value every second and print it to the console-

 1 #Example usage for Arduino Nano
 2 from machine import Pin, ADC
 3 from time import sleep
 4 
 5 analogPin = ADC(Pin(26))
 6 
 7 while True:
 8   analogVal16 = analogPin.read_u16()
 9   print(analogVal16)
10   sleep(1)


Solution Task 1.4 Combine Light-Dependent Resistor (LDR) with Blinking LED

 1 from machine import Pin, ADC
 2 import time
 3 from time import sleep
 4 
 5 led = Pin(6, Pin.OUT)
 6 
 7 while (True):
 8   analogPin = ADC(Pin(26))
 9   analogVal16 = analogPin.read_u16()
10   print(analogVal16)
11   rate = analogVal16 / 300 # create a simple mapping
12   led.on()
13   time.sleep_ms(int(rate)) # convert the rate to an integer type
14   led.off()
15   time.sleep_ms(int(rate))

Task 2: Read Data

  • read data from the accelerometer and the gyro and print them (Arduino IDE) https://docs.arduino.cc/micropython/basics/board-examples/
  • extend your program to write the data from the accelerometers to a file; see the instructions for FileIO.
  • transfer the file to your computer
  • optional: add the photo resistors to your board, read their values, and write them to the file; see the instructions for LDR.

Solution Task 2.1: Read Accelerometer and Gyro

 1 import time
 2 from lsm6dsox import LSM6DSOX
 3 
 4 from machine import Pin, I2C
 5 lsm = LSM6DSOX(I2C(0, scl=Pin(13), sda=Pin(12)))
 6 
 7 while (True):
 8     accel_data = lsm.accel()
 9     print('Accelerometer: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*accel_data))
10     gyro_data = lsm.gyro()
11     print('Gyroscope:     x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*gyro_data))
12     print("")
13     time.sleep_ms(100)

Solution Task 2.2: Read analog values

A0 is the analog input with 16-bit resolution. It reads the analog value every second and print it to the console-

1 from machine import Pin, ADC
2 from time import sleep
3 
4 analogPin = ADC(Pin(26))
5 
6 while True:
7   analogVal16 = analogPin.read_u16()
8   print(analogVal16)
9   sleep(1)

Task 6: Getting Started with Jupyter Notebook

  • Connect the board
  • Install the Jupyter Notebook,
  • Read the accelerometer and the gyro and show it in the notebook

Task 6.1: is it moved?

  • read acceleration and gyro
  • calculate the differences between values
  • show an ouput when it is move
  • create a file on the device that logs, when it is moved

Task 6.2: it was turned upside down?

  • read acceleration and gyro
  • make a rule based "AI" that records
    • it was put upside down
    • it was turned 360
    • it was moved "quickly"

Task 7: Deploy Machine Learning Models

We will use https://github.com/eloquentarduino/everywhereml to detect the same gestures as in Task 2.2. For this, install everywhereml:

See EverywhereML for downloading the full example and a dataset to experiment with.

pip3 install -U everywhere

Using everywhereml we can train a model on a more powerful machine for deployment on a microcontroller. See https://eloquentarduino.com/posts/micropython-machine-learning for example for such a training process. Assuming that our ML model is trained and stored in variable clf then we can save the model to a file using

clf.to_micropython_file("MyModel.py")

The MyModel.py file can then be saved and called directly on the microcontroller. To run the model on the microcontroller, assume your data is stored in x and you trained a RandomForestClassifier. Then you can predict via the following code snippet

import MyModel
clf = MyModel.RandomForestClassifier()
clf.predict(x)

Task 8: Connect to WiFi

This is an optional task.

  • Connect both boards to WIFI using Tutorial Network
  • Use the Arduino Nano RP2040 Connect as output (showing a color)
  • Use the Arduino Nano RP2040 Connect as input (recognize with rules 3 gestures)

Links

See the full list of links: UBISS2024-Links

Local Links

https://ubicomp.net/sw/db1/var2db.php? http://localhost:8888/notebooks/ArduinoNanoRP2040_v01.ipynb http://localhost:8888/doc/tree/create-ML-model01.ipynb

Reading

Required Reading before the course

Recommended Reading before the course

Random Commands

pip install micropython-lsm6dsox

picotool.exe load -x C:\Users\ru42qak\AppData\Roaming\OpenMV\openmvide\firmware\ARDUINO_NANO_RP2040_CONNECT\firmware.bin

pip install jupyterlab

pip install everywhereml

python -m pip install jupyter

git clone https://github.com/goatchurchprime/jupyter_micropython_kernel.git

pip install -e jupyter_micropython_kernel

python -m notebook

python -m jupyter kernelspec list


C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>pip install -e jupyter_micropython_kernel

C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>python -m notebook