UBISS2024: Difference between revisions

From Sketching with Hardware at LMU Wiki
Jump to navigation Jump to search
No edit summary
Line 136: Line 136:
= Reading =
= Reading =
== Required Reading before the course ==
== Required Reading before the course ==
Albrecht Schmidt. 2020. Interactive Human Centered Artificial Intelligence: A Definition and Research Challenges. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New York, NY, USA, Article 3, 1–4. https://doi.org/10.1145/3399715.3400873 (4p) https://uni.ubicomp.net/as/iHCAI2020.pdf
* Albrecht Schmidt. 2020. Interactive Human Centered Artificial Intelligence: A Definition and Research Challenges. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New York, NY, USA, Article 3, 1–4. https://doi.org/10.1145/3399715.3400873 https://uni.ubicomp.net/as/iHCAI2020.pdf (4p)
 
* Albrecht Schmidt and Kristof van Laerhoven, "How to build smart appliances?," in IEEE Personal Communications, vol. 8, no. 4, pp. 66-71, Aug. 2001, https://doi.org/10.1109/98.944006 https://www.eti.uni-siegen.de/ubicomp/papers/sl_ieeepc2001.pdf (6p)
A. Schmidt and K. van Laerhoven, "How to build smart appliances?," in IEEE Personal Communications, vol. 8, no. 4, pp. 66-71, Aug. 2001, doi: 10.1109/98.944006. (6p)
* Albrecht Schmidt. "Understanding and researching through making: a plea for functional prototypes." interactions 24.3 (2017): 78-81. https://www.sketching-with-hardware.org/files/functional3058498.pdf (4p)
https://www.eti.uni-siegen.de/ubicomp/papers/sl_ieeepc2001.pdf  
* Huy Viet Le, Sven Mayer, and Niels Henze. 2020. Deep learning for human-computer interaction. interactions 28, 1 (January - February 2021), 78–82. https://doi.org/10.1145/3436958 https://sven-mayer.com/wp-content/uploads/2021/01/huy2021deep.pdf (5p)
 
* Huy Viet Le, Sven Mayer, Max Weiß, Jonas Vogelsang, Henrike Weingärtner, and Niels Henze. Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. In: ACM Trans. Comput.-Hum. Interact., vol. 27, no. 5, pp. 38, 2020, ISSN: 1073-0516 https://sven-mayer.com/wp-content/uploads/2020/09/le2020shortcuts.pdf (38p)
Schmidt, Albrecht. "Understanding and researching through making: a plea for functional prototypes." interactions 24.3 (2017): 78-81. (4p)
* Judith Hurwitz, and Daniel Kirsch. "Machine learning for dummies." IBM Limited Edition 75 (2018): 9780429196645-6. https://www.ibm.com/downloads/cas/GB8ZMQZ3 (Pages 3-18 and 29-47, this is Chapters 1 and 3) (35p)
https://www.sketching-with-hardware.org/files/functional3058498.pdf  
* Chris Garrett. MicroPython: An Intro to Programming Hardware in Python https://realpython.com/micropython/ (14 pages)
 
* MicroPython Basics https://docs.arduino.cc/micropython/basics/micropython-basics/ (5 pages)
Huy Viet Le, Sven Mayer, and Niels Henze. 2020. Deep learning for human-computer interaction. interactions 28, 1 (January - February 2021), 78–82. (5p)
https://doi.org/10.1145/3436958 https://sven-mayer.com/wp-content/uploads/2021/01/huy2021deep.pdf  
 
Le, Huy Viet; Mayer, Sven; Weiß, Max; Vogelsang, Jonas; Weingärtner, Henrike; Henze, Niels (2020) Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. In: ACM Trans. Comput.-Hum. Interact., vol. 27, no. 5, pp. 38, 2020, ISSN: 1073-0516. (38p)
https://sven-mayer.com/wp-content/uploads/2020/09/le2020shortcuts.pdf  
 
Hurwitz, Judith, and Daniel Kirsch. "Machine learning for dummies." IBM Limited Edition 75 (2018): 9780429196645-6. https://www.ibm.com/downloads/cas/GB8ZMQZ3 (Pages 3-18 and 29-47, this is Chapters 1 and 3) (35p)
 
Chris Garrett. MicroPython: An Intro to Programming Hardware in Python https://realpython.com/micropython/ (14 pages)
 
MicroPython Basics https://docs.arduino.cc/micropython/basics/micropython-basics/ (5 pages)


== Recommended Reading before the course: ==
== Recommended Reading before the course: ==

Revision as of 09:06, 10 June 2024

Link Page

https://www.sketching-with-hardware.org/wiki/UBISS2024-Links

Tasks

Project 0: connect a Arduino Nano ESP32 board

solution Project 0: LED Blinking

# Blinky example

import time
from machine import Pin

# This is the only LED pin available on the Nano RP2040,
# other than the RGB LED connected to Nano WiFi module.
led = Pin(6, Pin.OUT)

while (True):
   led.on()
   time.sleep_ms(250)
   led.off()
   time.sleep_ms(200)

solution Project 0: Control external RGB

# RGB example

import time
from machine import Pin

# RGB LED connected to Nano WiFi module.
ledG = Pin(2, Pin.OUT)
ledR = Pin(3, Pin.OUT)
ledB = Pin(4, Pin.OUT)
print("start")

while (True):
    print("*")
    ledG.on()
    ledR.off()
    ledB.off()
    time.sleep_ms(250)
    ledG.off()
    ledR.on()
    ledB.off()
    time.sleep_ms(250)
    ledG.off()
    ledR.off()
    ledB.on()
    time.sleep_ms(250)

Project 1: read Acceleration from Arduino Nano ESP32 board

solution Project 1: Read Accelerometer and Gyro

import time
from lsm6dsox import LSM6DSOX

from machine import Pin, I2C
lsm = LSM6DSOX(I2C(0, scl=Pin(13), sda=Pin(12)))

while (True):
    accel_data = lsm.accel()
    print('Accelerometer: x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*accel_data))
    gyro_data = lsm.gyro()
    print('Gyroscope:     x:{:>8.3f} y:{:>8.3f} z:{:>8.3f}'.format(*gyro_data))
    print("")
    time.sleep_ms(100)


solution Project 2: Read analog values - Code Example Arduino Nano Connect RP2040

A0 is the analog input with 16 bit resolution. It reads the analog value every second and print it to the console-

#Example usage for Arduino Nano
from machine import Pin, ADC
from time import sleep

analogPin = ADC(Pin(26))

while True:
  analogVal16 = analogPin.read_u16()
  print(analogVal16)
  sleep(1)

Project 2: Jupyter Notebook


Task 2.1: is it moved?

  • read acceleration and gyro
  • calculate the differences between values
  • show an ouput when it is move
  • create a file on the device that logs, when it is moved

Task 2.2: it was turned upside down?

  • read acceleration and gyro
  • make a rule based "AI" that records
    • it was put upside down
    • it was turned 360
    • it was moved "quickly"

Task 3: ML on Arduino Nano Connect RP2040

Task 4: connect both boards to WIFI

  • connect both boards to WIFI using Tutorial_Network
  • use the Arduino Nano ESP32 as output (showing a color)
  • use the Arduino Nano Connect RP2040 as input (recognize with rules 3 gestures)

Links

See the full list of links: UBISS2024-Links

Local Links

https://ubicomp.net/sw/db1/var2db.php? http://localhost:8888/notebooks/ArduinoNanoRP2040_v01.ipynb http://localhost:8888/doc/tree/create-ML-model01.ipynb

Reading

Required Reading before the course

Recommended Reading before the course:

John D. Kelleher, Deep Learning, https://mitpress.mit.edu/9780262537551/deep-learning/

Yuli Vasiliev, Python for Data Science: A Hands-On Introduction, https://nostarch.com/python-data-science

Tutorial on Jupyter Notebooks: https://www.datacamp.com/tutorial/tutorial-jupyter-notebook

Smola, Alex, and S. V. N. Vishwanathan. "Introduction to machine learning." Cambridge University, UK 32.34 (2008): 2008. https://alex.smola.org/drafts/thebook.pdf


Random Commands

pip install micropython-lsm6dsox

picotool.exe load -x C:\Users\ru42qak\AppData\Roaming\OpenMV\openmvide\firmware\ARDUINO_NANO_RP2040_CONNECT\firmware.bin

pip install jupyterlab

pip install everywhereml

python -m pip install jupyter

git clone https://github.com/goatchurchprime/jupyter_micropython_kernel.git

pip install -e jupyter_micropython_kernel

python -m notebook

python -m jupyter kernelspec list


C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>pip install -e jupyter_micropython_kernel

C:\Users\ru42qak\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\jupyterlab>python -m notebook