Editing UBISS2024

Jump to navigation Jump to search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.

Latest revision Your text
Line 1: Line 1:
 
This course is designed as a week-long tutorial to engage with ubiquitous devices in the domain of smart environments and how to use machine learning to build smart devices. Here, we use an [[Arduino_Nano_RP2040_Connect|Arduino Nano RP2040 Connect]].
 
This course is designed as a week-long tutorial to engage with ubiquitous devices in the domain of smart environments and how to use machine learning to build smart devices. Here, we use an [[Arduino_Nano_RP2040_Connect|Arduino Nano RP2040 Connect]].
 
= Final Projects =
 
 
== StressLess Shell ==
 
See the teaser video on [https://www.youtube.com/watch?v=NBSCIGqiXqM YouTube] by Songyan Teng, Jingyao Zheng, and Tim Zindulka.
 
 
== Plant Monitoring and Warning System ==
 
See the teaser video on [https://www.youtube.com/watch?v=DxjguBbUobk YouTube] by Shenxiu Wu, and Huong Nguyen.
 
 
== IntelliPen ==
 
The pen that can recognize the characters that you write! See the teaser video on [https://www.youtube.com/watch?v=WdLBq__ORBQ YouTube] by Mohammed Khalili, and Ali Mahmoudi.
 
 
== Hand Gesture Recognition ==
 
Find the code and documentation at [https://github.com/mamadzebal/Morse-Code-Detector GitHub]. The project was completed by Mohammed Farhoudi and Samira Kamali Poorazad.
 
  
 
= Schedule =
 
= Schedule =
Line 53: Line 39:
 
* 15:00-15:30 Hands-On: stand-up meeting on project progress
 
* 15:00-15:30 Hands-On: stand-up meeting on project progress
 
* 15:30-16:00 Coffee break
 
* 15:30-16:00 Coffee break
* 16:00-16:15 Lecture: How to run your system of a battery (see [[Tutorial AutoRun]]).
+
* 16:00-17:30 Hands-On: project work
* 16:15-17:30 Hands-On: project work
 
 
* 17:30-18:00 Lecture: How to Evaluate ML Solutions (talk and discussion)
 
* 17:30-18:00 Lecture: How to Evaluate ML Solutions (talk and discussion)
  
Line 60: Line 45:
 
* 10:00-10:30 Hands-On: stand-up meeting — project challenges and solutions
 
* 10:00-10:30 Hands-On: stand-up meeting — project challenges and solutions
 
* 10:30-11:30 Hands-On: project work and preparing the presentation
 
* 10:30-11:30 Hands-On: project work and preparing the presentation
** [[UBISS2024#Requirements_for_the_Final_Presentation | Requirements for the Presentation]]
+
** Requirements for the Presentation:
 
+
*** The presentation has to be 4 min long
 +
*** Your team name and your names - and if you want a photo
 +
*** A short video of the tech you envision (up to 60 sec, [https://www.kickstarter.com/ Kickstarter]-style promotion type)
 +
*** A technology description, including the list of components used in the prototype
 +
*** A description of your data set and how it was acquired
 +
*** The ML model/approach you took to learning the data
 +
*** An evaluation of how well your ML model works with the data set (and optional in real live)
 
* 11:30-12:00 Lecture: Pitfalls and Challenges in Developing ML/AI for IoT
 
* 11:30-12:00 Lecture: Pitfalls and Challenges in Developing ML/AI for IoT
 
* 12:00-13:00 Lunch break
 
* 12:00-13:00 Lunch break
Line 73: Line 64:
 
* 13:15-18:15: Workshop Result Presentations
 
* 13:15-18:15: Workshop Result Presentations
 
* 18:30-18:50: Debriefing
 
* 18:30-18:50: Debriefing
 
== Requirements for the Final Presentation ==
 
* The presentation has to be 4 minutes long (we stop you after 4 minutes!)
 
* First slide: Your team name and your names - and if you want a photo of the team
 
* A short video of the tech you envision (up to 60 sec, [https://www.kickstarter.com/ Kickstarter]-style promotion type)
 
* A technology description, including the list of components used in the prototype
 
* A description of your data set and how it was acquired
 
* The ML model/approach you took to learning the data
 
* An evaluation of how well your ML model works with the data set (and optional in real live)
 
  
 
== Final Submissions ==
 
== Final Submissions ==
Line 173: Line 155:
 
from machine import Pin, ADC
 
from machine import Pin, ADC
 
import time
 
import time
 +
from time import sleep
  
 
led = Pin(6, Pin.OUT)
 
led = Pin(6, Pin.OUT)
analogPin = ADC(Pin(26))
 
  
 
while (True):
 
while (True):
 +
  analogPin = ADC(Pin(26))
 
   analogVal16 = analogPin.read_u16()
 
   analogVal16 = analogPin.read_u16()
 
   print(analogVal16)
 
   print(analogVal16)
Line 249: Line 232:
  
 
<syntaxhighlight lang="Bash">
 
<syntaxhighlight lang="Bash">
pip3 install -U everywhereml
+
pip3 install -U everywhere
 
</syntaxhighlight>
 
</syntaxhighlight>
  

Please note that all contributions to Sketching with Hardware at LMU Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see My wiki:Copyrights for details). Do not submit copyrighted work without permission!

Cancel Editing help (opens in new window)