Difference between revisions of "Tasks WiSe2021"

From Sketching with Hardware at LMU Wiki
Jump to navigation Jump to search
 
(30 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 
= Semesterplan and Tasks =
 
= Semesterplan and Tasks =
 
+
* Regular Online Session on Thursday 18:00-19:00
Online Session on Thursday 18:00-19:00
+
* Tutors available in discord:
 +
** 10.12.2020 (Thursday) from 17:00-21:00
 +
** 16.12.2020 (Wednesday) from 17:00-21:00
 +
** on request - please let us know when you want to work on it
  
 
== Session 1: Introduction (Monday 07th Dec) ==  
 
== Session 1: Introduction (Monday 07th Dec) ==  
Line 12: Line 15:
 
* look at the following page: https://micropython-on-wemos-d1-mini.readthedocs.io/en/latest/basics.html
 
* look at the following page: https://micropython-on-wemos-d1-mini.readthedocs.io/en/latest/basics.html
  
=== Tasks and Submission ===  
+
=== Tasks and Submission (Deadline 16th of Dec) ===  
 
* Unpacking of the hardware
 
* Unpacking of the hardware
 
* Install the software on your computer, put the software image onto the ESP8266 D1 mini, see [[Tutorial Basics]]
 
* Install the software on your computer, put the software image onto the ESP8266 D1 mini, see [[Tutorial Basics]]
Line 24: Line 27:
 
** write the analog value to the serial line
 
** write the analog value to the serial line
 
* Submission:
 
* Submission:
** For each task submit a video of max 30 seconds (should show the setup, your code ond the screen, an the system functioning)
+
** Deadline: 16th of December 2020 latest 23:59 (Wednesday)
 +
** For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)
 +
 
 +
==Session 2: Sensors and Actuators (Thursday 17th Dec) ==
 +
* Discussion of sensors, actuators, libraries
 +
* Organisation - check that teams of 2 people exist
 +
 
 +
=== Videos and Tutorials to watch ===
 +
* Understand how to connect a servo and how to move it, see [[Tutorial Digital IO]]
 +
* learn at how to create sounds, see [[Piezo Speaker]]
 +
* look at how control a stepper motor [[Stepper Motor and ULN2003]]
 +
* get familiar with how to connect the [[Ultrasonic Sensor HC-SR04]]
 +
 
 +
=== Tasks and Submission (Deadline 13th of Jan) ===
 +
* Task 1: Play a sound, PWM
 +
** Connect the piezo speaker to an output pin
 +
** Modify the example given in [[Piezo Speaker]] to play a part of a Xmas song
 +
 
 +
* Task 2: Read the distance with HC-SR04
 +
** Connect the [[Ultrasonic Sensor HC-SR04]]
 +
** Understand how to include a library in mycropython (see [[Tutorial Network]], [[Tutorial Display]])
 +
** use the hcsr04.py library and read out the distance
 +
** write the distance to the serial line
 +
 
 +
* Task 3: Implement Theremin-like instrument
 +
** Connect the piezo speaker to an output pin (task 1)
 +
** Connect the [[Ultrasonic Sensor HC-SR04]] (task 2)
 +
** Change the tone (frequency) that is played based on the distance
 +
** write the frequency you play to the serial line
 +
 
 +
* Task 4: Stepper Motor
 +
** Connect the stepper motor [[Stepper Motor and ULN2003]]
 +
** Connect a button (see session 1)
 +
** For a short press on the button move 45° and for a long press move 180°
 +
** '''there may be a problem with power - see [[Stepper Motor and ULN2003]]''' - if this is the case just build up the setup and the program and document this
 +
 
 +
* Task 5:
 +
** Connect the servo motor, see [[Tutorial Digital IO]] and [[SG90 Servo]]
 +
** Connect a potentiometer to A0 and read the analog value
 +
** Set the position (angle) of the servo based on the analog value
 +
** '''there may be a problem with power - see [[SG90 Servo]]''' - if this is the case just build up the setup and the program and document this
 +
 
 +
* Submission:
 +
** Deadline: 13th of January 2021 latest 23:59 (Wednesday)
 +
** For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)
 +
 
 +
==Session 3: Networked IoT Systems (Thursday 14th Jan) ==
 +
* Network library, Display, ES32, MPU6050
 +
 
 +
=== Videos and Tutorials to watch ===
 +
* Understand the network library [[Tutorial Network]]
 +
* Understand the display library [[Tutorial Display]] (requires coming to the lab for soldering/picking up ESP32)
 +
* Understand the [[MPU 6050]] Sensor and I2C connection (requires coming to the lab for soldering)
 +
 
 +
=== Tasks and Submission (Deadline 20th of Jan) ===
 +
* Task 1: Play a sound over the network, connected devices
 +
** Connect a potentiometer to A0 and read the analog value
 +
** Write the value to the network [[Tutorial Network]]
 +
** Read the value from the network from your team colleague
 +
** Play a sound based on the value you get from the network
 +
 
 +
* NOT POSSIBLE as we have no access to the lan: Task 2: Subtle communication, connected devices
 +
** (requires coming to the lab for soldering)
 +
** Solder the Pins onto the MPU 6050
 +
** Connect the [[MPU 6050]] Sensor as I2C device
 +
** When the sensor is moved, write a value to the network [[Tutorial Network]]
 +
** When the device has not been moved for 10 seconds, write a different value to the network
 +
** Read the movement value from the network from your team colleague
 +
** Switch on the LED if the remote device has been moved
 +
 
 +
* Alternativ Task 2: Subtle communication, connected devices
 +
** Connect a Sensor ([[LDR]], [[Ultrasonic Sensor HC-SR04]], [[PIR Sensor]], ...)
 +
** If the sensor value changes, write a value to the network [[Tutorial Network]] (not more than every 5 seconds)
 +
** Read the sensor value from the network from your team colleague
 +
** Switch on the LED if the remote site is active (changing sensor values)
 +
 
 +
* Task 3: Quotes on the Display
 +
** (requires coming to the lab for soldering/picking up ESP32)
 +
** put the micropython image onto the ESP32, see [[Tutorial Basics]] and session 1
 +
** Show every minute a random quote on the display (at least 3 different quotes) [[Tutorial Display]]
 +
 
 +
* Task 4: Ideation (for the group projects). Generate ideas within your team using the following process:
 +
**First, before meeting your partner, everyone of you should independently think about 5-10 issues with remote connectedness (eating, dancing, etc.)
 +
**Second, before meeting your partner, everyone of you should think independently about 5-10 hardware components (sensors, actuators, etc.)
 +
**Third, together (!) randomly pick one issue and one hardware component and imagine a project. Save your solution! Create and describe at least five projects this way. So, person A picks both and describes a solution. Next, person B does the same, and so on.
 +
 
 +
* Submission:
 +
** Deadline: 20th of January 2021 latest 23:59 (Wednesday)
 +
** For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)
 +
 
 +
==Session 4: Group projects (Thursday 21st Jan) ==
 +
* Group project ideation: Presentation of ideas.
 +
 
 +
=== Tasks and Submission (Deadline 27th of Jan) ===
 +
* Task 1: Describe your group project in detail
 +
** Pick an idea that you want to implement
 +
** Create a visual concept of how your system will work
 +
** Describe what needs to be done
 +
* Create a List:
 +
** Team name
 +
** Can all members can pick up hardware from Frauenlobstraße 7a (yes/no)
 +
*** If not please add the address
 +
** List the hardware components that you will need
 +
*** Type (Pressure sensor)
 +
*** Specific Name/Number
 +
*** Quantity
 +
*** Link to article in shop
 +
**** https://www.sketching-with-hardware.org/wiki/List_of_shops
  
==Session 2: Sensors and Actuators (17th Dec) ==
 
  
* 17th Dec: IO, sensors and actuators
+
* Submission:
** watch videos: sensors (distance, acceleration), actuatos (stepper, servo)
+
** Deadline: 27th of January 2021 latest 23:59 (Wednesday)
** hand in basic chip working, digital IO and analog in
+
** A PDF for the group project description
** orga: system design in groups of 2
+
 
 +
 
 +
<!--
 
* 7th Jan: networked system IoT
 
* 7th Jan: networked system IoT
 
** watch video: netvars, autostart, building a system
 
** watch video: netvars, autostart, building a system
Line 45: Line 156:
 
* 28th Jan: group project
 
* 28th Jan: group project
 
** orga: fix groups, time plan, concept, and time plan
 
** orga: fix groups, time plan, concept, and time plan
 
 
 
 
  
  
Line 89: Line 196:
  
 
----
 
----
 +
-->
 +
 +
= Project Ideas =
 +
 +
 +
== Resources ==
 +
 +
=== Pinout of Pi Zero ===
 +
physical connections
 +
<br/> JL: https://peppe8o.com/raspberry-pi-zero-pinout/
 +
<br/> CR: https://de.pinout.xyz/pinout/io_pi_zero
 +
<br/> HW: https://www.argon40.com/learn/index.php/2020/03/06/a-comprehensive-guide-on-raspberry-pi-3-and-raspberry-pi-zero-and-zero-w-gpio-pinout/
 +
<br/> JW: https://www.raspberrypi.org/documentation/usage/gpio/
 +
 +
<br/> SF: With the gpiozero library you will by default want to reference the pin number as in GPIOx where x is the pin you define in software. Example: LED(5) in software; connect to GPIO5 on the Pi Zero header. (see also: https://gpiozero.readthedocs.io/en/stable/recipes.html#pin-numbering)
 +
 +
=== Setting up the OS and Python ===
 +
NOOBS
 +
configure
 +
<br/> JL: https://www.tutorialspoint.com/python/python_environment.htm
 +
<br/> CR: https://learn.adafruit.com/setting-up-a-raspberry-pi-with-noobs/download-noobs
 +
<br/> HW: https://www.raspberrypi.org/help/noobs-setup./
 +
<br/> JW: https://www.raspberrypi.org/documentation/installation/noobs.md
 +
<br/> SG: https://medium.com/@nikosmouroutis/how-to-use-atom-with-your-raspberry-pi-through-sftp-382c43176a9e
 +
 +
<br/> SF: The simple three-step wizard we talked about: https://www.raspberrypi.org/software/
 +
 +
=== Access to GPIO ===
 +
JL: http://codefoster.com/pi-basicgpio/
 +
<br/> CR: https://fedoramagazine.org/raspberry-pi-zero-gpio-fedora/
 +
<br/> HW: https://webnist.de/raspberry-pi-zero-gpio-pins-ueber-den-desktop-steuern/
 +
<br/> SG: https://github.com/fivdi/onoff
 +
 +
<br/> SF: With the gpiozero library you will by default want to reference the pin number as in GPIOx where x is the pin you define in software. Example: LED(5) in software; connect to GPIO5 on the Pi Zero header. (see also: https://gpiozero.readthedocs.io/en/stable/recipes.html#pin-numbering)
 +
 +
=== Blinking an LED ===
 +
JL: https://raspberrypihq.com/making-a-led-blink-using-the-raspberry-pi-and-python/
 +
<br/> CR: https://fedoramagazine.org/raspberry-pi-zero-gpio-fedora/
 +
<br/> HW: https://webnist.de/raspberry-pi-zero-gpio-pins-ueber-den-desktop-steuern/
 +
<br/> JW: http://www.heidislab.com/tutorials/getting-started-with-raspberry-pi-led-blinking-on-raspberry-pi-zero
 +
 +
=== Reading a Pin ===
 +
JL: http://raspberry.io/projects/view/reading-and-writing-from-gpio-ports-from-python/
 +
<br/> CR: https://learn.sparkfun.com/tutorials/raspberry-gpio/all
 +
<br/> HW: https://mjrobot.org/rpi-gpiozero/
 +
<br/> SG: https://tutorials-raspberrypi.de/raspberry-pi-gpio-erklaerung-beginner-programmierung-lernen/
 +
 +
<br/> SF: What state does your input pin have by default? With gpiozero, you can configure the pin to be pulled up (pull_up=True) or pulled low (pull_up=False). If you pull up, you can only change the state by connecting the pin physically to GND. If pulled low, you can only change the input state by connecting it to 3.3V. (see also: https://gpiozero.readthedocs.io/en/stable/api_input.html). By default, the pin is pulled up. If you always define the state explicitly in your code, you might make it easier for yourself to debug the code. Example: Button(15, pull_up=True)
 +
 +
=== Runing a Webserver that access GPIO ===
 +
JL: https://towardsdatascience.com/python-webserver-with-flask-and-raspberry-pi-398423cc6f5d
 +
<br/> CR: https://www.e-tinkers.com/2018/04/how-to-control-raspberry-pi-gpio-via-http-web-server/
 +
<br/> HW: https://hackaday.io/project/173322-controlling-gpio-outputs-using-a-web-interface-wit
 +
<br/> JW: https://www.youtube.com/watch?v=c6FOpPXbLjs
 +
<br/> SG: https://randomnerdtutorials.com/raspberry-pi-web-server-using-flask-to-control-gpios/
 +
 +
=== Restful API ===
 +
JL: https://sites.google.com/site/cartwrightraspberrypiprojects/home/other-projects/rest-api
 +
<br/> CR: https://docs.dataplicity.com/docs/control-gpios-using-rest-api
 +
<br/> HW: https://www.youtube.com/watch?v=JZfevVG-VuA&list=PLLIDdNg0t5ceg3mI3vn0YJocJ4ndMtM98&index=4&t=0s
 +
<br/> SG: https://tutorials-raspberrypi.de/raspberry-pi-nodejs-webserver-installieren-gpios-steuern/
 +
 +
=== Description of the components ===
 +
JL: https://www.adafruit.com/product/3411
 +
<br/> CR: https://computer.howstuffworks.com/raspberry-pi2.htm
 +
<br/> HW: https://learn.adafruit.com/introducing-the-raspberry-pi-zero
 +
<br/> JW: https://www.youtube.com/watch?v=WJdQ4rknBX0
 +
 +
== Raspberry Pi ==
 +
 +
 +
 +
=== Groups ===
 +
HW, CR
 +
 +
YS, JW
 +
 +
JL, SG, JP
 +
 +
=== SG ===
 +
mindSat:
 +
 +
<br/>A small portable gadget. Which can record sensor data and give feedback to a user. Mind satisfaction for your offices days.
 +
<br/>It’s hard to do small exercises, often we find excuses to not do it. Concentrated breathing can help your brain to relax, or focusing on small things can reduce stress. Using those informations to create short (~ 5min) exercises.
 +
 +
=== JL ===
 +
Home Office Buddy:
 +
<br/> A speaker communicating in a pleasant way with the target person in order to minimize a possible social isolation and encourage the keeping of important routines & daily structures in order to maintain a certain productivity during a working day.
 +
<br/> Example: suggest unplanned short breaks if the target person is showing a lack of concentration measured by a certain level of body movement.
 +
 +
=== JP ===
 +
 +
'''Problem''': too much sitting while working in the office
 +
</br>'''Idea''': adjustable desk
 +
</br>'''Sensors''': on the chair, to check, if the user is sitting and how long
 +
</br>'''Notification''': User gets a message after a certain time, that the desk will rise for min. 30 minutes. After that time the desk can be lowerd again, if wished. The Sensor will messures again.
 +
</br>'''Gamification''': a little "stand-up-figure" will be on the desk and will stand up, if the desk is rised. Will be collapse, if the desk is lowered.
 +
 +
=== CR ===
 +
Problem: Tension due to prolonged sitting and incorrect sitting positions (e.g. crooked shoulders)
 +
<br/> Sensors: Kinect for analysis of posture, sensors that measure the tension in the neck
 +
<br/> The longer one maintains a bad sitting position, the older/bent an avatar becomes on the screen. Through movement the avatar becomes younger/upright again.
 +
 +
=== YS ===
 +
 +
<b>WALL-E: your digital stress cleaner</b>
 +
<br/><b>Features</b>
 +
<br/>
 +
<ul>
 +
<li>The buggy has two “hands” and one trash can</li>
 +
<li>Users’ stress detection: EDA/HRV</li>
 +
<li>If stressful: The buggy will </li>
 +
<ul>
 +
<li>search for the smartphone (RFID or GPS)</li>
 +
<li>steal the smartphone</li>
 +
<li>drop it into the trash can</li>
 +
</ul>
 +
<li>If user calms down: be allowed to use smartphone </li>
 +
</ul>
 +
 +
=== JW ===
 +
Problem: ① Uncomfortable working environment in aspects of lighting and air quality; ② Avoidable disturbance by colleges, etc.; ③ Remaining stationary for too long
 +
 +
Solution: ① Automatic adjustable light, automatic jalousie roller, automatic humidifier, LED as indicator next to windows; ② Display as door sign; ③ Speaker with a switch as indicator
 +
 +
Sensor: Temperature and humidity sensor Bme280, light sensor Tsl2561, CO2 sensor, motion sensor
 +
 +
=== HW ===
 +
Problem: Pain in the hand and wrist due to consistent movement or incorrect hand/arm position
 +
<br/> Sensors: e.g. Leap Motion for motion and posture detection, keyboard/mouse to track activity (?)
 +
<br/> Vibration of the mouse/keyboard if the same position was kept for too long or no pause was made. A visualization of an inflammation (red pulsating circles) is displayed, which gets worse/better depending on user behavior.  This warns the user when his hand position is worsening. The user may choose to be prevented from continuing to work or not.
 +
 +
[[Category:Courses]]

Latest revision as of 15:33, 12 June 2024

Semesterplan and Tasks[edit]

  • Regular Online Session on Thursday 18:00-19:00
  • Tutors available in discord:
    • 10.12.2020 (Thursday) from 17:00-21:00
    • 16.12.2020 (Wednesday) from 17:00-21:00
    • on request - please let us know when you want to work on it

Session 1: Introduction (Monday 07th Dec)[edit]

  • Introduction to the course, presentation of the hardware.
  • Organisation - make teams of 2 people

Videos and Tutorials to watch[edit]

Tasks and Submission (Deadline 16th of Dec)[edit]

  • Unpacking of the hardware
  • Install the software on your computer, put the software image onto the ESP8266 D1 mini, see Tutorial Basics
  • Task 1: Digital IO
    • Connect 3 external LEDs and let it blink in different speeds
    • Connect a button to a digital input, when the button is pressed all LEDs should be on
  • Task 2: Analog In, PWM Out
    • Connect 1 external LED
    • Connect a potentiometer to A0 and read the analog value
    • change the brightness of the LED with the analog value (PWM)
    • write the analog value to the serial line
  • Submission:
    • Deadline: 16th of December 2020 latest 23:59 (Wednesday)
    • For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)

Session 2: Sensors and Actuators (Thursday 17th Dec)[edit]

  • Discussion of sensors, actuators, libraries
  • Organisation - check that teams of 2 people exist

Videos and Tutorials to watch[edit]

Tasks and Submission (Deadline 13th of Jan)[edit]

  • Task 1: Play a sound, PWM
    • Connect the piezo speaker to an output pin
    • Modify the example given in Piezo Speaker to play a part of a Xmas song
  • Task 3: Implement Theremin-like instrument
    • Connect the piezo speaker to an output pin (task 1)
    • Connect the Ultrasonic Sensor HC-SR04 (task 2)
    • Change the tone (frequency) that is played based on the distance
    • write the frequency you play to the serial line
  • Task 4: Stepper Motor
    • Connect the stepper motor Stepper Motor and ULN2003
    • Connect a button (see session 1)
    • For a short press on the button move 45° and for a long press move 180°
    • there may be a problem with power - see Stepper Motor and ULN2003 - if this is the case just build up the setup and the program and document this
  • Task 5:
    • Connect the servo motor, see Tutorial Digital IO and SG90 Servo
    • Connect a potentiometer to A0 and read the analog value
    • Set the position (angle) of the servo based on the analog value
    • there may be a problem with power - see SG90 Servo - if this is the case just build up the setup and the program and document this
  • Submission:
    • Deadline: 13th of January 2021 latest 23:59 (Wednesday)
    • For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)

Session 3: Networked IoT Systems (Thursday 14th Jan)[edit]

  • Network library, Display, ES32, MPU6050

Videos and Tutorials to watch[edit]

  • Understand the network library Tutorial Network
  • Understand the display library Tutorial Display (requires coming to the lab for soldering/picking up ESP32)
  • Understand the MPU 6050 Sensor and I2C connection (requires coming to the lab for soldering)

Tasks and Submission (Deadline 20th of Jan)[edit]

  • Task 1: Play a sound over the network, connected devices
    • Connect a potentiometer to A0 and read the analog value
    • Write the value to the network Tutorial Network
    • Read the value from the network from your team colleague
    • Play a sound based on the value you get from the network
  • NOT POSSIBLE as we have no access to the lan: Task 2: Subtle communication, connected devices
    • (requires coming to the lab for soldering)
    • Solder the Pins onto the MPU 6050
    • Connect the MPU 6050 Sensor as I2C device
    • When the sensor is moved, write a value to the network Tutorial Network
    • When the device has not been moved for 10 seconds, write a different value to the network
    • Read the movement value from the network from your team colleague
    • Switch on the LED if the remote device has been moved
  • Alternativ Task 2: Subtle communication, connected devices
    • Connect a Sensor (LDR, Ultrasonic Sensor HC-SR04, PIR Sensor, ...)
    • If the sensor value changes, write a value to the network Tutorial Network (not more than every 5 seconds)
    • Read the sensor value from the network from your team colleague
    • Switch on the LED if the remote site is active (changing sensor values)
  • Task 3: Quotes on the Display
    • (requires coming to the lab for soldering/picking up ESP32)
    • put the micropython image onto the ESP32, see Tutorial Basics and session 1
    • Show every minute a random quote on the display (at least 3 different quotes) Tutorial Display
  • Task 4: Ideation (for the group projects). Generate ideas within your team using the following process:
    • First, before meeting your partner, everyone of you should independently think about 5-10 issues with remote connectedness (eating, dancing, etc.)
    • Second, before meeting your partner, everyone of you should think independently about 5-10 hardware components (sensors, actuators, etc.)
    • Third, together (!) randomly pick one issue and one hardware component and imagine a project. Save your solution! Create and describe at least five projects this way. So, person A picks both and describes a solution. Next, person B does the same, and so on.
  • Submission:
    • Deadline: 20th of January 2021 latest 23:59 (Wednesday)
    • For each task submit a video of max 30 seconds (should show the setup, your code on the screen, an the system functioning)

Session 4: Group projects (Thursday 21st Jan)[edit]

  • Group project ideation: Presentation of ideas.

Tasks and Submission (Deadline 27th of Jan)[edit]

  • Task 1: Describe your group project in detail
    • Pick an idea that you want to implement
    • Create a visual concept of how your system will work
    • Describe what needs to be done
  • Create a List:
    • Team name
    • Can all members can pick up hardware from Frauenlobstraße 7a (yes/no)
      • If not please add the address
    • List the hardware components that you will need


  • Submission:
    • Deadline: 27th of January 2021 latest 23:59 (Wednesday)
    • A PDF for the group project description


Project Ideas[edit]

Resources[edit]

Pinout of Pi Zero[edit]

physical connections
JL: https://peppe8o.com/raspberry-pi-zero-pinout/
CR: https://de.pinout.xyz/pinout/io_pi_zero
HW: https://www.argon40.com/learn/index.php/2020/03/06/a-comprehensive-guide-on-raspberry-pi-3-and-raspberry-pi-zero-and-zero-w-gpio-pinout/
JW: https://www.raspberrypi.org/documentation/usage/gpio/


SF: With the gpiozero library you will by default want to reference the pin number as in GPIOx where x is the pin you define in software. Example: LED(5) in software; connect to GPIO5 on the Pi Zero header. (see also: https://gpiozero.readthedocs.io/en/stable/recipes.html#pin-numbering)

Setting up the OS and Python[edit]

NOOBS configure
JL: https://www.tutorialspoint.com/python/python_environment.htm
CR: https://learn.adafruit.com/setting-up-a-raspberry-pi-with-noobs/download-noobs
HW: https://www.raspberrypi.org/help/noobs-setup./
JW: https://www.raspberrypi.org/documentation/installation/noobs.md
SG: https://medium.com/@nikosmouroutis/how-to-use-atom-with-your-raspberry-pi-through-sftp-382c43176a9e


SF: The simple three-step wizard we talked about: https://www.raspberrypi.org/software/

Access to GPIO[edit]

JL: http://codefoster.com/pi-basicgpio/
CR: https://fedoramagazine.org/raspberry-pi-zero-gpio-fedora/
HW: https://webnist.de/raspberry-pi-zero-gpio-pins-ueber-den-desktop-steuern/
SG: https://github.com/fivdi/onoff


SF: With the gpiozero library you will by default want to reference the pin number as in GPIOx where x is the pin you define in software. Example: LED(5) in software; connect to GPIO5 on the Pi Zero header. (see also: https://gpiozero.readthedocs.io/en/stable/recipes.html#pin-numbering)

Blinking an LED[edit]

JL: https://raspberrypihq.com/making-a-led-blink-using-the-raspberry-pi-and-python/
CR: https://fedoramagazine.org/raspberry-pi-zero-gpio-fedora/
HW: https://webnist.de/raspberry-pi-zero-gpio-pins-ueber-den-desktop-steuern/
JW: http://www.heidislab.com/tutorials/getting-started-with-raspberry-pi-led-blinking-on-raspberry-pi-zero

Reading a Pin[edit]

JL: http://raspberry.io/projects/view/reading-and-writing-from-gpio-ports-from-python/
CR: https://learn.sparkfun.com/tutorials/raspberry-gpio/all
HW: https://mjrobot.org/rpi-gpiozero/
SG: https://tutorials-raspberrypi.de/raspberry-pi-gpio-erklaerung-beginner-programmierung-lernen/


SF: What state does your input pin have by default? With gpiozero, you can configure the pin to be pulled up (pull_up=True) or pulled low (pull_up=False). If you pull up, you can only change the state by connecting the pin physically to GND. If pulled low, you can only change the input state by connecting it to 3.3V. (see also: https://gpiozero.readthedocs.io/en/stable/api_input.html). By default, the pin is pulled up. If you always define the state explicitly in your code, you might make it easier for yourself to debug the code. Example: Button(15, pull_up=True)

Runing a Webserver that access GPIO[edit]

JL: https://towardsdatascience.com/python-webserver-with-flask-and-raspberry-pi-398423cc6f5d
CR: https://www.e-tinkers.com/2018/04/how-to-control-raspberry-pi-gpio-via-http-web-server/
HW: https://hackaday.io/project/173322-controlling-gpio-outputs-using-a-web-interface-wit
JW: https://www.youtube.com/watch?v=c6FOpPXbLjs
SG: https://randomnerdtutorials.com/raspberry-pi-web-server-using-flask-to-control-gpios/

Restful API[edit]

JL: https://sites.google.com/site/cartwrightraspberrypiprojects/home/other-projects/rest-api
CR: https://docs.dataplicity.com/docs/control-gpios-using-rest-api
HW: https://www.youtube.com/watch?v=JZfevVG-VuA&list=PLLIDdNg0t5ceg3mI3vn0YJocJ4ndMtM98&index=4&t=0s
SG: https://tutorials-raspberrypi.de/raspberry-pi-nodejs-webserver-installieren-gpios-steuern/

Description of the components[edit]

JL: https://www.adafruit.com/product/3411
CR: https://computer.howstuffworks.com/raspberry-pi2.htm
HW: https://learn.adafruit.com/introducing-the-raspberry-pi-zero
JW: https://www.youtube.com/watch?v=WJdQ4rknBX0

Raspberry Pi[edit]

Groups[edit]

HW, CR

YS, JW

JL, SG, JP

SG[edit]

mindSat:


A small portable gadget. Which can record sensor data and give feedback to a user. Mind satisfaction for your offices days.
It’s hard to do small exercises, often we find excuses to not do it. Concentrated breathing can help your brain to relax, or focusing on small things can reduce stress. Using those informations to create short (~ 5min) exercises.

JL[edit]

Home Office Buddy:
A speaker communicating in a pleasant way with the target person in order to minimize a possible social isolation and encourage the keeping of important routines & daily structures in order to maintain a certain productivity during a working day.
Example: suggest unplanned short breaks if the target person is showing a lack of concentration measured by a certain level of body movement.

JP[edit]

Problem: too much sitting while working in the office
Idea: adjustable desk
Sensors: on the chair, to check, if the user is sitting and how long
Notification: User gets a message after a certain time, that the desk will rise for min. 30 minutes. After that time the desk can be lowerd again, if wished. The Sensor will messures again.
Gamification: a little "stand-up-figure" will be on the desk and will stand up, if the desk is rised. Will be collapse, if the desk is lowered.

CR[edit]

Problem: Tension due to prolonged sitting and incorrect sitting positions (e.g. crooked shoulders)
Sensors: Kinect for analysis of posture, sensors that measure the tension in the neck
The longer one maintains a bad sitting position, the older/bent an avatar becomes on the screen. Through movement the avatar becomes younger/upright again.

YS[edit]

WALL-E: your digital stress cleaner
Features

  • The buggy has two “hands” and one trash can
  • Users’ stress detection: EDA/HRV
  • If stressful: The buggy will
    • search for the smartphone (RFID or GPS)
    • steal the smartphone
    • drop it into the trash can
  • If user calms down: be allowed to use smartphone

JW[edit]

Problem: ① Uncomfortable working environment in aspects of lighting and air quality; ② Avoidable disturbance by colleges, etc.; ③ Remaining stationary for too long

Solution: ① Automatic adjustable light, automatic jalousie roller, automatic humidifier, LED as indicator next to windows; ② Display as door sign; ③ Speaker with a switch as indicator

Sensor: Temperature and humidity sensor Bme280, light sensor Tsl2561, CO2 sensor, motion sensor

HW[edit]

Problem: Pain in the hand and wrist due to consistent movement or incorrect hand/arm position
Sensors: e.g. Leap Motion for motion and posture detection, keyboard/mouse to track activity (?)
Vibration of the mouse/keyboard if the same position was kept for too long or no pause was made. A visualization of an inflammation (red pulsating circles) is displayed, which gets worse/better depending on user behavior. This warns the user when his hand position is worsening. The user may choose to be prevented from continuing to work or not.