Artefacts presented at AE49SOSERundgang in R311 of UdK's Medienhaus

Credits: Nikoloz Kapanadze, Astrid Kraniger, Kohei Kimura, Akif Mehmet Sari, Ozcan Ertek, Anna Petzer

Coordinator: Prof. Dr. Dr. Daniel Devatman Hromada (ECDF Juniorprofessor for Digital Education)

Avatar 0

(with Chris Schmidts, Akif Sari)
Head-shaped tridecahedral (13-plane) wooden shell crafted by Chris Schmidts and filled with Raspberry pi 3A+, Seeed Respeaker (4mic circular array, ac108 converter; APA102 12-LED RGB pixel ring) enriched with UltraSonic Ranger System sending UDP packets to "active wall" installation by architecture student Akif Sari.

6-inch e-ink displays page 4. of McGuffey's Ecclectic Primer, a well-known Fibel of Victorian Era. Display is currently powered off and as such has zero carbon dioxide trace (hallo Gretha!) while still teaching Alphabet. To see more interactive e-ink setup please check the artefact CardboardFibel0.

Carboard Primer

(with Astrid Kraniger)
First functional cardboard-embedded artefact combining touchless gesture-based recognition (Seeed Grove Gesture Recognition Sensor PAJ7620U2)with an e-ink controlled by IT8591 board and some ANSI C coded  for the purpose of diagnostics of difficulties in acquisition of learning and writing (Leserechtschreibschwierigkeiten - LRS).

Diagnostics focuses on so-called "Rapid Automatized Naming" which is considered to be one among the most important LRS-predictors.

Visual content ("animal pictures") scanned from reedition of Lumen Picturae et Delineationis (Amsterdam, 1660, BE310).

Instead of a signature, this artefact contains a four-leaf clover (harvested in July) attached by duct-tape above the e-ink screen. Bottom of the cardboard shell is photovoltaic, making it possible to transform PappeFibel 0 or one of its derivatives into an energy-autarch ("eutark"; Hromada, 2019, AE49) digital education artefact.

TASK: Identify mismatch between visual and textual modality. 

INSTRUCTION OF USE: You interact with the device by moving Your hand in front of the Gesture Recognition Sensor (to the right from e-ink screen). Movement along vertical axis (up/down) maps to boolean (true/false, JA/NEIN) answers. Movement along horizontal axis (left/right) is used to browse the content. Rotation is used to switch between "learning" and "testing" mode.

CAVEAT: When changing modes of operation, new data has to be loaded into the buffer of the e-ink controller. This takes few seconds. Be patient. Breathe.

Touchless Ukulele

(with Anna Petzer)
This upcycled, portable, cable-less digital artefact combines touchless gesture-based recognition (Skywriter sensor)with a Raspberry pi 3a+ and integrated bluetooth loudspeaker.

TASK: Play with speed (time), position (space) and quality (suspense) of your hand movements. 

INSTRUCTION OF USE: You interact with the device by moving Your hand in front of the Skywriter sensor (above the sound hole of the instrument). Play the string to observe how the string vibration deforms the field measured by Skywriter. 

sonic

(with Nikoloz Kapanadze)
sonic is an experiment on the topic of screenless computers. what would our smart devices look like, and act like if they were not bound by the rectangular standard of the screen. sonic is a pair of headtracking headphones powered by a Raspberry Pi. the headtracking information is used to control a supercollider program that performs binaural processing. in  other worrds the spatial positions of sound sources are  decoupled from the listeners frame of reference. so to say the sounds stay in place as the user changes the orientation of their head. its like VR but for your ears not your eyes.

the device can also record stereo audio using the two microphones mounted on the two earcups. having microphones so close to where one's ears would be gives the sound a lot of presence.  the componnts are mounted on a pair of headphones that i've had for around five years now and the whole assembly is modular, meaning that faulty components can be easily swapped out.

sonic uses twin i2s mems microphones( SPH0645) for audio recording and a accelerometer+gyroscope+magnetometer(LSM9DS1) ic for head-tracking.

developed by Nikoloz Kapanadze

In a Praise Of Void

The stable "wall" is the very basis of land creatures and thus by "having responsive space rather than the linear presentation of the wall," the body is suddenly made aware of the situation it is in.

With the current housing crises and the reduction of the living space per person, the effect of the occupancy and void becomes more valuable day by day. But this concept, void, which can only be defined by its contrast: fullness, and which can change the perception of everyday spaces we are accustomed like a room, is not easily readable. This situation removes the concept from our daily practices and traps it in the dusty pages of abstract architectural books.

The project based on an example ordinary wall of living space offers a responsive space that will bring up a concept of void on the agenda in order to challenge perceptions of living space and physical context.

This exploration simulates an extreme architecture of responsive variable structure by suggesting a technological infrastructure with hypothetical material.

The narrative analyzes the benefits and limitations of these projects but avoids explaining the necessity and result of living in such a space. It remains speculative, is leaving the questions open for the next phases.

Bowl

(with Kohei Kimura)
Skywriter + raspberry Pi zero + 3W shaker + tibetian singing bowl + physical setup by Kohei Kimura 

CyberPlant 0

(with Ozcan Ertek)

cyberplant.py

#DDH, based on code from Adafruit industries, mrGPL

import os,glob

import pygame

DRUM_FOLDER = "KidsDay/drums2"

BANK = os.path.join(os.path.dirname(__file__), DRUM_FOLDER)

 

pygame.mixer.init(44100, -16, 1, 512)

pygame.mixer.set_num_channels(16)

 

files = glob.glob(os.path.join(BANK, "*.wav"))

files.sort()

 

samples = [pygame.mixer.Sound(f) for f in files]

 

 

import sys

import time

 

import Adafruit_MPR121.MPR121 as MPR121

print('Adafruit MPR121 Capacitive Touch Sensor Test')

 

# Create MPR121 instance.

cap = MPR121.MPR121()

 

if not cap.begin():

    print('Error initializing MPR121.  Check your wiring!')

    sys.exit(1)

 

def handle_hit(sensor_id):

    # event.channel is a zero based channel index for each pad

    # event.pad is the pad number from 1 to 8

    samples[sensor_id].play(loops=0)

    print("You hit pad {}, playing: {}".format(sensor_id,files[sensor_id]))

 

# Alternatively, specify a custom I2C address such as 0x5B (ADDR tied to 3.3V),

# 0x5C (ADDR tied to SDA), or 0x5D (ADDR tied to SCL).

#cap.begin(address=0x5B)

 

# Also you can specify an optional I2C bus with the bus keyword parameter.

#cap.begin(busnum=1)

 

# Main loop to print a message every time a pin is touched.

print('Press Ctrl-C to quit.')

last_touched = cap.touched()

while True:

    current_touched = cap.touched()

    # Check each pin's last and current state to see if it was pressed or released.

    for i in range(12):

        # Each pin is represented by a bit in the touched value.  A value of 1

        # means the pin is being touched, and 0 means it is not being touched.

        pin_bit = 1 << i

        # First check if transitioned from not touched to touched.

        if current_touched & pin_bit and not last_touched & pin_bit:

            print('{0} touched!'.format(i))

            handle_hit(i)

        # Next check if transitioned from touched to not touched.

        if not current_touched & pin_bit and last_touched & pin_bit:

            print('{0} released!'.format(i))

    # Update last state and wait a short period before repeating.

    last_touched = current_touched

    time.sleep(0.1)

 

Zeus erscheint Eva in Gestalt einer elliptischen Kurve