ITP Blog
Light and Interactivity
Final Project Progress 2
This week I focused on the experiment we wanted to have with machine learning and gesture detection and the results are not bad, although the model needs more fine tuning and training it can almost detect the gesture.
We had to use the Arduino nano ble since it's one of the few microcontrollers that currently support tensorflowlite library, I collected 200 data samples of 2 seconds of sensor reading sampled in 100ms time intervals, so each row of data had 20 numbers, I then generated around 200 random noise values. Not an ideal way to do it but good enough for a basic prototype.
Below is the arduino code used to load the model and get the sensor readings and run it through the model:
The model performs ok considering that it was only trained for 10 epochs and with 400 lines of data. It's detecting the gesture but sometimes with one wave it also detects the gesture which means it may need more data to differentiate between one hand wave and back and forth.
In terms of the shape and the fabrication, we have finalized the form and built a small prototype and doing so we found out that this shape could probably be used on its own and in various angles rather than being stuck on a base, so we're considering fitting all the electronics inside the fixture.