RingGesture - finger and hand gestures
My project comes along and makes progress.
I have verified how good my sensor data is, before I dive into AI and ML programming to recognize gestures. Just to make sure the sensor data as input "looks good". And it does.
Here two videos taken during my tests:
Instead of using the tool "Processing 4" (which is not really helpful to "see" anything), I have programmed a Python script. Using matplotlib, as 3D plots, let me see much clearer how my Time of Flight sensor data looks like. And I can already realize the difference between gestures, finger movements. So, it should be possible to distinguish finger gestures with AI or ML (my goal to elaborate - done).
For the "fourth" dimension: I have already prepared to use an IMU sensor (soldered on the hub). This sensor will realize if I twist or move my arm. My arm movements realized (esp. twisting) should also "rotate" my finger gestures in a 3D sphere, which adds another dimension, esp. in which orientation the gesture was done (the optical sensors do not have a reference to a 3D sphere, just the relation between fingers, but the IMU adds the "missing" reference in a 3D sphere).
So, the hardware setup works, time to dive into the AI programming. Next should be to demonstrate which gestures are realized (and, for instance, display the hand gesture with/in Unity ("XR Hands")).
Wireless Software Dev Engineer
8 个月A true engineer