What it is and how it works
This is a special system that helps the MacBook screen to understand exactly where the user is touching.
The idea is to place a small mirror in front of the built-in macbook camera so that it looks at the computer screen at an acute angle.
This way the camera can see the user’s fingers. And with the help of computer vision, transform the video stream into commands for touch control.
As a result, to implement the plan it took:
• hot glue
• miniature mirror
• hard paper plate (this is not a joke)
• door hinges
With their help, the mirror is mounted above the MacBook on a makeshift mount. So it was going:
Next, the algorithm estimates the middle line between them and takes it for the touch point to the laptop screen .
RANSAC – a method for estimating model parameters based on random samples
im g In the final prototype, the developers converted touch and finger movements into mouse commands. When creating the model, a camera with a resolution of 480p was used. And the greater the resolution of the camera, the more accurately the movements will be read.
The creators published the source code for the computer vision system on GitHub. There are also detailed installation instructions. [AnishaTalye]