I'm excited to share a small personal project I've just completed: a hands gestures-navigable website! ๐๐ป
Inspired by Charlie Gerard's workshop on machine learning with JavaScript on Frontend Masters, I decided to create something of my own. This led me to develop a simple experiment exploring the potential of TensorFlow.js and MediaPipe for gesture recognition.
While it's a basic project, it allowed me to dive deeper into these technologies and reflect on how they could be applied to improve web accessibility.
Some things I learned:
๐ฅ Implementing real-time gesture recognition is complex but fascinating
๐ฅ Balancing performance and accuracy is an interesting challenge
๐ฅ There's enormous potential for more intuitive and inclusive user interfaces
This technology could be very useful in everyday situations, like cooking, where you can navigate recipes without touching your device with messy hands. More importantly, it has the potential to significantly reduce accessibility barriers, making digital interfaces more usable for a wider range of people.
The next small project I think I'll be working on is an eyes movement-navigable website, this will be a bit though though, I'll need to design the page specifically to be used only with the eyes
Thanks for reading this all the way through!
Here are some useful links!
Link to the website: https://hands-gestures-control.netlify.app/
My LinkedIn profile: https://www.linkedin.com/in/federico-casadei-8b572b1b8/