Gesture Control of Smart Home Devices with a Xilinx KV260


Winter 2022

Summary

I did this project as part of a the Adaptive Computing Challenge contest that AMD-Xilinx was running over on hackster.io. This was a really fun project to work on, since I got to work with some brand new hardware, as I was provided with a free a KV260 Vision AI starter kit. And, even better, this project won 3rd prize in the Edge Computing category, whose prize money will be going towards future projects!

The KV260 is an nice little FPGA SoM that is intended for developing computer vision applications. I loaded it up with Ubuntu Linux, installed PYNQ, compiled a custom PyTorch model using Vitis-AI 1.4, connected a USB webcam, and wrote some Python code to connect it all together. Before long, I was able to wave my hands in front of my TV to navigate the menues, no remote required!

You can check out the code on GitHub and full article on hackster.io using the links below!

Full Article Code