A Beginner-Friendly STEAM Project Using Raspberry Pi, Computer Vision & Python
You walk into a room. Instead of searching for a switch, you simply raise your finger — and the light turns on.
Lower your finger — the light turns off.
No buttons.
No physical contact.
Just vision, code, and a bit of electronics.
This is not science fiction. This is exactly what your students will build in this project.
Welcome to a hands-on STEAM activity where computer vision meets electronics, designed especially for beginners who are learning Raspberry Pi, Python, and real-world problem solving.
In real life, gesture control is already everywhere:
-
Automatic doors
-
Touch-free switches
-
Smart TVs
-
Motion-controlled games
-
Assistive technology for people with limited mobility
This project introduces you to how those systems actually work, not through theory slides — but by building one themselves.
By the end of this activity, students will:
-
Use a USB camera to detect a human hand
-
Track fingers using computer vision
-
Write Python code to interpret gestures
-
Control a real LED using hand movements
What This Project Does
You point a camera at your hand.
-
Raise your index finger → the LED turns ON
-
Lower your finger → the LED turns OFF
That’s it.
No buttons.
No touch sensors.
Just vision + code + a bit of GPIO.
Under the hood, the Pi is:
-
Looking at the camera feed
-
Detecting your hand
-
Tracking finger positions
-
Translating that into a physical action
It’s the same idea behind gesture-controlled TVs and smart devices — just stripped down to something you can build in an afternoon.
Why an LED?
Yes, this controls an LED.
But the LED is just a placeholder.
Once this works, you can swap it out for:
-
A relay (to control real lights)
-
A fan
-
A motor
-
A buzzer
-
Anything you’d normally control with a GPIO pin
The hard part isn’t the output.
It’s teaching the computer to understand your hand.
What You’ll Need
Nothing exotic here:
-
Raspberry Pi 5 (headless is fine)
-
A laptop (I’m using Windows, but Mac/Linux works)
-
USB webcam (Logitech BRIO 100 works reliably)
-
1 LED
-
1 × 330Ω resistor
-
Breadboard
-
Jumper wires
Wiring the LED
This is the simplest part, but don’t rush it.
-
GPIO17 (Pin 11) → 330Ω resistor → long leg of LED
-
Short leg of LED → GND
The resistor matters.
Skipping it is a good way to shorten the life of both the LED and your Pi.
Once wired, leave it alone for now — we’ll test it later.
A Quick Word About Tools
You’ll use a few different things during this build:
-
SSH / Terminal → installing software on the Pi
-
VNC Viewer → seeing camera windows
-
Python IDLE → writing and running code
This isn’t overkill — it’s just how headless Pi setups work.
Connecting to the Raspberry Pi
From your laptop, open a terminal or Command Prompt and connect:
Once you’re in, everything you type runs directly on the Raspberry Pi.
First Sanity Check: Is the Camera Visible?
Before installing anything fancy, make sure the Pi can see the camera:
You should see something like:
If you don’t:
-
Unplug and reconnect the camera
-
Try a reboot
No point going further until this works.
The Python Version Problem (This Is Where Most Tutorials Break)
Here’s the awkward reality:
-
Raspberry Pi OS uses Python 3.11+
-
MediaPipe (the hand-tracking library) still requires Python 3.10
-
Installing Python 3.10 with
aptno longer works -
Downgrading system Python is a bad idea
This is why so many people get stuck halfway through gesture projects.
The solution isn’t a hack — it’s standard practice.
Installing Python 3.10 Safely Using pyenv
Instead of touching system Python, we install another Python version just for this project.
This keeps your Pi stable and your project clean.
Install the required build tools
Install pyenv
Tell your shell about it:
Check:
Install Python 3.10
This takes a while on a Pi. Let it finish.
Setting Up a Clean Project Environment
Now let’s keep everything contained:
Verify:
python --version
You should see Python 3.10.13.
At this point:
-
System Python is untouched
-
This project uses exactly what it needs
-
MediaPipe will behave
Installing the Libraries
With the virtual environment active:
Opening Python IDLE (This Detail Matters)
From VNC Viewer, open a terminal:
Inside IDLE:
Make sure it still says Python 3.10.
Testing the Camera
Create a file called camera_test.py:
If a live window opens, the camera side is done.
Testing the LED (One Pi-5 Specific Fix)
Raspberry Pi 5 uses a newer GPIO backend. Install it once:
After reboot, test the LED:
Blinking LED = everything is wired correctly.
The Fun Part: Hand Gesture Control
Now we connect vision to hardware.
Create hand_led_control.py:
Run it.
Raise your index finger.
Watch the LED respond.
That moment never gets old.
Where This Can Go Next
Once this works, ideas start multiplying:
-
Replace the LED with a relay
-
Add more gestures
-
Control multiple outputs
-
Combine with voice or sensors
-
Build touch-free interfaces
This project isn’t about an LED.
It’s about realizing that your hands can directly control hardware using code — and once you see that, you start seeing applications everywhere.

Comments
Post a Comment