BrilliantSole (@BrilliantSole)
2025-12-31 | โค๏ธ 830 | ๐ 252
3D hand tracking on smart glasses
Not only can we stream camera data with our Wearables SDK, but we can run MediaPipe for hand tracking, applying it in WebXR environments like @threejs and @aframevr
Works with compatible smart glasses, in this case the @omidotme Omi Glass, based on the @seeedstudio XIAO ESP32-S3 Sense (running our custom Firmware)
๋ฏธ๋์ด
![]()
๐ Related
- open-source-robot-arm-meets-hand-tracking-github-below-it
- watch-the-awesome-4dgs-plugin-running-in-lichtfeld-studio
- new-workflow-thanks-to-steipetes-awesome-twitter-cli-bird-1
- the-next-evolution-vla-models-just-yesterday-msftresearch
- thx-akhaliq-for-sharing-control-the-camera-control-the
์ธ์ฉ ํธ์
BrilliantSole (@BrilliantSole)
Detect head gestures with smart glasses
Used @EdgeImpulse to create a simple classifier to detect head gestures (nod and shake) based on motion data, all running on the glasses itself via Tensorflow lite
Using the @omidotme Omi Glass (based on the @seeedstudio XIAO ESP32-S3 Sense) running our custom firmware
๐ฌ ์์