XR Blocks
JavaScript library for rapid XR and AI prototyping
Site -- Manual -- Templates -- Demos -- YouTube -- arXiv -- Blog
Description
XR Blocks is a lightweight, cross-platform library for rapidly prototyping advanced XR and AI experiences. Built upon three.js, it targets Chrome v136+ with WebXR support on Android XR (e.g., Galaxy XR) and also includes a powerful desktop simulator for development. The framework emphasizes a user-centric, developer-friendly SDK designed to simplify the creation of immersive applications with features like:
- Hand Tracking & Gestures: Access advanced hand tracking, custom gestures with TensorFlow Lite / PyTorch models, and interaction events.
- Gesture Recognition: Opt into pinch, open-palm, fist, thumbs-up, point,
and spread detection with
options.enableGestures(), tune providers or thresholds, and subscribe togesturestart/gestureupdate/gestureendevents from the shared subsystem. - World Understanding: Present samples with depth sensing, geometry-aware physics, and object recognition with Gemini in both XR and desktop simulator.
- AI Integration: Seamlessly connect to Gemini for multimodal understanding and live conversational experiences.
- Cross-Platform: Write once and deploy to both XR devices and desktop Chrome browsers.
We welcome all contributors to foster an AI + XR community! Read our blog post and white paper for a visionary roadmap.
Usage
XR Blocks can be imported directly into a webpage using an importmap. This code creates a basic XR scene containing a cylinder. When you view the scene, you can pinch your fingers (in XR) or click (in the desktop simulator) to change the cylinder's color. Check out this live demo with simple code below: