BrilliantSole
@BrilliantSole
Making smart insoles for developers. They have pressure sensors (8 per insole), 3DoF motion sensor, and 2 haptics per insole (heel & ball). Waitlist link below
Kicking a virtual soccer ball in WebXR on the Quest 3 ⚽️ @aframevr Using our @ConcreteSciFi motion module (strapped to our ankle) to detect kicks and stomps
Stomp, Kick, and Toss green shells in WebXR on the Quest 3 @aframevr By strapping our @ConcreteSciFi modules to our ankle, we can detect kicks and stomps by running an @EdgeImpulse model on-device, and then send the results to the headset via WebSockets We also add haptic…
Stomp, Kick, and Toss green shells in WebXR on the Quest 3 @aframevr By strapping our @ConcreteSciFi modules to our ankle, we can detect kicks and stomps by running an @EdgeImpulse model on-device, and then send the results to the headset via WebSockets We also add haptic…
Adding haptic feedback to create immersive WebXR experiences on the Quest 3 @aframevr Here we vibrate our @ConcreteSciFi modules differently based on hand gesture (pet, punch, grab, and release) We added WiFi support to our JavaScript SDK, allowing us to connect to our…
Smart Glasses Webcam Used @OBSProject to create a Virtual Webcam of our smart glasses camera stream demo Use in video conference apps like Zoom and Google Meet, for live-streaming on Twitch, or even as a quick way to test realtime computer vision demos (eg @huggingface spaces)…
Express yourself with smart glasses 👓 Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the @brilliantlabsAR Frame running custom firmware), sorta like a simple FaceTime Memoji While Frame doesn’t have built-in…
Express yourself with smart glasses 👓 Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the @brilliantlabsAR Frame running custom firmware), sorta like a simple FaceTime Memoji While Frame doesn’t have built-in…
Our smart glasses display simulator also works in WebXR, allowing us to preview AR Glasses applications on headsets like the Quest 3 in @aframevr And despite Web Bluetooth not working in the oculus browser, our JavaScript SDK allows us to indirectly connect to compatible smart…
Our smart glasses display simulator also works in WebXR, allowing us to preview AR Glasses applications on headsets like the Quest 3 in @aframevr And despite Web Bluetooth not working in the oculus browser, our JavaScript SDK allows us to indirectly connect to compatible smart…
Added a smart glasses display simulator to our JavaScript SDK Easily preview how your applications will look in the browser, using your webcam, a video, or an @aframevr WebXR scene as the background Even works alongside compatible smart glasses in realtime (in this case the…
Added a smart glasses display simulator to our JavaScript SDK Easily preview how your applications will look in the browser, using your webcam, a video, or an @aframevr WebXR scene as the background Even works alongside compatible smart glasses in realtime (in this case the…
Added display support for smart glasses (in this case the @brilliantlabsAR Frame) to our SDK Easily draw primitives (rectangles, rounded rectangles, circles, ellipses, polygons, line segments), set outline thickness, change colors, apply rotations, and even apply regular and…
Added display support for smart glasses (in this case the @brilliantlabsAR Frame) to our SDK Easily draw primitives (rectangles, rounded rectangles, circles, ellipses, polygons, line segments), set outline thickness, change colors, apply rotations, and even apply regular and…
Running speech recognition on smart glasses 👓 As we port our smart insole firmware to the @brilliantlabsAR Frame, we added microphone support to our SDK’s, allowing us to easily run @huggingface’s Whisper Web to perform speech recognition in the browser (both realtime and…
Running speech recognition on smart glasses 👓 As we port our smart insole firmware to the @brilliantlabsAR Frame, we added microphone support to our SDK’s, allowing us to easily run @huggingface’s Whisper Web to perform speech recognition in the browser (both realtime and…
Track hands, faces, and objects with your smart glasses 👓 Running MediaPipe and @huggingface models in the browser, using camera data sent from the @brilliantlabsAR Frame via Bluetooth The Frame is running custom firmware ported from our smart insoles, using our own…
Track hands, faces, and objects with your smart glasses 👓 Running MediaPipe and @huggingface models in the browser, using camera data sent from the @brilliantlabsAR Frame via Bluetooth The Frame is running custom firmware ported from our smart insoles, using our own…
Streaming camera data from smart glasses 👓 to a smart watch ⌚️ via Bluetooth As we port our smart insole firmware to the @brilliantlabsAR Frame, we added camera support to our SDK’s For context, this is running on an Apple Watch Series 5 (2019)
Streaming camera data from smart glasses 👓 to a smart watch ⌚️ via Bluetooth As we port our smart insole firmware to the @brilliantlabsAR Frame, we added camera support to our SDK’s For context, this is running on an Apple Watch Series 5 (2019)
Ported our firmware to the @brilliantlabsAR Frame 👓, making it compatible with our SDK’s
Ported our firmware to the @brilliantlabsAR Frame 👓, making it compatible with our SDK’s
Connecting an Apple Watch to our Smart Insoles Using our Brilliant Sole Swift Package in a SwiftUI app Get sensor data, trigger haptics, and even send ML models to the insoles It also works in Xcode Previews (Core Bluetooth doesn’t work on non-MacOS schemes, but you can use…
Adding haptic feedback to create immersive WebXR experiences on the Quest 3 @aframevr Here we vibrate our @ConcreteSciFi modules differently based on hand gesture (pet, punch, grab, and release) We added WiFi support to our JavaScript SDK, allowing us to connect to our…
Virtual punching bag, using our @ConcreteSciFi hardware to detect punches and provide haptic feedback Running an ML model on-device to detect punches, hooks, and uppercuts, made with @EdgeImpulse All running in the browser @aframevr
Virtual punching bag, using our @ConcreteSciFi hardware to detect punches and provide haptic feedback Running an ML model on-device to detect punches, hooks, and uppercuts, made with @EdgeImpulse All running in the browser @aframevr
Experimenting with simple wrist controls using our @ConcreteSciFi hardware, all running in a webpage @aframevr Their hardware is modular, so we can remove the insole & shoe clip, add a loop clip, and strap it around our wrist, relying just on motion+haptics We used…
Experimenting with simple wrist controls using our @ConcreteSciFi hardware, all running in a webpage @aframevr Their hardware is modular, so we can remove the insole & shoe clip, add a loop clip, and strap it around our wrist, relying just on motion+haptics We used…
Rewrote @ConcreteSciFi’s firmware so their hardware is now 100% compatible with our SDK’s - get motion data, pressure data, and trigger haptics Despite having different motion sensors, different MCU’s (ours is @NordicTweets, theirs is @ESP32net), and different platforms (ours is…
Rewrote @ConcreteSciFi’s firmware so their hardware is now 100% compatible with our SDK’s - get motion data, pressure data, and trigger haptics Despite having different motion sensors, different MCU’s (ours is @NordicTweets, theirs is @ESP32net), and different platforms (ours is…
Connecting our smart insoles to an Apple Vision Pro (simulator) using our Swift package + SwiftUI sample app The AVP simulator doesn’t have Bluetooth enabled, but fortunately our nodejs server can relay data via UDP Even works in Xcode previews - great for iterating (even if…
Interfacing with our smart insoles on an Apple TV using our Swift package + SwiftUI sample app
Connecting our smart insoles to an Apple Vision Pro (simulator) using our Swift package + SwiftUI sample app The AVP simulator doesn’t have Bluetooth enabled, but fortunately our nodejs server can relay data via UDP Even works in Xcode previews - great for iterating (even if…
Connecting our smart insoles to an Apple Vision Pro (simulator) using our Swift package + SwiftUI sample app The AVP simulator doesn’t have Bluetooth enabled, but fortunately our nodejs server can relay data via UDP Even works in Xcode previews - great for iterating (even if…
Connecting an Apple Watch to our Smart Insoles Using our Brilliant Sole Swift Package in a SwiftUI app Get sensor data, trigger haptics, and even send ML models to the insoles It also works in Xcode Previews (Core Bluetooth doesn’t work on non-MacOS schemes, but you can use…