![]() Thank you again for everyone here helping out and the developer of FreePIE for such an awesome tool. If anyone with Windows DLL plugin development experience thinks it's straightforward, this driver could enable a flurry of VR applications for FreePIE community. Until the driver is updated/recreated we're stuck using the limited InteractionBox volume. The x,y,z positions of Left and Right hands are given in the InteractionBox's normalized space instead of the raw mm values of the full FOV from the Leap sensor. ![]() It outputs the normalized (and limited) hand positions of the interaction box. The biggest limitation is the 2014 FreePIE plugin for the Leap Motion. The hard part was figuring out the 3D transforms, now it's just a matter of mapping the data. With a few edits to the FreePIE script, it should be easily converted to any device/buttons/orientation data they want. While researching and reading through forums I found a lot of people with wii nunchucks trying to do this. Rec Room didn't recognize the pickup action, but paintball was fun. Without any industry standards for VR controllers, your mileage will vary in app support. There were a few glitches that were very specific to developer implementation, but overall very satisfying to finally have access to the Steam VR stuff with motion controllers. Over the weekend I finished 3D printing a second DIY motion controller and playing a few Steam VR apps. Project files for DIY HTC Vive motion controllers, using the Leap Motion mounted to VR headset for positional tracking: Should be in a playable state before the weekend Next up is calibrate the Arduino-IMU orientation, and map the joystick+buttons. If anyone could help here, that would be awesome. I hope this is a simple fix, but I know nothing about writing plugins. All that would be needed is output the raw hand position data for the full range of motion the Leap captures. Unfortunately, the author of the Leap plugin for FreePIE is no longer active as this was made over 2 years ago. If you look closely at the gif, you'll see the hand be stuck inside a "virtual bounded box." That's the Interaction Box function of the Leap sensor: What's the distance between the hydra base station and eye-center when re-centering the hydra base to 0,0,0? There were many pitfalls (e.g no documentation on the hydra driver, and I had no idea how it calibrated so everything was inverted for a while) but I hope someone has these numbers. He has most recently been experimenting with emulating HTC Vive controllers using an Arduino, a BNO055 9-axis orientation sensor, and a Leap Motion unit. are two of the most commonly used lo w-cost input devices for interac-tions in virtual environments.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |