-
Hello, I've been building a VR experiment with UXF, but have run into a potential issue implied by the title of this topic. I'm planning to use active markers (e.g., Optitrack, Worldviz PPT) to record positional data (e.g., reaching characteristics). Normally, these would sample at ~200 times per second, and much of the research I see using them holds to that standard. However, the rate at which position is sampled in Unity is tied to the framerate, with most HMDs running at 90fps. Is there a way to decouple the sampling rate of the markers from the framerate, or sample a transform multiple times per frame? Alternatively, is my concern here overblown? Does it even matter? I've spent a few days essentially spinning my tires in place trying to work out a solution to no avail. Coroutines evaluate once per frame, and other approaches (e.g., multithreading, the job system) don't allow me to access Unity methods (thus, the transform of the tracked object). However, I'm also relatively naive to Unity, so I thought I'd ask here. Thanks, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Hi Josh As to weather its needed - I guess it depends on your use case. For fast jerky movements, you can imagine 200, 500, 1000Hz being very useful. But for many experiments, 90Hz could be enough. I recall there being some published experiments finding that ~100Hz is enough for most human movement data. Maybe @mwarb has an opinion? Jack |
Beta Was this translation helpful? Give feedback.
-
I ended up solving this problem! In short, I created a new Tracker class that alters a few of the methods from the parent class, opening up a separate thread during trials to sample position data from VRPN outside the Unity API. I put up an example at JashoBell/rapid-sampling-for-unity-experiments for anyone interested in navigating the same issue. It probably won't be plug and play, but hopefully will give a good head start. I might work to clean the project up over time, but that might not be for a few months. Let me know if I need to give credit to UXF differently than I am, I am very new to this. |
Beta Was this translation helpful? Give feedback.
I ended up solving this problem! In short, I created a new Tracker class that alters a few of the methods from the parent class, opening up a separate thread during trials to sample position data from VRPN outside the Unity API. I put up an example at JashoBell/rapid-sampling-for-unity-experiments for anyone interested in navigating the same issue. It probably won't be plug and play, but hopefully will give a good head start. I might work to clean the project up over time, but that might not be for a few months. Let me know if I need to give credit to UXF differently than I am, I am very new to this.