You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to mix the picsphere functionality with another open source project using Tango, RTAB-map, for a university project I'm doing. The GPL will be very much respected.
I've managed to import all the files of the complete app so they build with gradle in Android Studio 2.3.3 and recompiled all of the external dependencies with ndk-build (13b if I recall) so all of the shared libraries work with the 2013 versions of all of the tools from hugin. So far so good, the picsphere applet does capture images, saves them and then attempts to render them but they often don't come out equirectangular at all. I'm thinking it might be down to a quirk of how the SensorFusion class interfaces with the gyro/accel/magnetometer in the Lenovo Phab 2 Pro I'm trying to port Focal to. I know it's a weird and awkwardly uncommon phone - but it's the only thing I could get my hands on that has Tango. It runs Marshmallow.
This feeds into another problem - I have a feeling that I can trace the possible sensor errors back to where the phone thinks the last snapshot taken in the picsphere sequence is in 3D space. The blue guide dots that are drawn every 40 degrees seem to move in different speeds of circular motion in all directions, mostly slowly but sometimes with dizzying circular motion. So, if I can take one snapshot and render it to the GLES20 scene, I can find out how the sensor data received by the app is behaving. However, none of the textures of snapshots that I take before the full sphere is rendered (using startRendering() in PicSphereManager) are actually drawn to the GLES20 scene. I think it's because of this error that I'm exploring using logcat:
which apparently means there has been a GL_INVALID_OPERATION somewhere. It's a recurring error that repeats constantly whilst in picsphere mode.
Now, I do realise that this repo hasn't been touched for a while but making it work for modern handsets does have lasting effects for the project I'm working on which aims to help blind and visually impaired people.
I'm sort of reaching out to xplodwild in your divine Android wisdom I guess. I'm rather new to Android programming I was just wondering if you or anyone else has come across this sort of error on a newer handset than a Nexus 4 (a name I see littered about the comments, I get a strong feeling it was developed with a Cyanogenmod from 2013 on this device) and also if the sensor interface needs updating? I'm also running my gradle built apk on near enough stock android 6.0.1 - is root access explicitly required for any part of Focal to work?
The text was updated successfully, but these errors were encountered:
Good evening,
I'm trying to mix the picsphere functionality with another open source project using Tango, RTAB-map, for a university project I'm doing. The GPL will be very much respected.
I've managed to import all the files of the complete app so they build with gradle in Android Studio 2.3.3 and recompiled all of the external dependencies with ndk-build (13b if I recall) so all of the shared libraries work with the 2013 versions of all of the tools from hugin. So far so good, the picsphere applet does capture images, saves them and then attempts to render them but they often don't come out equirectangular at all. I'm thinking it might be down to a quirk of how the SensorFusion class interfaces with the gyro/accel/magnetometer in the Lenovo Phab 2 Pro I'm trying to port Focal to. I know it's a weird and awkwardly uncommon phone - but it's the only thing I could get my hands on that has Tango. It runs Marshmallow.
This feeds into another problem - I have a feeling that I can trace the possible sensor errors back to where the phone thinks the last snapshot taken in the picsphere sequence is in 3D space. The blue guide dots that are drawn every 40 degrees seem to move in different speeds of circular motion in all directions, mostly slowly but sometimes with dizzying circular motion. So, if I can take one snapshot and render it to the GLES20 scene, I can find out how the sensor data received by the app is behaving. However, none of the textures of snapshots that I take before the full sphere is rendered (using startRendering() in PicSphereManager) are actually drawn to the GLES20 scene. I think it's because of this error that I'm exploring using logcat:
07-10 21:37:01.315 25854-26166/fr.xplod.focal W/GLConsumer: [SurfaceTexture-1-25854-1] bindTextureImage: clearing GL error: 0x502
which apparently means there has been a GL_INVALID_OPERATION somewhere. It's a recurring error that repeats constantly whilst in picsphere mode.
Now, I do realise that this repo hasn't been touched for a while but making it work for modern handsets does have lasting effects for the project I'm working on which aims to help blind and visually impaired people.
I'm sort of reaching out to xplodwild in your divine Android wisdom I guess. I'm rather new to Android programming I was just wondering if you or anyone else has come across this sort of error on a newer handset than a Nexus 4 (a name I see littered about the comments, I get a strong feeling it was developed with a Cyanogenmod from 2013 on this device) and also if the sensor interface needs updating? I'm also running my gradle built apk on near enough stock android 6.0.1 - is root access explicitly required for any part of Focal to work?
The text was updated successfully, but these errors were encountered: