GearVRf
GearVRf copied to clipboard
Q / How can i use 180 dgree GVRVideoSceneObject?
I used 360 dgree GVRVideoSceneObject.
GVRSphereSceneObject sphere = new GVRSphereSceneObject(gvrContext, 72, 144, false);
GVRMesh mesh = sphere.getRenderData().getMesh();
environmentObject = new GVRVideoSceneObject( gvrContext, mesh, EnviromentSceneObjectPlayer, GVRVideoSceneObject.GVRVideoType.MONO );
environmentObject.getTransform().setScale(21f, 21f, 21f);
environmentObject.setName( "environmentObject" );
mGVRContext.getMainScene().addSceneObject(environmentObject);
I want to use 180 dgree GVRVideoScneneObject. How can i use it?
I'm assuming that currently, your code maps the 180 degree video across the entire interior of the sphere instead of a single hemisphere. In order to achieve the effect you want, there are a few solutions.
First, you could adjust the UV mapping of the spherical mesh such that the texture (video) maps only to a single hemisphere. I believe that you may be able to achieve this programmatically - since the standard storage method for 360 degree videos is an equirectangular projection, it should follow that taking the original UV coordinates and multiplying the U coordinate by 2 will result in the video projecting to half as much area in the x-direction, and you should see the video show up on a single hemisphere. Depending on the render settings, you may see the video repeat on the opposite hemisphere as well.
Alternatively, if you have the ability to edit the video, you can extend the video in the x-direction with a black rectangular area the same size of your current video. This will create a second, black hemisphere that will allow the original part of the video to be properly projected onto a single hemisphere.
If the video is not in equirectangular format, however, you may need to try a different approach.
@J0Nreynolds Thank you for your reply. I want to show the sample code. If you have sample code, Please Let me know. I need your kind help.
In order to perform my first suggestion, modify your code to be the following:
GVRSphereSceneObject sphere = new GVRSphereSceneObject(gvrContext, 72, 144, false);
GVRMesh mesh = sphere.getRenderData().getMesh();
float[] texCoords = mesh.getTexCoords();
for(int i = 0; i < texCoords.length; i+=2){
texCoords[i] = 2*texCoords[i];
}
mesh.setTexCoords(texCoords);
environmentObject = new GVRVideoSceneObject( gvrContext, mesh, EnviromentSceneObjectPlayer, GVRVideoSceneObject.GVRVideoType.MONO );
environmentObject.getTransform().setScale(21f, 21f, 21f);
environmentObject.setName( "environmentObject" );
mGVRContext.getMainScene().addSceneObject(environmentObject);
I tested this with a 180 degree photo, and it worked as I expected.
@miraclehwan If you could attach a picture of what your video looks like, I'd be able to determine the format that your video is stored in so I can help you map it to the sphere correctly.
@miraclehwan
Is the 180 video an unwarped panoramic video? Or is it a specific video format that must be warped to a hemisphere?
If it's the former, a cylinder geometry (or half of it) is suitable.
If it's the latter, use J0Nreynolds approach.
Alternatively, there are also shaders one can write to remap a 360 degree video to a 180 degree hemisphere instead of remapping the UV coordinates.
@J0Nreynolds I ran into this thread, and I read your code. If u = 2*u for a mesh, from my beginner's guess, the image should expand horizontally which can wrap sphere two times. (But the result is opposite, I checked it also) Would you kindly help me to understand uv-coordinate correctly?
@bohuang3d What kind of shader do we use to remap a video? I've looked into shader-related things, but couldn't understand how...
@joelsung Just for debugging purposes, what happens if you do u = 0.5*u instead?
The remapping of u is more or less a vertex shader modification. One can also do a remapping in fragment shaders for example, to map a rectangle to a sphere, or one of many cartographical projections that your video may use.
You could share the format of your video and what you wish to do
@bohuang3d If I do u = 0.5*u, the half of the video stretches all over the sphere. My curiosity is, why does it work in an opposite way from my guess.
I'm thinking about doing a conference call through VR (Right now, I'm playing local mp4 file) :)
@joelsung I agree that the it seems a bit unintuitive that doubling the U coordinate compresses the projection by a factor of 2 instead of stretching it by a factor of 2, but I think I can explain it to you in a way that makes sense.
Recall that UV coordinates indicate which part of a texture should appear at a certain point of a mesh. Because of this, each vertex in a mesh has a UV coordinate that indicates what part of the texture should be drawn there. If we were to double all of the U coordinates, this would indicate that we'd want twice as much of the texture to be drawn on the the mesh in the U direction. For example, a vertex that originally had a U coordinate of 0.5 (halfway through the texture) would now have a U coordinate of 1.0 (the end of the texture). Since the mesh itself is unchanged, we need to fit twice as much texture on the same surface area as before, which means the texture will be compressed to fit on the mesh's surface, not stretched.
Similarly, multiplying the U coordinates by 0.5 will result in half as much of the texture to be drawn on the mesh. A vertex that originally had a U coordinate of 0.5 (halfway through the texture) would now have a U coordinate of 0.25 (the quarter of the way through the texture). In this case, half as much texture will be displayed on the same surface area, and the texture will be stretched as a result.
I hope this explanation helps you understand this effect a little better.
@joelsung
For u = 0.5u the result is what you expect? but for u = 2u the result is 'opposite'? Could you elaborate.
Also, check your texture wrapping mode. It should be set for REPEAT for your u coordinate
@joelsung
please also send screenshots for both cases. Thanks
@J0Nreynolds Thanks so much for elaborating explanation!! Due to your kind effort, I could understand and try display other videos. But it was not easy for me just by assigning u, v to split the sphere into multiple regions.
So I finally decided to import part of sphere mesh, and am going really rough in loading obj file. (I also tried to use GVRSphereSceneObject, but mesh shape was too limited). Thanks though!
@bohuang3d Sorry, I already changed method to do it. Could I know how to vertex shader modification? I tried to figure out how could do with shader, but couldn't get a glimse of it. Is it possible sharing a bit of codes?
@joelsung Something like this for the vertex shader
precision highp float;
//Incoming vertex attributes
attribute vec3 position;
attribute vec2 uv;
uniform mat4 modelViewProjMatrix;
//This is filled in the vertex shader and passed to the fragment shader
out vec2 uvOut;
void main()
{
gl_Position = modelViewProjMatrix * vec4(position, 1);
//This is where you insert the scaling factor for your uv
uvOut = uv * 2.0f;
}
@J0Nreynolds Thank you for your comment.
I solved this issues. but I hava a little problem.
Please look at below picture.
I don't want show the Back image. Can i change the Back color or Same Video Image??
Sorry for a lot of question and my english is not good..
Current My VideoScene


I want My VideoScene


@miraclehwan
The circular stripes you see are usually due to one of the texture parameters (either U or V) not changing across the surface being mapped, or texture repeat mode is set to GL_CLAMP if the texture coordinates go beyond the [0,1] range
@bohuang3d Thank you for comment. if you know solve the issues, please let me know. I used below code.
float[] texCoords = mesh.getTexCoords(); for(int i = 0; i < texCoords.length; i+=2){ texCoords[i] = 2*texCoords[i]; }
@miraclehwan
When you do texCoords[i] = 2*texCoords[i]; you are mapping the texture range from [0,1] to [0,2]
You could try something like this:
//Create texture parameters and set the texture wrap style to GL_REPEAT. The S,T correspond to the U,V I mentioned.
GVRTextureParameters parameters = new GVRTextureParameters(mGVRContext);
parameters.setWrapSType(TextureWrapType.GL_REPEAT);
parameters.setWrapTType(TextureWrapType.GL_REPEAT);
//And pass the parameters to your texture
GVRTexture texture = mGVRContext.loadTexture(
new GVRAndroidResource(mGVRContext, R.drawable.ground_512), parameters);
@bohuang3d I using GVRVideoSceneObject. how can i set texture parmeters in video?
GVRSphereSceneObject sphere = new GVRSphereSceneObject(gvrContext, 72, 144, false); GVRMesh mesh = sphere.getRenderData().getMesh(); EnvironmentVideoSceneObject = new GVRVideoSceneObject( gvrContext, mesh, EnviromentSceneObjectPlayer, VideoType ); EnvironmentVideoSceneObject.getTransform().setScale(21f, 21f, 21f); if (VideoDegreeType.equals("180")){ float[] texCoords = mesh.getTexCoords(); for(int i = 0; i < texCoords.length; i+=2){ texCoords[i] = 2*texCoords[i]; } mesh.setTexCoords(texCoords); EnvironmentVideoSceneObject.getTransform().rotateByAxis(-180, 0, 1, 0); }else{ EnvironmentVideoSceneObject.getTransform().rotateByAxis(-90, 0, 1, 0); } EnvironmentVideoSceneObject.setName( "EnvironmentVideoSceneObject" ); mainScene.addSceneObject(EnvironmentVideoSceneObject);
GVRTextureParameters texparams = new GVRTextureParameters(mGVRContext); parameters.setWrapSType(TextureWrapType.GL_REPEAT); parameters.setWrapTType(TextureWrapType.GL_REPEAT); EnvironmentVideoSceneObject.getRenderData().getMaterial().getMainTexture().updateTextureParameters(texparams);
@NolaDonato Thank you for your comment. But I still have a problem. Could you check my code please?
EnviromentSceneObjectPlayer = MakeMediaPlayer(PlayVideoUrl);
GVRSphereSceneObject sphere = new GVRSphereSceneObject(gvrContext, 72, 144, false);
GVRMesh mesh = sphere.getRenderData().getMesh();
EnvironmentVideoSceneObject = new GVRVideoSceneObject( gvrContext, mesh, EnviromentSceneObjectPlayer, VideoType );
EnvironmentVideoSceneObject.getTransform().setScale(21f, 21f, 21f);
if (VideoDegreeType.equals("180")){
float[] texCoords = mesh.getTexCoords();
for(int i = 0; i < texCoords.length; i+=2){
texCoords[i] = 2*texCoords[i];
}
mesh.setTexCoords(texCoords);
EnvironmentVideoSceneObject.getTransform().rotateByAxis(-180, 0, 1, 0);
GVRTextureParameters texparams = new GVRTextureParameters(mGVRContext);
texparams.setWrapSType(GVRTextureParameters.TextureWrapType.GL_REPEAT);
texparams.setWrapTType(GVRTextureParameters.TextureWrapType.GL_REPEAT);
EnvironmentVideoSceneObject.getRenderData().getMaterial().getMainTexture().updateTextureParameters(texparams);
}else{
EnvironmentVideoSceneObject.getTransform().rotateByAxis(-90, 0, 1, 0);
}
EnvironmentVideoSceneObject.setName( "EnvironmentVideoSceneObject" );
mainScene.addSceneObject(EnvironmentVideoSceneObject);
Current VideoView


@miraclehwan Run the texture parameters block on the GL thread:
getGVRContext().runOnGlThread(new Runnable() {
@Override
public void run() {
GVRTextureParameters texparams = new GVRTextureParameters(mGVRContext);
texparams.setWrapSType(GVRTextureParameters.TextureWrapType.GL_REPEAT);
texparams.setWrapTType(GVRTextureParameters.TextureWrapType.GL_REPEAT);
EnvironmentVideoSceneObject.getRenderData().getMaterial().getMainTexture().updateTextureParameters(texparams);
}
});
If you still experience problems, please consider creating a test app for us to examine. Thanks.
@liaxim Thank you for your comment. I used GL thread. but My app crashing. so I update my code. Please check & test my project.
@bohuang3d Since the the video scene object has to use an external texture, the texture wrapping mode is fixed to GL_CLAMP_TO_EDGE per OES_EGL_image_external spec.
@miraclehwan I am not seeing a crash trying your app btw.
Given the video from the sample app, half of a cylinder is the mesh that is needed probably.
@miraclehwan I will modify your project with the initial half-sphere method to keep it simple and straight-forward and get back to you
@miraclehwan I modified your Main.java and added a class called PartialSphereSceneObject that can generate hemispheres, partial spheres, and full sphere to map your video too. Both files attached please give a try, thanks
//A new class that generates a part of a sphere
//last two parameters (both range from 0.0f to 1.0f) determine how fully is the sphere
// constructed
// i.e.
// 1.0f and 1.0f = a full sphere
// 1.0f and 0.5f = a half sphere vertically sliced
// 0.5f and 1.0f = a half sphere horizontally sliced
final PartialSphereSceneObject sphere = new PartialSphereSceneObject(gvrContext, 72, 144,
false,
new GVRMaterial(gvrContext),
1.0f, //the vertical direction
0.5f //the horizontal direction
);
@liaxim I updateed my code.. I solved this issues. but I curious why crash is my app when used GL Thread.
@bohuang3d Thank you for your code. I solved this issues. Thanks you so much. I have a question. if i want to show duplicate video in front & back, how can i edit my code? I think i make duplicate my video texture. but I don't know... Sorry a lot of question.
look this image please

@miraclehwan
For duplicate video, can you make two such hemispheres with video mapped to each? One for front and one for back. And you can position and rotate them however you want.
I didn't crash either when running your app. What's your device
@miraclehwan Can you attach the logcat output at the time of the crash?