sceneview-android
sceneview-android copied to clipboard
Some issue with existing project samples
First of Great library with great functions and examples, Thanks for the support and Library, But I am facing on issue of some points in sample and document regarding implementation, I am little bit new in this library, So If possible community can we fix some bellow issue in samples or example for some cases?
- Cloud anchor sample not working with latest version
- Also we dont have any proper code or example for video node with chroma color or augmented image. (Tried my self but video texture showing black alwys)
- Also if can get guide for custome gesture. (Like rotation with single finger and scalling and moving with two finger)
@ThomasGorisse
@grassydragon
and All members
I am also experiencing some issues on 2.0.3 that does not occur in 2.0.2 🤔 Such as Custom Gestures we have implemented on 2.0.2 do not work anymore for example.
For gesture, on 2.0.2
if we pass GestureDetector.OnGestureListener
on a custom SceneView and override the cameraManipulator
we could have made our own gestures.
class CustomSceneView @JvmOverloads constructor(
context: Context,
attrs: AttributeSet? = null,
defStyleAttr: Int = 0,
defStyleRes: Int = 0
) : SceneView(
context,
attrs,
defStyleAttr,
defStyleRes,
sharedOnGestureListener = CustomGestureListener()
) {
override val cameraManipulator: Manipulator
get() = Manipulator.Builder()
//... what you want
.build(/* ... */)
}
class CustomGestureListener: GestureDetector.OnGestureListener {
//...
}
However, in 2.0.3
,
class CustomSceneView @JvmOverloads constructor(
context: Context,
attrs: AttributeSet? = null,
defStyleAttr: Int = 0,
defStyleRes: Int = 0
) : SceneView(
context,
attrs,
defStyleAttr,
defStyleRes,
sharedOnGestureListener = CustomGestureListener(),
cameraManipulator = cameraManipulator
) {
companion object {
private val cameraManipulator: ((View, CameraNode) -> Manipulator) = { view, cameraNode ->
Manipulator.Builder()
//... what you want
.build(/* ... */)
}
}
}
It does not override the gestures anymore.
Anyone have idea about some changes plz inform us also any one have working video node with sdk plz inform us.
@dhaval-android For a videoNode, I think you need to use VideoMaterial
(or create your own with ExoPlayer if you are using ExoPlayer)
sceneView.apply {
val node = /* ... */
val videoMaterial = VideoMaterial(/* ... */)
val videoInstance = materialLoader.createVideoInstance(videoTexture = videoMaterial.texture).apply {
// Do not forget this line
setExternalTexture(videoMaterial.texture)
}
node.setMaterialInstance(videoInstance)
addChildNode(node)
}
something like that
First of thanks @JeromeCHA for the replay,
Actually, I am doing the same thing almost creating custom video material with stream and texture and then after applying the surface to the media player, Actually video is loading now but not in proper shape also when using chroma color its replicating camera stream not the video stream bellow is my code plz guide if you find anything wrong.
class VideoNodeT(
sceneView: SceneView,
val player: MediaPlayer,
size: Size = Plane.DEFAULT_SIZE,
center: Position = Plane.DEFAULT_SIZE,
normal: Direction = Plane.DEFAULT_NORMAL,
chromaColor: Color? = null,
scaleToVideoRatio: Boolean = true,
keepAspectRatio: Boolean = true,
renderableApply: RenderableManager.Builder.() -> Unit = {},
val onLoaded: ((materialInstance: MaterialInstance) -> Unit)? = null
) : PlaneNode(
engine = sceneView.engine,
size = size,
center = center,
normal = normal,
builderApply = renderableApply
) {
var isKeepAspectRatio = true
var isAlwaysLookAtCamera = false
val surfaceTexture: SurfaceTexture = SurfaceTexture(0).apply {
detachFromGLContext()
}
init {
if (scaleToVideoRatio) {
player.doOnVideoSized { player, width, height ->
if (player == this.player) {
if (keepAspectRatio) {
scale = scale.apply {
x = if (width >= height) 1.0f else width.toFloat() / height.toFloat()
y = if (width >= height) height.toFloat() / width.toFloat() else 1.0f
}
}
}
}
}
try {
sceneView?.materialLoader?.loadMaterialAsync(if(chromaColor==null)"materials/video_texture.filamat" else "materials/video_texture.filamat", onResult = {itMaterial->
if (itMaterial != null) {
/*if(chromaColor!=null){
itMaterial.defaultInstance.setColor("chromaKeyColor", chromaColor,Colors.RgbaType.LINEAR)
}*/
val stream: Stream = Stream.Builder()
.stream(surfaceTexture)
.build(engine)
var surface: Surface
val videoTexture=VideoTexture.Builder()
.stream(stream)
.build(engine).apply {
surface = Surface(surfaceTexture)
setExternalStream(engine, stream)
materialInstance = itMaterial.defaultInstance
setMaterialInstances(itMaterial.defaultInstance)
player.setSurface(surface)
onLoaded?.invoke(itMaterial.defaultInstance)
}
}
})
} catch (error: Exception) {
}
}
override fun destroy() {
super.destroy()
player.stop()
}
}
Implementation augmented image detection
onSessionUpdated = { session, frame ->
frame.getUpdatedAugmentedImages().forEach { augmentedImage ->
val augmentedImageNode =
AugmentedImageNode(engine, augmentedImage).apply {
when (augmentedImage.name) {
"rabbit" -> {
if (!isNodAdded) {
isNodAdded = true
addChildNode(
ModelNode(
modelInstance = modelLoader.createModelInstance(
assetFileLocation = "models/ToyCar.glb"
),
scaleToUnits = 0.1f,
centerOrigin = Position(0.0f)
).apply {
}
)
}
}
"qrcode" -> {
if (!isNodAdded) {
playerMedia = MediaPlayer.create(
this@ArActivity,
R.raw.test_vid
)
playerMedia.apply {
isLooping = true
setOnPreparedListener {
/* if (augmentedImage.isTracking) {
Log.d("testScen", "start() called")
}*/
}
}
isNodAdded = true
addChildNode(
VideoNodeT(
bindingView!!.sceneView,
playerMedia,
size = normalize(
Size(
playerMedia.videoWidth.toFloat(),
playerMedia.videoHeight.toFloat()
)
),
normal = Direction(
x = 0f,
y = 1f,
z = 0f
),
center = Position(0f, 0f, 0f),
scaleToVideoRatio = true,
keepAspectRatio = true,
chromaColor = colorOf(R.color.green),
onLoaded = {
playerMedia.start()
}
).apply {
scale = Scale(-1f, 1f, 1f)
rotation = Rotation(0f, -0.5f, 0f)
onTrackingStateChanged =
{ trackingState ->
Log.d(
"augmentedImageTrace",
"$trackingState"
)
when (trackingState) {
TrackingState.TRACKING -> {
if (!playerMedia.isPlaying) {
playerMedia.start()
}
}
else -> {
if (playerMedia.isPlaying) {
playerMedia.pause()
}
}
}
}
}
)
}
}
}
}
addChildNode(augmentedImageNode)
}
}
Does anyone know what's going on with version 2.0.3? In 2.0.2 everything I use works fine, but in 2.0.3 I launch the application and a black screen appears. I can see the planeRenderer points but the screen is black and I can also render the view. Can anyone help me with this?
Hey there, it looks like there has been no activity on this issue recently. Has the issue been fixed, or does it still require the community's attention? This issue may be closed if no further activity occurs. Thank you for your contributions.
Reopen if it's still the case in v2.2.0