AR.js
AR.js copied to clipboard
[bug report] Stretched camera feed
The video camera feed is stretched on my android chrome (ARJS Geolocation). I took a photo with ARJS, and then with the camera app of my phone so you can see the difference
ARJS:

Camera app

@marcusx2 this is a known problem, I will try to get to it when I have a moment.
Hi @nickw1 , thanks for the reply. If it helps at all, MindAR uses aframe as well and it doesn't have this issue.
@marcusx2 thanks for the tip re. MindAR.
In the ar-threex-location-only file and aframe-ar (why both files?), there's this line navigator.mediaDevices&&navigator.mediaDevices.getUserMedia){const t={video:{width:1280,height:720,facingMode:"environment"}} that seems to be the issue. It sets the width and height to a specific value...
On this file https://github.com/hiukim/mind-ar-js/blob/master/src/image-target/aframe.js
this code snippet seems to be the solution
navigator.mediaDevices.getUserMedia({audio: false, video: {
facingMode: 'environment',
}}).then((stream) => {
this.video.addEventListener( 'loadedmetadata', () => {
//console.log("video ready...", this.video);
this.video.setAttribute('width', this.video.videoWidth);
this.video.setAttribute('height', this.video.videoHeight);
this._startAR();
});
this.video.srcObject = stream;
}).catch((err) => {
console.log("getUserMedia error", err);
this.el.emit("arError", {error: 'VIDEO_FAIL'});
});
@nickw1 If I use videoTexture: false the problem goes away. But then the cube stretches, so another bug comes up ;_; damn it. Should I create another issue for this one?

The scale of the cube is 1 1 1, it shows up correctly if I set videoTexture: true on the a-scene's arjs component, but then the video feed is the one that stretches lol.
Thanks for that. I think what I'll probably do is incorporate fixes for all your issues into a new version 3.5, with the agreement of @kalwalt, as the world-origin-as-original-GPS-position will be a breaking change. I'll also try and add the multiple cameras feature.
with the agreement of @kalwalt, as the world-origin-as-original-GPS-position will be a breaking change
It doesn't have to be breaking if it's optional! It can default to false to keep the functionality as it is.
I'll also try and add the multiple cameras feature.
Multiple cameras? Don't know anything about that. Must be some other feature request.
I just thought that, as you have made quite a few requests, it would be best to collate them and include them in a new release. And if the initial position as world origin is the preferred behaviour anyway, maybe it's best to make that the default by switching to that with a 3.5 release. It won't affect most use cases anyhow. (This will also fit in with some other work I'm doing to try and combine AR.js with the SLAM library AlvaAR).
Sorry, I lost track of who requested the multiple cameras feature!
And if the initial position as world origin is the preferred behaviour anyway, maybe it's best to make that the default by switching to that with a 3.5 release.
It is better for other frameworks(PlayCanvas and Unity at least, probably more) that want to use arjs like I explained because of the floating point issue. I also think it semantically makes sense as well that the experience starts at world origin.
(This will also fit in with some other work I'm doing to try and combine AR.js with the SLAM library AlvaAR)
AlvaAR integration would be amazing for sure! AlvaAR alone without geolocation would already be awesome: world tracking on the web!
Just to summarize this issue includes 2 bug reports:
1- Camera feed stretching when videoTexture: true. No entity stretching.
2- Entity stretching when videoTexture: false. No camera feed stretching.
@marcusx2 I have implemented your suggested fix for stretching for the videoTexture: true mode (note that videoTexture: false is deprecated for location-based AR in any case).
Do you want to try testing it? You need to check out the stretched-video-fix branch of AR.js and include the built files from that branch into your project, e.g.
<script type='text/javascript' src='LOCATION_OF_ARJS_ON_YOUR_SYSTEM/three.js/build/ar-threex-location-only.js'></script>
<script type='text/javascript' src='LOCATION_OF_ARJS_ON_YOUR_SYSTEM/aframe/build/aframe-ar.js'></script>
I simply got the files ar-threex-location-only and aframe-ar from the branch and added to the same folder as the index.html.
Like this
<!DOCTYPE html>
<html>
<head>
<title>AR.js A-Frame Location-based</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, minimum-scale=1.0, user-scalable=no" />
<script src="https://aframe.io/releases/1.0.4/aframe.min.js"></script>
<script type='text/javascript' src='./ar-threex-location-only.js'></script>
<script type='text/javascript' src='./aframe-ar.js'></script>
</head>
<body>
<a-scene vr-mode-ui='enabled: false' arjs='sourceType: webcam; videoTexture: true; debugUIEnabled: false' renderer='antialias: true; alpha: true'>
<a-camera look-controls-enabled='false' arjs-device-orientation-controls='smoothingFactor: 0.1' gps-new-camera='positionMinAccuracy: 100; gpsMinDistance: 5; simulateLatitude: 51.049; simulateLongitude: -0.723; simulateAltitude: 0; gpsTimeInterval: 0;' position='0 10 0'></a-camera>
<a-entity material='color: red' geometry='primitive: box' gps-new-entity-place="latitude: -21.1873371; longitude: -47.7996175" scale="1 1 1" ></a-entity>
</a-scene>
<script>
const entity = document.querySelector("[gps-new-entity-place]");
console.log(entity.components['gps-new-entity-place'].distance);//returns undefined
setTimeout(() => {
console.log(entity.components['gps-new-entity-place'].distance);//returns distance
}, 0);
</script>
</body>
</html>
Should be correct right? The fix didn't work here though, unless I missed something.
I have implemented your suggested fix for stretching for the videoTexture: true mode (note that videoTexture: false is deprecated for location-based AR in any case).
Please don't deprecate/remove it. videoTexture: false is better for projects that use AR.js as an API to get the information about positioning and apply it elsewhere. For example, for PlayCanvas I don't need the videoTexture to be true because I just get the information of the camera and entities, and use it to place the objects in PlayCanvas, which draws the scene in its own Canvas which overlays the video element. The threejs canvas is completely useless for cases like this, just consuming unnecessary resources, because I don't need aframe/threejs to actually draw anything.
If it helps anything, you can take a look at this issue from another repo. I tried the fix provided there but it didn't work here. Maybe the JeelizResizer and/or the JeelizThreeHelper can give some clues.
I tried checking out the new branch as well, just in case. Still didn't work =/. The problem happens on my Android Chrome and on iOS Safari, and also on Chrome for desktop (stretched horizontally).
OK this may be a tricky one rather than a quick fix. I'm not sure it will make it to the next bugfix release in that case, but I will include PRs #507 and #508 though.
Does the bug not happen with you? Hopefully it's something that you can debug.
I know, but it might be something that requires some research to fix. This is very much dependent on my time availability, which is a bit restricted right now - whereas the other two problems are already fixed. Hence, as soon as #507 and #508 have been reviewed, I can make a bugfix release for those two issues, at least.
Can this bug be worked around in current version of AR.js?
bump
Hey @nickw1 , I recently had this stretching issue with another app, and I fixed it with something like this
myclass.windowResizeEvent = function () {
if (window.innerHeight > window.innerWidth) {
if (myclass.firstResize !== "portrait" && myclass.firstResize !== "landscape") {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
myclass.firstResize = "portrait";
} else {
if (myclass.firstResize == "portrait") {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
} else {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
}
}
dynCall_vii(callback, video.videoWidthbk, video.videoHeightbk)
} else {
if (myclass.firstResize !== "portrait" && myclass.firstResize !== "landscape") {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
myclass.firstResize = "landscape";
} else {
if (myclass.firstResize == "landscape") {
video.videoWidthbk = video.videoHeight;
video.videoHeightbk = video.videoWidth;
} else {
video.videoWidthbk = video.videoWidth;
video.videoHeightbk = video.videoHeight;
}
}
dynCall_vii(callback, video.videoHeightbk, video.videoWidthbk)
}
};
window.addEventListener("resize", myclass.windowResizeEvent)
window.dispatchEvent(new Event("orientationchange"));
don't know if this applies to arjs or not, but the problem I was having on said app is that depending if the it started on landscape mode or portrait mode, I had to swap the video width with the video height and vice versa so that the aspect ratio was always correct and it didn't show up stretched. I had to check for the first resize event and detect if the app started in portrait or landscape...anyways. Don't know if this helps at all, but just throwing it here just in case, who knows.
I really need this videoTexture: false to work properly
Can this bug be worked around in current version of AR.js?
@kbs1 I've found a workaround:
</a-scene>
<button id="start" onclick="start()" style="position: absolute; top: 0; width: 100%; height: 100%; z-index: 10;">Start</button>
</body>
</html>
<script>
var elem = document.documentElement;
/* View in fullscreen */
function start() {
if (elem.requestFullscreen) {
elem.requestFullscreen();
} else if (elem.webkitRequestFullscreen) { /* Safari */
elem.webkitRequestFullscreen();
} else if (elem.msRequestFullscreen) { /* IE11 */
elem.msRequestFullscreen();
}
document.querySelector("#start").remove()
document.querySelector('video').setAttribute('style', 'width: 100vw;height: 100vh;object-fit: cover;position: absolute;top: 0;left: 0;');
}
I'm full-screening because I want to but the only important bit is the button removal then setting the style attribute.
I found the style doesn't work if set directly, probably something to do with setting the style after AR.js is done setting up the scene. So I imagine you can get this to work without the 'start' button by just getting the start function to run after the first gps camera position is updated, although I haven't tried it:
let testEntityAdded = false;
document
.querySelector("#camera")
.addEventListener("gps-camera-update-position", (e) => {
if (!testEntityAdded) {
start()
Never-mind actually, just noticed that for some reason changing the video's CSS seems to destroy all entities as does setting a-scene to embedded
@marcusx2 @Platform-Group apologies for the late reply, I've had very little time to look at AR.js lately.
@marcusx2 could you submit your fix as a PR? Please provide a sample which works with the fix and doesn't work without the fix. This will ease the integration into the main repo.
@nickw1 I didn't apply anything to ARJS. It's just that I had a similar problem with something else and posted my solution here, which maybe can be used by ARJS with the same idea or something similar.
I've found out that the issue arises when camera aspect ratio is not the same as device (mobile phone) aspect ratio, or more specifically, usable viewport aspect ratio. I've found out that AR.js creates a PlaneBufferGeometry of width 1 and height 1, if you tweak these values to have the same aspect ratio as the viewport, you can then unstretch the video feed (video texture) on that particular device.
@kbs1 is this tweaking done by modifying ar.js's source code?
@Platform-Group yes, you can search the source for PlaneBufferGeometry, there will be 2 instances. You can find the active one based on your use case and tweak the values there. For my use case, I added reading from window (global) so the correct values can be calculated before initialising AR.js itself
@kbs1 thanks for that. Do you want to submit a PR to fix the issue?
@nickw1 the fix I used was not proper, and it required reading from magic window..... variables at the patched PlaneBufferGeometry creation. It also assumed AR.js will be full-width and full-height in the viewport, which might not be the case. It also did not work on iOS devices, so I added a workaround that it would not activate on those.
I'm not familiar with AR.js / aframe / three.js codebases enough to develop a proper fix, but for the fix itself, further research is also needed. The way I understand it AR.js creates a plane that is filled with video texture (the camera feed). This plane is 1x1 in dimensions. Somehow, this plane is stretched to "fit" the scene, and this creates the distortion (might be due to OrthographicCamera being used).
The code in question is here: https://github.com/AR-js-org/AR.js/blob/3.4.5/aframe/src/location-based/arjs-webcam-texture.js#L6-L20
The proper fix would be to emulate a behavior like CSS's background-size: cover; background-position: center for the video feed. That way, it would not matter what the wiewport aspect ratio is, and what the video feed aspect ratio is. It might however be needed to adjust the position of AR objects so they are truest to the shown camera feed, but that's outside of my abilities to judge.
@kbs1 I think you can still contribute your modifications as a PR, we will review it, and if your code is not compatible stylistically or otherwise with AR.js, we will modify it.
Even if it's not a full fix, if it partly solves some problems, it's still worth incorporating.
@nickw1 here, I had a crack at it, works for me:
const constraints = {
video: {
facingMode: "environment",
width: { ideal: 1920 },
height: { ideal: 1080 }
},
};
navigator.mediaDevices.getUserMedia(constraints).then(stream => {
let streamSettings = stream.getVideoTracks()[0].getSettings()
console.log('navigator by geom ', streamSettings)
let sourceAspectRatio = streamSettings.width / streamSettings.height;
let displayAspectRatio = window.innerWidth / window.innerHeight;
let geomX = 1;
let geomY = 1;
if (displayAspectRatio > sourceAspectRatio) {
// Display is wider than source
geomX = sourceAspectRatio / displayAspectRatio;
} else {
// Display is taller than source
geomY = displayAspectRatio / sourceAspectRatio;
}
console.log('geomY ', geomY, ' geomX ', geomX)
this.geom = new THREE.PlaneBufferGeometry(geomX, geomY);
this.texture = new THREE.VideoTexture(this.video);
this.material = new THREE.MeshBasicMaterial({ map: this.texture });
const mesh = new THREE.Mesh(this.geom, this.material);
this.texScene.add(mesh);
})
It still has problems though:
- Doesn't handle resizing such as toggling to fullscreen
- Uses window, so won't work with embedded (not that that matters right now anyway as embedded is broken anyway, at least it is for me using location based)