results-collection
                                
                                 results-collection copied to clipboard
                                
                                    results-collection copied to clipboard
                            
                            
                            
                        Result issues in webgl tests
Possibly related to https://github.com/web-platform-tests/results-collection/issues/98
See e.g. https://wpt.fyi/webgl/bufferSubData.html?sha=6736e3f46b
Edge result is "(results not found)"
Why is there no result for Edge? There are results in earlier runs, e.g. https://wpt.fyi/webgl/bufferSubData.html?sha=c218fe33f4
Firefox fails with "Failure message: assert_true: Should be able to get a context. expected true got false"
@Ms2ger suggested in IRC the Firefox result is "Possibly to do with a headless setup?". Is that the case? If so, is there a way to run (some) tests with a different setup that can run webgl tests?
Why is there no result for Edge? There are results in earlier runs, e.g. https://wpt.fyi/webgl/bufferSubData.html?sha=c218fe33f4
This is due to intermittent connectivity problems with Sauce Labs. I've created a dedicated issue to explain that problem and track its resolution: https://github.com/web-platform-tests/results-collection/issues/544
Firefox fails with "Failure message: assert_true: Should be able to get a context. expected true got false"
@Ms2ger suggested in IRC the Firefox result is "Possibly to do with a headless setup?". Is that the case? If so, is there a way to run (some) tests with a different setup that can run webgl tests?
This would be my guess as well. One bit of background that may be relevant: we switched from using fancy GPU-powered Amazon AWS instances to lower-powered "headless" instances on March 20. Results collected for Firefox before that date (say, March 7) demonstrate the same failure, suggesting that removing display virtualization and adding a GPU may not solve the problem.
However, my local testing supports @Ms2ger's theory. The test passes locally when executed using the following command:
$ ./wpt run --include webgl/bufferSubData.html --no-pause-after-test --log-tbpl - firefox
And it fails after the introduction of xvfb (the display virtualization technology we use to collect results):
$ xvfb-run --auto-servernum ./wpt run --include webgl/bufferSubData.html --no-pause-after-test --log-tbpl - firefox
The output of the latter command has some more information:
TEST-START | /webgl/bufferSubData.html
PID 12427 | JavaScript warning: http://web-platform.test:8000/webgl/common.js, line 3: Error: WebGL warning: Disallowing antialiased backbuffers due to blacklisting.
PID 12427 | JavaScript warning: http://web-platform.test:8000/webgl/common.js, line 3: Error: WebGL warning: Refused to create native OpenGL context because of blacklist entry:
PID 12427 | JavaScript warning: http://web-platform.test:8000/webgl/common.js, line 3: Error: WebGL warning: Failed to create WebGL context: WebGL creation failed:
PID 12427 | * Refused to create native OpenGL context because of blacklist entry:
PID 12427 | * Exhausted GL driver options.
A quick web search turned up a potential workaround: the MOZ_GFX_SPOOF_VENDOR_ID environment variable. Those folks weren't dealing with xvfb, but there problem was similar. Sure enough, the test passes with following invocation:
$ MOZ_GFX_SPOOF_VENDOR_ID=0x80EE xvfb-run --auto-servernum ./wpt run --include webgl/bufferSubData.html --no-pause-after-test --log-tbpl - firefox
This suggests that the issue is GPU related, but that it can't be satisfied by any old GPU. Perhaps the GPU-sporing AWS EC2 instances we used previously (despite being very expensive, I assure you) didn't pass muster.
I'm not suggesting that we spoof graphics cards like this. Doing so might be enough to satisfy surface-level tests like the one @zcorpan has identified, but it's unlikely to produce fully conformant behavior (otherwise, why implement the blacklist at all?).
To get an idea of the severity of this issue, I wrote a script to scrape the Firefox logs from Buildbot (below) and searched those for the tests which mentioned the blacklist (also below). The affected tests are:
- /WebIDL/current-realm.html?sha=6736e3f46b
- /webgl/uniformMatrixNfv.html?sha=6736e3f46b
- /webgl/texSubImage2D.html?sha=6736e3f46b
- /webgl/texImage2D.html?sha=6736e3f46b
- /webgl/compressedTexSubImage2D.html?sha=6736e3f46b
- /webgl/compressedTexImage2D.html?sha=6736e3f46b
- /webgl/bufferSubData.html?sha=6736e3f46b
- /2dcontext/imagebitmap/createImageBitmap-invalid-args.html
- /css/css-regions/elements/canvas3d-002.html
- /css/css-regions/elements/canvas3d-001.html
Which is a bit disappointing after all that text. Sorry! Note the disparity between WPT test type, test location, and failure message, though.
My takeaways:
- We need to be more picky when/if we invest in GPU-powered machines
- Resolving this has a very high cost/benefit ratio (today, anyway. That could change if the WebGL test suite becomes more substantial or if we find other reason to use GPUs)
- Optimizing resource utilization will be difficult (since neither the WPT test type, the test file location, nor even the test's failure message are reliably indicate that a GPU is required)
Script used to scrape logs from http://builds.wpt.fyi
'use strict';
const http = require('http');
const path = require('path');
const util = require('util');
const writeFile = util.promisify(require('fs').writeFile);
const baseUrl = 'http://builds.wpt.fyi/api/v2';
function fetch(url) {
  return new Promise((resolve, reject) => {
    http.get(url, (response) => {
      const { statusCode } = response;
      if (statusCode !== 200) {
        reject(new Error('Request Failed.\n' + `Status Code: ${statusCode}`));
        // consume response data to free up memory
        response.resume();
        return;
      }
      resolve(response);
    }).on('error', reject);
  });
}
async function fetchText(url) {
  const response = await fetch(url);
  response.setEncoding('utf8');
  let rawData = '';
  response.on('data', (chunk) => { rawData += chunk; });
  return new Promise((resolve, reject) => {
    response.on('end', () => resolve(rawData));
  });
}
async function fetchLog(builderId, buildNumber, stepNumber) {
  const url = `${baseUrl}/builders/${builderId}/builds/${buildNumber}` +
    `/steps/${stepNumber}/logs/stdio`;
  const logInfo = JSON.parse(await fetchText(url)).logs[0];
  logInfo.contents = await  fetchText(`${baseUrl}/logs/${logInfo.logid}/raw`);
  return logInfo;
}
(async function() {
  const reqs = process.argv.slice(2)
    .map((buildNumber) => fetchLog(3, buildNumber, 8));
  const logs = await Promise.all(reqs);
  await Promise.all(
    logs.map((logInfo) => writeFile(logInfo.logid + '.txt', logInfo.contents, 'utf-8'))
  );
}());
Command used to search the logs
$ tac *.txt | awk '/blacklisting/, /TEST_START/' | grep TEST_START
51:45.93 TEST_START: /WebIDL/current-realm.html
18:17.47 TEST_START: /webgl/uniformMatrixNfv.html
18:14.37 TEST_START: /webgl/texSubImage2D.html
18:11.11 TEST_START: /webgl/texImage2D.html
18:07.77 TEST_START: /webgl/compressedTexSubImage2D.html
18:04.38 TEST_START: /webgl/compressedTexImage2D.html
18:01.38 TEST_START: /webgl/bufferSubData.html
 5:58.39 TEST_START: /2dcontext/imagebitmap/createImageBitmap-invalid-args.html
 1:56.38 TEST_START: /css/css-regions/elements/canvas3d-002.html
 1:53.44 TEST_START: /css/css-regions/elements/canvas3d-001.html
Here's some documentation from Mozilla's wiki:
https://wiki.mozilla.org/Blocklisting/Blocked_Graphics_Drivers
Notably:
If you would like to forcibly enable a graphics feature that is blocked on your system, follow these instructions. Warning: do this at your own risk. There usually are good reasons why features are blocked.
And further on:
If force-enabling a feature doesn't work, that probably means that your hardware doesn't support it.
This makes me think that subverting the blacklist may be a viable option. If we did that, though, test failures would become our responsibility. I'd rather not have to manually verify individual test results every time we want to publish!