ngl
ngl copied to clipboard
1015 binary cif support
This PR adds support for Binary Cif files parsing and changes the RCSB data source provider to use this new format instead of the deprecated MMTF format.
Changes made
- Added molstar library as a dependency to use the Cif/Binary Cif parsing code from this project. Thanks to tree-shaking only the import tree relative to the Cif reader is imported, but this causes a sensible increase of the whole bundle size.
- The code for reading binary cif is the same as the one that what used for reading cif files. The only difference is the streamer invocation
- Added code to parse "Upgraded mmCif files" from PDBe. These contain connectivity information and allow to get the proper bond orders.
- Added
pdbe
as a new datasource. Data can be loaded from PDBe using the pseudo protocolpdbe://4hhb
which downloads a binary cif (uncompressed) with the full connectivity - The nglviewer will download PDB files from PDBe by default, unless an alpha fold code is used (can't find a way to download a binary cif from PDBe for AlphaFold structures)
-
jest
library as a test runner has been replaced withvitest
. This was due to a bug with Jest when parsing the cif-parser file. The import from molstar is not an import of bundled code, but an import of ES module which is not suported natively by node and requires a transformer. But Jest do not transform files from node_modules. Albeit trying various approaches, I could not make it work and resolved to using vitest which worked out of the box.
Fixes
- Secondary structure in files from AlphaFold is now handled
- pdb codes are not limited to 4 letters when importing from rcsb, and alphafold codes can be used as well
Comments
Small benchmark, using the pdb 5z6y (relatively small strucutre GFP):
Format | Provider | Download size |
---|---|---|
mmtf | RCSB | 17.2kb* |
bcif | PDBe | 182kb |
bcif | RCSB | 33.4kb* |
(*) RCSB response is gzipped
Despite the claim that bcif achieves better compression, it seems that there are still some caveats and generally speaking forcing the transition from bcif to mmtf creates regressions (also some improvements for specific use cases where the extra data content is relevant)
Thanks again - will have a proper look asap, just to note some of our examples already failing due to this, e.g. https://nglviewer.org/ngl/?script=showcase/viruses
EDIT: To clarify, failing because they can't grab the mmtf file, not because of changes in this PR!
Hmm, 3nap seems to be causing me some problems (might be a horror-show example as it's a virus) in the symmetry processing
I did not implement the alphaCarbondsOnly flag, to this might be it. I'll look at the issue with the mmtf file more precisely. The code was allowing to download backbone only structures. I haven't looked at wether the same exist for bcif files
Fixed:
(it seems the bug was already there in the previous cif parser)
Okay - looks good, I've gone through all the parser examples and they all work except for:
- parser/map(which breaks on parsing 4UJD.cif.gz - I've tested with recent versions of this entry in case it was edited in the repo or something but no luck
- parser/validation (parsing the 3PQR.cif from the data directory)
I can dig some more into these but thought I'd flag first as it might be something simple when you're familiar with the code.
Apologies, first one that wasn't working is parser/map (edited above, previously said ccp4, but adding this comment in case you're following by email too)
Oh, and you should make yourself the authore for pdbe-datasource.ts!
Good catch @fredludlow !
The issue with the 4UJD.cif.gz file was due to how compression is handled. The streamer returns an ArrayBuffer, which needs to be converted to a string to be processed by the CIF parsing library.
The second one is a bug with handling altlocs (they were not processed correctly in fact)
Both were pretty major issues. Maybe we should add more tests to better cover this code?
Can confirm both those are now working for me.
I've got a local PDB mirror and am running a script to try NGL.autoLoad on every mmCIF formatted entry - if this works there may still be other classes of bug, but it would definitely be reassuring.
Happy for you to merge this in the meantime (and thank you again!)
Hmm, 7a4p is causing issues
That's a tricky file: one of the chain (identifier U
, entity id 20
) is missing from the coordinates block. It is reported elsewhere as a 3 aa chain.
So that's a missing null check I think.
7a4p was the ony one that threw an error / rejected the promise. There were approx 250 entries where the spacegroup was either undefined or another one that isn't recognized (P b c a
, P 21 21 2 A
, P 1 21/n 1
, P n n a
, C 4 21 2
and F 4 2 2
- putting here in case this comes up in the future) but I don't think that's related to the parser.
For reference, script is here: https://gist.github.com/fredludlow/e0a2a4af29d902350c872162315538d1
Thanks @fredludlow that's so useful!
I'm not sure if the data source is set up properly for this. The following doesn't work for me (on a development server):
new Stage(...).loadFile('rscb://5z6y');
This tries to access http://models.rcsb.org/5z6y.bcif.gz
, but that returns a 301 Moved Permanently
, referring to the new location https://models.rcsb.org/5z6y.bcif.gz
(using HTTPS!), which works, when used explicitly in the code. On the other hand, shouldn't the API described by RCSB be utilized instead? However, there seems to be no option for compression. The https://models.rcsb.org/5z6y.bcif.gz
API seems to be the best option after all, even though it doesn't offer any options (I think that's just the download link on their website, right?).
I'm not sure if the data source is set up properly for this. The following doesn't work for me (on a development server):
new Stage(...).loadFile('rscb://5z6y');
This tries to access
http://models.rcsb.org/5z6y.bcif.gz
, but that returns a301 Moved Permanently
, referring to the new locationhttps://models.rcsb.org/5z6y.bcif.gz
(using HTTPS!), which works, when used explicitly in the code. On the other hand, shouldn't the API described by RCSB be utilized instead? However, there seems to be no option for compression. Thehttps://models.rcsb.org/5z6y.bcif.gz
API seems to be the best option after all, even though it doesn't offer any options (I think that's just the download link on their website, right?).
The protocol part (http:// vs https://) comes from the current location (i.e. the server that serves the current page, her your development server). We should make this always https then (I think it's already the case for PDBe). Regarding the compression, are you referring to compression headers? Not sure they are necessary as the file is already transmitted as a compressed stream?
Oh, I see. I think that would be good.
Regarding the compression: I was referring to the RCSB model API, which offers several endpoints to make requests to. I was wondering if instead of using links like https://models.rcsb.org/5z6y.bcif.gz
directly, maybe this API should be queried instead (like the full
endpoint), as it seems to be the "official", documented way. But it apparently doesn't offer any compression options, so maybe using the direct download link is still better.
So, I've just checked and the https://models.rcsb.org
endpoint does a good job with sending the data stream compressed using the HTTP headers:
Here is the download size compared with the resource size
I'll make a fix for the https vs http
Oh, I see. I think that would be good.
Regarding the compression: I was referring to the RCSB model API, which offers several endpoints to make requests to. I was wondering if instead of using links like
https://models.rcsb.org/5z6y.bcif.gz
directly, maybe this API should be queried instead (like thefull
endpoint), as it seems to be the "official", documented way. But it apparently doesn't offer any compression options, so maybe using the direct download link is still better.
@panda-byte #1043 has been merged and published as v2.3.1 with the http/https fix for rcsb