PEP 426: Define a JSON-LD context as part of the proposal
I finally found time to investigate JSON-LD as Wes Turner has regularly suggested. It does look like a good fit for what I want to achieve with the metadata 2.0 spec: http://www.w3.org/TR/json-ld/#basic-concepts
Also useful to me was this blog post from the JSON-LD lead editor: http://manu.sporny.org/2014/json-ld-origins-2/
I've long ignored the semantic web people because they tend to design and create overengineered solutions that are completely impractical for real world use. Sporny's post persuaded me that JSON-LD wasn't like that, and hence worth investigating further.
So, this is somewhat of a frequent documentation need, and an opportunity for linked requirements traceability (#LinkedData (EDIT: #LinkedReproducibility #PEP426JSONLD)):
- https://westurner.org/wiki/tools
- https://westurner.org/tools/ -- Sphinx, inline blocks (#NotStructuredData)
| Homepage: ...
| Src: git https://bitbucket.org/./.
| Download: .../download/
| Issues: bitbucket.org/././issues
| Docs: `<https://containsparens_(disambiguation)>`__
[... add'l ad-hoc attributes]
Before writing this as (most minimal, ordered) inline blocks, I wrote 'bobcat' (which requires FuXi for OWL schema reasoning) and one day drafted some thoughts for a 'sphinxcontrib-rdf' extension to add roles and directives.
- bobcat -> ``RST` -> Sphinx (Use case: add a Appendix listing component RDF attributes to system docs)
- sphinxcontrib-rdf <-> Sphinx
More practically, how do I simulate pip install without running any setup.py files (traverse and solve from the given Requirements rules)?
- The
install_requiresandextras_requireedges need to be in the JSON[-LD]- https://github.com/ipython/ipython/blob/master/setup.py#L182
- Note that here these variables are conditional based upon e.g. platformstr parameters.
- Is it possible to serialize these edges to JSON [EDIT: e.g. and JSONLD] at next build / release time?
- Note that here these variables are conditional based upon e.g. platformstr parameters.
- https://pypi.python.org/pypi/ipython/json
- https://github.com/ipython/ipython/blob/master/setup.py#L182
And then positive externalities of exposing JSON[-LD] that is schema.org compatible:
- It's possible that search engines could index schema.org/SoftwareApplication (from JSON-LD and/or RDFa in crawled pages)
- Other tools can retrieve metadata in a structured way (versioncheck (pip-tools pip-sync, ))
- Upstream and downstream packages could be linked with URIs (and provenance metadata, with signatures)
An broader discussion for/with really tools in any language for/with RDFJS: https://text.allmende.io/p/rdfjs (see ### Classes)
- In terms of mapping types to JSON-LD,
- https://wrdrd.com/docs/consulting/knowledge-engineering#json-ld
- this may be generally helpful: https://github.com/westurner/elasticsearchjsonld/blob/master/elasticsearchjsonld/elasticsearchjsonld.py
Also of potential interest would be linking this in to the ISO/IEC Software Identification effort: http://tagvault.org/about/
Also of potential interest would be linking this in to the ISO/IEC Software Identification effort: http://tagvault.org/about/
Do they have URNs that could be the object of a (pypi:projectname, ex:, urn:x-tagvault:xyz) triple?
The install_requires and extras_require edges need to be in the JSON[-LD]
https://github.com/ipython/ipython/blob/master/setup.py#L182
- Note that here these variables are conditional based upon e.g. platformstr parameters.
- Is it possible to serialize these edges to JSON at next build / release time?
The total graph of install_requires and extras_require is the sum of each of the built eggs' JSON[-LD] representations of runtime setup.py state.
- { } Generate a separate JSON-LD
- { } Aggregate all JSON-LD metadata sets for each instance of each version of each package
- { } Generate a separate JSON-LD
- pydist.json: https://www.python.org/dev/peps/pep-0426/#metadata-format https://github.com/vsajip/distlib/blob/master/tests/pydist.json
- metadata.json (wheel): "Rename pydist.json to metadata.json to avoid stepping on the PEP"
- pydist.jsonld ?
"@context": "https://pypi.python.org/ns/pydist"
-
I think some level of schema.org interoperability should be a goal:
- http://lists.w3.org/Archives/Public/public-vocabs/2014Oct/0018.html
- EDIT: source links:
- http://lists.w3.org/Archives/Public/public-vocabs/2014Oct/0018.html
-
Also of interest: "[Distutils] pip/warehouse feature idea: "help needed" https://mail.python.org/pipermail/distutils-sig/2015-April/026108.html
On Sat, Apr 11, 2015 at 1:14 PM, Wes Turner <wes.turner at gmail.com> wrote:
On Sat, Apr 11, 2015 at 12:29 PM, Marc Abramowitz
wrote: Interesting. One of the things that would help with getting people to help and is in the PEPs but last I checked wasn't yet implemented is the metadata that allows putting in all kinds of URLs and the ones I'm primarily thinking of here are the source code repository URL and the issue tracker URL.
http://legacy.python.org/dev/peps/pep-0459/:
[...]
A JSON-LD context would be outstanding.
- [ ] Additional properties for {...} (see RDFJS https://text.allmende.io/p/rdfjs ## Tools Schema)
I personally sigh when I see a PyPI page that lists its URL as said PyPI page as this seems redundant and not useful and I'd rather see a GitHub or Bitbucket URL (or maybe a foo-project.org or readthedocs URL, but I the repo URL is usually what I'm most interested in).
If we had the metadata with all the different kinds of URLs and the tools to show it and search it, then it would be clearer what to put where and would make it easier for consumers to find what they're looking for.
Another thought I had while reading your email was the OpenHatch project and if there could be some tie-in with that.
It also would be interesting if package maintainers had a channel to communicate with their user base. Back when I was at Yahoo, our proprietary package tool kept track of all installs of packages and stored the information in a centralized database. As a result, a package maintainer could see how many people had installed each version of their package and could send emails to folks who had installed a particular version or folks who had installed any version. A lot of folks used this to warn user bases about security issues, bugs, deprecations, etc. and to encourage folks to upgrade to newer versions and monitor the progress of such efforts.
Links to e.g. cvedetails, lists, and RSS feeds would be super helpful.
Links to e.g. IRC, Slack, Gitter would be super helpful.
Where Links == {edges, predicates, new metadata properties}
Links to downstream packages (and their RSS feeds) would also be helpful.
-
Debian has RDF (and also more structured link types that would be useful for project metadata)
- changelog / "release notes"
- build lods
- https://wiki.debian.org/RDF
- https://packages.qa.debian.org/p/python3-defaults.html
- https://packages.qa.debian.org/p/python3-defaults.ttl
What URI should pypi:readme or warehouse:readme expand to?
@prefix pypi: <https://pypi.python.org/pypi/> ; @prefix warehouse: <https://warehouse.python.org/project/> ; @prefix github: <https://github.com/> ; * pypi:json["info"]["name"] ( + ".json" ) * warehouse:json["info"]["name"] * github:json["info"]["name"] @prefix doap: <http://usefulinc.com/ns/doap#> ; * http://lov.okfn.org/dataset/lov/vocabs/doap @prefix schema: <http://schema.org/> ; -
schema:SoftwareApplication -> https://schema.org/SoftwareApplication
-
schema:Code -> https://schema.org/Code
-
schema:Project -> TODO (new framework for extension vocabularies)
Should/could there be a pypa: namespace?
@prefix pypa: <https://pypa.github.io/ns/pypa/#> ;
working thoughts:
am working on adding Schema.org RDFa metdata to project detail pages for the next-gen PyPi (http://warehouse.python.org) [1]
There are structured fields for Python Packaging metadata [2][3] and there are tables in warehouse [4].
Challenges: (a) Mapping Author / Maintainer to <Person >s: (author/creator, editor, contributor, accountablePerson) (publisher, sourceOrganization may note be feasible)
(b) Picking a canonical [URI] for a Package:
warehouse.python.org/project/<name> warehouse.python.org/project/<version> pypi.python.org/pypi/<name> pypi.python.org/pypi/<name>/<version> release.homepage release.project_url(c) Expressing softwareVersion[s]
- How to express the project <--- release[.version] relation? [5]
(d) documentationUrl, bugtrackerUrl, [repositoryUrl]
- I think it could be helpful to amend SoftwareApplication with these properties.
- There is not yet an analogue of repositoryUrl in Python
Packaging Metadata.
- DOAP Project and Versions [6]
- DOAP schema includes typed *Repository properties [5]
- {CVS, SVN, Darcs, Bk, Git, Hg; BazaarBranch }
- DOAP schema includes typed *Repository properties [5]
- DOAP Project and Versions [6]
(e) Should there be a SoftwareRelease?
-
Currently: SoftwareApplication:softwareVersion (Text)
- This is lossy, as different releases have different filesizes and checksum/signature types.and checksum/signature types.>
- { } { PEP Metadata -> JSON-LD, pypa.ttl RDF Ontology }
[1] https://github.com/pypa/warehouse/blob/master/warehouse/templates/projects/detail.html [...] [4] https://github.com/pypa/warehouse/blob/master/warehouse/packaging/tables.py [5] https://github.com/edumbill/doap/blob/master/schema/doap.rdf [6] https://github.com/edumbill/doap/tree/master/licenses
-
https://github.com/mozillascience/code-research-object/issues/15 "(JSON-LD) Metadata for software discovery"
[EDIT] ~fulltext cc here, emphasis added, markdown [EDIT] warehouse pkg detail template is now at https://github.com/pypa/warehouse/blob/master/warehouse/templates/packaging/detail.html
Also of potential interest would be linking this in to the ISO/IEC Software Identification effort: http://tagvault.org/about/
Do they have URNs that could be the object of a (pypi:projectname, ex:, urn:x-tagvault:xyz) triple?
Here is the XSD schema for "[ISO/IEC 19770-2:2009 Software Identification Tag Standard]" from http://tagvault.org/standards/swid_tagstandard/:
- http://standards.iso.org/iso/19770/-2/2009/schema.xsd
AFAIU, there is not yet support for ISO/IEC 19770-2:2009 "Software Identification (SWID) Tag Standard" tags in schema.org (e.g. schema.org/SoftwareApplication).
- A. add a schema.org/swid property to schema.org/SoftwareApplication
- B. create an extension vocabulary (in RDFa), generate the TTL and JSON-LD context, and host those:
- https://schema.org/MedicalCode shows how to create a flexible reified edge (that can be used with many
codingSystems). - This is/could/should/would then be defined in a JSON-LD context, and referenced as properties ("predicates") from package metadata RDF (as represented as JSON-LD)
- https://schema.org/MedicalCode shows how to create a flexible reified edge (that can be used with many
- B. create an extension vocabulary (in RDFa), generate the TTL and JSON-LD context, and host those:
Possible prefix URIs (these don't have to resolve as deferencable URLs (they are URIs)):, but it's helpful if there is an HTML(+RDFa) representation there, for reference, which links to the source vocabs)
- http://schema.python.org/ # "reviewed/hosted extension"
- http://pydist.schema.org/ # "external extension"
- https://pypi.python.org/namespaces/pydist
- https://pypi.python.org/ns/pydist
- https://pypi.python.org/ns/pydist-v1.0.1
- https://pypi.python.org/ns/pep1234
- https://python.org/ns/pydist
- https://pypa.io/ns/pydist
Docs on creating schema.org extension vocabulary for [Python] packages:
- https://schema.org/docs/extension.html
- https://github.com/schemaorg/schemaorg/tree/sdo-phobos/data/releases/2.2
- https://github.com/schemaorg/schemaorg/blob/sdo-phobos/data/releases/2.2/schema.rdfa "RDFa master document" --> { other RDF formats }
- https://github.com/schemaorg/schemaorg/blob/sdo-phobos/app.yaml
- https://schema.org/version/2.2/
- [edit] http://dataliberate.com/2016/02/evolving-schema-org-in-practice-pt1-the-bits-and-pieces/######
[EDIT] Links [EDIT] schema.org 2.1 -> 2.2 links
#PEP426JSONLD
@westurner #WhatDoHashTagsMean?
Compare:
- https://www.google.com/search?q=PEP426JSONLD
- https://www.google.com/search?q=PEP+426+JSONLD
Should there be / would it be useful to have:
[
{'distro':'...'},
{'distro': 'Ubuntu',
'pkgname': 'python-pip',
'url': 'http://packages.ubuntu.com/trusty/python-pip',
# ... may also be present in e.g. downstream DOAP RDF records
'maintainers': [{
'name': 'Ubuntu MOTU Developers',
'url': 'http://lists.ubuntu.com/archives/ubuntu-motu/',
'emailAddress': '[email protected]',
}]
},]
-
[ ] How to remap pkg names to URIs?
pkgname -> pkg_url "pip" -> {index_server_n}/pip
-
PyPI ensures a unique constraint on pkg names of pkgs uploaded to PyPI by requiring the
registerstep- DevPi also supports authenticated package registration (package URL registration and/or package hosting)
-
For a given Python package [that is registered and uploaded to [PyPI/devpi]]; there is thus a distinction between:
- a name property (e.g. "pip")
- the URN URI: e.g. something like
urn:x-pythonpkg:pip - the 'registered URI/URL: https://pypi.python.org/pypi/pip
- the retrieval URLs (e.g. {index_server}/{pkgname}/{version} -> {pkg.file.name.egg.whl})
- https://pypi.python.org/simple/pip/
- https://pypi.python.org/pypi/pip
- https://pypi.python.org/pypi/pip/7.1.2
- https://pypi.python.org/pypi/pip/7.1.2/json
- https://pypi.python.org/packages/py2.py3/p/pip/pip-7.1.2-py2.py3-none-any.whl#md5=5ff9fec0be479e4e36df467556deed4d
... So, in Linked Data terminology, the package URN URI (urn:x-pythonpkg:pip) is resolved to a dereferencable URL at install time, given the distutils/setuptools/pip (~index_servers and find-links * configuration)
For the distro metadata question, that's the main reason the draft metadata 2.0 proposal moves project details out to a metadata extension: https://www.python.org/dev/peps/pep-0459/#the-python-project-extension
Having the project metadata in an extension means it is then trivial to re-use the same format for redistributor metadata: https://www.python.org/dev/peps/pep-0459/#the-python-integrator-extension
For the pkgname to URI question: what practical problem will that solve for Python developers? What will they be able to do if metadata 2.0 defines that mapping that they won't be able to do if we don't define it?
For the distro metadata question, that's the main reason the draft metadata 2.0 proposal moves project container details out to a metadata extension: https://www.python.org/dev/peps/pep-0459/#the-python-project-extension
Got it, thanks hadn't been aware of this draft spec.
- https://schema.org/Person also defines 'name', 'email', and 'url' (``"@type": "schema.org/Person")
Having the project metadata in an extension means it is then trivial to re-use the same format for redistributor metadata: https://www.python.org/dev/peps/pep-0459/#the-python-integrator-extension
For the pkgname to URI question: what practical problem will that solve for Python developers? What will they be able to do if metadata 2.0 defines that mapping that they won't be able to do if we don't define it?
Linked Data names things with namespaced URIs for many of the same reasons that Python uses namespaces.
- Build a graph of package metadata that more completely describes the actual installation / build requirements for a given package
- JOIN with other sources of metadata using a canonical [URI] key
- Downstream packaging
- Metrics
- [CVE] databases
- https://github.com/nvie/pip-tools
- A procedure for resolving / expanding (with context) that all of the package specifiers in following pip requirements file describe the same package resource (given the state of
index_servers, pip configuration, PyPI):
pip
pip==7.1.2
https://pypi.python.org/packages/source/p/pip/pip-7.1.2.tar.gz#md5=3823d2343d9f3aaab21cf9c917710196
https://pypi.python.org/packages/py2.py3/p/pip/pip-7.1.2-py2.py3-none-any.whl#md5=5ff9fec0be479e4e36df467556deed4d
-e git+https://github.com/pypa/pip#egg=pip
-e git+ssh://[email protected]/pypa/pip#egg=pip
-e git+ssh://[email protected]/pypa/[email protected]#egg=pip
Practical utility of this:
- Do I already have the metadata for this package?
- Do I already have the metadata for this [installed] package in my journaled, append-only, JSON-LD log of (system/VIRTUAL_ENV) pip operations?
If, in the future, I want to store checksums for each and every file in a package (so that they can be later reviewed), what do I key that auxiliary document to? Should I be able to just ingest 1+ JSON-LD documents into an [in-memory, ..., RDF] graph datastore?
This is a graph of packages which happened to have fit a given set of constraints on a given date and time, with a given index_servers, pip configuration... At present, pip.log and pip freeze are not sufficient to recreate / reproduce / CRC a given environment.
What I would like is:
- (pkgname, version, install_date, installed_from_URI, installed_for=[])
- (pkgname, version, filename, file checksum)
IIUC, currently, the suggested solution is "just rebuild [in a venv [in a Docker container named 'distro']] and re-run the comprehensive test suite".
The currently suggested solution for cryptographic assurance of repeated installations is to use peep to capture the hash of the Python components in the requirements.txt file: https://pypi.python.org/pypi/peep
If you want full traceability, then Nix is a better fit than any other current packaging system: http://nixos.org/nix/about.html
Offering these kinds of capabilities by default isn't a current design goal for the upstream Python ecosystem, since they can already be added by the folks that need them, and providing them by default doesn't help lower barriers to entry for new users.
FWIW pip 8.0 will include peep’s functionality built into pip (though it is opt in by adding hashes to your requirements file).
FWIW pip 8.0 will include peep’s functionality built into pip (though it is opt in by adding hashes to your requirements file).
Is/should this also be defined in "PEP 0508 -- Dependency specification for Python Software Packages" https://www.python.org/dev/peps/pep-0508/ ? ... :+1:
Pip docs of interest (in specifying Python package dependencies):
- https://pip.readthedocs.org/en/stable/reference/pip_install/#requirements-file-format
- https://pip.readthedocs.org/en/stable/reference/pip_install/#requirement-specifiers
- https://packaging.python.org/en/latest/glossary/#term-requirement-specifier
- https://packaging.python.org/en/latest/glossary/#term-version-specifier
- https://pip.pypa.io/en/latest/user_guide/#constraints-files
Do I already have the metadata for this [installed] package in my journaled, append-only, JSON-LD log of (system/VIRTUAL_ENV) pip operations?
{
"@graph": {
"actions": [
{"@type": "InstallAction",
"command": "pip install -U pip",
"description": "log message",
"packages": [
{"name": "pip", "version": "7.1.2", "versionwas": "7.1.0",
"versionspec_constraint": ">=7.0.0",
# ... pypi/pip/json metadata ...
}
]}
]}
}
Then indexing on actions[*]["packages"][*][("name", "version" [, PEP0508]] would get the current snapshot off the top of the journaled history of the env (according to [pip, ])
A JSON-LD journal of package Actions [and inlined-metadata.json] would be an improvement over (PEP376 .dist-info directories) and (pip-log.txt, pip.log) because:
- It would then be possible to differentiate between pip environment changes and system package environment changes (as compared with the outputs from
pip freezeorpip-ls)- Each VIRTUAL_ENV would then have something like a
pip-log.jsonldJSONLD w/ an inlined@context
- Each VIRTUAL_ENV would then have something like a
https://github.com/pypa/interoperability-peps/blob/master/pep-0376-installation-db.rst https://www.python.org/dev/peps/pep-0376/
pip log
--log=$PIP_LOG_FILE:- https://github.com/pypa/pip/blob/develop/tests/unit/test_options.py#L84
- https://github.com/pypa/pip/blob/develop/pip/cmdoptions.py#L118
- [install [--update] [--user] [!--prefix], uninstall]
A JSONLD context for the current JSON would need an "index map" to skip over the version keys;
- https://www.w3.org/TR/json-ld/#dfn-index-map
- https://www.w3.org/TR/json-ld/#data-indexing
but in JSONLD 2.0, we would need the ability to not skip but apply the key to each nested record.
... https://github.com/json-ld/tests
- Common Platform Enumeration (CPE)
https://scap.nist.gov/specifications/cpe/
- https://scap.nist.gov/schema/cpe/2.3/cpe-naming_2.3.xsd
- https://scap.nist.gov/schema/cpe/2.3/cpe-dictionary_2.3.xsd
- https://scap.nist.gov/schema/cpe/2.3/cpe-dictionary-extension_2.3.xsd
- https://scap.nist.gov/schema/cpe/2.3/cpe-language_2.3.xsd
This discusdion indicates that there may be need to add reified edges for packages which, according to maintainers and/or index maintainers, supersede existing packages (e.g. PIL -> pillow)
- "[Distutils] Outdated packages on pypi" http://comments.gmane.org/gmane.comp.python.distutils.devel/26306