decentraleyes icon indicating copy to clipboard operation
decentraleyes copied to clipboard

Create standard for custom resource bundles

Open Synzvato opened this issue 8 years ago • 13 comments

It should be possible for end-users, to create, share, and import custom resource bundles. This will allow us to keep Decentraleyes lean and tidy, without having to disappoint power-users. To be able to support these bundles, we will need to develop an open standard. Bundles should contain the actual resources, and essential metadata (such as resource mappings, a title, and a description).

Feel free to weigh in with suggestions, ideas, or concrete proposals.

Synzvato avatar Nov 27 '15 23:11 Synzvato

@Synzvato If you implement this, consider adding a warning about the danger importing JavaScript from untrusted sources. To give you ideas, consider how Greasemonkey add-on handles this same vulnerability. To see this for yourself, install the add-on, then go to the greasyfork site to install a user script.

Alternatively, the standard could work such that these custom bundles do not include the JS themselves, but rather a list of URLs to the scripts they would like to save locally. Decentraleyes would then download the script itself (over https if possible)

RoxKilly avatar Dec 04 '15 21:12 RoxKilly

[...] consider adding a warning about the danger importing JavaScript from untrusted sources.

That's quite crucial indeed. There should definitely be a prominent warning.

Alternatively, the standard could work such that these custom bundles do not include the JS themselves, but rather a list of URLs to the scripts they would like to save locally. Decentraleyes would then download the script itself (over https if possible)

Although this is a good idea, there are two sides to this. The reason that currently all bundled resources are pushed into this repository, and that there's a handcrafted list of all included files (files.js), is that the contents of the resulting package should be set in stone.

This is important, because it allows multiple parties to verify a fixed set of files. The approach is meant to help reduce the amount of trust required in Content Delivery Networks. Do you think it could be an idea to make a non-optional file checksum validation mechanism part of the standard?

I would be happy to hear your thoughts on the strategy.

Synzvato avatar Dec 06 '15 16:12 Synzvato

I wasn't suggesting changing the add-on itself when I discussed listing URLs instead of including scripts. The add-on would still come with its bundled scripts. I was talking about the custom bundles. If my understanding is correct, the way your add-on works, it associate each supported URL with a local script right? Ok so suppose you use an open standard to allow users to add their own bundles. What is to prevent someone from associating a valid URL (eg: code.jquery.com...) with a malicious script, then bundling all that and publishing it for users to import? If that happened, when the victim tried to reach code.jquery.com... as a result of everyday browsing, the malicious script will get executed.

What I suggested is that instead of including the JS that's meant to be served, the custom bundle would just list the URLs it wants to cache locally (eg code.jquery.com....). When a user installs the bundle, the add-on will download all the scripts and import them. This way you will be sure that in regular browsing, when the browser wants code.jquery.com..., the legitimate JS will be served. This way the amount of trust we put in the CDN is the same as always: we execute the script that they store. However the CDN is contacted only once, when the bundle is imported.

I don't think I understood well what you were trying to explain about parties verifying a fixed set of files.

RoxKilly avatar Dec 06 '15 18:12 RoxKilly

Ok so suppose you use an open standard to allow users to add their own bundles. What is to prevent someone from associating a valid URL [...] with a malicious script, then bundling all that and publishing it for users to import? If that happened, then [...] will get executed.

Although that's essentially true, what would prevent a devious package maintainer from simply linking to malicious resources? Worth noting, it is possible to point multiple links to the same local file.

When a user installs the bundle, the add-on will download all the scripts and import them. This way you will be sure that [...] the legitimate JS will be served. This way the amount of trust we put in the CDN is the same as always: we execute the script that they store.

Only, the current implementation puts even less trust in centralized delivery networks. If every user would fetch copies of files from these central parties, it technically re-opens the door for big players to inject malicious code into specific environments (highly targeted, so next to impossible to detect).

However the CDN is contacted only once, when the bundle is imported. I don't think I understood well what you were trying to explain about parties verifying a fixed set of files.

I think would be safer to force package maintainers to include the files in their bundles. These files can then be compared to one mapping entry (link, selected by creator). Bundles can then easily be verified, as several parties will be comparing the same files to the same central copy.

The result would be a complete, signed (to safeguard the integrity of its contents), bundle that can be scrutinized by anyone and that puts zero trust in delivery networks. Does this clear things up?

Synzvato avatar Dec 14 '15 23:12 Synzvato

What would prevent a devious package maintainer from simply linking to malicious resources?

Nothing would prevent it, but it wouldn't matter. Suppose the attacker wants to run bad.js on the user's machine. With my suggestion, the user would not be exposed to run bad.js when a website requests a legitimate jquery.js file. The user would run bad.js only if the website specifically asks for the bad.js URL. For all other users, although the bundle will contain bad.js, it will never get executed.

the current implementation puts even less trust in centralized delivery networks

I don't think I understand this. Doesn't the addon currently execute the same code that's available at the CDNs? So if the https://ajax.googleapis.com/ajax/libs/angularjs/1.3.15/angular.min.js is malicious, both nonusers and users of decentraleyes will be victimized. No?

I think would be safer to force package maintainers to include the files in their bundles. These files can then be compared to one mapping entry (link, selected by creator).

Ok suppose I create a bundle that includes instructions to serve bad.js whenever the browser requests any jquery 1.2.1 link. I then sign my bundle and point to a reference online that all users can verify their local bundle copy against. Now everyone can be sure that they have an authentic copy of my bundle. Isn't it true that all users who browse to a site that uses jquery 1.2.1 will run bad.js instead of the legitimate jQuery code and will therefore be infected? I just don't understand how they will be protected. In my previous suggestion, we can be sure that only the legitimate jQuery code will be run. The bundle can still be signed for integrity checking, just as you could sign any text with a PGP key, or compare hashes to determine whether two sources are identical.

RoxKilly avatar Dec 15 '15 01:12 RoxKilly

Anyone who wishes to curate a "custom resource bundle" could simply provide a set of mappings. The remote resources would be retrived at time of install. This approach could even be applied to the decentraleyes "base set" of resources, and doing so would obviate the need for an "audit" utility script (needed to satisfy AMO review, each time the extension is incrementally updated?)

Yes, I've skimmed the closed issues list and have read: https://github.com/Synzvato/decentraleyes/issues/35 "step 4" proposed there (pass, and allow retrieval from browser's regular cache) may be prone to failure if server originally stipulated "must-revalidate" in the response header.

stewie avatar Feb 08 '16 17:02 stewie

@stewie Providing mappings instead of the JS content is what I've been advocating for as well. The scripts themselves would be downloaded at bundle install time, and there will be much harder for bad actors to get malicious JS to run when the browser tries to fetch a legitimate script

RoxKilly avatar Feb 11 '16 04:02 RoxKilly

@stewie

This approach could even be applied to the decentraleyes "base set" of resources, and doing so would obviate the need for an "audit" utility script

I don't know of any downsides to the current audit script, and cannot see what advantages there would be to removing the audit tool from the current codebase. It's quite easy to maintain.

@RoxKilly

Providing mappings instead of the JS content is what I've been advocating for as well. The scripts themselves would be downloaded at bundle install time, and there will be much harder for bad actors to get malicious JS to run when the browser tries to fetch a legitimate script

This all looks good in theory, but it will break the concept of keeping a lot of sites from breaking whilst not at all relying on third parties. This might not sound like too big of a deal, but let me elaborate.

Let's say I live in China (where access to ajax.googleapis.com is often times blocked). The current version of Decentraleyes would greatly help people to keep using websites that rely on such networks. In this situation, the proposed (stripped down) version of the add-on would fail spectacularly.

Bundled resources are added with regards to web usage trends, and that's why it's just over 5 MB in size and able to block a fairly high amount of requests to known CDNs.

I'm trying to work towards a situation where anyone can put Decentraleyes (along with his or her favorite custom bundles) on a USB-drive and use it anywhere without relying on any third parties.

I hope this managed to highlight some key advantages to the current approach.

Synzvato avatar Feb 11 '16 14:02 Synzvato

@Synzvato Thanks for taking the time to explain. I better understand your position. If you don't want to rely at all on contact with the CDNs and if you're worried about censorship blocking the download when the app tries to resolve the mappings, then you're right mappings won't work.

What I still don't understand is how do you then combat the danger that I raised. If I publish a bundle that includes malicious javascript and links it to http://ajax.googleapis.com/ajax/libs/jquery..., then anyone who uses my bundle will become infected as soon as the browser tries to fetch jquery.

Another solution might be to install only those bundles that have been signed with your developer private key, so only you can build official bundles. Under that scenario, if I wanted to publish a custom bundle, I would provide you with mappings that you would resolve, download and import the JS, build the bundle, sign it and return it. It would work as a web service. I submit the mappings to the your server, and I get back a signed, built bundle that includes all the javascript and can be distributed.

At the end of the day, whatever solution you adopt, users should have a guarantee that when the browser tries to fetch jquery, it's the legitimate jquery script that gets executed.

RoxKilly avatar Feb 12 '16 04:02 RoxKilly

Related question: as users, is there any reason we should refrain from adding every single library available?

(Feel free to remove this comment if you think it doesn't belong in this thread)

Bisaloo avatar Apr 13 '16 06:04 Bisaloo

@Bisaloo

[...] is there any reason we should refrain from adding every single library available?

I'd say this highly depends on your actual use case and on your personal preferences. Doing so should be possible. If you ever decide to have a go at it, please share your personal findings!

Synzvato avatar Oct 24 '16 08:10 Synzvato

After reading the discussion, it seems that there are several questions we need to answer.

1. Should we give the users the possibility to download and install scripts from third-parties?

I think we should give the possible to the users to install scripts from third-parties. That option should be turned off by default and when turned on the user should be warn about the risks of installing scripts from externals sources.

I know you are concern about trust but the problem here is we are limiting the users and if we take your example about the Chinese people that can be really problematic for them since the base bundle only cover a fraction of the librairies that exist on the web. And I don't even talk about other resources like CSS files. For example, bootstrap is a pretty popular librairies and a lot of website are using CDN to load the JS, CSS and Webfont.

In conclusion, we have three choice :

  1. Keep the code as is.
  2. Let users install third-parties script (the script can come from upstream. For example, I can download jquery directly from the developers and add it to decentraleyes myself if there's an interface like the one in GreaseMonkey, ViolentMonkey). Normally upstream is a trustworthy source.
  3. We can host our own repository but that option will cost a lot of money and the users will need to trust us, so back to square one.

2. Should we keep a base package of the most popular libraries?

As you pointed out, Chinese people are often confronted with gouvernment censorship and providing a base bundle would help them even if the bundle isn't up-to-date (bleeding edge).

Bonus It will be a good idea to only keep some version of the librairies. For example, minor version are often backward compatibles. That way, we could include more librairies in the base bundle. Of Course, there's the problem of keeping the plugin up-to-date and the processus to push an update on the firefox addon site is awful! That option is half way between "keep the code as is" and "give the users the possibility to install whatever they want".

dannycolin avatar Nov 24 '17 04:11 dannycolin

@Synzvato I dont' want to rush you but is there any news about that feature request?

dannycolin avatar Feb 15 '18 23:02 dannycolin