helix
helix copied to clipboard
Disable loading of workspace configs by default and make it configurable
Makes loading of workspace-local config.toml
and languages.toml
files optional and disables it by default to protect against vulnerability to arbitrary code execution when running helix in an untrusted directory. Should fix #9514 and alleviate #2697.
This is a quick fix but not the one I'm intending.
- The editor should hold an allow list file that contains a list of allowed directories followed by a hash of the file.
- We cross reference this list when trying to load a local config. If a config is detected but not on the list a status message should appear
-
:allow
would append the file + it's hash to the allowlist
@the-mikedavis I'd be fine with your workspace trust concept once it is implemented and if the prompt is absolutely clear about the risks. I'd really like the ability to activate language servers and loading of workspace configs separately though.
I'd really like the ability to activate language servers and loading of workspace configs separately though.
that ssesm like security theater to me. VSCode doesn't do that either. If you trust the LSP you trust the config. The other way around you can disable LSP in your workspace config.
@archseer There is one workspace-config = true
for config.toml
and one for languages.toml
. The current code structure makes it way easier to implement this way and I think it makes sense too :shrug:
@pascalkuthe It seems entirely reasonable to me to have language servers statically analyze code, provide snippets etc. without opening up the gates for ACE :flushed: I'm not a heavy user of language servers but this is what GPT-4 says about it:
User: How common is unprotected execution of code from currently edited code repositories in language servers used in IDEs? GPT-4: Unprotected execution of code from currently edited repositories in language servers is not common and is generally considered a security risk. Language servers typically provide features like code analysis, linting, autocomplete, and refactoring. They parse the code and understand its structure, but they do not execute the actual code being edited. When execution is necessary, for example, during debugging sessions, it is done in a controlled environment, often with explicit user consent and understanding of the risks.
I'd assume that a lot of people who switch to a small-code-base editor that doesn't rely on plugins, advertises to be usable over SSH, and is implemented in a security-driven language expect it to follow a security-conscious approach and configurability.
Can other users of VSCode confirm that enabling language servers is coupled to loading project local configurations and executing code from them? I've never really used the thing but that'd seem like a no-go to me.
gpt is not a reliable source for anything and as usual provides a wordsoup without substance.
The majority of languageserver will execute arbitrary code which is why vscode disables them by default. For example rust-analyzer will execute all "build.rs" files and procedural macros (yes without any sandboxing). They often even have their own config files which they read from the project from which they can and do execute commands.
@pascalkuthe Certainly no proof and unreliable but often decent at reiterating and summarizing information. I know that rust does this but it therefore is also pretty much the first thing that comes up when putting "language server" and "arbitrary code execution" into a search engine. I don't have a reliable source that says most LSPs don't do this. Do you have a source for the opposite? Jedi apparently is one explicit counter-example. In any case, the fact that there are plenty of ways to use the language server protocol without arbitrary code execution warrants an option to use them that way IMO.
I pushed my set of changes based on some comments above here: https://github.com/helix-editor/helix/commit/5ed223f9476810a65c427c464d99950f1df6ec49
@archseer Oh, shoot. I refactored a bunch already. Is my version okay for you?