About configuring VSCode to interpret php
About configuring VSCode to interpret php
Hello my friend
Can you help me how to configure VSCode to interpret php
I need to point docker php to VSCode
It is possible ? Can you help me
I think it is possible and some people have done it and there is a script somewhere but it would be really cool if someone can come up with a step by step guide and how to get this done with Devilbox.
If you are talking about setting up XDebug with devilbox and VSCode: here you go
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@mrbig00 Is this the same as setting up XDebug? I don't think so.
If we simply want VSCode to be able to access PHP inside the Devilbox container, what do we need to do? So regardless of project, just have VSCode "connect" to PHP inside the Devilbox container. How is that done please?
@mozgbrasil Did you find a solution? Do you use XDebug or just have PHP be "available" inside VSCode regardless of project?
Good Morning
I installed php on the operating system
And due to the operating system apache2 conflict with devilbox I'm using the practice of stopping apache
sudo service apache2 stop
@mozgbrasil
I am using Devilbox and Docker to keep my OS free of Apache, NGINX, PHP, etc. Of course I could have installed it on the OS and simply connect VSCode to it.
@cytopia I am trying to "connect" PHP from inside the Devilbox container to VSCode on the OS. What parts in the https://devilbox.readthedocs.io/en/latest/index.html might be able to point me in the right direction?
https://devilbox.readthedocs.io/en/latest/intermediate/configure-php-xdebug/linux/vscode.html#configure-php-xdebug-lin-vscode is for XDebug. But what if I just want to "connect" VSCode to PHP in the Devilbox container and not use XDebug?
if I just want to "connect" VSCode to PHP in the Devilbox container and not use XDebug?
What is the reason or what is this useful for, If I might ask.
Not sure if related but for me, running tools like phpcs (php code fixer), I have them currently pointed at the OS php install which is not really correct. It should be linked to the container in case you are using a different php version than your OS
@cytopia What @Cipa wrote is exactly one of the reasons for example.
Another reason would be to be able to use https://github.com/Automattic/VIP-Coding-Standards or https://github.com/squizlabs/PHP_CodeSniffer.
In fact being able to do that would be truly awesome.
On purpose I am keeping the host OS slim, fast and clean. Being able to use these tools inside VSCode through Devilbox, oh my, that would be really nice.
@mozgbrasil @robots4life @Cipa as this is very VSCode specific and I've never used it, I am unfortunately out of any help to you guys. I would suggest to get support directly at the relevant VSCode places:
- https://stackoverflow.com/questions/tagged/vscode
- https://github.com/Microsoft/vscode/issues
- https://www.reddit.com/r/vscode/
@cytopia Ok, thank you.
Can you explain, if I wanted to make PHP available outside the Devilbox container, other than through the browser, would I need to open a port or do something similar like change the .env file? Not sure if this applies but what if I wanted to check i.e. the version of PHP inside the Devilbox container but from the host OS in a console. How could I pass such a command to inside the Devilbox container?
Where in the docs can I read up on the concept of making PHP available outside the Devilbox container? I guess this is more Docker specific, but if you have a topic in the docs that you think might be able to help even remotely, please let me know.
Thank you.
If you are in the Devilbox directory, you can use it as such:
docker-compose exec -T --user devilbox php php -v
Hi @robots4life, I am using the new VS Code Remote Development to accomplish this. Make sure you have one of the following extensions installed ms-vscode-remote.vscode-remote-extensionpack (package containing remote development for SSH, Containers and WSL) or ms-vscode-remote.remote-containers (for remote development in containers).
After installing these extensions all you need is a devcontainer.json file in a .devcontainer folder inside of your workspace. <project>/.devcontainer/devcontainer.json.
In case of Devilbox the file should look like the following:
// See https://aka.ms/vscode-remote/devcontainer.json for format details.
{
// See https://aka.ms/vscode-remote/devcontainer.json for format details.
"name": "Existing Docker Compose (Extend)",
// Update the 'dockerComposeFile' list if you have more compose files or use different names.
// The .devcontainer/docker-compose.yml file contains any overrides you need/want to make.
"dockerComposeFile": [
"../../../../docker-compose.yml" // Point to Devilbox's docker-compose.yml
],
// The 'service' property is the name of the service for the container that VS Code should
// use. Update this value and .devcontainer/docker-compose.yml to the real service name.
"service": "php", // Name of the service we want to remote to
// The optional 'workspaceFolder' property is the path VS Code should open by default when
// connected. This is typically a file mount in .devcontainer/docker-compose.yml
"workspaceFolder": "/shared/httpd/<project-name>", // For example: /shared/httpd/my-website
// Uncomment the next line if you want to keep your containers running after VS Code shuts down.
// "shutdownAction": "none",
// Uncomment the next line if you want to add in default container specific settings.json values
// "settings": { "workbench.colorTheme": "Quiet Light" },
// Uncomment the next line to run commands after the container is created - for example installing git.
// "postCreateCommand": "apt-get update && apt-get install -y git",
// Add the IDs of any extensions you want installed in the array below.
"extensions": [
"eamodio.gitlens",
"felixfbecker.php-debug",
"felixfbecker.php-intellisense",
"fterrag.vscode-php-cs-fixer",
"atlassian.atlascode"
],
}
This approach needs Devilbox to be up and running on forehand.
When the extensions are installed AND the devcontainer.json file is present in the appropriate folder VS Code will notify you about that it detected the file (upon reopening the workspace) and that you're able to reopen the project inside of the container.

After reopening the project inside of the container you're also able to fire commands inside of the container straight away from VS Code's built-in terminal. For example composer install etc.

@boumanb thanks for the detailed post. One more thing to sort is how to actually make the terminal start up as the devilbox user instead of root. The devilbox user has the same uid/gid as your local user and I suppose you rather want it to issue any commands to keep permissions between inside the Docker container and your host syncronized.
No problem. Glad I can contribute in some way, truly a great project. Glad to see you active again.
You're totally right. It's better to keep these permissions synchronized. However, I've not been able to find a solution for this. VS Code has some documentation on using a different user while developing remotely: https://code.visualstudio.com/docs/remote/containers-advanced#_adding-a-nonroot-user-to-your-dev-container
Apparently when using the approach described above, we have to define the user in the docker-compose.yml (docker-compose.override.yml in our case) like user: "devilbox" although this causes the php service to crash.
php_1 | [INFO] Changing group 'devilbox' gid to: 1000
php_1 | groupmod: Permission denied.
php_1 | groupmod: cannot lock /etc/group; try again later.
// Uncomment the next line to run commands after the container is created - for example installing git. // "postCreateCommand": "apt-get update && apt-get install -y git",
Is this probably something where you could do: su - devilbox as a post command?
Normally, yes. But in my case Devilbox is running 24/7 in the background. The // "postCreateCommand": "apt-get update && apt-get install -y git", will only be executed when creating a new remote environment from image, Dockerfile or docker-compose for example. What I tell my VS Code with the configured devcontainer.json is to make use of the pre-existing environment.
@boumanb Thank you! I will give all this a try and report back either way.
Great, looking forward to your response!
@boumanb
Make sure you have one of the following extensions installed ms-vscode-remote.vscode-remote-extensionpack (package containing remote development for SSH, Containers and WSL) or ms-vscode-remote.remote-containers (for remote development in containers).
My workflow is such that I run node, npm, grunt, sass, etc. locally on the host in the project folder and like to do things like php coding standards or php code sniffer in the container.
First, would you see an advantage in running everything in the container, while having HOST_PATH_HTTPD_DATADIR pointing to a local folder with all my projects? Meaning in those local folders just have the files for the project and have tools etc. in the containers.
At the moment I am fine with having the tools mixed into the project folders. This means I need to have node and npm on the host what sometimes leads to minor issues when updates or different versions are required.
I tried both ways and found that having things local is a bit faster. Would you see another advantage of running all the tools in the containers?
Then I am not sure what package to pick from the two you mention. Like I said I do all the development locally and would only need to run the php tools in the container. What package is better suited for that?
@boumanb Given I might not have grasped the concept of containers and working inside them you can disregard my previous message/question.
I guess I am close but it does not work. Here is what I have done.
First I installed this extension pack for VS Code. https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack
All my projects are on the hard drive in this folder /media/user/d/WWW/.
Inside that folder I also have /devilbox/. On the konsole (kubuntu) I can simply run docker-compose up -d and the devilbox will start. So far to good.
The .env file in the /devilbox/ folder contains this line.
HOST_PATH_HTTPD_DATADIR=/media/user/d/WWW/
This enables me to have access to all the projects I have in that folder when I open the devilbox shell for example. With this I can also create any new folder and have a project up and running fast. I add the entry for the project to the hosts file and it works.
So my VS Code workspace is the folder /media/user/d/WWW/ and inside that are multiple projects, all of which can use devilbox.
Now inside /media/user/d/WWW/, my workspace, I made a folder .devcontainer and inside that created the file devcontainer.json. That file has the following content.
{
"name": "Existing Docker Compose (Extend)",
"dockerComposeFile": ["/media/user/d/WWW/devilbox/docker-compose.yml"],
"service": "php",
"workspaceFolder": "/shared/httpd/",
"shutdownAction": "none"
}
Now when I open VS Code and open the workspace being at /media/user/d/WWW/ I am greeted with the following messages.

Now when I click on Reopen in Container this happens.

So if you have any idea to what I can do different please let me know. I am reading these docs https://code.visualstudio.com/docs/remote/remote-overview and hopefully I can find something there. Thank you @boumanb
Hi,
Sorry for not responding to your previous message. I'll get back to you this weekend!
No worries please. Do take your time. This is not time critical at all. Easy going here.. and thx.
Hi,
I can reproduce this exact same error message by not having devilbox up and running on reopening in container.
Could you verify that you have devilbox up and running upon reopening in container through VS Code?
First removed the vscode-remote-extensionpack and closed VS Code.
Then I removed the .devcontainer folder and its content the devcontainer.json file.
Then I stopped Devilbox. Then I cleaned/removed Devilbox.
Then I made a new .devcontainer folder with devcontainer.json file in it with the exact content of above post.
Then I opened VS Code again and installed this again. https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack
Then I started the Devilbox. Then I started VS Code.
Then, while VS Code was starting up I could see a notification saying this.
I quickly clicked details and was presented with this log.
devcontainer.log
Then the terminal opens with Devilbox! Yes!

So in fact all I did was remove and add the pack, folder, file and then stop and start Devilbox. That did the trick. Thank you @boumanb
Now what about using the container sensibly? Do I understand this correct?
When I issue commands in that terminal, like npm init inside a project all that work will be done inside the container, right? So this is not touching the host OS any more, right? Well, assuming the default setting in the .env file has not been changed. Well in my case it is because I set HOST_PATH_HTTPD_DATADIR in the .env file to /media/user/d/WWW/. So the changes I make in the terminal that is connected to the Dev Container will be reflected on my local host. But the tools to run these packages/whatever will reside in the container. So this means I can have the local host clean and do the work in the container while also having the project files on the local host. That is quite neat in fact if I understand all that right.
Then what about this?
Now that I have the Dev Container connected, what should I put in the VS Code settings, meaning how can I tell VS Code where to find the PHP executable path now?
Then you mention some extension.
"extensions": [
"eamodio.gitlens",
"felixfbecker.php-debug",
"felixfbecker.php-intellisense",
"fterrag.vscode-php-cs-fixer",
"atlassian.atlascode"
],
These will be installed in VS Code I assume, right? So this means I could also just install them normally, without running the Dev Container, but only once I have told VS Code where the PHP executable path is, no?
So once VS Code has that setting right I can in fact use vscode-php-cs-fixer or php-intellisense or do something like this https://tommcfarlin.com/php-codesniffer-in-visual-studio-code/ to then be able to use https://github.com/Automattic/VIP-Coding-Standards for example. Or would I need to add all this to the devcontainer.json file in the extensions key?
Last not least, what about the Devilbox user that you speak about above?
I don't run Devilbox 24/7, stop and start it as I do work. So I might be able to issue the postCreateCommand being su - devilbox in the devcontainer.json file or, like you say, add user: "devilbox" to the docker-compose.override.yml file, right?
I guess I am close to making a much much better use of Devilbox, with you kind help. Thank you!
Great to see you got it working! 😄 I do recommend to make a .devcontainer folder with devcontainer.json inside each of your projects instead of inside /media/user/d/WWW/ sometimes you may need to differ in the devcontainer.json settings for specific projects. I like to make a clear seperation there. Like so:
/media/user/d/WWW/project1/.devcontainer/devcontainer.json
/media/user/d/WWW/project2/.devcontainer/devcontainer.json
But using your whole /media/user/d/WWW/ as can come in handy too sometimes I guess. Maybe it's just a matter of taste haha.
So this means I can have the local host clean and do the work in the container while also having the project files on the local host. That is quite neat in fact if I understand all that right.
Spot on. That is exactly what it means. No dependencies on your host OS (node or PHP etc.). 😃
About interpreting
Did you get that notification when reopening in the container? For me it resolves the executablePath out of the box I guess, I may be wrong.. Will look into it monday.
About extensions
These will be installed in VS Code I assume, right? So this means I could also just install them normally, without running the Dev Container, but only once I have told VS Code where the PHP executable path is, no?
The extensions that have dependencies like PHP will get install on the remote side. Which is the container in our case. Extensions that do not require any special dependencies live on the local side. See following image. Thereby you don't have to mention the PHP executable path, as stated before it should resolve itself because it's in the default PHP installation directory inside of devilbox

Or would I need to add all this to the devcontainer.json file in the extensions key?
Only the ones that you want to have installed automatticaly once you open the workspace in the container. You should play with it, you'll notice that you're also able to install extensions afterwards by simply going to the extensions menu and you'll notice texts like: Install local and Install remotely etc. Read more about it here
About the user
Yes, you might be able to do so. I settled with using root although I know it's bad. Haven't made a proof-of-concept yet where it is working as supposed. Sad. 😢
Please let me know if you find a way to login as devilbox.
I guess I am close to making a much much better use of Devilbox, with you kind help. Thank you!
That is exactly how I felt when I started experimenting with this a couple of months ago. At our organization we're all very happy (I suppose) with this new way-of-working, we're able to do everything from withing our IDE now. The developers using PHPStorm have also found a way to do so.
I do recommend to make a .devcontainer folder with devcontainer.json inside each of your projects. I like to make a clear separation there.
Ok yeah. I see what you mean. Also I think be default I am "abusing" the workspace setting of VS Code. I think also the workspace is just per 1 project. But yeah. When requirements change I can adjust this.
Did you get that notification when reopening in the container?
No, not any more. So that is good, very good in fact. Sorted.
The extensions that have dependencies like PHP will get install on the remote side. Which is the container in our case. Extensions that do not require any special dependencies live on the local side. See here https://code.visualstudio.com/docs/remote/containers#_managing-extensions
Perfect and thank you for the link and image. Fully understand now.
Please let me know if you find a way to login as devilbox.
Well, I tried with the postCreateCommand but that is not being issued when I open the terminal for some reason. What does work though is simply executing this command once you are in the shell. So just su - devilbox being inside the project that you like. That will change you to devilbox as user. So that could be a way, albeit not automatically. So if you have other postCreateCommands that run automatically they will most likely run as root.
I also tried with docker-compose-override.yml but quickly saw that I need to replicate most of the docker-compose.yml settings in it and would not know where to add the user:devilbox line in it. That is still something to look into.
Bottom line. Great. Now we can use Devilbox with VS Code in a Dev Container and benefit from all the tools/services running in the containers while keeping the host OS clean. Have a good start to the week and thank you indeed for your contribution @boumanb and of course @cytopia for the Devilbox.
@boumanb When I am in the Dev Container and install the extensions into the container the next time I start up Devilbox the extensions are gone. Do you know why this happens?
So I am in the Dev Container and in the extensions selection I install these two for example.
As you can see I install them from inside the "local - installed" panel. There are none in the "Dev Container" panel.
Once I have installed the extensions and they run in the Dev Container and I finished the work for the day I shut down the Devilbox.
For that I use docker-compose stop and then docker-compose rm -f.
Is it perhaps possible that the extensions are removed because I run docker-compose rm -f at the end?
In your example devcontainer.json above you mention.
Add the IDs of any extensions you want installed in the array below.
"extensions": [
"eamodio.gitlens",
"felixfbecker.php-debug",
"felixfbecker.php-intellisense",
"fterrag.vscode-php-cs-fixer",
"atlassian.atlascode"
],
Does this list need to be in the devcontainer.json at all times and do the extensions need to be re-installed every time the Dev Container is started or does it have to do with the docker-compose rm -f command?
First of all, great questions!
When I am in the Dev Container and install the extensions into the container the next time I start up Devilbox the extensions are gone. Do you know why this happens?
As shown in the diagram in my last reaction VS Code sets up a server on the remote OS and installs the extensions on the remote OS (the extensions that need an interpreter). When you shutdown Devilbox you'll also remove the installed VS Code server along with the extensions. Next time you'll bring up Devilbox and open a project leveraging the devcontainer VS Code needs to reinstall the VS Code server along with the extensions.
For that I use docker-compose stop and then docker-compose rm -f.
I think docker-compose down combines those two (stop and remove containers + networks).
Is it perhaps possible that the extensions are removed because I run docker-compose rm -f at the end?
Sure. The VS Code remote server along with the extensions gets removed by this I guess.
Does this list need to be in the devcontainer.json at all times
No it does not actually. I ended up putting them in the VS Code settings.json (global settings) like so:
"remote.containers.defaultExtensions": [
"eamodio.gitlens",
"felixfbecker.php-debug",
"felixfbecker.php-intellisense",
"fterrag.vscode-php-cs-fixer",
"atlassian.atlascode"
],
and do the extensions need to be re-installed every time the Dev Container is started or does it have to do with the docker-compose rm -f command?
I think we've covered this already. So yes, I guess everytime we remove the containers VS Code remote along with the extensions gets removed. Using only docker-compose stop and docker-compose start may result in different behavior but I haven't tried.
@boumanb All good. So far I now have this in the VS Code workspace settings.
"remote.containers.defaultExtensions": [
"formulahendry.auto-rename-tag",
"streetsidesoftware.code-spell-checker",
"oderwat.indent-rainbow",
"sirtori.indenticator",
"bmewburn.vscode-intelephense-client",
"esbenp.prettier-vscode"
],
When I simply stop and start Devilbox without removing the containers the extensions are there. It all starts up super fast. Now exploring what useful extension I can use for PHP and JS development. Quite happy it all works out with Devilbox being able to use the VS Code terminal and have the host clean and fast. Thank you.
@cytopia @robots4life
About changing the user. While reading the documentation of VS Code today I ran into the solution regarding specifying the user you want to use inside of the container. It's so simple, probably read over it a thousand times. All you have to do is put the following into the devcontainer.json: "remoteUser": "devilbox",. Then restart VS Code and when starting a terminal inside of VS Code it should be logged in as the user devilbox.
@boumanb wow, great news, just saw this. Changed the file and it works! Thank you. :heart:
To confirm, we are advised to set the remote user to "devilbox", right? :question: Because the rest of the Devilbox runs with that user, so that we can use all the tools, right?
I was doing some automated WP setup script and ran it under root. WP CLI was giving me a warning that I can then control all files on the server and that is dangerous. :scream:
So this "user" that we set, it has in fact nothing got to do with app accounts, but the underlying file system, right? So what implications has that on the app we are working on? What are solid patterns for the "user" at this level working with Devilbox?
I mainly do PHP work with Devilbox, sometimes also Node stuff. So far I did not have to change the user or rights when uploading to the host, perhaps they change file permissions automatically. So yeah, what is that "devilbox" user about?
To confirm, we are advised to set the remote user to "devilbox", right?
Correct. The general advise (good practice) on unix-like systems is to do your work as a non-root user. We elevate to root privileges when needed by prepending sudo to our commands. For example:
$ reboot
warning: must be root!
$ sudo reboot
I was doing some automated WP setup script and ran it under root. WP CLI was giving me a warning that I can then control all files on the server and that is dangerous.
Yes, running that script as root means that it runs under root privileges, which is dangerous (not always). One could add malicious commands such as sudo rm -rf /* --no-preserve-root which deletes all your files.
So this "user" that we set, it has in fact nothing got to do with app accounts, but the underlying file system, right?
The interacting with the system. Not solely filesystem.
So what implications has that on the app we are working on? What are solid patterns for the "user" at this level working with Devilbox?
Use devilbox and elevate using sudo when needed.
@boumanb Thank you for your explanation.
Use
devilboxand elevate usingsudowhen needed.
The user on the local machine is devilbox. I understand that.
So when I do the work locally in the remote container with Devilbox under that user all is fine.
Now here comes. When I then upload the files to the live server, i.e. on a shared hosting where the hosting company already makes users and sets their permissions, will those files hold any reference in their file attributes or somewhere else in the file that says that these files have been created or edited with the devilbox user OR will the files be, well, just files and the hosting company has to take care of the user and the user permissions for that file?
I am asking, since when I work locally, I could also work as root since I am the only one working on this machine. No, I am not doing that of course, but this would help me understand better.
Meaning when working as root locally, i.e. with the WP CLI script I could theoretically disregard that message. This, because when those PHP, HTML, CSS, JS etc. files are uploaded to the live server, there the hosting company has to make the users and the user permissions, right?
The actual uploaded files will not hold a reference to the local root user and therefore be dangerous on the live server. The files are just files and depending on what system they reside on and what user and user permissions are set on that system those files will be dangerous or not. Is this correct? I mean it would be nut if user and user permissions could be passed with a simple file.
https://wordpress.org/support/article/changing-file-permissions/ https://stackoverflow.com/questions/18352682/correct-file-permissions-for-wordpress
For example when I FTP into a shared hosting with WordPress on it, it looks like this by default.

Long story short, the user and user permissions are not passed with the files from the system they came from but are always set on the system where the files currently are. Correct?
In general When you FTP files to a different machine, the owner will be set as your user account on that server.
If you want to confirm that, you can either see if there is a right click -> permissions option in your ftp client, or contact your hosting provider.
@science695 Perfect, thank you.
The developers using PHPStorm have also found a way to do so.
@boumanb Any chance you get those developers to share a few bullet points about their setup? I dont see any information on google about this.
@weitzman Ha. Just yesterday I looked into PHPStorm because I am going through the Symfony tutorials and using PHPStorm with that seems to be a real pleasure doing a lot of things automatically. Having a quick look around I found https://blog.jetbrains.com/phpstorm/2018/08/quickstart-with-docker-in-phpstorm/ and https://medium.com/docker-captain/dockerized-php-development-with-phpstorm-f5d69fec133. Would that help any of us in case we have PHPStorm? I am asking since I am contemplating to get PHPStorm. :thinking:
@boumanb Haw are you doing? Hope fine! Things here are busy, learning Hapi.js and Symfony. So yeah, if you happen to have a link to a tut that does indeed make PHPStorm work with Devilbox, you know what to do. Happy greetings to ya. :smile:
@weitzman I showed this thread to my colleague and he is willing to share information about setting up PHPStorm to work with Devilbox. He told me he was going to try to respond tomorrow.
@robots4life I'm doing fine! How about you? Good to see you're learning new things. Regarding the tut, the colleague I was talking about told me he was planning on writing a blog post about it. Anyhow, knowing him, the information he'll supply you with in this thread is probably sufficient to get you up and running.
@cytopia I guess quite some people would like to see something about this in the documentation. I'll probably make a PR to add some documentation to intermediate/work-inside-the-php-container regarding configuring VS Code and PHPStorm to enable developers to work inside of the container from within their IDE.
Hi, I'm The Colleague that @boumanb mentioned. These are the steps to get Devilbox integration in PHPStorm. I use Linux, but I think these settings should also work on Windows.
Basic stuff
First, go to your .env file (located in your devilbox directory) and add the following to the bottom:
PHP_IDE_CONFIG=serverName=devilbox
Save and restart devilbox.
Next, open your project in PHPStorm and go to Languages & Frameworks > PHP in the settings. Click on the "..." next to "CLI Interpreter". In the popup window, click the plus sign in the upper left corner and choose "From Docker, Vagrant [..]".
In the next popup window, choose "Docker" at the radio buttons section at the top and click on the "New.." button. Give this server a name and choose "Unix socket". If everything went well you should see a "Connection successful" message at the bottom. Now click on OK to close the second popup window.
In the first popup window (named "Configure Remote PHP Interpreter") choose the "devilbox/php-fpm:
If you click on the reload icon next to the "PHP executable" field, you should get a message below it which states your PHP version. Now give this interpreter a name at the top en optionally uncheck the box next to "Visible only for this project" to make this interpreter available to other projects on your computer. Click OK to close the window.
Now click on the folder icon next to "Docker container". In the popup window that appears, set "Network mode" to "devilbox_app_net" and uncheck the "Disable network" and "Publish all ports" boxes if they're checked. Now in the "Volume bindings" section you should have an entry. Click that entry and then click on the pencil on the right to edit the entry.
In the popup window that appears, set the "Container path" to the path where your project is located in Devilbox (for example: "/shared/data/my_project") and set the "Host path" to the location of your project in Windows/Linux (for example: C:\devilbox\data\www\my_project). Uncheck the "Read only" box if it's checked and click the OK button to close the popup window and do the same for the "Edit Docker Container Settings" window.
Now go to Languages & Frameworks > PHP > Servers and click on the plus sign in the top left. For the name enter "devilbox" (this is not something you can choose, it has to be "devilbox"). Uncheck the box next to "Shared" because these settings will be different for every project. The "host" is the url to your project without the "www" and "https", so for example "my_project.local". The port should be "80" and the Debugger "XDebug". Next, check the box with the label "Use path mappings". In the section below that checkbox, you will see a collapsable part called "Project files". Click on the path in the "Absolute path on the server" column and enter the path to your project as it is in Devilbox, so for example "/shared/httpd/my_project". Now click the Apply button.
PHPUnit
Go to the Languages & Frameworks > PHP > Test Frameworks setting. Click on the plus sign to add a new framework. Choose "PHPUnit by remote interpreter".
In the new popup window choose the interpreter we created earlier and click OK to close the popup window. Now make sure that the "Path mappings" field is correct. For the "Docker container" field enter the following:
--net="devilbox_app_net" -v <Windows path to your project>:<Devilbox path to your project>
For example: --net="devilbox_app_net" -v C:\devilbox\data\www\my_project:/shared/httpd/my_project.
Click the "Path to phpunit.phar" radio button and enter the path to phpunit in Devilbox. You can find this path by starting an SSH session to Devilbox and entering the command which phpunit. Leave the "Path to script", "Default configuration file" and "Default bootstrap file" empty unless you need it. If you need a bootstrap file like me, be aware that you need to enter the Devilbox container path, not the Windows path.
Click on the refresh button and you should now see the message "PHPUnit version:
Xdebug
Go to the Languages & Frameworks > PHP > Debug setting and to the Xdebug section. Check all the checkboxes in this section and set the "debug port" to 9000 (no not over 9000, just 9000). Save with the Apply button at the bottom. Note: There is a link to validate your Xdebug settings at the top, but this isn't working for me. Xdebug works fine in PHPStorm though. You can test it by placing a breakpoint in your code by clicking in the gutter, next to the line number. A red dot should appear and it should halt on that line when you open the page in your browser.
PHPCS
Go to the Languages & Frameworks > PHP > Quality Tools setting, expand the PHP_CodeSniffer section and choose your interpreter in the "Configuration" dropdown. Now click the "..." button next to it.
In the popup window that appears, set the "PHP_CodeSniffer path" to phpcs and click the "Validate" button. A message should appear at the bottom of the popup window indicating that it found PHPCS. Click OK to close the popup window.
PHP Mess detector
Go to the Languages & Frameworks > PHP > Quality Tools setting, expand the Mess Detector section and choose your interpreter in the "Configuration" dropdown. Now click the "..." button next to it.
In the popup window that appears, set the "PHP Mess Detector path" to phpmd and click the "Validate" button. A message should appear at the bottom of the popup window indicating that it found PHPMD. Click OK to close the popup window.
PHP CS Fixer
Go to the Languages & Frameworks > PHP > Quality Tools setting, expand the PHP CS Fixer section and choose your interpreter in the "Configuration" dropdown. Now click the "..." button next to it.
In the popup window that appears, set the "PHP CS Fixer path" to php-cs-fixer and click the "Validate" button. A message should appear at the bottom of the popup window indicating that it found PHP CS Fixer. Click OK to close the popup window.
Thanks everyone for gathering this info together.
Thanks for the detailed post @insyht.
I'm specifically interested in using PHPStorm + VS Code Server. Is that what you are using? I'm using terminology per this diagram

I don't know if that's possible. What functionality of VS Code Server do you want to use in PHPStorm?
@insyht Man, excellent post up there about PHPStorm, thank you heaps! :+1: :zap: :rocket:
Hi @robots4life, I am using the new VS Code Remote Development to accomplish this. Make sure you have one of the following extensions installed
ms-vscode-remote.vscode-remote-extensionpack(package containing remote development for SSH, Containers and WSL) orms-vscode-remote.remote-containers(for remote development in containers).After installing these extensions all you need is a
devcontainer.jsonfile in a.devcontainerfolder inside of your workspace.<project>/.devcontainer/devcontainer.json. In case of Devilbox the file should look like the following:// See https://aka.ms/vscode-remote/devcontainer.json for format details. { // See https://aka.ms/vscode-remote/devcontainer.json for format details. "name": "Existing Docker Compose (Extend)", // Update the 'dockerComposeFile' list if you have more compose files or use different names. // The .devcontainer/docker-compose.yml file contains any overrides you need/want to make. "dockerComposeFile": [ "../../../../docker-compose.yml" // Point to Devilbox's docker-compose.yml ], // The 'service' property is the name of the service for the container that VS Code should // use. Update this value and .devcontainer/docker-compose.yml to the real service name. "service": "php", // Name of the service we want to remote to // The optional 'workspaceFolder' property is the path VS Code should open by default when // connected. This is typically a file mount in .devcontainer/docker-compose.yml "workspaceFolder": "/shared/httpd/<project-name>", // For example: /shared/httpd/my-website // Uncomment the next line if you want to keep your containers running after VS Code shuts down. // "shutdownAction": "none", // Uncomment the next line if you want to add in default container specific settings.json values // "settings": { "workbench.colorTheme": "Quiet Light" }, // Uncomment the next line to run commands after the container is created - for example installing git. // "postCreateCommand": "apt-get update && apt-get install -y git", // Add the IDs of any extensions you want installed in the array below. "extensions": [ "eamodio.gitlens", "felixfbecker.php-debug", "felixfbecker.php-intellisense", "fterrag.vscode-php-cs-fixer", "atlassian.atlascode" ], }This approach needs Devilbox to be up and running on forehand.
When the extensions are installed AND the
devcontainer.jsonfile is present in the appropriate folder VS Code will notify you about that it detected the file (upon reopening the workspace) and that you're able to reopen the project inside of the container.
After reopening the project inside of the container you're also able to fire commands inside of the container straight away from VS Code's built-in terminal. For example
composer installetc.
Work Like Charm for me!

@CoffeeQuotes yeah indeed, not only Devilbox rocks, but this setup is really just so comfy and ace. It does not fill up the machine with unneeded data and if you want a fresh start you can just clean the containers out and have a fresh start. If you work with Node and write JS apps have a look at this https://github.com/cytopia/devilbox/issues/709, in that thread are configs for Gulp and Rollup.
I just tried the configuration quoted by @CoffeeQuotes and it is working, but it is so intensive on the CPU that I had to drop it... my workstation config is very good but somehow it seems to be preparing for launching like a rocket...
@masiorama a) Are you doing exactly what is mentioned here ? https://github.com/cytopia/devilbox/issues/628#issuecomment-558044181
b) What are you doing with the setup, what are you working on?
c) What are the specs of your box?
Mine are this, a laptop, and everything this butter smooth. I do not have Node, PHP or anything installed locally, just in containers, it works excellent, fast and box does not even hum when coding.

@robots4life yup, followed the steps, checked few times to be sure. Indeed it is working, but it is somehow to much intensive on the CPU. My notebook's specs are: OS: Windows 10 Pro, CPU: Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz RAM: 32,0 GB I'm gonna try again in the next days, anyway. Thanks!
@masiorama Please do check https://github.com/docker/for-win/issues/1936 and https://www.reddit.com/r/docker/comments/87uinr/docker_is_very_slow_on_windows_10/ and https://nickjanetakis.com/blog/setting-up-docker-for-windows-and-wsl-to-work-flawlessly, perhaps that might help.
Since I have the data I work with locally and mounted into the containers I can remove and clean all containers on a regular basis, or when I need to, to keep things spotless and fast. This is for the containers, not the images.
docker-compose stop
docker-compose rm -f
Docker for Windows is basically running a Linux VM to run the Linux Docker containers.
Hope this is somewhat helpful. If you find the cause and it has to do with Devilbox, please do let us know, thank you. :+1:
A note on using php-cs-fixer in VSCode in a remote Devilbox container. php-cs-fixer is listed as available tool here https://devilbox.readthedocs.io/en/latest/readings/available-tools.html#available-tools though it seems it is not installed.
Using the VSCode extension for php-cs-fixer https://github.com/junstyle/vscode-php-cs-fixer these steps are required to make things work in VSCode with the remote Devilbox container.
I have a local folder mounted to the container. For me this is simply /www/phptools/. In the remote container this resolves to /shared/httpd/phptools/.
Though for you this points to anything you set as value for HOST_PATH_HTTPD_DATADIR in your .env file.
Locally I made these steps to make php-cs-fixer available in the container.
Inside the /www/phptools/ folder or whatever folder you have pointed to HOST_PATH_HTTPD_DATADIR in the .env file do this.
https://github.com/FriendsOfPHP/PHP-CS-Fixer/blob/2.18/doc/installation.rst
wget https://cs.symfony.com/download/php-cs-fixer-v2.phar -O php-cs-fixer
sudo chmod a+x php-cs-fixer
In the settings for the PHP CS FIXER extension for Remote and Workspace now simply put /shared/httpd/phptools/php-cs-fixer as value for the executable path and this error will go away and the extension will work.

Since I regularly remove stopped containers I opted not to add php-cs-fixer to the $PATH in the Devilbox container. However you are free to use other installation options and/or modify the Devilbox $PATHs.
Had a look in the documentation and have not found a setting how to persist changes to the Devilbox $PATH between removing stopped containers, is there something I could do to have paths that are added to $PATH in the Devilbox keep those changes between removing stopped containers?
Ideally in the Devilbox shell I would want to run
sudo cp /shared/httpd/phptools/php-cs-fixer /usr/local/bin/php-cs-fixer
and then
(https://unix.stackexchange.com/a/26059)
PATH=$PATH:/usr/local/bin/php-cs-fixer
and keep those changes and then be able to just point the executable path in the remote and workspace extension settings to php-cs-fixer.
@CamdenGonzalez aight, I will give it a shot, took me some time to figure out that php-cs-fixer has to be installed, I was under the assumption it is available, as in, ready to use. Then I checked out phpcs and phpcbf and the VSCode extensions for it, but had to do some work so stopped "tooling". But since there is demand for this I will try and get this rocking.. :guitar:
@CamdenGonzalez have you got a way to meet in chat, I have a feeling that could benefit both of us, the long term plan is to add documentation for how to use the available tools in a remote container in VSCode.
If chat is not an option, can you elaborate on how you, if I understand you correct, got xDebug working over the browser and phpcs / php-cs-fixer working locally ?
edit: which php-cs-fixer in the shell does not show me anything for example when doing a fresh start on the Devilbox, so wonder why you have a full path there while I get no output. However which phpcs does show me the same path as you have it.
Also under the devilbox path in bash I just added this phptools.sh file that gets php-cs-fixer working with the https://github.com/junstyle/vscode-php-cs-fixer extension. Feel free to change this as you like, i.e. including a check for the date of the php-cs-fixer-v2.phar last download and only renew it every so often.. it is just a basic example of how I currently set this up.
#! /usr/bin/env bash
# VSCode remote container - php-cs-fixer
# 1. go to /shared/httpd/phptools
cd /shared/httpd/phptools
# 2. remove the current version and download the latest version
rm -rf php-cs-fixer
# 3. download php-cs-fixer into /shared/httpd/phptools
wget https://cs.symfony.com/download/php-cs-fixer-v2.phar -O php-cs-fixer
# 4. give executeable permission to php-cs-fixer
sudo chmod a+x php-cs-fixer
# 5. copy executeable php-cs-fixer to bin /usr/local/bin/ directory
sudo cp /shared/httpd/phptools/php-cs-fixer /usr/local/bin/php-cs-fixer
# 6. add php-cs-fixer to Devilbox's $PATH
PATH=$PATH:/usr/local/bin/php-cs-fixer
# 7. return to /shared/httpd
cd /shared/httpd
# Install PHP coding standards ..
Concerning phpcs and phpcbf I am looking at the following.
- https://github.com/valeryan/vscode-phpsab
- https://github.com/WordPress/WordPress-Coding-Standards
- https://github.com/tommcfarlin/phpcs-wpcs-vscode
WordPress is not the focus here, though figuring out to to get rules, fixing and beautifying going with that might also help with the other standards.
How to get SvelteKit working with Devilbox. https://github.com/cytopia/devilbox/issues/797#issuecomment-843941126
Thank you for all these informations! (Especially the detailed information from @boumanb helped me.)
Possibly this information would also be suitable for the devilbox documentation like: https://devilbox.readthedocs.io/en/latest/intermediate/configure-php-xdebug.html (even if it is not a devilbox specific topic)
II will try in any case to take up this approach in our Contao documentation or to extend it ...
I will try in any case to take up this approach in our Contao documentation or to extend it ...
Cool. Let me know when its in and I can replicate it on the Devilbox documentation as well
Cool. Let me know when its in and I can replicate it on the Devilbox documentation as well
Check. I will summarize the info and create a PR for the Contao doc. As soon as this is accepted I will let you know...
So I don't forget: See: https://github.com/contao/docs/issues/790
Unfortunately, I still have a question. I had used VSCode with Xdebug, with this "PHP Debug" extension installed locally in VSCode. Together with the information from the devilbox documentation it worked.
Now I have installed the "Visual Studio Code Remote Development" pack as described here. It works so far. In the VSCode terminal I am inside the devilbox php container with the user "devilbox". Also e.g. "Intelephense" via this extension works.
But: Now I have to install in VSCode, analogous to e.g. "Intelephense", also the "PHP Debug" extension in the container itself. Now unfortunately it does not work anymore. I have tried various entries in the "devilbox\cfg\php-ini-7.4\xdebug.ini" and ".vscode\launch.json".
- Windows 10 Pro / WSL 2 / current devilbox version (with PHP 7.4).
- /shared/httpd/demo
- devilbox/data/www/demo/htdocs
- https;//demo.loc
Could anyone provide a working configuration/help for Xdebug usage in this environment? Thanks a lot
Addition: Here is my working configuration when the VSCode "PHP debug" extension is installed locally within VSCode (analogous to the devilbox documentation):
\devilbox\cfg\php-ini-7.4\xdebug.ini
xdebug.mode = debug
xdebug.start_with_request = yes
xdebug.remote_handler = dbgp
xdebug.remote_port = 9000
xdebug.remote_connect_back = 0
;from vEthernet (WSL):
xdebug.client_host = 172.19.112.1
xdebug.idekey = VSCODE
xdebug.remote_log = /var/log/php/xdebug.log
\devilbox\data\www\demo\.vscode\launch.json
{
"version": "0.2.0",
"configurations": [
{
"name": "Listen for Xdebug",
"type": "php",
"request": "launch",
"port": 9000,
"pathMappings": {"/shared/httpd/demo/htdocs": "${workspaceRoot}/htdocs"}
}
]
}
Hint: Using Docker via WSL 2: The entry for "xdebug.client_host" here according to the IPv4 address from "vEthernet (WSL)" (see ipconfig).
Using Docker via VM (Windows 10 Pro / Hyper-V): The entry for "xdebug.client_host" here corresponding to the IPv4 address from "vEthernet (Default Switch)" (see ipconfig).
Now I need help with the procedure using "Visual Studio Code Remote Development" : VSCode is asking me for relaunch according to this:
\devilbox\data\www\demo\.devcontainer\devcontainer.json
{
"name": "Existing Docker Compose (Extend)",
"dockerComposeFile": [
"../../../../docker-compose.yml"
],
"service": "php",
"workspaceFolder": "/shared/httpd/demo",
"remoteUser": "devilbox",
}
As already mentioned it works then regarding using VSCode terminal (inside the devilbox php container with the user "devilbox") or using the "Intelephense" extension for example.
The existing extension "PHP Debug" is now grayed out with the option to install it within the PHP container. After installation, I am now trying to set a breakpoint and debug without any other changes:
Result: _Error: listen EADDRINUSE: address already in use :::9000
I then tried different entries in "xdebug.ini" and "launch.json". Unfortunately without further success. Grateful for your suggestions ...
I am stuck as @fkaminski is.
From my previous comments I totally switched to WSL2 (ubuntu + docker), hoping for the best.
I am trying to make things to work, but still no success.
Plus, I am unhappy with the devcontainer.json bringing up all the services, but at the moment I haven't find the correct way to ups only a few selected services (like you can do via docker-compose up php httpd).
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Does anyone know how to install Vercel, Netlify and GitHub CLI to the Devilbox so they work from VS Code and are possibly kept installed between shut down and start (not when removing containers) ?
https://devilbox.readthedocs.io/en/latest/autostart/custom-scripts-globally.html#custom-scripts-globally
In /devilbox/autostart I have created a file clis.sh with following content.
#! /usr/bin/env bash
# run this script
# chmod +x clis.sh
# ./clis.sh
npm install vercel -g
npm install netlify-cli -g
Remember to chmod +x clis.sh inside /devilbox/autostart so the script can be executed.
This will install Vercel and Netlify CLI globally in Devilbox and persist the installation as long as the clis.sh fils is kept inside /devilbox/autostart.
Per docs, i.e.
# Install grunt as devilbox user
# su -c "npm install grunt" -l devilbox
asks for a password that I don't know and that is not the host password.
But the CLIs are installed as the devilbox user since in devcontainer.json I have the line "remoteUser": "devilbox",.
https://github.com/cli/cli/blob/trunk/docs/install_linux.md
The GitHub CLI has a different approach to install and since git is pretty much the only tool I have on the host I am happy installing the GitHub CLI on the host as well.