Document/improve migration options
I noticed this in the docs: The migrations are supported while working in local environment only.
Might be helpful to detail alternatives so that people can run their migrations. Perhaps there's an opportunity for a special migration task to be included in the library? Then people could trigger or queue the job on deploy.
The ability to trigger or queue a job on deploy looks to me a little dangerous from security point of view. Anyway I would appreciate if we could discuss the matter further.
As I see it, in order to run the migrations we have to deploy them first and then have something to trigger them. Please describe the way you had in mind and maybe will analyze and implement it.
I think that Artisan::call('migrate') would do the job, but we need to run it only once and make sure no-one will be able to trigger it again.
Doesn't GAE set a specific header when jobs are run as tasks?
Looks like a good solution for me. If I understand you correctly, you suggest using a cron job in order to trigger the migrations. Application URLs can be secured for cron, so the security should not be a problem.
The next issue to solve is how to collect the results of the command and show them to the operator / admin. I presume we should be able to collect the results, although it is not something that can be done "straight forward", because Artisan::call('migrate') does not return the output of the command.
One of the possible options for presenting the results could be sending an email to the app admin.
Last artisan command output can be retrieved using Artisan::output().
The only other thing I would suggest is to ensure that the migrations can be triggered arbitrarily from a build server. Not necessarily directly, but via the gcloud tools running on the build server.
That way people can do deploys and then migrations in one swing.
After some thinking I'm going to take a different path.
Since GAE supports access 'protected' urls I added the following lines to my app.yaml:
- url: /artisan
script: public/index.php
login: admin
The url is handled by a controller capable of running artisan commands and returning their results. I also added a simple view containing a form with an input field for an artisan command and a textarea for its results. Let me know if you are interested in trying it out or if you have any other suggestions.
I spent some time thinking about this on the weekend and I think I agree with your approach. I'm guessing this will be much better than trying to slip into the task queue.
Once authenticated, a client can simply hit that endpoint to run commands (like from a build server).
To that end, I think it would be a good idea to either document or provide resources and best/common practices for:
- Authenticating a headless client like a build server
- Issuing authenticated requests (CURL, Jenkins) using the credentials
- Parameters for the endpoint
Even if the above three are a wiki page, linked to from readme.md, I think it would be quite valuable. Many people - especially those getting started - I suspect won't be fully versed in App Engine conventions. And if they're coming from a "pure" PHP/Laravel environment, are almost guaranteed to struggle initially.
Ok, now it is time to do some work. I'll send updates on my progress.
I'm going to be creating the following for my project. The api prefix is just a convention I'm using, but the task/* part might be nice to encourage.
- url: /api/task/*
script: public/index.php
login: admin
That will result in me having the route /api/task/migrate for triggering my migrations.
I'm going to push Artisan Console for GAE very soon. Still need to update the documentation.
The Console is there.
I'll check things out next time I have my hands in this stuff, which should be in the next couple of days.
@shpasser - Got a chance to try the web-based artisan console for the first time. Very handy!
One change I'm wondering if it might be worthwhile to make is if we could remove the CSRF token checking, and possibly even add some accept header support for JSON responses.
I'm looking for something I can use to trigger commands remotely from my deployment server. But unfortunately it has no way of getting a CSRF token.
I think will have to add an additional controller to handle deployment server requests in JSON. As for CSRF token, I would like to do some digging.
If you add another controller, then the CSRF token thing won't be an issue. These endpoints already will (ideally) require GAE admin access.
I agree, just wanted to make sure we do not break anything by ignoring CSRF protection. As I see it, since here the access is done from deployment server via a script we are not exposed to CSRF attacks. Therefore admin access should be sufficient.
Hi, long time no 'see'. I was a little busy lately. I've implemented an artisan controller for REST api calls, which receives commands and their parameters and returns the results in JSON.
Of course there are some issues related to ease of use, integration and security. Most of the issues can be addressed via the documentation. In case of artisan console I added a separate service provider to emphasize the implications of its use. Now I think will have to add a totally separate documentation file with examples of use and integration with CURL, Jenkins and such.
Haha, not a problem. I know how things get ;)
I'll give it all a go next chance I get.
Please let me know and I will create a special branch for it.