golem
golem copied to clipboard
run_app and datasets not recognized as exported when using docker files
I created a golem app and tested the deployment on shinyapp.io where it worked smoothly.
Then I created a docker file and tried to deploy it but for some reason it told me run_app
wasn't an exported function.
This was fixed by replacing in the docker file mypkg::run_app()
by mypkg:::run_app()
(adding third :
).
This is puzzling to me as the "#' @export" line is definitely there (I didn't tweak the boilerplate), and again, this works locally and on shinyapps.io.
Then I ran into issues because my datasets were not recognized, this could be solved by replacing the last line of the docker file by :
CMD R -e 'data(dataset1,package = "mypkg");data(dataset2,package = "mypkg");options("shiny.port"=80,shiny.host="0.0.0.0");mypkg:::run_app()'
This kind of namespace issues happened to me in the past when passing in my package expressions to functions from other packages that would use NSE.
Finally a simple solution could solve all at once, just add a library call to attach my package.
CMD R -e 'library(mypkg);run_app()'
@moodymudskipper I also had this problem and solved adding mypkg::dataset
every time my dataset is called. In a shiny with many modules this could be painful, so I would suggest creating reactive()
objects to store the mypkg::dataset
calls and call the reactive when you need the data.
There are other options here: https://coolbutuseless.github.io/2018/12/10/r-packages-internal-and-external-data/
Good luck!
Note to self: try and see if using "Lazyload: false" in the DESCRIPTION changes that
I think this might be related to https://github.com/ThinkR-open/golem/issues/413.
For some reason I haven't clearly identified yet, some internal dataset are not recognized, and the best way to do this is to explicitely namespace them, which is not a viable solution.
Issue right now is that I haven't find a way yet to reprex this reliably. I'll keep you posted on this :)
Hi all, I have run into this issue recently, and I am glad to see a few options out there. The mypkg::dataset
is cumbersome, but CMD R -e 'library(mypkg);run_app()'
in the Dockerfile is what I did and is the most straightforward solution.
I also feel that there is a difference between {golem} approach and runApp()
from app directory, that is the use of the global.R
file. I use the global.R
file to load packages (like the library(mypkg)
in CMD
), load and pre-process data sets that should be available globally, set global variables (which is covered by get_golem_config()
).
In the Docker context, a way would be to use CMD Rscript index.R
and use the index.R
file to load necessary objects:
options("shiny.port"=80,shiny.host="0.0.0.0")
library(mypkg)
mydata <- t(myotherdata)
run_app()
Question would be how to automate this part given that the build_zone
is removed after package install. I can see having it in inst/app/index.R
with the following template:
options("shiny.port"=80,shiny.host="0.0.0.0")
mypkg::run_app()
And with these lines added to the Dockerfile:
...
RUN rm -rf ./build_zone
COPY inst/app/index.R .
EXPOSE 80
CMD Rscript index.R
Users can then edit the index.R
file as they would the global.R
.
Of course one can do this without {golem} following course. I just wanted to throw this into the mix because this might be an overlooked consequence of the early decision of "you don't need global.R
in {golem}" (see https://github.com/ThinkR-open/golem/issues/6). @ColinFay what do you think?
Hi all,
Do you have an example where doing "library(mypkg);run_app()" does not solve all the problems with data?
in my opinion (but I may be wrong) if you need to source a file, there is a problem in the package (you have to play with the Lazyload parameters)