jasyncapi
jasyncapi copied to clipboard
package to make it easier to pull data from The Australian Bureau of Meteorology
A package similar to rnoaa[https://www.google.com.au/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=rnoaa%20cran] to make it easy to get data from the Australian Bureau of Meterology.
Back in 2013 we wrote an R package to download and format the Bureau of Meteorology generated gridded meteorological datasets for Australia - the Australian Water Availability Project (AWAP).
https://github.com/swish-climate-impact-assessment/awaptools
(The team dispersed, and I am still hoping to develop tools for extreme weather events)
Basically this downloads and formats the zips off the BoM server that holds the grids. Daily data up to two days before current, by 0.05 decimal degrees, available from 1900 (rain), 1950 (Temp + Vapour pressure), 1990 Solar. I also worked on code to use the THREDDS server holding the TERN EMAST improved grids, but they only did 1970 - 2012 and then stopped.
Markus Nolf used this code/data to publish a paper http://dx.doi.org/10.1111/pce.12581 titled "Stem and leaf hydraulic properties are finely coordinated in three tropical rain forest tree species", which resulted from the 2012 study of hydraulic traits in the Daintree Rainforest, in Plant, Cell & Environment.
I'm interested in continuing this development (either at the auunconf or separately). I might be able to help tie it in to IMOS or perhaps MODIS. I need to look deeper through AWAP to see where I might be useful.
Awesome. I was in contact with another R/weather developer Gopi Gopeti, a few years back via the Rain Project on Github. http://rationshop.blogspot.com.au/2014/01/the-rain-project-r-based-open-source.html When Nick put up this idea I contacted him again but Gopi says that is no longer developing, but his "raincpc" package is still available from CRAN.
On Wed, Mar 30, 2016 at 11:09 AM, Jonathan Carroll <[email protected]
wrote:
I'm interested in continuing this development (either at the auunconf or separately). I might be able to help tie it in to IMOS https://github.com/aodn/imos-user-code-library or perhaps MODIS http://modis.gsfc.nasa.gov/data/dataprod/. I need to look deeper through AWAP to see where I might be useful.
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/ropensci/auunconf/issues/6#issuecomment-203164956
I was looking into that just recently. I had issues installing it from CRAN (didn't try too hard at the time) and I noticed that the github.io page was down. I'd be interested in getting things back up to working order and updating.
Really liking the discussion going on here, guys!
We're hacking away on this over the weekend, so will have a bit more information and progress for everyone on where the package is at.
I'm interested in this project. R is not my area of expertise but I can definitely develop some stuff to make the actual pulling of data from the BoM servers easier.
I've done similar work automating the navigation and pulling of patient imaging and associated data on TCGA servers for my PhD.
I've heard rumours the AWAP data may be (now - or soon to be) exposed as NetCDF via a THREDDS server, perhaps at NCI in Canberra? That would be better. TERN eMAST has this for their improved Hutchinson spline data, but it is only for 1970 - 2012 and they got defunded. The current AWAP ftp solution is soooo 1990's.
Anybody up-to-date with the Aust Met rumour-mill?
On Wed, Mar 30, 2016 at 11:37 AM, A-Simmons [email protected] wrote:
I'm interested in this project. R is not my area of expertise but I can definitely develop some stuff to make the actual pulling of data from the BoM servers easier.
I've done similar work automating the navigation and pulling of patient imaging and associated data on TCGA servers for my PhD.
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/ropensci/auunconf/issues/6#issuecomment-203170075
I'm also really interested in developing tools for scraping and processing of Bureau of Meteorology data, however I'm more concerned with station data, in particular daily rainfall. [http://www.bom.gov.au/climate/data/stations/] . I've written a few functions in regards to pre-processing the station data for the applications of extremes and would be really interested in being a part of this project. I'm a Phd student in Maths and Stats / Earth Sciences and I'm yet to register for the unconference but will do so shortly - hopefully there are still places!
So I had a request from my Earth Science supervisor, David Karoly. He was involved in the "Art + Climate = Change" festival held in 2015 and being held again in 2017 where the following visualisation was commisioned - http://artclimatechange.org/ The visualisation displays the weather data for Melbourne in a circle, where the bar plots are temperature, the circles are rainfall observations and the line is wind data for 365 days of a year! He suggested the Earth Sciences community/BoM would be really interested in a package/app that would recreate this picture for any location in Australia. It would also be a really nice end deliverable we could produce in the couple of days we have!
Hi Kate,
interesting idea! Do you have a link to a precedent?
Cheers
Andrew
On 13 April 2016 at 14:23, Kate Saunders [email protected] wrote:
So I had a request from my Earth Science supervisor, David Karoly. He was involved in the "Art + Climate = Change" festival held in 2015 and being held again in 2017 where the following visualisation was commisioned - http://artclimatechange.org/ The visualisation displays the weather data for Melbourne in a circle, where the bar plots are temperature, the circles are rainfall observations and the line is wind data for 365 days of a year! He suggested the Earth Sciences community/BoM would be really interested in a package/app that would recreate this picture for any location in Australia. It would also be a really nice end deliverable we could produce in the couple of days we have!
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/ropensci/auunconf/issues/6#issuecomment-209220234
Andrew Robinson Deputy Director, CEBRA, School of Biosciences Reader & Associate Professor in Applied Statistics Tel: (+61) 0403 138 955 School of Mathematics and Statistics Fax: +61-3-8344 4599 University of Melbourne, VIC 3010 Australia Email: [email protected] Website: http://www.ms.unimelb.edu.au/~andrewpr
MSME: http://www.crcpress.com/product/isbn/9781439858028 FAwR: http://www.ms.unimelb.edu.au/~andrewpr/FAwR/ SPuR: http://www.ms.unimelb.edu.au/spuRs/
Hey Andrew - if you are asking about if there is existing code for the visualisation. The answer is no as far as I am aware it was just done as a one off and not in R. Or did you mean concept copyright? David is looking into that now for me, but as he was involved in the commissioning he doesn't think it will be a problem.
Hi Kate,
no I mean do you have a link to the visualisation itself.
Andrew
On 13 April 2016 at 14:44, Kate Saunders [email protected] wrote:
Hey Andrew - if you are asking about if there is existing code for the visualisation. The answer is no as far as I am aware it was just done as a one off and not in R. Or did you mean concept copyright? David is looking into that now for me, but as he was involved in the commissioning he doesn't think it will be a problem.
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/ropensci/auunconf/issues/6#issuecomment-209225811
Andrew Robinson Deputy Director, CEBRA, School of Biosciences Reader & Associate Professor in Applied Statistics Tel: (+61) 0403 138 955 School of Mathematics and Statistics Fax: +61-3-8344 4599 University of Melbourne, VIC 3010 Australia Email: [email protected] Website: http://www.ms.unimelb.edu.au/~andrewpr
MSME: http://www.crcpress.com/product/isbn/9781439858028 FAwR: http://www.ms.unimelb.edu.au/~andrewpr/FAwR/ SPuR: http://www.ms.unimelb.edu.au/spuRs/
https://github.com/chrislad/phenotypicForest has a polarHistogram function that has some headway towards this through coord_polar()
. e.g. http://chrisladroue.com/2012/02/polar-histogram-pretty-and-useful/
To be honest, it's more useful as an art-piece than a graph. That said, it wouldn't be too difficult to program and generalise once a good data source was compiled.
Hi everyone, This is really an interesting topic.
Perhaps, we can do some time series forecasting and spatial analysis with the climate and location data. Visualization with ggplot2, ggvis, and ggmap are awesome as well.
I am new to this ropen unconference and would like to connect to the attendees and form a team to do project together.
Fang
The awaptools package downloads weather grids and unzips them, converts to geotiff, for further processing such as extract over polygons or create timeseries. Working at the un-conference we found dependencies caused problems especially with MacOS, such as RODBC and RPostgreSQL/wkt_raster. I have removed those dependencies now and published v1.2 binaries at http://swish-climate-impact-assessment.github.io/tools/awaptools/awaptools-downloads.html. Is it possible for someone to test on MacOS? Cheers, Ivan.
I'd recommend checking out Travis CI and AppVeyor for automated build testing. Travis can handle Ubuntu and Mac OSX, and AppVeyor does windows.
https://github.com/craigcitro/r-travis
http://r-pkgs.had.co.nz/check.html#check
Hi guys, I got no news about the latest awaptools version failing to install so have gone ahead and put it back into your vignette, removing the requirement to manually load the awaptools functions: https://github.com/saundersk1/auunconf16/pull/1 . I got advice from paleo13 to set up Travis to automatically check if MacOS builds OK, but I don't have time for that https://github.com/ropensci/auunconf/issues/6#issuecomment-215262494 if any of you do that would be awesome!
@ivanhanigan I've submitted pull request with the travis and appveyor files. You just need to tell Travis and AppVeyor to use them. It's pretty painles - or if you add me to the repo I can set it up.
Ivan, I cloned awaptools locally and built it on my machine here with no RODBC. The only dependencies on startup were raster and sp. Shouldn't be any problems. I'll update the vignette later.
Request for bomaRang
weather data package testers:
Thanks to @paleo13 and @adamhsparks I have pushed a working version of awaptools into master. The TravisCI tools shows it now should work on Mac OS as well as Linux and windoze.
The next step for me is to optimize local storage. Anyone have opinion about whether GeoTIFF or NetCDF would be preferable? I like NetCDF because I heard in future BoM are moving toward that format in general (and perhaps they will provide AWAP that way rather than ASCII grid in future too?).
PS @saundersk1 would you please review Adam's pull request on the vignette https://github.com/saundersk1/auunconf16/pull/3 and either accept it or let us know what changes you'd prefer?
Thanks all!
FYI that link again https://github.com/swish-climate-impact-assessment/awaptools
I don't mind either way, but I thought I'd offer my opinion anyway :)
I find a a key benefit for GeoTIFF format is that they are generally much easier to work with then NetCDF (YMMV OFC). Importing them is trivial using the R raster package. They are also well supported by other GIS platforms (eg. QGIS and ArcGIS).
Also, note that you can save multi-band GeoTIFF rasters. If the main benefit of using NetCDF is reduced disk usage because you can save multiple rasters with the same extent/dimensions in a single file, you can obtain similar benefits using multi-band rasters. You just need to put all your rasters into a RasterStack object and then use writeRaster.
Also, something funny's going on with the AppVeyor badge is says "invalid" and when I click on it it says project not found.This might mean that the windows checks aren't working properly. If should say "passing" or "failed" like the Traivs CI one.
Could you verify if AppVeyor is set up for this package?
If you go to the AppVeyor website (https://www.appveyor.com/) and sign in using your GitHub account, does it have awaptools listed?
If not can you activate it (click on "new project" and then under "github" find "awaptools").
Thanks
I agree with @paleo13, Gtiffs are more mobile and standardised. I tend to dislike things that are too proprietary. NetCDF is OK if you're working with R I suppose, but disk space is cheap nowadays.
thanks @paleo13 re https://github.com/ropensci/auunconf/issues/6#issuecomment-223451739
I had to allow appveyor to read that repo, and for some reason it references it as appveyor/ci/ivanhanigan/awaptools
rather than appveyor/ci/swish-climate-impact-assessment/awaptools
but it all works now which is great thanks again
Awesome - no worries!
@adamhsparks re https://github.com/ropensci/auunconf/issues/6#issuecomment-223463404 thanks for this.
one thing I think may be relevant: if/when the BoM shift from the current FTP service to a THREDDS NetCDF service (as I have heard they intend to) we (or I) will have to set up the package to use ncdf4
to get the data. If that is the case then it may make sense to use that format locally too, to keep consistency between the tools being used to access/manipulate both the online and local data.
I do also like the rich metadata options that NetCDF has.
It would be easy enough to give the option of Gtiffs for the user to decide, but default to NetCDF for the reasons you stated.