rnoaa
rnoaa copied to clipboard
Consider support for CMORPH (NOAA rainfall data)?
http://www.cpc.ncep.noaa.gov/products/janowiak/cmorph_description.html
@cboettig do you have any of this data in a human consumable format? the data is in binary files that are as typical for noaa hard to wrap your head around. want to make sure data is extracted correctly
Ha, I was hoping you might help me wrap my head around NOAA's use of binary formats, my googling suggested these .Z files were compressed Big Endian , which I guess is just binary stream, but should correspond to some kind of spatial raster? Was hoping gdal or something might recognize it. All Greek to me, just passing on a suggestion from student
right, have seen some code that reads using raster, but there's a fair amount of manipulation before that can happen, will keep looking
notes
- ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/README.cmorph.8km_30minute
- ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/README.cmorph-only.8km-hourly
this is sort of a hot mess, putting off for later
moving to next milestone - if anyone wants to get this working, that'd be great - file is at inst/ignore/cmorph.R
This seems close, haven't checked for upside down ness yet
#tx <- readLines("ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/README.cmorph.8km_30minute")
#u <- "ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/30min_8km/CMORPH_8KM-30MIN_2017030100.Z"
u <- "ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/30min_8km/CMORPH_8KM-30MIN_2017030114.Z"
#system(glue::glue("gdalinfo {u}"))
f <- basename(u)
if (!file.exists(f)) {
curl::curl_download(u, f)
}
f0 <- gsub("\\.Z$", "", f)
system(glue::glue("uncompress {f0}"))
dims <- c(1649, 4948, 6)
a <- array(readBin(f0, integer(), size = 1, endian = "little", signed = FALSE, n =prod(dims)), dims)
#range(a)
a[a > 254] <- NA
a <- a * 0.2 ## mm/hr
# op <- par(mfrow = c(6, 1), mar = rep(0, 4))
# for (i in seq_len(dims[3])) image(a[,,1], useRaster = TRUE)
# par(op)
# Each direct access record is a 4948 x 1649 CHARACTER*1 (use FORTRAN ichar
# command to retrieve interger value for all parameters) array with grid
# increments of 0.072756669 degrees of longitude and 0.072771377 of latitude,
# which is apporoximately 8 km at the equator. The arrays are oriented from
# North to South, beginning from latitude 59.963614N and from West to EAST from
# longitude 0.036378335E.
library(raster)
#> Loading required package: sp
offs <- c(0.036378335, 59.963614)
delt <- c(0.072756669, -0.072771377)
b <- setExtent(brick(a, crs = "+init=epsg:4326"), extent(offs[1], offs[1] + dims[2] * delt[1],
offs[2] + dims[1] * delt[2], offs[2]))
plot(b[[4]])
maps::map("world2", add = TRUE)
Created on 2019-05-02 by the reprex package (v0.2.1)
(Got stuck for a while because didn't set readBin(n = ??)
LOL)
readBin(archive::file_read(f, ...))
would avoid the need to decompress the Z file, I think
@sckott @cboettig @mdsumner @raymondben @lesserwhirls
Is this the data you are after:
https://rda.ucar.edu/thredds/catalog/files/g/ds502.0/catalog.html
If so there are at least .nc files, though unaggregated, and you should be able to access them using ncdf4 or probably RNetCDF. If there is a demand for them, we might add them to our ERDDAP (with permission from NCAR). We may or may not be able to virtually aggregated them in ERDDAP if we do serve them, all of which would provide many more options for download
The person who would do this is on travel this week, and I would need to have some feel that there is reasonable demand.
thanks very much @mdsumner and @raymondben - i'll try that out.
@rmendels not sure, @cboettig is the one that brought this up, so i don't know myself what folks want per se.
@sckott @cboettig let me know if the THREDDS data are what Carl is after. If so, sort of overkill to add to rnoaa
, as either ncdf4 to RNetCDF should presently work. And as I sad, we would consider adding them to ERDDAP (we just point at the THREDDS) and that would make it work in rerddap
This seems close, haven't checked for upside down ness yet
#tx <- readLines("ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/README.cmorph.8km_30minute") #u <- "ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/30min_8km/CMORPH_8KM-30MIN_2017030100.Z" u <- "ftp://ftp.cpc.ncep.noaa.gov/precip/global_CMORPH/30min_8km/CMORPH_8KM-30MIN_2017030114.Z" #system(glue::glue("gdalinfo {u}")) f <- basename(u) if (!file.exists(f)) { curl::curl_download(u, f) } f0 <- gsub("\\.Z$", "", f) system(glue::glue("uncompress {f0}")) dims <- c(1649, 4948, 6) a <- array(readBin(f0, integer(), size = 1, endian = "little", signed = FALSE, n =prod(dims)), dims) #range(a) a[a > 254] <- NA a <- a * 0.2 ## mm/hr # op <- par(mfrow = c(6, 1), mar = rep(0, 4)) # for (i in seq_len(dims[3])) image(a[,,1], useRaster = TRUE) # par(op) # Each direct access record is a 4948 x 1649 CHARACTER*1 (use FORTRAN ichar # command to retrieve interger value for all parameters) array with grid # increments of 0.072756669 degrees of longitude and 0.072771377 of latitude, # which is apporoximately 8 km at the equator. The arrays are oriented from # North to South, beginning from latitude 59.963614N and from West to EAST from # longitude 0.036378335E. library(raster) #> Loading required package: sp offs <- c(0.036378335, 59.963614) delt <- c(0.072756669, -0.072771377) b <- setExtent(brick(a, crs = "+init=epsg:4326"), extent(offs[1], offs[1] + dims[2] * delt[1], offs[2] + dims[1] * delt[2], offs[2])) plot(b[[4]]) maps::map("world2", add = TRUE)
Created on 2019-05-02 by the reprex package (v0.2.1)
(Got stuck for a while because didn't set
readBin(n = ??)
LOL)
What about this ftp://ftp.cpc.ncep.noaa.gov/precip/CMORPH_V1.0/CRT/8km-30min/1998/ I got stuck in plotting and converting the binary format to hourly Geotiff layers. Any idea?!
@neda1366 Thanks for your comment
What about this ftp://ftp.cpc.ncep.noaa.gov/precip/CMORPH_V1.0/CRT/8km-30min/1998/
Please clarify what you are asking. What about it?
I wanna have hourly CMORPH_v1 tiff files. I downloaded them from ftp://ftp.cpc.ncep.noaa.gov/precip/CMORPH_V1.0/CRT/8km-30min/1998/ . The original files are in binary for each month. I used the code which was mentioned here to read as well as write the files into .tiff format. The output is a tiff file for the given month but I want to have hourly raster files of each month.
Tried coming back to this. Definitely don't want to do system calls to uncompress, so looked at jimhester/archive, but can't get it to install on macos; keep getting compilation errors. If anyone has tips ... @raymondben ?