This blog is run by Jason Jon Benedict and Doug Beare to share insights and developments on open source software that can be used to analyze patterns and trends in in all types of data from the natural world. Jason currently works as a geospatial professional based in Malaysia. Doug lives in the United Kingdom and is currently Director of Globefish Consultancy Services which provides scientific advice to organisations that currently include the STECF [Scientific, Technical and Economic Committee for Fisheries, https://stecf.jrc.europe.eu/] and ICCAT, https://www.iccat.int/en/

Showing posts with label CRAN. Show all posts
Showing posts with label CRAN. Show all posts

Monday, 23 February 2015

Why are the British tweeting about climate change at night?

Points of post:

  •  To demonstrate how the library twitteR can be combined with mapping capabilities of ggplot
  • To reveal the locations of people tweeting about climate change at two points in time.
  • To show that climate change tweeters are mostly from the developed world.
  • To show that some (sad) tweeters are twittering about climate change in the middle of the night.


In a previous blog post, we used twitteR and wordcloud to summarize which other words were occurring in tweets (combined from around the world) together with the keywords, ‘climate-change’ and ‘Bangladesh’ at two arbitrarily selected time-points. 

The geo-locations of tweets are, however, also often available and are potentially very interesting and revealing. 

Fifteen-hundred (1500) is the maximum number of tweets that can be captured using the twitteR library with one call.  There probably are ways to get more data but we guess you probably have then to spend money. 

Here we used #climatechange because it coincided with the last days of COP20 in Lima, Peru which ran from 1st to 12th December 2014.  It is an extremely important global forum at which nations can meet and discuss their options for reducing carbon emissions, see http://unfccc.int/meetings/lima_dec_2014/meeting/8141.php)


The first ‘tweet map’ (below) we produced is based on approximately 1500 geo-located tweets that contained the hash-tag, #climatechange, and which were ‘tweeted’ at about 10am GMT on the 11th December 2014. It shows that #climatechange tweets were coming from 4 main areas: North America, Europe, India and Australia. There didn’t appear either to be too many tweets coming out of Lima which surprised us. Maybe the delegates were too busy enjoying the South-American hospitality, and catching up with old mates to take much interest in Climate Change!

Geo-located tweets with #climatechange tweeted at around 10am GMT 11th December 2014

The second ‘tweet-map’ (below), also based on approximately 1500 geo-located #climatechange tweets, is for a snapshot that took place at 5 hours later at around 3am GMT on the final day of the conference (12th December 2014).  The overall pattern between the maps remains the same but the relative frequency of #climatechange tweeters from Europe, as compared to North America, has increased. People in the United Kingdom were particularly keen, twittering like mad about climate change at 3am. Why? We don’t know.

Geo-located tweets with #climatechange tweeted at around 3am GMT 12th December 2014

Note that tweets are geo-located, either by exploiting the users’ location as defined in their profile, or by ascertaining the exact location directly  if allowed by the user. This can be effected, either from GPS-enabled software which many people have installed on their smart-phones, or by using an IP-address.  This means that not all tweets can be geo-located with any great precision. Some are only geo-located at the National and/or regional levels, as evident from the large circle in the middle of Australia. That’s to say these cautious tweeters only gave ‘Australia’ as their location.

As we have explained in an earlier blog post on worldclouds and twitteR, to pull the data from Twitter using its API, you will need to have a Twitter account and carry out a 'twitter authentication'.  The R code to perform a search on twitter for the selected 'term(s)' and mapping them out is detailed below.

# Load required libraries
 
library(RCurl)
library(maps)
library(stringr)
library(tm)
library(twitteR)
library(streamR)
library(grid)
library(ggplot2)
library(rgdal)
library(ggmap)
 
# Set working directory
 
setwd("D:/ClimData/")
 
#### Fonts on Windows ####
windowsFonts(ClearSans="TT Clear Sans")
 
# Load Credentials
 
load("D:/ClimData/Twitter/twitter authentification.Rdata")
registerTwitterOAuth(twitCred)
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem", package = "RCurl")))
 
# Search term on twitter
 
searchTerm <- "#climatechange"
searchResults <- searchTwitter(searchTerm,n=1500,since='2014-12-11', until='2014-12-12')  
tweetFrame <- twListToDF(searchResults) 
 
userInfo <- lookupUsers(tweetFrame$screenName)  
userFrame <- twListToDF(userInfo)
 
locatedUsers <- !is.na(userFrame$location)
 
# Geocode locations using 'ggpmap' library
 
locations <- geocode(userFrame$location[locatedUsers])
 
locations_robin <- project(as.matrix(locations), "+proj=robin")
 
locations_robin_df <- as.data.frame(locations_robin)
 
# Import world boundaries
 
world <- readOGR(dsn="D:/Data/ne_10m_admin_0_countries", layer="ne_10m_admin_0_countries")
 
world_robin <- spTransform(world, CRS("+proj=robin"))
 
world_robin_df <- fortify(world_robin)
 
counts <- aggregate(locations_robin_df$V1,by=list(x=locations_robin_df$V1,y=locations_robin_df$V2),length)
names(counts)[3] <- "count"
 
# Theme options for Map
 
theme_opts <- list(theme(panel.grid.minor = element_blank(),
                         panel.grid.major = element_blank(),
                         panel.background = element_blank(),
                         panel.border = element_blank(),
                         plot.background = element_blank(),
                         axis.line = element_blank(),
                         axis.text.x = element_blank(),
                         axis.text.y = element_blank(),
                         axis.ticks = element_blank(),
                         axis.title.x = element_blank(),
                         axis.title.y = element_blank(),
                         legend.position = "bottom",
                         legend.key = element_blank(),
                         legend.title = element_text(colour="black", size=12, face="bold",family="Clear Sans"),
                         legend.text = element_text(colour="black", size=10, face="bold",family="Clear Sans"),
                         plot.title = element_text(size=15,face="bold",lineheight=0.5,family="Clear Sans")))
 
# Plot map and tweet counts 
 
tp <- ggplot(world_robin_df)+
      geom_polygon(aes(x = long, y = lat, group = group), fill = "grey20")+
      geom_path(aes(x = long, y = lat, group = group),colour = "grey40", lwd = 0.2)+
      geom_point(data= counts, aes(x=x,y=y,size=count),color="#32caf6", alpha=I(8/10))+
      scale_size_continuous(name="Number of tweets")+
      ggtitle("Twitter Map of #climatechange\n")+
      xlab("")+ ylab("")+
      coord_equal()+
      theme_bw() + 
      guides(size = guide_legend(title.position = "top",title.hjust =0.5))+
      theme_opts
 
tp
 
# Save to png
 
ggsave(tp,file="D:/Twitter_ClimateChange_Map.png",dpi=500,w=10,h=6,unit="in",type="cairo-png")
Created by Pretty R at inside-R.org

Friday, 30 January 2015

The recent deluge in Malaysia - using raincpc to map extreme rainfall events

Key point of post


  • To describe an application of raincpc to map the rainfall that led to the recent (December 2014) floods in Malaysia

Raincpc (http://cran.r-project.org/web/packages/raincpc/vignettes/raincpc_demo.html) is a new library for R that exploits the Climate Prediction Center’s (CPC, www.cpc.ncep.noaa.gov) global (1979 to present, 50km resolution) datasets for precipitation. It renders CPC’s rainfall data more readily available for plotting and visualization, allowing any user to conveniently side-step problems relating to changing data formats, file-naming conventions etc. And all this free of charge!

We thought it would be fun to demonstrate the use of raincpc, focusing on Malaysia which experienced devastating floods over the Christmas and New Year Period, leading to the evacuation of 1000s of people. The damage has been estimated to have cost ~ 2 billion RM. Please see the following links - http://en.wikipedia.org/wiki/2014%E2%80%9315_Malaysia_floods and http://reliefweb.int/report/malaysia/asean-flash-update-northeast-monsoon-flood-24-december-2014.



In the plot below we used raincpc to show the amount of rain that fell over south-east Asia between 17th and 24th December 2014. It confirms that rainfall was indeed particularly heavy along the east coast of peninsular Malaysia; but also over northern Sumatera. Penang was certainly wet during December but the island had nothing like the amount of rainfall endured by communities on Malaysia’s east coast.

We do not know what caused the extreme rainfall that led to the flooding. Meteorologists think that it is related to the 'Madden-Julian Oscillation' (http://www.themalaysianinsider.com/malaysia/article/malaysia-could-see-more-severe-floods-like-in-kelantan-say-experts) and it's interaction with the north-east Monsoon. Very heavy rain is of course common in the tropics, but it doesn't neccesarily lead to flooding if drainage is adequate. Some experts think that rampant deforestation in Malaysia has led to more siltation of rivers, in effect blocking Malaysia's drains, and this exacerbates the impact of rainfall events (https://www.youtube.com/watch?v=r_eZUxgoxCw)

As usual the R-code for producing the map is outlined below.
## Load package libraries
 
library(raincpc)
library(SDMTools)
library(raster)
library(ggplot2)
library(rgdal)
library(grid)
library(maptools)
 
# Set working directory
 
setwd("D:/ClimData/")
 
## Get raw CPC rain data - data has a 2 day lag
 
cpc_get_rawdata(2014,12,17,2014,12,24) 
 
## Read raw CPC rain data into raster grids
 
rain1 <- cpc_read_rawdata(2014, 12, 17)
rain2 <- cpc_read_rawdata(2014, 12, 18)
rain3 <- cpc_read_rawdata(2014, 12, 19)
rain4 <- cpc_read_rawdata(2014, 12, 20)
rain5 <- cpc_read_rawdata(2014, 12, 21)
rain6 <- cpc_read_rawdata(2014, 12, 22)
rain7 <- cpc_read_rawdata(2014, 12, 23)
rain8 <- cpc_read_rawdata(2014, 12, 24)
 
# Combine multiple day rasters
 
rain_tot <- rain1 + rain2 + rain4 + rain5 + rain6 + rain7 + rain8
 
# Get summary of raster grid
 
print(rain_tot)
 
raster_ggplot <- function(rastx) {
 
require(SDMTools)
 
  stopifnot(class(rastx) == "RasterLayer")
 
  gfx_data <- getXYcoords(rastx)
  # lats need to be flipped
  gfx_data <- expand.grid(lons = gfx_data$x, lats = rev(gfx_data$y), 
                          stringsAsFactors = FALSE, KEEP.OUT.ATTRS = FALSE)
  gfx_data$rain <- rastx@data@values
 
  return (gfx_data)
}
 
rain_gg <- raster_ggplot(rain_tot)
 
# Read shapefile of country boundaries (shapefiles can be downloaded from http://thematicmapping.org/downloads/world_borders.php)
 
bounds <- readOGR(dsn="D:/Data/World_Borders", layer="TM_WORLD_BORDERS-0.3")
 
## Extents of ggplot map
 
xmin<-95
xmax<-120
ymin<--10
ymax<-15
 
interval <-(xmax-xmin)/5
 
lon_vals <- seq(xmin, xmax, 0.5)
lat_vals <- seq(ymin, ymax, 0.5)
 
 
# Set theme options
 
theme_opts <- list(theme(panel.grid.minor = element_blank(),
                         panel.grid.major = element_blank(),
                         panel.background = element_rect(fill="grey95"),
                         panel.border = element_rect(colour="black"),
                         axis.line = element_blank(),
                         axis.text.x = element_blank(),
                         axis.text.y = element_blank(),
                         axis.ticks = element_blank(),
                         axis.title.x = element_blank(),
                         legend.key.size=unit(0.35,"in"),
                         legend.key.width=unit(0.15,"in"),
                         legend.text=element_text(size=14,family="Myriad Pro Cond"),
                         legend.title=element_text(size=16,family="Myriad Pro Cond"),
                         plot.title = element_text(size=23,face="bold",vjust = -10,hjust=0.96,family="Graph Black"),
                         legend.position = c(0.17, 0), 
                         legend.justification = c(1, 0), 
                         legend.background = element_blank(),
                         axis.title.y = element_blank()))
 
# Plot rainfall map
 
rf <-  ggplot()+
       geom_raster(data=rain_gg,aes(x=lons,y=lats,fill=rain),alpha=0.8) +
       scale_fill_gradientn(colours=c("#FFFFFF","#ADD8E6","#95BBE9","#7E9EEC","#6781F0","#5064F3","#3948F6","#222BFA","#0B0EFD","#0A02FE","#1F06FC","#350AFA","#4A0EF8","#6013F6","#7517F3"),limits=c(0,1200),na.value=NA, name="Rainfall (mm)\n")+
       geom_polygon(data=bounds, aes(long,lat, group=group),colour="grey30",fill="transparent",size=0.35)+
       coord_equal(xlim=c(xmin,xmax),ylim=c(ymin,ymax)) + theme_bw()+
       ggtitle("Extreme rainfall event over Malaysia\n(17th to 24th of December 2014)\n")+
       xlab("") + ylab("")+ theme_opts +
       annotate("text", x = 115.00, y = -9.5, label = "Data source: Climate Prediction Center - NOAA (2014)",size=5,family="Myriad Pro Cond") 
 
plot(rf)
 
# Save rainfall map to png file
 
ggsave(rf,file="D:/ClimData/CPC_Extreme_Rainfall_Event_MYS_Dec2014.png",dpi=500,w=10,h=10,unit="in",type="cairo-png")
Created by Pretty R at inside-R.org

Tuesday, 16 December 2014

13 decades on...has Penang gotten warmer?

  • Here we compare recent temperature data for Penang with a report written in 1885.
  • We find that air temperatures are higher now in the afternoon and evening
  • Morning air temperatures seem to be about the same now as they were 130 years ago.
  • There are sources of bias which may affect the interpretation, but about which we can do nothing.

Last week we received the email below from Mike Gibby (mikegibby@gmail.com).

“Hi Jason, Doug

I found your website / blog on Temperature Change in Penang while searching for info on rain, and more especially on rain gauges in Penang, and wondered if you could give any advice or help?

I'm part of a team who are researching for a guide to hiking trails in Penang; most of these trails were developed by farmers for access, but there are some trails that only 'stitch together' the sites of rain gauges, so they must have been created for data collection in the old, pre-digital days. An example here would be what is described as 'the longest trail in Penang', that runs from Telok Bahang to Penang Hill, via Bukit Laksamana.

So, my question is, can you point me at any sources of information, quite probably of the colonial era, on the setting up, siting etc of rain gauges?
In case you don't know it, I attach an article on the climate in Penang in 1885.

I look forward to hearing from you- best wishes !

Mike Gibby”

We replied to Mike that we didn’t know anything about the rain gauges or the old-colonial data collection protocols or policies.

We were, however, very interested in the 1885 weather report that Mike attached. We’ve since discovered that the report is available from the Malaysian Branch of the Royal Asiatic Society.  The 1885 report was compiled by Mr Thomas Irvine Rowell (Principal Civil Medical Officer) in Singapore in early 1886, about the same time Karl Benz patented the first gasoline-driven automobile and covers Singapore, Penang, ‘Province Wellesley’ and Malacca. 

Given our interest in Penang’s current and past weather, we decided to compare recent air temperature measurements with those taken 130 years previously in 1885 (please see below).



In the figure below we’ve plotted annual patters for six different Temperature parameters (mean monthly, maximum monthly, minimum monthly, 9am, 3pm, & 9pm) for both 1885 (blue lines) and the period 2003-2013 (red lines).  Monthly mean temperatures are around 28°C. March and April Penang average temperatures appeared to be a little hotter in 1885 than we’ve seen more recently, but for the rest of the year air Temperatures were fairly similar.  Recent maximum air temperatures are higher than they were in 1885 and minimum temperatures lower. We think, however, that this is probably caused by confounding/bias, ie. in 1885 data were only collected 3 times daily (9am, 3pm, & 9pm) whereas nowadays we can make observations, continuously, at 5 minute intervals and publish them instantly around the world! 



Given such problems of bias, the most meaningful comparisons we can make between these datasets are those observations taken at 9am, 3pm, and 9pm.  At 9am, temperatures in 1885 were similar to current measurements.  In the afternoon (3pm, typically around the hottest time of day in Penang) and evening (9pm), however, Temperatures are higher now in all months by about 1°C

So, in conclusion, if we assume that the measuring equipment is accurate, current air temperatures are around 1°C warmer in Penang in the afternoons and evenings, while Mr Rowell would probably have experienced very similar morning temperatures to us now!

These differences are less pronounced than we’d expected.

[Note: We’ve not addressed the potentially serious issue of ‘spatial bias’ and the locations that weather observations were made by us are not the same as those used by Mr Rowell].

The R code we used to produce the above line plots is outlined below and the 2003-2013 temperatures were downloaded from the NOAA NCDC's Global Summary of the Day (GSOD).

# Load required libraries
 
library(zoo)
library(lubridate)
library(plyr)
library(ggplot2)
library(grid)
library(XLConnect)
library(reshape2)
library(extrafont)
 
# Set working directory
 
setwd("D:/ClimData")
 
# Read table of 1885 Temperatures
 
temp1885<-readWorksheet(loadWorkbook("1885_Temps.xlsx"),sheet=1)
 
temps1885 <- temp1885[c(1,2,10,11,12,13,14,15)]
 
colnames(temps1885) <- c("month","monthf","temp_9am","temp_3pm","temp_9pm","mon_mean","mon_max","mon_min")
 
temps1885$year <- "1885"
 
temps1885_df <- temps1885[c(1,2,9,3:8)]
 
 
# Read data from NCDC
 
dat<-read.table("2110827004508dat.txt",header=TRUE,fill=TRUE,na.strings=c("*","**","***","****","*****","******","0.00T*****"))
 
colnames(dat)<-tolower(colnames(dat))
 
# Convert hourly data from UTC to local time zone
 
Sys.setenv(TZ = "UTC")
dat$dates <- as.POSIXct(strptime(dat$yr..modahrmn,format="%Y%m%d%H%M"))  + 8 * 60 * 60
 
# Convert temperatures in Degree Fahrenheit to Degree Celcius
 
dat$tempc <- (dat$temp-32) * (5/9)
 
dat$tempc[dat$tempc<=10] <- NA
dat$tempc[dat$tempc>=40] <- NA 
 
# Extract times and dates and reformat data
 
dat$year <- as.numeric(as.POSIXlt(dat$dates)$year+1900)
dat$month <- as.numeric(as.POSIXlt(dat$dates)$mon+1)
dat$monthf <- factor(dat$month,levels=as.character(1:12),labels=c("Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"),ordered=TRUE)
dat$jday <- yday(dat$dates)
dat$hour <- as.numeric(format(strptime(dat$dates, format = "%Y-%m-%d %H:%M"),format = "%H"))
dat$minute <- as.numeric(format(strptime(dat$dates, format = "%Y-%m-%d %H:%M"),format = "%M"))
 
dat_01 <- subset(dat, year >= 2003 & year <= 2013 )
 
dat_02 <- dat_01[c(34,35,36,37,38,39,40,41)]
 
dat_9am <- subset(dat_02,hour == 9 & minute == 0 )
dat_3pm <- subset(dat_02,hour == 15 & minute == 0 )
dat_9pm <- subset(dat_02,hour == 21 & minute == 0 )
 
dat_9am_melt<- ddply(dat_9am, .(month,monthf), summarize, temp_9am=mean(tempc,na.rm=T))
dat_3pm_melt<- ddply(dat_3pm, .(month,monthf), summarize, temp_3pm=mean(tempc,na.rm=T))
dat_9pm_melt<- ddply(dat_9pm, .(month,monthf), summarize, temp_9pm=mean(tempc,na.rm=T))
 
dat_hourly <- cbind(dat_9am_melt,dat_3pm_melt[3],dat_9pm_melt[3])
 
dat_melt<- ddply(dat_01, .(month,monthf), summarize, mon_mean=mean(tempc,na.rm=T),mon_max=max(tempc,na.rm=T),mon_min=min(tempc,na.rm=T))
 
dat_yrs <- cbind(dat_melt,dat_hourly[-c(1,2)]) 
 
dat_yrs$year <- "10 Year Average (2003-2013)"
 
tempcomb <- rbind(dat_yrs,temps1885_df)
 
tempcomb_df <- melt(tempcomb, id.vars=c("month","monthf","year"),variable.name="type",value.name="temps")
 
# Plot data
 
p <- ggplot(data=tempcomb_df, aes(x=monthf,temps, colour=year))+
     geom_point(data=subset(tempcomb_df,year!=1885),size=3, alpha=0.5,aes(group=year))+ 
     geom_line(data=subset(tempcomb_df,year!=1885),size=1, alpha=0.5,aes(group=year))+
     geom_point(data=subset(tempcomb_df,year==1885),size=3, alpha=0.5,aes(group=year))+ 
     geom_line(data=subset(tempcomb_df,year==1885),size=1, alpha=0.5,aes(group=year))+
     geom_text(aes(label=round(temps,1)),hjust=1, vjust=-1,size=2.5)+
     facet_wrap(~type)+
     xlab("")+ ylab ("Temperature (Degree C)\n") +
     labs(title="Comparison of year 1885 and 10 year monthly average (2003-2013) temperatures in Penang\n")+
     theme(panel.background = element_rect(fill= "transparent"),
     plot.title = element_text(lineheight=1.2, face="bold",size = 13, colour = "grey20",family="Clear Sans"),
     panel.border = element_rect(colour = "grey90",fill=F,size=0.5),
     panel.grid.major.x = element_blank(),panel.grid.minor = element_blank(),
     axis.line=element_blank(),axis.title.x=element_blank(),
     axis.text.x=element_text(face="bold",size = 10, colour = "grey20",family="Clear Sans"),
     axis.title.y=element_text(face="bold",size = 10, colour = "grey20",family="Clear Sans"),
     axis.text.y=element_text(face="bold",size = 8, colour = "grey20",family="Clear Sans"),
     legend.title = element_text(colour="black", size=12, face="bold",family="Clear Sans"),
     legend.text = element_text(colour="black", size=10, family="Clear Sans"),
     legend.position="bottom",
     axis.ticks=element_line(size=0.5,colour="grey20"),panel.grid.major.y = element_line(colour="grey90",linetype="longdash",size=1))+
     guides(color = guide_legend(title = "Year", title.position = "top"))
 
p
 
 
# Save plot to png
 
ggsave(p,file="PenangTemps_1885_10yrAverage.png",dpi=500,w=12,h=8,unit="in",type="cairo-png")
Created by Pretty R at inside-R.org

Tuesday, 2 December 2014

Mapping Typhoon Haiyan/Yolanda with rNOMADS


The purpose of this article is to:

  • Promote the utility of the recently published rNOMADS software package which links directly to the climate model archive maintained by NOAA.
  • Demonstrate the use of rNOMADS for plotting global surface temperatures and wind-speeds.
  • Show how we used rNOMADS to map the extent and track of Typoon Yolanda/Haiyan.

rNOMADS allows R users to connect directly with the National Oceanic and Atmospheric Administration's (NOAA) Operational Model Archive and Distribution System (NOMADS). It allows rapid and efficient downloading of a vast range of different types of meteorological data from both global and regional weather models. Data for the past are available in addition to data for all sorts of forecasts.

Global surface temperatures from the NOAA archives (details of actual model used) are shown in the map below for 18:00 GMT on 29th October 2014. The band of high sea surface temperatures associated with the tropics is clear. It is also interesting to note the influence of the ‘upwelling zones’ on the western coasts of the southern American and Africa continents which tend to force cold water northwards. On land the relatively cool areas associated with the Andes and the Ethiopian Plateau are also visible.


We follow up global temperatures with the map of global windspeeds at ground level below. The ‘Roaring Forties’, ‘Furious Fifties’ and ‘Screaming Sixties’ are very clearly discernible. So too the ‘Doldrums’, or areas of calm wind which hug the equator . You can also make out Cyclone Nilofar in the north Indian Ocean which was about to strike the west coast of India.


The gif animation below utilizes the same data source to summarize the path of Typhoon Haiyan/Yolanda which hit the Philippines in November 2013 causing great devastation and loss of life.  The animation depicts wind-speed (km/hr) at 3 time points: 15:00 GMT and 21:00 GMT on 7th November 2013 and then early morning (03:00 GMT) on 8th November. Each raster grid is roughly 50 km's by 50 km's (0.5 degrees by 0.5 degrees) in extent, so the whole storm was around 650 km's across.  At its height wind-speeds exceeded 150 km/hr and the ‘eye of the storm’ is also clearly visible as the blue dot in the very centre.  


As usual, the R code is provided below if you are interested in producing similar plots for an area or time of interest. You can get plenty of information and code examples for the rNOMADS package on CRAN (link provided earlier in the post) as well as at the developers' blog posts on the package at the following link - https://bovineaerospace.wordpress.com/tag/rnomads/

# Load libraries
 
library(rNOMADS) 
library(GEOmap)
library(aqfig)
library(rgdal)
 
# Define levels
 
levels <- c("surface", "2 m above ground", "10 m above ground", "300 mb")
 
# Define variables - temperature and wind
 
variables <- c("TMP", "UGRD", "VGRD")
 
# Define additional model inputs
 
abbrev <- "gfsanl"
model.date <- 20141029
model.run <- 06
pred <- 3
 
# Get the data
 
grib.info <- ArchiveGribGrab(abbrev, model.date,model.run, pred, file.type = "grib2")
 
# Read data into R
 
grib.data <- ReadGrib(grib.info$file.name, levels, variables)
 
resolution <- c(0.5, 0.5) #Resolution of the model
 
# Make an array for easier manipulation
 
atmos <- ModelGrid(grib.data, resolution)
 
# Set up display
 
# Plot temperature at 2 m above ground
 
li <- which(atmos$levels == "2 m above ground")
vi <- which(atmos$variables == "TMP")
colormap <- rev(rainbow(500, start = 0 , end = 5/6))
 
# Read world boundaries
 
world <- readOGR(dsn="D:/Data/ne_10m_admin_0_countries", layer="ne_10m_admin_0_countries")
 
png(file = "world_surface_temp.png", width = 9, height = 6, res=400, type="cairo-png",units="in", bg="white",restoreConsole = TRUE)
 
image(atmos$x , sort(atmos$y), atmos$z[li,vi,,], col = colormap,
      xlab = "Longitude", ylab = "Latitude",
      main = paste("World Temperature at Ground Level (deg C):", atmos$fcst.date ,"GMT"))
 
# Plot coastlines
 
plot(world, border = "black", add = TRUE, MAPcol = NA)
 
# Plot legend, convert to Celsius
 
vertical.image.legend(col=colormap, zlim = range(atmos$z[li,vi,,] - 273.15))
dev.off()
 
# Plot wind magnitude at 10 m above ground
 
li <- which(atmos$levels == "10 m above ground")
vi.zonal <- which(atmos$variables == "UGRD") # East-West wind
vi.merid <- which(atmos$variables == "VGRD") # North-South wind
 
wind.mag <- sqrt(atmos$z[li,vi.zonal,,]^(2) + atmos$z[li,vi.merid,,]^(2))
colormap <- rev(rainbow(500, start = 0 , end = 5/6))
 
png(file = "world_surface_wind.png", width = 9, height = 6, res=400, type="cairo-png",units="in", bg="white",restoreConsole = TRUE)
 
image(atmos$x, sort(atmos$y), wind.mag, col = colormap,
      xlab = "Longitude", ylab = "Latitude",xlim=c(100,150),ylim=c(-5,20),
      main = paste("World Winds at Ground Level (km/hr):", atmos$fcst.date, "GMT"))
 
# Plot coastlines
 
plot(world, border = "black", add = TRUE, MAPcol = NA)
 
# Plot legend, convert to km/hr
 
vertical.image.legend(col=colormap, zlim = range(wind.mag * 3.6))
 
dev.off()
Created by Pretty R at inside-R.org