Note
The original version of this post (December 2018) used the {gmapsdistance} package. I updated it extensively in 2020 to use the {osrm} package, which doesn’t require an API key nor billing details.
tl;dr
The {osrm} R package can retrieve from the OSRM API the travel duration between points. I looked at these data for NBA basketball-team arenas, whose details I scraped from the web using {rvest} and mapped with {leaflet}.
On the road
Fans don’t have far to travel in the UK if they want to see their favourite sports team play an away match.
The USA is pretty big, though.
The National Basketball Association (NBA) compensates by separating its teams into Eastern and Western conferences, each with three divisions of five teams. This means that the majority of regular-season games aren’t too far away.
But this varies. Teams are clustered near Lakes Michigan and Erie in the Central division, but the Northwest division stretches from Portland in the Pacific Northwest to Oklahoma City in the centre-south of the country.
What would it take to be a basketball fan who wanted to drive to away games? How long would it take?
R can help
Surprise, this is all a ruse for me to practice with some R packages:
- {rvest} for scraping web pages
- {leaflet} for interactive mapping
- {osrm} for calculating duration of travel between points
There’s four main parts to the post (click to jump):
Let’s start by attaching the packages we need. As always, make sure these are installed first, using install.packages()
.
# Tidyverse
library(tidyverse) # data handling and plotting
library(rvest) # scrape data
# Geography and travel
library(sf) # handle geographies
library(osrm) # fetch travel info
# Interactive elements
library(leaflet) # interactive maps
library(DT) # interactive tables
library(plotly) # interactive plots
1. Scrape team data
Use {rvest}
The Wikipedia page for the NBA has a table with each team and its location, including coordinates. We can use the {rvest} web-scraping package to extract that table into a data frame with these steps:
- Read the HTML of the page with
xml2::read_html()
- Extract the HTML node for the table with
rvest::html_nodes()
- Parse the HTML as a table with
rvest::html_table()
Note that you have to provide to html_nodes()
a CSS selector or an XPath that identifies the table’s ‘location’ in the HTML. You can find these using a tool like SelectorGadget, or with your browser’s ‘inspect’ tool (for Chrome, right-click the element on the page, select ‘inspect’, right-click the HTML for that element, go to ‘Copy’, then ’Copy full XPath).
nba_scrape <-
read_html("https://en.wikipedia.org/wiki/National_Basketball_Association") %>%
html_nodes(xpath = "/html/body/div[3]/div[3]/div[5]/div[1]/table[4]") %>%
html_table(fill = TRUE, header = NA) %>%
.[[1]] # list was returned, so extract first list element
glimpse(nba_scrape) # a data frame
## Rows: 32
## Columns: 9
## $ Division <chr> "Eastern Conference", "Atlantic", "Atlantic", "Atlantic…
## $ Team <chr> "Eastern Conference", "Boston Celtics", "Brooklyn Nets"…
## $ `City, State` <chr> "Eastern Conference", "Boston, Massachusetts", "New Yor…
## $ Arena <chr> "Eastern Conference", "TD Garden", "Barclays Center", "…
## $ Capacity <chr> "Eastern Conference", "18,624", "17,732", "19,812", "21…
## $ Coordinates <chr> "Eastern Conference", "42°21′59″N 71°03′44″W / 42.366…
## $ Founded <chr> "Eastern Conference", "1946", "1967*", "1946", "1946*",…
## $ Joined <chr> "Eastern Conference", "1946", "1976", "1946", "1949", "…
## $ NA <chr> "Eastern Conference", NA, NA, NA, NA, NA, NA, NA, NA, N…
So, the table has been returned, but it needs to be tidied up.
Wrangle the data
To summarise the main cleaning steps required:
- remove the rogue
NA
-filled column - filter out the spanning headers that identify the conferences
- add a column for each team’s conference
- make numeric the arena capacity
- separate city and state into separate columns
- isolate the latitude and longitude by separating them from the
Coordinates
column - remove the ‘zero width no-break space’ unicode character in the longitude column
- retain only the columns of interest
nba_wrangle <- nba_scrape %>%
select(-length(.)) %>% # remove the last column (NA)
dplyr::filter(!str_detect(Division, "Conference")) %>%
mutate(
Conference = c(rep("Eastern", 15), rep("Western", 15)),
Capacity = as.numeric(str_remove(Capacity, ","))
) %>%
separate(`City, State`, c("City", "State"), sep = ", ") %>%
separate(Coordinates, c("Coords1", "Coords2", "Coords3"), " / ") %>%
separate(Coords3, c("Latitude", "Longitude"), sep = "; ") %>%
separate(Longitude, c("Longitude", "X"), sep = " \\(") %>%
mutate(
Latitude = as.numeric(Latitude),
Longitude = as.numeric(
str_remove(Longitude, "\\ufeff") # remove rogue unicode
)
) %>%
select(
Team, Conference, everything(),
-Founded, -Joined, -Coords1, -Coords2, -X
) %>%
as_tibble() # convert to tibble
glimpse(nba_wrangle)
## Rows: 30
## Columns: 9
## $ Team <chr> "Boston Celtics", "Brooklyn Nets", "New York Knicks", "Phi…
## $ Conference <chr> "Eastern", "Eastern", "Eastern", "Eastern", "Eastern", "Ea…
## $ Division <chr> "Atlantic", "Atlantic", "Atlantic", "Atlantic", "Atlantic"…
## $ City <chr> "Boston", "New York City", "New York City", "Philadelphia"…
## $ State <chr> "Massachusetts", "New York", "New York", "Pennsylvania", "…
## $ Arena <chr> "TD Garden", "Barclays Center", "Madison Square Garden", "…
## $ Capacity <dbl> 18624, 17732, 19812, 21600, 19800, 20917, 20562, 20491, 17…
## $ Latitude <dbl> 42.36630, 40.68265, 40.75056, 39.90111, 43.64333, 41.88056…
## $ Longitude <dbl> -71.06223, -73.97469, -73.99361, -75.17194, -79.37917, -87…
Great, a clean and tidy table.
Add more information
I also wanted to add three-letter team codes, which can be scraped from another Wikipedia table.
I also wanted to add team colours to later customise the marker pins on the interactive map. These are a little tricky to get in an automated way, so I input them manually with reference to teamcolorcodes.com. With {leaflet}, the markers can only take a small set of named colours (see ?awesomeIcons
), whereas the icon can use any CSS-valid colour (like hex codes).
I’ve hidden the code for this below to save space and because it doesn’t introduce anything new.
Click for the code that creates a data frame of team codes and colours
nba_abbr_cols <-
read_html(
"https://en.wikipedia.org/wiki/Wikipedia:WikiProject_National_Basketball_Association/National_Basketball_Association_team_abbreviations"
) %>%
html_nodes(xpath = "/html/body/div[3]/div[3]/div[5]/div[1]/table") %>%
html_table(header = TRUE) %>%
.[[1]] %>%
rename(Code = `Abbreviation/Acronym`) %>%
mutate(
# {leaflet} markers take a named colour
colour_marker = case_when(
Code == "ATL" ~ "red",
Code == "BKN" ~ "black",
Code == "BOS" ~ "green",
Code == "CHA" ~ "darkblue",
Code == "CHI" ~ "red",
Code == "CLE" ~ "darkred",
Code == "DAL" ~ "blue",
Code == "DEN" ~ "darkblue",
Code == "DET" ~ "red",
Code == "GSW" ~ "blue",
Code == "HOU" ~ "red",
Code == "IND" ~ "darkblue",
Code == "LAC" ~ "red",
Code == "LAL" ~ "blue",
Code == "MEM" ~ "lightblue",
Code == "MIA" ~ "red",
Code == "MIL" ~ "darkgreen",
Code == "MIN" ~ "darkblue",
Code == "NOP" ~ "darkblue",
Code == "NYK" ~ "blue",
Code == "OKC" ~ "blue",
Code == "ORL" ~ "blue",
Code == "PHI" ~ "blue",
Code == "PHX" ~ "darkblue",
Code == "POR" ~ "red",
Code == "SAC" ~ "purple",
Code == "SAS" ~ "black",
Code == "TOR" ~ "red",
Code == "UTA" ~ "darkblue",
Code == "WAS" ~ "darkblue"
),
# {leaflet} marker icons take hex
colour_icon = case_when(
Code == "ATL" ~ "#C1D32F",
Code == "BKN" ~ "#FFFFFF",
Code == "BOS" ~ "#BA9653",
Code == "CHA" ~ "#00788C",
Code == "CHI" ~ "#000000",
Code == "CLE" ~ "#FDBB30",
Code == "DAL" ~ "#B8C4CA",
Code == "DEN" ~ "#FEC524",
Code == "DET" ~ "#1D42BA",
Code == "GSW" ~ "#FFC72C",
Code == "HOU" ~ "#000000",
Code == "IND" ~ "#FDBB30",
Code == "LAC" ~ "#1D428A",
Code == "LAL" ~ "#FDB927",
Code == "MEM" ~ "#12173F",
Code == "MIA" ~ "#F9A01B",
Code == "MIL" ~ "#EEE1C6",
Code == "MIN" ~ "#9EA2A2",
Code == "NOP" ~ "#C8102E",
Code == "NYK" ~ "#F58426",
Code == "OKC" ~ "#EF3B24",
Code == "ORL" ~ "#C4CED4",
Code == "PHI" ~ "#ED174C",
Code == "PHX" ~ "#E56020",
Code == "POR" ~ "#000000",
Code == "SAC" ~ "#63727A",
Code == "SAS" ~ "#C4CED4",
Code == "TOR" ~ "#000000",
Code == "UTA" ~ "#F9A01B",
Code == "WAS" ~ "#E31837"
)
) %>%
as_tibble()
sample_n(nba_abbr_cols, 5)
## # A tibble: 5 x 4
## Code Franchise colour_marker colour_icon
## <chr> <chr> <chr> <chr>
## 1 IND Indiana Pacers darkblue #FDBB30
## 2 UTA Utah Jazz darkblue #F9A01B
## 3 NOP New Orleans Pelicans darkblue #C8102E
## 4 SAC Sacramento Kings purple #63727A
## 5 MIN Minnesota Timberwolves darkblue #9EA2A2
Now this extra information can be joined to our scraped and wrangled data frame from before.
nba_table <- nba_wrangle %>%
left_join(nba_abbr_cols, by = c("Team" = "Franchise")) %>%
select(Code, everything())
glimpse(nba_table)
## Rows: 30
## Columns: 12
## $ Code <chr> "BOS", "BKN", "NYK", "PHI", "TOR", "CHI", "CLE", "DET",…
## $ Team <chr> "Boston Celtics", "Brooklyn Nets", "New York Knicks", "…
## $ Conference <chr> "Eastern", "Eastern", "Eastern", "Eastern", "Eastern", …
## $ Division <chr> "Atlantic", "Atlantic", "Atlantic", "Atlantic", "Atlant…
## $ City <chr> "Boston", "New York City", "New York City", "Philadelph…
## $ State <chr> "Massachusetts", "New York", "New York", "Pennsylvania"…
## $ Arena <chr> "TD Garden", "Barclays Center", "Madison Square Garden"…
## $ Capacity <dbl> 18624, 17732, 19812, 21600, 19800, 20917, 20562, 20491,…
## $ Latitude <dbl> 42.36630, 40.68265, 40.75056, 39.90111, 43.64333, 41.88…
## $ Longitude <dbl> -71.06223, -73.97469, -73.99361, -75.17194, -79.37917, …
## $ colour_marker <chr> "green", "black", "blue", "blue", "red", "red", "darkre…
## $ colour_icon <chr> "#BA9653", "#FFFFFF", "#F58426", "#ED174C", "#000000", …
Now we have everything we need to visualise the data and fetch the travel duration times.
2. Map the locations
So where are all the arenas?
We can create a simple interactive map with {leaflet} by plotting the Latitude
and Longitude
columns and creating custom point markers with a basketball icon and each team’s colours, as well as an information box that appears on-click.
leaflet(nba_table) %>%
addProviderTiles(providers$Stamen.TonerLite) %>% # add basemap
addAwesomeMarkers( # add markers
lng = ~Longitude, lat = ~Latitude, # coordinates
popup = ~paste0( # HTML content for popup info
"<b>", nba_table$Team, "</b>", # team name
"<br>", paste0(nba_table$Arena, ", ", nba_table$City), # location
if_else( # division/conference information
nba_table$Conference == "Eastern",
paste0("<br><font color='#0000FF'>", nba_table$Division,
" Division (Eastern Conference)</font>"),
paste0("<br><font color='#FF0000'>", nba_table$Division,
" Division (Western Conference)</font>")
)
),
icon = awesomeIcons(
library = "ion", icon = "ion-ios-basketball", # add basketball icon
markerColor = nba_table$colour_marker, # colour the marker
iconColor = nba_table$colour_icon # colour the basketball icon
)
) %>%
addMeasure() # add straight-line distance-measuring tool
You can drag and zoom and click the points.
3. Get travel duration
So how far between these locations?
The {osrm} R package from Timothée Giraud, Robin Cura and Matthieu Viry lets you fetch shortest paths and travel times from OpenStreetMap via the OSRM API. It defaults to driving, but you can select walking and biking too. Since we’re using the demo server for OSRM, we can only fetch duration.
Duration matrix
The osrm::osrmTable()
function takes a data frame (or spatial object) where the first three columns are an identifier and coordinates. The return object is a list, where the first element is a matrix of durations for each pair of points.
nba_locs <- select(nba_table, Code, Longitude, Latitude)
nba_dur <- osrmTable(loc = nba_locs)
glimpse(nba_dur)
## List of 3
## $ durations : num [1:30, 1:30] 0 277 269 377 649 ...
## ..- attr(*, "dimnames")=List of 2
## .. ..$ : chr [1:30] "BOS" "BKN" "NYK" "PHI" ...
## .. ..$ : chr [1:30] "BOS" "BKN" "NYK" "PHI" ...
## $ sources :'data.frame': 30 obs. of 2 variables:
## ..$ lon: num [1:30] -71.1 -74 -74 -75.2 -79.4 ...
## ..$ lat: num [1:30] 42.4 40.7 40.7 39.9 43.6 ...
## $ destinations:'data.frame': 30 obs. of 2 variables:
## ..$ lon: num [1:30] -71.1 -74 -74 -75.2 -79.4 ...
## ..$ lat: num [1:30] 42.4 40.7 40.7 39.9 43.6 ...
Duration: all teams
Let’s take this matrix and tidy it into a data frame so there’s one row per team-pair. We can also round to the nearest minute and calculate the nearest number of hours.
nba_dur_all <-
as.data.frame(nba_dur$durations) %>%
rownames_to_column("Start") %>%
pivot_longer(
cols = BOS:SAS,
names_to = "End",
values_to = "Duration (mins)"
) %>%
mutate(
`Duration (mins)`= janitor::round_half_up(`Duration (mins)`),
`Duration (hrs)` = janitor::round_half_up(`Duration (mins)` / 60)
) %>%
arrange(desc(`Duration (mins)`))
Here’s a {DT} interactive table sorted by duration that you can filter. Click the ‘CSV’ button to download the data.
nba_dur_all %>%
datatable(
filter = "top",
extensions = c("Buttons","Scroller"),
class = "compact", width = "100%",
options = list(
dom = "Blrtip",
scroller = TRUE, scrollY = 300,
buttons = list("csv")
)
)
So an incredible 58 hours of driving to get from Miami to Portland.
Duration: by division
We can also narrow this down to get only the team-pairs that play in the same division as each other.
nba_dur_div <- nba_dur_all %>%
left_join(select(nba_table, Code, Division), by = c("Start" = "Code")) %>%
left_join(select(nba_table, Code, Division), by = c("End" = "Code")) %>%
dplyr::filter(Division.x == Division.y, `Duration (mins)` != 0) %>%
select(Division = Division.x, everything(), -Division.y) %>%
arrange(Division, desc(`Duration (mins)`))
Again, here’s an interactive table that you can use to explore the data. Note that it’s ordered by Division and then duration in minutes. I’ve hidden the code because it’s the same as for the table above.
Click for the {DT} code
nba_dur_div %>%
datatable(
filter = "top",
extensions = c("Buttons","Scroller"),
rownames = FALSE,
class = "compact", width = "100%",
options = list(
dom = "Blrtip",
scroller = TRUE, scrollY = 300,
buttons = list("csv")
)
)
This time we can see that there’s a maximum of 33 hours of driving required between two teams in the same division: Portland to Oklahoma City.
A quick diversion: routing
We know from using osrm::osrmTable()
that Miami to Portland has the longest travel duration. What’s the route?
Fortunately, {osrm} has the function osrmRoute()
for fetching the routes between a pair of points.
We can grab a vector of coordinates for each team from our nba_table
object and set these as our origin (src
) and destination (dst
) in osrm::osrmRoute()
. The return object is a ‘linestring’ object that contains detail on the coordinates and coordinate system for the route.
# Function to extract latlong vectors for teams
get_ll <- function(data, team_code) {
team_data <- dplyr::filter(data, Code == team_code)
lng <- pull(team_data, Longitude)
lat <- pull(team_data, Latitude)
lnglat <- c(lng, lat)
return(lnglat)
}
# Get route between latlong pairs
route <- osrmRoute(
src = get_ll(nba_table, "MIA"),
dst = get_ll(nba_table, "POR"),
returnclass = "sf"
)
route
## Simple feature collection with 1 feature and 4 fields
## geometry type: LINESTRING
## dimension: XY
## bbox: xmin: -122.6658 ymin: 25.78202 xmax: -80.15664 ymax: 45.8407
## CRS: EPSG:4326
## src dst duration distance geometry
## src_dst src dst 3459.378 5245.24 LINESTRING (-80.18809 25.78...
Now we can set up the same type of {leaflet} map as earlier, but we’ll include only Portland and OKC. I’ve hidden the map definition because it’s almost the same as before.
Click for the {leaflet} map definition
mia_por <- nba_table %>%
dplyr::filter(Code %in% c("MIA", "POR"))
mia_por_map <-
leaflet(mia_por) %>%
addProviderTiles(providers$Stamen.TonerLite) %>% # add basemap
addAwesomeMarkers( # add markers
lng = ~Longitude, lat = ~Latitude, # coordinates
popup = ~paste0( # HTML content for popup info
"<b>", mia_por$Team, "</b>", # team name
"<br>", paste0(mia_por$Arena, ", ", mia_por$City), # location
if_else( # division/conference information
mia_por$Conference == "Eastern",
paste0("<br><font color='#0000FF'>", mia_por$Division,
" Division (Eastern Conference)</font>"),
paste0("<br><font color='#FF0000'>", mia_por$Division,
" Division (Western Conference)</font>")
)
),
icon = awesomeIcons(
library = "ion", icon = "ion-ios-basketball", # add basketball icon
markerColor = mia_por$colour_marker, # colour the marker
iconColor = mia_por$colour_icon # colour the basketball icon
)
) %>%
addMeasure() # add straight-line distance-measuring tool
And to that map we can add the line that defines the route
mia_por_map %>% addPolylines(data = st_geometry(route))
That’s a long way.
4. Make a heatmap
A quick way to visualise the data is to create a heatmap, where we take a matrix of teams in each division and colour by duration. Here, lighter colours indicate greater travel duration.
The plot is interactive; you can hover over squares in each facet to see specific information about that pair, including the exact duration value.
p <- nba_dur_div %>%
ggplot(aes(Start, End)) +
geom_tile(aes(fill = `Duration (hrs)`)) +
xlab("") + ylab("") +
facet_wrap(~Division, scales = "free")
ggplotly(p)
Note the light colours in the Northwest division where teams have to travel far (like the 33 hour trip from Portland and Oklahoma City), while travel durations in the Atlantic and Central divisions are shorter. Of course, the Clippers and Lakers both play in the Staples Center in LA, so their journey time is zero.
Ending the journey
So, this post shows the the power of the {osrm} package for travel distance, duration and routing information.
Of course, it’s never usually as simple as having your geographic data ready to go, so I hope this post also provides a good use-case for {rvest} to help you collect information and {tidyverse} for wrangling it.
The plots here are pretty minimal, but they hopefully give a flavour of how to use {leaflet} for plotting points and the routing between them according to {osrm}.
This post was initially written before the travel restrictions brought about by the 2020 pandemic. Of course, the maps would have been much simpler during for the 2020 playoffs, which all took place in a ‘bubble’ at Disney World, Florida!
Session info
## ─ Session info ───────────────────────────────────────────────────────────────
## setting value
## version R version 4.0.2 (2020-06-22)
## os macOS Mojave 10.14.6
## system x86_64, darwin17.0
## ui X11
## language (EN)
## collate en_GB.UTF-8
## ctype en_GB.UTF-8
## tz Europe/London
## date 2020-11-15
##
## ─ Packages ───────────────────────────────────────────────────────────────────
## package * version date lib source
## assertthat 0.2.1 2019-03-21 [1] CRAN (R 4.0.0)
## backports 1.1.8 2020-06-17 [1] CRAN (R 4.0.0)
## bitops 1.0-6 2013-08-17 [1] CRAN (R 4.0.0)
## blob 1.2.1 2020-01-20 [1] CRAN (R 4.0.2)
## blogdown 0.19 2020-05-22 [1] CRAN (R 4.0.0)
## bookdown 0.19 2020-05-15 [1] CRAN (R 4.0.0)
## broom 0.7.0 2020-07-09 [1] CRAN (R 4.0.2)
## cellranger 1.1.0 2016-07-27 [1] CRAN (R 4.0.2)
## class 7.3-17 2020-04-26 [1] CRAN (R 4.0.2)
## classInt 0.4-3 2020-04-07 [1] CRAN (R 4.0.0)
## cli 2.0.2 2020-02-28 [1] CRAN (R 4.0.0)
## colorspace 1.4-1 2019-03-18 [1] CRAN (R 4.0.0)
## crayon 1.3.4 2017-09-16 [1] CRAN (R 4.0.0)
## crosstalk 1.1.0.1 2020-03-13 [1] CRAN (R 4.0.0)
## curl 4.3 2019-12-02 [1] CRAN (R 4.0.0)
## data.table 1.12.8 2019-12-09 [1] CRAN (R 4.0.0)
## DBI 1.1.0 2019-12-15 [1] CRAN (R 4.0.0)
## dbplyr 1.4.4 2020-05-27 [1] CRAN (R 4.0.2)
## digest 0.6.25 2020-02-23 [1] CRAN (R 4.0.0)
## dplyr * 1.0.0 2020-08-10 [1] Github (tidyverse/dplyr@5e3f3ec)
## DT * 0.13 2020-03-23 [1] CRAN (R 4.0.0)
## e1071 1.7-3 2019-11-26 [1] CRAN (R 4.0.0)
## ellipsis 0.3.1 2020-05-15 [1] CRAN (R 4.0.0)
## evaluate 0.14 2019-05-28 [1] CRAN (R 4.0.0)
## fansi 0.4.1 2020-01-08 [1] CRAN (R 4.0.0)
## farver 2.0.3 2020-01-16 [1] CRAN (R 4.0.0)
## forcats * 0.5.0 2020-03-01 [1] CRAN (R 4.0.2)
## fs 1.5.0 2020-07-31 [1] CRAN (R 4.0.2)
## generics 0.0.2 2018-11-29 [1] CRAN (R 4.0.0)
## gepaf 0.1.1 2018-03-05 [1] CRAN (R 4.0.0)
## ggplot2 * 3.3.1 2020-05-28 [1] CRAN (R 4.0.0)
## glue 1.4.1 2020-05-13 [1] CRAN (R 4.0.0)
## gtable 0.3.0 2019-03-25 [1] CRAN (R 4.0.0)
## haven 2.3.1 2020-06-01 [1] CRAN (R 4.0.2)
## hms 0.5.3 2020-01-08 [1] CRAN (R 4.0.2)
## htmltools 0.5.0 2020-06-16 [1] CRAN (R 4.0.2)
## htmlwidgets 1.5.1 2019-10-08 [1] CRAN (R 4.0.0)
## httr 1.4.2 2020-07-20 [1] CRAN (R 4.0.2)
## icon 0.1.0 2020-05-24 [1] Github (ropenscilabs/icon@a5bc1cc)
## janitor 2.0.1 2020-04-12 [1] CRAN (R 4.0.0)
## jsonlite 1.7.0 2020-06-25 [1] CRAN (R 4.0.0)
## KernSmooth 2.23-17 2020-04-26 [1] CRAN (R 4.0.2)
## knitr 1.29 2020-06-23 [1] CRAN (R 4.0.2)
## labeling 0.3 2014-08-23 [1] CRAN (R 4.0.0)
## lattice 0.20-41 2020-04-02 [1] CRAN (R 4.0.2)
## lazyeval 0.2.2 2019-03-15 [1] CRAN (R 4.0.0)
## leaflet * 2.0.3 2019-11-16 [1] CRAN (R 4.0.0)
## leaflet.providers 1.9.0 2019-11-09 [1] CRAN (R 4.0.0)
## lifecycle 0.2.0 2020-03-06 [1] CRAN (R 4.0.0)
## lubridate 1.7.8 2020-04-06 [1] CRAN (R 4.0.0)
## magrittr 1.5 2014-11-22 [1] CRAN (R 4.0.0)
## modelr 0.1.8 2020-05-19 [1] CRAN (R 4.0.2)
## munsell 0.5.0 2018-06-12 [1] CRAN (R 4.0.0)
## osrm * 3.3.3 2020-04-14 [1] CRAN (R 4.0.0)
## pillar 1.4.6 2020-07-10 [1] CRAN (R 4.0.2)
## pkgconfig 2.0.3 2019-09-22 [1] CRAN (R 4.0.0)
## plotly * 4.9.2.1 2020-04-04 [1] CRAN (R 4.0.0)
## purrr * 0.3.4 2020-04-17 [1] CRAN (R 4.0.0)
## R6 2.4.1 2019-11-12 [1] CRAN (R 4.0.0)
## Rcpp 1.0.5 2020-07-06 [1] CRAN (R 4.0.2)
## RCurl 1.98-1.2 2020-04-18 [1] CRAN (R 4.0.0)
## readr * 1.3.1 2018-12-21 [1] CRAN (R 4.0.2)
## readxl 1.3.1 2019-03-13 [1] CRAN (R 4.0.2)
## reprex 0.3.0 2019-05-16 [1] CRAN (R 4.0.2)
## rlang 0.4.7 2020-07-09 [1] CRAN (R 4.0.2)
## rmarkdown 2.1 2020-01-20 [1] CRAN (R 4.0.0)
## rstudioapi 0.11 2020-02-07 [1] CRAN (R 4.0.0)
## rvest * 0.3.6 2020-07-25 [1] CRAN (R 4.0.2)
## scales 1.1.1 2020-05-11 [1] CRAN (R 4.0.0)
## selectr 0.4-2 2019-11-20 [1] CRAN (R 4.0.0)
## sessioninfo 1.1.1 2018-11-05 [1] CRAN (R 4.0.0)
## sf * 0.9-4 2020-06-13 [1] CRAN (R 4.0.0)
## snakecase 0.11.0 2019-05-25 [1] CRAN (R 4.0.0)
## sp 1.4-2 2020-05-20 [1] CRAN (R 4.0.0)
## stringi 1.4.6 2020-02-17 [1] CRAN (R 4.0.0)
## stringr * 1.4.0 2019-02-10 [1] CRAN (R 4.0.0)
## tibble * 3.0.3 2020-07-10 [1] CRAN (R 4.0.2)
## tidyr * 1.1.0 2020-05-20 [1] CRAN (R 4.0.0)
## tidyselect 1.1.0 2020-05-11 [1] CRAN (R 4.0.0)
## tidyverse * 1.3.0 2019-11-21 [1] CRAN (R 4.0.2)
## units 0.6-7 2020-06-13 [1] CRAN (R 4.0.0)
## utf8 1.1.4 2018-05-24 [1] CRAN (R 4.0.0)
## vctrs 0.3.2 2020-07-15 [1] CRAN (R 4.0.0)
## viridisLite 0.3.0 2018-02-01 [1] CRAN (R 4.0.0)
## withr 2.2.0 2020-04-20 [1] CRAN (R 4.0.0)
## xfun 0.16 2020-07-24 [1] CRAN (R 4.0.2)
## xml2 * 1.3.2 2020-04-23 [1] CRAN (R 4.0.0)
## yaml 2.2.1 2020-02-01 [1] CRAN (R 4.0.0)
##
## [1] /Library/Frameworks/R.framework/Versions/4.0/Resources/library