FVCOM Access in R and Spatial Subsetting

Using Bounding Polygons to Trim FVCOM Mesh Data

Published

April 28, 2024

Subsetting FVCOM Spatially with {FVOM}

This approach will lean on the work of Ben Tupper at Bigelow Labs, and their package fvcom. Many thanks to Ben and their team for their work on this.

This package can be used to get fvcom data from downloaded files or from links to THREDDS/OPENDAP endpoints. Once the netcdf files are opened in the environment, the {FVCOM} package will use the coordinate information to generate a mesh using thee {sf} package. This mesh can then be clipped using geoprocessing functions in the {sf} package, and done this way triangular mesh structure is preserved, and no elements like the current direction etc. become orphaned.

FVCOM meshes can be saved as shapefile/geojson/csv files as a lightweight file that contains the lat/lon index numbers and their coordinates for verification. These index numbers can be used directly to index from FVCOM files in R/python.

NOTE: Mesh structure is not consistent across FVCOM versions. FVCOM-GOM3 uses a different mesh than FVCOM-GOM4 or FVCOM-GOM5 which have expanded regional coverage and increased densities.

Read more about their package here: https://github.com/BigelowLab/fvcom

Accessing NECOFS GOM4

As explained on the repo documentation for Ben’s {fvcom} package, Data are served via OpeNDAP on a THREDDS Server. NECOFS has its own THREDDS Directory which can be used to browse what is available.

Working with the Nodes & Elements

We will use this Jan, 2019 data to get information about the Nodes we wish to gather data from.

[1] "2019-07-01 00:00:00 UTC" "2019-07-01 01:01:52 UTC"
[3] "2019-07-01 01:58:07 UTC" "2019-07-01 03:00:00 UTC"
[5] "2019-07-01 04:01:52 UTC" "2019-07-01 04:58:07 UTC"

Working with the Mesh using Simple Features

The {fvcom} package has functions that can generate the mesh a a simple features dataframe. This carries the node locations in a table as an inter-connected mesh.

The point ID’s for these nodes should correspond directly with their index order in the Netcdf files. If all goes to plan we should be able to crop/clip this mesh as we would any polygon, and use the remaining point id’s later to subset from the OPENDaP connection.

Finding the Nodes We Care About:

We can do ourselves a favor by identifying which nodes we’re interested in before pulling variables. There are a couple different approaches to try:

  1. Use the nodes as points, use st_intersect to identify which are within the areas we are studying.
    2. Use the mesh itself, to preserve the triangular geometries. Then use the {fvcom} functions to pull data

Areas We are Interested in

  1. VTS Survey, Nearshore Lobster Habitat
  2. GOM & GB Ecological Production Units
  3. Northeast Shelf

Mesh Containment Within an Area

Will test this first with the offshore areas. But plan is to take the mesh and check for overlap/containment within the above polygons

Coordinate Reference System:
  User input: EPSG:6318 
  wkt:
GEOGCRS["NAD83(2011)",
    DATUM["NAD83 (National Spatial Reference System 2011)",
        ELLIPSOID["GRS 1980",6378137,298.257222101,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    USAGE[
        SCOPE["Horizontal component of 3D system."],
        AREA["Puerto Rico - onshore and offshore. United States (USA) onshore and offshore - Alabama; Alaska; Arizona; Arkansas; California; Colorado; Connecticut; Delaware; Florida; Georgia; Idaho; Illinois; Indiana; Iowa; Kansas; Kentucky; Louisiana; Maine; Maryland; Massachusetts; Michigan; Minnesota; Mississippi; Missouri; Montana; Nebraska; Nevada; New Hampshire; New Jersey; New Mexico; New York; North Carolina; North Dakota; Ohio; Oklahoma; Oregon; Pennsylvania; Rhode Island; South Carolina; South Dakota; Tennessee; Texas; Utah; Vermont; Virginia; Washington; West Virginia; Wisconsin; Wyoming. US Virgin Islands - onshore and offshore."],
        BBOX[14.92,167.65,74.71,-63.88]],
    ID["EPSG",6318]]

Use New Mesh to Extract Vars:

Getting Averages Over Multiple Times

[1] 53087    40
[1] 53087    41

Bottom Temp:

I had originally done this with January and the bottom temperatures were higher than surface temperatures. Revisiting it with July Data

Surface/Bottom Comparison

Compare the difference in surface and bottom measurements:

Extracting Vector Elements

The magnitude and direction of current flows are measured as two vectors, the relative strengths of a Northward (“u”) and Eastward (“v”) water velocities. These are stored in FVCOM as zonal elements, with single values associated with the centroids of the triangular mesh pieces.

The fvcom package has a function for extracting data for these elements:

Interpolate Currents to Regular Grid


Next Steps - GOM3 Node/Data Matching

To go back further than 2016 we need to use the gom3 hindcast records. These have a different file location on the THREDDS directory, and we’ll need to confirm the grid structure and variables are consistent across these different model runs.

Verify that the IDS for nodes and elements are the same between NECOFS and FVCOMGom3.

If we want to go back further in time then NECOFS won’t yet be online and we’ll need to use data from the FVCOM GOM3 hindcast. Best-case scenario these mesh id’s match so we can append the time-periods together

Can we verify they match? They seem to have the same number of nodes and the same number zonal elements.

[1] TRUE
[1] TRUE

Temporal Frequency

What about the time frequency, what is the pace of the hind-cast data records? Looks hourly, That’s probably too much. This will require some thought on how to match hourly data to daily biological data, and pulling the right indices over which to average.

[1] 31.26466

Save Node/Element ID’s for regions of interest

Now that we know that the node and element ID’s are consistent across GOM3 & NECOFS, we can use the approach above to save the mesh information for areas within the regions we are looking at:

These can be used if-needed to limit the data being transferred over the OPeNDAP connection and save us data transfer/processing time. Would also be useful to have just as a means to match points to a standalone mesh geometry.

Coordinate Reference System:
  User input: EPSG:6318 
  wkt:
GEOGCRS["NAD83(2011)",
    DATUM["NAD83 (National Spatial Reference System 2011)",
        ELLIPSOID["GRS 1980",6378137,298.257222101,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    USAGE[
        SCOPE["Horizontal component of 3D system."],
        AREA["Puerto Rico - onshore and offshore. United States (USA) onshore and offshore - Alabama; Alaska; Arizona; Arkansas; California; Colorado; Connecticut; Delaware; Florida; Georgia; Idaho; Illinois; Indiana; Iowa; Kansas; Kentucky; Louisiana; Maine; Maryland; Massachusetts; Michigan; Minnesota; Mississippi; Missouri; Montana; Nebraska; Nevada; New Hampshire; New Jersey; New Mexico; New York; North Carolina; North Dakota; Ohio; Oklahoma; Oregon; Pennsylvania; Rhode Island; South Carolina; South Dakota; Tennessee; Texas; Utah; Vermont; Virginia; Washington; West Virginia; Wisconsin; Wyoming. US Virgin Islands - onshore and offshore."],
        BBOX[14.92,167.65,74.71,-63.88]],
    ID["EPSG",6318]]


Testing Multiple Dates of Data with Target Mesh

So where are we at: * 1. Need to write some code that can open a connection for an area we care about* 2. grab 1-4 variables at the surface, and the ones at the bottom* 3. average for the month and spit them out in a table*

Accessing mesh data for an extended period of time.

We don’t need hourly, or even daily records. But we will need to make sure we’re aggregating in time correctly and not just grabbing snapshots.

So where we sit now. We know the exact ID’s for the nodes and centroids that make complete mesh elements that fall within our area of interest. We aslo know the variables we are interested in, and possibly the time period we want to cover and the temporal resolution we’re interested in (months).