Skip to content

Latest commit

 

History

History
130 lines (115 loc) · 7.55 KB

README.md

File metadata and controls

130 lines (115 loc) · 7.55 KB

Base container images for running WIBL in Docker

This page describes how to use WIBL base container imagers for running WIBL in Docker or other container runtime environments.

There are currently two tags for the WIBL container image: 1.0.4-amazonlinux and 1.0.4, which at present reference the same image. For more information see here. In the future these may be separated into distinct images for those not using Amazon Linux.

The WIBL container image contains installed and runnable versions of both wibl-python and LogConvert. For more information about how these tools and setup and installed, see the Dockerfile.

To see how the WIBL container image was built, see BUILDING.md.

Use directly using docker run

To run a wibl-python command within the container while reading and writing data from a directory on your host computer, do the following:

$ docker run -v ./:/var/wibl -ti ghcr.io/ccomjhc/wibl:1.0.4 'wibl datasim -f test.bin -d 360 -b'
...
INFO:wibl.command.datasim:Step to time: 359000000
INFO:wibl.command.datasim:Step to time: 359157800
INFO:wibl.command.datasim:Step to time: 360000000
INFO:wibl.command.datasim:Total iterations: 600
$ docker run -v ./:/var/wibl -ti ghcr.io/ccomjhc/wibl:1.0.4 'wibl editwibl -m sensor-inject.json test.bin test-inject.bin'
...

where:

  • -v ./:/var/wibl mounts the current working directory from the host to /var/wibl in the container.
  • 'wibl datasim -f test.bin -d 360 -b' is the wibl-python command to run in the container

Note: the entire command to be run in the container must be enclosed in quotes

Since it can be error-prone to specify the full docker run command each time you run a wibl-python command, it can be easier to open a shell in the wibl-base container, then run multiple wibl commands:

$ docker run -v ./:/var/wibl -ti ghcr.io/ccomjhc/wibl:1.0.4
bash-5.2# wibl datasim -f test.bin -d 360 -b
...
INFO:wibl.command.datasim:Step to time: 359000000
INFO:wibl.command.datasim:Step to time: 359467302
INFO:wibl.command.datasim:Step to time: 360000000
INFO:wibl.command.datasim:Total iterations: 603
bash-5.2# wibl editwibl -m sensor-inject.json test.bin test-inject.bin 
...
bash-5.2# exit
exit

Use in your own Dockerfile

To build your own image based off of the WIBL image, make a Dockerfile that begins with the following:

FROM ghcr.io/ccomjhc/wibl:1.0.4
...

Automating WIBL data processing using PowerShell scripts

Assuming you have a Zip file named ydvr-test.zip (containing one or more YDVR data files) in the current directory of your host computer, you can batch convert these files to WIBL format using the convertToWibl.ps1 PowerShell script:

$ docker run -v ./:/var/wibl -ti ghcr.io/ccomjhc/wibl:1.0.4
bash-5.2# pwsh /opt/bin/convertToWibl.ps1 -Source ydvr-test.zip -OutputFolder wibl-test -LogConvertPath /usr/local/bin/logconvert -Format YDVR -verbose -debug
VERBOSE: Preparing to expand...
VERBOSE: Created '/var/wibl/test/wibl-test/00020001.DAT'.                                                             
VERBOSE: Created '/var/wibl/test/wibl-test/00020002.DAT'.                                                             
DEBUG: Full logconvert path: /usr/local/bin/logconvert                                                                
DEBUG: Source format is: YDVR
VERBOSE: Processing file 00020001.DAT of  ---> 00020001.wibl
VERBOSE: Processing file 00020002.DAT of  ---> 00020002.wibl
warning: generating algorithm request for 'nodatareject' due to bad data.
warning: generating algorithm request for 'nodatareject' due to bad data.
bash-5.2#

This will write the converted WIBL files into a directory named wibl-test. Then you should be able to batch process the resulting WIBL files using processWibl.ps1 as follows from within the wibl-base container:

bash-5.2# pwsh /opt/bin/processWibl.ps1 wibl-test b12_v3_metadata_example.json configure.local.json

Where wibl-test in the directory where the input WIBL files are stored (generated by convertToWibl.ps1 above), b12_v3_metadata_example.json is an example CSB provider metadata file, and configure.local.json is an example WIBL local processing configuration file. See the wibl-python README and the data management scripts README for more details on these files.

Now that we have GeoJSON files containing soundings from our WIBL files along with B-12 metadata, we can batch validate these files using validateWibl.ps1 from within the wibl-base container:

bash-5.2# pwsh /opt/bin/validateWibl.ps1 wibl-test -extension geojson
Validating file /var/wibl/wibl-test/00020001.geojson...
Validation of /var/wibl/wibl-test/00020001.geojson against schema 3.1.0-2023-08 failed due to the following errors: 
Path: /properties/trustedNode/uniqueVesselID, error: 'SEAID-UNKNOWN' does not match '^[a-zA-Z][a-zA-Z0-9]*-[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$'
Path: /properties/platform, error: 'length' is a required property
Path: /properties/platform/IDType, error: 'LoggerName' is not one of ['MMSI', 'IMO']
Path: /properties/platform/IDNumber, error: 'UNKNOWN' is not valid under any of the given schemas
Path: /properties/platform/uniqueID, error: 'SEAID-UNKNOWN' does not match '^[a-zA-Z][a-zA-Z0-9]*-[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$'
Path: /properties/platform/IDType, error: Unknown IDType LoggerName.
Path: /properties/platform/dataProcessed, error: dataProcessed flag is 'false', but 'processing' properties were found.
Validating file /var/wibl/wibl-test/00020002.geojson...
Validation of /var/wibl/wibl-test/00020002.geojson against schema 3.1.0-2023-08 failed due to the following errors: 
Path: /properties/trustedNode/uniqueVesselID, error: 'SEAID-UNKNOWN' does not match '^[a-zA-Z][a-zA-Z0-9]*-[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$'
Path: /properties/platform, error: 'length' is a required property
Path: /properties/platform/IDType, error: 'LoggerName' is not one of ['MMSI', 'IMO']
Path: /properties/platform/IDNumber, error: 'UNKNOWN' is not valid under any of the given schemas
Path: /properties/platform/uniqueID, error: 'SEAID-UNKNOWN' does not match '^[a-zA-Z][a-zA-Z0-9]*-[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}$'
Path: /properties/platform/IDType, error: Unknown IDType LoggerName.
Path: /properties/platform/dataProcessed, error: dataProcessed flag is 'false', but 'processing' properties were found.

These example GeoJSON files don't validate. If these were real data, we'd have some work to do to reconcile the metadata template passed to processWibl.ps1 with the WIBL data that we have before being able to submit to DCDB.

Once your GeoJSON files validate, you can use submidDCDB.ps1 to batch submit validated GeoJSON files to DCDB:

bash-5.2# pwsh -Command Get-Help /opt/bin/submitDCDB.ps1 
submitDCDB.ps1 [-inPath] <string> -authFile <string> -configFile <string> [-extension <string>] [<CommonParameters>]

For more information on how to perform DCDB submissions, see the wibl-python README and the data management scripts README.