Exercise | XML and Met.ie |
Overall Goal | To show how an XML based web service can be used within FME |
Demonstrates | Call a web service and processing the XML response using FME's XML transformers |
Completed Workspace | Download Completed Workspace> |
This exercise will look at retrieving weather data for each of the RNLI's offshore lifeboat stations around Ireland to produce a table view of the latest weather conditions faced by their all weather boats (ALB) if they had to launch on a shout in response to a request from the Coast Guard.
The exercise will take the latest data from the RNLI's ArcGIS Online portal filter out those stations of interest and then for each station make a request to Met.ie to get the latest weather and forecast information.
The station locations can be seen in the ArcGIS portal here RNLI Lifeboat Stations and the exercise will use the API provided to bring in the data from their service.
1) Create a New Blank Workspace
Open FME Desktop and select New to create a new blank canvas.
2) Read in Station Data
Add a GeoJSON Reader and set the Dataset to the following URL location
https://opendata.arcgis.com/datasets/7dad2e58254345c08dfde737ec348166_0.geojson
and theCoord. System
to EPSG:4326.
This will add a Reader feature type for the Station data. If you run the Reader and view the output in the Visual Preview you will see that data is returned for the UK and Ireland so our next step is to filter the data to just Ireland.
3) Filter Data to Ireland
From viewing the data above we can see that Irish stations are identified within the data and stations that have an all weather boat (ALB). In order to do this add a Tester transformer to the Workspace connected to the GeoJSON Reader feature type.
In the Tester we can add the following test criteria
Logic | Left Value | Operator | Right Value |
---|---|---|---|
Leave Blank | Country | = | IRL |
AND | Station_Ty | != | ILB |
So the completed tests look like below.
Remember that in 2020.1 you can make use of the Cache Values to pick from the values contained within the relevant attribute. To do this select the drop down arrow on the far right of the Right Value
section and select Cached Values followed by the value you want to use.
If you reinspect the data on the Passed port of the Tester you will see we now only have stations in Ireland where the Station type is ALB (All Weather) or ALB/ILB (All Weather with Inshore).
4) Tidy up Attributes
In general and especially when merging data it is best practise to remove those attributes that are not required and to ensure that the names of the attributes used are logical.
Add an AttributeManager and connect it to the Tester Passed Port as shown below.
Inside the transformer set-up the following attributes as per the table below and remove the others.
Input | Output | Action |
---|---|---|
SAP_ID | StationID | Rename |
Station | Station | Do Nothing |
County | County | Do Nothing |
Lat_Dec_De | Lat | Rename |
Long_Dec_D | Long | Rename |
The transformer should look like below when complete
5) Sample Data For Testing
For testing we do not need to pass all of the features into the HTTPCaller as it makes unnecessary requests. The Sampler transformer can be used to pass a single feature into the HTTPCaller, add a Sampler and connect it to the AttributeManager. Within the transformers parameter window set the Sampling Rate (N)
to 1 and the Sampling Type
to First N Features as shown below.
6) Call the Met.ie Service
Add a HTTPCaller and connect it to the Sampled port of the Sampler transformer.
Open the HTTPCallers parameters window and configure the settings as follows.
Set the Request URL
to,
http://metwdb-openaccess.ichec.ie/metno-wdb2ts/locationforecast?
and the HTTP Method
to GET then the Query String Parameters
to,
Name | Value |
---|---|
lat | Lat |
long | Long |
Finally as a best practise action set the Response Body Attribute
to _xmlPayload this gives us a more descriptive name for the response attribute we are using and makes it easier to track through the workflow.
The resulting configuration should look like the below,
7) Validate the Response
As a best practise activity and to ensure that the response is valid we can use the XMLValidator. Add an XMLValidator transformer and connect it to the Output port of the HTTPCaller.
Inside the transformers parameters window set the Attribute with XML Text
to _xmlPayload.
Run the transformer and the response should come out from the Passed port.
8) Extract the Values from XML
Now we have a valid XML payload we can view the response that comes back from the service we can review it to understand how the data is structured. From reviewing the data we can see the different tags it uses, there is a root weatherdata tag, a meta tag holding information on the forecast models, and the product tag which holds the forecast information.
From the data we can see that this is a nested object with each time tag holding the forecast information for a given hour.
Remember that your request will look different as it will reflect the situation now so your data values will look different as you move through the remainder of the exercise.
In order to get the forecast information as individual records we can use the XMLFragmenter transformer to break the data up into individual features. Add an XMLFragmenter transformer and connect it to the Passed port of the XMLValidator.
Inside the transformer parameter window set the XML Attribute
to _xmlPayload, the Elements to Match
to time and lastly set the Merge Attributes From Input Feature
to Yes.
This last option ensures we keep all of the original attributes containing information about each station on the new features. This is useful as when we process all of the stations it will let us know which forecasts relate to which station.
As we want the information from the forecast to become attributes on the feature we can can set these to be flattened on to it as part of the process.
Click on Options next to Flatten Options
and in the subsequent dialogue set Enable Flattening
to be checked.
9) Expose Attributes
As with the JSON exercises we can leverage the new functionality in FME 2020.1 to easily expose the attributes from the feature on to the canvas.
Add a AttributeExposer and connect to the Fragments port of the XMLFragmenter.
From the AttributeExposer parameter window select Import
and then From Feature Cache.
This will provide a list of all of the available attributes select the attributes shown below to add these to the canvas.
Select these attributes from the list above
- location.cloudiness.percent
- location.humidity.value
- location.pressure.value
- location.temperature.value
- location.windDirection.name
- location.windSpeed.beaufort
10) Filter for Current Forecast
Our requirement is for the current forecast at each station and we currently get returned approximately 10 days of forecast information. With a single response it is easy to filter off the first record but with multiple responses we have to calculate which is the first forecast for each station.
As the forecast information is pre-sorted by date and time we know that the first record is the current situation so to uniquely identify this record we can use the counter and assign a count to the forecasts that is unique for each station.
To do this add a Counter transformer and connect it to the AttributeExposer. In the parameters window set the Counter Name
to StationID. This will create a unique count of forecasts for every station.
To filter just the first record for each station add a Tester transformer and connect it to the Counter transformer.
Within the parameters set the Left Value
to test on to _Count and the Operator
to = and lastly set the Right Value
to 0.
This will filter out the first forecast record for each station.
11) Tidy up Output
Now we have the data we need but we also have a lot of additional attributes from the process it is best practise to tidy these up and to also give our new attributes relevant names for our workflow. This makes the workspace more efficient, easier to maintain and to understand.
The AttributeManager allows us to manage our current schema efficiently in a single transformer. Add an AttributeManager and connect to the Passed port on the Tester.
Inside the AttributeManager configure the attributes as shown below, giving the new forecast attributes relevant names. Remove the other attributes that are no longer required.
You can use the arrows at the bottom left of the Attribute Actions section to move the attributes up and down the dialogue so that the ones being kept and renamed are all at the top. This makes it easier to see the changes by grouping them together and presenting the relevant changes first. Whilst not required it makes the transformer easier to understand when reviewing.
Lastly run the transformer and review the output within the Visual Preview.
12) Run for all Stations
Currently we have been building our process using a single station feature that we sampled off at the start of our workflow.
To run for all stations we can either remove the Sampler transformer or we can connect the NotSampled port up to the HTTPCaller along with the Sampled Port. This later is often a useful development step when testing and once we are ready to use the workspace live we will remove the Sampler.
Connect up the NotSampled Port and use the Run From This function to run the Workspace from this transformer. This will trigger the downstream transformers highlighted green to run with all of the features rather than just the sample we've used to date.
Having run the workspace you can view the output of the AttributeManager at the end of the Workflow for a complete view of all of the stations forecasts in our tidied state.
Advanced Task |
The task above just views the data into the Visual Preview Window but we could use the information to create a HTML Report that could be served by FME Server allowing a user to quickly get an overview of the weather conditions across all stations. Use your FME skills to see if you can create a Report showing the station location and forecast information. |
CONGRATULATIONS |
The tasks above demonstrate how you can use FME to read data from a web service and use this to make a second call to another web service supporting a different payload structure and combine the information together into useable features within our workflow. |