The Now Platform® Washington DC release is live. Watch now!

Help
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Adam Stout
ServiceNow Employee
ServiceNow Employee

There have been a lot of requests recently to view current COVID-19 infection information. To simplify my users’ experience, I was asked how we can display this on a dashboard along with other information that is key to managing the situation.

I’m glad I have the Now Platform and such a great team to work with. [Thanks for your help with this @Padmanabam Tirumala.]

We have a few options to work with, so let’s review our options first before diving into each one of them.

  1. IFrame on Dashboard: This is cutting corners as the look and feel will most likely not match, and we have no control over the contents. For quick results or keeping an eye on “outside” perspectives or reference, this works fine.
  2. Remote Table to Import on the Fly: use an external table as a source for our styled visuals. It will match the look and feel of the rest of our dashboard, but we are not importing the data, so Performance Analytics will not be able to create snapshots of the data for trending and advanced analysis purposes.
  3. Import the Data: we import the data and can then do whatever we need and want including trending and providing us with more advanced analysis and filtering options.

IFrame on Dashboard

The fastest way to import the data is to not import it but instead embed an existing site as an iframe. To set this up, we need to add a new IFrame Content Block.

find_real_file.png

Once that is added, we need to configure the iframe. (this requires content_admin to manage Content Blocks). All we need to do is give it a name, enter the URL (https://ncov2019.live/data), and set the height and width. IFrames with external content do not auto-resize, so you must pick something that will work for most of your users.

find_real_file.png 

Now we have embedded a dashboard website with the information.

find_real_file.png 

Note: If you are embedding content from some systems, the other system may block the content from showing up in frames. If you get a security area, you may need to investigate the security settings on the other system to white list your instance.

Remote Table to Import on the Fly

Remote Tables, a feature introduced in New York, allows us to import data on the fly. The data is imported into a temporary table in memory. This table can be used as a standard table through most of the platform (including for reports and interactive filters) but is populated when you query it (there are some caching options if desired).

Note: Remote Tables leveraging external data requires an entitlement to IntegrationHub. Be sure to understand if you are entitled to use this technique. Check with your account executive if you have any questions.

For COVID-19 data, we can access publicly available information. Johns Hopkins University is posting data daily available at https://github.com/CSSEGISandData.

Here is an example of the data file we will work with:

Province/State,Country/Region,Last Update,Confirmed,Deaths,Recovered,Latitude,Longitude 
Hubei,China,2020-03-12T09:53:06,67781,3056,50318,30.9756,112.2707
,Italy,2020-03-11T21:33:02,12462,827,1045,43.0000,12.0000
,Iran,2020-03-12T11:13:27,10075,429,2959,32.0000,53.0000
,"Korea, South",2020-03-12T05:13:02,7869,66,333,36.0000,128.0000

Create Table

To create a new Remote Table, we start with the Tables module (under Remote Table) in the System Definition application.

find_real_file.png 

Click New to create a new table. For this table, the only required field is “sys_id”, the other normal “sys” fields are not created by default and not needed. We are going to add a field (with the correct type) for each field in the data file.

We are also going to add a reference to Location [cmn_location] to enable some more robust reporting.

find_real_file.png

Fields

  • Country - string
  • Province - string
  • Last Update – date/time
  • Confirmed - integer
  • Deaths - integer
  • Recovered - integer
  • Latitude - decimal
  • Longitude - decimal
  • Location – reference to cmn_location

Create Definition

Now that we have the table, we need to populate it. To populate the Remote Table, we use a Remote Table Definition.

Click on the Definitions module under Remote Tables and put in a script to parse the data and create the rows. Checking Advanced enables us to set the caching settings, which isn’t essential here, but maybe useful for other use cases.

find_real_file.png

Here is the script I used with some inline comments on what it is doing.

(function executeQuery(v_table, v_query)
{
    var MAX_LOOK_BACK_DAYS = 7; // how many day we will look back for data
    var REMOTE_DATA = null;

    /**
     * Format the date string in the required format
     * @param {GlideDateTime} theDate 
     * @returns {string} formated date string
     */
    var getFileDate = function (theDate)
    {
        return ("00" + theDate.getMonthLocalTime()).substr(-2, 2) + '-' + ("00" + theDate.getDayOfMonthLocalTime()).substr(-2, 2) + '-' + theDate.getYearLocalTime();
    };

    /**
     * Pull data from public site and populate the REMOTE_DATA variable
     * @param {*} v_table 
     * @param {*} v_query 
     * @returns {void}
     */
    var getCovidDataFromGit = function (v_table, v_query)
    {
		// Uses RestMessage and set the end point
		// Create a RestMessage first which calls an external REST service
		try {
            var fileFound = false;
            var theDate = new GlideDateTime();
            var loopCount = 0;

            // the data is posted at a consistant time.  To handle that we look for today's file and if that is there, we look for yesteday and then the day before up to MAX_LOOK_BACK_DAYS
            while(!fileFound)
            {
                var url = "https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_daily_reports/" + getFileDate(theDate) + ".csv";
                var restMessage = new sn_ws.RESTMessageV2();
                restMessage.setHttpMethod("get");
                restMessage.setEndpoint(url);
                var response = restMessage.execute(); //Might throw exception if http connection timed out or some issue with sending request itself because of encryption/decryption of password.

                // if the file isn't found, look at the day before
                if (response.getStatusCode() == "404")
                {
                    theDate.addDaysLocalTime(-1); // look for data from yesterday
                } else {
                    gs.addInfoMessage('Data retrieved from: ' + url); // add a message to the user that we pulled the data.  This is optional.
                    fileFound = true; // we found the file so stop looking back in time.
                }

                // break if we have some problem;
                if(loopCount++ > MAX_LOOK_BACK_DAYS)
                {
                    fileFound = true;
                }
            }

            // if REST call ends up in an error, set the last error message which shows up
            // at the bottom of the list view
            if (response.haveError()) {
                v_query.setLastErrorMessage(response.getErrorMessage());
                // can use gs.error() or gs.addErrorMessage() while debugging
                // gs.debug() messages visible in session debugger
                // gs.debug(response.getErrorMessage());
                return;
            }
			
            REMOTE_DATA = response.getBody();
            
		} catch (ex) {
			v_query.setLastErrorMessage(ex.message);
			// gs.debug(ex.message);
			return;
		}
    };
    
    /**
     * lookupLocation - Get teh sys_id for the location based on teh country and province
     * @param {string} country 
     * @param {string} province 
     * @returns {sys_id} Location sys_id
     */
    var lookupLocation = function (country, province)
    {
            var loc = new GlideRecord('cmn_location');
            loc.setLimit(1);
            loc.addQuery('country', country);
            // this doesn't make a lot of sense, but matches my data
            if(province)
            {
                //loc.addQuery('state', province);
                loc.addQuery('city', province); // we have some odd data to handle
            } else {
                //loc.addNullQuery('state');
                loc.addQuery('city', country); // we have some odd data to handle
            }
            loc.addNullQuery('street');
            loc.query();
            if(loc.next())
            {
                return loc.getValue('sys_id');
            }
            return null;
    };

    /**
     * Fix the location by mapping the string we get to the string in cmn_location
     * @param {string} loc
     * @returns {string} Location name in cmn_location 
     */
    var fixLocation = function (loc)
    {
        var locationMap = {"US": "USA"};
        if(locationMap && locationMap.hasOwnProperty(loc))
        {
            loc = locationMap[loc];
        }
        return loc;
    };

    /**
     * Generate a unique name for each row (standin for a sys_id)
     * @param {string} latitude 
     * @param {string} longitude 
     * @returns {string} A unique name for the row
     */
    var uniqueName = function (latitude, longitude)
    {
        return latitude + '_' + longitude;
    };

    /**
     * Process the data that was previously pulled into REMOTE_DATA
     * @returns {void}
     */
    var processData = function ()
    {
        // would like to use sn_tfrm.TransformerRuleList() but data is a CSV not JSON so we'll brute force it
        var rows = REMOTE_DATA.split("\n");
        for(var r in rows)
        {
            var rowObj = {};
            var row = rows[r].split(/,(?=(?:(?:[^"]*"){2})*[^"]*$)/); // handle comma inside of double quotes
            rowObj.latitude = row[6];
            rowObj.longitude = row[7];
            rowObj.sys_id = uniqueName(rowObj.latitude, rowObj.longitude);
            rowObj.province = row[0];
            rowObj.country = row[1];
            rowObj.last_update = row[2]; // the data seems to change over time so we need to handle that
            rowObj.confirmed = row[3];
            rowObj.deaths = row[4];
            rowObj.recovered = row[5];
            rowObj.location = lookupLocation(fixLocation(row[1]), fixLocation(row[0]));
            //gs.addInfoMessage(JSON.stringify(rowObj));
            if(r > 0)
            {
                // this is the most important line, where we actually add the row to the remote table
                v_table.addRow(rowObj);
            }
        }
    };
    // do the work
    getCovidDataFromGit(v_table, v_query);
    processData();
})(v_table, v_query);

 

Use the Table

Now that we have a table, we can build reports and interactive filters to display the data.

find_real_file.png 

Because we are using a Remote Table, there are some limitations to what we can do because the data is not materialized in the database. Specifically, we cannot apply Performance Analytics on this data or using it in Related List Conditions. However, the data is easy to load, very current, and accessible in most of the platform.

Import Data

The third option you have is to load this data is to use a standard Data Source, Transform Map, and Scheduled Import to load the data into an ordinary table.

find_real_file.png 

Create a Data Source

We need a standard data source that pulls the data from the same location we used in the Remote Table example.

find_real_file.png

Create a Transform Map

For this data, we’ll use two transform maps. One will load the locations into cmn_location (this can be skipped if you already have them and just need to look them up), and the second (order does matter) to load the location data.

find_real_file.png 

Nothing special is required for the field mappings, but for this data, I coalesced on last updated, country, and region (being sure to allow blanks for these). I also used a script to look up the location from cmn_location.

Schedule the Import

We want this data every day, so we need to schedule the load. Since there is a new file each day, we need to handle that. I used a pre-import script to handle this by editing the file_path and connection_url on the data source to the next day every time I run it.

find_real_file.png

Here is an example of what you pre-import script might look like:

var importTable = 'x_snc_covid_imp_covid_data';

var getDataSource = function (sourceTable) {
    var ds = new GlideRecord('sys_data_source');
    ds.addQuery('import_set_table_name', '=', sourceTable);
    ds.query();
    if (ds.next()) {
        return ds;
    }
    return null;
};

var replaceDateInString = function (d, str) {
    return str.replace(/\d{2}\-\d{2}\-\d{4}/gi, d);
};

var ds = getDataSource(importTable);
var theDate = new GlideDateTime();
var dateString = ("00" + theDate.getMonthLocalTime()).substr(-2, 2) + '-' + ("00" + theDate.getDayOfMonthLocalTime()).substr(-2, 2) + '-' + theDate.getYearLocalTime();

ds.setValue('connection_url', replaceDateInString(dateString, ds.connection_url.toString()));
ds.setValue('file_path', replaceDateInString(dateString, ds.file_path.toString()));
ds.update();

I didn’t make this conditional, but it may make sense to run this job every hour and add a condition to check if there is a new file available or not.

Use the Data

When we do a physical import of the data, we have no restrictions on what to do with it. Performance Analytics and Related List Conditions work, so we have the full power of the platform at our fingertips.

find_real_file.png

Choose what makes sense

You have many options to get the data exposed to your user. We can show it with an IFrame, leverage reports with a Remote Table, or unleash the power of the platform with a full import.

Where else can you use these techniques to make better user experiences and empower your users and supercharge your workflows?

5 Comments