The Now Platform® Washington DC release is live. Watch now!

Help
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Michael Ritchie
ServiceNow Employee
ServiceNow Employee

ServiceNow offers a wide variety of API's to integrate with other systems: Web Services (SOAP and REST), JDBC, LDAP, PowerShell, Shell Script, scheduled file import, and bi-directional email. Unfortunately, not all systems and tools offer this same variety of choices and loading data via spreadsheet or files feels like the only choice. The files can certainly be imported manually through Import Sets or through another solution I documented called "Consumerize" Data Imports to Non-ServiceNow Administrators, but what if this could be automated.   ServiceNow can connect to FTP sites or pull files via MID server, but what if that still doesn't work for the system or vendor you are trying to integrate with? Then I would say the lowest common denominator for integration is email.

 

We all know parsing email text can be very tricky and problematic at the same time; however, if you can get an email template set up it can be a useful integration method. The ability to process an inbound email and import data at the time is often overlooked. I often see and hear about spreadsheets being emailed around and then saved so the data can be imported, but again what if that could happen automatically?

 

Loading data from an email attachment

There have been a few solutions for this documented over the years, including UPDATED - Use an email with an attachment to create a Data Source, Load Data, and Run Transform. These solutions were documented many years ago and are now obsolete. This requirement to load data from an email attachment came up the other day. I thought I would post a working solution for Geneva and beyond releases.

 

Set up prerequisites to load data from an email attachment

  1. You must establish an import file/template that will always be used.
    • The columns in the spreadsheet must remain the same since it will require changes in ServiceNow to add, change, or remove columns.
  2. Your email needs to contain something unique to look for in order to know you want to process the email attachment.  
    • In other words you don't want to be trying to import every email attachment that is sent to your ServiceNow instances.   Options are keywords in the subject or body of the email or even emails from a specific email address.   Again you need something that will be unique about the emails for the inbound email action to look for.
  3. You will need to set up the import set table and transform maps.  
    • This can be done by manually importing the template as an administrator.   Verify the import generated records in your target table and everything looks good.   This blog isn't going to cover those steps, but once you can manually import the file, then you can automate that process.
    • You will need to write down or copy/paste a few things once this is set up for use in a script provided in this post.
      • Name of your import set table - You can get this by refreshing your browser so the application navigator is updated
        • Navigate to System Import Sets > Import Set Tables and there you should see a table that matches what you called your import when loading in the file
        • Click the module that matches your table name and when the list shows, click any of the 3-line icons beside the column headers, then Configure, and choose table.
        • When the table record shows up, copy down the name value or better yet you can copy the name value and paste it into a temporary text file.
      • SysID of your transform map(s).   This is the transform map that processes data in your import set table and drops it into your target table.
        • Navigate to System Import Sets > Administration > Transform Maps and there you should see a record that matches what you typed in when manually importing your file.
        • Right-click on that row and choose Copy sys_id
        • Depending on your browser it may just copy that value into memory and you will need to paste it into a text file to see the value.   Paste it into the temporary text file you used in the prior step.
        • If multiple transform maps need to be leveraged, repeat the steps above to capture the additional SysIDs of the transform maps.

 

Automate the processing of the inbound email with the attachment

Now that you have your email requirements established and your file set up for import, we can now automate the processing of the inbound email with the attachment.   This will involve creating an inbound email action.   To better understand how this works, look over the documentation on inbound email actions, inbound email action variables, creating inbound email actions, inbound email action examples, and inbound email action ordering.

 

Set up your inbound email action

  1. Navigate to System Policy > Email > Inbound Actions
  2. Click New.
  3. Set the following values:
    • Name: Give it a name that makes sense to you.
    • Set the Target table to Data Source (sys_data_source).   This is because we expect these emails to contain an Excel or CSV file and we need to generate a data source with that attachment that can then be imported.
    • Set Active to true
    • Set Stop processing to true since we don't want any other inbound email actions to process this email or file.
  4. In the When to run section/tab:
    • You may consider changing the order to a very low or negative number so that other inbound actions don't process these emails.
    • If you are expecting these emails to come from a specific email, you can select the From user.
    • Set the condition based on Pre-req 2 above.   Examples are subject contains "file import" or something.   Again this needs to be something unique but something that will always appear in these inbound emails.
  5. In the Actions section/tab:
    • Paste in the following script:
    •  
      (function runAction(/*GlideRecord*/ current, /*GlideRecord*/ event, /*EmailWrapper*/ email, /*ScopedEmailLogger*/ logger, /*EmailClassifier*/ classifier) {
            var importSetTableName = "IMPORT SET TABLE NAME";
            var transformMapIDs = "SYS-ID(s) OF YOUR TRANSFORM MAP TO UTILIZE";   //Use a comma to specify multiple transform maps
            var applicatonScope = "Global";
      
            // Create the datasource record
            current.name = "File import from: " + email.from;   //Feel free to rename this as appropriate
            current.import_set_table_name = importSetTableName;
            current.file_retrieval_method = "Attachment";
            current.type = "File";
            current.format = "Excel"; // For Excel Files
            //current.format = "CSV"; // For CSV Files
            current.header_row = 1;
            current.sheet_number = 1;
            current.sys_package.setDisplayValue(applicatonScope);
            current.sys_scope.setDisplayValue(applicatonScope);
            var dataSourceID = current.insert();
           
            /*
             * Schedule Load of Attachment
             *
             * This inbound email action will generate an import data source, however the attachment isn't copied to the data source until
             * after the insert of the record.   Scheduling the import to happen 30 seconds later so that attachment has time to be copied.
             */
            new global.EmailFileImportUtils().scheduleImport(dataSourceID, transformMapIDs);
      })(current, event, email, logger, classifier);​​
    • Set the values of the variables declared in lines 3 and 4 of the script to what you captured in pre-req 3 above.
      • You can specify multiple Transform Maps by separating them by a comma with no spaces on line 4.
    • If your file is in CSV format, comment line 12 and uncomment line 13.
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  6. Click Submit.

 

Set up your utility script include

Now we need to create the utility script include that is called by the inbound email action.

  1. Navigate to System UI > Script Includes
  2. Click New.
  3. Set the following values:
    • Name: EmailFileImportUtils
    • Accessible from: All applications scopes - setting this to all scopes in case you want to use this for a scoped application
    • Script: paste in the following:
    • var EmailFileImportUtils = Class.create();
      EmailFileImportUtils.prototype = {
      	initialize: function() {
      	},
      	
      	scheduleImport: function(dataSourceID, transformMapIDs) {
      		/*
       		* Create scheduled job to process import
       		*
       		* The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
       		* after the insert of the record.   The code below will create a scheduled job to process the import 30 seconds later
       		* so that attachment has time to be copied to the data source from the email.
       		*/
      		
      		var schRec = new GlideRecord("sys_trigger");
      		schRec.name = "Load Data Source: " + dataSourceID;
      		schRec.trigger_type = 0;   // Run Once
      		schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
      		
      		var nextAction = new GlideDateTime();
      		nextAction.addSeconds(30);   // 30 seconds should be enough time however this can be changed.
      		schRec.next_action = nextAction;
      		schRec.insert();
      	},
      	
      	loadImportSet: function(dataSourceID, transformMapIDs) {
      		// Get Datasource Record
      		var dataSource = new GlideRecord("sys_data_source");
      		dataSource.get(dataSourceID);
      		
      		// If CSV and header isn't on row 1, recreate attachment with empty rows removed, and delete original
      		if (dataSource.getValue("format") == "CSV" && dataSource.getValue("header_row") > 1) {
      			var attachmentRec = new GlideRecord("sys_attachment");
      			attachmentRec.addQuery("table_sys_id", dataSource.getValue("sys_id"));
      			attachmentRec.query();
      			if (attachmentRec.next()) {
      				var oldAttachmentID = attachmentRec.getValue("sys_id");
      				var inputStream = new GlideSysAttachment().getContentStream(oldAttachmentID);
      				var textReader = new GlideTextReader(inputStream);
      				var ln = " ";
      				var newLine = "";
      				
      				var lineCounter = 0;
      				var headerRow = parseInt(dataSource.getValue("header_row"));
      				while((ln = textReader.readLine()) != null) {
      					lineCounter += 1;
      					if (lineCounter < headerRow) {
      						continue;
      					}
      					
      					if (ln.length > 1) {
      						newLine += ln + "\n";
      					}
      				}
      				new GlideSysAttachment().write(dataSource, 'Changed ' + attachmentRec.getValue('file_name'), "text/csv", newLine);
      				//Delete original email attachment so that the new one is processed.  If two remain, API may get confused on which one to process.
      				attachmentRec.deleteRecord();
      			}
      		}		
      		
      		// Process data source file
      		var loader = new GlideImportSetLoader();
      		var importSetRec = loader.getImportSetGr(dataSource);
      		var ranload = loader.loadImportSetTable(importSetRec, dataSource);
      		importSetRec.state = "loaded";
      		importSetRec.update();
      		
      		// Transform import set
      		var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
      		transformWorker.setBackground(true);
      		transformWorker.start();
      	},
      	
      	type: 'EmailFileImportUtils'
      	
      };
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  4. Click Submit.

 

If data load is part of a scoped application or if you are loading data into a scoped table and changed line 5 in your inbound email action, then you will need to perform the following steps.   If not you can skip to the next step.

 

By default the Data Sources table only allows records to be created by the Global scope and since your scoped application needs to create a data source via the inbound email action we need to change that.

  1. Navigate to System Import Sets > Administration > Data Sources.
  2. Click the Additional Actions 3 lined icon beside Data Sources:

    data sources.jpg

  3. Then choose Configure and select Table:

    configure data sources.jpg

  4. Go to the Application Access Section or tab and check the Can Create checkbox.

                          can create data source table.jpg

  5. Click Update.

 

Now test by sending an email that meets the conditional criteria of your inbound email action with your file. Within a few minutes you should see data populated in your table.   Keep in mind that the out of the box scheduled job called Email Reader runs every two minutes to check for new inbound emails.   This can be changed to run faster, but may cause system performance issues. Once your email is processed it will take another 30 seconds to process the attachment.

 

If you would like to set up another inbound email action to process a different file, simply repeat steps 1-5 above.   The script include does not need to be recreated.

 

Troubleshooting your setup:

  • All inbound emails are stored in the database and can be viewed by navigating to System Mailboxes > Received. Here you can see a copy of the email and the Target field at the top should be a Data Source if things worked correctly.   At the bottom, see the Email Log list that shows which inbound email actions processed the email.
  • If the target of the received email is not a data source and your inbound email action is part of a scoped application, check to make sure you changed the Data Source table application access in step 9 above.
  • You can view the data source and spreadsheet sent via email by navigating to System Import Sets > Administration > Data Sources. You can add the Updated column to your list and sort in descending order to see the latest at the top.   All data sources created by the emails will be named "File import from + the from email address" unless you changed line 8 of the inbound email action script.   Each of the data sources should have the attachment sent via email, if there isn't one then that is a problem and the cause of the failure.
  • You can view all data imported and status of the import by navigating to System Import Sets > Advanced > Import Sets.   You can add the Updated column to your list and sort in descending order to see the latest at the top.   Each of the import sets should be in a state of Processed if they were successfully processed.
  • You can also view the system logs for any other errors by navigating to System Logs > System Log > All.   Make sure you sort the list by Created in descending order and look for any errors during the time of the inbound email processing.

 

268 Comments