Skip navigation

Developer Community

7 Posts authored by: Michael Ritchie Employee

ServiceNow Notify is a great feature that provides additional communication vehicles to the platform via SMS and voice; the datasheet can be downloaded here.  Included with this plugin is Notify on Task which provides the ability to send an SMS message and initiate a conference call from any task within your instance.  This includes incidents, problems, changes, and any table that extends task.  The only issue is out of the box it only works for individual users and you cannot leverage groups, which is a common requirement. 

 

Recently posted on Share is an update set that extends Notify on Task that adds the ability to send SMS messages and initiate conference calls to groups.

Notify on Task for Groups

 

This update set adds a Groups section at the bottom of the popup as show above.  If On-Call Scheduling is enabled in your instance, you can also elect to only include the current on-call person for the selected groups versus all the members of the group.  This option will not appear if On-Call Scheduling is not activated within your instance.  Since group managers may not always be members of the group, there is an option to include the group manager as well.  Duplicates are automatically removed with the out of the box code so users won't receive more than one message or call.

 

Note: It is important to note that several out of the box components are being modified with this update set so please be aware of this as you upgrade your instance.

 

Enjoy!


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

Ever since the "Genius Lounge" concept was created by Apple many years ago many companies, including ServiceNow internally, have created walk up style technical support, especially as BYOD has gotten popular.  While ServiceNow is a powerful platform to schedule and capture the details of a task/request, it lacks a nice feature rich UI that allows users to easily schedule an appointment similar to OpenTable's view of making a restaurant reservation.  That is until now!

 

This provides a "no-code" solution of defining how appointments should be scheduled and Service Portal widgets to allow users to schedule new appointments and view their existing appointments.

 

 

The ServiceNow platform offers several features that make this solution possible:

  • Feature Rich Task Table:
    • The task table is the base table for most applications within ServiceNow.  Incident, Problem, Change, Case, etc all extend Task.  This table includes common attributes for any type of task/case and extending it speeds up development time and it brings many features along with it such as SLAs, On-Call Scheduling, etc.
    • The task table includes Start and End attributes that various applications leverage for the scheduling of work.  Because of this, this Task Appointment Scheduling solution can be applied to any application within your instance that extends the task table.
  • Service Portal:
    • The Service Portal, introduced in the Helsinki release, has brought a whole new dynamic of creating responsive and feature rich UI's that improves the usability of ServiceNow.
    • Using AnjularJS and Bootstrap, reusable widgets can be created that can be easily leveraged on any page within your Service Portals.
    • The Service Portal widget is reusable in that each instance of the widget is tied to Task Appointment Definition record.  So you could have two widgets on a page where one creates an Incident appointment while another one creates a Change appointment.

 

Sush Chandrashekar(sush_c) and I partnered up to come up with a Task based solution that allows you to:

  • Define a schedule for accepting appointments and a duration of time for each appointment
  • Set which Task table to use to store the appointments.  This can be any table that extends task, including out of the box tables like Incident, Problem, Change, Case, etc, or any custom table.
  • Set how many concurrent appointments that can be scheduled before marking an appointment time unavailable.  Default is 1.
  • Set template values that will be applied when the appointment record is created.
  • Set the style of how you want the available appointments to be presented within the Service Portal:
    • Timecards
    • Chiclet Style Buttons
    • Dropdown

 

The Code:

The solution involves several components that is available for download from ServiceNow Share: Task Appointment SchedulingDownload the update set from Share, load/preview/Commit it, and then you can leverage it in your instance.

 

The Setup:

  • After committing the update set from Share, Navigate to Task Appointments\Task Appointments and click New.
    • Give it a name
    • Set the schedule for the time frame that you wish to accept appointments and the duration of the appointments.  You may need to create a new schedule if you don't already have one defined that matches the appointment schedule.
    • Optionally set a group you wish to route the appointments to and set the number of concurrent appointments this group can accept.  You can also set the group in the Task Values instead.  The task values will override the value set in the Group field.
    • Set the Task table used to store the appointment records
    • Use the Task Values template field to set any other fields in the task record such as short description, assignment group, category, etc which is applied as an appointment is created.
    • Click Submit.
  • Navigate to Service Portal\Service Portal Configuration and choose Designer to add the widgets to a page.
    • Search for the page that you wish to add this solution to or create a new page if necessary.
    • On the left under Widgets, you should find Appointment Scheduling and Appointments List, drag them onto to your page.
    • Once added, click the Pencil to edit the properties of the Appointment Scheduling widget.
      • Give the widget a Title that will appear above the widget.
      • Set the Task Appointment record to use from step 1.
      • Choose the layout of how you want to display the available appointments.  See a screenshot of the options above.
      • Set whether you wish to allow reminders to be sent prior to an appointment.  This utilizes a little known feature covered by ctomasi in TechNow Episode 39.
      • Set the maximum number of days out that you wish to allow appointments.  Default is 30 days.
      • If you would like to collect Short Description, Description, and Location when creating appointments, check the applicable box(s).
      • Click Save.
    • Click the Pencil to edit the properties of the Appointments List widget.
      • Give the widget a name and set the Task Appointment record to use from step 1.
      • If you would like to show Short Description and Location in the list, check the applicable box(s).
      • Click Save.
  • Enjoy!

 

Other Important Notes:

  • The update set adds a new field to the out of the box Task table called Task Appointment (u_task_appointment) which is a reference to the Task Appointment (u_task_appointment) table.
    • This reference is utilized to link appointments back to the defined configuration record.  It is also leveraged when displaying a user's scheduled appointments.
  • The work_start and work_end fields on the Task table are leveraged to store the appointment start and end time.  The out of the box labels for these fields are Actual Start and Actual End, but this could vary by instance and table.
  • A business rule called "Enforce Task Appointment Schedule" is included to
    • Zero out the seconds associated to an appointment.  This is important to track reserved time slots.
    • Ensure that the work_start and work_end values are valid for the schedule within the Task Appointment definition record
      • The work_end of a task can be extended beyond the Appointment Duration defined within the Task Appointment record, however it must conform to that duration time frame.  In other words if the Task Appointment Duration is set to 30 minutes, a task can have a duration of 1 hour, 1.5 hours, 2 hours, etc in 30 minute increments.
      • If the work_end date of a task is outside the duration time frame, the business rule will prevent the record from being saved and the following message will appear at the top of the screen to the user:
        • End time must be in X minute intervals.
        • This message can be modified in the TaskAppointmentUtils Script Include, checkTaskDates function.
    • A link to this script is provided in the Task Appointment application navigator.
  • The TaskAppointmentUtils Script Include defines all the API's leveraged by the Service Portal Widget.
    • Each function is documented within the script include.
    • This script can be edited to provide further detail in the widget or change data that gets set as appointments are created, updated, and deleted.
    • A link to this script is provided in the Task Appointment application navigator.


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

A common use case in ServiceNow is to have data driven forms that fill themselves in based input from other fields.  For example after selecting a user, fill in location fields, department fields, etc based upon the selected user's information.  This is typically accomplished using a ServiceNow client script using the out of the box g_form.getReference() function.  While this function is useful, it has its shortcomings including the ability to "dot-walk" into the referenced record's references.  For example if your form needs to display the user's location information like street, city, state, and zip, the getReference() function cannot "dot-walk" into the user's location record to gather these attributes because "location" is a reference on the user record.  The typical answer to solving this requirement is GlideAjax that includes a client side call to a server side script to gather the details and then pass it back to the client.  GlideAjax is quite useful and very powerful, however it can be difficult to setup since it requires somewhat complicated step by step instructions to make it work.

 

On top of detailed step by step instructions, GlideAjax setup is different between the desktop UI and the service portal UI.  If you are not careful, you can have GlideAjax "sprawl" where you have many script include records for point solutions.  There have been several ideas on the ServiceNow Community discussing how to solve this problem such as goranlundqvist's "Lets make GlideAjax a little more dynamic" and tim.harris's "Standardizing GlideAjax Calls to Prevent Script Include Sprawl".  Those posts have been very useful in getting people going with GlideAjax.

 

"getReferenceAdvanced" is an easier solution that is a simple function call form a client script.  In this function call, you can specify one or more attributes that you want from the referenced record including "dot-walk" attributes.  A global UI script enables this functionality on any form or service portal in your instance.

 

Note: The mobile app and UI do not support UI Scripts so this solution will not work for the app store mobile app or the mobile browser ($m.do) at this time.

 

Another challenge with GlideAjax is queries are happening server side by a system process and if you are not careful you may be exposing more data than the logged in user is supposed to have access to.  You will find many comments on the Community about this.  This solution solves that issue by using GlideRecordSecure instead of GlideRecord thus enforcing ACLs.

 

The Use Case:

I searched the Community for common use cases where the getReference function wasn't working and GlideAjax was proposed.  One customer had a form with a location reference field and the location's street, city, state, and zip needed to be filled into separate fields on the form.  Another user needed to fill in separate fields for email, department name, and department cost center's code after filling in the caller.  These are both excellent examples of where the coding to accomplish this use case can be complicated.

 

The getReferenceAdvanced solution is much simpler.  In testing, I setup a catalog item form that prompts for a "Requested For" user which is pretty typically across any type of task/case intake form.  Once the user is selected details from the user's profile are filled in below the Requested For including location and department information.  Below is a screenshot of this form indicating the source of data when the getReferenceAdvanced function runs:

 

The Code:

This solution involves several components that is available for download from ServiceNow Share: getReferenceAdvanced Client Script Function, a GlideAjax AlternativeDownload the update set from Share, load/preview/commit it, and then you can leverage it in your client scripts.

 

  • UI Script called getReferenceAdvanced: The getReferenceAdvanced* functions are part of a global UI script which make it available to a client script on any form in any scope in your ServiceNow instance.  This client script validates the data passed to it and then makes a REST web service call back to the instance to gather data.
    • For security purposes the user's security token is passed to the REST web service which prevents unauthorized access to instance data as well as enforcing ACL's for the logged in user.
  • Scripted REST API called getReferenceAdvanced: The getReferenceAdvanced UI Script passes data to this Scripted REST API that then gathers the data requested and passes it back to the client script.  A Scripted REST API was used instead of GlideAjax because calling a web service behaves the same in the Desktop UI and the Service Portal UI which simplifies the code.  This web service requires authentication for security purposes and the user's session token is used for the authentication.

 

The Setup:

The desktop UI and the Service Portal UI behave a little differently so two different function calls are provided for the two different UIs:

  • Desktop:
    • getReferenceAdvancedDesktop("REFERENCE-FIELD-NAME", "SEMICOLON-SEPARATED-LIST-OF-FIELDS-YOU-WANT");
  • Service Portal:
    • getReferenceAdvancedPortal(g_form, "REFERENCE-FIELD-NAME", "SEMICOLON-SEPARATED-LIST-OF-FIELDS-YOU-WANT");
    • Notice the getReferenceAdvancedPortal function has a third input of "g_form".  This is a required input and is basically an object containing all the details about the form being displayed.  For some reason the g_form object is not passed to the UI Script from the Service Portal like it is with the desktop UI so it is required that it is passed.

 

The getReferenceAdvancedDesktop and getReferenceAdvancedPortal return a JSON object (name/value pair) with the field requested and its value from the reference record.  Because the data is returned in an Object, all the "dots" that were passed into the function are changed to underscores so that you can call out each data element individually.  For example the value of "location.street" is stored in the "location_street" attribute within the returned object.  This data can then be combined with the g_form.setValue() function to set values on the form.

 

OK so this sound complicated, how is this simpler and how do I use it?!!  Considering use case mentioned above and the test form screenshot, the following code is used in an onChange client script to fill in the form.

 

Desktop Script:

function onChange(control, oldValue, newValue, isLoading) {
    if (isLoading || newValue == '') {
        return;
    }
    
    var reqFor = getReferenceAdvancedDesktop("requested_for", "location.street;location.city;location.state;location.zip;department;department.dept_head;department.cost_center.code");
    g_form.setValue("street", reqFor.location_street);
    g_form.setValue("city", reqFor.location_city);
    g_form.setValue("state", reqFor.location_state);
    g_form.setValue("department", reqFor.department);
    g_form.setValue("department_manager", reqFor.department_dept_head);
    g_form.setValue("department_cc_code", reqFor.department_cost_center_code);    
}

 

Service Portal Script:

function onChange(control, oldValue, newValue, isLoading) {
    if (isLoading || newValue == '') {
        return;
    }
    
    var reqFor = getReferenceAdvancedPortal(g_form, "requested_for", "location.street;location.city;location.state;location.zip;department;department.dept_head;department.cost_center.code");
    g_form.setValue("street", reqFor.location_street);
    g_form.setValue("city", reqFor.location_city);
    g_form.setValue("state", reqFor.location_state);
    g_form.setValue("department", reqFor.department);
    g_form.setValue("department_manager", reqFor.department_dept_head);
    g_form.setValue("department_cc_code", reqFor.department_cost_center_code);
    
}

 

As you can see in the example script, instead of making multiple getReference calls, only one was made with a semicolon separated list of the attributes needed from the Requested For's user record.  This has a performance benefit because less database calls are being made as well as huge user experience increase because the user is waiting less time for the form data to be filled in.

 

Notice the getReferenceAdvanced* call is set to the "answer" variable that is then used for the setValue() function calls.  Again each of the attributes requested using "dot-walking" are updated with an underscore but using the same name.  If only one attribute is needed from the reference record, either of the following scripts will work.  In this example the user's department's manager (dept_head) is needed:

g_form.setValue("department_manager", getReferenceAdvancedDesktop("requested_for", "department.dept_head").department_dept_head);

--OR--

var reqFor = getReferenceAdvancedDesktop("requested_for", "department.dept_head");
g_form.setValue("department_manager", reqFor.department_dept_head);

 

In order to use the getReferenceAdvanced UI script working in your Service Portal(s), you will need to configure your Service Portal Theme.  As specified by James.Neale in this Community Article:

  1. Navigate to Service Portal\Themes and select your theme.  If you are using the OOB Service Portal, click Stock.
  2. Scroll down to the "JS Includes" Related List and click New.
  3. Set the name to getReferenceAdvanced or whatever makese sense to you, Source should be set to UI Script and then select the getReferenceAdvanced UI Script and click Submit.
  4. Repeat steps 1-3 if you have multiple Service Portals that need this capability.

 

Error Handling/Troubleshooting:

To make this as painless as possible to setup, error handling has been set in the getReferenceAdvanced UI Script.  These alert popup messages will appear when testing your client script:

  • If you call getReferenceAdvanced* in your client script and forget to specify a reference field, you will receive the following message: The getReferenceAdvanced() function requires a field to be passed.
  • If you call getReferenceAdvanced* in your client script and is NOT a reference field, you will receive the following message: The getReferenceAdvanced() function only works with reference fields.  The field specified is not a reference field.
  • If you call getReferenceAdvancedPortal in your client script and you don't pass the g_form object, you will receive the following message: The getReferenceAdvanced() function requires the g_form object to be passed from the Service Portal.  Please pass that as the first parameter.
    • Remember that Service Portal client scripts (Type is Mobile/Service Portal) require you to use the getReferenceAdvancedPortal() function and the first parameter needs to be g_form as shown on the example scripts above.
  • If you call getReferenceAdvanced* in your client script and you don't pass the fields needed from the reference record, you will receive the following message: The getReferenceAdvanced() function requires a string of field(s) you wish to return.
  • If you call getReferenceAdvanced* in your client script and the field passed does not have a value, you will receive the following message: The getReferenceAdvanced() function requires the field 'FIELD-NAME' to have a value.


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

ServiceNow offers a wide variety of API's to integrate with other systems: Web Services (SOAP and REST), JDBC, LDAP, PowerShell, Shell Script, scheduled file import, and bi-directional email. Unfortunately, not all systems and tools offer this same variety of choices and loading data via spreadsheet or files feels like the only choice. The files can certainly be imported manually through Import Sets or through another solution I documented called "Consumerize" Data Imports to Non-ServiceNow Administrators, but what if this could be automated.  ServiceNow can connect to FTP sites or pull files via MID server, but what if that still doesn't work for the system or vendor you are trying to integrate with? Then I would say the lowest common denominator for integration is email.

 

We all know parsing email text can be very tricky and problematic at the same time; however, if you can get an email template set up it can be a useful integration method. The ability to process an inbound email and import data at the time is often overlooked. I often see and hear about spreadsheets being emailed around and then saved so the data can be imported, but again what if that could happen automatically?

 

Loading data from an email attachment in Geneva, Helsinki, Istanbul & Jakarta

There have been a few solutions for this documented over the years, including UPDATED - Use an email with an attachment to create a Data Source, Load Data, and Run Transform. These solutions were documented many years ago and are now obsolete. This requirement to load data from an email attachment came up the other day. I thought I would post a working solution for Geneva, Helsinki and Istanbul releases.

 

Set up prerequisites to load data from an email attachment

  1. You must establish an import file/template that will always be used.
    • The columns in the spreadsheet must remain the same since it will require changes in ServiceNow to add, change, or remove columns.
  2. Your email needs to contain something unique to look for in order to know you want to process the email attachment. 
    • In other words you don't want to be trying to import every email attachment that is sent to your ServiceNow instances.  Options are keywords in the subject or body of the email or even emails from a specific email address.  Again you need something that will be unique about the emails for the inbound email action to look for.
  3. You will need to set up the import set table and transform maps. 
    • This can be done by manually importing the template as an administrator.  Verify the import generated records in your target table and everything looks good.  This blog isn't going to cover those steps, but once you can manually import the file, then you can automate that process.
    • You will need to write down or copy/paste a few things once this is set up for use in a script provided in this post.
      • Name of your import set table - You can get this by refreshing your browser so the application navigator is updated
        • Navigate to System Import Sets > Import Set Tables and there you should see a table that matches what you called your import when loading in the file
        • Click the module that matches your table name and when the list shows, click any of the 3-line icons beside the column headers, then Configure, and choose table.
        • When the table record shows up, copy down the name value or better yet you can copy the name value and paste it into a temporary text file.
      • SysID of your transform map(s).  This is the transform map that processes data in your import set table and drops it into your target table.
        • Navigate to System Import Sets > Administration > Transform Maps and there you should see a record that matches what you typed in when manually importing your file.
        • Right-click on that row and choose Copy sys_id
        • Depending on your browser it may just copy that value into memory and you will need to paste it into a text file to see the value.  Paste it into the temporary text file you used in the prior step.
        • If multiple transform maps need to be leveraged, repeat the steps above to capture the additional SysIDs of the transform maps.

 

Automate the processing of the inbound email with the attachment

Now that you have your email requirements established and your file set up for import, we can now automate the processing of the inbound email with the attachment.  This will involve creating an inbound email action.  To better understand how this works, look over the documentation on inbound email actions, inbound email action variables, creating inbound email actions, inbound email action examples, and inbound email action ordering.

 

Set up your inbound email action

  1. Navigate to System Policy > Email > Inbound Actions
  2. Click New.
  3. Set the following values:
    • Name: Give it a name that makes sense to you.
    • Set the Target table to Data Source (sys_data_source).  This is because we expect these emails to contain an Excel or CSV file and we need to generate a data source with that attachment that can then be imported.
    • Set Active to true
    • Set Stop processing to true since we don't want any other inbound email actions to process this email or file.
  4. In the When to run section/tab:
    • You may consider changing the order to a very low or negative number so that other inbound actions don't process these emails.
    • If you are expecting these emails to come from a specific email, you can select the From user.
    • Set the condition based on Pre-req 2 above.  Examples are subject contains "file import" or something.  Again this needs to be something unique but something that will always appear in these inbound emails.
  5. In the Actions section/tab:
    • Paste in the following script.
      (function runAction(/*GlideRecord*/ current, /*GlideRecord*/ event, /*EmailWrapper*/ email, /*ScopedEmailLogger*/ logger, /*EmailClassifier*/ classifier) {
          
          var importSetTableName = "IMPORT SET TABLE NAME";
          var transformMapIDs = "SYS-ID(s) OF YOUR TRANSFORM MAP TO UTILIZE";  //Use a comma to specify multiple transform maps
          var applicatonScope = "Global";
          
          // Create the datasource record
          current.name = "File import from: " + email.from;  //Feel free to rename this as appropriate
          current.import_set_table_name = importSetTableName;
          current.file_retrieval_method = "Attachment";
          current.type = "File";
          current.format = "Excel"; // For Excel Files
          //current.format = "CSV"; // For CSV Files
          current.header_row = 1;
          current.sheet_number = 1;
          current.sys_package.setDisplayValue(applicatonScope);
          current.sys_scope.setDisplayValue(applicatonScope);
          var dataSourceID = current.insert();
          
          /*
           * Schedule Load of Attachment
           *
           * This inbound email action will generate an import data source, however the attachment isn't copied to the data source until
           * after the insert of the record.  Scheduling the import to happen 30 seconds later so that attachment has time to be copied.
           */
          new global.EmailFileImportUtils().scheduleImport(dataSourceID, transformMapIDs);
          
      })(current, event, email, logger, classifier);
      
    • Set the values of the variables declared in lines 3 and 4 of the script to what you captured in pre-req 3 above.
      • You can specify multiple Transform Maps by separating them by a comma with no spaces on line 4.
    • If your file is in CSV format, comment line 12 and uncomment line 13.
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  6. Click Submit.

 

Set up your utility script include

Now we need to create the utility script include that is called by the inbound email action.

  1. Navigate to System UI > Script Includes
  2. Click New.
  3. Set the following values:
    • Name: EmailFileImportUtils
    • Accessible from: All applications scopes - setting this to all scopes in case you want to use this for a scoped application
    • Script: paste in the following:
      var EmailFileImportUtils = Class.create();
      EmailFileImportUtils.prototype = {
          initialize: function() {
          },
          
          scheduleImport: function(dataSourceID, transformMapIDs) {
              /*
               * Create scheduled job to process import
               *
               * The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
               * after the insert of the record.  The code below will create a scheduled job to process the import 30 seconds later
               * so that attachment has time to be copied to the data source from the email.
               */
              
              var schRec = new GlideRecord("sys_trigger");
              schRec.name = "Load Data Source: " + dataSourceID;
              schRec.trigger_type = 0;  // Run Once
              schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
              
              var nextAction = new GlideDateTime();
              nextAction.addSeconds(30);  // 30 seconds should be enough time however this can be changed.
              schRec.next_action = nextAction;
              schRec.insert();
          },
          
          loadImportSet: function(dataSourceID, transformMapIDs) {
              // Get Datasource Record
              var dataSource = new GlideRecord("sys_data_source");
              dataSource.get(dataSourceID);
              
              // Process data source file
              var loader = new GlideImportSetLoader();
              var importSetRec = loader.getImportSetGr(dataSource);
              var ranload = loader.loadImportSetTable(importSetRec, dataSource);
              importSetRec.state = "loaded";
              importSetRec.update();
              
              // Transform import set
              var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
              transformWorker.setBackground(true);
              transformWorker.start();
          },
          
          type: 'EmailFileImportUtils'
      };
      
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  4. Click Submit.

 

If data load is part of a scoped application or if you are loading data into a scoped table and changed line 5 in your inbound email action, then you will need to perform the following steps.  If not you can skip to the next step.

 

By default the Data Sources table only allows records to be created by the Global scope and since your scoped application needs to create a data source via the inbound email action we need to change that.

  1. Navigate to System Import Sets > Administration > Data Sources.
  2. Click the Additional Actions 3 lined icon beside Data Sources:

    data sources.jpg

  3. Then choose Configure and select Table:

    configure data sources.jpg

  4. Go to the Application Access Section or tab and check the Can Create checkbox.

                can create data source table.jpg

  5. Click Update.

 

Now test by sending an email that meets the conditional criteria of your inbound email action with your file. Within a few minutes you should see data populated in your table.  Keep in mind that the out of the box scheduled job called Email Reader runs every two minutes to check for new inbound emails.  This can be changed to run faster, but may cause system performance issues. Once your email is processed it will take another 30 seconds to process the attachment.

 

If you would like to set up another inbound email action to process a different file, simply repeat steps 1-5 above.  The script include does not need to be recreated.

 

Troubleshooting your setup:

  • All inbound emails are stored in the database and can be viewed by navigating to System Mailboxes > Received. Here you can see a copy of the email and the Target field at the top should be a Data Source if things worked correctly.  At the bottom, see the Email Log list that shows which inbound email actions processed the email.
  • If the target of the received email is not a data source and your inbound email action is part of a scoped application, check to make sure you changed the Data Source table application access in step 9 above.
  • You can view the data source and spreadsheet sent via email by navigating to System Import Sets > Administration > Data Sources. You can add the Updated column to your list and sort in descending order to see the latest at the top.  All data sources created by the emails will be named "File import from + the from email address" unless you changed line 8 of the inbound email action script.  Each of the data sources should have the attachment sent via email, if there isn't one then that is a problem and the cause of the failure.
  • You can view all data imported and status of the import by navigating to System Import Sets > Advanced > Import Sets.  You can add the Updated column to your list and sort in descending order to see the latest at the top.  Each of the import sets should be in a state of Processed if they were successfully processed.
  • You can also view the system logs for any other errors by navigating to System Logs > System Log > All.  Make sure you sort the list by Created in descending order and look for any errors during the time of the inbound email processing.


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

During my tenure at ServiceNow, I have always stressed the importance of "data-driven" code.  What I mean is make workflows, business rules, etc dependent on tables and records in ServiceNow that can be maintained outside of your internal enhancement release process.  In other words, I shouldn't have to promote code to change something as simple as an approver in a workflow.  I find that ServiceNow Administrators are often bogged down maintaining data instead of enhancing the process to be more efficient and save time.  Examples:

  • Use the task's configuration item whenever possible to store important process attributes for that particular item.  In a workflow "dot-walk" to the Task's CI for things like Approval Group, Support Group, Owned By, Location, etc and leverage those attributes instead of hard coding the values in a workflow or code.
  • Create your own custom tables to store data in support of your process.  Does the incident category really need to be a choice type field that only admin's can add choices?  No!  You can easily create a custom category table and change the category field to be a reference instead.  Then create ACL's to allow users to maintain this data for you.
  • Don't be afraid to add attributes to out of the box tables like locations and departments.  I have seen cases where locations have a specific support group for that campus, building, or floor.  Instead of creating code to determine the group based on the location in the task, simply add a Support Group attribute to the location record that can be maintained outside of code and use that in your workflows and code.

 

Coding in this way takes more time up front to do the right look-ups, but it will save you a ton of time in the long run and make your ServiceNow administrators happy.  Plus you will have the ability to "delegate" the maintenance of this data to people outside the ServiceNow Administration group if you so choose.  I cringe every time I hear of administrators being asked to manually make changes to data in ServiceNow "just because" they are the only users that have access to update that data.  Mistakes can and will happen!  So instead, modify the ACL's, create access, etc for the users that own that data to do it themselves.

 

Easy Import:

I am sure this all sounds good, but I commonly get a follow up question... how can non-administrators maintain data in these custom tables, especially if there are a lot of rows to maintain?  The answer is usually to import a spreadsheet.  Unfortunately data imports in ServiceNow are an admin function and import sets can be very confusing to setup.  The Fuji release introduced "Easy Import" where it will automatically create an import and update template for you:

http://wiki.servicenow.com/index.php?title=Easy_Import#gsc.tab=0

 

 

Unfortunately the Easy Import feature is only available to administrators out of the box, but this can be changed.  Navigate to System UI \ UI Context Menus and search for name = Import and open the record.

 

You will see this is a global action that available to anyone with the 'admin' role (see the condition).  If you want to make this feature available on a specific table, you can easily clone this record and set it for a specific table and a specific role.  Simply change the table from Global to your specific table and then change the condition to something more appropriate:

Condition Example: gs.hasRole('YOUR-CUSTOM-ROLE') && !ListProperties.isRelatedList() && !ListProperties.isRefList()

 

  • gs.hasRole('YOUR-CUSTOM-ROLE') part of the condition checks the logged in user's roles to see if it matches the role between the quotes.
  • !ListProperties.isRelatedList() part of the condition prevents this action from showing on up related lists.
  • !ListProperties.isRefList() part of the condition prevents this action from showing up on reference list popups.

 

You may also want to change the name from Import to something else because the administrators will see duplicate actions when they are logged in and this way they know which one is which.  Then click Additional Actions > Insert.  Once this is done non-administrators will now have access to Easy import for your specific table.

 

Even Easier Import:

Now truth be told, while Easy Import is an awesome feature, it can still be somewhat confusing especially to non-technical people.  By default it also allows for inserting and updating of every field on the table.  What if you wanted to simply provide a locked down excel import template with a fixed list of columns and allow them to import data into ServiceNow?  Again out of the box importing spreadsheets is an admin function, but fortunately there is another way... Service Catalog Record Producers.  Record Producers are a very powerful platform feature that have many uses.  They are great because they are accessible from the Service Catalog that all users have access to, you can utilize User Criteria to restrict/enable access to them, put data into any table in ServiceNow, and they can call a script.

 

In order to make this write up easier, I am choosing to walk you through importing data into an out of the box table.  But the concept of creating an import template that is loaded by a record producer can be applied to any table in ServiceNow as the process and code is very similar.  Lets first start with a use case to set the context of what I will be walking you through...

 

During conversations about incident and change management, customers often ask "how can I associate an incident or change to 100s to 1000s of CIs".  The Affected CI related list is the best out of the box solution allowing you to list all of those CI's.  The Geneva release introduced a new UI to add affected CI's to a change record and this can certainly be extended to other tables like incident and problem, but sometimes importing a spreadsheet of CI's can be easier especially if this is a change that you perform on a recurring basis.

 

The steps below will walk you through the necessary pieces to make this work: Import Set table, transform map, and record producer.  Once complete, users will be able to access this feature from the service catalog to download the import template and be prompted for the task to associate the list of CI.  The final solution will look like this:

 

  • First we need to create the import set staging table and transform map.  I won't be going into every detail about import sets since it is well documented.
    • Create the import template that you would like your users to utilize.  Name your columns in words that the end users will understand.
      • In my example use case, I created an Excel spreadsheet with one column for the Configuration Item, though again you can add any number of columns to the spreadsheet.  Since I don't want the users to have to enter the change number 100s to 1000s of time on the spreadsheet, I will prompt for the Task in a record producer variable.
      • Populate the spreadsheet with test data and save the spreadsheet somewhere on your computer.
    • Navigate to System Import Sets \ Load Data.  Choose Create table, name your import set table and then choose your import template.
    • Click Submit.  ServiceNow will automatically create an Import Set staging table for you and import the data from the spreadsheet.
    • Once complete, click Loaded data.  Since we are prompting for the Task in the record producer, we need a place to store the task ID so we need to add a new field to the import set table.
    • While viewing the import set table, Affected CI Imports in my use case, click on one of the "hamburger" icons beside one of the column headers, then Choose Configure, and finally Table.
    • Write down the Name of your import set table since you will need it later in the setup.
    • Click New in the Columns section to create a new field.
    • Enter the following information:
      • Type: String
      • Column label: Task ID
      • Column name: u_task_id
        • Write down the name of your new column since you will need it later in the setup.
      • Max length: 40
    • Click Submit to create the new field.
    • Click Update on the Affected CI Import table record so you are taken back to the Affected CI Imports list of imported records.
    • Click Transform Maps under Related Links on the Affected CI Imports list so we can create a new transform map for this new table.
    • Since we don't have a transform map yet the list will be empty, but Click New to create a new Transform Map.
    • Name your Transform Map and set the Target Table.  In my example use case the target table is CIs Affected (task_ci).  All other fields can remain default.
    • Click Mapping Assist under Related Links.
      • If your spreadsheet column names match the field labels, you can click Auto Map Matching Fields instead which will automate the creation of field maps.
      • Don't click the submit button because that will require extra steps to further create the field maps.
    • Map your source fields to the target table fields.  In my example use case there are two field maps: Configuration Item to Configuration Item and Task ID to Task.
    • Click Save.
    • Since the Configuration Item field is a reference you can make further adjustments like setting whether to create a record in the reference table if the CI in the spreadsheet isn't found in the CMDB.  We don't want that to happen, so lets edit the field map.  More details can be found here: Creating New Transform Maps - ServiceNow Wiki
      • In the Field Maps related list at the bottom, click "u_configuration_item" to edit this record.
      • Set Choice Action to reject since in our example use case we don't want to process this CI in the event the CI entered in the spreadsheet is not valid.
        • In other use cases you may want to set it to Ignore if you have additional columns in your spreadsheet and you want to process the row but just ignore the invalid value in the one column.
        • Other cases you may want to create a record in the target table so you can choose create.
        • You may also find the Referenced value field name attribute useful. In my example use case I am expecting the CI's name to match a record in the CMDB but what if you prefer to enter the CI's serial number or asset tag instead.  You can enter the column name (database column name, not label name) in this field and it will perform a lookup against that field instead of the default name.
      • Click Update.
    • Click the "hamburger" Additional Actions button and choose Copy sys_id and paste this into a text file because we will need it later in the setup.
    • We are now done with the Import Set Components.
  • Second we need to create a Service Catalog Record Producer for users to access from the catalog that will provide a link to download the import template as well as prompt for the task to link the list of CI's.  The approach will be that the record producer will create a Import Set Data Source record with the Excel Import file attached to it.  The record producer script will automatically execute the processing and transforming of the excel file.
    • Navigate to Service Catalog \ Catalog Definitions \ Record Producers and click New.
    • Set the Name and Short description to something that will make sense to your users, in my example I am setting both to "Affected CI Import".
    • Set the Table name to Data Source (sys_data_source).
    • For easy access and administration we will attach the import template directly to this record producer.  Either drag and drop your Excel import template into your browser or click the paperclip to browse for it.
    • Right-click on your attachment and choose Copy link address in Chrome or Copy link location in Firefox, etc.
    • Now that we have the URL for the import template, we can add a clickable link in the Description text.
    • Set the Description to provide instructions for your users.  In my example description, step 1 includes a step to download the template by "clicking here".  We can make the click here a clickable link.
    • After entering the description text, highlight the text you want to make the clickable link to download the template and then click the Insert/edit link button.
    • Paste in the URL into the URL field and then click OK.
    • Click the Accessibility tab and choose the Catalog(s) that you want this Record Producer to be in along with the category within that catalog.
    • Click the "hamburger" Additional Actions button and choose Save so we can add the Task reference variable.
    • Scroll to the bottom of the form to the Variables related list and click New.
    • Set the following fields:
      • Type (Top of form): Reference
      • Mandatory (Top of form): true
      • Question (Question Section): Task Number
      • Name (Question Section): task_number
      • Reference (Type Specifications Section): Task (task)
        • You could specify a specific type of task like change_request
        • You could also specify a Reference qualifier condition such as active is true
    • Click Submit.
    • Now we need to set the script to run when the record producer is submitted.  Go back to the What it will contain tab and scroll to the script and paste in the following script.  The script has embedded comments to explain what everything is doing.
// Set the following variables with the name of your import set table and task id column
var importSetTableName = "u_affected_ci_";
var importSetTaskIDFieldName = "u_task_id";
var transformMapID = "63f9ee304f8a2e00d1676bd18110c74c";

// Setup data source for attachment
current.name = "Affected CI Import for:  " + producer.task_number.getDisplayValue();
current.import_set_table_name = importSetTableName;
current.file_retrieval_method = "Attachment";
current.type = "File";
current.format = "Excel";
current.header_row = 1;
current.sheet_number = 1;
current.insert();

// Process excel file
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(current);
var ranload = loader.loadImportSetTable(importSetRec, current);
importSetRec.state = "loaded";
importSetRec.update();

// Update processed rows with task sys_id
var importSetRow = new GlideRecord(importSetTableName);
importSetRow.addQuery("sys_import_set", importSetRec.sys_id);
importSetRow.query();
while (importSetRow.next()) {
    importSetRow[importSetTaskIDFieldName] = producer.task_number;
    importSetRow.update();
}

// Transform import set
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapID);
transformWorker.setBackground(true);
transformWorker.start();

// Take user to task
gs.addErrorMessage("Data import may take time load, please reload record to see all the Affected CIs.");
var redirectURL = "task.do?sys_id=" + producer.task_number;
producer.redirect = redirectURL;

// Since we inserted data source already, abort additional insert by record producer
current.setAbortAction(true);

 

    • Set lines 2-4 within the script using the information you copied down in the earlier steps.  If you were following along and naming everything exactly as I provide in these instructions the importSetTableName and importSetTaskIDFieldName variables should be similar, but you will need to paste in the SysID of the transform map you created.
    • Click Update.
    • Additional ideas for you is to create a Catalog Client Script that will ensure there is an attachment on the record producer before proceeding.  Check the community for solutions on how to do this.
  • You have now completed creating the record producer.

 

Now its time to test!  Cross your fingers that you followed along closely and that this will work on the first try.

  • Navigate to the Service Catalog and to the category you chose to add your record producer and click it.
    • Or feel free to open the record producer again and click Try it.
  • Be sure to test that the template download link works.
  • Choose a task you want to test with, attach a completed import template with a list of Configuration Items
  • Click Submit.
  • It will take a few seconds to start the processing of the data load but the record producer script will take you to the task you chose so you can view the list of Affected CI's that were imported.  As noted in the message at the top of the screen, it may take several seconds to process the entire data load so reloading the record may be required to validate.

 

Hopefully you found this useful.  Again I chose to use an out of the box table as an example, but these steps can be applied to any table in ServiceNow.  The record producer script is generic enough to plugin in your own tables and additional steps.  Enjoy!


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

Organizations often struggle to notify people in crisis situations. Crisis situations can be system outages, natural disasters, or any event that requires people to be notified.  A time consuming manual process is usually involved to send notifications ensuring the right people are notified. Sometimes laws, regulations, and/or audit require that the recipients of the alerts confirm receipt of the alert or confirm whether they need help or not.

 

Crisis Alert is a custom scoped application built on the ServiceNow platform leveraging ServiceNow Notify to solve this issue. It was built as a utility type application that provides input of Groups, Users, and/or a filter condition of users that need to be notified.  Crisis Alerts can be created by other applications within the ServiceNow platform to drive the mass communications for those records. Notifications can be sent via email, SMS, and/or text to Voice and the solution will look at the targeted recipient's notification devices in ServiceNow and reach out to those users via those devices.  If input is required from the user to confirm they received the crisis alert or that action is required is logged as well for auditing purposes.

 

Licensing Requirements:

  • Platform Runtime for the creators of Crisis Alerts.
  • Notify licenses for the recipients of the alerts.
    • A Twilio account is also required but done separately.

 

The solution is comprised of two update sets found on ServiceNow Share: https://share.servicenow.com/app.do#/detailV2/66c49c9c1386a600f609d6076144b036/overview

  1. Crisis Alert Scoped Application vX.xml - This includes all the scoped application files.  Upload, preview, and commit this update set first.
  2. Crisis Alert Global Scope Code vX.xml - This file can be found in the Supporting Files section on the Details tab.  This includes global scoped files that are required for the application.  Upload, preview, and commit this update set second.

 

Setup:

  • Ensure that ServiceNow Notify is configured and working on the instance.
    • It works best if you purchase a dedicated Twilio phone number for Crisis Alerts.  This number can be purchased through your Twilio account and once purchased it will be automatically downloaded into your instance by clicking Twilio configuration under the Notify\Administration application.
  • Copy down the E.164 formatted Twilio phone number that you would like to utilize for the Crisis Alert application.  Example: +18005551212
  • Upload, preview, and commit the two update sets from ServiceNow Share.
  • Change your application scope to Crisis Alert by clicking the Settings gear in the upper right corner of your desktop browser, and then choosing Developer.
  • Navigate to Crisis Alert\Properties, and enter the E.164 Twilio phone number into the x_snc_crisis_alert.crisis_alert.phone_number system property on this page.
  • Set the Notify group for the Twilio number to the Crisis Alert Group.
    • Navigate to Notify\Numbers and select the Twilio number that will be utilized by the Crisis Alert application.
    • Type 'Crisis Alert Group' in the Notify group field..
    • Click Update to save the record.
  • Setup notification devices for the ServiceNow users.
    • These can be added via Notification preferences from the User Profiles or a script can be leveraged to create them for all users.
    • It may be useful to add the Notification Devices related list to the User form.
    • When adding SMS type notification devices, the Service Provider field is required out of the box, but this field is not utilized by ServiceNow Notify so any selection can be made.
    • Make sure the SMS and Voice phone numbers are entered in E.164 format.
  • Navigate Crisis Alert\Create New to test the application setup.
  • An annotation has been added at the top of the Crisis Alert form with further instructions.  This annotation can easily be removed once you are familiar with the application.

 

 

 

The data center is on fire!


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

ServiceNow homepages/dashboards are a great feature leveraged by all users of the platform.  A common request is for team managers wanting to view various types of metrics for individual team members.  Unfortunately there is not out of the box mechanism to make interactive homepages where you can choose a value and the homepage dynamically updates the reports based on the value chosen.  That is until today.

 

We have a new option on Share that makes the process of creating interactive homepages much easier than wring Jelly Script, JavaScript, and many manual steps. Interactive Homepages provides a simple way to setup what data you want in your homepage control gauge and then automates the creation of the necessary components so you can quickly build an interactive homepage.

Interactive Homepages on Share

 

Also included in this update set is an example My Team Dashboard that dynamically presents a list of team members for all of the current logged in user's groups and subgroups along with 3 gauges that dynamically update when the team member selection changes.

Team Homepage.png


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

Filter Blog

By date: By tag: