Skip navigation

I found this question on the community where they were looking for a solution to close the ticket after 3 business days. As we all know, for example the "autoclose" for incident doesn't count business days, it just goes for "days". Which pretty much leaves people setting a high value enough so it would take care most of the short holidays and the customers won't be angry. This solution can be used on all kind of tickets.

 

EDIT: I also put this as an "idea" on the community, so go in and vote for it and see if we can get it into a future release: Be able to use Business hours/days to autoclose ticket

 

I will use the following stuff to get it to work:

 

  • Relative Durations. I use this to set the amount of business days I want it to wait until it closes. Using relative duration
  • DurationCalculator. To calculate the duration with a schedule. Using DurationCalculator to calculate a due date
  • Schedule Job. Then I'll put the code in a schedule job that runs how often you want it to run to check and close those tickets that are "due".

 

 

First step is to create a relative duration that fits the amount of days you want. You can easy find it under System Scheduler->Schedules->Relative Durations.

As you can see there are a few examples that you can either use or look to understand how it's build up.

But lets create a new one that we want to be "3 business days". It doesn't need to be more than this:

Copy the sys_id of this record, since you will be needing it later for the coding of the schedule job.

 

Next step will just to get the sys_id of the schedule you want to use. Just head to the schedule record and copy that sys_id as well.

 

Now we got all the nice things we need. So lets head to System Definition->Scheduled Job and press New. Choose "Automatically run a script of your choosing" and we get to the new record.

Here you can set how often you want it to run, but perhaps once a day would be nice and let's pick a time when we don't think so many people are working.

Remember to put all your code in an anonymous function so it variable etc. doesn't get messed up with other things that is running. If you don't do this, strange things can happen if you for example have two jobs running at the same time and both having a variable called for example gr.

 

Summary of the code below.

  • Get the sys_id for the relative duration I want to use
  • Query all the records I want to go through. Active is true and got a resolve date is the ones I want here.
  • Loop through the records I get and check if its been more than 3 business days since it was set to resolve. If it is, close it.

 

(function(){
  //Get the relative duration for 3 business days
  var relDur = 'f3ae5fc70f0132004cf365ba32050eb9';
  //get the incidents that we want to go through
  var encQue = 'active=true^resolved_atISNOTEMPTY';

  var gr = new GlideRecord('incident');
  gr.addEncodedQuery(encQue);
  gr.query();

  while(gr.next()){
  //Calculate and see if resolve date is more than 3 business days ago. And if, close the ticket.
  var dc = new DurationCalculator();

  //Load the schedule into our calculation through the function below
  addSchedule(dc);

  //Do the calculation and see if end date is before today
  dc.setStartDateTime(gr.resolved_at);
  if (!dc.calcRelativeDuration(relDur)) {
  gs.error("*** calcRelativeDuration failed for record {0}", gr.number);
  }
  if (dc.getEndDateTime() < gs.nowDateTime()){
  gr.setValue('state', 7);
  gr.update();
  }
  }

  function addSchedule(durationCalculator) {
  //  Load the "8-5 weekdays" schedule into our duration calculator.
  var scheduleName = "8-5 weekdays";
  var grSched = new GlideRecord('cmn_schedule');
  grSched.addQuery('name', scheduleName);
  grSched.query();
  if (!grSched.next()) {
  gs.error("*** Could not find schedule {0}.", scheduleName);
  return;
  }
  return durationCalculator.setSchedule(grSched.getUniqueValue(), "GMT");
  }

})();

 

That it, I hope it can give some ideas how to use the duration calculator for more fun stuff.

 

 

//Göran

 


Symfoni Logo Color Box.jpgsn-community-mvp.png

//Göran

ServiceNow Witch Doctor and MVP
-----------------------------------
For all my blog posts: http://bit.ly/2fCzj1g

Available now in the ServiceNow app store!

With the NewRocket Contextual Search Results Widget your company can achieve incident deflection in the Service Portal. As self-service users enter details about their issue, search results appear within the form, allowing the user to view relevant knowledge articles, catalog items, or social Q&A thread instead of submitting an incident. The result: less incidents submitted and more satisfied users.

Key Features

– Integrates with existing record producers

– View search results while typing

– View individual result items without navigating to a new page

– Simple to configure to show relevant knowledge articles, pinned articles, social Q&A and/or catalog items

 

contextual-results3.png

 

Thank you, Nathan!

 

Nathan Firth,
Founder of NewRocket, Inc. and ServiceNow architect, web developer, and entrepreneur with over 20 years experience in web development. Former senior engineer and team lead of Service Portal at ServiceNow, originally from Sweden, but currently living in San Diego, CA.

 

Read Some Recent Articles:

Communicating between the Client Script and the Server Script of a widget

Create custom action buttons in Service Portal
Using Events to Communicate Between Widgets

 


The method below was originally published on Oct 25, 2012, is an outdated process

Please see the following post for help! >  Loading data from an email attachment


 

I had a number of people respond to this post with the same problem and together with my colleague I re-wrote the instructions to make it work.

 

I had a customer that wanted to have IBM Maximo product send an email to ServiceNow and have a CSV/XLS file attached to the email automatically create a new Data Source, load the data, then transform it all automatically just from receiving the email. Here is what I did to get that to work:

 


1. Create a Data Source Manually first so you can set up the Import Set Table and the Transform Map that you will use later. This is necessary since you need a Transform Map defined before you send the email inbound to ServiceNow. This Data Source can be deleted after creating the Import Set Table and Transform Map.
2. Get a sample CSV/XLS file and attach it to the Data Source and load the data at least once.
3. Define your Transform Map and set it up so the data is getting transformed into the right table in the format that you need.
4. Create a user that you will use for basic authentication with no roles and has a password.
5. Create a REST Message that points to your own instance for the REST endpoint: https://instancename/sys_import.do as the url named "Inbound Import Set Poster"
6. Check the box for "Use basic auth" and set the username and password from step #4
7. Edit the "get" function and add the following function parameters:

8. Create an Inbound Email Action with the following settings:

HERE IS THE CODE TO COPY:


if (current.getTableName() == "sys_data_source") {

    current.name = email.subject.substring(3);
    current.import_set_table_label = email.body.import_set_table_label;
    current.import_set_table_name = email.body.import_set_table_name;
    current.file_retrieval_method = email.body.file_retrieval_method;
    current.format = email.body.format;

    current.update();

var r = new RESTMessage('Inbound Import Set Poster', 'get');
r.setStringParameter('tableLabel', encodeURIComponent(current.name));
r.setStringParameter('customDataSource', current.sys_id);
        r.setStringParameter('tableName', current.import_set_table_name);
r.execute();
}


9. The Condition can be anything you want but the Target table needs to be sys_data_source.
10. Enable the inbound email settings in your instance to receive email and process them. See WIKI http://wiki.servicenow.com/index.php?title=Configuring_Email for this.
11. Now send an email like this to the email instancename@unix.service-now.com (The "unix" part of this avoids any email formatting issues with email clients):

  1. To: instancename@unix.service-now.com
  2. Subject: DS:Test Data Source 1
  3. Body:


        import_set_table_label:Test Import Set Table
        import_set_table_name:u_test_import_set_table
        file_retrieval_method:Attachment
        format:CSV

 

NOTE: CSV would be Excel if you are using an XLS file

 

12. Wait until the email is processed and it should have created your Data Source with whatever is after the "DS:" in the subject
13. Make sure the import_set_table_name is set to whatever you named your import set table when you created your Data Source in step 1 (This is very important since the Transform Map is linked to this table)
14. The format is set to either Excel or CSV.
15. You need to attach the file you want to attach to the Data Source.

>


The method above was originally published on Oct 25, 2012, is an outdated process

Please see the following post for help! >  Loading data from an email attachment


ServiceNow offers a wide variety of API's to integrate with other systems: Web Services (SOAP and REST), JDBC, LDAP, PowerShell, Shell Script, scheduled file import, and bi-directional email. Unfortunately, not all systems and tools offer this same variety of choices and loading data via spreadsheet or files feels like the only choice. The files can certainly be imported manually through Import Sets or through another solution I documented called "Consumerize" Data Imports to Non-ServiceNow Administrators, but what if this could be automated.  ServiceNow can connect to FTP sites or pull files via MID server, but what if that still doesn't work for the system or vendor you are trying to integrate with? Then I would say the lowest common denominator for integration is email.

 

We all know parsing email text can be very tricky and problematic at the same time; however, if you can get an email template set up it can be a useful integration method. The ability to process an inbound email and import data at the time is often overlooked. I often see and hear about spreadsheets being emailed around and then saved so the data can be imported, but again what if that could happen automatically?

 

Loading data from an email attachment in Geneva, Helsinki, Istanbul & Jakarta

There have been a few solutions for this documented over the years, including UPDATED - Use an email with an attachment to create a Data Source, Load Data, and Run Transform. These solutions were documented many years ago and are now obsolete. This requirement to load data from an email attachment came up the other day. I thought I would post a working solution for Geneva, Helsinki and Istanbul releases.

 

Set up prerequisites to load data from an email attachment

  1. You must establish an import file/template that will always be used.
    • The columns in the spreadsheet must remain the same since it will require changes in ServiceNow to add, change, or remove columns.
  2. Your email needs to contain something unique to look for in order to know you want to process the email attachment. 
    • In other words you don't want to be trying to import every email attachment that is sent to your ServiceNow instances.  Options are keywords in the subject or body of the email or even emails from a specific email address.  Again you need something that will be unique about the emails for the inbound email action to look for.
  3. You will need to set up the import set table and transform maps. 
    • This can be done by manually importing the template as an administrator.  Verify the import generated records in your target table and everything looks good.  This blog isn't going to cover those steps, but once you can manually import the file, then you can automate that process.
    • You will need to write down or copy/paste a few things once this is set up for use in a script provided in this post.
      • Name of your import set table - You can get this by refreshing your browser so the application navigator is updated
        • Navigate to System Import Sets > Import Set Tables and there you should see a table that matches what you called your import when loading in the file
        • Click the module that matches your table name and when the list shows, click any of the 3-line icons beside the column headers, then Configure, and choose table.
        • When the table record shows up, copy down the name value or better yet you can copy the name value and paste it into a temporary text file.
      • SysID of your transform map(s).  This is the transform map that processes data in your import set table and drops it into your target table.
        • Navigate to System Import Sets > Administration > Transform Maps and there you should see a record that matches what you typed in when manually importing your file.
        • Right-click on that row and choose Copy sys_id
        • Depending on your browser it may just copy that value into memory and you will need to paste it into a text file to see the value.  Paste it into the temporary text file you used in the prior step.
        • If multiple transform maps need to be leveraged, repeat the steps above to capture the additional SysIDs of the transform maps.

 

Automate the processing of the inbound email with the attachment

Now that you have your email requirements established and your file set up for import, we can now automate the processing of the inbound email with the attachment.  This will involve creating an inbound email action.  To better understand how this works, look over the documentation on inbound email actions, inbound email action variables, creating inbound email actions, inbound email action examples, and inbound email action ordering.

 

Set up your inbound email action

  1. Navigate to System Policy > Email > Inbound Actions
  2. Click New.
  3. Set the following values:
    • Name: Give it a name that makes sense to you.
    • Set the Target table to Data Source (sys_data_source).  This is because we expect these emails to contain an Excel or CSV file and we need to generate a data source with that attachment that can then be imported.
    • Set Active to true
    • Set Stop processing to true since we don't want any other inbound email actions to process this email or file.
  4. In the When to run section/tab:
    • You may consider changing the order to a very low or negative number so that other inbound actions don't process these emails.
    • If you are expecting these emails to come from a specific email, you can select the From user.
    • Set the condition based on Pre-req 2 above.  Examples are subject contains "file import" or something.  Again this needs to be something unique but something that will always appear in these inbound emails.
  5. In the Actions section/tab:
    • Paste in the following script.
      (function runAction(/*GlideRecord*/ current, /*GlideRecord*/ event, /*EmailWrapper*/ email, /*ScopedEmailLogger*/ logger, /*EmailClassifier*/ classifier) {
          
          var importSetTableName = "IMPORT SET TABLE NAME";
          var transformMapIDs = "SYS-ID(s) OF YOUR TRANSFORM MAP TO UTILIZE";  //Use a comma to specify multiple transform maps
          var applicatonScope = "Global";
          
          // Create the datasource record
          current.name = "File import from: " + email.from;  //Feel free to rename this as appropriate
          current.import_set_table_name = importSetTableName;
          current.file_retrieval_method = "Attachment";
          current.type = "File";
          current.format = "Excel"; // For Excel Files
          //current.format = "CSV"; // For CSV Files
          current.header_row = 1;
          current.sheet_number = 1;
          current.sys_package.setDisplayValue(applicatonScope);
          current.sys_scope.setDisplayValue(applicatonScope);
          var dataSourceID = current.insert();
          
          /*
           * Schedule Load of Attachment
           *
           * This inbound email action will generate an import data source, however the attachment isn't copied to the data source until
           * after the insert of the record.  Scheduling the import to happen 30 seconds later so that attachment has time to be copied.
           */
          new global.EmailFileImportUtils().scheduleImport(dataSourceID, transformMapIDs);
          
      })(current, event, email, logger, classifier);
      
    • Set the values of the variables declared in lines 3 and 4 of the script to what you captured in pre-req 3 above.
      • You can specify multiple Transform Maps by separating them by a comma with no spaces on line 4.
    • If your file is in CSV format, comment line 12 and uncomment line 13.
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  6. Click Submit.

 

Set up your utility script include

Now we need to create the utility script include that is called by the inbound email action.

  1. Navigate to System UI > Script Includes
  2. Click New.
  3. Set the following values:
    • Name: EmailFileImportUtils
    • Accessible from: All applications scopes - setting this to all scopes in case you want to use this for a scoped application
    • Script: paste in the following:
      var EmailFileImportUtils = Class.create();
      EmailFileImportUtils.prototype = {
          initialize: function() {
          },
          
          scheduleImport: function(dataSourceID, transformMapIDs) {
              /*
               * Create scheduled job to process import
               *
               * The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
               * after the insert of the record.  The code below will create a scheduled job to process the import 30 seconds later
               * so that attachment has time to be copied to the data source from the email.
               */
              
              var schRec = new GlideRecord("sys_trigger");
              schRec.name = "Load Data Source: " + dataSourceID;
              schRec.trigger_type = 0;  // Run Once
              schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
              
              var nextAction = new GlideDateTime();
              nextAction.addSeconds(30);  // 30 seconds should be enough time however this can be changed.
              schRec.next_action = nextAction;
              schRec.insert();
          },
          
          loadImportSet: function(dataSourceID, transformMapIDs) {
              // Get Datasource Record
              var dataSource = new GlideRecord("sys_data_source");
              dataSource.get(dataSourceID);
              
              // Process data source file
              var loader = new GlideImportSetLoader();
              var importSetRec = loader.getImportSetGr(dataSource);
              var ranload = loader.loadImportSetTable(importSetRec, dataSource);
              importSetRec.state = "loaded";
              importSetRec.update();
              
              // Transform import set
              var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
              transformWorker.setBackground(true);
              transformWorker.start();
          },
          
          type: 'EmailFileImportUtils'
      };
      
    • If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
  4. Click Submit.

 

If data load is part of a scoped application or if you are loading data into a scoped table and changed line 5 in your inbound email action, then you will need to perform the following steps.  If not you can skip to the next step.

 

By default the Data Sources table only allows records to be created by the Global scope and since your scoped application needs to create a data source via the inbound email action we need to change that.

  1. Navigate to System Import Sets > Administration > Data Sources.
  2. Click the Additional Actions 3 lined icon beside Data Sources:

    data sources.jpg

  3. Then choose Configure and select Table:

    configure data sources.jpg

  4. Go to the Application Access Section or tab and check the Can Create checkbox.

                can create data source table.jpg

  5. Click Update.

 

Now test by sending an email that meets the conditional criteria of your inbound email action with your file. Within a few minutes you should see data populated in your table.  Keep in mind that the out of the box scheduled job called Email Reader runs every two minutes to check for new inbound emails.  This can be changed to run faster, but may cause system performance issues. Once your email is processed it will take another 30 seconds to process the attachment.

 

If you would like to set up another inbound email action to process a different file, simply repeat steps 1-5 above.  The script include does not need to be recreated.

 

Troubleshooting your setup:

  • All inbound emails are stored in the database and can be viewed by navigating to System Mailboxes > Received. Here you can see a copy of the email and the Target field at the top should be a Data Source if things worked correctly.  At the bottom, see the Email Log list that shows which inbound email actions processed the email.
  • If the target of the received email is not a data source and your inbound email action is part of a scoped application, check to make sure you changed the Data Source table application access in step 9 above.
  • You can view the data source and spreadsheet sent via email by navigating to System Import Sets > Administration > Data Sources. You can add the Updated column to your list and sort in descending order to see the latest at the top.  All data sources created by the emails will be named "File import from + the from email address" unless you changed line 8 of the inbound email action script.  Each of the data sources should have the attachment sent via email, if there isn't one then that is a problem and the cause of the failure.
  • You can view all data imported and status of the import by navigating to System Import Sets > Advanced > Import Sets.  You can add the Updated column to your list and sort in descending order to see the latest at the top.  Each of the import sets should be in a state of Processed if they were successfully processed.
  • You can also view the system logs for any other errors by navigating to System Logs > System Log > All.  Make sure you sort the list by Created in descending order and look for any errors during the time of the inbound email processing.


Please mark this post or any post helpful or the correct answer so others viewing can benefit.

Here's the dilemma of the day - you’re developing applications on your DEVelopment instance, and you need to clone your PRODuction instance to your DEV instance to test the changes. But here's the catch - one or more of those applications are still in-development. What should you do?

 

Let's say you throw caution to the wind and start the clone anyway. When the clone is complete and you refresh your DEV instance (clone target) from PRODuction (clone source), you find that the in-development applications are completely missing or are the wrong versions on the target instance, post-clone. Arg! What happened? In this sixth installment of our best practices series, we look at three different ways you can preserve those in-development apps.

 

What happens to in-development applications when you clone over the development instance

After a clone, the target instance will have whatever version of the application the source instance had when cloned. For example, if the source had the latest version, the target will have the latest version post-clone. The source will still be a sys_app (sys_store_app applications are read-only) application, meaning it can be developed further. However, if the latest application version is 1.2 and the clone source has version 1.1, the post-clone target will have a developable application that is version 1.1, the same as the source instance. If the clone source has no version of the application installed, the target will, likewise, have no version of the application available post-clone.

 

 

Post-clone development best practices

If you have applications in development on an instance that will be used as a clone target, before cloning to that instance, do one of the following with all of your in-development applications to allow continued post-clone development:

source_control.JPG

  • Publish the application to an update set, and install it back to the target instance, post-clone.

publish_to_update_set.JPG

  • Install the most recent version of the application to the source instance prior to the clone.

 

Dilemma solved -  you can clone your PROD instance and keep your fledgling apps. Happy developing!

 

--

 

Behind the scenes here at ServiceNow, the Knowledge Management and Multimedia teams work closely with subject matter experts to disseminate critical information to our customers. We’ve found that certain topics come up frequently, in the form of best practices that can help you keep your ServiceNow instances running smoothly. This series aims to target those topics so that you and your organization can benefit from our collective expertise.

 

See Annotate scripts and customizations with comments for the first installment on script comments.

See Limit the Number of Users with the Admin Role for the second installment on user roles.

See Where to avoid linking to a reference field when configuring a list for the third installment on list configuration.

See Developing on your production instance for the fourth installment on development.

See When to create a new table vs. when to extend for the fifth installment on application development.

 

To access all of the blog posts in this series, search for "nowsupport best practices series."

This best practice only applies if you are working on the Istanbul or later release.  When using the automated test framework, it is a good idea to build your test while you build your form and code.  This best practice really only applies to testing your ServiceNow forms.  Once you feel like you have the mandatory fields and default view identified and your data model pretty solid, you should create a test.  I recommend doing one test per form (if possible).  Some people will create a separate test per field and this is not recommended in my book.  It takes a long time to start up and run these test, so I have learned to chain related test together using the steps within the Automated Testing Framework (ATF).

For example:  Below is the form from a property management application.  This form is specifically used for managing the properties I rent.  You can see there is a mandatory field on Address, but there is also an UI policy that controls the visibility of the Parking under structure field.  If the type field is switch to Single Family Home the parking under structure field goes away.

Now that my form is done and I have the behavior I expect, I can create one automated test for this page that will test the behavior of the entire form. (see below)

 

Rental_Property___ServiceNow.jpg

Rental_Property___ServiceNow 2.jpg

 

You can see here that I am creating a test that test the state of more than one field at a time.  You should not have separate test for each field.  That is overkill.  For example, you should not have a separate test for testing the mandatory field of Address. 

 

Rental_Property_Form_Test___ServiceNow.jpg

 

Here is the flow of my test.

 

Rental_Property_Form_Test___ServiceNow 2.jpg

 

 

There may be exceptions to this rule, especially if your form is very complex and is controlled with lots of scripting logic, etc, but the idea is to minimize the number of places you have to maintain code, test, and files.  Simplicity is your friend.

A while back I realized that we were running into what I would call, "script include sprawl." Let me explain...

 

Pretty much every GlideAjax call that was being made had it's own corresponding script include to return specific data. At best, a new method was being added to an already existing script include. YUCK!! Think about it-- It doesn't take long for things to get really messy.

 

Then, I got the ACE report results and realized that there were tons of client scripts that were doing direct GlideRecord queries instead of a GlideAjax call. Things were about to get even messier unless I provided a standardized way to do all GlideAjax calls through one script include.

 

Here is the solution I came up with:

 

var AjaxGlideRecord = Class.create();
AjaxGlideRecord.prototype = Object.extendsObject(AbstractAjaxProcessor, {
  getRecordField: function() {
  var data = '';
  var table = this.getParameter('sysparm_table_name')+'';
  var enc = this.getParameter('sysparm_encoded_query')+'';
  var field = this.getParameter('sysparm_field')+'';

  var gr = new GlideRecord(this.getParameter('sysparm_table_name'));
  gr.addEncodedQuery(this.getParameter('sysparm_encoded_query'));
  gr.query();
  if(gr.next()) {
  data = gr.getValue(field);
  }
  return data;
  },
  getRecordFields: function() {
  var data = {};
  var table = this.getParameter('sysparm_table_name')+'';
  var enc = this.getParameter('sysparm_encoded_query')+'';
  var fields = this.getParameter('sysparm_fields')+'';
  var field_names = fields.split("|");

  var gr = new GlideRecord(this.getParameter('sysparm_table_name'));
  gr.addEncodedQuery(this.getParameter('sysparm_encoded_query'));
  gr.query();
  if(gr.next()) {
   field_names.forEach(function(element) {
   data[element] = gr.getValue(element);
   });  
  }
  data = new global.JSON().encode(data);
  return data;
  },
    type: 'AjaxGlideRecord'
});

 

Here is how to retrieve a single piece of data from a client script:

 

var ga = new GlideAjax('AjaxGlideRecord');
ga.addParam('sysparm_name','getRecordField');
ga.addParam('sysparm_table_name',"sys_user");
ga.addParam('sysparm_encoded_query',"sys_id=ba1443d38db62400932b553d6ead9dec");
ga.addParam('sysparm_field',"email");
ga.getXML(parseResult);


function parseResult(response) {
   var answer = response.responseXML.documentElement.getAttribute("answer");
   console.log(answer);
}

 

Here is how to retrieve multiple pieces of data from a client script:

 

var ga = new GlideAjax('AjaxGlideRecord');
ga.addParam('sysparm_name','getRecordFields');
ga.addParam('sysparm_table_name',"sys_user");
ga.addParam('sysparm_encoded_query',"sys_id=ba1443d38db62400932b553d6ead9dec");
ga.addParam('sysparm_fields',"email|user_name");
ga.getXML(parseResult);


function parseResult(response) {
   var answer = response.responseXML.documentElement.getAttribute("answer");
   answer = answer.evalJSON(); // Turn JSON string into object
   console.log(answer.email);
   console.log(answer.user_name);
}

 

Problem solved.

Oops!... I did it again. While I was testing out data sources, I was tampering some import set tables by setting some "string fields" to "reference fields." However, I noticed some extra data got created before the transformation, when loading data into them. With my raised eyebrows I was not expecting this so I thought I would share my findings. Reference fields are very useful to normalise and organise data. Sometimes reference fields can be too powerful. Bow down before reference fields.

did it again.gif

Changing the field 'name' from type String to Reference

In this example I will show you what happens when you change the field 'name' from type String to reference fields to the sys_user table. I wanted to have the sys_id on the field, instead of the full name.  If you need to replace ugly sys_id with the actual data "Display" values, or to normalize the data, or better relate your data, you may choose to set the field as a reference field. This is a very unusual case.

sysid field.jpg

Normally, we would expect to import most string values as String types.

string types.jpg

 

However, tables are flexible and you can customize some of those to be reference fields.

customize ref field.jpg

 

Misspellings when importing records

The problem is that if we have a simple spelling mistake like "Boris Catino X", the data load could create a new record or set the reference field to NULL.

 

This data is inserted on the "Load all data" stage and there is NO transformation map executed yet.

 

Here is the result of my testings:

Import data

Expected

Match Display valued

Result

Additional notes

beverly campbel

Beverly Campbel

Yes

Sysid of matching record

Match is no case sensitive

Billie Cowley

Billie Cowley

Yes

Sysid of matching record

 

Boris Catino X

Boris Catino

No

New record sysid

New record created as display value does not match. On some cases can return false

 

You want to avoid the new records created by the reference fields themselves. Those records can cause confusion. If the records are not handled carefully, the loaded data could be set to null if there is no matching of the display value. This also applies if the data is passed for a reference field.

user ref fields.jpg

 

Using reference fields is very useful if you are importing accurate data, or the sys_id of the records directly. If the imported data is flaky, keep the fields as Strings. You can then use the transformation maps to gain control on how to process the data and when the new data is created.

 

More information here:

Here is the second article in the series of Portal diaries. In this article, I would like to discuss about displaying the summary of approval record in Service Portal. As customers started using ServiceNow widely the idea of approval process drifted into other modules like Resource plans, Issues, stories, hr case, test plans which are just a few to mention.

 

In my opinion the OOB approval page in Service Portal is only focusing on RITM approval displaying all the variables whereas for other tables it is only showing the following four fields (Short description, Opened by, Start and End dates if applicable).

 

Backend form view of ServiceNow approval record displays the approving field summary with the help of approval summarizer UI macro. If the specific table has approval view the UI macro (Approval Summarizer) displays fields of this view, if not the default view of form is shown.

 

Here is the screenshot of a change request fields in an approval record using approval summarizer.

 

This similar solution can be implemented on Service Portal via a custom script include taking GlideReocrd as an input and outputting the fields of a default view along with its field positions. When I was working on a different requirement I came across one of the methods in $sp portal API called getForm and a <sp-model> tag to the render form.

 

Here is the format for calling getForm api and using it in HTML view

 

HTML Template

<sp-model form-model="data.f" mandatory="mandatory"></sp-model>

 

Server Code

data.f = $sp.getForm( table_name, sys_id_of_record);

 

This solution seemed clean until I realized that the getForm method display fields with edit access  which is undesired. But on a positive note getForm method returns the fields after all the rules like UI policies, client scripts, ACL’s and even dictionary level read only. This generates me an idea of forcing all the fields to be read only .

 

In order to make all the fields read only, I hacked the data.f object and updated the sys_readonly property.

 

Here is the server script to make all the fields of a record read only

 

data.f = $sp.getForm( table_name, sys_id);

for (var field_name in data.f._fields) {
     data.f._fields[field_name]["sys_readonly"] = true;
}


 

Screenshot of Story form in approval widget

 

Screenshot of HR case form in approval widget

 

Here is more info about what all the getForm returns

 

gs.log('display value: ' + data.f.display_value)

gs.log('ui acations: ' + data.f._ui_actions)

gs.log('short description: ' + data.f.short_description)

gs.log('plural : ' + data.f.plural)

gs.log('view_titile: ' + data.f.view_title)

gs.log('_pref: ' + data.f._perf)

gs.log('_sections: ' + data.f._sections)

gs.log('label: ' + data.f.label)

gs.log('title: ' + data.f.title)

gs.log('_fields: ' + data.f._fields)

gs.log('_formatters: ' + data.f._formatters)

gs.log('sys_id: ' + data.f.sys_id)

gs.log('view: ' + data.f.view)

gs.log('scratchpad: ' + data.f.g_scratchpad)

gs.log('_view: ' + data.f._view)

gs.log('attachmentGUID: ' + data.f._attachmentGUID)

gs.log('client_script: ' + data.f.client_script)

gs.log('related_list: ' + data.f._related_lists)

gs.log('table: ' + data.f.table)

gs.log('policy: ' + data.f.policy)

 

As in this article I talked more about properties of fields. Here are the all the properties of a field

 

gs.log('sys_mandatory : ' + data.f._fields.cmdb_ci.sys_mandatory)

gs.log('visible : ' + data.f._fields.cmdb_ci.visible)

gs.log('dependentField : ' + data.f._fields.cmdb_ci.dependentField)

gs.log('dbType : ' + data.f._fields.cmdb_ci.dbType)

gs.log('label : ' + data.f._fields.cmdb_ci.label)

gs.log('sys_readonly : ' + data.f._fields.cmdb_ci.sys_readonly)

gs.log('type : ' + data.f._fields.cmdb_ci.type)

gs.log('mandatory : ' + data.f._fields.cmdb_ci.mandatory)

gs.log('refTable : ' + data.f._fields.cmdb_ci.refTable)

gs.log('displayValue : ' + data.f._fields.cmdb_ci.displayValue)

gs.log('readonly : ' + data.f._fields.cmdb_ci.readonly)

gs.log('hint : ' + data.f._fields.cmdb_ci.hint)

gs.log('name : ' + data.f._fields.cmdb_ci.name)

gs.log('attributes : ' + data.f._fields.cmdb_ci.attributes)

gs.log('reference_key : ' + data.f._fields.cmdb_ci.reference_key)

gs.log('readonlyClickthrough : ' + data.f._fields.cmdb_ci.readonlyClickthrough)

gs.log('choice : ' + data.f._fields.cmdb_ci.choice)

gs.log('value : ' + data.f._fields.cmdb_ci.value)

gs.log('max_length : ' + data.f._fields.cmdb_ci.max_length)

gs.log('ed : ' + data.f._fields.cmdb_ci.ed)

 

Please find the attached XML of updated Approval Record widget.

 

Edit: updated the widget to include record attachments

 

Previous blogs in this series

Portal diaries: Service Portal – Making Rejection Comments Mandatory on Approval Record

Portal diaries: Service Portal – Multiple Catalogs (Part 1)

We’re excited to announce the CreatorCon judging panel!

 

Three legends of the enterprise IT and SaaS worlds will apply their expert scrutiny in selecting the 1st, 2nd, and 3rd place startups (and their apps built on the ServiceNow platform) that will deliver tremendous value to ServiceNow customers via distribution on the ServiceNow Store. They will judge the three finalists who will compete in a live pitch-off, on stage in a general session at CreatorCon on May 11th in Orlando, FL.

 

trophy.png

 

To find out who the judges are, please read the official announcement from ServiceNow CTO Allan Leinwand.

 

If you are selected as one of the three finalists, you’ll meet our esteemed judges. These vital industry connections add even more value to the $500K in cash investments from ServiceNow Ventures and $325K in marketing and business development prizes. I'd even go out on a limb and suggest this pushes it over the $1 million mark.

 

Be sure to get your entries in by the deadline of 2/15, for a shot at this tremendous opportunity!

 

Team CreatorCon Challenge

Martin Barclay
Director, Product Marketing
App Store and ISVs
ServiceNow
Santa Clara, CA

I just stumbled over a new Security restraint that I never noticed before. I had a user that had the role “user_admin” so they could do some standard admin stuff.
Now with the role, you have the “groups” module and in that, you can press “new”. Here you put in your data and press save, unless you want to add a type.

 

Then this happens…
ACL 1.JPG

 

 

 

That was kind of a bummer. But my guess was some kind of ACL so I put on my superhero cape and started to debug security.

I was expecting a page with at least ONE red row with ACL to confirm that I was right on track, but this is what I got.

 

acl 2 all green.JPG

 

I see the “Security constrains prevent access to requested page”. However, it is all green and after another look.

I can see that there is not any ACL at all. This should be your “red flag”.

I remember the days when I took the sys admin course, going through ACL for the first time and it was VERY complex, and I never thought I would understand it.

Nevertheless, I got that if it could not find an ACL for the specific table, it go downwards to the wildcard one etc.

 

And I have this beautiful picture in my head =)

 

Acl_matching.png

 

Now, there is a * ACL, but it requires me to have the role admin to get access. But I do not see any red rows in my debug…

It all boils down to the Security settings. Default it has this setting under system properties-> Security

ACL 2.JPG

 

Meaning that if it doesn't have any ACL at all, it will be deny...
and it didn't have any.

 

So to fix this, I created a ACL and put in the role I wanted to be able to get access to the type.

And here you can see with the ACL Execution plan that it adds it just like we want =)

 

 

And when the user now goes in and clicks on type, it shows up.

 

ACL works.JPG

 

 

I hope this will help in the daily ACL work that we all love

 

//Göran

 

sn-community-mvp.pngSymfoni Logo Color Box.jpg

//Göran

ServiceNow Witch Doctor and MVP
-----------------------------------
For all my blog posts: http://bit.ly/2fCzj1g

In this series, I would like to share the solutions that are missing in Service Portal. Today I would to discuss about how to make rejection comments mandatory in Service Portal.

 

Approvals in ServiceNow are handled either via emails or through an approval record. One of the significant reasons to use approvals via form is the ability to capture comments when rejecting a record. In a form view there is an OOB UI policy Comments mandatory on rejection, that makes comments field mandatory while rejecting a record.

 

When I initially started playing with Portal, I did not find a way to capture comments in OOB approval widgets. So I cloned the OOB Approval info widget and added a new text box

<textarea ng-model="c.data.comment" style="color: grey; width: 100%; margin-top: .5em;" placeholder="Rejection Comments" class="form-control" rows="5"></textarea>

 

Below  is the screenshot of comments box

 

In order to make the comments field required while rejecting, first validate whether it is empty or not. If it is then stop the approval process.

After adding couple lines in server and client controller I'm able stop the rejection when the comments are NULL and an alert box will pop out as shown in the image below.

 

Here is the code that has to be updated in server and client controller blocks of a cloned widget.

 

Server Code

 

if (input.comment){
     gr.comments = input.comment;
     gr.update();
}

 

Client Controller

 

c.action = function(state) {

     if( (c.data.comment ==  undefined || c.data.comment ==  '' )&& state == 'rejected'){
          $window.alert('Rejection Comments cannot be empty');
          return false;

     }
     c.data.op = state;
     c.data.state = state;
     c.server.update();
}

 

 

 

Please find the attached XML of updated Approval info widget.

 

 

Blogs in this series

Portal diaries: Service Portal – Approval Summarizer in Portal

Portal diaries: Service Portal – Multiple Catalogs (Part 1)

After web services were first introduced, they became very popular. Web services is a great tool to exchange information. ServiceNow implementations of the web services are top notch. When encoding international characters, the safest option points to Unicode. One of the most popular implementations is UTF-8, which is the one adopted by us.  If you need to connect to your instance and use non-ascii characters, you should read this blog. Specially if you are seeing � or questions marks (?) as part of the data received using web services. Luckily the solution is very simple by safe-encoding non-ascii characters before sending them (see below) if you are not using UTF-8.

 

Utf8webgrowth.png

Outside encoding, incoming SOAP requests with characters like "&", "<" or ">" on the data can cause errors. Those characters are reserved as they are part of the XML tags used to delimiter the data. If used on the data, they interfere with the SOAP message itself. If they are passed on the data and read on ServiceNow, they will show as "Unable to parse SOAP document"

Recommendations to safely pass non-ascii characters

I will focus on SOAP web services. They use SOAP messages to exchange data between the soap nodes (e.g. your soap client and the instance). However, while the data is provided on the message, both parties need to agree on the "encoding." We make it easy, we would use UTF-8 to encode characters.

 

unicode-shield.png

 

If you are interacting with your instance and you use a different encoding with special characters on your data, you may face problems if you are not using Unicode. However, XML offers the option to safe encoding most characters into escape characters (safe encoded).

 

For SOAP Message

If the message is

encoded on UTF-8

Recommendation: The message needs to be safe encoded

Incoming to the instance

Yes

No, it is not necessary.

If safe encoded, it will also work.

Incoming to the instance

No

Yes, safe encode the data to avoid ? or � characters

Outbound from the instance

Yes

Yes, if target is not UTF-8 to avoid ? or � characters

Outbound from the instance

No

Yes, always safe encode the data

 

The outbound soap messages from the instance are always on UTF-8, so the messages are always encoded on UTF-8

 

"Incoming" means that SOAP calls toward your instance. "Outbound" means SOAP calls from the instance to your end-point. To safely encode the message, you need to transform any non-ascii character into a XML code.

 

 

Example of a message sent to the instance using an Unicode (UTF-8) encoding

On the following example, I will use SOAP UI, to transfer "comments" that contains non-ascii characters and "work_notes" that contains the same characters safely encoded. I would expect the system will have NO problem with the characters as both soap nodes are using UTF-8.

 

Using SOAP UI looks like:

SOAP ui.jpg

 

In more detail, the message looks like:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:inc="http://www.service-now.com/incident">
   <soapenv:Header/>
   <soapenv:Body>
      <inc:insert>
         <short_description>Testing with encoding characters characters</short_description>
         <comments>Testing with encoding characters characters -not safe encoded
Basic Latin
! " # $ % &amp; ' ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ; &#x3C; = &#x3E; ? @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ ` a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~
Latin-1 Supplement
  ¡ ¢ £ ¤ ¥ ¦ § ¨ © ª « ¬  ® ¯ ° ± ² ³ ´ µ ¶ · ¸ ¹ º » ¼ ½ ¾ ¿ À Á Â Ã Ä Å Æ Ç È É Ê Ë Ì Í Î Ï Ð Ñ Ò Ó Ô Õ Ö × Ø Ù Ú Û Ü Ý Þ ß à á â ã ä å æ ç è é ê ë ì í î ï ð ñ ò ó ô õ ö ÷ ø ù ú û ü ý þ ÿ
Latin Extended-A
Ā ā Ă ă Ą ą Ć ć Ĉ ĉ Ċ ċ Č č Ď ď Đ đ Ē ē Ĕ ĕ Ė ė Ę ę Ě ě Ĝ ĝ Ğ ğ Ġ ġ Ģ ģ Ĥ ĥ Ħ ħ Ĩ ĩ Ī ī Ĭ ĭ Į į İ ı IJ ij Ĵ ĵ Ķ ķ ĸ Ĺ ĺ Ļ ļ Ľ ľ Ŀ ŀ Ł ł Ń ń Ņ ņ Ň ň ʼn Ŋ ŋ Ō ō Ŏ ŏ Ő ő Œ œ Ŕ ŕ Ŗ ŗ Ř ř Ś ś Ŝ ŝ Ş ş Š š Ţ ţ Ť ť Ŧ ŧ Ũ ũ Ū ū Ŭ ŭ Ů ů Ű ű Ų ų Ŵ ŵ Ŷ ŷ Ÿ Ź ź Ż ż Ž ž ſ
Latin Extended-B
ƀ Ɓ Ƃ ƃ Ƅ ƅ Ɔ Ƈ ƈ Ɖ Ɗ Ƌ ƌ ƍ Ǝ Ə Ɛ Ƒ ƒ Ɠ Ɣ ƕ Ɩ Ɨ Ƙ ƙ ƚ ƛ Ɯ Ɲ ƞ Ɵ Ơ ơ Ƣ ƣ Ƥ ƥ Ʀ Ƨ ƨ Ʃ ƪ ƫ Ƭ ƭ Ʈ Ư ư Ʊ Ʋ Ƴ ƴ Ƶ ƶ Ʒ Ƹ ƹ ƺ ƻ Ƽ ƽ ƾ ƿ ǀ ǁ ǂ ǃ DŽ Dž dž LJ Lj lj NJ Nj nj Ǎ ǎ Ǐ ǐ Ǒ ǒ Ǔ ǔ Ǖ ǖ Ǘ ǘ Ǚ ǚ Ǜ ǜ ǝ Ǟ ǟ Ǡ ǡ Ǣ ǣ Ǥ ǥ Ǧ ǧ Ǩ ǩ Ǫ ǫ Ǭ ǭ Ǯ ǯ ǰ DZ Dz dz Ǵ ǵ Ƕ Ƿ Ǹ ǹ Ǻ ǻ Ǽ ǽ Ǿ ǿ ...</comments>
</comments>
         <work_notes>Testing with encoding characters characters -safe encoded
Basic Latin
! &#x22; # $ % &#x26; &#x27; ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ; &#x3C; = &#x3E; ? @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ &#x60; a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~
Latin-1 Supplement
  &#xA1; &#xA2; &#xA3; &#xA4; &#xA5; &#xA6; &#xA7; &#xA8; &#xA9; &#xAA; &#xAB; &#xAC; &#xAD; &#xAE; &#xAF; &#xB0; &#xB1; &#xB2; &#xB3; &#xB4; &#xB5; &#xB6; &#xB7; &#xB8; &#xB9; &#xBA; &#xBB; &#xBC; &#xBD; &#xBE; &#xBF; &#xC0; &#xC1; &#xC2; &#xC3; &#xC4; &#xC5; &#xC6; &#xC7; &#xC8; &#xC9; &#xCA; &#xCB; &#xCC; &#xCD; &#xCE; &#xCF; &#xD0; &#xD1; &#xD2; &#xD3; &#xD4; &#xD5; &#xD6; &#xD7; &#xD8; &#xD9; &#xDA; &#xDB; &#xDC; &#xDD; &#xDE; &#xDF; &#xE0; &#xE1; &#xE2; &#xE3; &#xE4; &#xE5; &#xE6; &#xE7; &#xE8; &#xE9; &#xEA; &#xEB; &#xEC; &#xED; &#xEE; &#xEF; &#xF0; &#xF1; &#xF2; &#xF3; &#xF4; &#xF5; &#xF6; &#xF7; &#xF8; &#xF9; &#xFA; &#xFB; &#xFC; &#xFD; &#xFE; &#xFF;
Latin Extended-A
&#x100; &#x101; &#x102; &#x103; &#x104; &#x105; &#x106; &#x107; &#x108; &#x109; &#x10A; &#x10B; &#x10C; &#x10D; &#x10E; &#x10F; &#x110; &#x111; &#x112; &#x113; &#x114; &#x115; &#x116; &#x117; &#x118; &#x119; &#x11A; &#x11B; &#x11C; &#x11D; &#x11E; &#x11F; &#x120; &#x121; &#x122; &#x123; &#x124; &#x125; &#x126; &#x127; &#x128; &#x129; &#x12A; &#x12B; &#x12C; &#x12D; &#x12E; &#x12F; &#x130; &#x131; &#x132; &#x133; &#x134; &#x135; &#x136; &#x137; &#x138; &#x139; &#x13A; &#x13B; &#x13C; &#x13D; &#x13E; &#x13F; &#x140; &#x141; &#x142; &#x143; &#x144; &#x145; &#x146; &#x147; &#x148; &#x149; &#x14A; &#x14B; &#x14C; &#x14D; &#x14E; &#x14F; &#x150; &#x151; &#x152; &#x153; &#x154; &#x155; &#x156; &#x157; &#x158; &#x159; &#x15A; &#x15B; &#x15C; &#x15D; &#x15E; &#x15F; &#x160; &#x161; &#x162; &#x163; &#x164; &#x165; &#x166; &#x167; &#x168; &#x169; &#x16A; &#x16B; &#x16C; &#x16D; &#x16E; &#x16F; &#x170; &#x171; &#x172; &#x173; &#x174; &#x175; &#x176; &#x177; &#x178; &#x179; &#x17A; &#x17B; &#x17C; &#x17D; &#x17E; &#x17F;
Latin Extended-B
&#x180; &#x181; &#x182; &#x183; &#x184; &#x185; &#x186; &#x187; &#x188; &#x189; &#x18A; &#x18B; &#x18C; &#x18D; &#x18E; &#x18F; &#x190; &#x191; &#x192; &#x193; &#x194; &#x195; &#x196; &#x197; &#x198; &#x199; &#x19A; &#x19B; &#x19C; &#x19D; &#x19E; &#x19F; &#x1A0; &#x1A1; &#x1A2; &#x1A3; &#x1A4; &#x1A5; &#x1A6; &#x1A7; &#x1A8; &#x1A9; &#x1AA; &#x1AB; &#x1AC; &#x1AD; &#x1AE; &#x1AF; &#x1B0; &#x1B1; &#x1B2; &#x1B3; &#x1B4; &#x1B5; &#x1B6; &#x1B7; &#x1B8; &#x1B9; &#x1BA; &#x1BB; &#x1BC; &#x1BD; &#x1BE; &#x1BF; &#x1C0; &#x1C1; &#x1C2; &#x1C3; &#x1C4; &#x1C5; &#x1C6; &#x1C7; &#x1C8; &#x1C9; &#x1CA; &#x1CB; &#x1CC; &#x1CD; &#x1CE; &#x1CF; &#x1D0; &#x1D1; &#x1D2; &#x1D3; &#x1D4; &#x1D5; &#x1D6; &#x1D7; &#x1D8; &#x1D9; &#x1DA; &#x1DB; &#x1DC; &#x1DD; &#x1DE; &#x1DF; &#x1E0; &#x1E1; &#x1E2; &#x1E3; &#x1E4; &#x1E5; &#x1E6; &#x1E7; &#x1E8; &#x1E9; &#x1EA; &#x1EB; &#x1EC; &#x1ED; &#x1EE; &#x1EF; &#x1F0; &#x1F1; &#x1F2; &#x1F3; &#x1F4; &#x1F5; &#x1F6; &#x1F7; &#x1F8; &#x1F9; &#x1FA; &#x1FB; &#x1FC; &#x1FD; &#x1FE; &#x1FF; 
</work_notes> 
      </inc:insert>
   </soapenv:Body>
</soapenv:Envelope>

 

Once processed into the target table, the characters are correctly displayed.

UTF 8 ENCODED.jpg

This shows that the data is processed without any problems.

 

Example of a message sent to the instance using a non-Unicode (ISO-8859-1) encoding

Similarly to the previous example, I will use SOAP UI, to transfer "comments" that contains non-ascii characters and "work_notes" that contains the same characters safely encoded. This time I will encode on 'iso-8859-1'. I would expect the system will try to match the characters against UTF-8.

 

Using soap UI looks like:

non unicode 8 soap ui.jpg

In more detail looks like:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:inc="http://www.service-now.com/incident">
   <soapenv:Header/>
   <soapenv:Body>
      <inc:insert>
         <short_description>Testing with encoding characters characters</short_description>
         <comments>Testing with encoding characters characters -not safe encoded
Basic Latin
 ! " # $ % ' ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ;  @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ ` a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~
Latin Extended-A
A a Ă ă Ą ą Ć ć C c C c Č č Ď ď Đ đ E e E e E e Ę ę Ě ě G g G g G g G g H h H h I i I i I i I i I i J j K k Ĺ ĺ L l Ľ ľ Ł ł Ń ń N n Ň ň O o O o Ő ő O o Ŕ ŕ R r Ř ř Ś ś S s Ş ş Š š Ţ ţ Ť ť T t U u U u U u Ů ů Ű ű U u W w Y y Y Ź ź Ż ż Ž ž 
Latin Extended-B
b Đ F f I l O O o t T U u z | ! A a I i O o U u U u U u U u U u A a G g G g K k O o O o j
</comments>
         <work_notes>Testing with encoding characters characters -safe encoded
Basic Latin
! &#x22; # $ % &#x26; &#x27; ( ) * + , - . / 0 1 2 3 4 5 6 7 8 9 : ; &#x3C; = &#x3E; ? @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z [ \ ] ^ _ &#x60; a b c d e f g h i j k l m n o p q r s t u v w x y z { | } ~
Latin-1 Supplement
  &#xA1; &#xA2; &#xA3; &#xA4; &#xA5; &#xA6; &#xA7; &#xA8; &#xA9; &#xAA; &#xAB; &#xAC; &#xAD; &#xAE; &#xAF; &#xB0; &#xB1; &#xB2; &#xB3; &#xB4; &#xB5; &#xB6; &#xB7; &#xB8; &#xB9; &#xBA; &#xBB; &#xBC; &#xBD; &#xBE; &#xBF; &#xC0; &#xC1; &#xC2; &#xC3; &#xC4; &#xC5; &#xC6; &#xC7; &#xC8; &#xC9; &#xCA; &#xCB; &#xCC; &#xCD; &#xCE; &#xCF; &#xD0; &#xD1; &#xD2; &#xD3; &#xD4; &#xD5; &#xD6; &#xD7; &#xD8; &#xD9; &#xDA; &#xDB; &#xDC; &#xDD; &#xDE; &#xDF; &#xE0; &#xE1; &#xE2; &#xE3; &#xE4; &#xE5; &#xE6; &#xE7; &#xE8; &#xE9; &#xEA; &#xEB; &#xEC; &#xED; &#xEE; &#xEF; &#xF0; &#xF1; &#xF2; &#xF3; &#xF4; &#xF5; &#xF6; &#xF7; &#xF8; &#xF9; &#xFA; &#xFB; &#xFC; &#xFD; &#xFE; &#xFF;
Latin Extended-A
&#x100; &#x101; &#x102; &#x103; &#x104; &#x105; &#x106; &#x107; &#x108; &#x109; &#x10A; &#x10B; &#x10C; &#x10D; &#x10E; &#x10F; &#x110; &#x111; &#x112; &#x113; &#x114; &#x115; &#x116; &#x117; &#x118; &#x119; &#x11A; &#x11B; &#x11C; &#x11D; &#x11E; &#x11F; &#x120; &#x121; &#x122; &#x123; &#x124; &#x125; &#x126; &#x127; &#x128; &#x129; &#x12A; &#x12B; &#x12C; &#x12D; &#x12E; &#x12F; &#x130; &#x131; &#x132; &#x133; &#x134; &#x135; &#x136; &#x137; &#x138; &#x139; &#x13A; &#x13B; &#x13C; &#x13D; &#x13E; &#x13F; &#x140; &#x141; &#x142; &#x143; &#x144; &#x145; &#x146; &#x147; &#x148; &#x149; &#x14A; &#x14B; &#x14C; &#x14D; &#x14E; &#x14F; &#x150; &#x151; &#x152; &#x153; &#x154; &#x155; &#x156; &#x157; &#x158; &#x159; &#x15A; &#x15B; &#x15C; &#x15D; &#x15E; &#x15F; &#x160; &#x161; &#x162; &#x163; &#x164; &#x165; &#x166; &#x167; &#x168; &#x169; &#x16A; &#x16B; &#x16C; &#x16D; &#x16E; &#x16F; &#x170; &#x171; &#x172; &#x173; &#x174; &#x175; &#x176; &#x177; &#x178; &#x179; &#x17A; &#x17B; &#x17C; &#x17D; &#x17E; &#x17F;
Latin Extended-B
&#x180; &#x181; &#x182; &#x183; &#x184; &#x185; &#x186; &#x187; &#x188; &#x189; &#x18A; &#x18B; &#x18C; &#x18D; &#x18E; &#x18F; &#x190; &#x191; &#x192; &#x193; &#x194; &#x195; &#x196; &#x197; &#x198; &#x199; &#x19A; &#x19B; &#x19C; &#x19D; &#x19E; &#x19F; &#x1A0; &#x1A1; &#x1A2; &#x1A3; &#x1A4; &#x1A5; &#x1A6; &#x1A7; &#x1A8; &#x1A9; &#x1AA; &#x1AB; &#x1AC; &#x1AD; &#x1AE; &#x1AF; &#x1B0; &#x1B1; &#x1B2; &#x1B3; &#x1B4; &#x1B5; &#x1B6; &#x1B7; &#x1B8; &#x1B9; &#x1BA; &#x1BB; &#x1BC; &#x1BD; &#x1BE; &#x1BF; &#x1C0; &#x1C1; &#x1C2; &#x1C3; &#x1C4; &#x1C5; &#x1C6; &#x1C7; &#x1C8; &#x1C9; &#x1CA; &#x1CB; &#x1CC; &#x1CD; &#x1CE; &#x1CF; &#x1D0; &#x1D1; &#x1D2; &#x1D3; &#x1D4; &#x1D5; &#x1D6; &#x1D7; &#x1D8; &#x1D9; &#x1DA; &#x1DB; &#x1DC; &#x1DD; &#x1DE; &#x1DF; &#x1E0; &#x1E1; &#x1E2; &#x1E3; &#x1E4; &#x1E5; &#x1E6; &#x1E7; &#x1E8; &#x1E9; &#x1EA; &#x1EB; &#x1EC; &#x1ED; &#x1EE; &#x1EF; &#x1F0; &#x1F1; &#x1F2; &#x1F3; &#x1F4; &#x1F5; &#x1F6; &#x1F7; &#x1F8; &#x1F9; &#x1FA; &#x1FB; &#x1FC; &#x1FD; &#x1FE; &#x1FF; 
</work_notes> 
      </inc:insert>
   </soapenv:Body>
</soapenv:Envelope>

 

Below, you can see some characters would be translated. However, with XML encoded characters, you can safely send UTF-8 characters.

incorrectly translated UTF .jpg

Using non-ascii characters can cause them to get transferred incorrectly. However, using XML encoded characters you can safely transfer those characters.

Using XML encoded characters, you can safely transfer non-ascii characters when encoding is not UTF-8.

 

How to safely encode XML data

There are several methods to achieve the data to be encoded. Below is a simple script that can encode the data transferred.

 

Here is a simple background script function that encode data:

// Simple encoding XML data function - Do not double-escape any characters.
function escapeXMLEntities(xmldata) {
    return xmldata.replace(/[\u00A0-\u2666<>\&]/g, function (a) {
        return "&#" + a.charCodeAt(0) + ";"
    })
};

var str = "A a Ă ă Ą ą Ć ć C c C c Č č Ď ď Đ đ E e E e E e Ę ę Ě ě G g G g G g G g H h H h I i I i I i I i I i J j K k Ĺ ĺ L l Ľ ľ Ł ł Ń ń N n Ň ň O o O o Ő ő O o Ŕ ŕ R r Ř ř Ś ś S s Ş ş Š š Ţ ţ Ť ť T t U u U u U u Ů ů Ű ű U u W w Y y Y Ź ź Ż ż Ž ž";
gs.print(escapeXMLEntities(str));

 

Result:

Script completed in scope global: script

*** Script: A a &#258; &#259; &#260; &#261; &#262; &#263; C c C c &#268; &#269; &#270; &#271; &#272; &#273; E e E e E e &#280; &#281; &#282; &#283; G g G g G g G g H h H h I i I i I i I i I i J j K k &#313; &#314; L l &#317; &#318; &#321; &#322; &#323; &#324; N n &#327; &#328; O o O o &#336; &#337; O o &#340; &#341; R r &#344; &#345; &#346; &#347; S s &#350; &#351; &#352; &#353; &#354; &#355; &#356; &#357; T t U u U u U u &#366; &#367; &#368; &#369; U u W w Y y Y &#377; &#378; &#379; &#380; &#381; &#382;

 

When using SOAP messages, ensure you are using UTF-8 to transfer the data, or ensure non-ascii characters are escaped safely into XML safe codes. A good note, this can also be used on many other situations. Luckily, programmatically, it is not a big challenge.

 

More information can be found here:

Filter Blog

By date: By tag: