Skip navigation

Developer Community

12 Posts authored by: Christopher.Maloy Employee

As companies are moving the existing legacy application to the cloud and adopt a mobile and social first approach; users are becoming less tolerant of antique interfaces, stand alone applications that lack proper integration, and data in various silos. Today’s companies need modern and easy to use apps that are accessible from any device, consolidated data, easy integration with back office systems, and the ability to innovate & adapt.

 

As part of the journey to modernize the application portfolio, many companies are looking to replace Domino Lotus Notes (DLN)  applications.  These applications range in variety, complexity, and functionality.  Many DLN applications are task and workflow based, which make them ideal candidates to migrate onto the ServcieNow Platform.  In this paper, we describe an approach to migrate your DLN applications onto the ServiceNow Platform.

 

Based on our experience from a large number of DLN to Servicenow Platform migrations, we typically see four phases to migrating your DLN application:

 

Phase 1: Determine Complexity

In this phase you will be working with the process owner of this app.  They should be able to take you through the workflow and use cases for this app.  Don’t worry about solving the technical details at this point, but get the requirements flushed out, and leverage the expertise of the process owner to take notes about where this app can improve when migrated to ServiceNow.

 

I often get asked at this phase – “Does ServiceNow have a tool that lets me migrate my DLN app into ServiceNow automatically?”  There is no migration tool, but what I have found is that most people want to use this opportunity to readdress the implementation of the previously written DLN app, and fix issues with their implementation (not migrate over same problems). 

 

It is at this phase we break down an existing DLN application and really understand what needs to be migrated.  Use this opportunity to rethink how things should be implemented.  Avoid over-engineering and unwanted functionality (process, data, integrations, etc.) to the new Platform. Take note about fields on our forms that are not used, overloaded, or the wrong field type; making our workflow more difficult than it should.

 

I normally like to collect answers to the following questions (thank you to my colleague Frank Schuster for sharing these questions):

  • What is the functional app description?
  • Who is the business owner of the app?
  • What is the business criticality of the app (on a scale from 1-5, 5 being very critical)?
  • What was the usage of the app in the past 3 months?
  • How many Notes databases are required for this app?
  • Are there integrations or messaging involved with this app?  Does the app use Sametime for messaging?
  • What is the current size of the database?
  • What is the number of documents in the database?
  • Does the app use ACL and what does that security structure look like? Who has what roles?
  • Does the app need to be optimized for mobile?
  • Does the app generate Outlook calendar invites?
  • Do we need to migrate the existing data into ServiceNow?

 

Phase 2: Plan for Success

You have a few options to solve your app requirements on the Now Platform.

  1. If things are pretty simple, in terms of data model and workflow, then a catalog item may be the best approach. Catalog items are services that are available to order from a service catalog, and use the out-of-box service catalog request data model. Administrators and catalog administrators can define catalog items, with details such as formatted descriptions, workflow, etc. There is no hard and fast rule here, but it is generally a good idea to keep it simple and clean, and included in the appropriate ServiceNow scope. 
  2. You can also define your own data model, not using the service catalog request item table, and keep things nicely contained within one functional ServiceNow application.  These simple data models are easy DLN migrations, and can be represented similarly to the catalog item (within a service catalog and with their own custom workflow).  The main difference between this type of implementation and a catalog item implementation is usually around licensing.
  3. Ultimately, if what you need to convert has a pretty complex data model, workflow, security, integrations, or other business logic, then you should probably do the implementation as a separate application on the Now Platform.

 

As a best practice you should not create a one to one application in ServiceNow for every DLN application.  In ServiceNow, you have the ablity to create application scopes that represent functional groups.  A functional group is defined in terms of the service you are offering, meaning that a single application may contain several DLN applications in one ServiceNow application scope.  For example, if you have several DLN apps that are used for invoicing, you may want to create one ServiceNow application for invoicing, and combine the functionality of those DLN apps into one Servicenow application scope.

 

Extending Task tables

Another big question to ask at this point is how to extend your data model from the ServiceNow Task table.  Task is one of the core tables provided with Servicenow Platform; it provides a series of standard fields used on each of the tables that extend it, such as the Incident and Problem tables. In addition, any table which extends task can take advantage of task-specific functionality for driving tasks.

 

Once again, we do not have to talk about the implementation details here. We do need to find out if we should treat and use the functionality that ServiceNow gives us around extending the task table.  There is no changing our mind once the table is created, without rebuilding the entire table.  I generally like to introduce some of the features of extending Task, in order to determine if we want to introduce this in our DLN app migration.  There are several out-of-box features you get if your data model extends Task, but there also comes a lot of extra metadata you may not really need. A previous blog post talks about extending the task table here:  What you get by extending the Task table .  If your DLN requires SLAs or Visual Task Boards, then the decision is easy - extend Task.  Look at the provided link and make that determination at this stage in the process.

 

I also like to start mapping parts of DLN to the appropriate ServiceNow application file as we will migrate this DLN app to ServiceNow.  It is much easier to break down functional requirements if we have gone through this exercise up front. We can also identify any gaps in product features at this point, and make the decision if ServiceNow is the right tool for this particular DLN application.  I usually like to categorize my application components into three main categories:

  1. Data Model
  2. Display
  3. Code

 

Data model

The data model from DLN will include tables, fields, files, and data relationships.  This maps pretty nicely into ServiceNow as a table, with the appropriate fields and security.

 

Display

The User Interface from DLN will include forms, views, navigators, and web pages, and these will map into UI Pages, Catalog Items, Portals, Process State Flows, Dashboards, Reports, and Related List in ServiceNow.

 

Code

The code from DLN will include formulas, LotusScript, Java, JavaScript, and other API calls. This will map into business rules, workflow, script includes, and events within ServiceNow.  All of the coding for converting these DLN files will be done in JavaScript on the ServiceNow Platform.  With the exception of Web Pages (which use HTML, CSS, etc) all of the implementation in ServiceNow is done in JavaScript.

 

Phase 3: Build and Test for Quality

Once you know what needs to be built, and we have a migration plan, the implementation isn’t too hard to do. I usually start building the data model first.  Most of the functionality in ServiceNow is data driven.  What I mean by this, is that once we create the right table structure ServiceNow will autogenerate APIs, List views, and Forms based on the data model.  This can all be done with no code in the ServiceNow Platform, saving quite a bit of development time.

 

After all of the tables are created, I start to tweak the out-of-box List and Form views of our newly created table.  Remember these forms are autogenerated by ServiceNow.  Once again we can do this without code.  I usually like to insert some sample data and create some reports and dashboards in this phase as well as use our out-of-box Service Portal if a more modern single page application is necessary.

 

Lastly I tackle the business logic.  This is where I build out the workflow and reimplement the coding details given from the DLN side of things. Understanding the process at this point is imperative.  Do you need approvals for a particular request? Do you calculate data when a particular database trigger is fired?  Do we need to implement a particular integration with an external system?  All those requirements are addressed in this phase.

 

Ultimately, you do not have to follow these steps exactly. You may find yourself bouncing around these steps, but as a best practice it is important to test frequently along the way.  What I mean is that you implement a functional requirement and test, then implement another requirement then test.  These tests are not a full test, but small functional test based on the feature you currently implement.  Once you feel your application is in a good state, then you can publish to a Test instance of ServiceNow and do a full end to end test with real production data, etc.  Depending on the complexity of app you may have multiple deployments of an app (resulting in more than one version before it is pushed to production).  It is recommended that you use GIT as a repo for this ServiceNow application, as it facilitates the management of these versions.  It is also a best practice to do all code changes and development on your ServiceNow development instance and not on your test/QA or production instances.

 

Phase 4: Deploy into Production

At this point in the migration we are ready to get our application into the hands of our customers (internal or external).  As a best practice I like to deploy frequently.  Do not be afraid to get your application into production and start getting feedback from your user base.  ServiceNow has a feature for publishing applications to a private repository for your company domain.  It is through this publishing your other instances will have access to the install or update your newly created app.  The length of time to convert a DLN app depends on the complexity of what we are converting.  You may find yourself publishing multiple applications a day and some that may take a week.

 

This paper is only the beginning of what you can do once you convert an app from DLN into ServiceNow.  There are many features within the ServiceNow Platform we did not get into, but can be utilized to enhance your DLN application from then to NOW!

 

Feel free to contact me at: chris.maloy@servicenow.com

This best practice only applies if you are working on the Istanbul or later release.  When using the automated test framework, it is a good idea to build your test while you build your form and code.  This best practice really only applies to testing your ServiceNow forms.  Once you feel like you have the mandatory fields and default view identified and your data model pretty solid, you should create a test.  I recommend doing one test per form (if possible).  Some people will create a separate test per field and this is not recommended in my book.  It takes a long time to start up and run these test, so I have learned to chain related test together using the steps within the Automated Testing Framework (ATF).

For example:  Below is the form from a property management application.  This form is specifically used for managing the properties I rent.  You can see there is a mandatory field on Address, but there is also an UI policy that controls the visibility of the Parking under structure field.  If the type field is switch to Single Family Home the parking under structure field goes away.

Now that my form is done and I have the behavior I expect, I can create one automated test for this page that will test the behavior of the entire form. (see below)

 

Rental_Property___ServiceNow.jpg

Rental_Property___ServiceNow 2.jpg

 

You can see here that I am creating a test that test the state of more than one field at a time.  You should not have separate test for each field.  That is overkill.  For example, you should not have a separate test for testing the mandatory field of Address. 

 

Rental_Property_Form_Test___ServiceNow.jpg

 

Here is the flow of my test.

 

Rental_Property_Form_Test___ServiceNow 2.jpg

 

 

There may be exceptions to this rule, especially if your form is very complex and is controlled with lots of scripting logic, etc, but the idea is to minimize the number of places you have to maintain code, test, and files.  Simplicity is your friend.

I am working on a plan to help those that have apps in the global namespace convert those apps into a scoped app.  I'd be willing to hear any thoughts that the community wants to share (either post on this thread and / or email me chris.maloy@servicenow.com).  There are many difficult steps in this process (like migrating to a similar, but different data model as well as identifying which API calls you may be using that need to change because they are not accessible to other scopes).

 

For this reason I am going to start writing a bunch of scripts that can help in the conversion process.

This is a rough first draft for a helper script that can be called to help identify some global dependencies that will change during your conversion process.  Don't be too critical I wrote this while on a phone call and I don't multi-task well.  I am sure there are better ways to use regular expressions (instead of splitting strings like I am doing - welcome to any feedback).

 

You will need to call scanScript passing in the sysId of the script you want to scan (including the type if it is not a script include). If you are in a domain separated environment you will need to query with NoDomain. 

 

var siNameCalls = scanScript('ca4033c1d7110100fceaa6859e610326');

//gs.print("" + identifySIDependencies());

//default script type will be a script include (no need to pass in type unless it is not a script include)
function scanScript(scriptSysId, type) {
         var lgr = new GlideRecord('sys_script_include');

         switch (type) {
                 case 'businessrule':
                         var lgr = new GlideRecord('sys_script');
                         break;
                 default:
                         break;
         }

         lgr.get(scriptSysId);
         if (lgr) {
                 var txt = lgr.script,
                         ans = txt.split("new"),
                         anslength = ans.length;


                 while (anslength--) {
                         var parsedData = ans[anslength];
                         var dependency = parsedData.substr(0, parsedData.indexOf("("));
                         //gs.print(dependency);
                         var result = identifySIDependencies(dependency); //ItomJSONParser is public //JSONParser set to not public unit test values

                         if (result === 0) {
                                 gs.print(dependency + " is invalid script include usage. Not accessible in app scope.");
                         }
                 }
         }
}

function identifySIDependencies(dependency) {
         var sigr = new GlideRecord('sys_script_include'),
                 enc = 'active=true^name=' + dependency.trim();

         //build the lookup query
         sigr.addEncodedQuery(enc);
         sigr.query();

         if (sigr.next()) {
                 //check if public access
                 if (sigr.access == 'public') {
                         return 1; //public
                 }

                 return 0; //private
         }

         return -1; //not found
}

Just doing a quick blog today.

 

I have been on a few calls lately where developers are trying to process XML from an inbound SOAP/REST request.  This scenario is usually done with a scripted  SOAP web service or a processor (where the payload is in XML format).  If you are doing this in a scoped application you probably have already started noticing some API limitations (that you had access to when working in the global namespace).  For examples if you need to support namespaces or need to get a list of all the attributes on a Node - you may start to pull your hair out dealing with XML.

 

Here is the best strategy you can use and I am declaring this a best practice.  Stop working in XML while in ServiceNow.  Your first step in receiving XML and having to deal with it in JavaScript should be to convert it to JSON.  Work in JSON while in the ServiceNow platform.

 

There is a scoped application safe function you can make (first thing) that will do this conversion for you:

 

var obj = gs.xmlToJSON(xmlString);

 

There is one gotcha / caveat you may have to watch out for following this mindset - this strategy will not work with large XML payloads.  Converting the XML into JSON will bring the whole thing into memory which may cause issues. 

So here is the scenario.  Lets say you have a scheduled job and 10 different users log into ServiceNow (that have rights to see the scheduled jobs) and they click the ExecuteNow button.  What happens to the running of those scripts?  Does it queue them to be ran in first come order?  Does it run them all simultaneously if a worker thread is available to run them?

 

The answer is that it runs them simultaneously if a worker thread is available to run them.  The ExecuteNow sets the next action date to 0 which tells the system (along with the state as Ready) to run this script when a worker become available.

 

ServiceNow.jpg

 

If you don't want these two scripts running simultaneous in this multi-threaded format you will want to implement your scripting logic different.  You could do a quick check in your script code check if another job of the same  type is running before you run the bulk of your script. 

If you happen to run into the following error within a scoped application: Error constructing REST Message/Method: The chances are you need to dereference the global namespace for your REST Message name. 

 

Here are the symptoms of this behavior.  You create a REST Message in global scope that is accessible to all Application Scopes.  You test it using the Test Related Link.

 

get___ServiceNow.jpg

 

It returns a beautiful 200 HTTP Status Code.  You use the Preview Script Usage and copy the contents to use within a scoped application.  It doesn't work.  You get the Error constructing REST Message/Method:.

 

You need to prefix your REST Message name with global (like so):

 

try {

var r = new sn_ws.RESTMessageV2('global.rest_message_name', 'get');

var response = r.execute();

var responseBody = response.getBody();

var httpStatus = response.getStatusCode();

}

catch(ex) {

var message = ex.getMessage();

}

Whenever I get a question about Jelly I start to sweat nervously. Then you throw in a question about phase 1 and phase 2 idiosyncrasies and I start to mumble like a crazy person.  So, here’s a little story I’ve got to tell - </end-beastie-boy-impression>

 

Stephan Nolan, an employee working with me at RMIT, asked the question:

"We have a dynamic content block with the following:

 

<?xml version="1.0" encoding="utf-8" ?>
<j:jelly trim="false" xmlns:j="jelly:core" xmlns:g="glide" xmlns:j2="null" xmlns:g2="null">
     <h1>Hello World</h1>

     <script id="template" type="text/x-handlebars-template">
         <![CDATA[
             <label>
                 <input
                     name="example"
                     type="radio"
                     value="1"
                     {{#if blah}}checked="checked"{{/if}}
               />
                 <span>Radio button</span>
             </label>
         ]]>
     </script>
</j:jelly>

 

 

 

Which works fine when two phase is turned off.  However when we turn it on, the <![CDATA[]]> tags are being ignored and the following error is thrown

Element type "input" must be followed by either attribute specifications, ">" or "/>".: org.xml.sax.SAXParseException: Element type "input" must be followed by either attribute specifications, ">" or "/>".:

 

We need two phase turned on to enable HTML escaping.  Currently our only solution is to replace the '<' of the input with &lt; to bypass jelly parsing."

 

I couldn’t figure out a nice an simple answer for him on this, but much to my delight he figured out a nice solution later on and shared the fact that if you have this requirement, wrap your script in a UI Macro and use an inline template. Example <g2:inline template="uimacro.xml".  It looks like the first phase eats up the CDATA tags and when we go into phase two we get the above error.  If we inline in the second phase we get to keep the CDATA tag and there is no need to escape all those pesky < > brackets.  Thanks Stephen for the tip.  I hope it helps someone else out there in Jelly ****, I mean land. 

 

Extensions to Jelly Syntax - ServiceNow Wiki

I learned something very interesting from my Australian brothers Steve Farrar and Zac Murray today.  Quick shout out to these Technical Consultants - they are geniuses and I thank them for imparting their wisdom with me.

 

If you happen to get the following "Upload Failed" "YourAppName" could not be uploaded due to the following error: User name or password invalid, DONT panic - all you need to do is make sure you DO NOT publish your application with an equal or less version than you have already published previously.  Change the version and you will be right as rain.

 

The error below is seen when trying to publish a version of an application ("Make App available on other instances") without increasing the version number.  The error you get is a bit misleading.

 

errorpublish.png

 

Some other community post addressing the same issue:

Error in making application available to other instances

Why am I seeing "upload failed" after I try to Make my app available on other instances?

Full credit to Kenny Caldwell for this list.  Kenny is a brilliant engineer and I always learn something when working with him, so I am passing some of the info onto the community. If you are wondering what you get for extending the Task table in ServiceNow these are some things to consider.

 

  1. Tables/Fields which are limited to the Task table.
    1. Approval Rules[sysrule_approvals] http://wiki.servicenow.com/index.php?title=Approval_Rules#gsc.tab=0
    2. Assignment Rules[sysrule_assignment] -http://wiki.servicenow.com/index.php?title=Defining_Assignment_Rules#gsc.tab=0
    3. Assignment Rules/Data Definition LookUp - http://wiki.servicenow.com/index.php?title=Defining_Assignment_Rules#gsc.tab=0
    4. Assessment conditions[assessment_conditions] - http://wiki.servicenow.com/index.php?title=Using_Change_Risk_Assessment#gsc.tab=0
    5. Service Level Agreements[sysrule_escalate] Inactivity Monitor/Legacy SLA - http://wiki.servicenow.com/index.php?title=Setting_Inactivity_Monitors#gsc.tab=0
    6. State Flows[sf_state_flow] - http://wiki.servicenow.com/index.php?title=State_Flows#gsc.tab=0
    7. Rate Cards[fm_rate_card] - http://wiki.servicenow.com/index.php?title=Cost_Management#gsc.tab=0
    8. Task Relationships[task_rel_task] - http://wiki.servicenow.com/index.php?title=Many_to_Many_Task_Relations#gsc.tab=0
    9. Execution plans[sc_cat_item_delivery_plan]  http://wiki.servicenow.com/index.php?title=Using_Execution_Plans#gsc.tab=0
    10. Visual Task Board[vtb_board] http://wiki.servicenow.com/index.php?title=Visual_Task_Boards#gsc.tab=0
    11. Survey Conditions[survey_conditions] - legacy http://wiki.servicenow.com/index.php?title=Survey_Management#gsc.tab=0
    12. SLA[contract_sla] - http://wiki.servicenow.com/index.php?title=Defining_an_SLA#gsc.tab=0
  2. Workflow Items
    1. Approvals
      1. Available in a workflow for Standalone task
        1. Approval – User
        2. Approval Action
        3. Rollback To
        4. Not Available in a workflow for Standalone task
          1. Approval – Group
          2. Approval Coordinator
          3. Generate
          4. Manual Approvals
    2. Tasks: Selection Not Available

It is often a design decision to normalize data into multiple lookup tables and create reference fields for that data.  Do take note and caution when doing this that there are limits on the number of Indexes and Columns that can be created on a single table (and if you are extending a table like Task then realize that some indexes and columns are already being used up and or get used up in the flattening process).

 

Reference fields are indexed and can push you closer to the upper limit of 64 indexes (indices?).

 

What I am trying to say is that it is a good design practice to control your reference fields.

 

Introduction to Fields - ServiceNow Wiki

Tables and Columns Module - ServiceNow Wik

This may already be a known issue in the community, but this week was the first time I encountered the problem.  You will notice when creating a table within a scoped application, as you create fields the field names in the dictionary do not begin with u_ . 

 

This is great, but this week I encountered a customer naming a True/False field type a value that started with a number.  They were seeing INVALID_CHARACTER_ERR as an error upon commit.  I am not sure what other field types may have this problem, but maybe as best practice don't name any columns starting with numerics.  There can be several weird form rendering side effects with formatters and other things. 

 

Remember you can use a label for a column field, if you want, that is different than the field name (labels can include or start with numerics).

 

Avoid starting field names with numerics. 

Two things recently happened that prompted me to post this blog:

  1. I recently read Gregor Hohpe's article "Programming Without a Call Stack - Event-driven Architectures" found here: http://www.enterpriseintegrationpatterns.com/docs/EDA.pdf
  2. I was recently challenged to show how ServiceNow could support EDA within the context of scoped application development.

 

As with all things ServiceNow - the Platform is powerful and flexible enough to allow me to implement an EDA design pattern.  Within the scoped application development world of ServiceNow remember that application scoping ensures that one application does not impact another application.  They are mutually exclusive.  This kind of isolation between scoped applications makes it hard to take advantage of data and resources in the single data model - so maybe this is where an EDA comes in handy.

 

Remember the key characteristics (from the Gregor article) for an EDA are:

  • Broadcast communications
  • Timeliness
  • Asynchrony
  • Fine Grained
  • Ontology
  • Complex Event Processing

 

I think we have those requirements met by using the ServiceNow Event Queue.

 

Scoped applications have access to the Event Registry: Event Registry - ServiceNow Wiki

to fire events using gs.eventQueue and gs.eventQueueScheduled API calls.  They will only be able to select the tables (data model) that is in their scope.  For all other applications, outside of the application firing,  will need to create a Script Action Script Actions - ServiceNow Wiki  to handle custom events fired by application outside of their scope.

 

This is one way to implement inter-application communication using the EDA pattern.  Hope this helps.  Cheers.

 

AgendaTracker.jpg

 

AgTracker.jpg

Filter Blog

By date: By tag: