Bulk Loading Attachments to ServiceNow With PowerShell

Version 3

    Often times when migrating from an older tool into ServiceNow it is a requirement that the file attachments be migrated from that tool into their associated records in ServiceNow.  I was recently faced with this task and as I dug into the process I chose to use PowerShell to issue XML SOAP calls to ServiceNow for uploading of the file attachments into ServiceNow’s Attachment Creator Web Service (http://wiki.servicenow.com/index.php?title=AttachmentCreator_SOAP_Web_Service).

    My goal was to set out to create a fully automated solution of loading files as attachments into ServiceNow and in doing this I found that there were a number of obstacles that needed to be overcome.


    Before the files are imported, the following must be completed:


    Once again, my goal was to develop a process that would programmatically fulfill each of these prerequisites while being simple enough for most anyone to be able to incorporate into their own import scenarios. 

    The output of this exercise is a robust PowerShell script (attached to this document) that performs all of the above tasks by easily attaching files to records within a wide array of tables with a single execution of the script.


    Before I get into the script itself, I would like to go cover some of the pre-work I went through in prepping my system for this import.



    If you are unfamiliar with MIME, you can follow the link I included earlier to WIKIPEDIA for a detailed explanation.  On a very high level, MIME was originally designed for e-mail to be embedded into the transmission of the e-mail letting the software client know what kind of file was attached.  Similarly, other Internet based services utilize the MIME to determine the file type to determine what application should be used to open it.


    In working towards an automated solution of discovering this, I chose to query the registry of the computer that the PowerShell script is running on to determine the MIME type of a file.  The top level of the registry (HKEY_CLASSES_ROOT) contains the entries for file extensions as well as their MIME types.


    In the image above, I highlighted the .bmp extension registry key. Windows stores the MIME type of that extension as the Content Type data element within that key.


    To query the MIME type, the script maps a new drive path to HKEY_CLASSES_ROOT.  PowerShell is very powerful in this way as it lets you map out not just file systems, but any system that is built out to be a PowerShell provider.  Once the registry is mapped, the script simply checks for the existence of the Content Type data element under the extensions associated registry key and stores it as a variable.


    There are a few limitations to using this approach, the biggest of which is that the Registry needs to have all of the MIME types for the file types that are being uploaded. I used the following approach to combat this.  First, I obtained a list of all of the extensions that were present within the files that were going to be imported and then I manually verified the registry for each of them notating which were present and which were not.  Second, I created a registry file with the missing extensions of which was later used to merge the missing extensions and MIME types into the registry.


    In a perfect world, there would be a web service that would be current and up to date with all of the most recent extensions and MIME types.  If that were the case, one could simply utilize such a service instead of querying the registry.  Unfortunately, there are gaps present in these systems so I have found it to be most thorough to check the extensions in http://www.stdicon.com (a web service that gives the MIME type of an extension) and then manually Google for any that are missing from its database.   By using a URL formatted like http://www.stdicon.com/ext/bmp where the extension is listed after the last forward slash, their web service will output the associated MIME.  In this example, it returns image/bmp.  Armed with this knowledge it is possible to create a registry import file with the following contents:


    Windows Registry Editor Version 5.00


    "Content Type"="image/bmp"


    Once the registry file is fully loaded with all of the missing extensions, it is imported into in to the registry by simply right clicking it and choosing Merge.  At this point, the system running the script is ready.  This process took me about an hour to complete for 72 extensions.  I have attached a sample registry import file that contains many of the more common extensions and MIME types to this document.


    The system that I was exporting the attachments from had no file type restrictions.  This brought about the inevitability that there would be some extensions that were either customized or just plain old typos resulting in invalid MIME types.  ServiceNow requires a valid MIME, to remedy this scenario the script errors when verifying the registry as the MIME type doesn’t exist.  This error is caught by the script and it forces the file to go through a conversion process where it is compressed into a ZIP file and loaded with the .zip MIME type.


    Updating the Extensions / MIME types in the registry is not required for running the PowerShell script, though by NOT doing this there will be more files loaded as ZIP files as the MIME types will be unknown.

    Export Folder Structure

    This step is crucial to the PowerShell script functioning properly.  The script actually utilizes the folder structure containing the attachments to determine what Table and Record the file should be attached to in ServiceNow


    When exporting file attachments from the old system, it is important to do it in a fashion that lets the PowerShell script know what record it is supposed to be attached to in ServiceNow.   As an example, if Assets are being exported from another Asset Management tool, the attachment could be exported into a folder that is named after the ASSET TAG or SERIAL NUMBER of the device.  The most important thing to remember is that this value must be a unique value in ServiceNow’s Asset table.  In the example below, File A.txt and File B.txt were attached to an asset in the old asset management system with an Asset Tag of AST12189 so they were exported into a folder named as such.


    Once the export of all of the files is complete, there are a number of folders named after various unique ASSET TAGs, each containing the files that were attached to that asset.


    The next step is to create a folder named after the FIELD NAME (not label) that is being mapped to in ServiceNow and then move all of the ASSET TAG folders into it.


    Last, create a folder that is named after the TABLE NAME that is being mapped to in ServiceNow and move the FIELD NAME folder into it.


    That might sound a bit complicated so here is a visualization of what this looks like.



    In the above import, there are 7 attachments being loaded between three assets and one incident.  As seen here, the script is capable of handling imports to multiple tables.  For AST12189 it will import to the record where the Table = alm_hardware and the Asset Tag field = AST12189.  The script will loop through all of the assets and then move on to the Incident import where based on the folder structure it will import File G.txt to the record where the Table = Incident and the Number field = INC2064860.


    Notice File F.adz has an invalid extension, this file is converted to a zip file before it is uploaded to the Asset Table where the Asset Tag = AST31234.


    The PowerShell Script (COPY IT)

    At this point, the script (Import_SNOW_Attachments.ps1) is copied to the machine that it is going to be executed from.  I saved it to my desktop in a folder named SOAP and then I moved  the export_files folder and sub folders (as shown above) into that directory.  When creating the directory structure, the PowerShell script and Export folders can be placed anywhere locally and they do not have to be in the same folder structure. The reason for this is explained in the next step.


    The PowerShell Script (MODIFY IT)

    Open the PowerShell script in an editor, there are 5 variables listed right at the top of the script.  These variables along with their definitions are detailed in the image below.  Once these are set to their correct values for your setup and instance, the script is ready to execute.


    In order to execute the SOAP queries against an instance, the ServiceNow user account that is used for the import requires the correct SOAP Roles (
    http://wiki.servicenow.com/index.php?title=SOAP_Web_Service#SOAP_Roles). The account also requires Read / Write access to the tables and records that the files are being targeted towards (http://wiki.servicenow.com/index.php?title=Using_Access_Control_Rules).

    As the script runs and loops through the files to be attached, the PowerShell will output feedback to the screen as well as to a log file for final verification of upload.


    The log file gets written (logfile.csv) into the same directory that contains the PowerShell script. This is the output for the example shown in the screenshot above.


    By viewing the Asset in ServiceNow (shown below) it is seen that the attachment is successfully added to the Asset record.


    Another logfile from the earlier Folder Structure example is shown below.  Take notice of line 7 and how the file had an invalid extension causing it to be renamed and compressed to a ZIP file.  These actions are all tracked in the log file (Red highlighting added -  not present in actual log file).


    Included  below is a graphic that quickly summarizes the REQUIRED steps to completing this process. As I stated before, the first step of updating the registry of the computer running the PowerShell script is optional, but in my opinion yields the best end results.




    The Techie Stuff

    How Does it convert the file to BASE64?

    BASE64 is another one of those things that if you are not familiar with it and would like a detailed explanation you can refer to the Wikipedia link I included earlier.  Briefly, it is a method of encoding a compiled file into a text string to be sent over e-mail and it plays hand in hand with the files MIME type.  To convert the file to BASE 64, the PowerShell script utilizes a native Windows utility called CERTUTIL (https://technet.microsoft.com/en-us/library/Cc732443.aspx).  This is a command line utility that takes a FULL PATH of a file as an input and provides  a BASE64 encoded string into an output file.  This output contains two lines designating START and END that are parsed out and removed from the file before storing it’s contents in a variable that is used in the as a payload in the SOAP envelope.


    How Is the Sys_ID of the Target Record Discovered?

    Actually this is where the proper Folder structure comes into play. The PowerShell script parses the directory structure and stores the various folder names in three variables: $qTable (target table) , $qField (known field), and $qValue (known field value). There is a small piece of code (shown below) that uses these three variables to create the XML envelope for querying the SYS_ID.  The script uses this to query against the Target Table to get the value of the Sys_ID.

    Earlier I displayed a screen grab of PowerShell as it uploaded a file, shown below is the XML that was generated from the above code for that lookup.


    Once the XML is created, authentication is established and the web request is sent to the ServiceNow instance.  The request itself is stored as an XML object that is later parsed for the SYS_ID record result and stored as a variable named $Sys_ID.


    How Does the File get Attached to the Record?

    Similarily to the lookup, this portion of the script makes use of the folder structure to create the XML file.  It also makes use of two other variables that we discussed earlier, $mime (MIME type)and $fileContentEncoded (Base64 encoding).  The actual contents of the lookup is created with the code as shown below.



    Again, the following XML file was generated from the above code for the upload example shown earlier.


    It is noticeable here that the payload of the file is vary small. The test.txt file only contained the word test and that payload is the BASE64 encoding of that file.


    The code for executing this is nearly the same as the lookup file so I won’t show it here.  The difference is that the query itself returns a SYS_ID when it is successful.

    This SYS_ID is the ID of the Record that was inserted into the ECC Queue and is sent to the logfile as a confirmation that they file was uploaded to the queue.


    A Quick Note About the Log File

    The LogFile is generated throughout the process of execution and is written at the end of each file upload.  While all of the files are being loaded, the logifile has a TXT extension. This is by design as it allows the user executing the script to be able to open the logfile while it is executing without putting a lock on the file.  When openeing TXT files the system is still able to write to it.  As soon as the last file is uploaded to ServiceNow, the logfile is renamed to a CSV file so that it will open natively in Excel.


    A Final Note About PowerShell

    If you are running an older version of PowerShell, you may need to upgrade it to the latest version that supports SOAP queries.  Info for upgrading PowerShell to V.4 can be found at How to Install Windows PowerShell 4.0 - TechNet Articles - United States (English) - TechNet Wiki


    Attached you will find the full PowerShell script ready for use, a sample output log file, a registry file with common MIME types, a word version of this document and a PDF of the overview graphic shown above.


    I truly hope you find this to be useful in automating your file uploads into ServiceNow.


    --John VanBruggen




    tltoulson for his assistance with Has anyone successfully imported multiple files as attachments to multiple records?

    http://poshcode.org/4845 - PowerShell function for creating ZIP files natively

    --John VanBruggen