Skip navigation

Developer Community

2 Posts authored by: Marty Grinstead Employee

As a Technical Support Engineer (TSE), most of my blogs come from experiences encountered by our customer base. PRBs are never nice, but they do offer great opportunities for blog posts. One in particular is KB0639042, which is causing some issues while running transforms after upgrading to Jakarta.

 

The tricky part about this PRB is that the import set only reports a java.lang.NullPointerException, so we need to dig into the log file to determine the underlying issue.

 

Looking in the system logs, we get a better idea of where the exception happened:

 

SEVERE *** ERROR *** com.glide.system_import_set.ImportSetTransformerWorker

java.lang.NullPointerException

        at java.util.Calendar.setTime(Calendar.java:1770)

        at java.text.SimpleDateFormat.format(SimpleDateFormat.java:943)

        at java.text.SimpleDateFormat.format(SimpleDateFormat.java:936)

        at java.text.DateFormat.format(DateFormat.java:345)

        at com.glide.db.impex.transformer.TransformEntry.getDateString(TransformEntry.java:126)

        at com.glide.db.impex.transformer.TransformerField.getSourceValue(TransformerField.java:116)

        at com.glide.db.impex.transformer.TransformerField.transformField(TransformerField.java:88)

        at com.glide.db.impex.transformer.TransformRow.transformCurrent(TransformRow.java:100)

        at com.glide.db.impex.transformer.TransformRow.transform(TransformRow.java:69)

        at com.glide.db.impex.transformer.Transformer.transformBatch(Transformer.java:154)

        at com.glide.db.impex.transformer.Transformer.transform(Transformer.java:79)

        at com.glide.system_import_set.ImportSetTransformerImpl.transformEach(ImportSetTransformerImpl.java:239)

        at com.glide.system_import_set.ImportSetTransformerImpl.transformAllMaps(ImportSetTransformerImpl.java:91)

        at com.glide.system_import_set.ImportSetTransformerWorker.startWork(ImportSetTransformerWorker.java:40)

        at com.glide.worker.AbstractProgressWorker.startAndWait(AbstractProgressWorker.java:116)

        at com.glide.worker.ProgressWorker.startAndWait(ProgressWorker.java:52)

        at com.glide.worker.BackgroundProgressJob.execute(BackgroundProgressJob.java:54)

        at com.glide.schedule.JobExecutor.executeJob(JobExecutor.java:103)

        at com.glide.schedule.JobExecutor.execute(JobExecutor.java:89)

        at com.glide.schedule.GlideScheduleWorker.executeJob(GlideScheduleWorker.java:219)

        at com.glide.schedule.GlideScheduleWorker.lambda$process$48(GlideScheduleWorker.java:161)

        at com.glide.schedule.GlideScheduleWorker$$Lambda$30/3401850.run(Unknown Source)

        at com.glide.worker.TransactionalWorkerThread.executeInTransaction(TransactionalWorkerThread.java:35)

        at com.glide.schedule.GlideScheduleWorker.process(GlideScheduleWorker.java:161)

        at com.glide.schedule.GlideScheduleWorker.run(GlideScheduleWorker.java:72)

 

From the error stack, we can see that we are dealing with a date/time column.

 

Underlying causes of transform issues after upgrading to Jakarta

 

There are two specific PRBs that we have identified as the source of these data load and transform issues.

  • KB0639042: Transforming a "Basic Date/Time" to "Date/Time" field creates null pointer exception has been identified as the underlying issue.

    This PRB is specific to Jakarta, and as the short description indicates, when the import set (staging) table has a column with a data type of "Basic date/time", we encounter this exception.

    The workaround for this PRB is to modify the import set table and change all of the data types from "Basic date/time" to "Date/Time" (or "String"). This fix has been incorporated into Jakarta Patch 5 and Kingston.

 

MID Server reported error: java.lang.NullPointerException

at java.util.Calendar.setTime(Calendar.java:1770)

at java.text.SimpleDateFormat.format(SimpleDateFormat.java:943)

at java.text.SimpleDateFormat.format(SimpleDateFormat.java:936)

at java.text.DateFormat.format(DateFormat.java:345)

at com.service_now.monitor.jdbc.JDBCRowSet.next(JDBCRowSet.java:181)

at com.service_now.monitor.jdbc.JDBCMultiRowSet.next(JDBCMultiRowSet.java:19)

at com.service_now.mid.probe.JDBCProbe.doSelect(JDBCProbe.java:319)

at com.service_now.mid.probe.JDBCProbe.doQuery(JDBCProbe.java:198)

at com.service_now.mid.probe.JDBCProbe.probe(JDBCProbe.java:123)

at com.service_now.mid.probe.AProbe.process(AProbe.java:81)

at com.service_now.mid.queue_worker.AWorker.runWorker(AWorker.java:119)

at com.service_now.mid.queue_worker.AWorkerThread.run(AWorkerThread.java:20)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

 

This PRB has been fixed in Jakarta Patch 2 Hot Fix 1, Jakarta Patch 3 (and all subsequent Jakarta patches), and Kingston.

 

So, when you upgrade to Jakarta, you can either examine all of your import set row tables, checking for Basic date/time data types, or simply upgrade to Jakarta Patch 5 and avoid these issues altogether!

What do Transform Map Scripts and algebra have in common? Not much, except the order of execution (precedence) is very important to the final result.  Transform maps provide many opportunities for scripting, but do you know which order they will be executed when data is processed through the mapping?

 

I was recently involved with an import project that needed some fine-tuning.  During this exercise, we found that we were making multiple GlideRecord calls to find the same pieces of data, and this was being done through the various transform map scripts.  Of course, we decided the best option was to make the call once, then store the returned data in variables.  Even though this was the best approach, the results were not exactly what we expected.

 

After some trial-and-error, we found that the order in which we expected the scripts to execute was not the order in which they actually ran.  To make sure we all know exactly what to expect, I created some data to process, and wrote a few different types of scripts (with log.info() statements) to help capture some details.  The log statements will show us which scripts executed, and which order they were processed.

 

Here is an exercise to validate the exact order of execution for the various types of transform map scripts.

First, let's define the types of transform map scripts:

  • onStart - The onStart event script is processed at the start of an import run, before any data rows are read.
  • onBefore - The onBefore event script is processed at the start of a row transformation, before the source row is transformed into the target row.
  • onAfter - The onAfter event script is processed at the end of a row transformation, after the source row has been transformed into the target row and saved.
  • onComplete - The onComplete event script is processed at the end of an import run, after all data rows are read and transformed.
  • FieldMap - Available in the Source script field.
  • "Run script" - the script which is displayed when the "Run script" checkbox is selected.

 

To make sure we exercised the majority of options, with an eye on keeping the output log manageable, here are the tests I've decided to run.

  1. Successful insert.
  2. Update the record.

 

The import set will contain four rows. The third row will ignore, and the fourth row will error (ignore and error will be set in the onBefore script).  Each row will contain three columns.  The first will be the coalesce field, and the next two will each have field map scripts.

 

I will have log.info statements in each of the scripts, printing some details about the script, and some row details being processed.  I also added the date/time (in milliseconds) to the start of each log statement so we can sort A-Z on the Message column, so it will be in the correct order of execution.

 

Here's what I used:

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+

 

DATA:

RecNumberField1
Field2
1Rec1Field1Rec1Field2
1Rec2Field1Rec2Field2
3Rec3Field1Rec3Field2
4Rec4Field1Rec4Field2

 

I know it is not very exciting, but hopefully it will make reading the log statements clear.

 

I created a .csv file with this set of data.  Using a "File" data source, I attached the file, and performed a test load of 20 records.

DataSource.jpg

 

This created an import set table called u_transformscriptlogger.  I was then able to create the transform map so I could call the scripts.  The scripts I used (provided below) used log.info() statements so I could capture some output in the transform log (found in the navigator).

ImportLog_Filter.jpg

Which then provides the import log messages that were printed through the log statements.

 

For example:

LogSamples.jpg

 

After running the transform map to process the data, this was the message output:

1462763686601 onStart

1462763686610 onBefore 1 Ignore: false Error: false

1462763686612 Field Map Rec1Field1

1462763686614 Field Map Rec1Field2

1462763686616 Run script

1462763686637 onAfter 1

1462763686641 onBefore 1 Ignore: false Error: false

1462763686644 Field Map Rec2Field1

1462763686646 Field Map Rec2Field2

1462763686648 Run script

1462763686653 onAfter 1

1462763686656 onBefore 3 Ignore: true Error: false

1462763686659 onAfter 3

1462763686663 onBefore 4 Ignore: false Error: true

com.glide.db.impex.transformer.TransformerScriptException: onBefore script error.

at com.glide.db.impex.transformer.TransformerScript.checkErrorCondition(TransformerScript.java:79)

at com.glide.db.impex.transformer.TransformerScript.runScript(TransformerScript.java:63)

at com.glide.db.impex.transformer.TransformerScript.runWhenScript(TransformerScript.java:122)

at com.glide.db.impex.transformer.Transformer.runOnBeforeScript(Transformer.java:269)

at com.glide.db.impex.transformer.Transformer.transformBatch(Transformer.java:142)

at com.glide.db.impex.transformer.Transformer.transform(Transformer.java:76)

at com.glide.system_import_set.ImportSetTransformerImpl.transformEach(ImportSetTransformerImpl.java:237)

at com.glide.system_import_set.ImportSetTransformerImpl.transformAllMaps(ImportSetTransformerImpl.java:91)

at com.glide.system_import_set.ImportSetTransformerWorker.startWork(ImportSetTransformerWorker.java:40)

at com.glide.worker.AbstractProgressWorker.startAndWait(AbstractProgressWorker.java:86)

at com.glide.worker.ProgressWorker.startAndWait(ProgressWorker.java:52)

at com.glide.worker.ProgressWorkerThread.run(ProgressWorkerThread.java:50)

onBefore script error.

 

The expectation was that the "run script" would execute either just before, or just after the onBefore script for each row, but this testing shows that this is not the case. The onStart script executed first, as expected. Then, the onBefore script for the first inserted record executed. Followed by the field map scripts, and then the "run script" executed after the field mapping scripts. Finally, the onAfter script for the first record executed.

 

The next row was executed in the same order, so it didn't matter if the record was inserted or updated (based on the coalesce value).

 

The third row may not have executed as expected.

 

The log shows:

1462763686656 onBefore 3 Ignore: true Error: false

1462763686659 onAfter 3

 

So, even though the 3rd record  set the "ignore" variable to "true", the  onAfter script executed.  The field map scripts and the "run script" obeyed the "ignore" but the onAfter script executed.  As I was working on this blog post, I see this tweet by fellow Support engineer jonnyseymour which helped explain exactly why this happened.  Here's the tweet and KB number:

transform map tweet.jpg

Linking to ServiceNow KB: Import Set onAfter script runs for ignored rows during transformation (KB0563524)

 

Finally, for row 4, I set error = true, and processing was terminated. Not just processing for that row, but for the entire import set. I ran it again, without setting any errors, and after the onAfter of the last row was executed, the onComplete script ran.

 

This gives us three take-aways:

  1. The correct order of execution for transform map scripts are:
    1. onStart
    2. onBefore executes on each row before any other processing is performed
    3. then we process the field mapping scripts
    4. the "run script" script follows the field map scripts
    5. we then process the onAfter script
    6. and finally, after all of the data has been processed, we finish with the onComplete script.
  2. Setting "ignore" to true will not prevent the onAfter script from executing.  It only prevents the "run script" and field mappings from being processed for the record being ignored.
  3. When "error" is set to true, all processing for that import set will stop.  No additional scripts will be processed, including onComplete.

 

Happy Mapping!

 

PS: For completeness here are the scripts:

"run script"

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" Run script";

log.info(info);

onStart

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" onStart";

log.info(info);

onBefore

if (source.u_recnumber == 3)

  ignore = true;

if (source.u_recnumber == 4)

  error = true;

 

var gdt = new GlideDateTime();

var info = gdt.getNumericValue() + " onBefore " + source.u_recnumber + " Ignore: " + ignore + " Error: " + error;

 

log.info(info);

onAfter

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" onAfter "+source.u_recnumber;

log.info(info);

onComplete

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" onComplete ";

log.info(info);

 

Field Map:

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" Field Map "+source.u_field1

log.info(info);

 

and

 

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" Field Map " +source.u_field2;

log.info(info);

Field Map

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" Field Map "+source.u_field1

log.info(info);

 

and

var gdt = new GlideDateTime();

var info = gdt.getNumericValue()+" Field Map " +source.u_field2;

log.info(info);

 

 

My testing was performed with Fuji and Geneva.

Filter Blog

By date: By tag: