Legacy data migration is a critical part of most CSM implementations. It is a common need to migrate historic and in-flight support information such as cases, case notes, customer comments, case logs, file attachments, etc. over to the new Customer Service Management system in ServiceNow. The company’s industry and specific company requirements will dictate the amount of data to be converted from the legacy system. This paper outlines importance, best practices, and tips for legacy data migration which will be helpful during your Customer Service Management implementation project.
Importance of legacy data migration
Almost every ServiceNow Customer Service Management implementation involves data migration requirements but still in most of the cases sufficient time and resources are not allocated to data migration activities. Most of the attention is steered towards implementation of new business processes and systems. It is very critical that quality data is imported into the new ServiceNow Customer Service Management Application. Inaccurate, redundant and duplicate data can have a direct adverse impact on customer satisfaction, agent productivity and day-to-day reporting.
A better approach to legacy data migration
Legacy data migration should really be treated as a standalone project rather than just an “Export” and “Import” activity. It should have separate project, plan, approach, budget, and team.
Data migration is a complex process, it is important to plan all the aspects of data migration program ahead of time. Execute following activities during planning phase –
- Define goal and scope of the data migration: It is very important to clearly identify goal and scope of the project to identify what is achievable in terms of what the source systems supports. Perform high-level analysis of source system to understand type, size, and volume of legacy data. Based on the high-level analysis refine and finalize scope, resources, and budget of the project. Below are some sample questions which should help in clarifying scope of the project –
o What data needs to be migrated to new Service Customer Service Management?
o How many source systems are involved?
o What is the size and complexity of legacy system data?
o Any kind of data transformation, scrubbing is needed before it is moved to the new application?
o Should all the legacy data be converted at once or can this be done in phases after Go Live?
- Identify key stakeholders from IT and Business: Establishing a team of right people is vital to the success of the project. It is important to assemble a project team with representation from both IT and business. Identify following key roles/personas –
Role / Personas
Establishes project vision, goals and outcome.
Subject matter expert
Oversees the data migration process from the business perspective. Responsible for establishing data governance policy. Manages team of data migration leads.
Data stewards – responsible for data auditing, cleansing, and validations
Project/ program manager
Coordinate with business data migration leads, subject matter expert and IT to execute data migration as per the project timeline
Business Systems Analyst
Understands history, structure, and meaning of legacy data and systems. Responsible for mapping legacy data with new system based on the requirements provided by business
Data ETL team (Development)
Extract, transform and load data from legacy system to new system
IT Operations Team (DBA, Sys Admin)
Asist during deployment and Go Live
Business is ultimately responsible for the quality of the data which gets loaded into the new system. Data migration project provides an opportunity to business to relook at the data from the usability perspective. It is vital to review and audit data from the legacy system before it is moved over to new system. Business should perform following activities with the help of IT team to maintain data sanity –
- Identify and remove obsolete data: Identify stale, unused data which has no business value Ex. Closed orders, cases, expired contracts for a considerable duration etc. Based on your data retention policies, obsolete data records can be upfront removed from the data migration efforts.
- Data deduplication: Data deduplication process involves finding and remove duplicate data before it is imported into the new system. This process increases the reliability of the data and helps in consistent reporting. Data De-duplication is done for both master data Ex. Accounts, Contacts, Locations etc. as well for transactional data Ex. Cases, Work Orders, Knowledge Articles etc. Data de-duplication activities can either be performed manually or by using specialized tools and services for de-duplication. It is also advisable to establish data de-duplication rules for every entity and to implement these rules in the new system to avoid potential duplicates in the future.
- Data enrichment: Legacy data migration project also provides an opportunity to correct the underlying data. Data enrichment process enhances, refines or otherwise improves quality of the data. Typically, external business applications and third-party services are used to enrich the data.
A typical Customer Service Management implementation requires both master and transactional data migration.
Master Data Migration
Master data conversion requires one-time data migration efforts to import Accounts, Contacts, Locations, Products, Assets, and Contracts from the legacy customer service application. You may also need to build an ongoing integration to synchronize this information with the backend ERP / CRM system. Which typically plays a role of single source of truth (SSOT) system.
Transaction Data Migration
As a part of transaction data migration, historical Cases, Case activities, Work orders, Incidents, Knowledge articles and other types of business transactions are imported into the ServiceNow Customer Service Management. Below are some of the general guidelines –
- Environment planning: Identify and plan for the number of instances (DEV, TEST, PROD) and tools needed to support data migration project. Consider availability of both source application and ServiceNow instances.
- Data Migration Specifications: Develop a mapping specification and supporting documentation based on the requirements provided by the business. Specification should cover following details –
- Mapping between source and target objects
- Relationship between source objects vs relationship between target object
- Field level mapping between source and the target object.
- Field level compliance for data types, length, permitted values, integrity checks
- Record level validations and transformation
- Error handling and reporting
- Error re-processing
- System Performance: Legacy Data Migration processes bring substantial data into the new ServiceNow environment. As a result of that, you may need to create new indexes on columns to speed up queries. You should create a new index in following situations –
- The column is queried frequently
- A referential integrity exists on the column
- The column has a unique value (candidate for Unique Index)
Always evaluate existing indexes before creating a new index.
- When running any migration scripts to insert/delete records into the target tables, the business rules can be evaluated to see if those are required at the time of migration and can be turned off if not required by setting the workflow flag to ‘false’ so that unnecessary BRs are not fired which may slow down the migration process
- Other Technical Considerations:
- It is a good idea to tag separately Imported legacy data for future identification purpose
- Try to leverage Import Sets and Transform Maps to stage the data before importing into the target application tables
- The order of migrating the data objects is important to make sure referential integrity constraints are maintained
- Define appropriate coalesce fields to avoid data duplication in the target tables.
Legacy Data Conversion – Approach
In this phase, Data is extracted from the source system, transformed, cleansed, and loaded into the target system, using the migration rules. Segmented approach offers more control on the data migration projects. Depending on size, you can plan migration activities in multiple phases. Focus on high-value data which are heavily consumed and utilized. During execution, capture, and report following information –
o Total number of records from the source system/file
o Total number of records imported successfully
o Total number of records failed during conversion, reasons for failure and next steps
o Processing time
It is important to execute and test data conversion activities in the non-production environment.
Testing the converted data is an important activity of data conversion. Define Test Plans / Test Cases ahead of time-based on the specifications. Test plan should cover –
- Functional testing
o Data validation: Validation of imported data/records by business users to ensure sanity of the data
o Access validation: Imported data is accessible by appropriate set of users
o Regression validation: Play new business transactions with the imported data. Ex. If you have imported "Accounts", try to create few cases for such "Accounts".
- Performance testing
Archiving Legacy Data
High volume legacy data may eventually cause performance issues by consuming system resources and slowing down queries and reports. ServiceNow has "Data Archiving" plugin which moves data that is no longer necessary for immediate day-to-day access from primary tables into a set of archive tables. This plugin allows to setup flexible data archival rules for individual tables in ServiceNow. It is vital to establish data retention and archival policies for high volume Customer Service Management customers for optimal performance.
To succeed, data migrations must be given the attention they deserve, rather than simply being considered part of a larger underlying project. Lacking this and without proper planning, there is a high risk that the project will go over budget, exceed the allotted time, or even fail completely.
Review following documents to avoid or address performance related issues encountered during data conversion.