Data Conversion and Migration Strategy Essay

1. Data Conversion & A ; Migration Strategy

The range of this subdivision is to specify the informations migration scheme from a CRM position. By its really nature, CRM is non a sweeping replacing of bequest systems with BSC CRM but instead the coordination and direction of client interaction within the bing application landscape. Therefore a big graduated table informations migration in the traditional sense is non required, merely a choice few informations entities will necessitate to be migrated into BSC CRM.

Data migration is typically a ‘one-off ‘ activity prior to go-live. Any on-going informations tonss required on a frequent or ad-hoc footing are considered to be interfaces, and are non portion of the informations migration range.

We will write a custom essay sample on
Data Conversion and Migration Strategy Essay
or any similar topic only for you
Order now

This subdivision outlines how STEE-Infosoft intends to pull off the informations migration from the CAMS and HPSM bequest systems to the BSC CRM system.

STEE-InfoSoft will supply a comprehensive information transition and migration solution to migrate the current bequest databases of CAMS and HPSM. The solution would follow the most suited and appropriate engineering for database migration, utilizing our proved methodological analysis and professional expertise.A STEE-InfoSoft ‘s informations migration methodological analysis assures clients the quality, consistence, and truth of results.A

Table 11 shows STEE-InfoSoft informations migration values proposition utilizing our methodological analysis.

Table 11: STEE-Infosoft informations migration values proposition

Value

Detailss

Cost Effective

STEE-InfoSoft adopts a cost-efficient information migration solution. Minimal downtime can be achieved for the information migration. Extensive usage of mechanization velocity up work and makes post-run alterations and corrections practical. Mistake trailing and rectification capablenesss help to avoid perennial transition re-runs. Customization enables acquiring the occupation done the right manner

Very Short Downtime

Downtime is minimized because most of the migration procedures are external to the running application system, and do non impact its normal work flow. It farther reduces downtime by leting the informations transition to be performed in phases.

Assured Data Integrity

Scripts and plans are automatically generated for subsequently usage when testing and formalizing the information.

Control Over the Migration Process.

Making alone ETL ( Extract, Transform and Load ) scripts to run the infusion and burden procedures in order to cut down the downtime of the bing systems. Unifying Fieldss, filtering, dividing informations, altering field definitions and interpreting the field content. Addition, Deletion, Transformation, and Aggregation, Validation regulations for cleansing informations.

1.1. Data Migration Overview

Data migration is the transportation of informations from one location, storage medium, or hardware/software system to another. Migration attempts are frequently prompted by the demand for ascents in proficient substructure or alterations in concern demands

Best patterns in informations migration recommends two rules which are built-in for successful informations migration:

  1. Perform informations migration as a undertaking dedicated to the alone aim of set uping a new ( mark ) information shop.
  2. Perform informations migration in four primary stages: Data Migration Planning, Data Migration Analysis and Design, and Data Migration Implementation, and Data Migration Closeout as shown in 1.1.

In add-on, successful informations migration undertakings were 1s that maximized chances and mitigated hazards. The undermentioned critical success factors were identified:

Perform informations migration as an independent undertaking.

Establish and manage outlooks throughout the procedure.

Understand current and future informations and concern demands.

Identify persons with expertness sing bequest informations.

Collect available certification sing bequest system ( s ) .

Define informations migration undertaking roles & A ; duties clearly.

Perform a comprehensive overview of informations content, quality, and construction.

Coordinate with concern proprietors and stakeholders to find importance of concern informations and informations quality.

1.2. STEE-Info Data Migration Project Lifecycle

Table 12 lists the high-ranking procedures for each stage of the STEE-Info Data Migration Project Lifecycle.

While all informations migration undertakings follow the four stages in the Data Migration Project Lifecycle, the high-ranking and low-level procedures may change depending on the size, range and complexness of each migration project.A Therefore, the undermentioned information should function as a guideline for developing, measuring, and implementing informations migration attempts. Each high-ranking and low-level procedure should be included in aDatasMigrationPlan.A For those procedures non deemed appropriate, a justification for exclusion should be documented in theDatasMigrationPlan.

Table 12: Data Migration Project Lifecycle with high-ranking undertakings identified.

Data Migration Planning Phase

Data Migration Analysis & A ; Design Phase

Data Migration Implementation Phase

Data Migration Closeout Phase

Plan Data Migration Project

Analyze Assessment Results

Develop Procedures

Document Data Migration Results

Determine Data Migration Requirements

Define Security Controls

Phase Datas

Document Lessons Learned

Assess Current Environment

Design Data Environment

Cleanse Data

Perform Knowledge Transfer

Develop Data Migration Plan

Design Migration Procedures

Convert Transform Data ( as needed )

Communicate Data Migration Results

Define and Assign Team Roles and Duties

Validate Data Quality

Migrate Data ( trial/deployment )

Validate Migration Results ( iterative )

Validate Post-migration Consequences

During the lifecycle of a information migration undertaking, the squad moves the informations through the activities shown in 1.2

The squad will reiterate these informations direction activities as needed to guarantee a successful information burden to the new mark informations shop.

1.3. Data Migration Guiding Principles

1.3.1. Data Migration Approach

1.3.1.1. Master Data – ( e.g. Customers, Assetss )

The attack is that maestro informations will be migrated into CRM supplying these conditions hold:

The application where the information resides is being replaced by CRM.

The maestro records are required to back up CRM functionality post-go-live.

There is a cardinal operational, coverage or legal/statutory demand.

The maestro information is current ( e.g. records marked for omission need non be migrated ) OR is required to back up another migration.

The bequest informations is of a sufficient quality such so as non to adversely impact the day-to-day running of the CRM system OR will be cleansed by the business/enhanced sufficiently within the informations migration procedure to run into this demand.

Note: Where the maestro informations resides in an application that is non being replaced by CRM, but is required by CRM to back up specific functionality, the informations will NOT be migrated but accessed from CRM utilizing a dynamic query look-up. A dynamic question look-up is a real-time question accessing the information in the beginning application as and when it is required.A The advantages of this attack are ;

Avoids the duplicate of informations throughout the system landscape.

Avoids informations within CRM going out-of-date.

Avoids the development and running of frequent interfaces to update the informations within CRM.

Reduces the measure of informations within the CRM systems.

1.3.1.2. ‘Open ‘ Transactional informations ( e.g. Service Tickets )

The attack is that ‘open ‘ transactional informations will NOT be migrated to CRM unless ALL these conditions are met:

There is a cardinal operational, coverage or legal/statutory demand

The bequest system is to be decommissioned as a consequence of the BSC CRM undertaking in timescales that would forestall a ‘run down ‘ of unfastened points

The parallel ‘run down ‘ of unfastened points within the bequest system is impractical due to operational, timing or resource restraints

The CRM physique and structures permit a correct and consistent reading of bequest system points aboard CRM-generated points

The concern proprietor is able to perpetrate resources to have informations rapprochement and sign-off at a elaborate degree in a timely mode across multiple undertaking stages

1.3.1.3. Historical Maestro and Transactional informations

The attack is that historical informations will non be migrated unless ALL these conditions are met:

There is a cardinal operational, coverage or legal/statutory demand that can non be met by utilizing the staying system

The bequest system is to be decommissioned as a direct consequence of the BSC CRM undertaking within the BSC CRM undertaking timeline

An archiving solution could non run into demands

The CRM physique and structures permit a correct and consistent reading of bequest system points aboard CRM-generated points

The concern proprietor is able to perpetrate resources to have informations rapprochement and sign-off at a elaborate degree in a timely mode across multiple undertaking stages

1.3.2. Data Migration Testing Cycles

In order to prove and verify the migration procedure it is proposed that there will be three proving rhythms before the concluding unrecorded burden:

Trial Load 1: Unit of measurement testing of the infusion and burden modus operandis.

Trial Load 2: The first trial of the complete end-to-end informations migration procedure for each information entity. The chief intent of this burden is to guarantee the extract modus operandis work right, the presenting country transmutation is right, and the burden modus operandis can lade the information successfully into CRM. The assorted information entities will non needfully be loaded in the same sequence as will be done during the unrecorded cutover

Trial Cutover: a complete dry run of the unrecorded informations migration procedure. The executing will be done utilizing the cutover program in order to formalize that the program is sensible and possible to finish in the in agreement timescale. A concluding set of cleansing actions will come out of test cutover ( for any records which failed during the migration because of informations quality issues ) . There will be at least one test cutover. For complex, bad, migrations several test tallies may be performed, until the consequence is wholly satisfactory and 100 % correct.

Live Cutover: the executing of all undertakings required to fix BSC CRM for the go-live of a peculiar release. A big bulk of these undertakings will be related to informations migration.

1.3.3. Datas Cleansing

Before informations can be successfully migrated it data demands to be clean, informations cleaning is hence an of import component of any informations migration activity:

Data demands to be in a consistent, standardised and right formatted to let successful migration into CRM ( e.g. CRM holds references as structured references, whereas some bequest systems might keep this information in a freeform format )

Data demands to be complete, to guarantee that upon migration, all Fieldss which are compulsory in CRM are populated. Any Fieldss flagged as mandatary, which are left clean, will do the migration to neglect.

Data demands to be de-duplicated and be of sufficient quality to let efficient and right support of the defined concern procedures. Duplicate records can either be marked for omission at beginning ( preferable option ) , or should be excluded in the extract/conversion process.A

Legacy informations Fieldss could hold been misused ( keeping information different from what this field was ab initio intended to be used for ) . Data cleaning should pick this up, and a determination needs to be made whether this information should be excluded ( i.e. non migrated ) , or transferred into a more appropriate field.

It is the duty of the information proprietor ( i.e. MOM ) to guarantee the informations provided to the STEE-Info for migration into BSC CRM ( whether this is from a bequest beginning or a templet populated specifically for the BSC CRM ) is accurate.

Datas cleaning should, wherever possible, be done at beginning, i.e. in the bequest systems, for the undermentioned grounds:

Unless a information alteration freezing is put in topographic point, extracted datasets become out of day of the month every bit shortly as they have been extracted, due to updates taking topographic point in the beginning system. When re-extracting the information at a ulterior day of the month to acquire the most recent updates, informations cleansing actions will acquire overwritten. Therefore cleaning will hold to be repeated each clip a new dataset is extracted. In most instances, this is impractical and requires a big attempt.

Data cleaning is typically a concern activity.A Therefore, cleansing in the existent bequest system has the advantage that concern people already have entree to the bequest system, and are besides familiar with the application. Something that is non the instance when information is stored in presenting areas.A In certain instances it may be possible to develop a programme to make a certain grade of machine-controlled cleansing although this adds extra hazard of informations mistakes.

If informations cleaning is done at beginning, each clip a new ( i.e. more recent ) infusion is taken, the consequences of the latest cleaning actions will automatically come across in the infusion without extra attempt.

1.3.4. Pre-Migration Testing

Testing interruptions down into two nucleus capable countries: logical mistakes and physical mistakes. Physical mistakes are typically syntactical in nature and can be easy identified and resolved. Physical mistakes have nil to make with the quality of the function attempt. Rather, this degree of proving is covering with semantics of the scripting linguistic communication used in the transmutation attempt. Testing is where we place and decide logical mistakes. The first measure is to put to death the function. Even if the function is completed successfully, we must still inquire inquiries such as:

How many records did we anticipate this book to make?

Did the correct figure of records get created?

Has the informations been loaded into the right Fieldss?

Has the informations been formatted right?

The fact is that informations mapping frequently does non do sense to most people until they can physically interact with the new, populated informations constructions. Frequently, this is where the bulk of transmutation and function demands will be discovered. Most people merely do non recognize they have missed something until it is non at that place any longer. For this ground, it is critical to unleash them upon the populated mark informations constructions every bit shortly as possible. The information migration proving stage must be reached every bit shortly as possible to guarantee that it occurs prior to the design and edifice stages of the nucleus undertaking. Otherwise, months of development attempt can be lost as each extra migration demand easy but certainly wreaks mayhem on the information theoretical account. This, in bend, requires substantial alterations to the applications built upon the informations theoretical account.

1.3.5. Migration Validation

Before the migration could be considered a success, one critical measure remains: to formalize the post-migration environment and confirm that all outlooks have been met anterior to perpetrating. At a lower limit, web entree, file permissions, directory construction, and database/applications need to be validated, which is frequently done via non-production testing.A Another good scheme to formalize package migration is to benchmark the manner concern maps pre-migration and so compare that benchmark to the behavior after migration. The most effectual manner to roll up benchmark measurings is roll uping and analysing Quality Metrics for assorted Business Areas and their corresponding personal businesss.

1.3.6. Data Conversion Process

Mapped information and information transition plan will be put into usage during this period. Duration and timeframe of this procedure will depend on:

Sum of informations to be migrated

Number of bequest system to be migrated

Resources restriction such as waiter public presentation

Mistake which were churned out by this procedure

The transition mistake direction attack aims to reject all records incorporating a serious mistake every bit shortly as possible during the transition attack. Correction installations are provided during the transition ; where possible, these will utilize the bing amendment interface.

Mistakes can be classified as follows:

Fatal mistakes – which are so serious that they prevent the history from being loaded onto the database. These will include mistakes that cause a breach of database unity ; such as duplicate primary keys or invalid foreign cardinal mentions. These mistakes will be the focal point of informations cleansing both before and during the transition. Attempts to rectify mistakes without user interaction are normally ineffectual.

Non-fatal mistakes – which are less serious. Load the affected mistake onto the database, still incorporating the mistake, and the mistake will be communicated to the user via a work direction point attached to the record. The mistake will so be corrected with information from user.

Auto-corrected mistakes – for which the offending informations point is replaced by a antecedently agreed value by the transition faculties. This is done before the transition procedure starts together with user to find values which need to be updated.

One of the of import undertakings in the procedure of informations transition is data proof. Data proof in a wide sense includes the checking of the interlingual rendition procedure per Se or look intoing the information to see to what degree the transition procedure is an information continuing function.

Some of the common confirmation methods used will be:

Fiscal confirmations ( verifying pre- to post-conversion sums for cardinal fiscal values, verify subordinate to general leger sums ) – to be conducted centrally in the presence of histories, audit, conformity & A ; hazard direction ;

Compulsory exclusions confirmations and rectifications ( on those exclusions that must be resolved to avoid production jobs ) – to be reviewed centrally but subdivisions to put to death and corroborate rectifications, once more, in the presence of web direction, audit, conformity & A ; hazard direction ;

Detailed confirmations ( where full inside informations are printed and the users will necessitate to make random elaborate confirmations with bequest system informations ) – to be conducted at subdivisions with concluding verification sign-off by subdivision deployment and subdivision director ; and

Electronic files fiting ( fiting field by field or record by record ) utilizing pre-defined files.

1.4. Data Migration Method

The primary method of reassigning informations from a bequest system into Siebel CRM is through Siebel Enterprise Integration Manager ( EIM ) .A This installation enables bidirectional exchange of informations between non Siebel database and Siebel database. It is a server constituent in the Siebel eAI constituent group that transfers informations between the Siebel database and other corporate informations beginnings. This exchange of information is accomplished through intermediary tabular arraies called EIM tabular arraies. The EIM tabular arraies act as a presenting country between the Siebel application database and other informations sources.A

The undermentioned figure illustrates how informations from HPSM, CAMS, and IA databases will be migrated to Siebel CRM database.

1.5. Data Conversion and Migration Schedule

Following is proposed informations transition and migration agenda to migrate HPMS and CAMS, and IA databases into Siebel CRM database.

1.6. Hazards and Premises

1.6.1. Hazards

  1. MOM may non be able to confidently accommodate big and/or complex informations sets. Since the information migration will necessitate to be reconciled a lower limit of 3 times ( system trial, test cutover and unrecorded cutover ) the attempt required within the concern to comprehensively prove the migrated information set is important. In add-on, proficient informations lading restraints during cutover may intend a limited clip window is available for rapprochement undertakings ( e.g. nightlong or during weekends )
  2. MOM may non be able to comprehensively cleanse the bequest informations in line with the BSC CRM undertaking timescales. Since the migration to BSC CRM may be dependent on a figure of cleansing activities to be carried out in the bequest systems, the attempt required within the concern to accomplish this will increase proportionally with the volume of informations migrated. Failure to finish this exercising in the needed timescale may ensue in informations being unable to be migrated into BSC CRM in clip for the planned cutover.
  3. The volume of informations mistakes in the unrecorded system may be increased if rapprochement is non completed to the needed criterion. The larger/more composite a migration becomes, the more likely it is that anomalousnesss will occur.A Some of these may ab initio travel undetected.A In the best instance such informations issues can take to a concern and undertaking operating expense in rectifying the mistakes after the event. In the worst instance this can take to a concern runing on inaccurate informations.
  4. The more informations migrated into BSC CRM makes the cutover more complex and drawn-out ensuing in an increased hazard of non being able to finish the migration undertaking on time.A Any farther resource or proficient restraints can add to this hazard.
  5. Due to the volume of the undertaking, informations migration can deviate undertaking and concern resources off from cardinal activities such as initial system physique, functional testing and user credence testing.

1.6.2. Premises

  1. Data Access – Entree to the informations held within the CAMS, HPSM and IA applications are required to enable informations profiling, the designation of informations beginnings and to compose functional and proficient specifications.
  2. Access connexion is required to HPMS and CAMS, and IA databases to enable executing of informations migrations books.
  3. MOM is to supply workstations to run ETL books for the informations migration of HPMS and CAMS, and IA databases.
  4. There must non be any schema alterations on bequest HPMS and CAMS, and IA databases during informations migration stage.
  5. MOM is to supply sample of production informations for proving the developed ETL books.
  6. MOM concern resource handiness ;

Required to help in informations profiling, the designation of informations beginnings and to make functional and proficient specifications.

Required to develop and run informations infusions from the CAMS & A ; HPSM systems.

Required to validate/reconcile/sign-off informations tonss.

Required for informations cleansing.

  1. Data cleaning of beginning informations is the duty of MOM.A STEE-Info will assist place the information anomalousnesss during the informations migration procedure ; nevertheless STEE-Info will non cleanse the information in the CAMS & A ; HPSM applications.A Depending on the information quality, informations cleaning can necessitate considerable attempt, and affect a big sum of resources.
  2. The range of the information migration demands has non yet been finalised, as informations objects are identified they will be added on to the informations object registry.
×

Hi there, would you like to get such a paper? How about receiving a customized one? Check it out