OP2 OLWhitepaper
OP2 OLWhitepaper
This document is provided "as-is". Information and views expressed in this document, including URL and other
Internet Web site references, may change without notice.
Some examples depicted herein are provided for illustration only and are fictitious. No real association or
connection is intended or should be inferred.
This document does not provide you with any legal rights to any intellectual property in any Microsoft product.
You may copy and use this document for your internal, reference purposes. You may modify this document for
your internal reference purposes.
The videos and eBooks may be in English only. Also, if you click the links, you may be redirected to a U.S. website
whose content is in English.
© 2024 Microsoft. All rights reserved.
Many customers expect that migrating to online would be an easy task facilitated by Microsoft where the
customer can provide their organization, which then becomes provisioned in Dynamics 365 Online. While
complete automation is not possible due to the complexity of the solution, the tool and process described in this
document provide a way to perform a migration from on-premises to online (OP2OL) with the least effort possible.
Please pay attention to the supported versions and ensure they follow the supportability/compatibility
documentation. Minimum supported CRM on-premises version for this migration service is CRM 2015.
In the event of encountering a failure not previously experienced, additional investigation and troubleshooting
are necessary, potentially extending the duration of the dry run. Response times are lenient, though the team is
committed to promptly resolving issues. The project team should be prepared for possible delays resulting from
these unforeseen circumstances.
A maximum of 2 successful dry runs is permitted. A dry run is deemed successful when the migration is not
abandoned. It's important to note that the primary objective of dry runs should not be the capture of end-to-end
timelines.
Production mock
Following the completion of dry runs, the Production Mock migration is scheduled close to the actual production
migration date, preferably a week before. This serves as a dress rehearsal for production migration, with the main
goal being to capture end-to-end timelines and complete migration with minimal failures. The experience gained
from this migration is typically repeated during production migration. If there are opportunities to reduce the
migration window, those steps will be deployed for the production migration.
All migration-blocking issues are addressed with the highest priority, similar to the approach to production
migration. Regular status updates are communicated via email.
Production migration
Production migration must be scheduled within 1 to 2 weeks after the Production Mock is completed. During the
production migration, all blocking issues are handled with the highest priority, and regular email and/or Teams
chat updates are provided throughout the process. Factory team members are available for contact via Teams call
or chat at any time during this migration window. Failure to schedule the production migration within the
suggested timeframe after the Production Mock may result in potential delays in migration support from Factory
Team members.
The below image shows migration process and highlights the steps with customer/partner responsibilities.
4. Once the above steps have been executed, please reach out to your Microsoft Migration Factory
contact with customers’ name and Tenant ID for activating your profile on LCS.
5. After your Microsoft contact confirms the activation, please make sure you Logout of the LCS portal and
Login again.
To add either your colleagues or Migration Factory team members to the project:
1. Open the Project from LCS home.
2. Select the Project Users tile.
3. Select (+) to add the user to the project. Specify the Project role as Project owner, and then click Invite,
then please send the Project URL and Name to the Migration Factory team, notifying of the project
creation. NOTE: Please use UPN (e.g. alias@... instead of first.lastname@...)
2. Database size is not more than 2TB. Let the factory team([email protected]) know if your On-
Prem database size is more than 2TB.
3. Databases over 900GB require additional work. Refer Appendix B for steps to split the data file.
4. The database collation needs to be aligned with default for the specified language (e.g.
‘Latin1_General_CI_AI’ for 1033). Supported Collations
5. SQL database backup set has only one LDF(log) file. Multiple datafiles are allowed.
6. The SQL DB recovery model should be set to simple before taking the backup.
7. Review the biggest tables in the database and reduce size. (Eg: AsyncOperationsbase table should be
cleaned, audit logs should be removed if not needed)
8. Full Text search should be set to “No” in Dynamics CRM On-prem and SQL full text indexes removed.
9. SQL TDE as well as any 3rd party encryption (using Symmetric Key) must disabled and DB/fields
decrypted before upload.
10. Remove Extra file groups
11. Run DBCC command to shrink the DB to get rid of empty space in the datafile.
12. Run command DBCC CHECKDB (‘DBName’) WITH EXTENDED_LOGICAL_CHECKS and provide output to
Microsoft team.
Organization name The unique name of the Microsoft Dynamics CRM on-premises organization.
It should not contain any spaces
Version (current) The Microsoft Dynamics CRM on-premises version of your database
SQL version The SQL Server version of Microsoft Dynamics CRM on-premises database.
CRM geo The region where D365 organization needs to be deployed. Please discuss
this option with Microsoft, to ensure you follow the best option according to
your scenario.
Sandbox Organization Name Please specify a target sandbox environment to which you are going to
migrate your on-premises database, e.g. for
mycompanytestmigration.crm4.dynamics.com you have to use
mycompanytestmigration. The environment will be overwritten. This
information should not be updated after entering Phase 2 of the project. If
any changes are needed, please contact Microsoft Factory Team.
CRM Region The name of the region where the environment will be migrated to.
Database size Enter the Database size here. The database size is not the same as the
backup file size. It is the size of (MDF + NDF) files (Giga bytes) of the on-
premises org database. Please use the given SQL to find the DB size.
Migration Type Dry Run, Production Mock or Production Run – this will help the Microsoft to
plan support accordingly.
Migration Date The date you’re targeting for the final step of migration (migrate to online)
to occur
Security Group Ignore this if you have already selected a security group at the environment
level.
4. There should be 0 allocation errors and 0 consistency errors. If you notice any errors then, please fix
those before we attempt migration.
5. Once complete send the results in a Text file to Migration Factory contacts.
To obtain backup with checksum you can run the following command in SSMS:
IMPORTANT: Make sure you import the BAK file as a “block blob”, not “page blob”.
1. Open the LCS project and go to the Step 1.3. Access Migration Service storage.
2. Choose Create and Save Connection String under CRM Azure Storage Connection Manager.
2. If backup files are uploaded, you will be able to select the respective file from dropdown box.
NOTE: Once connected, you will have 24 hours to upload the files. If the key expires, please contact Microsoft on
[email protected].
Note: After kicking off Step 2.1 by factory team, Steps 2.3 and 2.5 will be performed automatically.
TIP: If this is a test run and you only need a few users to work on the system after the migration is completed, just
map those. If this is a rehearsal for the final migration or the final migration itself, make sure you map all the users
you need to have mapped (these should exist and be licensed in the Microsoft 365 portal too).
Mapping B2B users (external domain) will require some specific actions, please consult Microsoft for those details.
1. Open the LCS project and go to the “Step 1.3 Access Migration Service storage.
2. Choose Create and Save Connection String under CRM Azure Storage Connection Manager.
If you use Notepad or similar editor, the structure of the file should be a comma separated values with double
quotes as qualifier, such as example below.
✓ Make sure that there are no empty rows in the document, including first and last row.
✓ If there is no Primary E-mail for any user then please leave it blank.
✓ There should not be any duplicates UPN in the TargetUser column.
Save the file as ADUsers.csv (please note the name is case sensitive) and upload it to the Blob storage account as
described in step 3.4.7.
Validate ADUSers.csv
After the ADUsers file is uploaded you .must perform validation by selecting the Validate AD Users file in step 2.4
Microsoft will monitor the progress. The upgrade process may fail due to issues on the source database, and
remediation steps may be necessary to proceed with migration. You may also be asked to raise a support ticket
and work with Microsoft Support Engineering team on the resolution steps.
IMPORTANT: Before starting this phase, ensure environment you specified in Customer Information step exists
on Power Platform, if not, please create an environment WITH Database and Dynamics 365 Apps in Power
Platform Admin center and return to Step 1.2 to validate.
IMPORTANT: Please contact Microsoft ([email protected]) to initiate the Migrate Organization task.
NOTE: Once this step is marked as complete, no further user mapping can be done through LCS. If there is
requirement for additional user mapping, Microsoft Support ticket needs to be raised.
NOTE: The Migration project and all the files in the storage account are deleted 7 days after the migration finishes.
If you need to keep the project and files available for longer period of time, please contact Microsoft.
If you need to delete the files earlier than 7 days after migration, you can access the Storage Account and delete
those manually.
The PowerApps Solution Checker is available for most of our Dynamics 365/PowerApps customers, online and on-
premises. It’s supported for Dynamics CRM 2011 to the latest Common Data Service/Model-Driven apps solutions.
For online customers, you can run PowerApps Solution Checker within the PowerApps maker portal.
For on-premises you can leverage the API to run the assessments too. More information about the PowerShell
option can be found here: Automatically validate your solutions using the PowerApps checker PowerShell Module
| Microsoft Power Apps
You can use this process within your ALM standards to submit your solution files for analysis before you push the
changes to production.
Once notified, you will be able to login and start importing your solutions. If you encounter errors importing your
solutions, related to dependencies from other Microsoft (or any other) apps, please verify if those are installed or
request assistance from your Microsoft contact or Microsoft Technical Support.
At this point, you can also re-configure your integrations to make those work with the newly migrated Dynamics
365 online environment.
The approach in general terms is to move online as quickly as possible: not change anything in the original on-
Premises systems and rather just remove any blockers and take the time to work on the solution changes directly
in the Online environment.
You can have up to two successful dry runs (see also section 3.2) – these will require each time a new LCS project,
and they will help capturing failures and record the mitigation actions, preparing for the final go-live migration.
NOTE: Successful dry run is a migration that is completed. If there are errors/issues encountered as part of
migration that are mitigated, this is as well considered a successful migration as the mitigation steps are then
noted and will be performed in eth production migration should there be need to do so.
NOTE: For any non-production database migration maximum 1 successful dry run is allowed.
Runbook
It is advisable to create a migration run book. Establish a logbook that aligns all migration-related tasks in a
sequential manner, recording detailed actions, owners, and expected start/finish times, will significantly
contribute to a successful migration. Make sure to include sufficient time buffers for any unforeseen situations.
The Dynamics 365 App for Outlook enhances user experience across Outlook on desktop, web, or phone. Upon
installation, depending on the app version, users will observe a Dynamics 365 apps pane or window alongside
selected Outlook email messages. This feature is also visible when composing an email message or setting up a
meeting or appointment. Dynamics 365 App for Outlook Overview (Dynamics 365 apps) | Microsoft Docs
6.10. Auditing
Audit logs assist administrators and other privileged users in reviewing the creation and update history of records.
During post-migration steps, there might be a requirement to update transactional or master data. It is advisable
to ensure that auditing is enabled before making any changes as part of post-migration steps.
https://docs.microsoft.com/en-us/power-platform/admin/manage-dataverse-auditing#configure-auditing-for-a-
dataverse-environment
FAQ
7.1. Where does my data reside during staging?
Location of the data during migration is determined by the region where the target sandbox environment resides.
This means, that if you create environment with URL myorg.crm4.dynamics.com, the Migration tool will be in
European datacenters. Additionally, ensure that you are utilizing the correct location for LCS tools; refer to section
3.2 for a list of available LCS endpoints.
Step 1:
-- Disable FullTextSearch
UPDATE OrganizationBase
SET IsFullTextSearchEnabled = 0 -- Drop FT indexes to address this issue:
/*
Cannot drop index 'cndx_PrimaryKey_ColumnMapping' because it enforces the full-text key for table or indexed
view 'ColumnMappingBase'. Could not drop constraint. See previous errors.
*/ DECLARE @output TABLE(tableName VARCHAR(1024)) DECLARE @T INT, @count INT, @dropTableName
VARCHAR(1024)
INSERT @output
SELECT OBJECT_NAME(object_id) AS TableName
FROM sys.fulltext_indexes
WHERE OBJECT_NAME(object_id) not in ('BusinessDataLocalizedLabelBase',
'DocumentIndex',
'MultiSelectAttributeOptionValuesBase')
SET @count =
(SELECT COUNT(*)
FROM @output)
SET @T = 1 WHILE @T <= @count BEGIN
SET @dropTableName =
(SELECT TOP 1 tableName
FROM @output) EXEC('DROP FULLTEXT INDEX ON ' + @dropTableName)
DELETE @output
WHERE tableName = @dropTableName
SET @T = @T + 1 END -- Drop SP to address an issue:
/*
Error in Action: Bin\Microsoft.Crm.ObjectModel.dll:InstallStoredProcedures on attempt 3.
System.Reflection.TargetInvocationException:
Exception has been thrown by the target of an invocation. ---> Microsoft.Crm.CrmException: Error
installing sp with name=EntityIdCollectionCreation
Cannot drop type 'dbo.EntityIdCollection' because it is being referenced by object
'p_GrantAccessBulkForCreate'.
There may be other objects that reference this type.
*/
DROP PROCEDURE [dbo].[p_GrantAccessBulkForCreate] GO -- Remove the FT indexes from the index tables
BEGIN TRANSACTION CleanUpMetadataSchema GO
DELETE
FROM [MetadataSchema].[IndexAttributes] WHERE [IndexId] IN
(SELECT [IndexId]
FROM [MetadataSchema].[EntityIndex]
WHERE IndexType = 9
AND [Name] not in ('BusinessDataLocalizedLabel_FullText',
'DocumentIndex_FullText')
February 2024 CRM On-Premises to Dynamics 365 Online Migration Process 26
AND [State] <> 3 );
GO
DELETE
FROM [MetadataSchema].[EntityIndex]
WHERE IndexType = 9
AND [Name] not in ('BusinessDataLocalizedLabel_FullText',
'DocumentIndex_FullText')
AND [State] <> 3;
GO
COMMIT TRANSACTION CleanUpMetadataSchema
Step 2:
Deleting Full Text Catalog:
You can follow this documentation to learn how to find and delete full text catalogs:
https://docs.microsoft.com/en-us/sql/relational-databases/search/create-and-manage-full-text-
catalogs?view=sql-server-ver15
1. In SSMS, in Object Explorer, expand the server, expand Databases, and expand the database that
contains the full-text catalog you want to remove.
2. Expand Storage and expand Full Text Catalogs.
3. Right-click the full-text catalog that you want to remove, and then select Delete.
4. In the Delete Objects dialog box, click OK.
Step 3:
Run the output that is generated against your CRM OrganizationDB and it will drop all the indexes needed. And
the again try deleting the Full Text Catalog, it should succeed now.
Step 4:
DB name: mycrm_MSCRM
DB Size: 1TB
1. Add 4 additional files with initial size of 200 GB and an autogrowth of 512 MB will be added.
2. In the sample script, include two files to be added to the E:\ drive, and one file each to be added to the
G:\ and L:\ drives, respectively.
3. When performing on the Production database, schedule a rebuild index job to redistribute data across
all files, excluding the attachment table. Ensure this runs outside business hours.
4. Schedule a Log Backup job on the database at 5-minute intervals while the rebuild index is running to
prevent the Transaction Log file from filling up and causing disk space issues on the "log" drive. Note: Set
the recovery model to SIMPLE as per op2ol prerequisites.
5. Identify and drop any unused indexes on the production database.
All through these steps, consistently monitor the size of .MDF file, as you see the file size reducing, limit the max
file size to little over the file size [Example: if the size of .MDF file is 500 GB, set max size to 550 GB].
This is an iterative step, that must be run until the .MDF file size is ~ 250 GB.
Scripts to be used:
Note: Before running any of the scripts below, please consult with your DBA.
In order to perform the data distribution by reindexing, the following Stored Procedure can be used:
create procedure [DBO].[P_DAMSREINDEXALL](@ALLINDEXTYPES int=1, --0:Clustered Index only, 1:
Clustered and Non Clustered indexes
@MAXRUNTIME int=NULL, --Maximum allowed running time (in seconds)
@FRAGREBUILDPCTHIGH int=30, --Min. Percentage of fragmentation above which indexes are
REBUILD unless @DefragType is set to 1.
*/
/* 1. ONLY REORGANIZE for frag 10% - 40% : exec p_reindexAll
@FragRebuildPctLow=10, @MaxFragPctToRebuild=40, @DefragType=1 */
/* 2. ONLY REBUILD for frag 40% - 100% : exec p_reindexAll
@FragRebuildPctLow=40, @MaxFragPctToRebuild=100, @DefragType=2 */
/* 3. Defragment indexes (default beahvior): exec p_reindexAll --(REORGANIZE
fragmentation less than 30%; REBUILD otherwise) */
/********************************************************************************************
***/
/* RETURN CODES:
*/
/* 0 - Success
*/
/* 1 - Partial success - some indexes could not be rebuilt
*/
/* 2 - Partial success - Indexes were rebuilt, but some statistics were not updated
*/
/* 5 - Invalid input parameter(s)
*/
/********************************************************************************************
***/
/*
Returns
a) Always first recordset - one row with 4 integer columns:
ResultCode - see RETURN CODES
TotalIndexesToRebuild - total count of indexes detected
to be rebuild
RebuiltWithOnlineON - count of indexes rebuilt with
option ONLINE = ON
RebuiltWithOnlineOFF - count of indexes rebuilt with
option ONLINE = OFF (can't be rebuilt with ONLINE = ON)
TotalStatisticsToUpdate - total count of the indexes to update
statistics for
StatisticsUpdated - count of the number of
indexes updated
FragmentationScanDurationInSeconds - Duration to identify the fragmented indexes that
needs rebuild/reOrg
MiscStatsUpdateBeforeRebuildInSeconds - Duration for all UPDATE STATISTICS
dbo.Subscription* that are done before rebuild starts.
ReindexComments - Reports any comments to
be Loggged in Telemetry like # of Online ReBuild that failed due to BLOB.
b) Always second recordset - see @errors table
c) Only when @Verbose=1, then the second recordset with detailed info about all indexes
d) Only when @Verbose=1, then the third recordset with detailed info about all indexes to
update statistics on
use mycrm_MSCRM
set lock_timeout 20000
exec [p_DamsReindexAll] @FragRebuildPctLow=1, @MaxFragPctToRebuild=100, @DefragType=2
In the example the job is scheduled to run on specified time (11:30 PM to 5:45AM)
Please discuss the DB backup strategy with your DBA while the reindexing job is running.