Lessons learnt from SAP Central Finance Initial Load
Data is largely being considered as Organizational Asset providing it the required level of strategic and operational excellence. It warrants the data to be accurate and consistent to be leveraged for right decision making and reporting.
The Initial Load (IL) is a process to transfer the opening balances, open items and accounting documents from the source systems to Central Finance (cFIN) for defined set of company codes for a timeframe. It is an important pillar for data transformation driving the success of a Central Finance (cFIN) project. We have observed plethora of clients’ interest in this area due to its value proposition of delivering harmonized data migration.
Upon delivering it successfully in a complex environment, we feel it is a logical and matured process provided there is understanding of “How to perform”, posting schema, key considerations, technical challenges. The aspects in every project could be different based on variety of factors e.g. status quo, source systems, volume and quality of data etc.
In this blog, we focus on the cFIN ‘Initial Load’ process covering our insights & lessons learnt from our recent experiences during each of the phases of the implementation.
Our experiences
cFIN is a deployment option which creates a single source of harmonized truth among disparate ERP applications providing a single platform for planning, reporting and consolidation. Our clients achieved the following:
- Business harmonization and reporting goals for the Enterprise structure, Chart of Accounts, Master data and reporting
- Complex data transformation rules engineered for data enrichment
- > 10 million Financial and Controlling records , historical balances migrated via IL
- > 50, 000 documents on average replicated per day per system via AIF replication
Initial Load activities
The recommended sequence of activities to ensure completeness and correctness is as follows:
Below is the persistence layer view during IL:
Key lessons learnt from Initial Load
Let us switch our gears and contemplate on the key challenges and the lessons we learnt:
Migration scoping considerations
Scope of data migration affects the project timelines with direct impact on IL runs activities, error resolutions and value reconciliation. Consider below to achieve the right balance between reporting requirements and the effort:
- Limit volume of data migration – Limiting the migration scope for reporting requirements and continuity for cFIN go-live reduces the chances of extraction failures, serialization errors and work processes optimization
- Limit controlling records migration – Scope is determined from the IL parameters. Due to the sheer volume of data that is replicated, consider migrating them directly from the period with live replication active, especially with constrained project timelines
- Validate Data source for extraction - Validate whether the standard extraction tables in source system provides the required reporting dimensions. E.g. Migrating balances from classic G/L source has limitations on the dimensions’ availability (no cost centres, functional area etc). Consider using alternative bespoke migration options from a source of data that includes required dimensions
- Document Splitting –If there is a need for Segmental Reporting (IFRS 8/IAS14 compliance) or the Profit centre reporting then document splitting enrichment rules set up is needed in cFIN. This is applicable to the period when entire documents & open items are replicated during Initial Load and not during the balance only periods where the postings happen with migration/substitution accounts
- Profitability Analysis (costing based activated in source system migrated to Account based in cFIN) – Scope is determined from IL parameters and is performed with FI and CO Initial Load during which the CO-PA relevant information (characteristics) from the source postings with profitability segment (such as billing documents) are fed to cFIN universal journal. Consider migration of CO-PA relevant information directly from the period with live replication active, specially, if the project timelines are constrained as there might be custom development (BADIs) requirements based on the source system’s documents complexities involved
Data preparation activities
Successful IL postings mandate the preparation activities which alleviates the potential later issues upfront. Consider including the below activities in your cFIN work plan:
Business activity for data cleansing
- Clear the open items in the source system to the extent possible
- Lock the redundant or defunct G/Ls, Profit centres, Cost centres
- Eliminate duplicity in master data (materials, customers, vendors)
Business mapping rules engineering
- Deploy the right SAP skilled resources in the data harmonization workshops to ensure that right guiding principles are defined
- Gather the business mapping requirements (1:1, 1:N, N:1 mappings) and transformation rules to meet reporting requirements
- Where business needs are not supported by standard SAP, consider implementing: the inbuilt powerful BADIs for complex transformation rules and data enrichment logics (e.g. 1:N mappings)
Limit data Inconsistencies
- Adopt G/L mapping design principles ensuring consistencies of cFIN master data attributes with source system. e.g. Currency etc. Any inconsistencies would result in posting errors later
- SAP consistency check report (FINS_CFIN_CC) could be leveraged to achieve the above. However, consider building custom programs for exhaustive checks (e.g. Recon type not included in standard report and no reliance on RFC connectivity)
- SAP mandates master data, configuration and mapping consistencies to ensure data consistency in cFIN for automatic G/L derivation processes e.g. Special G/L. Consider building a checker tool to identify the inconsistencies prior to postings
Define optimal IL Groups
- They represent the combination of logical system and source company code at which the IL activities are performed
- SAP recommends smaller IL groups as they reduce processing time by parallel runs, however the effort in monitoring the jobs increases with number of IL groups
- The roll back activities (if required e.g. erroneous postings) for extracted, simulated or posted can be performed at this level
- Therefore, define an IL group that works for your project needs and consider the implications of limiting risk to data activities in the unlikely event a roll back is required
Extraction process
During this step the data extract from legacy system is staged into cFIN staging tables before being transformed and posted in subsequent steps. Despite recent improvements by SAP in this area, we advise for the careful consideration of below points to ensure its completeness:
- Network connectivity – Ensure strong network connectivity with the source systems servers for optimum run of extraction jobs with no failures. This reduces the extraction time, if data volume is huge
- Completeness of records extraction – Extraction step could be a time killer during the reconciliation, if not performed with completeness. There could be missing documents which are neither extracted during Initial Load nor replicated using AIF. This could be mitigated by implementing the right SAP OSS notes and running extraction programme multiple times (delta loads) to cover the documents posted during the packaging phase. Also, SAP support might be required to identify other missing documents and manual activities to resolve by looking at IL error logs
- Cutover window – As a timesaver to performing Delta loads and reconciliation, the cutover window for Extraction should ideally in timeframe of minimum postings e.g. weekend or nights (consider business time zones)
Transformation process
During this step the defined transformation rules (mapping and enrichment logic) are applied to the extracted source data. It is recommended to run mapping and posting simulation jobs (Smoke testing for controlling data) to identify and resolve the errors quickly prior to posting run. It is recommended to resolve as many critical errors as possible during this step. Consider the below SAP technical limitations during Simulation runs:
- High number of N:1 mapping is not supported during simulation runs, if the system memory parameters are low. The potential mitigation action could be increase in memory parameters or the usage of BADIs
- Unable to download the simulation error logs > 3-5 million errors. The simulation jobs need to be cancelled after the threshold, errors re-solved and jobs re-triggered
Load process
During this step where the source data is posted in universal journal post applying the data transformation rules and passing system validations. Consider below to decide on when to switch on the live replication of the documents:
- IL and AIF replication for FI/CO can run in parallel - Successful completion of IL is not a pre-requisite to switch on the live replication of documents (via SLT/AIF) as both processes have different posting mechanism. IL posts the legacy data extracted in cFIN staging tables (CFIN_ACCHD, CFIN_ACCIT etc.) to cFIN universal journal. AIF utilizes the DB triggers mechanism to populate the legacy data in source staging tables (CFIN_ACCHD, CFIN_ACCIT etc.) which replicates to cFIN universal journal
- Having a posting success rate of 95-98% could be used as a benchmark to ensure that AIF errors are relatively low when activated
Validate
During this step error logs are monitored, resolved and data reconciled. Below are the key lessons learnt based on our experience of few test cycles and hyper care activities (with evolving mappings from the business):
Error logs review
- For management status - Consider the need to build dashboards in SAC or Tableau with the quantified mapping and master metrics (e.g. mapping readiness, master data quality etc.)
- For day to day efficient error resolutions consider the need of combining the standard error reports (e.g. RFINS_CFIN_DISPLAY_LOG) into a single log file with errors categorizations for efficient routing to the project team members
- It’s important to decipher the SAP error report terminologies for efficient errors resolution. "Count of errors" refer to unique number of errors for the message variable. "Number of occurrences" refer to the total number of work packages affected by a given error message
Types of Errors
- Majority of our errors pertained to source data not qualifying the data model validations in cFIN (e.g. no Trading Partner, Functional Area etc.). These errors were resolved by complex mapping or data enrichment rules
- Special G/L indicators related errors - Conflict between Special G/L indicators and complex mapping rules meant spending considerable amount of time to harmonize and standardize the set up. Inconsistent documents in source posted with different special G/Ls required additional configuration in cFIN
- Certain errors required processes changes in the source systems for ongoing replication to pass the cFIN validations and post
- Certain inconsistent documents in source systems resulted in posting errors and required manual journals in cFIN. e.g. Inconsistent documents with G/L attributes switches
- Document splitting errors - Depending on the source system company codes set up, complexity of the source documents and business reporting requirements consider the options - standard set up (with or without default constants) / custom development (complex source documents) / subsequent implementation of document splitting (consider the SAP pre-requisites)
- Profitability Analysis errors - Consider the need of using BADIs to populate the missing COPA characteristics for cases such as specific billing document cases, missing profitability segments, the missing quantities updates in cFIN etc.
- SAP program (FINS_MASS_DATA_TEST) is a timesaver for errors resolution which highlights the source documents causing errors
Data reconciliation
- Lock down the reconciliation scope with the client as early as possible
- Standard SAP comparison reports don’t work completely for IL as most documents don’t have a link to the source documents (aggregated G/L balances). They work well for online replicated documents
- Consider building custom reports to reconcile data with complex transformation logics wherever applicable
- Using ETL tools like SAP-DS can prove to be useful where adhoc queries are needed to support the data reconciliation
- If the number of legal entities is high, consider building reconciliation outputs with Excel macros that can generate multiple files in minutes
Ending note
IL is a focus area for SAP with incremental improvements in every new release with multiple clients buying in or considering cFIN as their choice of deployment options.
With latest releases (1909 and 2020), there have been improvements on Initial Load functionality front e.g. supporting additional dimensions from classic G/L for balance load, taking document splitting information into account, supporting different fiscal year variant for non-leading ledger, providing flexibility with COPA (such as standard feature of mapping of characteristics), central budgeting and availability control for Internal orders etc. These improvements points that SAP is filling out all the gaps towards creating cFIN as the only source of both Financial and Management reporting.
In the upcoming SAP releases, we would like to see improvements in errors and reconciliation reports, dashboards for reporting statistics as this helps to accelerate the testing and cut-over cycles. Also, we would like to see more improvements in the area of Profitability Analysis Initial Load which may reduce the custom effort.
We look forward to covering more CFIN functional topics (e.g. Document splitting, Profitability Analysis) and experiences in subsequent blogs.
Chartered Accountant | Accenture | Ex-BDO
6 个月Knowledgeable!!!
SAP Certified | SAP Data Migration | Certified Syniti ADM | SAP Certified HANA XSA | SAP SDI
4 年Very nice Himanshu
SAP PaPM Design Authority at Shell
4 年Very thorough, thanks for sharing!