Murex Reporting Design Principles & Guidelines

Murex Reporting Design Principles & Guidelines

Following are the Design principles and guidelines one can refer while creating any Datamart table or a Report based on the Datamart, from a Murex application perspective.

There could be client specific nomenclature but the core approach remains the same, hence the 'principles':

  1. Check if the required trades are already captured in key tables such as transaction table, profit & loss, cash flow, and so on. Create the subset table only if the deals are not captured in the key tables or if the table requires new user functions (parser functions via horizontal fields) that don’t exist in these tables.
  2. Always create dynamic tables under User dynamic tables, which can be referred by batches, whereas User Additional dynamic tables should be given access to users for their online runs.
  3. Avoid using select * from table in your report logic. Because if someone changes the table structure like addition/ deletion/ or modification of a column, the reports or the downstream processing dependent on this table could fail. Therefore, always select the individual fields from the source table in the extract.
  4. Avoid using unnecessary trims in the join conditions.
  5. Follow table naming conventions such as DYN_XXX_REP for Dynamic tables based, DBF_XXX_REP for SQL-based, and TEMP_XXX_REP for any temp tables.
  6. Likewise, follow conventions BF_XXX for Batch of Feeder, FEED_XXX for Table Feeder, BE_XXX for Batch of Extraction, PS_XXX for Processing Script, GF_XXX for Global Filter, and H_XXX naming conventions for any parser function based horizontal field in a dynamic table.
  7. Don't use too many parser functions in a dynamic table that may hinder the performance of that table publishing. Weigh the approach between the performance of these tables vs using these user fields, because these tables could be processing millions of record every single end of day batch publishing, hence overdoing of these fields could impact the run time SLA.
  8. For simulation-based tables, ensure that the simulation view has access to updates from a certain group, an IT group for example, to enforce change management.
  9. Always create the index definitions in Datamart management. For any function-based indices, it could be created on a database level.
  10. If an existing datamart table structure is changed, assess the impact on downstream systems to capture any unforeseen processing issues during the EOD. As an instance, run user_source/ all_views query to check the impacted column is exposed to which other user tables/ reports/ or database views.
  11. Avoid basing reporting extractions on financial database tables such as transaction header table, or any real-time tables that are impacted by online trading activities. For that matter, avoid using post trade workflow tables such as STPDOC_ENTRY* other STP* tables since they are in constant use for straight-through-processing of business objects such as payments, events, market ops, transactions, and so on.
  12. Avoid feeding a datamart table using a batch of datamart extraction. Seen during the CLS settlement initiative that the data populated in the table turned out to be inconsistent for Forex trades due to dynamic table filter configuration messing up with the extract logic. A conventional 2-step approach is recommended, i.e., configure independently a feeder, an extraction, then populate the underlying table by running the feeder fist, followed by the extract run.
  13. Avoid exposing native audit tables to any downstream, as they should be exclusive to the source system because it contains system specific crucial metrics.
  14. Understand the purpose of reports from the respective Business Units and assess if it is already captured in existing reports to avoid redundant builds.
  15. Always assess the performance impact on the system if the Business Unit is requesting multiple runs of the report in a day, especially the system intensive cashflow queries.
  16. Use Datamart extraction framework for report creation as much as possible to maintaining configs in Murex. Create Stored Procedures for exception cases of complex computations.
  17. Parametrize the datamart extracts with entity and other SQL expression tags for reusability.
  18. Ensure that for an extract that goes to a downstream system, headers and footers are properly defined with row count for facilitating sanity checks.
  19. Also avoid using post commands in the batch of extraction to transfer the files. This could mess up with an accidental deletion of files in the backend, or the job could fail due to permission issue. Instead handle it at the job level in a scheduler like Control-M.
  20. Irrespective of the build complexity or trade volume, a report should not take more than 5 minutes, otherwise optimize it. Validate indexes, run SQL execute explain plan to assess the costly joins. Try running that report for a single trade to categorize if the slowness is coming form a poorly written SQL, or the trade volume.
  21. Create the extracts with computing date suffix (either in the batch of extraction file settings, or Control-M order date feature) and housekeep the generated files to have a trail of the generated jobs for any debugging.
  22. Assign proper in-conditions & out-conditions in the Control-M job to ensure the report is rightfully placed in the EOD sequence. Meticulously choose the job run timing during the EOD to minimize slowness on the system.


#MurexReporting #Datamart #CapitalMarkets #Reporting #LinkedInArticle #MurexNetwork #Article #Murex

If you liked this article, checkout more on Mindfulness?? By Vivek

deepali B.

Senior Consultant

4 个月

Thorough and insightful. Definitely helpful design tips for anyone to start with building any report!!

Mayank Kumar Shukla

Senior Risk Consultant

4 个月

Nice work Vivek! Definitely helpful for a fresher to start his journey

Ashutosh Pathak

Murex environment manager

4 个月

Excellent tips !! Thanks.

Yuvraj P

Murex Consultant at Largest Islamic bank

4 个月

Good one

要查看或添加评论,请登录

Vivek Shukla的更多文章

社区洞察

其他会员也浏览了