Insights of Business Intelligence (BI)

Insights of Business Intelligence (BI)

Self Sourced #Hrishikesh Sharma

What is Business Intelligence (BI)

Business intelligence (BI) is?a process that uses data analysis to provide actionable information to executives, managers, and employees.?BI is a combination of strategy and technology that gathers, analyzes, and interprets data from internal and external sources.

The ultimate goal of BI initiatives is to drive better business decisions that enable organizations to increase revenue, improve operational efficiency and gain competitive advantages over business rivals.

To achieve that goal, BI incorporates a combination of analytics, data management and reporting tools, plus various methodologies for managing and analyzing data. BI tools can extract, transform, and present data to help with data analysis, trend identification, and strategic decision-making.

Business intelligence (BI) can help businesses to:


  • Improve decisions
  • Identify problems
  • Spot market trends
  • Find new opportunities
  • Improve ROI
  • Understand customer behaviour


Business intelligence (BI) tools can access different types of data, including historical and current, third-party and in-house, as well as semi-structured data and unstructured data like social media.?BI tools can produce business analytics that reveal patterns in historical and current data, as well as predictive modeling to peer into the future.

To understand BI there are some prerequisite concepts, I have explained them in short towards the last section of the article.

Business Intelligence (BI) Evolution Chronology

Self Sourced #Hrishikesh Sharma

Steps Involved in Business intelligence (BI)

Below steps provide a general framework for the BI process, but it's important to alter the approach to the specific needs and requirements of each organization. Additionally, BI is an iterative process, with continuous refinement and improvement based on feedback and changing business needs.

Business Intelligence (BI) includes a variety of processes and methodologies aimed at collecting, analyzing, and presenting data to support decision-making within an organization. While the specific steps involved in BI can vary depending on the organization's needs and the complexity of the data environment, here are some common steps typically involved in the BI process:

Self Sourced #Hrishikesh Sharma


  1. Data Collection: The first step in the BI process is to gather data from various internal and external sources. This may include transactional databases, spreadsheets, CRM systems, ERP systems, social media platforms, and external data sources. Data can be structured (e.g., databases) or unstructured (e.g., text files, images).
  2. Data Integration: Once data is collected, it needs to be integrated into a single, unified data repository. This involves cleaning, transforming, and standardizing data to ensure consistency and accuracy across different sources. Data integration may also involve resolving data quality issues, such as missing values, duplicates, or inconsistencies.
  3. Data Warehousing: In many cases, organisations use data warehouses or data marts to store and manage their integrated data. Data warehouses are centralised repositories that store historical and current data from multiple sources in a structured format optimised for analytics and reporting.
  4. Data Modelling: Data modelling involves designing the structure and relationships of the data stored in the data warehouse. This includes defining tables, columns, relationships, and hierarchies to support specific BI use cases and analytical queries. Common data modelling techniques include dimensional modelling and entity-relationship modelling.
  5. Data Analysis: With the data in place, organisations can perform various types of analysis to derive insights and make informed decisions. This may include Descriptive analysis (e.g., summarising data), Diagnostic analysis (e.g., identifying trends and patterns), Predictive analysis (e.g., forecasting future outcomes, what if cases, ), and Prescriptive analysis (e.g., recommending actions). BI programs often incorporate forms of advanced analytics, such as data mining, predictive analytics, text mining, statistical analysis and big data analytics. A common example is predictive modelling that enables what-if analysis of different business scenarios. In most cases, though, advanced analytics are conducted by separate teams of data scientists, statisticians, predictive modelers and other skilled analytics professionals, while common BI teams oversee more straightforward querying and analysis of business data.
  6. Data Visualization / Reporting and Dashboarding : Data visualization is the process of representing data visually through charts, graphs, dashboards, and other visualizations. Visualization tools allow users to explore data, identify trends, and communicate insights effectively. Visualization plays a crucial role in making complex data accessible and understandable to decision-makers. Reports typically present summary-level data and insights, while dashboards provide real-time visibility into key performance indicators (KPIs) and metrics.
  7. Data Exploration and Discovery: Data exploration involves exploring and analyzing data to uncover hidden patterns, correlations, and insights. This may involve interactive exploration of data using ad-hoc queries, drill-down analysis, and data discovery tools to gain a deeper understanding of the underlying data.
  8. Deployment and Maintenance: Once BI solutions are developed, they need to be deployed to end-users and maintained over time. This involves ongoing monitoring, performance tuning, and updates.
  9. User Training and Adoption: Finally, organizations must provide training and support to end-users to ensure that they can effectively use BI tools and solutions to make data-driven decisions.


Companies that effectively employ BI tools and techniques can translate their collected data into valuable insights about their business processes and strategies. Such insights can then be used to make better business decisions that increase productivity and revenue, leading to accelerated business growth and higher profits.


BI Tools Market share

As per Finance Online, predicated market share of BI Tools is given below

https://financesonline.com/business-intelligence-software-statistics/

Self Sourced #Hrishikesh Sharma
Self Sourced #Hrishikesh Sharma

Popular BI Tools

Based on the industry, needs of organization there are different sets of popular BI tools. I am listing the tools which are commonly used in banking and financial organisations. ?Popularity differs from specific requirement like Cloud/On-Premise, integration with existing ecosystem, domain leaders, etc

#1. Microsoft Power BI (By Microsoft)

Power BI integrates seamlessly with other Microsoft products like Excel, Microsoft 365, and Azure. This allows you to combine data from various sources and leverage Azure's AI capabilities for deeper insights. Microsoft Power BI is a powerful tool that can help businesses of all sizes gain valuable insights from their data. It has components like Power BI Desktop, Power BI Service, etc

#2. Tableau (By Salesforce)

Tableau was acquired by Salesforce in June 2019. Salesforce currently owns Tableau.

Tableau is known for its drag-and-drop functionality, making it accessible for people with varying levels of technical expertise. This is very popular and one of the leading BI tool in market.

#3. QlikSense and QlikView

Qlik is a leader in the BI industry, serving over 40,000 customers globally across various industries . Qlik ?competes with other major BI players like Microsoft Power BI and Tableau.

QlikSense and QlikView are both business intelligence (BI) tools developed by Qlik, but they cater to different user needs and approaches to data analysis.

QlikView is a first generation analytics platform that supports visual data discovery, self-service BI reporting, and the development and sharing of data dashboards. Qlik Sense is a modern analytic solution that supports more free-form analytics and allows users to build data and web applications through API connections.

QlikView: To quickly develop - controlled, guided analytics applications and have developers who prefer granular design control.

QlikSense: To prioritize user-friendly self-service analytics, have a mix of technical and non-technical users, and value a modern, touch-friendly interface with good integration capabilities.

#4. SAP BusinessObjects BI (Aka SAP BI)

SAP BusinessObjects Business Intelligence is a centralised suite for data reporting, visualization, and sharing. As the on-premise BI layer for SAP's Business Technology Platform, it transforms data into useful insights, available anytime, anywhere.

SAP BO is a front-end BI platform, so the data is not stored at the application level, but is integrated from the various back-end sources. SAP BI/BW is the technological part where data is stored and analytical tools are available for analysis.

SAP BusinessObjects BI integrates seamlessly with other SAP software products and ERP’s. It also supports integration with third-party data sources and applications through connectors and APIs.

Organizations already invested in the SAP ecosystem and those dealing with substantial data volumes are good candidates to adopt SAP BI.

#5. Looker (by Google)

Looker is a cloud based business intelligence (BI) and analytics platform acquired by Google Cloud in 2020. ?Looker was founded in 2011 and is based in California, United States.

Looker has deep integration with other Google Cloud services, including BigQuery, Google Cloud Storage, Google Sheets, and Google Data Studio. This enables seamless data integration and analytics workflows within the Google Cloud ecosystem.

Looker stands out for its multi-cloud flexibility. You can choose to deploy Looker on Google Cloud Platform (GCP) or leverage it with other cloud providers like Amazon Web Services (AWS) or Microsoft Azure, depending on your existing infrastructure

#6. SAS Business Intelligence

SAS stands for Statistical Analysis System. It's a collection of software programs that can store, retrieve, and modify data. SAS can also perform statistical analyses, create reports, and produce graphics.

SAS BI integrates with SAS's powerful statistical and analytical tools, enabling users to perform in-depth data mining, forecasting, and predictive analytics.

It is mainly used by organizations who have already heavily invested in the SAS software suite or SAS ecosystem.

#7. IBM Cognos Analytics (Cognos Analytics with Watson)

IBM Cognos Analytics with Watson (aka Cognos Analytics, and formerly known as IBM Cognos Business Intelligence) is a web-based integrated business intelligence suite by IBM. It provides a toolset for reporting, analytics, score-carding, and monitoring of events and metrics.

#8. Oracle BI

Oracle Business Intelligence (BI) is a comprehensive suite of business analytics tools and applications offered by Oracle Corporation. Some of the offerings include Oracle Essbase (OLAP DB), Oracle BI Enterprise Edition (OBIEE), Oracle Analytics Cloud (OAC), etc

#9. Other BI Tools

Other notable tools available in market are


  • Zoho Analytics
  • Sisense
  • MicroStrategy
  • Datapine
  • Domo
  • BOARD (By Board Tool Kit)
  • Pentaho (By Hitachi)
  • Mathworks (Produces software such as MATLAB and Simulink)

Business Intelligence (BI) Pre-Requisite Concepts

To understand BI, there are some pre-requisite concepts which needs to understood.

#1. OLAP

OLAP stands for Online Analytical Processing. It is a category of software tools and technologies used to perform multidimensional analysis of data. OLAP enables users to analyze large volumes of data from multiple perspectives quickly and interactively.

Here are some key aspects of OLAP:


  • Multidimensional Analysis: OLAP allows users to analyze data across multiple dimensions or attributes simultaneously. By slicing and dicing data along different dimensions, users can gain insights into trends, patterns, and relationships within the data.
  • Aggregation and Summarization: OLAP systems support aggregation and summarization of data to provide a high-level view of information. Users can drill up or drill down to view data at different levels of granularity, from summary-level aggregates to detailed transactional data.
  • Interactive Analysis: OLAP tools provide interactive capabilities that enable users to explore and analyze data dynamically. Users can manipulate data, apply filters, and perform ad-hoc queries to answer specific business questions and gain insights in real-time.
  • Complex Calculations: OLAP systems support complex calculations and calculations across dimensions, such as year-over-year trend, market share calculations, and other key performance indicators (KPIs). These calculations can be pre-defined or created on the fly by users.
  • Speed and Performance: OLAP databases are optimized for fast query performance, enabling users to analyze large volumes of data quickly. OLAP systems use specialized data storage and indexing techniques, to optimize performance and support analysis.
  • OLAP Cubes: OLAP data is typically stored in multidimensional structures called OLAP cubes or data cubes. These cubes organize data into dimensions, measures, and hierarchies, enabling efficient multidimensional analysis.


Self Sourced #Hrishikesh Sharma
Self Sourced #Hrishikesh Sharma
Self Sourced #Hrishikesh Sharma

In short, OLAP is a powerful technology for interactive analysis of large volumes of data, enabling users to gain insights, make informed decisions, and drive business performance. It is widely used in business intelligence, data analytics, and decision support systems across various industries.

#2. OLTP

OLTP stands for online transactional processing. It's a software program or operating system that helps businesses and individuals complete transactions quickly, efficiently, and accurately. OLTP systems are designed for use by frontline workers like cashiers and tellers.

It refers to a class of systems and technologies used to manage and process transactions in real-time. OLTP systems are designed to support high-volume transactional workloads, such as recording sales, processing orders, updating inventory, and managing customer interactions.

OLTP is used frequently with database RDBMS database systems like Oracle, MS SQL Server, MySQL, etc.

Here are some key aspects of OLTP:


  • Real-Time Processing: OLTP systems are optimized for real-time transaction processing, enabling organizations to capture and process transactions as they occur.
  • Transactional Workloads/ Processing: OLTP systems handle transactional workloads involving frequent insertions, updates, and deletions of data records.
  • Concurrent Access: OLTP systems support concurrent access by multiple users and applications, allowing for simultaneous execution of transactions without causing conflicts or data inconsistencies.
  • ACID Properties: OLTP systems adhere to the principles of ACID (Atomicity, Consistency, Isolation, Durability) to ensure transactional reliability and integrity. Transactions are >> Atomic (i.e., all or nothing), / Consistent (i.e., maintain data integrity constraints), / Isolated (i.e., execute independently of other transactions), and / Durable (i.e., persist changes permanently).
  • Normalized Data Model: OLTP databases typically use a normalized data model to minimize redundancy and optimize data integrity.
  • Indexing and Query Optimization: OLTP systems use indexing and query optimization techniques to facilitate efficient data retrieval and manipulation.


OLTP systems are essential for supporting day-to-day business operations, enabling organizations to process transactions efficiently, maintain data integrity, and provide timely access to critical business information. They are commonly used in various industries, including banking, retail, healthcare, and e-commerce, to support mission-critical business processes and ensure operational efficiency.


#3. OLAP vs. OLTP

Self Sourced #Hrishikesh Sharma

#4. Data Modelling

Data models provide a blueprint for designing a new database or re-engineering a legacy application. Data modelling is the process of creating a diagram of a software system and its data elements. It's a central step in software engineering and a critical process in the development of software applications and database systems.

Self Sourced #Hrishikesh Sharma
Self Sourced #Hrishikesh Sharma

#5. Database

A database management system (DBMS) is a set of computer software that allows users to interact with one or more databases.

A database is?a collection of information that is organized for easy access, management, and updating.


#6. Data Warehouse

A data warehouse is a system that collects data from multiple sources into a single repository. It's a central repository of information that can be analyzed to make more informed decisions.? A data warehouse will ‘house’ data that has been collected from many disparate sources through the ETL (Extract Transform Load) process.

Self Sourced #Hrishikesh Sharma

#7. Data Lake

Data lakes can accommodate all types of data, which is then used to power big data analytics, machine learning, and other forms of intelligent action.

Data lakes offer more storage options, have more complexity, and have different use cases compared to a data warehouse.

High level layers in data-lake ar

Self Sourced #Hrishikesh Sharma


#8. Database vs. Data Warehouse vs. Data Lake

A database is a collection of data that is organized for storage, accessibility, and retrieval.?A data warehouse is a type of database that integrates copies of transaction data from different source systems and provisions them for analytical use.

Data warehouses are designed to facilitate reporting and analysis.?The rows and columns are typically read-only and maintain historical entry data, not just the most recent entry.

Self Sourced #Hrishikesh Sharma

#9. Datamart

A data mart is a data storage system that contains a small, selected part of an organization's data. It's a simple form of data warehouse that focuses on a single business unit, department, or subject area. It is subject or domain specific subset from data warehouse

Self Sourced #Hrishikesh Sharma

#10. Data Mining

Data mining is a computer science technique that involves extracting useful information from raw data.

Data mining is the process of discovering patterns, trends, correlations, and insights from large datasets using statistical, mathematical, and machine learning techniques. It involves extracting valuable knowledge and actionable information from raw data to support decision-making, strategic planning, and business intelligence.

Data mining is a process that involves:


  • Data Collection: Collect the data
  • Data Preprocessing: Cleaning, transforming, and preparing the data for analysis. This includes tasks such as handling missing values, removing duplicates, standardizing formats, and normalizing data to ensure consistency and quality.
  • Exploratory Data Analysis (EDA): This may include generating summary statistics, histograms, scatter plots, and correlation matrices to identify patterns and outliers in the data.
  • Feature Selection and Engineering: Identifying the most relevant features or variables that contribute to the predictive power of the model.
  • Model Selection and Training: Applying various machine learning algorithms
  • Model Evaluation and Validation: After ML training the model, evaluate its performance and validate its accuracy using appropriate metrics and techniques.
  • Knowledge Discovery and Interpretation: The final step in data mining is interpreting the results and extracting actionable insights from the discovered patterns and trends. This involves translating the findings into meaningful business recommendations, strategies, or decisions that drive value and impact.


Self Sourced #Hrishikesh Sharma


Vishal Satija

Assistant Operations Manager at Altran

11 个月

Good and Meaningful Hrishikesh Sharma

Amitt Gupta

Delivery Director - Network Transformation & Operations | Pursuing CTO Program - IIT Kanpur | MIT | PMI-PMP | ITIL | PRINCE2 | CSM | CCIE Security # 32025 - Emeritus

11 个月

Informative !!!!!!

Rahul Pareek

Simplifying Data & Corporate Growth | HCLTech | Educator

11 个月

Adding one more point on use cases of BI tools: Detecting Outliers: An outlier is?a data point that differs significantly from other observations.? Outliers can be due to a variability in the measurement, an indication of novel data, or it may be the result of experimental error. One of the simplest ways to spot outliers is to?visualize your data using graphs, charts, or plots. For example, you can use a box plot to show the range, median, and quartiles of your data, and mark any points that lie beyond the whiskers as potential outliers. Great work Hrishikesh! Very useful article for someone who need a quick refresher on basics + advance topics. Appreciate your hard work!

Hemant Mohan

Principal PS Consultant at Genesys | Cloud Contact Centre Solution | Ex- Amazon | Amex | Barclays

11 个月

Informative ??

要查看或添加评论,请登录

Hrishikesh Sharma的更多文章

社区洞察

其他会员也浏览了