How to drive Data & Analytics projects completely to the wall - and how to prevent it
Do you know it? A new project comes up - it's about reporting, Big Data Analytics, Machine Learning, a data platform for the company? The controlling department wants to analyze the financial data of the ERP system, the sales & marketing team has to evaluate the CRM data, in production there are IoT devices as first prototypes. There is always a lot invested in "modern" technologies, but how successful are these investments?
Digression: Data & Analytics projects fail
The public is hardly aware of any information about failed Data & Analytics projects - why? There are plenty of articles on how and why Business Intelligence or Big Data projects fail - but concrete use cases and lessons learned are hard to find. In contrast, there are reports about failed ERP (Enterprise Resource Planning) implementations in corporations - such as LIDL, Haribo, Otto, Deutsche Bank, Deutsche Post (Link). Without concrete public reports - we simply assume: Data & Analytics projects fail.
Data & Analytics projects differ from ERP projects
Let us stay with the example and concretize: Data & Analytics projects differ from ERP projects: an ERP implementation serves to effectively and efficiently handle the primary process chain from backwards (customer orders, shipping delivers, production produces, purchasing procures) and to support secondary processes in the areas of HR, finance, controlling or even IT. Data & Analytics projects based on an ERP implementation usually serve to analyze and report this data. Another important difference is that a Data & Analytics project usually has a "hard" end date (yes, there are exceptions!), where all project participants of an ERP implementation are working towards a fixed end date for the implementation. If this deadline is not met, an outcry is pre-programmed.
What are the possibilities that Data & Analytics projects do not fail?
Those responsible for and involved in Data & Analytics projects should know these differences. After all, they enable different planning and implementation concepts. But how do you achieve a good result relatively quickly, easily and cost-effectively? In other words, to deliver a high-quality result with absolute certainty?
"High-quality results" are always very positive user experiences with IT-system supported, data-driven processes (in short: use of analytical applications). To finally deliver a high quality application, a design phase is required - in which the analyst understands the customer's needs, creates the conceptual design and then creates the final product. This is the approach of "Design Thinking" - and I love it.
Design Thinking means development in the client's mind
Why? Design Thinking means the development of software for a customer - from the customer's point of view - that sounds simple. In everyday life, however, interviews with requirement and functional specifications still dominate. For data & analytics projects in the design phase, this approach is not best practice (anymore) - this methodology is old and has been overtaken by new techniques.
Design Thinking provides a framework to find out who the user is, what drives him and what frustrates him in his daily work. These users are brought together groups of people (persona) with common characteristics and it is recorded how their view of the analytical application is and what they really need to improve their performance. A practical example and one of my greatest personal experiences - a sales manager wanted to see and analyze customer sales based on the fiscal years of each customer. This information put him in a better position in negotiations. Consequently, the master data had to be extended. Without the interview and understanding of his daily work, this function would never have been developed.
Once the user's needs are understood, the conceptual design phase begins with storyboards, wireframes, visual design and the presentation of the use case. Direct consultations with open questions to the user are allowed and explicitly desired - he should understand what he gets. Most of the time the users even adjust their real needs.
Subsequently, the results of the conceptual phase are used to create the final productive design. This includes interactions with the analytical application via the end device, visual specifications, and formal capture as a user story for the developers.
And read correctly - no line of code has ever been written, no Data & Analytics tool has ever been picked up or the Data Lake, the Lake House or the Data Warehouse has been comprehensively realized. Because only when the customer knows what he is getting and the analyst documents the requirements in such a way that the developers understand it from the customer's point of view, only then does development begin.
Implementation? Provide the organizational and technical framework
If user stories are available for implementation, these can also be implemented. A prerequisite for this is the creation of organizational and technical guidelines. It is best to already describe the architectural solution and technical components in the appendix of the user story - because nothing is worse than developers arguing about the "best" solution. It is important in this context that the architectural solution has been defined as a standard within the organization. Otherwise, sooner or later the solution will be declared either a "shadow IT project" or an "individual solution", which will at some point demand a high level of technical debt.
Parallel to the standard architecture and the technical components, it should also be defined who implements the corresponding work packages with the respective tools. Who fills which role in the team, who executes it and what capacities are the absolute basics in every project - whether modern or classic - and yet it happens more often that this is not the case.
Once the user story has been implemented according to the team, the user acceptance test follows, in which the customer accepts or rejects the product/service. This test is carried out jointly between the team's business analyst and the customer and possibly the key user of the area. Only when the customer confirms that the product can be used live is the user story complete. If the customer rejects it, the user story is returned to the team. What counts is the basic idea: "whoever orders, pays and whoever pays is right".
How do you drive BI and data analytics developments?
If you have any questions or want to give feedback - I am always interested in an exchange. Please select the comment function below to have a public discussion. For private cover letters, please contact me via my Xing or LinkedIn profile.