Not Everyone is Allowed to Play
tl;dr - do we need a sign around all of our data and AI tools that reads: 'danger lurks here'?
Part 1 of #x series discussing the use of Intelligent Data platforms and Artificial Intelligence systems.
Risks associated with centralized data structures necessitate a barrier to entry for who gets to work with them. Particularly for more mature organizations that have centralized their data, processes, and tools, users of these systems have the potential to, at best, waste valuable time playing with models that have no business value, and at worst, cause irreparable damage to datasets that can set the organization back. While some of these challenges can be prevented through appropriate permissioning, ultimately the solution to this problem is that of good judgement. Technically minded people need to develop good business judgement in order to know how to use their time effectively. Business minded people need to develop good technical judgement to know how to respect the complexity of such systems and play within their own capabilities.
The implications
Perhaps a need for having badges or certifications required before anyone can access certain datasets or tools. Ultimately, this is a question of education surrounding people who play with data and AI, however, with technologies, tools, and approaches constantly evolving, the education becomes less about specific skills and knowledge, and more about instilling a sense of good judgement and responsibility around use.
Thoughts?
rm
Assisting clients improve operational outcomes by helping them become more data driven and people centric.
9 个月Great comments. The race to deploy AI solutions is leaving many companies vulnerable to unintended consequences. Investing time and money on solid data governance often becomes an afterthought until a major incident occurs and then suddenly everyone is on board with taking the time to doing it right. Data and business teams need to make sure their AI aspirations don’t get too far ahead of their ability to govern the data and the solutions they support. This has always been the best practice, let’s not repeat the sins of the past.
Marketing Executive | B2B | B2C | Story teller | IT |ex-Microsoft&TechData&Philips
9 个月Insightful! Thank you!
Chief Technology Officer (CTO) at Inspired Intellect and WorldLink US
9 个月The basic thought - just because you can does not mean you should. However relying on individual subjective perspective on can vs should is dangerous and frankly reckless for an enterprise, thus governance, especially data and data product governance needs to be built into your data management and operations... AI is not the first liability-heavy usage pattern that exposes the various constituents ( data users/ consumers, contributors, owners, governors, etc...) to risk and misuse. As the saying goes, the road to hell is paved with good intentions or no good deed goes unpunished... these are common in the misuse and misunderstanding in the use of AI on enterprise data. From competitive confidential content overshared to public service AI with the best of intentions to improve enterprise operations, to misalignment of technical viability to business value. This is in part why the CDO role was originally created and why governance security and operational governance has to be in place to be trully a mature enterprise. Having a mature centralized data environment without such governance is like setting up a nuclear reactor in the middle of a farmers market.