The barriers to adopt machine learning are in our heads

The barriers to adopt machine learning are in our heads

The seven take-aways of this post:

  • Thanks to modern machine learning technology, organizations have never before been closer to being able to manage risk holistically.
  • Today’s barriers for adoption of state-of-the-art machine learning are no longer in technology but in our heads.
  • The lack of productivity in data analytics cannot be solved by hiring more people.
  • New cloud-born machine learning platforms are moving predictive analytics to the next level. Machine learning is moving from the self-standing product to an integral service – Machine Learning as a Service.
  • The entire machine learning workflow is getting automated little by little.
  • Companies’ domain experts and IT developers are starting to use new consumable and interpretable machine learning platforms in their daily work.
  • Innovative organizations in different industries will soon transform into efficient predictive model factories.

Increasing computational power is bringing problems humans are helpless to solve within reach as they can explore relationships between thousands of variables and find the right relationships between them. Machine learning is great at identifying patterns in sparse data and zeroing on variables that strongly influence the outcome of events of interest to the organization like: Will the customer be profitable? How likely is it that the claim is fraudulent? What is the next best action the customer could be interested in? Or how severe an event will it turn out to be given circumstances? This ‘super-power’ is of enormous value. Each newly discovered pattern can have direct impact on costs, revenues and customer satisfaction.

As it is often the case with technological advances the never ending march of bigger data, bigger computing capacity and better tools have finally broken the barriers of feasibility in doing machine learning at scale.

Democratised productivity
Today’s barriers for adoption of state-of-the-art machine learning technology are no longer in technology but in our heads. ”We’ve always done it this way,” arises quickly even in machine learning projects. Most of the industries have recently hired people from academia and with them also inherited the technology in use in academia without questioning whether the same tools can also solve the problems with different requirements than the ones in scientific research. The belief that lack of productivity in data analytics can be solved by hiring more smart people instead of upgrading to a standards based end-to-end framework stands opposite to all other developments in the digital world. Yet these barriers remain omnipresent in corporate settings when it comes down to taking analytics from merely interesting insights into real business processes that touch the end-customer. But the shift is inevitable.

New cloud-born machine learning platforms (like BigML) are moving predictive analytics from the far corners of support functions to the daylight of everyday business decisions, making it consumable, interpretable and opening it to companies’ domain experts and IT developers. Machine learning is moving from the self-standing product to an integral service; Machine Learning as a Service. This shift allows many more use cases to be solved programmatically thanks to easy cloning of good ideas across the organization and beyond toward the whole supply chain. This movement is bound to result large productivity gains due to smarter applications with real-time insights at the point of action — all this without having to hire 10s of PhDs and asking them to play nice in their sandboxes.

Focus on creativity
The entire machine learning workflow is getting automated little by little. Level of human interaction in repetitive tasks like extracting the right variables and choosing the right predictive model to solve relevant problems for organizations is gradually but surely marching down to zero. Thanks to these productivity gains in house analytics experts can afford to focus their attention to more creative aspects of their jobs that translate into true innovation.

Old conventions
To facilitate this key transition, the focus of the next generation machine learning platforms extends beyond the objective of pure statistical programming to questions like: How can we easily deploy the predictive models into existing infrastructure and serve the prediction at the right moment to the right recipient? How can we build end-to-end predictive applications with dynamic models in half of the time that it took us before? How can we analyse new real-time data sources produced by dozens of sensors in vehicles collecting driving data?
Although a number of available platforms already can offer a big part of this vision today, the industry is still caught in old automatisms and false beliefs. “Predictive analytics is extremely difficult.” “In order to increase the productivity of our data science departments we need to double the number of data scientists.” “It will take us years until we change the hard-wired business rules with smart and adaptive business rules based on predictive models.” “Our hopes lay in improving our model’s accuracy with deep learning.” It runs the gamut from not questioning status-quo to pseudo-scientific talks about techniques that are promising but not yet fully vetted in most business domains.

Return to the forefront
This is a wake up call. It is high time to question the status quo. Make a concerted effort to adopt next generation machine learning platforms, revisit existing processes and get ready for industrialized machine learning before it takes you by surprise. The time of machine learning workshops and “Big Data” strategy slideware is coming to the end. It’s your job to break down the walls and empower the people participating in those workshops and project with the right scope of responsibilities, real-life use cases and the cutting edge tools. Innovative companies will soon transform into efficient predictive model factories that continually manage magnitudes higher number of models optimized for every product, every risk, and every customer. Are you ready for the upcoming change?

Hartmuth Gieldanowski, CPCC, ACC

Agile Coach at Takeda Pharmaceuticals | Certified Professional Co-Active Coach, CPCC | ICF ACC | President elect ICF Switzerland | Organizational Catalyst & Product Champion Coach | Pragmatic Utopian & Metamodern Thinker

8 年

Great Article! Making capabilities accessible and usable for users and customers is the important step in unlocking the potentials. Same has been true with manufacturing - bringing people from operating to creating machines - or computing: everyone has a accessible super power device in his pocket. So it's clear that the "connected" wisdom of "academia" will empower people much more. But with great power comes also great responsibility...let's use it for the good. Keep up the amazing work you guys do @BigML. cheers

回复

要查看或添加评论,请登录

Amir Tabakovic的更多文章

  • Only a Little Bit Re-Discovered

    Only a Little Bit Re-Discovered

    A New LinkedIn Notification Two weeks ago, during a business trip in Switzerland, I noticed a new LinkedIn…

  • Only a little bit re-identifiable?! Good luck with that…

    Only a little bit re-identifiable?! Good luck with that…

    Don't break things In the first part of this PrivacyTech in Banking series, I looked into the effect that privacy has…

    8 条评论
  • Privacy – A Killer of Data-Driven Innovation for Banks?

    Privacy – A Killer of Data-Driven Innovation for Banks?

    Looking back at data-driven innovation in banking Looking back at my years at a retail bank that involved developing…

    5 条评论
  • Rolling the Dice With Privacy

    Rolling the Dice With Privacy

    The original version of this article was published on Mostly AI's blog Why Privacy Matters? “Knowledge is power” should…

    1 条评论

社区洞察

其他会员也浏览了