Think the next Billion dollar investment you must make is AI?
Robin Dymond
Expert in Agile Transformation: Helping people & companies become resilent and adaptive. CST.
Let me tell you a true story about the software industry's 30+ years of failed promises to automate away knowledge work.
TLDR at bottom of article.
We have so many experiences like this, you'd think we'd start seeing the repeating pattern...
In the early 90's the 4th Generation language programming environment MATLAB/SIMULINK was going to save me from having to code in C, I could just create my algorithms in a simple math statement and it would generate the code. Turns out MATLAB/SIMULINK generated bloated low quality code that would never deliver the performance needed in an embedded environment. I ended up just using MATLAB to test algorithm ideas, and then hand coded the production code.
In 1994 CDROMs were all the rage, with 17.5 million CD-ROM drives and $590 million in discs sold. We were told that we'd better get into the multimedia business creating CDROMs with the visual design tool Macromedia Director. Except producing CDROM was very labour intensive (expensive) and CDROMs were returned at a rate of 20% (1 in 5) because of compatibility problems.
By 1998 everyone was focused on the internet and creating websites. Macromedia Dreamweaver was going to eliminate the tedious coding web page layouts by converting a designer's beautiful page to html/javascript with no coding. Except that the generated code was bloated, hard to read, harder to fix, and wasn't compatible across web browsers. In 2004 I joined an agency doing websites for large customers (Procter and Gamble, Mercedes, etc). In order to have readable, maintainable, compatible and performant websites, all the web sites were coded by hand.
In 1999-2002, long before the iPOD, I had a startup, Sensate Inc. that was going to bring internet audio to cellphones, cars and consumer electronics. The new programming platform JAVA was the all the rage. The creator James Gosling said JAVA would allow people writing software for devices to work in a much faster and easier to use environment than old C and C++. After a few days of research and experimentation I dropped the idea of using JAVA for my startup, as it would lead to many many issues when supporting our solution across the various technology platforms our customers were using. We went with C++ on Windows CE and Linux, and had demonstration hardware and software running in 3 months. The embedded systems market didn't adopt JAVA in until embedded computing became roughly 10X more powerful (early 2010s) and Google made JAVA its native development environment for Android.
领英推荐
Fast forward to 2023, I started using ChatGPT to help write articles. With some prompts ChatGPT produced articles containing some good points and lots of boiler plate. I then started editing and feeding my changes back into ChatGPT to improve the article. Pretty soon ChatGPT was removing nuance and specific guidance I was adding, dumbing down the writing. After 3 hours of repeating the process, I was thinking, wouldn't have this been faster if I had just ran a couple queries with ChatGPT and then wrote the article myself instead of trying to correct this regurgitated content? Very quickly I found ChatGPT was correcting towards the norm, filtering out the more interesting parts in the article, because they were unique, different then what the model had been trained on. In another experiment, my daughter, a Synchronized swimmer, started asking ChatGPT many questions she would ask a Synchro coach about the sport. It turns out ChatGPT makes a very poor advisor on Synchronized swimming. It was an amusing demonstration on how Large Language Models not only don't understand Synchronized swimming because there wasn't much information online about it, ChatGPT was also incapable of asking intelligent questions to become better informed.
All of these tools and did advance the state of technology in some way, however all of them were oversold and over hyped. For example JAVA found its place in web browsers and enterprise application development, a long way from James Gosling's original idea of improving software development for devices and industrial applications.
Want the best return on your AI investment? Start with projects the are narrow in scope with well defined outcomes. Make sure you have access to specific training data sets that have high quality labels or are relatively easy to label. Low quality training data will result in poor results (garbage in garbage out). Grand projects with lofty goals and unclear outcomes, will be costly failures. Don't try to boil the ocean. Spend slower, and focus on learning and educating yourself and your staff on how to use these new tools.
#TLDR Too long don't read
(should you be in charge of mega investments if you don't read? Hmmm.)
Today we have the AI hype, where companies will spend and lose billions creating AI bots that don't deliver the expected return on investment.
Embedded Systems Engineer
3 个月I remember those days. (Oh no. I'm showing my age ??)
Expert in Agile Transformation: Helping people & companies become resilent and adaptive. CST.
4 个月Interesting. I must be wrong, because Meta, the company that 2 years ago was spending billions on the virtual reality metaverse has switched gears, cancelled the metaverse projects and now plans to spend $50 billion on AI. Either I am wrong or now is a good time to sell your facebook shares? Meanwhile, Meta has banned sharing of any news articles on Facebook in Canada. https://www.dhirubhai.net/news/story/meta-seen-spending-even-more-on-ai-6109844/