AI - What's Old is New Again

As we move into the final quarter of 2023, I cannot think of a more compelling buzz term for the year than “AI.”? Certainly, my LinkedIn feed seems to be a steady stream of product promotions that somehow incorporate AI as a product feature.? AI is hip, it’s now, it’s happening.? It’s also older than most of us on LinkedIn.?

Yes, the generative AI of Chat GPT and its kindred reflects huge advancements in AI and should be applauded for all of the opportunity it affords, and scrutinized for all of the risks that people may create with it, but we also may need to appreciate that it is only the current state of an evolution that started many decades ago.?

With the above in mind, I am sharing the proposal that kick-started the Artificial Intelligence discussion back in 1955 and which led to the Dartmouth Summer Research Project on Artificial Intelligence in 1956.?Take a moment to read it, and you will recognize that what’s old is new again.

Julie Saslow Schroeder, J.D., L.L.M.(Actively searching)

Chief Legal Officer | Cybersecurity | Risk Management Expert | Privacy and Governance Leader | Emerging Technologies Expertise | Digital Transformation| Corporate Business Strategist | Integrator | Speaker

1 年
Julie Saslow Schroeder, J.D., L.L.M.(Actively searching)

Chief Legal Officer | Cybersecurity | Risk Management Expert | Privacy and Governance Leader | Emerging Technologies Expertise | Digital Transformation| Corporate Business Strategist | Integrator | Speaker

1 年

David Mitchell Adrian K Francis #ai is not a thing-it is a collection of #machinelearning tools. Second,the theory started with Markov chains in 1906. Markov processes are the basis for random simulation methods (Markov chain Monte Carlo) and have found application in Bayesian statistics, thermodynamics, statistical mechanics, physics, chemistry, economics, finance, and signal/ information theory/speech processing. In 1950, Alan Turing wrote about a “thinking”machine. The premise? If a machine could be part of a conversation and imitated a human so completely with no noticeable differences, then the machine could be considered capable of “thinking.” In 1952, the Hodgkin-Huxley model showed how the brain uses neurons in forming an electrical network. These events helped inspire the idea of artificial intelligence (AI), natural language processing (NLP), and the evolution of computers. Noam Chomsky published Syntactic Structures in 1957 and revolutionized linguistic concepts that-for a computer to understand a language-the sentence structure would have to be changed. Chomsky created Phase-Structure Grammar, which methodically translated natural language sentences into a format that is usable by computers. Then #DARPA/ #NLP….

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了