Pulling back the Curtain on Deep Learning, Neural Networks, and Gen AI at LA’s MOCA.
Machines imitating life and life imitating machines.?
The Columbia Alumni Club of Southern California gathered a couple of weeks ago at MOCA | The Museum of Contemporary Art in the city of Angels.
We welcomed Columbia Business School Professor Daniel Guetta who gave a lively presentation of the inner workings of Deep Learning, Neural Networks, and Gen AI models.? A docent-led tour of MOCA’s impressive exhibit of human artists simulating machine produced images followed.
Early AI systems go back to the ‘60s.? Professor Guetta presented ELIZA, a basic prompt based system designed for psychotherapy.? Fast forward to today, machine learning (ML) is a more comprehensive version of this early system that shows the computer lots of examples of inputs and outputs to learn a relationship pattern commonly called predictive analytics.
Deep learning within this category refers to a computer program’s ability to learn from unstructured data (photos) as well as structured data (easily readable numbers, quantitative symbols, etc.).
Neural networks stack lots of these deep learning models together to help us better manage, synthesize, and act upon vast quantities of information.?
Cool.? So how does one create or fact check such a model?
It was pretty interesting to learn how words (and images) are associated with coordinated number pairs (think X,Y axis) called embeddings or vectors.? Basic geometry (non-quants thought they would never use again) is applied to calculate the relationship between those coordinates.? The Ancient Greek’s simple but elegant Pythagorean theorem is key to neural networks.? ??
Embeddings represent the context of individual words/concepts.? Computer software can now scrape the internet and calculate the context of ALL THE TEXT ON THE INTERNET.? Every language.? These vector space calculations were initially trained on Google News data in 2013.?
领英推荐
Advanced vector (direction and magnitude) calculations in neural networks means that the computer program can now account for context, connotation, and subtleties which allows the next generation of predictability that blows everything else away.?
Generative AI (Gen AI) is enabled by this transformer architecture, introduced in 2017, which took off in 2022.? Gen AI handles massive, extended word/concept relationships accounting for the relationship subtleties to enable a range of tasks from research and writing assistance in useable summaries to more advanced quantum computing cyber and medical applications.
In sum, these sophisticated models have multiple use cases people now take for granted when working with GPT, BERT, BART, and DALL-E (generating images from textual transcriptions) based models.? Alexa, for example, uses transformer tech to understand and respond to voice commands.
This brief and informative introduction to the inner workings of the models driving the AI revolution was followed by a terrific docent led tour of MOCA’s exhibition Ordinary People:? Photorealism and the Work of Art since 1968.? Interesting to observe the artists painting/sculpting people to mimic machine outputs after learning about the progress of artificial intelligence to simulate human outputs.
Thank you, Daniel Guetta for your very lively and informative presentation.? Thank you, Nancy Kwon Merrihew, MOCA Trustee, for sponsoring, and thank you, esteemed minkyung K. , So Cal CAA President, for organizing.
美国哥伦比亚大学 美国哥伦比亚大学商学院 Columbia Business School Alumni Columbia College Alumni Association MOCA | The Museum of Contemporary Art
#artificialintelligence #AI #innovation #technology #future
General Partner | CEO | Investor | Board Member | Venture Partner | Operating Partner | Advisor
2 个月Fascinating stuff, Joy. The machines are getting smarter ?? - I'm going to keep my eyes out for Agent Smith ??