This AI newsletter is all you need #22
Towards AI
Making AI accessible to all with our courses, blogs, tutorials, books & community.
What happened this week in AI by Louis
One word: Galactica.
Galactica, Meta's most recent large language model that can store, combine and reason about scientific knowledge was shut down after many users reported results that were misleading or incorrect. There's a lot of controversy going on around this model, mostly to do with the gap between Meta’s confidence over the model and its rightfully questionable results. The demo was not as catastrophic as Microsoft’s Tay incident of 2016, but it too quickly found the line between fun experimental tool and dangerous propagator of misinformation. Galactica represents a big advancement for large language models, but given that it was intended for scientific use, the level of rigor was far from met.?
On my end, I really liked a tweet shared by my friend Lior, which greatly summarizes my thoughts. I'd like to quote here:
“The drama surrounding Galactica baffles me. Let's remember we're all on the same?team trying to make our tiny field progress.”
Was Galactica perfect? No. But GPT3, StableDiffusion, and Dall-E weren't either. It's by releasing it into the world that the feedback loop starts, and these insights help us build better tools over time.?
To add the ethical perspective from Lauren, let’s not forget what effects this might leave on the world and our responsibility as AI co-creators to handle those effects, whether they range from negative to positive. This is neither the first nor the last language model to accidentally spread falsehoods, but understanding and learning from these mistakes ensures that the progress we work toward in AI forges the future we want.?
Hottest News
Most interesting papers of the week
Enjoy these papers and news summaries? Get a daily recap in your inbox!
The Learn AI Together Community section!
Meme of the week!
Featured Community post from the Discord
JacobBum#7456 just published “Breaking it Down: K-Means Clustering”. This is a great article which explores and visualizes the fundamentals of K-means clustering with NumPy and scikit-learn. If you write articles and publish them on your blog or on our Medium publication, share them on our discord server and you might get a chance to be featured here too!
AI poll of the week!
TAI Curated section
Article of the week
Training machine learning models can be time and memory-consuming, especially if your data is large. It is important to optimize the workflow to save computational time and memory consumption, especially while training the model multiple times with different hyperparameters to find the best hyperparameters for your model. This article shares six practical tips to decrease computational time and memory consumption while training a machine learning model.
Our must-read articles
In-depth Azure Machine Learning Model Train, Test, and Deploy Pipelines on Cloud With Endpoints for Web APIs by Amit Chauhan
If you are interested in publishing with Towards AI, check our guidelines and sign up. We will publish your work to our network if it meets our editorial policies and standards.
Job offers
Interested in sharing a job opportunity here? Contact [email protected].?
Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan
2 年Thanks for sharing.