Create a Movie Recommendation Chatbot with Zilliz Cloud & AI/ML Terms You May Not Know
In this issue:?
?? Create a Movie Recommendation Chatbot with Zilliz Cloud
Create a chatbot service to suggest new movies to watch! Perfect for movie streaming companies or movie nerds. ????
This chatbot is made easy in 4 steps via Zilliz Cloud. In this demo with Frank Liu , we'll build this service end-to-end, showing how easy it is to build any semantic search solution.
Let us know your favorite movie in the comments!
?? AI/ML Terms You May Not Know
?? Softmax
What is Softmax?
The softmax function or the normalized exponential function, is a popular activation function for multi-class classification. Other activation functions like Sigmoid are limited to single-class use cases and softmax works on multiple labels. Softmax scales logits or raw outputs from a neural network into an array of probabilities. Softmax is important for:?
Why is it relevant??
The softmax activation function, found in neural network output layers, helps multi-class classification by outputting class probabilities. It transforms raw inputs into a probability distribution, bridging the gap between network output and meaningful confidence scores. This makes softmax vital for real-world machine learning problems. ????
???Image Classification
?Sentiment Analysis
???Speech recognition
The formula:?
f(xi) = e^xi / Σj e^xj
Where:
x = Vector of raw outputs from the previous layer of a neural network
i = Probability of class i
e = 2.718
Examples:?
??? Image classification - Neural networks excel at image classification, analyzing and categorizing an image into predefined classes. The softmax function plays a vital role in this process.
??? Speech recognition - AI chatbots must accurately identify users’ words from predefined alternatives and formulate their responses accordingly. So, the model analyzes the input audio and generates a score for each possible word. The Softmax function then assigns probabilities to each alternative.?
What is Local Sensitivity Hashing??
Locality-Sensitive Hashing (LSH) is a technique used in approximate nearest neighbor (ANN) searches that underpins and accelerates the efficiency of similarity searches.? Unlike conventional hashing, which mainly focuses on mapping each data point, LSH groups similar data points together. This is enabled by its “locality sensitivity,” which ensures that close points in the original data spaces would end up in the same location, also referred to as a “bucket.” ??
Why is it relevant??
In the era of big data and machine learning, efficiently finding nearest neighbors or similar items in large, high-dimensional datasets is a challenge across many applications. Traditional techniques like linear scans or space-partitioning tree methods become difficult as the number of dimensions and data points increases. This is where Locality Sensitive Hashing (LSH) comes into play. Examples of applications:?
??? Image retrieval
?? Computational Biology?
?? Plagarism detection
?? Audio/video fingerprinting
???? Fraud detection?
领英推荐
??? ICYMI - Meetup Recaps
Missed a recent meetup? Check out the YouTube videos and slides from all the talks!?
?? Ensuring Secure and Permission-Aware RAG Deployments
?? How CXAI Toolkit uses RAG for Intelligent Q&A
?? Multimodal Embeddings
?? A Different Angle: Retrieval Optimized Embedding Models
?? From Dev to Prod: Vector Database Made Easy
?? Building the Future of Neural Search: How to Train State-of-the-Art Embeddings
?? Get Started with LangChain?
LangChain stands at the forefront of large language model-driven application development, offering a versatile framework that revolutionizes how we interact with text-based systems. Learn how to get started with LangChain, create an open source chatbot quickly, and more!
?? Upcoming Events?
Aug 13: South Bay Unstructured Data Meetup (in-person)?
We’ll be back at SAP in Palo Alto for our meetup! Talks from TwelveLabs , Zilliz, and Intuit !
?? Advanced Video Search - Leveraging Twelve Labs and Milvus for Semantic Retrieval James Le
?? Implement Agentic RAG Using Claude 3.5 Sonnet, LlamaIndex, and Milvus Bill Z.
?? Inference on streaming data Sriharsha Yayi & Derek Wang ?
Aug 13: New York Unstructured Data Meetup (in-person)?
We got a stacked speaker lineup for the New York meetup! Join us for the following AI talks:?
?? Quick intro to unstructured data, edge ai and Milvus Tim Spann
?? Modern Analytics & Reporting with Milvus Vector DB and GenAI Bill Reynolds
?? cuVS+Milvus Corey Nolet
?? Combining Hugging Face Transformer Models and Visual Data with FiftyOne Jacob Marks
Aug 29: Multimodal RAG with Milvus and GPT-4o (virtual)?
If you’re interested in learning multimodal RAG live, don’t miss this webinar. In this tutorial, Jiang Chen will vectorize a dataset of text and images into the same embedding space, store them in Milvus, retrieve all relevant data given an LLM query, and input multimodal data as context into GPT-4o.?
-
Thanks for reading and see you next time! ??