Intro to Vector Operations in AI Systems

Intro to Vector Operations in AI Systems

Register here .

Vectors are everywhere in AI. LLMs rely on embeddings, which are vectors, while cosine similarity and dot products are vector operations routinely used in these systems. Vectors are the columns or rows of matrices (the way tabular data is stored), generalizing to tensors used in all neural networks. PyTorch and TensorFlow rely on them.

This presentation explains how the vector manipulations work behind the scenes. In particular:

?? How to quantize them for faster operations, and what is quantization

?? Processing tabular data vector-wise rather than elementwise for much faster processing, to avoid loops (as in NumPy)

?? Vector databases and fast vector search to match embeddings derived from prompts to those found in backend tables: the backbone of RAG/LLM systems

?? Further optimization with variable-length vectors, for instance variable-length embeddings

?? Alternatives to dot products and cosine distance, such as PMI (pointwise mutual information)

?? Geometric interpretation

?? Vectors in higher dimensions and structures to deal with large sparse vectors consisting mostly of zeroes

?? Vector functions and Jacobian matrices, at the core of gradient descent, and thus, all neural networks

This hands-on workshop is for developers and AI professionals, featuring state-of-the-art technology, case studies, code-share, and live demos. Recording and GitHub material will be available to registrants who cannot attend the free 60-min session.

Register here .


Shahab uddin

Learning Cloud Generative AI

2 个月

Great, can't wait to join.

回复
Gyan Prakash Rastogi

Human Resource Management Professional using Artificial Intelligence & SAP SuccessFactors.

2 个月

Thanks - Good insight into the importance of Vectors in AI an ML

要查看或添加评论,请登录

社区洞察

其他会员也浏览了