Introduction
Vector search has been used by tech giants like Google and Amazon for decades. It has been claimed to be a significant driver in clicks, views, and sales across several platforms. Yet, it was only with Faiss that this technology became more accessible.
In the past few years, vector search exploded in popularity. It has driven ecommerce sales, powered music and podcast search, and even recommended your next favorite shows on streaming platforms. Vector search is everywhere and in the following chapters you will discover why it has found such great success and how to apply it yourself using the Facebook AI Similarity Search (Faiss) library.
Chapter 1
Introduction to Facebook AI Similarity Search (Faiss)
An overview of the Faiss library and similarity search.
Chapter 2
Nearest Neighbor Indexes for Similarity Search
Learn how to choose the right index in Faiss
Chapter 3
Locality Sensitive Hashing (LSH): The Illustrated Guide
Take your first steps, towards a deeper understanding of approximate nearest neighbor indexes with LSH.
Chapter 4
Random Projection for Locality Sensitive Hashing
Apply LSH to modern dense vector representations using random projection.
Chapter 5
Product Quantization
Learn how Product Quantization (PQ) can be used to compress indexes by up to 97%.
Chapter 6
Hierarchical Navigable Small Worlds (HNSW)
HNSW graphs are among the top performing indexes in similarity search.
Chapter 7
Composite Indexes and the Faiss Index Factory
Learn how to apply all we have learned so far to create multi-step composite indexes.
New chapters coming soon!
Get email updates when they're published:
Chapter 8