You are viewing a free preview of this lesson.
Subscribe to unlock all 10 lessons in this course and every other course on LearningBro.
Embeddings transform text into numerical vectors that capture semantic meaning. Combined with vector databases, they enable powerful features like semantic search, recommendation systems, and retrieval-augmented generation (RAG).
An embedding is a dense vector (list of floating-point numbers) that represents the meaning of a piece of text. Texts with similar meanings have vectors that are close together in the embedding space.
"The cat sat on the mat" → [0.023, -0.114, 0.891, ...] (1536 dimensions)
"A kitten was on the rug" → [0.025, -0.109, 0.887, ...] (very similar!)
"Stock prices rose today" → [-0.412, 0.067, 0.203, ...] (very different)
| Use Case | How Embeddings Help |
|---|---|
| Semantic search | Find results by meaning, not just keywords |
| Recommendations | Suggest similar items based on vector proximity |
| Clustering | Group similar documents automatically |
| RAG | Retrieve relevant context for LLM prompts |
| Anomaly detection | Identify outliers in vector space |
from openai import OpenAI
client = OpenAI()
response = client.embeddings.create(
model="text-embedding-3-small",
input="The cat sat on the mat",
)
Subscribe to continue reading
Get full access to this lesson and all 10 lessons in this course.