Dive into the world of text embeddings. This course will guide you through leveraging text embeddings to enhance various natural language processing (NLP) tasks.
-
Updated
Feb 5, 2024 - Jupyter Notebook
Dive into the world of text embeddings. This course will guide you through leveraging text embeddings to enhance various natural language processing (NLP) tasks.
Implementation demonstrating how temperature, top-p (nucleus sampling), and top-k sampling parameters transform raw logits into probability distributions for text generation. Includes mathematical explanations and visual examples of each sampling strategy.
A short TeX paper formalizing the “Anthem” decoding recipe (temp=0.75, top_k=50, top_p=0.95, min_p=0.05) and explaining why pairing it with a strong persona/system prompt produces coherent, agentic “thinking-being” outputs.
phi3mini 🧊🖥️🛝 : Microsoft Phi 3 Mini Model # Generative AI # Chat Playground # Microsoft Foundry
sdkgenai 🛠️🔃📦 : Gen AI SDK # Model Parameters # Safety Filters # Multi-turn Chat # Content Streaming # Asynchronous Requests # Token Counting # Context Caching # Function Calling # Batch Prediction # Text Embeddings
Add a description, image, and links to the top-p topic page so that developers can more easily learn about it.
To associate your repository with the top-p topic, visit your repo's landing page and select "manage topics."