Glossary

search icon
1
The Model Context Protocol (MCP) is an open standard that enables large language models (LLMs) to dynamically interact with external tools, databases, and APIs through a standardized interface. Introduced by Anthropic in November 2024 and later adopted by OpenAI, MCP provides a universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol that enables seamless, secure, and scalable AI workflows.
LLM Observability is the comprehensive practice of monitoring, tracking, and analyzing the behavior, performance, and outputs of Large Language Models (LLMs) throughout their entire lifecycle from development to production. It provides real-time visibility into every layer of LLM-based systems, enabling organizations to understand not just what is happening with their AI models, but why specific behaviors occur, ensuring reliable, safe, and cost-effective AI operations.
A Cost-Based Optimizer (CBO) represents a sophisticated query optimization framework designed to maximize database performance by systematically evaluating multiple potential execution plans and selecting the one with the lowest estimated computational cost. In contrast to traditional rule-based optimizers, which depend on fixed heuristic rules, the CBO leverages comprehensive statistical metadata—including data distribution, table cardinality, and index availability—to make context-aware, data-driven optimization decisions.
RAG (Retrieval-Augmented Generation) is an AI framework that enhances large language models (LLMs) by combining them with external knowledge retrieval systems. This architecture allows LLMs to access up-to-date, domain-specific information from external databases, documents, or knowledge bases during the generation process, significantly improving the accuracy, relevance, and factuality of AI-generated responses.
Hybrid search is a powerful search approach that combines multiple search methodologies, primarily keyword-based (lexical) search and vector-based (semantic) search, to deliver more comprehensive and accurate search results. By leveraging the strengths of both exact term matching and semantic understanding, hybrid search provides users with relevant results that capture both literal matches and contextual meaning, significantly improving search precision and user satisfaction.
Vector search is a modern search technique that enables finding similar items by converting data into high-dimensional numerical representations called vectors or embeddings. Unlike traditional keyword-based search that matches exact terms, vector search understands semantic meaning and context, allowing users to find relevant content even when exact keywords don't match. This technology powers recommendation systems, similarity search, and AI applications by measuring mathematical distances between vectors in multi-dimensional space.