Available for: UI for ASP.NET MVC | UI for ASP.NET AJAX | UI for Blazor | UI for WPF | UI for WinForms | UI for Silverlight | UI for Xamarin | UI for WinUI | UI for ASP.NET Core | UI for .NET MAUI

New to Telerik Document Processing? Download free 30-day trial

PartialContextQuestionProcessor

The PartialContextQuestionProcessor class enables you to ask questions about a PDF document and receive answers based on the most relevant parts of the document content. This processor uses embeddings to identify and send only the relevant portions of the document to the AI model, making it more efficient for token usage and more suitable for large documents. This class inherits from the abstract AIProcessorBase class, which provides common functionality for all AI processors.

The PartialContextQuestionProcessor is ideal for the following scenarios:

  1. Large Documents: When the document exceeds the token limit of the AI model and cannot be processed in a single call.
  2. Efficient Token Usage: When you want to minimize token consumption and optimize costs.
  3. Specific Questions: When questions are targeted at specific information within the document rather than requiring complete document understanding.

Public API and Configuration

Constructor Platform Description
PartialContextQuestionProcessor(IChatClient chatClient, IEmbeddingSettings settings, SimpleTextDocument document) Specific* Creates an instance with built-in embeddings storage
PartialContextQuestionProcessor(IChatClient chatClient, IEmbedder embedder, IEmbeddingSettings settings, SimpleTextDocument document) Any Creates an instance with custom embedding
PartialContextQuestionProcessor(IChatClient chatClient, IContextRetriever contextRetriever, IEmbeddingSettings settings, SimpleTextDocument document) Any Allows custom PartialContextQuestionProcessor configuration

*Specific The .NET 8+ (Target OS Windows) + Packages for .NET 8 and .NET 10 for Windows constructor uses a default IEmbedder internally, while the cross-platform and .NET Framework constructor requires a custom implementation of IEmbedder as shown in the Custom IEmbedder Setup section.

Properties and Methods

Member Type Description
Settings Property Gets the readonly IEmbeddingSettings that configure the AI process
AnswerQuestion(string question) Method Returns an answer to the question using relevant document context

Security Warning: The output produced by this API is generated by a Large Language Model (LLM). As such, the content should be considered untrusted and may include unexpected or unsafe data. It is strongly recommended to properly sanitize or encode all output before displaying it in a user interface, logging, or using it in any security-sensitive context.

IEmbeddingSettings

The settings are created only through EmbeddingSettingsFactory's CreateSettingsForTextDocuments method and can only be passed to the constructor of the processor. They expose configuration options for the question-answering process through the following properties:

  • ModelMaxInputTokenLimit: Maximum input token limit the model allows
  • TokenizationEncoding: Tokenization encoding used (default: null)
  • ModelId: ID of the AI model (default: null)
  • MaxNumberOfEmbeddingsSent: Maximum number of context chunks sent (default: null)
  • EmbeddingTokenSize: Size in tokens of each context chunk (default: null)
// Convert the document to a simple text representation
SimpleTextDocument plainDoc = fixedDocument.ToSimpleTextDocument(TimeSpan.FromSeconds(20));

// Set up the AI client (Azure OpenAI in this example)
string key = "AZUREOPENAI_KEY";
string endpoint = "AZUREOPENAI_ENDPOINT";
string model = "gpt-4o-mini";

Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
    new Uri(endpoint),
    new Azure.AzureKeyCredential(key),
    new Azure.AI.OpenAI.AzureOpenAIClientOptions());
ChatClient chatClient = azureClient.GetChatClient(model);

IChatClient iChatClient = new OpenAIChatClient(chatClient);
int maxTokenCount = 128000;
int maxNumberOfEmbeddingsSent = 20;
int embeddingTokenSize = 500;

IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForTextDocuments(maxTokenCount, model, null, maxNumberOfEmbeddingsSent, embeddingTokenSize);

// Create the processor with a custom IEmbedder implementation
IEmbedder customEmbedder = new CustomOpenAIEmbedder();
using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, customEmbedder, partialProcessorSettings, plainDoc))
{
    // Ask a question
}

Usage Examples

Example 1: Using PartialContextQuestionProcessor with default embedding

This example demonstrates how to use the PartialContextQuestionProcessor with the built-in embedding on .NET 8+ (Target OS Windows) + Packages for .NET 8 and .NET 10 for Windows. For setting up the AI client, see the AI Provider Setup section:

        async Task AskQuestionsUsingPartialContext()
{
    // Load the PDF document
    string filePath = @"path\to\your\document.pdf";
    PdfFormatProvider formatProvider = new PdfFormatProvider();
    RadFixedDocument fixedDocument;

    using (FileStream fs = File.OpenRead(filePath))
    {
        fixedDocument = formatProvider.Import(fs, TimeSpan.FromSeconds(20));
    }

    // Convert the document to a simple text representation
    SimpleTextDocument plainDoc = fixedDocument.ToSimpleTextDocument(TimeSpan.FromSeconds(20));

    // Set up the AI client (Azure OpenAI in this example)
    string key = "AZUREOPENAI_KEY";
    string endpoint = "AZUREOPENAI_ENDPOINT";
    string model = "gpt-4o-mini";

    Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
        new Uri(endpoint),
        new Azure.AzureKeyCredential(key),
        new Azure.AI.OpenAI.AzureOpenAIClientOptions());
    ChatClient chatClient = azureClient.GetChatClient(model);

    IChatClient iChatClient = new OpenAIChatClient(chatClient);
    int maxTokenCount = 128000;
    int maxNumberOfEmbeddingsSent = 20;
    int embeddingTokenSize = 500;

    // Create the processor with built-in DefaultEmbeddingsStorage (.NET 8+ only)
    IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForTextDocuments(maxTokenCount, model, null, maxNumberOfEmbeddingsSent, embeddingTokenSize);

    using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, partialProcessorSettings, plainDoc))
    {
        // Ask a question
        string question = "What are the key findings in the document?";
        string answer = await processor.AnswerQuestion(question);

        Console.WriteLine($"Question: {question}");
        Console.WriteLine($"Answer: {answer}");

        // Ask additional questions using the same processor
        string question2 = "What methodology was used in the research?";
        string answer2 = await processor.AnswerQuestion(question2);

        Console.WriteLine($"\nQuestion: {question2}");
        Console.WriteLine($"Answer: {answer2}");
    }
}

Example 2: Using PartialContextQuestionProcessor with Custom IEmbedder

This example demonstrates how to use the PartialContextQuestionProcessor with a custom IEmbedder implementation as described in the Implementing Custom IEmbedder section:

async Task AskQuestionsUsingPartialContext()
{
    // Load the PDF document
    string filePath = @"path\to\your\document.pdf";
    PdfFormatProvider formatProvider = new PdfFormatProvider();
    RadFixedDocument fixedDocument;

    using (FileStream fs = File.OpenRead(filePath))
    {
        fixedDocument = formatProvider.Import(fs, TimeSpan.FromSeconds(10));
    }

    // Convert the document to a simple text representation
    SimpleTextDocument plainDoc = fixedDocument.ToSimpleTextDocument(TimeSpan.FromSeconds(20));

    // Set up the AI client (Azure OpenAI in this example)
    string key = "AZUREOPENAI_KEY";
    string endpoint = "AZUREOPENAI_ENDPOINT";
    string model = "gpt-4o-mini";

    Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
        new Uri(endpoint),
        new Azure.AzureKeyCredential(key),
        new Azure.AI.OpenAI.AzureOpenAIClientOptions());
    ChatClient chatClient = azureClient.GetChatClient(model);

    IChatClient iChatClient = new OpenAIChatClient(chatClient);
    int maxTokenCount = 128000;
    int maxNumberOfEmbeddingsSent = 20;
    int embeddingTokenSize = 500;

    // Create the processor with a custom IEmbedder implementation
    IEmbedder customEmbedder = new CustomOpenAIEmbedder();

    IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForTextDocuments(maxTokenCount, model, null, maxNumberOfEmbeddingsSent, embeddingTokenSize);
    using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, customEmbedder, partialProcessorSettings, plainDoc))
    {
        // Ask a question
        string question = "What are the main conclusions of the document?";
        string answer = await processor.AnswerQuestion(question);

        Console.WriteLine($"Question: {question}");
        Console.WriteLine($"Answer: {answer}");
    }
}

Implementing Custom IEmbedder

A sample custom CustomOpenAIEmbedder implementation for the IEmbedder is shown in the below code snippet:

Requires installing the following NuGet packages:

  • Azure.AI.OpenAI
  • Microsoft.Extensions.AI.OpenAI (v9.3)
  • Telerik.Windows.Documents.AIConnector
  • Telerik.Windows.Documents.Fixed
public class CustomOpenAIEmbedder : IEmbedder
{
    private readonly HttpClient httpClient;

    public CustomOpenAIEmbedder()
    {
        HttpClient httpClient = new HttpClient();
        httpClient.Timeout = TimeSpan.FromMinutes(5);
        string endpoint = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_ENDPOINT");
        string apiKey = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_KEY");

        httpClient.BaseAddress = new Uri(endpoint);
        httpClient.DefaultRequestHeaders.Add("api-key", apiKey);
        httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));

        this.httpClient = httpClient;
    }

    public async Task<IList<Embedding>> EmbedAsync(IList<IFragment> fragments)
    {
        AzureEmbeddingsRequest requestBody = new AzureEmbeddingsRequest
        {
            Input = fragments.Select(p => p.ToEmbeddingText()).ToArray(),
            Dimensions = 3072
        };

        string json = JsonSerializer.Serialize(requestBody);
        StringContent content = new StringContent(json, Encoding.UTF8, "application/json");

        string apiVersion = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_APIVERSION");
        string deploymentName = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_DEPLOYMENT");
        string url = $"openai/deployments/{deploymentName}/embeddings?api-version={apiVersion}";
        using HttpResponseMessage response = await this.httpClient.PostAsync(url, content, CancellationToken.None);

        Embedding[] embeddings = new Embedding[fragments.Count];

        string responseJson = await response.Content.ReadAsStringAsync(CancellationToken.None);
        AzureEmbeddingsResponse responseObj = JsonSerializer.Deserialize<AzureEmbeddingsResponse>(responseJson);

        List<EmbeddingData> sorted = responseObj.Data.OrderBy(d => d.Index).ToList();
        List<float[]> result = new List<float[]>(sorted.Count);

        for (int i = 0; i < sorted.Count; i++)
        {
            EmbeddingData item = sorted[i];
            embeddings[i] = new Embedding(fragments[i], item.Embedding);
        }

        return embeddings;
    }

    private sealed class AzureEmbeddingsRequest
    {
        [System.Text.Json.Serialization.JsonPropertyName("input")]
        public string[] Input { get; set; } = Array.Empty<string>();

        [System.Text.Json.Serialization.JsonPropertyName("dimensions")]
        public int? Dimensions { get; set; }
    }

    private sealed class AzureEmbeddingsResponse
    {
        [System.Text.Json.Serialization.JsonPropertyName("data")]
        public EmbeddingData[] Data { get; set; } = Array.Empty<EmbeddingData>();

        [System.Text.Json.Serialization.JsonPropertyName("model")]
        public string? Model { get; set; }

        [System.Text.Json.Serialization.JsonPropertyName("usage")]
        public UsageInfo? Usage { get; set; }
    }

    private sealed class UsageInfo
    {
        [System.Text.Json.Serialization.JsonPropertyName("prompt_tokens")]
        public int PromptTokens { get; set; }

        [System.Text.Json.Serialization.JsonPropertyName("total_tokens")]
        public int TotalTokens { get; set; }
    }

    private sealed class EmbeddingData
    {
        [System.Text.Json.Serialization.JsonPropertyName("embedding")]
        public float[] Embedding { get; set; } = Array.Empty<float>();

        [System.Text.Json.Serialization.JsonPropertyName("index")]
        public int Index { get; set; }
    }
}

Example 3: Processing Specific Pages

// Convert only pages 5-10 to a simple text document (0-based index)
// ISimpleTextDocument partialDoc = fixedDocument.ToSimpleTextDocument(4, 9); // Obsolete since Q4 2025
SimpleTextDocument partialDoc = fixedDocument.ToSimpleTextDocument(4, 9, TimeSpan.FromSeconds(20));

See Also

In this article