PartialContextQuestionProcessor
| Minimum Version: | Q4 2025 |
|---|
The PartialContextQuestionProcessor class enables you to ask questions about an Excel document and receive answers based on the most relevant parts of the document content. This processor uses embeddings to identify and send only the relevant portions of the document to the AI model, making it more efficient for token usage and more suitable for large documents. This class inherits from the abstract AIProcessorBase class, which provides common functionality for all AI processors.
The PartialContextQuestionProcessor is ideal for the following scenarios:
- Large Documents: When the document exceeds the token limit of the AI model and cannot be processed in a single call.
- Efficient Token Usage: When you want to minimize token consumption and optimize costs.
- Specific Questions: When questions are targeted at specific information within the document rather than requiring complete document understanding.
Public API and Configuration
| Constructor | Platform | Description |
|---|---|---|
| PartialContextQuestionProcessor(IChatClient chatClient, IEmbeddingSettings settings, SimpleTextDocument document) | Specific* | Creates an instance with built-in embeddings storage |
| PartialContextQuestionProcessor(IChatClient chatClient, IEmbedder embedder, IEmbeddingSettings settings, SimpleTextDocument document) | Any | Creates an instance with custom embedding |
| PartialContextQuestionProcessor(IChatClient chatClient, IContextRetriever contextRetriever, IEmbeddingSettings settings, SimpleTextDocument document) | Any | Allows custom PartialContextQuestionProcessor configuration |
*Specific The .NET 8+ (Target OS Windows) + Packages for .NET 8 and .NET 10 for Windows constructor uses a default IEmbedder internally, while the cross-platform and .NET Framework constructor requires a custom implementation of IEmbedder as shown in the Custom IEmbedder Setup section.
Properties and Methods
| Member | Type | Description |
|---|---|---|
| Settings | Property | Gets the readonly IEmbeddingSettings that configure the AI process |
| AnswerQuestion(string question) | Method | Returns an answer to the question using relevant document context |
Security Warning: The output produced by this API is generated by a Large Language Model (LLM). As such, the content should be considered untrusted and may include unexpected or unsafe data. It is strongly recommended to properly sanitize or encode all output before displaying it in a user interface, logging, or using it in any security-sensitive context.
IEmbeddingSettings
The settings are created only through EmbeddingSettingsFactory's CreateSettingsForSpreadDocuments method and can only be passed to the constructor of the processor. They expose configuration options for the question-answering process through the following properties:
- ModelMaxInputTokenLimit: Maximum input token limit the model allows
- ModelId: ID of the AI model (default: null)
- TokenizationEncoding: Tokenization encoding used (default: null)
- EmbeddingTokenSize: Size in tokens of each context chunk (default: null)
- ProduceJsonFormattedContext: Whether the context should be formatted as JSON (default: false)
- TotalContextTokenLimit: The total token limit for the context which will be sent to the answering model (default: null - takes half of the max limit)
// Convert the document to a simple text representation
SimpleTextDocument plainDoc = workbook.ToSimpleTextDocument(TimeSpan.FromSeconds(20));
// Set up the AI client (Azure OpenAI in this example)
string key = "AZUREOPENAI_KEY";
string endpoint = "AZUREOPENAI_ENDPOINT";
string model = "gpt-4o-mini";
Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
new Uri(endpoint),
new Azure.AzureKeyCredential(key),
new Azure.AI.OpenAI.AzureOpenAIClientOptions());
ChatClient chatClient = azureClient.GetChatClient(model);
IChatClient iChatClient = new OpenAIChatClient(chatClient);
int modelMaxInputTokenLimit = 128000;
int totalContextTokenLimit = 50000;
int embeddingTokenSize = 500;
string tokenizationEncoding = "cl100k_base";
bool produceJsonFormattedContext = false;
IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForSpreadDocuments(modelMaxInputTokenLimit, model, tokenizationEncoding, embeddingTokenSize, produceJsonFormattedContext, totalContextTokenLimit);
// Create the processor with a custom IEmbedder implementation
IEmbedder customEmbedder = new CustomOpenAIEmbedder();
using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, customEmbedder, partialProcessorSettings, plainDoc))
{
// Ask a question
}
Usage Examples
Example 1: Using PartialContextQuestionProcessor with default embedding.
This example demonstrates how to use the PartialContextQuestionProcessor with the built-in embedding on .NET 8+ (Target OS Windows) + Packages for .NET 8 and .NET 10 for Windows. For setting up the AI client, see the AI Provider Setup section:
public async Task AskQuestionsUsingPartialContext()
{
// Load the Xlsx document
string filePath = @"path\to\your\document.xlsx";
XlsxFormatProvider formatProvider = new XlsxFormatProvider();
Workbook workbook;
using (FileStream fs = File.OpenRead(filePath))
{
workbook = formatProvider.Import(fs, TimeSpan.FromSeconds(10));
}
// Convert the document to a simple text representation
SimpleTextDocument plainDoc = workbook.ToSimpleTextDocument(TimeSpan.FromSeconds(10));
// Set up the AI client (Azure OpenAI in this example)
string key = "AZUREOPENAI_KEY";
string endpoint = "AZUREOPENAI_ENDPOINT";
string model = "gpt-4o-mini";
Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
new Uri(endpoint),
new Azure.AzureKeyCredential(key),
new Azure.AI.OpenAI.AzureOpenAIClientOptions());
ChatClient chatClient = azureClient.GetChatClient(model);
IChatClient iChatClient = new OpenAIChatClient(chatClient);
int maxTokenCount = 128000;
int maxNumberOfEmbeddingsSent = 20;
int embeddingTokenSize = 500;
// Create the processor with built-in DefaultEmbeddingsStorage (.NET 8+ only)
IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForTextDocuments(maxTokenCount, model, null, maxNumberOfEmbeddingsSent, embeddingTokenSize);
using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, partialProcessorSettings, plainDoc))
{
// Ask a question
string question = "What content does the document contain?";
string answer = await processor.AnswerQuestion(question);
Console.WriteLine($"Question: {question}");
Console.WriteLine($"Answer: {answer}");
// Ask additional questions using the same processor
string question2 = "What is the main topic of the document?";
string answer2 = await processor.AnswerQuestion(question2);
Console.WriteLine($"\nQuestion: {question2}");
Console.WriteLine($"Answer: {answer2}");
}
}
Example 2: Using PartialContextQuestionProcessor with Custom IEmbedder
This example demonstrates how to use the PartialContextQuestionProcessor with a custom IEmbedder implementation as described in the Implementing Custom IEmbedder section:
public async void AskQuestionsUsingPartialContext()
{
// Load the XLSX document
string filePath = @"path\to\your\document.xlsx";
XlsxFormatProvider formatProvider = new XlsxFormatProvider();
Workbook workbook;
using (FileStream fs = File.OpenRead(filePath))
{
workbook = formatProvider.Import(fs, TimeSpan.FromSeconds(10));
}
// Convert the document to a simple text representation
SimpleTextDocument plainDoc = workbook.ToSimpleTextDocument(TimeSpan.FromSeconds(10));
// Set up the AI client (Azure OpenAI in this example)
string key = "AZUREOPENAI_KEY";
string endpoint = "AZUREOPENAI_ENDPOINT";
string model = "gpt-4o-mini";
Azure.AI.OpenAI.AzureOpenAIClient azureClient = new AzureOpenAIClient(
new Uri(endpoint),
new Azure.AzureKeyCredential(key),
new Azure.AI.OpenAI.AzureOpenAIClientOptions());
ChatClient chatClient = azureClient.GetChatClient(model);
IChatClient iChatClient = new OpenAIChatClient(chatClient);
int maxTokenCount = 128000;
int maxNumberOfEmbeddingsSent = 20;
int embeddingTokenSize = 500;
// Create the processor with a custom IEmbedder implementation
IEmbedder customEmbedder = new CustomOpenAIEmbedder();
IEmbeddingSettings partialProcessorSettings = EmbeddingSettingsFactory.CreateSettingsForTextDocuments(maxTokenCount, model, null, maxNumberOfEmbeddingsSent, embeddingTokenSize);
using (PartialContextQuestionProcessor processor = new PartialContextQuestionProcessor(iChatClient, customEmbedder, partialProcessorSettings, plainDoc))
{
// Ask a question
string question = "What are the main conclusions of the document?";
string answer = await processor.AnswerQuestion(question);
Console.WriteLine($"Question: {question}");
Console.WriteLine($"Answer: {answer}");
}
}
Implementing custom IEmbedder
A sample custom CustomOpenAIEmbedder implementation for the IEmbedder is shown in the below code snippet:
Requires installing the following NuGet packages:
- Azure.AI.OpenAI
- Microsoft.Extensions.AI.OpenAI (v9.3)
- Telerik.Windows.Documents.AIConnector
- Telerik.Windows.Documents.Spreadsheet
public class CustomOpenAIEmbedder : IEmbedder
{
private readonly HttpClient httpClient;
public CustomOpenAIEmbedder()
{
HttpClient httpClient = new HttpClient();
httpClient.Timeout = TimeSpan.FromMinutes(5);
string endpoint = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_ENDPOINT");
string apiKey = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_KEY");
httpClient.BaseAddress = new Uri(endpoint);
httpClient.DefaultRequestHeaders.Add("api-key", apiKey);
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
this.httpClient = httpClient;
}
public async Task<IList<Embedding>> EmbedAsync(IList<IFragment> fragments)
{
AzureEmbeddingsRequest requestBody = new AzureEmbeddingsRequest
{
Input = fragments.Select(p => p.ToEmbeddingText()).ToArray(),
Dimensions = 3072
};
string json = JsonSerializer.Serialize(requestBody);
StringContent content = new StringContent(json, Encoding.UTF8, "application/json");
string apiVersion = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_APIVERSION");
string deploymentName = Environment.GetEnvironmentVariable("AZUREEMBEDDINGOPENAI_DEPLOYMENT");
string url = $"openai/deployments/{deploymentName}/embeddings?api-version={apiVersion}";
using HttpResponseMessage response = await this.httpClient.PostAsync(url, content, CancellationToken.None);
Embedding[] embeddings = new Embedding[fragments.Count];
string responseJson = await response.Content.ReadAsStringAsync(CancellationToken.None);
AzureEmbeddingsResponse responseObj = JsonSerializer.Deserialize<AzureEmbeddingsResponse>(responseJson);
List<EmbeddingData> sorted = responseObj.Data.OrderBy(d => d.Index).ToList();
List<float[]> result = new List<float[]>(sorted.Count);
for (int i = 0; i < sorted.Count; i++)
{
EmbeddingData item = sorted[i];
embeddings[i] = new Embedding(fragments[i], item.Embedding);
}
return embeddings;
}
private sealed class AzureEmbeddingsRequest
{
[System.Text.Json.Serialization.JsonPropertyName("input")]
public string[] Input { get; set; } = Array.Empty<string>();
[System.Text.Json.Serialization.JsonPropertyName("dimensions")]
public int? Dimensions { get; set; }
}
private sealed class AzureEmbeddingsResponse
{
[System.Text.Json.Serialization.JsonPropertyName("data")]
public EmbeddingData[] Data { get; set; } = Array.Empty<EmbeddingData>();
[System.Text.Json.Serialization.JsonPropertyName("model")]
public string? Model { get; set; }
[System.Text.Json.Serialization.JsonPropertyName("usage")]
public UsageInfo? Usage { get; set; }
}
private sealed class UsageInfo
{
[System.Text.Json.Serialization.JsonPropertyName("prompt_tokens")]
public int PromptTokens { get; set; }
[System.Text.Json.Serialization.JsonPropertyName("total_tokens")]
public int TotalTokens { get; set; }
}
private sealed class EmbeddingData
{
[System.Text.Json.Serialization.JsonPropertyName("embedding")]
public float[] Embedding { get; set; } = Array.Empty<float>();
[System.Text.Json.Serialization.JsonPropertyName("index")]
public int Index { get; set; }
}
}