New to Telerik Reporting? Download free 30-day trial

Enable AI-Powered Insights with Built-in AI Client

This tutorial shows how to enable and configure AI-powered insights using built-in support for popular LLM providers, such as Azure OpenAI, OpenAI, Azure AI Foundry, and Ollama, so that end users can run predefined or custom prompts against the data behind the currently previewed report and receive responses from an LLM.

If you use a Telerik Report Server instead of a standalone Telerik Reporting REST service, check the Report Server article AI-Powered Features Settings instead.

Prerequisites

To follow the steps from this tutorial, you must have:

You can also connect to LLM providers that are not supported out of the box. To do this, create a custom Telerik.Reporting.AI.IClient implementation to integrate the provider into Reporting and enable the AI-powered insights functionality. For more details, refer to the article Enable AI-Powered Insights with Custom AI Client.

Using AI-Powered Insights with a REST service

To enable the AI-powered insights, follow these steps:

  1. Install exactly one of the following NuGet packages, depending on the LLM provider you use:

    • Telerik.Reporting.AI.Microsoft.Extensions.AzureAIInference—for Azure AI Foundry
    • Telerik.Reporting.AI.Microsoft.Extensions.AzureOpenAI—for Azure OpenAI resources
    • Telerik.Reporting.AI.Microsoft.Extensions.OpenAI—for OpenAI
    • Telerik.Reporting.AI.Microsoft.Extensions.Ollama—for Ollama
  2. Add the AIClient element to the report engine configuration in your application's configuration file. This element allows you to specify the AI model, endpoint, and authentication credentials. The following example demonstrates a basic Azure OpenAI configuration:

{
    "telerikReporting": {
        "AIClient": {
            "friendlyName": "MicrosoftExtensionsAzureOpenAI",
            "model": "gpt-4o-mini",
            "endpoint": "https://ai-explorations.openai.azure.com/",
            "credential": "YOUR_API_KEY"
        }
    }
}
<Telerik.Reporting>
    <AIClient
        friendlyName="MicrosoftExtensionsAzureOpenAI"
        model="gpt-4o-mini"
        endpoint="https://ai-explorations.openai.azure.com/"
        credential="YOUR_API_KEY">
    </AIClient>
</Telerik.Reporting>

If you haven't configured the report engine previously, make sure to check the article Report Engine Configuration Overview to get familiar with this topic.

In this case, the friendlyName attribute identifies the LLM provider to the report engine. Each provider has specific configuration requirements:

  • Azure OpenAI: Use MicrosoftExtensionsAzureOpenAI. Requires model, endpoint, and credential.
  • Azure AI Foundry: Use MicrosoftExtensionsAzureAIInference. Requires model, endpoint, and credential.
  • OpenAI: Use MicrosoftExtensionsOpenAI. Requires only model and credential (uses the default OpenAI API endpoint).
  • Ollama: Use MicrosoftExtensionsOllama. Requires only model and endpoint (no authentication needed for local deployments).

See Also

In this article