Factors Affecting Report Performance
Reporting performance is generally understood to be the speed and efficiency with which reports are generated by the system when end-users perform a query. Performance depends on a variety of factors, including system bandwidth, number of concurrent users, volume of data to be presented, etc.
Naturally, some of these performance factors are “environmental,” that is, pretty much outside of the report developer’s control. Others, however, are not, and the smart report developer should be aware of these factors and use them to her or his advantage to create reports that perform efficiently and that don’t place undue burden on the system.
Prior to further elaborating on the major performance factors our recommendation is to go through some main points regarding the reporting's inner workings and principles.
Report Processing stages
- When the first report rendering is started, the application will load the assemblies it depends on in the memory. This may cause an additional delay. You may consider a delay of ~2s as normal for this initial loading.
- Processing data: Once the raw data is fetched from the data source, the reporting engine uses it to create an OLAP cube. Its dimensions are built according to Group Definitions for the report and contain Data Items. Measures are calculated from all expressions. This is repeated for every DataSource in the report definition.
- Processing the report items: Using the report definition and the already calculated OLAP cubes hierarchies, the Report Processing Object Tree is built. Report Processing Tree contains all required layout information regardless of the PageSettings information.
- Render the report to the desired media, perform paging, and calculate page aggregates.
Environmental reporting performance factors and suggestions:
Report processing and rendering are memory-intensive operations. The OLAP cube, constructed from the raw data, is kept in memory during the report life cycle. This may be a reason for out-of-memory exceptions, especially when the process that renders the report is run in 32-bit mode. The rendering performance is also affected by the CPU power and the available memory. The faster the hardware, the better. For data- and layout-intensive reports, we would suggest a minimum of a dual-core processor with at least 2 GB of RAM. Network traffic bandwidth also impacts performance.
Handling a Large Workload
To get the highest performance when handling large workloads that include user requests for large reports, implement the following recommendations.
Control the Size of Your Reports
You will first want to determine the purpose of these reports and whether a large multipage report is even necessary. If a large report is necessary, how frequently will it be used? If you provide users with smaller summary reports, can you reduce the frequency with which users attempt to access this large multipage report? Large reports have a significant processing load on the reporting environment, so it is necessary to evaluate each report on a case-by-case basis.
Some common problems with these large reports are that they contain data fields that are not used in the report or they contain duplicate datasets. Often users retrieve more data than they need. To significantly reduce the load placed on your reporting environment, create summary reports that use aggregates, and include only the necessary columns.
Use Telerik Reporting REST Service Cache
If you have reports that do not need to have live execution, set the service ReportSharingTimeout to an appropriate value. For more information see: HTML5 Report Viewer and Reporting REST services
Use Telerik ReportViewer's PrintPreview Mode
When you have a huge Table/List/Crosstab in your report's detail section, it will be rendered fully before being displayed in the Viewer, when the latter is in its default Interactive View mode. The reason is that this mode supports only soft pagination that occurs at the end of the section when it occupies more space than the specified page size. With tables of thousands of rows, the soft page size may be hundreds of times larger than the specified hard page size.
To avoid this problem you may switch the viewer's View Mode to Print Preview, which respects the Report PageSettings and the corresponding Page Size. In Print Preview, the rendered pages are with the specified size, which in this case results in more pages with smaller sizes. The improved performance comes from the smaller content for a single page, which may be a key performing factor in HTML and XAML renderings.
The Interactive and Print Layouts elaborates further on the View modes.
The pagination is explained in more detail in the article Understanding Pagination.
Deliver Rendered Reports for Non-browser Formats
To reduce the load on your reporting environment, you can use our Report Server where one may schedule a report in off-peak hours and send it by mail when ready, thus avoiding the waiting and this comes out of the box.
Performance factors in the developer’s control:
- PageCount PageCount triggers an extra paging pass that depending on the report volume and complexity may significantly slow down the rendering. If performance is important consider avoiding the PageCount object;
- Number of report items The number of report items will affect loading time, just like the more information a web page requests, the longer it will take to load. Consider changing your large and slow report layout and create a report with Actions (drill down, drill-through) and Report Parameters (on-demand data). Besides being beneficial to the end-user, these features make the report faster and more efficient, lightening the reporting load by giving only the current state of requested data;
- Rendering media After the report is processed a respective rendering extension is used to render the report in the desired media. Every rendering extension has its specifics and if all reports process slowly only in a particular format try adding more memory, or choosing a different format;
- Excel Rendering The Excel rendering engines create a complex matrix taking into account every item's coordinates and size, and then create a table with many table cells and rows, some of which are used merely as "spacers". This way an item's exact location and size are preserved, but at the cost of many calculations. When a report has tons of items that are not vertically and horizontally aligned, the resulting table is gigantic. Align your items horizontally and vertically relative to each other and make sure that you have NO warning signs in design-time. While in other formats the rendering locations and sizes do not relate to performance, in Excel this is critical. The more items are aligned horizontally and vertically, the fewer dummy spacer rows and cells will be created by the Excel rendering engine;
- Volume of the data retrieved from the data source If your report(s) display large amounts of data, it may take a long time for the data to be retrieved from the datasource. With large amounts of datasource data, you need to retrieve the smallest usable set of data to reduce network traffic, use resources effectively, and render reports quickly. This can be achieved by using selections applied to data at the report level vs. user selections applied at the datasource - review the Overview. Selections applied at the datasource level are effective when the data does not need to be re-queried. This can be effective when only a narrow range of data is required. Additional trips to the datasource for different views of the same data increase the network traffic cost and the time needed to render the report. When user selections are applied at the report level the data is cached in-memory. This can be effective when different views of the same data are required and the data is not too large. A combination of these techniques may be the most appropriate solution. To filter data on a database level and fetch only a subset of the data, review the Using Parameters with Data Source objects;
- Expressions Complicated expressions (especially with many report items) require some time to be evaluated and they can also lead to slowness. It is a good idea to replace these expressions with User Functions because user functions are an already compiled code;
- Number of Subreport items The subreport item references a report definition that should be processed separately. This is an expensive process and can be avoided with a Table or List item that is bound to the same datasource and has the same layout as the subreport report definition. If you need some data from the master data item (report) datasource for the child data item datasource parameter you can take advantage of the Data Source Components relations capability as elaborated in the How to use the ReportItem.DataObject property in expressions article;
- Number of HtmlTextBox items The HtmlTextBox content is parsed with an HTML parser and this is a resource-consuming process. When performance is important our recommendation is to avoid the extensive use of this item. Instead of the complex text styling in a single HtmlTextBox, you can try to achieve the desired layout with multiple textboxes;
- Events Interrupting the report processing with events comes at a price. Instead, consider Using Expressions and User Functions. If you have to keep the events make sure that no time and resource consuming actions in the event handlers are slowing down the report processing.
- Number of Chart items The old Chart item does not use the reporting data engine, but its own. As a result, showing many charts involves extra data processing. In Q1 2013 we introduced the Graph item The new item utilizes the optimized reporting data engine. Thus we highly recommend it over the obsolete Chart item. Still, if you have to use the obsolete Chart item, avoid using the IntelligentLabels function;
- Hidden Report Items Hiding report items (setting their visibility to false) will not prevent the reporting engine from processing those items. It is required to process hidden items for other features of the report, such as Actions, to work correctly. If such report item is a Data Item it will still retrieve data from its data source and reduce the overall performance. Therefore, it is recommended to add server side filtering based on the same condition which controls the visibility of the data item.
For additional assistance
If your case is not listed above, or you have tried all suggestions, but still experience slow rendering, please open a new support ticket and send us the following so that we can investigate your case and check what is causing your troubles:
- Why do you think the report is slow (benchmarks, comparisons, etc);
- Amount of data/pages you are trying to show;
- Report layout complexity (type and number of items used);
- Machine’s hardware and software configuration where the reports are rendered;
- Your archived report files;
- The data source you're binding to.