Factors Affecting Report Performance
Reporting performance is generally understood to be the speed and efficiency with which reports are generated by the system when end-users perform a query. Performance depends on a variety of factors, including system bandwidth, number of concurrent users, volume of data to be presented, etc.
Naturally, some of these performance factors are “environmental,” that is, pretty much outside of the report developer’s control. Others, however, are not, and the smart report developer should be aware of these factors and use them to her or his advantage to create reports that perform efficiently and that don’t place undue burden on the system.
Prior further elaborating on the major performance factors our recommendation is to go through some main points regarding the reporting inner workings and principles.
Report Processing stages
- When the first report rendering is started, the application will load the assemblies it depends on in the memory. This may cause an additional delay. You may consider a delay of ~2s as normal for this initial loading.
- Processing data: Once the raw data is fetched from the data source, the reporting engine uses it to create an OLAP cube. Its dimensions are build according to Group Definitions for the report and contained Data Items. Measures are calculated from all expressions. This is repeated for every DataSource in the report definition.
- Processing the report items: Using the report definition and the already calculated OLAP cubes hierarchies, the Report Processing Object Tree is built. Report Processing Tree contains all required layout information regardless of the PageSettings information.
- Render the report to the desired media, perform paging and calculate page aggregates.
Environmental reporting performance factors and suggestions:
Report processing and rendering are memory intensive operations. The OLAP cube, constructed from the raw data, is kept in memory during the report life cycle. This may be a reason for out-of-memory exceptions, especially when the process that renders the report is run in 32-bit mode. The rendering performance is also affected by the CPU power and the available memory. Obviously, the faster the hardware, the better. For data- and layout-intensive reports, we would suggest a minimum of a dual core processor with at least 2 GB of RAM. Network traffic bandwidth also impacts performance.
Handling a Large Workload
To get the highest performance when handling large workloads that include user requests for large reports, implement the following recommendations.
Control the Size of Your Reports
You will first want to determine the purpose of these reports and whether a large multipage report is even necessary. If a large report is necessary, how frequently will it be used? If you provide users with smaller summary reports, can you reduce the frequency with which users attempt to access this large multipage report? Large reports have a significant processing load on the reporting environment, so it is necessary to evaluate each report on a case-by-case basis.
Some common problems with these large reports are that they contain data fields that are not used in the report or they contain duplicate datasets. Often users retrieve more data than they really need. To significantly reduce the load placed on your reporting environment, create summary reports that use aggregates, and include only the necessary columns.
Use Telerik Reporting REST Service Cache
If you have reports that do not need to have live execution, set the service ReportSharingTimeout to an appropriate value. For more information see: HTML5 Report Viewer and Reporting REST services
Deliver Rendered Reports for Non-browser Formats
To reduce the load on your reporting environment, you can use our Report Server where one may schedule report in off peak hours and send it by mail when ready, thus avoid the waiting and this comes out of the box.
Performance factors in the developer’s control:
- PageCount PageCount triggers an extra paging pass that depending on the report volume and complexity may significantly slow down the rendering. If performance is important consider avoiding the PageCount object;
- Number of report items The number of report items will affect loading time, just like the more information a web page requests, the longer it will take to load. Consider changing your large and slow report layout and create a report with Actions (drill down, drill-through) and Report Parameters (on-demand data). Besides being beneficial to the end-user, these features make the report faster and more efficient, lightening the reporting load by giving only the current state of requested data;
- Rendering media After the report is processed a respective rendering extension is used to render the report in the desired media. Every rendering extension has its specifics and if all reports process slowly only in a particular format try adding more memory, or choosing a different format;
- Excel Rendering The Excel rendering engines create a complex matrix taking into account every item's coordinates and size, and then create a table with many table cells and rows, some of which are used merely as "spacers". This way an item's exact location and size are preserved, but at the cost of many calculations. When a report has tons of items that are not vertically and horizontally aligned, the resulting table is gigantic. Align your items horizontally and vertically relative to each other and make sure that you have NO warning signs in design-time. While in other formats the rendering locations and sizes do not relate to performance, in Excel this is critical. The more items are aligned horizontally and vertically, the less dummy spacer rows and cells will be created by the Excel rendering engine;
- Volume of the data retrieved from the data source If your report(s) display large amounts of data, it may take a long time for the data to be retrieved from the datasource. With large amounts of datasource data you need to retrieve the smallest usable set of data to reduce network traffic, use resources effectively and to render reports quickly. This can be achieved by using selections applied to data at the report level vs. user selections applied at the datasource - review the Overview. Selections applied at the datasource level are effective when the data does not need to be re-queried. This can be effective when only a narrow range of data is required. Additional trips to the datasource for different views of the same data increase the network traffic cost and the time needed to render the report. When user selections are applied at the report level the data is cached in-memory. This can be effective when different views of the same data are required and the data is not too large. A combination of these techniques may be the most appropriate solution. To filter data on database level and fetch only a subset of the data, review the Using Parameters with Data Source objects;
- Expressions Complicated expressions (especially with many report items) require some time to be evaluated and they can also lead to slowness. It is a good idea to replace these expressions with User Functions because user functions are an already compiled code;
- Number of Subreport items The subreport item references a report definition that should be processed separately. This is expensive process and can be avoided with Table or List item that is bound to the same datasource and has the same layout as the subreport report definition. If you need some data from the master data item (report) datasource for the child data item datasource parameter you can take advantage of the Data Source Components relations capability as elaborated in the How to use the ReportItem.DataObject property in expressions article;
- Number of HtmlTextBox items The HtmlTextBox content is parsed with a HTML parser and this is resource consuming process. When performance is important our recommendation is to avoid the extensive use of this item. Instead of the complex text styling in a single HtmlTextBox you can try to achieve the desired layout with multiple textboxes;
- Events Interrupting the report processing with events comes at a price. Instead consider Using Expressions and User Functions. If you have to keep the events make sure that no time and resource consuming actions in the event handlers are slowing down the report processing.
- Number of Chart items The old Chart item does not use the reporting data engine, but its own. As a result showing many charts involves extra data processing. In Q1 2013 we have introduced the Graph item The new item utilizes the optimized reporting data engine. Thus we highly recommend it over the obsolete Chart item. Still if you have to use the obsolete Chart item, avoid using the IntelligentLabels function;
- Hidden Report Items Hiding report items (setting their visibility to false) will not prevent the reporting engine from processing those items. It is required to process hidden items in order for other features of the report, such as Actions, to be working correctly. If such report item is a Data Item it will still retrieve data from its data source and reduce the overall performance. Therefore, it is recommended to add server side filtering based on the same condition which controls the visibility of the data item.
For additional assistance
If your case is not listed above, or you have tried all suggestions, but still experience slow rendering, please open a new support ticket and send us the following so that we can investigate your case and check what is causing your troubles:
- Why do you think the report is slow (benchmarks, comparisons, etc);
- Amount of data/pages you are trying to show;
- Report layout complexity (type and number of items used);
- Machine’s hardware and software configuration where the reports are rendered;
- Your archived report files;
- The data source you're binding to.