Execution Extension

Test Studio has enhanced extensibility support for test execution. This model helps integrate Test Studio better into your environment that contains custom results reporting and code defect tracking.

 

To demonstrate the execution extensibility model, let's build a simple Execution Extension to Test Studio that writes the results of a test list to a text file.

  1. Create a Class Library project in Visual Studio. This example uses C#.
  2. Add references to three DLLs from %ProgramFiles%\Telerik\Test Studio\Bin\
    • ArtOfTest.WebAii.dll
    • ArtOfTest.WebAii.Design.dll
    • Telerik.TestStudio.Interfaces.dll

     

    Also add the following .NET references:

    • System.Runtime.Serialization
    • System.Windows.Forms

     

  3. Add the following using statements to the class file:
  4.  

    using System.IO;
    using System.Data;
    using System.Data.OleDb;
    using System.Windows.Forms;
    using ArtOfTest.WebAii.Design.Execution;

     

  5. The ArtOfTest.WebAii.Design.Execution namespace contains an IExecutionExtension that our class needs to implement:
  6.  

    namespace ClassLibrary1
    {
        public class Class1 : IExecutionExtension
        {
        }
    }

     

  7. Right click on IExecutionExtension and select Implement Interface > Implement Interface. This displays all the methods and notifications exposed by Test Studio. Here are definitions for each IExecutionExtension member: 
    • The functions you're not using should be left empty (remove throw new NotImplementedException).
    • OnInitializeDataSource should not be left empty and should return null if not being used.

     

    namespace ClassLibrary1
    {
        public class Class1 : IExecutionExtension
        {
            #region IExecutionExtension Members
     
     
            // After each test is completed, ArtOfTest.Runner calls this method.
            // <param name="executionContext">The execution context the test is running under.</param>
            // <param name="result">The actual result of the test.</param>
     
            public void OnAfterTestCompleted(ExecutionContext executionContext, TestResult result)
            {

                   // Your custom implementation here

            }
     
     
            // After the test list is completed, the Scheduling Server calls this method.
            // <param name="result">The entire RunResult object.</param>
     
            public void OnAfterTestListCompleted(RunResult result)
            {

                   // Your custom implementation here.

            }
     
     
            // Before the test list begins execution, the Scheduling Server calls this method.
            // <param name="list">The test list that is about to start.</param>
     
            public void OnBeforeTestListStarted(TestList list)
            {

               // Your custom implementation here

            }
     
     
            // Before a test is about to start, ArtOfTest.Runner calls this method.
            // <param name="executionContext">The execution context the test is running under.</param>
            // <param name="test">The test we are about to start running.</param>
     
            public void OnBeforeTestStarted(ExecutionContext executionContext, ArtOfTest.WebAii.Design.ProjectModel.Test test)
            {

                // Your custom implementation here

            }
     
     
            // Use this to return your own data source.
            // <param name="executionContext">The execution context.</param>
     
            public System.Data.DataTable OnInitializeDataSource(ExecutionContext executionContext)
            {

                // Your custom implementation here

                return null;

            }
     
     
            // Called only on a step failure.
            // <param name="executionContext">The execution context a test is running under.</param>
            // <param name="stepResult">The step result that just failed.</param>
     
            public void OnStepFailure(ExecutionContext executionContext, ArtOfTest.WebAii.Design.AutomationStepResult stepResult)
            {

                // Your custom implementation here

            }
     
     
            #endregion
        }
    }

     

    A few notes about the code above:

    • ExecutionContext - The context is an object that gives you mostly all other objects in context of the current execution. For example, from this object you can access the run-time Manager object, the ActiveBrowser, the Log object, the Find object, etc. These are the run-time objects are used to execute your test. If you use the Telerik Testing Framework or coded steps, you are familiar with these objects. It also exposes the ExecutionContext.ExecutingTestAsStep property, which distinguishes between tests being run normally or as a subtest.
    • Result objects - RunResult contains a list of “TestResult” objects for each test that is executed. Each TestResult has an AutomationStepResult list that represents the result for each step that executed. You have access to all the metadata of the execution so you can generate your own reports.
    • OnInitializeDataSource - Bind a test to a custom data source by returning a DataTable from this method. Use that DataTable to data drive the test. As of internal build 2012.1.816, if this method returns a DataTable, it will be used regardless if the test is data bound or not. For older versions, it will only get called if the test is bound to a data source or if the InheritParentDataSource Test Property is checked.
    • Scope of Variables - Notice that OnBeforeTestListStarted() and OnAfterTestListCompleted() are called by the Scheduling Server, while OnBeforeTestExecution and OnAfterTestExecution are called by ArtOfTest.Runner on the Execution Server. This means that if you initialize variables inside of OnBeforeTestListStarted or OnAfterTestListCompleted, you will not reliably have access to these variables inside OnBeforeTestExecution or OnAfterTestExecution, and vice versa. To use the same variables in both sets of methods, you can lazy initialize the variables to ensure that they are not null.

     

  8. For the first example, we'll add code to the OnAfterTestListCompleted method to write the result of a test list as a basic string to a text file:
  9.  

    public void OnAfterTestListCompleted(RunResult result)
    {
        string msg = string.Format("TestList '{0}' completed on '{1}'. ({2}/{3}) Passed", result.Name, result.EndTime, result.PassedCount, result.TestResults.Count);
        StreamWriter file = new StreamWriter("c:\\test-list-results.txt");
        file.WriteLine(msg);
        file.Close();
    }

     

  10. Compile the class library.
  11. Deploy the extension by copying the DLL from the %Project Folder%\ClassLibrary1\ClassLibrary1\bin\Debug to the following directory:
    • %ProgramFiles%\Telerik\Test Studio\Bin\Plugins\

     

  12. Now execute a test list. The result string is written to the defined text file.

 


 

Let's see another example using the OnInitializeDataSource method. This assumes your test is already bound to an Excel file, and each Excel file has matching column names.

  1. Add the following code to that method:
  2.  

    public System.Data.DataTable OnInitializeDataSource(ExecutionContext executionContext)
    {
        System.Data.DataTable table = null;
        var thread = new System.Threading.Thread(obj =>
        {
            try
            {
                System.Windows.Forms.OpenFileDialog ofd = new System.Windows.Forms.OpenFileDialog();
                ofd.Title = "Open Excel File";
                if (ofd.ShowDialog() == DialogResult.OK)
                {
                    string excel = ofd.FileName;
                    DataSet foo = ImportExcelXLS(excel, true);
                    table = (foo.Tables[0]);
                }
            }
            catch (Exception ex)
            {
                NativeWindow a = new NativeWindow();
                a.AssignHandle(ArtOfTest.WebAii.Core.Manager.Current.ActiveBrowser.Window.Handle);
                MessageBox.Show(a, ex.Message);
            }
        });
        thread.SetApartmentState(System.Threading.ApartmentState.STA);
        thread.Start();
        thread.Join();
        return table;
    }

     

  3. Now add the following ImportExcelXLS method within the same public class:
  4.  

    private static DataSet ImportExcelXLS(string FileName, bool hasHeaders)
    {
        string HDR = hasHeaders ? "Yes" : "No";
        string strConn = null;
        if (FileName.Substring(FileName.LastIndexOf('.')).ToLower() == ".xlsx")
        {
            strConn = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + FileName + ";Extended Properties=\"Excel 12.0;HDR=" + HDR + ";IMEX=1\"";
        }
        else
        {
            strConn = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + FileName + ";Extended Properties=\"Excel 8.0;HDR=" + HDR + ";IMEX=1\"";
        }
     
        DataSet output = new DataSet();
     
        using (OleDbConnection conn = new OleDbConnection(strConn))
        {
            conn.Open();
     
            DataTable schemaTable = conn.GetOleDbSchemaTable(
              OleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
     
            foreach (DataRow schemaRow in schemaTable.Rows)
            {
                string sheet = schemaRow["TABLE_NAME"].ToString();
     
                OleDbCommand cmd = new OleDbCommand("SELECT * FROM [" + sheet + "]", conn);
                cmd.CommandType = CommandType.Text;
     
                DataTable outputTable = new DataTable(sheet);
                output.Tables.Add(outputTable);
                new OleDbDataAdapter(cmd).Fill(outputTable);
            }
        }
        return output;
    }

     

  5. Rebuild the class library, copy the resulting DLL file, and paste it into the Plugins folder (overwriting the existing file).
  6. When you execute a data driven test, you are prompted to select an Excel file (.xls or .xlsx). You have two choices:
    • Select a new Excel file and press OK.
    • Press Cancel to use the original data source.