The overall flow of automation testing is shown in the figure below. This module mainly provides the ability of automation task creation, task execution and report generation.
After completing the scripting, package your local scripts into a zip format and upload the file to the UDT project.
First of all, choose the type of test you need. You can choose Android or iOS system devices to test your application, and then click the button to select an application or upload a new one from local.
After clicking the button, you can see the app management pop-up. Select an app or upload a new one.
Select test case, test devices, test Account and configure custom parameters.
Script dependency environment:Choose the script dependency environment, you can choose Python, WeAumatour, Appium or Airtest.
Test Case Library:Select the test case library for this test.
You can go to the test cases page and add a new use case library through git hosting or script uploading.
Select Test Case:Select the test case for this test.
You can select all the cases or specify the test cases to be tested.
Choose Devices: Select the device to be tested this time.
Select the device pool and specific devices you want to test, then click the confirm buuton.
Test Account:
if you have test account file, you can click it to upload.
Custom parameters:This is an optional field and is empty by default. The custom parameters are stored as environment variables during the test run and can be retrieved by the user in the test script via the EXTRA_INFO. variable.
Custom parameters:This is an optional field and is empty by default. The custom parameters are stored as environment variables during the test run and can be retrieved by the user in the test script via the EXTRA_INFO. variable.
Test Execution Configuration
You can configure how tasks are executed, such as whether to automatically pull up the application, execution timeout, etc.
You have two options to choose Execution Mode:
Sequential Execution: All test cases are executed sequentially on each device.
Distributed Execution: Evenly distributed by device, executed once per test case.
Depending on your test scenario, choose whether or not to enable the following features.
After configuring all the parameters, click the Create New Task button.
Click Automation>Tasks to enter the page.
The current page displays a list of tasks that have been created. Contains: task name, creator, test framework information, last execution time and test ID.
Click on the Start up button to the right of the task to execute it.
Click Automation>Test Reports to view recent reports.
By clicking on the record, you can view detailed test report.
You can also check reports from tasks.
Click the task name and choose the record to enter into the detailed test report.
A detailed test report is generated at the end of each Automated Testing. The report provides test details such as screenshots, logs, performance data, video and test files. We offer two ways to view the report: from the device or from the use case dimension.
On this page, you can see the test results for each device, including device activity, device screenshot, device video, performance data and device log and files.
Display test information in test case dimension, including test case information, device information, test case results (JSON), test case screenshots, test case video, performance data and test case log information.
On the Automated Test>tasks page, click the Edit button to go to the task edit page where you can update the task information.
You can re-edit the task name, select a new test application, test cases and more.
Click on the Delete button to delete this task.
Click the Clone button to create a new task with the same configuration.
Enter a task name to complete the task creation.