Altova MobileTogether Designer

Automated Testing

Home Prev Top Next

The Automated Testing feature enables you to compare two test runs (which are essentially simulations) to detect differences in the design, page source data, component styles or layout, or solution environment.

 

The process works as follows: First, a base test run (or test case) of a design is recorded. This test case progresses through certain user and design actions. The test case is subsequently played back with different parameters (for example, with different page source data, or on a different version of a device-OS). If the playback in MobileTogether Designer returns differences from the test case, then the playback is recorded. A recorded playback is called a test run—as opposed to the original test case. The test run can then be compared with its originating test case. If any issues are identified, these can be addressed. Furthermore, a test case for a design can be deployed, together with the design, to MobileTogether Server. This enables test cases to be downloaded to multiple client devices for playback. Playbacks on client devices are saved to the server, and can be retrieved in MobileTogether Designer for comparison.

 

The typical Automated Testing scenario would progress as follows:

 

1.Record a test case. The recorded test case can be played back in a different environment.

2.Playback a test case. Playbacks are saved as test runs of its test case. If the playback is in MobileTogether Designer, then only those playbacks that return a difference are stored as test runs. If a test case is deployed to MobileTogether Server and played back on a client device, then all these client playbacks are stored on the server.

3.The test run is compared with its originating test case. Comparisons are carried out in MobileTogether Designer. The differencing level can be configured, and the differences can be examined in detail. Test runs that are returned from client playbacks (and stored on the server) will have to be retrieved to MobileTogether Designer for comparisons.

 

Using Automated Testing for quickly carrying out routine steps

In cases where certain routine steps need to be carried out every time you run a simulation, these steps can be recorded as a test case and subsequently played back. For example, the design might prompt the user to enter log-in details or other items of data that do not change. If the entry of this data takes up much time, the data-entry steps can be recorded in a test case. Subsequently, you can play back the test case to quickly complete these routine steps and then carry out additional test-steps manually. Used in this way, Automated Testing can help you to save time during the design phase.

 

Automated Testing menu commands

The commands to run the Automated Testing feature are in the Run menu. They are also available via the Automated Testing toolbar (screenshot below).

MTDAutoTestToolBar
MTDAutoTestIconRecord

Record New Test Case: Starts a new test case in the Simulator and records user actions. When recording stops, you are prompted to give the recording a name and save it as a test case. See Recording a Test Case for details.

MTDAutoTestIconPlayback

Playback Test Case: Plays back the test case that is selected in the Available Test Cases for Playback combo box. If the playback returns differences from the test case, then the playback is saved. See Playing Back a Test Case.

MTDAutoTestIconClientRun

Trial Run Test Cases on Client: Plays back, on a connected client, the test case that is selected in the Available Test Cases for Playback combo box. If the playback returns differences, then the playback is saved. See Playing Back a Test Case.

MTDAutoTestIconManageTestRuns

Manage Test Cases and Runs: Displays the Manage Test Cases and Runs dialog.

 

The Available Test Cases for Playback combo box is is displayed only after a test case has been recorded. It displays all the recorded test cases. Select the test case you want to play back. The test case that is selected here will be run when Playback Test Case or Trial Run Test Cases on Client is clicked.

 

In this section

This section is organized as follows:

 

Recording a Test Case

Playing Back a Test Case

Managing Test Cases and Runs

Deploying Test Cases to Server

Comparing Test Runs

 

© 2017-2023 Altova GmbH