The tests tab allows for the creation of workflow tests to run a workflow against the TB Engine before exporting it or deploying it. This will reduce the risk of deploying a faulty workflow or script.
Mocks are created for each adapter and sub-workflow within a workflow and these are used during the test. So for instance a real LDAP server is not needed as the response from the server can be mocked.
Tests are created by the TrustBuilder administrator for each workflow. The adpaters and sub-workflows are initially mocked automatically.
The test area allows the tests to be run within TBA so no TrustBuilder instances need to be created in order to run a test just the Run Test button needs to be clicked within TBA.
Create a Test File for a Workflow
For each workflow there is a row in the tests tab.
To create a test file click the Create Test File button.
This will create a test by reading the workflow.
For each adapter activity found in the workflow a mock adapter will be made.
For each sub-workflow activity found in the workflow a mock sub-adapter will be made.
For each component activity found in the workflow a mock component will be made.
Mocks allow for mocking responses so that a real adapter does not need to be called to run the test.
For instance if there is an LDAP adapter in the workflow a mock adapter can be created that will return a specific user in the adapter response which the workflow can then process.
This means that an LDAP server does not need to be installed just to run a test.
For each adapter activity a mock adapter is created in the test. Each mock adapter is represented as a form in the Adapters tab of the test file.
|Field Name||Description||Is Required?|
This is the status returned in the response from the adapter.
The default is 0 meaning OK.
Check the adapter documentation for relevant response status codes
This is the sub status returned in the response from the adapter.
The default is 0 meaning OK
Check the adapter documentation for relevant response status codes.
|Message||This is a message that will be added to the response.||no|
This is the return code in the response from the adapter.
The default is OK
Check the adapter documentation for relevant return codes.
|Properties||Name value pair that will be added to the response.||no|
To add extra properties to a mock adapter click the Add Property button. To delete a property click the rubbish bin icon beside the name entry of the property.
If changes are made click the Save button.
For each sub-workflow activity a mock sub-workflow is made in the test file.
The form for the mock sub-workflow has one input which is return. If a value is entered this is what will be returned from the sub-workflow when the workflow is run.
For each component activity a mock component is made in the test file.
The form for the mock component has one input which is return. If a value is entered this is what will be returned from the component when the workflow is run.
For each mock adapter a test is created by default.
New tests can be created by entering a name for a test in the input box at the top of the test name tab and then clicking the Add New Test button.
There is a form for each test:
|Form Field||Description||Is Required|
|Parameters||This is data that will be passed to the workflow when it is tested as request parameters||yes, null is acceptable|
|Headers||This is data that will be passed to the workflow when it is tested as request http headers||yes, null is acceptable|
|Cookies||This is data that will be passed to the workflow when it is tested as request http cookies||yes, null is acceptable|
|Body||This is the body of the request that will be passed to the workflow when it is tested||yes, null is acceptable but no usual|
|Assertions||These assertions will be tested once the call to the workflow has been run.||yes at least one|
|Assertion: Variable Name||This is the name of the variable to be tested||yes|
This is the type of assertion if the resulting value is to be exactly or to match.
Match can be used with regular expressions.
|Assertion: Value||The value that is expected||yes|
The value of evaluations will be run when the test is run.
This is commonly used to add a log line such as marking the beginning of the test.
For more details and examples regarding these fields please refer to the testing your workflow documentation.
To add new assertions click the Add New Assertion button. To delete assertions click the rubbish bin icon to the right of the value entry.
To add a new evaluation click the Add New Evaluation button. To delete evaluations click the rubbish bin icon to the right of the input box.
To delete a test click the Delete button in the bottom right hand corner of the relevant test box.
If changes are made, including adding/removing assertions and evaluations then the test must be saved by clicking the Save button in the bottom left hand corner of the test box.
Editing a Test
To edit a test click the Edit button in the list of workflow tests in the relevant row.
Deleting a Test
To delete a test click the Delete Test button in the list of workflow tests in the relevant row.
If a test is deleted but the workflow remains then a new test can be created by clicking the Create Test File.
Running a Test
To run a test click the Run Test button in the list of workflow tests in the relevant row.
Running a test will open the test results screen showing a summary of test results and the output log from the TB Engine when it run the test.
If the test failed for any reason: such as an error in a script, workflow or the configuration, the log will be displayed so that the error can be diagnosed and corrected.
If there are not errors from the engine and it can run the tests then there will be results detailed counting the number of tests run, passed and failed.
If there are failed tests the results of the failed assertions will be printed in an errors list.
If the tests run successfully this is displayed.