Orbit Planner Testing Notes
Some Test Cases (with a few notes on how to test):
A) Trans correctness when run from APT. (TEST & PSDB)
- Verify that Trans generates appropriate output when fed test proposals by APT.
- APT test proposals can be created by converting from proposals in
existing Trans test suites.
- The output from Trans can be analyzed for correctness independently,
or by comparing the output to that created by Trans when running the
same proposals from tdf files.
- Such comparisons could help validate the proposal converter, APT's
communication with Trans, and Trans' correctness when run via APT.
- Verify that Trans generates appropriate output when fed proposal updates by APT.
- Update test scripts can be created by using Trans' recording mechanism
while editing proposals in APT.
- One validation criteria is that Trans should never crash in response
to an update.
- Trans output can be captured after each update, or after a series of
updates.
- If that output can be validated, it can be baselined and used for
future comparion testing.
B) Trans correctness in the new operational flow. (TEST &
PSDB)
- Verify that Trans generates appropriate operational output when run from tdf files
which contain the new exposure properties (orbit number and duration).
- Verify that Trans packs the orbits just as it did for the same
proposal in APT.
- Verify that Trans honors the overridden subexposure durations.
- Verify that Trans is willing to automatically adjust certain exposure
durations up to 20% to correct for overfull orbits.
C) Overall usability in the new operational flow. (APSB, OPB
& FSE)
- Verify that operational flow is seamless.
- Users can access and edit the APT proposal.
- Users can initiate the conversion to prop format and initiate
operational processing.
- Operational processing continues to work essentially as it did last
cycle.
- Typical tasks not too onerous.
D) Orbit Planner GUI correctness. (APSB, OPB, DAB & ID)
- The display is correct with respect to expected Trans results.
- Expected Trans results can be observed in the DG or other Trans
reports.
- All expected GUI actions are present and respond as expected.
E) Orbit Planner GUI usability. (APSB, OPB, DAB & ID)
- The GUI is reasonably intuitive and not too confusing.
- Typical editing tasks are straightforward.
Some Possible Tests
In the following list of tests, comparison tests are represented as
equalities. In '==' equalities, the quantities on the left and right are
expected to be (nearly) exactly the same. In '~=', the quantities on the
left and right should be very similar, but some predictable differences are
expected.
In the equality notation, the following terms are used:
- aptTrans[-n] - A version of Trans suitable for use with
APT. Different n's can denote different revisions.
- currentTrans - The latest operational version of Trans not suitable
for use with APT.
- reports() - A function representing running Trans on some input and
generating report files.
- The input is either xml (implies Trans run from APT), prop (implies
Trans run from PP-generated tdf), or update (implies Trans running a
playback of a recorded APT session).
- dg() - A function representing running Trans and displaying the
output in the DG.
- op() - A function representing running Trans via APT and displaying
the output in the Orbit Planner.
- savedXml() - A function representing an xml file saved from APT at
the end of a recorded update session.
- aptTrans.reports(prop) == aptTrans.reports(prop2xml(prop))
- aptTrans.reports(xml) == aptTrans.reports(xml2prop(xml))
- aptTrans.reports(prop) ~= currentTrans.reports(prop), for prop's without
OP-EXPOSURE-VALUES
- aptTrans.reports(update) ~= aptTrans.reports(savedXml(update))
- aptTrans1.reports(update) == aptTrans2.reports(update)
- aptTrans.dg(xml) ~= aptTrans.op(xml)
- Users building and modifying proposals using APT
- Users building, modifying and processing proposals in APT and
Operations
Test Applicability Matrix
Tests vs. Test Cases |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
A1 |
* |
* |
|
|
|
|
|
|
A2 |
|
|
|
* |
* |
|
|
|
B1 |
|
* |
* |
|
|
|
|
|
C1 |
|
|
|
|
|
|
|
* |
C2 |
|
|
|
|
|
|
|
* |
D1 |
|
|
|
|
|
|
* |
|
D2 |
|
|
|
|
|
* |
* |
|
E1 |
|
|
|
|
|
|
* |
|
E2 |
|
|
|
|
|
|
* |
|
To Do Before Testing
- Enable testers (through instruction or s/w) to record proposal update
scripts.
- Enable testers (through instruction or s/w) to play back proposal update
scripts (including in batch mode).
- Finish prop to xml proposal converter.
- Investigate and implement any necessary restrictions for subexposure
duration overrides.
- Create test data and plans for analyzing output.
- Add APT/OP support for exposure groups (patterns, seq non-int, etc.).
- Review and complete APT/OP support for all (or at least most) optional
parameters and special requirements.
- Design and implement operational proposal storage/retrieval/conversion
mechanism.