CSC 307 Lecture Notes Week 8
Use of Formal Method Specification in Testing
Introduction to System Testing Techniques
Testing Implementation, in TestNG and JUnit
post: return == x + y;
is semantically equivalent to the following standard Java expressionif X then Y else Z
X ? Y : Z
is semantically equivalent to no standard Java expression, since Java always requires the "else" part of and if-then-else expression.if X then Y
for all types for which null is a legitimate value.X ? Y : null
forall (T x ; constraint ; predicate)
exists (T x ; constraint ; predicate)
-- Now onto System Testing Techniques --
Case No. Inputs Expected Output Remarks 1 parm 1 = ... ref parm 1 = ... ... ... parm m = ... ref parm n = ... return = ... data field a = ... data field a = ... ... ... data field z = ... data field z = ... n parm 1 = ... ref parm 1 = ... ... ... parm m = ... ref parm n = ... return = ... data field a = ... data field a = ... ... ... data field z = ... data field z = ...
Table 1: Unit test plan.
unix3:~gfisher/work/calendar/testing/implementation/source/java/caltool/integration-test-plan.html
class X { // Method under test public Y m(A a, B b, C c) { ... } // Data field inputs I i; J j; // Data field output Z z; }
class XTest { public void testM() { // Set up X x = new X(...); ... // Invoke Y y = m(aVAlue, bValue, cValue); // Validate assertEqual(y, expectedY); } }
where p is the number of the method path covered by the test case i.
Test No. Inputs Expected Output Remarks Path i parm 1= ref parm 1 = p ... ... parm m = ref parm n =
unix3:~gfisher/work/calendar/testing/implementation/source/java/Makefile
class SomeModestModel { public void doSomeModelThing(String name) { ... hdb.doSomeProcessThing(...); ... } protected HumongousDatabase hdb; } class HumongousDatabase { public void doSomeProcessThing(...) { ... } }
Figure 1: Testing directory structure.
Directory or File | Description |
*Test.java | Implementation of class testing plans. Per the project testing methodology, each testing class is a subclass of the design/implementation class that it tests. |
input | Test data input files used by test classes. These files contain large input data values, as necessary. This subdirectory is empty in cases where testing is performed entirely programatically, i.e., the testing classes construct all test input data dynamically within the test methods, rather than inputing from test data files. |
output-good | Output results from the last good run of the tests. These are results that have been confirmed to be correct. Note that these good results are platform independent. I.e., the correct results should be the same across all platforms. |
output-prev-good | Previous good results, in case current results were erroneously confirmed to be good. This directory is superfluous if version control of test results is properly employed. However, this directory remains as a backup to avoid nasty data loss in case version control has not been kept up to date. |
$PLATFORM/output | Current platform-specific output results. These are the results produced by issuing a make command in a platform-specific directory. Note that current results are maintained separately in each platform-specific subdirectory. This allows for the case that current testing results differ across platforms. |
$PLATFORM/diffs | Differences between current and good results. |
$PLATFORM/Makefile | Makefile to compile tests, execute tests, and difference current results with good results. |
$PLATFORM/.make* | Shell scripts called from the Makefile to perform specific testing tasks. |
$PLATFORM/.../*.class | Test implementation object files. |
Table 2: Test file and directory descriptions.