CSC 307 Milestones 7 and 8

CSC 307 Milestones 7 and 8
Continued Model/View Design and Implementation
Test Planning and Design
Code Reviews




Due:
Milestone 7 is a rough draft of the implementation; it will be discussed during week 9 lab meetings; your draft needs to be released by the time of your meeting on Wednesday or Friday of week 9

Milestone 8 release: is due on two separate dates:

Deliverables:
Continued work on Design and Implementation:

  1. Make any necessary refinements to the package design of Milestone 6.

  2. Provide design documentation for all packages, classes, methods, and data fields

    1. Note Well: Permanent point deductions will be made for any missing documentation.

    2. The handout for the 307 Design and Implementation Conventions has details for what the documentation entails.

    3. In particular, model class documentation must consist of

      1. a description of the purpose of the class

      2. a summary of the methods that the class provides

      3. for classes that define a significant data structures, a description of what the data structures are, including pictures if necessary

      4. for classes that implement a significant piece of processing, a description of the processing algorithms employed

    4. At the end of the conventions handout is the following point breakdown for the design documentation:
      • packages: 5%
      • classes: 15%
      • methods: 12%
      • data fields: 8%
      which means you will lose up to 40% of the points for your portion of the milestone if any or all design documentation is missing.

  3. Complete the .java files for all derived model classes.

    1. The files should have public methods that trace to operations in the requirements specification, some of which are implemented (see below).

    2. If a method is unimplemented, make it a "stub", as described in the Milestones 5-6 writeup.

  4. Define model class data representations.

    1. I.e., provide the class data fields that define how class data are represented.

    2. Begin implementation of model methods, to produce real data, or canned testing data.

    3. As necessary, write test data-generation methods, to support model classes where underlying process classes are not yet completed.

    4. Each team member must have some key aspect of the model data design implemented, where the precise definition of "key" will be determined in our lab meetings, and the precise definition of "implemented" means that the execution of the program can be validated as described under HOW-TO- RUN below.

  5. Refine view classes.

    1. Continue to refine view class design and implementation.

    2. As necessary, provide hand-built model testing data, when underlying process classes are not completed.

  6. Bottom Line -- some things must actually run for Milestone 8.

    1. There must be implemented model classes that hold real data. (Real data can include canned testing values; "real" means that data are held in the model, not just views.)

    2. GUIs on the screen must communicate with the models to change and view the data.

    3. We will discuss in Week 8 and 9 labs precisely what each team member is responsible for.
Testing:

  1. For the entire team:

    1. Set up the project testing directory.

    2. The exact structure is up to you, but it must separate the testing code from the application implementation code.

      1. A sample structure is pictured an the end of Lecture Notes week 8

      2. A sample concrete implementation of this structure is in the milestone 8 example

  2. Per team member:

    1. use Spest to validate the preconditions/postconditions for at least three model methods

    2. use Spest to generate the tests for these methods

    3. put the generated methods in the appropriate testing directory

    For Milestone 8, you do not need to write any test methods yourself. The test methods you submit for this milestone are those that Spest generates. For Milestone 10 (the final deliverable), you may need to write test methods yourself, depending on the test coverage that you get from the Spest-generated methods.

  3. Write a draft integration test plan for the entire team. Put the plan here:
    testing/implementation/integration-test-plan.html

  4. Perform code reviews.

    1. Code reviews involve the inspection by colleagues of one another's code.

    2. Each team member will review the code of another fellow team member. It can be helpful if the code being reviewed is related to that being done by the reviewer, but that is not required.

    3. The review is performed for each class, and for each method within each class.

    4. The following are the specific standards that must be met for each class:

      1. follows the 307 design and implementation conventions

      2. compiles and executes properly, where "properly" means it uses actual model data as described above

    5. The following are the specific standards that must be met for each method:

      1. follows the 307 design and implementation conventions, in particular the javadoc commenting and 50-line rule

      2. for the three methods chosen by the team member, the pre- and postconditions are provided and compile with the JML checker

    6. The code review report takes the following form:
      Class X: Review Remarks
      Method X.m1: Review Remarks
      . . .
      Method X.mn: Review Remarks

      Class Y: Review Remarks
      Method Y.m1: Review Remarks
      . . .
      Method Y.mn: Review Remarks
      where Review Remarks are either the description "Meets all standards" or an itemized description of the standards that are not met.

    7. Put the code review files in HTML format in the project directory testing/implementation/code-reviews. Name each file with the UNIX ID of the team member performing the review. This means there will be six code review files in total in testing/implementation/code-reviews.
Administration:

  1. Provide a HOW-TO-RUN.html file in the project administration directory.

    1. This file explains how to start up the one or more executables for your project.

    2. Where necessary, be very precise about what commands the end-user performs, what files need to be present, and any other necessary support structures that need to be in place for execution to be successful.

  2. Provide an m8-duties.html file.

    1. It itemizes for each team member the parts of the project that produce validatable execution results.

    2. It also itemizes precisely for each team member what aspects of the implementation the member is responsible for, and how the implementation is validated.

    3. Validation must include at least some display in a GUI window, with communication to appropriate model class(es).

    4. In addition, validation may also be presented by:

      1. the creation of data files

      2. the creation of database entries

      3. output to stdout
      where all such forms of validation must be described clearly and completely in the HOW-TO-RUN.

    For the Tuesday Milestone 8 deliverable, the m8-duties file also says what classes contain Spest specifications, and where the generated Spest tests are.

    NOTE ABOUT SPEST PROBLEMS: If you cannot get your Spest specs to validate and/or generate tests, provide a brief explanation of the problem you had.

  3. Update work-breakdown.html as necessary.

Here's an example of what goes into each of these three files:

HOW-TO-RUN.html:

To run the instructor version of the program, cd to implementation/executables
and run the command "java -jar testtool_insructor.jar".  To run the student
version, the command is "java -jar testtool_student.jar".

m8-duties, for one team member (there's one such entry for each team member):

I worked on test generation.  What works for Milestone 8 is basic test
generation.  To see this run, goto the 'Test' tab in the main UI, then click
the 'Basic' button under the 'Test Generation' heading.  When the GUI pops up,
enter some values and press the 'Generate' button.  The contents of the
generated test appear on stdout, since the test display GUI hasn't been
implemented yet for Milestone 8.

The model class with Spest specifications is implementation/source/testtool/model/tests/generate/BasicGeneration.java. The generated Spest tests are in testing/implementation/source/testtool/model/tests/generate/BasicGenerationTest.java. NOTE ABOUT SPEST PROBLEMS: If there was a problem with Spest validation or generation, put a brief explanation here.

work-breakdown, for one team member:

Test.java class in package testtool/model/tests; all classes in sub-package
testtool/model/tests/generate.




index | lectures | handouts | examples | textbook | doc | grades