2.4.1 Random Test Generation Continued

    The purpose of this section is to expand on the test refining parameters available to the user when creating a random test outside of those already discussed in 2.2 Basic Test Creation.  For the purposes of this scenario, it is assumed that the user has the accessibility / permission to create a test, that a populated database of questions exists, and that a list of classes have also been created.

    To generate a random test using TestTool's test generation tool, the user selects Test Creation from the startup window, and then selects Test->Generate from the toolbar.  If the user has not defined a question database in the global preferences, or has not opened a question database into the program, the dialog in Figure 1 appears.

 

Figure 1: Load database dialog.

    Before the user can begin to set the initial parameters for their test, he/she first must select which database they wish to draw their questions from. The OK button is grayed out to indicate the program's inability to move forward without loading a database. The user clicks on the Browse... button which displays a standard file chooser window where the user selects a database file to load into the TestTool program.   Once a question database has been loaded into the system, the original load database window appears and will show the file path of the database that the user has chosen and the OK button is then enabled, as shown in Figure 2:

Figure 2: Loading a database.

    The user then clicks OK to confirm their database selection, or Cancel to stop the process.

    After loading their desired question database, a new window is displayed as shown in Figure 3 below:

Figure 3: Test creation dialog.

    The Test Name field is a one-line string that gives a name to this test file. The Class drop-down menu displays the list of classes already entered into the question database.  Here the user selects what class he/she wants the database of questions to be filtered by, as shown in Figure 4:

Figure 4: Selecting a class

    The two empty text fields below the Class text field accepts numerical input which provides the user with the ability to specify their exam's length in time or by number of questions.  The user will not be able to generate a test until the Test Name text field, the time allotted text field, and the Class field are all filled with valid values.  The number of desired questions on the test is an optional parameter.  To stop the test creation process and return the original screen, the user clicks Cancel.  To generate a test without defining anymore parameters, the user clicks the now enabled Generate Test button which generates a test and displays the "test statistics" window which can be seen in Figure 8But in this scenario, the user clicks the More... button which will drop down another window panel with more advanced test generation options.  This new window can be seen in Figure 5.

 

Figure 5: Creating a CSC 102 midterm using basic preferences.

    In this new panel, the button that used to be labeled as More... is now labeled Less....  To retract the newly displayed panel, the user clicks on the Less... button which also throws out all inputted user values in the second panel.  A drop-down menu for the test's difficulty entitled Difficulty creates a test of user-defined difficulty on a scale of 1-10.  The Keywords  text field is a one-line string that allows the user to only allow questions that contain certain words in their description.  If the user wishes to enter multiple keywords in the Keywords text field, the user can use the semicolon (;) character as a delimiter between keywords.  Entering more keywords into the text field means that all questions in the completed test must have all of the user-inputted keywords.  A checkbox associated with a text field is provided that will allow the user to filter questions that have been used in the last user-defined number of days.  To enable the text field, the user must check the checkbox, then enter a valid integer into the text field.  If the field is left empty, a default value of "0" will be used.  The next checkbox allows the user to filter questions drawn from the database by author of the question.  To enable the text field, the user must check the checkbox, then enter a string of characters.  If the user wishes to enter multiple authors in the text field, the user can use the semicolon (;) character as a delimiter between authors.  Should the user not enter any value, no argument will be passed to the program, and no authors will be filtered.  The user clicks on the More... button, and in response, the system displays additional options for the user to refine their test creation process.  This new, third panel drops down just as the second panel did from the first, as show in Figure 6.

Figure 6: Creating a CSC 102 midterm using non-default difficulty.

    In the last filter panel, the user refines the specific types of questions to be asked (i.e. multiple choice, true/false, essay, etc.).  A pair of radio buttons allows the user to either randomly select questions of all types from the question bank, or to specify the number of each question format.  In this example, the user chooses to specify the number of each type of question.  He/she edits the text field of the respective Count column of each question format.  The program uses the user-inputted Count value as a median and uses the variance specified in the global preferences setting as a means to determine how many of each question format will be on the generated test. In this case, the default value of five is used as the variance.  A completed test creation window can be seen below in Figure 7.

Figure 7: Creating a CSC 102 midterm using advanced preferences.

    The user is now done setting the specifications for their exam.  The user clicks on the Generate Test button which generates a test, and displays a "vital statistics" window for the test that shows number of questions, approximate duration of the test, and distribution of question difficulty and question format, as shown in Figure 8:

Figure 8: Test statistics dialog.

    If these statistics are unsatisfactory, the user selects the Generate Another Test button, which uses the same parameters already entered, and generates another test.  To go back to the test creation preferences window, the user clicks the < Back button.  Clicking on the Done button displays the user's randomly generated test which can be seen in Figure 9.

Figure 9: A completed test.

    Figure 9 shows that all Difficulty values for each question lie within the default variance of 2 from the user-inputted test difficulty of 7, and the final question distribution lies within the default variance of the user-inputted mean.  Additionally, each question has one of the user-entered authors as its own Author, the Class field of each question matches the user-inputted class, and the Keywords of each question match every user entered Keyword.  The system also shows the duration of the completed test, which lies within the default range of the user-inputted median, and the number of questions on the test, which also lies within the default range of the user-inputted median.

    With a completed test, the user edits the test manually as he/she sees fit, using basic text editing, or by using the Add, Edit, Delete, Replace buttons at the bottom of the test window.  Discussion of this scenario can be seen in 2.4.2 Manual Test Creation.


Next: Manual Test Creation | Up: Advanced Test Creation | Top: index