Thinking about a basic use-case (and general requirements) process for team use?
This may be modified or ignored if you have another successful
process for eliciting requirements, but it can be a good start if
you're looking to start. It is not a precise definition of a
process, but an outline of something that can work for you.
Brainstorm basic features and scenarios. Brainstorm basic user scenarios (concrete examples of the use of the proposed or a similar system)
List the scenarios and generalize the (short, simple) list into a list of features necessary for the new system.
Add the features necessary to make your system what it is (those parts beyond the standard system)
(Note the definition of "feature" by Karl Wiegers,
it will encompass several requirements to enable the user to take
advantage of a "feature.")
Take this list of features and carefully review for correctness, completeness and consistency at a high level.
Check with your users, customers about the list to be sure it covers their picture of needed features.
Determine basic use-cases required to make scenarios work. Begin to develop use-cases that implement the features:
Create a list of "bite sized" tasks for a given "role" (an entity that wants to utilize your system).
Postulate the precondition and postcondition (notable states of the
system that MUST be true before and after the use-case executes)
Develop the "normal course" for the use-case, using Wiegers' use-case
form. ("Normal Course" - where all goes smoothly and
successfully).
Develop any "alternate" courses for accomplishing the same use-case
(could be candidates for "include" or "extend" - see Schach text for
hints)
Any exceptional courses must be indicated somehow. What can go wrong and what behavior is required under that condition?
IF the use-case is large, shows repeated (or separable) behavior, consider breaking it into more than one use-case (intuitive.)
Team QA check on use-cases. Take
each individual use-case to the team, hand each team member a copy,
then walk the the team through it. The team should be QA for you
here, asking questions that should be answered in the document or
defects noted for later repair. The team should also look at the
top level list of use cases (the use-case diagram) and should continue
to look for correctness, completeness and consistency at that top
level. This whole process can define your functional
requirements, so it is important.
Meet with the customer - customer walkthrough. After
your team check, the use-case should be pretty-printed and spruced up
(simplified as much as possible, diagrams where necessary - where
possible :-) Gather any accessory information (old documents the
customer has seen and commented on, other connected documents that may
be needed - you want to be prepared to make your customer's job easy
and your job efficient.) Organize your questions for your
customer beforehand and take a copy of your use-cases (just a few at a
time) to her for her review. Walk her through the use-cases and
ask for comments on their correctness, completeness and consistency.
Have some team member take notes while another dialogues with
her. (I'd have another team member paying attention to all the
"other" sorts of information you may encounter during such sessions but
not relevant to your use-cases at that moment - someone to note quality
attributes, business requirements, system requirements, design ideas,
data items from the problem domain for your data dictionary... all
notes to help you now and in the future.)
Repeat as needed. Feedback
and corrections (attributed to the customer as contributor/reviewer )
are very important. Notice that there is a point of diminishing
returns and keep in mind your context diagram and main features (don't
stray if at all possible.)
Do this all as a formal and scheduled process. Have set goals and track your progress.