Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Any known differences between the evaluated context and the expected context of use
    • There is no big difference between our experctations and the reality, since we also feel what respondents feel.
  • Tasks
    • getting student data
    • changing address
    • check the confirmed participation
    • check the completed courses
    • check the study guide line
    • checking the courses offered for the department
    • enrolling for the courses that are offered
    • changing the data language
    • checking the enrollment status(season)
  • Describe the task scenarios for testing
    • Explain why these tasks were selected
      • Because they are the basic and most common tasks to be done in the system.
    • Describe the source of these tasksthese tasks
      • From our own experience
    • Include any task data/information given to the participants
      • We provided questionnaire
    • Completion or performance criteria established for each task
      • The respondants were expected to answer YES  , NO and YES WITH DIFFICULTIES, inaddition they are provided spaces to include thier personal opinions 

Test Facility
Describe the setting, and type of space in which the evaluation was conducted

  • The test was performed in the labs

Detail any relevant features or circumstances, which could affect the results (e.g. There was a brake down of the server, which messed up the test for a while and created unnecessary tension. There was unforeseeable noise that disturbed the test, etc.)

  • There was a printing error which we corrected right away.

Participant's Computing Environment

  • Computer configuration, including model, OS version, settings,
    • OS(Mac)
  • Browser name and version;;
    •  Fire Fox
  • Relevant plug-in names and versions (the bullets mean stating e.g., what browser and computers the users are using in the test. In field trials this is information that is not known by the technical partners. For example, in one of the tests during last spring 2007, one of the users was at home using SSp during the test, so it was asked what she used e.g., Internet Explorer 6 and Mozilla Firefox2.0.0.6, Compaq Presario with Windows XP and IBM ThinkPad with Windows XP. If all is not know then it is not but it would be good to try to get the info. Plug-ins can refer for example to the browser add-ons (in Firefox these are found from the upper tools menu. Sometimes it is needed to know if some plug-ins are on or off, because it might change or prohibit some functions.).
    • OS(Mac)
    • Fire Fox

Display Devices (report if relevant, e.g., Paper prototypes are tested or static prototypes are tested on screen)

...

  • If a questionnaire was used, describe or specify it here (add these to appendix)
    • The questionnaire is attached at the top of this page.
  • Describe any hardware or software used to control the test or to record data (audio, video)
    • We used A4 printed out papers.

Experimental Design

  • Define independent variables and control variables
  • Describe the measures for which data were recorded (the scale/scope of the recorded data, if relevant for the particular test, i.e., written notes, think aloud in audio recording, etc.).

...

  • Operational definitions of measures (e.g., how is it decided that that a task is completed)
    • When the questionnaire is filled.
  • Policies and procedures for interaction between tester(s) and subjects (e.g., is the test conductor aloud to answer questions of the user, provide help, etc.)
    • Just provide the questionnaire and wait for the answers(no help was provided in answering the questions).
  • State used: non-disclosure agreements, form completion, warm-ups, pre-task training, and debriefingdebriefing
    • none of the above
  • Specific steps followed to execute the test sessions and record data
    • questionnaire was given and collected after finnishing.
  • Number and roles of people who interacted with the participants during the test session
    • 4 of us
  • Specify if other individuals were present in the test environment
    • No
  • State whether participants were paid
    • No

Participant General Instructions (here or in Appendix)

  • Instructions given to the participants
    • perform the tasks whcih were provided in the questionnaire and answer the questions.
  • Task instruction summary
    • The tasks are to answer the given questions.
  • Usability Metrics (if used)
    • Efficency,Memorabilty,Errors,Satisfaction
  • Metrics for effectiveness
    • Metrics for efficiency
      • According to their answer
    • Metrics for satisfaction, etc.
      • According to their answer

Results

  • Data Analysis
  • Quantitative data analysis 
  • Qualitative data analysis
  • Presentation of the Results
    • From quantitative data analysis
    • From qualitative data analysis (descriptive and clarifying presentation of the results)

...