Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Title

...

Page

...

Usability

...

Report.

...

Tuubi

...

Portal

...

Due

...

date

...

of

...

Report:

...

10.12.2009

...

Actual submission date:

14.12.2009

...

Revised version:

...

14.12.2009

...

(final)

...

Product name and version:

Tool name, version (proto or stable version)

Organisers of the test:

Kashi Gauli, Ademola Somoye

Date of the test:

10.12.2009

Date of the report:

12.12.2009

Editor:

Kashi Gauli

Contact name(s):

Kashi Gauli ( kashira()metropolia.fi

...

Executive summary

Provide a brief level overview of the test (including purpose of test)

Name the product: Tuubi Portal
Purpose/objectives of the test: To get functionality and usability views from users.
Method 1: pilot testing, verbal questionnaire
Number and type of participants: 2 , Students
Tasks (if task are used):
Method 2: Paper questionaire

Results in main points

-        tuubi  interface looks good

-        information accessibility is not clear

-        easy navigation

-        useful workspaces

-        easy upload of assignments

...

Introduction

Full Product Description

  • Product name and release date: Tuubi Portal, released 2008
  • Describe what parts of the product were evaluated

Interface, Accessibility of information, Navigation system, Workspaces, Information retrieval

  • The user population for which the product is intended

: About 12,000

  • Brief description of the environment in which it should be used (this means the context of the use of product/tool, e.g., is it an education product used in primary school, higher education, etc., or maybe research tool used in the field -then what could be field)

School is the primary environment for tuubi portal because of academic activities that happens within it. They use it in all computer labs where internet connection is available, Library, Lobby etc. Secondary environment is home where students or teacher use tuubi for academic purposes. Student access tuubi to to school related works i.e. homework's, workspaces etc and teacher uses to check assignment, to upload new assignment etc.

Test Objectives

  • State the objectives for the test and any areas of specific interest
    • Functions and components with which the user directly and indirectly interacted

Interaction with interface of tuubi, navigation within different pages, uploading documents or assignments etc

    • Reason for focusing on a product subset

They are most critical part of tuubi portal.

Method

Participants

  • The total number of participants tested

         3

  • Segmentation of user groups tested, if more than one user group was tested

                Only Tuubi user: Student were tested

  • Key characteristics and capabilities of user group (this info might have been acquired through the background (pre) questionnaires, thus it can be just referred here, e.g. linked to the description of the results of the background (pre) questionnaires)

Tuubi is compulsory to use for all students in Metropolia, so everybody needs to know basically the functionality i.e. how to use tuubi, and usability part of tuubi.

        

  • How participants were selected; whether they had the essential characteristics

There is no selection, we choose classmate who are students and the daily user of tuubi portal.

  • Differences between the participant sample and the user population

   They all are students, so they don't have all the rights for tuubi, mostly teacher get administrative right to access tuubi, Students has limited rights who are primary user of tuubi.

Context of Product Use in the Test

  • Any known differences between the evaluated context and the expected context of use

  There was difference of users view in the interface of tuubi portal also the workspace, some user thought its useless

  • Tasks

To get the Views on Interface, information accessibility, navigation system, usefulness of workspaces, uploading assignments etc.

  • Describe the task scenarios for testing
    • Explain why these tasks were selected

Because these are the most essential parts of tuubi portal which all of its user must know.

    • Describe the source of these tasks

                  Individual Interaction gave use idea  how to build the tasks related with these essentials parts of tuubi.

    • Include any task data/information given to the participants

Paper Questionnaire (attachment)

    • Completion or performance criteria established for each task

 We got good feedback from participants.

Test Facility
Describe the setting, and type of space in which the evaluation was conducted
Detail any relevant features or circumstances, which could affect the results (e.g. There was a brake down of the server, which messed up the test for a while and created unnecessary tension. There was unforeseeable noise that disturbed the test, etc.)

We organized our actual testing in classroom where students were having their lab session. It was good environment and we have paper questionnaire so it was easy to conduct.

Participant's Computing Environment

  • Computer configuration, including model, OS version, settings,

  Mac OS.

  • Browser name and version;

Mozilla,3.5.5

  • Relevant plug-in names and versions (the bullets mean stating e.g., what browser and computers the users are using in the test. In field trials this is information that is not known by the technical partners. For example, in one of the tests during last spring 2007, one of the users was at home using SSp during the test, so it was asked what she used e.g., Internet Explorer 6 and Mozilla Firefox2.0.0.6, Compaq Presario with Windows XP and IBM ThinkPad with Windows XP. If all is not know then it is not but it would be good to try to get the info. Plug-ins can refer for example to the browser add-ons (in Firefox these are found from the upper tools menu. Sometimes it is needed to know if some plug-ins are on or off, because it might change or prohibit some functions.).

Display Devices (report if relevant, e.g., Paper prototypes are tested or static prototypes are tested on screen)

  • If screen-based, screen size, resolution, and color setting

Default; it was set already in the lab.

  • If print-based, the media size and print resolution

Test Administrator Tools (report if relevant for the particular test)

  • If a questionnaire was used, describe or specify it here (add these to appendix)
  • Describe any hardware or software used to control the test or to record data (audio, video)

Experimental Design

  • Define independent variables and control variables

 It was an independent variable kind of testing because the users where quite familiar with the test questions.

  • Describe the measures for which data were recorded (the scale/scope of the recorded data, if relevant for the particular test, i.e., written notes, think aloud in audio recording, etc.).

 : Writing notes

Procedure

  • Operational definitions of measures (e.g., how is it decided that that a task is completed)

When we got task back from participants; Test was believed to be completed.

  • Policies and procedures for interaction between tester(s) and subjects (e.g., is the test conductor aloud to answer questions of the user, provide help, etc.)

         we provided some help regarding the test questionnaire, which were unclear to participants.

  • State used: non-disclosure agreements, form completion, warm-ups, pre-task training, and debriefing

 Nothing was used.

  • Specific steps followed to execute the test sessions and record data

We went with some steps, we gave questionnaire in the beginning, we waited for them to complete the task and we collected it back from them and finally we evaluated it altogether.

  • Number and roles of people who interacted with the participants during the test session

We were two who interacted with the participants.

  • Specify if other individuals were present in the test environment

No

  • State whether participants were paid

No

Participant General Instructions (here or in Appendix)

  • Instructions given to the participants

   we asked them to fill all the relevants in the questionaire , to feel free to give feedback.

  • Task instruction summary To go through questionnaire and fill the relevant places.
  • Usability Metrics (if used)

No

  • Metrics for effectiveness
    • Metrics for efficiency
    • Metrics for satisfaction, etc.

Results

  • Data Analysis

All the filled questionnaire were collected from 3 participants. We compared the views from each participant on the same question and tried to conclude the common views.

  • Quantitative data analysis  The quantitative data analysis can be said to be good because most of the feedback gotten from participants were positive in terms of interface, accessibility, navigation etc.
  • Qualitative data analysis

The qualitative data analysis was more of less the personal views of the participants because they all gave not too similar thoughts of what they felt about Tuubi.

  • Presentation of the Results
    • From quantitative data analysis
    • From qualitative data analysis (descriptive and clarifying presentation of the results)

Reliability and Validity

Reliability is the question of whether one would get the same result if the test were to be repeated.
This is hard to acquire in usability tests, but it can be reasoned how significant the findings are.
(Example from expert evaluation: This review was made by one reviewer in order to give quick feedback to the development team. To get more reliable results it would have been desirable to use three or at least two reviewers, as it is often the case that different reviewers look at different things. We do feel, however, that for the purpose of this report, and the essence of quick feedback, one reviewer has given enough feedback to enhance the usability of the system.)

The reliability and validity would not be the same because all the participants would always give different views and ideas. If we have to test different people for the test again they would all come up with different answers given in the questionnaire.

Validity is the question of whether the usability test measured what was thought it would measure, i.e., provides answers to. Typical validity problems involve: using the wrong users, giving them the wrong tasks.
(Example from expert evaluation: "The reviewer is an experienced usability professional that has evaluated systems for many years. We therefore feel that the method used as well as the tasks used give an appropriate view of how ordinary users would behave in the system.")

Generally we feel we would have arrived at almost the same answer as we had thought before.

Summary Appendices

  • Custom Questionnaires, (if used, e.g., in expert evaluation there is no participants)
  • Participant General Instructions
  • Participant Task Instructions, if tasks were used in the test

Questionaire:

Usability and Interface Design

Actual Testing 2009-12-10

Tuubi Portal( tuubi.metropolia.fi)

DAP07S Testing Result

Case study- Students Tuubi has just been born but it used to be Ovi portal.

Tuubi portal was built so as to accommodate the merger of two schools namely Evtek and Stadia Polytechnic. The new Tuubi serves as a media or network whereby students, teachers and staffs of Metropolia interact in terms of sending and receiving of information.

The basic features of the all new Tuubi includes:-

Course Assessment

Course Timetable

Moodle links

E-mail links

WinhaWille links

Helpdesk Information

Forums

The introduction of Tuubi has led to different challenges often faced by students.

The Questionnaire tries to throw more light on the basic questions which can help the users to make Tuubi a better forum or platform. Students would be given some basic questions which would help us improve the functionality of Tuubi.

Tick the appropriate portion X

Number of participants: 3

Below in the table, the results from 3 different users view have been marked. We are going to analyze the questions and answers based on the strong points they have pointed out in terms of "Agree or Disagree".

Question Types

Strongly Agree

Agree

Neither Agree or Disagree

Disagree

Strongly Disagree

1. Is the interface good i.e. color scheme etc?

 

ü     

 

 

 

2. Is information easily accessible?

 

 

ü     

 

 

3 .Is it easy to navigate from one page to another?

 

ü     

 

 

 

4. Are the workspaces a useful tool?

 

 

ü     

 

 

5. Does it serve as a good media of retrieving information within the school?

 

ü     

 

 

 

6. Can you upload your assignment easily?

ü     

 

 

 

 

7. Does it serve as a social media within the school?

 

 

 

ü     

 


8. How do you think Tuubi can be improved?

-Some unused features should be removed
-Better interaction between students and teacher, feedbacks etc.
-Color management (user of darker color)
-Tuubi, teachers personal pages and winha should be merged.




9.  What additional functionality would you like to see in Tuubi?




-Chatting
-Web ftp to access personal pages within pages.( users.metropolia.fi/-personal name


10. Any other comments or feedback?

-Tuubi is getting improvements
-it will be more better in the future