Testing Results

Usability Report.

1

Due date of Report:

10.12.2009

Actual submission date:

10.12.2009

Product name and version:

WinhaWille (student page)

Organisers of the test:

Bella Kayumova
Recep Ercan
Lilia Galyautdinova

Date of the test:

Week 48-50

Date of the report:

10.12.2009

Executive summary

Test has been done in the Metropolia school premises; representative group was bachelor full time Metropolia students. WinhaWille is a standart application, used in higher education schools across Finland. It is dedicated to assist in study process, such as year enrolment, manage ISP (individual study plan) through course registration, monitoring study progress and etc. WinhaWille has two user interfaces: teachers and students. In our testing we focus on the student page, due to its availability.

Name the product: WinhaWille (student page)

Purpose/objectives of the test: to research usability aspects of WinhaWille student page, dentify problems and benefits.

Since WihnaWille is strictly educational assistance tool, we will not be covering attractiveness and visualization of the application.

Users and Tools used: pre-questionnaire, interview and questionnaire on 5 users.

Improvement suggestions:

  • Login: Make login possible with pressing Enter; change time-out throw out
  • Menu structure:** delete elements: exam, ISP contents, bulletin board;
    •  unite "Student data" and "Change address" into "Student data";
    • add "Enroll to a study year"
  •  Title definitions: change to more comprehensive if needed
  • ISP and Implementation: modify pages.

Language:

  • Possibility to change language without loosing session; language consistency

General:

  • Insure well functioning of all options and functions
  • Add  a "Help" link on every subsection page that requires user's input
  • Sufficient and up-to-date course description
  • In search result show all the courses (already enrolled, not yet enrolled)

-        Make "order by name" a clearer option, to specify which name. (course, group, teacher).

Introduction

Product Description

Name: WinhaWille - Student Page, version 6.4, release date: 28.08.2009.

Parts evaluated: Menu structure, title definitions, navigation logic.

User population of the system: Students and academic staff of Finnish higher educational systems.

Variable usage context: depending on users location (university premises, home, cafeteria).

Test Objectives

The objectives were to research usability aspects of WinhaWille student page, identify problems and benefits.

WihnaWille is strictly educational assistance tool, therefore attractiveness and visualization of the application are not emphasised in following tasks:

- to define if the structure of the menu is clear and the titles are understandable

- do the users able to successfully complete the task

- to detect if information represented is clear and understandable

- to detect if data is reliable and accurate

- to identify the most relevant/important parts of application, as well as missing and unnecessary ones, useful and not

- to reveal general areas for improvement

Method

Participants

Total number of participant: 5

Key characteristics: Bachelor full time degree Metropolia students.

Selection criteria: random sampling among classmates.

Differences between the participant sample and the user population: schools, field of study, degree programme, study level, language of education, level of computer skills, study type (full time, exchange, distant learning, adult education and etc), and frequency of WinhaWille.

Context of Product Use in the Test

Difference between evaluated context and the expected context of use:

  • aimed for individual use
  • no assistance given in performed task
  • acknowledgement of testing
  • pressure of personal evaluation (knowledge of ISP, computer skill, cultural and etc.)

Pre-questionnaire intended to ensure user is a part of representative group; evaluate familiarization with system (participation in Orientation days, number of terms completed). Has been conducted by interviewer.

Interview has been performed using real application.

  • Task 1: User was asked to access WinhaWille page, starting point was: no browser window open. This meant to identify if user knows the link by heart, and if no - what is the common way to access, and is it easy to find it.
  • Task 2: Then user had been asked to describe menu titles without clicking on them, in order to understand comprehension level of title definitions and functions, user logic in menu structure.
  • Task 3: User was asked to access following links: ISP, ISP contents, Enrol-Implemetation, Enrol-Exam; and to give detailed explanation of each field, title, option and data. In case user did not know or did not have previously experience in usage of the option, he was asked to guess what is it for and what the title means for him. Users were not corrected, if given answer was wrong. The choice of links was based on the most relevant features of the system, time limitation. Completion criteria was: as soon as user gave sufficient respond (not in terms of correctness, but in clearance and detail)
  • Task 4Task scenario - User is in the beginning of a study term and needs to enrol to it, what he will do. User should show and explain his actions. This meant to show if user is able to find access to Study term enrolment, if he remembers how it works, where it is located.
  • Task 5: Task scenario - User needs to enrol to a study course. User should show and explain his actions. This meant to show if user is able to find access to Study term enrolment, if he remembers how it works, where it is located.
  • Task 6: Task scenario: - User is asked to check his study progress. Whether the users thought the study progress meant checking the amount of completed credits or the received grades - was not relevant, both options sufficed.
  • Task 7: User is asked to show the most common area of usage*.* The task was used to find out the most effective feature of the application, and get an idea what is also the most important part of the application, where we could focus on even more than the other parts.
  • Task 8: User is asked to show the least common area of usage.

Questionnaire 

Consisted of 10 open questions. Questions were chosen in order to reveal main usability aspects of the system, listed above.

Example of the Questionnaire is in the Appendice

Test Facility

The testing was carried out in a computer classroom, Metropolia university premises, Leppävaara unit. There were no disturbances observed by the testing team.

 Participant's Computing Environment

Computer configuration: We have used the computer of university in our session; we took 3 computers which each are recently, they are composed by processor dual core, their memory are very enough (2GB),DirectX version 9.0(the latest) was installed, their graphical card are enough for what we will doing, one for testing the user and the other for writing comment and taking notes. The entire computer had the same configuration and using the same OS: Windows XP professional. They are connected to the university network. We could access and used all softwares which are on these computer, the connexion to internet was available too.

Browser name and version: There are different browser installed in these computer, the basic Internet explorer version 8.0.6001.18702 which is installed automatically with the OS.We had too Mozilla Firefox wich is most common browser used in university. The version of Mozilla Firefox is /5.0.But the choice of browser depended of the using by each user.

Plug-ins: All plug-in are installed for both, there was adobe reader, adobe flash player, java, quicktime, shockwave, windows media player.

Display Devices

The screen resolution are 1280 by 1024 pixels for each computer, and the highest level (32bit) is used for colour quality. The plug and play monitor is used; it is Radeon X1300PRO on 256 MB.

We used infact paper for our each interview, for personal note and questionnaire. The normal format is used like A4 Format.

Test Administrator Tools

For this testing we have tried to use all things that we have our own disposition, we didn't used record, we used basic things, like Microsoft Word, notepad for take notes. Maybe for next interview we could used a camera or voice recording. It could be easier to take note, and show some attitude of users, because sometimes they said something which mean is right to say.

Procedure Details

Number of testers: There were usually 2-3 persons interacting with the tested user: interviewer who asked the questions and took some notes, and the note takers (one or two persons seated at the computer nearby, who typed mostly everything the user was saying, as well as some of the interviewer's remarks).

Procedure and Policy:The test conductor (interviewer) read aloud the questions, some explanations were sometimes required (this depended on the individuals), these additional explanations are not included in the testing report, because of the scope of the project and the estimated irrelevance of the extra comments on the user performance. If the user had a problem with a particular task, he was still reminded (comforted) that it is the application that was being tested and the task's nonsuccess just led us to the possible problems with the application which should be improved as a result.

Only one question in the interview included a multiple choice answer option. Question -"How well are you familiar with ISP?":

a) Perfect

b) Very good

c) Good

d) Satisfactory

e) Poor).

The intention was to understand self-evaluation of the system.

Other questions in the interview were more performance based, where the user was asked to talk aloud and explain his actions.

+Accomplishment criteria:+The task was thought to be completed when a user himself came to conclude he has performed the task to the end (thought he was successful in obtaining the desired result or realized he was not able to do it): in this way the interviewer team did not force the user to move on to the next question.

Questionnaire was used as another method of giving the test respondent a possibility to give completely anonymous answers (even though in the interview it was specified that no names will be used, there was still a personal factor of recognition involved, whether handwriting is thought to be more anonymous, and the answers could be more open and, thus, valid).

+Environment:+The testings were usually done while other people were present in the same room, who were  not part of the testing team, but this was not thought to be an obstacle or influential factor that could affect user's answers.

+Payment:+Participants of the evaluation were not paid in any way.

Results

In our result analysis we used following types:

Quantitative data analysis: to calculate different shares of users' responds and successful task accomplishment.

Qualitative data analysis: was used during experiment and interview.

Results' interpretation and conclusions:

We will not be dividing quantity and qualitative data in our results, because they are strictly related. And splitting will cause interpretation and comprehension difficulties. However we will separate results of the Interview and Questionnaire:

Questionnaire:

Following conclusions and interpretations were made, after User details and questionnaire analysis.

Orientation day participation: 60% of the users have participate Orienation Days, where information about the system has been given further more one of the users used to be tutor - presenting and giving tutorial of the system. Length of competed study: All of the users have been studying over 2 years in Metropolia. Familiarity with system before  Metropolia: One of the user had experienced the system before the school.

However,self-evaluation of system familiarity was very low: 40% of the users evaluated their familiarity with the system as good, the same amount of students - as satisfactory, and 20% as poor. One of the reasons might be that system is not used on daily basis: frequency of usage: 60% of users told that the use it only beginning and ending of the term, 20% once a month, 20% almost never. But we assume that the main factor is poor system design, unusable interface.

This supposition is proven by following: 100% of users admitted that it was not easy to learn how to use the system from the first time and also 100% of users agreed that it is not easy to use. All of the participant told that menu titles and definitions are not clear. Menu structure and navigation is poorly designed on opinion of 80% of the users.

Most common problems and irritating issues with WinhaWille were:

  • Bad titles and definitions
  • Bad course description
  •  Search does not give all results
  •  No info when the courses you have to do are available.
  • No possibility to enroll to courses which you took previously and fail
  • Empty bulletin board
  • English mistakes
  • Help is not sufficient and no English version.

Interview:

WinhaWille access: It was evident that the users had no problem with accessing the WinhaWille, either through finding the link from Tuubi portal or from Metropolia homepage (which was also the default webpage for the used browsers in the testing). All the users were quite acknowledged with the task of logging in, with virtually no problems arising, and the timing was on average less than half a minute.  Nevertheless, we have found that none of the users knew the direct link to access WinhaWille.

Login: When logging in, the only bothersome point for half of the users was that the application didn't accept the login form, with all fields completed correctly, just by pressing enter, but only with a left mouse click on a "login" button.

As the interview took its course, the users found themselves logged out of the application - it's the "15 minutes time-out feature" of the application, when WinhaWille closes the connection automatically after the browser wasn't active for this time.

It has been noted by the users that this wouldn't be a problem (since it is a part of securing their own information), but the fact that none of the users were able to log in immediately from this page (they went to their usual route of accessing WinhaWille - through other websites), this time-out throw out feature became an irritating issue. The testers noticed that none of the users were willing to read long text on the page (the case of which they would find instructions to "press on the owl picture to bo back to Wille"), thus it has been concluded that the page should be modified, text made clear and short, so that the logging in back to the application after time-out would be more intuitive. It is held by the testers that developers of the application have made Wille character - "the wise owl" graphic that is supposed to be associated with the academic relevance of WinhaWille- nevertheless the character identity hasn't been developed enough so that the users would associate with it when thinking of the homepage for this application. Though this usability testing's focus was not the graphical interface, we may still note  that with better graphics, the user's experience and navigation logic can be improved.

Student Data and Change address: In the first part of the interview, the users were asked to give explanation of what the main menu items meant. "Student data" and "Change address" were quite clear to all the users. It has been suggested though to add these two items together into one term for easier usage.

"ISP": menu title was not absolutely clear to 80 % of the users. First of all, many have had difficulties in stating what the abbreviation "ISP" meant. In the experiment it was also found out that the application itself has some inconsistencies regarding linguistics (for example, ISP is translated as Personal Study Plan instead of Individual Study Plan in some places). When the user was asked to enroll to the course or talk about general experience of using WinhaWille, it was stated by some that working with ISP and ISP contents is usually a method of "trial and error", where the user did not remember how he did a certain task in the past. Thus, it can be concluded that using this application is about learning it again and again, even though the features stay the same, we hope to improve it with our mock-up.

"Enrol > Implementation" submenu was thought to be quite understandable by 75 % of the users, with the majority stating rightly that it was "an administration page for enrolling to courses" and the minority remembering how they were not able to guess what "that fancy word meant" when first used the application. When the users clicked on the "Implementation", not all were able to state what each of the fields meant.

The usual route of enrolling to the course was made in the following way: the user received a file about the available courses from a department coordinator (usually an excel file posted as an announcement on Tuubi portal in the beginning of each period), then the user found out the course code and enrolled to it separately, one by one. Also, enrolling to courses by typing in study group code was also popular with the users- they were all confident that this operation would list all the courses (compulsory, optional) available to the group. Even though in the part "Enrol > Implementation", the "Pgrm responsible" was guessed right to be the teacher responsible, it was still just a non-confident guess. In fact, this feature was never used before by 80% of the users.

"Study code" was mistaken by 20 % of the users to mean the code of the study group instead of the course code, thus these users' search remained fruitless when they tried to find out which courses were available for their group. "Study type", "Study method" and "Including virtual education" were thought to be "surprising features" of the application, which the users have not been using before. Whether it is a limitation of the application or the impracticality of using these when studying in Metropolia is arguable (we have to remember the application is developed for availability to all higher education institutions in Finland). "Study type" has 4 options, stating the abbreviation (which comes from Finnish language and not understandable to English speakers) and then description in English, these were thought to be misleading and very unclear.

In the course of testing, whenever the user saw an option of inserting a date (in ISP, enrolments menu sections), all the users were not clear on what type of date was in question. There were different opinions and mainly guesses, whether it was the date "when the responsible for the course makes the course available on the application", or the date of "when the course is about to start", therefore this section should be made more clear. Also, 20 % of the users noted that it wasn't clear to them what format should be used when inserting a date, whether it is DDMMYY or something completely else. It has to be noted that WinhaWille specifies only on "Enrol>Implementation" what type of format the data should be inserted, even though it should be on all.

"ISP Contents": it was noticed that to 80% of the users it was not clear what type of information was given in the brackets. The title of the list is misleading itself - "Study type / code / name (extent)" - it doesn't provide any instructions on the numbers and letters in the brackets, their explanations. For the example, the dash in the end of the brackets was a confusing point to the majority, and when there was a number in its place, it was mistaken for the number of credits the student got, instead of the grade for the course (and dash consequently meaning that the course had not been graded). In the case of the user having such information in the brackets "5 cr/ T / 3", it was guessed wrong that the last number meant that the student got only 3 credits instead of 5 he was supposed to get. Editing the ISP Contents in a more understandable way is crucial for a user to monitor his study progress.

It has been also found out that exactly ISP, Implementation and ISP Contents are the most useful features for all the users - improving these should be a priority. Whether "more info" menu section was thought to be the least common for the users: therefore "Bulletin board" can be removed altogether; but for the courtesy of the developers' credits, we hold it true that "Application" link should still stay there.

"Exam":item was not used by any of the users before. In fact, this item was thought to be unclear, and thus not used. It is important to state here that it has been observed as a general procedure in MetropoliaUniversity, Leppävaara campus, that teachers do not require from a student a registration to the exam, at all. The only registration is done for the re-sit exams, and follows a bureaucratic system, involving paper forms with signatures and the porter-information desk. It is negotiated whether the "Exam" feature should be eliminated from the application in order to avoid clutter.

When the user was asked to do the task of enrolling to a new study year, all the users did not remember how to do it exactly, again they said they had to click around and search for it every time. In the course of the experiment, it was found out that the option of enrolling automatically disappears when the student has already registered and the study term has already began. Nevertheless, all of the users did not suspect that until they performed a rather fruitless search and only then 50% of the users came to conclusion that such an option is not available "at this time of the year" while the other half felt they failed the task, that they "just can't find it now".  In the developing of application's mock-up, attention should be paid to make the feature "enroll to a study term" displayed at all times, with the only difference of being "unavailable" button when the student has already enrolled.

The users of the testing have also stated separately that the more responsible filling in the data from the teachers' and other staff's side would somewhat improve WinhaWille, they would use such features as "Pgrm responsible" and "Exam". Typing in data and not getting sufficient search results because of such negligence results in negative feedback on the application itself.

Improvement suggestions:

Login:

  • Make login possible with pressing Enter, not only with the mouse clicking.
  • Modify time-out throw out page to make an easier login back (the irritating thing that happens after 15 minutes of no action), provide a link on it to go back to login page.

Menu structure, title definitions:

  • Unite "Student data" and "Change address" into "Student data" with a button to "Modify" it.
  • Rename following menu items:
  •  ISP -> Individual Study Plan (ISP)
  • Enrol >Implemetation  -> Course enrollment
  • Application -> Application info
  • To the main page ->  Main or Home
  • Close -> Logout
  • Eliminate some features from the main menu:
  • ISP contents
  • Exam (no one is using it)
  • Bulletin board (more popular on Tube portal) - make the application less cluttered.* To add:
  •  "Enroll to a study year" - make itavailable at all times, it should appear as non-functional button when the user has already registered for a study term or when the period of enrollment is over. It is suggested to put it on the main page.

ISP:

  • Delete Study Code, Date, Projects options
  • Planned studies - Study modules with all planned courses in it, information on credit and earned grade.

Implementation:

  • Study Code - > Course Code
  • Group -> Group Code
  • Study type - make sufficient list of study types
  • Study method - make sufficient list of study methods
  • Delete: Date
  • Lecturing language -  add English
  • In result show all the courses even if you are still enrolled to them, make a mark - Already enrolled.

General:

  • Insure well functioning of all options and functions
  • Add  a "Help" link on every subsection page that requires user's input, clicking on it should open a pop-up window with the instructions text for the specific subsection (for example, in ISP it should be about ISP first and how to work with it).
  • Sufficient and up-to-date course description
  • Make "order by name" a clearer option, to specify which name. (course, group, teacher).

Language:

  • When the language is changed, it should not throw out the user from the application completely and demand a separate login.
  • Fix language inconsistencies (if chosen in English, everything should be English), this also regards drop-down options in "Study method", "Study type", etc.

Further developments - our suggestions:

  • Make the users remember the direct link to WinhaWille

Reliability and Validity

Reliability: We had three testers involved to testing purpose, which is increasing reliability of the test. Number of total tested users is not that high: 5; but homogeneity and our own experience with the system give us more reason to believe that test results are highly reliable.

Validity: we do not have usability expert in our team, therefore there might be validity issues. However, we persuaded that test was highly valid. We have tried to create our questions according to usability statement we stated, we used pre-questionnaire to select representative users, all tasks were deeply analyzed, in order to evaluate and measure usability aspects listed in the beginning of the experiment. Selected tasks widely represent the most common goals for using the system.

Appendice:

User Code:_____

WinhaWille Usability Test

User details:

Program Degree:  ___________________________

How many academic terms you have been present at school _______________

Have you been on Orientation days___________How well are you familiar with Study Plan:
¨    Perfect
¨    Very Good
¨    Satisfactory
¨    Poor
 
User code:_______
Questionnaire:
1.     How often do you use WinhaWille?_________________________________________________
2.     In the beginning was it easy to learn how to use the system?______________________________________________________________________________
3.     Did anyone teach you how to use the system?____________If yes, who ______________
4.     Do you find it is easy to use?______________________________________________________________________________
If no, why _____________________________________________________________________
5.     What kind of problems previously did you have while using WinhaWille?______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
6.     Is the layout of the system is clear and understandable? If not, specify
Menu structure and navigation ______________________________________________________________________________
______________________________________________________________________________
Menu items and definitions: ______________________________________________________________________________
______________________________________________________________________________
7.     Did you have problem with data accuracy: wrong credits published, wrong registration, insufficient course description: ________________________________________________________________
________________________________________________________________________________
8.     Would you recommend any improvements, regarding problems you have mentioned or other?________________________________________________________________________________
___________________________________________________________________________________
9.     What kind of irritating things you find in WinhaWille?________________________________________________________________________________
___________________________________________________________________________________
10. Have you ever been using Help page on WinhaWille? Any comments: ___________________________________________________________________________________
 

  • No labels
You must log in to comment.