Fdroidserver UX Testing Report

We ran consumer checks of fdroidserver, the tools for developers to create and manage F-Droid repositories of media and apps. This test was setup to assemble usability feedback about the tools themselves and the related documentation. These checks were come up with and run by Seamus Tuohy/Prudent Innovation. Participants completed a pretest demographic/background information questionnaire.

The facilitator then described that the amount of time taken to complete the test task will be measured which exploratory behavior within the app should take place after the jobs are completed. The participant was then provided a laptop with a internet browser window open to the F-Droid documentation. The facilitator allow participant know that if they felt they need anything external that they could ask the facilitator for your.

  • Empty body loops should use “continue” or “”, however, not single semicolon
  • I can now Access Kodi add-ons, apps,and channels that my ISP has blocked
  • PORTAL HACKING DNN – WEBSITE HACKING
  • Test your site
  • Export your data into a spreadsheet and create a written report in your favorite spreadsheet program
  • Countries with the highest/lowest percentages with good careers and involved employees

At the start of each job, the participant was provided the duty, and shown the resources which were available for them to use on the USB stick provided by the facilitator. The participant was then instructed to learn the task explanation from the published copy and begin the task. Time-on-task measurement began when the participant started the task. The facilitator instructed the participant to ‘think aloud’ so that they could capture their otherwise internal thoughts about interactions with the F-Droid server software.

The facilitator observed and came into participant behavior, participant feedback, and system activities to their record of the event. After each job, the participant spoke with the facilitator about the task. After all task situations were attempted, a post-test was completed by the participant satisfaction questionnaire. Each participant was asked to complete six specific tasks. In the beginning of each task, the participant was read the job, and shown the resources that are available for them to use.

The participant was then instructed to read the task description from the imprinted copy and start the task. 1. SETUP an F-Droid software repository with the applications on this USB Drive. 2. Hook up to that repository using the F-Droid customer. 3. Group the applications under your repositories name in the F-Droid interface. 4. Download the barcode scanning device program using the F-Droid client.

5. Update your F-Droid repository with an update to the barcode scanner app. 6. Download the revise using the phone. Tasks are proclaimed as “complete” following the participant says they have completed the task or after the test facilitator has enough proof that the task has been completed. Tasks are marked as “uncompleted” after the participant says they cannot complete the duty and demands assistance.

Test time will begin following the participant says they have recognized the task and will begin. Testing time will end after the participant says they have completed the task or after the test facilitator has enough proof that the duty has been completed. In the entire case of a crucial mistake, test time will stop following the participant says they can not complete the request and task assistance. Usability metrics identifies participant performance when completing the assigned tasks. That is includes conclusion success rates, error rates, time for you to task conclusion and subjective evaluations/interviews. Critical Errors: Critical errors are reported as errors that result in failure to complete the task.

Participants may or may not remember that the duty goal is wrong or incomplete. Independent completion of the duty is the goal; help from the test others or facilitator is to be marked as a crucial mistake. Non-critical Errors: Non-critical errors are errors that the participant recovers from alone and are not such that the participant can’t complete the duty.

They can include errors such as extreme steps taken up to complete an activity or initially using an wrong function but dealing with that incorrect step. Exploratory Behavior: Errors that are off job from the main task wanting to be completed will be designated as exploratory behavior. It should be noted that many of the errors that were came across in jobs two, four, and six were the result of configuration decisions and non-critical mistakes in tasks one and five. Task 1: SETUP an F-Droid application repository with the applications on this USB Drive.

The participants were provided a remote server that was pre-configured to web host F-Droid applications. The research team did this because establishing a web-server to host an F-Droid repository appeared out of scope, and the length would be increased by it of the UX session to an unacceptable length. The participant was confused about where the documentation begins to begin deploying the server.