UX

User Testing in Streamlining Global Scheduling

User testing

Table of Contents

Share this article

At the heart of Hivekind’s approach to developing ZoneJam was a strong emphasis on User Experience (UX). Our goal was to create an interface that was not only visually appealing but also user-friendly. We understood that to make ZoneJam stand out, it needed to prioritise the user experience above all else. This emphasis on UX became the cornerstone of ZoneJam’s design and development.

To facilitate a rapid feedback loop during all user testing phases, Hivekind adopted a methodology of concurrent design and development sprints. This approach allowed for immediate insights and adjustments, ensuring that the tool was continually refined and improved upon throughout the development process.

Understanding Different Types of User Testing

User testing is a multifaceted discipline, offering a range of methods to evaluate and enhance the user experience of a product or service. Each method has its unique strengths, tailored to different stages of product development or specific research objectives. Here’s a brief overview of some common user testing methods:

  1. Focus Group Testing: A qualitative research method where a group of participants discuss the product or service. Facilitated by a moderator, it’s an excellent way to gather diverse opinions and perceptions in an interactive setting.
  2. Field Studies: Observations of users in their natural environment using the product. It provides real-world insights but can be more time-consuming and less controlled.
  3. A/B Testing: Users are presented with two or more versions of a product to determine which one performs better in terms of user engagement, conversion rates, or other metrics.
  4. Surveys and Questionnaires: Users are asked to provide feedback on their experience using structured forms. It’s a quick way to gather data from a large audience but might lack depth.
  5. Remote Usability Testing: Conducted remotely, allowing participants from diverse locations to take part. It can be moderated (real-time interaction with a facilitator) or unmoderated (participants complete tasks at their own pace).
  6. Hallway Testing: An informal method where random individuals (like those passing in a hallway) are asked to use the product. It’s quick and can provide rapid feedback, especially useful in the early stages of design.

Given the diverse range of user testing methods available, it’s crucial to select the ones that align with the product’s stage of development and the specific insights sought. For Hivekind’s development of ZoneJam, we gravitated towards Remote Usability Testing and Hallway Testing.

Our choice of Remote Usability Testing was driven by the need to gather feedback from a geographically diverse user base, reflecting the global nature of our target audience. This method allowed us to capture rich insights from users in different regions, ensuring ZoneJam’s design resonated with a wide range of cultural contexts and usage habits.

We also perform Hallway Testing for its agility. Given the rapid development cycles of ZoneJam, we needed a method that could provide swift feedback on mobile intuitiveness and other design elements. By observing spontaneous interactions of users unfamiliar with the product, we could quickly identify and address usability challenges, ensuring an intuitive and user-friendly experience.

Competitive Analysis

During the exploration phase, Hivekind analysed competing products in the market. While some solutions existed, they were generally clunky and lacked user-friendly interfaces. We understood that to make ZoneJam stand out, it needed to prioritise the user experience above all else. This emphasis on UX became the cornerstone of ZoneJam’s design and development.

Design Thinking and Competitive Analysis Workshop

Moderated Remote Usability Testing with Internal team

Usability testing is a critical method for evaluating a product’s effectiveness, efficiency, and satisfaction from the perspective of the end-users. It involves observing participants as they interact with the product or system, often while they carry out a series of tasks or scenarios that a typical user might encounter. These tasks are usually designed to evaluate how easily users can complete primary tasks, how intuitively they can navigate the system, and how satisfied they are with the overall experience. During the test, facilitators observe and record users’ actions, comments, and areas of confusion or frustration. Insights from usability testing can directly influence design improvements, resulting in a more user-friendly and efficient product or service.

As ZoneJam was born out of Hivekind’s own experiences, we started with a moderated usability testing with our internal team. This provided immediate, valuable insights and allowed the design team to observe the tool’s usability within a real-world context.

We started with validating the problem statement. Does the team really face the problem of finding it hard to schedule a meeting with their stakeholders or we’re just creating the problem statement ourselves? Once we have validated the problem statement, identified the pain point throughout the user journey, and the core journey that needs to be improved.

Ideation and Exploration

We then proceed to plan on writing the scenarios based on the outcome we’re trying to get out of the user test.

  • How user gets onboarded without any assistance
  • Where they get stuck or confuse
  • How they feel after finishing the whole tasks
  • Any other feedbacks from user
High Level User Journey of ZoneJam

We then proceeded to plan on writing the scenarios based on the outcome we’re trying to get out of the user test. We crafted questions based on the onboarding experience, core functionalities of the application and see how the users react to it as naturally as possible, without any assistance. If we identified the user is lost, that’s where we provided guidance and took note on making improvements on the respective point in the journey.

To illustrate, here are some examples of the scenarios and features we tested:

Scenario 1: You are a Product Manager living in London, and you want to schedule a call with a potential client who lives in Kuala Lumpur, Malaysia. You might have another team member, a developer, who wants to join the call too. Your developer lives in Texas, United States. The task is for you to set up a call with the potential client, incorporating if your developer is able to join if time permits.

Feature: Adding users and their respective time zones and view the display of the table.

  1. Are they able to successfully add the team members and their respective time zone/location?
  2. Does adding the name helps the user to “profile” the guests?
  3. Do they search by location or time zone
  4. Do they search or use the dropdown option?
  5. At which point they stopped with adding more user and move to Add multiple?
  6. Realised that one of the guest’s timing is just off. Do they just remove him from the calendar? If yes, did they press the X icon. If no, do they just leave it or replace with another?
Usability Test Outcome

Scenario 2: After you’ve sent out the invite to everyone, a client messaged and told you that he is now in Jakarta, Indonesia and it has a 1 hour time difference from Kuala Lumpur. How do you incorporate the new change of information?

Feature: Make changes on existing list of team members with its selected location/timezone

  1. Do they click on individual profile or add as new user with new time zone? Or remove existing one and add new one?
  2. If they click on profile and realised that they can add office hour slots, do they use the feature at least for their own calendar?
  3. If the suggested time doesn’t work, how do they find the time slots in the timeline view?
  4. Do they scroll up and down?
  5. Do they zoom in and out?
  6. Do they manually click the arrows at the calendar to change date?
  7. Do they manually select different date on the calendar picker?

Moderated Remote User Testing with External User

Remote user testing offers the advantage of gathering feedback from users who aren’t co-located with the evaluators. This method can be performed in two ways: moderated and unmoderated. In moderated remote testing, a test facilitator interacts with a participant in real-time over a video or audio call, guiding them through the tasks and asking questions to gather insights. In unmoderated remote testing, participants complete the tasks at their own pace without direct interaction with a facilitator. They use a platform that records their screens and voices as they interact with the product. Remote user testing is especially useful for getting feedback from geographically diverse users or those who can’t be physically present due to other constraints.

For us, as we want to further refine and validate our design, we expanded to use moderated remote user testing with external users and skipped unmoderated user testing as we want to extend the conversation with open feedback. We conducted interviews with global testers from Germany, India, and the United States. This diverse group of testers provided a broad range of perspectives, reflecting different cultural contexts and usage habits. Their feedback was instrumental in refining ZoneJam’s design and functionality, ensuring that our tool was truly user-centric and globally applicable.

We used Hotjar, a powerful user behaviour analytics tool, to narrow down our sample of testers. During our design thinking process, we had defined a User Persona that represented our target user. With Hotjar, we were able to select testers that fit this persona, ensuring that our testing was focused and relevant.

Identifying Our User Persona

This approach allowed us to gather more organic feedback. We were able to ask detailed questions and delve deeper into the users’ experiences. We learned about their pain points, their needs, and their expectations. This invaluable feedback helped us understand how real users interacted with ZoneJam, what worked for them, and what didn’t.

Mobile intuitiveness via Hallway Testing

Hallway testing is a quick, informal usability testing method where individuals who happen to pass by (i.e., picked up from the ‘hallway’) are asked to test a product or system. These participants are typically not familiar with the product, mimicking the perspective of new users. They are given tasks to complete while the facilitator observes their interactions with the product. This form of testing can quickly reveal usability issues that might be overlooked by designers or developers who are too close to the project. It’s especially useful in the early stages of design, providing rapid feedback and enabling swift iterations.

Understanding the increasing reliance on smartphones in modern work culture, a particular focus was given to mobile intuitiveness. We aimed to ensure that ZoneJam was not only functional but also intuitive and user-friendly on mobile devices.

To capture natural user interactions, live or in-person interviews were conducted. These interviews allowed us to observe users interacting with ZoneJam on mobile devices, providing us with valuable insights into the user journey. We were able to observe how users navigated the app, how they interacted with the interface, and where they encountered difficulties or confusions.

In addition to testing ZoneJam’s prototype, we also conducted comparative analysis with other mobile apps or websites that have similar UX that we were trying to incorporate into ZoneJam’s features. This helped us to understand the best practices in the industry and how we could adapt them to improve ZoneJam.

The objective of these tests was not just to validate the features, but also to refine the granular details of UI/UX. We looked at aspects such as font and space sizes, scrolling behaviour, tapping or dragging actions, and colour contrast. These elements, while seemingly minor, can significantly impact the overall user experience.

Hallway Test Outcome

Feedback from these extensive testing rounds was then incorporated into subsequent design iterations. This ensured that ZoneJam remained aligned with user needs and expectations, and that the tool was continually improved upon based on real-world user experiences. This user-centric approach to design and development underscores Hivekind’s commitment to creating products that truly meet the needs of its users.

Your vision deserves a great dev team.

We're not about tech jargon or over-promising. Instead, we focus on clear communication, transparency in our process, and delivering results that speak for themselves.

Awards won by Hivekind