After spending 2 months defining and developing our first MVP, we wanted to validate that the design of the platform makes sense and that the basic flow could be achieved.
To host usability testing sessions and gather as much feedback as we can so that we can fix any alarming issues and make improvements before our first paid-event launch.
Testing Facilitator, Outreach and Survey designer
Before the usability testing, we need to make sure that our testers are our target audience. This would help us filter out a lot of noise and focus on users that would be able to provide us with valuable feedback to make our product better. To screen out the users that we want, I had these objectives in mind which helped me structured the questions that was asked:
> How valuable do they find networking?
> Do they do any prep work before or after the conference? If so, what is it that they do?
> What is the definition of a good conference?
> How do they define a good networking experience?
Survey link: https://forms.gle/VooATpmNRGXpkFnKA
The survey in my opinion would be very helpful if we were hosting a formal usability interview but because this is a voluntarily exercise, the completion rate was low. To compensate, during my outreach, I had asked these questions during our initial conversation to find out more of our potential testers instead.
This is the general guideline of the tasks that our team had agreed on to be tested. We wanted to see if the basic flow made sense and if they would be able to navigate through the system without any guidance.
Prototype 1 - Desktop
1. Invitation email
* Understanding? Thoughts? Next steps
* Too much text?
* Does profile strength help?
* Fields make sense?
3. Onboarding + Discover
* Take a look around, explain the UI. What would you do first? * Can they see Top Matches and Search?
4. Send meeting request
Prototype 2 - Mobile
1. Meeting request email
2. Accept meeting
3. Cancel meeting
4. Decline meeting (if possible)
5. Chat overview screen
6. Review pending meeting
7. Cancel meeting request
8. Edit availability
The most challenging part of user testing for me is and had always been is getting actual potential C-end users to test our product without any incentives. So I would like to take this opportunity to say thank you to those who had participated, it was very much appreciated! For this user testing, other than manually inviting guests like the old-fashioned way, we had also participated in Testla HK which is "a UX testing community based in Hong Kong. It encourage UX Designers, Product Managers, Entrepreneurs and Curious individuals to test their products and to hear feedback from each other. " Participating at open events like Testla gives us more product exposure and it allows us to test people's initial response and engagement with the product. The downside, however, is that the testers that we get matched with may not be our potential users. So the testing feedbacks are different and we would need to be aware and eliminate the noise that might influence our decision making.
Top common issues for improvement:
1. They don’t speak our language - Need to restructure the text and open field questions for easier understanding
2. Educate how people are being rated to show that our AI works and based on that, users would have the ability to improve their profile
3. More prompts/guidance/affordance of meter rating and questions
4. Remove company logo and even phone number - not useful
4. Communicate profile info visibility - if they know that its public, they will put more thoughts into it