Project ONTOS
Improvements Post MVP
2-PART CASE STUDY
NWEA Research Application
Suitability Research, Usability Testing, & Platform Improvements
Design Brief
PART 1
Suitability Research for SaaS Potential
We conducted interviews with NWEA Researchers who had not yet seen or worked with the Ontos Product to gauge their interest in the Ontos Research Application and to see if we could position the tool as a SaaS. We also built out a business model to explore if our tool could have traction externally.
PART 2
Product Improvements: Usability Testing New Features
After interviewing NWEA Researchers, we found ways to make things better, like adding new features and improving the design. Using feedback from our Suitability Research, I created a prototype and tested our ideas for the new features and improvements.
PART 1: SUITABILITY RESEARCH
Business Model Strategy Work
My partner and I collaborated to uncover how we could create a business modal for the NWEA Research Application. For a deeper look into our thought process check out our Miro board.
PART 1: SUITABILITY RESEARCH
Interviews
We conducted interviews with NWEA employees outside of the Ontos team that had NOT been using the tool in order to learn more about educational researchers and their needs as they pertain to the market potential of our lab application as a SaaS product outside of their company.
We showed participants an informational video I created to assimilate them to the tool and how it’s used to help anchor the conversation and give our interviewees context for some of our questions on the product.
I designed everything you see in the video: the research administrative application and the student assessments.
PART 1: SUITABILITY RESEARCH
Findings & Synthesis
The 12 interviews were fruitful, providing amazing input on feature additions and improvements to the product. The product was praised by the majority of participants, which was very rewarding to hear as the sole designer of the MVP.
Our research application was the first at NWEA to collect simple, yet extremely valuable data points as it pertains to children’s performance on quizzes. Data included clicks (on actionable and non-actionable places on page), time on page, recording answers, and activity throughout the test (example: going back to change an answer). Many interviewees asked when their teams could start using our product to conduct testing, since it was said to be superior to any other tools they had access to at the company.
Our team had dreams of including qualitative tools such as facial emotion reading and voice tonal analysis. However, as an outside party, our tool was restricted by the extremely sensitive PII (Personal Identifiable Information) rules around testing with children participants.
These are the findings from our interviews for product improvements for non-Ontos NWEA researchers and potential SaaS expansion for other research companies:
Improving the app
Enhancing cognitive interviews features, more qualitative data
Our team had dreams of including qualitative tools such as facial emotion reading and voice tonal analysis. However, as an outside party, our tool was restricted by the extremely sensitive PII (Personal Identifiable Information) rules around testing with children participants.
Improve site accessibility
The product was WCAG compliant, but would benefit for more attention of accessibility, especially when external tools are used with computers)
More creative question types; make it easier to build assessments
The app uploaded assessments in the form of .JSONs created by the Ontos Research team. The process was very technical and items could not be built within the platform.
Student Profiles
Longevity for tracking student performance — at the time, our company could not do this because of the PII (privacy rules) when it pertains to child participants. However, the tool had the capability to do this, and I always had envisioned a ‘Student Profile’ section to track progress and compare student understanding/performance across subjects.
Consider Verbiage - Jargon
Most of the terms used in the app were used with recommendation from our client team of researchers, but they could still be confusing for people who could use the tool for simpler surveys or not technically trained researchers.
Part 2: Improving the Product
Usability Testing
Our client asked us to switch gears and no longer follow the path of trying to create a business plan to extend the app as a SaaS product. However, our rich insights gave us plenty of discovery work to uncover by improving on the Research App MVP. Following the 5 main areas for improvement listed above, we set out to conduct testing for feature enhancements, additions, and some information architecture rework. Our tried and true formula for success is below,
PART 2: USABILITY TESTING
Planning
My partner and I gave our client 3 different paths of exploration after wrapping up our Suitability Research. Our client happily chose our favored idea, which was doing usability testing on new feature ideas and app improvements that we had heard from our interview participants.
This path to my delight would most benefit the Ontos Research Team for 2 reasons.
They were continuing testing for another 2 years and could use product improvements
They had expressed many similar thoughts previous to our Suitability research. I ran a MVP retrospective and ideation workshop about new content in tests and features that could improve their research data and the product in June 2021 with our core team after the first round of testing using the MVP in April 2021.
View workshop notes here
PART 2: USABILITY TESTING
Site maping the Prototype
After reviewing the Researcher's initial ideas from the workshop, we sent them brief surveys and conducted 30-minute interviews with active tool users. This was done to assess their overall understanding of the tool's features and gather feedback on desired changes, insignificant aspects, and new features that would significantly enhance their workflow. We compiled all this information and created a site map/task flow for the prototype.
The things we mainly focused on learning were:
Simplying/improving workflows and Information Architecture
Experimenting with new verbiage
Adding features to the Cognitive interview tool for better/more qualitative data.
PART 2: USABILITY TESTING
Usability Testing the Prototype
PART 2: USABILITY TESTING
Synthesis & Findings
Ask me to view the final report.
Below are the reactions, behaviors, and opinions of the researchers as they went through the prototype.