Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Enhancing healthcare learning by improving Gordon Center’s Cardiac Auscultation e-learning modules.

In this project, I conducted a thorough evaluation of the Gordon Center’s Essential Cardiac Auscultation learning module, an online program designed for medical students to practice using a stethoscope to listen to heartbeat sounds. I collaborated with my co-UX researcher to perform 9 moderated virtual usability tests on the platform.

Outcome:

In this evaluation, I identified 21 unique usability issues and pinpointed 8 UI elements that were effective and should remain unchanged. To address the usability issues, I created 21 mockups of proposed design solutions, which were pitched to the client. Overall, the client approved of the re-design suggestions and planned to utilize our findings to update the web platform with their in-house developers.

Project type:
Gordon Center for Simulation and Innovation
Category:
UX Design
,
User Research
Industry:
UX designer & UX researcher / Team of 2
Responsibilities:
survey creation, persona building, design strategy, UX Design, wireframing, prototyping, user testing, design iterations, brand strategy
My Role:
product designer, UX designer
team of 1
Timeline
14 weeks / 2024
      Prototype

People were unaware of their progress toward the final testing section, so a progress indicator was needed.

I fix this uncertainty by adding a progress indicator as a percentage scale to show how much of the module users have left to complete

Quote:

"A progress bar on the top were the nav bar is would be super helpful when going through the entire site." - usability test participant

      Problem sPACE

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project aPPROACH

I planned and moderated 90-minute remote usability tests with 6 medical students and 3 non-medical students, through 3 task-based scenarios. Participants had to complete the System Usability Scale (SUS) questionnaire and answer post-study questions

Our goals were to track the module's efficiency, gauge the content's effectiveness and presentation, assess overall usability, and evaluate the design to ensure an optimal learning experience for users.

      Research

We planned a 5 step procedure for a 3 task evaluation with the goals to enhance the existing product.

My goals of the usability test were to track the module's efficiency, gauge the content's effectiveness, assess overall usability, and evaluate the design to ensure it did provide an engaging learning experience for it's users.

Procedure:

  1. 6 participants were met  through Zoom and were asked to filled a consent form and complete demographic questionnaire prior to the start of the usability test.
  2. For all 3 tasks: completion time, success rate, number of usability issues, location of start of task, location of end of task and task flow was recorded.
  3. After participants completed all test task, participants answered SUS questionnaire on a Qualtrics form.
  4. Participants were then asked open ended questions related to the UI elements that they saw or interacted with on the website.
  5. Participants completed post study interview questions about how clear, useful and engaging the content of the website was.
      user insight details

All participants passed Tasks 1 & 2 but Task 3: 'Finding Help' had only 33% success rate, highlighting urgent need for better help accessibility.

Users found it difficult to locate help information. They disliked that the only way to get help was through the “Submit Ticket” feature on the homepage, which opened their native email app.

Quote:

“This seems as if someone will get back to me sometime during business hours.” - usability test participant

      Identifying target market

We found that there were missing instructions for practice questions.  

I suggested to add clearer instructions that point users to interact with elements on the page and to include a hover state for sound buttons. 

Quote:

"I didn’t know those were clickable, I just skimmed pass them thinking the instructions were for an upcoming page. "  - usability test participant

      Exploring Solutions

As I prepared to dive into Figma, the client revealed that students strongly prefer using tablets and phones for studying on the go.

With this newfound information, I had to work within the constraint of using the same, if not more, content in a limited screen space. Despite this, I decided to focus the new designs on optimizing the learning experience for tablet view, prioritizing mobile-friendly solutions for learning on the go over traditional desk-based learning.

      Design Challenges

I recommended that the E-learning module’s overall visual design should derive from University of Miami’s visual branding.

The current UI of the e-learning platform had a different style guide than it's landing page on the gordon center's official website, which could cause confusion. To ensure consistency and recognition, it is important that users can easily identify the E-learning modules as a UM product.

      Design Challenges

To address the issue of the large wall of transcription text that made following the video difficult for users, I introduced visuals.

I suggested to add video timestamps pictures  and text highlighting queues so that users can have visuals to follow along with the lesson while reading the transcript

Quote:

"I wish that the transcript section had visuals or connected with the video visuals a bit better." - usability test participant

      USER TESTING

Finding immediate help within the modules was challenging, so an error-free path to guidance was required for a better user experience.

I recommended adding a contact section in the footer with immediate resources that are easily accessible throughout the site. Additionally, users should be able to submit a form to connect with a help representative.

Quote:

"I really don't like that the ‘Submit a Ticket’ button opens up my personal email. Plus how do I even know I’ll get a response quickly?" - usability test participant

      dESIGN FEEDBACK + ITERATIONS

Finally, I created a design hand off document that showcases fundamental design guidelines that  are functional yet flexible enough to ensure that the users are getting an optimal learning experience on the website. 

I generated a series of templates made from the mockups and documented the design specifications. i wanted to make sure the documentation was easy to consume detailed so that our client had the right tools to re-design their learning platform on their own in -house designers and developers.

      dESIGN FEEDBACK + ITERATIONS

Within the test, participants also had to answer post-study questions about the challenges or delights they experience while using the platform.

Tasks include:

  • Test Task 1: Participants must look through the entire module and get first impressions. (measured by usability issues, Single Ease Questionnaire (SEQ)- 7pt rating score 1System Usability Scale (SUS) - 5pt rating)
  • Test Task 2: Participant completes the entirety of the module up to the “Testing” section. (measured by task completion time)
  • Test Task 3: Towards the end of the module, participants should “Submit Ticket” to seek help. (measured by success rate, pass or fail)
      dESIGN FEEDBACK + ITERATIONS

Usability testing revealed that 21 unique issues were found in the existing platform.

During our usability testing, we identified 21 unique issues with the existing platform. To better assess the design problems, I organized the findings into four categories: content, functionality, visibility, and visual design.

Content findings:

  • Missing Instructions for Practice Questions: Users struggled due to the lack of clear instructions for practice questions.

Functionality findings:

  • Audio Overlap Issues: Users experienced audio overlap on pages with multiple videos. Without manually stopping one video, both audios played simultaneously, undermining the purpose of sound distinction.

Visibility findings:

  • Navigation and Progress Tracking Problems: Users had trouble knowing their location within the module. There was no indication of which sections they had completed or how much they had left until the testing section.
      solution
      solution
      solution
      metrics + Learnings

With more time, I would have expanded our testing group beyond the three medical students.

By doing so, I'll have a better understanding of the different learning preferences and directly assess the impact of our design solutions by presenting the changes to the same participants.

This project also validated the lesson that when coming up design recommendations for a already existing product, there is no perfect solution. There can be many different ways to solve a problem, as long as there is a good reason behind how it can positively impact the user. we can only determine the effectiveness of a solution by testing it. 

Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Enhancing healthcare learning by improving Gordon Center’s Cardiac Auscultation e-learning modules.

In this project, I conducted a thorough evaluation of the Gordon Center’s Essential Cardiac Auscultation learning module, an online program designed for medical students to practice using a stethoscope to listen to heartbeat sounds. I collaborated with my co-UX researcher to perform 9 moderated virtual usability tests on the platform.

Outcome:

In this evaluation, I identified 21 unique usability issues and pinpointed 8 UI elements that were effective and should remain unchanged. To address the usability issues, I created 21 mockups of proposed design solutions, which were pitched to the client. Overall, the client approved of the re-design suggestions and planned to utilize our findings to update the web platform with their in-house developers.

Project type:
Gordon Center for Simulation and Innovation
Category:
UX Design
,
User Research
Industry:
Education, Networking
Responsibilities
user interviews moderating, user personas, empathy maps, affinity diagrams, ideation, information architecture, design iteration and prototyping
My Role:
UX designer & UX researcher / Team of 2
Timeline:
16 weeks / 2022
Prototype:
      Problem sPACE

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project Apprpoach

I spearheaded a 6 week user evaluation and re-design process.

With my partner, I conducted and moderated 90-minute remote usability tests with a diverse group of participants, including 6 medical students and 3 non-medical students.

I designed the tests to be structured around three task-based scenarios and I utilized the insights from each usability test to make informed design recommendations to improve the functionality, visual design and content within the website.

My responsibilities included:

  • Week 1-2: Set goals and criteria for the usability tests, create detailed task scenarios and questions.
  • Week 3-4: moderate 4 virtual usability tests and note-take for 3.
  • Week 5: Collect and analyze data from the 9 usability tests, Identify key areas for improvement based on feedback and create design recommendations using Figma
  • Week 6: Prepare a research report with recommendations and key test findings  for the client and delivered additional hand off documents.
      RESEARCH

We planned a 5 step procedure for a 3 task evaluation with the goals to enhance the existing product.

My goals of the usability test were to track the module's efficiency, gauge the content's effectiveness, assess overall usability, and evaluate the design to ensure it did provide an engaging learning experience for it's users.

Procedure:

  1. 6 participants were met  through Zoom and were asked to filled a consent form and complete demographic questionnaire prior to the start of the usability test.
  2. For all 3 tasks: completion time, success rate, number of usability issues, location of start of task, location of end of task and task flow was recorded.
  3. After participants completed all test task, participants answered SUS questionnaire on a Qualtrics form.
  4. Participants were then asked open ended questions related to the UI elements that they saw or interacted with on the website.
  5. Participants completed post study interview questions about how clear, useful and engaging the content of the website was.
      DETAILS

All participants passed Tasks 1 & 2 but Task 3: 'Finding Help' had only 33% success rate, highlighting urgent need for better help accessibility.

Users found it difficult to locate help information. They disliked that the only way to get help was through the “Submit Ticket” feature on the homepage, which opened their native email app.

Quote:

“This seems as if someone will get back to me sometime during business hours.” - usability test participant

      RESEARCH Findings

Users also expressed a desire to retain six important features.

Participants also voiced that they wanted to keep features such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

I had to ensure that the new website design retained key elements from the original layout. It was essential to keep the components that users know and love, such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

Quotes:

"If I were going through this as a review, I would definitely use  timestamps to skip to the specific sections so I don't have to watch the full video.” - usability test participant

      USER PERSONAS

As I prepared to dive into Figma, the client revealed that students strongly prefer using tablets and phones for studying on the go.

With this newfound information, I had to work within the constraint of using the same, if not more, content in a limited screen space. Despite this, I decided to focus the new designs on optimizing the learning experience for tablet view, prioritizing mobile-friendly solutions for learning on the go over traditional desk-based learning.

      USER JOURNEY

I recommended that the E-learning module’s overall visual design should derive from University of Miami’s visual branding.

The current UI of the e-learning platform had a different style guide than it's landing page on the gordon center's official website, which could cause confusion. To ensure consistency and recognition, it is important that users can easily identify the E-learning modules as a UM product.

      User empathy

To address the issue of the large wall of transcription text that made following the video difficult for users, I introduced visuals.

I suggested to add video timestamps pictures  and text highlighting queues so that users can have visuals to follow along with the lesson while reading the transcript

Quote:

"I wish that the transcript section had visuals or connected with the video visuals a bit better." - usability test participant

      Exploring Solutions

People experienced audio overlap on pages with multiple videos, so I proposed having a clear distinction between played and paused video.

I recommended to make sure videos stop once a new one is clicked and highlight videos that plays audio with a blue colored border.

Quote:

"It’s annoying that I need to stop one video and to play the next one properly."- usability test participant

      defining  requirements

People were unaware of their progress toward the final testing section, so a progress indicator was needed.

I fix this uncertainty by adding a progress indicator as a percentage scale to show how much of the module users have left to complete

Quote:

"A progress bar on the top were the nav bar is would be super helpful when going through the entire site." - usability test participant

      user testing

Finding immediate help within the modules was challenging, so an error-free path to guidance was required for a better user experience.

I recommended adding a contact section in the footer with immediate resources that are easily accessible throughout the site. Additionally, users should be able to submit a form to connect with a help representative.

Quote:

"I really don't like that the ‘Submit a Ticket’ button opens up my personal email. Plus how do I even know I’ll get a response quickly?" - usability test participant

      design iterations

Finally, I created a design hand off document that showcases fundamental design guidelines that  are functional yet flexible enough to ensure that the users are getting an optimal learning experience on the website. 

I generated a series of templates made from the mockups and documented the design specifications. i wanted to make sure the documentation was easy to consume detailed so that our client had the right tools to re-design their learning platform on their own in -house designers and developers.

      design challengeS

Within the test, participants also had to answer post-study questions about the challenges or delights they experience while using the platform.

Tasks include:

  • Test Task 1: Participants must look through the entire module and get first impressions. (measured by usability issues, Single Ease Questionnaire (SEQ)- 7pt rating score 1System Usability Scale (SUS) - 5pt rating)
  • Test Task 2: Participant completes the entirety of the module up to the “Testing” section. (measured by task completion time)
  • Test Task 3: Towards the end of the module, participants should “Submit Ticket” to seek help. (measured by success rate, pass or fail)
      design challenges

Usability testing revealed that 21 unique issues were found in the existing platform.

During our usability testing, we identified 21 unique issues with the existing platform. To better assess the design problems, I organized the findings into four categories: content, functionality, visibility, and visual design.

Content findings:

  • Missing Instructions for Practice Questions: Users struggled due to the lack of clear instructions for practice questions.

Functionality findings:

  • Audio Overlap Issues: Users experienced audio overlap on pages with multiple videos. Without manually stopping one video, both audios played simultaneously, undermining the purpose of sound distinction.

Visibility findings:

  • Navigation and Progress Tracking Problems: Users had trouble knowing their location within the module. There was no indication of which sections they had completed or how much they had left until the testing section.
      design challenges

21 design solutions in Figma were crafted to address and mitigating the challenges within the learning platform.

While designing, I refreshed essential UI components such as the navigation bar, video transcripts, and the testing layout, as students found these elements most useful.

I also introduced new design elements like a progress bar and interactive features, such as highlighted text within video transcripts, to make the content less overwhelming to read.

      impact

In the end, the proposed design solutions was well-received by our client.

Our client was happy with the proposed design solutions and is considering implementing our ideas to improve the platform! However, I've learned through this process that there is no perfect solution for an existing product. Multiple approaches can work, as long as they positively impact the user. So, we can only determine the effectiveness of a solution by testing it with real users

      LEARNINGS

With more time, I would have expanded our testing group beyond the three medical students.

By doing so, I'll have a better understanding of the different learning preferences and directly assess the impact of our design solutions by presenting the changes to the same participants.

This project also validated the lesson that when coming up design recommendations for a already existing product, there is no perfect solution. There can be many different ways to solve a problem, as long as there is a good reason behind how it can positively impact the user. we can only determine the effectiveness of a solution by testing it. 

Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Enhancing healthcare learning by improving Gordon Center’s Cardiac Auscultation e-learning modules.

In this project, I conducted a thorough evaluation of the Gordon Center’s Essential Cardiac Auscultation learning module, an online program designed for medical students to practice using a stethoscope to listen to heartbeat sounds. I collaborated with my co-UX researcher to perform 9 moderated virtual usability tests on the platform.

Outcome:

In this evaluation, I identified 21 unique usability issues and pinpointed 8 UI elements that were effective and should remain unchanged. To address the usability issues, I created 21 mockups of proposed design solutions, which were pitched to the client. Overall, the client approved of the re-design suggestions and planned to utilize our findings to update the web platform with their in-house developers.

Project type:
Gordon Center for Simulation and Innovation
Category:
UX Design
,
User Research
Industry:
SaaS, health care, HIPAA reporting, employee administration
Responsibilities
design strategy, user flows, information architecture, dashboard design, data table design, system interactions, prototyping, design token creation
My Role:
UX designer & UX researcher / Team of 2
Timeline:
6 weeks / 2023
Prototype:
      Problem sPACE

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project Apprpoach

I spearheaded a 6 week user evaluation and re-design process.

With my partner, I conducted and moderated 90-minute remote usability tests with a diverse group of participants, including 6 medical students and 3 non-medical students.

I designed the tests to be structured around three task-based scenarios and I utilized the insights from each usability test to make informed design recommendations to improve the functionality, visual design and content within the website.

My responsibilities included:

  • Week 1-2: Set goals and criteria for the usability tests, create detailed task scenarios and questions.
  • Week 3-4: moderate 4 virtual usability tests and note-take for 3.
  • Week 5: Collect and analyze data from the 9 usability tests, Identify key areas for improvement based on feedback and create design recommendations using Figma
  • Week 6: Prepare a research report with recommendations and key test findings  for the client and delivered additional hand off documents.
      RESEARCH

We planned a 5 step procedure for a 3 task evaluation with the goals to enhance the existing product.

My goals of the usability test were to track the module's efficiency, gauge the content's effectiveness, assess overall usability, and evaluate the design to ensure it did provide an engaging learning experience for it's users.

Procedure:

  1. 6 participants were met  through Zoom and were asked to filled a consent form and complete demographic questionnaire prior to the start of the usability test.
  2. For all 3 tasks: completion time, success rate, number of usability issues, location of start of task, location of end of task and task flow was recorded.
  3. After participants completed all test task, participants answered SUS questionnaire on a Qualtrics form.
  4. Participants were then asked open ended questions related to the UI elements that they saw or interacted with on the website.
  5. Participants completed post study interview questions about how clear, useful and engaging the content of the website was.
      DETAILS

All participants passed Tasks 1 & 2 but Task 3: 'Finding Help' had only 33% success rate, highlighting urgent need for better help accessibility.

Users found it difficult to locate help information. They disliked that the only way to get help was through the “Submit Ticket” feature on the homepage, which opened their native email app.

Quote:

“This seems as if someone will get back to me sometime during business hours.” - usability test participant

      RESEARCH Findings

Users also expressed a desire to retain six important features.

Participants also voiced that they wanted to keep features such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

I had to ensure that the new website design retained key elements from the original layout. It was essential to keep the components that users know and love, such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

Quotes:

"If I were going through this as a review, I would definitely use  timestamps to skip to the specific sections so I don't have to watch the full video.” - usability test participant

      USER PERSONAS

As I prepared to dive into Figma, the client revealed that students strongly prefer using tablets and phones for studying on the go.

With this newfound information, I had to work within the constraint of using the same, if not more, content in a limited screen space. Despite this, I decided to focus the new designs on optimizing the learning experience for tablet view, prioritizing mobile-friendly solutions for learning on the go over traditional desk-based learning.

      USER JOURNEY

I recommended that the E-learning module’s overall visual design should derive from University of Miami’s visual branding.

The current UI of the e-learning platform had a different style guide than it's landing page on the gordon center's official website, which could cause confusion. To ensure consistency and recognition, it is important that users can easily identify the E-learning modules as a UM product.

      VISUAL DESIGN

To address the issue of the large wall of transcription text that made following the video difficult for users, I introduced visuals.

I suggested to add video timestamps pictures  and text highlighting queues so that users can have visuals to follow along with the lesson while reading the transcript

Quote:

"I wish that the transcript section had visuals or connected with the video visuals a bit better." - usability test participant

      dashboard design

People experienced audio overlap on pages with multiple videos, so I proposed having a clear distinction between played and paused video.

I recommended to make sure videos stop once a new one is clicked and highlight videos that plays audio with a blue colored border.

Quote:

"It’s annoying that I need to stop one video and to play the next one properly."- usability test participant

      design solutions

People were unaware of their progress toward the final testing section, so a progress indicator was needed.

I fix this uncertainty by adding a progress indicator as a percentage scale to show how much of the module users have left to complete

Quote:

"A progress bar on the top were the nav bar is would be super helpful when going through the entire site." - usability test participant

      design Solutions

Finally, I created a design hand off document that showcases fundamental design guidelines that  are functional yet flexible enough to ensure that the users are getting an optimal learning experience on the website. 

I generated a series of templates made from the mockups and documented the design specifications. i wanted to make sure the documentation was easy to consume detailed so that our client had the right tools to re-design their learning platform on their own in -house designers and developers.

      design challengeS

Within the test, participants also had to answer post-study questions about the challenges or delights they experience while using the platform.

Tasks include:

  • Test Task 1: Participants must look through the entire module and get first impressions. (measured by usability issues, Single Ease Questionnaire (SEQ)- 7pt rating score 1System Usability Scale (SUS) - 5pt rating)
  • Test Task 2: Participant completes the entirety of the module up to the “Testing” section. (measured by task completion time)
  • Test Task 3: Towards the end of the module, participants should “Submit Ticket” to seek help. (measured by success rate, pass or fail)
      design challenges

Usability testing revealed that 21 unique issues were found in the existing platform.

During our usability testing, we identified 21 unique issues with the existing platform. To better assess the design problems, I organized the findings into four categories: content, functionality, visibility, and visual design.

Content findings:

  • Missing Instructions for Practice Questions: Users struggled due to the lack of clear instructions for practice questions.

Functionality findings:

  • Audio Overlap Issues: Users experienced audio overlap on pages with multiple videos. Without manually stopping one video, both audios played simultaneously, undermining the purpose of sound distinction.

Visibility findings:

  • Navigation and Progress Tracking Problems: Users had trouble knowing their location within the module. There was no indication of which sections they had completed or how much they had left until the testing section.
      Problem

21 design solutions in Figma were crafted to address and mitigating the challenges within the learning platform.

While designing, I refreshed essential UI components such as the navigation bar, video transcripts, and the testing layout, as students found these elements most useful.

I also introduced new design elements like a progress bar and interactive features, such as highlighted text within video transcripts, to make the content less overwhelming to read.

      Metrics

In the end, the proposed design solutions was well-received by our client.

Our client was happy with the proposed design solutions and is considering implementing our ideas to improve the platform! However, I've learned through this process that there is no perfect solution for an existing product. Multiple approaches can work, as long as they positively impact the user. So, we can only determine the effectiveness of a solution by testing it with real users

      LEARNINGS

With more time, I would have expanded our testing group beyond the three medical students.

By doing so, I'll have a better understanding of the different learning preferences and directly assess the impact of our design solutions by presenting the changes to the same participants.

This project also validated the lesson that when coming up design recommendations for a already existing product, there is no perfect solution. There can be many different ways to solve a problem, as long as there is a good reason behind how it can positively impact the user. we can only determine the effectiveness of a solution by testing it. 

Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Enhancing healthcare learning by improving Gordon Center’s Cardiac Auscultation e-learning modules.

In this project, I conducted a thorough evaluation of the Gordon Center’s Essential Cardiac Auscultation learning module, an online program designed for medical students to practice using a stethoscope to listen to heartbeat sounds. I collaborated with my co-UX researcher to perform 9 moderated virtual usability tests on the platform.

Outcome:

In this evaluation, I identified 21 unique usability issues and pinpointed 8 UI elements that were effective and should remain unchanged. To address the usability issues, I created 21 mockups of proposed design solutions, which were pitched to the client. Overall, the client approved of the re-design suggestions and planned to utilize our findings to update the web platform with their in-house developers.

Category:
UX Design
,
User Research
Industry:
Education, Medical
Responsibilities
test plan creation, recruitment, usability test moderator, divergent insight synthesis, UX/UI  design recommendation creation, wire-framing, pitch presentation
My Role:
UX designer & UX researcher / Team of 2
Timeline:
6 weeks /  2022
Offical usability report:
      Client overview

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project Apprpoach

I spearheaded a 6 week user evaluation and re-design process.

With my partner, I conducted and moderated 90-minute remote usability tests with a diverse group of participants, including 6 medical students and 3 non-medical students.

I designed the tests to be structured around three task-based scenarios and I utilized the insights from each usability test to make informed design recommendations to improve the functionality, visual design and content within the website.

My responsibilities included:

  • Week 1-2: Set goals and criteria for the usability tests, create detailed task scenarios and questions.
  • Week 3-4: moderate 4 virtual usability tests and note-take for 3.
  • Week 5: Collect and analyze data from the 9 usability tests, Identify key areas for improvement based on feedback and create design recommendations using Figma
  • Week 6: Prepare a research report with recommendations and key test findings  for the client and delivered additional hand off documents.
      test planning

We planned a 5 step procedure for a 3 task evaluation with the goals to enhance the existing product.

My goals of the usability test were to track the module's efficiency, gauge the content's effectiveness, assess overall usability, and evaluate the design to ensure it did provide an engaging learning experience for it's users.

Procedure:

  1. 6 participants were met  through Zoom and were asked to filled a consent form and complete demographic questionnaire prior to the start of the usability test.
  2. For all 3 tasks: completion time, success rate, number of usability issues, location of start of task, location of end of task and task flow was recorded.
  3. After participants completed all test task, participants answered SUS questionnaire on a Qualtrics form.
  4. Participants were then asked open ended questions related to the UI elements that they saw or interacted with on the website.
  5. Participants completed post study interview questions about how clear, useful and engaging the content of the website was.
      test tasks

Within the test, participants also had to answer post-study questions about the challenges or delights they experience while using the platform.

Tasks include:

  • Test Task 1: Participants must look through the entire module and get first impressions. (measured by usability issues, Single Ease Questionnaire (SEQ)- 7pt rating score 1System Usability Scale (SUS) - 5pt rating)
  • Test Task 2: Participant completes the entirety of the module up to the “Testing” section. (measured by task completion time)
  • Test Task 3: Towards the end of the module, participants should “Submit Ticket” to seek help. (measured by success rate, pass or fail)
      TEST Insights

Usability testing revealed that 21 unique issues were found in the existing platform.

During our usability testing, we identified 21 unique issues with the existing platform. To better assess the design problems, I organized the findings into four categories: content, functionality, visibility, and visual design.

Content findings:

  • Missing Instructions for Practice Questions: Users struggled due to the lack of clear instructions for practice questions.

Functionality findings:

  • Audio Overlap Issues: Users experienced audio overlap on pages with multiple videos. Without manually stopping one video, both audios played simultaneously, undermining the purpose of sound distinction.

Visibility findings:

  • Navigation and Progress Tracking Problems: Users had trouble knowing their location within the module. There was no indication of which sections they had completed or how much they had left until the testing section.
      DETAILS

All participants passed Tasks 1 & 2 but Task 3: 'Finding Help' had only 33% success rate, highlighting urgent need for better help accessibility.

Users found it difficult to locate help information. They disliked that the only way to get help was through the “Submit Ticket” feature on the homepage, which opened their native email app.

Quote:

“This seems as if someone will get back to me sometime during business hours.” - usability test participant

      keepers

Users also expressed a desire to retain six important features.

Participants also voiced that they wanted to keep features such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

I had to ensure that the new website design retained key elements from the original layout. It was essential to keep the components that users know and love, such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

Quotes:

"If I were going through this as a review, I would definitely use  timestamps to skip to the specific sections so I don't have to watch the full video.” - usability test participant

      CLIENT CONSIDERATIONS

As I prepared to dive into Figma, the client revealed that students strongly prefer using tablets and phones for studying on the go.

With this newfound information, I had to work within the constraint of using the same, if not more, content in a limited screen space. Despite this, I decided to focus the new designs on optimizing the learning experience for tablet view, prioritizing mobile-friendly solutions for learning on the go over traditional desk-based learning.

      VISUAL DESIGN

I recommended that the E-learning module’s overall visual design should derive from University of Miami’s visual branding.

The current UI of the e-learning platform had a different style guide than it's landing page on the gordon center's official website, which could cause confusion. To ensure consistency and recognition, it is important that users can easily identify the E-learning modules as a UM product.

      UX Enhancements

21 design solutions in Figma were crafted to address and mitigating the challenges within the learning platform.

While designing, I refreshed essential UI components such as the navigation bar, video transcripts, and the testing layout, as students found these elements most useful.

I also introduced new design elements like a progress bar and interactive features, such as highlighted text within video transcripts, to make the content less overwhelming to read.

      design solutions

We found that there were missing instructions for practice questions.  

I suggested to add clearer instructions that point users to interact with elements on the page and to include a hover state for sound buttons. 

Quote:

"I didn’t know those were clickable, I just skimmed pass them thinking the instructions were for an upcoming page. "  - usability test participant

      design  solutions

To address the issue of the large wall of transcription text that made following the video difficult for users, I introduced visuals.

I suggested to add video timestamps pictures  and text highlighting queues so that users can have visuals to follow along with the lesson while reading the transcript

Quote:

"I wish that the transcript section had visuals or connected with the video visuals a bit better." - usability test participant

      design solutions

People experienced audio overlap on pages with multiple videos, so I proposed having a clear distinction between played and paused video.

I recommended to make sure videos stop once a new one is clicked and highlight videos that plays audio with a blue colored border.

Quote:

"It’s annoying that I need to stop one video and to play the next one properly."- usability test participant

      design solutions

People were unaware of their progress toward the final testing section, so a progress indicator was needed.

I fix this uncertainty by adding a progress indicator as a percentage scale to show how much of the module users have left to complete

Quote:

"A progress bar on the top were the nav bar is would be super helpful when going through the entire site." - usability test participant

      design solutions

Finding immediate help within the modules was challenging, so an error-free path to guidance was required for a better user experience.

I recommended adding a contact section in the footer with immediate resources that are easily accessible throughout the site. Additionally, users should be able to submit a form to connect with a help representative.

Quote:

"I really don't like that the ‘Submit a Ticket’ button opens up my personal email. Plus how do I even know I’ll get a response quickly?" - usability test participant

      impact + lessons

In the end, the proposed design solutions was well-received by our client.

Our client was happy with the proposed design solutions and is considering implementing our ideas to improve the platform! However, I've learned through this process that there is no perfect solution for an existing product. Multiple approaches can work, as long as they positively impact the user. So, we can only determine the effectiveness of a solution by testing it with real users

Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Enhancing healthcare learning by improving Gordon Center’s Cardiac Auscultation e-learning modules.

In this project, I conducted a thorough evaluation of the Gordon Center’s Essential Cardiac Auscultation learning module, an online program designed for medical students to practice using a stethoscope to listen to heartbeat sounds. I collaborated with my co-UX researcher to perform 9 moderated virtual usability tests on the platform.

Outcome:

In this evaluation, I identified 21 unique usability issues and pinpointed 8 UI elements that were effective and should remain unchanged. To address the usability issues, I created 21 mockups of proposed design solutions, which were pitched to the client. Overall, the client approved of the re-design suggestions and planned to utilize our findings to update the web platform with their in-house developers.

Category:
UX Design
,
User Research
Industry:
Performing Arts, Customer Experience
Responsibilities
Design strategy, team facilitation, wire framing, design iterations, landing page design, new user experience, super user experience, platform design
My Role:
UX designer & UX researcher / Team of 2
Timeline:
14 weeks /  2023
Offical pitch deck:
Ask for the link!
      Client overview

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project Apprpoach

I spearheaded a 6 week user evaluation and re-design process.

With my partner, I conducted and moderated 90-minute remote usability tests with a diverse group of participants, including 6 medical students and 3 non-medical students.

I designed the tests to be structured around three task-based scenarios and I utilized the insights from each usability test to make informed design recommendations to improve the functionality, visual design and content within the website.

My responsibilities included:

  • Week 1-2: Set goals and criteria for the usability tests, create detailed task scenarios and questions.
  • Week 3-4: moderate 4 virtual usability tests and note-take for 3.
  • Week 5: Collect and analyze data from the 9 usability tests, Identify key areas for improvement based on feedback and create design recommendations using Figma
  • Week 6: Prepare a research report with recommendations and key test findings  for the client and delivered additional hand off documents.
      Prototype

I planned and moderated 90-minute remote usability tests with 6 medical students and 3 non-medical students, through 3 task-based scenarios. Participants had to complete the System Usability Scale (SUS) questionnaire and answer post-study questions

Our goals were to track the module's efficiency, gauge the content's effectiveness and presentation, assess overall usability, and evaluate the design to ensure an optimal learning experience for users.

      Learnings

With more time, I would have expanded our testing group beyond the three medical students.

By doing so, I'll have a better understanding of the different learning preferences and directly assess the impact of our design solutions by presenting the changes to the same participants.

This project also validated the lesson that when coming up design recommendations for a already existing product, there is no perfect solution. There can be many different ways to solve a problem, as long as there is a good reason behind how it can positively impact the user. we can only determine the effectiveness of a solution by testing it. 

Gordon Center's Essential Cardiac Heart Auscultation re-design

User testing and UX design enhancements to the Gordon Center’s web e-learning platform

      Project Overview

Users also expressed a desire to retain six important features.

Participants also voiced that they wanted to keep features such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

I had to ensure that the new website design retained key elements from the original layout. It was essential to keep the components that users know and love, such as the navigation bar, video transcripts, practice videos, and the test section interface layout.

Quotes:

"If I were going through this as a review, I would definitely use  timestamps to skip to the specific sections so I don't have to watch the full video.” - usability test participant

Project type:
Product Management project at Brain Station
Category:
UX Design
,
User Research
Industry:
social media, networking
Responsibilities
Design strategy, go-to-market strategy, wire framing, prototyping, new user experience
My Role:
UX designer & UX researcher / Team of 2
Timeline:
8 weeks /  2024
      problem scope  

The Gordon Center Essential Cardiac Auscultation module is a web-based educational tool developed by the Gordon Center.

Through interactive teaching, practice with audio pages, and skills testing using video case studies, the platform leverages Moodle LMS as an open-source platform for education, featuring in-house development by the center's front-end developers.

Our client at the Gordon Center expressed a need to evaluate the overall platform's usability and sought design recommendations to enhance the user interface (UI) and user experience (UX).

      Project Apprpoach

I spearheaded a 6 week user evaluation and re-design process.

With my partner, I conducted and moderated 90-minute remote usability tests with a diverse group of participants, including 6 medical students and 3 non-medical students.

I designed the tests to be structured around three task-based scenarios and I utilized the insights from each usability test to make informed design recommendations to improve the functionality, visual design and content within the website.

My responsibilities included:

  • Week 1-2: Set goals and criteria for the usability tests, create detailed task scenarios and questions.
  • Week 3-4: moderate 4 virtual usability tests and note-take for 3.
  • Week 5: Collect and analyze data from the 9 usability tests, Identify key areas for improvement based on feedback and create design recommendations using Figma
  • Week 6: Prepare a research report with recommendations and key test findings  for the client and delivered additional hand off documents.
      Project process

I planned and moderated 90-minute remote usability tests with 6 medical students and 3 non-medical students, through 3 task-based scenarios. Participants had to complete the System Usability Scale (SUS) questionnaire and answer post-study questions

Our goals were to track the module's efficiency, gauge the content's effectiveness and presentation, assess overall usability, and evaluate the design to ensure an optimal learning experience for users.