Summer 2023 Internship

Summer 2023 Internship

Summer 2023 Internship

Epic

Epic

Epic, also known as Epic Systems, is the leading Electronic Medical Records (EMRs) software company in the United States, based in Madison, Wisconsin. As a Summer 2023 User Experience Designer Intern here, I had the opportunity to own a project within their Telehealth application, alongside a paired Software Developer Intern.

Epic, also known as Epic Systems, is the leading Electronic Medical Records (EMRs) software company in the United States, based in Madison, Wisconsin. As a Summer 2023 User Experience Designer Intern here, I had the opportunity to own a project within their Telehealth application, alongside a paired Software Developer Intern.

Project

Summer Internship

Epic Systems

Project

Summer Internship

Epic Systems

Skills

User research, sketching, wireframing, prototyping, user-testing

Skills

User research, sketching, wireframing, prototyping, user-testing

Timeline

May 2023 - Aug 2023

Timeline

May 2023 - Aug 2023

Note: All of the frames below are low-fidelity, intended to just give a sense of the project!

Note: All of the frames below are low-fidelity, intended to just give a sense of the project!

Premise

Premise

What is Telehealth?

Epic’s Telehealth application is dedicated to building software that enables patients to have access to healthcare from wherever they are. Their primary service is through the Epic Video Client (EVC), which is Epic’s homegrown browser-based videotelephony platform.

What is Telehealth?

Epic’s Telehealth application is dedicated to building software that enables patients to have access to healthcare from wherever they are. Their primary service is through the Epic Video Client (EVC), which is Epic’s homegrown browser-based videotelephony platform.

What was the project?

As part of preparation for a Telehealth video visit, patients are encouraged to test their device to make sure that it’s working and ready to go. How they do this, is through a hardware test, and EVC currently has two:

What was the project?

As part of preparation for a Telehealth video visit, patients are encouraged to test their device to make sure that it’s working and ready to go. How they do this, is through a hardware test, and EVC currently has two:

When it’s time for a user’s visit, clicking ‘Join Video Visit’ leads them to the Pre-Visit test, since the test is part of the interface to join the call. It acts as a final “You’re good to go, and you can join when you’re ready!” check.

When it’s time for a user’s visit, clicking ‘Join Video Visit’ leads them to the Pre-Visit test, since the test is part of the interface to join the call. It acts as a final “You’re good to go, and you can join when you’re ready!” check.

PRE-VISIT TEST

PRE-VISIT TEST

Occurs immediately before the visit

Occurs immediately before the visit

The Standalone Hardware Test is an independent test that opens in a brand new tab. It’s offered to patients when scheduling or when e-checking in to a video visit within Epic’s patient portal MyChart, depending on how customers configure it; it can also be sent as a direct link to those who do not have MyChart accounts.

The Standalone Hardware Test is an independent test that opens in a brand new tab. It’s offered to patients when scheduling or when e-checking in to a video visit within Epic’s patient portal MyChart, depending on how customers configure it; it can also be sent as a direct link to those who do not have MyChart accounts.

STANDALONE TEST

STANDALONE TEST

Ideally occurs a few days to a week in advance

Ideally occurs a few days to a week in advance

We worked on redesigning the Standalone Hardware Test, following customers concerns over its current interface.

We worked on redesigning the Standalone Hardware Test, following customers concerns over its current interface.

Preliminary Research

discover

The time frame of the internship didn’t allow us to conduct formal user interviews, as contact and research with patients is done through a regulated process, but we looked to testing later on to validate our assumptions.

Baseline Heuristic Evaluation

I documented an assessment of the current interface within the internal heuristic evaluation system, and found violations in the following of Nielson Norman Group’s 10 Usability Heuristics for User Interface Design:

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

No direct exit from this page's interface besides closing the tab

Cognitive overload when testing all at once, as three results means three actions

Inactionable results

Recovery guidance for those actions is separated from the direct error

Competitive Analysis

I then took a look at the existing structures and successes of other hardware tests and noticed variations in:

scroll→

Within the Feedback METHOD, I found an interesting difference in Automated vs. Self-Directed testing that would impact the structure of our test later.

Within the Feedback METHOD, I found an interesting difference in Automated vs. Self-Directed testing that would impact the structure of our test later.

Within the Feedback METHOD, I found an interesting difference in Automated vs. Self-Directed testing that would impact the structure of our test later.

AUTOMATED

AUTOMATED

AUTOMATED

Twilio

Twilio

Twilio

As soon as a user clicks into the test, the test automatically checks if everything is working technically and then displays it. This is really convenient and efficient for quick checks.

As soon as a user clicks into the test, the test automatically checks if everything is working technically and then displays it. This is really convenient and efficient for quick checks.

As soon as a user clicks into the test, the test automatically checks if everything is working technically and then displays it. This is really convenient and efficient for quick checks.

USER-DIRECTED

USER-DIRECTED

USER-DIRECTED

User determines whether or not they can see themselves. Accuracy of the test relies on the user’s judgment, but it helps catch environmental factors that technical tests may not be able to (e.g. camera covers, muffled audios, etc.).

User determines whether or not they can see themselves. Accuracy of the test relies on the user’s judgment, but it helps catch environmental factors that technical tests may not be able to (e.g. camera covers, muffled audios, etc.).

User determines whether or not they can see themselves. Accuracy of the test relies on the user’s judgment, but it helps catch environmental factors that technical tests may not be able to (e.g. camera covers, muffled audios, etc.).

Customer Solutions

Customer Solutions

Customer Solutions

define

Design Sprint

With all of these swirling understandings of the project, I decided that a Design Sprint might be the best way to resolve and flesh out any lingering questions and considerations in our problem space, as well as spark ideas for design and catch potential issues of our current trajectory.

*sushi-themed slides for World Ocean Day!

I invited five other User Experience Designers, two Quality Managers, and three Software Developers for an intense 3 hour rapid brainstorm and sketch session, and we came up with a lot of valuable discussion and raw material...

...which I was then able to convert into some really useful affinity maps!

scroll→

Problem Statement

Our problem statement then accumulated into...

How can we seamlessly support users with varying technology skills and testing needs in a way that instills confidence in their preparation for Telehealth visits?

...with main goals being:

1.

Results are more digestible.

Original interface elicited a lot of visual and cognitive overload.

2.

Troubleshooting is guided.

Instructions were not immediate to the error.

3.

Workflow is clear.

Users were unaware of the difference between Pre-Visit and Standalone Test.

Iteration

design

I made and played around with 2,000+ mockups, so I tried to condense a sample that could showcase each stage of iteration and some of our milestone decisions.

Sketches

Drawing from our Design Sprint collection, I sketched up some ideas of how guide through a test in network, camera, speaker, and microphone.

SEPARATED

Users select which element they would like to test

This is one layout example that uses a menu.

LINEAR

Users are led from test to test

This is one layout example that followed a horizontal subway.

BLENDED

We attended the Telehealth Weekly Design Meetings to grab some feedback, and the idea of a blended workflow was born!

Users can still choose what they want, but there is also circular navigation within the tests

Hi-fi + Milestone Decisions

Our project team visualized milestone decisions best in hi-fi, so that was the stage in which we spent a good chunk of time! During this stage, we focused on mobile.

THOROUGHNESS

Automated

Early on, we were met with suggestions to continue with the automated style of testing in the current Standalone Hardware test, where as soon as a user clicks into the test, we automatically check if everything is working and then display it. This is really convenient for users who are more tech-literate and just want a quick check.

Self-Directed

But, since we wanted to discontinue the cognitive overload and overwhelm of status in the current Standalone Hardware Test, we thought it best to force users to go through the camera, speaker, and microphone individually to see their results.

Like in the competitive analysis, this also helps catch issues that might not necessarily be technical (e.g. if the speaker is working, yet muffled), and prompts the user to ensure that the quality is up to par.

WORKFLOW

Menu-based

A user can jump into any test, allowing them to test what they want, but can only access others tests through the menu.

Step-by-step

Less choice than the menu, but instead guides the user through each test.

Blended

Our Goldilocks! There’s choice through the menu for our users who just want to check one thing, but circular navigation within each test for our users who want guidace.

Mid-Summer Presentation

At this point was our Mid-summer Presentation, where we got some good feedback, as well as internally reflected as a project team on our progress and direction.

[REDACTED]

[REDACTED]

Though we received praise for our compromise with Blended, we as a project team felt that our decisions had been too skewed by our feedback demographic’s higher technological skill, and wanted to revisit the scope of our target user.

re-define

Customer Call

The customer that had sparked this project was available for a call, and our discussion with them helped narrow our focus.

USERS

Our original problem statement tried to address all users with varying technology skills and testing needs...

Technological Literacy

Frequency of Use

Low

High

High

Low

...but trying to cater to too much ultimately took away from those who needed it the most:

Older, lower-tech literacy users who might not be

using technology or video visits frequently

GOALS

1.

Gain confidence

that they’re prepared

2.

Be guided

through any troubles

Persona

From there, we created a persona that would better set us up for our target user!

Meet...

Sal

short for Salvatore!

ABOUT

65-year-old retired landscaper

Never really had to use technology much

Usually offloads any technological tasks/issues to his son

SITUATION

Son recently moved away → Sal must take on more tech-heavy tasks

Needs to prepare for his upcoming follow-up video visit about his new hypertension medicine

GOAL

Complete the test for confidence that his device is ready for the visit

FRUSTRATIONS

Overwhelmed when faced with a lack of clear instructions and directions

Struggles with both the digital and physical technology

Revised Problem Statement

How can we seamlessly support users with lower technological literacy in a way that instills confidence in their preparation for Telehealth visits?

re-design

Hi-fi + Milestone Decisions

With this narrower scope, we were able to continue more confidently in our hi-fis.

WORKFLOW

Menu-based

Step-by-step

Blended

Since we realized trying to cater to too much variation in tech-savviness would hinder clarity to our users who really need the test, we reverted back to the Step-by-step workflow to align with our target users.

ERROR RECOVERY

Within the Step-by-step workflow though, we ran into our final decision of where to inform troubleshooting.

Originally, troubleshooting was only given in the results after the test is completed. The rationale behind this was that users should focus on one task at a time, so test, figure out what’s wrong and then troubleshoot. If a user were to troubleshoot as they go, if they run into an error they can’t fix, they may just give up and not test the other components.

Fix at end

Usability Testing

ROUND 1

We were unsure about our choice to place troubleshooting at the end, and decided it might be best to inform our decision with some usability testing. We structured it as a task-based run-through of the test, as our goal was to observe interactions in workflow.

“It’s a little weird to have to go all the way through the test in order to fix it. I would rather fix the problem right there.”

“It's almost aimless to go on to the test when something is failing because you have to go back anyways.”

“If I was an old person, I'd think that the phone was going to forget to bring me back to the tips!”

The consensus was clear: the anxiety of getting an error and having to wait to resolve it outweighs the anxiety from the error.

ERROR RECOVERY

Fix at end

Fix as you go

We ultimately decided to put troubleshooting within testing. This runs the risk of users getting stuck at errors they can’t troubleshoot, but we tried to put in place some mitigations to prevent users from getting stuck on an error, such as the sticky bar at the bottom with constant Next navigation buttons to indicate that you can always move forward, as well as an advisory message to move on if a user tries to test again.

ROUND 2

We ran another task-based run-through of the test to validate our changes, and were met with positive reactions!

It was intuitive where I needed to go next.

Super straightforward and clean!

The process was seamless.

Final Look

Final Look

Final Look

Demo

Demo

Demo

Mobile

Mobile

Mobile

Desktop

Desktop

Desktop

MOCKUP WALKTHROUGH

MOCKUP WALKTHROUGH

Final Presentation

All of this accumulated into our final presentation at Demo Days, where we got to showcase all of our hardwork from the summer!

Final Presentation

All of this accumulated into our final presentation at Demo Days, where we got to showcase all of our hardwork from the summer!

Final Presentation

All of this accumulated into our final presentation at Demo Days, where we got to showcase all of our hardwork from the summer!

Here with my paired software developer and our mentors🫶🏻

Here with my paired software developer and our mentors🫶🏻

Here with my paired software developer and our mentors🫶🏻

MOCKUP WALKTHROUGH

Reflections

Reflections

Reflections

Main Takeaways

Cross-functional compromise: Working with a paired software developer intern helped complement experience levels, but learning to understand each other’s priorities and timelines was definitely a challenge that we took on!

What it means to be a designer: Empathy is often cited in designers’ toolkits, but it isn’t just for the end-user. I was surprised to learn that in my day-to-day, I spent about 50% of my time in meetings — to be a great designer then, is not only to be a creator of great products,  but to be a great listener and a great communicator.



Main Takeaways

Cross-functional compromise: Working with a paired software developer intern helped complement experience levels, but learning to understand each other’s priorities and timelines was definitely a challenge that we took on!

What it means to be a designer: Empathy is often cited in designers’ toolkits, but it isn’t just for the end-user. I was surprised to learn that in my day-to-day, I spent about 50% of my time in meetings — to be a great designer then, is not only to be a creator of great products,  but to be a great listener and a great communicator.



Main Takeaways

Cross-functional compromise: Working with a paired software developer intern helped complement experience levels, but learning to understand each other’s priorities and timelines was definitely a challenge that we took on!

What it means to be a designer: Empathy is often cited in designers’ toolkits, but it isn’t just for the end-user. I was surprised to learn that in my day-to-day, I spent about 50% of my time in meetings — to be a great designer then, is not only to be a creator of great products,  but to be a great listener and a great communicator.



scroll→

Iteration

design

I made and played around with 2,000+ mockups, so I tried to condense a sample that could showcase each stage of iteration and some of our milestone decisions.

Sketches

Drawing from our Design Sprint collection, I sketched up some ideas of how guide through a test in network, camera, speaker, and microphone.

Lo-fi

I explored a few avenues of flow through low fidelity wireframes.

SEPARATED

Users select which element they would like to test

This is one layout example that uses a menu.

LINEAR

Users are led from test to test

This is one layout example that followed a horizontal subway.

BLENDED

We attended the Telehealth Weekly Design Meetings to grab some feedback, and the idea of a blended workflow was born!

Users can still choose what they want, but there is also circular navigation within the tests

Hi-fi + Milestone Decisions

Our project team visualized milestone decisions best in hi-fi, so that was the stage in which we spent a good chunk of time! During this stage, we focused on mobile.

THOROUGHNESS

Automated

Early on, we were met with suggestions to continue with the automated style of testing in the current Standalone Hardware test, where as soon as a user clicks into the test, we automatically check if everything is working and then display it. This is really convenient for users who are more tech-literate and just want a quick check.

Self-Directed

But, since we wanted to discontinue the cognitive overload and overwhelm of status in the current Standalone Hardware Test, we thought it best to force users to go through the camera, speaker, and microphone individually to see their results.

Like in the competitive analysis, this also helps catch issues that might not necessarily be technical (e.g. if the speaker is working, yet muffled), and prompts the user to ensure that the quality is up to par.

WORKFLOW

Menu-based

A user can jump into any test, allowing them to test what they want, but can only access others tests through the menu.

Step-by-step

Less choice than the menu, but instead guides the user through each test.

Blended

Our Goldilocks! There’s choice through the menu for our users who just want to check one thing, but circular navigation within each test for our users who want guidace.

Mid-Summer Presentation

At this point was our Mid-summer Presentation, where we got some good feedback, as well as internally reflected as a project team on our progress and direction.

[REDACTED]

[REDACTED]

Though we received praise for our compromise with Blended, we as a project team felt that our decisions had been too skewed by our feedback demographic’s higher technological skill, and wanted to revisit the scope of our target user.

define

Design Sprint

With all of these swirling understandings of the project, I decided that a Design Sprint might be the best way to resolve and flesh out any lingering questions and considerations in our problem space, as well as spark ideas for design and catch potential issues of our current trajectory.

*sushi-themed slides for World Ocean Day!

I invited five other User Experience Designers, two Quality Managers, and three Software Developers for an intense 3 hour rapid brainstorm and sketch session, and we came up with a lot of valuable discussion and raw material...

...which I was then able to convert into some really useful affinity maps!

scroll→

Problem Statement

Our problem statement then accumulated into...

How can we seamlessly support users with varying technology skills and testing needs in a way that instills confidence in their preparation for Telehealth visits?

...with main goals being:

1.

Results are more digestible.

Original interface elicited a lot of visual and cognitive overload.

2.

Troubleshooting is guided.

Instructions were not immediate to the error.

3.

Workflow is clear.

Users were unaware of the difference between Pre-Visit and Standalone Test.

re-define

Customer Call

The customer that had sparked this project was available for a call, and our discussion with them helped narrow our focus.

USERS

Our original problem statement tried to address all users with varying technology skills and testing needs...

Technological Literacy

Frequency of Use

Low

High

High

Low

...but trying to cater to too much ultimately took away from those who needed it the most:

Older, lower-tech literacy users who might not be

using technology or video visits frequently

GOALS

1.

Gain confidence

that they’re prepared

2.

Be guided

through any troubles

Persona

From there, we created a persona that would better set us up for our target user!

Meet...

Sal

short for Salvatore!

ABOUT

65-year-old retired landscaper

Never really had to use technology much

Usually offloads any technological tasks/issues to his son

SITUATION

Son recently moved away → Sal must take on more tech-heavy tasks

Needs to prepare for his upcoming follow-up video visit about his new hypertension medicine

GOAL

Complete the test for confidence that his device is ready for the visit

FRUSTRATIONS

Overwhelmed when faced with a lack of clear instructions and directions

Struggles with both the digital and physical technology

Revised Problem Statement

How can we seamlessly support users with lower technological literacy in a way that instills confidence in their preparation for Telehealth visits?

Preliminary Research

discover

The time frame of the internship didn’t allow us to conduct formal user interviews, as contact and research with patients is done through a regulated process, but we looked to testing later on to validate our assumptions.

Baseline Heuristic Evaluation

I documented an assessment of the current interface within the internal heuristic evaluation system, and found violations in the following of Nielson Norman Group’s 10 Usability Heuristics for User Interface Design:

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

No direct exit from this page's interface besides closing the tab

Cognitive overload when testing all at once, as three results means three actions

Inactionable results

Recovery guidance for those actions is separated from the direct error

Competitive Analysis

I then took a look at the existing structures and successes of other hardware tests and noticed variations in:

re-design

re-design

re-design

Hi-fi + Milestone Decisions

With this narrower scope, we were able to continue more confidently in our hi-fis.

Hi-fi + Milestone Decisions

With this narrower scope, we were able to continue more confidently in our hi-fis.

Hi-fi + Milestone Decisions

With this narrower scope, we were able to continue more confidently in our hi-fis.

WORKFLOW

WORKFLOW

WORKFLOW

Menu-based

Menu-based

Menu-based

Step-by-step

Step-by-step

Step-by-step

Blended

Blended

Blended

Since we realized trying to cater to too much variation in tech-savviness would hinder clarity to our users who really need the test, we reverted back to the Step-by-step workflow to align with our target users.

Since we realized trying to cater to too much variation in tech-savviness would hinder clarity to our users who really need the test, we reverted back to the Step-by-step workflow to align with our target users.

Since we realized trying to cater to too much variation in tech-savviness would hinder clarity to our users who really need the test, we reverted back to the Step-by-step workflow to align with our target users.

ERROR RECOVERY

ERROR RECOVERY

ERROR RECOVERY

Within the Step-by-step workflow though, we ran into our final decision of where to inform troubleshooting.

Originally, troubleshooting was only given in the results after the test is completed. The rationale behind this was that users should focus on one task at a time, so test, figure out what’s wrong and then troubleshoot. If a user were to troubleshoot as they go, if they run into an error they can’t fix, they may just give up and not test the other components.

Within the Step-by-step workflow though, we ran into our final decision of where to inform troubleshooting.

Originally, troubleshooting was only given in the results after the test is completed. The rationale behind this was that users should focus on one task at a time, so test, figure out what’s wrong and then troubleshoot. If a user were to troubleshoot as they go, if they run into an error they can’t fix, they may just give up and not test the other components.

Within the Step-by-step workflow though, we ran into our final decision of where to inform troubleshooting.

Originally, troubleshooting was only given in the results after the test is completed. The rationale behind this was that users should focus on one task at a time, so test, figure out what’s wrong and then troubleshoot. If a user were to troubleshoot as they go, if they run into an error they can’t fix, they may just give up and not test the other components.

Fix at end

Fix at end

Fix at end

Usability Testing

ROUND 1

We were unsure about our choice to place troubleshooting at the end, and decided it might be best to inform our decision with some usability testing. We structured it as a task-based run-through of the test, as our goal was to observe interactions in workflow.

Usability Testing

ROUND 1

We were unsure about our choice to place troubleshooting at the end, and decided it might be best to inform our decision with some usability testing. We structured it as a task-based run-through of the test, as our goal was to observe interactions in workflow.

Usability Testing

ROUND 1

We were unsure about our choice to place troubleshooting at the end, and decided it might be best to inform our decision with some usability testing. We structured it as a task-based run-through of the test, as our goal was to observe interactions in workflow.

“It’s a little weird to have to go all the way through the test in order to fix it. I would rather fix the problem right there.”

“It’s a little weird to have to go all the way through the test in order to fix it. I would rather fix the problem right there.”

“It’s a little weird to have to go all the way through the test in order to fix it. I would rather fix the problem right there.”

“It's almost aimless to go on to the test when something is failing because you have to go back anyways.”

“It's almost aimless to go on to the test when something is failing because you have to go back anyways.”

“It's almost aimless to go on to the test when something is failing because you have to go back anyways.”

“If I was an old person, I'd think that the phone was going to forget to bring me back to the tips!”

“If I was an old person, I'd think that the phone was going to forget to bring me back to the tips!”

“If I was an old person, I'd think that the phone was going to forget to bring me back to the tips!”

The consensus was clear: the anxiety of getting an error and having to wait to resolve it outweighs the anxiety from the error.

The consensus was clear: the anxiety of getting an error and having to wait to resolve it outweighs the anxiety from the error.

The consensus was clear: the anxiety of getting an error and having to wait to resolve it outweighs the anxiety from the error.

ERROR RECOVERY

ERROR RECOVERY

ERROR RECOVERY

Fix at end

Fix at end

Fix at end

Fix as you go

Fix as you go

Fix as you go

We ultimately decided to put troubleshooting within testing. This runs the risk of users getting stuck at errors they can’t troubleshoot, but we tried to put in place some mitigations to prevent users from getting stuck on an error, such as the sticky bar at the bottom with constant Next navigation buttons to indicate that you can always move forward, as well as an advisory message to move on if a user tries to test again.

We ultimately decided to put troubleshooting within testing. This runs the risk of users getting stuck at errors they can’t troubleshoot, but we tried to put in place some mitigations to prevent users from getting stuck on an error, such as the sticky bar at the bottom with constant Next navigation buttons to indicate that you can always move forward, as well as an advisory message to move on if a user tries to test again.

We ultimately decided to put troubleshooting within testing. This runs the risk of users getting stuck at errors they can’t troubleshoot, but we tried to put in place some mitigations to prevent users from getting stuck on an error, such as the sticky bar at the bottom with constant Next navigation buttons to indicate that you can always move forward, as well as an advisory message to move on if a user tries to test again.

ROUND 2

We ran another task-based run-through of the test to validate our changes, and were met with positive reactions!

ROUND 2

We ran another task-based run-through of the test to validate our changes, and were met with positive reactions!

ROUND 2

We ran another task-based run-through of the test to validate our changes, and were met with positive reactions!

It was intuitive where I needed to go next.

It was intuitive where I needed to go next.

It was intuitive where I needed to go next.

Super straightforward and clean!

Super straightforward and clean!

Super straightforward and clean!

The process was seamless.

The process was seamless.

The process was seamless.

Epic

Epic, also known as Epic Systems, is the leading Electronic Medical Records (EMRs) software company in the United States, based in Madison, Wisconsin. As a Summer 2023 User Experience Designer Intern here, I had the opportunity to own a project within their Telehealth application, alongside a paired Software Developer Intern.

Project

Summer Internship

Epic Systems

Skills

User research, sketching, wireframing, prototyping, user-testing

Timeline

May 2023 - Aug 2023

Note: All of the frames below are low-fidelity, intended to just give a sense of the project!

Premise

What is Telehealth?

Epic’s Telehealth application is dedicated to building software that enables patients to have access to healthcare from wherever they are. Their primary service is through the Epic Video Client (EVC), which is Epic’s homegrown browser-based videotelephony platform.

What was the project?

As part of preparation for a Telehealth video visit, patients are encouraged to test their device to make sure that it’s working and ready to go. How they do this, is through a hardware test, and EVC currently has two:

Occurs immediately before the visit

When it’s time for a user’s visit, clicking ‘Join Video Visit’ leads them to the Pre-Visit test, since the test is part of the interface to join the call. It acts as a final “You’re good to go, and you can join when you’re ready!” check.

PRE-VISIT TEST

scroll→

Ideally occurs a few days to a week in advance

The Standalone Hardware Test is an independent test that opens in a brand new tab. It’s offered to patients when scheduling or when e-checking in to a video visit within Epic’s patient portal MyChart, depending on how customers configure it; it can also be sent as a direct link to those who do not have MyChart accounts.

STANDALONE TEST

scroll→

Preliminary Research

discover

The time frame of the internship didn’t allow us to conduct formal user interviews, as contact and research with patients is done through a regulated process, but we looked to testing later on to validate our assumptions.

Baseline Heuristic Evaluation

I documented an assessment of the current interface within the internal heuristic evaluation system, and found violations in the following of Nielson Norman Group’s 10 Usability Heuristics for User Interface Design:

1. Visibility of system status

2. Match between system and the real world

3. User control and freedom

4. Consistency and standards

5. Error prevention

6. Recognition rather than recall

7. Flexibility and efficiency of use

8. Aesthetic and minimalist design

9. Help users recognize, diagnose, and recover from errors

10. Help and documentation

No direct exit from this page's interface besides closing the tab

Cognitive overload when testing all at once, as three results means three actions

Inactionable results

Recovery guidance for those actions is separated from the direct error

Competitive Analysis

I then took a look at the existing structures and successes of other hardware tests and noticed variations in:

scroll→

define

Design Sprint

With all of these swirling understandings of the project, I decided that a Design Sprint might be the best way to resolve and flesh out any lingering questions and considerations in our problem space, as well as spark ideas for design and catch potential issues of our current trajectory.

*sushi-themed slides for World Ocean Day!

I invited five other User Experience Designers, two Quality Managers, and three Software Developers for an intense 3 hour rapid brainstorm and sketch session, and we came up with a lot of valuable discussion and raw material...

...which I was then able to convert into some really useful affinity maps!

scroll→

Problem Statement

Our problem statement then accumulated into...

How can we seamlessly support users with varying technology skills and testing needs in a way that instills confidence in their preparation for Telehealth visits?

...with main goals being:

1.

Results are more digestible.

Original interface elicited a lot of visual and cognitive overload.

2.

Troubleshooting is guided.

Instructions were not immediate to the error.

3.

Workflow is clear.

Users were unaware of the difference between Pre-Visit and Standalone Test.

Iteration

design

I made and played around with 2,000+ mockups, so I tried to condense a sample that could showcase each stage of iteration and some of our milestone decisions.

Sketches

Drawing from our Design Sprint collection, I sketched up some ideas of how guide through a test in network, camera, speaker, and microphone.

Lo-fi

I explored a few avenues of flow through low fidelity wireframes.

SEPARATED

Users select which element they would like to test

This is one layout example that uses a menu.

LINEAR

Users are led from test to test

This is one layout example that followed a horizontal subway.

BLENDED

We attended the Telehealth Weekly Design Meetings to grab some feedback, and the idea of a blended workflow was born!

Users can still choose what they want, but there is also circular navigation within the tests

Hi-fi + Milestone Decisions

Our project team visualized milestone decisions best in hi-fi, so that was the stage in which we spent a good chunk of time! During this stage, we focused on mobile.

THOROUGHNESS

Automated

Early on, we were met with suggestions to continue with the automated style of testing in the current Standalone Hardware test, where as soon as a user clicks into the test, we automatically check if everything is working and then display it. This is really convenient for users who are more tech-literate and just want a quick check.

Self-Directed

But, since we wanted to discontinue the cognitive overload and overwhelm of status in the current Standalone Hardware Test, we thought it best to force users to go through the camera, speaker, and microphone individually to see their results.

Like in the competitive analysis, this also helps catch issues that might not necessarily be technical (e.g. if the speaker is working, yet muffled), and prompts the user to ensure that the quality is up to par.

WORKFLOW

Menu-based

A user can jump into any test, allowing them to test what they want, but can only access others tests through the menu.

Step-by-step

Less choice than the menu, but instead guides the user through each test.

Blended

Our Goldilocks! There’s choice through the menu for our users who just want to check one thing, but circular navigation within each test for our users who want guidace.

Mid-Summer Presentation

At this point was our Mid-summer Presentation, where we got some good feedback, as well as internally reflected as a project team on our progress and direction.

[REDACTED]

[REDACTED]

Though we received praise for our compromise with Blended, we as a project team felt that our decisions had been too skewed by our feedback demographic’s higher technological skill, and wanted to revisit the scope of our target user.

re-define

Customer Call

The customer that had sparked this project was available for a call, and our discussion with them helped narrow our focus.

USERS

Our original problem statement tried to address all users with varying technology skills and testing needs...

Technological Literacy

Frequency of Use

Low

High

High

Low

...but trying to cater to too much ultimately took away from those who needed it the most:

Older, lower-tech literacy users who might not be

using technology or video visits frequently

GOALS

1.

Gain confidence

that they’re prepared

2.

Be guided

through any troubles

Persona

From there, we created a persona that would better set us up for our target user!

Meet...

Sal

short for Salvatore!

ABOUT

65-year-old retired landscaper

Never really had to use technology much

Usually offloads any technological tasks/issues to his son

SITUATION

Son recently moved away → Sal must take on more tech-heavy tasks

Needs to prepare for his upcoming follow-up video visit about his new hypertension medicine

GOAL

Complete the test for confidence that his device is ready for the visit

FRUSTRATIONS

Overwhelmed when faced with a lack of clear instructions and directions

Struggles with both the digital and physical technology

Revised Problem Statement

How can we seamlessly support users with lower technological literacy in a way that instills confidence in their preparation for Telehealth visits?

Made by Fiona with the help of lots of kombucha!

Made by Fiona with the help of lots of kombucha!

Made by Fiona with the help of lots of kombucha!