Student Performance Record
customer: University of Michigan Medical School in 2016
definition: Web application that aggregates and displays academic performance information from several sources
impact: Eliminated 40+ hours per month of data foraging and formatting in key student support services across the org; enabled team to pivot to a more comprehensive solution within 6 months
ux methods: Interviewing, contextual inquiry, facilitation, narrative framing, paired designing, rapid prototyping, iterative design
ux deliverables: Research debriefs, sketches, process models, epics and user stories, mockups, HTML prototypes, product demonstrations
My team and I observed a pattern among customer requests delivered to us through help tickets in ZenDesk. Multiple tickets had users asking for new auto-generating reports or asking for cosmetic changes to already existing reports. One user wanted SQL access directly to the database where all student information is stored. Some requests were possible, but bad for business in the long-term and others posed risks for student privacy which is federally protected.
We hypothesized that these requests were related to a common root problem and could, in turn, be solved by a single solution. I took the lead on planning research efforts, gathering data, and synthesizing my findings for the team. As we shifted into product development, I defined the workflow and high-level design of the application. Likewise, I maintained connection with users who had participated in interviews and gathered feedback on the product regularly.
Finding the Right Problem
Before putting marker to whiteboard, I sought to understand the problems that prompted the requests from users. My team strives to build sustainable software that meets users' needs. Solving the right problem is critical to building solutions that users like to use and customers like to invest in. Nailing the problem doesn't just make for a better design, but I think it's a better business investment.
I interviewed users about the work that they had been doing with academic performance data and asked them to show me how they work. The administrative staff and faculty that participated in user research for this project helped me understand some key pain points around accessing and formatting data.
An anti-pattern is a common response to a recurring problem that risks being counterproductive. I used this term to describe the user behaviors I observed around usage of performance data. Through interviewing, I identified 3 meaningful patterns to describe the root problems.
- Data foraging - users described spending a lot of time and energy finding and gathering data. The Registrar of the Medical School described a painstaking process of retrieving data that would inform the award of a prestigious student award. They were spending hours collecting and verifying and recollecting students’ clinical grades and putting them into a spreadsheet.
- Data wrangling - users showed us that they spent a lot of time formatting and distributing data. A statistician showed me a sample of a student performance summary that had been compiled for a single student; it contained core competency scores, assessment comments, and quiz and exam grades for that student and was 10 pages total. The statistics team reported spending at least 40 hours per month compiling and formatting up to 30 student performance summaries for a monthly student support meeting. This month meeting has up to 20 participants resulting in 20 copies of collected summaries, all of them up to 300 pages in length.
- Data not found - users could not access data without expensive IT intervention. In other cases, data was simply not available through a user interface and users sought IT assistance from software developers to retrieve the desired data.
Articulating anti-patterns helped me triangulate the root UX problem: users had no single source for academic performance data. Having identified the root problem, a sort of "lowest common denominator," I was confident that one software solution could effectively fulfill the requests from our customers.
Our users have to use our software because it is the interface available to them to retrieve the information they need in order to their jobs. This is a reality of working on an in-house team that builds custom software solutions. UX personas are useful for articulating user diversity in design projects that aims for a broad audience, but on this project they tended to be uncanny caricatures of the people we knew would use the app.
Another UX designer in my unit recommended "proto-personas" as a design artifact that could be useful for creating shared understanding of audience and goals across my team, but would not become overly specific. The IT unit my team operates under has a long history of doing requirements-driven development and focusing on building features "for Bob" or "for Sally." I wanted to avoid that here, because it results in unsustainable software that, inevitably becomes confusing and even unusable when Bob or Sally move on to other roles and are no longer the primary users. For me, "proto" in proto-personas refers to the level of specificity expressed in the persona. I ran with the idea and presented a job role-based schema of primary users and potential users to my team. Sample slides from my discovery slide deck are included below.
Proto-personas helped my team orient around the roles and organizational functions that we wanted to support with our solution in this project. By "zooming out" from individuals, we were able to build an application that was useful to our customer and usable by our users.
Academic performance data helps drive a number of academic support and professional development activities at the Medical School. Without a useful, reliable, and secure resource the Medical School administration was stuck doing cumbersome work just to get data into a usable state when that effort might have been allocated elsewhere.
My team and I came back together to compare the results of our research. Our software ecosystem was made up of data silos and obscure points of entry for users. We agreed that the root problem was shared across the customer requests we had collected and that we could build one solution that made performance data accessible in the browser and also in malleable file types for easy usage.
Learning through Prototypes
Functional prototypes can be costly and aren't always possible on our projects, but on this project I used simple HTML prototypes to introduce new concepts to users and collect feedback. Later in development, I would show demos on our test server to users and solicit feedback that way.
I demo’ed my prototypes for users and collected feedback. This level of transparency alleviated customer anxiety and helped us learn about some key requirements that wouldn’t have come out in discovery. My project manager and I also sought out other use cases that might be served by our solution. The positive demos and additional use cases appeased the members of the team that thought it too risky to do anything but satisfy the explicit request of the customer.
As we transitioned into interface design and development planning, the team agreed that the best solution would be a new application that provided one interface to view data from many sources. We began development with lo-fi design assets (wireframes and workflows to communicate structure) and iterated on the visual design through cross-disciplinary pairing. To be clear, the GUI is a important component of the project, but my primary focus was on ensuring that we solved the right problem and delivered structured, accessible information.
Pivoting to the next big thing
Student Performance Record was a major success for my team. Launching an initial release in Summer 2016 enabled my unit to conduct further testing and more enriching user research which led to the rapid evolution of our medical education application portfolio. By 2017, I had moved on to another area of the portfolio, but Student Performance Record was pivoted to become a core offering in our Undergraduate Medical Education Suite of applications. Core data views of student information, assessment scores and comments, cumulative grades and others continue to be used by Medical School administrators, deans, academic review board members, academic counselors, and instructors.