Assessment Instrument Administration

Overview

customer: University of Michigan Medical School in 2016

definition: Web-based medical student performance assessment form builder, instrument testing platform, and instrument version manager

impact: Eliminated dependency on IT for curriculum delivery in medical school; consolidated instrument management to enable single-sourcing and reuse in delivery of performance assessments

ux methods: Concept modeling, sketching, paired design, prototyping, observation and interviewing, product demos, user assistance training

ux deliverables: Visual models, sketches, agile user stories, system maps, documentation, mockups 


 

Background

In 2014, U-M Medical school began transitioning to a new curriculum. The curriculum of old, with its structure largely unchanged for over 100 years, generated a high volume of summative student performance data through quizzes and exams. Summative performance data indicated mastery of topical material (e.g. epidemiology, anatomy, gastrointestinal system), but did little to help educators and students alike pinpoint areas for growth in more nebulous aspects of doctor training such as patient critical problem-solving, collaboration, and of course bedside manner. In concert with the goals of the new curriculum, more frequent assessment of students was planned. More frequent assessments would generate formative feedback that could help students focus their doctoring practice and study habits.

 
 

Problem

Medical School administrators and educators could not alter or create new assessment instruments without assistance of IT staff. Assessment instruments were hard-coded into assessment scheduling and tabulation applications. Changes to assessments or deployment of new assessment instruments were difficult. To be specific, my team was spending a lot of time coding assessment forms or making changes to wording and testing assessments when our expertise is really in designing, building, and launching software. In turn, the people who were really experts in assessment instruments were far removed from the process.

Business process flow depicting "current" problem state and future state. The area highlighted in red is the portion of the process that our project focused on improving. 

 

Solution

Medical School administrators and educational staff now have unimpeded access to assessment instrument authorship and management. We built a web application that enables the creation and management of assessment instruments. After analysis of our existing assessment schedule and outcomes system, we identified a gap in the system and explored what solutions might fill in the gap.

User interviews demonstrated a modular business process carried out by different types of users for assessment instrument authorship and management, scheduling, and outcomes analysis. Rather than extend either the scheduling or outcomes applications, we designed a new web application that would enable instrument authorship and management of instruments.

 

Early product conceptS

An early concept design where users would select assessment components by topic in a checklist, submit the checklist to the application, and generate the completed assessment instrument. This concept is based on how a user operates some of our legacy software. While it's only a few clicks from nothing to a fully tricked out assessment instrument, this concept hides and abstracts the end-product from the user in a way that's not useful. 

 

This later concept cuts down on abstraction by depicting a dynamic instrument editor (screen 3). A user still marks check boxes to include desired content, but instead of a transactional "submit" action, the application would refresh to reflect the user's selections from the left. Likewise, topics could be rearranged and contents could be edited at will. 

 

My contributions

I designed the instrument form builder in this application. My goal was to design an interface that was flexible enough to meet current assessment needs as well as enable invention of new assessment instruments as the curriculum continued to evolve. Our drag-n-drop form builder was informed by a pattern analysis of current and past assessment forms, concept modeling and testing with users, and a desire to build software that was transparent to users about how it worked.