Annotation Assignment

While working on GE Healthcare’s Edison AI Workbench, I was tasked with designing a better way for our customers to manage their image annotation process. While the product already allowed the annotation of images, the workflow was inefficient and difficult to manage, so I set out to streamline it. 

The Challenge

We had heard from our users that the existing annotation workflow was insufficient for some of their use cases. In particular, they had the following issues: 

  • Annotators had to manually locate the images they were assigned. 

  • There was no way to differentiate between annotations intended for different uses. 

  • There was no way for managers to track when annotators had completed their annotations. 

Initial Design

To solve these problems, I introduced the concepts of Projects and Tasks. 

Projects were buckets that could be used to isolate annotations intended for a specific purpose. The idea was that a manager would create a project for each algorithm they were developing. Annotations done within that project would only be used to train that algorithm, and users wouldn’t have to worry about other annotations on those images contaminating their training data.

Tasks were discrete assignments of images to be annotated by a specific annotator. A manager could create a task inside a project, and the assigned annotator could easily begin annotating directly from that task. It also allowed the manager who assigned the work to easily track the progress of the image annotation. Related tasks, such as the annotation and then review of the same data, could be grouped into a Job.

By utilizing Projects and Tasks, our users would be able to manage assignments within our application, keep annotations scoped to their intended purpose, and track the progress of annotation work. 

User Feedback

Once the concept was developed, we were able to give an MVP to our users to get their feedback. The overall reception was excellent. They appreciated the ability to organize and track their work, and they saw the potential of the Projects concept to incorporate additional functions such as model training and testing in the future. 

Our users were also able to give us great insight into how we could improve the experience going forward. They gave us the following feedback: 

  • Projects, while powerful, were cumbersome to set up, as annotators and data had to be imported before tasks could be created. 

  • Tasks were inflexible, and they wanted to distribute the annotation work across multiple annotators. 

Revised Design

Having obtained this valuable feedback, I began to brainstorm ways to streamline project and task creation, while also providing additional flexibility. 

I eventually landed on a design that would allow users to specify data and annotators as part of task creation, without importing them into the project first. Images and annotators that were added to a task would automatically be imported, without any additional action from the user.

I also modified the task creation design so that a user could select multiple annotators, as well as request additional annotation passes. During the annotation process, annotators would be shown images dynamically, rather than be assigned a set of images at the start of the task. This allowed a team of annotators to work together on their own time, to complete the annotation process more efficiently.

This revised design made Projects the preferred way for users to assign and annotate images, and set us up to expand the concept and add more value in the future.