Humanloop

WEBSITE

SAAS

AI

Simplifying data annotation interfaces for no-code software training AI to understand data

Simplifying data annotation interfaces for no-code software training AI to understand data

Humanloop is a SaaS company offering software that trains and builds advanced AI to understand, label, and annotate data. Over the course of six weeks, I worked on a team tasked with improving the annotation interface and the goal of making the product more efficient and user friendly.

Problem

Humanloop has one single data annotation interface, whether it is being used by an experienced data analyst or a new, inexperienced user. At times, the interface can feel cluttered for the inexperienced user, overwhelming them with unnecessary choices and information.

Solution

We simplified all data annotation interfaces, as well as created a new process for using sample data provided by Humanloop. Moreover, we designed a brand new rapid annotation screen for inexperienced users or users that do not need the full capability of Humanloop's software.

Key takeaway

It was a great learning experience designing for an emerging technology in AI. Even though it's not a topic I had much experience or interest in at the time, the application of user-centered design methodology enabled me to still make small, positive design changes, resulting in a better user experience.

"This would be really impactful. We're constrained by time to really think through and design and iterate… these features are crucial to our company's offering."

Jordan Burgess, CPO of Humanloop

Learning about competitors

Learning about competitors

A competitive analysis of 6 different data annotation platforms resulted in potential opportunities for growth, and we synthesized results by creating an easily digestible presentation.

1

A more robust help section

2

Video, image, and audio annotation

3

Client testimonials and reviews

4

Visual, animated examples of main product features

Identifying current design issues

Identifying current design issues

I completed a Heuristic Evaluation of Humanloop's current data annotation interface, and prepared suggestions for improvement, including:

1

Ensuring buttons and labels meet contrast guidelines

2

Using more user-friendly terms throughout

3

Consistent shortcut placement on buttons

4

Removal of redundant links

Visualizing the steps

Visualizing the steps

Using the 3 user stories provided by Jordan as a guide, I created a user flow. This took a number of iterations, as the flows include both a demo experience for a user trying out the platform, and full annotation experiences for users with data and labels.

I don't have labels or data and just want to look around by using sample labels on sample data.

I have data, but need to create labels.

I have data and labels, but need to apply the labels.

Striving for consistent solutions

Striving for consistent solutions

After sketching out some initial ideas, our team together in creating low fidelity wireframes. One challenge was creating consistency among different designers, and in retrospect, we should have used a wireframe kit to speed up the process.

Working within an existing design system

Working within an existing design system

After receiving positive feedback from our client, we moved on to creating high fidelity screens. Because Humanloop already has a design systems library, certain aspects of the designs, like colors, typography, and button style were already pre-determined.

Ensuring a smooth handoff

Ensuring a smooth handoff

We annotated our high fidelity screens to prepare them for developer handoff. Because not all features could be properly conveyed through the screens themselves, we were careful to annotate to remove any uncertainties and try to make the designs as clear as possible.

Key metrics

Key metrics

Potential key metrics would include:

  • Average time on task per manual annotation

  • Task success rate for manual annotations

  • Improvement on a System Usability Scale

  • Number of users who add a subscription after using the sample data

Bringing value

These improvements have the potential to bring value in a number of ways. Allowing new users to sample the software with both sample data and labels allows them to understand it's performance in a low-stakes, hands-on way that could lead to increased subscribers. Lowering average time on task per manual annotation while improving task success rate could result in more efficient users, saving them time and highlighting the benefits of Humanloop's software.

Reflecting on Humanloop

Reflecting on Humanloop

Designing in a team

Designing in a team required constant planning and open lines of communication, and I appreciated the different perspectives and skillsets brought forth from my teammates. There were steps that could have gone smoother with better communication, but it was part of the process of working together.

Emerging technologies

It was a great learning experience designing for an emerging technology in AI. Even though it's not a topic I had much experience or interest in at the time, the application of user-centered design methodology enabled me to still make small, positive design changes, resulting in a better user experience.

DESIGNED IN SAN FRANCISCO

Let's chat!

Likely topics include design, fantasy football, retro Nikes, fatherhood, sour candy, and Frank Ocean

DESIGNED IN SAN FRANCISCO

Let's chat!

Likely topics include design, fantasy football, retro Nikes, fatherhood, sour candy, and Frank Ocean

DESIGNED IN SAN FRANCISCO

Let's chat!

Likely topics include design, fantasy football, retro Nikes, fatherhood, sour candy, and Frank Ocean