What we learned from introducing interactive automation to our users

Kognic provides fast annotations while keeping the quality high. An important factor for making sure this stays true is the use of efficient annotation workflows, utilizing interactive automation features. The goal of introducing automation tools to the annotation workflow is to reduce the time and effort spent on annotating while keeping data quality high.  In this article, we will introduce the main insights we gathered from implementing interactive machine learning services as part of the annotation user experience.

Through the years we have grown to have a large amount of annotators spread across the world. One type of data we work a lot with is a fusion of 2D images and LiDAR 3D point clouds. In order for our annotators to successfully annotate images, videos and sequences they receive definitions of what to annotate and guidelines on how to do so. Additionally, we continuously develop tools, features and functions that assist the annotators in their annotation workflows.

 
 

We have been putting a lot of effort into avoiding and reducing interactions that are time-consuming and inefficient within the user workflows. In order for the annotators to create more efficient annotations, we have provided them with a variety of interactive automation features.

Using integrated machine learning services to create more efficient annotation workflows

The Engineering Team at Kognic was tasked with creating a drawing tool that would allow the annotators to spend less time drawing and adjusting cuboids while annotating objects in LiDAR 3D point clouds. This tool, which we call The Machine Assisted 3D Box tool, calculates the size, position, and rotation of objects. The annotator provides an initial judgment while the machine learning service encapsulates all the points of the object. By doing this we utilize the human’s and machine’s different strengths. 

In addition to The Machine Assisted 3D Box tool, a set of tracking features were developed for 3D cuboids. These features adjust the object’s position, rotation, and size through a point cloud sequence. This decreases the time our annotators have to spend adjusting the cuboids in each frame of the sequences and therefore saves time while keeping the annotation quality high. 

"In the early stage of introducing the users to the interactive automation tools we had a low success rate in getting the users to utilize these features."

User experience findings and key takeaways

In the early stage of introducing the users to the interactive automation tools, we had a low success rate in getting the users to utilize these features. This was the case even if we could conclude, both by observations and user metrics, that these tools sped up the annotation workflows.

To investigate the low conversion ratio we held interviews and sent out surveys to the users but the response wasn’t positive. According to the users, they had experienced these new tools as a slower and less accurate alternative and simply couldn’t see how it would benefit their workflow. So, we were left with a solution the engineers liked and that users ignored. This led to us questioning what factors that could have led to us ending up in this unfortunate situation.

User expectations

The automation tools weren't appreciated by the users because working with them didn’t bring the expected results to efficiently create cuboids with high quality. Thus, when the tool didn’t meet their expectations, they felt they would rather work with the manual tools and stay in control of the result.

How could we have prevented the negative experiences towards the machine-assisted tool? According to Google’s People + AI Guidebook, you will create trust in systems if you allow the users to have them explained to them. You decide on what level you should explain the system, whether this involves the level of how the data behind the automated interaction works or how the users will be able to include it in their personal workflows. Providing users with this information will help them create mental models that set a reasonable expectation of how to interact and work with the systems.

When we first introduced the interactive automation tools to the users, our engineers had some knowledge about what to expect from the interactions. They knew the limitations of the automation tools along with how it was meant to make the user workflows more efficient. However, it seems, as though we failed to prepare and inform the users enough to create the necessary trust in the tools.  

Communicating with the users

What we learned so far was that the automated interactions did not replace the users’ own judgments in the way they expected. This left the users disappointed and a little confused as to why they should use them at all. How could we have communicated and prepared them better for the new types of interactions these tools came with?

Lessons we’ve learned so far:

  1. Be honest about what the feature can and cannot achieve. Allow for an open communication between the engineers and users while trying not to go into detailed information about how the automation works in the code. Furthermore, keep the users in the loop about the evolution of the automated features and functions. This might be easier said than done, but finding the fine line could make a big difference to how the users approach the new tools. 
  2. Set expectations by communicating using visual and written instructions on how to work with the interactive automation tools. If there are required steps or steps that work better than others -  share these with the users and spare them from unnecessary trial-and-error processes. According to Amy Schade at the Nielsen Norman Group, providing instructions and information through multiple mediums such as video, images, and text, preferably a combination of all, allows each individual user to choose their preferred method of learning.
  3. Guide the users by communicating feature availability in the user interface. By doing this you will make the user aware of the feature instead of relying on them recalling its existence. Our preferred way of doing this is through context menus, instruction boxes, and states.

By being honest, setting expectations and using visual affordances you will prepare and guide the users through the workflows. However, it is equally important that an automated workflow is more efficient and less complex than the manual workflow it is replacing.

"...make sure that the performance of the automated interactions bring positive, noticeable changes that the users appreciate."

Automation performance

If you introduce a new workflow to the user, it is important that the user understand why this workflow has been introduced. Changing a habit is never appreciated amongst the users, especially when they can’t see what’s “wrong” with the old one. A lesson we learned was to make sure that the automation features and functions are easier to work with than the manual ones. More specifically, we have to make sure that the performance of the automated interactions bring positive, noticeable changes that the users appreciate.

Preparations and good communication can lead to a greater level of user acceptance when interacting with automation. However, it can be difficult to change the users’ feelings towards tools, features, and functions if the first experience was negative. This in turn will affect how willing they are to work with it in the future. Therefore, if you are aware that the automation feature you want to implement lacks good usability, start by introducing it to a small group of users. This will allow them to test it and provide feedback on it before making it a part of everyone's workflow. 

Conclusion

So what did we learn from introducing this interactive automation into the annotation workflows of our users?

  • Be honest about what the feature can and cannot achieve,
  • Set expectations by communicating using visual and written instructions on how to work with it,
  • Guide the users by communicating feature availability in the user interface,
  • The interactive automation tools, features, and functions should be easier to work with than the manual ones,
  • Introduce the interactive automation tool, feature, or function to a small group of users first and allow them to test it and provide feedback.