Part 1 — Why We Built an Annotation Tool
Part 1 — Why We Built an Annotation Tool
This is the first article in a series for our Sequence Annotation Tool. Read on to learn more about our journey along the learning curve.
At Sequence, we provide text and image annotation services for machine learning projects. Our most common use cases fall under image annotation at high volumes
Annotation is a core functionality of our platform. Our decision to build our own tool focuses on laying the right foundation for future builds. For us, it’s a win-win situation. By investing into our own annotation tool, we can better serve the needs of our clients and our contributors while working with a tool that’s just the right fit.
We wanted our Annotation Tool to be:
1. Web-based
We needed the annotation tool to be able to run on the browser. Our platform is web-based, which allows anyone to work on it as long as they have internet access. With no software to install and no manual transfer of work-files from desktop to web, we can simplify the user experience of working on a task.
2. Flexible
We wanted to be able to configure shortcuts, add specific shapes in the future (beyond bounding boxes and polygons), and tailor the behaviour of the tool based on our business use cases. As most of the time spent on the platform by contributors is on this tool, being able to control the functionality at a granular level gives us an opportunity to refine details and experiment with new ideas.
3. Robust
To guarantee the same platform behaviour across browsers, and under slower network conditions — we think an in-house build is a right choice for us. As small bugs can sometimes snowball into significant negative experiences within the platform, it was important for us to be able to run extensive tests against the tool and make sure it’s easy for us to ensure the quality we wanted to provide.
4. Fast
Our goal is to limit delays and lags as much as possible. We pride ourselves on keeping the platform feeling responsive, especially on older machines. Building our annotation tool in-house allows us to make sure that the tool runs smoothly for contributors who have low-CPU computers.
These constraints helped us to focus on what we wanted for our platform and gain clarity on the requirements we needed to best serve our users.
We’ve launched the tool on our platform as of September 2018 and the results have been positive. We’ve learned a lot from the process and if you’d like to read more about our learnings on the build, stay tuned for the follow-up article that’s coming soon: Part 2 — Annotation Tool: Retrospective on the Build.