A world where "click-working" is not only rewarding, but fun and exciting!
This project was sponsored by Bosch RTC. Bosch RTC is working on a new internal project that will attempt to replace rote tasks currently being completed on Amazon’s Mechanical Turk by a platform of contractors or annotators, paid more fairly and with more enjoyable work than Mechanical Turk workers.
The current plan is for Bosch internal teams to migrate to the new platform (Bosch currently uses Amazon Mechanical Turk for a lot of work in autonomous vehicle and machine learning labeling tasks), then open it up to Bosch partners, and eventually open the platform to external developers.
Being a 2-sided marketplace we designed a new gamified, mobile annotation platform that annotators around the world could enjoying working on, delivering a great user experience to incentivize them to create high quality annotations. We also designed a new streamlined platform that developers could leverage to create and manage annotation tasks.
Crowd workers today perform annotation tasks on platforms like Mechanical Turk for low wages. Apart from the low monetary incentive, annotation tasks are repetitive and boring and often times the results are of low quality.
These annotation tasks feed Bosch's autonomous (self) driving vehicle algorithms where even slightly low quality annotations could be fatal!
To investigate breakdowns in the current task flow on both sides of the marketplace, with a goal of delivering a delightful user-experience to users incentivizing them to provide high quality annotations.
Crowd workers today perform annotation tasks on platforms like Mechanical Turk for low wages. Apart from the low monetary incentive, annotation tasks are repetitive and boring and often times the results are of low quality.
These annotation tasks feed Bosch's autonomous (self) driving vehicle algorithms where even slightly low quality annotations could be fatal!
How Bosch operates today
It is evident there are several breakdowns present in the way Bosch RTC operates!
Market size of autonomous annotations
Current Total Addressable Market size is currently at $137M. This figure is based on our estimates for a wide variety of annotation applications, from health care, autonomous driving, and manufacturing applications.
Service Attainable Market is solely based on Automotive Annotations. The current size is $53M.
Estimated Target Market projection, by 2021, is $12M, based on a 30% year over year growth of annotation request, internally, within Bosch. Now, this is a conservative estimate based on our current info on Bosch’s current annotation capabilities with the Bangalore team and the use of 3rd party platforms.
Developer Persona: Susan Sanders
Susan is a developer at BMW in Germany. BMW plans to roll out their own lines of autonomous vehicles and requires almost 600,000 images of real time traffic data to be annotated with 100% accuracy.
Currently, BMW submits this data to Amazon MTurk but there lies two main problems:
Annotator Persona: Ravi Kumar
Ravi works as a sales assistant in Bangalore, India during the day and works on average 3 hours a day on Amazon Mechanical Turk. He has completed almost 100,000 tasks and has been rejected 24 times. Out of these 24 times, one request was in a language foreign to him and he understands that this was his fault. The rest were a result of requesters scamming the system to get work done for free.
This frustrates Ravi, who puts in the effort to ensure the quality of his work. He wishes a platform existed where annotators could communicate to developers so as to minimize misunderstandings and navigate potential language barriers.
We used a radar diagram to visualize our competitive analysis, to be consistent with Bosch RTC. We performed our analysis based on several decision criteria:
Competitive Analysis on the autonomous annotation industry
Our final deliverable was and all of our decisions throughout the project was informed by users. We interviewed industry and academic experts (developers) and potential users (annotators).
"Annotation quality on MTurk is really bad because the instructions are not comprehensive enough. Annotators can not provide high quality results even if they wanted to!" Eshed Ohn-Bar, Postdoctoral Fellow, Robotics Institute, CMU
"The GUI on existing platforms like Crowd Flower are horrible." Eshed Ohn-Bar, Postdoctoral Fellow, Robotics Institute, CMU
"I currently use MTurk. One reason that would convince me to immediately switch services is the number of scammers on MTurk." Gunnar Sigurdsson, PhD Student, advised by Abhinav Gupta, Robotics Institute, CMU
"There should be a system to remove poor requesters. For example, after so many negative reports they are suspended. If it is found that they are fraudulent, then they are removed or never allowed back on the platform. Requesters are starting to think they can get free work." Annotator 92
"I am part of community to see things I didn't see on the platform yet." Annotator 64
"I have 24 rejections in nearly 100000 submitted tasks. Only one was because of something I did wrong. The rest were mistakes from the requester that were never rectified or straight-up scams." Annotator 57
A seamless data submission dashboard including a preset list of annotation tasks with an option for task customization as necessary. Real-time review and task progress update for quality assurance.
Developer Dashboard to streamline task creation and management
Storyboard for Bosch Annotate
"I would use this. I have always wanted a platform like this on mobile that I could use when I'm bored, like when I'm on the bus coming to school." Master's Student, Robotics Institute, CMU
"I never knew such platforms even existed. I currently work as a bartender after school. I would be interested in this app. While making the world a better place sounds great, I would still use the app for the money!
An annotation platform for both the web and smart phones, using gamification to incentivize users.
MVP - UI Click Through
As part of our final deliverable to Bosch RTC, we consolidated all our findings and insights into a Design Guidelines document consisting of the design process and design thinking that went into all aspects of the MVP. These categories range from onboarding, to the Gatcha mechanic, to the feedback platform.
Design Guidelines for Bosch Annotate. Download the document here
The new product will automatically assign workers tasks based on their performance and efficiency with certain tasks. These per image cost estimates include semantic segmentation and 20 bounding boxes. Our new total costs are now in the competitive range of some of the more leading platforms, such as Scale API.
Pricing Plan for Bosch's new Internal Platform
Crowd sourcing jobs have always been looked at with a negative sentiment. People have generated mental models comparing crowd workers to robots working in factories. By moving away from Desktops on which crowd sourcing platforms have been always built, by moving towards mobile and by designing a meaningful, interesting user experience, not only can we expect high quality results from existing crowd workers but we can also hopefully create a new diverse, segment of crowd workers and by doing so redefine the industry!