Segmentation
Last updated
Last updated
This is where you can view productivity and quality provided by your annotators, and view the type of mistakes they are making.
To see the annotator's performance for a task, you can click on the task name.
a) Users: Number of users who have attempted the task.
b) Manhours: Total time spent by all the users in this task.
c) Avg. Making Time:
d) Avg. Editing Time:
e) Questions solved: Total jobs solved in this task.
f) Questions edited: Total jobs that went through the task in the second loop.
a) Pixel Accuracy: If the reviewer edits a job, comparisons are made at every change at a pixel level to calculate the accuracy. If there is no change in the value of a pixel, it gets counted as a correct pixel. The change includes Class Change or Instance ID change.
b) Questions solved: Total jobs solved by the Maker.
c) Questions checked: Total jobs checked by the reviewers.
d) Questions rejected: Total jobs rejected by the reviewers.
You can see the type of mistakes happening in this task. It will help you to provide better feedback to annotators.
a) Class: Correct class for the object
b) Mislabelled Class: Class marked by the annotator, which is incorrect.
c) Mislabelled Pixels: Percentage of pixels mislabelled for a class
Tips: You can go to the User Details tab to see all the metrics for a particular user.
Note: The second loop happens if a reviewer rejects a job.