Segmentation

This is where you can view productivity and quality provided by your annotators, and view the type of mistakes they are making.

To see the annotator's performance for a task, you can click on the task name.

Productivity Metrics

a) Users: Number of users who have attempted the task.

b) Manhours: Total time spent by all the users in this task.

c) Avg. Making Time:

Avg. Making Time=Total time spentTotal images solvedAvg. \ Making\ Time = \frac{Total\ time\ spent}{Total\ images\ solved}

d) Avg. Editing Time:

Avg. Editing Time=Total time spent in the second loopTotal images solved in the second loopAvg. \ Editing\ Time = \frac{Total\ time\ spent\ in\ the\ second\ loop}{Total\ images\ solved\ in\ the\ second\ loop}

👉 Note: The second loop happens if a reviewer rejects a job.

e) Questions solved: Total jobs solved in this task.

f) Questions edited: Total jobs that went through the task in the second loop.

Quality Metrics

a) Pixel Accuracy: If the reviewer edits a job, comparisons are made at every change at a pixel level to calculate the accuracy. If there is no change in the value of a pixel, it gets counted as a correct pixel. The change includes Class Change or Instance ID change.

Pixel Accuracy=Correct PixelsTotal PixelsPixel\ Accuracy = \frac{Correct\ Pixels}{Total\ Pixels}

b) Questions solved: Total jobs solved by the Maker.

c) Questions checked: Total jobs checked by the reviewers.

d) Questions rejected: Total jobs rejected by the reviewers.

Classification of Mistakes

You can see the type of mistakes happening in this task. It will help you to provide better feedback to annotators.

a) Class: Correct class for the object

b) Mislabelled Class: Class marked by the annotator, which is incorrect.

c) Mislabelled Pixels: Percentage of pixels mislabelled for a class

Tips: You can go to the User Details tab to see all the metrics for a particular user.

Last updated