Machine Learning Workbench: Content Annotation
Content Annotation is used to identify and tag entities, concepts, and other important information. In Machine Learning Workbench > Content Annotation, you can enter a piece of text and see for yourself how relevant the entities defined in Taxonomy are to the piece. Alternatively, you can define your own entity values on the go. In the image, a three entity values have been defined for the text.
Upon clicking Run Annotation, you can find the most relevant entity values for the text.
Clicking Run Again takes you back to the Content Annotation Playground where you can update the text or edit entity values.
This section allows you manage content annotation requests submitted through Content Sources > Content Annotation. You can view and accept the entity values in annotations requests. You can also set up a threshold, so that only the entity values matching or exceeding the threshold come up for your review.
To review annotation requests, pick a value from Select Annotation Request. It will fill List of Records with the documents that have been annotated.
Optionally you can refine the list with Select Field and Keyword. Select Field limits the list to those documents that have the selected field. To find only those records that have a certain term, use Keyword. In the image, only the records that contain the keyword "Motorola" have been included.
Click on a record to view
Actual Tagged Entity Value(s) contains the entities that annotate the record.
Entity Values contains the relevance of entities that annotate the record. The relevance is captured in the column Strength. The more relevant an entity is, the higher is its relevance.
Annotation Rules where you can set a minimum relevancy threshold for entity values using the orange slider. Two rules are available:
Threshold Rule. Only the entities that match the threshold value are used to annotate the content. Check it to use it.
Strongest Entity Rule. Only the strongest entity is used to annotate the content. Check it to use it.
An entity that matches either of those rules is picked for annotating content.
The small button on the extreme right of the panel is Annotation Re-Submition History.
Click on it to view the person who requested an annotation, the time of request, and the current status. The status can be Successful (the annotation has been accepted), Pending (the annotation is under review), Discarded (the annotation has been rejected), and Error (the annotation is incorrect).
Last updated: Tuesday, February 6, 2024
Or, send us your review at email@example.com