chunk¶
-
class
ChunkEvaluator
(label_list, suffix=False)[source]¶ Bases:
paddle.metric.metrics.Metric
ChunkEvaluator computes the precision, recall and F1-score for chunk detection. It is often used in sequence tagging tasks, such as Named Entity Recognition(NER).
- Parameters
label_list (list) – The label list.
suffix (bool) – if set True, the label ends with ‘-B’, ‘-I’, ‘-E’ or ‘-S’, else the label starts with them.
-
compute
(lengths, predictions, labels, dummy=None)[source]¶ Computes the precision, recall and F1-score for chunk detection.
- Parameters
lengths (tensor) – The valid length of every sequence, a tensor with shape
[batch_size]
predictions (tensor) – The predictions index, a tensor with shape
[batch_size, sequence_length]
.labels (tensor) – The labels index, a tensor with shape
[batch_size, sequence_length]
.dummy (tensor, optional) – Unnecessary parameter for compatibility with older versions with parameters list
inputs
,lengths
,predictions
,labels
. Defaults to None.
- Returns
the number of the inference chunks. num_label_chunks (tensor): the number of the label chunks. num_correct_chunks (tensor): the number of the correct chunks.
- Return type
num_infer_chunks (tensor)
-
update
(num_infer_chunks, num_label_chunks, num_correct_chunks)[source]¶ This function takes (num_infer_chunks, num_label_chunks, num_correct_chunks) as input, to accumulate and update the corresponding status of the ChunkEvaluator object. The update method is as follows:
\[\begin{split}\\ \begin{array}{l}{\text { self. num_infer_chunks }+=\text { num_infer_chunks }} \\ {\text { self. num_Label_chunks }+=\text { num_label_chunks }} \\ {\text { self. num_correct_chunks }+=\text { num_correct_chunks }}\end{array} \\\end{split}\]- Parameters
num_infer_chunks (int|numpy.array) – The number of chunks in Inference on the given minibatch.
num_label_chunks (int|numpy.array) – The number of chunks in Label on the given mini-batch.
num_correct_chunks (int|float|numpy.array) – The number of chunks both in Inference and Label on the given mini-batch.