modeling¶
-
class
RobertaModel
(vocab_size, hidden_size=768, num_hidden_layers=12, num_attention_heads=12, intermediate_size=3072, hidden_act='gelu', hidden_dropout_prob=0.1, attention_probs_dropout_prob=0.1, max_position_embeddings=512, type_vocab_size=16, initializer_range=0.02, pad_token_id=0)[source]¶ Bases:
paddlenlp.transformers.roberta.modeling.RobertaPretrainedModel
-
class
RobertaPretrainedModel
(name_scope=None, dtype='float32')[source]¶ Bases:
paddlenlp.transformers.model_utils.PretrainedModel
An abstract class for pretrained RoBERTa models. It provides RoBERTa related
model_config_file
,resource_files_names
,pretrained_resource_files_map
,pretrained_init_configuration
,base_model_prefix
for downloading and loading pretrained models. SeePretrainedModel
for more details.-
base_model_class
¶ alias of
paddlenlp.transformers.roberta.modeling.RobertaModel
-
-
class
RobertaForSequenceClassification
(roberta, num_classes=2, dropout=None)[source]¶ Bases:
paddlenlp.transformers.roberta.modeling.RobertaPretrainedModel
Model for sentence (pair) classification task with RoBERTa. :param roberta: An instance of
RobertaModel
. :type roberta: RobertaModel :param num_classes: The number of classes. Default 2 :type num_classes: int, optional :param dropout: The dropout probability for output of RoBERTa.If None, use the same value as
hidden_dropout_prob
ofRobertaModel
instanceRoberta
. Default None
-
class
RobertaForTokenClassification
(roberta, num_classes=2, dropout=None)[source]¶ Bases:
paddlenlp.transformers.roberta.modeling.RobertaPretrainedModel
-
class
RobertaForQuestionAnswering
(roberta, dropout=None)[source]¶ Bases:
paddlenlp.transformers.roberta.modeling.RobertaPretrainedModel