GPT-2

Reference:

Radford et al. “Language models are unsupervised multitask”.

class textbox.model.LM.gpt2.GPT2(config, dataset)[source]

Bases: UnconditionalGenerator

GPT-2 is an auto-regressive language model with stacked Transformer decoders.

calculate_nll_test(corpus, epoch_idx=- 1)[source]
forward(corpus, epoch_idx=- 1, nll_test=False)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

generate(batch_data, eval_data)[source]

Predict the texts conditioned on a noise or sequence.

Parameters
  • batch_data (Corpus) – Corpus class of a single batch.

  • eval_data – Common data of all the batches.

Returns

Generated text, shape: [batch_size, max_len]

Return type

torch.Tensor

training: bool