claf.modules package

Submodules

claf.modules.activation.get_activation_fn(name)[source]

PyTorch built-in activation functions

some functional codes from allennlp: https://github.com/allenai/allennlp

  • add_masked_value : replace_masked_values (allennlp)

  • get_mask_from_tokens : get_mask_from_tokens (allennlp)

  • last_dim_masked_softmax : last_dim_masked_softmax (allennlp)

  • masked_softmax : masked_softmax (allennlp)

  • weighted_sum : weighted_sum (allennlp)

claf.modules.functional.add_masked_value(tensor, mask, value=-10000000.0)[source]
claf.modules.functional.forward_rnn_with_pack(rnn_module, tensor, seq_config)[source]
claf.modules.functional.get_mask_from_tokens(tokens)[source]
claf.modules.functional.get_sorted_seq_config(features, pad_index=0)[source]
claf.modules.functional.last_dim_masked_softmax(x, mask)[source]
claf.modules.functional.masked_log_softmax(vector, mask)[source]
claf.modules.functional.masked_softmax(x, mask)[source]
claf.modules.functional.masked_zero(tensor, mask)[source]

Tensor masking operation

claf.modules.functional.weighted_sum(attention, matrix)[source]
claf.modules.initializer.weight(module)[source]

weight initialization (according to module type)

  • Args:

    module: torch.nn.Module

Module contents