PyText
master
Getting Started
Installation
Train your first model
Execute your first model
Visualize Model Training with TensorBoard
Use PyText models in your app
Serve Models in Production
Config Files Explained
Config Commands
Training More Advanced Models
Train Intent-Slot model on ATIS Dataset
Hierarchical intent and slot filling
Multitask training with disjoint datasets
Data Parallel Distributed Training
XLM-RoBERTa
Semantic parsing with sequence-to-sequence models
Extending PyText
Architecture Overview
Custom Data Format
Custom Tensorizer
Using External Dense Features
Creating A New Model
Hacking PyText
References
pytext
config
data
exporters
loss
metric_reporters
models
optimizer
adabelief
fp16_optimizer
lamb
madgrad
optimizers
privacy_engine
radam
scheduler
sparsifiers
swa
task
torchscript
trainers
pytext package
PyText
Docs
»
pytext
»
optimizer
Edit on GitHub
optimizer
ΒΆ
adabelief
AdaBelief.Config
fp16_optimizer
FP16Optimizer.Config
FP16OptimizerApex.Config
FP16OptimizerFairseq.Config
MemoryEfficientFP16OptimizerFairseq.Config
lamb
Lamb.Config
madgrad
MADGRAD.Config
optimizers
Adagrad.Config
Adam.Config
AdamW.Config
Optimizer.Config
SGD.Config
privacy_engine
PrivacyEngine.Config
radam
RAdam.Config
scheduler
BatchScheduler.Config
CosineAnnealingLR.Config
CyclicLR.Config
ExponentialLR.Config
LmFineTuning.Config
PolynomialDecayScheduler.Config
ReduceLROnPlateau.Config
Scheduler.Config
SchedulerWithWarmup.Config
StepLR.Config
WarmupScheduler.Config
sparsifiers
blockwise_sparsifier
BlockwiseMagnitudeSparsifier.Config
sparsifier
CRF_L1_SoftThresholding.Config
CRF_MagnitudeThresholding.Config
CRF_SparsifierBase.Config
L0_projection_sparsifier.Config
SensitivityAnalysisSparsifier.Config
Sparsifier.Config
swa
StochasticWeightAveraging.Config
Read the Docs
v: master
Versions
master
latest
stable
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds
Free document hosting provided by
Read the Docs
.