fastxtend (fastai extended) is a collection of tools, extensions, and addons for fastai

Feature overview

General Features

  • Fused optimizers which are 21 to 293 percent faster relative to fastai native optimizers.
  • Flexible metrics which can log on train, valid, or both. Backwards compatible with fastai metrics.
  • Easily use multiple losses and log each individual loss on train and valid.
  • A simple profiler for profiling fastai training.



Check out the documentation for additional splitters, callbacks, schedulers, utilities, and more.


fastxtend is avalible on pypi:

pip install fastxtend

To install with dependencies for vision, audio, or all tasks run one of:

pip install fastxtend[vision]

pip install fastxtend[audio]

pip install fastxtend[all]

Or to create an editable install:

git clone
cd fastxtend
pip install -e ".[dev]"


Like fastai, fastxtend provides safe wildcard imports using python’s __all__.

from import *
from import *

In general, import fastxtend after all fastai imports, as fastxtend modifies fastai. Any method modified by fastxtend is backwards compatible with the original fastai code.


Use a fused ForEach optimizer:

Learner(..., opt_func=adam(fused=True))

Log an accuracy metric on the training set as a smoothed metric and validation set like normal:

Learner(..., metrics=[Accuracy(log_metric=LogMetric.Train, metric_type=MetricType.Smooth),

Log multiple losses as individual metrics on train and valid:

mloss = MultiLoss(loss_funcs=[nn.MSELoss, nn.L1Loss], 
                  weights=[1, 3.5], loss_names=['mse_loss', 'l1_loss'])

Learner(..., loss_func=mloss, metrics=RMSE(), cbs=MultiLossCallback)

Apply MixUp, CutMix, or Augmentation while training:

Learner(..., cbs=CutMixUpAugment)

Profile a fastai training loop:

from fastxtend.callback import simpleprofiler

learn = Learner(...).profile()
learn.fit_one_cycle(2, 3e-3)

Train in channels last format:



fastxtend requires fastai to be installed. See for installation instructions.