Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

itan1/FluxTraining.jl

Repository files navigation

FluxTraining.jl

Docs (master)

A Julia package for using and writing powerful, extensible training loops for deep learning models.

What does it do?

When should you use FluxTraining.jl?

  • You don't want to implement your own metrics tracking and hyperparameter scheduling or insert common training feature here for the 10th time
  • You want to use composable and reusable components that enhance your training loop
  • You want a simple training loop with reasonable defaults that can grow to the needs of your project

How do you use it?

Install like any other Julia package using the package manager:

]add FluxTraining

After installation, import it, create a Learner from a Flux.jl model, data iterators, an optimizer, and a loss function. Finally train with fit!.

using FluxTraining

learner = Learner(model, lossfn)
fit!(learner, 10, (trainiter, validiter))

Next, you may want to read

Acknowledgements

The design of FluxTraining.jl's two-way callbacks is adapted from fastai's training loop.

About

A flexible neural net training library inspired by fast.ai

Resources

Readme

License

MIT license

Stars

Watchers

Forks

Releases

No releases published

Packages

Contributors

Languages

  • Julia 100.0%