Flux provides a single, intuitive way to define models, just like mathematical notation. Julia transparently compiles your code, optimising and fusing kernels for the GPU, for the best performance.
Existing Julia libraries are differentiable and can be incorporated directly into Flux models. Cutting edge models such as Neural ODEs are first class, and Zygote enables overhead-free gradients.
GPU kernels can be written directly in Julia via CUDA.jl. Flux is uniquely hackable and any part can be tweaked, from GPU code to custom gradients and layers.
A rich collection of Flux scripts to learn from, or tweak to your own data. Trained Flux models can be used from TextAnalysis or Metalhead.
Flux models can be compiled to TPUs for cloud supercomputing, and run from Google Colab notebooks.
The Turing.jl and Stheno libraries enable probabilistic programming, bayesian inference and Gaussian processes on top of Flux.
GeometricFlux.jl is a geometric deep lerning library for Flux and has support of CUDA GPU acceleration with CUDA..
Metalhead includes many state-of-the-art computer vision models which can easily be used for transfer learning..
The SciML ecosystem uses Flux and Zygote to mix neural nets with differential equations, to get the best of black box and mechanistic modelling.
Transformers.jl provides components for Transformer models for NLP, as well as providing several trained models out of the box.
DiffEqFlux.jl provides tools for creating Neural Differential Equations.
Machine Learning in Julia community.
Official Julia Slack for casual conversation.
Zulip server for the Julia programming language community.