Name Date Size #Lines LOC

..--

support/H25-Apr-2025-336248

.gitignoreH A D25-Apr-202524 32

README.mdH A D25-Apr-2025832 126

maml-omniglot-higher.pyH A D25-Apr-20259.4 KiB280181

maml-omniglot-ptonly.pyH A D25-Apr-20259.1 KiB270177

maml-omniglot-transforms.pyH A D25-Apr-20258.7 KiB265182

README.md

1# Omniglot MAML examples
2
3In this directory we've provided some examples of training omniglot that reproduce the experiments from [the original MAML paper](https://arxiv.org/abs/1703.03400).
4
5They can be run via `python {filename}`.
6
7`maml-omniglot-higher.py` uses the [facebookresearch/higher](https://github.com/facebookresearch/higher) metalearning package and is the reference implementation. It runs all of its tasks sequentially.
8
9`maml-omniglot-transforms.py` uses functorch. It runs all of its tasks in parallel. In theory this should lead to some speedups, but we haven't finished implementing all the rules for vmap that would actually make training faster.
10
11`maml-omniglot-ptonly.py` is an implementation of `maml-omniglot-transforms.py` that runs all of its tasks sequentially (and also doesn't use the higher package).
12