This module amber.modeler provides class and interfaces to convert an architecture (usually a list of tokens/strings) to a model. For simple, sequential models, it’s sufficient to use the out-of-box tf.keras.Sequential for implementations. However, more classes are needed for advanced architectures, such as conversion of enas super-net to sub-nets.

On the high level, we first need an analog of tf.keras.Sequential that returns a model object when called; in AMBER, this is amber.modeler.ModelBuilder and its subclasses. To wrap around different implementations of neural networks (e.g., a sequential keras model vs. a sub-net of enas implemented in tensorflow), ModelBuilder will take amber.architect.ModelSpace as the unifying reference of model architectures, so that different implementations frameworks, like tensorflow vs keras vs pytorch, will look the same to the search algorithms in amber.architect to ease its burden.

Moving one level further, we need an analog of tf.keras.Model to facilitate the training and evaluation as class methods. This is implemented by amber.modeler.child.

Under the hood of child models, the corresponding tensor operations and computation graphs are constructed in module amber.modeler.dag. Currently AMBER builds the enas sub-graphs with keras models, and builds branching keras model and multi-input/output keras model. Next steps include construction of pytorch computation graphs.

Model Builders



Child Models: Training Interface

DAG: Computation Graph for Child Models

Architecture Decoder