Submitted by gahaalt t3_ypkfwq in MachineLearning
violentdeli8 t1_ivk090t wrote
This is really cool! I am wondering if I can use this as a “compiler” for Neural Architecture Search. During the search phase one has to sample from a search space of architectures and “compile” them into a valid nn.Module. Usually this code is written per search space. Wondering if that part becomes far less tedious with this.
gahaalt OP t1_ivk3tvj wrote
Yeah! You have a lot of flexibility to do NAS here. You can create a huge graph of layers and sample a smaller path from it to create a Symbolic Model. One non-standard thing you need to do to pull it off is to modify ._children
attribute of Symbolic Data when you want to rewire the connections in this graph.
I might add an example for a simple NAS soon.
violentdeli8 t1_ivk4uel wrote
In https://arxiv.org/pdf/2203.02094.pdf figure 2 describes a GPT architecture configuration search space. Can one compile samples from this search space easily in Symbolic?
violentdeli8 t1_ivk53vn wrote
A very simple NAS example using perhaps one of the tabular benchmarks like 201 will be very useful illustration. Thanks!!
Viewing a single comment thread. View all comments