Neural architecture search (NAS) techniques generate complex model architectures by manually searching a smaller portion of the model space. Various NAS algorithms have been proposed and several efficient model structures have been discovered, including MobileNetV3 and EfficientNet. By reformulating the multi-objective NAS problem in the context of combinatorial optimization, the LayerNAS method greatly reduces the complexity of the problem. This greatly reduces the number of model candidates to search for, the computation required for multi-trial searches, and the identification of model architectures that work best. Models with higher accuracy 1 were detected on ImageNet, up to 4.9% better than the most current alternatives, using a search space generated using backbones obtained from MobileNetV2 and MobileNetV3.
LayerNAS is built on search spaces that meet the following two criteria: one of the model options produced by searching in the previous layer and using those search options in the current layer can be used to build an optimal model. If the current layer has FLOP constraints, we can constrain the previous layer by lowering the FLOPs of the current layer. In these conditions, it is possible to search linearly from layer 1 to layer n because it is known that changing any previous layer after finding the best layer choice will not improve the performance of the model.
Candidates can then be grouped according to their cost, which limits the number of candidates stored per tier. The most accurate model is only preserved when two models have the same FLOPs, provided this does not change the structure of the layers below. The cost-based, layer-based approach enables one to significantly reduce the search space while considering precisely the algorithm’s polynomial complexity. In contrast, to complete the treatment, the search area will increase exponentially with the layers because the full range of options is available in each layer. The results of the empirical evaluation show that the best models can be found within these constraints.
🚀 Join the fastest ML Subreddit community
LayerNAS reduces NAS to a combinatorial optimization problem by applying a layered cost approach. After training on a specific Si component, the cost and reward can be calculated for each layer i. This points to the following combinatorial problem: How does one choose one option per layer while staying within the cost budget to achieve the best reward? There are many ways to get around this problem, but dynamic programming is one of the easiest. The following metrics are evaluated when NAS algorithms are compared: quality, stability, and efficiency. The algorithm is evaluated on a NATS-Bench benchmark using 100 NAS runs and compared to other NAS algorithms such as random search, systematic evolution, and close policy optimization. The differences between these search algorithms are shown for the metrics described above. The mean variance and accuracy are reported for each comparison (variance is indicated by a shaded rectangle corresponding to the interquartile range 25% to 75%).
To avoid looking for many unhelpful model designs, LayerNAS performance formulates the problem differently by separating cost and reward. Fewer channels in the earlier layers tend to improve performance in candidate models. This shows how LayerNAS discovers better models faster than other methods because it does not waste time on models with unfavorable cost distributions. By using combinatorial optimization, which effectively limits the search complexity to be polynomial, LayerNAS has been proposed as a solution to the multi-objective NAS challenge.
Researchers have created a new method for finding better models for neural networks called LayerNAS. Compare this to other methods and find that it works better. They also used it to find better models for MobileNetV2 and MobileNetV3.
scan the paper And Reference article. Don’t forget to join 20k+ML Sub RedditAnd discord channelAnd Email newsletter, where we share the latest AI research news, cool AI projects, and more. If you have any questions regarding the above article or if we’ve missed anything, feel free to email us at Asif@marktechpost.com
🚀 Check out 100’s AI Tools in the AI Tools Club
Niharika is a Technical Consultant Intern at Marktechpost. She is a third year undergraduate student and is currently pursuing a Bachelor of Technology degree from Indian Institute of Technology (IIT), Kharagpur. She is a highly motivated person with a keen interest in machine learning, data science, and artificial intelligence and an avid reader of the latest developments in these areas.