Technology

NASGraph: A novel graph-based machine learning method for NAS with lightweight computation (CPU only), data independent, and no training

Designing state-of-the-art deep learning models is an incredibly complex challenge that researchers are tackling using an approach called “ Neural Architecture Search (NAS). The goal of NAS is to automate the discovery of optimal neural network architectures for a given task by evaluating thousands of candidate architectures against a performance metric such as the accuracy of a validation data set.

However, previous NAS methods faced significant bottlenecks as each candidate architecture had to be extensively trained, making the process extremely computationally intensive and time-consuming. Researchers have proposed various techniques such as weight sharing, differentiable search spaces, and predictor-based methods to accelerate NAS, but computational complexity remained a major hurdle.

https://arxiv.org/abs/2405.01306

This paper presents NASGraph (shown in Figure 1), an innovative method that drastically reduces the computational effort when searching for neural architectures. Instead of fully training each candidate architecture, NASGraph converts them into graph representations and uses graph metrics to efficiently estimate their performance.

Specifically, the neural network is first split up Diagram blocks Contains layers such as convolutions and activations. For each block, the technique determines how much each input channel contributes to the output channels through a single forward pass. These contributions form the weighted edges when the inputs are mapped to nodes and connections to edges in the graph representation.

Once the architecture is represented as a diagram, NASGraph calculates it average degree (average number of connections per node) as a proxy for the classification of architectural quality. However, the researchers present Replacement models with reduced computational effort to further accelerate this process.

This NASGraph(h, c, m) Replacement models have fewer channels Hfewer search cells C per module and fewer modules M. As shown in their systematic study following the convention in EcoNAS, the use of such computationally reduced settings enables a trade-off between accuracy and significant speedups.

To evaluate NASGraph, the team tested it on several NAS benchmarks such as NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101. They compared the average completion metric rankings with ground truth and other training-free NAS methods. The average degree metric showed a strong correlation with actual architectural performance, outperformed previous training-free NAS methods, and showed little bias on specific operations compared to ground truth rankings. Furthermore, combining this graph measure with other training-free metrics such as Jacobian covariance further increased the ranking capabilities and produced new results State-of-the-art Spearman ranking correlations greater than 0.8 for datasets such as CIFAR-10, CIFAR-100 and ImageNet-16-120.

In summary, NASGraph represents a paradigm shift in the search for neural architectures by leveraging a sophisticated graph-based approach. It overcomes a major computational bottleneck that plagued previous NAS methods by circumventing the need for architectural training. With its outstanding performance, low drift, data-independent nature, and remarkable efficiency, NASGraph could usher in a new era of rapid exploration of neural architectures and discovery of powerful AI models for various applications.


Visit the Paper. All credit for this research goes to the researchers of this project. Also don’t forget to follow us Twitter. Join our… Telegram channel, Discord channelAnd LinkedIn Grupp.

If you like our work, you will love ours Newsletter..

Don’t forget to join our 41k+ ML SubReddit


Vineet Kumar is a consultant intern at MarktechPost. He is currently completing his bachelor’s degree at the Indian Institute of Technology (IIT) in Kanpur. He is a machine learning enthusiast. He is passionate about research and the latest advances in deep learning, computer vision and related areas.




Source link