Neural Network Structure Optimization by Simulated Annealing.

heuristics neural network pruning simulated annealing structure optimization

Journal

Entropy (Basel, Switzerland)
ISSN: 1099-4300
Titre abrégé: Entropy (Basel)
Pays: Switzerland
ID NLM: 101243874

Informations de publication

Date de publication:
28 Feb 2022
Historique:
received: 20 12 2021
revised: 21 02 2022
accepted: 23 02 2022
entrez: 25 3 2022
pubmed: 26 3 2022
medline: 26 3 2022
Statut: epublish

Résumé

A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industrial applications, they need to be compressed in advance. Since edge devices cannot train or access trained networks when internet resources are scarce, the preloading of smaller networks is essential. Various works in the literature have shown that the redundant branches can be pruned strategically in a fully connected network without sacrificing the performance significantly. However, majority of these methodologies need high computational resources to integrate weight training via the back-propagation algorithm during the process of network compression. In this work, we draw attention to the optimization of the network structure for preserving performance despite compression by pruning aggressively. The structure optimization is performed using the simulated annealing algorithm only, without utilizing back-propagation for branch weight training. Being a heuristic-based, non-convex optimization method, simulated annealing provides a globally near-optimal solution to this NP-hard problem for a given percentage of branch pruning. Our simulation results have shown that simulated annealing can significantly reduce the complexity of a fully connected network while maintaining the performance without the help of back-propagation.

Identifiants

pubmed: 35327859
pii: e24030348
doi: 10.3390/e24030348
pmc: PMC8947290
pii:
doi:

Types de publication

Journal Article

Langues

eng

Subventions

Organisme : National Natural Science Foundation of China
ID : 71971127

Références

J Theor Biol. 2017 Apr 21;419:227-237
pubmed: 28163008
IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):772-785
pubmed: 31150347

Auteurs

Chun Lin Kuo (CL)

Tsinghua-Berkeley Shenzhen Institute, Nanshan, Shenzhen 518071, China.

Ercan Engin Kuruoglu (EE)

Tsinghua-Berkeley Shenzhen Institute, Nanshan, Shenzhen 518071, China.

Wai Kin Victor Chan (WKV)

Tsinghua-Berkeley Shenzhen Institute, Nanshan, Shenzhen 518071, China.

Classifications MeSH