DeepSwarm

Optimising Convolutional Neural Networks using Swarm Intelligence

Edvinas Byla, Wei Pang*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we propose DeepSwarm, a novel neural architecture search (NAS) method based on Swarm Intelligence principles. At its core DeepSwarm uses Ant Colony Optimization (ACO) to generate ant population which uses the pheromone information to collectively search for the best neural architecture. Furthermore, by using local and global pheromone update rules our method ensures the balance between exploitation and exploration. On top of this, to make our method more efficient we combine progressive neural architecture search with weight reusability. Furthermore, due to the nature of ACO our method can incorporate heuristic information which can further speed up the search process. After systematic and extensive evaluation, we discover that on three different datasets (MNIST, Fashion-MNIST, and CIFAR-10) when compared to existing systems our proposed method demonstrates competitive performance. Finally, we open source DeepSwarm (https://github.com/Pattio/DeepSwarm) as a NAS library and hope it can be used by more deep learning researchers and practitioners.
Original languageEnglish
Title of host publicationAdvances in Computational Intelligence Systems
Subtitle of host publicationContributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK
EditorsZhaojie Ju, Longzhi Yang, Chenguang Yang, Alexander Gegov, Dalin Zhou
Place of PublicationCham
PublisherSpringer
Pages119-130
Number of pages12
ISBN (Electronic)9783030299330
ISBN (Print)9783030299323
DOIs
Publication statusE-pub ahead of print - 30 Aug 2019
Event19th Annual UK Workshop on Computational Intelligence - University of Portsmouth, Portsmouth, United Kingdom
Duration: 4 Sep 20196 Sep 2019

Publication series

NameAdvances in Intelligent Systems and Computing
PublisherSpringer
Volume1043
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

Conference19th Annual UK Workshop on Computational Intelligence
CountryUnited Kingdom
CityPortsmouth
Period4/09/196/09/19

Fingerprint

Ant colony optimization
Neural networks
Reusability
Swarm intelligence
Deep learning

Keywords

  • Ant Colony Optimization
  • Neural Architecture Search

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science(all)

Cite this

Byla, E., & Pang, W. (2020). DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence. In Z. Ju, L. Yang, C. Yang, A. Gegov, & D. Zhou (Eds.), Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK (pp. 119-130). (Advances in Intelligent Systems and Computing; Vol. 1043). Cham: Springer . https://doi.org/10.1007/978-3-030-29933-0_10

DeepSwarm : Optimising Convolutional Neural Networks using Swarm Intelligence. / Byla, Edvinas; Pang, Wei.

Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK. ed. / Zhaojie Ju; Longzhi Yang; Chenguang Yang; Alexander Gegov; Dalin Zhou. Cham : Springer , 2020. p. 119-130 (Advances in Intelligent Systems and Computing; Vol. 1043).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Byla, E & Pang, W 2020, DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence. in Z Ju, L Yang, C Yang, A Gegov & D Zhou (eds), Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK. Advances in Intelligent Systems and Computing, vol. 1043, Springer , Cham, pp. 119-130, 19th Annual UK Workshop on Computational Intelligence, Portsmouth, United Kingdom, 4/09/19. https://doi.org/10.1007/978-3-030-29933-0_10
Byla E, Pang W. DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence. In Ju Z, Yang L, Yang C, Gegov A, Zhou D, editors, Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK. Cham: Springer . 2020. p. 119-130. (Advances in Intelligent Systems and Computing). https://doi.org/10.1007/978-3-030-29933-0_10
Byla, Edvinas ; Pang, Wei. / DeepSwarm : Optimising Convolutional Neural Networks using Swarm Intelligence. Advances in Computational Intelligence Systems: Contributions Presented at the 19th UK Workshop on Computational Intelligence, September 4-6, 2019, Portsmouth, UK. editor / Zhaojie Ju ; Longzhi Yang ; Chenguang Yang ; Alexander Gegov ; Dalin Zhou. Cham : Springer , 2020. pp. 119-130 (Advances in Intelligent Systems and Computing).
@inproceedings{5c8ae47bbfbf468c9f1da78f2c38f425,
title = "DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence",
abstract = "In this paper we propose DeepSwarm, a novel neural architecture search (NAS) method based on Swarm Intelligence principles. At its core DeepSwarm uses Ant Colony Optimization (ACO) to generate ant population which uses the pheromone information to collectively search for the best neural architecture. Furthermore, by using local and global pheromone update rules our method ensures the balance between exploitation and exploration. On top of this, to make our method more efficient we combine progressive neural architecture search with weight reusability. Furthermore, due to the nature of ACO our method can incorporate heuristic information which can further speed up the search process. After systematic and extensive evaluation, we discover that on three different datasets (MNIST, Fashion-MNIST, and CIFAR-10) when compared to existing systems our proposed method demonstrates competitive performance. Finally, we open source DeepSwarm (https://github.com/Pattio/DeepSwarm) as a NAS library and hope it can be used by more deep learning researchers and practitioners.",
keywords = "Ant Colony Optimization, Neural Architecture Search",
author = "Edvinas Byla and Wei Pang",
year = "2019",
month = "8",
day = "30",
doi = "10.1007/978-3-030-29933-0_10",
language = "English",
isbn = "9783030299323",
series = "Advances in Intelligent Systems and Computing",
publisher = "Springer",
pages = "119--130",
editor = "Zhaojie Ju and Longzhi Yang and Chenguang Yang and Alexander Gegov and Dalin Zhou",
booktitle = "Advances in Computational Intelligence Systems",

}

TY - GEN

T1 - DeepSwarm

T2 - Optimising Convolutional Neural Networks using Swarm Intelligence

AU - Byla, Edvinas

AU - Pang, Wei

PY - 2019/8/30

Y1 - 2019/8/30

N2 - In this paper we propose DeepSwarm, a novel neural architecture search (NAS) method based on Swarm Intelligence principles. At its core DeepSwarm uses Ant Colony Optimization (ACO) to generate ant population which uses the pheromone information to collectively search for the best neural architecture. Furthermore, by using local and global pheromone update rules our method ensures the balance between exploitation and exploration. On top of this, to make our method more efficient we combine progressive neural architecture search with weight reusability. Furthermore, due to the nature of ACO our method can incorporate heuristic information which can further speed up the search process. After systematic and extensive evaluation, we discover that on three different datasets (MNIST, Fashion-MNIST, and CIFAR-10) when compared to existing systems our proposed method demonstrates competitive performance. Finally, we open source DeepSwarm (https://github.com/Pattio/DeepSwarm) as a NAS library and hope it can be used by more deep learning researchers and practitioners.

AB - In this paper we propose DeepSwarm, a novel neural architecture search (NAS) method based on Swarm Intelligence principles. At its core DeepSwarm uses Ant Colony Optimization (ACO) to generate ant population which uses the pheromone information to collectively search for the best neural architecture. Furthermore, by using local and global pheromone update rules our method ensures the balance between exploitation and exploration. On top of this, to make our method more efficient we combine progressive neural architecture search with weight reusability. Furthermore, due to the nature of ACO our method can incorporate heuristic information which can further speed up the search process. After systematic and extensive evaluation, we discover that on three different datasets (MNIST, Fashion-MNIST, and CIFAR-10) when compared to existing systems our proposed method demonstrates competitive performance. Finally, we open source DeepSwarm (https://github.com/Pattio/DeepSwarm) as a NAS library and hope it can be used by more deep learning researchers and practitioners.

KW - Ant Colony Optimization

KW - Neural Architecture Search

UR - http://www.scopus.com/inward/record.url?scp=85072860836&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-29933-0_10

DO - 10.1007/978-3-030-29933-0_10

M3 - Conference contribution

SN - 9783030299323

T3 - Advances in Intelligent Systems and Computing

SP - 119

EP - 130

BT - Advances in Computational Intelligence Systems

A2 - Ju, Zhaojie

A2 - Yang, Longzhi

A2 - Yang, Chenguang

A2 - Gegov, Alexander

A2 - Zhou, Dalin

PB - Springer

CY - Cham

ER -