CCE Theses and Dissertations
Campus Access Only
All rights reserved. This publication is intended for use solely by faculty, students, and staff of Nova Southeastern University. No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, now known or later developed, including but not limited to photocopying, recording, or other electronic or mechanical methods, without the prior written permission of the author or the publisher.
Date of Award
2010
Document Type
Dissertation - NSU Access Only
Degree Name
Doctor of Philosophy in Information Systems (DISS)
Department
Graduate School of Computer and Information Sciences
Advisor
Easwar A Nyshadham
Committee Member
Sumitra Mukherjee
Committee Member
Randall S Sexton
Keywords
Genetic Algorithms, Neural Networks, Parallel Computing
Abstract
Parallelizing neural networks is an active area of research. Current approaches surround the parallelization of the widely used back-propagation (BP) algorithm, which has a large amount of communication overhead, making it less than ideal for parallelization. An algorithm that does not depend on the calculation of derivatives, and the backward propagation of errors, better lends itself to a parallel implementation.
One well known training algorithm for neural networks explicitly incorporates network structure in the objective function to be minimized which yields simpler neural networks. Prior work has implemented this using a modified genetic algorithm in a serial fashion that is not scalable, thus limiting its usefulness.
This dissertation created a parallel version of the algorithm. The performance of the proposed algorithm is compared against the existing algorithm using a variety of syn-thetic and real world problems. Computational experiments with benchmark datasets in-dicate that the parallel algorithm proposed in this research outperforms the serial version from prior research in finding better minima in the same time as well as identifying a simpler architecture.
NSUWorks Citation
Shannon Dale McMurtrey. 2010. Training and Optimizing Distributed Neural Networks Using a Genetic Algorithm. Doctoral dissertation. Nova Southeastern University. Retrieved from NSUWorks, Graduate School of Computer and Information Sciences. (243)
https://nsuworks.nova.edu/gscis_etd/243.