Genetic optimization of neural network configurations for natural language learning.
Item
-
Title
-
Genetic optimization of neural network configurations for natural language learning.
-
Identifier
-
AAI9924802
-
identifier
-
9924802
-
Creator
-
Davila, Jaime Jesus.
-
Contributor
-
Adviser: Virginia Teller
-
Date
-
1999
-
Language
-
English
-
Publisher
-
City University of New York.
-
Subject
-
Computer Science
-
Abstract
-
One approach used by researchers developing computer systems capable of understanding natural language is that of training a neural network (NN) for the task. Because of the large number of parameters that can be controlled in a neural network (such as topology, training data, transfer function, learning algorithm, and others), researchers have used networks with different configurations to achieve success in various natural language tasks.;A major unanswered question in NN research is how best to set a series of configuration parameters so as to maximize the network's performance. Variables such as the number and type of connections among neurons, data to be used during training, and speed of learning can dramatically affect performance. Moreover the ways in which these parameters interact and affect the network's behavior is not well understood.;In the research reported here, no particular NN configuration is chosen in advance. Instead, genetic algorithms (GA) are used to search the configuration space for optimal combinations. The GA system defines important aspects of network configurations: topology (number of nodes and layers and connections among them), training set composition, and learning parameters.;Networks evolved by the GA system were successful in solving the natural language task; the percent of correct output ranged from 80% to 97%. Network topology was the most important factor in the successful configurations. Training set composition also played a role, but transfer function and mutation rate were not significant factors.;The four main topologies discovered by the GA system were compared to four of the most commonly used topologies in NN research and were found to be significantly superior in their performance. The use of GA to optimize NN shows promise for all tasks where an optimal configuration is not known, and at the same time helps identify required NN characteristics given particular natural language properties.
-
Type
-
dissertation
-
Source
-
PQT Legacy CUNY.xlsx
-
degree
-
Ph.D.