Researcher ORCID Identifier
Open Access Senior Thesis
Bachelor of Arts
© 2022 Seiji A Akera
Artificial Neural Networks (ANNs) comprise a non-linear modeling method that is often used to analyze neural data in place of linear models. These networks fall under one of two classifications: pure prediction purposes and improving understanding of the brain through neural interpretability. FieldNet is an ANN designed by Dr. Gautam Agarwal that takes complex valued theta oscillations recorded from implanted multi-electrodes and predicts the rat’s location as it moves throughout a maze. In learning neural features to make this classification, FieldNet appears to reconstruct place fields. The construction of FieldNet has an impact on both the performance and reconstruction capabilities of the network. In this paper we focus on how choices in network design affect the performance of FieldNet so we may arrive at general principles for network design. Grid search was used on 5 hyperparameters; number of nodes, learning rate, activation constant, activation exponent, and batch normalization method. A Clutter Based Dimensionality Recording (CBDR) plot was used to visualize relationships between hyperparameters. For hyperparameter codependence, the scalability of the number of nodes is dependent on the batch normalization method with “unit norm” possessing the least scalability. Poor data quality is represented by a more limited space of high performance hyperparameter combinations. During construction and hyperparameter tuning of the network, we recommend using some intentionally poor quality data mixed with clear data. Another recommendation is to intentionally use a poor hyperparameter choice, such as batch normalization method: unit norm, to determine the optimal hyperparameter combination of other hyperparameters.
Akera, Seiji, "Hyperparameter Codependence in FieldNet: Guidelines for ANN construction" (2023). Pitzer Senior Theses. 160.