Charles M Higgins

Charles M Higgins

Associate Professor, Neuroscience
Associate Professor, Neuroscience - GIDP
Associate Professor, Applied Mathematics - GIDP
Associate Professor, Electrical and Computer Engineering
Associate Professor, Entomology / Insect Science - GIDP
Associate Professor, BIO5 Institute
Primary Department
Department Affiliations
Contact
(520) 621-6604

Research Interest

Charles Higgins, PhD, is an Associate Professor in the Department of Neuroscience with a dual appointment in Electrical Engineering at the University of Arizona where he is also leader of the Higgins Lab. Though he started his career as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, while also trying to meld together the worlds of robotics and biology. His research ranges from software simulations of brain circuits to interfacing live insect brains with robots, but his driving interest continues to be building truly intelligent machines.Dr. Higgins’ lab conducts research in areas that vary from computational neuroscience to biologically-inspired engineering. The unifying goal of all these projects is to understand the representations and computational architectures used by biological systems. These projects are conducted in close collaboration with neurobiology laboratories that perform anatomical, electrophysiological, and histological studies, mostly in insects.More than three years ago he captured news headlines when he and his lab team demonstrated a robot they built which was guided by the brain and eyes of a moth. The moth, immobilized inside a plastic tube, was mounted on a 6-inch-tall wheeled robot. When the moth moved its eyes to the right, the robot turned in that direction, proving brain-machine interaction. While the demonstration was effective, Charles soon went to work to overcome the difficulty the methodology presented in keeping the electrodes attached to the brain of the moth while the robot was in motion. This has led him to focus his work on another insect species.

Publications

Northcutt, B. D., & Higgins, C. M. (2018). A Minimal Computational Architecture for Range Estimation and Mapping. Robotics and Autonomous Systems.
Higgins, C. M., & Goodman, R. M. (1991). Incremental learning with rule-based neural networks. Proceedings. IJCNN-91-Seattle: International Joint Conference on Neural Networks, 875-880.

Abstract:

A classifier for discrete-valued variable classification problems is presented. The system utilizes an information-theoretic algorithm for constructing informative rules from example data. These rules are then used to construct a neural network to perform parallel inference and posterior probability estimation. The network can be grown incrementally, so that new data can be incorporated without repeating the training on previous data. It is shown that this technique performs as well as other techniques such as backpropagation while having unique advantages in incremental learning capability, training efficiency, knowledge representation, and hardware implementation suitability.

Johnson, L. A., & Higgins, C. M. (2006). A navigation aid for the blind using tactile-visual sensory substitution. Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, 6289-6292.

Abstract:

The objective of this study is to improve the quality of life for the visually impaired by restoring their ability to self-navigate. In this paper we describe a compact, wearable device that converts visual information into a tactile signal. This device, constructed entirely from commercially available parts, enables the user to perceive distant objects via a different sensory modality. Preliminary data suggest that this device is useful for object avoidance in simple environments. © 2006 IEEE.

Northcutt, B. D., Dyhr, J. P., & Higgins, C. M. (2017). An Insect-Inspired Model for Visual Binding I: Learning Objects and Their Characteristics. Biological Cybernetics, 111(2), 185-206.
Higgins, C., Dyhr, J. P., & Higgins, C. M. (2010). The spatial frequency tuning of optic-flow-dependent behaviors in the bumblebee Bombus impatiens. The Journal of experimental biology, 213(Pt 10).

Insects use visual estimates of flight speed for a variety of behaviors, including visual navigation, odometry, grazing landings and flight speed control, but the neuronal mechanisms underlying speed detection remain unknown. Although many models and theories have been proposed for how the brain extracts the angular speed of the retinal image, termed optic flow, we lack the detailed electrophysiological and behavioral data necessary to conclusively support any one model. One key property by which different models of motion detection can be differentiated is their spatiotemporal frequency tuning. Numerous studies have suggested that optic-flow-dependent behaviors are largely insensitive to the spatial frequency of a visual stimulus, but they have sampled only a narrow range of spatial frequencies, have not always used narrowband stimuli, and have yielded slightly different results between studies based on the behaviors being investigated. In this study, we present a detailed analysis of the spatial frequency dependence of the centering response in the bumblebee Bombus impatiens using sinusoidal and square wave patterns.