Charles M Higgins

Charles M Higgins

Associate Professor, Neuroscience
Associate Professor, Neuroscience - GIDP
Associate Professor, Applied Mathematics - GIDP
Associate Professor, Electrical and Computer Engineering
Associate Professor, Entomology / Insect Science - GIDP
Associate Professor, BIO5 Institute
Primary Department
Department Affiliations
Contact
(520) 621-6604

Research Interest

Charles Higgins, PhD, is an Associate Professor in the Department of Neuroscience with a dual appointment in Electrical Engineering at the University of Arizona where he is also leader of the Higgins Lab. Though he started his career as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, while also trying to meld together the worlds of robotics and biology. His research ranges from software simulations of brain circuits to interfacing live insect brains with robots, but his driving interest continues to be building truly intelligent machines.Dr. Higgins’ lab conducts research in areas that vary from computational neuroscience to biologically-inspired engineering. The unifying goal of all these projects is to understand the representations and computational architectures used by biological systems. These projects are conducted in close collaboration with neurobiology laboratories that perform anatomical, electrophysiological, and histological studies, mostly in insects.More than three years ago he captured news headlines when he and his lab team demonstrated a robot they built which was guided by the brain and eyes of a moth. The moth, immobilized inside a plastic tube, was mounted on a 6-inch-tall wheeled robot. When the moth moved its eyes to the right, the robot turned in that direction, proving brain-machine interaction. While the demonstration was effective, Charles soon went to work to overcome the difficulty the methodology presented in keeping the electrodes attached to the brain of the moth while the robot was in motion. This has led him to focus his work on another insect species.

Publications

Higgins, C. M., Pant, V., & Deutschmann, R. (2005). Analog VLSI implementation of spatio-temporal frequency tuned visual motion algorithms. IEEE Transactions on Circuits and Systems I: Regular Papers, 52(3), 489-502.

Abstract:

The computation of local visual motion can be accomplished very efficiently in the focal plane with custom very large-scale integration (VLSI) hardware. Algorithms based on measurement of the spatial and temporal frequency content of the visual motion signal, since they incorporate no thresholding operation, allow highly sensitive responses to low contrast and low-speed visual motion stimuli. We describe analog VLSI implementations of the three most prominent spatio-temporal frequency-based visual motion algorithms, present characterizations of their performance, and compare the advantages of each on an equal basis. This comparison highlights important issues in the design of analog VLSI sensors, including the effects of circuit design on power consumption, the tradeoffs of subthreshold versus above-threshold MOSFET biasing, and methods of layout for focal plane vision processing arrays. The presented sensors are capable of distinguishing the direction of motion of visual stimuli to less than 5% contrast, while consuming as little as 1 μW of electrical power. These visual motion sensors are useful in embedded applications where minimum power consumption, size, and weight are crucial. © 2005 IEEE.

Özalevli, E., Hasler, P., & Higgins, C. M. (2006). Winner-take-all-based visual motion sensors. IEEE Transactions on Circuits and Systems II: Express Briefs, 53(8), 717-721.

Abstract:

We present a novel analog VLSI implementation of visual motion computation based on the lateral inhibition and positive feedback mechanisms that are inherent in the hysteretic winner-take-all circuit. By use of an input-dependent bias current and threshold mechanism, the circuit resets itself to prepare for another motion computation. This implementation was inspired by the Barlow-Levick model of direction selectivity in the rabbit retina. Each pixel uses 33 transistors and two small capacitors to detect the direction of motion and can be altered with the addition of six more transistors to measure the interpixel transit time. Simulation results and measurements from fabricated VLSI designs are presented to show the operation of the circuits. © 2006 IEEE.

Higgins, C. M., & Goodman, R. M. (1994). Fuzzy rule-based networks for control. IEEE Transactions on Fuzzy Systems, 2(1), 82-88.

Abstract:

We present a method for learning fuzzy logic membership functions and rule to approximate a numerical function from a set of examples of the functions independent variables and the resulting function value. This method uses a three-step approach to building a complete function approximation system: first, learning the membership functions and creating a cell-based rule representation; second, simplifying the cell-based rules using an information-theoretic approach for induction of rules from discrete-valued data; and, finally, constructing a computational (neural) network to compute the function value given its independent variables. This function approximation system is demonstrated with a simple control example: learning the truck and the trailer backer-upper control system.

Higgins, C. M. (2001). Sensory architectures for biologically inspired autonomous robotics. Biological Bulletin, 200(2), 235-242.

PMID: 11341590;Abstract:

Engineers have a lot to gain from studying biology. The study of biological neural systems alone provides numerous examples of computational systems that are far more complex than any man-made system and perform real-time sensory and motor tasks in a manner that humbles the most advanced artificial systems. Despite the evolutionary genesis of these systems and the vast apparent differences between species, there are common design strategies employed by biological systems that span taxa, and engineers would do well to emulate these strategies. However, biologically-inspired computational architectures, which are continuous-time and parallel in nature, do not map well onto conventional processors, which are discrete-time and serial in operation. Rather, an implementation technology that is capable of directly realizing the layered parallel structure and nonlinear elements employed by neurobiology is required for power- and space-efficient implementation. Custom neuromorphic hardware meets these criteria and yields low-power dedicated sensory systems that are small, light, and ideal for autonomous robot applications. As examples of how this technology is applied, this article describes both a low-level neuromorphic hardware emulation of an elementary visual motion detector, and a large-scale, system-level spatial motion integration system.

Ozalevli, E., & Higgins, C. M. (2003). Multi-chip implementation of a biomimetic VLSI vision sensor based on the Adelson-Bergen algorithm. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2714, 433-440.

Abstract:

Biological motion sensors found in the retinas of species ranging from flies to primates are tuned to specific spatio-temporal frequencies to determine the local motion vectors in their visual field and perform complex motion computations. In this study, we present a novel implementation of a silicon retina based on the Adelson-Bergen spatio-temporal energy model of primate cortical cells. By employing a multi-chip strategy, we successfully implemented the model without much sacrifice of the fill factor of the photoreceptors in the front-end chip. In addition, the characterization results proved that this spatio-temporal frequency tuned silicon retina can detect the direction of motion of a sinusoidal input grating down to 10 percent contrast, and over more than a magnitude in velocity. This multi-chip biomimetic vision sensor will allow complex visual motion computations to be performed in real-time. © Springer-Verlag Berlin Heidelberg 2003.