Skip to main content
  • Home
  • Happenings
  • Events
  • BRAIN-INSPIRED ATTENTIONAL SEARCH MODEL TO SOLVE OBJECT RECOGNITION PROBLEMS IN 2D AND 3D ENVIRONMENTS
BRAIN-INSPIRED ATTENTIONAL SEARCH MODEL TO SOLVE OBJECT RECOGNITION PROBLEMS IN 2D AND 3D ENVIRONMENTS

BRAIN-INSPIRED ATTENTIONAL SEARCH MODEL TO SOLVE OBJECT RECOGNITION PROBLEMS IN 2D AND 3D ENVIRONMENTS

Date16th Oct 2023

Time11:00 AM

Venue Google Meet

PAST EVENT

Details

We propose a D brain-inspired attentional search model to solve real world image recognition problems such as object detection, target search in 2D and 3D space, and object counting etc. Results on such problems demonstrate that the attention approach not only enhances the performance, but also reduces the computational cost by using smaller attention windows as input to the model. Replacing long-short term memory (LSTM) neurons of 4 gating variables by JK-flip-flop neurons of 2 gating variables reduces the network’s training parameters significantly. The proposed brain-inspired attentional search model has three separate channels labeled as – classifier network, eye-position network, and saccade network. Multiple attentional windows with different resolutions and a common center is given as input to the classifier network and the saccade network. The heat-map representation of the location of the attentional windows is given as input to the eye-position network. The saccade network predicts the next jump of the attention windows with the help of reward signals received by the classifier network, which determines the object class. The classifier network and Saccade network are analogous to the processing of visual information along the ” what” and “where/how” pathways, respectively. The model uses Elman, Jordan, JK-flip- flop recurrence layers as memory to store the history of the view and corresponding location, which resemble the feedback loops present among the visual cortical areas for example from V1 to thalamus or from V2 to V1. Furthermore, the flip-flop neurons are considered similar to the UP/DOWN neurons found in the prefrontal cortex (PFC), responsible for working memory.


Journal Publications:


Sweta Kumari, V. Srinivasa Chakravarthy, Biologically inspired image classifier based on saccadic eye movement design for convolutional neural networks, Neurocomputing, Volume 513, 2022, Pages 294-317, ISSN 0925-2312, Paper



Sweta Kumari, C Vigneswaran, and V. Srinivasa Chakravarthy(2021), The flip-flop neuron–a memory efficient alternative for solving challenging sequence processing and decision-making problems, Neural Computing and Applications, Springer, Paper



Sweta Kumari, Shobha Amala, Nivethithan M, and V. Srinivasa Chakravarthy (2022),BIAS-3D: Brain inspired attentional search model fashioned after what and where/how pathways for target search in 3D environment, Frontiers in Computational Neuroscience, Paper

Speakers

Sweta Kumari (BT17D019)

Department of Biotechnology