Neuroscientists have identified two brain regions that are involved in creating panoramic memories and help us to merge fleeting views of our surroundings into a seamless, 360-degree panorama.
As we look at a scene, visual information flows from our retinas into the brain, which has regions that are responsible for processing different elements of what we see, such as faces or objects.
"Our understanding of our environment is largely shaped by our memory for what's currently out of sight," said lead author Caroline Robertson, post doctoral student at the Massachusetts Institute of Technology (MIT) in the US.
The study found the hubs in the brain where your memories for the panoramic environment are integrated with your current field of view.
The researchers suspected that areas involved in processing scenes -- the occipital place area (OPA), the retrosplenial complex (RSC), and parahippocampal place area (PPA) -- might also be involved in generating panoramic memories of a place such as a street corner.
Brain scans conducted on study participants revealed that when participants saw two images that they knew were linked, the response patterns in the RSC and OPA regions were similar.
However, this was not the case for image pairs that the participants had not seen as linked.
This suggests that the RSC and OPA, but not the PPA, are involved in building panoramic memories of our surroundings, the researchers said.
"Our hypothesis was that as we begin to build memory of the environment around us, there would be certain regions of the brain where the representation of a single image would start to overlap with representations of other views from the same scene," Robertson added.
For the study, the team used immersive virtual reality headsets, which allowed them to show people many different panoramic scenes, the researchers showed participants images from 40 street corners in Boston's Beacon Hill neighbourhood.
The images were presented in two ways. Half the time, participants saw a 100-degree stretch of a 360-degree scene, but the other half of the time, they saw two noncontinuous stretches of a 360-degree scene.
After showing participants these panoramic environments, the researchers then showed them 40 pairs of images and asked if they came from the same street corner.
Participants were much better able to determine if pairs came from the same corner if they had seen the two scenes linked in the 100-degree image than if they had seen them unlinked, said the paper appearing in the journal Current Biology.
--IANS
rt/ask/dg
Disclaimer: No Business Standard Journalist was involved in creation of this content
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
