Large-scale foundation models and generative AI for BigData neuroscience
Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-06-01
|
| Series: | Neuroscience Research |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S0168010224000750 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850270244849844224 |
|---|---|
| author | Ran Wang Zhe Sage Chen |
| author_facet | Ran Wang Zhe Sage Chen |
| author_sort | Ran Wang |
| collection | DOAJ |
| description | Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities. |
| format | Article |
| id | doaj-art-0b330d72358d41948004d575b89eac2f |
| institution | OA Journals |
| issn | 0168-0102 |
| language | English |
| publishDate | 2025-06-01 |
| publisher | Elsevier |
| record_format | Article |
| series | Neuroscience Research |
| spelling | doaj-art-0b330d72358d41948004d575b89eac2f2025-08-20T01:52:42ZengElsevierNeuroscience Research0168-01022025-06-0121531410.1016/j.neures.2024.06.003Large-scale foundation models and generative AI for BigData neuroscienceRan Wang0Zhe Sage Chen1Department of Psychiatry, New York University Grossman School of Medicine, New York, NY 10016, USADepartment of Psychiatry, New York University Grossman School of Medicine, New York, NY 10016, USA; Department of Neuroscience and Physiology, Neuroscience Institute, New York University Grossman School of Medicine, New York, NY 10016, USA; Department of Biomedical Engineering, New York University Tandon School of Engineering, Brooklyn, NY 11201, USA; Corresponding author at: Department of Psychiatry, New York University Grossman School of Medicine, New York, NY 10016, USA.Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities.http://www.sciencedirect.com/science/article/pii/S0168010224000750Foundation modelGenerative AIBigDataTransformerSelf-supervised learningTransfer learning |
| spellingShingle | Ran Wang Zhe Sage Chen Large-scale foundation models and generative AI for BigData neuroscience Neuroscience Research Foundation model Generative AI BigData Transformer Self-supervised learning Transfer learning |
| title | Large-scale foundation models and generative AI for BigData neuroscience |
| title_full | Large-scale foundation models and generative AI for BigData neuroscience |
| title_fullStr | Large-scale foundation models and generative AI for BigData neuroscience |
| title_full_unstemmed | Large-scale foundation models and generative AI for BigData neuroscience |
| title_short | Large-scale foundation models and generative AI for BigData neuroscience |
| title_sort | large scale foundation models and generative ai for bigdata neuroscience |
| topic | Foundation model Generative AI BigData Transformer Self-supervised learning Transfer learning |
| url | http://www.sciencedirect.com/science/article/pii/S0168010224000750 |
| work_keys_str_mv | AT ranwang largescalefoundationmodelsandgenerativeaiforbigdataneuroscience AT zhesagechen largescalefoundationmodelsandgenerativeaiforbigdataneuroscience |