MoS2-based optoelectronic synapses array: the future of neuromorphic computing

March 05, 2025 04:20 PM AEDT | By EIN Presswire
 MoS2-based optoelectronic synapses array: the future of neuromorphic computing
Image source: EIN Presswire

GA, UNITED STATES, March 5, 2025 /EINPresswire.com/ -- In a development for artificial intelligence, researchers have unveiled a 28×28 synaptic device array that promises to revolutionize artificial visual systems. This innovative array, measuring a compact 0.7×0.7 cm², integrates the capabilities of sensing, memory, and processing to mimic the intricate functions of the human visual system. Utilizing wafer-scale monolayer molybdenum disulfide (MoS2) and gold nanoparticles for enhanced electron capture, the array exhibits remarkable coordination between optical and electrical components. It is capable of both writing and erasing images and has achieved a stunning 96.5% accuracy in digit recognition, marking a significant leap forward in the development of large-scale neuromorphic systems.

The human visual system processes complex visual data efficiently through an interconnected network that allows for parallel processing. However, current artificial vision systems face numerous challenges, including circuit complexity, high power consumption, and difficulties in miniaturization. These issues arise from the separation between signal devices and processing units, hindering the ability to process visual information in parallel. Despite previous attempts, simulating a complete, biologically inspired vision system with a single device has remained elusive, driving the need for more integrated, efficient solutions capable of real-time processing.

On January 13, 2025, a study (DOI: 10.1038/s41378-024-00859-2) published in Microsystems & Nanoengineering introduced a game-changing solution to these longstanding challenges. Led by a team from the Beijing Institute of Technology, the study presents a 28×28 synaptic device array, fabricated using MoS2 floating-gate field-effect transistors. This device not only replicates the neural networks of the human visual system but also delivers exceptional optoelectronic synaptic performance, setting the stage for more efficient and integrated artificial visual systems.

The research team from Beijing Institute of Technology has successfully designed a 28×28 array where each device mimics the synaptic plasticity found in the human visual system. Using MoS2 floating-gate transistors combined with gold nanoparticles as electron capture layers, they achieved stable and uniform optoelectronic performance, capable of simulating key synaptic behaviors like excitatory postsynaptic current (EPSC) and paired-pulse facilitation (PPF). The array demonstrated an on/off ratio of around 10^6 and an average mobility of 8 cm²V^-1s^-1. Notably, the array was able to store and process image data, such as the emblem of Beijing Institute of Technology, showcasing its potential for optical data processing. Furthermore, the ability to adjust light intensity and fine-tune recognition accuracy provides a new method for optimizing the system's performance in varying lighting conditions.

Jing Zhao, the corresponding author of the study, emphasized the importance of these findings: “Our results offer a viable pathway toward large-scale integrated artificial visual neuromorphic systems. The performance of the MoS2-based synaptic array represents a major step toward practical applications, from device-level simulations to system-wide integration.”

The advances in artificial synaptic neural networks present numerous advantages, including high integration, stable uniformity, and powerful parallel processing capabilities. These attributes could transform the performance of computational systems. The network’s ability to simultaneously process optoelectronic signals and adjust synaptic weights via light signals has already demonstrated impressive results in handwritten digit recognition, with an accuracy of 96.5%. This breakthrough opens up exciting possibilities for the future of deep learning and artificial vision, potentially ushering in smarter, more efficient systems in the near future.

DOI
10.1038/s41378-024-00859-2

Original Source URL
https://doi.org/10.1038/s41378-024-00859-2

Funding information
This work is supported by National Natural Science Foundation of China (NSFC, Grand No. 62127810, 61804009), State Key Laboratory of Explosion Science and Safety Protection (QNKT24-03), Xiaomi Young Scholar, Beijing Institute of Technology Research Fund Program for Young Scholars and Analysis & Testing Center, Beijing Institute of Technology.

Lucy Wang
BioDesign Research
email us here

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.


Disclaimer

The content, including but not limited to any articles, news, quotes, information, data, text, reports, ratings, opinions, images, photos, graphics, graphs, charts, animations and video (Content) is a service of Kalkine Media Pty Ltd (“Kalkine Media, we or us”), ACN 629 651 672 and is available for personal and non-commercial use only. The principal purpose of the Content is to educate and inform. The Content does not contain or imply any recommendation or opinion intended to influence your financial decisions and must not be relied upon by you as such. Some of the Content on this website may be sponsored/non-sponsored, as applicable, but is NOT a solicitation or recommendation to buy, sell or hold the stocks of the company(s) or engage in any investment activity under discussion. Kalkine Media is neither licensed nor qualified to provide investment advice through this platform. Users should make their own enquiries about any investments and Kalkine Media strongly suggests the users to seek advice from a financial adviser, stockbroker or other professional (including taxation and legal advice), as necessary.
The content published on Kalkine Media also includes feeds sourced from third-party providers. Kalkine does not assert any ownership rights over the content provided by these third-party sources. The inclusion of such feeds on the Website is for informational purposes only. Kalkine does not guarantee the accuracy, completeness, or reliability of the content obtained from third-party feeds. Furthermore, Kalkine Media shall not be held liable for any errors, omissions, or inaccuracies in the content obtained from third-party feeds, nor for any damages or losses arising from the use of such content.
Kalkine Media hereby disclaims any and all the liabilities to any user for any direct, indirect, implied, punitive, special, incidental or other consequential damages arising from any use of the Content on this website, which is provided without warranties. The views expressed in the Content by the guests, if any, are their own and do not necessarily represent the views or opinions of Kalkine Media. Some of the images/music that may be used on this website are copyrighted to their respective owner(s). Kalkine Media does not claim ownership of any of the pictures displayed/music used on this website unless stated otherwise. The images/music that may be used on this website are taken from various sources on the internet, including paid subscriptions or are believed to be in public domain. We have made reasonable efforts to accredit the source wherever it was indicated as or found to be necessary.
This disclaimer is subject to change without notice. Users are advised to review this disclaimer periodically for any updates or modifications.


AU_advertise

Advertise your brand on Kalkine Media

Sponsored Articles


Investing Ideas

Previous Next
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.