Neural Processing Letters
Published by Springer Nature
ISSN : 1370-4621 eISSN : 1573-773X
Abbreviation : Neural Process. Lett.
Aims & Scope
Neural Processing Letters is an international journal that promotes fast exchange of the current state-of-the art contributions among the artificial neural network community of researchers and users.
The Journal publishes technical articles on various aspects of artificial neural networks and machine learning systems.
Coverage includes novel architectures, supervised and unsupervised learning algorithms, deep nets, learning theory, network dynamics, self-organization, optimization, biological neural network modelling, and hybrid neural/fuzzy logic/genetic systems.
The Journal publishes articles on methodological innovations for the applications of the afore mentioned systems in classification, pattern recognition, signal processing, image and video processing, robotics, control, autonomous vehicles, financial forecasting, big data analytics, and other multidisciplinary applications.
View Aims & ScopeMetrics & Ranking
Impact Factor
| Year | Value |
|---|---|
| 2025 | 2.8 |
| 2024 | 2.60 |
Journal Rank
| Year | Value |
|---|---|
| 2024 | 8518 |
Journal Citation Indicator
| Year | Value |
|---|---|
| 2024 | 3943 |
SJR (SCImago Journal Rank)
| Year | Value |
|---|---|
| 2024 | 0.672 |
Quartile
| Year | Value |
|---|---|
| 2024 | Q2 |
h-index
| Year | Value |
|---|---|
| 2024 | 72 |
Impact Factor Trend
Abstracting & Indexing
Journal is indexed in leading academic databases, ensuring global visibility and accessibility of our peer-reviewed research.
Subjects & Keywords
Journal’s research areas, covering key disciplines and specialized sub-topics in Computer Science and Neuroscience, designed to support cutting-edge academic discovery.
Most Cited Articles
The Most Cited Articles section features the journal's most impactful research, based on citation counts. These articles have been referenced frequently by other researchers, indicating their significant contribution to their respective fields.
-
Differential Evolution Training Algorithm for Feed-Forward Neural Networks
Citation: 503
Authors: Jarmo, Joni-Kristian, Jouni
-
Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function
Citation: 340
Authors: Shuai, Sanfeng, Bo
-
Clustering Incomplete Data Using Kernel-Based Fuzzy C-means Algorithm
Citation: 265
Authors: Dao-Qiang, Song-Can
-
Self-Adaptive Evolutionary Extreme Learning Machine
Citation: 261
Authors: Jiuwen, Zhiping, Guang-Bin
-
Growing Grid — a self-organizing network with constant neighborhood range and adaptation strength
Citation: 225
Authors: Bernd
-
Daily Activity Feature Selection in Smart Homes Based on Pearson Correlation Coefficient
Citation: 219
Authors: Yaqing, Yong, Keyu, Yiming, Jinghuan