publications Article
A Scientometric Study of Neurocomputing Publications (1992–2018): An Aerial Overview of Intrinsic Structure Manvendra Janmaijaya 1 , Amit K. Shukla 1 , Ajith Abraham 2 and Pranab K. Muhuri 1, * 1 2
*
ID
Department of Computer Science, South Asian University, New Delhi 110021, India;
[email protected] (M.J.);
[email protected] (A.K.S.) Machine Intelligence Research Labs (MIR Labs), 3rd Street NW, P.O. Box 2259, Auburn, WA 98071, USA;
[email protected] Correspondence:
[email protected]
Received: 11 May 2018; Accepted: 16 July 2018; Published: 19 July 2018
Abstract: The international journal of neurocomputing (NC) is considered to be one of the most sought out journals in the computer science research fraternity. In this paper, an extensive bibliometric overview of this journal is performed. The bibliometric data is extracted from the Web of Science (WoS) repository. The main objective of this study is to reveal internal structures and hidden inferences, such as highly productive and influential authors, most contributing countries, top institutions, collaborating authors, and so on. The CiteSpace and VOS viewer is used to visualize the graphical mapping of the bibliometric data. Further, the document co-citations network, cluster detection and references with strong citation burst is analyzed to reveal the intellectual base of NC publications. Keywords: neurocomputing; bibliometric study; scientometric mapping; co-citation analysis; Web of Science
1. Introduction Bibliometrics (also referred as Scientometrics) as a research area has received tremendous attention from the scientific fraternity in recent times [1]. Bibliometrics is identified as an efficient technique to build a general outline of a certain scientific research field. There has been a rapid development in the field of bibliometrics in recent times attributing to social developments significantly gaining from the advancement of the internet and computers [2]. Scientometrics therefore gives an aerial view of scientific activity. It basically is a part of library and information science, but it has been judiciously used in numerous other research areas like management [3,4], energy & fuels [5,6], psychology [7,8], Dielectric and bioimpedance research [9], Selfish memes [10], and so on. Bibliometric study is not only celebrated in the field of computer science but is also finding increasing attention in domains such as decision support systems [11,12], bioinformatics [13,14], heuristics [15], big data [16–18], and software engineering [19]. Moreover, there has been a quality scientometric research work contribution in last few years, such as Fuzzy decision making [20], fuzzy theory research in China [21], Real-time operating systems [22], Linguistic decision making studies [23], Ordered weighted averaging operators [24], and Atanassov intuitionistic fuzzy sets [25]. Apart from subject specific bibliometric study, there are several journal specific studies, such as Knowledge Based Systems (KBS) [26], International Journal of Intelligent Systems (IJIS) [27], IEEE Transactions on Fuzzy Systems (TFS) [28], European Journal of Operations Research (EJOR) [29], Computers & Industrial Engineering [30], Information Sciences (INS) 1968–2016 [31,32], Medicine [33], Applied Soft Computing [34], and others [35–38].
Publications 2018, 6, 32; doi:10.3390/publications6030032
www.mdpi.com/journal/publications
Publications 2018, 6, 32
2 of 22
Neurocomputing (NC) is one of the most pioneering journals in the field of neurocomputing. Throughout the years it has maintained a high reputation and is widely read by the computer science research fraternity. NC publishes various document types, such as articles, theoretical contributions, literature reviews, conference and workshop meeting reports, short communications, and so on. All these document types cover a wide area of computer science discipline including signal processing, speech processing, image processing, computer vision, control, robotics, optimization, scheduling, resource allocation, and financial forecasting. NC not only covers the theoretical aspects but also focuses on the practical contributions, such as simulation software environments, emulation hardware architectures, models of concurrent computation, neurocomputers, and neurochips (digital, analog, optical, and biodevices) [39]. The popularity of NC journal can be observed from the impact factor this journal has acquired in such short span of time. Impact factor (IF) is the measure of the frequency with which an article of the journal is cited in year. It is a journal metric used for analyzing the rank of a journal. According to the latest Journal Citation Reports of the Web of Science, which is owned by Clarivate Analytics, NC journal has an impact factor of 3.317. The major contribution of this paper are as follows: 1.
2. 3. 4. 5.
6.
This paper reports a detailed study of the generic bibliometric impression of the journal of NC publications by examining the most prominent articles, institutions, authors, author collaborations, institute collaborations, country collaborations, etc. The purpose of this study is to provide an intrinsic structure of NC publications and the targeted research areas where the contributions are made. A detailed study of the NC publication base is also performed to understand the knowledge base of NC publications. The top cited articles in this journal are listed to give highlights of the NC publications. The data of all the documents considered for this study have been retrieved from the Web of Science (WoS) database which comprises of several document types, such as articles, proceeding papers, letters, editorials, and reviews. Statistical techniques and analysis determines the nature of the NC citations and their structure.
The paper is organized as follows: Section 2 explains the data collection methodology, performance indicators, and the visualization tools used in this paper. In Section 3, we have covered the publications and citation structure, most productive and influential authors, top ranked countries in NC, and the best institutions contribution in NC. The visualization of the publication structure of the various entities (authors, countries, institutions, etc.) is shown with the help of the Citespace. Then we have explained the bibliometric landscape and included the document co-citation analysis in Sections 4 and 5, simultaneously. Finally, a conclusion is drawn in Section 6 followed by the references used in this paper. 2. Methodology and Performance Indicators The repository of Web of Science (WoS) is utilized for this study, which is a scientific citation indexing service maintained by Clarivate Analytics. WoS is the first multidisciplinary bibliographic index of journal publications designed. It is considered a standard data source for bibliometrics. Other databases such as Scopus by Elsevier are also used by bibliometricians. The first volume of NC came in the year 1989, however, the data for this study has been extracted from WoS where NC’s publications are from the year 1992. Therefore the data span of this study is from 1992 to 2018. The data was collected in December 2017. A total of 10,838 publications were retrieved for the above said period. The query used in the search engine of WoS was “SO = Neurocomputing”. Each record of the data retrieved from WoS comprises a number of fields such as author, author affiliation, title, abstract, citations record, and so on.
Publications 2018, 6, 32
3 of 22
This study uses a range of scientometric indicators combined with statistical analysis and visualizations tools. Associations between several entitles/variables (author, organization, or country) have been explored, which are defined as follows: (i)
Co-citation analysis: This is another way to analyze the citation structure and provides a glimpse of the relationships between papers, and through them other entities, inside a research domain. It basically tells us that if two entities are co-cited, i.e., cited together more frequently then there are closer academic or disciplinary ties between them. (ii) Bibliographic coupling: This is the opposite of co-citation, it is the number of times two or more entities cite the same entity. Both co-citation and bibliographic coupling indicate disciplinary links. The number of co-authored documents identifies collaborative or co-authorship links between two entities, directly linking authors, institutions, or countries. (By entity we mean either an author, an organization, or a country.) (iii) Document co-citation analysis (DCA): This explores the citation base of publications giving an insight into the inspiration corpus. Performance Indicators and Visual Tools: Now, we explain the various scientometric indicators used for the analysis. (i) (ii)
Total Papers (TP)—the total number of papers from a particular source, Total Citations (TC)—the total number of citations generated by a particular publication within the database of WoS. (iii) Citations per Paper (CPP)—TP divided by TC, (iv) Hirsch index or h-index—This is equal to the number of papers (N) of an entity that has more than N citations each [40]. Further, two of the widely used graphical mapping software’s are used, namely, VOSviewer and CiteSpace (i)
(ii)
VOSviewer is a tool for generation and visualization of bibliographic networks. It has been used here for depicting bibliographic coupling and co-authorship between different entities VOSviewer can be used to construct networks of scientific publications, scientific journals, researchers, research organizations, countries, keywords, or terms. Items in VOSviewer are visualized in terms of nodes and edges connecting theses nodes. Items in these networks can be connected by co-authorship, co-occurrence, citation, bibliographic coupling, or co-citation links. Examples are bibliographic coupling links between publications, co-authorship links between researchers, and co-occurrence links between terms. Vosviewer version 1.6.5 has been used for this study. CiteSpace [41] has also been used here for visualization and analysis document co-citation of NC publications. It is an open source Java application used for visualizing trends from metadata of scientific literature. Citespace version 5.0 R4 SE has been used for this study.
3. Citation Structure of NC Publications In the year 1992, NC published 29 research articles. Since 1998, the number of publications never dropped below 100 and the trends of publications are mostly in the increasing order. The maximum number of 1840 articles were published in the year 2016. The years from 2015–2017 account for approximately 40% of the total NC publications. In the year 2017, there were 1118 publications up to the month of November, when the search query was performed, and this count will certainly increase. For 2018, 143 papers are already included in NC. There is a strong possibility that the journal is expected to follow the current increasing trend over the years to come. Figure 1 shows the distribution of citations of NC publications over the years (1992–2018), which received a total of 106,536 citations. Interestingly, the year 1992 reflects only 30 citations with a total of 29 publications. The year 1998 accounted for a citation count of 3361 with only 108 publications,
Publications 2018, 6, 32
4 of 22
308 308 252 252 295 295 255 255 335 335 342 342 393 393 414 414 345 345 369 369 427 427 702 702 902 902 13011301 18401840 11181118 143 143
making this the year with the maximum citations per year. However, the maximum number of citations were attracted in the year 2015, with a total count of 9108. The TC is not shown for 2018 as counting Publications 2018, 6, x FOR PEER REVIEW 4 of 21 is yet to start. Moreover, the year 2006, with 335 publications, attracted comparable citations (7617) Publications 2018, 6, x publications FOR PEER REVIEW 4 of 21by to that which 2016 did. On further analysis, this is found to be because of the article article by Huang et al. [31] where extreme machine learning is presented along with its theory and Huang et al. [31] where extreme machine learning is presented along with its theory and applications. applications. article by Huang et al. [31] where extreme machine learning is presented along with its theory and applications.
2 0 1 28 0 1 8
2 0 1 27 0 1 7
2 0 1 26 0 1 6
2 0 1 25 0 1 5
2 0 1 24 0 1 4
2 0 1 23 0 1 3
2 0 12 0 1 2
2 0 1 21 0 1 1
2 0 1 20 0 1 0
2 0 0 29 0 0 9
2 0 0 28 0 0 8
Years
2 0 0 27 0 0 7
2 0 0 26 0 0 6
2 0 0 25 0 0 5
2 0 0 24 0 0 4
2 0 0 23 0 0 3
2 0 02 0 0 2
1 9 9 19 9 9 9 184 184 2 0 0 20 0 0 0 219 219 2 0 0 21 0 0 1 257 257
1 9 9 18 9 9 8 108 108
1 9 9 15 9 9 562 62 1 9 9 16 9 9 6 103 103 1 9 9 17 9 9 774 74
1 9 9 12 9 9 229 29 1 9 9 13 9 9 322 22 1 9 9 14 9 9 439 39
Total Publications Total Publications
Years Figure 1. Neurocomputing (NC) publications/year distribution. Figure 1. Neurocomputing (NC) publications/year distribution. Figure 1. Neurocomputing (NC) publications/year distribution.
2 0 1 27 0 1 7462 462
2 0 1 26 0 1 6
57175717
91089108 2 0 1 25 0 1 5
84608460 2 0 1 24 0 1 4
81688168 2 0 1 23 0 1 3
58155815 2012012
2 0 1 21 0 1 1
70857085 2 0 0 29 0 0 9
58315831 55925592
72537253 2 0 0 28 0 0 8
53995399 2 0 0 27 0 0 7
76177617
38023802
Years
2 0 0 26 0 0 6
2 0 0 25 0 0 5
30753075 2 0 0 24 0 0 4
54295429 2 0 0 23 0 0 3
43274327
22332233 2 0 0 21 0 0 1
2002002
21862186 2 0 0 20 0 0 0
33613361 17361736 1 9 9 19 9 9 9
1 9 9 18 9 9 8
1 9 9 16 9 9 6 14071407 1 9 9 17 9 9 7 982 982
1 9 9 15 9 9 5 765 765
1 9 9 14 9 9 4 516 516
1 9 9 13 9 9180 3 180
1 9 9 12 9 930 2 30
Total Citations Total Citations
2 0 1 20 0 1 0
The diversity in the total number of citations with respect to the total number of publications Thethe diversity the total number of citations with respect to the total of publications over over years isin shown in Figure 2. There has been a gradual increase in number the citations over the year. The diversity in the total number of citations with respect to the total number of publications theItyears shown in Figure There has been a gradual increase in the citations over the year. It is at a is at aismaximum in 19982.with a value of 31.21 as seven publications have attracted more than 100 over the years is shown in Figure 2. There has been a gradual increase in the citations over the year. maximum value of was 31.21 as seven attracted more than 100 citations citations in up1998 to thewith dateathat data collected forpublications this study. Ithave is followed by the publications in the It is at a maximum in 1998 with a value of 31.21 as seven publications have attracted more than 100 years and 2003 with CPP of 22.7 and This can also be attributed to the that upcitations to the2006 date that collected for 21.5, this respectively. study. is followed by the infact the up to thedata date was that data was collected for this It study. It is followed bypublications the publications in years the a lot of 2003 publications in these years have attracted a goodThis number of citations. However, citations 2006 and with CPP of 22.7 and 21.5, respectively. can also be attributed to the fact that a years 2006 and 2003 with CPP of 22.7 and 21.5, respectively. This can also be attributed to the fact that year count has in the lastattracted few yearsa as shown in Figure 3. This can possibly be because lotaper oflotpublications indegraded these years have citations. However, the citations of publications in these years have attracted agood goodnumber number of of citations. However, the citations publications in the later years are fairly new and possibly will attract more citations in the times to perper year count has degraded in the last few years as shown in Figure 3. This can possibly be because year count has degraded in the last few years as shown in Figure 3. This can possibly be because come. publications ininthe fairly new newand andpossibly possibly will attract more citations in times the times publications thelater lateryears years are are fairly will attract more citations in the to to come.
Yearsof NC over the years. Figure 2. Citations structure Figure 2. Citations structure of NC over the years.
Figure 2. Citations structure of NC over the years. Table 1 displays the general citation structure of NC publications from 1992 to 2018. Here, ‘%’ is used to represent the percentage. There are 16 papers which received more than 200 citations, which Table 1 displays the general citation structure of NC publications from 1992 to 2018. Here, ‘%’ is is 0.14% of the total NC publications. With more than 100 citations, we have 56 publications which used to represent the percentage. There are 16 papers which received more than 200 citations, which account for 0.51% of total publications. There are 2146 papers with no citations at all, which is 19.8% is 0.14% of the total NC publications. With more than 100 citations, we have 56 publications which of total publications. Approximately 80% of the publications had at least 1 citation; among them, a account for 0.51% of total publications. There are 2146 papers with no citations at all, which is 19.8% of total publications. Approximately 80% of the publications had at least 1 citation; among them, a
Publications 2018, 6, x FOR PEER REVIEW
5 of 21
major chunk was published in the last five years and thus this number is bound to change as they are Publications 2018, 6,publications. 32 5 of 22 fairly young The year 1998 stands at the top position with the CPP. Almost 50% of NC
publications have attracted more than five citations which indicates the reach of the journal.
35
Citations Per Year
30 25 20 15 10
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
2005
2004
2003
2002
2001
2000
1999
1998
1997
1996
1995
1994
1993
0
1992
5
Years Figure 3. Citations per year of the NC computing. Figure 3. Citations per year of the NC computing. Table 1. General citation Structure in the Journal: NC (1992–2018).
Table 1 displays the general citation structure of NC ≥1publications from 1992 to 2018. Here, ‘%’ is Year ≥200 ≥100 ≥50 ≥20 ≥10 ≥5 0 TP h-Index TC CPP 2018 0 0 0 0 are 16 0 papers 0 which 0 received 143 143 than 200 0 0 0.00 used to represent the percentage. There more citations, which is 2017 0 0 0 1 4 9 264 854 1118 7 510 0.46 0.14%2016 of the total NC0 publications. With more than 100 citations, we have 5620 publications which 0 3 21 93 419 1389 451 1840 5813 3.16 2015for 0.51% 0 0 1 73 There 322 are 715 1200 101 no1301 9190 7.06 account of total publications. 2146 papers with citations27at all, which is 19.8% 2014 0 1 8 106 321 576 845 57 902 31 8495 9.42 of total publications. Approximately 80%296 of the468 publications had at 702 least 1 citation; among them, 2013 0 2 13 115 663 39 35 8200 11.68 2012chunk0was published 0 14in the89 207 years 296and thus 398 this29number 427 is bound 36 5826 13.64 a major last five to change as they 2011 1 1 20 91 189 260 350 19 369 5605 15.19 are fairly young publications. The year 1998 stands at the top position with the33 CPP. Almost 50% of 2010 1 2 19 94 187 261 329 16 345 36 5840 16.93 2009 0 have 2attracted 27 more 124than five 235 citations 313 389 25 414the reach 41 of the 7095 17.14 NC publications which indicates journal. 2008 1 7 30 115 199 277 367 26 393 40 7261 18.48 Neurocomputing journal has been publishing quality and impactful research articles since its 2007 1 4 24 75 151 227 322 20 342 35 5405 15.80 2006 1 2 displays 3 219 305 335 by WoS. 35 22.77 induction. Table a16list of 82 top 40153 most cited articles as30 indexed All7627 the papers in 2005 1 4 17 53 97 154 233 22 255 32 3806 14.93 this list got more than 0100 citations. Huang et al. [42] received the highest citations of 2762 since 2006 2004 0 8 46 104 161 261 34 295 27 3077 10.43 4 9 21 57machine: 92 Theory 131 and210 42 252 33 5439 21.58with for the2003 paper titled “Extreme learning applications”. It has received citations 2002 2 5 11 47 102 162 257 51 308 31 4331 14.06 increasing speed and has a count of 230.17 citations per year, which is highest in 22 the top2234 40 list 8.69 of most 2001 1 1 6 30 67 108 230 27 257 2000 0 2 research 10 article 27 proposed 56 34 algorithm 219 24 single2193 10.01 cited publications. This a97 novel185learning for feed forward 1999 0 1 5 25 47 82 162 22 184 24 1737 9.44 neural1998 networks where the hidden nodes58are randomly chosen and108the weights are analytically 2 7 17 40 79 96 12 30 3365 31.16 1997 The0 second1 position 3 is acquired 17 26by paper 39 59 15 forecasting 74 18 982 13.27 assigned. “Time series using a hybrid ARIMA 1996 1 1 8 21 34 47 69 34 103 20 1407 13.66 and neural network model” authored by Zhang [43] where he proposed a hybrid approach containing 1995 0 1 3 11 22 31 52 10 62 15 765 12.34 1994 0 integrated 2 3 7 11and artificial 15 25neural14networks 39 10 516 non-linear 13.23 autoregressive moving average for linear and 1993 0 0 1 3 3 9 15 7 22 6 180 8.18 modelling for forecasting. These two1are followed by Suykens et12al. [44], Kim [45], and 106 Kohonen [46], 1992 0 0 0 3 9 17 29 6 3.66 Total papers 16 titles 56 288 1371 3079 5164 8692 2146 for their as follows: Weighted least squares support vector10,838 machines: robustness and sparse % 0.14 0.51 2.65 12.64 28.40 47.64 80.19 19.8 approximation; Financial time series forecasting using support vector machines; and The self-organizing map, respectively. The paper by Suykens et al. [44] proposed a method for obtaining robust estimates for Neurocomputing journal has been publishing quality and impactful research articles since its regression byTable applying a weighted version least squares support vector induction. 2 displays a list of top 40 of most cited articles as indexed bymachine WoS. All(LS-SVM) the papersalong in with a sparse approximation procedure for weighted and unweighted LS-SVM. Kim [45] examined this list got more than 100 citations. Huang et al. [42] received the highest citations of 2762 since 2006the feasibility of support machines financial forecasting. KohonenIt [46] an overview of the for the paper titled vector “Extreme learning in machine: Theory and applications”. has gave received citations with self-organizing map algorithm, namely, a software for visualizing high dimension data. increasing speed and has a count of 230.17 citations per year, which is highest in the top 40 list of There arepublications. six papers which published before 2000, with the rest ofalgorithm them published after 2000. most cited This were research article proposed a novel learning for single feed From the last five years, there are two papers which are in the list of top 40 most cited list. They forward neural networks where the hidden nodes are randomly chosen and the weights areare “A survey on fallassigned. detection:The Principles approaches” and “Weighted extreme learning machine for aimbalance analytically secondand position is acquired by paper “Time series forecasting using hybrid learning” from Mubashir et al. [47] and Zong et al. [48], respectively.
Publications 2018, 6, 32
6 of 22
Table 1. General citation Structure in the Journal: NC (1992–2018). Year
≥200
≥100
≥50
≥20
≥10
≥5
≥1
0
TP
h-Index
TC
CPP
2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 1992 Total %
0 0 0 0 0 0 0 1 1 0 1 1 1 1 0 4 2 1 0 0 2 0 1 0 0 0 0 16 0.14
0 0 0 0 1 2 0 1 2 2 7 4 3 4 0 9 5 1 2 1 7 1 1 1 2 0 0 56 0.51
0 0 3 1 8 13 14 20 19 27 30 24 16 17 8 21 11 6 10 5 17 3 8 3 3 1 0 288 2.65
0 1 21 73 106 115 89 91 94 124 115 75 82 53 46 57 47 30 27 25 40 17 21 11 7 3 1 1371 12.64
0 4 93 322 321 296 207 189 187 235 199 151 153 97 104 92 102 67 56 47 58 26 34 22 11 3 3 3079 28.40
0 9 419 715 576 468 296 260 261 313 277 227 219 154 161 131 162 108 97 82 79 39 47 31 15 9 9 5164 47.64
0 264 1389 1200 845 663 398 350 329 389 367 322 305 233 261 210 257 230 185 162 96 59 69 52 25 15 17 8692 80.19
143 854 451 101 57 39 29 19 16 25 26 20 30 22 34 42 51 27 34 22 12 15 34 10 14 7 12 2146 19.8
143 1118 1840 1301 902 702 427 369 345 414 393 342 335 255 295 252 308 257 219 184 108 74 103 62 39 22 29 10,838
0 7 20 27 31 35 36 33 36 41 40 35 35 32 27 33 31 22 24 24 30 18 20 15 10 6 6
0 510 5813 9190 8495 8200 5826 5605 5840 7095 7261 5405 7627 3806 3077 5439 4331 2234 2193 1737 3365 982 1407 765 516 180 106
0.00 0.46 3.16 7.06 9.42 11.68 13.64 15.19 16.93 17.14 18.48 15.80 22.77 14.93 10.43 21.58 14.06 8.69 10.01 9.44 31.16 13.27 13.66 12.34 13.23 8.18 3.66
Table 2. Top 40 cited publications in NC. Rank
Title
Authors
Year
TC
Citations per Year
1
Extreme learning machine: Theory and applications
Huang GB; Zhu QY; Siew CK
2006
2762
230.17
2
Time series forecasting using a hybrid ARIMA and neural network model
Zhang GP
2003
660
44.00
3
Weighted least squares support vector machines: robustness and sparse approximation
Suykens JAK; De Brabanter J; Lukas L; Vandewalle J
2002
615
38.44
4
Financial time series forecasting using support vector machines
Kim KJ
2003
463
30.87
5
The self-organizing map
Kohonen T
1998
462
23.10
6
Convex incremental extreme learning machine
Huang GB; Chen L
2007
454
41.27
7
Enhanced random search based incremental extreme learning machine
Huang GB; Chen L
2008
376
37.60
8
Blind separation of convolved mixtures in the frequency domain
Smaragdis P
1998
361
18.05
9
Optimization method based extreme learning machine for classification
Huang GB; Ding XJ; Zhou HM
2010
335
41.88
10
Designing a neural network for forecasting financial and economic time series
Kaastra I; Boyd M
1996
322
14.64
11
The support vector machine under test
Meyer D; Leisch F; Hornik K
2003
291
19.40
12
Evaluation of simple performance measures for tuning SVM hyperparameters
Duan K; Keerthi SS; Poo AN
2003
280
18.67
13
(2D)(2)PCA: Two-directional two-dimensional PCA for efficient face representation and recognition
Zhang DQ; Zhou ZH
2005
265
20.38
Publications 2018, 6, 32
7 of 22
Table 2. Cont. Rank
Title
Authors
Year
TC
Citations per Year
Murata N; Ikeda S; Ziehe A
2001
240
14.12
14
An approach to blind source separation based on temporal structure of speech signals
15
Error-backpropagation in temporally encoded networks of spiking neurons
Bohte SM; Kok JN; La Poutre H
2002
235
14.69
16
Recent advances and trends in visual tracking: A review
Yang HX; Shao L; Zheng F; Wang L; Song Z
2011
229
32.71
17
Modified support vector machines in financial time series forecasting
Tay FEH; Cao LJ
2002
194
12.13
18
Support vector machines experts for time series forecasting
Cao LJ
2003
192
12.80
19
A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine
Cao LJ; Chua KS; Chong WK; Lee HP; Gu QM
2003
191
12.73
20
Empirical evaluation of the improved Rprop learning algorithms
Igel C; Husken M
2003
188
12.53
21
Determination of the spread parameter in the Gaussian kernel for classification and regression
Wang WJ; Xu ZB; Lu WZ; Zhang XY
2003
186
12.40
22
Evolutionary tuning of multiple SVM parameters
Friedrichs F; Igel C
2005
185
14.23
23
A survey on fall detection: Principles and approaches
Mubashir M; Shao L; Seed L
2013
174
34.80
24
WEBSOM - Self-organizing maps of document collections
Kaski S; Honkela T; Lagus K; Kohonen T
1998
169
8.45
25
Asymptotic and robust stability of genetic regulatory networks with time-varying delays
Ren F; Cao J
2008
168
16.80
26
Natural Actor-Critic
Peters J; Schaal S
2008
167
16.70
27
Artificial neural networks in hardware A survey of two decades of progress
Misra J; Saha I
2010
165
20.63
28
Learning And Generalization Characteristics Of The Random Vector Functional-Link Net
Pao YH; Park GH; Sobajic DJ
1994
163
6.79
29
A new correlation-based measure of spike timing reliability
Schreiber S; Fellous JM; Whitmer D; Tiesinga P; Sejnowski TJ
2003
158
10.53
30
Time-series prediction using a local linear wavelet neural network
Chen YH; Yang B; Dong JW
2006
156
13.00
31
A fast pruned-extreme learning machine for classification problem
Rong HJ; Ong YS; Tan AH; Zhu ZX
2008
154
15.40
32
The nonlinear PCA learning rule in independent component analysis
Oja E
1997
154
7.33
33
Weighted extreme learning machine for imbalance learning
Zong WW; Huang GB; Chen YQ
2013
147
29.40
34
A case study on using neural networks to perform technical forecasting of forex
Yao JT; Tan CL
2000
143
7.94
35
Hopfield neural networks for optimization: study of the different dynamics
Joya G; Atencia MA; Sandoval F
2002
142
8.88
36
Fully complex extreme learning machine
Li MB; Huang GB; Saratchandran P; Sundararajan N
2005
140
10.77
37
Ensemble of online sequential extreme learning machine
Lan Y; Soh YC; Huang GB
2009
139
15.44
38
Multimodality medical image fusion based on multiscale geometric analysis of contourlet transform
Yang L; Guo BL; Ni W
2008
133
13.30
39
A comprehensive review of current local features for computer vision
Li J; Allinson NM
2008
127
12.70
40
Methodology for long-term prediction of time series
Sorjamaa A; Hao J; Reyhani N; Ji YN; Lendasse A
2007
125
11.36
Publications 2018, 6, 32
8 of 22
Top Authorship, Country, and Institutions The list of most productive and influential authors is shown in Table 3. The most productive authors are ordered by the maximum number of TP while the most influential authors are accounted by TC. The table also describes authors with total citations, citations per paper and h-index. Cao and Yang both have 74 publications, however, Cao received more citations of 1574 which is higher than Yang’s citation of 510. These two are followed by Wang (TP = 72) and Zhang (TP = 64). Li and Liu published same number of papers with a count of 60. Table 3. Most Productive and Influential Authors in NC (1992–2018). Rank
Name
Country
Affiliation
TP
TC
CPP
h-Index
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Cao JD Yang J Wang J Zhang HG Li J Liu Y Zhang J Wang Y Zhang Y Yuan Y Li XL Sanchez VD Wang D Wang L Liu J Huang TW Liu YR Zhang H Zhang L Gao XB
China China China China China China China China China China China USA China China China Qutar China China China China
Southeast University Beijing Institute of Technology Tsinghua University Northeastern University Xidian University Northeastern University Beijing Institute of Technology Central South University Tongji University Chinese Academy of Sciences Chinese Academy of Sciences University of Miami Dalian University of Technology Wuhan University Qingdao Agricultural University Texas A&M University Yangzhou University PLA University of Science & Technology Hong Kong Polytechnic University Xidian University
74 74 72 64 60 60 58 55 54 52 50 50 49 48 45 44 43 43 43 42
1574 510 374 899 528 495 429 344 448 615 461 197 565 662 215 496 712 365 323 362
21.27 6.89 5.19 14.05 8.80 8.25 7.40 6.25 8.30 11.83 9.22 3.94 11.53 13.79 4.78 11.27 16.56 8.49 7.51 8.62
23 11 12 17 13 14 10 10 11 15 12 7 13 12 8 13 14 11 12 12
Again, Cao tops the list of most influential authors with TC of 1574. Zhang HG, Liu YR, Wang L and Yuan Y are next in the list with TC of 899, 712, 662, and 615, respectively. The maximum citations per paper (CPP) were attained by Cao (21.27), Liu (16.56), Zhang (14.05), Wang (13.79), and Yuan (11.83). The h-index is again highest for Cao (23). Zhang (17) and Yuan (15) comes second and third highest in the list. They are followed by Liu YR and Liu Y with same h-index of 14. Figure 4 conveys similar information as in Table 3, for example the circles representing Cao and Zhang are the biggest followed by Wang and Huang. Further, it depicts the author co-citation network and thus describes the disciplinary ties between them. The researchers from various countries are publishing in one of the computer sciences leading journal of neurocomputing. The list of leading 25 countries which are publishing in NC is shown in Table 4. Apart from total papers, the table also mentions other indicators such as: total citations, citations per paper, and h-index. The countries are ordered according to the total number of publications. The People’s Republic of China is the most productive country which lies on the first spot with a total publication count of 5179. It is followed by the USA (1498), England (590), Spain (564), and Germany (492). Surprisingly, there is no developing country in the list of top 10 most productive countries. India Ranks 11th in the list, which is a developing nation with 277 publications. The ranking of most influential countries remains the same with respect to the most productive countries for China (TC = 45,688) and the USA (TC = 13,933). Interestingly, then comes Singapore with 8764 citations in only 278 publications. It is then followed by England (6596), Germany (5145), and Spain (4880). The citations per paper shows a different image of the influence of the countries. Singapore tops the
Again, Cao tops the list of most influential authors with TC of 1574. Zhang HG, Liu YR, Wang L and Yuan Y are next in the list with TC of 899, 712, 662, and 615, respectively. The maximum citations per paper (CPP) were attained by Cao (21.27), Liu (16.56), Zhang (14.05), Wang (13.79), and Yuan (11.83). Publications 2018, 6, 32 9 of 22 The h-index is again highest for Cao (23). Zhang (17) and Yuan (15) comes second and third highest in the list. They are followed by Liu YR and Liu Y with same h-index of 14. Figure 4 conveys similar in of Table 3, for example circles(17.58), representing and Zhang areNetherlands the biggest list withinformation the highest as CPP 31.53. Then comes the Belgium FinlandCao (16.34), and The followed by Wang and Huang. Further, it depicts theof author networkcountries. and thus There describes (12.7). These three countries were not even in the list top 10co-citation most productive is a the disciplinary ties between them. total of 12 countries whose CPP is more than 10.
Figure 4. 4. Author Author Co-Citation Co-Citation Analysis. Analysis. Figure
The researchers from various countries are publishing in one of the computer sciences leading Table 4. Most Productive Countries publishing in NC (1992–2018). journal of neurocomputing. The list of leading 25 countries which are publishing in NC is shown in Table 4. Apart from total papers, the table also mentions other indicators such as: total citations, Rank Country TP TC CPP h-Index citations per paper, and h-index. The countries are ordered according to the total number of Peoples R China 5179 45,688 8.82 63.00 publications. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
USA 1498 13,933 9.30 England 590 6596 11.18 Table 4. Most Productive Countries Spain 564 publishing 4880 in NC (1992–2018). 8.65 Germany 492 5145 10.46 Rank Country TP TC CPP h-Index Japan 439 3584 8.16 1 Peoples R China 5179 45,688 63.00 France 346 3909 8.82 11.30 2 AustraliaUSA 1498 13,933 9.30 46.00 343 2846 8.30 3 590 6596282511.18 37.00 ItalyEngland 294 9.61 4 Singapore Spain 564 48808764 8.65 31.00 278 31.53 IndiaGermany 277 5 492 5145304610.46 11.00 30.00 276 10.93 6 South Korea Japan 439 35843016 8.16 27.00 Brazil France 256 7.42 7 346 3909189911.30 30.00 Canada 253 2481 9.81 8 Australia 343 2846 8.30 25.00 Taiwan 244 2516 10.31 9 Italy 294 2825 9.61 26.00 Iran 174 1511 8.68 10 Singapore 278 8764 31.53 35.00 Saudi Arabia 170 1550 9.12 11 FinlandIndia 277 3046 11.00 26.00 149 2434 16.34 12 Poland South Korea 276 3016142710.93 10.27 27.00 139 Belgium The Netherlands Scotland Mexico Greece Malaysia
118 111 104 101 86 77
2074 1410 840 686 900 393
17.58 12.70 8.08 6.79 10.47 5.10
46.00 37.00 31.00 30.00 27.00 30.00 25.00 26.00 35.00 26.00 27.00 21.00 27.00 25.00 20.00 21.00 23.00 23.00 21.00 20.00 16.00 15.00 17.00 10.00
ranking of most influential countries remains the same with respect to the most productive countries for China (TC = 45,688) and the USA (TC = 13,933). Interestingly, then comes Singapore with 8764 citations in only 278 publications. It is then followed by England (6596), Germany (5145), and Spain (4880). The citations per paper shows a different image of the influence of the countries. Singapore tops the list The Publications 2018,with 6, 32 the highest CPP of 31.53. Then comes Belgium (17.58), Finland (16.34), and 10 of 22 Netherlands (12.7). These three countries were not even in the list of top 10 most productive countries. There is a total of 12 countries whose CPP is more than 10. With With respect respect to to the the h-index, h-index, China China tops tops the the list list with with a value of 63 followed by USA (46), England (37), Singapore (35), and Spain (31). Then Then France France and and Germany Germany share share the the same same h-index h-index value value of of 30. 30. The dominance of China can also be seen in the the Figure Figure 55 which which describes describes the the co-citation co-citation analysis of countries. The inference from this visualization shows that, almost every time, a paper paper originating originating from China is cited along with papers originating from other countries. cited along with papers originating from other countries.
Figure 5. 5. Countries Countries Co-Citation Co-Citation Analysis. Analysis. Figure
The institution-wise ordering is displayed in Table 5. This table shows the total publications, The institution-wise ordering is displayed in Table 5. This table shows the total publications, total citations, citations per paper and h-index of institutions. Table 5 presents the most productive total citations, citations per paper and h-index of institutions. Table 5 presents the most productive institutions which have contributed to NC publications. The Chinese Academy of Sciences, China, institutions which have contributed to NC publications. The Chinese Academy of Sciences, China, produced 526 total papers with total citations of 4752. The second spot is taken by Harbin Institute of Technology, also from China, with 219 papers. Southeast University from China takes the third spot with 206 occurrences. The total papers from the institute with total citations and citations per paper (CPP) are also given in the table, along with the h-index of each institution. As can be seen, most of the institutes in the list belong to China. Interestingly, most of the universities in this table are from China. However in terms of TC, Nanyang technological university, China stands at the top with 6862 citations followed by Chinese Academy of Sciences (4752) and Southeast University (3267). The institutes with the highest average citation per paper (CPP) were Nanyang Technological University, Singapore (46.68), National University of Singapore (27.46), and Southeast University (15.86). There are 11 institutions in this list of top 25 which have a CPP of at least 10. Two institutes have an h-index of 31, Chinese Academy of Sciences and Southeast University, both are from China. They are followed by the Nanyang Technological University (30), Harbin Institute of Technology, China (23), and Northeastern University, China (23). The dominance of China is also evident here as most of the institutions in Table 5 are from China. However, among the institutions, as shown by the institution co-citation network in Figure 6, the dominance of the Chinese Academy of Sciences can be concluded, which overlaps with almost all the other institutions. This determines that almost every time the publications from Chinese Academy of Sciences are cited along with all other institutions.
Publications 2018, 6, 32 Publications 2018, 6, x FOR PEER REVIEW
11 of 22 11 of 21
Figure 6. Institution Co-Citation Analysis. Figure 6. Institution Co-Citation Analysis.
4. Bibliographic Landscape
Table 5. Most Productive and Influential Institutions in NC (1992–2018).
This section demonstrates the disciplinary links with the help of bibliographic coupling. The Rank TC CPP h-Index VOSviewer visualization toolInstitute is used(Countries) to show the coupling betweenTP the authors, countries and the 1 Chinese Academy of Sciences, China 526 4752 9.03 31 institutions. 2We have first shown Harbinthe Institute of Technology, China 219 2123 9.69 23 bibliographic coupling between the authors in Figure 7. The authors 3 Southeast University China, China 206 3267 15.86 31 are connected by the link with different widths. More width shows more intersection or references of 4 Nanjing University of Science Technology, China 180 1943 10.79 22 these5 authors, or degree of overlapofbetween the reference lists, and 169 thus more Huazhong University Science Technology, China 1662 coupling. 9.83 Moreover, 21 each6node representing the authorsUniversity, is also marked red, blue and green. Tsinghua China with three colors,166 1243 7.49 Each19color 7 Jiao TongaUniversity, China 1508 9.20 consisting of group ofShanghai countries forms cluster which represents that164 these countries in a cluster21have Xidian University, China 158 shown 1178in red 7.46 19 J, cited8each other’s paper with more frequency. In the prominent cluster color, Wang 9 Zhejiang University, China 158 1167 7.39 18 Yang J, Yuan Y are the authors whose papers are more cited by the other authors in 9.50 that cluster. The 10 Northeastern University, China 157 1492 23 Other cluster is of a green color with authors such as: Zhang J, Wang D, Zhang HG, Zhang Y, and University Of Electronic Science Technology of China, China 149 1395 9.36 21 so 11 12 Nanyang Technological University, Singapore 147 6862 46.68 30 on. Hammer B, Chen I, Wang GR, and so on form the third cluster of a blue color. We have mentioned 13 Xi An Jiaotong University, China 136 1588 11.68 the author’s name, with the last name followed by the initials of the first name as there are 17 many 14 Beihang University, China 132 884 6.70 16 Chinese authors whose names match closely. 15 Centre National De La Recherche Scientifique CNRS, France 128 1704 13.31 22 Figure 8 shows Dalian the bibliographic of China the top countries/territories publishing 16 University ofcoupling Technology, 127 1104 8.69 in the 18NC. The 17 countries are denoted by the with different colors.126 The countries are connected University of circular Californianode System, USA 1276 10.13 16 by King Abdulaziz University, Arabia 120 1302 10.85 the 18 link with different widths. More widthSaudi shows the more intersection or references of 21these 19 Kong Polytechnic University, Hong Kong 114 more 1350 11.84 20 countries, or the Hong degree of overlap between the reference lists, and thus coupling. 20 21 22 23 24 25
University of Chinese Academy Of Sciences CAS, China South China University of Technology, China Chongqing University, China National University of Singapore, Singapore Tianjin University, China University of London, England
106 99 98 98 97 96
729 624 1194 2691 613 1005
6.88 6.30 12.18 27.46 6.32 10.47
16 13 19 20 13 20
4. Bibliographic Landscape This section demonstrates the disciplinary links with the help of bibliographic coupling. The VOSviewer visualization tool is used to show the coupling between the authors, countries and the institutions.
Publications 2018, 6, 32
12 of 22
We have first shown the bibliographic coupling between the authors in Figure 7. The authors are connected by the link with different widths. More width shows more intersection or references of these authors, or degree of overlap between the reference lists, and thus more coupling. Moreover, each node representing the authors is also marked with three colors, red, blue and green. Each color consisting of group of countries forms a cluster which represents that these countries in a cluster have cited each other’s paper with more frequency. In the prominent cluster shown in red color, Wang J, Yang J, Yuan Y are the authors whose papers are more cited by the other authors in that cluster. The Other cluster is of a green color with authors such as: Zhang J, Wang D, Zhang HG, Zhang Y, and so on. Hammer B, Chen I, Wang GR, and so on form the third cluster of a blue color. We have mentioned the author’s name, with the last name followed by the initials of the first name as there are many Chinese authors whosePublications names match closely. 2018, 6, x FOR PEER REVIEW 12 of 21
Figure 7. Bibliographic coupling of the authors in NC.
Figure 7. Bibliographic coupling of the authors in NC.
Figure 8 shows the bibliographic coupling of the top countries/territories publishing in the NC. The countries are denoted by the circular node with different colors. The countries are connected by the link with different widths. More width shows the more intersection or references of these countries, or the degree of overlap between the reference lists, and thus more coupling. Two countries are said to be bibliographically coupled if there is intersection between the papers cited by them. Among all the countries, China is shown as the prominent one in Figure 8 which also is the center node of the cluster between the countries such as Singapore, South Korea, India, Turkey, and Saudi Arabia. This suggest that these countries are most likely working on similar research areas and also referencing similar work. Among them, most probably all of other countries in this cluster are citing papers from China. The other visible clusters shown in red color are countries such as the USA, England, Japan, Germany, and France. Spain, Brazil, Italy, Canada, and Taiwan form the third cluster with a green color. In the last visualization of this section, we have shown the bibliographic coupling among several institutions as shown in Figure 9. The Chinese Academy of sciences is the prominent node (institution) in the red cluster. Other institute in this clusters are Xidian University, National University of Singapore,
Figure 8. Bibliographic coupling of the countries publishing in NC.
Publications 2018, 6, 32
13 of 22
Peking University, and so on. This strong bibliographic coupling for Chinese Academy of Sciences shows that most of the publications of other institutions in this cluster have cited its paper twice or more. The cluster with the blue color has universities such as South East University, Harbin institute of technology, King Abdulaziz University, and so on. Nanyang technology of university, Northeastern university, University of Granada, and so on form the group of institutes Figure 7. Bibliographic coupling of the authors in NC. in the third cluster.
Publications 2018, 6, x FOR PEER REVIEW
13 of 21
is the center node of the cluster between the countries such as Singapore, South Korea, India, Turkey, and Saudi Arabia. This suggest that these countries are most likely working on similar research areas and also referencing similar work. Among them, most probably all of other countries in this cluster are citing papers from China. The other visible clusters shown in red color are countries such as the USA, England, Japan, Germany, and France. Spain, Brazil, Italy, Canada, and Taiwan form the third cluster with a green color. In the last visualization of this section, we have shown the bibliographic coupling among several institutions as shown in Figure 9. The Chinese Academy of sciences is the prominent node (institution) in the red cluster. Other institute in this clusters are Xidian University, National University of Singapore, Peking University, and so on. This strong bibliographic coupling for Chinese Academy of Sciences shows that most of the publications of other institutions in this cluster have cited its paper twice or more. The cluster with the blue color has universities such as South East University, Harbin institute of technology, King Abdulaziz University, and so on. Nanyang FigureNortheastern 8. Bibliographic coupling ofUniversity the countries NC. technology of university, university, ofpublishing Granada, in and so on form the group Figure 8. Bibliographic coupling of the countries publishing in NC. of institutes in the third cluster. Two countries are said to be bibliographically coupled if there is intersection between the papers cited by them. Among all the countries, China is shown as the prominent one in Figure 8 which also
Figure 9. Bibliographic coupling of the institutions publishing in NC.
Figure 9. Bibliographic coupling of the institutions publishing in NC. 5. Document Co-Citation Network This section uses citescape for visualization of NC publications. Citespace is an open source Java based application software used for visualizing trends from metadata of scientific literature. Furthermore, document co-citation network analysis is studied along with the visualization of different document co-citation clusters and the references with strong citation bursts. The document co-citation network helps to explore the citation base of NC publications and can
Publications 2018, 6, 32
14 of 22
5. Document Co-Citation Network This section uses citescape for visualization of NC publications. Citespace is an open source Java based application software used for visualizing trends from metadata of scientific literature. Furthermore, document co-citation network analysis is studied along with the visualization of different document co-citation clusters and the references with strong citation bursts. The document co-citation network helps to explore the citation base of NC publications and can be seen in Figure 10. There are nodes and links in the given network. The nodes in Figure 10 represents the cited reference, the highly cited references are labeled with their first author and its year of publication. size of aREVIEW node is proportional to its number of citations count. Publications 2018, 6,The x FOR PEER 14 of 21
Figure 10. Document co-citation analysis. Figure 10. Document co-citation analysis.
The top 10 most cited references by NC publications from various sources (Journals, conferences, top 10 most cited references NCpaper publications from various sources (Journals, conferences, etc.) The are tabulated in Table 6. The firstby is the by Huang et al. [49] which proposes a new learning etc.) are tabulated in Table 6. The first is the paper by Huang et al. [49] which proposes a new learning approach called the extreme learning machine (ELM) for regression and multiclass classification. The approach called the extreme learning machine (ELM) for regression and multiclass classification. paper has been cited 1283 times according to WoS. The second paper by Huang et al. [42] discusses The cited 1283oftimes to WoS. Thepaper second al. [42] discusses the paper theoryhas andbeen applications ELM.according Third ranked is the bypaper ChangbyetHuang al. [50]etwhich proposes a the theory and applications of ELM. Third ranked is the paper by Chang et al. [50] which proposes library for support vector machine (SVM) called LIBSVM. The fourth paper is a survey on ELM bya library support (SVM) called paper is a survey by Huangfor et al. [51]. Itvector is verymachine interesting to note thatLIBSVM. out of theThe tenfourth tabulated articles six are on by ELM Huang, Huang et al. [51]. It is very interesting to note that out of the ten tabulated articles six are by Huang, out of which three are co-authored by Chen. out of which three are co-authored by Chen. Table 6. Top 10 Highly Cited References in NC. Rank 1 2
Title Extreme Learning Machine for Regression and Multiclass Classification Extreme learning machine: Theory and applications
Year 2012 2006
3
LIBSVM: A library for support vector machines
2011
4
Extreme learning machines: a survey
2011
5 6
Letters: Convex incremental extreme learning machine Enhanced random search based incremental extreme learning machine Universal Approximation Using
Source IEEE transactions on systems, man, and cybernetics Neurocomputing ACM Transactions on Intelligent Systems and Technology International Journal of Machine Learning and Cybernetics
Authors Huang GB, Zhou H, Ding X, Zhang R Huang GB, Zhu QY, Siew CK Chang CC, Lin CJ Hung GB, Wang DH, Lan Y
2007
Neurocomputing
Huang GB, Chen L
2008
Neurocomputing
Huang GB, Chen L
IEEE TRANSACTIONS ON
Huang GB, Chen L, Siew
Publications 2018, 6, 32
15 of 22
Table 6. Top 10 Highly Cited References in NC. Rank
Title
Year
Source
Authors
1
Extreme Learning Machine for Regression and Multiclass Classification
2012
IEEE transactions on systems, man, and cybernetics
Huang GB, Zhou H, Ding X, Zhang R
2
Extreme learning machine: Theory and applications
2006
Neurocomputing
Huang GB, Zhu QY, Siew CK
3
LIBSVM: A library for support vector machines
2011
ACM Transactions on Intelligent Systems and Technology
Chang CC, Lin CJ
4
Extreme learning machines: a survey
2011
International Journal of Machine Learning and Cybernetics
Hung GB, Wang DH, Lan Y
5
Letters: Convex incremental extreme learning machine
2007
Neurocomputing
Huang GB, Chen L
6
Enhanced random search based incremental extreme learning machine
2008
Neurocomputing
Huang GB, Chen L
7
Universal Approximation Using Incremental Constructive Feedforward Networks With Random Hidden Nodes
2006
IEEE TRANSACTIONS ON NEURAL NETWORKS
Huang GB, Chen L, Siew CK
8
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
2007
IEEE Transactions on Pattern Analysis and Machine Intelligence
Yan SC, Xu D, Zhang B, Zhang HJ, Yang Q, Lin S
9
Object detection with discriminatively trained part-based models
2010
IEEE Transactions on Pattern Analysis and Machine Intelligence
Felzenszwalb PF, Girshick RB, McAllester D, Ramanan D.
10
OP-ELM: Optimally Pruned Extreme Learning Machine
2010
IEEE Transactions on Neural Networks
Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A
5.1. Cluster Detection and Analysis The research patterns and emerging trends of NC publications are explored using CiteSpace. Figure 11 displays the document co-citation clusters of NC publications. The title term of each cluster is extracted from the title of the articles cited by NC publications using three methods namely log-likelihood test (LLR), mutual information test (MI), and term-frequency and inverse-document frequency (TF-IDF). For the NC publications there are a total of 187 clusters out of which the top 10 are as in the Table 7 along with the top terms from each of the above-mentioned methods LLI, MI, and TF-IDF. Thereby, if considering LLR labels, then all the members of cluster 0 have spiking neurons as a common research theme and cluster 1 has neural networks in common. This can further be seen graphically in Figure 11 where all the clusters are shown. A total of 91 citing documents form the cluster ID 1 and are labelled neural networks by LLR. It is a fairly young cluster. With a total of 84 citing documents, cluster ID 2 is formed and is labelled Theoretical Results by LLR. Cluster ID 3 is formed of 76 citing documents and is labelled as Face Recognition by LLR. Cluster naming in Figure 11 has been done using LLR, which visualizes these clusters over the document co-citation analysis network. From Figure 11 and Table 7 it is quite evident that the three largest clusters are spiking neurons, neural networks, and face recognition. Neural networks is the oldest cluster and Class is the youngest cluster. The minimum value of silhouettes is 0.889 which shows good clustering. The largest cluster 0 is of size 97 and a silhouette of 0.919 and is labeled spiking neurons by LLR. Figure 12 shows the timeline view of the document co-citations of top 24 largest clusters. All of these clusters are plotted in a horizontal manner. Further, it depicts the sequence of features of every cluster, which includes citation patterns and relationships among the other clusters. The temporal spread of all the cluster can be seen in Figure 12. Spiking neuron was a top research interest from 1992 to 2005 whereas neural networks was from 2002 to 2014. Moreover, a strong connection can be seen between the neural network cluster and convolution neural networks which is also a very obvious relationship. The temporal span of convolutional neural network as shown in Figure 12 is from 2008 and continues until the year 2017.
Publications 2018, 6, 32
16 of 22
Publications 2018, 6, x FOR PEER REVIEW Publications 2018, 6, x FOR PEER REVIEW
Figure 11. Document co-citations clusters.
Figure co-citationsclusters. clusters. Figure11. 11.Document Document co-citations
Figure 12. Timeline view of the document co-citations. Figure 12. Timeline view of the document co-citations.
Figure 12. Timeline view of the document co-citations.
16 of 21 16 of 21
Publications 2018, 6, 32
17 of 22
Table 7. Top 12 clusters in NC publications. Cluster ID
Size
Silhouette
Label (TFIDF)
Label (LLR)
Label (MI)
Mean (Year)
0 1 2 3
89 91 84 76
0.919 0.938 0.889 0.916
1998 2007 1990 2007
74
0.944
Nonlinear interactions
1997
5 6 7
59 49 47
0.948 0.954 0.934
Spiking neurons Neural networks Theoretical results Face recognition Independent component analysis Conjugate Gradient Periodic solution Kernel method
Nonlinear interactions Nonlinear interactions Nonlinear interactions Nonlinear interactions
4
Nonlinear interactions Nonlinear interactions Nonlinear interactions
1991 2003 1999
8
42
0.929
Visual cortex
Nonlinear interactions
1995
9
40
0.891
1996
37
0.982
Nonlinear interactions
2008
11
36
0.977
Weakly electric fish Extreme learning machine Fuzzy systems
Nonlinear interactions
10
Spiking neurons Neural networks Neural networks Face recognition Independent component analysis Neural networks Neural networks Neural networks Spatio-temporal receptive fields Computational model Extreme learning machine Class
Nonlinear interactions
2013
Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Publications 2018, 6, x FOR PEER REVIEW 17 of 21 5.2. References with Strong Citation Bursts Publications 2018, 6, x FOR PEER REVIEW Bursts 17 of 21 5.2. References with Strong Citation Publications 2018, with 6, x FOR PEERCitation REVIEW Bursts 17 of 21 5.2. References Strong Publications 2018, 6, x FOR PEERCitation REVIEW Bursts 17 of 21 5.2. References with Strong Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Research getting a boom of citations in citations a period in of atime can of give an can insight ofantheinsight emerging 5.2.articles References with Strong Citation Burstsof Publications 2018, 6, x PEER REVIEW 17 of 21 Publications 2018, 6, x FOR FOR PEER REVIEW 17 ofthe 21 Research articles getting a boom period time give of of 5.2. References with Strong Citation Bursts Publications 2018, 6,articles x FOR PEER REVIEW 17 21 Research getting a boom of citations in a period of time can give an insight of the 5.2. References with Strong Citation Bursts Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Research articles getting a boom of citations in a period of time can give an insight of the developments in the research domain. Citation bursts depict the articles to which the research fraternity 5.2. References with Strong Citation Burstsof domain. emerging developments in the research Citation bursts depict the articles to which the Publications 2018, 6, x FOR PEER REVIEW 17 of 21 Research articles getting a boom citations in a period of time can give an insight of the 5.2. References with Strong Citation Bursts emerging developments in thea research domain. Citation bursts thegive articles to which Publications 2018, 6, x FOR PEER REVIEW 21 Research articles getting boom citations incitation a period of depict time can anTable insight of ofthe the 5.2. References with Strong Citation Bursts emerging developments in theattention. research domain. Citation bursts depict thegive articles to which the 5.2. References with Strong Citation Bursts pays most attention. The top 30 references with of the highest bursts are shown in 8. 17 research fraternity pays most The top 30 references with the highest citation bursts are Research articles getting a boom of citations in a period of time can an insight of the Publications 2018, 6, x FOR PEER REVIEW 17 of 21 5.2. References with Strong Citation Bursts emerging developments in the research domain. Citation bursts depict the articles to which the research fraternity pays most attention. The top 30 references with the highest citation bursts are
Research articles getting boom citations Citation in a period of depict time can an insight of the the 5.2. References with Strong Citation Burstsof emerging developments in theaaattention. research domain. bursts thegive articles to which research pays most The top 30Citation references with the can highest citation bursts are Research articles getting boom citations in aa period of time an insight of the shown infraternity Table 8. Strong 5.2. References with Citation Burstsof emerging developments in theaaattention. research domain. bursts depict thegive articles to which which Research articles getting boom of citations in period of time can give an insight of the research fraternity pays most The top 30 references with the highest citation bursts are Research articles getting boom of citations in a period of time can give an insight of shown in Table 8. emerging developments in the research domain. Citation bursts depict the articles to the 5.2. References with Strong Citation Bursts Research articles getting a boom of citations in a period of time can give an insight of the research fraternity pays most attention. The top 30 references with the highest citation bursts are shown in Table 8. emerging developments in the research domain. Citation bursts depict the articles to which the 5.2. References with Citation Burstswith Table 8.Strong Top 30 References the Strongest Citation Research articles getting a research boom ofThe citations inreferences a period ofBursts. time give an insight of are the research fraternity pays most attention. The top 30Citation references with the can highest citation bursts are emerging developments in the domain. bursts depict the articles to which shown in Table 8. emerging developments in the research domain. Citation bursts depict the articles to which the research fraternity pays most attention. top 30 with the highest citation bursts Research articles getting a boom of citations in a period of time can give an insight of the emerging developments in 8.the research domain. bursts thegive articles to which the shown infraternity Table 8. pays research most attention. top 30Citation with the highest citation bursts are Table Top 30 References with the Citation Bursts. Research articles getting aattention. boom ofThe citations inreferences aStrongest period of depict time can an insight of the emerging developments in 8.the research domain. Citation bursts depict thegive articles to which shown infraternity Table 8. pays research most The top references with the highest citation bursts are Table Top 30 References with30 the Strongest Citation Bursts. research fraternity pays most attention. The top 30 references with the highest citation bursts are shown in Table 8. Research articles getting a boom of citations in a period of time can an insight of the emerging developments in the research domain. Citation bursts depict the articles to which research fraternity pays most attention. The top 30Citation references with theBursts. highest citation bursts the are Table Top 30 References with the Strongest Citation shown in Table 8. emerging developments in 8. the research domain. bursts depict the articles to which References Year Strength Begin End 1992–2018 research most attention. The top 30the references with theBursts. highest citation bursts are shown in Table 8. Table 8. Top 30 References with Strongest Citation References Year Strength Begin End 1992–2018 shown infraternity Table 8. pays emerging developments in 8. the research domain. bursts depict the1992–2018 articles to which research fraternity pays most attention. top references with the highest citation bursts the are Table Top 30 References with30Citation the Strongest Citation Bursts. References Year The Strength Begin End shown in Table 8. research fraternity most attention. top 30the references with theBursts. highest citation bursts are References Year The Strength Begin End Citation 1992–2018 Table 8. Top Top 30 References References with the Strongest Bursts. ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ shown Table 8. pays Table 8. 30 with Strongest Citation ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ Hertz Theory J, in 1991, Intro Theory Neural, V, P1991 1991 15.3824 1994 1999 fraternity pays most attention. The top 30 references with the highest citation bursts are References Year Strength Begin End 1992–2018 Hertz J, 1991,research Intro Neural, V, P 15.3824 1994 1999 shown in Table 8. ▂▂▂▂▂▂▂▂ Hertz J, in 1991, Intro Theory Neural, V, PTop 30 References 1991 Strength 15.3824 1994 1999 Table 8. with the Strongest Bursts. ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ References Year Begin End Citation 1992–2018 shown Table 8. Table 8. with Strongest Bursts. ▂▂▂▂▂▂▂▂ Hertz J, 1991, Intro Theory Neural, 1991 Strength 15.3824 1994 1999 Table V, 8. PTop Top 30 30 References References with the the Strongest Citation Bursts. 1992–2018 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ References Year Begin End Citation shown Table 8. ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ Table V, 8. PTop 30 References with the Strongest Bursts. 1992–2018 Hertz J, in 1991, Intro Theory Neural, 1991 Strength 15.3824 1994 1999 Citation ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ References Year Begin End
▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ HornikJ, K, 1989, Neural Networks, V2, P359 30 References 1989 Strength 12.3758 1994 1997 ▂▂▂▂▂▂▂▂ Hertz 1991, Intro Theory Neural, V, 1991 15.3824 1994 1999 TableV2, 8. PTop with the Strongest Bursts. 1992–2018 References Year Begin End Citation ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ Hornik 1989, Neural Networks, 1989 12.3758 1994 1997 ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ References Year Strength Begin End Hornik K, 1989, Neural Networks, V2,Neural, P359 V2, 198930 References 12.3758 1994 1997 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ Hertz J,K, 1991, Intro Theory V, 1991 15.3824 1994 1999 Table 8.PPP359 Top with the Strongest Bursts.1992–2018 References Year Strength Begin End Citation 1992–2018 ▂▂▂▂▂▂▂▂ Hornik K, 1989, Neural Networks, P359 1989 12.3758 1994 1997 ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ Hertz J, 1991, Intro Theory Neural, V, 1991 15.3824 1994 1999 ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ References Year Strength Begin End Table 8. Top 30 References with the Strongest Citation Bursts. 1992–2018 ▂▂▂▂▂▂▂▂ Haykin S, 1994, Neural Networks Comp, V, ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ Hornik K, 1989, Neural Networks, V2, P359 1989 12.3758 1994 1997 ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ Hertz J, 1991, Intro Theory Neural, V, P 1991 15.3824 1994 1999 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ References Year Strength Begin End Citation 1992–2018 Table 8. Top 30 References with the Strongest Bursts. Haykin S, 1994, Neural Networks Comp, V, ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ 1994 10.6235 1996 2002 ▂▂▂▂▂▂▂▂ Hornik K, 1989, Neural Networks, V2, P359 1989 12.3758 1994 1997 Hertz J, 1991, Intro Theory Neural, V, P 1991 15.3824 1999 ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ Hertz J, S, 1991, Intro Theory Neural, V, P V, 1991 15.3824 1994 1999 References Year Strength Begin End ▂▂▂▂▂▂▂▂ 1992–2018 1994 10.6235 1996 2002 P Haykin 1994, Neural Networks ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ Haykin S, 1994, Neural Networks Comp, Hornik K, 1989, Neural Networks, V2, P359 1989 12.3758 1997 Hertz J, K, 1991, Intro Theory Neural,Comp, V, PP359 1991 15.3824 1994 1999 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ P References Year Strength Begin End 1992–2018 1994 10.62351996 1996 2002 2002 Haykin S, 1989, 1994, Networks Comp, V, ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ Hornik Neural Networks, V2, 1989 12.3758 1994 1997 ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ 1994 10.6235 ▂▂▂▂▂▂▂▂ Hertz J, K, 1991, Intro Theory Neural, V, PP359 1991 15.3824 1994 1999 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ P ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ Cybenko G, 1989, Mathematics OfComp, Control, ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ 1994 10.6235 1996 2002 Haykin S, 1994, Neural Networks V, References Year Strength Begin End ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ 1992–2018 Hornik 1989, Neural Networks, V2, 1989 12.3758 1994 1997 V, P ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ Hertz J, 1991, Intro Theory Neural, V, P 1991 15.3824 1994 1999 ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ Cybenko 1989,Neural Mathematics Of Control, 8.6326 ▂▂▂▂▂▂▂▂ Hornik K,G,1989, Networks, V2, P359 1989 12.3758 1994 1997 ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ P 1994 10.6235 1996 2002 ▂▂▂▂▂▂▂▂ Haykin S,G, 1994, Neural Networks Comp, V, Hornik K, 1989, Networks, V2, P359 Hertz J, K, 1991, Intro Theory Neural, V, PP359 And Systems Cybenko 1989, Mathematics OfComp, Control, P Haykin S, 1994, Neural Networks V, Hornik 1989, Neural Networks, V2,
And Systems Hertz J,K, 1991, Intro Theory Neural, V, PP359 Cybenko G, 1989, Mathematics OfComp, Control, Haykin S, 1994, Neural V, P Cybenko G, 1989, Mathematics OfNetworks Control, Hornik 1989, Neural Networks, V2, And Systems Haykin S, 1994, Neural Networks V, Rumelhart De, 1986, Parallel Distributed, P Cybenko 1989, Mathematics Of Comp, Control, Haykin S,G, 1994, Neural Networks Comp, V, 1989 Hornik K, 1989, Neural Networks, V2, P359 De, 1986, Parallel Distributed, And Systems P Haykin S, 1994, Neural Networks V, Cybenko G, 1989, Mathematics OfComp, Control, And Systems Rumelhart Hornik K, 1989, Neural Networks, V2, P359 P V1, P318 Rumelhart De, 1986, Parallel Distributed, And Systems Cybenko G, 1989, Mathematics Of Control, P Haykin S, 1994, Neural Networks Comp, V, V1, P318 Hornik K, 1989, Neural Networks, V2, P359 Rumelhart De, 1986, Parallel Distributed, Cybenko 1989, Mathematics Of Comp, Control, And Systems P Haykin S,G, 1994, Neural Networks V,
V1, P318 Cybenko G, 1989, Mathematics Of And Systems De, 1986, Parallel Distributed, Cybenko G, 1989, Mathematics Of Control, Control, P Rumelhart De,Rumelhart 1986, Parallel Distributed, Haykin S, 1994, Neural Networks V, Poggio T, 1990, P IEEE, V78, P1481 V1, P318 And Systems Rumelhart De, 1986, Parallel Distributed, Cybenko 1989, Mathematics OfComp, Control, P P318 1986 Poggio T, 1990, PNeural IEEE, V78, P1481 Haykin S,G, 1994, Networks V, And Systems V1, Rumelhart De, 1986, Parallel Distributed, And Systems Cybenko G, 1989, Mathematics Of Comp, Control, P V1, P318 Poggio T, 1990, P IEEE, V78, P1481 Rumelhart De, 1986, Parallel Distributed, And Systems V1, P318 Cybenko G, 1989, Mathematics Of Control, Chen S, T, 1991, IEEE T Parallel Neural Networ, V2, P Rumelhart De, 1986, Distributed, Poggio 1990, P IEEE, V78, Networ, P1481 V1, P318 Rumelhart De, 1986, Parallel Distributed, And Systems Cybenko G, 1989, Mathematics Of Control, Chen S,T, 1991, IEEE T Parallel Neural V2, Poggio 1990, P IEEE, V78, P1481 V1, Rumelhart De, 1986, Distributed, AndP318 Systems
P302 Chen S,T, 1991, IEEE T Parallel Neural Networ, V2, Cybenko G, 1989, Mathematics Of Control, V1, P318 Poggio 1990, P IEEE, V78, P1481 V1, P318 Rumelhart De, 1986, Distributed, Poggio T, 1990,P302 P P318 IEEE, V78, 1990 And Systems Chen S,T,1991, IEEE T Neural Networ, V2, Poggio 1990, P P1481 IEEE, V78, P1481 V1, Rumelhart De, Neural 1986, Parallel Distributed, P302 Moody J,1991, 1989, Computation, V1, And Systems Chen S,T, T Parallel Neural Networ, V2, Poggio 1990, P IEEE, V78, P1481 V1, P318 Rumelhart De,IEEE 1986, Distributed, Moody J,1991, 1989, Neural Computation, V1, Poggio 1990, P V78, P302 Chen S,T, IEEE T Neural Networ, V2, Poggio T, 1990, P IEEE, IEEE, V78, P1481 P1481 V1, P318
P281 Moody J,1991, 1989, Neural Computation, V1, Rumelhart De,IEEE 1986, Distributed, P302 Chen S, T Parallel Neural Networ, V2, Poggio 1990, P IEEE, V78, P1481 Chen S, 1991, IEEE TT, Neural Networ, V1, P318 P281 Moody J, 1989, Neural Computation, V1, Chen S, 1991, IEEE T Neural Networ, V2, P302 Poggio T, 1990, P IEEE, V78, P1481 1991 P281 Chen S, 1991, IEEE T Neural Networ, V2, V1, P318 P302 Moody J, 1989, Neural Computation, V1, Chen S, 1991, IEEE T Neural Networ, V2, Poggio T, 1990, P IEEE, V78, P1481 V2, P302 Pao 1989, Adaptive Pattern Rec, V1, V, P P281Y-H, P302 Moody 1989,IEEE Neural Computation, Chen S, J,1991, T Neural Networ, V2, Poggio 1990, P IEEE, V78, P1481 Pao 1989, Adaptive Pattern Rec, V1, V, P302 P281 Moody J,1991, 1989, Neural Computation, P302Y-H, Chen S,T, IEEE T Neural Networ, V2,P
Pao Y-H, 1989, Adaptive Pattern Rec, V1, V, P Poggio T, 1990, P IEEE, V78, P1481 1989, Neural Computation, P302 P281 Chen S,J, 1991, IEEE T Neural Networ, V2, Moody J, 1989,Moody Neural Computation, Moody J, 1989, Neural Computation, Pao Y-H, Adaptive Pattern Rec,V1, V, P P281 Moody J,1991, 1989, Neural Computation, V1, P302 Chen S, J, IEEE Neural Networ, V2, Kohonen T, 1990, PTIEEE, V78, P1464 Pao Y-H, 1989, Adaptive Pattern Rec, V1, V,1989 P P281 Moody Neural Computation, P302 Kohonen T, 1990, P IEEE, V78, P1464 V1, P281 ChenY-H, S,J,1991, T Neural Networ, V2,P P281 Pao 1989,IEEE Adaptive Pattern Rec, V1, V, P281 Moody Neural Computation, P302 Kohonen T, 1990, P IEEE,Pattern V78, P1464 Pao Y-H,J,S1989, Adaptive Rec, V, P281 Moody 1989, Neural Computation, V1,P Fahlman E,1990, 1990, Adv Neural Informati, P302 Kohonen T, PAdv IEEE, V78, P1464 Pao Y-H, Adaptive Pattern Rec, V1, V, P P281 Fahlman E,1990, 1990, Neural Informati, Moody J, S1989, 1989, Neural Computation, Kohonen T, P IEEE, V78, Pao Y-H, Adaptive Pattern Rec, V, Pao Y-H, 1989, Adaptive Pattern Rec, V1, V, P P Pao Y-H, 1989,Pao Adaptive Pattern Rec, V, P1464 PInformati, P281 V2, P524 Fahlman S1989, E,1990, 1990, Adv Neural Moody J, 1989, Neural Computation, Kohonen T, P IEEE, V78, P1464 Y-H, Adaptive Pattern Rec, V,1989 P V2, P524 S P281 Fahlman E, 1990, Adv Neural Informati, Kohonen T, 1990, P IEEE, V78, P1464 Pao Y-H, 1989, Adaptive Pattern Rec, V, P V2, P524 Narendra K S, 1990, Ieee Trans Neural Fahlman S E, 1990, Adv Neural Informati, P281 Kohonen P IEEE, V78, P1464 Pao P524 Y-H, ST, 1989, Adaptive Pattern Rec, V, P Narendra KE,1990, S,1990, 1990, IeeeNeural Trans Neural Kohonen T, 1990, P IEEE, V78, V2, Fahlman Informati, Kohonen T, 1990, P Adv IEEE, V78, P1464 P1464 Pao Y-H, Adaptive Pattern Rec, V,1990 P Netw, V1,S1989, P4 Narendra K S,1990, 1990, IeeeNeural Trans Neural Kohonen T, 1990, P V1, IEEE, V78, P1464 V2, P524 Fahlman E,1990, Informati, Kohonen T, PAdv IEEE, V78, P1464 Netw, P4 Pao Y-H, ST, 1989, Adaptive Pattern Rec, V, P Narendra KE,1990, S,1990, 1990, IeeeNeural Trans Neural Fahlman Adv Informati, V2, P524 Kohonen P IEEE, V78, P1464 Netw, V1, P4 Fahlman S E, 1990, Adv Neural Informati, Funahashi K, 1989, Neural Networks, V2, V2, P524 Narendra K S, 1990, Ieee Trans Neural Fahlman S E, 1990, Adv Neural Informati, Kohonen 1990, PNeural IEEE, V78, P1464 Funahashi K, 1989, Networks, V2, Netw, V1,ST, P4 V2, P524 Fahlman E,1990, Neural Informati, Narendra K S,1990, 1990, Ieee Trans Neural Fahlman S E, 1990, Adv Neural Informati, Kohonen P Adv IEEE, V78, P1464 V2, P183P524 Funahashi K, 1989, Neural Networks, V2, Netw, V1,ST, P4 Narendra K S,1990, 1990, Ieee Trans Neural V2, P524 Fahlman E, Neural Informati, 1990 P183 Kohonen T, 1990, PAdv IEEE, V78, P1464 Funahashi K, 1989, Neural Networks, V2, Narendra K S,1990, 1990, Ieee Neural Trans Neural Netw, V1, P4 V2, P524 Fahlman S E, Adv Informati, V2, P524 P183 Narendra S, 1990, Ieee Trans Neural Rumelhart De, 1986, Parallel Distributed, Netw, V1,SK P4 Funahashi K, 1989, Neural Networks, V2, Narendra K S, 1990, Ieee Trans Neural V2, P524 Fahlman E, 1990, Adv Neural Informati, Rumelhart De, 1986,Neural Parallel Distributed, P183 Netw, V1, K P4 Funahashi K, 1989, Networks, V2, Narendra S, 1990, Ieee Trans Neural V2, P524 Fahlman SK E,S,1989, 1990, Adv Neural Informati, Netw, P4 V1, P V1, Rumelhart De, 1986,Neural Parallel Distributed, P183 Funahashi K, Networks, V2, Netw, V1, P4 Narendra 1990, Ieee Trans Neural V2, P524 P Narendra K S,V1, 1990, Ieee Trans Neural Rumelhart De, 1986, Parallel Distributed, Funahashi K, 1989, Neural Networks, V2, Netw, V1, P4 P183 Narendra K S, 1990, Ieee Trans Neural V2, P524 1990 V1, P Funahashi K, 1989, Neural Networks, Kohonen Teuvo, 1989, Self Org Ass V2, P183 Rumelhart De, 1986, Parallel Distributed, Funahashi K, 1989, Neural Networks, V2, Netw, V1, P4 Narendra K S, 1990, Ieee Trans Neural Teuvo, 1989, Self Org Ass Netw, V1, P4 Kohonen V1, P V1, P4 P183 Funahashi K, 1989, Neural Networks, Rumelhart De, 1986, Parallel Distributed, Netw, Narendra K S,1989, 1990, Ieee Trans Neural P183 Memory, V, P Kohonen 1989, Self Org Ass V2, V1, P V1,Teuvo, Rumelhart De, 1986, Parallel Distributed, P183 Funahashi K, Neural Networks, V2, Netw, P4 Memory, Kohonen Teuvo, 1989, Self Networks, Org Ass V2, Rumelhart De, 1986, Parallel Distributed, V1, P V1,V, P183 Funahashi K,P Neural Netw, P4 Memory, V, P 1989, Rumelhart De, 1986, Parallel Distributed, Waibel A,Teuvo, 1989, Ieee T Acoust Speech, V37, Funahashi K, 1989, Networks, V1, P Neural Kohonen 1989, Self Networks, Org Ass V2, Rumelhart De, 1986, Parallel Distributed, P183 Funahashi K, Neural Waibel 1989, Ieee TParallel Acoust Speech, V37, 1989 Memory, V, P1989, V1, P A,Teuvo, Kohonen 1989, Self Org Ass V2, Rumelhart De, 1986, Distributed, P183 Funahashi K, 1989, Neural Networks, V1, P P328 Waibel A, 1989, Ieee T Acoust Speech, V37, Memory, V, P Kohonen Teuvo, 1989, Self Org Ass V2, P183 V1, P Rumelhart De, 1986, Parallel Distributed, P183 P328 Waibel A,Teuvo, 1989, Ieee T Acoust Speech, V37, Kohonen 1989, Self Org Ass V1, P Memory, V, P Rumelhart De, 1986, Parallel Distributed, P183 P328P A,V, Kohonen Teuvo, 1989, Self Org Ass Memory, P 1986, Waibel 1989, Ieee T Acoust Speech, Kohonen Teuvo, 1989, Self Org AssV1,V37, V1, Rumelhart De, Parallel Distributed, Jacobs Ra, 1988, Neural Networks, P295 P328 Memory, V, P Neural Rumelhart De,Jacobs 1986, Parallel Distributed, Kohonen Teuvo, 1989, Self Org AssV1, Waibel A, 1989, Ieee T Acoust Speech, V37, V1, P Ra, 1988, Networks, P295 Rumelhart De, Distributed, Memory, V, P P328 Waibel A,Teuvo, 1989, Ieee TParallel Acoust Speech, 1986 Memory, V, P 1986, Kohonen 1989, Self Org AssV1,V37, V1, P Ra, Jacobs 1988, Neural Networks, P295 Waibel A, 1989, Ieee T Acoust Speech, Memory, V, P P328 V1, P Kohonen Teuvo, 1989, Self OrgSpeech, AssV1,V37, V1, P Waibel A, 1989, Ieee T Acoust V37, Jacobs Ra, 1988, Neural Networks, P295 P328 Waibel A,Teuvo, 1989, T Acoust Speech, V37, Memory, V, P Ieee Kohonen 1989, Self Org Ass Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 P328 Waibel A,Teuvo, 1989, T Acoust Speech, V37, Memory, V, P Ieee Xu L, 1994, Neural Networks, P609 Kohonen 1989, Self OrgV7, Ass P328 Jacobs Ra, 1988, Neural Networks, V1, V37, P295 Kohonen Teuvo, 1989, Self Ass P328 Waibel A,V, 1989, Ieee T Acoust Speech, Memory, P Org Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1,V3, P295 P328 1989 Waibel A,V, 1989, IeeeNetworks, T Acoust Speech, V37, Bishop C, 1991, Computation, Memory, P Neural Xu L, 1994, Neural V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 P328 Memory, V, P Bishop C, Neural Computation, V3, Waibel A, 1991, 1989, Ieee T Acoust Speech, V37, Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 Jacobs Ra, 1988, Neural Networks, V1, P295 P328 P579 Bishop C, 1991, Neural Computation, V3, Waibel A, 1989, Ieee T Acoust Speech, V37, Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 P579 P328 Bishop C, 1991, Neural Computation, V3, Xu L, 1994, Neural Networks, V7, P609 Jacobs 1988, Neural Networks, V1,V3, P295 P579 Baum ERa, 1989, Neural Computation, V1, Bishop C,B,1991, Neural Computation, P328 Xu L, 1994, Neural Networks, V7, P609 Jacobs 1988, Networks, V1,V3, P295 Baum ERa, B, 1991, 1989, Neural Computation, V1, Xu L, 1994, Neural Networks, V7, P609 P579 Bishop C, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 P151 Baum EC, B,1991, 1989, Neural Computation, V1, P579 Bishop Neural Computation, V3, Xu L, 1994, Neural Networks, V7, P609 P151 Jacobs 1988, Neural Networks, V1,V3, P295 Baum ERa, 1989, Neural Computation, V1, Bishop C,B,1991, Neural Computation, P579 Xu L, 1994, Neural Networks, V7, P609 P151 Bishop Neural Computation, V3, P579 Baum EC, B,1991, 1989, Neural Computation, V1, Bishop C, 1991, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 Ritter H, 1992, Neural Computation S,V3, V, P P151 P579 Baum EC, B, 1989, Neural Computation, V1, Bishop 1991, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 Ritter H, 1992, Neural Computation S,V3, V, P579 P151 Baum EC, B, 1989, Neural Computation, V1,P P579 Bishop 1991, Neural Computation, V3, Ritter H, 1992, Neural Computation S, V, P Xu L, 1994, Neural Networks, V7, P609 Baum EC, B, 1991, 1989, Neural Neural Computation, Computation, V3, V1, P579 P151 Bishop Baum E B, 1989, Neural Computation, Ritter H, 1992, Neural Computation S, V1, V, P P151 Baum E B, 1989, Neural Computation, V1, P579 Bishop C, 1991, Neural Computation, Battiti R, 1992, Neural Comput, V4, P141 Ritter H, 1992, Neural Computation S,V3, V, P P151 Baum E B, 1989, Neural Computation, V1, P579 Battiti R, 1992, Neural Comput, V4, P141 BishopH, 1991, Neural Computation, P151 Ritter 1992, Neural Computation S,V3, V, P P151 Baum ER,C, B, 1989, Neural Computation, V1, P579 BattitiH, 1992, Neural Comput, V4, P141 Ritter 1992, Neural Computation S, V, P P151 BaumH, E B, 1989, Neural Computation, V1, Barnard E, 1992, Ieee T Neural Networ, V3, P579 Battiti R, 1992, Neural Comput, V4, P141 Ritter 1992, Neural Computation S, V, P P151 Barnard E, 1992, Ieee TComput, Neural Networ, V3, Baum E 1989, Neural Computation, Battiti R,B, 1992, Neural V4, P141 Ritter H, Computation S, V, P Ritter H, 1992, Neural Computation S, V1, V, P P151 P232 Barnard E, 1992, Ieee T Neural Networ, V3, Baum E B, 1989, Neural Computation, V1, BattitiH, R, 1992, Neural Computation Comput, V4, P141 Ritter S, V, P P232 P151 Barnard 1992, Ieee TComput, Neural Networ, V3, Battiti R, E, 1992, Neural V4, P141 Ritter H, 1992, Neural Computation S, V, P P232 Ku Cc,R, 1995, Ieee T Neural Networ, V6, V3, Barnard E, 1992, Ieee TComput, Neural Networ, P151 Battiti 1992, Neural V4, P141 Ritter H, 1992, Neural Computation S, V, P Ku Cc,R, 1995, Ieee T Neural Networ, Battiti Comput, V4, P141 P232 Barnard E, 1992, Ieee TComputation Neural Networ, Battiti R, 1992, Neural Comput, V4, V6, P141 Ritter 1992, Neural S, V,V3, P P144 Ku Cc,H, 1995, Ieee T Neural Networ, V6,
1989 12.3758 1994 1989 8.6326 1996 1991 15.3824 1994 1994 10.6235 1996 1989 12.3758 1994 1989 8.6326 1994 10.6235 1996 1991 15.3824 1994 1989 12.3758 1994 1989 8.6326 1996 1994 10.6235 1996 8.6326 19961996 1989 12.3758 1994 1994 10.6235 1986 7.5576 1993 1989 8.6326 1994 10.6235 1989 12.3758 1994 1986 7.5576 1993 1989 8.6326 1994 10.6235 1996 1986 7.5576 1993 1989 12.3758 1994 1989 8.6326 1996 1994 10.6235 1996 1986 7.5576 1993 1989 8.6326 1996 1994 10.6235 1996 1990 7.4954 19931993 1995 1989 8.6326 1996 1986 7.5576 1989 8.6326 1996 7.5576 1994 10.6235 1996 1990 7.4954 1995 1989 8.6326 1996 1986 7.5576 1993 1990 7.4954 1995 1994 10.6235 1996 1986 7.5576 1993 1989 8.6326 1996 1990 7.4954 1995 1986 7.5576 1993 1989 8.6326 1991 7.2449 1996 1986 7.5576 1993 1990 7.4954 1995 1986 7.5576 1993 1989 8.6326 1991 7.2449 1996 1990 7.4954 1995 1986 7.5576 1993 7.4954 1995 1991 7.2449 1996 1989 8.6326 1990 7.4954 1995 1986 7.5576 1993 1991 7.2449 1996 1990 7.4954 1995 1986 7.5576 1993 1989 7.2193 1990 7.4954 1995 1991 7.2449 1996 1990 7.4954 1995 1986 7.5576 1993 1989 7.2193 1995 1991 7.2449 1996 1990 7.4954 1995 1989 7.2193 1995 1986 7.5576 19961996 1993 1991 7.2449 1990 7.4954 1995 7.2449 1989 7.2193 1995 1991 7.2449 1996 1990 7.4954 1995 7.1573 1994 1991 7.2449 1996 1989 7.2193 1995 1991 7.2449 1996 1990 7.4954 1995 1989 7.1573 1994 1991 7.2449 1996 1989 7.2193 1995 7.1573 1994 1990 7.4954 1995 1989 7.2193 1995 1991 7.2449 1996 1989 7.1573 1994 1989 7.2193 1995 1991 7.2449 19951994 1990 6.3202 1996 7.21936.3202 7.2193 1995 1989 7.1573 1989 7.2193 1995 1990 1996 1991 7.2449 7.1573 1994 1989 7.2193 1995 1990 6.3202 1996 1991 7.2449 1989 7.1573 1994 1989 7.2193 1995 1990 6.3202 1996 1989 7.1573 1994 1989 7.2193 1995 1990 6.3202 1996 1989 7.1573 1994 1989 7.1573 19941996 1994 7.15736.3202 1990 1989 7.2193 1995 1990 6.3202 1996 1989 7.1573 1994 1990 6.3202 1996 1989 7.2193 1995 1989 7.1573 1994 1990 6.3202 1996 1990 6.3202 1996 1989 7.1573 1994 6.2445 1995 1990 6.3202 1996 1990 6.3202 1996 1990 1989 7.1573 1994 6.32026.2445 19961995 1990 6.3202 1996 6.2445 1995 1990 6.3202 1996 1989 7.1573 1994 1990 6.3202 1996 1990 6.2445 1995 1990 6.3202 1996 1990 6.3202 1996 6.3202 1996 1989 5.8544 1994 1990 6.2445 1995 6.3202 1996 1990 6.3202 1996 1989 5.8544 1994 1990 6.3202 1996 6.2445 1995 6.3202 1996 1989 5.8544 1994 1990 6.3202 1996 1990 6.2445 1995 1990 6.3202 1996 1989 5.8544 1994 1990 6.2445 1995 1990 6.3202 1996 1990 6.2445 1995 1986 5.4868 1992 1989 5.8544 1994 1990 6.2445 1995 1990 6.3202 1996 1986 5.4868 1992 1989 5.8544 1994 1990 6.2445 1995 1990 6.3202 1996 1986 5.4868 1992 1989 5.8544 1994 1990 6.2445 1995 6.24455.8544 1986 5.4868 1992 1989 1994 1990 6.2445 19951992 1995 1989 5.8544 1994 5.3089 1996 1986 5.4868 1989 5.8544 1994 1990 6.2445 1995 1989 5.3089 1996 1989 5.8544 1994 1986 5.4868 1992 1990 6.2445 1995 1989 5.3089 1996 1986 5.4868 1992 1989 5.8544 1994 1989 5.3089 1996 1986 5.4868 1992 1989 5.8544 19941996 1994 1986 5.4868 1992 5.2033 1994 5.85445.2033 1989 5.3089 1986 5.4868 1992 5.8544 1989 1994 1986 5.4868 1992 1989 5.3089 1996 5.8544 1989 5.2033 1994 5.3089 1996 1986 5.4868 1992 1989 5.2033 1994 1989 5.3089 1996 1986 5.4868 1992 1988 4.6453 5.3089 1996 1989 5.2033 1994 5.3089 1996 1986 5.4868 1992 1988 5.48684.6453 19921994 5.2033 1994 1989 5.3089 1996 1988 4.6453 1986 5.4868 1992 1989 5.2033 1994 1989 5.3089 1996 1988 4.6453 1994 1989 5.2033 1994 1989 5.3089 1996 1994 4.3694 1995 1989 5.2033 1988 4.6453 1994 1989 5.2033 1994 1994 4.3694 1995 1989 5.3089 1996 1988 4.6453 1989 5.2033 1994 1994 4.3694 1995 1989 5.3089 19961994 1996 1988 4.6453 5.3089 1989 5.2033 1994 1994 4.3694 1995 1988 4.6453 1994 1989 5.2033 1994 1991 4.0149 1994 4.3694 1995 1988 4.6453 1994 1988 4.6453 1994 1991 4.0149 1995 1989 5.2033 1994 1994 4.3694 1995 1988 4.6453 1994 1991 4.0149 1994 4.3694 1995 1989 5.2033 1994 1988 4.6453 1994 1991 4.0149 1995 1994 4.3694 1995 1988 4.6453 1994 1989 3.9806 1996 1994 4.3694 1995 1991 4.0149 1994 4.3694 1995 1989 3.9806 1996 1988 4.6453 1994 1991 4.0149 1994 4.3694 1995 1989 3.9806 1996 1991 4.0149 1995 1988 4.6453 1994 1994 4.3694 1995 1989 3.9806 1996 1991 4.0149 1995 1994 4.3694 1995 1992 3.7901 1991 4.0149 1995 1989 3.9806 1996 1991 4.0149 1995 1994 4.3694 1995 1992 3.7901 1996 1991 4.0149 1995 1989 3.9806 1996 1992 3.7901 1996 1994 4.3694 1995 1989 3.9806 1991 4.0149 1995 1992 3.7901 1996 1989 3.9806 1996 1991 4.0149 1995 1992 3.5404 1994 1989 3.9806 1992 3.7901 1996 1989 3.9806 1996 1992 3.5404 1994 1991 4.0149 1995 1992 3.7901 1989 3.9806 1996 1992 3.5404 1994 1991 4.0149 1995 1992 3.7901 1996 1989 3.9806 1996 1992 3.5404 1994 1992 3.7901 1996 1989 3.9806 1996 1992 3.5404 1994 3.7901 1996 3.7901 1996 1992 3.5404 1994 1989 3.9806 1996 3.5404 1994 1992 3.7901 1996 1992 3.5404 1994 1989 3.9806 1996 1992 3.7901 1996 1992 3.5404 1994 1992 3.5404 1994 1992 3.7901 1996 1995 3.521 1997 1992 3.5404 1994 1992 3.5404 1994 1995 3.521 1997 1992 3.7901 1996
1997 1997 1999 2002 1997 1997 2002 1999 1997 1997 2002 1997 1997 2002 1994 1997 2002 1997 1994 1997 2002 1994 1997 1997 2002 1994 1997 2002 1998 1997 1994 1997 1994 2002 1998 1997 1994 1998 2002 1994 1997 1998 1994 1997 1999 1994 1998 1994 1997 1999 1998 1994 1998 1999 1997 1998 1994 1999 1998 1994 1997 1998 1999 1998 1994 1997 1999 1998 1997 1994 1999 1998 1999 1997 1999 1998 1999 1997 1999 1998 1997 1999 1997 1998 1997 1999 1997 1997 1999 1998 1997 1997 1997 1998 1999 1997 1998 1999 1997 1997 1998 1997 1997 1998 1997 1997 1997 1998 1997 1998 1997 1998 1997 1997 1998 1998 1997 1998 1998 1998 1997 1998 1998 1998 1997 1998 1998 1998 1998 1997 1998 1998 1997 1998 1998 1997 1998 1998 1998 1997 1998 1998 1998 1994 1997 1998 1998 1994 1997 1998 1998 1994 1997 1998 1998 1994 1997 1998 1997 1994 1997 1998 1997 1997 1994 1998 1997 1994 1997 1997 1994 1997 1994 1997 1997 1994 1997 1994 1997 1997 1994 1997 1997 1994 1996 1997 1994 1996 1994 1997 1996 1994 1997 1997 1996 1997 1997 1998 1997 1996 1997 1998 1997 1996 1997 1998 1997 1996 1997 1997 1998 1996 1997 1998 1996 1996 1997 1998 1996 1996 1998 1997 1996 1996 1998 1996 1997 1998 1996 1998 1997 1996 1996 1998 1997 1996 1996 1998 1997 1996 1998 1996 1997 1996 1998 1996 1997 1998 1997 1996 1998 1997 1996 1999 1997 1998 1997 1999 1996 1998 1997 1999 1996 1998 1997 1999 1998 1997 1999 1998 1998 1999 1997 1999 1998 1999 1997 1998 1999 1999 1998 2000 1999 1999 2000 1998
▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂
P302 Chen S,T, IEEE T Neural Networ, V1, V2, Poggio 1990, P IEEE, V78, P1481 P281 Moody J,1991, 1989, Neural Computation, Chen S, J, 1991, IEEE T Neural Networ, V2, P302 Poggio T, 1990, P IEEE, V78, P1481 P281 Moody 1989, Neural Computation, Pao Y-H, Adaptive Pattern Rec, V1, V, P P302 Moody J,1991, 1989, Neural Computation, Chen S,T, IEEE T Neural Networ, V2, Poggio 1990, P IEEE, V78, P1481 P281 Pao Y-H, 1989, Adaptive Pattern Rec, V1, V, P P P302 Chen S, 1991, IEEE T Neural Networ, V2, Moody J, Neural Computation, V1, Pao Y-H, 1989, Adaptive Pattern Rec, V, P281 P302 P281 Chen S, 1991, IEEE T Neural Networ, V2, Moody J, 1989, Neural Computation, V1, Pao Y-H, 1989, Adaptive Pattern Rec, V, P P302 Moody Neural Computation, P281 Chen S, J,1991, IEEE TIEEE, Neural Networ, V2, Pao Y-H, 1989, Adaptive Pattern Rec, V1, V, P Kohonen T, 1990, P V78, P1464 P302Y-H, P281 Moody J, 1989, Neural Computation, Pao Adaptive Pattern Rec, V1, V, P P Kohonen T, 1990, 1990, P IEEE, IEEE, V78, P1464 P1464 Pao Adaptive Pattern Rec, V, P281Y-H, Moody J, 1989, Neural Computation, V1, Kohonen T, P V78, P302 P281 Kohonen P Adv IEEE, V78, P1464 Pao Y-H,J,ST, 1989, Adaptive Pattern Rec, V, Moody 1989, Neural Computation, V1,P Fahlman E,1990, 1990, Neural Informati, Pao Y-H, 1989, Adaptive Pattern Rec, V, P P281 Kohonen T, 1990, P IEEE, V78, P1464 Moody J, Neural Computation, V1, Fahlman S1989, E, 1990, 1990, Adv Neural Neural Informati, Informati, Pao Y-H, S Adaptive Rec, V, P P281 Fahlman Kohonen T,E,1990, 1990, PAdv IEEE,Pattern V78, P1464 P1464 V2, P524 Kohonen T, P IEEE, V78, Pao Y-H, S1989, Adaptive Pattern Rec, V, P Fahlman E, 1990, Adv Neural Informati, P281 V2, P524 Kohonen T, P Adv IEEE, V78, P1464 Pao Y-H, S1989, Adaptive Pattern Rec, V, P V2, P524 Fahlman E,1990, 1990, Neural Informati, Narendra KE, S,1990, 1990, IeeeNeural Trans Neural Kohonen 1990, PAdv IEEE, V78, P1464 Pao Y-H,ST, 1989, Adaptive Pattern Rec, V, P Fahlman Informati, Narendra KE, S,1990, 1990, IeeeNeural Trans Neural Publications 2018,V2, 6,P524 32 Fahlman Informati, Kohonen T, 1990, PAdv IEEE, V78, P1464 V2, P524 Narendra K S, 1990, Ieee Trans Neural Pao Y-H, Adaptive Pattern Rec, V, P Netw, V1,SS1989, P4 Kohonen T, 1990, P IEEE, V78, P1464 Fahlman E, 1990, Adv Neural Informati, Narendra K S, 1990, Ieee Trans Neural V2, P524 Netw, V1,ST,P4 P4 V2, P524 Fahlman E, 1990, Adv Neural Informati, Kohonen 1990, P IEEE, V78, P1464 Netw, V1, Narendra K S, 1990, Ieee Trans Neural Funahashi K, 1989, Neural Networks, V2, Fahlman E,S, 1990, Neural Informati, V2, P524 Kohonen 1990, PAdv IEEE, V78, P1464 Netw, V1,ST, P4 Narendra K 1990, IeeeNeural Trans Neural Funahashi K, 1989, Neural Networks, V2, V2, P524 Fahlman E,1990, Informati, Narendra K S,1990, 1990, Ieee Trans Neural Kohonen PAdv IEEE, V78, P1464 Netw, V1,ST, P4 Funahashi K, 1989, Neural Networks, V2, P183 V2, P524 Fahlman SK E,S,1990, Informati, Narendra 1990,Adv IeeeNeural Trans Neural Funahashi K, 1989, Neural Networks, V2, Netw, V1, P4 P183 Netw, V1,SK P4 V2, P524 Fahlman E,S,1989, 1990, Adv Neural Informati, Narendra 1990, Ieee Trans Neural P183 Funahashi K, Neural Networks, V2, Rumelhart De, 1986,Neural Parallel Distributed, V2, P524 Narendra S,1990, 1990, Ieee Trans Neural Netw, V1,SK P4 Fahlman E, Neural Informati, P183 Funahashi K, 1989, Networks, V2, Rumelhart De, 1986,Adv Parallel Distributed, V2, PP524 Netw, V1, K P4 Funahashi K, Neural Networks, Narendra S,1989, 1990, Ieee Trans NeuralV2, P183 Rumelhart De, 1986, Parallel Distributed, V1, Netw, V1, P4 Narendra K S, 1990, Ieee Trans Neural V2, P524 Funahashi K, 1989, Neural Networks, V2, Rumelhart De, 1986, Parallel Distributed, P183 V1, P V1, P4 Netw, P183 Narendra K S,1989, 1990, Ieee Trans NeuralV2, Funahashi K, Neural Networks, V1, P Rumelhart De, 1986, Parallel Distributed, Kohonen Teuvo, 1989, Self Org Ass Netw, P4 Funahashi K, Neural Networks, V2, P183P V1,Teuvo, Narendra K S,1989, 1990, Ieee Trans Neural V1, Rumelhart De, 1986, Parallel Distributed, Kohonen 1989, Self Org Ass Netw, P4 References Year P183 Funahashi K, Neural Rumelhart De, 1986, Parallel Distributed, V1, P V1,Teuvo, Kohonen 1989, Self Networks, Org Ass V2, Memory, V, P1989, P183 Funahashi K, Neural Networks, Netw, P4 Rumelhart De, 1986, Parallel Distributed, Kohonen 1989, Self Org Ass V2, V1, P V1,Teuvo, Memory, V, P1989, V1, P P183 Funahashi K, 1989, Neural Networks, V2, Rumelhart De, 1986, Parallel Distributed, Memory, V, P Kohonen Teuvo, 1989, Self Org Ass V37, Waibel A, 1989, Ieee T Acoust Speech, P183 Rumelhart De, 1986, Parallel Distributed, V1, P Waibel A, 1989, Ieee T Acoust Speech, Funahashi K, 1989, Neural Networks, V2, Memory, V, P Kohonen Teuvo, 1989, Self Org Org Ass V37, Waibel A,Teuvo, 1989, Ieee T Acoust Speech, P183 V1, P A, Kohonen 1989, Self Ass Rumelhart De, Parallel Distributed, 1989 Memory, V, P 1986, Waibel 1989, Ieee T Acoust Speech, V37, P328 V1, P A,V, Rumelhart De, Distributed, P183 Kohonen Teuvo, 1989, Self Org Ass V37, V37, P328 Waibel 1989, Ieee TParallel Acoust Speech, Memory, P 1986, P328 V1, P A,V, Memory, P Rumelhart De, 1986, Parallel Distributed, Kohonen Teuvo, 1989, Self Org Ass V37, P328 Waibel 1989, Ieee T Acoust Speech, V1, P Ra, Kohonen Teuvo, 1989, Self Org AssV1, V37, Memory, V, P 1986, Rumelhart De, Parallel Distributed, P328 Waibel A,Teuvo, 1989, Ieee T Acoust Speech, Jacobs 1988, Neural Networks, P295 V1, P Ra, Memory, V, P Ieee Kohonen 1989, Self OrgSpeech, AssV1, Waibel A, 1989, T Acoust V37, P328 Jacobs 1988, Neural Networks, P295 Memory, V, P Ieee Jacobs Ra, 1988, Neural Networks, V1,Org P295 1988 Kohonen 1989, Self AssV1, V37, V1, P Ra, Waibel A,Teuvo, 1989, T Acoust Speech, Jacobs 1988, Neural Networks, P295 P328 Memory, V, P P328 Kohonen Teuvo, 1989, Self Org AssV1, V37, Waibel A, 1989, Ieee T Acoust Speech, Jacobs Ra, 1988, Neural Networks, P295 Memory, P Ieee Waibel A,V, 1989, T Acoust Speech, P328 Kohonen Teuvo, 1989, Self OrgV7, AssP609 Jacobs Ra, 1988, Neural Networks, V1, V37, P295 Xu L, 1994, Neural Networks, Memory, P Ieee P328 Waibel A, V, 1989, T Acoust Speech, Jacobs Ra, 1988, Neural Networks, V1, V37, P295 Xu L, 1994, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1, P295 P328 Waibel A,V, 1989, T Acoust Speech, V37, Xu L, Neural Networks, V7, P609 Memory, P Ieee Xu L, 1994, Neural Networks, V7, P609 V7, 1994 P328 Xu L, 1994, Neural Networks, P609 Jacobs Ra, 1988, Neural Networks, V1,V3, P295 Waibel A,1991, 1989, Ieee T Acoust Speech, V37, Bishop C, Neural Computation, Jacobs Ra, 1988, Neural Networks, V1, P295 P328 Xu L, 1994, Neural Networks, V7, P609 Waibel A, 1989, Ieee T Acoust Speech, V37, Bishop C, 1991, 1991, Neural Computation, V3, Jacobs Ra, 1988, Neural Networks, V1, P295 P328 Bishop C, Neural Computation, V3, Xu L, 1994, Neural Networks, V7, P609 P579 Xu L, 1994, Neural Networks, V7, P609 Jacobs Ra, 1988, Neural Networks, V1,V3, P295 Bishop C, 1991, Neural Computation, P328 P579 Bishop C, 1991, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 Jacobs 1988, Neural Networks, V1,V3, P295 P579 Bishop C, 1991, Neural Computation, 1991 Baum ERa, B,1991, 1989, Neural Computation, V1, Xu L, 1994, Neural Networks, V7, P609 P579 Jacobs Ra, 1988, Neural Networks, V1,V3, P295 Bishop C, Neural Computation, Baum E B, 1989, Neural Computation, V1, V3, P579 Bishop C, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 P579 Baum ERa, B,1991, 1989, Computation, V1, Jacobs 1988, Neural Networks, V1,V3, P295 P151 Xu L, 1994, Neural Networks, V7, P609V3, Bishop Neural Computation, Baum EC, B, 1991, 1989, Neural Computation, V1, P579 P151 P579 Bishop C, 1991, Neural Computation, V3, Xu L, 1994, Neural Networks, V7, P609 P151 Baum E B, 1989, Neural Computation, V1, Baum E B, 1989, Computation, Bishop C, 1991, Neural Computation, P579 XuNeural L, H, 1994, Neural Networks, V7, P609 P151 Baum EC, B,1992, 1989, Neural Computation, V1, Ritter Neural Computation S,V3, V, P 1989 P579 Baum E B, 1989, Neural Computation, V1, Bishop 1991, Neural Computation, Xu L, 1994, Neural Networks, V7, P609 P151 Ritter H, 1992, Neural Computation S,V3, V, P V1, P151 P579 Bishop 1991, Neural Computation, V3, Baum H, EC, B, 1989, Neural Computation, V1,P Ritter 1992, Neural Computation S, V, P151 P579 P151 Bishop 1991, NeuralComputation Computation, Baum E C, B, 1989, V1,P Ritter H, 1992, Neural S, V3, V, P579 Baum E B, 1989, Neural Computation, V1, P151 H, Bishop C, 1991, Neural Computation, Ritter 1992, Neural Computation S,V3, V, P Battiti R, Comput, V4, P141 P579 P151 Baum E 1989,Neural NeuralComputation Computation, Ritter H, 1992, S, V1, V, P Battiti R,B, Comput, V4, P141 Ritter H, 1992,Ritter Neural Computation S, V, V4, P P141 1992 H, 1992, Neural Computation S, V, P P151 Baum E B, 1989, Neural Computation, V1, Battiti R, Comput, P579 P151 Battiti R, 1992, Neural Comput, V4, P141 Ritter H, Computation S, V, P Baum E B, 1989, Neural Computation, V1, Barnard E, 1992, Ieee T Neural Networ, V3, Ritter H, 1992, Neural Computation S, V, P P151 Battiti E R,B, Comput, V4, P141 Baum Neural Computation, V1, Barnard E, 1989, 1992, Ieee T TComputation Neural Networ, V3, Ritter Neural S, V, P P151 H, Barnard E, 1992, Ieee Neural Networ, V3, Battiti R, 1992, Comput, V4, P141 P232 Battiti R,E, Comput, V4, P141 Ritter H, 1992, Neural Computation S, V, P 1992, Ieee T Neural V3, P151 Battiti R, 1992,Barnard Neural Comput, V4, P141Networ, 1992 P232 Battiti R, Comput, V4, P141 Ritter 1992, Neural S, V, P P232 Barnard E, 1992, Ieee TComputation Neural Networ, V3, Ku Cc,H, 1995, Ieee T Neural Networ, V6, Battiti R, 1992, Neural V4, P141 P232 Ritter H, 1992, Neural Computation S, V, P Barnard E, 1992, Ieee TComput, Neural Networ, V3, Ku Cc,H, 1995, Ieee T Neural Neural Networ, V6, Barnard E, 1992, Ieee T Neural Networ, V3, Battiti R, Comput, V4, P141 P232 Ku Cc, 1995, Ieee T Networ, V6, Ritter 1992, Neural Computation S, V, P P144 Battiti R, 1992, Neural Comput, V4, P141 Barnard E, 1992, Ieee T Neural Networ, V3, Ku Cc, 1995, Ieee T Neural Networ, V6, P232 Barnard E, 1992, Ieee T Neural Networ, P144 P232 Barnard E, 1992, Ieee T Neural Networ, V3, Battiti R, 1992, Neural Comput, V4, P141 P144 Ku Cc, 1995, IeeeIeee T Neural Networ, V6,1992 Barnard E,1992, 1992, T Comput, Neural Networ, V3, P232 Battiti Neural V4,P533 P141 P144 Ku Cc, R, 1995, Ieee T Neural Neural Networ, V6, V3, P232 Rumelhart De, 1986, Nature, V323, P232 Barnard E, 1992, Ieee TComput, Neural Networ, V3, Ku Cc, 1995, Ieee T Networ, V6, Battiti R, 1992, Neural V4,P533 P141 P144 Rumelhart De, 1986, Nature, V323, P232 Barnard E, 1992, Ieee T Neural Networ, Ku Cc, 1995, Ieee T Neural Networ, V6, V3, Rumelhart De, 1986, Nature, V323, P533 P144 P144 P232 Barnard E, 1992, Ieee T Neural Networ, V3, Ku Cc, 1995, Ieee T Neural Networ, V6, Rumelhart De, 1986, Nature, V323, P533 Drago Gp, 1992, Ieee T Neural Networ, V3, Ku Cc, 1995, Ieee T Neural Networ, P232 Ku Cc, 1995, Ieee T Neural Networ, V6, V3, P144 Barnard E, 1992, 1992, Ieee T Neural Neural Networ, V3, Rumelhart De, 1986, Nature, V323, P533 Drago Gp, Ieee T Networ, P232 P144 Ku Cc, Gp, 1995, Ieee T Neural Networ, V6,1995 Drago 1992, Ieee T Neural Networ, V3, Rumelhart De, 1986, Nature, V323, P533 P627 V6, P144 Rumelhart De, 1986, Nature, V323, P533 P144 Ku Cc,Gp, 1995, IeeeIeee T Neural Networ, V6, V3, P232 Drago 1992, T Neural Networ, P627 P144 Rumelhart De, 1986, Nature, V323, P533 Ku Cc,Gp, 1995, Ieee T Neural Networ, V6,V3, P627 Drago 1992, Ieee T Neural Networ, Rumelhart De, 1986, Nature, V323, P144 P627 Ku Cc,F, 1995, Ieee T Neural Networ, V6, V3, Drago Gp, 1992, Ieee TComput, Neural Networ, Girosi 1995, Neural V7, P533 P219 Gp, 1992, Ieee T Neural Networ, V3, Rumelhart De,Neural 1986, Nature, V323, P533 P144 P627 Girosi F, 1995, Comput, V7, P219 Rumelhart De,Drago 1986, Nature, V323, P533 1986 Rumelhart De, Neural 1986, V323, Drago F, Gp, 1992, Ieee Nature, TComput, Neural Networ, V3, Girosi 1995, V7, P533 P219 P144 P627 P627 Drago Gp, 1992, Ieee TComput, Neural Networ, Rumelhart De, 1986, Nature, V323, Girosi F, 1995, Neural V7, P533 P219V3, Kohonen T, 1988, Self Org Ass Memory, V, Drago Gp, 1992, Ieee T Neural Networ, P627 Rumelhart De, 1986, Nature, V323, P533 Girosi F, 1995, Neural Comput, V7, P219V3, Kohonen T, 1988, Self Org Ass Memory, V, P627 Drago Gp, 1992, Ieee T Neural Networ, V3, Kohonen T,De, 1988, Self Org Ass Memory, Rumelhart 1986, Nature, V323, Girosi F, T 1995, Neural Comput, V7, P533 P219 V, Drago Gp, 1992, Networ, P Ieee Girosi F, 1995, Neural Comput, V7, P219 P627 Drago Gp, 1992, Ieee TOrg Neural Networ, V3, Kohonen T,Neural 1988, Self Ass Memory, V, P 1992 P627 Girosi 1995, Neural Comput, V7, P219V3, DragoF, Gp, Ieee Neural Networ, P Kohonen T,1992, 1988, Self TOrg Ass Memory, V, V3, P627 Broomhead D S, 1988, Complex Systems, Girosi F, 1995, Neural Comput, V7, P219V3, P627 P Drago Gp, Ieee Neural Kohonen T,1992, 1988, SelfTOrg Org Ass Networ, Memory, V, Broomhead D S, 1988, Complex Systems, Kohonen T, 1988, Self Ass Memory, Girosi F, 1995, Neural Comput, V7, P219 V, P627 P Broomhead D S, 1988, Complex Systems, V2, P321 Girosi F, 1995, Neural Comput, V7, P219 Kohonen T, 1988, Self Org Ass Memory, V, P627 Broomhead D S, 1988, Complex Systems, P V2, P321 P Kohonen T, 1988, SelfV7, Org Ass Memory, V, Girosi F, 1995, Neural Comput, V7, P219 V2, P321 Broomhead D S, 1988, Complex Systems, Girosi F, 1995,V2, Neural Comput, P219 1995 Kohonen T, D 1988, Self Org Ass Memory, V, PGirosi F, 1995, Neural Comput, V7, P219 P321 Broomhead S, 1988, Complex Systems, P Kohonen T, D 1988, Self Org Ass Memory, Broomhead S, 1988, Complex Systems, Girosi F, 1995, Neural Comput, V7, P219 V, V2, P321 P Kohonen Self Org Ass Memory, Broomhead D S, 1988, Complex Systems,V, V2, P321 T, 1988, V2, P321 P Kohonen T, D 1988, Self Org Ass Memory, Broomhead S, 1988, Complex Systems, V, Kohonen T, 1988, Self Org Ass Memory, P Broomhead D S, 1988, Complex Systems, V2, P321 Kohonen T, 1988, Self Org Ass Memory, V, 1988 P P321 V2, Broomhead D S, 1988, Complex Systems, V, P V2, P321 Broomhead D S, 1988, Complex Systems, P V2, P321 Broomhead D S, 1988, Complex Systems, V2, P321 Complex Broomhead D S, 1988,Systems, Complex Publications PEERSystems, REVIEW Broomhead D S, 1988, V2, P321 2018, 6, x FOR 1988 V2, P321 V2, P321 Hornik K, 1990, Neural Networks, Hornik K, 1990, Neural Networks, V3, P551 V3, P551 1990
1991 7.2449 1996 1990 7.4954 1989 7.2193 1995 1991 7.2449 1996 1990 7.4954 1995 1989 7.2193 1995 7.1573 1994 1991 7.2449 1996 1990 7.4954 1995 1989 7.2193 7.1573 1994 1991 7.2449 1996 1989 7.2193 1995 7.1573 1994 1991 7.2449 1996 1989 7.2193 1995 7.1573 1994 1991 7.2449 1996 1989 7.2193 1995 7.1573 1994 1990 6.3202 1996 7.2193 1995 1991 7.2449 1996 1989 7.1573 1994 1990 6.3202 1989 7.1573 1994 7.2193 1995 1990 6.3202 1996 1990 6.3202 1996 7.1573 1994 1989 7.2193 1995 1989 7.1573 1994 1989 7.2193 1995 1990 6.3202 1996 7.1573 1994 1989 7.2193 1995 1990 6.3202 1996 1990 6.3202 1996 1989 7.1573 1994 1990 6.3202 1996 1989 7.1573 1994 1990 6.3202 1996 1989 7.1573 1994 1990 6.2445 1995 1989 7.1573 1994 1990 6.3202 1996 6.2445 1995 1990 6.3202 1996 6.2445 1995 6.3202 1996 1990 6.2445 1995 1990 6.3202 1996 1990 6.3202 1996 6.2445 1995 1989 5.8544 1994 1990 6.3202 1996 1990 6.2445 1995 1989 5.8544 1994 6.3202 1996 1990 6.2445 1995 1989 5.8544 1994 1990 6.3202 1996 6.2445 1995 1989 5.8544 1994 Table Cont. 1990 8. 6.2445 6.3202 1996 1990 1995 1989 5.8544 1994 1986 5.4868 1992 6.2445 1995 1990 6.3202 1996 1989 5.8544 1994 1986 5.4868 1992 1989 5.8544 1994 1990 6.2445 1995 1986 5.4868 1992 1990 6.2445 1995 1989 5.8544 1994 1986 5.4868 1992 1990 6.2445 1995 1989 5.8544 1994 1986 5.4868 1992 1989 5.3089 1996 1989 5.8544 1994 Strength Begin 1990 6.2445 1995 1986 5.4868 1992 1989 5.3089 1996 1989 5.8544 1994 1986 5.4868 1992 5.3089 1996 5.8544 1994 1986 5.4868 1992 1989 5.3089 1996 1989 5.8544 1994 1986 5.4868 1992 1989 5.3089 1996 1989 5.2033 1994 1986 5.4868 1992 1989 5.8544 1994 5.20335.3089 19941996 1989 5.2033 1994 1986 5.4868 1992 1989 5.3089 1996 5.2033 1994 1986 5.4868 1992 5.3089 1996 1989 5.2033 1994 1986 5.4868 1992 1989 5.3089 1996 5.2033 1994 1988 4.6453 1989 5.3089 1996 1986 5.4868 1992 1989 5.2033 1994 1988 4.6453 4.64535.2033 19941994 1989 5.3089 1996 1988 4.6453 1989 5.3089 1996 5.2033 1994 1988 4.6453 1989 5.3089 1996 1989 5.2033 1994 1988 4.6453 1994 4.3694 1995 5.2033 1989 5.3089 1996 1988 4.6453 1994 1994 4.3694 1995 1988 4.6453 1994 1989 5.2033 1994 4.3694 1995 4.3694 1994 4.3694 1988 4.6453 19951995 1989 5.2033 1994 1988 4.6453 1994 1989 5.2033 1994 1994 4.3694 1995 1991 4.0149 1988 4.6453 1989 5.2033 1994 1994 4.3694 1995 1991 4.0149 1994 4.3694 1995 1988 4.6453 1994 1991 4.0149 1994 4.3694 19951995 1991 1988 4.6453 1994 4.01494.0149 1994 4.3694 1995 1988 4.6453 1994 1991 4.0149 1989 3.9806 1996 1994 4.3694 1988 4.6453 1994 1991 4.0149 1995 1989 3.9806 1996 1991 4.0149 1995 1994 4.3694 1989 3.9806 1996 1991 4.0149 1994 4.3694 1995 1989 3.9806 1996 1991 4.0149 1995 1994 4.3694 1995 1989 3.9806 1996 1992 3.7901 3.98063.7901 19961996 1991 4.0149 1995 1994 4.3694 1989 3.9806 1992 1991 4.0149 1995 1989 3.9806 1996 1992 3.7901 1991 4.0149 1995 1989 3.9806 1992 3.7901 1996 1991 4.0149 1995 1989 3.9806 1996 1992 3.7901 3.5404 1994 1989 3.9806 1991 4.0149 1995 1992 3.7901 1996 3.5404 1994 3.79013.7901 19961996 1992 1989 3.9806 3.5404 1994 1992 3.5404 1994 3.7901 1989 3.9806 1996 1992 3.7901 1996 1989 3.9806 1996 3.5404 1994 1992 1992 3.7901 1989 3.9806 1996 1992 3.5404 1994 1992 3.5404 1994 3.7901 1996 3.5404 1994 1992 3.5404 1994 3.7901 1996 1992 3.5404 1994 1992 3.7901 1996 1995 3.521 1997 1992 3.7901 1996 3.5404 1994 1995 3.521 1997 1992 3.5404 1994 1995 3.521 1997 1992 3.5404 1994 1995 3.521 19941997 3.54043.5404 1992 1994 1992 3.5404 1994 1995 3.521 1997 1986 3.4325 1993 1992 3.5404 1994 1995 3.521 1997 1986 3.4325 1993 1992 3.5404 1994 1995 3.521 1997 1986 3.4325 1993 1992 3.5404 1994 1995 3.521 1997 1986 3.4325 1993 1992 3.5404 1994 1995 3.521 19971997 1986 3.4325 1993 1992 3.4191 1996 3.5213.5404 1995 3.521 1997 1992 1994 1986 3.4325 1993 1992 3.4191 1996 1986 3.4325 1993 1995 3.521 1997 1992 3.4191 1996 1986 3.4325 1993 1995 3.521 1997 1992 3.4191 1996 1986 3.4325 1993 1995 3.521 1997 1992 3.4191 1996 1995 1986 3.4325 1993 3.521 19931997 1992 1995 3.4191 1996 3.43253.4191 1992 1996 1986 3.4325 1993 1995 1992 1986 3.4325 1993 1995 3.4191 1996 1992 3.4191 1996 1986 3.4325 1993 1995 1988 3.3453 1995 1992 3.4191 1996 1986 3.4325 1993 1995 1988 3.3453 1995 1995 3.4191 1996 1992 1988 3.3453 1995 3.4191 1996 1995 1992 3.4191 1996 1988 3.3453 1995 1995 3.4191 1996 1992 3.4191 1996 1988 3.3453 1995 1988 3.3453 1995 1995 1992 3.4191 1996 1988 3.3453 1995 1988 3.3453 1995 1995 3.4191 1996 1988 3.3453 1995 1995 3.4191 1996 3.41913.3453 1988 1995 3.419119961995 1996 1988 3.3453 1995 1995 3.4191 1996 1988 3.3453 1995 1988 3.3453 1995 1988 3.3453 1995 1988 3.345319951995 1995 1988 3.3453 3.34533.3453 1988 1995 1988 3.3453 1995 1988 3.3453 1995 1988 3.3453 1995 1988 3.34533.3453 19951995
1999 1998 1997 1999 1998 1997 1999 1998 1997 1999 1997 1999 1997 1999 1997 1998 1999 1997 1998 1997 1998 1998 1997 1997 1997 1998 1997 1998 1998 1997 1998 1997 1998 1997 1998 1997 1998 1998 1998 1998 1998 1997 1998 1998 1997 1998 1997 1998 1997 1998 1998 1997 1994 1998 1997 1994 1997 1998 1994 1998 1997 1994 1998 1997 1994 1997 1997 End 1998 1994 1997 1994 1997 1994 1997 1997 1994 1997 1997 1994 1997 1997 1997 1994 1997 1994 1997 1994 1997 1996 1997 1994 1997 1996 1996 1997 1996 1997 1996 1997 1997 1996 1998 1997 1996 1998 1996 1997 1998 1998 1998 1996 1997 1996 1997 1998 1996 1996 1997 1998 1996 1998 1996 1998 1996 1996 1998 1996 1996 1997 1998 1996 1997 1996 1998 1997 1996 1998 1997 1996 1998 1997 1998 1997 1996 1998 1997 1998 1996 1997 1998 1996 1997 1998 1996 1997 1998 1999 1997 1996 1998 1999 1998 1998 1997 1999 1999 1998 1997 1998 1997 1999 1998 1997 1999 1999 1998 1999 1999 1998 1999 1998 2000 1998 1999 2000 1999 2000 1999 2000 1999 1999 1999 2000 1994 1999 2000 1994 1999 2000 1994 1999 2000 1994 1999 2000 1994 2000 2000 2000 1999 1994 2000 1994 2000 1994 2000 1994 2000 2000 1994 2000 1994 2000 1994 1994 2000 2000 1994 1996 2000 1994 1996 2000 1996 2000 2000 1996 2000 2000 1996 1996 2000 1996 1996 2000 1996 2000 2000 1996 2000 1996 2000 1996 1996 1996 1996 1996 1996 1996 1996 1996 1996 1996 1996
1990 1996 3.34533.3453 19951995 1996
▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ 18 of 22 ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ 1992–2018 ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ 18 of 21 ▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂ ▂▂▂▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂ ▂▂▂▂▂▂▂▂
The top rank is taken by hertz [52] with the highest burst strength of 15.3824. The reference is a
The topbook ranktitled is taken by hertzto[52] highest burst strength of 15.3824. The 1991. reference is a “Introduction the with theorythe of neural computation” published in the year The second book titled “Introduction to the the theory of neural computation” published in the 1991.approximators. The second article [53] proposed concept of multilayer feed forward networks as year universal article [53] proposed concept of multilayer feed forward networks as universal approximators. The articlesthe with the longest citation burst is starting in 1996 after two years if its publication in 1994 and ends in the year. The book is titled “Neural Networks: A Comprehensive Foundation” and The articles with the longest citation burst is starting in 1996 after two years if its publication in 1994 introduces the concept of neural networks from an engineering perspective. and ends in the year. The book is titled “Neural Networks: A Comprehensive Foundation” and introduces the concept of neural networks from an engineering perspective. 6. Conclusions
6. Conclusions This paper presents an overall insight and the hidden structure of the journal of neurocomputing (NC) through an extensive bibliometric study on the publications and citations history from 1992–
This paper presents an overall insight and the hidden structure of the journal of neurocomputing 2018. Over these years of its publications, NC has emerged as one of the most prominent journals. (NC) through an extensive bibliometric study on the publications and citations history from 1992–2018. Table 9 shows the comparison of NC with respect to the several other related journals with respect Over these years publications, NC analysis has emerged as oneover of the prominent journals. to the of TP,its TC, CPP, and IF. The is performed lastmost 10 years of publications. NCTable stands9 at shows the comparison of NC with respect to the several other related journals with respect to the TP, the top in terms of the total number of publications with TP of 7954 and ranks at the 2nd spot in terms TC, CPP, and Thecitations. analysisWith is performed last 10(3.317) years of publications. NC computing stands at the top in of IF. total respect to over the IF, NC ranks above the Soft (2.472) and Applications of Artificial terms of theEngineering total number of publications withIntelligence TP of 7954(2.894). and ranks at the 2nd spot in terms of total citations. With respect to the IF, NC (3.317) ranks above the Soft computing (2.472) and Engineering Table 9. Comparison of NC with other related Journals. Applications of Artificial Intelligence (2.894). Journal Induction Year bibliometric TP (10 Years) TC (10 Years) and CPPtechniques. Impact Factor In this paper, we have explored the publication history using methods Information Sciences (INS) 1968 5125 102,308 20.0 4.832 It probed the publicationSoft structure A 14,096 total of 10,308 papers Computing and (SC) the citation history 1997 of the journal. 1620 8.7 2.472 Knowledge-Based Systems (KBS) 1987 2092 31,516 15.1 4.529 were analyzed from 1992–2018, which received a total of 106,536 citations (CPP = 10.33). The highest Engineering Applications of Artificial Intelligence (EAAI) 1988 1617 22,107 13.7 2.894 IEEE on Fuzzy Systems (IEEE TFS) citations1993 1186 in 2015. 42,013 35.4 16 papers 7.651 publications came inTransactions 2016, while the maximum were received There are Applied Soft Computing (ASOC) 2001 3581 56,584 15.8 3.541 which got more than 200Neurocomputing citations and 2006, which citations (NC) the most cited paper 1989 came in 7954 63,491 got 2762 7.98 3.317 In this paper, we have explored the publication history using bibliometric methods and techniques. It probed the publication structure and the citation history of the journal. A total of 10,308 papers were analyzed from 1992–2018, which received a total of 106,536 citations (CPP = 10.33). The highest publications came in 2016, while the maximum citations were received in 2015. There are 16 papers which got more than 200 citations and the most cited paper came in 2006, which got 2762
Publications 2018, 6, 32
19 of 22
to date. The most productive and influential author of NC is Cao. China and USA are the top two nations in publications of NC. Chinese Academy of Sciences, China published the highest number of papers. Interestingly, from the overall compilation of the bibliometric results, it is evident that China has played a prominent role in the publications and citations structure of the NC. Table 9. Comparison of NC with other related Journals. Journal
Induction Year
TP (10 Years)
TC (10 Years)
CPP
Impact Factor
Information Sciences (INS) Soft Computing (SC) Knowledge-Based Systems (KBS) Engineering Applications of Artificial Intelligence (EAAI) IEEE Transactions on Fuzzy Systems (IEEE TFS) Applied Soft Computing (ASOC) Neurocomputing (NC)
1968 1997 1987
5125 1620 2092
102,308 14,096 31,516
20.0 8.7 15.1
4.832 2.472 4.529
1988
1617
22,107
13.7
2.894
1993
1186
42,013
35.4
7.651
2001 1989
3581 7954
56,584 63,491
15.8 7.98
3.541 3.317
However, one thing to be noted is that there may be certain limitations in the study because of the focus being on the data retrieved from the WoS database. The limitations include the coverage of research publications wherein WoS omits certain research articles, a smaller database of journals which leads to under-reported statistics, and so on. However, the data quality in WoS is tremendous and is thus the most acknowledged source of almost all scientometric studies. Therefore, the major aim of this paper is to reveal the current status of the NC publications by utilizing the results in WoS [54,55]. The other limitation of this study arises from the fact that it is from the metric level, that is, it deals with the number of citations and papers. However, numbers do represent quantity but citations do not represent the quality. Another limitation is that this study creates a profile of the NC journal rather than the area itself, although it can be claimed that NC has been the pioneer in leading the research in this field. Certain limitations arise due to data retrieved from WoS as not all journals are indexed with WoS and thus an array of citations are missed. A comparison with other indexing platforms such as Scopus, Google Scholar can be a good future scope of this study. In conclusion, the journal of NC has played an important role in shaping academic research since its inception. NC is certainly discovering and developing trends in the domain of soft computing. Author Contributions: Conceptualization: P.K.M.; Methodology: A.K.S., Data Curation, M.J.; Writing—Original Draft Preparation: M.J.; Writing—Review & Editing: M.J., A.K.S., P.K.M. and A.A.; Supervision: P.K.M. and A.A. Funding: This research received no external funding. Acknowledgments: The authors gratefully acknowledge the valuable comments received from the editors and the reviewers, which has helped them in improving the paper significantly. Conflicts of Interest: The authors declare no conflict of interest.
References 1. 2. 3. 4.
5.
Garcia-Silvente, Y.A.M. An overview of iris recognition: A bibliometric analysis. Scientometrics 2014, 101, 2003–2033. Merigó, J.M.; Rocafort, A.; Aznar-Alarcón, J.P. Bibliometric Overview of Business & Economics Research. J. Bus. Econ. Manag. 2016, 17, 397–413. Du, Y. A Bibliometrics Portrait of Chinese Research through the Lens of China Economic Review. A research proposal. Econ. Manag. Res. Proj. 2012, 1, 79–91. Hoepner, A.G.F.; Kant, B.; Scholtens, B.; Yu, P.S. Environmental and ecological economics in the 21st century: An age adjusted citation analysis of the influential articles, journals, authors and institutions. Ecol. Econ. 2012, 77, 193–206. [CrossRef] Kolle, S.R.; Vijayashree, M.S.; Shankarappa, T.H. Highly cited articles in malaria research: A bibliometric analysis. Collect. Build. 2017, 36, 45–57. [CrossRef]
Publications 2018, 6, 32
6. 7. 8. 9. 10. 11. 12. 13.
14. 15. 16. 17.
18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29.
30.
20 of 22
Jiang, H.; Qiang, M.; Lin, P. A topic modeling based bibliometric exploration of hydropower research. Renew. Sustain. Energy Rev. 2016, 57, 226–237. [CrossRef] Yataganbaba, A.; Kurtba¸s, I. A scientific approach with bibliometric analysis related to brick and tile drying: A review. Renew. Sustain. Energy Rev. 2016, 59, 206–224. [CrossRef] Lee, C.I.S.G.; Felps, W.; Baruch, Y. Toward a taxonomy of career studies through bibliometric visualization. J. Vocat. Behav. 2014, 85, 339–351. [CrossRef] El Khaled, D.; Novas, N.; Gazquez, J.A.; Manzano-Agugliaro, F. Dielectric and bioimpedance research studies: A Scientometric approach using the scopus database. Publications 2018, 6, 6. [CrossRef] Aaen-Stockdale, C. Selfish Memes: An Update of Richard Dawkins’ Bibliometric Analysis of Key Papers in Sociobiology. Publications 2017, 5, 12. [CrossRef] Vanleeuwen, T. Letter to the editor Publication trends in social psychology journals: A long-term bibliometric analysis. Eur. J. Soc. Psychol. 2013, 11, 9–11. [CrossRef] Arnott, D.; Pervan, G. A critical analysis of decision support systems research. Formul. Res. Methods Inf. Syst. 2015, 20, 127–168. Arnott, D.; Pervan, G. A Critical Analysis of Decision Support Systems Research Revisited: The Rise of Design Science. In Enacting Research Methods in Information Systems; Palgrave Macmillan: Cham, Switzerland, 2016; Volume 29, pp. 43–103. Kim, M.C.; Jeong, Y.K.; Song, M. Investigating the integrated landscape of the intellectual topology of bioinformatics. Scientometrics 2014, 101, 309–335. [CrossRef] Song, M.; Kim, S.; Zhang, G.; Ding, Y.; Chambers, T. Productivity and influence in bioinformatics: A biliographic analysis using PubMed central. J. Am. Soc. Inf. Sci. Technol. 2014, 65, 352–371. [CrossRef] Loock, M.; Hinnen, G. Heuristics in organizations: A review and a research agenda. J. Bus. Res. 2015, 68, 2027–2036. [CrossRef] Huang, Y.; Schuehle, J.; Porter, A.L.; Youtie, J. A systematic method to create search strategies for emerging technologies based on the Web of Science: Illustrated for ‘Big Data’. Scientometrics 2015, 105, 2005–2022. [CrossRef] Prathap, G. Big data and false discovery: Analyses of bibliometric indicators from large data sets. Scientometrics 2014, 98, 1421–1422. [CrossRef] Garousi, V. A bibliometric analysis of the Turkish software engineering research community. Scientometrics 2015, 105, 23–49. [CrossRef] Blanco-Mesa, F.; Merigó, J.M.; Gil-Lafuente, A.M. Fuzzy decision making: A bibliometric-based review. J. Intell. Fuzzy Syst. 2017, 32, 2033–2050. [CrossRef] Yu, D.; Xu, Z.; Wang, W. Bibliometric analysis of fuzzy theory research in China: A 30-year perspective. Knowl.-Based Syst. 2018, 141, 188–199. [CrossRef] Shukla, A.K.; Sharma, R.; Muhuri, P.K. A Review of the Scopes and Challenges of the Modern Real-Time Operating Systems. Int. J. Embed. Real-Time Commun. Syst. 2018, 9, 66–82. [CrossRef] Yu, D.; Li, D.F.; Merigó, J.M.; Fang, L. Mapping development of linguistic decision making studies. J. Intell. Fuzzy Syst. 2016, 30, 2727–2736. [CrossRef] Emrouznejad, M.M.A. Ordered weighted averaging operators 1988–2014: A citation based literature survey. Int. J. Intell. Syst. 2014, 29, 994–1014. [CrossRef] Yu, D.; Shi, S. Researching the development of Atanassov intuitionistic fuzzy set: Using a citation network analysis. Appl. Soft Comput. 2015, 32, 189–198. [CrossRef] Cobo, M.J.; Martínez, M.A.; Gutiérrez-Salcedo, M.; Fujita, H.; Herrera-Viedma, E. 25years at Knowledge-Based Systems: A bibliometric analysis. Knowl.-Based Syst. 2015, 80, 3–13. [CrossRef] Merigó, J.M.; Blanco-Mesa, F.; Gil-Lafuente, A.M.; Yager, R.R. Thirty years of the International Journal of Intelligent Systems: A bibliometric review. Int. J. Intell. Syst. 2017, 32, 526–554. [CrossRef] Xu, Z.; Yu, D.; Kao, Y.; Lin, C.T. The structure and citation landscape of IEEE Transactions on Fuzzy Systems (1994–2015). IEEE Trans. Fuzzy Syst. 2018, 26, 430–442. Laengle, S.; Merigó, J.M.; Miranda, J.; Słowinski, ´ R.; Bomze, I.; Borgonovo, E.; Dyson, R.G.; Oliveira, J.F.; Teunter, R. Forty years of the European Journal of Operational Research: A bibliometric overview. Eur. J. Oper. Res. 2017, 262, 803–816. [CrossRef] Cancino, C.; Merigó, J.M.; Coronado, F.; Dessouky, Y.; Dessouky, M. Forty years of Computers & Industrial Engineering: A bibliometric analysis. Comput. Ind. Eng. 2017, 113, 614–629.
Publications 2018, 6, 32
31. 32. 33. 34. 35. 36. 37.
38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53.
21 of 22
Yu, D.; Xu, Z.; Pedrycz, W.; Wang, W. Information Sciences 1968–2016: A retrospective analysis with text mining and bibliometric. Inf. Sci. 2017, 418–419, 619–634. [CrossRef] Merigó, J.M.; Pedrycz, W.; Weber, R.; De la Sotta, C. Fifty years of Information Sciences: A bibliometric overview. Inf. Sci. 2018, 432, 245–268. [CrossRef] Wakeling, S.; Willett, P.; Creaser, C.; Fry, J.; Pinfield, S.; Spezi, V. Transitioning from a Conventional to a ‘Mega’ Journal: A Bibliometric Case Study of the Journal Medicine. Publications 2017, 5, 7. [CrossRef] Muhuri, P.K.; Shukla, A.K.; Janmaijaya, M.; Basu, A. Applied Soft Computing: A Bibliometric Analysis of the Publications and Citations during (2004–2016). Appl. Soft Comput. 2018, 69, 381–392. [CrossRef] Amiguet, L.; Gil-Lafuente, A.M.; Kydland, F.E.; Merigó, J.M. One hundred and twenty-five years of the Journal of Political Economy: A bibliometric overview. J. Political Econ. 2017, 125, 6. Martínez-López, F.J.; Merigó, J.M.; Valenzuela, L.; Nicolás, C. Fifty years of the European Journal of Marketing: A bibliometric analysis. Eur. J. Mark. 2018, 52, 439–468. [CrossRef] Wang, W.; Laengle, S.; Merigó, J.M.; Yu, D.J.; Herrera-Viedma, E.; Cobo, M.J.; Bouchon-Meunier, B. A bibliometric analysis of the first twenty-five years of the International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2018, 26, 169–193. [CrossRef] Tur-Porcar, A.; Mas-Tur, A.; Merigó, J.M.; Roig-Tierno, N.; Watt, J. A bibliometric history of the Journal of Psychology between 1936 and 2015. J. Psychol. 2018, 152, 199–225. [CrossRef] [PubMed] Neurocomputing. Available online: https://www.journals.elsevier.com/neurocomputi-ng (accessed on 21 February 2018). Hirsch, J.E. An index to quantify an individual’s scientific research output. Proc. Natl. Acad. Sci. USA 2005, 102, 16569–16572. [CrossRef] [PubMed] Chen, C. CiteSpace II: Detecting and visualizing emerging trends and transient patterns in scientific literature. J. Assoc. Inf. Sci. Technol. 2006, 57, 359–377. [CrossRef] Huang, G.B.; Zhu, Q.Y.; Siew, C.K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [CrossRef] Zhang, G.P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 2003, 50, 159–175. [CrossRef] Suykens, J.A.K.; Brabanter, J.D.; Lukas, L.; Vandewalle, J. Weighted least squares support vector machines: Robustness and sparse approximation. Neurocomputing 2002, 48, 85–105. [CrossRef] Kim, K. Financial time series forecasting using support vector machines. Neurocomputing 2003, 55, 307–319. [CrossRef] Kohonen, T. The self-organizing map. Neurocomputing 1998, 21, 1–6. [CrossRef] Mubashir, M.; Shao, L.; Seed, L. A survey on fall detection: Principles and approaches. Neurocomputing 2013, 100, 144–152. [CrossRef] Zong, W.; Huang, G.B.; Chen, Y. Weighted extreme learning machine for imbalance learning. Neurocomputing 2013, 101, 229–242. [CrossRef] Huang, G.B.; Zhou, H.; Ding, X.; Zhang, R. Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part. B Cybern. 2012, 42, 513–529. [CrossRef] [PubMed] Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 3, 27. [CrossRef] Huang, G.B.; Wang, D.H.; Yuan, L. Extreme learning machines: A survey. Int. J. Mach. Learn. Cybern. 2011, 2, 107–122. [CrossRef] John, H.; Krogh, A.; Palmer, R.G. Introduction to the Theory of Neural Computation; Addison-Wesley: Boston, MA, USA, 1991. Haykin, S. Neural Networks: A Comprehensive Foundation; Macmillan Publishers: New York, NY, USA, 2004; p. 41.
Publications 2018, 6, 32
54. 55.
22 of 22
Fiala, D.; Tutoky, G. Computer Science Papers in Web of Science: A Bibliometric Analysis. Publications 2017, 5, 23. [CrossRef] Arik, B.T.; Arik, E. “Second Language Writing” Publications in Web of Science: A Bibliometric Analysis. Publications 2017, 5, 4. [CrossRef] © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).