000 12324nam a2200637 i 4500
001 9292525
003 IEEE
005 20240917144827.0
006 m o d
007 cr |n|||||||||
008 210105s2020 nju ob 001 eng d
010 _z 2020015458 (print)
020 _a1119705916
020 _a1119705878
020 _z9781119705918
_qePub
020 _a9781119705871
_qelectronic bk.
020 _z9781119705925
_qelectronic bk.
020 _z1119705924
_qelectronic bk.
020 _z9781119705864
_qcloth
020 _z111970586X
024 7 _a10.1002/9781119705925
_2doi
035 _a(CaBNVSL)mat09292525
035 _a(IDAMS)0b0000648d5918e0
040 _aCaBNVSL
_beng
_erda
_cCaBNVSL
_dCaBNVSL
082 0 0 _a153.1/20113
100 1 _aGhosh, Lidia,
_eauthor.
245 1 0 _aCognitive modeling of human memory and learning :
_ba non-invasive brain-computer interfacing approach /
_cLidia Ghosh, Artificial Intelligence Lab., Dept. of Electronics and Tele-Communication Engineering, Amit Konar, Artificial Intelligence Lab., Dept. of Electronics and Tele-Communication Engineering, Pratyusha Rakshit, Artificial Intelligence Lab., Dept. of Electronics and Tele-Communication Engineering.
264 1 _aHoboken, New Jersey :
_bWiley,
_c[2020]
264 2 _a[Piscataqay, New Jersey] :
_bIEEE Xplore,
_c[2020]
300 _a1 PDF.
336 _atext
_2rdacontent
337 _aelectronic
_2isbdmedia
338 _aonline resource
_2rdacarrier
504 _aIncludes bibliographical references and index.
505 0 _aChapter 1: Introduction to Human Memory and Learning Models -- 1.1 Introduction 2 -- 1.2 Philosophical Contributions to Memory Research 4 -- 1.2.1 Atkinson and Shiffrin's Model 4 -- 1.2.2 Tveter's Model 6 -- 1.2.3 Tulving's model 6 -- 1.2.4 The Parallel and Distributed Processing (PDP) Approach 8 -- 1.2.5 Procedural and Declarative Memory 9 -- 1.3 Brain-theoretic Interpretation of Memory Formation 11 -- 1.3.1 Coding for Memory 11 -- 1.3.2 Memory Consolidation 13 -- 1.3.3 Location of stored Memories 16 -- 1.3.4 Isolation of Information in Memory 16 -- 1.4 Cognitive Maps 17 -- 1.5 Neural Plasticity 18 -- 1.6 Modularity 19 -- 1.7 The cellular Process behind STM Formation 20 -- 1.8 LTM Formation 21 -- 1.9 Brain Signal Analysis in the Context of Memory and Learning 22 -- 1.9.1 Association of EEG alpha and theta band with memory performances 22 -- 1.9.2 Oscillatory beta and gamma frequency band activation in STM performance 26 -- 1.9.3 Change in EEG band power with changing working memory load 26 -- 1.9.4 Effects of Electromagnetic field on the EEG response of Working Memory 29 -- 1.9.5 EEG Analysis to discriminate focused attention and WM performance 30 -- 1.9.6 EEG power changes in memory repetition effect 31 -- 1.9.7 Correlation between LTM Retrieval and EEG features 34 -- 1.9.8 Impact of math anxiety on WM response: An EEG study 37 -- 1.10 Memory Modelling by Computational Intelligence Techniques 38 -- 1.11 Scope of the Book 43 -- References 47 -- Chapter 2: Working Memory Modeling Using Inverse Fuzzy Relational Approach -- 2.1 Introduction 56 -- 2.2 Problem Formulation and Approach 59 -- 2.2.1 Independent Component Analysis as a Source Localization Tool 61 -- 2.2.2 Independent Component Analysis vs Principal Component Analysis 62 -- 2.2.3 Feature Extraction 63 -- 2.2.4 Phase 1: WM Modeling 64 -- 2.2.4.1 Step I: WM modeling of subject using EEG signals during full face encoding and recall from specific part of same face 65 -- 2.2.4.2 Step II: WM modeling of subject using EEG signals during full face encoding and recall from all parts of same face 68.
505 8 _a2.2.5 Phase 2: WM Analysis 69 -- 2.2.6 Finding Max-Min Compositional of Weight Matrix 70 -- 2.3 Experimental Results and Performance Analysis 75 -- 2.3.1 Experimental Set-up 75 -- 2.3.2 Source Localization using e-LORETA 78 -- 2.3.3 Pre-processing 79 -- 2.3.4 Selection of EEG Features 80 -- 2.3.5 WM Model Consistency across Partial Face Stimuli 81 -- 2.3.6 Inter-person Variability of Weight Matrix W 85 -- 2.3.7 Variation in Imaging Attributes 87 -- 2.3.8 Comparative Analysis with existing Fuzzy Inverse Relations 87 -- 2.4 Discussion 88 -- 2.5 Conclusion 89 -- References 90 -- Chapter 3: Short-Term Memory Modeling in Shape-Recognition Task by Type-2 Fuzzy Deep Brain Learning -- 3.1 Introduction 98 -- 3.2 System Overview 101 -- 3.3 Brain Functional Mapping using Type-2 Fuzzy DBLN 107 -- 3.3.1 Overview of Type-2 Fuzzy Sets 107 -- 3.3.2 Type-2 Fuzzy Mapping and Parameter Adaptation by Perceptron-like Learning 108 -- 3.3.2.1 Construction of the Proposed Interval Type-2 Fuzzy Membership Function 109 -- 3.3.2.2 Construction of IT2FS Induced Mapping Function 110 -- 3.3.2.3 Secondary Membership Function Computation of Proposed GT2FS 112 -- 3.3.2.4 Proposed General Type-2 Fuzzy Mapping 114 -- 3.3.3 Perceptron-like Learning for Weight Adaptation 115 -- 3.3.4 Training of the Proposed Shape-Reconstruction Algorithm 116 -- 3.3.5 The Test Phase of the Memory Model 118 -- 3.4 Experiments and Results 118 -- 3.4.1 Experimental Set-up 118 -- 3.4.2 Experiment 1: Validation of the STM Model with respect to Error Metric 121 -- 3.4.3 Experiment 2: Similar Encoding by a Subject for Similar Input Object-Shapes 122 -- 3.4.4 Experiment 3: Study of Subjects' Learning Ability with Increasing Complexity in Object Shape 123 -- 3.4.5 Experiment 4: Convergence Time of the Weight Matrix G for Increased Complexity of the Input Shape Stimuli 124 -- 3.4.6 Experiment 5: Abnormality in G matrix for the subjects with Brain Impairment 125 -- 3.5 Biological Implications 126 -- 3.6 Performance Analysis 128.
505 8 _a3.6.1 Performance Analysis of the Proposed T2FS Methods 128 -- 3.6.2 Computational Performance Analysis of the Proposed T2FS Methods 130 -- 3.6.3 Statistical Validation using Wilcoxon Signed-Rank Test 130 -- 3.6.4 Optimal Parameter Selection and Robustness Study 131 -- 3.7 Conclusions 133 -- References 135 -- Chapter 4: EEG Analysis for Subjective Assessment of Motor Learning Skill in Driving Using Type-2 Fuzzy Reasoning -- 4.1 Introduction 142 -- 4.2 System Overview 144 -- 4.2.1 Rule Design to determine the degree of learning 145 -- 4.2.2 Single Trial Detection of Brain Signals 148 -- 4.2.2.1 Feature Extraction 149 -- 4.2.2.2 Feature Selection 149 -- 4.2.2.3 Classification 150 -- 4.2.3 Type-2 Fuzzy Reasoning 151 -- 4.2.4 Training and Testing of the Classifiers 151 -- 4.3 Determining Type and Degree of Learning by Type-2 Fuzzy Reasoning 151 -- 4.3.1 Preliminaries on IT2FS and GT2FS 153 -- 4.3.2 Proposed Reasoning Method 1: CIT2FS based Reasoning 153 -- 4.3.3 Computation of Percentage Normalized Degree of Learning 155 -- 4.3.4 Optimal (Sn(B Selection in IT2FS Reasoning 156 -- 4.3.5 Proposed Reasoning Method 2: Triangular Vertical Slice Based CGT2FS Reasoning 156 -- 4.3.6 Proposed Reasoning Method 3: CGT2FS Reasoning with Gaussian Secondary Membership Function (MF) 158 -- 4.4 Experiments and Results 162 -- 4.4.1 The Experimental set-up 162 -- 4.4.2 Stimulus Presentation 163 -- 4.4.3 Experiment 1: Source Localization using eLORETA 163 -- 4.4.4 Experiment 2: Validation of the Rules 164 -- 4.4.5 Experiment 3: Pre-processing and Artifact Removal using ICA 165 -- 4.4.6 Experiment 4: N400 Old/New Effect Observation over the Successive Trials 167 -- 4.4.7 Experiment 5: Selection of the Discriminating EEG Features using PCA 168 -- 4.5 Performance Analysis and Statistical Validation 169 -- 4.5.1 Performance Analysis of the LSVM Classifiers 169 -- 4.5.2 Robustness Study 170 -- 4.5.3 Performance Analysis of the Proposed T2FS Reasoning Methods 170 -- 4.5.4 Computational Performance Analysis of the Proposed T2FS Reasoning Methods 171.
505 8 _a4.5.5 Statistical Validation using Wilcoxon Signed-Rank Test 172 -- 4.6 Conclusion 173 -- References 173 -- Chapter 5: EEG Analysis to Decode Human Memory Responses in Face Recognition Task Using Deep LSTM Network -- 5.1 Introduction 182 -- 5.2 CSP Modeling 186 -- 5.2.1 The Standard CSP Algorithm 186 -- 5.2.2 The Proposed CSP Algorithm 187 -- 5.3 Proposed LSTM Classifier with Attention Mechanism 189 -- 5.4 Experiment and Results 195 -- 5.4.1 The Experimental Set-up 195 -- 5.4.2 Experiment 1: Activated Brain Region Selection using eLORETA 196 -- 5.4.3 Experiment 2: Detection of the ERP signals associated with the familiar andunfamiliar face discrimination 198 -- 5.4.4 Experiment 3: Performance Analysis of the Proposed CSP algorithm as a Feature extraction Technique 199 -- 5.4.5 Experiment 4: Performance Analysis of the Proposed LSTM based Classifier 201 -- 5.4.6 Experiment 5: Classifier Performance Analysis with varying EEG Time-Window Length 202 -- 5.4.7 Statistical Validation of the Proposed LSTM Classifier using McNamers' Test 203 -- 5.5 Conclusions 204 -- References 204 -- Chapter 6: Cognitive Load Assessment in Motor Learning Tasks by Near-Infrared Spectroscopy Using Type-2 Fuzzy Sets -- 6.1 Introduction 214 -- 6.2 Principles and Methodologies 216 -- 6.2.1 Normalization of Raw Data 217 -- 6.2.2 Pre-processing 218 -- 6.2.3 Feature Extraction 218 -- 6.2.4 Training Instance Generation for Offline Training 219 -- 6.2.5 Feature Selection using Evolutionary Algorithm 219 -- 6.2.6 Classifier Training and Testing 221 -- 6.3 Classifier Design 221 -- 6.3.1 Preliminaries of IT2FS and GT2FS 221 -- 6.3.2 IT2FS Induced Classifier Design 222 -- 6.3.3 GT2FS Induced Classifier Design 228 -- 6.4 Experiments and Results 230 -- 6.4.1 Experimental Set-up 230 -- 6.4.2 Participants 232 -- 6.4.3 Stimulus Presentation for Online Classification 232 -- 6.4.4 Experiment 1: Demonstration of decreasing Cognitive Load with increasing Learning Epochs for similar stimulus 233 -- 6 .4.5 Experiment 2: Automatic Extraction of Discriminating fNIRs features 234.
505 8 _a6.4.6 Experiment 3: Optimal Parameter Setting of Feature Selection and Classifier Units 235 -- 6.5 Biological Implications 237 -- 6.6 Performance Analysis 239 -- 6.6.1 Performance Analysis of the proposed IT2FS and GT2FS Classifier 239 -- 6.6.2 Statistical Validation of the Classifiers using McNamer;s Test 242 -- 6.7 Conclusion 243 -- References 243 -- Chapter 7: Conclusions and Future Directions of Research on BCI based Memory and Learning -- 7.1 Self-Review of the Works Undertaken in the Book 250 -- 7.2 Limitations of EEG BCI-Based Memory Experiments 252 -- 7.3 Further Scope of Future Research on Memory and Learning 253 -- References.
506 _aRestricted to subscribers or individual electronic text purchasers.
520 _a"This book models human memory from a cognitive standpoint by utilizing brain activations acquired from the cortex by electroencephalographic (EEG) and functional near-infrared-spectroscopic (f-NIRs) means. It begins with an overview of the early models of memory. The authors then propose a simplistic model of Working Memory (WM) built with fuzzy Hebbian learning. A second perspective of memory models is concerned with Short-Term Memory (STM)-modeling in the context of 2-dimensional object-shape reconstruction from visually examined memorized instances. A third model assesses the subjective motor learning skill in driving from erroneous motor actions. Other models introduce a novel strategy of designing a two-layered deep Long Short-Term Memory (LSTM) classifier network and also deal with cognitive load assessment in motor learning tasks associated with driving. The book ends with concluding remarks based on principles and experimental results acquired in previous chapters"--
_cProvided by publisher
530 _aAlso available in print.
538 _aMode of access: World Wide Web
650 0 _aMemory.
650 0 _aBrain-computer interfaces.
650 0 _aCognitive neuroscience.
655 4 _aElectronic books.
700 1 _aKonar, Amit,
_eauthor.
700 1 _aRakshit, Pratyusha,
_eauthor.
710 2 _aIEEE Xplore (Online Service),
_edistributor.
710 2 _aWiley,
_epublisher.
776 0 8 _iPrint version:
_aGhosh, Lidia.
_tCognitive modeling of human memory and learning.
_dHoboken, New Jersey : Wiley, [2020]
_z9781119705864
_w(DLC) 2020015457
856 4 2 _3Abstract with links to resource
_uhttps://ieeexplore.ieee.org/xpl/bkabstractplus.jsp?bkn=9292525
942 _2ddc
_cBK
999 _c40928
_d40928