|
BIL5050 | Artificial Neural Systems | 3+0+0 | ECTS:7.5 | Year / Semester | Fall Semester | Level of Course | Second Cycle | Status | Elective | Department | DEPARTMENT of COMPUTER ENGINEERING | Prerequisites and co-requisites | None | Mode of Delivery | Face to face | Contact Hours | 14 weeks - 3 hours of lectures per week | Lecturer | Prof. Dr. Murat EKİNCİ | Co-Lecturer | None | Language of instruction | | Professional practise ( internship ) | None | | The aim of the course: | To give information about Artificial neural systems (ANS) . |
Programme Outcomes | CTPO | TOA | Upon successful completion of the course, the students will be able to : | | | PO - 1 : | understand what ANN is and how it works | 1,4,5,8,9,10,15 | 1,3 | PO - 2 : | design and train feedforward networks | 1,4,5,8,15 | 1,3 | PO - 3 : | design and train feedback networks | 1,3,5,8,13,15 | 1,3 | PO - 4 : | gain knowledge on how multi-layer ANN's work and are trained | 1,3,4,5,8,9,14,15 | 1,3 | PO - 5 : | design and train associative memory networks | 1,3,5,8,15 | 1,3 | PO - 6 : | Design and apply convolutional neural networs | | | CTPO : Contribution to programme outcomes, TOA :Type of assessment (1: written exam, 2: Oral exam, 3: Homework assignment, 4: Laboratory exercise/exam, 5: Seminar / presentation, 6: Term paper), PO : Learning Outcome | |
Fundamental concepts and models of ANS; Single-layer perceptron classfiers. Multilayer feedforward networks. Single-layer feedback networks. Associative memories. Convolution Neural Networks: Architectures, Convolution/Pooling Layers, Case Study : AlexNet, VGGNet, DarkNet, ResNet, DenseNet, Recurrent Neural Network (RNN), Long Short Term Memory (LSTM) Networks; |
|
Course Syllabus | Week | Subject | Related Notes / Files | Week 1 | Fundamental concepts and models of ANS: Biological neurons, Models of artificial neural network (ANN), Neural processing, Learning and adaptation | | Week 2 | Learning Rules in Neural Networks; | | Week 3 | Neuron activation function models and mathematical concepts (Sign, Sigmoid, Softmax, ReLU, etc.) ; | | Week 4 | Single-Layer Network Perceptron, Error-Back Propagation concepts and theories; | | Week 5 | Multi-Layer Fully Connected Network (FCN) theory and design; | | Week 6 | Error Back-propagation Traning in multilayer Fully Connected Networks and coding with C/C++; | | Week 7 | Learning factors in multi-layer FCN models | | Week 8 | Regression with FCN; | | Week 9 | Mid-term Examination | | Week 10 | Single-Layer Feedback Networks; Associative memory and networks; Unsupervised Learning and Clustering; | | Week 11 | Convolutional Neural Networks : Basic concepts, acrhitecture and feed-forward processes | | Week 12 | Convolutional Neural Networks : Weight parameters training with Error-Back Propagation : Theory and Conding with C/C++ | | Week 13 | CNN Architecture models ( Case Study AlexNet, VGGNET, ResNet, DenseNet; ViT, etc); | | Week 14 | ResNet, R-NN, LSTM, GAN, ; Neuron Models used in CNN (ResNet, R-NN, LSTM, GAN); | | Week 15 | CNN models for detection (FCN, YOLO, etc); CNN model for Image Segmentation (SegNet, U-Net, etc.); | | Week 16 | End-of-term exam | | |
1 | Jacek M. Zurada, Artificial Neural Systems, West Publishing Company | | |
1 | Simon Haykin, Neural Networks and Learrning Machines, Pearson International Edition | | 2 | Mohamad H. Hassoun, Fundamentals of Artificial Neural Networks, The MIT Press | | |
Method of Assessment | Type of assessment | Week No | Date | Duration (hours) | Weight (%) | Project | 15 | 25012021 | 10 | 50 | End-of-term exam | 16 | 17/01/2021 | 2.0 | 50 | |
Student Work Load and its Distribution | Type of work | Duration (hours pw) | No of weeks / Number of activity | Hours in total per term | Yüz yüze eğitim | 3 | 14 | 42 | Sınıf dışı çalışma | 3 | 14 | 42 | Arasınav için hazırlık | 4 | 1 | 4 | Arasınav | 2 | 1 | 2 | Proje | 25 | 1 | 25 | Dönem sonu sınavı için hazırlık | 5 | 1 | 5 | Dönem sonu sınavı | 2 | 1 | 2 | Total work load | | | 122 |
|