|
BIL4015 | Artificial Neural Networks | 3+0+0 | ECTS:4 | Year / Semester | Fall Semester | Level of Course | First Cycle | Status | Elective | Department | DEPARTMENT of COMPUTER ENGINEERING | Prerequisites and co-requisites | None | Mode of Delivery | | Contact Hours | 14 weeks - 3 hours of lectures per week | Lecturer | Prof. Dr. Murat EKİNCİ | Co-Lecturer | None | Language of instruction | Turkish | Professional practise ( internship ) | None | | The aim of the course: | The course intends to teach the students for the principles of Artificial Neural Networks (ANN) . The fundamentals of artificial neural systems theory, algorithms for information acquisitions and retrieval, examples of applications, implementation issues are also included. |
Learning Outcomes | CTPO | TOA | Upon successful completion of the course, the students will be able to : | | | LO - 1 : | understand what ANN is and how it works | 2,3,4,12 | 1, 3 | LO - 2 : | design and train feedforward networks | 2,3,4,12 | 1, 3 | LO - 3 : | design and train feedback networks | 2,3,4,12 | 1, 3 | LO - 4 : | gain knowledge on how multi-layer ANN's work and are trained | 2,3,4,12 | 1, 3 | CTPO : Contribution to programme outcomes, TOA :Type of assessment (1: written exam, 2: Oral exam, 3: Homework assignment, 4: Laboratory exercise/exam, 5: Seminar / presentation, 6: Term paper), LO : Learning Outcome | |
Introduction; Fundemantal Concepts and Models of Artificial Neural Network; Learning Rules; Classification Models: Discriminant Functions, Linear Machine; Nonparametric Training Concepts; Training and classification using Discrete; Perceptron; Single Layer single-level Continous Perceptron Networks; Single Layer Multi-Level Continous Networks; Delta Learning Rule for Multiperceptron Layer; Generalized Delta Rule for Fully Connected Networks (FFCN); Learning Factors in FCN; Single-Layer Feedback Networks, Unsupervised Learning and Clusters; Convolutional Neural Networks (CNN); CNN Arhitectures: ALexNet, VGGNet, ResNet, YOLO |
|
Course Syllabus | Week | Subject | Related Notes / Files | Week 1 | Artificial neural systems: Introduction | | Week 2 | Fundemantal Concepts and Models of Artificial Neural Network | | Week 3 | Neural Network Learning Rules | | Week 4 | Classification Models: Discriminant Functions, Linear Machine. | | Week 5 | Nonparametric Training Concepts | | Week 6 | Training and classification with single layer Discrete Perceptron | | Week 7 | Single Layer Continous Perceptron Networks for classification and Regression | | Week 8 | Linearly Nonseparable Pattern Classification | | Week 9 | Mid-term exam
| | Week 10 | Delta Learning Rule for Multiperceptron Layers | | Week 11 | Generalized Delta Rule (Erreor Back Propagation) for Fully Connected Networks (FCN) | | Week 12 | Learning Factors in Multi-Layer Networks | | Week 13 | Single-Layer Feedback Networks, Unsupervised Learning and Clusters | | Week 14 | Evrişimsel Sinir Ağları (ESA) | | Week 15 | CNN Arhitectures for classification, detection, segmentation: ALexNet, VGGNet, ResNet, YOLO, U-Net | | Week 16 | End-of-term exam | | |
1 | Zurada, M., J., 1992, Introduction to Artificial Neural Systems, West Publishing Company, 825 p. | | |
1 | Cichocki, A., Unbehauen, R., 1993, Neural Networks for Optization and Signal Processing, John Wiley, 526 p. | | |
Method of Assessment | Type of assessment | Week No | Date | Duration (hours) | Weight (%) | Mid-term exam | 9 | 25/11/2020 | 2 | 30 | Project | 15 | 29/12/2020 | 2 | 20 | End-of-term exam | 16 | 25/01/2021 | 2 | 50 | |
Student Work Load and its Distribution | Type of work | Duration (hours pw) | No of weeks / Number of activity | Hours in total per term | Yüz yüze eğitim | 3 | 14 | 42 | Sınıf dışı çalışma | 1 | 14 | 14 | Arasınav için hazırlık | 10 | 1 | 10 | Arasınav | 2 | 1 | 2 | Ödev | 2 | 8 | 16 | Dönem sonu sınavı için hazırlık | 11 | 1 | 11 | Dönem sonu sınavı | 2 | 1 | 2 | Total work load | | | 97 |
|