International Teaching | MACHINE LEARNING
International Teaching MACHINE LEARNING
Back
cod. 0623200005
MACHINE LEARNING
0623200005 | |
DEPARTMENT OF INFORMATION AND ELECTRICAL ENGINEERING AND APPLIED MATHEMATICS | |
EQF7 | |
INFORMATION ENGINEERING FOR DIGITAL MEDICINE | |
2024/2025 |
OBBLIGATORIO | |
YEAR OF COURSE 1 | |
YEAR OF DIDACTIC SYSTEM 2022 | |
SPRING SEMESTER |
SSD | CFU | HOURS | ACTIVITY | |
---|---|---|---|---|
ING-INF/05 | 4 | 32 | LESSONS | |
ING-INF/05 | 4 | 32 | LAB | |
ING-INF/05 | 1 | 8 | EXERCISES |
Objectives | |
---|---|
THE COURSE IS AIMED AT PROVIDING THE STUDENT WITH THE THEORETICAL, METHODOLOGICAL AND TECHNOLOGICAL KNOWLEDGE ON MACHINE LEARNING AND ON THE ANALYSIS OF LARGE DATA SETS, INCLUDING BOTH TRADITIONAL TECHNIQUES AND INNOVATIVE PARADIGMS SUCH AS DEEP LEARNING. KNOWLEDGE AND UNDERSTANDING PARADIGMS OF STRUCTURAL LEARNING, STATISTICAL LEARNING AND NEURAL LEARNING. UNSUPERVISED LEARNING. DEEP LEARNING. PARADIGMS AND TOOLS FOR BIG DATA ANALYSIS. APPLYING KNOWLEDGE AND UNDERSTANDING DESIGN AND REALIZATION OF SOLUTIONS TO LEARNING AND DATA ANALYTICS PROBLEMS BY INTEGRATING EXISTING TOOLS AND TUNING IN AN EFFECTIVE WAY THEIR OPERATING PARAMETERS. |
Prerequisites | |
---|---|
THE COURSE REQUIRES BASIC KNOWLEDGE OF THE PYTHON PROGRAMMING LANGUAGE. |
Contents | |
---|---|
LEARNING UNIT 1: FOUNDATIONS (HOURS OF LECTURES/EXERCISES/LABORATORY 6/0/4) - 1 (2 HOURS LECTURE): DEFINITION OF MACHINE LEARNING. HISTORICAL OUTLINE. MACHINE LEARNING TASKS: SUPERVISED LEARNING, UNSUPERVISED LEARNING, SEMI-SUPERVISED LEARNING, REINFORCEMENT LEARNING. - 2 (2 HOURS LECTURE): TRAINING DATA. DATA KINDS: NUMERICAL, CATEGORICAL, STRUCTURED DATA. LEARNING AS AN OPTIMIZATION PROBLEM. PARAMETERS AND HYPER-PARAMETERS. OVERFITTING. BIAS AND VARIANCE ERRORS. THE “NO FREE LUNCH” THEOREM. - 3 (2 HOURS LABORATORY): GOOGLE COLAB. THE NUMPY LIBRARY. - 4 (2 HOURS LECTURE): PERFORMANCE EVALUATION. TEST SET. HYPER-PARAMETER TUNING. VALIDATION SET. K-FOLD CROSS VALIDATION. DATA AUGMENTATION. REGULARIZATION. THE CURSE OF DIMENSIONALITY. PERFORMANCE EVALUATION METRICS. ACCURACY. PRECISION. RECALL. RECEIVER OPERATING CURVE. - 5 (2 HOURS LABORATORY): EXERCISE ON A SIMPLE CLASSIFICATION PROBLEM. KNOWLEDGE AND UNDERSTANDING: FUNDAMENTAL LEARNING PARADIGMS. APPLIED KNOWLEDGE AND UNDERSTANDING: DESIGNING AND IMPLEMENTING SIMPLE LEARNING SOLUTIONS. LEARNING UNIT 2: INTRODUCTIONS TO ARTIFICIAL NEURAL NETWORKS. THE MLP, LVQ AND SOM NETWORKS. (HOURS OF LECTURES/EXERCISES/LABORATORY 12/0/12) - 6 (2 HOURS LECTURE): BIOLOGICAL NEURONS. ARTIFICIAL NEURAL NETWORKS. MCCULLOCH AND PITTS NEURON. ROSENBLATT’S PERCEPTRON. COMBINATION OF NEURONS. FEED-FORWARD, LATERAL CONNECTIONS AND RECURRING ARCHITECTURES. FULLY-CONNECTED AND SPARSELY-CONNECTED NETWORKS. MULTI-LAYER PERCEPTRONS. - 7 (2 HOURS LECTURE): THE UNIVERSAL APPROXIMATION THEOREM. TRAINING A MLP. GRADIENT DESCENT. THE BACK PROPAGATION ALGORITHM. - 8 (2 HOURS LECTURE): STOCHASTIC GRADIENT DESCENT. EARLY STOPPING. MOMENTUM. ADAPTIVE LEARNING RATE. REGULARIZATION. SIGMOID AND TANH ACTIVATION FUNCTIONS. MLP AS BINARY CLASSIFIERS. BINARY CROSS-ENTROPY LOSS. MLP AS MULTI-CLASS CLASSIFIERS. CATEGORICAL CROSS-ENTROPY LOSS. - 9 (2 HOURS LABORATORY): IL FRAMEWORK PYTORCH. - 10 (2 HOURS LABORATORY): DEFINIZIONE E ADDESTRAMENTO IN PYTORCH. - 11 (2 HOURS LABORATORY): EXERCISE ON THE USE OF MLP AS A CLASSIFIER. - 12 (2 HOURS LABORATORY): EXERCISE ON THE USE OF MLP AS A REGRESSOR. - 13 (2 HOURS LECTURE): REJECT OPTION OF A CLASSIFIER. - 14 (2 HOURS LABORATORY): EXERCISE ON THE REJECT OPTION OF A CLASSIFIER. - 15 (2 HOURS LECTURE): COMPETITIVE NEURAL NETWORK. LEARNING VECTOR QUANTIZATION. SUPERVISED AND UNSUPERVISED LEARNING WITH LVQ. - 16 (2 HOURS LECTURE): THE MANIFOLD LEARNING PROBLEM. SELF ORGANIZED MAPS. TRAINING OF A SOM. - 17 (2 HOURS LABORATORY): EXERCISE ON LVQ AND SOM NETWORKS. KNOWLEDGE AND UNDERSTANDING: THE NEURAL LEARNING PARADIGM. ARCHITECTURE AND OPERATION OF THE MLP, LVQ AND SOM NETWORKS. APPLIED KNOWLEDGE AND UNDERSTANDING: DESIGNING AND IMPLEMENTING MACHINE LEARNING SOLUTIONS BASED ON THE NEURAL PARADIGM. USING THE PYTORCH FRAMEWORK FOR IMPLEMENTING AND TUNING NEURAL NETWORKS. LEARNING UNIT 3: DEEP LEARNING. (HOURS OF LECTURES/EXERCISES/LABORATORY 10/2/10) - 18 (2 HOURS LECTURE): THIRD GENERATION NEURAL NETWORKS. VANISHING GRADIENT AND THE RELU ACTIVATION FUNCTION. SPARSE CONNECTIONS. WEIGHT SHARING. DEEP ARCHITECTURES. ADVANTAGES OF DEEP NETWORKS. REPRESENTATION LEARNING. TRANSFER LEARNING. - 19 (2 HOURS LECTURE): CONVOLUTIONAL NEURAL NETWORKS. OPERATION AND STRUCTURE OF A CONVOLUTIONAL LAYER. STRIDE AND PADDING. POOLING LAYERS. DROPOUT LAYERS. OUTPUT LAYER OF A CNN. - 20 (2 HOURS LABORATORY): EXERCISES ON CNN. - 21 (2 HOURS LABORATORY): EXERCISES ON CNN - PART 2. - 22 (2 HOURS LECTURE): ADVANCED ASPECTS OF THE PYTORCH FRAMEWORK. NON-SEQUENTIAL MODELS. WEIGHT SHARING. - 23 (2 HOURS LECTURE): CUSTOMIZATION OF LOSS FUNCTIONS. GENERATORS. DATA AUGMENTATION FOR IMAGES. - 24 (2 HOURS LABORATORY): EXERCISE ON FINE TUNING AND DATA AUGMENTATION. - 25 (2 HOURS LECTURE): LEARNING STRATEGIES FOR DEEP NETWORKS. LEARNING DEGRADATION. SKIP CONNECTIONS AND RESIDUAL LEARNING. BATCH NORMALIZATION. GREEDY SUPERVISED PRE-TRAINING. AUXILIARY HEADS. - 26 (2 HOURS LABORATORY): EXERCISE ON SKIP CONNECTIONS AND RESIDUAL LEARNING. - 27 (2 HOURS LABORATORY): EXERCISE ON RESIDUAL LEARNING WITH FINE TUNING. - 28 (2 HOURS EXERCISES): PROJECT WORK PRESENTATION. KNOWLEDGE AND UNDERSTANDING: DEEP LEARNING MODELS AND ARCHITECTURES, WITH SPECIAL REFERENCE TO CONVOLUTIONAL NETWORKS. TECHNIQUES TO IMPROVE THE TRAINING OF DEEP NETWORKS. APPLIED KNOWLEDGE AND UNDERSTANDING: DESIGNING AND IMPLEMENTING MACHINE LEARNING SOLUTIONS BASED ON DEEP NEURAL NETWORKS USING PYTORCH, INCLUDING SOLUTIONS BASED ON FINE TUNING OF EXISTING NETWORKS. LEARNING UNIT 4: ADVANCED ARCHITECTURES (HOURS OF LECTURES/EXERCISES/LABORATORY 6/2/8) - 29 (2 HOURS LECTURE): REINFORCEMENT LEARNING. PROBLEM DEFINITION. EPISODES. THE STATE-ACTION FUNCTION Q. THE Q LEARNING ALGORITHM. - 30 (2 HOURS LECTURE): THE ACTOR-CRITIC MODEL. REPLAY BUFFER. - 31 (2 HOURS LABORATORY): EXERCISE ON Q LEARNING. - 32 (2 HOURS LABORATORY): EXERCISE ON THE ACTOR-CRITIC MODEL. - 33 (2 HOURS LECTURE): RECURRENT NEURAL NETWORKS. UNFOLDING. LEARNING TASKS: SEQUENCE-TO-SEQUENCE, SEQUENCE-TO-VALUE, VALUE-TO-VALUE, SEQUENCE-TO-SEQUENCE OF DIFFERENT LENGTH. BACK PROPAGATION THROUGH TIME. LSTM AND GRU ARCHITECTURES. - 34 (2 HOURS LABORATORY): EXERCISE ON LSTM. - 35 (2 HOURS EXERCISES): CHECK ON PROJECT WORK. - 36 (2 HOURS EXERCISES): CHECK ON PROJECT WORK. KNOWLEDGE AND UNDERSTANDING: REINFORCEMENT LEARNING AND Q LEARNING ALGORITHM. ADVANCED ARCHITECTURES FOR DEEP LEARNING, INCLUDING RNN. APPLIED KNOWLEDGE AND UNDERSTANDING: DESIGNING AND REALIZING LEARNING SOLUTIONS BASED ON ADVANCED ARCHITECTURES, WITH SPECIFIC REFERENCE TO REINFORCEMENT LEARNING AND SOLUTIONS FOR SEQUENTIAL DATA PROCESSING. TOTAL HOURS LECTURES/EXERCISES/LABORATORY: 34/6/32 |
Teaching Methods | |
---|---|
THE COURSE CONTAINS THEORETICAL LECTURES, IN-CLASS EXERCITATIONS AND PRACTICAL LABORATORY EXERCITATIONS. |
Verification of learning | |
---|---|
THE EXAM IS COMPOSED BY THE DISCUSSION OF A TEAM PROJECTWORK (FOR 3-4 PERSONS TEAMS) AND AN ORAL INTERVIEW. THE DISCUSSION OF THE PROJECTWORK AIMS AT EVALUATING THE ABILITY TO BUILD A SIMPLE APPLICATION OF THE TOOLS PRESENTED IN THE COURSE TO A PROBLEM ASSIGNED BY THE TEACHER, AND INCLUDES A PRACTICAL DEMONSTRATION OF THE REALIZED APPLICATION, A PRESENTATION OF A QUANTITATIVE EVALUATION OF THE APPLICATION PERFORMANCE AND A DESCRIPTION OF THE TECHNICAL CHOICES INVOLVED IN ITS REALIZATION. THE INTERVIEW EVALUATES THE LEVEL OF THE KNOWLEDGE AND UNDERSTANDING OF THE THEORETICAL TOPICS, TOGETHER WITH THE EXPOSITION ABILITY OF THE CANDIDATE. |
Texts | |
---|---|
"DEEP LEARNING", IAN GOODFELLOW AND YOSHUA BENGIO AND AARON COURVILLE, MIT PRESS. LECTURE NOTES AND OTHER MATERIAL PROVIDED DURING THE COURSE SUPPLEMENTARY TEACHING MATERIAL WILL BE AVAILABLE ON THE UNIVERSITY E-LEARNING PLATFORM (HTTP://ELEARNING.UNISA.IT) ACCESSIBLE TO STUDENTS USING THEIR OWN UNIVERSITY CREDENTIALS. |
More Information | |
---|---|
THE COURSE IS HELD IN ENGLISH |
BETA VERSION Data source ESSE3