Contenuto principale della pagina Menu di navigazione Modulo di ricerca su uniPi Modulo di ricerca su uniPi

INFORMATICA

Corso di laurea magistrale

Piano di Studi


Curricula:


BDT - Big Data Technologies

Primo anno

  • Parallel and Distributed Systems: paradigms and models (9 cfu)

    • Il corso mira a fornire un mix di basi e conoscenze avanzate nel campo del calcolo parallelo, specificatamente rivolte ad applicazioni ad alte prestazioni. Una prima parte del corso, relativamente piccola, fornirà il necessario background relativo all'hardware parallelo, dal multicore agli acceleratori fino ai sistemi distribuiti come cluster e cloud. Quindi verranno affrontati i principi del calcolo parallelo, comprese le misure che caratterizzano le computazioni parallele, i meccanismi e le politiche che supportano il calcolo parallelo ed i modelli tipici per il calcolo ad alte prestazioni. Alla fine sarà inclusa una rassegna dei framework di programmazione esistenti, finalizzati a preparare gli studenti a utilizzare e sfruttare i framework più moderni ed avanzati attualmente in uso sia negli istituti di ricerca che in contesti di produzione. Di conseguenza, allo studente che frequenterà il corso verrà fornita una prospettiva generale dell'area del calcolo parallelo nonché un'indagine completa dei framework attualmente disponibili per il calcolo ad alte prestazioni. L'intera serie di argomenti sarà integrata da esercizi pratici, in classe - secondo il principio "bring-yout-own-device" o come compiti a casa, da svolgere anche su macchine messe a disposizione dal nostro dipartimento. Verranno introdotti i diversi framework di programmazione utilizzati nel corso dettagliando le principali caratteristiche e modelli di utilizzo, lasciando allo studente il compito di apprendere i dettagli sintattici di basso livello (sotto la supervisione dei docenti) nell'ambito dei compiti a casa.

      Contenuti
      a) Evoluzione dei dispositivi informatici da sequenziale a parallelo: introduzione a multicore, core multiuso, acceleratori, cluster e architetture cloud.
      b) Principi del calcolo parallelo: misure di interesse (tempo e potenza), scalabilità orizzontale e verticale, meccanismi di comunicazione / condivisione e sincronizzazione, attività concorrenti (processi, thread, kernel), vettorializzazione, pattern tipici per il calcolo parallelo ad alta intensità di dati.
      Esercizi di laboratorio e assegnazioni utilizzando framework di programmazione parallela all'avanguardia mirati a multicore di memoria condivisa.
      c) Framework avanzati di elaborazione parallela e distribuita per applicazioni ad alta intensità di dati: GPU, elaborazione del flusso di dati e framework di programmazione ad alta intensità di dati.
      Esercizi di laboratorio e assegnazioni framework di programmazione all'avanguardia mirati ad architetture distribuite o acceleratori.

  • Data Mining (9 cfu)

    • This course provides a structured introduction to the key methods of data mining and the design of knowledge discovery processes. Organizations and businesses are overwhelmed by the flood of data continuously collected into their data warehouses as well as sensed by all kinds of digital technologies - the web, social media, mobile devices, the internet of things. Traditional statistical techniques may fail to make sense of the data, due to the inherent complexity and size. Data mining, knowledge discovery and statistical learning techniques emerged as an alternative approach, aimed at revealing patterns, rules and models hidden in the data, and at supporting the analytical user to develop descriptive and predictive models for a number of challenging problems.
      • Fundamentals of data mining and of the knowledge discovery process from data.
      • Design of data analysis processes.
      • Statistical exploratory analytics for data understanding.
      • Dimensionality reduction and Principal Component Analysis.
      • Clustering analysis with centroid-based, hierarchical and density-based methods, predictive analytics and classification models (including decision trees, bayesian, rule-based, kernel-based, SVM, random forest and ensemble methods), pattern mining and association rule discovery.
      • Validation and interpretation of discovered patterns and models within statistical frameworks.
      • Design and development of data mining processes using state of the art technology, including KNIME, Python, and R, within a wrap-up project aimed at using and possibly modifying the DM tools and libraries learned in class.

  • Bioinformatics (6 cfu)

    • This course has the goal to give the student an overview of algorithmic methods that have been conceived for the analysis of genomic sequences, and to be able to critically observe the practical impact of algorithmic design on real problems with relevant applications. The exam, besides the obvious goal to evaluate the students understanding of the course contents, is additionally meant as a chance to learn how a scientific paper is like, and how to make an oral presentation on scientific/technical topics, as well as to design it for a specific audience.
      • A brief introduction to molecular biology
      • Sequences Alignments
      • Pattern Matching
      • Fragment Assembly
      • New Generation Sequencing
      • Motifs Extraction

  • Algorithm engineering (9 cfu)

    • Study, design and analyze advanced algorithms and data structures for the efficient solution of combinatorial problems involving all basic data types, such as integer sequences, strings, (geometric) points, trees and graphs. The design and analysis will involve several models of computation — such as RAM, 2-level memory, cache-oblivious, streaming — in order to take into account the architectural features of modern PCs and the availability of Big Data upon which algorithms could work on. We will add to such theoretical analysis several engineering considerations spurring from the implementation of the proposed algorithms and from experiments published in the literature
      · Design of algorithms for massive datasets: disk aware or cache oblivious
      · Design of advanced data structures in hierarchical memories for atomic or string data
      · Data compression for structured and unstructured data
      · Algorithms for large graphs
      · Engineering considerations about the implementation of algorithms and data structures

  • Information Retrieval (6 cfu)

    • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets.
      -Search engines
      -Crawling, Text analysis, Indexing, Ranking
      -Storage of Web pages and (hyper-)link graph
      -Results processing and visualization
      -Other data types: XML, textual DBs
      -Data processing for IR tools
      -Data streaming
      -Data sketching
      -Data compression
      -Data clustering (sketch)


  • Advanced databases (9 cfu)

    • Database systems occupy a central position in our information-based society, and computer scientist and database application designers should have a good knowledge about both the theoretical and the engineering concepts that underline these systems to ensure the application performance desired. The student who completes the course successfully will demonstrate advanced technical knowledge of the main issues related to the implementation of both classical centralized relational database systems for operational and OLAP processing and of recent advances in non-relational data models (columnar, document, key-value, graph) and scalable distributed architectures. The skills provided will make the student a sophisticated developer of high-performance database applications.
      • Internals of relational database management systems.
      • Data Warehousing management systems and On-Line Analytical Processing.
      • Extract-Transform-Load and query/reporting in OLAP systems.
      • Beyond SQL: NoSQL data management systems for big data.
      • Distributed data processing and the Map-Reduce paradigm

  • Computational mathematics for learning and data analysis (9 cfu)

    • The course introduces some of the main techniques for the solution of numerical problems that find widespread use in fields like data analysis, machine learning, and artificial intelligence. These techniques often combine concepts typical of numerical analysis with those proper of numerical optimization, since numerical analysis tools are essential to solve optimization problems, and, vice-versa, problems of numerical analysis can be solved by optimization algorithms. The course has a significant hands-on part whereby students learn how to use some of the most common tools for computational mathematics; during these sessions, specific applications will be briefly illustrated in fields like regression and parameter estimation in statistics, approximation and data fitting, machine learning, artificial intelligence, data mining, information retrieval, and others.
      - Multivariate and matrix calculus
      - Matrix factorization, decomposition and approximation
      - Eigenvalue computation
      - Nonlinear optimization: theory and algorithms
      - Least-squares problems and data fitting
      - MATLAB and other software tools (lab sessions with applications)

  • 6 cfu a scelta nel gruppo BDT-1 affini da 6 cfu al primo anno

    • Insegnamenti affini da 6 cfu nel curriculum KD attivati al primo anno
    • Peer to peer systems and blockchains (6 cfu)

      • Objectives - Introduction of the basic technologies for the development of highly distributed systems and of some real scenarios exploiting them. Presentation of the disruptive technology of blockchains, and its numerous applications to different fields. P2P Topologies (2 CFU) - Peer to Peer (P2P) systems: general concepts (1/2 CFU) - Unstructured Overlays: Flooding, Random Walks, Epidemic Diffusion (1/2 CFU) - Structured Overlays: Distributed Hash Tables (DHT), Routing on a DHT (1/2 CFU) - Case Studies: Bittorrent as a Content Distribution Network: KAD implementation of the Kademlia DHT, game-based cooperation (1/2 CFU) Complex Network for the analysis of P2P systems (2 CFU) - Network models (1 CFU) - Case Studies: Darknet Freenet (1 CFU) Cryptocurrencies and Blockchains (5 CFU) - basic concepts: (1 CFU) - the Bitcoin protocol (2 CFU) Bitcoin Extensions/alternatives (1/2 CFU) - overview of altcoins - sidechains - the Stellar Consensus Protocol Further applications of blockchains (1 CFU) - Ethereum: programming smart contracts - Blockchain 1.0: cryptocurrencies - Blockchain 2.0: financial instruments built on cryptocurrencies - Blockchain 3.0: applications beyond cryptocurrencies (DNS, lotteries, voting, IoT...) Legal aspects of cryptocurrencies (1/2 CFU)
    • Computational models for complex systems (6 cfu)

      • The objective of this course is to train experts in systems modelling and analysis methodologies. Of course, this will require understanding, to some degree of detail, the mathematical and computational techniques involved. However, this will be done with the aim of shaping good modellers, that know the advantages/disadvantages/risks of the different modelling and analysis methodologies, that are aware of what happens under the hood of a modelling and analysis tool, and that can develop their own tools if needed. The course will focus on advanced modelling approaches that combine different paradigms and analysis techniques: from ODEs to stochastic models, from simulation to model checking. Case studies from population dynamics, biochemistry, epidemiology, economy and social sciences will be analysed. Moreover, synergistic approaches that combine computational modelling techniques with data-driven methodologies will be outlined. - Modelling with ODEs: examples - (Timed and) Hybrid Automata: definition and simulation techniques - Stochastic simulation methods (Gillespie’s algorithm and its variants) - Hybrid simulation methods (stochastic/ODEs) - Rule-based modelling - Probabilistic/stochastic model checking: principles, applicability and tools - Statistical model checking - Process mining (basic notions)
    • Social and ethical issues in information technology (6 cfu)

      • The progress in AI research makes it timely to focus not only on making AI more capable, but also on maximizing the societal benefit of AI [from Research Priorities for Robust and Beneficial Artificial Intelligence, an open letter]. This concern is by necessity interdisciplinary, because it involves both society and AI. It ranges from economics, law and philosophy to computer security, formal methods and, of course, various branches of AI itself. The course will be organized as a series of seminars on different hot topics. Contents may include: • Philosophical implications of AI o Technological limitations to Ai o Biological limitations to human intelligence • Economic impact of emerging technologies o The disappearance of intellectual jobs o The rise of the corporate colossus o The power of data in the hands of big companies and auto-regulations (Partnership on AI) • Legal and ethical issues o Privacy, transparency, biases and fairness, data lock-in o Autonomous vehicles and weapons o Machines for human care and threats to human dignity: customer care, robot companions … • Trustworthiness of the technologies o Safety, robustness, and control • Future scenarios o Superintelligence implications
    • ICT infrastructures (6 cfu)

      • The goal of the course is to introduce students to the computing infrastructures powering cloud services. At the end of the course a student should be able to understand the general organization of a datacenter and the logical infrastructure that power virtualization and containers. The course starts from physical infrastructures such as power and datacenter organization. The network fabric is introduced, with particular focus on SDN techniques used to balance East-West and North-South traffic. Storage and compute are then introduced with special attention to hyperconverged systems. - Physical infrastructures (datacenters, energy and PUE, SCADAs) (1 CFU) - Networking (SDN and overlay, fabrics (RDMA, OPA, InfiniBand), monitoring techniques) (2 CFU) - Storage (SDS) (1 CFU) - Computing (hypervisor) (2 CFU)
    • Big Data Analytics (6 cfu)

      • In our digital society, every human activity is mediated by information technologies. Therefore, every activity leaves digital traces behind, that can be stored in some repository. Phone call records, transaction records, web search logs, movement trajectories, social media texts and tweets, … Every minute, an avalanche of “big data” is produced by humans, consciously or not, that represents a novel, accurate digital proxy of social activities at global scale. Big data provide an unprecedented “social microscope”, a novel opportunity to understand the complexity of our societies, and a paradigm shift for the social sciences. This course is an introduction to the emergent field of big data analytics and social mining, aimed at acquiring and analyzing big data from multiple sources to the purpose of discovering the patterns and models of human behavior that explain social phenomena. The focus is on what can be learnt from big data in different domains: mobility and transportation, urban planning, demographics, economics, social relationships, opinion and sentiment, etc.; and on the analytical and mining methods that can be used. An introduction to scalable analytics is also given, using the “map-reduce” paradigm. 1. Big data sources. - Open (linked) data, Web activity data, Social network data, Social media data, Mobile phone data, Navigation GPS data, Commercial transaction data, Tourism-related data, Crowdsourcing / crowdsensing. 2. Big data analytics and social mining methods: data preprocessing, exploratory data analysis, correlation analysis, feature selection, semantic enrichment, pattern discovery, classification and prediction, clustering and segmentation for: - the discovery of individual social pro'les - the analysis of collective behavior - the discovery of emotional content of text and sentiment analysis 3. Big data analytics domains - Mobility and transportation - Nowcasting of socio-economic indicators of progress, happiness, etc. - Twitterology and nowcasting of social mood and trends - Tourism 4. Ethical issues of big data analytics - Privacy and personal data protection - Privacy-preserving analytics - Social responsibility of data scientists 5. Scalable data analytics - Paradigms of NO-SQL databases - Data analysis processes with the “map-reduce” paradigm
    • Scientific and large data visualization (6 cfu)

      • The availability of data generated from sensors, mobile devices, social networks, and so on has grown continuously in recent years. Visualization is what one needs to put data to good use: it allows to analyze, explore and communicate possibly large and complex data in a meaningful way. The first part of the course will deal with scientific visualization, which concerns the graphical illustration of scientific data (for example, biological data) to understand and glean insights into the underlying phenomena. The second part of the course will introduce the fundamentals of information visualization. Unlike scientific visualization, where data have an immediate physical representation, information visualization often deals with abstract data, which do not have a direct visual representation, like the network of people connections on a social network. We will learn to decide what to visualize, how to abstract and encode it using different graph types, and how to evaluate different solutions according to visual perception rules. Fundamentals of scientific and data visualization. Visual perception. Best practices in data visualization. Visualization techniques for both scientific phenomena and abstract data. Visualization libraries. By the end of the course, the students will be able to ' illustrate and communicate data and results using visualization, also for complex and large datasets ' using existing libraries and software tools for visualization purposes (e.g. seaborn, D3.js). Syllabus 1. Introduction: differences between scientific visualization, data visualization, interactive visualization, visual analytics and infographics. 2. Scientific Visualization a. 3D data visualization b. Spatial data structures and Indexing c. Flow visualization d. Paraview tool e. Topological Data Analysis for scientific visualization f. TTK - Topology Toolkit 3. Data Visualization Pipeline a. Data and attribute types b. Data preprocessing c. Graph and chart types d. Encoding and decoding processes e. Evaluation of visualizations 4. Visual Perception a. Fundamentals b. Gestalt laws c. Preattentive processes d. Color 5. Time Series 6. Animated Charts 7. Graph Drawing a. Trees b. Small graphs c. Large graphs 8. Multi-dimensional Data Visualization a. Multi-dimensional glyphs b. Dimensionality reduction techniques c. Ordering/sorting d. Dataset summarization 9. Machine Learning and Data Visualization 10. Python for Data Science and Visualization a. Intro, NumPy, Pandas b. Python’s visualization ecosystem (Matplotlib, Plotly, ...) 11. D3.js
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • 3D Geometric Modeling & Processing (6 cfu)

      • In this course, we plan to study the fundamental algorithms, data structures, and mathematics behind the current approaches for manipulating and processing geometric data in a variety of real world applications, like computer aided design, interactive computer graphics, reliable physical simulations, and robust 3D representations for machine learning. The course will present the data structures for simplicial complexes and the current discrete representations used to manage 3D shapes in common applications. The course will also introduce the basic notions of differential geometry and Topology that can be useful for a better comprehension of algorithms in Computer Graphics. The most common mesh processing algorithms will be explained with their practical applications and available implementations. The purpose of the course is to illustrate the most critical mathematical, geometric and algorithmic foundations for representing and processing 3D shapes in computer graphics. Syllabus : 1. Basics of Differential Geometry and Topology for Computer Graphics 2. Discrete Representations and Data Structures for Simplicial Complexes and spatial Indexing 3. Mesh Processing Algorithms a. Remeshing, Refinement & Simplification b. Parametrization and Texturing c. Fairing and Smoothing d. Surface reconstruction and Sampling 4. Shape Analysis and Representations for Machine Learning Prerequisites: Knowledge of linear algebra and calculus. Advisement recommendations: C++, Python
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
    • Computational Health Laboratory (6 cfu)

      • The purpose of this laboratory course is to introduce the computer science students to the applicative domain of computational health. Industrial scale applications will be handled with the tools acquired by the students in a 5-year course. Pharmaceutical, food and biotech industries are increasingly becoming computationally driven and skills to address these challenges are missing. The lab will teach the students a language to interact with medical doctors or scientists in the reference industries -- a prerequisite for practitioners and scientists in the field. The lab will teach with hands-on experience emerging technologies that are essential in this applicative domain and will help students navigate the plethora of available methods and technologies to select the most suitable for each problem. Besides the technological aspects of the lab, we will also make clear connections with social and ethical aspects of computational health and how students with these skills can have an impact in the world. Knowledge: The course will quickly present the working language to address biological and medical concepts that one needs to understand for working into a biomedical, pharmaceutical or food computational context. This course will introduce some emerging technologies to cope with big data: natural language processing for text-mining of scientific literature, data integration from heterogeneous sources, biomarker identification, pathway analysis and eventually modeling and simulation for in silico experiments. The knowledge will be delivered through practical examples and projects to be developed during the lab. Artificial intelligence, programming, data bases, statistics and computational mathematics will be revisited through the practical solution of biomedical problems. Syllabus: - Computational biology, bioinformatics, medical informatics, computational health. - Public domain knowledge: publicly available resources, text-mining, DB mining - Data integration: *-omics levels, structured and unstructured public and proprietary data, constraints, quality check. - Biomarker identification: stratification of patients, diagnostic tools, prognostic tools. - Functional analysis: pathway and network biology, complexity reduction, module identification. - Dynamic modeling: modeling technologies, simulation algorithms, hybrid strategies. - Each item above will be introduced through real industrial case studies.
  • Secondo anno

  • Thesis (24 cfu)


  • Free choice (9 cfu)

    • Free choice exam to be approved by the Academic Board
  • 18 cfu a scelta nel gruppo BDT-2 affini da 9 cfu al secondo anno

    • Insegnamenti affini da 9 cfu del curriculum KD attivati al secondo anno
    • Machine learning (9 cfu)

      • We introduce the principles and the critical analysis of the main paradigms for learning from data and their applications. The course provides the Machine Learning basis for both the aims of building new adaptive Intelligent Systems and powerful predictive models for intelligent data analysis. - Computational learning tasks for predictions, learning as function approximation, generalization concept. - Linear models and Nearest-Neighbors (learning algorithms and properties, regularization). - Neural Networks (MLP and deep models, SOM). - Probabilistic graphical models. - Principles of learning processes: elements of statistical learning theory, model validation. - Support Vector Machines and kernel-based models. - Introduction to applications and advanced models. Applicative project: implementation and use of ML/NN models with emphasis to the rigorous application of validation techniques.
    • Mobile and cyber-physical systems (9 cfu)

      • The course covers mobile and cyber-physical systems by providing an overview of issues, solutions, architectures, technologies and standards. It offers to the students an overall, coherent view of the organization of internet of things (IoT) systems, from the networking and sensing levels to the applications. Specifically, it shows how mobile, heterogeneous elements (from low-end sensors to high-end devices) form pervasive networks integrated in the internet and how they interact among themselves and with the surrounding physical world.The course is organized in three parts. The first part (3 CFU) introduces the principles of wireless communications and network architectures for mobility management. The second part (4 CFU) presents the foundations of signal processing and sensing and discusses the applications of sensor networks. The third part (2 CFU) provides an overview of the main standards and platforms of the IoT. • Foundations of wireless technologies and mobility management (3 CFU) 5G mobile, ad hoc networks, mobile social networks, IEEE 802.x standards • Cyber-physical systems (4 CFU) Foundations of signal processing, wireless sensor networks, energy harvesting, localization, elements of embedded programming • Internet of Things (2 CFU) ZigBee, Bluetooth, sensor network gateways, IoT platforms & standards (OneM2M, FIWARE, COAP, MQTT)
    • Human language technologies (9 cfu)

      • The course presents principles, models and the state of the art techniques for the analysis of natural language, focusing mainly on statistical machine learning approaches and Deep Learning in particular. Students will learn how to apply these techniques in a wide range of applications using modern programming libraries. - Formal and statistical approaches to NLP. - Statistical methods: Language Model, Hidden Markov Model, Viterbi Algorithm, Generative vs Discriminative Models - Linguistic essentials (tokenization, morphology, PoS, collocations, etc.). - Parsing (constituency and dependency parsing). - Processing Pipelines. - Lexical semantics: corpora, thesauri, gazetteers. - Distributional Semantics: Word embeddings, Character embeddings. - Deep Learning for natural language. - Applications: Entity recognition, Entity linking, classification, summarization. - Opinion mining, Sentiment Analysis. - Question answering, Language inference, Dialogic interfaces. - Statistical Machine Translation. - NLP libraries: NLTK, Theano, Tensorflow.
    • ICT risk assessment (9 cfu)

      • At the end of this course, the student should be able to discover and analyze the weaknesses and the vulnerabilities of a system to evaluate in a quantitative and formal way the risk it poses. The student should be able to select and deploy a cost-effective set of countermeasures at the various implementation levels to improve the overall ability of the system to withstand its attackers. Focus of the course is on a predictive approach where risk assessment and management is a step in the system design. The student should also be able to know the various tools that can support the assessment and simplify both the assessment and the selection of countermeasures. In this framework, the focus on cloud computing makes it possible to cover the most complex assessment. • Risk Assessment and Management of ICT Systems 3 CFU o Vulnerabilities/Attacks 1 CFU o Countermeasures 1 CFU o Tools for Automating Assessment & Management 1 CFU • Security of Cloud Computing 6 CFU o Economic Reasons/Deployment Models/ Service Models 1 CFU o Virtualization and TCM 1 CFU o New Vulnerabilities 1 CFU o New Attacks 1 CFU o New Countermeasures 1 CFU o Certification of Cloud Provider 1 CFU
  • 6 cfu a scelta nel gruppo BDT-2 affini da 6 cfu al secondo anno

    • Insegnamenti affini da 6 cfu nel curriculum KD attivati al secondo anno
    • Peer to peer systems and blockchains (6 cfu)

      • Objectives - Introduction of the basic technologies for the development of highly distributed systems and of some real scenarios exploiting them. Presentation of the disruptive technology of blockchains, and its numerous applications to different fields. P2P Topologies (2 CFU) - Peer to Peer (P2P) systems: general concepts (1/2 CFU) - Unstructured Overlays: Flooding, Random Walks, Epidemic Diffusion (1/2 CFU) - Structured Overlays: Distributed Hash Tables (DHT), Routing on a DHT (1/2 CFU) - Case Studies: Bittorrent as a Content Distribution Network: KAD implementation of the Kademlia DHT, game-based cooperation (1/2 CFU) Complex Network for the analysis of P2P systems (2 CFU) - Network models (1 CFU) - Case Studies: Darknet Freenet (1 CFU) Cryptocurrencies and Blockchains (5 CFU) - basic concepts: (1 CFU) - the Bitcoin protocol (2 CFU) Bitcoin Extensions/alternatives (1/2 CFU) - overview of altcoins - sidechains - the Stellar Consensus Protocol Further applications of blockchains (1 CFU) - Ethereum: programming smart contracts - Blockchain 1.0: cryptocurrencies - Blockchain 2.0: financial instruments built on cryptocurrencies - Blockchain 3.0: applications beyond cryptocurrencies (DNS, lotteries, voting, IoT...) Legal aspects of cryptocurrencies (1/2 CFU)
    • Computational models for complex systems (6 cfu)

      • The objective of this course is to train experts in systems modelling and analysis methodologies. Of course, this will require understanding, to some degree of detail, the mathematical and computational techniques involved. However, this will be done with the aim of shaping good modellers, that know the advantages/disadvantages/risks of the different modelling and analysis methodologies, that are aware of what happens under the hood of a modelling and analysis tool, and that can develop their own tools if needed. The course will focus on advanced modelling approaches that combine different paradigms and analysis techniques: from ODEs to stochastic models, from simulation to model checking. Case studies from population dynamics, biochemistry, epidemiology, economy and social sciences will be analysed. Moreover, synergistic approaches that combine computational modelling techniques with data-driven methodologies will be outlined. - Modelling with ODEs: examples - (Timed and) Hybrid Automata: definition and simulation techniques - Stochastic simulation methods (Gillespie’s algorithm and its variants) - Hybrid simulation methods (stochastic/ODEs) - Rule-based modelling - Probabilistic/stochastic model checking: principles, applicability and tools - Statistical model checking - Process mining (basic notions)
    • Social and ethical issues in information technology (6 cfu)

      • The progress in AI research makes it timely to focus not only on making AI more capable, but also on maximizing the societal benefit of AI [from Research Priorities for Robust and Beneficial Artificial Intelligence, an open letter]. This concern is by necessity interdisciplinary, because it involves both society and AI. It ranges from economics, law and philosophy to computer security, formal methods and, of course, various branches of AI itself. The course will be organized as a series of seminars on different hot topics. Contents may include: • Philosophical implications of AI o Technological limitations to Ai o Biological limitations to human intelligence • Economic impact of emerging technologies o The disappearance of intellectual jobs o The rise of the corporate colossus o The power of data in the hands of big companies and auto-regulations (Partnership on AI) • Legal and ethical issues o Privacy, transparency, biases and fairness, data lock-in o Autonomous vehicles and weapons o Machines for human care and threats to human dignity: customer care, robot companions … • Trustworthiness of the technologies o Safety, robustness, and control • Future scenarios o Superintelligence implications
    • ICT infrastructures (6 cfu)

      • The goal of the course is to introduce students to the computing infrastructures powering cloud services. At the end of the course a student should be able to understand the general organization of a datacenter and the logical infrastructure that power virtualization and containers. The course starts from physical infrastructures such as power and datacenter organization. The network fabric is introduced, with particular focus on SDN techniques used to balance East-West and North-South traffic. Storage and compute are then introduced with special attention to hyperconverged systems. - Physical infrastructures (datacenters, energy and PUE, SCADAs) (1 CFU) - Networking (SDN and overlay, fabrics (RDMA, OPA, InfiniBand), monitoring techniques) (2 CFU) - Storage (SDS) (1 CFU) - Computing (hypervisor) (2 CFU)
    • Big Data Analytics (6 cfu)

      • In our digital society, every human activity is mediated by information technologies. Therefore, every activity leaves digital traces behind, that can be stored in some repository. Phone call records, transaction records, web search logs, movement trajectories, social media texts and tweets, … Every minute, an avalanche of “big data” is produced by humans, consciously or not, that represents a novel, accurate digital proxy of social activities at global scale. Big data provide an unprecedented “social microscope”, a novel opportunity to understand the complexity of our societies, and a paradigm shift for the social sciences. This course is an introduction to the emergent field of big data analytics and social mining, aimed at acquiring and analyzing big data from multiple sources to the purpose of discovering the patterns and models of human behavior that explain social phenomena. The focus is on what can be learnt from big data in different domains: mobility and transportation, urban planning, demographics, economics, social relationships, opinion and sentiment, etc.; and on the analytical and mining methods that can be used. An introduction to scalable analytics is also given, using the “map-reduce” paradigm. 1. Big data sources. - Open (linked) data, Web activity data, Social network data, Social media data, Mobile phone data, Navigation GPS data, Commercial transaction data, Tourism-related data, Crowdsourcing / crowdsensing. 2. Big data analytics and social mining methods: data preprocessing, exploratory data analysis, correlation analysis, feature selection, semantic enrichment, pattern discovery, classification and prediction, clustering and segmentation for: - the discovery of individual social pro'les - the analysis of collective behavior - the discovery of emotional content of text and sentiment analysis 3. Big data analytics domains - Mobility and transportation - Nowcasting of socio-economic indicators of progress, happiness, etc. - Twitterology and nowcasting of social mood and trends - Tourism 4. Ethical issues of big data analytics - Privacy and personal data protection - Privacy-preserving analytics - Social responsibility of data scientists 5. Scalable data analytics - Paradigms of NO-SQL databases - Data analysis processes with the “map-reduce” paradigm
    • Scientific and large data visualization (6 cfu)

      • The availability of data generated from sensors, mobile devices, social networks, and so on has grown continuously in recent years. Visualization is what one needs to put data to good use: it allows to analyze, explore and communicate possibly large and complex data in a meaningful way. The first part of the course will deal with scientific visualization, which concerns the graphical illustration of scientific data (for example, biological data) to understand and glean insights into the underlying phenomena. The second part of the course will introduce the fundamentals of information visualization. Unlike scientific visualization, where data have an immediate physical representation, information visualization often deals with abstract data, which do not have a direct visual representation, like the network of people connections on a social network. We will learn to decide what to visualize, how to abstract and encode it using different graph types, and how to evaluate different solutions according to visual perception rules. Fundamentals of scientific and data visualization. Visual perception. Best practices in data visualization. Visualization techniques for both scientific phenomena and abstract data. Visualization libraries. By the end of the course, the students will be able to ' illustrate and communicate data and results using visualization, also for complex and large datasets ' using existing libraries and software tools for visualization purposes (e.g. seaborn, D3.js). Syllabus 1. Introduction: differences between scientific visualization, data visualization, interactive visualization, visual analytics and infographics. 2. Scientific Visualization a. 3D data visualization b. Spatial data structures and Indexing c. Flow visualization d. Paraview tool e. Topological Data Analysis for scientific visualization f. TTK - Topology Toolkit 3. Data Visualization Pipeline a. Data and attribute types b. Data preprocessing c. Graph and chart types d. Encoding and decoding processes e. Evaluation of visualizations 4. Visual Perception a. Fundamentals b. Gestalt laws c. Preattentive processes d. Color 5. Time Series 6. Animated Charts 7. Graph Drawing a. Trees b. Small graphs c. Large graphs 8. Multi-dimensional Data Visualization a. Multi-dimensional glyphs b. Dimensionality reduction techniques c. Ordering/sorting d. Dataset summarization 9. Machine Learning and Data Visualization 10. Python for Data Science and Visualization a. Intro, NumPy, Pandas b. Python’s visualization ecosystem (Matplotlib, Plotly, ...) 11. D3.js
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • 3D Geometric Modeling & Processing (6 cfu)

      • In this course, we plan to study the fundamental algorithms, data structures, and mathematics behind the current approaches for manipulating and processing geometric data in a variety of real world applications, like computer aided design, interactive computer graphics, reliable physical simulations, and robust 3D representations for machine learning. The course will present the data structures for simplicial complexes and the current discrete representations used to manage 3D shapes in common applications. The course will also introduce the basic notions of differential geometry and Topology that can be useful for a better comprehension of algorithms in Computer Graphics. The most common mesh processing algorithms will be explained with their practical applications and available implementations. The purpose of the course is to illustrate the most critical mathematical, geometric and algorithmic foundations for representing and processing 3D shapes in computer graphics. Syllabus : 1. Basics of Differential Geometry and Topology for Computer Graphics 2. Discrete Representations and Data Structures for Simplicial Complexes and spatial Indexing 3. Mesh Processing Algorithms a. Remeshing, Refinement & Simplification b. Parametrization and Texturing c. Fairing and Smoothing d. Surface reconstruction and Sampling 4. Shape Analysis and Representations for Machine Learning Prerequisites: Knowledge of linear algebra and calculus. Advisement recommendations: C++, Python
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
    • Computational Health Laboratory (6 cfu)

      • The purpose of this laboratory course is to introduce the computer science students to the applicative domain of computational health. Industrial scale applications will be handled with the tools acquired by the students in a 5-year course. Pharmaceutical, food and biotech industries are increasingly becoming computationally driven and skills to address these challenges are missing. The lab will teach the students a language to interact with medical doctors or scientists in the reference industries -- a prerequisite for practitioners and scientists in the field. The lab will teach with hands-on experience emerging technologies that are essential in this applicative domain and will help students navigate the plethora of available methods and technologies to select the most suitable for each problem. Besides the technological aspects of the lab, we will also make clear connections with social and ethical aspects of computational health and how students with these skills can have an impact in the world. Knowledge: The course will quickly present the working language to address biological and medical concepts that one needs to understand for working into a biomedical, pharmaceutical or food computational context. This course will introduce some emerging technologies to cope with big data: natural language processing for text-mining of scientific literature, data integration from heterogeneous sources, biomarker identification, pathway analysis and eventually modeling and simulation for in silico experiments. The knowledge will be delivered through practical examples and projects to be developed during the lab. Artificial intelligence, programming, data bases, statistics and computational mathematics will be revisited through the practical solution of biomedical problems. Syllabus: - Computational biology, bioinformatics, medical informatics, computational health. - Public domain knowledge: publicly available resources, text-mining, DB mining - Data integration: *-omics levels, structured and unstructured public and proprietary data, constraints, quality check. - Biomarker identification: stratification of patients, diagnostic tools, prognostic tools. - Functional analysis: pathway and network biology, complexity reduction, module identification. - Dynamic modeling: modeling technologies, simulation algorithms, hybrid strategies. - Each item above will be introduced through real industrial case studies.

  • ICT - ICT Solutions Architect

    Primo anno

  • Peer to peer systems and blockchains (6 cfu)

    • Objectives - Introduction of the basic technologies for the development of highly distributed systems and
      of some real scenarios exploiting them. Presentation of the disruptive technology of blockchains, and its numerous applications to different fields.
      P2P Topologies (2 CFU)
      - Peer to Peer (P2P) systems: general concepts (1/2 CFU)
      - Unstructured Overlays: Flooding, Random Walks, Epidemic Diffusion (1/2 CFU)
      - Structured Overlays: Distributed Hash Tables (DHT), Routing on a DHT (1/2 CFU)
      - Case Studies: Bittorrent as a Content Distribution Network: KAD implementation of the Kademlia DHT, game-based cooperation (1/2 CFU)
      Complex Network for the analysis of P2P systems (2 CFU)
      - Network models (1 CFU)
      - Case Studies: Darknet Freenet (1 CFU)
      Cryptocurrencies and Blockchains (5 CFU)
      - basic concepts: (1 CFU)
      - the Bitcoin protocol (2 CFU)
      Bitcoin Extensions/alternatives (1/2 CFU)
      - overview of altcoins
      - sidechains
      - the Stellar Consensus Protocol
      Further applications of blockchains (1 CFU)
      - Ethereum: programming smart contracts
      - Blockchain 1.0: cryptocurrencies
      - Blockchain 2.0: financial instruments built on cryptocurrencies
      - Blockchain 3.0: applications beyond cryptocurrencies (DNS, lotteries, voting, IoT...)
      Legal aspects of cryptocurrencies (1/2 CFU)

  • Algorithm engineering (9 cfu)

    • Study, design and analyze advanced algorithms and data structures for the efficient solution of combinatorial problems involving all basic data types, such as integer sequences, strings, (geometric) points, trees and graphs. The design and analysis will involve several models of computation — such as RAM, 2-level memory, cache-oblivious, streaming — in order to take into account the architectural features of modern PCs and the availability of Big Data upon which algorithms could work on. We will add to such theoretical analysis several engineering considerations spurring from the implementation of the proposed algorithms and from experiments published in the literature
      · Design of algorithms for massive datasets: disk aware or cache oblivious
      · Design of advanced data structures in hierarchical memories for atomic or string data
      · Data compression for structured and unstructured data
      · Algorithms for large graphs
      · Engineering considerations about the implementation of algorithms and data structures

  • ICT infrastructures (6 cfu)

    • The goal of the course is to introduce students to the computing infrastructures powering cloud services. At the end of the course a student should be able to understand the general organization of a datacenter and the logical infrastructure that power virtualization and containers. The course starts from physical infrastructures such as power and datacenter organization. The network fabric is introduced, with particular focus on SDN techniques used to balance East-West and North-South traffic. Storage and compute are then introduced with special attention to hyperconverged systems.
      - Physical infrastructures (datacenters, energy and PUE, SCADAs) (1 CFU)
      - Networking (SDN and overlay, fabrics (RDMA, OPA, InfiniBand), monitoring techniques) (2 CFU)
      - Storage (SDS) (1 CFU)
      - Computing (hypervisor) (2 CFU)

  • ICT risk assessment (9 cfu)

    • At the end of this course, the student should be able to discover and analyze the weaknesses and the vulnerabilities of a system to evaluate in a quantitative and formal way the risk it poses. The student should be able to select and deploy a cost-effective set of countermeasures at the various implementation levels to improve the overall ability of the system to withstand its attackers. Focus of the course is on a predictive approach where risk assessment and management is a step in the system design. The student should also be able to know the various tools that can support the assessment and simplify both the assessment and the selection of countermeasures. In this framework, the focus on cloud computing makes it possible to cover the most complex assessment.
      • Risk Assessment and Management of ICT Systems 3 CFU
      o Vulnerabilities/Attacks 1 CFU
      o Countermeasures 1 CFU
      o Tools for Automating Assessment & Management 1 CFU
      • Security of Cloud Computing 6 CFU
      o Economic Reasons/Deployment Models/ Service Models 1 CFU
      o Virtualization and TCM 1 CFU
      o New Vulnerabilities 1 CFU
      o New Attacks 1 CFU
      o New Countermeasures 1 CFU
      o Certification of Cloud Provider 1 CFU

  • Mobile and cyber-physical systems (9 cfu)

    • The course covers mobile and cyber-physical systems by providing an overview of issues, solutions, architectures, technologies and standards. It offers to the students an overall, coherent view of the organization of internet of things (IoT) systems, from the networking and sensing levels to the applications. Specifically, it shows how mobile, heterogeneous elements (from low-end sensors to high-end devices) form pervasive networks integrated in the internet and how they interact among themselves and with the surrounding physical world.The course is organized in three parts. The first part (3 CFU) introduces the principles of wireless communications and network architectures for mobility management. The second part (4 CFU) presents the foundations of signal processing and sensing and discusses the applications of sensor networks. The third part (2 CFU) provides an overview of the main standards and platforms of the IoT.
      • Foundations of wireless technologies and mobility management (3 CFU)
      5G mobile, ad hoc networks, mobile social networks, IEEE 802.x standards
      • Cyber-physical systems (4 CFU)
      Foundations of signal processing, wireless sensor networks, energy harvesting, localization, elements of embedded programming
      • Internet of Things (2 CFU)
      ZigBee, Bluetooth, sensor network gateways, IoT platforms & standards (OneM2M, FIWARE, COAP, MQTT)

  • Advanced programming (9 cfu)

    • The objectives of this course are:
      to provide the students with a deep understanding of how high level programming concepts and metaphors map into executable systems and which are their costs and limitations
      to acquaint the students with modern principles, techniques, and best practices of sophisticated software construction
      to introduce the students to techniques of programming at higher abstraction levels, in particular generative programming, component programming and web computing
      to present state-of-the-art frameworks incorporating these techniques.
      This course focuses on the quality issues pertaining to detailed design and coding, such as reliability, performance, adaptability and integrability into larger systems.
      -Programming Language Pragmatics
      -Run Time Support and Execution Environments
      -Generic Programming
      -Class Libraries and Frameworks
      -Generative Programming
      -Language Interoperability
      -Component Based Programming
      -Web Services
      -Web and Application Frameworks
      -Scripting Languages


  • Advanced software engineering (9 cfu)

    • Objectives – The objective of the course is to introduce some the main aspects in the design, analysis, development and deployment of modern software systems. Service-based and cloud-based systems are taken as references to present design, analysis and deployment techniques. DevOps practices are discussed, and in particular containerization is introduced. The course includes a "hands-on" lab where students will experiment weekly the design, analysis, development and deployment techniques introduced.
      • Service-based software engineering (3 CFU)
      - core interoperability standards
      - software design by service composition, microservice architecture, examples of design patterns
      - business process modelling and analysis
      - service descriptions and service level agreements
      • DevOps practices (1.5 CFU)
      - DevOps toolchain, continuous delivery
      - Docker and containerization
      • Cloud-based software engineering (1.5 CFU)
      - service and deployment models
      - cross-cloud deployment and management of applications
      • Hands-on laboratory (3 CFU)

  • 6 cfu a scelta nel gruppo ICT-1 affini da 6 cfu al primo anno

    • Insegnamenti affini da 6 cfu nel curriculum ICT attivati al primo anno
    • Information Retrieval (6 cfu)

      • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets. -Search engines -Crawling, Text analysis, Indexing, Ranking -Storage of Web pages and (hyper-)link graph -Results processing and visualization -Other data types: XML, textual DBs -Data processing for IR tools -Data streaming -Data sketching -Data compression -Data clustering (sketch)
    • Scientific and large data visualization (6 cfu)

      • The availability of data generated from sensors, mobile devices, social networks, and so on has grown continuously in recent years. Visualization is what one needs to put data to good use: it allows to analyze, explore and communicate possibly large and complex data in a meaningful way. The first part of the course will deal with scientific visualization, which concerns the graphical illustration of scientific data (for example, biological data) to understand and glean insights into the underlying phenomena. The second part of the course will introduce the fundamentals of information visualization. Unlike scientific visualization, where data have an immediate physical representation, information visualization often deals with abstract data, which do not have a direct visual representation, like the network of people connections on a social network. We will learn to decide what to visualize, how to abstract and encode it using different graph types, and how to evaluate different solutions according to visual perception rules. Fundamentals of scientific and data visualization. Visual perception. Best practices in data visualization. Visualization techniques for both scientific phenomena and abstract data. Visualization libraries. By the end of the course, the students will be able to ' illustrate and communicate data and results using visualization, also for complex and large datasets ' using existing libraries and software tools for visualization purposes (e.g. seaborn, D3.js). Syllabus 1. Introduction: differences between scientific visualization, data visualization, interactive visualization, visual analytics and infographics. 2. Scientific Visualization a. 3D data visualization b. Spatial data structures and Indexing c. Flow visualization d. Paraview tool e. Topological Data Analysis for scientific visualization f. TTK - Topology Toolkit 3. Data Visualization Pipeline a. Data and attribute types b. Data preprocessing c. Graph and chart types d. Encoding and decoding processes e. Evaluation of visualizations 4. Visual Perception a. Fundamentals b. Gestalt laws c. Preattentive processes d. Color 5. Time Series 6. Animated Charts 7. Graph Drawing a. Trees b. Small graphs c. Large graphs 8. Multi-dimensional Data Visualization a. Multi-dimensional glyphs b. Dimensionality reduction techniques c. Ordering/sorting d. Dataset summarization 9. Machine Learning and Data Visualization 10. Python for Data Science and Visualization a. Intro, NumPy, Pandas b. Python’s visualization ecosystem (Matplotlib, Plotly, ...) 11. D3.js
    • Business Process Modeling (6 cfu)

      • Il corso illustra i concetti principali e le problematiche inerenti la gestione di processi, intesi come flussi di lavoro (workflow) costruiti componendo certe attività atomiche, e di fornire una panoramica dei linguaggi, dei modelli concettuali e degli strumenti automatici basati su essi, che possono essere usati per affrontare le problematiche in maniera adeguata. Il percorso di apprendimento porterà gli studenti ad acquisire dimestichezza con i termini tecnici dell'area, con i diversi modelli per strutturare e comporre i processi in modo rigoroso, con le proprietà logiche che questi modelli possono essere richiesti soddisfare e con le tecniche di analisi e verifica dei processi. Inoltre potranno sperimentare i concetti visti con strumenti automatici per progettare e analizzare processi. Syllabus - Introduzione alle problematiche relative alla gestione dei processi. - Terminologia (business process, business process management, business process management system, business process model, process orchestration, business process lifecycle, workflow) e classificazione (orchestrazione vs coreografia, automazione, strutturazione). - Cenni sull'evoluzione dei sistemi di gestione di processi aziendali. - Modellazione di processi. - Modelli concettuali e livelli di astrazione. - Decomposizione funzionale e modularizzazione. - Orchestrazione di processi. - Proprietà di interesse nella progettazione, analisi e verifica di processi basati su workflow. - Pattern di orchestrazione (sequenza, split parallelo, split esclusivo, and-join, join esclusivo) e workflow strutturati. - Modelli rigorosi per workflow: reti di Petri e workflow nets. - Strumenti automatici per la progettazione e analisi di workflow. Sperimentazione su ambiente di progettazione di processi workflow con strumenti automatici per progettare, analizzare processi di workflow.
    • Wireless Networks of Embedded Systems (6 cfu)

      • Obiettivi Il corso presenta i principi fondamentali delle reti di sensori wireless (WSN), con un focus specifico su micro-kernel, programmazione IP e architetture orientate ai servizi. Verrà adottato l’approccio dell’Internet of Things, discutendo anche aspetti tecnologici correlati alla programmazione real-time. Syllabus: • Introduzione alla tecnologia WSN e sue applicazioni • Architetture dei dispositivi e componenti • Codifica a basso livello e sistemi operativi • Tecniche di comunicazione wireless agli strati PHY e MAC • Caratterizzazione di protocolli IP • Programmazione di base per sistemi embedded • Casi applicativi • Astrazione di reti di sensori (db-like) • Studi di casi: Embedded Vision, Multinode Data Aggregation • Applicazioni reali, in partciolare nel campo dell’Intelligent Transportation.
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
  • Secondo anno

  • Thesis (24 cfu)


  • Free choice (9 cfu)

    • Free choice exam to be approved by the Academic Board
  • 18 cfu a scelta nel gruppo ICT-2 affini da 9 cfu al secondo anno

    • Insegnamenti affini da 9 cfu del curriculum ICT attivati al secondo anno
    • Data Mining (9 cfu)

      • This course provides a structured introduction to the key methods of data mining and the design of knowledge discovery processes. Organizations and businesses are overwhelmed by the flood of data continuously collected into their data warehouses as well as sensed by all kinds of digital technologies - the web, social media, mobile devices, the internet of things. Traditional statistical techniques may fail to make sense of the data, due to the inherent complexity and size. Data mining, knowledge discovery and statistical learning techniques emerged as an alternative approach, aimed at revealing patterns, rules and models hidden in the data, and at supporting the analytical user to develop descriptive and predictive models for a number of challenging problems. • Fundamentals of data mining and of the knowledge discovery process from data. • Design of data analysis processes. • Statistical exploratory analytics for data understanding. • Dimensionality reduction and Principal Component Analysis. • Clustering analysis with centroid-based, hierarchical and density-based methods, predictive analytics and classification models (including decision trees, bayesian, rule-based, kernel-based, SVM, random forest and ensemble methods), pattern mining and association rule discovery. • Validation and interpretation of discovered patterns and models within statistical frameworks. • Design and development of data mining processes using state of the art technology, including KNIME, Python, and R, within a wrap-up project aimed at using and possibly modifying the DM tools and libraries learned in class.
    • Software validation and verification (9 cfu)

      • The goal of the course is to introduce techniques for verifying and validating software properties, either by analysing a model extracted from a program with model checking, or by testing the software before (the next) deployment, or equipping the running software with tools that monitor its execution. - Specifying software properties [2 CFU] o Assertions o Invariants, Safety and Liveness Properties, Fairness o Temporal logics: LTL, CTL, CTL* - Model Checking [4 CFU] o Transition systems and Program graphs o Checking regular safety properties o Checking omega regular properties with Büchi automata o Overview of Promela-SPIN and SMV o Extracting models from Java source code: BANDERA - Testing [3 CFU] o Coverage criteria and metrics: statement, function, branch, path and data-flow coverage o Test cases selection, prioritization and minimization o Automatic generation of test cases Topics to be chosen among: o Component-based and Service-oriented system testing o Object-oriented testing and Junit o Access Control Systems testing o Performance and other non-functional aspects testing
    • Machine learning (9 cfu)

      • We introduce the principles and the critical analysis of the main paradigms for learning from data and their applications. The course provides the Machine Learning basis for both the aims of building new adaptive Intelligent Systems and powerful predictive models for intelligent data analysis. - Computational learning tasks for predictions, learning as function approximation, generalization concept. - Linear models and Nearest-Neighbors (learning algorithms and properties, regularization). - Neural Networks (MLP and deep models, SOM). - Probabilistic graphical models. - Principles of learning processes: elements of statistical learning theory, model validation. - Support Vector Machines and kernel-based models. - Introduction to applications and advanced models. Applicative project: implementation and use of ML/NN models with emphasis to the rigorous application of validation techniques.
    • Language-based tecnology for security (9 cfu)

      • Overview : Traditionally, computer security has been largely enforced at the level of operating systems. However, operating-system security policies are low-level (such as access control policies, protecting particular files), while many attacks are high-level, or application-level (such as email worms that pass by access controls pretending to be executed on behalf of a mailer application). The key to defending against application-level attacks is application-level security. Because applications are typically specified and implemented in programming languages, this area is generally known as language-based security. A direct benefit of language-based security is the ability to naturally express security policies and enforcement mechanisms using the developed techniques of programming languages. The aim of the course is to allow each student to develop a solid understanding of application level security, along with a more general familiarity with the range of research in the field. In-course discussion will highlight opportunities for cutting-edge research in each area. The course intends to provide a variety of powerful tools for addressing software security issues: - To obtain a deeper understanding of programming language-based concepts for computer security. - To understand the design and implementation of security mechanisms. - To understand and move inside the research in the area of programming languages and security. Content: This course combines practical and cutting-edge research material. For the practical part, the dual perspective of attack vs. protection is threaded through the lectures, laboratory assignments, and projects. For the cutting-edge research part, the course's particular emphasis is on the use of formal models of program behaviour for specifying and enforcing security properties. Topics include: - Certifying Compilers - Code obfuscation - In-lined Reference Monitors - Formal Methods for security - Security in web applications - Information Flow Control Lab assignment and final examination: There are lab assignments. The lab assignments are experimental activities about specific problems. To pass the course, students must pass the labs by making a presentation of the assignments in class and pass the requirements on a written report that documents the activities done. Learning Goals: After the course, students should be able to apply practical knowledge of security for modern programming languages. This includes the ability to identify application- and language-level security threats, design and argue for application- and language-level security policies, and design and argue for the security, clarity, usability, and efficiency of solutions, as well as implement such solutions in expressive programming languages. Student should be able to demonstrate the critical knowledge of principles behind such application-level attacks as race conditions, buffer overruns, and code injections. You should be able to master the principles behind such language-based protection mechanisms as static security analysis, program transformation, and reference monitoring.
    • Parallel and Distributed Systems: paradigms and models (9 cfu)

      • Il corso mira a fornire un mix di basi e conoscenze avanzate nel campo del calcolo parallelo, specificatamente rivolte ad applicazioni ad alte prestazioni. Una prima parte del corso, relativamente piccola, fornirà il necessario background relativo all'hardware parallelo, dal multicore agli acceleratori fino ai sistemi distribuiti come cluster e cloud. Quindi verranno affrontati i principi del calcolo parallelo, comprese le misure che caratterizzano le computazioni parallele, i meccanismi e le politiche che supportano il calcolo parallelo ed i modelli tipici per il calcolo ad alte prestazioni. Alla fine sarà inclusa una rassegna dei framework di programmazione esistenti, finalizzati a preparare gli studenti a utilizzare e sfruttare i framework più moderni ed avanzati attualmente in uso sia negli istituti di ricerca che in contesti di produzione. Di conseguenza, allo studente che frequenterà il corso verrà fornita una prospettiva generale dell'area del calcolo parallelo nonché un'indagine completa dei framework attualmente disponibili per il calcolo ad alte prestazioni. L'intera serie di argomenti sarà integrata da esercizi pratici, in classe - secondo il principio "bring-yout-own-device" o come compiti a casa, da svolgere anche su macchine messe a disposizione dal nostro dipartimento. Verranno introdotti i diversi framework di programmazione utilizzati nel corso dettagliando le principali caratteristiche e modelli di utilizzo, lasciando allo studente il compito di apprendere i dettagli sintattici di basso livello (sotto la supervisione dei docenti) nell'ambito dei compiti a casa. Contenuti a) Evoluzione dei dispositivi informatici da sequenziale a parallelo: introduzione a multicore, core multiuso, acceleratori, cluster e architetture cloud. b) Principi del calcolo parallelo: misure di interesse (tempo e potenza), scalabilità orizzontale e verticale, meccanismi di comunicazione / condivisione e sincronizzazione, attività concorrenti (processi, thread, kernel), vettorializzazione, pattern tipici per il calcolo parallelo ad alta intensità di dati. Esercizi di laboratorio e assegnazioni utilizzando framework di programmazione parallela all'avanguardia mirati a multicore di memoria condivisa. c) Framework avanzati di elaborazione parallela e distribuita per applicazioni ad alta intensità di dati: GPU, elaborazione del flusso di dati e framework di programmazione ad alta intensità di dati. Esercizi di laboratorio e assegnazioni framework di programmazione all'avanguardia mirati ad architetture distribuite o acceleratori.
    • Intelligent Systems for pattern recognition (9 cfu)

      • The course introduces students to the design of A.I. based solutions to complex pattern recognition problems and discusses how to realize applications exploiting machine learning techniques. The course also presents fundamentals of signal and image processing. Particular focus will be given to pattern recognition problems and models dealing with sequential and visual data. • Signal processing and time-series analysis • Image processing, filters and visual feature detectors • Bayesian learning and deep learning for machine vision and signal processing • Deep learning for pattern recognition on non-vectorial data (physiological data, sensor streams, etc) • Adaptive methods for graphs and relational data • Reinforcement learning and intelligent agents • Pattern recognition applications: machine vision, bio-informatics, robotics, medical imaging, etc. • ML and deep learning libraries overview: e.g. Keras, Pytorch, Tensorflow, Ray, ... A final project will introduce students to the implementation of a pattern recognition application or to the development of computational intelligence applications.
  • 6 cfu a scelta nel gruppo ICT-2 affini da 6 cfu al secondo anno

    • Insegnamenti affini da 6 cfu nel curriculum ICT attivati al secondo anno
    • Information Retrieval (6 cfu)

      • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets. -Search engines -Crawling, Text analysis, Indexing, Ranking -Storage of Web pages and (hyper-)link graph -Results processing and visualization -Other data types: XML, textual DBs -Data processing for IR tools -Data streaming -Data sketching -Data compression -Data clustering (sketch)
    • Scientific and large data visualization (6 cfu)

      • The availability of data generated from sensors, mobile devices, social networks, and so on has grown continuously in recent years. Visualization is what one needs to put data to good use: it allows to analyze, explore and communicate possibly large and complex data in a meaningful way. The first part of the course will deal with scientific visualization, which concerns the graphical illustration of scientific data (for example, biological data) to understand and glean insights into the underlying phenomena. The second part of the course will introduce the fundamentals of information visualization. Unlike scientific visualization, where data have an immediate physical representation, information visualization often deals with abstract data, which do not have a direct visual representation, like the network of people connections on a social network. We will learn to decide what to visualize, how to abstract and encode it using different graph types, and how to evaluate different solutions according to visual perception rules. Fundamentals of scientific and data visualization. Visual perception. Best practices in data visualization. Visualization techniques for both scientific phenomena and abstract data. Visualization libraries. By the end of the course, the students will be able to ' illustrate and communicate data and results using visualization, also for complex and large datasets ' using existing libraries and software tools for visualization purposes (e.g. seaborn, D3.js). Syllabus 1. Introduction: differences between scientific visualization, data visualization, interactive visualization, visual analytics and infographics. 2. Scientific Visualization a. 3D data visualization b. Spatial data structures and Indexing c. Flow visualization d. Paraview tool e. Topological Data Analysis for scientific visualization f. TTK - Topology Toolkit 3. Data Visualization Pipeline a. Data and attribute types b. Data preprocessing c. Graph and chart types d. Encoding and decoding processes e. Evaluation of visualizations 4. Visual Perception a. Fundamentals b. Gestalt laws c. Preattentive processes d. Color 5. Time Series 6. Animated Charts 7. Graph Drawing a. Trees b. Small graphs c. Large graphs 8. Multi-dimensional Data Visualization a. Multi-dimensional glyphs b. Dimensionality reduction techniques c. Ordering/sorting d. Dataset summarization 9. Machine Learning and Data Visualization 10. Python for Data Science and Visualization a. Intro, NumPy, Pandas b. Python’s visualization ecosystem (Matplotlib, Plotly, ...) 11. D3.js
    • Business Process Modeling (6 cfu)

      • Il corso illustra i concetti principali e le problematiche inerenti la gestione di processi, intesi come flussi di lavoro (workflow) costruiti componendo certe attività atomiche, e di fornire una panoramica dei linguaggi, dei modelli concettuali e degli strumenti automatici basati su essi, che possono essere usati per affrontare le problematiche in maniera adeguata. Il percorso di apprendimento porterà gli studenti ad acquisire dimestichezza con i termini tecnici dell'area, con i diversi modelli per strutturare e comporre i processi in modo rigoroso, con le proprietà logiche che questi modelli possono essere richiesti soddisfare e con le tecniche di analisi e verifica dei processi. Inoltre potranno sperimentare i concetti visti con strumenti automatici per progettare e analizzare processi. Syllabus - Introduzione alle problematiche relative alla gestione dei processi. - Terminologia (business process, business process management, business process management system, business process model, process orchestration, business process lifecycle, workflow) e classificazione (orchestrazione vs coreografia, automazione, strutturazione). - Cenni sull'evoluzione dei sistemi di gestione di processi aziendali. - Modellazione di processi. - Modelli concettuali e livelli di astrazione. - Decomposizione funzionale e modularizzazione. - Orchestrazione di processi. - Proprietà di interesse nella progettazione, analisi e verifica di processi basati su workflow. - Pattern di orchestrazione (sequenza, split parallelo, split esclusivo, and-join, join esclusivo) e workflow strutturati. - Modelli rigorosi per workflow: reti di Petri e workflow nets. - Strumenti automatici per la progettazione e analisi di workflow. Sperimentazione su ambiente di progettazione di processi workflow con strumenti automatici per progettare, analizzare processi di workflow.
    • Wireless Networks of Embedded Systems (6 cfu)

      • Obiettivi Il corso presenta i principi fondamentali delle reti di sensori wireless (WSN), con un focus specifico su micro-kernel, programmazione IP e architetture orientate ai servizi. Verrà adottato l’approccio dell’Internet of Things, discutendo anche aspetti tecnologici correlati alla programmazione real-time. Syllabus: • Introduzione alla tecnologia WSN e sue applicazioni • Architetture dei dispositivi e componenti • Codifica a basso livello e sistemi operativi • Tecniche di comunicazione wireless agli strati PHY e MAC • Caratterizzazione di protocolli IP • Programmazione di base per sistemi embedded • Casi applicativi • Astrazione di reti di sensori (db-like) • Studi di casi: Embedded Vision, Multinode Data Aggregation • Applicazioni reali, in partciolare nel campo dell’Intelligent Transportation.
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank

  • AI - Artificial Intelligence

    Primo anno

  • Machine learning (9 cfu)

    • We introduce the principles and the critical analysis of the main paradigms for learning from data and their applications. The course provides the Machine Learning basis for both the aims of building new adaptive Intelligent Systems and powerful predictive models for intelligent data analysis.
      - Computational learning tasks for predictions, learning as function approximation, generalization concept.
      - Linear models and Nearest-Neighbors (learning algorithms and properties, regularization).
      - Neural Networks (MLP and deep models, SOM).
      - Probabilistic graphical models.
      - Principles of learning processes: elements of statistical learning theory, model validation.
      - Support Vector Machines and kernel-based models.
      - Introduction to applications and advanced models.
      Applicative project: implementation and use of ML/NN models with emphasis to the rigorous application of validation techniques.

  • Artificial intelligence fundamentals (6 cfu)

    • The course aims to offer a view of the classical/symbolic approach to Artificial Intelligence and serves as a basis for more in depth treatment of specific theories and technologies for building complete A.I. systems integrating different approaches and methods.
      - Advanced search
      - Constraint satisfaction problems
      - Knowledge representation and reasoning
      - Non-standard logics
      - Uncertain and probabilistic reasoning (Bayesian networks, fuzzy sets).
      - Foundations of semantic web: semantic networks and description logics.
      - Rules systems: use and efficient implementation.
      - Planning systems

  • Human language technologies (9 cfu)

    • The course presents principles, models and the state of the art techniques for the analysis of natural language, focusing mainly on statistical machine learning approaches and Deep Learning in particular. Students will learn how to apply these techniques in a wide range of applications using modern programming libraries.
      - Formal and statistical approaches to NLP.
      - Statistical methods: Language Model, Hidden Markov Model, Viterbi Algorithm, Generative vs Discriminative Models
      - Linguistic essentials (tokenization, morphology, PoS, collocations, etc.).
      - Parsing (constituency and dependency parsing).
      - Processing Pipelines.
      - Lexical semantics: corpora, thesauri, gazetteers.
      - Distributional Semantics: Word embeddings, Character embeddings.
      - Deep Learning for natural language.
      - Applications: Entity recognition, Entity linking, classification, summarization.
      - Opinion mining, Sentiment Analysis.
      - Question answering, Language inference, Dialogic interfaces.
      - Statistical Machine Translation.
      - NLP libraries: NLTK, Theano, Tensorflow.

  • Intelligent Systems for pattern recognition (9 cfu)

    • The course introduces students to the design of A.I. based solutions to complex pattern recognition problems and discusses how to realize applications exploiting machine learning techniques. The course also presents fundamentals of signal and image processing. Particular focus will be given to pattern recognition problems and models dealing with sequential and visual data.
      • Signal processing and time-series analysis
      • Image processing, filters and visual feature detectors
      • Bayesian learning and deep learning for machine vision and signal processing
      • Deep learning for pattern recognition on non-vectorial data (physiological data, sensor streams, etc)
      • Adaptive methods for graphs and relational data
      • Reinforcement learning and intelligent agents
      • Pattern recognition applications: machine vision, bio-informatics, robotics, medical imaging, etc.
      • ML and deep learning libraries overview: e.g. Keras, Pytorch, Tensorflow, Ray, ...
      A final project will introduce students to the implementation of a pattern recognition application or to the development of computational intelligence applications.


  • Parallel and Distributed Systems: paradigms and models (9 cfu)

    • Il corso mira a fornire un mix di basi e conoscenze avanzate nel campo del calcolo parallelo, specificatamente rivolte ad applicazioni ad alte prestazioni. Una prima parte del corso, relativamente piccola, fornirà il necessario background relativo all'hardware parallelo, dal multicore agli acceleratori fino ai sistemi distribuiti come cluster e cloud. Quindi verranno affrontati i principi del calcolo parallelo, comprese le misure che caratterizzano le computazioni parallele, i meccanismi e le politiche che supportano il calcolo parallelo ed i modelli tipici per il calcolo ad alte prestazioni. Alla fine sarà inclusa una rassegna dei framework di programmazione esistenti, finalizzati a preparare gli studenti a utilizzare e sfruttare i framework più moderni ed avanzati attualmente in uso sia negli istituti di ricerca che in contesti di produzione. Di conseguenza, allo studente che frequenterà il corso verrà fornita una prospettiva generale dell'area del calcolo parallelo nonché un'indagine completa dei framework attualmente disponibili per il calcolo ad alte prestazioni. L'intera serie di argomenti sarà integrata da esercizi pratici, in classe - secondo il principio "bring-yout-own-device" o come compiti a casa, da svolgere anche su macchine messe a disposizione dal nostro dipartimento. Verranno introdotti i diversi framework di programmazione utilizzati nel corso dettagliando le principali caratteristiche e modelli di utilizzo, lasciando allo studente il compito di apprendere i dettagli sintattici di basso livello (sotto la supervisione dei docenti) nell'ambito dei compiti a casa.

      Contenuti
      a) Evoluzione dei dispositivi informatici da sequenziale a parallelo: introduzione a multicore, core multiuso, acceleratori, cluster e architetture cloud.
      b) Principi del calcolo parallelo: misure di interesse (tempo e potenza), scalabilità orizzontale e verticale, meccanismi di comunicazione / condivisione e sincronizzazione, attività concorrenti (processi, thread, kernel), vettorializzazione, pattern tipici per il calcolo parallelo ad alta intensità di dati.
      Esercizi di laboratorio e assegnazioni utilizzando framework di programmazione parallela all'avanguardia mirati a multicore di memoria condivisa.
      c) Framework avanzati di elaborazione parallela e distribuita per applicazioni ad alta intensità di dati: GPU, elaborazione del flusso di dati e framework di programmazione ad alta intensità di dati.
      Esercizi di laboratorio e assegnazioni framework di programmazione all'avanguardia mirati ad architetture distribuite o acceleratori.

  • Computational mathematics for learning and data analysis (9 cfu)

    • The course introduces some of the main techniques for the solution of numerical problems that find widespread use in fields like data analysis, machine learning, and artificial intelligence. These techniques often combine concepts typical of numerical analysis with those proper of numerical optimization, since numerical analysis tools are essential to solve optimization problems, and, vice-versa, problems of numerical analysis can be solved by optimization algorithms. The course has a significant hands-on part whereby students learn how to use some of the most common tools for computational mathematics; during these sessions, specific applications will be briefly illustrated in fields like regression and parameter estimation in statistics, approximation and data fitting, machine learning, artificial intelligence, data mining, information retrieval, and others.
      - Multivariate and matrix calculus
      - Matrix factorization, decomposition and approximation
      - Eigenvalue computation
      - Nonlinear optimization: theory and algorithms
      - Least-squares problems and data fitting
      - MATLAB and other software tools (lab sessions with applications)

  • 12 cfu a scelta nel gruppo AI-1 affini da 6 cfu al primo anno

    • Insegnamenti affini da 6 cfu nel curriculum AI attivati al primo anno
    • Robotics (6 cfu)

      • The course introduces the fundamentals of robotics, viewed as an application domain for computer science, intelligent systems, and machine learning; provide students with the basic tools to integrate and program a robotic system, with special attention to the realization of perception-action schemes and behaviour control; improve students' experimental work capacity, through the analysis of case studies and laboratory work. • Introduction to robotics: main definitions, illustration of application domains • Mechanics and kinematics of the robot • Sensors for robotics • Robot Control • Architectures for controlling behaviour in robots • Robotic Navigation • Tactile Perception in humans and robots • Vision in humans and robots • Analysis of case studies of robotic systems • Project laboratory: student work in the lab with robotic systems
    • Information Retrieval (6 cfu)

      • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets. -Search engines -Crawling, Text analysis, Indexing, Ranking -Storage of Web pages and (hyper-)link graph -Results processing and visualization -Other data types: XML, textual DBs -Data processing for IR tools -Data streaming -Data sketching -Data compression -Data clustering (sketch)
    • Computational models for complex systems (6 cfu)

      • The objective of this course is to train experts in systems modelling and analysis methodologies. Of course, this will require understanding, to some degree of detail, the mathematical and computational techniques involved. However, this will be done with the aim of shaping good modellers, that know the advantages/disadvantages/risks of the different modelling and analysis methodologies, that are aware of what happens under the hood of a modelling and analysis tool, and that can develop their own tools if needed. The course will focus on advanced modelling approaches that combine different paradigms and analysis techniques: from ODEs to stochastic models, from simulation to model checking. Case studies from population dynamics, biochemistry, epidemiology, economy and social sciences will be analysed. Moreover, synergistic approaches that combine computational modelling techniques with data-driven methodologies will be outlined. - Modelling with ODEs: examples - (Timed and) Hybrid Automata: definition and simulation techniques - Stochastic simulation methods (Gillespie’s algorithm and its variants) - Hybrid simulation methods (stochastic/ODEs) - Rule-based modelling - Probabilistic/stochastic model checking: principles, applicability and tools - Statistical model checking - Process mining (basic notions)
    • Social and ethical issues in information technology (6 cfu)

      • The progress in AI research makes it timely to focus not only on making AI more capable, but also on maximizing the societal benefit of AI [from Research Priorities for Robust and Beneficial Artificial Intelligence, an open letter]. This concern is by necessity interdisciplinary, because it involves both society and AI. It ranges from economics, law and philosophy to computer security, formal methods and, of course, various branches of AI itself. The course will be organized as a series of seminars on different hot topics. Contents may include: • Philosophical implications of AI o Technological limitations to Ai o Biological limitations to human intelligence • Economic impact of emerging technologies o The disappearance of intellectual jobs o The rise of the corporate colossus o The power of data in the hands of big companies and auto-regulations (Partnership on AI) • Legal and ethical issues o Privacy, transparency, biases and fairness, data lock-in o Autonomous vehicles and weapons o Machines for human care and threats to human dignity: customer care, robot companions … • Trustworthiness of the technologies o Safety, robustness, and control • Future scenarios o Superintelligence implications
    • Semantic web (6 cfu)

      • The course presents Semantic web technologies, making the student able to design and implement knowledge bases based on ontologies encoded with Semantic Web languages, and offered access as Linked Data. • The architecture of the Web and the Semantic Web stack; URI. • Resource Description Framework (RDF) and RDF Schema • The query language SPARQL • Linked data: creation of data sets from DB relations; access. • Web Ontology Language (OWL): syntax and semantics • Top ontologies: main definitions and examples (DOLCE and CRM) • Specific ontologies, such as semantic sensor networks. • Extraction of knowledge from KB’s (DBpedia, Freebase) • Project consisting in the creation of ontologies (use of Protegé).
    • Computational neuroscience (6 cfu)

      • The objectives of "Computational neuroscience" class include advanced computational neural models for learning, architectures and learning methods for dynamical/recurrent neural networks for temporal data and the analysis of their properties, bio-inspired neural modelling, spiking and reservoir computing neural networks, the role of computational neuroscience in real-world applications (by case studies). The content includes the following topics: - Computational models of the biological neuron (neuroscience modeling) - Models of synaptic plasticity and learning (representation/deep learning) - Recurrent neural networks (dynamical models for temporal data) - Applications (case-studies)
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • 3D Geometric Modeling & Processing (6 cfu)

      • In this course, we plan to study the fundamental algorithms, data structures, and mathematics behind the current approaches for manipulating and processing geometric data in a variety of real world applications, like computer aided design, interactive computer graphics, reliable physical simulations, and robust 3D representations for machine learning. The course will present the data structures for simplicial complexes and the current discrete representations used to manage 3D shapes in common applications. The course will also introduce the basic notions of differential geometry and Topology that can be useful for a better comprehension of algorithms in Computer Graphics. The most common mesh processing algorithms will be explained with their practical applications and available implementations. The purpose of the course is to illustrate the most critical mathematical, geometric and algorithmic foundations for representing and processing 3D shapes in computer graphics. Syllabus : 1. Basics of Differential Geometry and Topology for Computer Graphics 2. Discrete Representations and Data Structures for Simplicial Complexes and spatial Indexing 3. Mesh Processing Algorithms a. Remeshing, Refinement & Simplification b. Parametrization and Texturing c. Fairing and Smoothing d. Surface reconstruction and Sampling 4. Shape Analysis and Representations for Machine Learning Prerequisites: Knowledge of linear algebra and calculus. Advisement recommendations: C++, Python
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
    • Computational Health Laboratory (6 cfu)

      • The purpose of this laboratory course is to introduce the computer science students to the applicative domain of computational health. Industrial scale applications will be handled with the tools acquired by the students in a 5-year course. Pharmaceutical, food and biotech industries are increasingly becoming computationally driven and skills to address these challenges are missing. The lab will teach the students a language to interact with medical doctors or scientists in the reference industries -- a prerequisite for practitioners and scientists in the field. The lab will teach with hands-on experience emerging technologies that are essential in this applicative domain and will help students navigate the plethora of available methods and technologies to select the most suitable for each problem. Besides the technological aspects of the lab, we will also make clear connections with social and ethical aspects of computational health and how students with these skills can have an impact in the world. Knowledge: The course will quickly present the working language to address biological and medical concepts that one needs to understand for working into a biomedical, pharmaceutical or food computational context. This course will introduce some emerging technologies to cope with big data: natural language processing for text-mining of scientific literature, data integration from heterogeneous sources, biomarker identification, pathway analysis and eventually modeling and simulation for in silico experiments. The knowledge will be delivered through practical examples and projects to be developed during the lab. Artificial intelligence, programming, data bases, statistics and computational mathematics will be revisited through the practical solution of biomedical problems. Syllabus: - Computational biology, bioinformatics, medical informatics, computational health. - Public domain knowledge: publicly available resources, text-mining, DB mining - Data integration: *-omics levels, structured and unstructured public and proprietary data, constraints, quality check. - Biomarker identification: stratification of patients, diagnostic tools, prognostic tools. - Functional analysis: pathway and network biology, complexity reduction, module identification. - Dynamic modeling: modeling technologies, simulation algorithms, hybrid strategies. - Each item above will be introduced through real industrial case studies.
  • Secondo anno

  • Smart applications (9 cfu)

    • The course aim is to explore methods and technologies for the development of smart connected applications, i.e. applications which exhibit intelligent behaviour -- through the use of artificial intelligence techniques introduced in other courses -- and that are deployed in immersive environments, including smart objects (as embodied by Internet of Things devices), mobile devices (smartphones, tablets), wearables (smartwatches, fitness trackers), home automation devices, web technologies, and cloud services and infrastructure. As such, applications considered for the course will include elements of context-awareness, sensor intelligence, spoken-language interfaces,
      The course will be based around a single case study for a novel smart application; students will cooperate as a single team, under the leadership of the instructor, in the design and implementation of a complete solution. In addition to standard lectures, classroom activities will include workshop-like sessions, where alternative designs are discussed, decisions are taken, and tasks are assigned. Weekly homework on the various phases of the joint project will be assigned to the team, and results reviewed the following week. The final goal is the delivery of a fully-functioning prototype of a smart application addressing the initial problem.
      While the specific technologies adopted for each case study will vary based on needs and opportunities, the following general themes will be explored in lectures (examples of specific subjects are noted next to each theme):
      • Introduction to the course and to the case study
      o examples: a voice-activated ambient assistant to answer student queries about the logistics of lectures in a classroom building, or autonomous software for a robotic rover for exploring inaccessible environments
      • Common designs for smart applications
      o examples: fuzzy logic in control systems or cloud analysis of field sensors data streams
      • Make or buy: selecting appropriate procurement strategies
      o example: writing your own RRN architecture vs. using cloud services
      • Development platforms for smart objects
      o examples: Brillo (IoT devices) or Android TV (Smart TVs)
      • Development platforms for smart architectures
      o examples: TensorFlow (server-side RNNs), or the Face Recognition API (mobile)
      • Cloud services for smart applications
      o examples: Google Cloud Machine Learning API, Google Cloud Vision API, Google Cloud Speech API, or Deploying Deep Neural Networks on Microsoft Azure GPU VMs
      • Deployment and operations
      o examples: cloud hosting vs. device hosting, or harnessing user feedback to drive improvement
      • Measuring success: methods and metrics
      o examples: defining user engagement and satisfaction metrics, or assessing the naturalness of smart interactions

  • Free choice (9 cfu)

    • Free choice exam to be approved by the Academic Board
  • Thesis (24 cfu)


  • 9 cfu a scelta nel gruppo AI-2 affini da 9 cfu al secondo anno

    • Insegnamenti affini da 9 cfu del curriculum AI attivati al secondo anno
    • Data Mining (9 cfu)

      • This course provides a structured introduction to the key methods of data mining and the design of knowledge discovery processes. Organizations and businesses are overwhelmed by the flood of data continuously collected into their data warehouses as well as sensed by all kinds of digital technologies - the web, social media, mobile devices, the internet of things. Traditional statistical techniques may fail to make sense of the data, due to the inherent complexity and size. Data mining, knowledge discovery and statistical learning techniques emerged as an alternative approach, aimed at revealing patterns, rules and models hidden in the data, and at supporting the analytical user to develop descriptive and predictive models for a number of challenging problems. • Fundamentals of data mining and of the knowledge discovery process from data. • Design of data analysis processes. • Statistical exploratory analytics for data understanding. • Dimensionality reduction and Principal Component Analysis. • Clustering analysis with centroid-based, hierarchical and density-based methods, predictive analytics and classification models (including decision trees, bayesian, rule-based, kernel-based, SVM, random forest and ensemble methods), pattern mining and association rule discovery. • Validation and interpretation of discovered patterns and models within statistical frameworks. • Design and development of data mining processes using state of the art technology, including KNIME, Python, and R, within a wrap-up project aimed at using and possibly modifying the DM tools and libraries learned in class.
    • Algorithm engineering (9 cfu)

      • Study, design and analyze advanced algorithms and data structures for the efficient solution of combinatorial problems involving all basic data types, such as integer sequences, strings, (geometric) points, trees and graphs. The design and analysis will involve several models of computation — such as RAM, 2-level memory, cache-oblivious, streaming — in order to take into account the architectural features of modern PCs and the availability of Big Data upon which algorithms could work on. We will add to such theoretical analysis several engineering considerations spurring from the implementation of the proposed algorithms and from experiments published in the literature · Design of algorithms for massive datasets: disk aware or cache oblivious · Design of advanced data structures in hierarchical memories for atomic or string data · Data compression for structured and unstructured data · Algorithms for large graphs · Engineering considerations about the implementation of algorithms and data structures
    • Mobile and cyber-physical systems (9 cfu)

      • The course covers mobile and cyber-physical systems by providing an overview of issues, solutions, architectures, technologies and standards. It offers to the students an overall, coherent view of the organization of internet of things (IoT) systems, from the networking and sensing levels to the applications. Specifically, it shows how mobile, heterogeneous elements (from low-end sensors to high-end devices) form pervasive networks integrated in the internet and how they interact among themselves and with the surrounding physical world.The course is organized in three parts. The first part (3 CFU) introduces the principles of wireless communications and network architectures for mobility management. The second part (4 CFU) presents the foundations of signal processing and sensing and discusses the applications of sensor networks. The third part (2 CFU) provides an overview of the main standards and platforms of the IoT. • Foundations of wireless technologies and mobility management (3 CFU) 5G mobile, ad hoc networks, mobile social networks, IEEE 802.x standards • Cyber-physical systems (4 CFU) Foundations of signal processing, wireless sensor networks, energy harvesting, localization, elements of embedded programming • Internet of Things (2 CFU) ZigBee, Bluetooth, sensor network gateways, IoT platforms & standards (OneM2M, FIWARE, COAP, MQTT)
  • 6 cfu a scelta nel gruppo AI-2 affini da 6 cfu al secondo anno

    • Insegnamenti affini da 6 cfu nel curriculum AI attivati al secondo anno
    • Robotics (6 cfu)

      • The course introduces the fundamentals of robotics, viewed as an application domain for computer science, intelligent systems, and machine learning; provide students with the basic tools to integrate and program a robotic system, with special attention to the realization of perception-action schemes and behaviour control; improve students' experimental work capacity, through the analysis of case studies and laboratory work. • Introduction to robotics: main definitions, illustration of application domains • Mechanics and kinematics of the robot • Sensors for robotics • Robot Control • Architectures for controlling behaviour in robots • Robotic Navigation • Tactile Perception in humans and robots • Vision in humans and robots • Analysis of case studies of robotic systems • Project laboratory: student work in the lab with robotic systems
    • Information Retrieval (6 cfu)

      • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets. -Search engines -Crawling, Text analysis, Indexing, Ranking -Storage of Web pages and (hyper-)link graph -Results processing and visualization -Other data types: XML, textual DBs -Data processing for IR tools -Data streaming -Data sketching -Data compression -Data clustering (sketch)
    • Computational models for complex systems (6 cfu)

      • The objective of this course is to train experts in systems modelling and analysis methodologies. Of course, this will require understanding, to some degree of detail, the mathematical and computational techniques involved. However, this will be done with the aim of shaping good modellers, that know the advantages/disadvantages/risks of the different modelling and analysis methodologies, that are aware of what happens under the hood of a modelling and analysis tool, and that can develop their own tools if needed. The course will focus on advanced modelling approaches that combine different paradigms and analysis techniques: from ODEs to stochastic models, from simulation to model checking. Case studies from population dynamics, biochemistry, epidemiology, economy and social sciences will be analysed. Moreover, synergistic approaches that combine computational modelling techniques with data-driven methodologies will be outlined. - Modelling with ODEs: examples - (Timed and) Hybrid Automata: definition and simulation techniques - Stochastic simulation methods (Gillespie’s algorithm and its variants) - Hybrid simulation methods (stochastic/ODEs) - Rule-based modelling - Probabilistic/stochastic model checking: principles, applicability and tools - Statistical model checking - Process mining (basic notions)
    • Social and ethical issues in information technology (6 cfu)

      • The progress in AI research makes it timely to focus not only on making AI more capable, but also on maximizing the societal benefit of AI [from Research Priorities for Robust and Beneficial Artificial Intelligence, an open letter]. This concern is by necessity interdisciplinary, because it involves both society and AI. It ranges from economics, law and philosophy to computer security, formal methods and, of course, various branches of AI itself. The course will be organized as a series of seminars on different hot topics. Contents may include: • Philosophical implications of AI o Technological limitations to Ai o Biological limitations to human intelligence • Economic impact of emerging technologies o The disappearance of intellectual jobs o The rise of the corporate colossus o The power of data in the hands of big companies and auto-regulations (Partnership on AI) • Legal and ethical issues o Privacy, transparency, biases and fairness, data lock-in o Autonomous vehicles and weapons o Machines for human care and threats to human dignity: customer care, robot companions … • Trustworthiness of the technologies o Safety, robustness, and control • Future scenarios o Superintelligence implications
    • Semantic web (6 cfu)

      • The course presents Semantic web technologies, making the student able to design and implement knowledge bases based on ontologies encoded with Semantic Web languages, and offered access as Linked Data. • The architecture of the Web and the Semantic Web stack; URI. • Resource Description Framework (RDF) and RDF Schema • The query language SPARQL • Linked data: creation of data sets from DB relations; access. • Web Ontology Language (OWL): syntax and semantics • Top ontologies: main definitions and examples (DOLCE and CRM) • Specific ontologies, such as semantic sensor networks. • Extraction of knowledge from KB’s (DBpedia, Freebase) • Project consisting in the creation of ontologies (use of Protegé).
    • Computational neuroscience (6 cfu)

      • The objectives of "Computational neuroscience" class include advanced computational neural models for learning, architectures and learning methods for dynamical/recurrent neural networks for temporal data and the analysis of their properties, bio-inspired neural modelling, spiking and reservoir computing neural networks, the role of computational neuroscience in real-world applications (by case studies). The content includes the following topics: - Computational models of the biological neuron (neuroscience modeling) - Models of synaptic plasticity and learning (representation/deep learning) - Recurrent neural networks (dynamical models for temporal data) - Applications (case-studies)
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • 3D Geometric Modeling & Processing (6 cfu)

      • In this course, we plan to study the fundamental algorithms, data structures, and mathematics behind the current approaches for manipulating and processing geometric data in a variety of real world applications, like computer aided design, interactive computer graphics, reliable physical simulations, and robust 3D representations for machine learning. The course will present the data structures for simplicial complexes and the current discrete representations used to manage 3D shapes in common applications. The course will also introduce the basic notions of differential geometry and Topology that can be useful for a better comprehension of algorithms in Computer Graphics. The most common mesh processing algorithms will be explained with their practical applications and available implementations. The purpose of the course is to illustrate the most critical mathematical, geometric and algorithmic foundations for representing and processing 3D shapes in computer graphics. Syllabus : 1. Basics of Differential Geometry and Topology for Computer Graphics 2. Discrete Representations and Data Structures for Simplicial Complexes and spatial Indexing 3. Mesh Processing Algorithms a. Remeshing, Refinement & Simplification b. Parametrization and Texturing c. Fairing and Smoothing d. Surface reconstruction and Sampling 4. Shape Analysis and Representations for Machine Learning Prerequisites: Knowledge of linear algebra and calculus. Advisement recommendations: C++, Python
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
    • Computational Health Laboratory (6 cfu)

      • The purpose of this laboratory course is to introduce the computer science students to the applicative domain of computational health. Industrial scale applications will be handled with the tools acquired by the students in a 5-year course. Pharmaceutical, food and biotech industries are increasingly becoming computationally driven and skills to address these challenges are missing. The lab will teach the students a language to interact with medical doctors or scientists in the reference industries -- a prerequisite for practitioners and scientists in the field. The lab will teach with hands-on experience emerging technologies that are essential in this applicative domain and will help students navigate the plethora of available methods and technologies to select the most suitable for each problem. Besides the technological aspects of the lab, we will also make clear connections with social and ethical aspects of computational health and how students with these skills can have an impact in the world. Knowledge: The course will quickly present the working language to address biological and medical concepts that one needs to understand for working into a biomedical, pharmaceutical or food computational context. This course will introduce some emerging technologies to cope with big data: natural language processing for text-mining of scientific literature, data integration from heterogeneous sources, biomarker identification, pathway analysis and eventually modeling and simulation for in silico experiments. The knowledge will be delivered through practical examples and projects to be developed during the lab. Artificial intelligence, programming, data bases, statistics and computational mathematics will be revisited through the practical solution of biomedical problems. Syllabus: - Computational biology, bioinformatics, medical informatics, computational health. - Public domain knowledge: publicly available resources, text-mining, DB mining - Data integration: *-omics levels, structured and unstructured public and proprietary data, constraints, quality check. - Biomarker identification: stratification of patients, diagnostic tools, prognostic tools. - Functional analysis: pathway and network biology, complexity reduction, module identification. - Dynamic modeling: modeling technologies, simulation algorithms, hybrid strategies. - Each item above will be introduced through real industrial case studies.

  • SW - Software: Programming, Principles, and Technologies

    Primo anno

  • Languages, compilers and interpreters (9 cfu)

    • The course teaches the core of compilation, program analysis techniques used in compilers and software development tools to improve productivity and reliability. Emphasis on the methodology of applying formal abstractions to writing complex software, using compilers as an example. The course will explore the basic static techniques that are the cornerstone of a variety of program analysis tools, including optimizing compilers, just-in-time compilers, program verifiers, bug finders and code refactoring tools. As case studies, tools developed within the LLVM Compiler Infrasctructure will be analysed and used in experimentations.
      - Abstract Machines, Compilation and Interpretation
      - Lexical Analysis and Lexical Analyser Generators
      - Parsing and Parser Generators
      - Static analysis
      - Intermediate Code Generation
      - Optimization
      - Runtime Support
      - Just-in-time compilation

  • Competitive programming and contests (6 cfu)

    • The goal of the course is to improve programming and problem solving skills of the students by facing them with difficult problems and by presenting the techniques that help their reasoning in the implementation of correct and efficient solutions. The importance of these skills has been recognized by the most important software companies worldwide, which evaluate candidates in their job interviews mostly by the ability in addressing such difficult problems. A natural goal is to involve the students in the intellectual pleasure of programming and problem solving, also preparing them for the most important international online contests, such as TopCoder, HackerRank, CodeChef, Facebook Hacker Cup, Google Code Jam and so on, for internships in most important companies and their interviews. A desirable side-effect of the course could be to organize and prepare teams of students for the ACM International Collegiate Programming Contests. The course will give the opportunity to uniform students' background in algorithms and programming in view of the subsequent courses and will be central to get them involved into the computing platforms of the future.
      - An official language for contests: C++ and its standard template library
      - Efficient code: programming, benchmarking and profiling
      - Real-world applications of sorting
      - Basic data structures: priority queues, search trees, and hash maps
      - Advanced data structures: union-find, Fenwick tree, interval trees, range-minima query
      - Basic string algorithms
      - Basic graph algorithms
      - Fast optimization with dynamic programming
      - Computational geometry
      Each topic of the above syllabus will be covered by
      - offering a quick recap of the related concepts from an introductory class on algorithms;
      - programming and engineering fast software solutions for real-life computational problems;
      - learning how to recognize their applicability through contests and experimentation.

  • Principles for software composition (9 cfu)

    • This course introduces concepts and techniques in the study of advanced programming languages, as well as their formal logical underpinnings. The central theme is the view of individual programs and whole languages as mathematical entities about which precise claims may be made and proved. The course will cover the basic techniques for assigning meaning to programs with higher-order, concurrent and probabilistic features (e.g., domain theory, logical systems, well-founded induction, structural recursion, labelled transition systems, Markov chains, probabilistic reactive systems) and for proving their fundamental properties, such as termination, normalisation, determinacy, behavioural equivalence and logical equivalence. In particular, some emphasis will be posed on modularity and compositionality, in the sense of guaranteeing some property of the whole by proving simpler properties of its parts. Emphasis will be placed on the experimentation of the introduced concepts with state-of-the-art tools.
      • Introduction and background [1 CFU]
      • Induction and recursion, partial orders, fixed points, lambda-notation [1 CFU]
      • Functional programming with Haskell and analysis of higher-order functional languages [1 CFU theory and 1 CFU exercises and experimentation]
      • Concurrent programming with Google Go and Erlang and analysis of concurrent and non-deterministic systems [2 CFU theory and 1 CFU exercises and experimentation]
      • Code orchestration with Orc and analysis of coordination languages [1 CFU theory and experimentation]
      • Models and analysis of probabilistic and stochastic systems [1 CFU theory and experimentation]

  • Algorithm design (9 cfu)

    • The course focuses on developing algorithmic design skills, exposing the students to complex problems that cannot be directly handled by standard libraries (being aware that several basic algorithms and data structures are already covered by the libraries of modern programming languages), thus requiring a significant effort in problem solving. These problems involve all basic data types, such as integers, strings, (geometric) points, trees and graphs, as a starting point and the syllabus is structured to highlight the applicative situations in which the corresponding algorithms can be successfully applied. Brainstorming activities will be central to help students learning from their mistakes. The level of detail in each argument can be adapted year-by-year to some trending topics, and will be decided according to requests coming from other courses and/or specific issues arising in, possibly novel, applicative scenarios.
      - Exploring the algorithms behind standard libraries [2 CFU]
      - External-memory and cache-efficient algorithms [2 CFU]
      - Randomized algorithms [2 CFU]
      - Approximation algorithms and complexity [2 CFU]
      - Argument chosen from an emerging scenario [1 CFU]

  • 18 cfu a scelta nel gruppo SW-1 affini da 9 cfu al primo anno

    • Insegnamenti affini da 9 cfu del curriculum SW attivati al primo anno
    • Computational mathematics for learning and data analysis (9 cfu)

      • The course introduces some of the main techniques for the solution of numerical problems that find widespread use in fields like data analysis, machine learning, and artificial intelligence. These techniques often combine concepts typical of numerical analysis with those proper of numerical optimization, since numerical analysis tools are essential to solve optimization problems, and, vice-versa, problems of numerical analysis can be solved by optimization algorithms. The course has a significant hands-on part whereby students learn how to use some of the most common tools for computational mathematics; during these sessions, specific applications will be briefly illustrated in fields like regression and parameter estimation in statistics, approximation and data fitting, machine learning, artificial intelligence, data mining, information retrieval, and others. - Multivariate and matrix calculus - Matrix factorization, decomposition and approximation - Eigenvalue computation - Nonlinear optimization: theory and algorithms - Least-squares problems and data fitting - MATLAB and other software tools (lab sessions with applications)
    • Advanced programming (9 cfu)

      • The objectives of this course are: to provide the students with a deep understanding of how high level programming concepts and metaphors map into executable systems and which are their costs and limitations to acquaint the students with modern principles, techniques, and best practices of sophisticated software construction to introduce the students to techniques of programming at higher abstraction levels, in particular generative programming, component programming and web computing to present state-of-the-art frameworks incorporating these techniques. This course focuses on the quality issues pertaining to detailed design and coding, such as reliability, performance, adaptability and integrability into larger systems. -Programming Language Pragmatics -Run Time Support and Execution Environments -Generic Programming -Class Libraries and Frameworks -Generative Programming -Language Interoperability -Component Based Programming -Web Services -Web and Application Frameworks -Scripting Languages
    • Advanced software engineering (9 cfu)

      • Objectives – The objective of the course is to introduce some the main aspects in the design, analysis, development and deployment of modern software systems. Service-based and cloud-based systems are taken as references to present design, analysis and deployment techniques. DevOps practices are discussed, and in particular containerization is introduced. The course includes a "hands-on" lab where students will experiment weekly the design, analysis, development and deployment techniques introduced. • Service-based software engineering (3 CFU) - core interoperability standards - software design by service composition, microservice architecture, examples of design patterns - business process modelling and analysis - service descriptions and service level agreements • DevOps practices (1.5 CFU) - DevOps toolchain, continuous delivery - Docker and containerization • Cloud-based software engineering (1.5 CFU) - service and deployment models - cross-cloud deployment and management of applications • Hands-on laboratory (3 CFU)
    • Smart applications (9 cfu)

      • The course aim is to explore methods and technologies for the development of smart connected applications, i.e. applications which exhibit intelligent behaviour -- through the use of artificial intelligence techniques introduced in other courses -- and that are deployed in immersive environments, including smart objects (as embodied by Internet of Things devices), mobile devices (smartphones, tablets), wearables (smartwatches, fitness trackers), home automation devices, web technologies, and cloud services and infrastructure. As such, applications considered for the course will include elements of context-awareness, sensor intelligence, spoken-language interfaces, The course will be based around a single case study for a novel smart application; students will cooperate as a single team, under the leadership of the instructor, in the design and implementation of a complete solution. In addition to standard lectures, classroom activities will include workshop-like sessions, where alternative designs are discussed, decisions are taken, and tasks are assigned. Weekly homework on the various phases of the joint project will be assigned to the team, and results reviewed the following week. The final goal is the delivery of a fully-functioning prototype of a smart application addressing the initial problem. While the specific technologies adopted for each case study will vary based on needs and opportunities, the following general themes will be explored in lectures (examples of specific subjects are noted next to each theme): • Introduction to the course and to the case study o examples: a voice-activated ambient assistant to answer student queries about the logistics of lectures in a classroom building, or autonomous software for a robotic rover for exploring inaccessible environments • Common designs for smart applications o examples: fuzzy logic in control systems or cloud analysis of field sensors data streams • Make or buy: selecting appropriate procurement strategies o example: writing your own RRN architecture vs. using cloud services • Development platforms for smart objects o examples: Brillo (IoT devices) or Android TV (Smart TVs) • Development platforms for smart architectures o examples: TensorFlow (server-side RNNs), or the Face Recognition API (mobile) • Cloud services for smart applications o examples: Google Cloud Machine Learning API, Google Cloud Vision API, Google Cloud Speech API, or Deploying Deep Neural Networks on Microsoft Azure GPU VMs • Deployment and operations o examples: cloud hosting vs. device hosting, or harnessing user feedback to drive improvement • Measuring success: methods and metrics o examples: defining user engagement and satisfaction metrics, or assessing the naturalness of smart interactions
    • Machine learning (9 cfu)

      • We introduce the principles and the critical analysis of the main paradigms for learning from data and their applications. The course provides the Machine Learning basis for both the aims of building new adaptive Intelligent Systems and powerful predictive models for intelligent data analysis. - Computational learning tasks for predictions, learning as function approximation, generalization concept. - Linear models and Nearest-Neighbors (learning algorithms and properties, regularization). - Neural Networks (MLP and deep models, SOM). - Probabilistic graphical models. - Principles of learning processes: elements of statistical learning theory, model validation. - Support Vector Machines and kernel-based models. - Introduction to applications and advanced models. Applicative project: implementation and use of ML/NN models with emphasis to the rigorous application of validation techniques.
    • Mobile and cyber-physical systems (9 cfu)

      • The course covers mobile and cyber-physical systems by providing an overview of issues, solutions, architectures, technologies and standards. It offers to the students an overall, coherent view of the organization of internet of things (IoT) systems, from the networking and sensing levels to the applications. Specifically, it shows how mobile, heterogeneous elements (from low-end sensors to high-end devices) form pervasive networks integrated in the internet and how they interact among themselves and with the surrounding physical world.The course is organized in three parts. The first part (3 CFU) introduces the principles of wireless communications and network architectures for mobility management. The second part (4 CFU) presents the foundations of signal processing and sensing and discusses the applications of sensor networks. The third part (2 CFU) provides an overview of the main standards and platforms of the IoT. • Foundations of wireless technologies and mobility management (3 CFU) 5G mobile, ad hoc networks, mobile social networks, IEEE 802.x standards • Cyber-physical systems (4 CFU) Foundations of signal processing, wireless sensor networks, energy harvesting, localization, elements of embedded programming • Internet of Things (2 CFU) ZigBee, Bluetooth, sensor network gateways, IoT platforms & standards (OneM2M, FIWARE, COAP, MQTT)
    • Language-based tecnology for security (9 cfu)

      • Overview : Traditionally, computer security has been largely enforced at the level of operating systems. However, operating-system security policies are low-level (such as access control policies, protecting particular files), while many attacks are high-level, or application-level (such as email worms that pass by access controls pretending to be executed on behalf of a mailer application). The key to defending against application-level attacks is application-level security. Because applications are typically specified and implemented in programming languages, this area is generally known as language-based security. A direct benefit of language-based security is the ability to naturally express security policies and enforcement mechanisms using the developed techniques of programming languages. The aim of the course is to allow each student to develop a solid understanding of application level security, along with a more general familiarity with the range of research in the field. In-course discussion will highlight opportunities for cutting-edge research in each area. The course intends to provide a variety of powerful tools for addressing software security issues: - To obtain a deeper understanding of programming language-based concepts for computer security. - To understand the design and implementation of security mechanisms. - To understand and move inside the research in the area of programming languages and security. Content: This course combines practical and cutting-edge research material. For the practical part, the dual perspective of attack vs. protection is threaded through the lectures, laboratory assignments, and projects. For the cutting-edge research part, the course's particular emphasis is on the use of formal models of program behaviour for specifying and enforcing security properties. Topics include: - Certifying Compilers - Code obfuscation - In-lined Reference Monitors - Formal Methods for security - Security in web applications - Information Flow Control Lab assignment and final examination: There are lab assignments. The lab assignments are experimental activities about specific problems. To pass the course, students must pass the labs by making a presentation of the assignments in class and pass the requirements on a written report that documents the activities done. Learning Goals: After the course, students should be able to apply practical knowledge of security for modern programming languages. This includes the ability to identify application- and language-level security threats, design and argue for application- and language-level security policies, and design and argue for the security, clarity, usability, and efficiency of solutions, as well as implement such solutions in expressive programming languages. Student should be able to demonstrate the critical knowledge of principles behind such application-level attacks as race conditions, buffer overruns, and code injections. You should be able to master the principles behind such language-based protection mechanisms as static security analysis, program transformation, and reference monitoring.
    • Parallel and Distributed Systems: paradigms and models (9 cfu)

      • Il corso mira a fornire un mix di basi e conoscenze avanzate nel campo del calcolo parallelo, specificatamente rivolte ad applicazioni ad alte prestazioni. Una prima parte del corso, relativamente piccola, fornirà il necessario background relativo all'hardware parallelo, dal multicore agli acceleratori fino ai sistemi distribuiti come cluster e cloud. Quindi verranno affrontati i principi del calcolo parallelo, comprese le misure che caratterizzano le computazioni parallele, i meccanismi e le politiche che supportano il calcolo parallelo ed i modelli tipici per il calcolo ad alte prestazioni. Alla fine sarà inclusa una rassegna dei framework di programmazione esistenti, finalizzati a preparare gli studenti a utilizzare e sfruttare i framework più moderni ed avanzati attualmente in uso sia negli istituti di ricerca che in contesti di produzione. Di conseguenza, allo studente che frequenterà il corso verrà fornita una prospettiva generale dell'area del calcolo parallelo nonché un'indagine completa dei framework attualmente disponibili per il calcolo ad alte prestazioni. L'intera serie di argomenti sarà integrata da esercizi pratici, in classe - secondo il principio "bring-yout-own-device" o come compiti a casa, da svolgere anche su macchine messe a disposizione dal nostro dipartimento. Verranno introdotti i diversi framework di programmazione utilizzati nel corso dettagliando le principali caratteristiche e modelli di utilizzo, lasciando allo studente il compito di apprendere i dettagli sintattici di basso livello (sotto la supervisione dei docenti) nell'ambito dei compiti a casa. Contenuti a) Evoluzione dei dispositivi informatici da sequenziale a parallelo: introduzione a multicore, core multiuso, acceleratori, cluster e architetture cloud. b) Principi del calcolo parallelo: misure di interesse (tempo e potenza), scalabilità orizzontale e verticale, meccanismi di comunicazione / condivisione e sincronizzazione, attività concorrenti (processi, thread, kernel), vettorializzazione, pattern tipici per il calcolo parallelo ad alta intensità di dati. Esercizi di laboratorio e assegnazioni utilizzando framework di programmazione parallela all'avanguardia mirati a multicore di memoria condivisa. c) Framework avanzati di elaborazione parallela e distribuita per applicazioni ad alta intensità di dati: GPU, elaborazione del flusso di dati e framework di programmazione ad alta intensità di dati. Esercizi di laboratorio e assegnazioni framework di programmazione all'avanguardia mirati ad architetture distribuite o acceleratori.
  • 12 cfu a scelta nel gruppo SW-1 affini da 6 cfu al primo anno

    • Insegnamenti affini da 6 cfu nel curriculum SW attivati al primo anno
    • Bioinformatics (6 cfu)

      • This course has the goal to give the student an overview of algorithmic methods that have been conceived for the analysis of genomic sequences, and to be able to critically observe the practical impact of algorithmic design on real problems with relevant applications. The exam, besides the obvious goal to evaluate the students understanding of the course contents, is additionally meant as a chance to learn how a scientific paper is like, and how to make an oral presentation on scientific/technical topics, as well as to design it for a specific audience. • A brief introduction to molecular biology • Sequences Alignments • Pattern Matching • Fragment Assembly • New Generation Sequencing • Motifs Extraction
    • Information Retrieval (6 cfu)

      • In this course we will study, design and analyze (theoretically and experimentally) software tools for IR-applications dealing with unstructured (raw data), structured (DB-centric) or semi-structured data (i.e. HTML, XML). We will mainly concentrate on the basic components of a modern Web search engine, by examining in detail the algorithmic solutions currently adopted to implement its main software modules. We will also discuss their performance and/or computational limitations, as well as introduce measures for evaluating their efficiency and efficacy. Finally, we will survey some algorithmic techniques which are frequently adopted in the design of IR-tools managing large datasets. -Search engines -Crawling, Text analysis, Indexing, Ranking -Storage of Web pages and (hyper-)link graph -Results processing and visualization -Other data types: XML, textual DBs -Data processing for IR tools -Data streaming -Data sketching -Data compression -Data clustering (sketch)
    • Computational models for complex systems (6 cfu)

      • The objective of this course is to train experts in systems modelling and analysis methodologies. Of course, this will require understanding, to some degree of detail, the mathematical and computational techniques involved. However, this will be done with the aim of shaping good modellers, that know the advantages/disadvantages/risks of the different modelling and analysis methodologies, that are aware of what happens under the hood of a modelling and analysis tool, and that can develop their own tools if needed. The course will focus on advanced modelling approaches that combine different paradigms and analysis techniques: from ODEs to stochastic models, from simulation to model checking. Case studies from population dynamics, biochemistry, epidemiology, economy and social sciences will be analysed. Moreover, synergistic approaches that combine computational modelling techniques with data-driven methodologies will be outlined. - Modelling with ODEs: examples - (Timed and) Hybrid Automata: definition and simulation techniques - Stochastic simulation methods (Gillespie’s algorithm and its variants) - Hybrid simulation methods (stochastic/ODEs) - Rule-based modelling - Probabilistic/stochastic model checking: principles, applicability and tools - Statistical model checking - Process mining (basic notions)
    • ICT infrastructures (6 cfu)

      • The goal of the course is to introduce students to the computing infrastructures powering cloud services. At the end of the course a student should be able to understand the general organization of a datacenter and the logical infrastructure that power virtualization and containers. The course starts from physical infrastructures such as power and datacenter organization. The network fabric is introduced, with particular focus on SDN techniques used to balance East-West and North-South traffic. Storage and compute are then introduced with special attention to hyperconverged systems. - Physical infrastructures (datacenters, energy and PUE, SCADAs) (1 CFU) - Networking (SDN and overlay, fabrics (RDMA, OPA, InfiniBand), monitoring techniques) (2 CFU) - Storage (SDS) (1 CFU) - Computing (hypervisor) (2 CFU)
    • Foundation of computing (6 cfu)

      • The goal of the course is to present the mathematical foundations of some advanced models of computation. The contents can vary along the years. In the current instance, detailed below, the focus is on algebraic and categorical foundations of calculi for higher-order and concurrent computing. Future instances of the course will focus on the mathematical foundations of other computational models, like Quantum Computing and Biologically Inspired Computational Mechanisms. No prerequisites are required except for some elementary knowledge of logic and algebra. - Simply typed lambda calculus - Curry-Howard isomorphism - PCF and its cpo model, with applications to functional programming languages - Elements of recursive and polymorphic types, with applications to object oriented programming languages - Categories as partial algebras - Monoidal, cartesian and cartesian closed (CCC) categories - CCC as models of simply typed lambda calculus - Algebraic specifications, categories of models and adjunctions - Petri nets and their (strictly) symmetric monoidal models - Labelled Transition Systems (LTS) as coalgebras - The Calculus for Communicating Processes (CCS) and its bialgebraic models - The Pi-Calculus and its presheaf coalgebraic models
    • Algorithmic Game Theory (6 cfu)

      • Description: The course aims at introducing the main paradigms of game theory within an algorithmic perspective in order to analyse the behaviour of multi-agent systems and phenomena. Suitable mathematical models and tools will be provided to help the understanding of inner mechanisms of competition and cooperation, threads and promises. A critical analysis of the power and limitations of game theory is addressed as well together with a few applications to problems from different areas (such as economics, computer science, engineering). Knowledge: The course provides the main theoretical concepts of cooperative and noncooperative games together with the main algorithms for their analysis. The theoretical analysis will be paired with applications to problems from a wide range of different areas. Applications will be chosen upon the students’ interests (e.g., ranging from computational economics and social sciences to traffic, ITC and social networks, energy management, blockchain technologies, security, pattern recognition and machine learning). Skills: The course aims at providing student suitable background to • formulate and analyse phenomena and systems with interactions between multiple agents/decision-makers • understand inner mechanisms of competition and cooperation • understand inner mechanisms of threads and promises • forecast the behaviour of agents • design mechanisms to steer systems towards desired objectives through adequate mathematical models. Syllabus: • Noncooperative games • Auctions and bargaining • Cooperative games • Game theory in practice • Applications (computer science, economic computation et al.)
    • Laboratory on ICT Startup Building (6 cfu)

      • Description: The purpose of this laboratory course is to introduce, and preliminary train Master students in Computer Science to the entrepreneurial mindset. The course is organised as a series of seminars and an intensive hands-on activity focused on building a simple startup project. Teachers will come from academia, venture capital and startups. Students that will attend the course do not need to have a startup idea, but they will participate to a “startup building process”, meaning with this expression the fact that they will learn and practice all main steps that shape a (possibly simple) “ICT technical idea” into a viable product, being it the one of a startup or of a corporate project. Students, working in groups, will eventually reach the stage of pitching the startup project in front of a seed venture capitalist or drafting a project proposal for seed funding. The course will hinge onto frontal lectures on basic principles and methodologies underlying Innovation, combined with a learning-by-doing experience. Skills: Entrepreneurial mindset, pills of Innovation methodologies and IP protection, how to make the value of an ICT idea. Syllabus: ' What a company is and what is its purpose ' Pills of IP protection ' ICT Company structure and roles ' B2B vs B2C ' Value proposition Design for ICT ' ICT Team Management ' From Idea to Startup, the journey ' How to found a startup in less than 10 hours ' Fundraising and spending, why you need the money and how you should spend them
    • Introduction to Quantum Computing (6 cfu)

      • Description : This course provides an introduction and a practical experience with the emerging field of quantum computing. We focus on the fundamental physical principles, the necessary mathematical background, and provide detailed discussions on the most important quantum algorithms. Some recent applications in Machine learning and Network analysis will be also presented. In addition, students will learn to use specific software packages and tools that will allow them to implement quantum algorithms on actual cloud-based physical quantum computers . Skills : The course aims at providing students with a suitable background to understand the new quantum computing reasoning, and design/analyze quantum algorithms for various application fields. The algorithms will be run on both simulators and real prototypes of quantum machines. Prerequisites: Linear algebra, basic concepts of Numerical Analysis, Theory of Algorithms Syllabus: ' Fundamental Concepts - Basic mathematical tools (Complex numbers, Hilbert spaces, tensor products properties, unitary matrices, Dirac’s bracket notation) - Qubits, quantum gates, and circuits - Superposition and Entanglement ' Fundamental Algorithms - Teleportation - Grover’s quantum search algorithm - Quantum Fourier Transform - Shor’s integer factorization algorithm ' Recent application - Quantum Data Preparation and QRAM - Examples of Quantum Machine Learning algorithms - Quantum Random Walk - Quantum Page Rank
  • Secondo anno

  • Thesis (24 cfu)


  • Laboratory for innovative software (6 cfu)

    • Practical development of software requires an understanding of successful methods for bridging the gap between a problem to be solved and a working reliable software system. This course will train the student to develop large software systems working in real projects by exploiting the techniques and the skills acquired in the fundamental courses of the curriculum.
      The main novelty of the course is the attempt to balance traditional lectures and experimental activities with technical meetings with software architects of innovative software enterprises. During the course students will face and deal with the up-to-date issues of software design, implementation and testing of real projects. In this way students will also learn how to inspect actively software solutions.
      Each time the course is offered the design and implementation of a new innovative software artifact will be addressed, however the main underlying theme will always be building reliable code. To this purpose the course experiments modern techniques for making software more robust. These techniques include, but are not limited to:
      - Ad hoc static code analyses and tools.
      - Model checkers.
      - Code verification.
      - Machine learning techniques applied to code analysis.
      - Undefined behavior detectors.
      - Testing frameworks.
      - Language-based security frameworks.
      By the end of this class, students will have developed skills in three distinct competency areas
      Reliable coding:
      Writing code that is well organized at a high level; exploiting the best programming language features appropriately and avoiding troublesome ones; applying sophisticated idioms to structure code elegantly; using innovative toolkits to check program properties including automatable unit tests in the code base; preventing security attacks.
      Design:
      Analyzing problems to understand what the tricky aspects are; identifying key design issues, and analyzing their tradeoffs; selecting features for a minimal viable product.
      Professionalism:
      Constructing and delivering presentation of the deployed software; collaborating with team members; making constructive critiques.


  • Software validation and verification (9 cfu)

    • The goal of the course is to introduce techniques for verifying and validating software properties, either by analysing a model extracted from a program with model checking, or by testing the software before (the next) deployment, or equipping the running software with tools that monitor its execution.
      - Specifying software properties [2 CFU]
      o Assertions
      o Invariants, Safety and Liveness Properties, Fairness
      o Temporal logics: LTL, CTL, CTL*
      - Model Checking [4 CFU]
      o Transition systems and Program graphs
      o Checking regular safety properties
      o Checking omega regular properties with Büchi automata
      o Overview of Promela-SPIN and SMV
      o Extracting models from Java source code: BANDERA
      - Testing [3 CFU]
      o Coverage criteria and metrics: statement, function, branch, path and data-flow coverage
      o Test cases selection, prioritization and minimization
      o Automatic generation of test cases
      Topics to be chosen among:
      o Component-based and Service-oriented system testing
      o Object-oriented testing and Junit
      o Access Control Systems testing
      o Performance and other non-functional aspects testing

  • Free choice (9 cfu)

    • Free choice exam to be approved by the Academic Board
  • 9 cfu a scelta nel gruppo SW-2 affini da 9 cfu al secondo anno

    • Insegnamenti affini da 9 cfu del curriculum SW attivati al secondo anno
    • Computational mathematics for learning and data analysis (9 cfu)

      • The course introduces some of the main techniques for the solution of numerical problems that find widespread use in fields like data analysis, machine learning, and artificial intelligence. These techniques often combine concepts typical of numerical analysis with those proper of numerical optimization, since numerical analysis tools are essential to solve optimization problems, and, vice-versa, problems of numerical analysis can be solved by optimization algorithms. The course has a significant hands-on part whereby students learn how to use some of the most common tools for computational mathematics; during these sessions, specific applications will be briefly illustrated in fields like regression and parameter estimation in statistics, approximation and data fitting, machine learning, artificial intelligence, data mining, information retrieval, and others. - Multivariate and matrix calculus - Matrix factorization, decomposition and approximation - Eigenvalue computation - Nonlinear optimization: theory and algorithms - Least-squares problems and data fitting - MATLAB and other software tools (lab sessions with applications)
    • Advanced programming (9 cfu)

      • The objectives of this course are: to provide the students with a deep understanding of how high level programming concepts and metaphors map into executable systems and which are their costs and limitations to acquaint the students with modern principles, techniques, and best practices of sophisticated software construction to introduce the students to techniques of programming at higher abstraction levels, in particular generative programming, component programming and web computing to present state-of-the-art frameworks incorporating these techniques. This course focuses on the quality issues pertaining to detailed design and coding, such as reliability, performance, adaptability and integrability into larger systems. -Programming Language Pragmatics -Run Time Support and Execution Environments -Generic Programming -Class Libraries and Frameworks -Generative Programming -Language Interoperability -Component Based Programming -Web Services -Web and Application Frameworks -Scripting Languages
    • Advanced software engineering (9 cfu)

      • Objectives – The objective of the course is to introduce some the main aspects in the design, analysis, development and deployment of modern software systems. Service-based and cloud-based systems are taken as references to present design, analysis and deployment techniques. DevOps practices are discussed, and in particular containerization is introduced. The course includes a "hands-on" lab where students will experiment weekly the design, analysis, development and deployment techniques introduced. • Service-based software engineering (3 CFU) - core interoperability standards - software design by service composition, microservice architecture, examples of design patterns - business process modelling and analysis - service descriptions and service level agreements • DevOps practices (1.5 CFU) - DevOps toolchain, continuous delivery - Docker and containerization • Cloud-based software engineering (1.5 CFU) - service and deployment models - cross-cloud deployment and management of applications • Hands-on laboratory (3 CFU)
    • Smart applications (9 cfu)

      • The course aim is to explore methods and technologies for the development of smart connected applications, i.e. applications which exhibit intelligent behaviour -- through the use of artificial intelligence techniques introduced in other courses -- and that are deployed in immersive environments, including smart objects (as embodied by Internet of Things devices), mobile devices (smartphones, tablets), wearables (smartwatches, fitness trackers), home automation devices, web technologies, and cloud services and infrastructure. As such, applications considered for the course will include elements of context-awareness, sensor intelligence, spoken-language interfaces, The course will be based around a single case study for a novel smart application; students will cooperate as a single team, under the leadership of the instructor, in the design and implementation of a complete solution. In addition to standard lectures, classroom activities will include workshop-like sessions, where alternative designs are discussed, decisions are taken, and tasks are assigned. Weekly homework on the various phases of the joint project will be assigned to the team, and results reviewed the following week. The final goal is the delivery of a fully-functioning prototype of a smart application addressing the initial problem. While the specific technologies adopted for each case study will vary based on needs and opportunities, the following general themes will be explored in lectures (examples of specific subjects are noted next to each theme): • Introduction to the course and to the case study o examples: a voice-activated ambient assistant to answer student queries about the logistics of lectures in a classroom building, or autonomous software for a robotic rover for exploring inaccessible environments • Common designs for smart applications o examples: fuzzy logic in control systems or cloud analysis of field sensors data streams • Make or buy: selecting appropriate procurement strategies o example: writing your own RRN architecture vs. using cloud services • Development platforms for smart objects o examples: Brillo (IoT devices) or Android TV (Smart TVs) • Development platforms for smart architectures o examples: TensorFlow (server-side RNNs), or the Face Recognition API (mobile) • Cloud services for smart applications o examples: Google Cloud Machine Learning API, Google Cloud Vision API, Google Cloud Speech API, or Deploying Deep Neural Networks on Microsoft Azure GPU VMs • Deployment and operations o examples: cloud hosting vs. device hosting, or harnessing user feedback to drive improvement • Measuring success: methods and metrics o examples: defining user engagement and satisfaction metrics, or assessing the naturalness of smart interactions
    • Machine learning (9 cfu)

      • We introduce the principles and the critical analysis of the main paradigms for learning from data and their applications. The course provides the Machine Learning basis for both the aims of building new adaptive Intelligent Systems and powerful predictive models for intelligent data analysis. - Computational learning tasks for predictions, learning as function approximation, generalization concept. - Linear models and Nearest-Neighbors (learning algorithms and properties, regularization). - Neural Networks (MLP and deep models, SOM). - Probabilistic graphical models. - Principles of learning processes: elements of statistical learning theory, model validation. - Support Vector Machines and kernel-based models. - Introduction to applications and advanced models. Applicative project: implementation and use of ML/NN models with emphasis to the rigorous application of validation techniques.
    • Mobile and cyber-physical systems (9 cfu)

      • The course covers mobile and cyber-physical systems by providing an overview of issues, solutions, architectures, technologies and standards. It offers to the students an overall, coherent view of the organization of internet of things (IoT) systems, from the networking and sensing levels to the applications. Specifically, it shows how mobile, heterogeneous elements (from low-end sensors to high-end devices) form pervasive networks integrated in the internet and how they interact among themselves and with the surrounding physical world.The course is organized in three parts. The first part (3 CFU) introduces the principles of wireless communications and network architectures for mobility management. The second part (4 CFU) presents the foundations of signal processing and sensing and discusses the applications of sensor networks. The third part (2 CFU) provides an overview of the main standards and platforms of the IoT. • Foundations of wireless technologies and mobility management (3 CFU) 5G mobile, ad hoc networks, mobile social networks, IEEE 802.x standards • Cyber-physical systems (4 CFU) Foundations of signal processing, wireless sensor networks, energy harvesting, localization, elements of embedded programming • Internet of Things (2 CFU) ZigBee, Bluetooth, sensor network gateways, IoT platforms & standards (OneM2M, FIWARE, COAP, MQTT)
    • Language-based tecnology for security (9 cfu)

      • Overview : Traditionally, computer security has been largely enforced at the level of operating systems. However, operating-system security policies are low-level (such as access control policies, protecting particular files), while many attacks are high-level, or application-level (such as email worms that pass by access controls pretending to be executed on behalf of a mailer application). The key to defending against application-level attacks is application-level security. Because applications are typically specified and implemented in programming languages, this area is generally known as language-based security. A direct benefit of language-based security is the ability to naturally express security policies and enforcement mechanisms using the developed techniques of programming languages. The aim of the course is to allow each student to develop a solid understanding of application level security, along with a more general familiarity with the range of research in the field. In-course discussion will highlight opportunities for cutting-edge research in each area. The course intends to provide a variety of powerful tools for addressing software security issues: - To obtain a deeper understanding of programming language-based concepts for computer security. - To understand the design and implementation of security mechanisms. - To understand and move inside the research in the area of programming languages and security. Content: This course combines practical and cutting-edge research material. For the practical part, the dual perspective of attack vs. protection is threaded through the lectures, laboratory assignments, and projects. For the cutting-edge research part, the course's particular emphasis is on the use of formal models of program behaviour for specifying and enforcing security properties. Topics include: - Certifying Compilers - Code obfuscation - In-lined Reference Monitors - Formal Methods for security - Security in web applications - Information Flow Control Lab assignment and final examination: There are lab assignments. The lab assignments are experimental activities about specific problems. To pass the course, students must pass the labs by making a presentation of the assignments in class and pass the requirements on a written report that documents the activities done. Learning Goals: After the course, students should be able to apply practical knowledge of security for modern programming languages. This includes the ability to identify application- and language-level security threats, design and argue for application- and language-level security policies, and design and argue for the security, clarity, usability, and efficiency of solutions, as well as implement such solutions in expressive programming languages. Student should be able to demonstrate the critical knowledge of principles behind such application-level attacks as race conditions, buffer overruns, and code injections. You should be able to master the principles behind such language-based protection mechanisms as static security analysis, program transformation, and reference monitoring.
    • Parallel and Distributed Systems: paradigms and models (9 cfu)

      • Il corso mira a fornire un mix di basi e conoscenze avanzate nel campo del calcolo parallelo, specificatamente rivolte ad applicazioni ad alte prestazioni. Una prima parte del corso, relativamente piccola, fornirà il necessario background relativo all'hardware parallelo, dal multicore agli acceleratori fino ai sistemi distribuiti come cluster e cloud. Quindi verranno affrontati i principi del calcolo parallelo, comprese le misure che caratterizzano le computazioni parallele, i meccanismi e le politiche che supportano il calcolo parallelo ed i modelli tipici per il calcolo ad alte prestazioni. Alla fine sarà inclusa una rassegna dei framework di programmazione esistenti, finalizzati a preparare gli studenti a utilizzare e sfruttare i framework più moderni ed avanzati attualmente in uso sia negli istituti di ricerca che in contesti di produzione. Di conseguenza, allo studente che frequenterà il corso verrà fornita una prospettiva generale dell'area del calcolo parallelo nonché un'indagine completa dei framework attualmente disponibili per il calcolo ad alte prestazioni. L'intera serie di argomenti sarà integrata da esercizi pratici, in classe - secondo il principio "bring-yout-own-device" o come compiti a casa, da svolgere anche su macchine messe a disposizione dal nostro dipartimento. Verranno introdotti i diversi framework di programmazione utilizzati nel corso dettagliando le principali caratteristiche e modelli di utilizzo, lasciando allo studente il compito di apprendere i dettagli sintattici di basso livello (sotto la supervisione dei docenti) nell'ambito dei compiti a casa. Contenuti a) Evoluzione dei dispositivi informatici da sequenziale a parallelo: introduzione a multicore, core multiuso, acceleratori, cluster e architetture cloud. b) Principi del calcolo parallelo: misure di interesse (tempo e potenza), scalabilità orizzontale e verticale, meccanismi di comunicazione / condivisione e sincronizzazione, attività concorrenti (processi, thread, kernel), vettorializzazione, pattern tipici per il calcolo parallelo ad alta intensità di dati. Esercizi di laboratorio e assegnazioni utilizzando framework di programmazione parallela all'avanguardia mirati a multicore di memoria condivisa. c) Framework avanzati di elaborazione parallela e distribuita per applicazioni ad alta intensità di dati: GPU, elaborazione del flusso di dati e framework di programmazione ad alta intensità di dati. Esercizi di laboratorio e assegnazioni framework di programmazione all'avanguardia mirati ad architetture distribuite o acceleratori.

Questo sito utilizza solo cookie tecnici, propri e di terze parti, per il corretto funzionamento delle pagine web e per il miglioramento dei servizi. Se vuoi saperne di più, consulta l'informativa