Research at IvI

Our research institute is separated into individual Research Groups:

The Amsterdam Machine Learning Lab (AMLab) conducts research in the area of large scale modelling of complex data sources. This includes the development of new methods for probabilistic graphical models and non-parametric Bayesian models, the development of faster (approximate) inference and learning methods, deep learning, causal inference, reinforcement learning and multi-agent systems and the application of all of the above to large scale data domains in science and industry (‘Big Data problems’).

AMLab is co-directed by Max Welling and Joris Mooij. Other faculty in AMLab include Ben Kröse (professor at the Hogeschool Amsterdam) doing research in ambient robotics, Dariu Gavrila (Daimler) known for his research in human aware intelligence and Zeynep Akata (scientific co-director of Delta Lab and co-affiliated with Max Planck Institute for Informatics) doing research on machine learning applied to the intersection of vision and language.

Market places to share data in a trustable and transparent way.

The Complex Cyber Infrastructure (CCI) group is part of the Informatics Institute at the University of Amsterdam. CCI focuses on the complexity of man-made systems on all scales. This scale can be small, like the devices that you carry with you, or the apps they are running, or the communication protocols these apps use to interact. It can be also comprehensive, as in large systems such as data centres or multi-domain networks.

The complexity of these systems is caused by the fact that more and more cyber infrastructure – e.g. routers, switches, the cloud – is reprogrammable nowadays. This offers many possibilities, but it also makes the equipment more difficult to operate and less transparent. Further, there is the complexity of mapping in computational terms the data sharing requirements which are defined at societal level, through legislation, organizational policies, private data-sharing agreements, and consents.

The Computational Science Lab, led by Alfons Hoekstra, tries to understand how information is processed in natural settings through the study of a large variety of dynamic multi-scale complex systems with a focus on – but not limited to – biomedicine.

We study this ‘natural information processing’ in complex systems by computational modelling and simulation. An example is the spreading of the HIV virus: many processes on a large range of spatiotemporal scales play a role, from the molecular scale (e.g. the details of the entry of the virus into a cell) to the organism level (the sequence of events leading from an initial infection to the development of AIDS, and medication to keep the infection under control), and even to the population level (the actual spreading of the virus).

We rely on a variety of modelling approaches (such as Agent Based models, Cellular Automata, Dynamic Complex Networks, particle methods, and models based on differential equations), on multiscale modelling methods that capture the transmission and transformation of information up – and down the scales, on formal methods (theories of natural information processing) and on Problem Solving Environments (workflows, visualisation, multiscale coupling libraries and e-science infrastructures for distributed multiscale computing).

The mission of the Computer Vision research group is to study core computer vision technologies and in particular colour processing, 3D reconstruction, object recognition, and human-behaviour analysis.

The aim is to provide theories, representation models and computational methods which are essential for image and video understanding. Research ranges from image processing (filtering, feature extraction, reflection modeling, and photometry), invariants (color, descriptors, scene), image understanding (physics‐based, probabilistic), object recognition (classification and detection) to activity recognition with a focus on human‐behavior (eye tracking, facial expression, head pose, age and gender).

What are the mathematical properties of information? How can we describe how information flows between humans or computers? Questions such as these lie at the heart of the research conducted at the Institute for Logic, Language and Computation (ILLC), a world-class research institute in the interdisciplinary area between mathematics, linguistics, computer science, philosophy and artificial intelligence, which is rooted in the Amsterdam logic research tradition dating back to the early twentieth century.

Research at the Institute for Logic, Language and Computation is organized into three core programmes. Logic & Language includes much of the research on formal semantics and pragmatics and philosophy of language. Language & Computation houses the ILLC’s research in computational linguistics, computational musicology and information retrieval. Logic & Computation is home to our research into epistemic and mathematical logic, social choice, and foundations of mathematics.

Below are short descriptions of these programs. Additionally, we have chosen two key research themes, as key research areas which span across the three programs: Logic & Game Theory and Cognitive Modelling.

Led by Maarten de Rijke, the Information and Language Processing Systems research group combines research on information retrieval, language technology, semi-structured data and result presentation in order to identify semantically meaningful information in large volumes of online content.

Our research is aimed at intelligent information access, especially in the face of massive amounts of information. We work on finding and analyzing content (information retrieval, machine translation, language technology), the analysis of structural information (social networks, linked data) and the analysis of user behavior (self-learning search, log analysis, user studies).

We combine fundamental, experimental and applied research, and we do so using a broad range of textual data, data from the web or enterprises, edited or user generated, or obtained from (automatic) transcriptions of audio or video. We are involved in a large number of projects with other groups, both within and outside academia. Our research is funded by NWO, KNAW, the EU and through a range of public-private partnerships.

The Intelligent Data Engineering Lab (INDElab), led by Prof. Paul Groth, investigates intelligent systems that support people in their work with data and information from diverse sources. This includes addressing problems related to the preparation, management, integration and reuse of data.

We perform both applied and fundamental research informed by empirical insights into data science practice. Topics of interest include: data supply chains, data provenance, transparency, information integration, automated knowledge base population, knowledge graph construction, and data semantics.

The Intelligent Sensory Information Systems research group, led by Cees Snoek, considers visual information in its many aspects. Research ranges from the perceptual and cognitive processes involved in understanding visual information, methods for learning and indexing the semantics of visual information automatically, methods for the presentation of and interaction with large visual collections and computer vision methods for human computer interaction.

The world is full of digital images and videos. In this deluge of visual information, the grand challenge is to unlock its content. This quest is the central research aim of the Intelligent Sensory Information Systems group. We address the complete knowledge chain of image and video retrieval by machine and human.

Topics of study are semantic understanding, image and video mining, interactive picture analytics, and scalability. Our research strives for automation that matches human visual cognition, interaction surpassing man and machine intelligence, visualization blending it all in interfaces giving instant insight, and database architectures for extreme sized visual collections.

Our research culminates in state-of-the-art image and video search engines which we evaluate in leading benchmarks, often as the best performer, in user studies, and in challenging applications.

Multiscale systems that make a difference.

The MultiScale Networked System (MSN) group is part of the Informatics Institute at the University of Amsterdam. The group focusses its research on multiscale systems e.g. cloud systems or clusters that define themselves by their dynamic size and scale, and on the network connecting them. The MNS group explores the emerging architectures that can support emerging applications across the future internet.

The predominant question that the group tries to answer: How can these distributed systems work as efficiently as possible? And how do these systems need to evolve to satisfy the constantly new application requirements?

Extra-functional behaviour of computer systems in full glory.

The Parallel Computing Systems (PCS) group is part of the Informatics Institute at the University of Amsterdam. It is the foremost research group in The Netherlands in the field of system optimization of multi-core and multi-processor computer systems. The PCS group looks at system performance, power/energy consumption, reliability, security & safety, but also the degree of productivity to design and program these systems: the extra-functional behaviour of computer systems in full glory.

The top research of the PCS group is indispensable for developments within, for example, Artificial Intelligence. In order to be able to cope with the increasingly demanding calculations in computer science, it is essential that computer systems become faster and more efficient. Without the skills of researchers within computer systems, AI, amongst others, was certainly not where it is today.

The Theory of Computer Science group, led by Alban Ponse, is concerned with the development of theoretical foundations of computer science, based on logic and mathematics.

The aim is to seek greater understanding of fundamental computational techniques and their inherent limitations. The emphasis is not only on the abstract aspects of computing, but also on the application of theory in the field of computer science.

The focus is on developing theory and tools in the field of algebraic specification which can be used to specify, analyse, and verify concurrent communicating and programmed systems.