The ISISLab laboratory mainly deals with three research topics which are summarized below (and then described), also referring to the areas of classification of the ACM ( (2012 ACM Computing Classification System ) for placement.

  • Distributed and Parallel Computations
    • Cloud and Edge Computing
    • Massive simulations
    • Parallel algorithms and architectures
  • Collaborative and Social Calculation
    • Co-creation of Open Data
    • Collaborative design
    • Synchronous and co-located collaboration
    • Privacy
    • Adaptive and customized systems
  • Visualization and Interaction
    • Scalable Display
    • Virtual Reality Applications
    • Innovative interfaces

Distributed and Parallel Computations

The evolution of computer science in the last three decades has been characterized by the architectural modification that has brought the centralized computing paradigm to distributed and parallel architectures where data processing and storage are carried out cooperatively on different nodes interconnected by a network.

The laboratory followed the evolution of this field, working, which led the research topics to go, on the one hand, towards the direction of computing on the cloud and on the edge, on the other to deepen the execution of simulations with massive agents and the efficient execution of algorithms. The problem of trying to make computations more efficient by using different processing units is one of the great topics of computer science research, which, in the course of the history of computer science, has involved, at 360 degrees, research in the field of hardware architectures. of those software (computing infrastructures, programming languages, etc.), research in the algorithmic field, up to the one on computation models, and performance evaluation.

It is in this area that ISISLab's research was carried out, addressing various issues, ranging from recent studies involving the efficient use of the Cloud and Edge computing, to arriving at execution environments for massive agent-based simulations, to the study of parallel and distributed architectures and algorithms.

Cloud e Edge Computing

The focus of the research first focused on the opportunities of the so-called Edge Computing, that is, the infrastructures that can be found between clients and between service providers.
Since the first introduction and dissemination of the World Wide Web, the laboratory has produced research in the field of Web intermediaries, that is Programmable HTTP Proxies, which could represent a useful tool to realize complex services based on those existing on the Web.

The scalability of these systems has been studied and positively addressed, as well as various systems have been developed and tested to facilitate the accessibility of the Web to those who suffer from visual handicaps and for navigation with the first mobile devices available.

With the advent and spread of the Cloud, which has now shown to offer an unparalleled performance-price ratio for scientific computing, research interest has moved in this direction.

In particular, optimization systems have been developed that can be run on the Cloud (Simulation Optimization and exploration Frameworks in the Cloud) that were extremely simple to use (Zero configuration) and highly effective / efficient.

Massive simulations

An instrument for particularly useful scientific research is that of agent-based simulations, which aims to recreate complex phenomena, emerging from the simulation of different agents that exhibit very simple behaviors. This computational technique of scientific investigation is particularly useful in very diverse contexts, from physics, chemistry, biology, up to the so-called soft sciences, such as psychology, sociology, pedagogy, jurisprudence.

The work of the laboratory has focused mainly on the problem of massive simulations, ie involving a very high and scalable number of agents, which therefore poses significant challenges on the efficiency and scalability of the simulation. In this context, the research focused on the creation of a distributed and scalable cluster environment, called D-MASON, which (built on the basis of an existing simulation system designed by George Mason University (USA)) allows simulations to be carried out. scalable, efficient and effective, even in the presence of non-regular partitioning of the agent field. Recently, the research work also involved the creation of Domain-specific languages for distributed agent simulation.

Parallel algorithms and architectures

The laboratory also has a thriving research activity with a theoretical approach to create algorithms and architectures that make basic operations more efficient.

After several jobs in the field of minimizing access conflicts to in-memory data structures for different data structures, a significant part of the laboratory's work was carried out in the analysis and design of networks and routing schemes for Peer 2 Peer networks . Within the P2P architecture called Chord, the original scheme was improved by proposing a scheme based on Fibonacci numbers, F-Chord, which exhibits better performance, both in terms of routing (number of hops max/avg) that in terms of memory occupation (size of the routing table per node).

Collaborative and Social Calculation

This research area aims to explore how users of all types, with all types of objectives, can interact effectively, efficiently and without particular effort, to be able to actively share, build a network of contacts, communicate, coordinate and collaborate with anyone, anywhere, at any time. Therefore the research of the laboratory aims to investigate how collaborative activities and their coordination can be supported by groupware systems, and which principles of design and empirical validation must be followed to obtain concrete results. In addition to studying these groupware systems in different contexts, remote / co-located, types of users (citizens, professionals, students), the research also sought to identify the privacy losses that these systems (based on WWW) involve, and to verify, how access to resources and tools can be made more accessible and personalized.

Open Data Co-creazione

The Open Data are data freely accessible by all, with the only possible restriction of the obligation to quote the source or to redistribute openly any changes. Particularly significant is the recent focus on the so-called open government, which aims to make the Public Administration open to citizens, not only through the transparency of actions but also through direct participation, exploiting new digital technologies. In this context, Open Data has gained popularity through other open movements (such as open-source, for example).

Well-known in the literature is the strong limitations of Open Data in terms of quality, accessibility, ease of interpretation and reusability, limitations that jeopardize the ultimate purpose of data opening by the Public Administration: improving the relationship with citizens. The difficulties of the Public Administration in managing the production process are counterbalanced by the inadequacy of the tools provided to citizens to access, make sense and interpret the Open Data provided by the Public Administration.

In this context, the research work of the laboratory, carried out in part within the European Horizon 2020 ROUTE-TO-PA Research Project (www.routetopa.eu), aimed to facilitate the task of creating Open Quality date, in collaboration between different users (in order to be able to use different skills and competencies together), with an agile process that, on the one hand, gives users the freedom to structure and organize tasks as they see fit, on the other provides non-intrusive driving and control tools that increase product quality.

Central was the creation of a Social platform, called Social Platform for Open Data, where users (be they citizens, public employees, experts, etc.) can interact naturally and familiarly, being able to take advantage of co-creation tools shared Open Data, and their easy export and display. This tool, which is part of a multi-platform architecture, represents the state of the art in supporting citizens and public administrations, as the prototype has been evaluated in different real contexts.


This approach increases the public value of the generated Open Data since the creation in collaboration allows to increase the quality of what is produced, especially useful in production contexts within the Public Administration that as regards transparency towards citizens. The use of proactive error correction tools significantly increases the usability of data and its real utility. Finally, the architecture is particularly efficient, portable and scalable.

Collaborative design

In research and industrial production contexts, it is essential to be able to have efficient tools that favor the collaboration of teams of specialists, who are often distant and who access from different places, at different times, a complex process, in which they hold various roles, with a different responsibility.
  Laboratory research has investigated the problems and proposed technological solutions, designing and creating prototypes that aim to facilitate these particularly complex processes, as they are difficult to automate. So the platform must foster and stimulate collaboration, at the right time with the most appropriate tools, without being prescriptive and binding, for a team of high specialists with special needs and peculiarities.

Laboratory research has led to the creation of a complete collaborative system for a team of specialists, which, through an innovative visual interface, allows effective collaboration during shared tasks of researching information and using heterogeneous Computational Fluidodynamics applications, with a significant testing job that led to validation in a real context (Fiat Chrysler Automobile CFD laboratory). The platform also offers flexible tools that allow interoperability between different proprietary software, using open intermediate formats.

The study of some typical primitives of Social Networks, applied to the design context was the subject of the research of the laboratory, where a general technique, which allows applying the collaboration functionalities in an orthogonal and transversal way to the functionalities offered by a traditional instrument, is applied and tested in a specific case (like SVN). In other application contexts such as biochemistry, the research work mainly consisted of supporting workgroups with advanced Web-based tools.

Synchronous and co-located collaboration

A particular type of collaboration is that which occurs when the tool to be used must facilitate a team that is present in the same place (co-located) and at the same time (synchronous). This involves using different approaches in groupware design and evaluation.

In this context, the research work of the laboratory carried out in part within the European Research Project of the 6th LEAD (“Technology-enhanced Learning and Problemsolving Discussions: Networked Learning Environment in the Classroom) Framework Program, was aimed at supporting discussion and computer-mediated debate in a didactic environment, in the classroom. The aim was to study how a computer-based discussion tool, mediated in a hybrid manner by voice discussions, could improve the quality of interactions and arguments during discussions/debates and the educational effect it could have on the class.

CoFFEE groupware architecture (Cooperative Face-to-Face Educational Environment), conceived, designed, implemented and tested in different schools in Europe within LEAD, is designed to facilitate brainstorming, stimulate participation by all students and be flexible for use by teachers who can easily reconfigure the methods of use. Among the special features, that of being able to contribute anonymously to the discussion, to be able to structure the collaboration in steps, with subgroups, with different tools to be used (collaborative editors, graphical mind-map tools, structured discussions, voting tools, quizzes, etc.), allowing the easy analysis of the collaboration that has elapsed, and also to analyze the free-riding mechanism ("towing other students' activities") with precision

Privacy

In systems based on collaboration and social networks, users' privacy is seriously threatened, when they rely on social and web environments that are designed precisely to collect data from users who use them, to best profile advertising messages and resell these possibilities to the fullest.

The laboratory study has therefore deepened in how it is possible to try to improve the user's perception during Web browsing, through customizable plugins of popular browsers, and how this can influence users.
Then, afterward, an attempt was made to provide, a measure of how much, during the navigation on mobile phones, the loss of privacy (understood as sensitive data transmitted in a hidden way to the concentrators and third parties) involves higher energy consumption. After designing and creating a plugin for mobile that tracks and blocks the loss of privacy, a series of real "in vivo" experiments, both with mobile devices actually in use and accurately monitored, and with real site navigation (with 4 different workloads ), showed how the plugin is remarkably efficient in terms of energy saved due to non-transmission of private data (depending on the category of sites from 8% to 36%), while its effectiveness is considerably better than the competition, consuming even less energy for filtering tasks.

Adaptive and customized systems

At the beginning of the spread of the World Wide Web, the laboratory was active in research on adaptive systems (ie systems that customize the response based on a user profile) and group on the Web from the beginning of the spread of the platform.

One of the first adaptive systems on the Web was created in 1998, and one of the first synchronous Web-based collaborative navigation systems, while the first applications of profiling to e-commerce were proposed and analyzed. In this context, the possibilities of these systems in education were also studied and where adaptivity was provided based on the behavior of the individual within the group.

Visualization and Interaction

Part of the research work of the laboratory aims to study the way to better represent information and manage interactions with users, in varied contexts. The lines of research followed have aimed at creating interactive visualizations that are scalable in the face of their considerable use by many users, and that allows, in an efficient manner, the monitoring of complex processes. The applications of Virtual Reality to the field of Cultural Heritage and reconstruction of archaeological sites were different, with the use also of design methodologies borrowed from the videogames environment (Serious Games). Finally, some works have also studied and contributed to research in the field of innovative interfaces through the use of graphic and auditory metaphors, and the use of graphics processors for purposes of modeling complex behaviors.

Scalable visualization

Within the ROUTE-TO-PA project, a particularly important topic dealt with by the laboratory was to create a platform for displaying open data that would be, on the one hand, extremely simple to use, so that users without technical skills could use it, and, on the other, that it was efficient, allowing the display to be scaled, using the resources made available on the client-side.

To this end, the Datalet Ecosystem Provider (DEEP) architecture allows the creation of reusable Web components (the datalets), which, once loaded by the client, provide to load the data, display them as required, and guide user interactions, without loading the server. This also allows the development of agile visualizations, that is, they follow the real-time content of the open data dataset, allowing continuous control in the creation process.

The research experience has also turned, in this field, to tools to use the visualization of processes in real-time, to be able to monitor them in an easy, versatile and scalable way. In particular, different frameworks have been designed, developed and analyzed for the visualization of running processes, filesystems, and clustering.

Virtual Reality Applications

In particular, the methods of using these tools were examined and studied in the context of the use (local/remote, single / group) of reconstructions of archaeological sites that can be visited (Cultural Heritage). The tool used, often, together with the reconstructions of Virtual Reality is that of the so-called "Serious games", that is educational games, which have not only recreational purposes, but which have the objective of entertaining by educating and providing information on the sites or artifacts visited.

In this context, the experiences gained in immersive serious games (that is, they use interfaces like stereoscopic viewers and haptic interfaces (like Microsoft Kinect)) seem to offer interesting perspectives and good results. Recently, interesting results on the automatic production of virtual environments, based on Open Data, have been realized in the context of the reuse of Open Data. In the past, the study also focused on the use of mobile peripherals in contexts of hybrid visits (real positioning and visualization in virtual reality) and the context of historical simulations (battles of the nineteenth century).

Innovative interfaces

Innovative ways of interfacing with computations and data in an innovative way was also the subject of research by the laboratory which began to take an interest in the possibilities offered by the sound representation (Sonification) of real processes, such as the behavior of an HTTP Server, to favor the monitoring.

The results, based on a prototype tested in a user-controlled environment, were positive, showing how audio can supplement the information transmitted to users. Other systems for audio monitoring via SNMP have been realized and tested, also with the realization of a multi-purpose programmable architecture.

Graphic Processors

The use of Graphical Processing Units (GPUs) as calculation tools was also the subject of study and research in the laboratory. GPUs are particularly useful for performing extremely fine-grained computations, where all the data you need during computation may be available on the internal GPU memory itself.

In particular, the work started with the use of GPUs as a simulation tool for massive simulations, allowing, subsequently, the simulation of even complex individual behaviors with a written library to facilitate the use of GPUs (of the time).
Some experiences on land rendering have also been implemented and tested on GPUs. Currently, the GPUs are easily programmable with high-level programming languages, which have made they're potential much more accessible and easy to use .