Banner
Home      Log In      Contacts      FAQs      INSTICC Portal
 
Documents

Keynote Lectures

Cloud Computing - State-of-the-Art and Future Research Trends
Eleni Karatza, Informatics, Aristotle University of Thessaloniki, Greece

The Role of Domain Specific Languages in Modeling and Simulation
Adelinde M. Uhrmacher, University of Rostock, Germany

Parallel Discrete Event Simulation - Past, Present and Future
Richard Fujimoto, Computational Science and Engineering, Georgia Institute of Technology, United States

Agent-based Models for Exploring Social Complexity, with an Application of Network Analysis to Agents
Pietro Terna, Dipartimento di Scienze economico-sociali e matematico-statistiche, Università di Torino, Italy

 

Cloud Computing - State-of-the-Art and Future Research Trends

Eleni Karatza
Informatics, Aristotle University of Thessaloniki
Greece
 

Brief Bio
Helen Karatza is a Professor Emeritus in the Department of Informatics at the Aristotle University of Thessaloniki, Greece, where she teaches courses in the postgraduate and undergraduate level, and supervises doctoral and postdoctoral research. Dr. Karatza's research interests include Computer Systems Modeling and Simulation, Performance Evaluation, Grid and Cloud Computing, Energy Efficiency in Large Scale Distributed Systems, Resource Allocation and Scheduling and Real-time Distributed Systems. Dr. Karatza has authored or co-authored over 215 technical papers and book chapters including five papers that earned best paper awards at international conferences. She is senior member of IEEE, ACM and SCS, and she served as an elected member of the Board of Directors at Large of the Society for Modeling and Simulation International. She served as Chair and Keynote Speaker in International Conferences. Dr. Karatza is the Editor-in-Chief of the Elsevier Journal “Simulation Modeling Practice and Theory” and Senior Associate Editor of the “Journal of Systems and Software” of Elsevier. She was Editor-in-Chief of “Simulation Transactions of The Society for Modeling and Simulation International” and Associate Editor of “ACM Transactions on Modeling and Computer Simulation”. She served as Guest Editor of Special Issues in International Journals. More info about her activities/publications can be found in http://agent.csd.auth.gr/~karatza/


Abstract
Clouds have been very popular, and their performance becomes more important due to the increase of users and applications. Currently, many enterprises are adopting Clouds to achieve high performance for their applications at low costs.
Because of the nature of these systems, there are important issues that must be addressed, such as: resource allocation, efficient scheduling, energy conservation, reliability, security and trust, cost, availability, quality. Effective management of cloud resources is crucial to use effectively the power of these systems and achieve high system performance. Furthermore, software structures that best exploit clouds capabilities while providing applications compatibility should be examined.
Cloud computing is a concept that has emerged from grid computing; it provides users the ability to acquire computational resources on demand from its virtually infinite pool on a pay-as-you-go basis.
The cloud computing paradigm can offer various types of services, such as computational resources for HPC applications, web services, social networking, etc. Resource allocation and scheduling is a difficult task in clouds where there are many alternative heterogeneous computers. If cloud computing is going to be used for HPC, appropriate methods must be considered for allocating resources to user requests efficiently, VM scalability, as well as effectively scheduling the tasks. The scheduling algorithms must seek a way to maintain a good response time to leasing cost ratio. Furthermore, adequate data security and availability are critical issues that have to be considered along with energy-efficient solutions that are required to minimize the impact of cloud computing on the environment.



 

 

The Role of Domain Specific Languages in Modeling and Simulation

Adelinde M. Uhrmacher
University of Rostock
Germany
 

Brief Bio
Adelinde M. Uhrmacher is Professor at the University of Rostock, Germany and head of the modeling and simulation lab. Her research is on discrete-event modeling and simulation methods and their applications in different areas, like demography and cell biology. Her research comprises the development of modeling languages and simulation algorithms, support for simulation experimentation as well as the realization of simulation systems. She has been involved in the organization of various modeling and simulation conferences, e.g., as program chair of the Winter Simulation Conference 2012. She has been and is member of the editorial board of several simulation related journals. She was editor in chief of SCS Simulation from 2000 until 2006.  Since 2013 she serves as editor in chief of the ACM Transactions of Modeling and Computer Simulation.


Abstract
Simulation, as an experiment performed with a formal model, is meanwhile widely accepted as the third branch of science, complementing theoretical and empirical science. However, “in-silico” experiments often suffer from the lack of reproducibility which has led to the opinion that published simulation results cannot be relied upon. In this talk I will take a closer look what role domain specific languages can play in addressing this problem.  The syntax and the semantics of domain specific modeling languages determine, how and which system can be described, respectively. Only by clearly distinguishing between syntax and semantics, the advantage of syntax can be exploited, i.e., to provide a compact and succinct description of the system of interest. Domain-specific languages are also applied for supporting experiments. The specification in a domain-specific language helps storing, revising and adapting experiments. How both, domain-specific languages for modeling as well as for simulation experiments, may influence modeling and simulation, and may contribute to the reproducibility of simulation, will be discussed based on case studies from cell biology and demography.



 

 

Parallel Discrete Event Simulation - Past, Present and Future

Richard Fujimoto
Computational Science and Engineering, Georgia Institute of Technology
United States
 

Brief Bio
Richard M. Fujimoto is a Regents’ Professor in the School of Computational Science and Engineering at the Georgia Institute of Technology. He received the M.S. and Ph.D. degrees from the University of California at Berkeley in 1980 and 1983 in Computer Science and Electrical Engineering. He did his undergraduate work at the University of Illinois at Urbana-Champaign where he received B.S. degrees in Computer Science and Computer Engineering in 1977 and 1978, respectively. He has been an active researcher in the parallel and distributed simulation field since 1985 and has published over 200 papers in this area. He has received several best paper awards for his research as well as the ACM SIGSIM Distinguished Contributions in Simulation Award. He led the definition of the time management services for the High Level Architecture (IEEE Standard 1516).  Fujimoto has served as Co-Editor-in-Chief of the journal Simulation: Transactions of the Society for Modeling and Simulation International and was a founding area editor for the ACM Transactions on Modeling and Computer Simulation journal.  He has also served on the organizing committees for several leading conferences in the parallel and distributed simulation field. He was the founding chair of the School of Computational Science and Engineering at Georgia Tech and led in the creation of several graduate and undergraduate programs at Georgia Tech in this field.


Abstract
The parallel discrete event simulation field emerged in the 1970s around the problem of distributing the execution of a discrete event simulation program across multiple processors while still obtaining the same results as a sequential execution. The field has since evolved and addressed many challenges to speeding up and scaling simulation programs, and remains an active area of research to this day. Many impressive successes have been reported in the literature. Today, issues such as the “power wall” facing modern processors and new developments such as massively parallel supercomputers and cloud computing platforms make the technology more important than ever before. Further, broader technology trends such as “big data” and the Internet of Things present new challenges and opportunities.
I will give a retrospective view of the parallel discrete event simulation field dating back to its origins in solving the so-called synchronization problem. I will describe important advances and successes that illustrate the potential offered by this technology. Key impediments that have prevented the technology from achieving more widespread adoption by the general modeling and simulation community are discussed as well as important research challenges that remain.



 

 

Agent-based Models for Exploring Social Complexity, with an Application of Network Analysis to Agents

Pietro Terna
Dipartimento di Scienze economico-sociali e matematico-statistiche, Università di Torino
Italy
web.econ.unito.it/terna
 

Brief Bio
Pietro Terna (born in 1944) is retired professor of Economics of the University of Torino (Italy). His more recent works are in the fields (i) of artificial neural networks and economic and financial modeling and (ii) of social simulation with agent based models, where he has been pioneering the use of Swarm.He has been developing an original simulation system to reproduce enterprise and organization behavior, named java Enterprise Simulator, visible at http://web.econ.unito.it/terna/jes and he is now developing a new Python based version both of Swarm and jES, named SLAPP (Swarm-Like Agent Protocol in Python), visible at http://web.econ.unito.it/terna/slapp.He is teaching an advanced course on Simulation Models for Economics. It also carries out a seminar activity in the field of applications of simulation in economics, developing a school in the discipline. He teaches the course of Economic simulation and of Network Analysis for the students of the PhD School of Economics of the University of Torino.He is the author of numerous papers in journals and collective volumes, published in Italy and abroad, and co-author of a book on application of artificial neural networks to economics and finance.The scientific production covers the following topics: applications of Monte Carlo analysis of estimators in econometrics; the quantitative analysis of economic phenomena; the quantitative methodology in economics. He devoted much of recent research, in addition to the theme of neural networks for the construction of agents capable of learning and choices, to the use of advanced simulation techniques for the construction of economic models.Publication list at http://web.econ.unito.it/terna


Abstract
Thinking to agent-based models as artifacts, useful to explore economic complexity, means to introduce three concepts: (i) on the technical side, the agent-based methodology; (ii) in the social science perspective, the idea of building artifacts also in the social domain; (iii) in a more general view, the idea of complexity. With an impressive image, this way of researching involves the application of the Galileo’s method in social sciences.
Remembering the theoretical roots in cybernetics and, more recently, in complexity science, we need also technical roots, with the capability of building models that could be accepted by a wide audience.
Agent-based models can be also particularly useful for policy-making, mainly if they are both quite simple in their structure and highly sophisticated in their outcomes. The pursuing of simplicity and sophistication can be made more efficacious by applying network analysis to the emergent results.



footer