December 16, 2021
Cardinal Stefan Wyszyński University in Warsaw
Human brain atlases and their applications
October 12, 2021
CSC – IT Center for Science and Nordic e-Infrastructure Collaboration (NeIC)
LUMI: the EuroHPC pre-exascale system of the North
May 27, 2021
Professor of Biochemistry and Genetics, La Trobe University
Professor of Medicinal Chemistry, Monash University
Visiting Professor in Pharmacy, University fo Nottingham
Fellow, Evolutionary Robotics, CSIRO Data61
Computational insights into the origin of SARS-CoV-2 and repurposing of drugs for COVID-19
Princeton University, USA
Towards whole-brain Connectomes: Reconstructing all neurons in a fly brain at nanometer resolution
Abstract: Comprehensive neuronal wiring diagrams derived from Electron Microscopy images allow researchers to test models of how brain circuits give rise to neuronal activity and drive behavior. Due to advances in automated image acquisition and analysis, whole-brain connectomes with thousands of neurons are finally on the horizon. However, many person-years of manual proofreading are still required to correct errors in these automated reconstructions. We created FlyWire to facilitate the proofreading of neuronal circuits in an entire fly brain by a community of researchers distributed across the world. While FlyWire is dedicated to the fly brain, its methods will be generally applicable to whole-brain connectomics and are already in use to proofread multiple datasets. In this talk I will describe how FlyWire’s computational and social structures are organized to scale up to whole-brain connectomics and present on our progress towards the generation of a proofread whole-brain connectome of the fruit fly.
Biosketch: Sven Dorkenwald is currently a PhD student in the Seung Lab at Princeton University. In his PhD he is developing systems, infrastructure and machine learning methods to facilitate the analysis of large-scale connectomics datasets. Together with collaborators at the Allen Institute for Brain Science, he developed proofreading and annotation infrastructure that is used to host multiple large-scale connectomics datasets and runs FlyWire. FlyWire.ai is an online community for proofreading neural circuits in a whole fly brain based on the FAFB EM dataset.
Ivo F. Sbalzarini
Chair of Scientific Computing for Systems Biology, Faculty of Computer Science, TU Dresden; MOSAIC Group, Center for Systems Biology Dresden; Max Planck Institute of Molecular Cell Biology and Genetics, Dresden
Title: Computational Developmental Biology
Abstract: Our vision is to develop a computer simulation of a developing embryo, incorporating the known biochemistry and biophysics into a computational model in 3D-space and time, which predicts the emerging shape and function of the tissue. Development and morphogenesis of tissues, organs, and embryos emerges from the collective self-organization of cells that communicate though chemical and mechanical signals. Decisions about growth, division, and migration are taken locally by each cell based on the collective information. In this sense, a developing tissue is akin to a massively parallel computer system, where each cell computes robust local decisions, integrating communication with other cells. Mechanistically understanding and reprogramming this system is a grand challenge. While the "hardware" (proteins, lipids, etc.) and the "source code" (genome) are increasingly known, we known virtually nothing about the algorithms that this code implements on this hardware. Using examples from our work, I highlight computational challenges along the way. These range from content-adaptive data representations for machine learning, to novel languages for parallel high-performance computing, to virtual reality and real-time visualization for 3D microscopy and numerical simulations of nonlinear and non-equilibrium mechanical models. This cooperative interdisciplinary effort contributes to all involved disciplines.
Research Fellow & STA STEM Ambassador | ARC Centre of Excellence for All Sky Astrophysics in 3D | School of Physics Senior Research Data Specialist | Melbourne Data Analytics Platform & Petascale Campus Initiative | The University of Melbourne, Australia
Title: HPC Simulations of the early Universe
Abstract: Understanding the formation and evolution of the first galaxies in the Universe is a vital piece of the puzzle in understanding how all galaxies, including our own Milky Way, came to be. It is also a key aim of major forthcoming international facilities such as the Square Kilometre Array and James Webb Space Telescope. In order to maximise what we can learn from observations made by these facilities, we need to be able to accurately simulate the early Universe and model how galaxies affected and interacted with their environments.
Massachusetts Institute of Technology
Title: High Performance Computing: The Power of Language
Abstract: Julia is now being used for high performance computing for the important problems of today including climate change and Covid-19. We describe how language is making all the difference.
Founder & CEO, Wolfram Research
Title: Emerging Surprises in Applying the Computational Paradigm (and the Deep Relations between Physics, Distributed Computing and AI)
Digital Museology, Digital Humanities Institute | Lead: Laboratory for Experimental Museology (eM+) | Director: ArtLab | EPFL Lausanne Switzerland
Title: Cultural data sculpting
Abstract: In 1889 the curator G. B. Goode of the Smithsonian Institute delivered an anticipatory lecture entitled ‘The Future of the Museum’ in which he said this future museum would stand side by side with the library and the laboratory.’ Convergence in collecting organisations propelled by the liquidity of digital data now sees them reconciled as information providers in a networked world. The media theorist Lev Manovich described this world-order as “database logic,” whereby users transform the physical assets of cultural organisations into digital assets to be—uploaded, downloaded, visualized, shared, users who treat institutions not as storehouses of physical objects, but rather as datasets to be manipulated. This presentation explores how such a mechanistic description can replaced by ways in which computation has become ‘experiential, spatial and materialized; embedded and embodied’. It was at the birth of the Information Age in the 1950s that the prominent designer Gyorgy Kepes of MIT said “information abundance” should be a “landscapes of the senses” that organizes both perception and practice. “This ‘felt order’ he said should be “a source of beauty, data transformed from its measured quantities and recreated as sensed forms exhibiting properties of harmony, rhythm and proportion.” Archives call for the creation of new prosthetic architectures for the production and sharing of archival resources. At the intersection of immersive visualisation technologies, visual analytics, aesthetics and cultural (big) data, this presentation explores diverse digital cultural heritage experiences of diverse archives from scientific, artistic and humanistic perspectives. Exploiting a series of experimental and embodied platforms, the discussion argues for a reformulation of engagement with digital archives at the intersection of the tangible and intangible and as a convergence across domains. The performative interfaces and repertoires described demonstrate opportunities to reformulate narrative in a digital context and they ways they support personal affective engagement with cultural memory.
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Samsung R&D, Poland
Title: Predicting the course of the COVID-19 epidemic in Poland
Abstract: One of the most promising approaches to predicting the possible scenarios of the epidemic is based on agent-based models. The idea of that model family is quite simple: reproduce the demographic and sociological structure of the society, and run the simulations of the disease spread throughout that structure. The direct reproduction of the contact structure allows investigating the consequences of various administrative measures applied, like school closure or travel restrictions. The model results can be visualised as a dynamic map of spreading disease, and enables the assessment of burden of disease factors locally. Our model, constructed more than 10 years ago for influenza epidemics, has been reanimated and tuned to COVID-19 parameters. It is now producing both scientific results as well as pragmatic reports, which are then being passed to Polish governmental authorities.
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Espace-DEV, IRD - Institut de Recherche pour le Développement, Montpellier, France
Title: The promises of the One Health concept in the age of anthropocene
Abstract: In May 2019 an article was published: “Anthropocene now: influential panel votes to recognise Earth’s new epoch”, situating et the stratigraphy of Earth’s history a new geological epoch – the domination of human influence on shaping the Earth’s environment. When humansnas are a central figure in an ecological niche its results massive subordination and transformation of the environment for their needs. Unfortunately, the consequence is robbery of natural resources. The consequences are socially unexpected – a global epidemiological crisis. The current covid-19 pandemic is an excellent example. It seems that one of the most important questions of the anthropocene era is how to maintain stable epidemiological conditions for now and in the future. The One Health concept proposes a nwe paradigm – a deep look at the sources of our well-being: our relationship with the environment. Our health status is interdependent with the well-being of the environment. It is clear that the socio-ecological niche disturbance results in the spread of pathogens. Can sustainable development of socio-ecological niches help us? Let’s take a look at the results!
President at The Systems Biology Institute, Tokyo; Professor at Okinawa Institute of Science and Technology Graduate University, Okinawa; President & CEO at Sony Computer Science Laboratories, Inc., Tokyo; Executive Vice President at Sony Corporation, Tokyo
Title: Nobel Turing Challenge — Creating the Engine of Scientific Discovery
Abstract: A new grand challenge for AI: to develop an AI system that can make major scientific discoveries in biomedical sciences and that is worthy of a Nobel Prize. There are a series of human cognitive limitations that prevent us from making accelerated scientific discoveries, particularity in biomedical sciences. As a result, scientific discoveries are left at the level of a cottage industry. AI systems can transform scientific discoveries into highly efficient practices, thereby enabling us to expand knowledge in unprecedented ways. Such systems may outcompute all possible hypotheses and may redefine the nature of scientific intuition, hence the scientific discovery process.
Postdoc in the Computer Science and Mathematics Division at Oak Ridge National Laboratory
Title: MGARD: Hierarchical Compression of Scientific Data
Abstract:The increasing scale of scientific datasets poses challenges to computational scientists needing to store, transfer, and understand their data. Lossy compression can help alleviate these problems, but practitioners are wary of incurring errors that may affect downstream analyses. In this talk, we present MGARD (MultiGrid Adaptive Reduction of Data), an algorithm for scientific data compression with guaranteed error bounds and the ability to preserve quantities of scientific interest. We begin by motivating the hierarchical decomposition underlying the algorithm. Next, we discuss quantization techniques for compression with guaranteed error bounds. Finally, we present numerical examples showcasing MGARD's ability to preserve quantities of scientific interest.
Valerie E. Polichar
Sr. Director & SATO, Academic Technology Services IT Services University of California San Diego
Title: Creating a Technology Vision: Planning for the Next Generation of Research Support
Abstract:In 2009, the University of California San Diego released the Blueprint for a Digital University, a broad vision for research IT that set the direction for UC San Diego’s offerings for the next ten years and led to the creation of multiple campus services, as well as the university’s first Research IT Services and Research Data Curation teams. To refresh this vision for a new era, in late 2019, we embarked on a multi-year process to look to the next ten years and imagine a new range of research data and compute support services. We’ll discuss our process and some of the early themes emerging from the first year’s discussions – and how they differ from those of the past. Attendees will take away a how-to approach that can be applied at any research institution to build vision and develop their own strategic plan.
Distinguished Professor in the Department of Computer Science; Director of the Institute for Data Science at New Jersey Institute of Technology
Title: Solving Global Grand Challenges with High Performance Data Analytics
Abstract: Data science aims to solve grand global challenges such as: detecting and preventing disease in human populations; revealing community structure in large social networks; protecting our elections from cyber-threats, and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms and architectures, and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams. In this talk, Bader will discuss the opportunities and challenges in massive data science for applications in social sciences, physical sciences, and engineering.
Dr. Nicolas Weber, Senior Researcher, NEC Laboratories Europe
Dr. Erich Focht, Senior Manager R&D, NEC HPC Europe
Workshop title: EFFECTIVE NEURAL NETWORKS *WITHOUT* GPU SOL: Transparent Neural Network Acceleration on NEC SX-Aurora TSUBASA
Date: Wed, Sept 30, 2020
Time: 09.00 am to 12.30 pm CEST
Limited access: 15 participants - first come first served
Payment: Free of charge
Note! Registration deadline, Sept 27, 2020 Participants will be informed about their qualification for the workshop by September 28th.
Introduction In 2019, ICM University of Warsaw expanded its HPC infrastructure with a specialized vector computer, NEC SX Aurora TSUBASA , with eight vector processors. Aurora TSUBASA is used at ICM UW for calculations in physics, chemistry, AI, as well as development work intended to adapt and optimize the existing software to work on the new computer architecture. Distinctive features of NEC SX-Aurora TSUBASA are:
- High memory bandwidth (48 GB, HBM2) of the Vector Engine (>1 TB/s) at < 300 W,
- 64 fully functional vector registers and 192 double precision FP operations per cycle,
- Works within the GNU/Linux environment - natively or in the accelerator mode.
- NEC SOL - Transparent Neural Network Acceleration  - an AI acceleration middleware enabling wide range of optimizations for neural network workloads. It integrates with existing Machine Learning frameworks such as PyTorch and TensorFlow. SOL offers broad support for hardware architectures including CPUs (x86, arm64), GPUs (NVIDIA), and NEC SX-Aurora TSUBASA. It does not require modification of the original source code allowing the user to focus on solving the problem rather than on the specifics of the hardware architecture;
- Frovedis - FRamework Of VEctorized and DIStributed data analytics  - data analytics software primarily targeting the NEC SX-Aurora TSUBASA architecture.
- SOL: Transparent Neural Network Acceleration:
- Integration with PyTorch;
- Integration with ONNX;
- Frovedis: FRamework Of VEctorized and DIStributed data analytics.
- Hands-on session: SOL at ICM infrastructure.
Nicolas Weber is Senior Researcher at the NEC Laboratories Europe. He received his PhD for work on automated memory access optimizations for GPU in 2017 from TU Darmstadt. Since then he focusses on the efficient mapping of artificial intelligence workloads onto various accelerator processors such as NEC SX-Aurora or GPUs, to transparently increase performance and efficiency.
Erich Focht is Senior Manager of the Research and Development group at NEC HPC Europe. His work topics cover distributed systems software, parallel file systems, hybrid programming models, system software, tools and compilers for vector systems with the focus currently on applications, linear algebra, AI and cooperations on the SX-Aurora vector engine.