Past seminars & videos
Prof. David Winkler
Professor of Biochemistry and Genetics, La Trobe University
Professor of Medicinal Chemistry, Monash University
Visiting Professor in Pharmacy, University fo Nottingham
Fellow, Evolutionary Robotics, CSIRO Data61
Title: Computational insights into the origin of SARS-CoV-2 and repurposing of drugs for COVID-19
Abstract:In the last 20 years the world has been threatened with three different coronavirus (CoV) pandemic threats from Severe Acute Respiratory Syndrome coronavirus (SARS CoV) starting in 2002, Middle East respiratory syndrome coronavirus (MERS-CoV) in 2012 and finally COVID-19 due to SARS-CoV-2 in late 2019. All these posed serious global pandemic threats with estimated case fatality rates of 15% for SARS, 34% for MERS and 1-3% for SARS-CoV-2 (1). With the current pandemic still far from over, there is an urgent need to understand where the virus came from, and to find new drugs to treat COVID-19, the disease caused by SARS-CoV-2. We can assume this will not be the last coronavirus to threaten humanity, hence we need better tools to track virus origin, and to identify drugs active against future coronavirus threats. In this seminar I discuss in silico computer modelling and screening approaches to estimate the SARS-CoV-2 susceptibility of humans and other important animal species. I will also illustrate how state-of-the-art computational methods can rapidly identify drugs from existing drug libraries that might be able to be repurposed to treat COVID-19 patients. We also describe how this computational screening pipeline can be expanded in the future to identify drugs with broad spectrum activity against a wide diversity of coronaviruses. Individual drug protection to CoVs may be short-lived, given their rapid mutation rates and the development of drug resistance. Thus, CoV drugs should hit multiple targets within viruses to minimize resistance. For example, one of the key and surprising findings of our drug screens to date is the anthelminthic drug Ivermectin is able to inhibit multiple SARS-CoV-2 protein targets, potentially making it difficult for SARS-CoV-2 to develop resistance to it. I will describe the current state of development of in silico CoV drug screening, the challenges and pitfalls of these approaches, and our predictions of how such methods may be used to develop drugs for future CoV pandemics even before they occur.
Prof Andrew Turpin
BCom BSc(Hons) PhD
Senior Academic Convenor, Petascale Campus Initiative.
Lead, Melbourne Data Analytics Platform.
Title: Supporting digital research methods: the Melbourne experience
Abstract: In this talk I will discuss the Petascale Campus Initiative, a program at The University of Melbourne to boost access to computational and data management resources for all faculties at the univeristy. With Melbourne is a large, comprehensive univeristy, and so balancing the needs of medicine, fine arts, physics, and architecture has its challenges. In addition to hardware, the Initiative is also committed to developing a workforce of data and computer scientists that can support reserachers in the academy to make use of digital technology in their research. This workforce (the Melbourne Data Analytics Platform – MDAP) is unique in Australia, and rare in the world, in that it is comprised of academics whose “KPI” are built around supporting research, not necessarily leading independent reserarch. The talk will discuss the journey in establishing MDAP at Melbourne; the challenges and the successes.
Dr. Nicolas Weber, Senior Researcher, NEC Laboratories Europe
Dr. Erich Focht, Senior Manager R&D, NEC HPC Europe
Workshop title: EFFECTIVE NEURAL NETWORKS *WITHOUT* GPU
SOL: Transparent Neural Network Acceleration on NEC SX-Aurora TSUBASA
Date: Wed, Sept 30, 2020
Time: 09.00 am to 12.30 pm CEST
Limited access: 15 participants – first come first served
Payment: Free of charge
Note! Registration deadline, Sept 27, 2020
Participants will be informed about their qualification for the workshop by September 28th.
Introduction
In 2019, ICM University of Warsaw expanded its HPC infrastructure with a specialized vector computer, NEC SX Aurora TSUBASA [1], with eight vector processors. Aurora TSUBASA is used at ICM UW for calculations in physics, chemistry, AI, as well as development work intended to adapt and optimize the existing software to work on the new computer architecture.
Distinctive features of NEC SX-Aurora TSUBASA are:
- High memory bandwidth (48 GB, HBM2) of the Vector Engine (>1 TB/s) at < 300 W,
- 64 fully functional vector registers and 192 double precision FP operations per cycle,
- Works within the GNU/Linux environment – natively or in the accelerator mode.
The Workshop is intended as an introduction to two software frameworks designed specifically for NEC SX-Aurora TSUBASA
- NEC SOL – Transparent Neural Network Acceleration [2] – an AI acceleration middleware enabling wide range of optimizations for neural network workloads. It integrates with existing Machine Learning frameworks such as PyTorch and TensorFlow. SOL offers broad support for hardware architectures including CPUs (x86, arm64), GPUs (NVIDIA), and NEC SX-Aurora TSUBASA. It does not require modification of the original source code allowing the user to focus on solving the problem rather than on the specifics of the hardware architecture;
- Frovedis – FRamework Of VEctorized and DIStributed data analytics [3] – data analytics software primarily targeting the NEC SX-Aurora TSUBASA architecture.
Brief agenda
- SOL: Transparent Neural Network Acceleration:
- Introduction;
- Integration with PyTorch;
- Integration with ONNX;
- Deployment.
- Frovedis: FRamework Of VEctorized and DIStributed data analytics.
- Hands-on session: SOL at ICM infrastructure.
About SX-Aurora TSUBASA
NEC SX-Aurora Tsubasa, introduced to the market in 2018, is a vector processor (Vector Engine, VE) belonging to the SX architecture line which has been developed by NEC Corporation since mid-1980s [1]. Unlike its stand-alone predecessors, Tsubasa has been designed as a PCIe attached card working within and being operated by an x86_64 host server (Vector Host, VH) running a distribution of the GNU/Linux operating system. The latter provides a complete software development environment for the connected VEs and runs Vector Engine Operating System (VEOS) which, in turn, serves as operating system to the VE programs. [4].
[1] https://www.nec.com/en/global/solutions/hpc/sx/index.html
[2] https://sysml.neclab.eu/projects/sol/
[3] https://github.com/frovedis/frovedis
[4] https://github.com/veos-sxarr-NEC/veos
About presenters
Nicolas Weber is Senior Researcher at the NEC Laboratories Europe. He received his PhD for work on automated memory access optimizations for GPU in 2017 from TU Darmstadt. Since then he focusses on the efficient mapping of artificial intelligence workloads onto various accelerator processors such as NEC SX-Aurora or GPUs, to transparently increase performance and efficiency.
Erich Focht is Senior Manager of the Research and Development group at NEC HPC Europe. His work topics cover distributed systems software, parallel file systems, hybrid programming models, system software, tools and compilers for vector systems with the focus currently on applications, linear algebra, AI and cooperations on the SX-Aurora vector engine.
SARAH KENDERDINE
Digital Museology, Digital Humanities Institute | Lead: Laboratory for Experimental Museology (eM+) | Director: ArtLab | EPFL Lausanne Switzerland
Title: Cultural data sculpting
Abstract: In 1889 the curator G. B. Goode of the Smithsonian Institute delivered an anticipatory lecture entitled ‘The Future of the Museum’ in which he said this future museum would stand side by side with the library and the laboratory.’ Convergence in collecting organisations propelled by the liquidity of digital data now sees them reconciled as information providers in a networked world. The media theorist Lev Manovich described this world-order as “database logic,” whereby users transform the physical assets of cultural organisations into digital assets to be—uploaded, downloaded, visualized, shared, users who treat institutions not as storehouses of physical objects, but rather as datasets to be manipulated. This presentation explores how such a mechanistic description can replaced by ways in which computation has become ‘experiential, spatial and materialized; embedded and embodied’. It was at the birth of the Information Age in the 1950s that the prominent designer Gyorgy Kepes of MIT said “information abundance” should be a “landscapes of the senses” that organizes both perception and practice. “This ‘felt order’ he said should be “a source of beauty, data transformed from its measured quantities and recreated as sensed forms exhibiting properties of harmony, rhythm and proportion.”
Archives call for the creation of new prosthetic architectures for the production and sharing of archival resources. At the intersection of immersive visualisation technologies, visual analytics, aesthetics and cultural (big) data, this presentation explores diverse digital cultural heritage experiences of diverse archives from scientific, artistic and humanistic perspectives. Exploiting a series of experimental and embodied platforms, the discussion argues for a reformulation of engagement with digital archives at the intersection of the tangible and intangible and as a convergence across domains. The performative interfaces and repertoires described demonstrate opportunities to reformulate narrative in a digital context and they ways they support personal affective engagement with cultural memory.
Professor Sarah Kenderdine researches at the forefront of interactive and immersive experiences for galleries, libraries, archives and museums. In widely exhibited installation works, she has amalgamated cultural heritage with new media art practice, especially in the realms of interactive cinema, augmented reality and embodied narrative. In 2017, Sarah was appointed Professor of Digital Museology at the École polytechnique fédérale de Lausanne (EPFL), Switzerland where she has built a new laboratory for experimental museology (eM+), exploring the convergence of aesthetic practice, visual analytics and cultural data. She is also director and lead curator of EPFL’s new art/science initiative ArtLab.
ANETA AFELT
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France
Title: The promises of the One Health concept in the age of anthropocene
Abstract: In May 2019 an article was published: “Anthropocene now: influential panel votes to recognise Earth’s new epoch”, situating et the stratigraphy of Earth’s history a new geological epoch – the domination of human influence on shaping the Earth’s environment. When humansnas are a central figure in an ecological niche its results massive subordination and transformation of the environment for their needs.
Unfortunately, the consequence is robbery of natural resources.
The consequences are socially unexpected – a global epidemiological crisis. The current covid-19 pandemic is an excellent example. It seems that one of the most important questions of the anthropocene era is how to maintain stable epidemiological conditions for now and in the future. The One Health concept proposes a nwe paradigm – a deep look at the sources of our well-being: our relationship with the environment. Our health status is interdependent with the well-being of the environment. It is clear that the socio-ecological niche disturbance results in the spread of pathogens. Can sustainable development of socio-ecological niches help us? Let’s take a look at the results!
Aneta Afelt, PhD, is a geographer working in the area of health geography. Her interest in research is the One Health concept, where environment, epidemiology and epizootiology are considered as interconnected processes located in social-ecological niches. She shows in the research results that the destruction of ecosystems results in epidemiological consequences. She works at the Interdisciplinary Center for Mathematical and Computational Modeling of the University of Warsaw, Poland, and is currently a Guest Researcher at Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France. She is also a member of the scientific committee for Covid-19 of the Ministry of Science in Poland and a scientific consultant of the European Research Agency for actions dedicated to Covid-19.
SELECTED PUBLICATIONS
Genetic diversity of coronaviruses in bats in Lao PDR and Cambodia. [Infection Genetics and Evolution, 48 (2017) 10–18]
Incidence of dengue and chikungunya viruses in mosquitoes and human patients in border provinces of Vietnam [BioMed Central, Springer Nature, 9 November 2017]
Japanese encephalitis in Indonesia: An update on epidemiology and transmission ecology [Acta Tropica, 187(2018), 240-247]
Distribution of bat-borne viruses and environment patterns [Infection Genetics and Evolution 58(2018), 181-191]
Bats, Coronaviruses, and Deforestation: Toward the Emergence of Novel Infectious Diseases? [Frontiers in Microbiology (11 April 2018)]
Bats, Bat-Borne Viruses, and Environmental Changes [in: Bats, IntechOpen (July 4th 2018)]
CATHERINE McGEOCH
D-Wave Systems
Title: What Do We Know About Performance of Quantum Annealing Systems?
Abstract: Quantum annealing (QA) falls within the Adiabatic Quantum Computing paradigm, which is a different approach to quantum computing than the more familiar quantum gate model (GM). The error models for open-system QA vs. GM are quite distinct, which means that different approaches to performance evaluation are needed. The first part of the talk will present a brief introduction to quantum annealing and how errors are modeled.
Quantum computing is a highly multidisciplinary field, and each sub-discipline has its own ideas about what would constitute a demonstration of superior quantum performance. The second part of the talk will review some prominent proposals and argue that they are largely incompatible, in the sense that a demonstration of superior quantum performance in one discipline will not satisfy researchers in other disciplines. I will give an update on what is known about the performance of
quantum annealing processors manufactured by D-Wave Systems.
Catherine McGeoch received her Ph.D. from Carnegie Mellon University in 1986 and spent almost 30 years on the faculty at Amherst College. Her research interests center around the develop of methodologies for empirical performance evaluation of algorithms and heuristics. She co-founded the DIMACS Challenge series and the ALENEX meetings on Algorithm Engineering and Experiments, and is past editor-in-chief of the ACM Journal on Experimental Algorithmics. In 2014 she joined D-Wave Systems, where she now works on methods for assessing performance of quantum annealing processors. She is the author of two books: “A Guide to Experimental Algorithmics” and “Adiabatic Quantum Computing and Quantum Annealing: Theory and Practice”
FRANCISZEK RAKOWSKI
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Samsung R&D, Poland
Title: Predicting the course of the COVID-19 epidemic in Poland
Abstract: One of the most promising approaches to predicting the possible scenarios of the epidemic is based on agent-based models. The idea of that model family is quite simple: reproduce the demographic and sociological structure of the society, and run the simulations of the disease spread throughout that structure. The direct reproduction of the contact structure allows investigating the consequences of various administrative measures applied, like school closure or travel restrictions. The model results can be visualised as a dynamic map of spreading disease, and enables the assessment of burden of disease factors locally. Our model, constructed more than 10 years ago for influenza epidemics, has been reanimated and tuned to COVID-19 parameters. It is now producing both scientific results as well as pragmatic reports, which are then being passed to Polish governmental authorities.
Affiliated Researcher at ICM University of Warsaw. Principal Data Scientist at the AI institute, Samsung R&D, Poland. Almost 10 years ago, when working at ICM UW as researcher, he led a project to construct a large-scale agent-based epidemic spread model for influenza disease.
Recently, during the times of COVID-19 pandemic, he became the project leader of development and adaptation of the ICM Epidemic Model to the coronavirus disease. His interests also cover cognitive science, neuroscience and computational biology.
Michael Bussmann
CASUS – Center for Advanced Systems Understanding Helmholtz-Zentrum
Title: HPC Simulations of the early Universe
Abstract: Research infrastructures play a key role in delivering high quality scientific data to many scientific communities. In the future, we will face a tremendous increase in data volume and rate at these facilities. This will fundamentally change the role of computing at these facilities. With this change new possibilities of using supercomputers for science arise. We will discuss how that future might look like, what is necessary to bring it to reality and – most importantly – how this will allow to foster interdisciplinary science in a complex world.
A theoretical plasma physicist by origin, Michael is now extending his reach well beyond Physics setting up the Center for Advanced Systems Understanding, CASUS, in Görlitz, Germany. CASUS is a new institute on data driven complex systems science fostering interdisciplinary research on new digital methods for understanding real world systems. Michael is the speaker for “Data Management & Analysis” for the Helmholtz Research Field Matter and part of the Helmholtz Incubator for Data and Information Science.
Scott Aaronson
UT Austin
Title: Quantum Computational Supremacy and Its Applications
Abstract: Last fall, a team at Google announced the first-ever demonstration of “quantum computational supremacy”—that is, a clear quantum speedup over a classical computer for some task—using a 53-qubit programmable superconducting chip called Sycamore.
Google’s accomplishment drew on a decade of research in my field of quantum complexity theory. This talk will discuss questions like:
- What exactly was the (contrived) problem that Sycamore solved?
- How does one verify the outputs using a classical computer?
- And how confident are we that the problem is classically hard—especially in light of subsequent counterclaims by IBM and others?
I’ll end with a possible application for Google’s experiment—namely, the generation of trusted public random bits, for use (for example) in
cryptocurrencies—that I’ve been developing and that Google and NIST are now working to test.
Scott Aaronson is David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin. He received his bachelor’s from Cornell University and his PhD from UC Berkeley. Before coming to UT Austin, he spent nine years as a professor in electrical Engineering and Computer Science at MIT. Aaronson’s research in theoretical computer science has focused mainly on the capabilities and limits of quantum computers. His first book, Quantum Computing Since Democritus, was published in 2013 by Cambridge University Press. He received the National Science Foundation’s Alan T. Waterman Award, the United States PECASE Award, and the Tomassoni-Chisesi Prize in Physics.
Valerie E. Polichar
Sr. Director & SATO, Academic Technology Services
IT Services
University of California San Diego
Title: Creating a Technology Vision: Planning for the Next Generation of Research Support
Abstract:In 2009, the University of California San Diego released the Blueprint for a Digital University, a broad vision for research IT that set the direction for UC San Diego’s offerings for the next ten years and led to the creation of multiple campus services, as well as the university’s first Research IT Services and Research Data Curation teams. To refresh this vision for a new era, in late 2019, we embarked on a multi-year process to look to the next ten years and imagine a new range of research data and compute support services. We’ll discuss our process and some of the early themes emerging from the first year’s discussions – and how they differ from those of the past. Attendees will take away a how-to approach that can be applied at any research institution to build vision and develop their own strategic plan.
Erol Gelenbe
Institute of Theoretical and Applied Informatics Polish Academy of Sciences, CNRS I3S Laboratory University Cote d’Azur (France), Visiting Professor Imperial
Title: A Dynamic Self-Aware Approach to Cybersecurity
Abstract: This presentation will argue that cyberattacks impair not just security but also Quality of Service, and that they increase Energy Consumption in Systems and Networks. Thus not only do they cause damage to the users of a system, but they also impair its reputation and trust, and increase its operating costs. We will also take the view that these are dynamic phenomena which take place unexpectedly. Therefore future systems will have to constantly observe their own state to be able to very rapidly react to dynamic attacks. We will suggest a Self-Aware approach to dynamically respond to cyberattacks based on the Cognitive Packet Network dynamic routing algorithm that uses Recurrent Random Neural Networks and Reinforcement Learning. Illustrations will be provided from two FP7 and H2020 that I have proposed and which were funded by the European Union.
Erol Gelenbe PHD DSc D.h.c.m., was elected IFIP Fellow, Fellow of ACM, IEEE, National Academy of Technologies of France, the Science Academies of Belgium, Hungary, Poland, Turkey and Academia Europaea. Citizen of France and Turkey, he graduated from Ankara Koleji and Middle East Technical University, Ankara with High Honours. He received several science prizes including the ACM SIGMETRICS Life-Time Achievement Award, IET Oliver Lodge Medal, Grand Prix France-Telecom, In Memoriam Gabor Denes Prize, Mustafa Prize
– Erol received his PhD in Electrical Engineering under Prof. Ed Smith at NYU on “Stochastic Automata with Structural Restrictions” by showing the mathematical links between state transition functions of probabilistic automata and the formal languages that they recognize, published in Information and Computation, and IEEE Transactions on Computers.
– Briefly at Philips Research, Eindhoven, designing a virtual memory for Algol-based “stack” computers, he then became Assistant Professor at the University of Michigan, Ann Arbor where he taught programming languages, algorithms and operating systems, and published research on reliable memory management in Comm. ACM and Acta Informatica.
– In 1972-73 at INRIA he pioneered research on Computer and System Performance Modelling using Queueing Networks, helping to solve the thrashing problem in operating systems with virtual memory, and contributing to the QNAP software package and INRIA’s first start-up SIMULOG. In 1973 he received a Doctorat ès Sciences under Prof. J.-L. Lions.
– Appointed Chair Professor at University of Liège in 1974, he continued as consultant at INRIA making seminal contributions to Diffusion Approximations, Optimum Checkpoints, and Optimum Control of Ethernet-like Channels published in several Journal ACM papers.
– Returning to Paris as Professor at Orsay in 1979 he co-founded the LRI (Laboratoire de Recherche en Informatique) with J.-C. Bermond and J. Vuillemin. At Orsay and University Paris-Descartes (1986-93), he invented G-Networks and Random Neural Networks, patented the first Voice-over-IP switch Sycomore (Thales), published seminal work on resequencing in codecs in Journal ACM, and other work in Journal of Applied Probability, Comm. ACM, Theoretical Computer Science, and Management Science. He founded two PhD programs, developed the commercial FLEXSIM manufacturing simulator, served as Ministerial Adviser for Science (1983-86), and co-founded IFIP WG7.3 with Paul Green (IBM Yorktown Heights).
– In 1993 he became New Jersey State Endowed Professor of Computer Science at NJIT (USA), and later Nello L. Teer at Duke University where he developed neuronal adaptive video compression methods and an algorithm for recognizing brain tumours in MRI images published in the journal Proceedings IEEE. In 1998-2003 as Director of the School of EECS at University of Central Florida, he invented the “Cognitive Packet Network” that routes packets adaptively in a network to optimize Quality of Service.
– At Imperial College (2003-19) he developed research on Self-Aware Networks, Cybersecurity and Energy-Aware Cloud Computing publishing in IEEE and ACM Transactions, Computer Journal and Physical Review, and keynotes at numerous conferences.
– After Brexit, he continues his research as Professor in the Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, and is involved in several EU H2020 programs. He has coordinated two FP7 and H2020 Research Actions on Cybersecurity.
– He has graduated over 90 PhDs: some became ACM President, University Presidents and Provosts, Fellows of National Academies in France and Canada, industry leaders and professors in France, Canada, Greece, China, Turkey, Morocco, UK, and USA.
– Erol was awarded Chevalier de la Légion d’Honneur and Commandeur du Mérite (France), Commendatore al Merito and Grande Ufficiale della Stella d’Italia (Italy), and Doctor Honoris Causa from the Universities of Liège (Belgium), Roma II (Italy) and Bogazici (Turkey).
Alan Edelman
Massachusetts Institute of Technology
Title: High Performance Computing: The Power of Language
Abstract: Julia is now being used for high performance computing for the important problems of today including climate change and Covid-19. We describe how language is making all the difference.
Alan Edelman is a professor of applied mathematics at MIT, is a member of MIT’s Computer Science & AI Laboratory, and is the leader of the JuliaLab and Applied Computing Group at MIT. He is also a cofounder of Julia Computing Inc. He works on numerical linear algebra, random matrix theory and parallel computing. He is a fellow of SIAM, IEEE, and the American Mathematial Society. He has won numerous prizes for his research, most recently the Fernbach Prize from IEEE for innovation in high performance computing.
Simon Mutch
Research Fellow & STA STEM Ambassador | ARC Centre of Excellence for All Sky Astrophysics in 3D | School of Physics Senior Research Data Specialist | Melbourne Data Analytics Platform & Petascale Campus Initiative | The University of Melbourne, Australia
Title: HPC Simulations of the early Universe
Abstract: Understanding the formation and evolution of the first galaxies in the Universe is a vital piece of the puzzle in understanding how all galaxies, including our own Milky Way, came to be. It is also a key aim of major forthcoming international facilities such as the Square Kilometre Array and James Webb Space Telescope. In order to maximise what we can learn from observations made by these facilities, we need to be able to accurately simulate the early Universe and model how galaxies affected and interacted with their environments.
Dr Simon Mutch a Postdoctoral Research Fellow in the Australian Research Council Centre of Excellence for All-Sky Astrophysics in 3-Dimensions (ASTRO 3D) and a Research Data Specialist in the Melbourne Data Analytics Platform (MDAP) based at the University of Melbourne. Dr Mutch received a Masters degree in Physics from the University of Edinburgh and was awarded his PhD in astrophysics from Swinburne University of Technology, Australia in 2013. His astronomy research is focused on the first galaxies and their impact on the evolution of the Universe, which he studies using a combination of supercomputer simulations and theoretical modelling. Through his role with MDAP, he is also currently collaborating with academics in areas such as climate science and ecology, to help uplift their digital research capabilities. Dr Mutch contributes to the Australian astronomical community through a number of national committees and is an inaugural Science and Technology Australia STEM Ambassador, a position which sees him meet with policy and decision makers to advocate for the importance of STEM and related fields.
STEPHEN WOLFRAM
Founder & CEO, Wolfram Research
Title: Emerging Surprises in Applying the Computational Paradigm (and the Deep Relations between Physics, Distributed Computing and AI)
Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science and other books; the originator of the Wolfram Physics Project; and the founder and CEO of Wolfram Research.
Ivo F. Sbalzarini
Chair of Scientific Computing for Systems Biology, Faculty of Computer Science, TU Dresden;
MOSAIC Group, Center for Systems Biology Dresden;
Max Planck Institute of Molecular Cell Biology and Genetics, Dresden
Title: Computational Developmental Biology
Abstract: Our vision is to develop a computer simulation of a developing embryo, incorporating the known biochemistry and biophysics into a computational model in 3D-space and time, which predicts the emerging shape and function of the tissue. Development and morphogenesis of tissues, organs, and embryos emerges from the collective self-organization of cells that communicate though chemical and mechanical signals. Decisions about growth, division, and migration are taken locally by each cell based on the collective information. In this sense, a developing tissue is akin to a massively parallel computer system, where each cell computes robust local decisions, integrating communication with other cells. Mechanistically understanding and reprogramming this system is a grand challenge. While the “hardware” (proteins, lipids, etc.) and the “source code” (genome) are increasingly known, we known virtually nothing about the algorithms that this code implements on this hardware. Using examples from our work, I highlight computational challenges along the way. These range from content-adaptive data representations for machine learning, to novel languages for parallel high-performance computing, to virtual reality and real-time visualization for 3D microscopy and numerical simulations of nonlinear and non-equilibrium mechanical models. This cooperative interdisciplinary effort contributes to all involved disciplines.
David Bader
Distinguished Professor in the Department of Computer Science;
Director of the Institute for Data Science at New Jersey Institute of Technology
Title: Solving Global Grand Challenges with High Performance Data Analytics
Abstract: Data science aims to solve grand global challenges such as: detecting and preventing disease in human populations; revealing community structure in large social networks; protecting our elections from cyber-threats, and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms and architectures, and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams. In this talk, Bader will discuss the opportunities and challenges in massive data science for applications in social sciences, physical sciences, and engineering.
David A. Bader is a Distinguished Professor in the Department of Computer Science and Director of the Institute for Data Science at New Jersey Institute of Technology. Prior to this, he served as founding Professor and Chair of the School of Computational Science and Engineering, College of Computing, at Georgia Institute of Technology. He is a Fellow of the IEEE, AAAS, and SIAM, and advises the White House, most recently on the National Strategic Computing Initiative (NSCI). Dr. Bader is a leading expert in solving global grand challenges in science, engineering, computing, and data science. His interests are at the intersection of high-performance computing and real-world applications, including cybersecurity, massive-scale analytics, and computational genomics, and he has co-authored over 250 articles in peer-reviewed journals and conferences. Dr. Bader has served as a lead scientist in several DARPA programs including High Productivity Computing Systems (HPCS) with IBM, Ubiquitous High Performance Computing (UHPC) with NVIDIA, Anomaly Detection at Multiple Scales (ADAMS), Power Efficiency Revolution For Embedded Computing Technologies (PERFECT), Hierarchical Identify Verify Exploit (HIVE), and Software-Defined Hardware (SDH). He has also served as Director of the Sony-Toshiba-IBM Center of Competence for the Cell Broadband Engine Processor. Bader is a cofounder of the Graph500 List for benchmarking “Big Data” computing platforms. Bader is recognized as a “RockStar” of High Performance Computing by InsideHPC and as HPCwire’s People to Watch in 2012 and 2014. In April 2019, Bader was awarded an NVIDIA AI Lab (NVAIL) award, and in July 2019, Bader received a Facebook Research AI Hardware/Software Co-Design award.
https://www.cs.njit.edu/~bader
Hiroaki Kitano
President at The Systems Biology Institute, Tokyo;
Professor at Okinawa Institute of Science and Technology Graduate University, Okinawa;
President & CEO at Sony Computer Science Laboratories, Inc., Tokyo;
Executive Vice President at Sony Corporation, Tokyo
Title: Nobel Turing Challenge — Creating the Engine of Scientific Discovery
Abstract: A new grand challenge for AI: to develop an AI system that can make major scientific discoveries in biomedical sciences and that is worthy of a Nobel Prize. There are a series of human cognitive limitations that prevent us from making accelerated scientific discoveries, particularity in biomedical sciences. As a result, scientific discoveries are left at the level of a cottage industry. AI systems can transform scientific discoveries into highly efficient practices, thereby enabling us to expand knowledge in unprecedented ways. Such systems may outcompute all possible hypotheses and may redefine the nature of scientific intuition, hence the scientific discovery process.
Hiroaki Kitano received a PhD in computer science from Kyoto University in 1991 for the thesis in machine translation titled “Speech-to-speech translation: a massively parallel memory-based approach”. His work includes a broad spectrum of publications on artificial intelligence and systems biology.
Kitano is known for developing AIBO, Sony’s entertainment robot, and the grand challenge project known as RoboCup, that aims at developing a team of fully autonomous robot that can outperform the World Cup winning team in soccer. He is also a pioneer of systems biology, and has served as a scientific advisor for a number of research institutions and companies, including European Molecular Biology Laboratory (EMBL), ALSTOM, and others. He was awarded the IJCAI Computers and Thought Award in 1993 and the Nature Award for Creative Mentoring in Science in 2009.”
Ben Whitney
Postdoc in the Computer Science and Mathematics Division at Oak Ridge National Laboratory
Title: MGARD: Hierarchical Compression of Scientific Data
Abstract: Lossy compression will be an essential tool in the efforts to manage and understand the imposing volumes of data that will be produced by exascale simulations and experiments. While ad hoc methods like spatial subsampling are easy to implement and understand, the absence of guaranteed bounds on the errors they incur hinders their widespread adoption by the scientific community. MGARD (MultiGrid Adaptive Reduction of Data) is one of a new generation of scientific data compressors aiming to win the trust of domain scientists with rigorous mathematical underpinnings and guaranteed error bounds. This tutorial will be an introduction to MGARD, starting with the hierarchical decomposition algorithm comprising the core of the method and its ties to the theories of finite element methods, multigrid solvers, and wavelet analysis. We will then discuss quantization techniques enabling bounds on the compression error as measured both in the original data and in quantities of interest computed from the data. We will conclude with a variety of numerical examples and a demonstration of the use of MGARD’s publically available implementation.
May 27, 2021
Prof. David Winkler
Professor of Biochemistry and Genetics, La Trobe University
Professor of Medicinal Chemistry, Monash University
Visiting Professor in Pharmacy, University fo Nottingham
Fellow, Evolutionary Robotics, CSIRO Data61
Computational insights into the origin of SARS-CoV-2 and repurposing of drugs for COVID-19
Apr 22, 2021
Prof Andrew Turpin
BCom BSc(Hons) PhD
Senior Academic Convenor, Petascale Campus Initiative.
Lead, Melbourne Data Analytics Platform.
Supporting digital research methods: the Melbourne experience
Mar 18, 2021
Valerie E. Polichar
Sr. Director & SATO, Academic Technology Services
IT Services
University of California San Diego
Creating a Technology Vision: Planning for the Next Generation of Research Support
Feb 18, 2021
Ben Whitney
Postdoc in the Computer Science and Mathematics Division at Oak Ridge National Laboratory
MGARD: Hierarchical Compression of Scientific Data
Jan 29, 2021
David Bader
Distinguished Professor in the Department of Computer Science;
Director of the Institute for Data Science at New Jersey Institute of Technology
Solving Global Grand Challenges with High Performance Data Analytics
Nov 26, 2020
Hiroaki Kitano
President at The Systems Biology Institute, Tokyo;
Professor at Okinawa Institute of Science and Technology Graduate University, Okinawa;
President & CEO at Sony Computer Science Laboratories, Inc., Tokyo;
Executive Vice President at Sony Corporation, Tokyo
Nobel Turing Challenge — Creating the Engine of Scientific Discovery
Oct 22, 2020
Ivo F. Sbalzarini
Chair of Scientific Computing for Systems Biology, Faculty of Computer Science, TU Dresden; MOSAIC Group, Center for Systems Biology Dresden; Max Planck Institute of Molecular Cell Biology and Genetics, Dresden
Computational Developmental Biology
© Copyright by Prof. Ivo Sbalzarini
Sept 30, 2020
Nicolas Weber & Erich Focht
NEC Laboratories Europe, NEC HPC Europe
Effective Neural Networks *Without* GPU | SOL: Transparent Neural Network Acceleration on NEC SX-Aurora TSUBASA
June 18, 2020
STEPHEN WOLFRAM
Founder & CEO, Wolfram Research
Emerging Surprises in Applying the Computational Paradigm (and the Deep Relations between Physics, Distributed Computing and AI)
June 11, 2020
SARAH KENDERDINE
Digital Museology, Digital Humanities Institute | Lead: Laboratory for Experimental Museology (eM+) | Director: ArtLab | EPFL Lausanne Switzerland
Cultural data sculpting
June 04, 2020
Alan Edelman
Massachusetts Institute of Technology
High Performance Computing: The Power of Language
May 28, 2020
ANETA AFELT
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France
The promises of the One Health concept in the age of anthropocene
May 21, 2020
FRANCISZEK RAKOWSKI
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Samsung R&D, Poland
Predicting the course of the COVID-19 epidemic in Poland
May 14, 2020
CATHERINE McGEOCH
D-Wave Systems
What Do We Know About Performance of Quantum Annealing Systems?
May 7, 2020
Simon Mutch
Research Fellow & STA STEM Ambassador | ARC Centre of Excellence for All Sky Astrophysics in 3D | School of Physics Senior Research Data Specialist | Melbourne Data Analytics Platform & Petascale Campus Initiative | The University of Melbourne, Australia
HPC Simulations of the early Universe
April 30, 2020
Michael Bussmann
CASUS – Center for Advanced Systems Understanding Helmholtz-Zentrum
The discovery machines – how supercomputers will shape the future of data-driven science
Copyright by Sami Erol Gelenbe
April 8, 2020
Erol Gelenbe
Institute of Theoretical and Applied Informatics Polish Academy of Sciences, CNRS I3S Laboratory University Cote d’Azur (France), Visiting Professor Imperial College
A Dynamic Self-Aware Approach to Cybersecurity
INAUGURAL SEMINAR – April 1, 2020
Scott Aaronson
UT Austin
Quantum Computational Supremacy and Its Applications
Coming
VIRTUAL ICM SEMINARS AFTER SCFE20
The series of Virtual ICM Seminars in Computer and Computational Science will return in the new academic year 2020/2021.
- 2021 (5)
- 2020 (13)