Past seminars & videos
Dr. Nicolas Weber, Senior Researcher, NEC Laboratories Europe
Dr. Erich Focht, Senior Manager R&D, NEC HPC Europe
Workshop title: EFFECTIVE NEURAL NETWORKS *WITHOUT* GPU
SOL: Transparent Neural Network Acceleration on NEC SX-Aurora TSUBASA
Date: Wed, Sept 30, 2020
Time: 09.00 am to 12.30 pm CEST
Limited access: 15 participants – first come first served
Payment: Free of charge
Note! Registration deadline, Sept 27, 2020
Participants will be informed about their qualification for the workshop by September 28th.
Introduction
In 2019, ICM University of Warsaw expanded its HPC infrastructure with a specialized vector computer, NEC SX Aurora TSUBASA [1], with eight vector processors. Aurora TSUBASA is used at ICM UW for calculations in physics, chemistry, AI, as well as development work intended to adapt and optimize the existing software to work on the new computer architecture.
Distinctive features of NEC SX-Aurora TSUBASA are:
- High memory bandwidth (48 GB, HBM2) of the Vector Engine (>1 TB/s) at < 300 W,
- 64 fully functional vector registers and 192 double precision FP operations per cycle,
- Works within the GNU/Linux environment – natively or in the accelerator mode.
The Workshop is intended as an introduction to two software frameworks designed specifically for NEC SX-Aurora TSUBASA
- NEC SOL – Transparent Neural Network Acceleration [2] – an AI acceleration middleware enabling wide range of optimizations for neural network workloads. It integrates with existing Machine Learning frameworks such as PyTorch and TensorFlow. SOL offers broad support for hardware architectures including CPUs (x86, arm64), GPUs (NVIDIA), and NEC SX-Aurora TSUBASA. It does not require modification of the original source code allowing the user to focus on solving the problem rather than on the specifics of the hardware architecture;
- Frovedis – FRamework Of VEctorized and DIStributed data analytics [3] – data analytics software primarily targeting the NEC SX-Aurora TSUBASA architecture.
Brief agenda
- SOL: Transparent Neural Network Acceleration:
- Introduction;
- Integration with PyTorch;
- Integration with ONNX;
- Deployment.
- Frovedis: FRamework Of VEctorized and DIStributed data analytics.
- Hands-on session: SOL at ICM infrastructure.
About SX-Aurora TSUBASA
NEC SX-Aurora Tsubasa, introduced to the market in 2018, is a vector processor (Vector Engine, VE) belonging to the SX architecture line which has been developed by NEC Corporation since mid-1980s [1]. Unlike its stand-alone predecessors, Tsubasa has been designed as a PCIe attached card working within and being operated by an x86_64 host server (Vector Host, VH) running a distribution of the GNU/Linux operating system. The latter provides a complete software development environment for the connected VEs and runs Vector Engine Operating System (VEOS) which, in turn, serves as operating system to the VE programs. [4].
[1] https://www.nec.com/en/global/solutions/hpc/sx/index.html
[2] http://sysml.neclab.eu/projects/sol/
[3] https://github.com/frovedis/frovedis
[4] https://github.com/veos-sxarr-NEC/veos
About presenters
Nicolas Weber is Senior Researcher at the NEC Laboratories Europe. He received his PhD for work on automated memory access optimizations for GPU in 2017 from TU Darmstadt. Since then he focusses on the efficient mapping of artificial intelligence workloads onto various accelerator processors such as NEC SX-Aurora or GPUs, to transparently increase performance and efficiency.
Erich Focht is Senior Manager of the Research and Development group at NEC HPC Europe. His work topics cover distributed systems software, parallel file systems, hybrid programming models, system software, tools and compilers for vector systems with the focus currently on applications, linear algebra, AI and cooperations on the SX-Aurora vector engine.
SARAH KENDERDINE
Digital Museology, Digital Humanities Institute | Lead: Laboratory for Experimental Museology (eM+) | Director: ArtLab | EPFL Lausanne Switzerland
Title: Cultural data sculpting
Abstract: In 1889 the curator G. B. Goode of the Smithsonian Institute delivered an anticipatory lecture entitled ‘The Future of the Museum’ in which he said this future museum would stand side by side with the library and the laboratory.’ Convergence in collecting organisations propelled by the liquidity of digital data now sees them reconciled as information providers in a networked world. The media theorist Lev Manovich described this world-order as “database logic,” whereby users transform the physical assets of cultural organisations into digital assets to be—uploaded, downloaded, visualized, shared, users who treat institutions not as storehouses of physical objects, but rather as datasets to be manipulated. This presentation explores how such a mechanistic description can replaced by ways in which computation has become ‘experiential, spatial and materialized; embedded and embodied’. It was at the birth of the Information Age in the 1950s that the prominent designer Gyorgy Kepes of MIT said “information abundance” should be a “landscapes of the senses” that organizes both perception and practice. “This ‘felt order’ he said should be “a source of beauty, data transformed from its measured quantities and recreated as sensed forms exhibiting properties of harmony, rhythm and proportion.”
Archives call for the creation of new prosthetic architectures for the production and sharing of archival resources. At the intersection of immersive visualisation technologies, visual analytics, aesthetics and cultural (big) data, this presentation explores diverse digital cultural heritage experiences of diverse archives from scientific, artistic and humanistic perspectives. Exploiting a series of experimental and embodied platforms, the discussion argues for a reformulation of engagement with digital archives at the intersection of the tangible and intangible and as a convergence across domains. The performative interfaces and repertoires described demonstrate opportunities to reformulate narrative in a digital context and they ways they support personal affective engagement with cultural memory.
Professor Sarah Kenderdine researches at the forefront of interactive and immersive experiences for galleries, libraries, archives and museums. In widely exhibited installation works, she has amalgamated cultural heritage with new media art practice, especially in the realms of interactive cinema, augmented reality and embodied narrative. In 2017, Sarah was appointed Professor of Digital Museology at the École polytechnique fédérale de Lausanne (EPFL), Switzerland where she has built a new laboratory for experimental museology (eM+), exploring the convergence of aesthetic practice, visual analytics and cultural data. She is also director and lead curator of EPFL’s new art/science initiative ArtLab.
ANETA AFELT
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France
Title: The promises of the One Health concept in the age of anthropocene
Abstract: In May 2019 an article was published: “Anthropocene now: influential panel votes to recognise Earth’s new epoch”, situating et the stratigraphy of Earth’s history a new geological epoch – the domination of human influence on shaping the Earth’s environment. When humansnas are a central figure in an ecological niche its results massive subordination and transformation of the environment for their needs.
Unfortunately, the consequence is robbery of natural resources.
The consequences are socially unexpected – a global epidemiological crisis. The current covid-19 pandemic is an excellent example. It seems that one of the most important questions of the anthropocene era is how to maintain stable epidemiological conditions for now and in the future. The One Health concept proposes a nwe paradigm – a deep look at the sources of our well-being: our relationship with the environment. Our health status is interdependent with the well-being of the environment. It is clear that the socio-ecological niche disturbance results in the spread of pathogens. Can sustainable development of socio-ecological niches help us? Let’s take a look at the results!
Aneta Afelt, PhD, is a geographer working in the area of health geography. Her interest in research is the One Health concept, where environment, epidemiology and epizootiology are considered as interconnected processes located in social-ecological niches. She shows in the research results that the destruction of ecosystems results in epidemiological consequences. She works at the Interdisciplinary Center for Mathematical and Computational Modeling of the University of Warsaw, Poland, and is currently a Guest Researcher at Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France. She is also a member of the scientific committee for Covid-19 of the Ministry of Science in Poland and a scientific consultant of the European Research Agency for actions dedicated to Covid-19.
SELECTED PUBLICATIONS
Genetic diversity of coronaviruses in bats in Lao PDR and Cambodia. [Infection Genetics and Evolution, 48 (2017) 10–18]
Incidence of dengue and chikungunya viruses in mosquitoes and human patients in border provinces of Vietnam [BioMed Central, Springer Nature, 9 November 2017]
Japanese encephalitis in Indonesia: An update on epidemiology and transmission ecology [Acta Tropica, 187(2018), 240-247]
Distribution of bat-borne viruses and environment patterns [Infection Genetics and Evolution 58(2018), 181-191]
Bats, Coronaviruses, and Deforestation: Toward the Emergence of Novel Infectious Diseases? [Frontiers in Microbiology (11 April 2018)]
Bats, Bat-Borne Viruses, and Environmental Changes [in: Bats, IntechOpen (July 4th 2018)]
CATHERINE McGEOCH
D-Wave Systems
Title: What Do We Know About Performance of Quantum Annealing Systems?
Abstract: Quantum annealing (QA) falls within the Adiabatic Quantum Computing paradigm, which is a different approach to quantum computing than the more familiar quantum gate model (GM). The error models for open-system QA vs. GM are quite distinct, which means that different approaches to performance evaluation are needed. The first part of the talk will present a brief introduction to quantum annealing and how errors are modeled.
Quantum computing is a highly multidisciplinary field, and each sub-discipline has its own ideas about what would constitute a demonstration of superior quantum performance. The second part of the talk will review some prominent proposals and argue that they are largely incompatible, in the sense that a demonstration of superior quantum performance in one discipline will not satisfy researchers in other disciplines. I will give an update on what is known about the performance of
quantum annealing processors manufactured by D-Wave Systems.
Catherine McGeoch received her Ph.D. from Carnegie Mellon University in 1986 and spent almost 30 years on the faculty at Amherst College. Her research interests center around the develop of methodologies for empirical performance evaluation of algorithms and heuristics. She co-founded the DIMACS Challenge series and the ALENEX meetings on Algorithm Engineering and Experiments, and is past editor-in-chief of the ACM Journal on Experimental Algorithmics. In 2014 she joined D-Wave Systems, where she now works on methods for assessing performance of quantum annealing processors. She is the author of two books: “A Guide to Experimental Algorithmics” and “Adiabatic Quantum Computing and Quantum Annealing: Theory and Practice”
FRANCISZEK RAKOWSKI
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Samsung R&D, Poland
Title: Predicting the course of the COVID-19 epidemic in Poland
Abstract: One of the most promising approaches to predicting the possible scenarios of the epidemic is based on agent-based models. The idea of that model family is quite simple: reproduce the demographic and sociological structure of the society, and run the simulations of the disease spread throughout that structure. The direct reproduction of the contact structure allows investigating the consequences of various administrative measures applied, like school closure or travel restrictions. The model results can be visualised as a dynamic map of spreading disease, and enables the assessment of burden of disease factors locally. Our model, constructed more than 10 years ago for influenza epidemics, has been reanimated and tuned to COVID-19 parameters. It is now producing both scientific results as well as pragmatic reports, which are then being passed to Polish governmental authorities.
Affiliated Researcher at ICM University of Warsaw. Principal Data Scientist at the AI institute, Samsung R&D, Poland. Almost 10 years ago, when working at ICM UW as researcher, he led a project to construct a large-scale agent-based epidemic spread model for influenza disease.
Recently, during the times of COVID-19 pandemic, he became the project leader of development and adaptation of the ICM Epidemic Model to the coronavirus disease. His interests also cover cognitive science, neuroscience and computational biology.
Michael Bussmann
CASUS – Center for Advanced Systems Understanding Helmholtz-Zentrum
Title: HPC Simulations of the early Universe
Abstract: Research infrastructures play a key role in delivering high quality scientific data to many scientific communities. In the future, we will face a tremendous increase in data volume and rate at these facilities. This will fundamentally change the role of computing at these facilities. With this change new possibilities of using supercomputers for science arise. We will discuss how that future might look like, what is necessary to bring it to reality and – most importantly – how this will allow to foster interdisciplinary science in a complex world.
A theoretical plasma physicist by origin, Michael is now extending his reach well beyond Physics setting up the Center for Advanced Systems Understanding, CASUS, in Görlitz, Germany. CASUS is a new institute on data driven complex systems science fostering interdisciplinary research on new digital methods for understanding real world systems. Michael is the speaker for “Data Management & Analysis” for the Helmholtz Research Field Matter and part of the Helmholtz Incubator for Data and Information Science.
Scott Aaronson
UT Austin
Title: Quantum Computational Supremacy and Its Applications
Abstract: Last fall, a team at Google announced the first-ever demonstration of “quantum computational supremacy”—that is, a clear quantum speedup over a classical computer for some task—using a 53-qubit programmable superconducting chip called Sycamore.
Google’s accomplishment drew on a decade of research in my field of quantum complexity theory. This talk will discuss questions like:
- What exactly was the (contrived) problem that Sycamore solved?
- How does one verify the outputs using a classical computer?
- And how confident are we that the problem is classically hard—especially in light of subsequent counterclaims by IBM and others?
I’ll end with a possible application for Google’s experiment—namely, the generation of trusted public random bits, for use (for example) in
cryptocurrencies—that I’ve been developing and that Google and NIST are now working to test.
Scott Aaronson is David J. Bruton Centennial Professor of Computer Science at the University of Texas at Austin. He received his bachelor’s from Cornell University and his PhD from UC Berkeley. Before coming to UT Austin, he spent nine years as a professor in electrical Engineering and Computer Science at MIT. Aaronson’s research in theoretical computer science has focused mainly on the capabilities and limits of quantum computers. His first book, Quantum Computing Since Democritus, was published in 2013 by Cambridge University Press. He received the National Science Foundation’s Alan T. Waterman Award, the United States PECASE Award, and the Tomassoni-Chisesi Prize in Physics.
Erol Gelenbe
Institute of Theoretical and Applied Informatics Polish Academy of Sciences, CNRS I3S Laboratory University Cote d’Azur (France), Visiting Professor Imperial
Title: A Dynamic Self-Aware Approach to Cybersecurity
Abstract: This presentation will argue that cyberattacks impair not just security but also Quality of Service, and that they increase Energy Consumption in Systems and Networks. Thus not only do they cause damage to the users of a system, but they also impair its reputation and trust, and increase its operating costs. We will also take the view that these are dynamic phenomena which take place unexpectedly. Therefore future systems will have to constantly observe their own state to be able to very rapidly react to dynamic attacks. We will suggest a Self-Aware approach to dynamically respond to cyberattacks based on the Cognitive Packet Network dynamic routing algorithm that uses Recurrent Random Neural Networks and Reinforcement Learning. Illustrations will be provided from two FP7 and H2020 that I have proposed and which were funded by the European Union.
Erol Gelenbe PHD DSc D.h.c.m., was elected IFIP Fellow, Fellow of ACM, IEEE, National Academy of Technologies of France, the Science Academies of Belgium, Hungary, Poland, Turkey and Academia Europaea. Citizen of France and Turkey, he graduated from Ankara Koleji and Middle East Technical University, Ankara with High Honours. He received several science prizes including the ACM SIGMETRICS Life-Time Achievement Award, IET Oliver Lodge Medal, Grand Prix France-Telecom, In Memoriam Gabor Denes Prize, Mustafa Prize
– Erol received his PhD in Electrical Engineering under Prof. Ed Smith at NYU on “Stochastic Automata with Structural Restrictions” by showing the mathematical links between state transition functions of probabilistic automata and the formal languages that they recognize, published in Information and Computation, and IEEE Transactions on Computers.
– Briefly at Philips Research, Eindhoven, designing a virtual memory for Algol-based “stack” computers, he then became Assistant Professor at the University of Michigan, Ann Arbor where he taught programming languages, algorithms and operating systems, and published research on reliable memory management in Comm. ACM and Acta Informatica.
– In 1972-73 at INRIA he pioneered research on Computer and System Performance Modelling using Queueing Networks, helping to solve the thrashing problem in operating systems with virtual memory, and contributing to the QNAP software package and INRIA’s first start-up SIMULOG. In 1973 he received a Doctorat ès Sciences under Prof. J.-L. Lions.
– Appointed Chair Professor at University of Liège in 1974, he continued as consultant at INRIA making seminal contributions to Diffusion Approximations, Optimum Checkpoints, and Optimum Control of Ethernet-like Channels published in several Journal ACM papers.
– Returning to Paris as Professor at Orsay in 1979 he co-founded the LRI (Laboratoire de Recherche en Informatique) with J.-C. Bermond and J. Vuillemin. At Orsay and University Paris-Descartes (1986-93), he invented G-Networks and Random Neural Networks, patented the first Voice-over-IP switch Sycomore (Thales), published seminal work on resequencing in codecs in Journal ACM, and other work in Journal of Applied Probability, Comm. ACM, Theoretical Computer Science, and Management Science. He founded two PhD programs, developed the commercial FLEXSIM manufacturing simulator, served as Ministerial Adviser for Science (1983-86), and co-founded IFIP WG7.3 with Paul Green (IBM Yorktown Heights).
– In 1993 he became New Jersey State Endowed Professor of Computer Science at NJIT (USA), and later Nello L. Teer at Duke University where he developed neuronal adaptive video compression methods and an algorithm for recognizing brain tumours in MRI images published in the journal Proceedings IEEE. In 1998-2003 as Director of the School of EECS at University of Central Florida, he invented the “Cognitive Packet Network” that routes packets adaptively in a network to optimize Quality of Service.
– At Imperial College (2003-19) he developed research on Self-Aware Networks, Cybersecurity and Energy-Aware Cloud Computing publishing in IEEE and ACM Transactions, Computer Journal and Physical Review, and keynotes at numerous conferences.
– After Brexit, he continues his research as Professor in the Institute of Theoretical and Applied Informatics, Polish Academy of Sciences, and is involved in several EU H2020 programs. He has coordinated two FP7 and H2020 Research Actions on Cybersecurity.
– He has graduated over 90 PhDs: some became ACM President, University Presidents and Provosts, Fellows of National Academies in France and Canada, industry leaders and professors in France, Canada, Greece, China, Turkey, Morocco, UK, and USA.
– Erol was awarded Chevalier de la Légion d’Honneur and Commandeur du Mérite (France), Commendatore al Merito and Grande Ufficiale della Stella d’Italia (Italy), and Doctor Honoris Causa from the Universities of Liège (Belgium), Roma II (Italy) and Bogazici (Turkey).
Alan Edelman
Massachusetts Institute of Technology
Title: High Performance Computing: The Power of Language
Abstract: Julia is now being used for high performance computing for the important problems of today including climate change and Covid-19. We describe how language is making all the difference.
Alan Edelman is a professor of applied mathematics at MIT, is a member of MIT’s Computer Science & AI Laboratory, and is the leader of the JuliaLab and Applied Computing Group at MIT. He is also a cofounder of Julia Computing Inc. He works on numerical linear algebra, random matrix theory and parallel computing. He is a fellow of SIAM, IEEE, and the American Mathematial Society. He has won numerous prizes for his research, most recently the Fernbach Prize from IEEE for innovation in high performance computing.
Simon Mutch
Research Fellow & STA STEM Ambassador | ARC Centre of Excellence for All Sky Astrophysics in 3D | School of Physics Senior Research Data Specialist | Melbourne Data Analytics Platform & Petascale Campus Initiative | The University of Melbourne, Australia
Title: HPC Simulations of the early Universe
Abstract: Understanding the formation and evolution of the first galaxies in the Universe is a vital piece of the puzzle in understanding how all galaxies, including our own Milky Way, came to be. It is also a key aim of major forthcoming international facilities such as the Square Kilometre Array and James Webb Space Telescope. In order to maximise what we can learn from observations made by these facilities, we need to be able to accurately simulate the early Universe and model how galaxies affected and interacted with their environments.
Dr Simon Mutch a Postdoctoral Research Fellow in the Australian Research Council Centre of Excellence for All-Sky Astrophysics in 3-Dimensions (ASTRO 3D) and a Research Data Specialist in the Melbourne Data Analytics Platform (MDAP) based at the University of Melbourne. Dr Mutch received a Masters degree in Physics from the University of Edinburgh and was awarded his PhD in astrophysics from Swinburne University of Technology, Australia in 2013. His astronomy research is focused on the first galaxies and their impact on the evolution of the Universe, which he studies using a combination of supercomputer simulations and theoretical modelling. Through his role with MDAP, he is also currently collaborating with academics in areas such as climate science and ecology, to help uplift their digital research capabilities. Dr Mutch contributes to the Australian astronomical community through a number of national committees and is an inaugural Science and Technology Australia STEM Ambassador, a position which sees him meet with policy and decision makers to advocate for the importance of STEM and related fields.
STEPHEN WOLFRAM
Founder & CEO, Wolfram Research
Title: Emerging Surprises in Applying the Computational Paradigm (and the Deep Relations between Physics, Distributed Computing and AI)
Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science and other books; the originator of the Wolfram Physics Project; and the founder and CEO of Wolfram Research.
Ivo F. Sbalzarini
Chair of Scientific Computing for Systems Biology, Faculty of Computer Science, TU Dresden;
MOSAIC Group, Center for Systems Biology Dresden;
Max Planck Institute of Molecular Cell Biology and Genetics, Dresden
Title: Computational Developmental Biology
Abstract: Our vision is to develop a computer simulation of a developing embryo, incorporating the known biochemistry and biophysics into a computational model in 3D-space and time, which predicts the emerging shape and function of the tissue. Development and morphogenesis of tissues, organs, and embryos emerges from the collective self-organization of cells that communicate though chemical and mechanical signals. Decisions about growth, division, and migration are taken locally by each cell based on the collective information. In this sense, a developing tissue is akin to a massively parallel computer system, where each cell computes robust local decisions, integrating communication with other cells. Mechanistically understanding and reprogramming this system is a grand challenge. While the “hardware” (proteins, lipids, etc.) and the “source code” (genome) are increasingly known, we known virtually nothing about the algorithms that this code implements on this hardware. Using examples from our work, I highlight computational challenges along the way. These range from content-adaptive data representations for machine learning, to novel languages for parallel high-performance computing, to virtual reality and real-time visualization for 3D microscopy and numerical simulations of nonlinear and non-equilibrium mechanical models. This cooperative interdisciplinary effort contributes to all involved disciplines.
Hiroaki Kitano
President at The Systems Biology Institute, Tokyo;
Professor at Okinawa Institute of Science and Technology Graduate University, Okinawa;
President & CEO at Sony Computer Science Laboratories, Inc., Tokyo;
Executive Vice President at Sony Corporation, Tokyo
Title: Nobel Turing Challenge — Creating the Engine of Scientific Discovery
Abstract: A new grand challenge for AI: to develop an AI system that can make major scientific discoveries in biomedical sciences and that is worthy of a Nobel Prize. There are a series of human cognitive limitations that prevent us from making accelerated scientific discoveries, particularity in biomedical sciences. As a result, scientific discoveries are left at the level of a cottage industry. AI systems can transform scientific discoveries into highly efficient practices, thereby enabling us to expand knowledge in unprecedented ways. Such systems may outcompute all possible hypotheses and may redefine the nature of scientific intuition, hence the scientific discovery process.
Hiroaki Kitano received a PhD in computer science from Kyoto University in 1991 for the thesis in machine translation titled “Speech-to-speech translation: a massively parallel memory-based approach”. His work includes a broad spectrum of publications on artificial intelligence and systems biology.
Kitano is known for developing AIBO, Sony’s entertainment robot, and the grand challenge project known as RoboCup, that aims at developing a team of fully autonomous robot that can outperform the World Cup winning team in soccer. He is also a pioneer of systems biology, and has served as a scientific advisor for a number of research institutions and companies, including European Molecular Biology Laboratory (EMBL), ALSTOM, and others. He was awarded the IJCAI Computers and Thought Award in 1993 and the Nature Award for Creative Mentoring in Science in 2009.”
- APRIL 2020
- MAY 2020
- June 2020
- SEPTEMBER 2020
- OCTOBER 2020
- NOVEMBER 2020
Nov 26, 2020
Hiroaki Kitano
President at The Systems Biology Institute, Tokyo;
Professor at Okinawa Institute of Science and Technology Graduate University, Okinawa;
President & CEO at Sony Computer Science Laboratories, Inc., Tokyo;
Executive Vice President at Sony Corporation, Tokyo
Nobel Turing Challenge — Creating the Engine of Scientific Discovery
Oct 22, 2020
Ivo F. Sbalzarini
Chair of Scientific Computing for Systems Biology, Faculty of Computer Science, TU Dresden; MOSAIC Group, Center for Systems Biology Dresden; Max Planck Institute of Molecular Cell Biology and Genetics, Dresden
Computational Developmental Biology
© Copyright by Prof. Ivo Sbalzarini
Sept 30, 2020
Nicolas Weber & Erich Focht
NEC Laboratories Europe, NEC HPC Europe
Effective Neural Networks *Without* GPU | SOL: Transparent Neural Network Acceleration on NEC SX-Aurora TSUBASA
June 18, 2020
STEPHEN WOLFRAM
Founder & CEO, Wolfram Research
Emerging Surprises in Applying the Computational Paradigm (and the Deep Relations between Physics, Distributed Computing and AI)
June 11, 2020
SARAH KENDERDINE
Digital Museology, Digital Humanities Institute | Lead: Laboratory for Experimental Museology (eM+) | Director: ArtLab | EPFL Lausanne Switzerland
Cultural data sculpting
June 04, 2020
Alan Edelman
Massachusetts Institute of Technology
High Performance Computing: The Power of Language
May 28, 2020
ANETA AFELT
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Espace-DEV, IRD – Institut de Recherche pour le Développement, Montpellier, France
The promises of the One Health concept in the age of anthropocene
May 21, 2020
FRANCISZEK RAKOWSKI
Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw | Samsung R&D, Poland
Predicting the course of the COVID-19 epidemic in Poland
May 14, 2020
CATHERINE McGEOCH
D-Wave Systems
What Do We Know About Performance of Quantum Annealing Systems?
May 7, 2020
Simon Mutch
Research Fellow & STA STEM Ambassador | ARC Centre of Excellence for All Sky Astrophysics in 3D | School of Physics Senior Research Data Specialist | Melbourne Data Analytics Platform & Petascale Campus Initiative | The University of Melbourne, Australia
HPC Simulations of the early Universe
April 30, 2020
Michael Bussmann
CASUS – Center for Advanced Systems Understanding Helmholtz-Zentrum
The discovery machines – how supercomputers will shape the future of data-driven science
Copyright by Sami Erol Gelenbe
April 8, 2020
Erol Gelenbe
Institute of Theoretical and Applied Informatics Polish Academy of Sciences, CNRS I3S Laboratory University Cote d’Azur (France), Visiting Professor Imperial College
A Dynamic Self-Aware Approach to Cybersecurity
INAUGURAL SEMINAR – April 1, 2020
Scott Aaronson
UT Austin
Quantum Computational Supremacy and Its Applications
Coming
VIRTUAL ICM SEMINARS AFTER SCFE20
The series of Virtual ICM Seminars in Computer and Computational Science will return in the new academic year 2020/2021.