{"id":49,"date":"2021-05-20T11:37:30","date_gmt":"2021-05-20T11:37:30","guid":{"rendered":"http:\/\/nbmds.uva.es\/?page_id=49"},"modified":"2021-12-02T09:09:00","modified_gmt":"2021-12-02T09:09:00","slug":"plenary-speakers","status":"publish","type":"page","link":"http:\/\/nbmds.uva.es\/plenary-speakers\/","title":{"rendered":"Plenary Speakers"},"content":{"rendered":"

[et_pb_section fb_built=”1″ _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”][et_pb_row column_structure=”3_4,1_4″ _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”][et_pb_column type=”3_4″ _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”][et_pb_text _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”]<\/p>\n

Plenary Speakers<\/h1>\n

 <\/p>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n
\"\"<\/td>\n\n

Joan Bruna<\/a><\/h2>\n

Courant Institute, New York University<\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n
\n

Title:<\/strong>
Prospects and Challenges of Machine Learning in the Physical World (video<\/a>)<\/p>\n

Show Abstract…<\/p>\n

\n

The last decade has witnessed an experimental revolution in data science, led by the huge empirical success of deep learning methods across many areas of science and engineering. In order to capitalise on these successes, it has become increasingly important to provide a mathematical foundation that gives guiding design principles, and mitigates the current data \u2018hunger\u2019 of these DL architectures, to enable further applications within computational science.
<\/span>In this talk, we will describe the crucial role that data structure plays in constructing such foundations. Existing mathematical models are mostly agnostic to data structure, and as a result rely on strong assumptions in order to break the curse of dimensionality. Alternatively, we will present a geometrical perspective that unifies all successful DL architectures (CNNs, RNNs, Transformers, GNNs) from the principles of symmetry and scale separation, providing a viable mathematical picture where the curse of dimensionality is avoided under more realistic assumptions. We will cover theoretical analyses that highlight the role of data structure, as well as applications of these models to computational science, emphasizing the incipient area of \u2019Scientific Machine Learning\u2019.
<\/span><\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n
\"\"<\/td>\n\n

Coralia Cartis<\/a><\/h2>\n

University of Oxford<\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n
\n

Title:<\/strong>
Challenges and improvements in optimization algorithms for machine learning (video<\/a>)<\/p>\n

Show Abstract…<\/p>\n

\n

We will discuss some key challenges to optimization algorithm development arising from machine earning. In particular, we investigate dimensionality reduction techniques in the variable\/parameter domain for local and global optimization; these rely crucially on random projections. We describe and use sketching results that allow efficient projections to low dimensions while preserving using properties, as well as other useful tools from random matrix theory and conic integral geometry. We focus on functions with low effective dimensionality – that are conjectured to provide an insightful proxy for neural networks landscapes. Time permitting we also discuss first- versus second-order optimization methods for training, and\/or stochastic variants of classical optimization methods that allow biased noise, adaptive parameters and have almost sure convergence.
<\/span><\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n
\"\"<\/td>\n\n

Marco Cuturi<\/a><\/h2>\n

Google Brain\/ENSAE, Institut Polytechnique de Paris<\/span><\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n
\n

Title:<\/strong>
Learning through ambiguity: differentiable matchings and mappings (video<\/a>)<\/p>\n

Show Abstract…<\/p>\n

\n

Data points (or, more generally, entire datasets) studied under the lens of ML undergo shifts of all types. In many machine learning tasks, such as fair classification, domain adaptation or robustness against attacks, crafting classifiers that can handle such shifts is crucial. In several fields of science (such as single cell genomics or neuroscience) understanding and modeling the dynamics of such shifts is of the essence.
<\/span>For all of these examples, a canonical strategy is to lift the ambiguity inherent to such shifts through some form of {matching, assignment, registration} step that helps build correspondences across datasets (i.e., match a cell observed at time t1 with another at time t2; an individual from group A to another in minority B, an image in setup 1 to another in setup 2, a word or a sentence in a {language, paragraph} to another in a different {language, paragraph}). These correspondences can be costly to compute, and even more so difficult to differentiate; it can be also challenging to extend them to out-of-sample settings.
<\/span>I will show in this work how recent ideas in the fields of optimal transport and differentiable programming have joined forces to provide scalable software solutions, providing new ways to learn through the ambiguity of shifts with the flexibility of end-to-end learning pipelines. Entropic regularization, quantile transforms, input-convex neural networks and the implicit function theorem will each play a role in these approaches.<\/span><\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n
\"\"<\/td>\n\n

Jeff Goldsmith<\/a><\/h2>\n

Columbia University<\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n
\n

Title:<\/strong>
Functional data methods for wearable device data (video<\/a>)<\/p>\n

Show Abstract…<\/p>\n

\n

In the last ten years, technological advances have made many activity- and physiology-monitoring wearable devices available for use in large-scale epidemiological studies. This trend is likely to continue and even expand as devices become cheaper and more reliable. These developments open up a tremendous opportunity for clinical and public health researchers to collect critical data at an unprecedented level of detail, while posing new challenges for statistical analysis of rich, complex data. This talk will present a collection of approaches in functional data analysis for identifying and interpreting variability in activity trajectories within and across participants, for building regression models in which activity trajectories are the response, and for understanding shifts in the circadian rhythms that underly the timing of activity. We’ll draw on several applications, including the Baltimore Longitudinal Study of Aging and data collected through the Columbia Center for Children’s Environmental Health.<\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n
\"\"<\/td>\n\n

Alfio Quarteroni<\/a><\/h2>\n

Politecnico di Milano<\/span><\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n
\n

Title:<\/strong>
Physics-Based and Data-Driven-Based Algorithms for the Simulation of the Heart Function (video<\/a>)<\/p>\n

Show Abstract…<\/p>\n

\n

In this talk, I will present a mathematical model that is suitable to simulate the cardiac function, thanks to its capability to describe the interaction between electrical, mechanical, and fluid-dynamical processes occurring in the heart.
<\/span>The model comprises a system of nonlinear differential equations (either ordinary and partial) featuring a multi-physics and multi-scale nature. Efficient numerical strategies are devised to allow for the analysis of both heart function and dysfunction. These strategies rely on both classical physics-based numerical discretization methods and machine-learning algorithms, as well as on their interplay.
Acknowledgment: The work presented in this talk is part of the project iHEART that has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (grant agreement No 740132)<\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][\/et_pb_column][et_pb_column type=”1_4″ _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”][et_pb_sidebar show_border=”off” _builder_version=”4.9.4″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_sidebar][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"

Plenary Speakers   Joan Bruna Courant Institute, New York University Title:Prospects and Challenges of Machine Learning in the Physical World (video) Show Abstract… The last decade has witnessed an experimental revolution in data science, led by the huge empirical success of deep learning methods across many areas of science and engineering. In order to capitalise […]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_et_pb_use_builder":"on","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"_links":{"self":[{"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/pages\/49"}],"collection":[{"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/comments?post=49"}],"version-history":[{"count":65,"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/pages\/49\/revisions"}],"predecessor-version":[{"id":988,"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/pages\/49\/revisions\/988"}],"wp:attachment":[{"href":"http:\/\/nbmds.uva.es\/wp-json\/wp\/v2\/media?parent=49"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}