\nThe last decade has witnessed an experimental revolution in data science, led by the huge empirical success of deep learning methods across many areas of science and engineering. In order to capitalise on these successes, it has become increasingly important to provide a mathematical foundation that gives guiding design principles, and mitigates the current data \u2018hunger\u2019 of these DL architectures, to enable further applications within computational science.
<\/span>In this talk, we will describe the crucial role that data structure plays in constructing such foundations. Existing mathematical models are mostly agnostic to data structure, and as a result rely on strong assumptions in order to break the curse of dimensionality. Alternatively, we will present a geometrical perspective that unifies all successful DL architectures (CNNs, RNNs, Transformers, GNNs) from the principles of symmetry and scale separation, providing a viable mathematical picture where the curse of dimensionality is avoided under more realistic assumptions. We will cover theoretical analyses that highlight the role of data structure, as well as applications of these models to computational science, emphasizing the incipient area of \u2019Scientific Machine Learning\u2019.
<\/span><\/p>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n
[\/et_pb_text][et_pb_divider _builder_version=”4.9.7″ _module_preset=”default” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.14.2″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]<\/p>\n\n\n\n