Lunch will begin at 11:45am
Abstract:
Tensor-valued data, which arise naturally in neuroimaging, genomics, climate science, and spatiotemporal networks, encode rich multilinear dependencies that are often lost under vectorization. This paper develops a unified statistical framework for tensor neural networks (TNNs) that provides finite-sample guarantees, uncertainty quantification, and provable structure discovery. The proposed Dual-Channel Tensor Neural Network (DC-TNN) decomposes each tensor into a low-rank core and a sparse refinement component processed in parallel through coupled tensor layers, unifying CP-, Tucker-, and Tensor-Train-based formulations for joint learning of global and localized structures.
For uncertainty quantification, we introduce Core-Refined Conformal ROC (CoR-ROC) bands that perform conformal calibration in the tensor’s core-refinement latent space, yielding sharper and less conservative confidence regions than naive conformal methods. We establish finite-sample coverage guarantees for CoR-ROC and derive non-asymptotic generalization and oracle bounds for DC-TNN estimators.
To select among candidate tensor decompositions, we further propose a Conformal Structure Selector (CoSS), which provides the first distribution-free statistical test for low-rank structure identification in TNNs. Simulations and real-data experiments confirm that DC-TNNs deliver competitive predictive accuracy and reliably recover the underlying tensor structure. Together, these developments offer a principled foundation for learning, quantifying, and selecting tensor structures within modern neural networks.
Bio
Dr. Elynn Chen is an Assistant Professor of Technology, Operations and Statistics (TOPS) at NYU Stern School of Business. Her research focuses on developing innovative methodologies for data-driven decision-making and complex data analysis, with applications spanning business, economics, and healthcare domains.
Before joining NYU Stern in 2021, she completed postdoctoral fellowships at UC Berkeley's EECS department with Prof. Michael I. Jordan, and at Princeton University's ORFE department working with Prof. Jianqing Fan. During her academic journey, she also served as a Research Scholar at OpenAI in 2019. Her research contributions have been recognized with the NSF Postdoc Award.
Dr. Chen holds a Ph.D. in Statistics from Rutgers University, where she was advised by Prof. Rong Chen working in tensor time series. Her research spans three main areas: tensor learning, statistical reinforcement learning, and transfer learning. In tensor learning, she develops efficient algorithms for multi-dimensional data analysis, offering more natural representations of complex physical phenomena. Her work in statistical reinforcement learning focuses on designing algorithms for social applications across business, education, and healthcare sectors, with the goal of optimizing decision-making processes and improving outcomes. Additionally, she investigates transfer learning methods to enhance performance across related tasks, particularly in the contexts of reinforcement learning and tensor analysis. Her work has made significant impacts in various fields, including international trade, corporate finance, and clinical dynamic treatments. Through her research and teaching, she continues to bridge the gap between complex statistical methodologies and their practical applications in business and healthcare settings. Website: https://elynncc.github.io/