Jump to: navigation, search

SOCR News & Events: 2023 JMM/AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data


The accelerated rate of increase of data volume and heterogeneity requires novel mathematical foundations for representation, modeling, analysis and interpretation of complex multisource information. This special session will explore the mathematical, physical, and computational aspects of tensors, as one promising direction for data compression, classification, model-based and model-free inference. By bringing together a broad range of experts, the session will provide a platform for open exchange of ideas, reports of recent developments, cross-fertilization of new techniques, translation of mathematical models into data analytic techniques, and the embedding of application-specific constraints into mathematical formulations.

The session talks will cover new mathematical, computational, and statistical approaches for tensor-based representation, modeling and inference with direct applications to high-dimensional and longitudinal data. Talks will cover coupled tensor-tensor completion strategies, complex time (kime) representation and tensor linear modeling of kimesurfaces, and current advances in tensor computing. Various data science, biomedical health, environmental, climate, and econometrics applications will be showcased.


Session Logistics

Abstract Submission

The organizers of this special session on Tensor Representation, Completion, Modeling and Analytics of Complex Data invite abstract submission for at the upcoming Joint Mathematics Meetings in Boston, MA, January 4-7, 2023 (Wednesday-Saturday). This 3-part session will meet on Wed-Thu January 4-5, 2023. Submission of abstracts for invited talks [20+5+5]-minute are welcome until the deadline, September 13, 2022. The JMM 2023 homepage contains a wealth of information about the JMM 2023 conference and this special session. Here are some AMS policies concerning special sessions:

  • Please note the deadline of *September 13* for abstract submission. We strongly encourage you to submit your abstract at least a week before that deadline in order to avoid any last-minute problems. When the "Conclude Submission" button is clicked, an email will be sent to the Presenting Author's email and the Submitter's email confirming receipt of the submission. Please note your abstract is *NOT* submitted until you click the "Conclude Submission" button.
  • Your talk must be delivered in person, via a computer projection system. Please note that overhead projectors will not be provided for sessions. Also, the meeting rooms will not have blackboards or whiteboards.
  • The AMS does not pay any expenses of faculty attending special sessions. However, PhD students are encouraged to apply for travel funding from AMS. The AMS also has a program of child care grants for all JMM attendees (faculty, students, etc.). Note that these programs have their own deadlines and application procedures.
  • Everyone who attends the meeting is required to pay a registration fee.


Session 1 (Wed 1/4/23, 8AM-12PM)

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
8:00AM Mason A Porter / UCLA Node Centralities in Multilayer Networks 15A99-17029
8:30AM Alex Townsend / Cornell Why are so many matrices and tensors compressible? 15-02-20266
9:00AM Joe Kileel / Texas Estimation in Mixture Models Through Implicit Tensor Decomposition 65F99-21426
9:30AM Luke Oeding / Auburn University Dimensions of Restricted Secant Varieties of Grassmannians 15A69-22223
10:00AM John Blake Temple / UC-Davis On the regularity implied by the assumptions of geometry 53B30-22415
10:30 AM Yizhe Zhu / UC-Irvine Non-backtracking spectra of random hypergraphs and community detection 60C05-18313
11:00 AM Bruno N. de Oliveira / University of Miami Abundance of symmetric differential tensors, birational geometry of surfaces and hypersurfaces in \(\mathbb{P}^3\) 14J60-22507
11:30 AM Ivo Dinov /Michigan Quantum Physics, Data Science, Tensor Linear Modeling, and Spacekime Analytics 81Q65-14771

Session 2 (Wed 1/4/23, 1PM-6PM)

Session 2 (2270:SS19B): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data II Date: Wednesday, January 4, 2023, Time: 1:00 – 6:00 PM, Location: Hynes Convention Center - 206

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
1:00PM Anru Zhang / Duke Tensor Learning in 2020s: Methodology, Theory, and Applications 62H99-17081
1:30PM Giuseppe Cotardo / Virginia Tech The Tensor Rank in Coding Theory 03D15-20548
2:00PM Edinah Koffi Gnang / Johns Hopkins On the complexity of hypermatrix equivalence 15-02-20813
2:30PM Hirotachi (Hiro) Abo / Idaho Algebro-geometric approaches to the tensor eigenproblem 15A69-20858
3:00PM Oscar Fabian Lopez / Florida Atlantic Zero-Truncated Poisson Regression for Sparse Multiway Count Data Corrupted by False Zeros 15B99-21425
3:30PM Anna Konstorum / Yale Optimizing component recovery in CP decomposition of immunology data 92-08-21503
4:00PM Tianyi Shi / Lawrence Berkeley National Laboratory Tensor equation methods for electron correlation energy computation 65F99-21870
4:30PM Jonathan Gryak / CUNY Tensor Denoising via Amplification and Stable Rank Methods 15A72-22096
5:00PM M. Alex O. Vasilescu / UCLA Kernel Tensor Factor Analysis 15-06-22645

Session 3 (Thu 1/5/23, 8AM-11:30AM)

Session 3 (2270:SS19C): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data III Date: Thursday, January 5, 2023, Time: 8:00 – 11:30 AM, Location: Hynes Convention Center - 206

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
8:00AM Elina Robeva / UBC High-order Cumulants for Learning Linear Non-Gaussian Causal Models 62H22-18002
8:30AM Zhen Dai / Chicago From tensor rank to the inversion of a complex matrix 65F05-19986
9:00AM Eric Evert / KU Leuven Best low rank approximations of positive definite tensors 15-02-20965
9:30AM Hajer Bouzaouache / University Tunis, El Manar On the Importance of Tensor representation in the stability analysis of Nonlinear systems 93D05-22326
10:00AM Rachel Minster / Wake Forest University Randomized Parallel Algorithms for Tucker Decompositions 65F99-21497
10:30AM Harm Derksen / Northeastern Tensor Denoising via Amplification and Stable Rank Methods 15-02-20663
11:00AM Maryam Bagherian / Michigan Tensor Recovery Under Metric Learning Constraints 68U99-18227

Speakers, Titles, and Abstracts

  • ...
... abstract...


Translate this page:

Uk flag.gif

De flag.gif

Es flag.gif

Fr flag.gif

It flag.gif

Pt flag.gif

Jp flag.gif

Bg flag.gif

الامارات العربية المتحدة
Ae flag.gif

Fi flag.gif

इस भाषा में
In flag.gif

No flag.png

Kr flag.gif

Cn flag.gif

Cn flag.gif

Ru flag.gif

Nl flag.gif

Gr flag.gif

Hr flag.gif

Česká republika
Cz flag.gif

Dk flag.gif

Pl flag.png

Ro flag.png

Se flag.gif