Difference between revisions of "SOCR JMM 2023"

From SOCR
Jump to: navigation, search
(Organizers)
(Session Logistics)
 
(31 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
== [[SOCR_News | SOCR News & Events]]: 2023 JMM/AMS Special Session on ''Tensor Representation, Completion, Modeling and Analytics of Complex Data'' ==
 
== [[SOCR_News | SOCR News & Events]]: 2023 JMM/AMS Special Session on ''Tensor Representation, Completion, Modeling and Analytics of Complex Data'' ==
  
[[Image:BigData_AMS_JMM_2014.gif|250px|thumbnail|right| [https://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_program_ss9.html 2023 JMM/AMS Foundations of Data Science Session (SS9A)] ]]
+
[[Image:BigData_AMS_JMM_2014.gif|250px|thumbnail|right| [https://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_program_ss9.html 2023 JMM/AMS Tensor Analytics Session] ]]
  
 
==Overview==
 
==Overview==
  
 +
The accelerated rate of increase of data volume and heterogeneity requires novel mathematical foundations for representation, modeling, analysis and interpretation of complex multisource information. This special session will explore the mathematical, physical, and computational aspects of tensors, as one promising direction for data compression, classification, model-based and model-free inference. By bringing together a broad range of experts, the session will provide a platform for open exchange of ideas, reports of recent developments, cross-fertilization of new techniques, translation of mathematical models into data analytic techniques, and the embedding of application-specific constraints into mathematical formulations.
  
 +
The session talks will cover new mathematical, computational, and statistical approaches for tensor-based representation, modeling and inference with direct applications to high-dimensional and longitudinal data. Talks will cover coupled tensor-tensor completion strategies, complex time (kime) representation and tensor linear modeling of kimesurfaces, and current advances in tensor computing. Various data science, biomedical health, environmental, climate, and econometrics applications will be showcased.
  
 
== Organizers==
 
== Organizers==
 
* [https://umich.edu/~dinov Ivo Dinov], [https://www.umich.edu University of Michigan], [https://www.socr.umich.edu SOCR], [https://midas.umich.edu MIDAS].
 
* [https://umich.edu/~dinov Ivo Dinov], [https://www.umich.edu University of Michigan], [https://www.socr.umich.edu SOCR], [https://midas.umich.edu MIDAS].
* [https://experts.umich.edu/discover/experts_publication?and_facet_profiles_author=5597 Joshual Welch], [https://www.umich.edu University of Michigan], [https://welch-lab.github.io/ WelchLab].
+
* [https://experts.umich.edu/discover/experts_publication?and_facet_profiles_author=5597 Joshua Welch], [https://www.umich.edu University of Michigan], [https://welch-lab.github.io/ WelchLab].
  
 
==Session Logistics==
 
==Session Logistics==
[[Image:JMM_2021_SS9_FoundationsOf_DS_Background.png|300px|thumbnail|right| [https://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_program_ss9.html 2021 JMM/AMS Foundations of Data Science Session (SS9A)] ]]
+
[[Image:JMM_2023_banner_Boston.jpg|300px|thumbnail|right| [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program.html 2023 JMM/AMS Tensor Reps & Analytics] ]]
 
* '''Date/Time''':  
 
* '''Date/Time''':  
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_wednesday.html Session 1: Wed Jan. 4, 2023, 8:00AM - 12:00PM], [https://www.timeanddate.com/time/zones/et US Eastern timezone (GMT-4)]
+
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_wednesday.html '''Session 1 (2270:SS19A)''': Wed Jan. 4, 2023, 8:00AM - 12:00PM], [https://www.timeanddate.com/time/zones/et US Eastern time zone (GMT-4)]
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_thursday.html Session 2: Wed Jan. 4, 2023, 1:00AM - 6:00PM], [https://www.timeanddate.com/time/zones/et US Eastern timezone (GMT-4)]
+
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_thursday.html '''Session 2 (2270:SS19B)''': Wed Jan. 4, 2023, 1:00PM - 6:00PM], [https://www.timeanddate.com/time/zones/et US Eastern time zone (GMT-4)]
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_thursday.html Session 3: Thu Jan. 5, 2023, 8:00AM - 12:00PM], [https://www.timeanddate.com/time/zones/et US Eastern timezone (GMT-4)]
+
** [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program_thursday.html '''Session 3 (2270:SS19C)''': Thu Jan. 5, 2023, 8:00AM - 12:00PM], [https://www.timeanddate.com/time/zones/et US Eastern time zone (GMT-4)]
 +
* '''Venue''': [https://www.massconvention.com/about-us/contact-us/john-b-hynes-veterans-memorial-convention-center Hynes Convention Center], [https://www.signatureboston.com/hynes/floor-plans-and-specs/space-finder/hynes-meeting-room-206 Room 206].
 +
* '''Registration''': [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_reg Meeting Registration is required].
 +
* '''Conference''': [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_program.html 2023 Joint Mathematics Meeting (JMM)], session 2270:SS19 (A, B, C).
 +
* '''Session Format''':  Three half-day sessions, each 5-hours with 10 20+5+5 minute talks.
 +
* '''Equipment''': Computer projector and screen with HDMI connection. Session room does not include overhead transparency projectors, computers/laptops, blackboards or whiteboards.
 +
* '''AMS Special Session Manual''': [http://www.ams.org/meetings/meet-specialsessionmanual AMS/JMM Session Manual].
 +
* [https://meetings.ams.org/math/jmm2023/cfp.cgi '''Paper Abstract Submission'''], click on the AMS (American Mathematical Society) “BEGIN A SUBMISSION” button. The '''deadline for submission of invited abstracts has passed''' (September 13, 2022). Interested presenters are encouraged to [https://meetings.ams.org/math/jmm2023/cfp.cgi submit contributed talk abstracts].
 +
* [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_policy JMM'23 policies/procedures]
 +
* [https://myumi.ch/DJ75R Session URL]: https://myumi.ch/DJ75R.
  
* '''Venue''': Hynes Convention Center, Room 206.
+
== Abstract Submission==
* '''Registration''': [http://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_reg Meeting Registration is required].
+
The organizers of this special session on ''Tensor Representation, Completion, Modeling and Analytics of Complex Data'' invite abstract submission for at the upcoming Joint Mathematics Meetings in Boston, MA, January 4-7, 2023 (Wednesday-Saturday). This 3-part session will meet on Wed-Thu January 4-5, 2023. Submission of abstracts for invited talks [20+5+5]-minute are welcome until the deadline, September 13, 2022.
* '''Zoom''': [https://meetings.ams.org/math/jmm2021/meetingapp.cgi Zoom URL links to all sessions], [http://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_reg access requires JMM 2021 Registration].
+
The [https://www.jointmathematicsmeetings.org/meetings/national/jmm2023/2270_intro JMM 2023 homepage] contains a wealth of information about the JMM 2023 conference and this special session. Here are some AMS policies concerning special sessions:
* '''Conference''': [http://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_program.html 2021 Joint Mathematics Meetings Meeting].
+
 
* '''Session Format''':  4-hour session, 8 20+5+5 minute talks.
+
*Each speaker *must* submit an abstract before a talk can be scheduled. [https://meetings.ams.org/math/jmm2023/cfp.cgiAbstracts must be submitted electronically through the AMS portal], then click on the AMS (American Mathematical Society) “BEGIN A SUBMISSION” button.
* '''Equipment''': Computer projector and screen with HDMI connection only (no VGA!) Session room does not include overhead transparency projectors, computers/laptops, blackboards or whiteboards.
+
 
* '''AMS Special Session Manual''': [http://www.ams.org/meetings/meet-specialsessionmanual AMS/JMM Session Manual].
+
* Please note the deadline of *September 13* for abstract submission. We strongly encourage you to submit your abstract at least a week before that deadline in order to avoid any last-minute problems.  When the "Conclude Submission" button is clicked, an email will be sent to the Presenting Author's email and the Submitter's email confirming receipt of the submission. Please note your abstract is *NOT* submitted until you click the "Conclude Submission" button.
* [http://jointmathematicsmeetings.org/meetings/abstracts/abstract.pl?type=jmm. '''Paper Abstract Submission'''] - choose session [https://jointmathematicsmeetings.org/meetings/national/jmm2021/2247_program_ss9.html SS 9A] (due date: Sept 15, 2020).
+
* Your talk must be delivered in person, via a computer projection system. Please note that overhead projectors will not be provided for sessions. Also, the meeting rooms will not have blackboards or whiteboards.
* '''COVID-19 Updates''': [https://www.ams.org//home/covid-19#Meetings AMS Resources & Updates related to COVID-19].
+
* The AMS does not pay any expenses of faculty attending special sessions. However, [http://www.ams.org/student-travel PhD students are encouraged to apply for travel funding from AMS]. The AMS also has a [http://www.ams.org/profession/opportunities/meetings-child-care-grants program of child care grants for all JMM attendees (faculty, students, etc.)]. Note that these programs have their own deadlines and application procedures.
* [https://myumi.ch/qgRl1 Session URL]: https://myumi.ch/qgRl1.
+
* Everyone who attends the meeting is required to pay a registration fee.
  
 
== Program==
 
== Program==
 +
=== Session 1 (Wed 1/4/23, 8AM-12PM) ===
 +
<center>
 +
{| class="wikitable"
 +
|-
 +
! Time [https://www.timeanddate.com/time/zones/et US ET timezone (GMT-5)] || Presenter/Affiliation || Title || Classification - Abstract ID
 +
|-
 +
| 8:00AM ||Mason A Porter / UCLA || ''Node Centralities in Multilayer Networks'' || 15A99-17029
 +
|-
 +
| 8:30AM || Alex Townsend / Cornell || ''Why are so many matrices and tensors compressible?'' || 15-02-20266
 +
|-
 +
| 9:00AM || Joe Kileel / Texas || ''Estimation in Mixture Models Through Implicit Tensor Decomposition'' || 65F99-21426
 +
|-
 +
| 9:30AM || Luke Oeding / Auburn University || ''Dimensions of Restricted Secant Varieties of Grassmannians'' || 15A69-22223
 +
|-
 +
| 10:00AM || John Blake Temple / UC-Davis || ''On the regularity implied by the assumptions of geometry'' ||  53B30-22415
 +
|-
 +
| 10:30 AM || Yizhe Zhu / UC-Irvine || ''Non-backtracking spectra of random hypergraphs and community detection'' || 60C05-18313
 +
|-
 +
| 11:00 AM || Bruno N. de Oliveira / University of Miami || ''Abundance of symmetric differential tensors, birational geometry of surfaces and hypersurfaces in <math>\mathbb{P}^3</math>'' || 14J60-22507
 +
|-
 +
| 11:30 AM || Ivo Dinov /Michigan || ''Quantum Physics, Data Science, Tensor Linear Modeling, and Spacekime Analytics'' || 81Q65-14771
 +
|}
 +
</center>
  
 +
=== Session 2 (Wed 1/4/23, 1PM-6PM) ===
 +
Session 2 (2270:SS19B): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data II   
 +
Date: Wednesday, January 4, 2023, Time: 1:00 – 6:00 PM, Location: Hynes Convention Center - 206 
 +
 
<center>
 
<center>
 
{| class="wikitable"
 
{| class="wikitable"
 
|-
 
|-
! Time [https://www.timeanddate.com/time/zones/mt US MT timezone (GMT-7)] || Presenter/Affiliation || Title || Abstract ID
+
! Time [https://www.timeanddate.com/time/zones/et US ET timezone (GMT-5)] || Presenter/Affiliation || Title || Classification - Abstract ID
 +
|-
 +
| 1:00PM || Anru Zhang / Duke || ''Tensor Learning in 2020s: Methodology, Theory, and Applications'' || 62H99-17081
 
|-
 
|-
| 8:00AM || [https://www.carolineuhler.com/ Caroline Uhler (MIT)] || ''Multi-Domain Data Integration: From Observations to Mechanistic Insights'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-62-32.pdf Abstract 1163-62-32]
+
| 1:30PM || Giuseppe Cotardo / Virginia Tech|| ''The Tensor Rank in Coding Theory'' || 03D15-20548
 
|-
 
|-
| 8:30AM || [https://luddy.indiana.edu/contact/profile/?profile_id=187 Mehmet (Memo) Dalkilic (Indiana University)] || ''Teaching an Old Dog New Tricks: Making EM work with Big Data using Heaps'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-03-86.pdf Abstract 1163-03-86]
+
| 2:00PM || Edinah Koffi Gnang / Johns Hopkins || ''On the complexity of hypermatrix equivalence'' || 15-02-20813
 
|-
 
|-
| 9:00AM || [https://www.math.fsu.edu/People/faculty.php?id=1783 Tom Needham (Florida State University)] || ''Applications of Gromov-Wasserstein distance to network science'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-52-68.pdf Abstract 1163-52-68]
+
| 2:30PM || Hirotachi (Hiro) Abo  / Idaho || ''Algebro-geometric approaches to the tensor eigenproblem'' || 15A69-20858
 
|-
 
|-
| 9:30AM || [http://www.cs.utah.edu/~jeffp/ Jeff M. Phillips (Utah)] || ''A Primer on the Geometry in Machine Learning'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-52-52.pdf Abstract 1163-52-52]
+
| 3:00PM || Oscar Fabian Lopez / Florida Atlantic || ''Zero-Truncated Poisson Regression for Sparse Multiway Count Data Corrupted by False Zeros'' || 15B99-21425
 
|-
 
|-
| 10:00AM || [https://www.jonathannilesweed.com/ Jonathan Niles-Weed, NYU/Courant/Center for Data Science] || ''Statistical estimation under group actions'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-62-41.pdf Abstract 1163-62-41]
+
| 3:30PM || Anna Konstorum / Yale || ''Optimizing component recovery in CP decomposition of immunology data'' || 92-08-21503
 
|-
 
|-
| 10:30 AM || [https://ani.stat.fsu.edu/~abarbu/ Adrian Barbu (Florida State University)] || ''A Novel Framework for Online Supervised Learning with Feature Selection'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-62-50.pdf Abstract 1163-62-50]
+
| 4:00PM || Tianyi Shi  / Lawrence Berkeley National Laboratory || ''Tensor equation methods for electron correlation energy computation'' || 65F99-21870
 
|-
 
|-
| 11:00 AM || [https://arsuaga-vazquez-lab.faculty.ucdavis.edu/team-details/maxime-pouokam/ Maxime G Pouokam (UC Davis)] || ''Statistical Topology of Genome Analysis in Three Dimensions'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-62-338.pdf Abstract 1163-62-338]
+
| 4:30PM || Jonathan Gryak / CUNY || ''Tensor Denoising via Amplification and Stable Rank Methods'' || 15A72-22096
 
|-
 
|-
| 11:30 AM || [https://www.umich.edu/~dinov/ Ivo D. Dinov (University of Michigan)] || ''Data Science, Time Complexity, and Spacekime Analytics'' || [http://www.ams.org/amsmtgs/2247_abstracts/1163-62-33.pdf Abstract 1163-62-33]
+
| 5:00PM || M. Alex O. Vasilescu / UCLA || ''Kernel Tensor Factor Analysis'' || 15-06-22645
 +
|}
 +
</center>
 +
 
 +
=== Session 3 (Thu 1/5/23, 8AM-11:30AM) ===
 +
Session 3 (2270:SS19C): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data III   
 +
Date: Thursday, January 5, 2023, Time: 8:00 – 11:30 AM, Location: Hynes Convention Center - 206 
 +
 +
<center>
 +
{| class="wikitable"
 +
|-
 +
! Time [https://www.timeanddate.com/time/zones/et US ET timezone (GMT-5)] || Presenter/Affiliation || Title || Classification - Abstract ID
 +
|-
 +
| 8:00AM || Elina Robeva / UBC || ''High-order Cumulants for Learning Linear Non-Gaussian Causal Models'' || 62H22-18002
 +
|-
 +
| 8:30AM ||  Zhen Dai / Chicago || ''From tensor rank to the inversion of a complex matrix'' ||  65F05-19986
 +
|-
 +
| 9:00AM ||  Eric Evert / KU Leuven || ''Best low rank approximations of positive definite tensors'' || 15-02-20965
 +
|-
 +
| 9:30AM || Hajer Bouzaouache / University Tunis, El Manar || ''On the Importance of Tensor representation in the stability analysis of Nonlinear systems'' || 93D05-22326
 +
|-
 +
| 10:00AM ||  Rachel Minster / Wake Forest University || ''Randomized Parallel Algorithms for Tucker Decompositions'' || 65F99-21497 
 +
|-
 +
| 10:30AM || Harm Derksen / Northeastern || ''Tensor Denoising via Amplification and Stable Rank Methods'' || 15-02-20663
 +
|-
 +
| 11:00AM || Maryam Bagherian / Michigan || ''Tensor Recovery Under Metric Learning Constraints'' || 68U99-18227
 
|}
 
|}
 
</center>
 
</center>
Line 56: Line 121:
 
==Speakers, Titles, and Abstracts==
 
==Speakers, Titles, and Abstracts==
  
* [https://www.carolineuhler.com/ Caroline Uhler (MIT)]: ''Multi-Domain Data Integration: From Observations to Mechanistic Insights'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-62-32.pdf Abstract 1163-62-32])
+
* ...
: Massive data collection holds the promise of a better understanding of complex phenomena and ultimately, of better decisions. An exciting opportunity in this regard stems from the growing availability of perturbation / intervention data (manufacturing, advertisement, education, genomics, etc.). In order to obtain mechanistic insights from such data, a major challenge is the integration of different data modalities (video, audio, interventional, observational, etc.). Using genomics and in particular the problem of identifying drugs for the repurposing against COVID-19 as an example, I will first discuss our recent work on coupling autoencoders in the latent space to integrate and translate between data of very different modalities such as sequencing and imaging. I will then present a framework for integrating observational and interventional data for causal structure discovery and characterize the causal relationships that are identifiable from such data. We end by a theoretical analysis of autoencoders linking overparameterization to memorization. In particular, I will characterize the implicit bias of overparameterized autoencoders and show that such networks trained using standard optimization methods implement associative memory. Collectively, our results have major implications for planning and learning from interventions in various application domains.
 
 
 
* [https://luddy.indiana.edu/contact/profile/?profile_id=187 Mehmet (Memo) Dalkilic (Indiana University)]: ''Teaching an Old Dog New Tricks: Making EM work with Big Data using Heaps'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-03-86.pdf Abstract 1163-03-86])
 
: Contemporary data mining algorithms are easily overwhelmed with truly big data.  While parallelism, improved initialization, and ad hoc data reduction are commonly used and necessary strategies, we note that (1) continually revisiting data and (2) visiting all data are two of the most prominent problems–especially for iterative learning techniques like expectation-maximization  algorithm  for  clustering  (EM-T).  To  the  best  of  our  knowledge,  there  is  no  freely  available software that specifically focuses on improving the original EM-T algorithm in the context of big data.  We demonstrate the  utility  of  CRAN  package  ''DCEM''  that  implements  an  improved  version  of  EM-T, which  we  call  EM*  (EM  star). DCEM provides an integrated and minimalistic interface to EM-T and EM* algorithms, and can be used as either (1) a stand-alone program or (2) a pluggable component in existing software.  We show that EM* can both effectively and efficiently cluster data as we vary size, dimensions, and separability.
 
 
 
* [https://www.math.fsu.edu/People/faculty.php?id=1783 Tom Needham (Florida State University)]: ''Applications of Gromov-Wasserstein distance to network science'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-52-68.pdf Abstract 1163-52-68])
 
: Recent  years  have  seen  a  surge  of  research  activity  in  network  analysis  through  the  lens  of  optimal  transport.  This perspective boils down to the following simple idea:  when comparing two networks, instead of considering a traditional registration between their nodes, one instead searches for an optimal ‘soft’ or probabilistic correspondence.  This perspective has led to state-of-the-art algorithms for robust large-scale network alignment and network partitioning tasks.  A rich mathematical theory underpins this work:  optimal node correspondences realize the Gromov-Wasserstein (GW) distance between networks.  GW distance was originally introduced, independently by K. T. Sturm and Facundo M ́emoli, as a tool for studying abstract convergence properties of sequences of metric measure spaces.  In particular, Sturm showed that GW distance can be understood as a geodesic distance with respect to a Riemannian structure on the space of isomorphism classes of metric measure spaces (the ‘Space of Spaces’).  In this talk, I will describe joint work with Samir Chowdhury,in which we develop computationally efficient implementations of Sturm’s ideas for network science applications.  We also derive theoretical results which link this framework to classical notions from spectral network analysis.
 
 
 
* [http://www.cs.utah.edu/~jeffp/ Jeff M. Phillips (Utah)]: ''A Primer on the Geometry in Machine Learning'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-52-52.pdf Abstract 1163-52-52])
 
: Machine  Learning  is  a  discipline  filled  with  many  simple  geometric  algorithms,  the  central  task  of  which  is  usually classification.  These varied approaches all take as input a set of n points in d dimensions, each with a label.  In learning,the goal is to use this input data to build a function which predicts a label accurately on new data drawn from the same unknown distribution as the input data.  The main difference in the many algorithms is largely a result of the chosen class of functions considered.  This talk will take a quick tour through many approaches from simple to complex and modern,and show the geometry inherent at each step.  Pit stops will include connections to geometric data structures, duality,random projections, range spaces, and core sets.
 
 
 
* [https://www.jonathannilesweed.com/ Jonathan Niles-Weed, NYU/Courant/Center for Data Science]: ''Statistical estimation under group actions'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-62-41.pdf Abstract 1163-62-41])
 
: A common challenge in the sciences is the presence of heterogeneity in data. Motivated by problems in signal processing and computational biology, we consider a particular form of heterogeneity where observations are corrupted by random transformations from a group (such as the group of permutations or rotations) before they can be collected and analyzed. We establish the fundamental limits of statistical estimation in such settings and show that the optimal rates of recovery are precisely governed by the invariant theory of the group. As a corollary, we establish rigorously the number of samples necessary to reconstruct the structure of molecules in cryo-electron microscopy. We also give a computationally efficient algorithm for a special case of this problem, and discuss conjectured statistical-computational gaps for the general case.
 
: Based on joint work with Afonso Bandeira, Ben Blum-Smith, Joe Kileel, Amelia Perry, Philippe Rigollet, Amit Singer, and Alex Wein.
 
 
 
* [https://ani.stat.fsu.edu/~abarbu/ Adrian Barbu (Florida State University)]: ''A Novel Framework for Online Supervised Learning with Feature Selection'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-62-50.pdf Abstract 1163-62-50])
 
: Current  online  learning  methods  suffer  from  lower  convergence  rates  and  limited  capability  to  recover  the  support  of the  true  features  compared  to  their  offline  counterparts.  In  this  work,  we  present  a  novel  online  learning  framework based on running averages and introduce online versions of some popular existing offline methods such as Elastic Net, Minimax Concave Penalty and Feature Selection with Annealing.  The framework can handle an arbitrarily large number of observations as long as the data dimension is not too large,  e.g.  p<50,000.  We prove the equivalence between our online methods and their offline counterparts and give theoretical true feature recovery and convergence guarantees for some  of  them.  In  contrast  to  the  existing  online  methods,  the  proposed  methods  can  extract  models  of  any  sparsity level at any time.  Numerical experiments indicate that our new methods enjoy high accuracy of true feature recovery and  a  fast  convergence  rate,  compared  with  standard  online  and  offline  algorithms.  We  also  show  how  the  running averages framework can be used for model adaptation in the presence of model drift.  Finally, we present applications to large datasets where again the proposed framework shows competitive results compared to popular online and offline algorithms.
 
 
 
* [https://arsuaga-vazquez-lab.faculty.ucdavis.edu/team-details/maxime-pouokam/ Maxime G Pouokam (UC Davis)]: ''Statistical Topology of Genome Analysis in Three Dimension'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-62-338.pdf Abstract 1163-62-338])
 
: The three-dimensional (3D) configuration of chromosomes within the eukaryote nucleus is an important factor for several cellular  functions,  including  gene  expression  regulation,  and  has  also  been  linked  with  many  diseases  such  as  cancer-causing translocation events.  Recent adaptations of high-throughput sequencing to chromosome conformation capture (3C) techniques, allows for genome-wide structural characterization for the first time with a goal of getting a 3D structure of the genome.  In this study, we present a novel approach to compute entanglement in open chains in general and apply it to chromosomes.  Our metric is termed the linking proportion (Lp).  We use the Lp in two different settings.  We use the Lp to show that the Rabl configuration, an evolutionary conserved feature of the 3D nuclear organization, as an essential player in the simplification of the entanglement of chromatin fibers.  We show how the Lp incorporates statistical models of inference that can be used to determine the agreement between candidate 3D configuration reconstructions. In the last part of our work, we present Smooth3D, a novel 3D genome reconstruction method via cubic spline approximation.
 
  
* [https://www.umich.edu/~dinov/ Ivo D. Dinov (University of Michigan)]: ''Data Science, Time Complexity, and Spacekime Analytics'' ([http://www.ams.org/amsmtgs/2247_abstracts/1163-62-33.pdf Abstract 1163-62-33])
+
* [https://www.umich.edu/~dinov/ Ivo D. Dinov (University of Michigan)]: ''...title...'' ([TBD Abstract ###])
: Human behavior, communication, and social interactions are profoundly augmented by the rapid immersion of digitalization and virtualization of all life experiences. This process presents important challenges of managing, harmonizing, modeling, analyzing, interpreting, and visualizing complex information. There is a substantial need to develop, validate, productize, and support novel mathematical techniques, advanced statistical computing algorithms, transdisciplinary tools, and effective artificial intelligence applications. ''Spacekime analytics'' is a new technique for modeling high-dimensional longitudinal data. This approach relies on extending the notions of time, events, particles, and wavefunctions to complex-time (''kime''), complex-events (''kevents''), data, and inference-functions. We will illustrate how the kime-magnitude (longitudinal time order) and kime-direction (phase) affect the subsequent predictive analytics and the induced scientific inference. The mathematical foundation of spacekime calculus reveal various statistical implications including inferential uncertainty and a Bayesian formulation of spacekime analytics. Complexifying time allows the lifting of all commonly observed processes from the classical 4D Minkowski spacetime to a 5D spacekime manifold, where a number of interesting mathematical problems arise. Direct data science applications of spacekime analytics will be demonstrated using simulated data and clinical observations (e.g., sMRI, fMRI data).
+
: ... abstract...
  
 
==Resources==
 
==Resources==
Slides/papers
+
* [https://wiki.socr.umich.edu/images/5/53/Tensor_InvitedSpecialSession_JMM_2023_Boston_Flier.pdf Session Flier]
 +
* Slides/papers ... coming up later ...
  
: [https://wiki.socr.umich.edu/images/4/42/AdrianBarbu_Slides-2021-01-09-JMM.pdf (Adrian Barbu) ''A Novel Framework for Online Supervised Learning with Feature Selection''].
 
: [https://wiki.socr.umich.edu/images/f/fb/JMM_MaximePouokam_UCD_2021.pdf (Maxime Pouokam) ''Statistical Topology of Genome Analysis in Three Dimension''].
 
: [https://socr.umich.edu/docs/uploads/2021/Dinov_Spacekime_JMM_AMS_2021.pdf (Ivo Dinov) ''Data Science, Time Complexity, and Spacekime Analytics'' (Presentation Slides)].
 
  
  

Latest revision as of 17:06, 25 September 2022

SOCR News & Events: 2023 JMM/AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data

Overview

The accelerated rate of increase of data volume and heterogeneity requires novel mathematical foundations for representation, modeling, analysis and interpretation of complex multisource information. This special session will explore the mathematical, physical, and computational aspects of tensors, as one promising direction for data compression, classification, model-based and model-free inference. By bringing together a broad range of experts, the session will provide a platform for open exchange of ideas, reports of recent developments, cross-fertilization of new techniques, translation of mathematical models into data analytic techniques, and the embedding of application-specific constraints into mathematical formulations.

The session talks will cover new mathematical, computational, and statistical approaches for tensor-based representation, modeling and inference with direct applications to high-dimensional and longitudinal data. Talks will cover coupled tensor-tensor completion strategies, complex time (kime) representation and tensor linear modeling of kimesurfaces, and current advances in tensor computing. Various data science, biomedical health, environmental, climate, and econometrics applications will be showcased.

Organizers

Session Logistics

Abstract Submission

The organizers of this special session on Tensor Representation, Completion, Modeling and Analytics of Complex Data invite abstract submission for at the upcoming Joint Mathematics Meetings in Boston, MA, January 4-7, 2023 (Wednesday-Saturday). This 3-part session will meet on Wed-Thu January 4-5, 2023. Submission of abstracts for invited talks [20+5+5]-minute are welcome until the deadline, September 13, 2022. The JMM 2023 homepage contains a wealth of information about the JMM 2023 conference and this special session. Here are some AMS policies concerning special sessions:

  • Please note the deadline of *September 13* for abstract submission. We strongly encourage you to submit your abstract at least a week before that deadline in order to avoid any last-minute problems. When the "Conclude Submission" button is clicked, an email will be sent to the Presenting Author's email and the Submitter's email confirming receipt of the submission. Please note your abstract is *NOT* submitted until you click the "Conclude Submission" button.
  • Your talk must be delivered in person, via a computer projection system. Please note that overhead projectors will not be provided for sessions. Also, the meeting rooms will not have blackboards or whiteboards.
  • The AMS does not pay any expenses of faculty attending special sessions. However, PhD students are encouraged to apply for travel funding from AMS. The AMS also has a program of child care grants for all JMM attendees (faculty, students, etc.). Note that these programs have their own deadlines and application procedures.
  • Everyone who attends the meeting is required to pay a registration fee.

Program

Session 1 (Wed 1/4/23, 8AM-12PM)

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
8:00AM Mason A Porter / UCLA Node Centralities in Multilayer Networks 15A99-17029
8:30AM Alex Townsend / Cornell Why are so many matrices and tensors compressible? 15-02-20266
9:00AM Joe Kileel / Texas Estimation in Mixture Models Through Implicit Tensor Decomposition 65F99-21426
9:30AM Luke Oeding / Auburn University Dimensions of Restricted Secant Varieties of Grassmannians 15A69-22223
10:00AM John Blake Temple / UC-Davis On the regularity implied by the assumptions of geometry 53B30-22415
10:30 AM Yizhe Zhu / UC-Irvine Non-backtracking spectra of random hypergraphs and community detection 60C05-18313
11:00 AM Bruno N. de Oliveira / University of Miami Abundance of symmetric differential tensors, birational geometry of surfaces and hypersurfaces in \(\mathbb{P}^3\) 14J60-22507
11:30 AM Ivo Dinov /Michigan Quantum Physics, Data Science, Tensor Linear Modeling, and Spacekime Analytics 81Q65-14771

Session 2 (Wed 1/4/23, 1PM-6PM)

Session 2 (2270:SS19B): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data II Date: Wednesday, January 4, 2023, Time: 1:00 – 6:00 PM, Location: Hynes Convention Center - 206

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
1:00PM Anru Zhang / Duke Tensor Learning in 2020s: Methodology, Theory, and Applications 62H99-17081
1:30PM Giuseppe Cotardo / Virginia Tech The Tensor Rank in Coding Theory 03D15-20548
2:00PM Edinah Koffi Gnang / Johns Hopkins On the complexity of hypermatrix equivalence 15-02-20813
2:30PM Hirotachi (Hiro) Abo / Idaho Algebro-geometric approaches to the tensor eigenproblem 15A69-20858
3:00PM Oscar Fabian Lopez / Florida Atlantic Zero-Truncated Poisson Regression for Sparse Multiway Count Data Corrupted by False Zeros 15B99-21425
3:30PM Anna Konstorum / Yale Optimizing component recovery in CP decomposition of immunology data 92-08-21503
4:00PM Tianyi Shi / Lawrence Berkeley National Laboratory Tensor equation methods for electron correlation energy computation 65F99-21870
4:30PM Jonathan Gryak / CUNY Tensor Denoising via Amplification and Stable Rank Methods 15A72-22096
5:00PM M. Alex O. Vasilescu / UCLA Kernel Tensor Factor Analysis 15-06-22645

Session 3 (Thu 1/5/23, 8AM-11:30AM)

Session 3 (2270:SS19C): AMS Special Session on Tensor Representation, Completion, Modeling and Analytics of Complex Data III Date: Thursday, January 5, 2023, Time: 8:00 – 11:30 AM, Location: Hynes Convention Center - 206

Time US ET timezone (GMT-5) Presenter/Affiliation Title Classification - Abstract ID
8:00AM Elina Robeva / UBC High-order Cumulants for Learning Linear Non-Gaussian Causal Models 62H22-18002
8:30AM Zhen Dai / Chicago From tensor rank to the inversion of a complex matrix 65F05-19986
9:00AM Eric Evert / KU Leuven Best low rank approximations of positive definite tensors 15-02-20965
9:30AM Hajer Bouzaouache / University Tunis, El Manar On the Importance of Tensor representation in the stability analysis of Nonlinear systems 93D05-22326
10:00AM Rachel Minster / Wake Forest University Randomized Parallel Algorithms for Tucker Decompositions 65F99-21497
10:30AM Harm Derksen / Northeastern Tensor Denoising via Amplification and Stable Rank Methods 15-02-20663
11:00AM Maryam Bagherian / Michigan Tensor Recovery Under Metric Learning Constraints 68U99-18227

Speakers, Titles, and Abstracts

  • ...
... abstract...

Resources






Translate this page:

(default)
Uk flag.gif

Deutsch
De flag.gif

Español
Es flag.gif

Français
Fr flag.gif

Italiano
It flag.gif

Português
Pt flag.gif

日本語
Jp flag.gif

България
Bg flag.gif

الامارات العربية المتحدة
Ae flag.gif

Suomi
Fi flag.gif

इस भाषा में
In flag.gif

Norge
No flag.png

한국어
Kr flag.gif

中文
Cn flag.gif

繁体中文
Cn flag.gif

Русский
Ru flag.gif

Nederlands
Nl flag.gif

Ελληνικά
Gr flag.gif

Hrvatska
Hr flag.gif

Česká republika
Cz flag.gif

Danmark
Dk flag.gif

Polska
Pl flag.png

România
Ro flag.png

Sverige
Se flag.gif