The submission deadline for the "Learn to Compress & Compress to Learn" Workshop @ ISIT 2025 is now extended to March 28, 2025. For more detailed information, please visit and .
This workshop aims to unite experts from machine learning, computer science, and information theory to delve into the dual themes of learning-based compression and using compression as a tool for learning tasks.听
The program will feature invited talks 蹿谤辞尘:听
*听 (University of Pennsylvania)
* 听(University of Cambridge)
* (Granica)
* (Apple)
* (Texas A&M University)
* (Chan Zuckerberg Initiative)
We invite researchers from machine learning, compression and related fields to submit their latest work to the workshop.听We welcome submissions of recent work that has been presented, published, or is currently under review elsewhere, if the authors opt out of publishing their paper on 江南体育 Xplore.
All accepted papers will be presented as posters during the poster session. Some papers will also be selected for spotlight presentations. Topics of interest include but are not limited to:
-
"Learn to Compress鈥 鈥 Advancing Compression with Learning
-
Learning-Based Data Compression:听New techniques for compressing data (e.g., images, video, audio), model weights, and emerging modalities (e.g., 3D content and AR/VR applications).
-
Efficiency for Large-Scale Foundation Models:听Accelerating training and inference for large-scale foundation models, particularly in distributed and resource-constrained settings
-
Theoretical Foundations of Neural Compression:听Fundamental limits (e.g., rate-distortion bounds), distortion/perceptual/realism metrics, distributed compression, compression without quantization (e.g., channel simulation, relative entropy coding), and stochastic/probabilistic coding techniques.
-
-
"Compress to Learn鈥 鈥 Leveraging Principles of Compression to Improve Learning
-
Compression as a Tool for Learning:听Leveraging principles of compression and source coding to understand and improve learning and generalization.
-
Compression as a Proxy for Learning:听Understanding the information-theoretic role of compression in tasks like unsupervised learning, representation learning, and semantic understanding.
-
Interplay of Algorithmic Information Theory and Source Coding:听Exploring connections between Algorithmic Information Theory concepts (e.g., Kolmogorov complexity, Solomonoff induction) and emerging source coding methods.
-
Important Dates
- Paper submission deadline: [Extended]听March 28, 2025听(11:59 PM AoE, Anywhere on Earth).
- Decision notification:听April 18, 2025
- Camera-ready paper deadline:听May 1, 2025
- Workshop date:听June 26, 2025
听
Organizing Committee
* 听(NYU)
* (University of Cambridge / Imperial College London)
* (Imperial College London)
* (NYU)听
听