Rate-Constrained Remote Contextual Bandits
We consider a rate-constrained contextual multi-armed bandit (RC-CMAB) problem, in which a group of agents are solving the same contextual multi-armed bandit (CMAB) problem. However, the contexts are observed by a remotely connected entity, i.e., the decision-maker, that updates the policy to maximize the returned rewards, and communicates the arms to be sampled by the agents to a controller over a rate-limited communications channel.
On the Rate-Distortion-Perception Function
Rate-distortion-perception theory extends Shannon鈥檚 rate-distortion theory by introducing a constraint on the perceptual quality of the output. The perception constraint complements the conventional distortion constraint and aims to enforce distribution-level consistencies. In this new theory, the information-theoretic limit is characterized by the rate-distortion-perception function.
Optimality of Huffman Code in the Class of 1-Bit Delay Decodable Codes
For a given independent and identically distributed (i.i.d.) source, Huffman code achieves the optimal average codeword length in the class of instantaneous codes with a single code table. However, it is known that there exist time-variant encoders, which achieve a shorter average codeword length than the Huffman code, using multiple code tables and allowing at most $k$ -bit decoding delay for $k = 2, 3, 4, \ldots $ .
DeepJSCC-Q: Constellation Constrained Deep Joint Source-Channel Coding
Recent works have shown that modern machine learning techniques can provide an alternative approach to the long-standing joint source-channel coding (JSCC) problem. Very promising initial results, superior to popular digital schemes that utilize separate source and channel codes, have been demonstrated for wireless image and video transmission using deep neural networks (DNNs).
Cross-Domain Lossy Compression as Entropy Constrained Optimal Transport
We study an extension of lossy compression where the reconstruction is subject to a distribution constraint which can be different from the source distribution. We formulate our setting as a generalization of optimal transport with an entropy bottleneck to account for the rate constraint due to compression. We provide expressions for the tradeoff between compression rate and the achievable distortion with and without shared common randomness between the encoder and decoder.
All You Need Is Feedback: Communication With Block Attention Feedback Codes
Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures.
江南体育 Journal on Special Areas in Information Theory information for authors
Front Cover
Guest Editorial
Welcome to the ninth (June 2022) issue of the 江南体育 Journal on Selected Areas in Information Theory (JSAIT), dedicated to 鈥淒istributed Coding and Computation鈥.