Fundamental Limits of Cache-Aided Broadcast Networks with User Cooperation
We consider cache-aided broadcast networks with user cooperation, where a server connects with multiple users and the users can cooperate with each other through a cooperation network. A new definition of transmission delay is introduced to characterize the latency cost during the delivery phase for arbitrary cache-aided networks. We investigate the deterministic caching and decentralized random caching setups respectively. For the deterministic caching setup, we propose new coded caching scheme that fully exploits time resource by allowing parallel transmission between the server and users. A constant multiplicative gap is derived between the information theoretic lower and upper bounds on the transmission delay. For the decentralized random caching setup, we show that if the cache size of each user is larger than a small threshold that tends to zero as the number of users goes to infinity, then the proposed decentralized coded caching scheme approaches an upper bound with a constant multiplicative factor. For both centralized and decentralized scenarios, we characterize cooperation gain (offered by the cooperation among the users) and parallel gain (offered by the parallel transmission between the server and multiple users) that greatly reduce the transmission delay. Furthermore, we show that due to a tradeoff between the parallel gain, cooperation gain and multicast gain, the number of users who parallelly send information should be chosen dynamically according to the system parameters, and letting more users parallelly send information could cause high transmission delay.
READ FULL TEXT