中国高校课件下载中心 》 教学资源 》 大学文库

Coded Caching under Arbitrary Popularity Distributions

文档信息
资源类别:文库
文档格式:PPTX
文档页数:24
文件大小:1.21MB
团购合买:点击进入团购
内容简介
Coded Caching under Arbitrary Popularity Distributions
刷新页面文档预览

Coded Caching under Arbitrary Popularity Distributions Jinbei Zhang, Xiaojun Lin, Xinbing Wang

Coded Caching under Arbitrary Popularity Distributions Jinbei Zhang, Xiaojun Lin, Xinbing Wang

Caching and coded-Caching Caching is important for reducing backhaul requirement when serving content that many users are interested in The cache size at each user needs to be reasonable large compared to the amount of"popular"content server Coded caching can further reduce the backhaul requirement even when each individual cache size is small but the global cache size is substantial [Maddah-Ali and cache cache cache Neisen 14 User 1 User 2 User 3

Caching and Coded-Caching 2 Server cache cache cache User 1 User 2 User 3 • Caching is important for reducing backhaul requirement when serving content that many users are interested in • The cache size at each user needs to be reasonable large compared to the amount of “popular” content • Coded caching can further reduce the backhaul requirement even when each individual cache size is small, but the global cache size is substantial [Maddah-Ali and Neisen ’14]

Traditional (Uncoded)Caching: Individual cache size needs to be large N=3 files (unit-size) Server A=(A1,A2,A3) B=(B1,B2,B3) A. A Broadcast 2 C=(C1,C2,C3) channel 2,B2 B 2 3 K=3 users Back-haul requirement Cache size M=1 Uncoded Caching (A1,B1C1)[(A1,B1C1)(A,B1C1) K·(1 Individu caching gain User 1 User 2 User 3 wants a wants wants C

3 (𝐴1 , 𝐵1 , 𝐶1 ) User 1 wants 𝐴 User 2 wants 𝐵 User 3 wants 𝐶 Traditional (Uncoded) Caching: Individual cache size needs to be large Server K=3 users Cache size M=1 Broadcast channel 𝐴 = (𝐴1 , 𝐴2 , 𝐴3 ) 𝐵 = (𝐵1 , 𝐵2 , 𝐵3 ) 𝐶 = (𝐶1 , 𝐶2 , 𝐶3 ) N=3 files (unit-size): • Uncoded Caching 𝐾 ∙ 1 − 𝑀 𝑁 = 2 Back-haul Requirement: (𝐴1 , 𝐵1 , 𝐶1 ) (𝐴1 , 𝐵1 , 𝐶1 ) 𝐴2 , 𝐴3 𝐵2 , 𝐵3 𝐶2 , 𝐶3 Individual caching gain

Coded Caching Global Caching Gains N=3 files: A=(A1, A2, A3) Server B=(B1,B2,B3) C=(C1,C2,C3 Broadcast A2田B1 channel A3 C1 ↓B3C2 K=3 users Back-haul requirement Cache size M=1 Uncoded Caching (A1,B1,C1)[(A2,B2C2)(A3B3C3) K·1 A B Coded Caching [ 1 3 M K…(1N)1N 3 2 KM Global caching User 1 User 2 User 3 gaIn wants a wants wants C [1]Fundamental Limits of Caching, M. Maddah-Ali and U Niesen, IEEE Trans. Inf Theory, 2014

4 (𝐴1 , 𝐵1 , 𝐶1 ) (𝐴2 , 𝐵2 , 𝐶2 ) (𝐴3 , 𝐵3 , 𝐶3 ) 𝐴2 ⊕ 𝐵1 𝐴2 𝐵1 𝐴3 ⊕ 𝐶1 𝐴3 𝐶1 𝐵3 ⊕ 𝐶2 𝐵3 𝐶2 User 1 wants 𝐴 User 2 wants 𝐵 User 3 wants 𝐶 Coded Caching: Global Caching Gains Server 𝐾 ∙ 1 − 𝑀 𝑁 ∙ 1 1 + 𝐾𝑀 𝑁 = 1 • Coded Caching [1] [1] Fundamental Limits of Caching, M. Maddah-Ali and U. Niesen, IEEE Trans. Inf. Theory, 2014. K=3 users Cache size M=1 Broadcast channel 𝐴 = (𝐴1 , 𝐴2 , 𝐴3 ) 𝐵 = (𝐵1 , 𝐵2 , 𝐵3 ) 𝐶 = (𝐶1 , 𝐶2 , 𝐶3 ) N=3 files: Global caching gain • Uncoded Caching 𝐾 ∙ 1 − 𝑀 𝑁 = 2 Back-haul Requirement:

Average-Case vs Worst-Case K Expected rate r WiP(Wi Worst-case rate [Maddah-Ali and Niesen 14 max r =1.NK Obviously N K 7(WL)P(W1)≤.maxr(W) However, constant-factor results do Not carry over from the worst case to the average case max K ∑r(W)P(W≤C max r wWi L=1,,|N K ∑=1r*(W)P(W

Average-Case vs. Worst-Case 5 • Expected rate: • Worst-case rate [Maddah-Ali and Niesen `14]: • Obviously: • However, constant-factor results do NOT carry over from the worst case to the average case ෍ 𝑖=1 𝑁 𝐾 𝑟(𝑊𝑖)𝑃(𝑊𝑖) max 𝑖=1,…,𝑁𝐾 𝑟(𝑊𝑖 ) ෍ 𝑖=1 𝑁 𝐾 𝑟(𝑊𝑖 )𝑃 𝑊𝑖 ≤ max 𝑖=1,…,𝑁𝐾 𝑟(𝑊𝑖 ) max 𝑖=1,…,𝑁𝐾 𝑟(𝑊𝑖) max 𝑖=1,…,𝑁𝐾 𝑟 ∗(𝑊𝑖) ≤ 𝑐 σ𝑖=1 𝑁 𝐾 𝑟(𝑊𝑖)𝑃(𝑊𝑖) σ𝑖=1 𝑁𝐾 𝑟 ∗(𝑊𝑖)𝑃(𝑊𝑖) ≤ 𝑐

Related Work on the Average case U Niesen, and M.A. Maddah-Ali, "Coded Caching with Nonuniform Demands aXiv:13080178V2[cs,Ma.2014.(TT2016) divide the files into groups The gap between the lower bound and the achievable(upper) bound increases with of groups(unbounded) J. Hachem, N. Karamchandani and s. Diggavi, " Multi-level Coded Caching arXiy:14046563[cs,Apr.2014 Popularity has multiple levels The gap increases with of levels (unbounded) M.Ji, A tulino, J. Llorca and G Caire, " On the Average Performance of Caching and Coded Multicasting with Random Demands", arXiv: 1402.4576V2 [cs.,Jul.2014 Zipf popularity distribution pi a The gap increases with when a>1(unbounded)

Related Work on the Average Case • U. Niesen, and M.A. Maddah-Ali, “Coded Caching with Nonuniform Demands”, arXiv:1308.0178v2 [cs.IT], Mar. 2014. (TIT 2016) ➢ Divide the files into groups ➢ The gap between the lower bound and the achievable (upper) bound increases with # of groups (unbounded) • J. Hachem, N. Karamchandani and S. Diggavi, “Multi-level Coded Caching”, arXiv:1404.6563 [cs.IT], Apr. 2014. ➢ Popularity has multiple levels ➢ The gap increases with # of levels (unbounded) • M. Ji, A. Tulino, J. Llorca and G. Caire, “On the Average Performance of Caching and Coded Multicasting with Random Demands”, arXiv:1402.4576v2 [cs.IT], Jul. 2014. ➢ Zipf popularity distribution 𝑝𝑖 ∝ 1 𝑖 𝛼 ➢ The gap increases with 1 𝛼−1 when 𝛼 > 1 (unbounded) 6

The Open Question Can we find a coded caching scheme whose average-case performance is at most a constant factor away from the minimum independently of any popularity distributions?

The Open Question 7 Can we find a coded caching scheme whose average-case performance is at most a constant factor away from the minimum independently of any popularity distributions?

Our main results Constant-factor gap between the lower bound (rib)and the achievable (upper) bound(rub)of the expected backhual transmission rate Rb≤87Rb+2□R2b≤55Rb The achievable bound (rub) is attained by a simple coded Popularity caching scheme similar to [iet al 14 Perform coded caching only among the most popular KM Files However, all N, popular files are N1 File treated uniformly Index The key step is to show a matching lower bound Arbitrary Popularity Distribution! 8

Our Main Results • Constant-factor gap between the lower bound (𝑅𝑙𝑏) and the achievable (upper) bound (𝑅𝑢𝑏) of the expected backhual transmission rate: 𝑅𝑢𝑏 ≤ 87𝑅𝑙𝑏 +2 𝑅𝑢𝑏≤ 55𝑅𝑙𝑏 • The achievable bound (𝑅𝑢𝑏 ) is attained by a simple coded caching scheme similar to [Ji et al ’14] – Perform coded caching only among the most popular N1 files – However, all N1 popular files are treated uniformly • The key step is to show a matching lower bound 8 Arbitrary Popularity Distribution! File Index Popularity 1 𝐾𝑀 𝑁1

Network model Server with a broadcast channel Server · K users: cache size m ④File File Placement v 2@ Transmission Niles:={F1,…,F} Popularity( decreasing 2 File P={m1,…,p Request 2 Fil Request · Random request W; W2={n,…,fk,fk∈ FUser1user2 User K Rate for Wi is r(Wi Expected rate R(K,,P)=〉r(W)PW) 1

Network Model 9 • Server with a broadcast channel • K users: cache size M • N files: ℱ = 𝐹1, …, 𝐹𝑁 • Random request 𝑊𝑖 • Expected rate: Popularity (decreasing): 𝒫 = 𝑝1,… , 𝑝𝑁 𝑊𝑖 = 𝑓𝑖1, … , 𝑓𝑖𝐾 , 𝑓𝑖𝑘 ∈ ℱ Rate for 𝑊𝑖 is 𝑟(𝑊𝑖) 𝑅 𝐾, ℱ,𝒫 = ෍ 𝑖=1 𝑁 𝐾 𝑟(𝑊𝑖)𝑃(𝑊𝑖) User 1 File 1 ... File N User 2 ... User K Server File Placement User 1 File 1 ... File N User 2 ... User K 1 Server File Placement User 1 File 1 ... File N User 2 ... User K File Request File Request 1 2 2 2 2 Server File Placement User 1 File 1 ... File N User 2 ... User K File Request File Request File Transmission 1 2 3 2 2 2 Server

Main Intuition An"Insensitivity Property ·The“ best worst- case rate Uncoded caching N-M for serving N files can be Coded achieved by uniform caching n TMaddah-Ali and Niesen 14/MI caching M N.M、N K·(1-) (1-x) KM M N M N N K 1+ M Whenever k>> n/m Key Insight: Beyond K=N/M, the above rate is independent of the number of users K Due to its global caching gain, coded caching significantly reduce the threshold for this insensitivity to arise

• The “best” worst-case rate for serving N files can be achieved by uniform caching [Maddah-Ali and Niesen ’14] whenever K >> N/M • Key Insight: Beyond K=N/M, the above rate is independent of the number of users K • Due to its global caching gain, coded caching significantly reduce the threshold for this insensitivity to arise 10 Main Intuition: An “Insensitivity” Property K 𝑁 𝑀 − 1 Coded caching 𝑁 Uncoded caching 𝑁 − 𝑀 𝑁 𝑀 1 (1 ) (1 ) 1 1 M N M N K N M N M KM N • − •  − = − +

共24页,试读已结束,阅读完整版请下载
刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档