中国高校课件下载中心 》 教学资源 》 大学文库

《算法入门》(英文版)Lecture 5 Prof. Erik Demaine

文档信息
资源类别:文库
文档格式:PDF
文档页数:47
文件大小:847.16KB
团购合买:点击进入团购
内容简介
All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements.
刷新页面文档预览

Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 5 Prof erik demaine

Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 5 Prof. Erik Demaine

How fast can we sort? All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements E. g. insertion sort, merge sort, quicksort heapsort The best worst-case running time that we've seen for comparison sorting is o(nIgn) Is O(nlgn the best we can do? Decision trees can help us answer this question o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.2

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.2 How fast can we sort? All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements. • E.g., insertion sort, merge sort, quicksort, heapsort. The best worst-case running time that we’ve seen for comparison sorting is O(n lg n). Is O(n lg n) the best we can do? Decision trees can help us answer this question

Decision-tree example Sort(a1,a2,…,an) 1:2 2:3 1:3 123 1:3 213 2:3 132 312 231 321 Each internal node is labeled i; i for i,jE(1,2, ..,n) The left subtree shows subsequent comparisons ifa s a The right subtree shows subsequent comparisons if a; 2a o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.3

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.3 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. •The left subtree shows subsequent comparisons if ai ≤ aj. •The right subtree shows subsequent comparisons if ai ≥ aj. Sort 〈a1, a2, …, an〉

Decision-tree example Sort(a1, a2, a3) 1:2 9≥4 =〈9,4,6) 2:3 1:3 123 1:3 213 2:3 132 312 231 321 Each internal node is labeled i; i for i,jE(1,2, ..,n) The left subtree shows subsequent comparisons ifa s a The right subtree shows subsequent comparisons if a; 2a o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.4

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.4 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. •The left subtree shows subsequent comparisons if ai ≤ aj. •The right subtree shows subsequent comparisons if ai ≥ aj. 9 ≥ 4 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:

Decision-tree example Sort(a1, a2, a3) 1:2 =〈9,4,6) 2:3 1:3 9≥6 123 1:3 213 2:3 132 312 231 321 Each internal node is labeled i; i for i,jE(1,2, ..,n) The left subtree shows subsequent comparisons ifa s a The right subtree shows subsequent comparisons if a; 2a o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.5

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.5 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. •The left subtree shows subsequent comparisons if ai ≤ aj. •The right subtree shows subsequent comparisons if ai ≥ aj. 9 ≥ 6 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:

Decision-tree example Sort(a1, a2, a3) 1:2 =〈9,4,6) 2:3 1:3 123 1:3 213 4≤6 2:3 132 312 231 321 Each internal node is labeled i; i for i,jE(1,2, ..,n) The left subtree shows subsequent comparisons ifa s a The right subtree shows subsequent comparisons if a; 2a o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.6

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.6 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. •The left subtree shows subsequent comparisons if ai ≤ aj. •The right subtree shows subsequent comparisons if ai ≥ aj. 4 ≤ 6 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:

Decision-tree example Sort(a1, a2, a3) 1:2 =〈9,4,6) 2:3 1:3 123 1:3 213 2:3 132 312 231 321 4<6≤9 Each leaf contains a permutation〈π(1),2丌(2)2…,m(m)to indicate that the ordering an< (2) has been established o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.7

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.7 Decision-tree example 1:21:2 2:3 2:3 123 123 1:31:3 132 132 312 312 1:3 1:3 213 213 2:32:3 231 231 321 321 Each leaf contains a permutation 〈π(1), π(2),…, π(n)〉 to indicate that the ordering aπ(1) ≤ aπ(2) ≤ L ≤ aπ(n) has been established. 4 ≤ 6 ≤ 9 Sort 〈a1, a2, a3〉 = 〈 9, 4, 6 〉:

Decision-tree model a decision tree can model the execution of any comparison sort. One tree for each input size n View the algorithm as splitting whenever it compares two elements The tree contains the comparisons along all possible instruction traces The running time of the algorithm =the length of the path taken Worst-case running time=height of tree o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.8

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.8 Decision-tree model A decision tree can model the execution of any comparison sort: • One tree for each input size n. • View the algorithm as splitting whenever it compares two elements. • The tree contains the comparisons along all possible instruction traces. • The running time of the algorithm = the length of the path taken. • Worst-case running time = height of tree

Lower bound for decision tree sorting Theorem. Any decision tree that can sort n elements must have height Q2(nIgn) Proof. The tree must contain > n! leaves, since there are n! possible permutations. a height-h binary tree has≤2 h leaves.Thus,n!≤2 h≥lg(n!) (g is mono. increasing) Ig((nle(Stirling's formula nlgn-nIge 2(n Ig n o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.9

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.9 Lower bound for decision￾tree sorting Theorem. Any decision tree that can sort n elements must have height Ω(n lg n). Proof. The tree must contain ≥ n! leaves, since there are n! possible permutations. A height-h binary tree has ≤ 2h leaves. Thus, n! ≤ 2h . ∴ h ≥ lg(n!) (lg is mono. increasing) ≥ lg ((n/e)n) (Stirling’s formula) = n lg n – n lg e = Ω(n lg n)

Lower bound for comparison sorting Corollary. Heapsort and merge sort are asymptotically optimal comparison sorting algorithms o 2001 by Charles E Leiserson Introduction to Algorithms Day 8 L5.10

© 2001 by Charles E. Leiserson Introduction to Algorithms Day 8 L5.10 Lower bound for comparison sorting Corollary. Heapsort and merge sort are asymptotically optimal comparison sorting algorithms

刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档