TH>OTOH, a heap is only slightly more complex, but its effiency is
TH>O(log2N),
BK>Shouldn't that be at least O(N * log2 N)? How could a sort
BK>algorithm ever be faster than O(N), which would IMHO mean that it
BK>wouldn't even have to access every element.
TH> Yep. A heap does *not* access every element - that is per insertion.
TH> O(logN) is a tell tail sign of a binary tree, and that is what a heap
TH> is. However, a heap is usually implemented as an array for efficiency.
TH> So, when an insertion takes place, the comparison of the insertion
TH> key starts at the end of the array, and in binary fashion trickles to
TH> the top. Each level up halves the posibilities. So, say, 255 elements
TH> has only seven accesses at most to make. It made only need to make
TH> one access. This is called reheaping.
Tim, this *IS* N*lgN. After all, you need to make lgN compares for each of N
elements...
TH> When the highest or lowest (depending on whether it is a MAX-heap, or
TH> a MIN-heap) element is popped off. A reheap is also required. That,
TH> too, is O(logN).... Hmmm. come to think of it, maybe I should have
TH> said O(2logN) since it is does get reheaped twice, once going in and
TH> again comming out.
One aspect of O-notation is that there isn't any constants... that's still
"O(logN)" (and still wrong - you need to take N off the heap as well, so it's
O(N*logN))
TH> BTW, heaps make real good priority trees, but they are not the
TH> panacia of sorts. You can only look at the top element in the sort.
TH> If you want to look at them all, you have to pop them off, and that
TH> destroys them as they go. You can work off a copy of the heap to
TH> avoid this.
For priority trees, one item that I saw that seemed faster was an unsorted
heap - where a node was guaranteed only to be higher than its leaves, but the
order of the leaves were unimportant. This reduces the constant, but does
not (IIRC) change the order of the algorithm.
---
---------------
* Origin: Tanktalus' Tower BBS (1:250/102)
|