Big-O Run Times of Sorting Algorithms (2024)

We had to learn a lot of details for eachalgorithm we've covered in this course. 0:00

Developers who need to implement theirown algorithms often need to choose 0:04

an algorithm for each andevery problem they need to solve. 0:08

And they often need to discuss theirdecisions with other developers. 0:11

Can you imagine needing to describe allthe algorithms in this same level of 0:14

detail all the time? 0:19

You'd spend all your time inmeetings rather than programming. 0:20

That's why Big O Notation was created,as a way of quickly describing how 0:24

an algorithm performs as the data setits working on increases in size. 0:28

Big O Notation lets you quickly compareseveral algorithms to choose the best one 0:33

for your problem. 0:37

The algorithms we've discussed inthis course are very well-known. 0:39

Some job interviewers are going toexpect you to know their Big O runtimes. 0:42

So let's look at them. 0:46

Remember that the n in Big O Notationrefers to the number of elements you're 0:48

operating on. 0:52

With selection sort, you need to checkeach item on the list to see if it's 0:53

lowest soyou can move it over to the sorted list. 0:57

So that's n operations. 0:59

Suppose you're doing selectionsort on a list of 5 items, and 1:02

in this case, would be 5. 1:06

So that's 5 operations before youcan move an item to the sorted list. 1:08

But with selection sort,you have to loop over the entire list for 1:11

each item you want to move. 1:15

There are 5 items in the list, and youhave to do 5 comparisons to move each one. 1:17

So it's more like 5 times 5 operations, or 1:21

if we replace 5 with n,it's n times n, or n squared. 1:25

But wait, you might say, half of that5 by 5 grid of operations is missing, 1:30

because we're testing one few itemin the unsorted list with each pass. 1:34

So isn't it more like,1/2 times n times n? 1:38

And this is true, we are not doinga full n squared of operations. 1:42

But remember, in Big O Notation, 1:46

as the value of n gets really big,constants like 1/2 become insignificant. 1:48

And so we discard them. 1:53

The Big O runtime of selection sort iswidely recognized as being O(n squared). 1:55

Quicksort requires one operation foreach element of listed sorting. 2:02

It needs to select a pivot first, and thenit needs to sort elements into lists that 2:06

are less than or greater than the pivot. 2:10

So that's n operations. 2:12

To put that another way, if you havea list of 8 items, then n is 8. 2:14

So it will take 8 operations tosplit the list around the pivot. 2:18

But of course, the list isn't sorted aftersplitting it around the pivot just once. 2:22

You have to repeat those dataoperations several times. 2:26

In the best case you'll pick a pivotthat's right in the middle of the lists, 2:29

so that you're dividinglist exactly in half. 2:33

Then you keep dividingthe list in half until you 2:36

have a list with a length of one. 2:38

The number of times you needto divide n in half until 2:40

you reach one is expressed as log n. 2:44

So you need to repeat n sortingup rations log n times. 2:47

That leaves us with the best case onetimes for Quicksort of O (n log n). 2:51

But that's the best case. 2:57

What about the worst case? 2:59

Well, if you pick the wrong pivot, 3:00

you won't be dividingthe list exactly in half. 3:02

If you pick a really bad pivot, 3:04

the next recursive call to Quicksortwill only reduce the list length by 1. 3:06

Since our Quicksort function simplypicks the first item to use as a pivot, 3:10

we can make it pick the worstpossible pivot repeatedly, 3:14

simply by giving it a listthat's sorted in reverse order. 3:18

If we pick the worst possible pivot everytime, we'll have to split the list once 3:21

for every item it contains, andthen, do n-sorting operations on it. 3:26

You already know another sorting algorithmthat only manages to reduce the list by 3:30

one element with each pass,selection sort. 3:35

Selection sort hasa runtime of O(n squared). 3:37

And in the worst case,that's the runtime for Quicksort as well. 3:40

So which do we consider when tryingto decide whether to use Quicksort, 3:44

the best case or the worst case? 3:48

Well, as long as your implementationdoesn't just pick the first item as 3:50

a pivot, which we did sowe could demonstrate this issue, 3:54

it turns out that on average, Quicksortperforms closer to the best case. 3:57

Many Quicksort implementations accomplishthis simply by picking a pivot at random 4:01

on each recursive loop. 4:06

Here we are sorting our reverse sorteddata again, but this time we picked pivots 4:08

at random, which reduces the numberof recursive operations needed. 4:12

Sure, random pivots sometimesgive you the best case and 4:17

sometimes you'll randomlyget the worst case. 4:19

But it all averages out over multiplecalls to the Quicksort function. 4:21

Now, with Merge Sort,there is no pivot to pick. 4:26

Your list of n items always getsdivided in half log n times. 4:28

That means, Merge Sort always hasa big O runtime of O(n log n). 4:33

Contrast that with Quicksort, 4:38

which only has a runtime ofO(n log n) in the best case. 4:40

In the worst case,Quicksort's runtime is O(n squared). 4:43

And yet, out in the real world, Quicksortis more commonly used than Merge Sort. 4:47

Now, why is that if Quicksort's biggerruntime can sometimes be worse than 4:51

Merge Sort? 4:55

This is one of those situations whereBig O Notation doesn't tell you 4:56

the whole story. 5:00

All Big O can tell you is the numberof times an operation is performed. 5:01

It doesn't describe howlong that operation takes. 5:05

And the operation merge ofperformance repeatedly, 5:08

takes longer than the operationQuicksort performance repeatedly. 5:11

Big O is a useful tool forquickly describing how the runtime 5:15

of an algorithm increases as the data setit's operating on gets really, really big. 5:18

But you can't always choose between twoalgorithms based just on their Big O 5:23

runtimes. 5:27

Sometimes there is additional infoyou need to know about an algorithm 5:28

to make a good decision. 5:31

Now that we can sort a list of items, 5:33

we're well on our way to being ableto search a list efficiently as well. 5:35

We'll look at how to dothat in the next stage. 5:39

Big-O Run Times of Sorting Algorithms (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Arline Emard IV

Last Updated:

Views: 5903

Rating: 4.1 / 5 (52 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Arline Emard IV

Birthday: 1996-07-10

Address: 8912 Hintz Shore, West Louie, AZ 69363-0747

Phone: +13454700762376

Job: Administration Technician

Hobby: Paintball, Horseback riding, Cycling, Running, Macrame, Playing musical instruments, Soapmaking

Introduction: My name is Arline Emard IV, I am a cheerful, gorgeous, colorful, joyous, excited, super, inquisitive person who loves writing and wants to share my knowledge and understanding with you.