Time for some computer science.

Yes, yes… I know.

I hear ya.

You didn’t come to my website to learn computer science (at least, probably not).

But *this* is a fundamental part of computer science that very well may help you down the road.

Because let’s face it…

If you interview for a developer job, there’s a good chance you’re gonna have to show that you can work with **algorithms**.

(Don’t worry, this is not going to be as difficult as it sounds.)

## Before we continue…

**Full disclosure**: I do not have a degree in computer science!

And you know what?

It isn’t necessary.

There. I said it.

Don’t get me wrong… there’s a lot to be said for earning a CS degree.

For one thing, you get to learn all of this stuff among peers in a classroom setting.

It can definitely help if you have a CS degree when trying to snag that first developer job.

But in the end, if you know what you *need* to learn, you can just learn it.

RIGHT HERE ON THE INTERNET.

FOR FREE (mostly, anyway).

## Back to the topic at hand…

So, Big-O…

It’s all about being able to *subjectively compare code solutions to determine which one is best for your situation*.

For example…

Write a function that calculates the sum of all numbers from 1 up to (and including) some number

n.

Here’s a solution you might come up with:

```
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total:
}
```

And that works great.

Wonderful.

But here’s another one (albeit less intuitive):

```
function addUpTo(n) {
return n * (n + 1) / 2;
}
```

This also works.

So…

Which one is better??

And the real question is: *DOES IT MATTER?*

Well, it depends on what you mean by *better*.

It could mean:

- Faster
- Less memory-intensive
- More readable

**I’m gonna go ahead and give a personal preference to SPEED…**

So check this out…

If we use `performance.now()`

to compare the two, we’ll discover the truth.

The **second** algorithm is *clearly* faster.

But how do we tell **how** the second algorithm is faster?

(other than just using the ‘time’ on our local machine)

I’m glad you asked!

## Counting Operations

This is where the rubber hits the road.

Take function #2…

```
function addUpTo(n) {
return n * (n + 1) / 2;
}
```

How many *operations* must take place?

Did you answer **three**?

Correct!

Now what if *n* = 8473928239 ?

How many operations must take place?

uh… still three?

That’s right!

Regardless of *n*’s value, it still only takes three operations to find a solution.

Compare that to function #1:

```
function addUpTo(n) {
let total = 0;
for (let i = 1; i <= n; i++) {
total += i;
}
return total;
}
```

How many operations? How many assignments? How many additions?

It can become tricky rather quickly.

But as you can see, the *for* loop creates a LOT of extra operations, assignments, and additions.

It could end up being like 5*n* + 2.

### Ok, this is getting too much like math class…

So the good news is, you don’t have to dig this deeply to see what really matters…

And that is:

The number of operations grows roughly *proportionally with n*.

As *n* grows, so do the number of operations.

## So what does this have to do with Big-O?

We say that an algorithm is **O(f(n))** if the number of simple operations the computer has to do is eventually less than a constant times **f(n)**, as **n** increases.

Now go back and read that like 4 or 5 more times…

Seriously, read it again.

Still doesn’t make much sense, does it?

Basically, **f(n)** could be:

- linear (f(n) = n)
- quadratic (f(n) = n
^{2}) - constant (f(n) = 1)
- something entirely different

So function 1 is **O(n)**, while function 2 is **O(1)**.

## How do we make this simpler?

First of all, **constants don’t matter**.

Actual | Simple |
---|---|

O(2n) |
O(n) |

O(500) | O(1) |

O(13n^{2}) |
O(n^{2}) |

Secondly, **smaller terms don’t matter**.

Longer | Shorter |
---|---|

O(n + 10) |
O(n) |

O(1000n + 50 |
O(n) |

O(n^{2} + 5n + 8) |
O(n^{2}) |

## Some reminders

- Arithmetic operations are
*constant* - Variable assignments are
*constant* - Accessing elements in an array (by index) or object (by key) is
*constant* - In a loop, the complexity is the length of the loop times the complexity of whatever happens inside the loop

You don’t need to memorize this stuff…

Just have an idea of how it all relates to the algorithms you use.

I know this was a bit mathy.

## What about the other factors like memory, readability, etc… ?

This post focuses primarily on the TIME it takes an algorithm to run.

If any of the other factors are of concern, I’m sure a quick google search can get you where you need to be.

Remember, a huge part of web development is just **finding out how to do crap**!

Google is your friend.