# What is ‘n’ in O(n)?

The Big O notation is a language used when referring to execution time and complexity of an algorithm. Understanding Big O notation is the first step to learning data structures and algorithms. With a firm understanding of Big O notation, practice will result in better critical thinking and more optimized code.

This graph refers to the different possibilities of a given algorithm and the efficiency. In this blog post we will only focus on O(n) (O of N). O(n) is liner . But before we get into why it is liner, let’s discuss what is n?

N represents the size of the input. Depending on the complexity of your code, the larger the input, the higher the runtime. Many algorithms can be condensed into better runtimes (O(1)), but thats not always the case. For more optimized code, it is important to aim for the fastest runtime that you have knowledge of.

Now that we understand what n represents. We can form an understanding why O(n) is liner. O(n) is liner because the bigger the input the higher the runtime. However O(n) falls into the bad zone because the more inputs, the longer our computers would take to run though the code.