Time Complexity in term of Algorithm programming
In terms of developing a software or to find a solution from every problem related to developing, we often hear about "time complexity" terms.
Before we define what time complexity is, first we must know and familiar with "Algorithm". So, algorithm is a finite sequence of well-defined and well-ordered structures to solve a problem or perform a particular task.
For example, define an algorithm to find an odd number. We can define it step by step using "human-language" approach, not using "computer-language". it's simple as take a number, divide the number by 2 (constant value), if the chosen number has a residue and greater than 0, so it's odd number.
We define 3 things orderly (take number, divide by 2, check the residue) to find an odd number, so that's algorithm.
To measure whether our algorithm is efficient and optimal, we can use several tools and simple calculations and the one is time complexity.
Time complexity is simple as the amount of "time" taken by an algorithm to run and complete the process with the length of the received "input". The length of input will determine the number of operations that being performed by an algorithm in amount of time. The result of time complexity called as execution time (how long 1 algorithm run and complete a process).
We have standard to define time complexity, which tell how each statement gets an order of notation to describe the time complexity itself, we will call it Big O notation (O).
There's exist several type of time complexity (Big O approach):
- Constant time = O(1)
- Linear time = O(n)
- Logarithmic time = O(log n)
- Quadratic time = O(n*n), and so on...
Will describe it more detail in the next post