what is big O notation and how to calculate it in mathematics
Answers
Answered by
1
big-O notation. (definition) Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n).
Similar questions