Computer Science, asked by shikhakaur4954, 9 months ago

What is example of algorithm with complexity o(n^4)?

Answers

Answered by gjay5599
1

Answer:

Algorithmic complexity is concerned about how fast or slow particular algorithm performs. We define complexity as a numerical function T(n) - time versus the input size n. We want to define time taken by an algorithm without depending on the implementation details. But you agree that T(n) does depend on the implementation! A given algorithm will take different amounts of time on the same inputs depending on such factors as: processor speed; instruction set, disk speed, brand of compiler and etc. The way around is to estimate efficiency of each algorithm asymptotically. We will measure time T(n) as the number of elementary "steps" (defined in any way), provided each such step takes constant time.

Let us consider two classical examples: addition of two integers. We will add two integers digit by digit (or bit by bit), and this will define a "step" in our computational model. Therefore, we say that addition of two n-bit integers takes n steps. Consequently, the total computational time is T(n) = c * n, where c is time taken by addition of two bits. On different computers, additon of two bits might take different time, say c1 and c2, thus the additon of two n-bit integers takes T(n) = c1 * n and T(n) = c2* n respectively. This shows that different machines result in different slopes, but time T(n) grows linearly as input size increases.

The process of abstracting away details and determining the rate of resource usage in terms of the input size is one of the fundamental ideas in computer science.

Asymptotic Notations

The goal of computational complexity is to classify algorithms according to their performances. We will represent the time function T(n) using the "big-O" notation to express an algorithm runtime complexity. For example, the following statement

T(n) = O(n2)

says that an algorithm has a quadratic time complexity.

Definition of "big Oh"

For any monotonic functions f(n) and g(n) from the positive integers to the positive integers, we say that f(n) = O(g(n)) when there exist constants c > 0 and n0 > 0 such that

f(n) ≤ c * g(n), for all n ≥ n0

Intuitively, this means that function f(n) does not grow faster than g(n), or that function g(n) is an upper bound for f(n), for all sufficiently large n→∞

Here is a graphic representation of f(n) = O(g(n)) relation:

Examples:

1 = O(n)

n = O(n2)

log(n) = O(n)

2 n + 1 = O(n)

Similar questions