Computer Science, asked by Rishitanawal7510, 11 months ago

Assume an algorithm that takes log2 n microseconds to solve a problem. Find the largest input size n such that the algorithm solves the problem in time in 24 days

Answers

Answered by jeehelper
13

Since the given algorithm is based on time in microseconds, let us convert 24 days into microseconds:

Days into microseconds ---> multiply the time value by 8.64e+10

24 days = 24 × 8.64e+10 = 2.074e+12 microseconds

log₂(n) = 2.074e+12 microseconds

To remove log₂ from left side, take 2^(x) on right hand side of equation.

n = 2^{2.074e+12}

n = 2^(2000000000000) is the largest input size.

Similar questions