Computer Science, asked by Prathap06, 1 year ago

Assume an algorithm that takes log2 n microseconds to solve a problem. Find the largest input size n such that the algorithm solves the problem in time in 24 days.​

Answers

Answered by jeehelper
63

Since the given algorithm is based on time in microseconds, let us convert 24 days into microseconds:

Days into microseconds ---> multiply the time value by 8.64e+10

24 days = 24 × 8.64e+10 = 2.074e+12 microseconds

log₂(n) = 2.074e+12 microseconds

To remove log₂ from left side, take 2^(x) on right hand side of equation.

n = 2^{2.074e+12}  

n = 2^(2000000000000) is the largest input size.

Answered by brainlyinuser
30

Answer:

n = 2^ (2000000000000)

Explanation:

From the given data we have,

Time taken for log2 n algorithm to solve problems: microseconds

Largest input size of the algorithm to be identified for: 24 days

Total number of days to be converted into microseconds

For this,

The time value of the algorithm is multiplied to 8.64e + 10

= 24 days = 24 X 8.64 e + 10

= 2.074e+ 12 microseconds

In order to remove logz we can take 2^(x) at the right side of the equation and this would result into:

n = 2^ {2.074e+12)

n = 2^ (2000000000000)

Similar questions