Computer Science, asked by sumi1002, 1 year ago

Assume an algorithm that takes log2 n microseconds to solve a problem. Find the largest input size n such that the algorithm solves the problem in time in 24 days.​

Answers

Answered by brainlyinuser
0

Answer:

n = 2^ (2000000000000)

Explanation:

From the given data we have,

Time taken for log2 n algorithm to solve problems: microseconds

Largest input size of the algorithm to be identified for: 24 days

Total number of days to be converted into microseconds

For this,

The time value of the algorithm is multiplied to 8.64e + 10

= 24 days = 24 X 8.64 e + 10

= 2.074e+ 12 microseconds

In order to remove logz we can take 2^(x) at the right side of the equation and this would result into:

n = 2^ {2.074e+12)

n = 2^ (2000000000000)

Similar questions