Computer Science, asked by akshaysajeev252, 1 year ago

Assume that an algorithm that takes log2 n microseconds to solve a problem.find the largest input size n such that algorithm solves the problem in time in 24 days?

Answers

Answered by dhruv0002
1

1 ms = 1/100 second

n(log2) = [n(log2)]/100 second

Now 60 seconds = 1 min

[n(log2)]/100 sec = [n(log2)]/(60×100)

Similary divide by 60 for hours

[n(log2)]/(60×60×100) hours

24 hours = 1 day

[n(log2)]/(60×60×100) hours = [n(log2)]/(60×60×24×100) days

Now equate,

[n(log2)]/(60×60×24×100) = 24

n(log2) = 24 × (60×60×24×100)

n(log2) = 207360000

log2 = 0.301

So, n = 207360000/0.301

= 688903654.485

Similar questions