Computer Science, asked by sriganesh2550, 1 year ago

To represent one billion (1 billion = 1000 million ), minimum number of bits needed are
1) 12
2) 15
3) 30
4) 20

Answers

Answered by ruchiagrawal11
2

the answer is 12

hope it helps


ruchiagrawal11: mark me branlist
Answered by sneha8665
0

12.

Hope it will help you.

Plz mark me as the brainliest.

Similar questions