Computer Science, asked by monuchaudhari17, 2 months ago

explain the double dabble method with suitable example​

Answers

Answered by Anonymous
7

Answer:

In computer science, the double dabble algorithm is used to convert binary numbers into binary-coded decimal (BCD) notation. It is also known as the shift-and-add-3 algorithm, and can be implemented using a small number of gates in computer hardware, but at the expense of high latency.

Similar questions