explain the double dabble method with suitable example
Answers
Answered by
7
Answer:
In computer science, the double dabble algorithm is used to convert binary numbers into binary-coded decimal (BCD) notation. It is also known as the shift-and-add-3 algorithm, and can be implemented using a small number of gates in computer hardware, but at the expense of high latency.
Similar questions
History,
3 months ago
Math,
3 months ago
Math,
6 months ago
English,
6 months ago
India Languages,
1 year ago