explain the double dabble method with suitable example
Answers
Answered by
7
Answer:
In computer science, the double dabble algorithm is used to convert binary numbers into binary-coded decimal (BCD) notation. It is also known as the shift-and-add-3 algorithm, and can be implemented using a small number of gates in computer hardware, but at the expense of high latency.
Similar questions
Science,
1 month ago
Math,
3 months ago
English,
3 months ago
India Languages,
9 months ago
India Languages,
9 months ago