Computer Science, asked by dillonbarnes07, 7 months ago

What is the largest number of binary codes used by Unicode to represent a character?

Answers

Answered by gurleen2717
6

Answer:

1 ASCII and Unicode

The biggest number that can be held in 7-bits is 1111111 in binary (127 in decimal). Therefore 128 different characters can be represented in the ASCII character set (Using codes 0 to 127).

PLZ MARK AS BRAIN LIST PLZ PLZ

Answered by syed2020ashaels
0

Answer: The table below shows the ASCII version, which uses 7 bits to encode each character. The largest number that can fit in 7 bits is 1111111 in binary (127 in decimal). Therefore, 128 different characters can be represented in the ASCII character set (using the codes 0 to 127).

Explanation:

Every time a character is typed on the keyboard, a code number is transmitted to the computer.

Code numbers are stored in binary form on computers as character sets called ASCII.

  • binary integers as shown in the ASCII table above. Coded characters are the numbers 0 to 9,
  • lowercase letters a to z, uppercase letters A to Z, basic punctuation marks, control codes
  • which come from Teletype machines and space. For example, a lowercase j would do
  • binary 1101010 and decimal 106. ASCII contains definitions for 128 characters: 33
  • non-printable control characters (many now obsolete) that affect the appearance of text and space
  • processed and 95 printable characters including space."

#SPJ6

https://brainly.in/question/15150538

Similar questions