Computer Science, asked by Pratapchaudhary1, 1 year ago

Write the definition of Bit,Byte and Mb in 10 lines

Answers

Answered by pramodaki2006
1

BYTE : the byte (/baɪt/) is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer[1][2] and for this reason it is the smallest addressable unit of memory in many computer architectures.

The size of the byte has historically been hardware dependent and no definitive standards existed that mandated the size – byte-sizes from 1[3] to 48 bits[4] are known to have been used in the past. Early character encoding systems often used six bits, and machines using six-bit and nine-bit bytes were common into the 1960s. These machines most commonly had memory words of 12, 24, 36, 48 or 60 bits, corresponding to two, four, six, eight or 10 six-bit bytes. In this era, bytes in the instruction stream were often referred to as syllables, before the term byte became common.

The modern de-facto standard of eight bits, as documented in ISO/IEC 2382-1:1993, is a convenient power of two permitting the values 0 through 255 for one byte.[5] The international standard IEC 80000-13 codified this common meaning. Many types of applications use information representable in eight or fewer bits and processor designers optimize for this common usage. The popularity of major commercial computing architectures has aided in the ubiquitous acceptance of the eight-bit size.[6] Modern architectures typically use 32 or 64-bit words, built of four or eight bytes.

The unit symbol for the byte was designated as the upper-case letter B by the International Electrotechnical Commission (IEC) and Institute of Electrical and Electronics Engineers(IEEE)[7] in contrast to the bit, whose IEEE symbol is a lower-case b. Internationally, the unit octet, symbol o, explicitly denotes a sequence of eight bits, eliminating the ambiguity of the byte .MEGA BITE: 

he megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB. The unit prefix megais a multiplier of 1000000 (106) in the International System of Units (SI).[1] Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.

However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (220 B), a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes,[2] in which this quantity is designated by the unit mebibyte (MiB). Less common is a convention that used the megabyte to mean 1000×1024 (1024000) bytes. NOTE I DON'T KNOW ABOUT BIT

Answered by varshneysamyakoxg8tj
1
bit stands for binary digits it is used to reprsent 1 bit either 0 or 1
value of bit can be 0 and 1 only ie why it is called binary digit

bytes
it is sequence of 8 bits. eight bits combined together to make a byte
generally 1 word size in computer is 1 byte

eg 10010011 representing 1 byte (8bits)
Mb stands for mega byte 
megabyte = 10^6 bytes

it is commonly used memory for representing computers data
eg capacity of cd is approx 720Mb and capacity of floppy that was used in old
days is 1.44 Mb 
Similar questions