can anyone explain me why 100 years contains only 24 leap years and not 25 leap years without using the formula of divisibility of 400 years.
false answer will be reported...
Answers
Answered by
2
Answer:
Explanation:
for better understanding as you are in 10 class you must have solved arithmetic progression
now consider 100 years from 2000 to 2100
as 2000,2004,2008.........2096 can be consider as leap years
now using AP formula
an=a+(n-1)d an=2096 a=2000 d=4
2096=2000+(n-1)4
96=(n-1)4
n=25
NOW YOU CAN SHOW THIS TO THOSE WHO TOLD YOU THAT THERE ARE 24 LEAP YEARS IN 100 YEARS . I HAVE PROVEN IT JUST BY SIMPLE MATHS
Answered by
1
As others have been pointing out, under the current, Gregorian calendar, a year is a leap year if and only if one of the following is true:
The year number is divisible by 400;
The year number is divisible by 4 but not by 100.
These rules are a rewording of the rules for leap years in the Gregorian calendar listed in subclause 3.2.1 of ISO 8601:2004 (an international standard for the representation of times and dates). Note that there is no rule pertaining to divisibility by 4000—such has been proposed but never accepted by any relevant authority, as there are more pressing problems for such people to deal with than a one day error every few thousand years.
Rule 2 says that every year number ending in 04, 08, 12, 16, 20, …, 80, 84, 88, 92, and 96 is a leap year. The case 00 is left out because such a year number is divisible by 100, contradicting Rule 2. That is 24 values out of 100 possibilities (the 100 possibilities 00 through 99 constituting a century, that is, 100 years). This is no doubt the origin of the question.
However, there is Rule 1 that provides an exception opportunity so that even though Rule 2 fails, we still want a leap year. Rule 1 allows for the year number to end in 00 as long as the year number is divisible by 400. Thus 1600 and 2000 are leap years (complying with Rule 1), but 1700, 1800, and 1900 are not (violating both Rule 1, since not divisible by 400, and Rule 2, since divisible by 100). Thus, in some centuries, there is a 25th leap year. More specifically, a 25th leap year occurs in one out of every four centuries, so, there are 97 leap years every 400 years, making an average of 24.25 leap years per century when averages over centuries.
This somewhat complicated set of rules replaces the rules under the Julian calendar, which had a leap year every 4 years (year number divisible by 4) without any further conditions or stipulations. That change came about because the length of the tropical year on which the seasons are based is now known to be about 365.24219 d. The Julian calendar added 1 d every 4 years, so on average 0.2500 d each year (and 25 leap days per century). This mean the calendar was treating each year as 365.2500 d, about 0.00781 d too long, which accumulated to 1 d about every 128 d, and the consequent drift of the seasons through the calendar had become noticeable in the Middle Ages. The Gregorian calendar took away 3 leap days every 400 years and kept 97 leap years every 400 years, which averages out to 24.25 leap days per century and 0.2425 leap days per year, so the average length of the Gregorian year is 365.2425 d, which is only 0.00031 d too long, which accumulates to 1 d every approximately 3300 years (thus the suggestion to add a 4000 rule).
The year number is divisible by 400;
The year number is divisible by 4 but not by 100.
These rules are a rewording of the rules for leap years in the Gregorian calendar listed in subclause 3.2.1 of ISO 8601:2004 (an international standard for the representation of times and dates). Note that there is no rule pertaining to divisibility by 4000—such has been proposed but never accepted by any relevant authority, as there are more pressing problems for such people to deal with than a one day error every few thousand years.
Rule 2 says that every year number ending in 04, 08, 12, 16, 20, …, 80, 84, 88, 92, and 96 is a leap year. The case 00 is left out because such a year number is divisible by 100, contradicting Rule 2. That is 24 values out of 100 possibilities (the 100 possibilities 00 through 99 constituting a century, that is, 100 years). This is no doubt the origin of the question.
However, there is Rule 1 that provides an exception opportunity so that even though Rule 2 fails, we still want a leap year. Rule 1 allows for the year number to end in 00 as long as the year number is divisible by 400. Thus 1600 and 2000 are leap years (complying with Rule 1), but 1700, 1800, and 1900 are not (violating both Rule 1, since not divisible by 400, and Rule 2, since divisible by 100). Thus, in some centuries, there is a 25th leap year. More specifically, a 25th leap year occurs in one out of every four centuries, so, there are 97 leap years every 400 years, making an average of 24.25 leap years per century when averages over centuries.
This somewhat complicated set of rules replaces the rules under the Julian calendar, which had a leap year every 4 years (year number divisible by 4) without any further conditions or stipulations. That change came about because the length of the tropical year on which the seasons are based is now known to be about 365.24219 d. The Julian calendar added 1 d every 4 years, so on average 0.2500 d each year (and 25 leap days per century). This mean the calendar was treating each year as 365.2500 d, about 0.00781 d too long, which accumulated to 1 d about every 128 d, and the consequent drift of the seasons through the calendar had become noticeable in the Middle Ages. The Gregorian calendar took away 3 leap days every 400 years and kept 97 leap years every 400 years, which averages out to 24.25 leap days per century and 0.2425 leap days per year, so the average length of the Gregorian year is 365.2425 d, which is only 0.00031 d too long, which accumulates to 1 d every approximately 3300 years (thus the suggestion to add a 4000 rule).
Similar questions