But Catherine, why would you assume that the modern psychosis of oneupmanship in all things – and the extension that demands the ringing in a new century/millennium a whole year before the last one has completed, on the basis of laziness induced by decade-ies terminology – would have been prevelant a century ago, before the onset of ubiquitous, instant media? (If anything, I wonder whether it would have been considered a big enough deal to be worth of a party.) In those days, cooler heads would have won out. Print media, under the guidance of editors who cared about facts more than hyped instant feedback (due to the lack of the latter), would have presented the facts of the matter.
Specifically, that the new century is at the end of the xx00 year.
For starters, the calendar system we use has no years 0. It counts down 3-2-1 BCE, then switches to 1-2-3 CE. The first century needs a full count of 100 years, so from 1CE to 100CE. Consequently, the 20th got all the way to 2000, inclusive.
The laziness aspect probably comes from the way people refer to age; someone "being in their 20s." This is because they say their age is 20 once they have completed their 20th year of life (during the first year, before they are 1, age is counted in days, weeks, then months). When mapped onto the grouping of decades – like the 20s – based on the second most significant digit in the year, cognitive laziness triggers a need to apply the same approach to centuries and millennia despite the logical error.
Perhaps the best approach in a case like this, other than avoidance, is to have the accuracy challenged within the presented events, thereby establishing what is correct and what is (modern) cultural laziness.