She stopping him from being a horder. Deep down he knows he shouldn’t be. But he needs her to tell him no.
Potential horders would understand.
Helpful(?) spelling tip: There’s a difference between a horde and a hoard, and likewise a difference between horders and hoarders.
He’s a horder. The Calendar had nice pictures of Phish from the H.O.R.D.E. tour in the 90s?
My niece spent several years in China and often sent her grandma a calendar – a scroll one-page affair with all the months together under a picture of, say, the Great Wall. 2001, 2002, 2003 and 2005 are still hanging in the kitchen, mostly on top of each other. But I used the inspiration of this post to check and find that indeed 2002 is good to go for this year. So I brought it forward.
Anyway, the pictures could indeed be good and keepable… several of my partner’s family give each other self-published calendars from one of the online services like Printerpix, made up of often personal and relevant memorable images. Be nice to keep them.
Also, many wall calendars, even commercial ones, are annotated with notes, dates, appointments, important events, holiday space selections. In the “we’ve got to look out for each other” set of strategies for dealing with oncoming mental decline, I would suggest that holding on to easy-to-grok aides memoire of the shapes of the past few years might be more useful than chucking them into the bin and leaving your house a blanker information- and context-free space – whatever the danger of becoming hoarders. So I am with CIDU Bill, “thanks” for what?
Plus, you never know when you might be nominated for the Supreme Court and need some back up !
” I would suggest that holding on to easy-to-grok aides memoire of the shapes of the past few years might be more useful than chucking them into the bin and leaving your house a blanker information- and context-free space – whatever the danger of becoming hoarders. So I am with CIDU Bill, “thanks” for what?”
But that’s you, not them.
I’m personally of the idea that I’d rather not get any of the damned calendars into my barely functioning routine in the first place. I never use them and then they just guilt me in that if I *don’t* use them one day I can never use them that day ever and I just wasted one 365th of the calendar that I can never redeem and that makes me morbidly depressed so I’d rather these foul and disgusting things never enter my house.
But that’s me, not them.
They… are worried about hoarding.
I have my calendars on a word processing program; I print out the year, staple it all together, and file in the filing cabinet. And yes, it’s come in handy at times.
Dog-themed calendars and DiscWorld Calendars are saved on the appropriate book case shelf; I NEVER write on those.
That’s why I never write on regular calendars; soon as you write something down, the date or time of that even is changed. My handwriting’s terrible, so who wants to look at the calendar . . . computer calendar that can be changed and printed out a few times/month is much easier for us.
He is not smiling or happy. “Thanks for nothing!” is what he might be thinking.
I went from regular calendars to a calendar app very abruptly last year after I found myself downtown wanting to buy a theatre ticket but not knowing which weekends we were free. I realized that if I broke down and joined the 21st century, I’d never have that problem again.
In the last century I used a paper appointment book that I kept my record of what I had done for work as well as is general. It was awkward to carry all the time and had limited room for entries.
In the early 2000s Robert needed cataract surgery. He had a Palm Pilot III. During the 2 years or so he put off having the surgery and could not see to use the Palm, he “lent” it to me for me to see if I liked it. Combined with Lotus Organizer I found it the best thing since bread itself was invented. The original idea had been to for me to get one of my own if I liked it, but he decided that since he gone without a calendar for about 3 years, he did not need one and told me to keep it.
It was later followed by the Palm Centro cell phone (suppose to be the first smart phone). When we switched cell phone companies some years later (as husband needed a better phone for work) I ended up with a Blackberry – one of the last before they went Android – which was able to sync (using old Blackberry software Robert found for me) with the Lotus Organizer. Alas and alack, the Blackberry phone function died – and, yes, I thought about keeping it and making calls online instead, but 2 weeks later the Internet connection died also and I was dragged kicking and screaming into an Android (and the smallest one he could find.
So I carry the Centro around the house and on trips to use with the Organizer and duplicate appointments into the Android. (The Blackberry phone camera still works and I use in house as it is better than the android’s camera.)
If Robert asks (argues) that last year we were away in such and such on such and such date I can check. If he asks which space we had at the RV park on a trip – I can tell him – it is all in the Organizer. Phone numbers I will probably never use (state senator?) are in the Centro/Organizer but not in the android.
The new laptop – Windows 10 – will have a Win XP virtual machine when we finish setting it up just so I can use Organizer and sync with the Palm in it.
Is eleven years always the right interval? I thought it might matter how many leap years your span took in.
Mitch4: short answer is no, it’s not always 11 years. More details here:
From that page, it seems it’s either 6 or 11 years, but the explanation as to why is lacking…
365 is 1 more than a multiple of 7, and 366 is 2 more than a multiple of 7. A nonleap year following a nonleap year will have 01 January (and all dates thru the year) fall one step further along in the weekday cycle than it was in the previous year. A year (necessarily nonleap) following a leap year will have 01 January (and I think all dates thru-out the year) fall two steps further in the weekdays cycle. A leap year following a (necessarily) non-leap year will have 01 January (and all dates in Jan and Feb) fall one step along, and dates in the last 10 months of the year falling two steps along.
So:
0M 1Tu 2W 3LTh 4Sa 5Su 6M
0M 1Tu 2LW 3F 4Sa 5Su 6LM 7W 8Th 9F 10LSa 11M
0M 1LTu 2Th 3F 4Sa 5LSu 6M
0LM 1W 2Th 3F 4LSa 5M 6Tu 7W 8LTh 9Sa 10Su 11M
@ Mitch4 – The interval between matching calendars is not uniform. The gap is usually 6 or 11 years, or in rare cases 12 years (related to leap years at the break between centuries).
I researched this several years ago, after someone sent me a “page per day” desk calendar with scenes from South Park. The calendar had no text or other information (just the pictures), so I thought it was absolutely worthless, but I was thinking of trying to find a dedicated South Park fan on E-Bay, preferably shortly before the next year in which it could be re-used.
Hi Kilby, if you can decode my notations, the scribblings show how to account for two cases of 6 and two cases of 11. (The fourth line should have ended with 11LM.)
I think 2000 was a leap year, so we would need to go back to 1900 or ahead to 2100 to worry about the century exception clause.
Meryl A, I still use my Palm Pilot, too, the “Tungsten” model that syncs with Palm Desktop on the computer., But I don’t use the calendar anymore. I was syncing it (when I was running XP), and got a message that the calendar was full. After it finished syncing, all entries in the calendar on both the Palm and the Desktop were empty! So, fair warning. But I still rely on all my memos and addresses there. They contain everything I need to remember! I have been struggling to keep the Desktop running on W10 — it runs just fine on my laptop, but not my desktop — it keeps losing connection with the data files. So thanks for the info about XP in a virtual window. I hadn’t thought of that, and will investigate.
For a calendar, I now use Google calendar on my phone and browser, and find it works for me. But my phone just doesn’t compare with the Palm for convenient text files that I can sync with my computer so I can type on a real keyboard, and I can group alphabetically in folders. (If anyone has a suggestion for a phone/computer app they like, I’d love to hear about it!) And the battery on my Palm lasts a week or more, while my phone is doing good to last a full day!
“Is eleven years always the right interval? I thought it might matter how many leap years your span took in.”
It *does* matter. And for 2019 the span is 11 years. But sometimes it is 6 years and sometimes it is 28 years.
There are 14 possible calendars based on the 7 different days a year can start and the 2 ways it can be a leap year (it it is or it isn’t). In a 28 year period the 7 leap calendars occur exactly once. And the 7 regular calendars occur exactly three times. Every 28 years the cycle starts over again.
If 2000 was a SaL (first day was Saturday and it was a leap year) and 2001 was a M (first day was Monday and it was not a leap year) the 28 year cycle will be:
The gap for the year before leap year is 11 years.
The gap for two years after leap year is 11 years.
And the gap for leap years is 28 years.
2000 was an exception to the century exception. Every 400 years the century years *are* leap years.
So we can say that every 2800 years the calendars repeat.
“So we can say that every 2800 years the calendars repeat.”
They don’t, but we can say that.
@James Pollock: “They don’t, but we can say that.”
How many legs does a cow have, if we call the tail a leg?
“They don’t, but we can say that.”
They would if we applied current calendar convention for 2800 years. We haven’t and we aren’t going to. But if we did they would.
The (Gregorian) calendar repeats itself exactly every 400 years. You might not find an English calendar from 1619 useful this year but a Spanish calendar from 1619 should do just fine.
Lunes, nada
Martes, nada
Miercoles y Jueves, nada
Viernes, por cambio
Un poco mas nada
Sabado otra vez nada
It’s also possible that she’s saving him from becoming his father, saving and reusing calendars, tea bags, paper towels, ziplock bags, or scraps of foil. My family is infamous for saving inkless ballpoint pens, unindentifird chargers, and broken small appliances like microwaves.
Mitch4, that’s a QIDU, if it is a quote from something. I mean, I know enough Spanish to translate it, and it even sounds vaguely familiar, but I’m coming up empty, unless it’s from “The Old Man and the Sea.” A little help?
Huh, It seems I got a completely different vibe from this comic to everyone else. For me it reads as a comment on how old they’re both getting; between panels 3 and 4 Janis told Arlo that in 11 years time he’ll likely be dead. Arlo saying “Thanks” is sarcasm.
@guero, that’s from a song called “Nothing”, by The Fugs. Not very relevant, I must admit, but it was my first association from MiB’s mention of a Spanish calendar.
“The (Gregorian) calendar repeats itself exactly every 400 years.”
The Gregorian calendar is more accurate than the Julian calendar was, but it also doesn’t quite match up to the celestial timings, either. Let it run for long enough, and it’ll need correction, too. But not in my lifetime.
Kind of the same way we kicked the Y2K computer bug down the road. It’s solved for Y2K, all right, but come Y10K, we’re going to have to go through all of that again. Except in Y10K it’ll be harder to find COBOL programmers.
Long before we have to worry seriously about the Y10K issue, programmers will have to find a solution for the 32-bit Unix timestamp problem, which (if unfixed) will cycle on 19-Jan-2038, bumping everything back to 13-Dec-1901.
Yeah, but you don’t need any COBOL programmers to fix Unix.
I don’t care about accuracy, And I don’t care about whether people actually use a calendar. All I care about is the fun mathematics of periodicity.
A system with 7 days a week and 365 = 52×7 + 1, will start each year a day later and repeat every 7 years. But a system with 365 days a year but an extra day every four years will be off set but it will have 28 (4×7) year cycle of repeating. Ever non leap year will repeat within 6 to 11 years and every leap year every 28 years but a complete cycle is 28 years.
If however every 100 years there *isn’t* a leap year on the years divisible by 100 then our 28 year cycles are off. But we will have a 700 (lowest common multiple of 100 and 28) year cycle.
But if we throw it off in that every 400 years there *is* a leap year on years divisble by 400 the our 700 year cycles are off and you we have a 2800 you cycle. I’m not sure what the greatest gap will be. I think It would be a leap year that would have repeated in 28 year but didn’t because that year was divisible by 100 and didn’t have a leap year (although it started on the same day) but then it would start an the same day 12 years later (12 year plus 2 leap years is 14 days or two weeks) and that would be a leap year.
So I think the longest gaps between calendars is 40 years. That century will have 28 year cycles but they will be offset but the 28 year cycles of other centuries. (The cycle for the 20 and 21st century leap years that start on monday are 1912, 1940, 1968, 1980,2008, 2020, 2048, 2060, 2088. But the cycle for the 19th century was 1816 ,1844, 1872. See? it was broken when 1900 was not a leap year.)
Interesting discussion about the calendar. Question for Kilby (or anyone) about the UNIX timestamp: Is 13-Dec-1901 represented by all 0’s or by a 1 and the rest 0’s? In other words, will the 1 or the last 0 be lost?
The way the Unix filestamp works is they picked an arbitrary point and labeled that as second #0, and just started counting every second since then as a 32-bit integer. That system works just fine for about 136 years or so, then you have to deal with it being second #0 AGAIN.
The way it will probably be patched is by throwing away the bottom half of the date range and tacking it on again at the end. Seconds 1000…000 to 111 … 111 will keep the original reference point, and seconds 000…000 to 011…111 will have a different reference point. This solution is (relatively) easy to program, and leaves (most) existing filestamps as-is.
This solution gives them another 80 years to decide if they want to go to the “real” solution, which is to switch from a 32-bit number to a 64-bit number.
Thank you, James. And thanks for today’s biggest laugh — Y10K COBOL programmers!
As I understand it, the Unix epoch is Jan. 1, 1970, so the apocalypse in 2038 is 2^31 seconds from that point, not 2^32 seconds, because the leftmost bit is the “sign” bit (1 signifying a negative number.) The “simple” solution is to treat all 2^32 values as positive, buying an additional 68 years, but I’m not so sure that is a simple software fix. At any rate, it is buried in the OS, rather than applications, (and assuming the applications use time/date functions correctly.) The Y2K problem was bought about because programmers, in an effort to save space, only provided 2 bytes (digits, characters, whatever) for the year field in date data. Probably the majority of programs were written and maintained in-house , so it was up to these in-house systems to fix their code. (As an old COBOL programmer, I was in the middle if this; spent New Years 2000 at work, monitoring systems.) In my opinion, a large part of the dot com boom was a result of companies replacing old systems/hardware with Y2K compliant systems/hardware, rather than mess with tracking down and fixing code. That and the fact that a lot of mainframe systems could be replaced with small servers running packaged software at 1/10th the cost. Once into the new millennium, all of that corporate money went elsewhere, and the bubble burst. 64-bit computers are already common, so whatever tweaks that will have to be made will be at the OS or Language level. The only issues will come from those applications where the programmers thought they were doing something clever with the date/time computations. That is not to say that a lot of hardware and software will not be made obsolete, so there will probably be another mini-dot com boom as corporations upgrade. You heard it here, first. ;-)
“Janis told Arlo that in 11 years time he’ll likely be dead.”
Even if Arlo and Janis are are as old as the say they are (early 70s) and not as old as they act (late 40’s- early 50’s) that’s exceedingly grim.
Any software developer can learn COBOL in a few days. It wasn’t that hard to learn even back then. Modern versions have block structure and objects and other good stuff, but you have to know how to live without garbage collection.
“Any software developer can learn COBOL in a few days.”
Unless people who’ve only known oo languages have the same problems going to procedural ones as many who grew up with procedural languages have going to oo.
When I was a lad earning my very first degree, I had the misfortune to show up on campus at the same time that the stupid decision was made that beginning Computer Science students should take all their first- and second-year classes in Pascal, and then and only then should they be permitted to use real programming languages.
I hate Pascal. I consider it an abomination, and refused to put any effort into classes taught in Pascal. Had I been just a couple of years younger, I would have arrived after the Great Pascal Experiment was given up as a bad idea, and had all my first- and second-year classes in C. But I didn’t, so I don’t have a CS degree.
(They ALSO had the Great Idea of moving all the first-year classes off the big computer and into the MacIntosh lab. Original, first-generation MacIntoshes. So that each student had to buy a copy of the Pascal compiler, and the bookstore didn’t order enough copies. So I went through the first month of the class without being able to do anything on the computer, and this not being a good enough excuse for not including the compile report and program output for the first four or five weeks’ worth of programming assignments.
I suffered from the same kind of problem as JP. In high school we started with Basic and then advanced to Fortran. When I got to college, they put the CS students on the Unix machine (using C), but the engineering students were stuck on a Dec Vax (so we could use Fortran and all of the “wonderful” libraries written in it). I didn’t mind this at the time (I would rather use a card punch than the vi editor), but in retrospect it was a major mistake: I would have been much better off learning a new language, rathet than trying to make sense out of dusty old archives.
P.S. @ Bookworm – The problem with the timestamp variable is that it is a 32-bit signed integer. When it counts all the way forward, so that the value is a zero followed by 31 ones, the rollover turns the sign bit to a one (followed by 31 zeros). This has the advantage that it permits date calculation (and storage) nearly 70 years in either direction (from the basis date assigned to the value zero), but that also makes it much harder to just “flip” the meaning of the negative range (by changing the definition to an unsigned integer).
JameK and woozy: grim, but it does work.
Pascal was conceived as a language for teaching programming, and accordingly had limitations — for instance that a set can only have as many elements as there are bits in a word — and was probably based on the idea that all the IBM 650’s were gone so the students could no longer have the painful experience of programming the 650 in machine language.
I may have missed it, but it seems that nobody has stated that some non-leap year calendars can be good in 5 years (as well as 6 or 11 years).
Thanks, Kilby. I had forgotten about the sign bit. Too many years ago.
I learned Fortran using the card punch, Then 10 years later I went back to school and learned BASIC and Pascal. That only made me hungry for more, so I took COBOL, which seemed like a lot of busy work, and finally C, which was the most fun. I ended up doing a contract job in, of all things, BASIC, because that was what they wanted it in, then went into IT and ran a small college network. Now I’m watching my grandkids have fun with programming, most of which I’ve forgotten. But it was fun.
“Pascal was conceived as a language for teaching programming, and accordingly had limitations”
More specifically, Pascal was a language designed for teaching structured programming, and as such it actively prevented programmers from being creative.
64-bit timestamps are a thing in most Unix flavors by now. At least at the system level. The chief problem is finding and fixing accidental truncations in applications, for which the level of automated support in readily available tools is abysmal. That and some legacy file formats and protocols, but most of those are already pretty well dead.
James: bollocks. Everyone else who’s interested: look for an article called “Why Pascal is Not My Favorite Programming Language”.
Having located and read Mr. Kernighan’s paper, I’m puzzled as to why you cited it to support a claim that I’m wrong about Pascal. Granted, Mr. K has a good reason to be biased in favor of C over other programming languages, but he’s undisputedly an expert in the field, and he agreed with me.
If I’d been a couple of years younger, I’d have come to CS after the Pascal fad died out, and I would have been given C as a first programming language instead, and I’d have a CS degree today. Then again, another couple of years, and I’d have arrived in time for C++ and the switch to OOP. I didn’t care for it (though possibly because I had so much time and effort invested into learning functional languages). I got through a class that used C++ as the implementation language, on the technicality that all C programs are ALSO C++ programs.
James: you’ve missed the point. That citation was not in support of anything, it was for the interest of passersby. I did not support my assertion because arguing with you is useless. If you want to actually engage for a change, you might begin by supporting your assertion about actively preventing creativity, and what this has to do with structured programming.
The POINT of structured programming is that everything is planned in advance, often in multiple stages.
Pascal is designed to require this. It actively limits or even prevents late or frequent changes. It is not a tool for solving problems, it is a tool for promoting rigid repetition of “proper” (structured) programming. The intended purpose is to stifle creativity, because creativity is incompatible with rigid process. The designed purpose of Pascal is NOT the creation of useful code, rather, the product of Pascal is programmers who follow the same, predetermined process to create code.
By contrast, C is a language developed by programmers, for programmers, for the purposes of producing useful code. It doesn’t have features intended to make you follow the same procedures for producing code, it has features that facilitate turning ideas into functioning programs.
These are not exactly secrets. I’m surprised you have to have them spelled out for you.
C is also a structured programming language and has almost the same set of language features related to program structure that Pascal does. By your logic, it must also be a tool for promoting rigid repetition of process.
Meanwhile, you keep making these unsupported (and unsupportable) assertions like “it actively limits”. Evidence?
Pascal is not my favorite programming language either, but it is adequate for many things (note how much useful software has been written in Turbo Pascal) and its chief failings have nothing to do with either structured programming as a concept or forty-year-old trendy software methodologies.
Normally this would be something I’d be interested in discussing, as I had a decently long career as a software engineer and graduate student in CS, but I just don’t bother even replying to James at all anymore.
She stopping him from being a horder. Deep down he knows he shouldn’t be. But he needs her to tell him no.
Potential horders would understand.
Helpful(?) spelling tip: There’s a difference between a horde and a hoard, and likewise a difference between horders and hoarders.
He’s a horder. The Calendar had nice pictures of Phish from the H.O.R.D.E. tour in the 90s?
My niece spent several years in China and often sent her grandma a calendar – a scroll one-page affair with all the months together under a picture of, say, the Great Wall. 2001, 2002, 2003 and 2005 are still hanging in the kitchen, mostly on top of each other. But I used the inspiration of this post to check and find that indeed 2002 is good to go for this year. So I brought it forward.
Anyway, the pictures could indeed be good and keepable… several of my partner’s family give each other self-published calendars from one of the online services like Printerpix, made up of often personal and relevant memorable images. Be nice to keep them.
Also, many wall calendars, even commercial ones, are annotated with notes, dates, appointments, important events, holiday space selections. In the “we’ve got to look out for each other” set of strategies for dealing with oncoming mental decline, I would suggest that holding on to easy-to-grok aides memoire of the shapes of the past few years might be more useful than chucking them into the bin and leaving your house a blanker information- and context-free space – whatever the danger of becoming hoarders. So I am with CIDU Bill, “thanks” for what?
Plus, you never know when you might be nominated for the Supreme Court and need some back up !
” I would suggest that holding on to easy-to-grok aides memoire of the shapes of the past few years might be more useful than chucking them into the bin and leaving your house a blanker information- and context-free space – whatever the danger of becoming hoarders. So I am with CIDU Bill, “thanks” for what?”
But that’s you, not them.
I’m personally of the idea that I’d rather not get any of the damned calendars into my barely functioning routine in the first place. I never use them and then they just guilt me in that if I *don’t* use them one day I can never use them that day ever and I just wasted one 365th of the calendar that I can never redeem and that makes me morbidly depressed so I’d rather these foul and disgusting things never enter my house.
But that’s me, not them.
They… are worried about hoarding.
I have my calendars on a word processing program; I print out the year, staple it all together, and file in the filing cabinet. And yes, it’s come in handy at times.
Dog-themed calendars and DiscWorld Calendars are saved on the appropriate book case shelf; I NEVER write on those.
That’s why I never write on regular calendars; soon as you write something down, the date or time of that even is changed. My handwriting’s terrible, so who wants to look at the calendar . . . computer calendar that can be changed and printed out a few times/month is much easier for us.
He is not smiling or happy. “Thanks for nothing!” is what he might be thinking.
I went from regular calendars to a calendar app very abruptly last year after I found myself downtown wanting to buy a theatre ticket but not knowing which weekends we were free. I realized that if I broke down and joined the 21st century, I’d never have that problem again.
In the last century I used a paper appointment book that I kept my record of what I had done for work as well as is general. It was awkward to carry all the time and had limited room for entries.
In the early 2000s Robert needed cataract surgery. He had a Palm Pilot III. During the 2 years or so he put off having the surgery and could not see to use the Palm, he “lent” it to me for me to see if I liked it. Combined with Lotus Organizer I found it the best thing since bread itself was invented. The original idea had been to for me to get one of my own if I liked it, but he decided that since he gone without a calendar for about 3 years, he did not need one and told me to keep it.
It was later followed by the Palm Centro cell phone (suppose to be the first smart phone). When we switched cell phone companies some years later (as husband needed a better phone for work) I ended up with a Blackberry – one of the last before they went Android – which was able to sync (using old Blackberry software Robert found for me) with the Lotus Organizer. Alas and alack, the Blackberry phone function died – and, yes, I thought about keeping it and making calls online instead, but 2 weeks later the Internet connection died also and I was dragged kicking and screaming into an Android (and the smallest one he could find.
So I carry the Centro around the house and on trips to use with the Organizer and duplicate appointments into the Android. (The Blackberry phone camera still works and I use in house as it is better than the android’s camera.)
If Robert asks (argues) that last year we were away in such and such on such and such date I can check. If he asks which space we had at the RV park on a trip – I can tell him – it is all in the Organizer. Phone numbers I will probably never use (state senator?) are in the Centro/Organizer but not in the android.
The new laptop – Windows 10 – will have a Win XP virtual machine when we finish setting it up just so I can use Organizer and sync with the Palm in it.
Is eleven years always the right interval? I thought it might matter how many leap years your span took in.
Mitch4: short answer is no, it’s not always 11 years. More details here:
https://www.timeanddate.com/calendar/repeating.html?year=2018&country=27
From that page, it seems it’s either 6 or 11 years, but the explanation as to why is lacking…
365 is 1 more than a multiple of 7, and 366 is 2 more than a multiple of 7. A nonleap year following a nonleap year will have 01 January (and all dates thru the year) fall one step further along in the weekday cycle than it was in the previous year. A year (necessarily nonleap) following a leap year will have 01 January (and I think all dates thru-out the year) fall two steps further in the weekdays cycle. A leap year following a (necessarily) non-leap year will have 01 January (and all dates in Jan and Feb) fall one step along, and dates in the last 10 months of the year falling two steps along.
So:
0M 1Tu 2W 3LTh 4Sa 5Su 6M
0M 1Tu 2LW 3F 4Sa 5Su 6LM 7W 8Th 9F 10LSa 11M
0M 1LTu 2Th 3F 4Sa 5LSu 6M
0LM 1W 2Th 3F 4LSa 5M 6Tu 7W 8LTh 9Sa 10Su 11M
@ Mitch4 – The interval between matching calendars is not uniform. The gap is usually 6 or 11 years, or in rare cases 12 years (related to leap years at the break between centuries).
I researched this several years ago, after someone sent me a “page per day” desk calendar with scenes from South Park. The calendar had no text or other information (just the pictures), so I thought it was absolutely worthless, but I was thinking of trying to find a dedicated South Park fan on E-Bay, preferably shortly before the next year in which it could be re-used.
Hi Kilby, if you can decode my notations, the scribblings show how to account for two cases of 6 and two cases of 11. (The fourth line should have ended with 11LM.)
I think 2000 was a leap year, so we would need to go back to 1900 or ahead to 2100 to worry about the century exception clause.
Meryl A, I still use my Palm Pilot, too, the “Tungsten” model that syncs with Palm Desktop on the computer., But I don’t use the calendar anymore. I was syncing it (when I was running XP), and got a message that the calendar was full. After it finished syncing, all entries in the calendar on both the Palm and the Desktop were empty! So, fair warning. But I still rely on all my memos and addresses there. They contain everything I need to remember! I have been struggling to keep the Desktop running on W10 — it runs just fine on my laptop, but not my desktop — it keeps losing connection with the data files. So thanks for the info about XP in a virtual window. I hadn’t thought of that, and will investigate.
For a calendar, I now use Google calendar on my phone and browser, and find it works for me. But my phone just doesn’t compare with the Palm for convenient text files that I can sync with my computer so I can type on a real keyboard, and I can group alphabetically in folders. (If anyone has a suggestion for a phone/computer app they like, I’d love to hear about it!) And the battery on my Palm lasts a week or more, while my phone is doing good to last a full day!
“Is eleven years always the right interval? I thought it might matter how many leap years your span took in.”
It *does* matter. And for 2019 the span is 11 years. But sometimes it is 6 years and sometimes it is 28 years.
There are 14 possible calendars based on the 7 different days a year can start and the 2 ways it can be a leap year (it it is or it isn’t). In a 28 year period the 7 leap calendars occur exactly once. And the 7 regular calendars occur exactly three times. Every 28 years the cycle starts over again.
If 2000 was a SaL (first day was Saturday and it was a leap year) and 2001 was a M (first day was Monday and it was not a leap year) the 28 year cycle will be:
SaL: M:Tu:W: ThL: Sa:Su:M:TuL:Th:F:Sa:SuL:T:W:Th:FL:Su:M:Tu:WeL:F:Sa:Su:ML:W:Th:F
The gap for the year after leap year is 6 years.
The gap for the year before leap year is 11 years.
The gap for two years after leap year is 11 years.
And the gap for leap years is 28 years.
2000 was an exception to the century exception. Every 400 years the century years *are* leap years.
So we can say that every 2800 years the calendars repeat.
“So we can say that every 2800 years the calendars repeat.”
They don’t, but we can say that.
@James Pollock: “They don’t, but we can say that.”
How many legs does a cow have, if we call the tail a leg?
“They don’t, but we can say that.”
They would if we applied current calendar convention for 2800 years. We haven’t and we aren’t going to. But if we did they would.
The (Gregorian) calendar repeats itself exactly every 400 years. You might not find an English calendar from 1619 useful this year but a Spanish calendar from 1619 should do just fine.
Lunes, nada
Martes, nada
Miercoles y Jueves, nada
Viernes, por cambio
Un poco mas nada
Sabado otra vez nada
It’s also possible that she’s saving him from becoming his father, saving and reusing calendars, tea bags, paper towels, ziplock bags, or scraps of foil. My family is infamous for saving inkless ballpoint pens, unindentifird chargers, and broken small appliances like microwaves.
Mitch4, that’s a QIDU, if it is a quote from something. I mean, I know enough Spanish to translate it, and it even sounds vaguely familiar, but I’m coming up empty, unless it’s from “The Old Man and the Sea.” A little help?
Huh, It seems I got a completely different vibe from this comic to everyone else. For me it reads as a comment on how old they’re both getting; between panels 3 and 4 Janis told Arlo that in 11 years time he’ll likely be dead. Arlo saying “Thanks” is sarcasm.
@guero, that’s from a song called “Nothing”, by The Fugs. Not very relevant, I must admit, but it was my first association from MiB’s mention of a Spanish calendar.
“The (Gregorian) calendar repeats itself exactly every 400 years.”
The Gregorian calendar is more accurate than the Julian calendar was, but it also doesn’t quite match up to the celestial timings, either. Let it run for long enough, and it’ll need correction, too. But not in my lifetime.
Kind of the same way we kicked the Y2K computer bug down the road. It’s solved for Y2K, all right, but come Y10K, we’re going to have to go through all of that again. Except in Y10K it’ll be harder to find COBOL programmers.
Long before we have to worry seriously about the Y10K issue, programmers will have to find a solution for the 32-bit Unix timestamp problem, which (if unfixed) will cycle on 19-Jan-2038, bumping everything back to 13-Dec-1901.
Yeah, but you don’t need any COBOL programmers to fix Unix.
I don’t care about accuracy, And I don’t care about whether people actually use a calendar. All I care about is the fun mathematics of periodicity.
A system with 7 days a week and 365 = 52×7 + 1, will start each year a day later and repeat every 7 years. But a system with 365 days a year but an extra day every four years will be off set but it will have 28 (4×7) year cycle of repeating. Ever non leap year will repeat within 6 to 11 years and every leap year every 28 years but a complete cycle is 28 years.
If however every 100 years there *isn’t* a leap year on the years divisible by 100 then our 28 year cycles are off. But we will have a 700 (lowest common multiple of 100 and 28) year cycle.
But if we throw it off in that every 400 years there *is* a leap year on years divisble by 400 the our 700 year cycles are off and you we have a 2800 you cycle. I’m not sure what the greatest gap will be. I think It would be a leap year that would have repeated in 28 year but didn’t because that year was divisible by 100 and didn’t have a leap year (although it started on the same day) but then it would start an the same day 12 years later (12 year plus 2 leap years is 14 days or two weeks) and that would be a leap year.
So I think the longest gaps between calendars is 40 years. That century will have 28 year cycles but they will be offset but the 28 year cycles of other centuries. (The cycle for the 20 and 21st century leap years that start on monday are 1912, 1940, 1968, 1980,2008, 2020, 2048, 2060, 2088. But the cycle for the 19th century was 1816 ,1844, 1872. See? it was broken when 1900 was not a leap year.)
Interesting discussion about the calendar. Question for Kilby (or anyone) about the UNIX timestamp: Is 13-Dec-1901 represented by all 0’s or by a 1 and the rest 0’s? In other words, will the 1 or the last 0 be lost?
The way the Unix filestamp works is they picked an arbitrary point and labeled that as second #0, and just started counting every second since then as a 32-bit integer. That system works just fine for about 136 years or so, then you have to deal with it being second #0 AGAIN.
The way it will probably be patched is by throwing away the bottom half of the date range and tacking it on again at the end. Seconds 1000…000 to 111 … 111 will keep the original reference point, and seconds 000…000 to 011…111 will have a different reference point. This solution is (relatively) easy to program, and leaves (most) existing filestamps as-is.
This solution gives them another 80 years to decide if they want to go to the “real” solution, which is to switch from a 32-bit number to a 64-bit number.
Thank you, James. And thanks for today’s biggest laugh — Y10K COBOL programmers!
As I understand it, the Unix epoch is Jan. 1, 1970, so the apocalypse in 2038 is 2^31 seconds from that point, not 2^32 seconds, because the leftmost bit is the “sign” bit (1 signifying a negative number.) The “simple” solution is to treat all 2^32 values as positive, buying an additional 68 years, but I’m not so sure that is a simple software fix. At any rate, it is buried in the OS, rather than applications, (and assuming the applications use time/date functions correctly.) The Y2K problem was bought about because programmers, in an effort to save space, only provided 2 bytes (digits, characters, whatever) for the year field in date data. Probably the majority of programs were written and maintained in-house , so it was up to these in-house systems to fix their code. (As an old COBOL programmer, I was in the middle if this; spent New Years 2000 at work, monitoring systems.) In my opinion, a large part of the dot com boom was a result of companies replacing old systems/hardware with Y2K compliant systems/hardware, rather than mess with tracking down and fixing code. That and the fact that a lot of mainframe systems could be replaced with small servers running packaged software at 1/10th the cost. Once into the new millennium, all of that corporate money went elsewhere, and the bubble burst. 64-bit computers are already common, so whatever tweaks that will have to be made will be at the OS or Language level. The only issues will come from those applications where the programmers thought they were doing something clever with the date/time computations. That is not to say that a lot of hardware and software will not be made obsolete, so there will probably be another mini-dot com boom as corporations upgrade. You heard it here, first. ;-)
“Janis told Arlo that in 11 years time he’ll likely be dead.”
Even if Arlo and Janis are are as old as the say they are (early 70s) and not as old as they act (late 40’s- early 50’s) that’s exceedingly grim.
Any software developer can learn COBOL in a few days. It wasn’t that hard to learn even back then. Modern versions have block structure and objects and other good stuff, but you have to know how to live without garbage collection.
“Any software developer can learn COBOL in a few days.”
Unless people who’ve only known oo languages have the same problems going to procedural ones as many who grew up with procedural languages have going to oo.
When I was a lad earning my very first degree, I had the misfortune to show up on campus at the same time that the stupid decision was made that beginning Computer Science students should take all their first- and second-year classes in Pascal, and then and only then should they be permitted to use real programming languages.
I hate Pascal. I consider it an abomination, and refused to put any effort into classes taught in Pascal. Had I been just a couple of years younger, I would have arrived after the Great Pascal Experiment was given up as a bad idea, and had all my first- and second-year classes in C. But I didn’t, so I don’t have a CS degree.
(They ALSO had the Great Idea of moving all the first-year classes off the big computer and into the MacIntosh lab. Original, first-generation MacIntoshes. So that each student had to buy a copy of the Pascal compiler, and the bookstore didn’t order enough copies. So I went through the first month of the class without being able to do anything on the computer, and this not being a good enough excuse for not including the compile report and program output for the first four or five weeks’ worth of programming assignments.
I suffered from the same kind of problem as JP. In high school we started with Basic and then advanced to Fortran. When I got to college, they put the CS students on the Unix machine (using C), but the engineering students were stuck on a Dec Vax (so we could use Fortran and all of the “wonderful” libraries written in it). I didn’t mind this at the time (I would rather use a card punch than the vi editor), but in retrospect it was a major mistake: I would have been much better off learning a new language, rathet than trying to make sense out of dusty old archives.
P.S. @ Bookworm – The problem with the timestamp variable is that it is a 32-bit signed integer. When it counts all the way forward, so that the value is a zero followed by 31 ones, the rollover turns the sign bit to a one (followed by 31 zeros). This has the advantage that it permits date calculation (and storage) nearly 70 years in either direction (from the basis date assigned to the value zero), but that also makes it much harder to just “flip” the meaning of the negative range (by changing the definition to an unsigned integer).
JameK and woozy: grim, but it does work.
Pascal was conceived as a language for teaching programming, and accordingly had limitations — for instance that a set can only have as many elements as there are bits in a word — and was probably based on the idea that all the IBM 650’s were gone so the students could no longer have the painful experience of programming the 650 in machine language.
I may have missed it, but it seems that nobody has stated that some non-leap year calendars can be good in 5 years (as well as 6 or 11 years).
Thanks, Kilby. I had forgotten about the sign bit. Too many years ago.
I learned Fortran using the card punch, Then 10 years later I went back to school and learned BASIC and Pascal. That only made me hungry for more, so I took COBOL, which seemed like a lot of busy work, and finally C, which was the most fun. I ended up doing a contract job in, of all things, BASIC, because that was what they wanted it in, then went into IT and ran a small college network. Now I’m watching my grandkids have fun with programming, most of which I’ve forgotten. But it was fun.
“Pascal was conceived as a language for teaching programming, and accordingly had limitations”
More specifically, Pascal was a language designed for teaching structured programming, and as such it actively prevented programmers from being creative.
64-bit timestamps are a thing in most Unix flavors by now. At least at the system level. The chief problem is finding and fixing accidental truncations in applications, for which the level of automated support in readily available tools is abysmal. That and some legacy file formats and protocols, but most of those are already pretty well dead.
James: bollocks. Everyone else who’s interested: look for an article called “Why Pascal is Not My Favorite Programming Language”.
“James: bollocks.”
You can bite me bollocks.
Speaking of coding (we were, weren’t we?) . . .
https://www.gocomics.com/deflocked/2019/01/08
http://onthefastrack.com/comics/january-8-2019/
Having located and read Mr. Kernighan’s paper, I’m puzzled as to why you cited it to support a claim that I’m wrong about Pascal. Granted, Mr. K has a good reason to be biased in favor of C over other programming languages, but he’s undisputedly an expert in the field, and he agreed with me.
If I’d been a couple of years younger, I’d have come to CS after the Pascal fad died out, and I would have been given C as a first programming language instead, and I’d have a CS degree today. Then again, another couple of years, and I’d have arrived in time for C++ and the switch to OOP. I didn’t care for it (though possibly because I had so much time and effort invested into learning functional languages). I got through a class that used C++ as the implementation language, on the technicality that all C programs are ALSO C++ programs.
James: you’ve missed the point. That citation was not in support of anything, it was for the interest of passersby. I did not support my assertion because arguing with you is useless. If you want to actually engage for a change, you might begin by supporting your assertion about actively preventing creativity, and what this has to do with structured programming.
The POINT of structured programming is that everything is planned in advance, often in multiple stages.
Pascal is designed to require this. It actively limits or even prevents late or frequent changes. It is not a tool for solving problems, it is a tool for promoting rigid repetition of “proper” (structured) programming. The intended purpose is to stifle creativity, because creativity is incompatible with rigid process. The designed purpose of Pascal is NOT the creation of useful code, rather, the product of Pascal is programmers who follow the same, predetermined process to create code.
By contrast, C is a language developed by programmers, for programmers, for the purposes of producing useful code. It doesn’t have features intended to make you follow the same procedures for producing code, it has features that facilitate turning ideas into functioning programs.
These are not exactly secrets. I’m surprised you have to have them spelled out for you.
C is also a structured programming language and has almost the same set of language features related to program structure that Pascal does. By your logic, it must also be a tool for promoting rigid repetition of process.
Meanwhile, you keep making these unsupported (and unsupportable) assertions like “it actively limits”. Evidence?
Pascal is not my favorite programming language either, but it is adequate for many things (note how much useful software has been written in Turbo Pascal) and its chief failings have nothing to do with either structured programming as a concept or forty-year-old trendy software methodologies.
Normally this would be something I’d be interested in discussing, as I had a decently long career as a software engineer and graduate student in CS, but I just don’t bother even replying to James at all anymore.