For those interested in mere datum points ... , 62 here and still joyously hacking.
It's been a blast so far, starting with punch cards and Burroughs JCL of the late 60s "high priesthood" machine-tending, through minicomputers, the internet, microcomputers, the web, the sudden ubiquity of personal computing devices. My love of sci-fi dates from the mid-60s and I'm still hugely buzzed to find myself living in the future that I used to read about; it's an absolutely fascinating time and a real privilege be working in the field.
Just last night I was using my laptop to watch a youtube video of a TV programme on dark matter (that's still a personal "wow" on so many levels). The programme mentioned that it was in the 1920s that Hubble first postulated the existence of galaxies other than the Milky Way. The contrast between "Hey, those fuzzy patches of light could be galaxies" to "and here we have a map of the dark matter in the observable universe" is almost as mind-boggling as the sheer speed of the advancement; less than 100 years, one person's lifetime.
It's now quite obvious to me (that's all I want to claim) that this period in human history is having a profoundly formative effect on the progress of the species and, should we collectively survive the immense challenges currently facing us, will be a subject of special interest to future historians. - so yeah, ageing can give you a greater facility for intuiting a wider view.
The fields of programming, computer science, software engineering are still young and (as a cognitivist interested in the cognitive psychology of programmers and programming), ISTM that we have a long way to go before we can develop reliably predictive models of programmers / programming and the related issues of: learning, ageing, skill acquisition and retention, organisational principles, etc., etc. - ageing does allow you to experience the same problems being identified over and over again, without any really effective solutions being devised. I spent nearly a decade as a cube-monkey in Hubris-Pachyderm labs between the mid-80s / mid-90s and during that time the problem of the "technical ladder" remained unsolved by our highly-paid senior management, much to their discredit. I note with some disappointment that the issue apparently remains generally unsolved today, that's 30 years of successive cohorts of senior management across the industry who have proved unequal to the task, sigh.
In my personal experience, the fields seem just as vulnerable to fads and fashion as any other formal/semi-formal endeavour. I'm given to understand (by a vastly more knowledgeable colleague) that DeMarco (the bane of my programming life in the 80s) has now recanted on structured analysis. I'm still contemplating the amount of damage inflicted on programmers by that movement, so you might forgive me for having a slightly jaundiced attitude towards contemporary ideas (can't even call them "theories") about pair programming, agile methodology, etc. But I mustn't get started on the subject of propellor-heads making uninformed, uneducated pronouncements on what are essentially topics in the domain of cognitive psychology, that way lies isolation. I've learned to keep my head down and avoid rocking the boat 'cos it gets right up other people's noses, I've found. I realised that all I really have to do is just wait. Eventually, reality will force them to acknowledge the error of their ways. The trouble is, a decade, two decades later, they've forgotten the entire conversation <spit>. If you live and breathe R&D in this field, you need to get used to the fact that your perceptions of how it's going to play out in 10 years time will just prompt laughter and disbelief in others, get used to being viewed forever as the loony in the corner.
FWIW, I'm still happy learning new languages and new ways of working but I'm increasingly picky about what I spend my time on, having wasted so much of it previously on sussing out half-assed but nevertheless seductive notions (e.g. VRML, to pick one at random). I can, and sometimes still do, spend ludicrous amounts of contiguous, hyper-concentrated time on things (up 3 days&nights) because if I choose my subject carefully, I can get through two weeks familiarisation in three working days. But that's my problem, I'm a generalist and this stuff takes effort for me and there's so much of it and there's so much hype that it's become quite difficult not to throw out entire nurseries of infants along with the bathwater. On the plus side, keeping mentally fit is thought to improve one's cognitive reserve [1].
In my case, the peripherals are starting to show distinct signs of wear but the CPU and RAM do seem still adequately rated for the task. Anyway, it's indoor work with no heavy lifting and, as a child of the 50s, I know that's a big plus - but YMMV.
> It's now quite obvious to me (that's all I want to claim) that this period in human history is having a profoundly formative effect on the progress of the species and, should we collectively survive the immense challenges currently facing us, will be a subject of special interest to future historians.
This stood out to me as one of the most beautiful things I've read on hacker news to date, and I completely agree. Thank you!
For those interested in mere datum points ... , 62 here and still joyously hacking.
It's been a blast so far, starting with punch cards and Burroughs JCL of the late 60s "high priesthood" machine-tending, through minicomputers, the internet, microcomputers, the web, the sudden ubiquity of personal computing devices. My love of sci-fi dates from the mid-60s and I'm still hugely buzzed to find myself living in the future that I used to read about; it's an absolutely fascinating time and a real privilege be working in the field.
Just last night I was using my laptop to watch a youtube video of a TV programme on dark matter (that's still a personal "wow" on so many levels). The programme mentioned that it was in the 1920s that Hubble first postulated the existence of galaxies other than the Milky Way. The contrast between "Hey, those fuzzy patches of light could be galaxies" to "and here we have a map of the dark matter in the observable universe" is almost as mind-boggling as the sheer speed of the advancement; less than 100 years, one person's lifetime.
It's now quite obvious to me (that's all I want to claim) that this period in human history is having a profoundly formative effect on the progress of the species and, should we collectively survive the immense challenges currently facing us, will be a subject of special interest to future historians. - so yeah, ageing can give you a greater facility for intuiting a wider view.
The fields of programming, computer science, software engineering are still young and (as a cognitivist interested in the cognitive psychology of programmers and programming), ISTM that we have a long way to go before we can develop reliably predictive models of programmers / programming and the related issues of: learning, ageing, skill acquisition and retention, organisational principles, etc., etc. - ageing does allow you to experience the same problems being identified over and over again, without any really effective solutions being devised. I spent nearly a decade as a cube-monkey in Hubris-Pachyderm labs between the mid-80s / mid-90s and during that time the problem of the "technical ladder" remained unsolved by our highly-paid senior management, much to their discredit. I note with some disappointment that the issue apparently remains generally unsolved today, that's 30 years of successive cohorts of senior management across the industry who have proved unequal to the task, sigh.
In my personal experience, the fields seem just as vulnerable to fads and fashion as any other formal/semi-formal endeavour. I'm given to understand (by a vastly more knowledgeable colleague) that DeMarco (the bane of my programming life in the 80s) has now recanted on structured analysis. I'm still contemplating the amount of damage inflicted on programmers by that movement, so you might forgive me for having a slightly jaundiced attitude towards contemporary ideas (can't even call them "theories") about pair programming, agile methodology, etc. But I mustn't get started on the subject of propellor-heads making uninformed, uneducated pronouncements on what are essentially topics in the domain of cognitive psychology, that way lies isolation. I've learned to keep my head down and avoid rocking the boat 'cos it gets right up other people's noses, I've found. I realised that all I really have to do is just wait. Eventually, reality will force them to acknowledge the error of their ways. The trouble is, a decade, two decades later, they've forgotten the entire conversation <spit>. If you live and breathe R&D in this field, you need to get used to the fact that your perceptions of how it's going to play out in 10 years time will just prompt laughter and disbelief in others, get used to being viewed forever as the loony in the corner.
FWIW, I'm still happy learning new languages and new ways of working but I'm increasingly picky about what I spend my time on, having wasted so much of it previously on sussing out half-assed but nevertheless seductive notions (e.g. VRML, to pick one at random). I can, and sometimes still do, spend ludicrous amounts of contiguous, hyper-concentrated time on things (up 3 days&nights) because if I choose my subject carefully, I can get through two weeks familiarisation in three working days. But that's my problem, I'm a generalist and this stuff takes effort for me and there's so much of it and there's so much hype that it's become quite difficult not to throw out entire nurseries of infants along with the bathwater. On the plus side, keeping mentally fit is thought to improve one's cognitive reserve [1].
In my case, the peripherals are starting to show distinct signs of wear but the CPU and RAM do seem still adequately rated for the task. Anyway, it's indoor work with no heavy lifting and, as a child of the 50s, I know that's a big plus - but YMMV.
[1] https://en.wikipedia.org/wiki/Cognitive_reserve