There was once a programmer who wrote software for personal computers. "Look at how well off I am here," he said to a mainframe programmer who came to visit. "I have my own operating system and file storage device. I do not have to share my resources with anyone. The software is self-consistent and easy-to-use. Why do you not quit your present job and join me here?"
The mainframe programmer then began to describe his system to his friend, saying, "The mainframe sits like an ancient Sage meditating in the midst of the Data Center. Its disk drives lie end-to- end like a great ocean of machinery. The software is as multifaceted as a diamond, and as convoluted as a primeval jungle. The programs, each unique, move through the system like a swift-flowing river. That is why I am happy where I am."
The personal computer programmer, upon hearing this, fell silent. But the two programmers remained friends until the end of their days.
The wise programmer is told about Tao and follows it. The average programmer is told about Tao and searches for it. The foolish programmer is told about Tao and laughs at it.
In 1999, a COBOL programmer, tasked with updating bank software for the year Y2K switch, became overwhelmed.
"Look at all of this code!" he exclaimed. "I'll never be able to fix it all in time!"
Horrified by the media depictions of apocalypse brought by financial meltdown and software-launched nuclear warheads, all because of the Y2K bug, he became very anxious.
So, he went to the cryogenic freezing facility and told them, "Wake me up when Y2K is over."
He laid in the cryotube and gently fell asleep as the cold began to overtake him.
The next thing he knew, he was laying on a bed in a warm room, bright lights and white walls giving the room a certain sterility. A doctor in a lab coat was standing over him holding a clipboard.
"Oh, thank heavens!" said the programmer. "We've made it out alive!"
"Yes," said the doctor. "The year is 2999, and it says in your chart you know COBOL?"
And yet, if you didn't mind that kind of work, you could have been making a fortune from 1999 to now. You'll be retired before COBOL is no longer used.
There is a large market for maintaining legacy COBOL, SNOBOL, and various esoteric assembly codebases. Additionally APL and its derivatives are not entirely dead.
SNOBOL should be looked at, if for nothing else, for its string handling (add a brief look at Icon right afterwards). I would have killed to take APL in college, but despite having to program on an IBM 370 mainframe, no classes were offered. I'm still a little ticked.
I have over the years played with it, but I was rather annoyed that having to put up with all the disadvantages of using an IBM 370 that we didn't get some advantages.
And depending on when your brother-in-law made the statement, he stood just as much of a chance or greater of being right than you did. COBOL will always be around (those 80 billion lines of code don't maintain themselves), but microprocessor and PCs could have broken any number of ways. Granted, a good knowledge of C and Unix will go a long way and be applicable to whatever ends up "winning", but it was no guarantee to the exact language and OS that would keep one employed.
My interests in C, compilers, small computers and Unix at the time wasn't driven by career aspirations. I was just genuinely interested those things. Maybe I was "unemployable".
Meh, I'm down with that. I would be a much richer man today had I chosen more wisely. But I would not be a happier man.
Maybe I was "unemployable".
Despite my comment to the (somewhat) contrary, as long as you're not specialist in BeOS or the like, in software you'll almost always make money doing what you like to do. Again, whatever ends up winning, the abstractions and concepts you pick up will generally prove useful where ever you end up. You might not be raking in the Benjamins like your SAP buddy over there pulling down $350K/year, but you'll be relatively wealthy and happy.
In the late 80's, my academic advisor said he wouldn't sign off on me taking a COBOL class because that was a business department IT class and would not go with the rest of my studies[1]. I wasn't really into the idea anyway and filled my schedule with other classes. I do wonder what other CompSci departments at the time were doing.
1) EE was the sole teacher of FORTRAN and I decided against that also, more from a teacher problem than a lack of desire to learn FORTRAN which I later did (without the all caps).
We had it too (I went to a mixed IT-business university). I think the idea was to get some perspective on what COBOL is, so if you're managing a company one day and IT people you hire tell you that something is hard to do, or takes too much time because the system is in COBOL, you know what they're talking about and don't think that they're just messing with you.
You know, I really didn't know then or now. The Business school taught COBOL and RPG on, I do believe, an AS/400 they owned separate from the rest of the computer infrastructure. I might be remembering wrong, but I was pretty sure that was the story[1]. I think it was something to do with the information systems degree requirements.
1) my Dad did IT on an AS/400, I still hear "wand in" instead of login.
I once had the pleasure to meet a young person (~30 years old at the time) who had learned to program at a bank, so I had to ask her about COBOL.
Her reply was interesting, if I looked at it as a general-purpose language like C++/C#/Java, then, yes, it sucks really hard. But if you look at COBOL as a DSL for building specific business applications, it does not look all that bad.
Having never used COBOL beyond a simple HELLO-WORLD, I am not sure I can judge it either way. But her reply sounded surprisingly pragmatic to me.
if I looked at it as a general-purpose language like C++/C#/Java, then, yes, it sucks really hard.
Plenty of people get by swinging a hammer all day and not a Swiss army knife. COBOL is one person's hammer. For me, at one point, it was RPG. You think COBOL is "domain specific"? RPG has the DSL right in the name: "Report Program Generator". Yuppers, RPG is a report writer. And when you're writing banking software or the like, which is just inputs munged into outputs, RPG did the job. Why use a language with shitty, dangerous string handling, no syntactic sugar for the task at hand, whose output keyword consists of "printf", dragging along a runtime with crap I'll never use, and other "general purpose" language problems when you can use a language crafted specifically for what you're trying to do?
I made plenty of money back in the day writing FoxPro. FoxPro sucks for embedded systems and writing an operating system. But ya know what it doesn't suck at? Taking some data, transforming it, do a little munging, and print some output. Which was precisely what customers were paying me to do at the time, and not writing a Windows competitor.
In my first job after my training, I had a boss who had practically written a couple of bespoke line-of-business applications for customers using FoxPro.
Three years ago I had to do some maintenance on an old-ish C++ application (also a one-of-a-kind-thing) that used a dBase backend for storage. I was lucky, though, the code was very well-written and, let's say, humble - it avoided unneccessary complexities whenever possible, and even without comments or any other documentation, I was able to figure out how it worked (at least the part that I was supposed to modify).
Part of the fun was getting a development environment to work with an ancient version of Visual Studio, but the second fun part was accessing that dBase database. I tried a couple of programs, none of them overly helpful, until I realized that ____ing Excel could open them. I don't know how well Excel supports interacting with these, but it worked well enough for inspection. ;-)
Excel should have worked just fine. The DBF format is not hard to understand, and it was documented in the manual that came with (IIRC) both dBase and FoxPro (I can still picture the page in my head from the FoxBase+ manual). Just a header with a record count and some field definitions, then the data. I haven't tried, but I bet with the docs in front of me I could write a parser in any number of languages in about 30 minutes, 30 minutes more to write to the DBF without corrupting it. You're on your own for indices, I never had to crack that nut. :-)
That is what that programmer told me, isn't it? COBOL was designed for a specific purpose, and according to her, having used it for that exact purpose, it gets the job done. COBOL seems to be the language hackers love to hate, but there is a reason it is still around.
During my training I once had a supervisor who spent years writing reports in RPG on an AS/400. She had interesting stories to tell, but unfortunately, she was always far too busy to actually tell them. :-(
I don't buy the DSL argument, as it's not more suitable for any domain than any other language unless shuffling COBOL specific data structures from a DL/1 database counts as a domain.
Interestingly, Lisp is (slightly) older than COBOL and is actually truly living up to the DSL thing - at least nowadays.
Both languages are actually centered around data structures - kind of, but while COBOL is massively built on fixed structures and boilerplate, lisp is built around dynamic structures of lists (and pairs - or CONS) .
It's interesting that they were invented more or less at the same time, as COBOL feels as dated as anything can be, and Lisp feels rather ok. Scheme (a dialect) is more than 40 years old and is actually rather fun to program in, and feels modern in more or less every sense.
Lisp source code is built around structures of lists built on cons pairs (and numbers, symbols, string literals, other kinds of literals, ...). (COBOL source code is just characters.)
Lisp data structures are varied. CLOS objects, multi-dimensional arrays, hash tables, strings, bit fields, ...
The Scheme of today is extremely different from the Scheme of 40 years ago. Comparing the language of R0RS and R1RS with R5RS, you'll see just how different they are. I'd say that "modern" Scheme is due more to Clinger than to Sussman & Steele!
My point is that, yes, Scheme as a concept originated in the mid 70s, but it has been updated many times over the years. The Scheme of today isn't 40 years old.
Indeed, it's an excellent language for what it was designed for: mainline data processing, consisting primarily of reading databases and creating reports.
>> the-values hold information about variables received. At the moment, these come from PATH (GET and POST are on the way). You can access them in the same order they occur in path.
Well, this is kinda the crux of the situation isn't it? Query parameter parsing, and POST body parsing, can't do much without these. Is this to be implemented in cobol? I really hope not, parsing anything is quiet untenable in cobol. I believe gnu/open cobol has pretty good .so binding, so you could conceivable use third party utilities like apache apr request parsing.
Why is parsing untenable in cobol? Becuase idomatic cobol doesn't provide dynamic memory allocation, or associative arrays, so you have to build up all that infrastructure from scratch.
But, the absence of dynamic memory allocation is also a characteristic that makes cobol extremely stable and safe, and mainframe applications legendarily bullet proof. All memory is required to be statically declared at compile time (similar to c-sects in assembler). So the runtime can ensure user programs play nice with each other.
You're first I run into saying that who isn't a mainframe programmer. It was an interesting language in summaries I read but I never really dug into it. What was good about Rexx for writing parsers?
I've worked for a big financial corp. I was in one of their mainframe teams for a year or so [1]. There's still a lot I don't know but I sure know Rexx is the most fun you can have on a mainframe. It's kind of like the polar opposite of JCL, or getting stuck on a TSO command line.
The first reason why Rexx is great for parsing is the fact that variables equal themselves. There's no "string" data type. If you type "myvar" then that's the name and the contents of your variable (unless you assign it another value):
DO INDEX = 1 TO 10 BY 2
SAY HELLO
END
/* Outputs:
HELLO
HELLO
HELLO
HELLO
HELLO
***
*/
Skipping the assignment reduces clutter and makes it very easy to see what you're dealing with. Unless you mess it up later, obviously.
Second, for parsing, Rexx has a keyword instruction called ...(drumroll)... PARSE that essentially implements pattern-matching, of some sort, probably as a regular automaton like regexes but without the regex syntax which is actually really nice. Instead, you define a pattern as a string with literals and variables and the special symbol '.' (a period) to indicate any token, (that's similar to the regex '.' but it matches between spaces).
Here's an example from my notes from that time (spaces delimit literals and variables) [2]:
/* Fun with patterns at the REXXREPL*/
BANDS = 'SLAYER, BATHORY, MOTORHEAD, VENOM'
PARSE VAR BANDS ABAND ', ' BEST ', ' REST
SAY ABAND
SLAYER
SAY BEST
BATHORY
SAY REST
MOTORHEAD, VENOM
PARSE VAR BANDS . ', ' BEST ', ' REST /* Skipped Slayer */
SAY BEST
BATHORY
SAY REST
MOTORHEAD, VENOM
You can also use numbers to indicate positions in a string (absolute, in the example below, but there's also relative ones):
DIGITS = '11001011'
PARSE VAR DIGITS FIRSTTWO 3 NEXTFOUR 7 LAST
SAY FIRSTTWO
11
SAY NEXTFOUR
0010
SAY LAST
11
On top of that, Rexx has "compound variables", a data structure that (to me) resembles a tree and lets you compose hierarchical structures. For instance, here's part of a program that reads in JCL files (formatted to a standard format with another Rexx program so that everything of the same kind is in the same column) and stores their DD, dataset, step and program names in such a structure:
/*PARSE JOBNAME OFF FIRST LINE IN JCL FILE*/
PARSE VAR CA.1 '//' JOBNAME .
/*PARSE OTHER NAMES INTO A STEM CALLED 'MATCHES.'
EACH TYPE OF NAME GOES INTO ITS OWN TAIL
TAILS FOR DDNAMES AND DATASET NAMES ARE INDEXED IN TANDEM
SAME FOR TAILS WITH STEP AND PROGRAM NAMES
IN OTHER WORDS, MATCHES. IS STRUCTURED AS FOLLOWS:
TAIL INDEX CONTENTS
---------- --------
MATCHES.DDNAMES.N N'TH DD NAME FOUND IN JCL FILE
MATCHES.DSNAMES.N DATASET NAME FOR N'TH DD NAME
MATCHES.STEPNAMES.M N'TH STEP NAME
MATCHES.PROGNAMES.M PROGRAM NAME FOR N'TH STEP NAME
SO, TO GET THE DATASET NAME THAT CORRESPONDS
TO THE 5TH DD AND DATASET NAME
FOUND IN THE JCL FILE, YOU LOOK INTO:
MATCHES.DDNAMES.5
MATCHES.DSNAMES.5
*/
DO INDEX = 2 TO CA.0 BY 1
PARSE VAR CA.INDEX '//' DDNAME . 'DSN=' DSNAME ','
/* IF DD NAME IS EMPTY, DON'T ASSUME DDNAME='DD'*/
IF DDNAME = 'DD' THEN
DDNAME = ''
PARSE VAR CA.INDEX '//' STEPNAME . 'EXEC' PROGNAME ','
IF DSNAME ¬= ''
THEN
MATCHES.DDNAMES.INDEX = DDNAME
MATCHES.DSNAMES.INDEX = DSNAME
IF PROGNAME ¬= ''
THEN
MATCHES.STEPNAMES.INDEX = STEPNAME
MATCHES.PROGNAMES.INDEX = PROGNAME
END
(The above will probably make more sense if you've seen a jcl script before).
The point is that all of that is very hard to do with COBOL, and, I believe, impossible with JCL [3]. In Rexx on the other hand, it's a doozy.
I also note that all of the above is nice to have in any language and that not very many of the languages popular outside of mainframes let you parse strings that easily, without explicitly calling a regex library or such.
_______________
[1] I raised hell until they put me in one of those teams. I could get plenty of experience with Spring and Hibernate outside of that corp. Since I was there, I figured I might as well learn something I couldn't learn anywhere else as easily. Still, people looked to me as if I was mad. "You want to be on a mainframe team? Why?".
Dude. Big computers. Millions of users. What the hell?
[2] Note that the "REXXREPL" in my comment is a Rexx program running on TSO (the Z/OS command line, ish):
/* REXX INTERPRET */
REPLPROMPT = "SAY '>>'"
INTERPRET REPLPROMPT
DO FOREVER
PULL USERINPUT
IF USERINPUT = "END" THEN EXIT 0
INTERPRET USERINPUT ; END
The INTERPRET instruction is another funky bit of Rexx that's real cool to have on a mainframe.
[3] Though I'm sure there's someone, somewhere still on this planet who would laugh in my face for saying that. But probably no more than four or five people.
Appreciate the examples. Yeah, that looks almost like a 4GL compared to regular mainframe programming. A lot more powerful.
"Since I was there, I figured I might as well learn something I couldn't learn anywhere else as easily. Still, people looked to me as if I was mad. "You want to be on a mainframe team? Why?". Dude. Big computers. Millions of users. What the hell?"
I like your style. Same excuse I had for trying (but failing sigh) to get onto a team with SGI NUMA machines. They were confused about why I'd want to do tedious work tweaking numerical applications. For 256 CPU's and 3TB of RAM in one machine? Obviously...?
Note: I imagined it had time in between jobs and a quota system not designed with people like me in mind, too. ;)
> But, the absence of dynamic memory allocation is also a characteristic that makes cobol extremely stable and safe, and mainframe applications legendarily bullet proof.
Of course, it can also make programs curiously limited.
If you have a COBOL program that loads its configuration into memory before processing data, you may find that too much configuration (think things like rules, transaction codes, etc) results in your program no longer running because the array that holds it isn't big enough.
You folks think you're being funny but I have vary off the interface on our iSeries to move it to a different ethernet switch then do an IPL over the weekend.
What's the reliability of the iSeries models in terms of the per-server failures? And have they worsened or gotten better on reliability or admin side than the AS/400's that preceded them? Just curious about such things.
It never fails. Its a bit grumpy (like do not unplug the ethernet without telling it first) but its rock solid. Calling IBM is pretty easy and they I've never had a real problem. I would say it would probably if we had a newer unit since ours is about 10 years old.
I remember this from a few years ago. The only recent updates appear to be a change to the tutorial link.
If you haven't clicked through to the inspiration (http://www.coboloncogs.org/HOME.HTM) then it's well worth a look. Although the Rails page doesn't look like that any more, so it's not quite as funny.
There are other non-free webframeworks that run COBOL and COBOL-like backends. One I am familiar with is WEbPortal from a company called Zortec in the US.
It works very well, very fast and is under active development. Although the backend can be written in COBOL, making porting to the web quite easy, the language it actually uses is called 'Z'.
Can we get an E-Commerce Shopping Cart platform written on this now? I'd like mine to be faster than 10 sec loading times most offer currently. /slightsarcasm
I enjoyed reading through the simple code, these older languages can often seem somewhat esoteric to the uninitiated, seeing the COBOL language twisted to fitting in to an MVC-lite pattern helped me to understand what was going in - it read like the applications I'm used to seeing.
"You don't have any Cobol, PL/1, OS360 Assembly Language or APL on here", he observed.
"I am going to focus on C, Unix and operating systems. I want to write compilers for microprocessor based computers".
When he finished laughing, he called me "unemployable" and said, "microprocessors will never be used for anything but toys. Take COBOL!"
Since then, I have made a point of not even looking at COBOL code. I have been in the presence of COBOL, but have averted my eyes.