Way back in the dark ages of computing, long before IBM's S/360 was even a gleam in some System Engineer's eye, there was a business computer called the 1401. It was a rudimentary sort of machine -- even the addressing scheme was in decimal. The original 1401 could address 4,000 bytes of memory. Later versions (notably the 1410) boasted a 16,000 byte address space. IBM introduced this line of solid-state computers to the world in 1959. Besides the central processing unit, a 14xx system might contain a card reader / card punch, a line printer (the venerable "1403"), one or more tape drives, and (eventually) a disk drive that could hold as much as 50 MB of data (50 platters, one megabyte per disk). The 1401 did not have an Operating System, per se. Each application program had to incorporate its own I/O support. Card, printer, and tape I/O is reasonably simple. Disk I/O is a lot trickier, so most 1401 users did not employ DASDs.
The primary programming language used with the 1401 was a macro assembler called 1401 Autocoder. In 1961 IBM began to develop a set of programs written in Autocoder that eventually became known as '62 CFO (for Consolidated Functions Ordinary). They spared no expense, even hiring a team of professional writers to create the system documentation (in 12 loose-leaf volumes). Subsequent systems developed for the life insurance industry have all drawn more or less directly on this original system. IBM did develop the Advanced Life Information System (aka ALIS) in COBOL for the System 360, but in spite of H. Ross Perot's best efforts at EDS, ALIS (or EDS's version, "Pacesetter") never became as popular as '62 CFO. 10 years after the advent of S/360 many life insurance companies were still running the old Autocoder programs under emulation on a 360 or a 370. Rumor has it that some insurance companies are still running '62 CFO (under two layers of emulation) on small, closed blocks of business in 2017 -- 55 years after the system was created!
I was first exposed to '62 CFO at Pacific Fidelity Life in L.A. But that exposure was only to the policy master file; I didn't actually have to learn anything about Autocoder while I was at PFLIC. That all changed in a hurry when I started working for United Anerican in Dever. UAL had a facilities management contract with Cybertek, so the daily cycle was being run (under CFO+) at the data center in Los Angeles. The jobs were submitted via Remote Job Entry (RJE) by a small Cybertek crew working in UAL's building. Cybertek had not yet developed all the reporting jobs associated with end of quarter and end of year processing. Temporarily, they were using a program to convert the CFO II master file into '62 CFO format, then running the old Autocoder report programs under emulation. The main problem with this procedure was that the operators in Denver were entirely unfamiliar with the old system. The programming department had been disbanded when Cybertek got the facilities management contract. There was a library with all the old program listings carefully preserved in hanging folders, but nobody who knew how to read the programs, or how to debug them when something went wrong.
My new job at United American came with some outsized responsibilities. I only had one employee reporting to me: Gary Clemons, a nice young man who was still trying to pass part I of the actuarial exams. But I was the company's chief actuary, and I was reporting directly to Bill Ellis, the president of UAL. I was also in charge of paying the monthly bills from Milliman & Robertson, an actuarial consulting firm with offices just half a block away. The first time I saw their monthly bill my jaw nearly hit the floor. "For services rendered, $15,000," it said. No description of what kind of services, or who had performed them. Just one grand total. (To put this in perspective, you should know that I was being paid $25,000 per annum. My assistant Gary was making $12,000, and my total budget for additional expenses (supplies, telephone, secretarial help, etc.) was just about $10,000. So the bill from M&R was a very big deal.) I called Gary Dahman, the head actuary at M&R, and made an appointment to see him.
The people at M&R were all as nice as pie. Yes, they'd be glad to itemize the servces they were billing me for. The main project they had going was to develop GAAP factors for UAL's book of business. They would be glad to have me involved in the process. They would supply me with a list of the assumptions they were using. In the future they would check with me before starting to calculate additional valuation factors. (Eventually, I took over the job of developing the factors for UAL's GAAP reporting; I still had the PL/I program I had written at PFLIC. And within six months I had reduced M&R's bill to roughly $2,000 per month. I was a very busy boy during my first year at United American Life.)
When I arrived at UAL in 1975 the company was going through a triennial examination. Tony Fagiano, the chief actuary for the Colorado Department of Insurance, and his assistant Don Yee, were in my office with new questions on a daily basis. I finally learned enough about the company to start answering most of the questions, but it was an uphill struggle. It's a good thing I got to know Tony that autumn; he went to bat for me with J. Richard Barnes, the commissioner of insurance, and I received permission to sign UAL's annual statement for 1974. (M&R wanted $50,000 to sign the statement. I was certainly saving UAL a bundle!)
The year-end schedule for Cybertek's valuation reports said all of them would be delivered by the 8th of January. The 8th came and went. I asked for my reports. "We've had some problems at the data center," I was told. "You'll get your reports in a couple more days." Two more days came and went. Still no reports. I was growing concerned. The deadline for filing the annual statement was March 1. It had to be at the print shop by the 24th of February at the very latest. Every day that passed was increasing the time pressure on Gary and me. I decided to take the bull by the horns. I called George Napoles in LA and asked him point blank, "What the f**k are you people doing with my year-end reports?" "I don't know," he said, "but I'll soon find out." An hour later he called me back. "Believe it or not, the problem is that the RJE operators at your end don't know how to set up the JCL to run an Autocoder program, and the people at this end say it's not their job." "OK, fine," I said. "I'll set up the JCL and we'll start submitting the jobs through RJE. Just make sure I get the run time I need, because I'm almost a week behind already." "Roger, Wilco," he said.
Setting up the JCL was the easy part. Most of the year-end jobs just read the policy master file, or an extracted version of the master file and one rate tape. There were a few SORT routines interspersed throughout the jobs. So the JCL generally consisted of a JOB card, one or two TLBL cards, an UPSI card (more about that in a minute). an EXEC card to invoke the emulator program, the 1401 object deck, and then a "/*" and "/&", denoting end of file and end of job, respectively. Oh yeah -- the data center in LA was running IBM DOS/VS on a S370/145, so the job deck had to be wrapped inside a POWER jobset ($$JOB in front; $$EOJ at the end). POWER was the DOS equivalent of ASP / HASP (the spooling subsystems that ran under OS/MVT and/or OS/MVS at that time). The UPSI card was used to simulate the set of seven sense switches on the 1401's front panel. Most of the time you just had to be sure sense switch A was set (i.e., "// UPSI X'40' "), but a few programs required that additional bits be set on.
As it turned out, the biggest problems were not with the JCL. The biggest problems were with the various "HALT" codes, which were totally unfamiliar to the people at Cybertek. The primary means of indicating an error condition in Autocoder was to issue a "Halt" instruction, with an address (000 - 999) that indicated what the error was. Code "999" was a successful end of job. Anything else demanded some sort of operator intervention. It took about two weeks, but eventually I was able to track down all the HALT codes and tell the guys in LA what they had to do in each situation. It would have been nice if there had been one big book documenting all the Halt codes, but if such a book had ever existed, I never found it. I had to research each program individually, reading the programmer's comments to ascertain what each Halt code meant.
