View Full Version : BASIC Programming language is 50 years old today



jmpokc1957
05-01-2014, 10:13 AM
It is being reported that the BASIC programming language made it's debut fifty years ago today at Dartmouth College. I first used BASIC sometime in the early 70's when a time sharing terminal became available at Putnam City High School. I've posted about that previously. I remember that my father, who was an engineer at General Electric/Honeywell at the time, brought home a BASIC manual for me to use. As I remember it was pretty plain by today's standards, but it got the job done.

I think my first program was a crude artillery game where a target was placed at a random distance and you had to

010 INPUT "ENTER ELEVATION ANGLE", A

Something like that. It got me started. I've been a software engineer for over 30 years now, and, seeing how things have developed, I wished I'd been a farmer!

Of Sound Mind
05-01-2014, 11:36 AM
01001000 01100001 01110000 01110000 01111001 00100000 01100010 01101001 01110010 01110100 01101000 01100100 01100001 01111001 00101100 00100000 01000010 01000001 01010011 01001001 01000011 00100001

Just the facts
05-01-2014, 12:09 PM
01001000 01100001 01110000 01110000 01111001 00100000 01100010 01101001 01110010 01110100 01101000 01100100 01100001 01111001 00101100 00100000 01000010 01000001 01010011 01001001 01000011 00100001

10 CLS
20 PRINT "That is not basic."
30 PRINT "That is binary, and it has been around since the beginning of time."
40 CLS
50 PRINT "Unless you wrote something clever about basic in binary,"
60 PRINT "in which case I now look stupid."
70 END

On a side note, if you have young ones you want to teach programming to check out SMALLBASIC. It is free.

http://smallbasic.com/

Of Sound Mind
05-01-2014, 12:12 PM
10 cls
20 PRINT "That is not basic."
30 PRINT "That is binary, and it has been around since the beginning of time."
40 END
10 cls
20 PRINT "I didn't claim it was BASIC"
30 PRINT "BASIC ultimately communicates in binary."
40 PRINT "I was offering birthday wishes from one computer language to another."
50 PRINT "Sorry it didn't compute."
60 PRINT "P.S. 'Beginning of time'?"
70 END

Just the facts
05-01-2014, 12:21 PM
Counting has always existed. In fact, the very first thing created was the binary option of dark and light. The next counting method was base 7 - which we still use today.

SoonerDave
05-01-2014, 12:25 PM
01001000 01100001 01110000 01110000 01111001 00100000 01100010 01101001 01110010 01110100 01101000 01100100 01100001 01111001 00101100 00100000 01000010 01000001 01010011 01001001 01000011 00100001



40 print "i was offering birthday wishes from one computer language to another."


10 cls
20 print "omigosh, he wasn't kidding"
30 print "holy ascii, batman!"

Pete
05-01-2014, 12:44 PM
Took Basic programming at OU in 1980 and ran the programs on punch cards.

Crazy to think how much things have changed in such a short period of time.

Mel
05-01-2014, 01:22 PM
01001000 01100001 01110000 01110000 01111001 00100000 01100010 01101001 01110010 01110100 01101000 01100100 01100001 01111001 00101100 00100000 01000010 01000001 01010011 01001001 01000011 00100001

dang it, you beat me to it.

Mel
05-01-2014, 01:26 PM
Q-bit is a fun page to play around with this stuff.

Snowman
05-01-2014, 01:33 PM
10 cls
20 PRINT "I didn't claim it was BASIC"
30 PRINT "BASIC ultimately communicates in binary."
40 PRINT "I was offering birthday wishes from one computer language to another."
50 PRINT "Sorry it didn't compute."
60 PRINT "P.S. 'Beginning of time'?"
70 END

So for reference the beginning of time is 1679 now.

Urbanized
05-01-2014, 08:27 PM
Took Basic programming at OU in 1980 and ran the programs on punch cards.

Crazy to think how much things have changed in such a short period of time.

It's not as short of a period of time as you and I both wish it was.

SoonerDave
05-02-2014, 11:53 AM
So for reference the beginning of time is 1679 now.

Any programmer worth their non-cryptographic salt (you either get that reference or you don't) knows that time began 1 Jan 1970.

ctchandler
05-02-2014, 01:25 PM
OSM,
I've been patiently waiting for one of the group to post this oldie but goodie - "There are 10 kinds of people, those that understand binary and those that don't.".
C. T.
p.s. I'm a little disappointed with you folks! Or maybe ashamed? Possibly even jealous because you are all probably a lot younger than me and that quote is old.
01001000 01100001 01110000 01110000 01111001 00100000 01100010 01101001 01110010 01110100 01101000 01100100 01100001 01111001 00101100 00100000 01000010 01000001 01010011 01001001 01000011 00100001

Just the facts
05-02-2014, 02:56 PM
Any programmer worth their non-cryptographic salt (you either get that reference or you don't) knows that time began 1 Jan 1970.

...and ended on 12/31/99

gjl
05-02-2014, 09:29 PM
...and ended on 12/31/99

I was working at Lucent Technologies on that date. I think we had every maintenance tech of which I was one, and a crap load of engineers working that night in case all the computers in the plant imploded when Y2K hit. Talk about a non event. I can remember loading software patches on the computers for months prior preparing for it.

RadicalModerate
05-03-2014, 06:40 AM
I've mentioned this before, but back when I was a Junior in High School (1969), taking Algebra II, we had a terminal connected to the Denver and Rio Grande Railroad mainframe. It ran on punchtape. Actually, it ran on electricity, but data was input with punchtape. Part of our math education included writing flowcharts and programs--in BASIC--to solve Algebra problems that I didn't have the foggiest clue how to solve with pencil and paper. I've always been more of a "Social Studies/History/English" or "Arts and Parties" kind of guy. (To this very day, when I see the phrase "graphing functions" or "the wrapping function" I cringe inside. But I was fairly good at Geometry.) The whole experience was so bad that I shunned computers completely until about 1993. Happy Birthday, BASIC.

SoonerDave
05-03-2014, 06:51 AM
...and ended on 12/31/99

No, not quite - time doesn't end until sometime in 2038 :) Someone will surely get that :)

SoonerDave
05-03-2014, 06:55 AM
I was working at Lucent Technologies on that date. I think we had every maintenance tech of which I was one, and a crap load of engineers working that night in case all the computers in the plant imploded when Y2K hit. Talk about a non event. I can remember loading software patches on the computers for months prior preparing for it.

That was one of the most infuriating technical non-events in my lifetime. I lost track of the number of consultants I came across who were making a *fortune* "selling" Y2K "protection" services for their computers and doing next to nothing for them. I guess had I not a shred of decency I could have made my own mint selling the same thing during that time. Just couldn't do it. Friends would ask me how "worried" or "scared" about Y2K I was, and I'd say "not in the slightest." They'd look at me like I was nuts, even though I was the tech geek among them. The media was insisting the world would implode at midnight that night, and I knew it wouldn't, but since the media obviously knew better, they couldn't be wrong...but they were.

Granted, there were a few holes here and there, but this cataclysmic failure portended by the hype mongers and consultants at that time was nothing short of nauseating.

RadicalModerate
05-03-2014, 07:00 AM
Granted, there were a few holes here and there, but this cataclysmic failure portended by the hype mongers and consultants at that time was nothing short of nauseating.

http://demotivators.despair.com/demotivational/consultingdemotivator.jpg

Snowman
05-03-2014, 07:53 AM
No, not quite - time doesn't end until sometime in 2038 :) Someone will surely get that :)

Upgrade to a 64bit OS and you can usually go to 15:30:08 Sunday December 4th, 292,277,026,596

Urbanized
05-03-2014, 08:14 AM
http://4.bp.blogspot.com/-9qgrLusZCDM/UJhXB1HqaFI/AAAAAAAAFVo/Q8f33b2Efks/s1600/Revenge+of+the+Nerds+Booger+Poindexter.jpg

Just the facts
05-03-2014, 04:03 PM
That Y2K non-event was probably the result of all the money spent trying to prevent it. I worked for a small city at the time and I remember running tests for the water department to make sure the water bills printed with two zero's. On the first run of the new year they realized the paper was pre-printed with a 19__ so the bills printed as 1900 anyhow.

SoonerDave
05-03-2014, 06:15 PM
That Y2K non-event was probably the result of all the money spent trying to prevent it. I worked for a small city at the time and I remember running tests for the water department to make sure the water bills printed with two zero's. On the first run of the new year they realized the paper was pre-printed with a 19__ so the bills printed as 1900 anyhow.

I'd qualify that with a 60/40 disagree over agree. Was there a real issue? Of course. And, yes, the attention drawn to the problem got some of those issues fixed. But was the problem ever the "society will implode at midnight" doomsday stockpile food-and-ammo survivalist orgy some had predicted? Absolutely not.

I just know that by the time this "crisis" hit, and all the hypemongers had had their day, there'd be a lot of cheap generators available come Jan 2000....

RadicalModerate
05-03-2014, 07:53 PM
I just know that by the time this "crisis" hit, and all the hypemongers had had their day, there'd be a lot of cheap generators available come Jan 2000....

From a strictly personal, sociological/geometric standpoint, that seems like a fairly selfish, free-market attitude. =)
Hopefully, you would have given all those generators acquired through BASIC insider trading to the poor.
Then taken a tax write-off for a donation to charity. (even it the poor couldn't afford gasoline to run the generators).
At the same time as The Environment couldn't afford bigger carbon footprints.

Just the facts
05-03-2014, 09:37 PM
I'd qualify that with a 60/40 disagree over agree. Was there a real issue? Of course. And, yes, the attention drawn to the problem got some of those issues fixed. But was the problem ever the "society will implode at midnight" doomsday stockpile food-and-ammo survivalist orgy some had predicted? Absolutely not.

I just know that by the time this "crisis" hit, and all the hypemongers had had their day, there'd be a lot of cheap generators available come Jan 2000....

I agree that the fear-mongering was way over the top. Some people thought the laws of physics would cease to exist and gravity would be shut-off. I was new to IT at that time and I asked my boss if we need to be on-site on Dec 31. He said if you a see a mushroom cloud over Australia go ahead and come in, we will have about 19 hours to fix whatever the problem is.

Brett
05-05-2014, 04:23 AM
I took BASIC programming in high school. I still remember that it was referred to as "GOTOless programming". Loved creating infinite loops with the GOTO command. The amusement lasted all of three seconds. My teacher was an absolute dolt and didn't realize that I lifted the teacher's edition of the textbook from his bookcase. Its pretty easy to program BASIC when you have all of the answers.

SoonerDave
05-05-2014, 05:55 AM
I took BASIC programming in high school. I still remember that it was referred to as "GOTOless programming". Loved creating infinite loops with the GOTO command. The amusement lasted all of three seconds. My teacher was an absolute dolt and didn't realize that I lifted the teacher's edition of the textbook from his bookcase. Its pretty easy to program BASIC when you have all of the answers.

I've heard BASIC referred to as a lot of things, even some of them not including profanity, but "GOTOless" wasn't among them... :)

ctchandler
05-05-2014, 09:58 AM
SoonerDave,
BASIC can be "gotoless" with the use of "GOSUBs", and Topdown/Structured BASIC "Gurus" probably didn't use the "GOTO" command.
C. T.
I've heard BASIC referred to as a lot of things, even some of them not including profanity, but "GOTOless" wasn't among them... :)

SoonerDave
05-05-2014, 10:05 AM
SoonerDave,
BASIC can be "gotoless" with the use of "GOSUBs", and Topdown/Structured BASIC "Gurus" probably didn't use the "GOTO" command.
C. T.

I have a feeling that the use of GOSUB's as a GOTO replacement was merely for the sake of balming their conscience :) LOL

I wrote a BUNCH of business, inventory, personal banking, and genealogy software (which I actually tried to market way back before the days of the Internet or anything like Ancestry.com) with the various incarnations of BASIC on the old Radio Shack series of TRS-80 computers and later PC-compatibles. I was just a teenager having a good time, writing stuff that my aunt and uncle wanted/needed, never even thinking that if there were a way to get them to the mainstream, I could have sold them for a small fortune and retired really, really young. The financial stuff allowed for management of checkbooks, saving accounts, all across multiple institutions, account ledger printing, statement intervals, you name it. The home inventory stuff let you categorize items down to the serial number and original receipt cost and print all manner of reports, and the genealogy stuff was robust enough that my uncle filled at least a couple of boxfulls of old 5-1/4" floppy disks of our family's ancestry back to the 1700's on it. If only I'd had a fraction more business sense back then.....

ctchandler
05-05-2014, 10:43 AM
SoonerDave,
I never wrote any BASIC (other than playing around on my Atari 800), but isn't the GOSUB command the same as the Fortran "Do/do until", the COBOL Perform, and the the Assembler combination of BAL/BCR?
C. T.
I have a feeling that the use of GOSUB's as a GOTO replacement was merely for the sake of balming their conscience :) LOL

SoonerDave
05-05-2014, 11:29 AM
SoonerDave,
I never wrote any BASIC (other than playing around on my Atari 800), but isn't the GOSUB command the same as the Fortran "Do/do until", the COBOL Perform, and the the Assembler combination of BAL/BCR?
C. T.

GOSUB was nothing more than a GOTO that remembered where it was called, so the flow of execution would return there upon the discovery of a RETURN statement. That gave it had at least a fleeting similarity to a "subroutine," eg

10 CLS
20 GOTO 100
30 PRINT "I'LL NEVER GET HERE"
100 PRINT "BUT I'LL GET HERE"


versus


10 CLS
20 GOSUB 100
30 PRINT "THIS WILL PRINT SECOND"
100 PRINT "BUT THIS WILL PRINT FIRST"
110 RETURN

ctchandler
05-05-2014, 01:39 PM
SoonerDave,
So, the answer is "Yes". That's exactly what the commands I mentioned do. The Gosub allows for "gotoless" code that has been pretty much the standard (although a lot of programmers refused to follow the "standard") since the 80's. I was first introduced to structured in 1980 at Scrivner and never used another go to of any kind after that. Well, sometimes when I was coding a throwaway/one time use program I used go tos, but never for production. Not to get too technical, but go tos are "killers" re-entrant code/programs.
C. T.
GOSUB was nothing more than a GOTO that remembered where it was called, so the flow of execution would return there upon the discovery of a RETURN statement. That gave it had at least a fleeting similarity to a "subroutine," eg

10 CLS
20 GOTO 100
30 PRINT "I'LL NEVER GET HERE"
100 PRINT "BUT I'LL GET HERE"


versus


10 CLS
20 GOSUB 100
30 PRINT "THIS WILL PRINT SECOND"
100 PRINT "BUT THIS WILL PRINT FIRST"
110 RETURN

SoonerDave
05-05-2014, 02:28 PM
SoonerDave,
So, the answer is "Yes". That's exactly what the commands I mentioned do. The Gosub allows for "gotoless" code that has been pretty much the standard (although a lot of programmers refused to follow the "standard") since the 80's. I was first introduced to structured in 1980 at Scrivner and never used another go to of any kind after that. Well, sometimes when I was coding a throwaway/one time use program I used go tos, but never for production. Not to get too technical, but go tos are "killers" re-entrant code/programs.
C. T.

Well, while I guess its just a matter of semantics, to me there was never really any difference between GOSUB and a GOTO other than the return point. They were still both GOTO's, and thus both evil; no change in variable scoping, no passing of parameters, etc. but given that BASIC didn't really have "named" functions and subs you didn't have much choice - although there were varying implementations of the function definition syntax - DEF FN()...

Actually, in between the "classic" era of BASIC and the later Visual Basic 4/5/6 and VB.NET era there was a really nice "intermediate" flavor that MS pushed out called "QuickBASIC." It was a full, compiled language, and took a lot of the ugliness of line numbers and such out of the language and put in a fairly nice named procedure/function syntax and scoping rules. It was intended primarily, I think, as a competitor to Borland's very popular TurboPASCAL back in the day. I remember to this day writing TurboPascal code for my Pascal classes at OU (there were a couple where Pascal was used primarily and one incidentally) and the profs never knew the diff LOL :)

As OO-based (Object-Oriented) languages started becoming more legitimate, that whole era of languages started to ebb, but if you wanted to get some stuff done, QuickBasic wasn't an entirely adverse choice. Pretty robust little language/compiler for its era, and MS did a pretty nice job of supporting it. Think they released a QuickC about that same time frame, but don't think it was quite as succesful. Don't recall for sure :)

ctchandler
05-05-2014, 03:02 PM
SoonerDave,
Enjoying the conversation, and I'm not sure it's really "on topic", but it's not semantics. The goto leaves a point with no link to where it came from therefore no automatic way to return and continue on. The gosub completes the code in the paragraph and upon that completion automatically returns to the instruction following the gosub and continues the structured/top down coding. That's a substantial difference between the two operations. If you have done Assembler coding on IBM main frames, the BAL, BALR, and BCR instructions are used for "automated returns" and the B, and BC illustrate the goto without automated returns. I probably should have "PMd" you but hopefully the viewers of this thread will have comments.
C. T.
Well, while I guess its just a matter of semantics, to me there was never really any difference between GOSUB and a GOTO other than the return point. They were still both GOTO's, and thus both evil; no change in variable scoping, no passing of parameters, etc. but given that BASIC didn't really have "named" functions and subs you didn't have much choice - although there were varying implementations of the function definition syntax - DEF FN()...

ctchandler
05-05-2014, 03:05 PM
I had an old Navy buddy that used to say the "gosub" was a go someplace and return from whence it emanated (originated).

SoonerDave
05-05-2014, 04:00 PM
SoonerDave,
Enjoying the conversation, and I'm not sure it's really "on topic", but it's not semantics. The goto leaves a point with no link to where it came from therefore no automatic way to return and continue on. The gosub completes the code in the paragraph and upon that completion automatically returns to the instruction following the gosub and continues the structured/top down coding. That's a substantial difference between the two operations. If you have done Assembler coding on IBM main frames, the BAL, BALR, and BCR instructions are used for "automated returns" and the B, and BC illustrate the goto without automated returns. I probably should have "PMd" you but hopefully the viewers of this thread will have comments.
C. T.

Oh, yeah, to be sure, I'm entirely familiar with the details of GOTO's and GOSUBs, CT. I spent most of my teenage years in front of a keyboard pounding out thousands of lines of each.

Either way, in readability terms, GOTO's and GOSUB's are still spaghetti code artifacts, and I'm thankful we've developed object-oriented techniques, languages, and practices that have overcome both. Obviously none of it is perfect, and I've been in doing it long enough to realize there aren't any panaceas. I'm very much a C# guy these days, and even enjoy delving back into plain, old, classic K&R C once in a great while :)

I've got two great old, classic books that were published by the long-defunct "Creative Computing" magazine. These were "101 BASIC COMPUTER GAMES", and its sequel, "MORE BASIC COMPUTER GAMES" - just page after page of source listings of great old BASIC Programs that someone had translated from some old mainframes into the more common dialects that had come to be in vogue at that time. The greatest game of this collection was one called "SUPER STAR TREK," where the Enterprise went around the galaxy chasing Klingons and firing phasers and photon torpedoes. It was intended as a teletypewriter-based game, and I ended up converting to run as a "pseudo-video" game by making all the output look "kinda" animated on my old character-cell TRS-80 and combining all the commands, scans, and game messages into a standard format. It was great fun, but also taught me TREMENDOUS things about the differences in dialects between the variety of BASIC used in the original source listings and what my ol TRS-80 could support - and the hardest one to translate was the ol DEF FN syntax that the TRS_80 basic didn't support.

The killer for me was having typed in the original source listing - conserving every byte I could because it occupied most of my voluminous 16K of RAM :) - I started it up the first time and it hung just as it started my mission. I spent who knows how much time poring over the source listing trying to figure out where my typos were - and I never could find *the* typo that kept it from running. Welllll, goofball that I am, I decided to *start over*, and type the whole thing in from scratch - only to have it hang at precisely the same point again. Frustrated to the point of giving up, I set it aside for a few days, then looked one last time at a source listing, and discovered I had confused a "O" for a "0" in a variable name declaration. Fixed that, and the whole thing finally worked :) Looked at the *original* listing I typed in, and realized I made the SAME mistake there, too!!! AARRGGHH

That was a great golden age in computing that's past, unfortunately. You really had to learn how things worked back then!

ctchandler
05-05-2014, 04:29 PM
SoonerDave,
You're right, when I first started programming (1965) on an IBM 1401, we actually looked at the "I-time" or instruction execution time before selecting the assembler op code to use. The "OR, AND, and Exclusive ORs) were much faster and if they could be used, that's what we chose because computers weren't as fast as they are now. That was in the day when computers were expensive and people (programmers) were cheap. You had to code efficiently to get the job done. I was around long enough to experience most of what you have done and of course client/server environments as well as card, mag tape, flat files on disk to relational dbs. We actually were using client/server systems before the name "client/server" existed. A lot of fun, but trust me, retirement can be a lot more fun.
C. T.

That was a great golden age in computing that's past, unfortunately. You really had to learn how things worked back then!

jmpokc1957
05-05-2014, 10:55 PM
I've mentioned this before, but back when I was a Junior in High School (1969), taking Algebra II, we had a terminal connected to the Denver and Rio Grande Railroad mainframe. It ran on punchtape. Actually, it ran on electricity, but data was input with punchtape. Part of our math education included writing flowcharts and programs--in BASIC--to solve Algebra problems that I didn't have the foggiest clue how to solve with pencil and paper. I've always been more of a "Social Studies/History/English" or "Arts and Parties" kind of guy. (To this very day, when I see the phrase "graphing functions" or "the wrapping function" I cringe inside. But I was fairly good at Geometry.) The whole experience was so bad that I shunned computers completely until about 1993. Happy Birthday, BASIC.

Writing flowcharts! I'm sure many of you will recognize the picture of a genuine, vintage, ca. 1970's IBM Flowcharting Template! If you don't recognize it then don't worry, you haven't missed a thing!

7735

ljbab728
05-05-2014, 11:24 PM
Writing flowcharts! I'm sure many of you will recognize the picture of a genuine, vintage, ca. 1970's IBM Flowcharting Template! If you don't recognize it then don't worry, you haven't missed a thing!

7735
I certainly recognize that. I have one in my briefcase right now.

SoonerDave
05-06-2014, 07:08 AM
I certainly recognize that. I have one in my briefcase right now.

One thing I think we are seriously lacking in today's environment is a really good system for documenting the design and operation of software systems. It's virtually all tribal knowledge. And when you talk about putting designs to paper in some form (written, graphical, whatever), people roll their eyes in disgust, but they get even more disgusted when the One Critical Guy leaves and no one left behind knows how things operate...really a serious issue to me, I think, going forward. You'd be amazed the number of fairly important software houses that "seem" like big companies but are really just a few guys doing all the tech stuff. Lots of fragility in there, in my book. And that's where the loss of "system smarts" really hurts.

jmpokc1957
05-06-2014, 10:10 AM
One thing I think we are seriously lacking in today's environment is a really good system for documenting the design and operation of software systems. It's virtually all tribal knowledge. And when you talk about putting designs to paper in some form (written, graphical, whatever), people roll their eyes in disgust, but they get even more disgusted when the One Critical Guy leaves and no one left behind knows how things operate...really a serious issue to me, I think, going forward. You'd be amazed the number of fairly important software houses that "seem" like big companies but are really just a few guys doing all the tech stuff. Lots of fragility in there, in my book. And that's where the loss of "system smarts" really hurts.

You are exactly right! You must of worked at the places that I have over the years.

What people( ie management, typically ) don't realize is the complexity of the stuff. Most of the programs in the industry I work in( industrial automation ) tend to be in the small to medium size, say 50,000 to 100,000 lines of code. Most are written by one to three people. There is no way anyone can really keep up with that and keep the bugs out. Pity the poor guy( and I've been him ) who has to come into a company and maintain and modify a 100,000 line program that he has never seen before! And that is just the code we have access to! What about the run-time libraries, third party, etc..

One of the arguments against President Reagan's "Strategic Defense Initiative", better known as "Star Wars" was that the complexity of the software to control such a system. It would be impossible to know, much less control, all the possible states which could occur.

We are surround by so much complex technology that we don't think of the complexity of it. It just seems to work( most of the time ). And, as you pointed out, it seems to be done by just a handful of people.

For what it's worth, my rant of the day!

SoonerDave
05-06-2014, 11:24 AM
You are exactly right! You must of worked at the places that I have over the years.

What people( ie management, typically ) don't realize is the complexity of the stuff. Most of the programs in the industry I work in( industrial automation ) tend to be in the small to medium size, say 50,000 to 100,000 lines of code. Most are written by one to three people. There is no way anyone can really keep up with that and keep the bugs out. Pity the poor guy( and I've been him ) who has to come into a company and maintain and modify a 100,000 line program that he has never seen before! And that is just the code we have access to! What about the run-time libraries, third party, etc..

One of the arguments against President Reagan's "Strategic Defense Initiative", better known as "Star Wars" was that the complexity of the software to control such a system. It would be impossible to know, much less control, all the possible states which could occur.

We are surround by so much complex technology that we don't think of the complexity of it. It just seems to work( most of the time ). And, as you pointed out, it seems to be done by just a handful of people.

For what it's worth, my rant of the day!

I think at some point most software systems of any appreciable complexity reach that condition of "state unknowability," particularly when you start delving into systems with terabytes of virtual address space and multiple possible levels of thread affinity or parallelism, etc. The best you can do is try to manage the "knowability" of certain defined (well, expected) states and keep those states stable through regression testing and such. Diving into the big middle of a long-standing software project and even beginning to make appreciable changes is like balancing crystal goblets in a minefield during an earthquake.

ctchandler
05-06-2014, 01:51 PM
Jmpokc,
I know things have changed but the way we handled a large program in a 24/7 real-time environment was one main program that called several hundred sub-programs. We could (and normally did) have 3 to 5, even more programmers working on a large project, and each programmer worked on the sub-program. When they tested, they all linked their modified subs together to build the main. We were doing that in the 70's and it was still functioning quite well on a different platform when I retired in 2004. That's how an SDI project would have been handled. Not one program with 100,000 lines of code, but one main with hundreds of subs. And of course, a project that large, like our reservation system would have many mains that other mains would "pass off to". We never had several hundred programmers working on it at the same time, but it would have been very easy to do. We just didn't have that many programmers much less that many enhancements/changes/modifications/bug fixes going on at the same time. I think we should have a new thread, this stuff still interests me as it seems to interest you and SoonerDave, but it is off topic and I'm afraid I'm the main guilty party.
C. T.
You are exactly right! You must of worked at the places that I have over the years.

What people( ie management, typically ) don't realize is the complexity of the stuff. Most of the programs in the industry I work in( industrial automation ) tend to be in the small to medium size, say 50,000 to 100,000 lines of code. Most are written by one to three people. There is no way anyone can really keep up with that and keep the bugs out. Pity the poor guy( and I've been him ) who has to come into a company and maintain and modify a 100,000 line program that he has never seen before! And that is just the code we have access to! What about the run-time libraries, third party, etc..

One of the arguments against President Reagan's "Strategic Defense Initiative", better known as "Star Wars" was that the complexity of the software to control such a system. It would be impossible to know, much less control, all the possible states which could occur.

We are surround by so much complex technology that we don't think of the complexity of it. It just seems to work( most of the time ). And, as you pointed out, it seems to be done by just a handful of people.

For what it's worth, my rant of the day!

SoonerDave
05-06-2014, 02:38 PM
Jmpokc,
I know things have changed but the way we handled a large program in a 24/7 real-time environment was one main program that called several hundred sub-programs. We could (and normally did) have 3 to 5, even more programmers working on a large project, and each programmer worked on the sub-program. When they tested, they all linked their modified subs together to build the main. We were doing that in the 70's and it was still functioning quite well on a different platform when I retired in 2004. That's how an SDI project would have been handled. Not one program with 100,000 lines of code, but one main with hundreds of subs. And of course, a project that large, like our reservation system would have many mains that other mains would "pass off to". We never had several hundred programmers working on it at the same time, but it would have been very easy to do. We just didn't have that many programmers much less that many enhancements/changes/modifications/bug fixes going on at the same time. I think we should have a new thread, this stuff still interests me as it seems to interest you and SoonerDave, but it is off topic and I'm afraid I'm the main guilty party.
C. T.

I don't think its really that off-topic, because we're really spinning a pretty natural discussion from the topic, but if TPTB want us to move, no problem :)

By the time SDI was announced, I'm all-but certain that Ada had become the de-facto MIL STD programming language - and back then, Ada-certified compiler - ones that had actually passed through the DOD validation/verification suite - cost tens of thousands of dollars. But that was when OO was seen, even in its infancy, as the solution to the problems of top-down, structured programming wherein so much code was rewritten and so many wheels reinvented. I actually got to take a two-week course in Ada from one of the language designers - Grady Booch - and it was a great expereince from a tech-nerd perspective. Although Ada was, in today's terms, kinda klunky, the elegance of the design concept behind it, abstracted components, reusable interface contracts, polymorphism, designing for re-use, were tremendous concepts. You could see how it all just might work.

I never have and probably never will be a big advocate of the "lines of code" metric. Any student of C (or its descendants) could easily write one "line" of code that did a TON of stuff, and a thousand lines of code that did nothing. Object design at least gives you a framework of reusability and methodology up front.

SDI would have been very difficult initially, I think, because it was an idea well ahead of the hardware available at the time. I shudder to think of the depths of kinds of calculations such a system would have to make, like the old Patriot missles only drastically more compex, all combined with the rapdity of inputs from untold numbers of sensors or satellites (what have you). We didn't really ramp up beyond 20Mhz CPU's rapidly until we got out of the late 80's-mid 90's, and the computational horsepower needed for SDI would have been staggering.

jmpokc1957
05-06-2014, 03:15 PM
In response to ctchandler: I think you once mentioned you worked at Hertz, and what you describe may be a perfectly valid model for that environment. In the West Coast Tech environment I've known for the last 30 years the model usually goes like this: One person tends to be the guy who develops the architecture and the "vision", if you will. Often times this falls to someone who has never done anything even remotely like it. The saying used to be, "If it's on your desk for 15 minutes then you're the expert!" You just have to run with it. Usually there will be a few others to help integrating various sub systems like vision systems, motion control, embedded microprocessor, digital IO, and, the 80 percenter of the whole project, the user interface. Sometimes you have to do it all yourself. That model has been pretty consistent in the industry that I've worked. Each industry seems to be different.

In response to SoonerDave: I agree that lines of code is not the best metric, but it seems to fit the situation and, just because you use objects, doesn't mean the objects themselves aren't comprised of hundreds of thousands of lines. I mainly use C++ and can have hundreds of objects, but add up the lines and voila!

The increase in processor speed and memory has really been something. As I say, a few gigahertz can make up for a lot of inefficiency, not that I don't strive to be wickedly efficient. Speaking of efficiency, I'm in awe of the guys who wrote the OS for the DEC PDP-11 series computers. My all-time favorites. What those guys could put into 8K of memory! I've seen their code, Wow!

ctchandler
05-06-2014, 05:42 PM
Jmpokc1957,
I recently threw away a template from 1962. It was yellow, not green and included things like a (punch) card, a card file, and of course the diamond for decisions and other items. It was used by the predecessors of systems analysts, called "project planners". It was also for the old electronic accounting machines (EAM). The templates were hard to come by, you had to be fairly senior and experienced to get one.
C. T.
Writing flowcharts! I'm sure many of you will recognize the picture of a genuine, vintage, ca. 1970's IBM Flowcharting Template! If you don't recognize it then don't worry, you haven't missed a thing!

7735

SoonerDave
05-07-2014, 12:22 PM
In response to SoonerDave: I agree that lines of code is not the best metric, but it seems to fit the situation and, just because you use objects, doesn't mean the objects themselves aren't comprised of hundreds of thousands of lines. I mainly use C++ and can have hundreds of objects, but add up the lines and voila!

The increase in processor speed and memory has really been something. As I say, a few gigahertz can make up for a lot of inefficiency, not that I don't strive to be wickedly efficient. Speaking of efficiency, I'm in awe of the guys who wrote the OS for the DEC PDP-11 series computers. My all-time favorites. What those guys could put into 8K of memory! I've seen their code, Wow!

Oh, absolutely - my only point was that I think objects give you a better sense of granularity when it comes to creating something of utility. If I've designed a system and I know the problem domain suits, say (random number) 50 objects, I think saying I've built and have started testing, 10 of those objects I've conveyed a better general picture of the state of development than compared to saying, "well, I've churned out 10,000 lines of code." Whether the lines of code metric is spread out over 10 lines or 100 lines, if the abstraction is one-up to objects (well, to be accurate, classes), then the lines of code aren't nearly as relevant. That's all I was getting at.

And, yes, the guys who could pack stuff into those small memory footprints had to know the systems/hardware they were using. Think there's much less of that expertise or even teaching - economy of resources, even if memory is all-but infinite these days :) That's one of the downsides of object-based systems - lots of overhead. All about tradeoffs, I s'pose :) Heck, one of the reasons C became so popular was that it relied on the developer to be very cautious how memory was used, how pointers were used, even to the point of borrowing an old Unix sysadmin axiom that to this day makes me chuckle: "UNIX is not your mother."

SoonerDave
05-07-2014, 12:27 PM
Awesome fun geek discussion, BTW :)

ctchandler
05-08-2014, 09:31 AM
SoonerDave,
Do you know who the "Father" of ADA is? Hint, he's a she!
C. T.

ctchandler
05-08-2014, 09:41 AM
Since this is the "Nostalgia and Memories" forum, does anyone remember the programming language "JOVIAL"? It was developed fifty-five years ago. It was developed by a man named Jules Schwartz and JOVIAL is an acronym for "Jules Own Version of the International Algorithmic Language.". It was used primarily by the government. It's the first "high level" language that I learned. I actually went to a two week JOVIAL class, but never actually coded a program using it.
C. T.

RadicalModerate
05-08-2014, 10:34 AM
If I were drawing flowcharts, today--which I most certainly am NOT--this is the stencil I would use.
It's part of the Visio software I have on my Home PC and laptop. I use Visio for other things. =)
http://pic.dhe.ibm.com/infocenter/p8docs/v5r1m0/topic/com.ibm.p8.pe.user.doc/visio_basic_flowchart_shapes_stencil.jpg

SoonerDave
05-08-2014, 06:09 PM
SoonerDave,
Do you know who the "Father" of ADA is? Hint, he's a she!
C. T.

Ada of Lovelace, an 19th century mathematician, daughter of Lord Byron, who was also fascinated with the whole notion of computing machines and how they might cast their spells on things other than numbers. She was fascinated by the "Babbage Machine" and is considered to be among the first "computer programmers" even though that concept wouldn't come to conventional fruition for another couple of centuries or so :)

:)

ctchandler
05-08-2014, 06:32 PM
SoonerDave,
Excellent answer. Actually, I suppose the man that designed and wrote the compiler was really the "Father", but it was named for Ada Lovelace. A lot of folks would have guessed Grace Hopper because she threw her "powerful" weight to ADA as the government standard.
C. T.
Ada of Lovelace, an 19th century mathematician, daughter of Lord Byron, who was also fascinated with the whole notion of computing machines and how they might cast their spells on things other than numbers. She was fascinated by the "Babbage Machine" and is considered to be among the first "computer programmers" even though that concept wouldn't come to conventional fruition for another couple of centuries or so :)

:)