I coded in traditional languages (mostly Fortran) until National Instruments LabView's object-oriented language came out. LabView is simple to use, yet powerful, and benefits from a large user base.
I sold Tektronix test equipment and went to a number of seminars by National Instruments on LabView. Like all higher order languages, I thought it was amazing what you could do and how quickly, however it was often frustrating what you could not do.
I also learned mainly Fortran. I was excited about programming until I did an internship and found out at the time that the kernel I liked about programming, which was solving an engineering problem took up maybe 5% of the effort. Another 45% of the effort was devoted to making pretty, user friendly I/O and the other 50% was testing and documentation.
You mean that you really do not like computer programming. You do not have to. Other people do. I am not one of them. But I do recognize and appreciate their skills.
Let me ask you how the world would look if we had no industrial designers. All computers would be gray. All cars would be gray. Instead of db's "pretty, user friendly I/O" we would be living in Stalingrad.
Moreover, bare bones functionality might argue against any coverings at all. Just keep everything dust-free and lubricated all the timeā¦ How would that work out?
Would you really trade your current computer for an IBM 3270 terminal?
1) I like computer programming, but I'm not as good at it as some people. Moreover, I learn more efficiently graphically. I have photographic memory.
2) My reference to the old Tektronix printers was a compliment to db. They were five years ahead of their time and are still ahead of inkjet printing. I wouldn't want to go back to those printers because the color laser printers are better now.
3) To me the beauty of a computer (or peripheral) is in how efficient and robust it is. I have plenty of old pieces of equipment that do the job that I buy on EBay.
You seem to have missed my point. When I asked, "Would you really trade your current computer for an IBM 3270 terminal?" I was referring to the green raster on black VDT or CRT look. You can still find that presentation on modern laptops on desks in organizations with IBM mainframes.
I also have old tools. They look nice; they work well. The IBM 3270 did not look nice because the designers were not the users.
I came to technical writing through programming; but, like Tron, I fight for the users. So, it is easy for me to point out the programmers where their designs are counter-intuitive. Back in 2000, I stopped a group of SQL developers from putting "File Update" on the F1 key. It was fine for them; they needed it there. Once the system was live, the clericals were never going to need it. The clericals expected Help under F1.
Yes, Mike, I did use an IBM 3270 at one point. At that time, the monitor was not something I had a problem with. The problem back then was the blankety-blank stack of cards necessary to execute the program. While I'm younger than you are, I did have one year of mainframe computing on punch cards. I remember how much that sucked.
My point, MikeM, was that your comment was completely irrelevant to what I had written prior to your entry into the conversation, as was your latest response. I tried to say that without being offensive the first time.
MM makes a valid point but misses the real argument which is this- programmers and techs generally spend one third of their time (my figure) on cosmetics as the work is inspected and approved by managers who do not understand it but judge on prettiness.
"All computers would be gray. All cars would be gray." Gray computers- would not bother me. Gray cars- my car has been resprayed to suit my 'taste' in a custom color, my money my choice. Recall, Henry Ford said all cars should be black to keep costs down, he later found out that despite the plant manager's opinion, the customer is king,
You were in "genius mode" but you lacked the "capability maturity model." You could write a quick-and-dirty program do to a thing you knew; and you could run the program. But no one else could. You did not document your work. Your interface was idiosyncratic, and you failed to actually test your work, i.e., to attempt to falsify it, against an objective standard. In my 30 years as technical writer, I have seen far too much of all that.
BTW, I also started with Fortran, but since then, I learned Java and Ruby. You went into law. We all make career choices.
Documentation, much of this is not for functional or any good reason at all, it is to make sure that talented individuals can be got rid of easily. More projects fail due to over management with endless formal processes, committees, documentation, timetables, than from talented but informal designers. 'But no one else could.' ..run that program. So? I do have to add that I have seen work made deliberately obscure to preserve the self importance (sometimes it is incompetence) of the designer.
In what you state here, Mike, you are totally correct, and that is why both db and I agreed on the amount of time allocated to each of the common tasks in computer programming (code - 5-10%; making it easy for others to use - 45-50%; testing - 50%).
Teaching computer programming is so much easier with LabView because most people I teach are familiar with either process flow diagrams or wiring diagrams.
The friendly GUI of LabView gives me the illusion that I ought to be able to jump in and make changes with no training or reading references. I have always found it harder, though, that other languages. That may not be true for someone experienced in LabView.
The size of the wiring diagram can be intimidating for some folks, but a patient editor of someone else's program should have no problem finding where to edit.
But is that really computer "programming?" I'd call it more like "computer building block assembly." But, yeah, if you understand process flow, it gets real easy.
Unless you are programming in 0s and 1s, _all_ computer programming is "building block assembly." That is why they invented Assembler. Do you program in that? (See my code in the Gulch here: http://www.galtsgulchonline.com/posts/2a...).
Fortran, Cobol, C, C+, C++, Java, Ruby, they are all just libraries of function calls to lower and lower levels of instructions. Ultimately, at the lowest level, every computer program is just one very, very long binary number.
Epistemologically, if not for Objects, we would be without Concepts and would have to discuss every program as a huge string of concretes.
Imagine if you could not say "Liberals are wrong." but would have to name each and every one of them and each and every one of their errors each and every time you wanted to discuss it.
As an old "hard core" programmer (machine language, assembly language, and channel programming [binary, octal, hexadecimal], I admit those languages were easier to debug, but the creation of user-friendly GUI and higher languages expanded the family of programmers, bringing in more talent. I adopted the Digital Research GUI before Windows, converted to Mac, and never looked back.
Wow, the Digital Research GUI. I used that to program a database of 20,000 homeowners in NJ to send "This house just listed or sold in your neighborhood" back before Macs came out, and the real estate farming had not yet become computerized. Yes, you have me to blame for those annoying post cards from the realtors. Back then in the early to mid-80's, direct mail to a typical realtor's farm of 200 homes resulted in an additional two listings for that agent per year. That was a fun summer job for a high schooler.
Robbie, I understand. As you can tell from the code that I posted, I have no problem with the command line. That said, I have been on a Mac three times since 1991 for professional work in stints of a year or two. This is being written on a Mac White Book. I also bought a Microsoft Surface. I love the touch screen, but had to clean out Justin Bieber and the NFL, -- and put the Command window on the start-up screen.
One of my USB drives glitched on Eject, so I wanted to clear it off, reformat it, and reload it. Hard to do nowadays... At least from the GUI. From the Command Line it was much easier.
So, I share that perspective with you. That being so, nonetheless, even the Command Line is a convenient user interface.
The first time that I saw a computer was 1962 or so. I was 12 or 13. The woman next door was a mathematician, a refugee from Castro's Cuba. She ran the data processing department for a hospital. To program her computer, she pulled off the front panel and changed the wires on a plug board.
One time, I was doing some low-level work, rewriting the same 20 or 30 lines in debug, and I said to myself, "I wish I had an editor." and I heard the voice of Obi-wan Kinobi: "Beware the Dark Side, Luke!"
OOP was created so that cheap programmers from the 3rd world could be hired instead of brilliant creative coders/designers with western English as a first language. (Admittedly there was a limited supply of those brilliant coders/designers.) No creative tricks for higher performance (apparently unneccessary with cheap hardware) and lower maintenance cost of cents per hour drone programmers. Similar to allowing unfettered illegal immigration from dictatorships with no history of individual liberty to eliminate a majority of people who demand liberty and justice from a looter dominated government. ;^)
I have to disagree, both with the claims of fact, and the presentation. English is a first language in India. See here: http://necessaryfacts.blogspot.com/2013/... Moreover, as educated people in India commonly speak one or more languages in addition to English, they are generally very good conceptual thinkers. I have worked with computer programmers from India for over 20 years. With 1.1 billion people there, you can find stupid Indians. You do not find them in IT.
(A friend of mine returned from Europe and complained about stupid Germans. "I never met a stupid German," I said. "That's because you never went to Germany," she replied. People are people everywhere.)
As you acknowledged in your linked article, Indian English often bears little resemblance to American English. What matters to American businessmen is communication, not a claim to speaking English when it does not provide adequate communication. No doubt there are brilliant people in India, even some conceptual thinkers, but In my experience, speaking multiple languages is no guarantee of good conceptual thinking. I have also met some educated and capable Indians working in IT, but many expressed agreement with my opinion of OOP, and are not the fish that the OOP net is meant to catch.
We have been discussing this for over a decade. Adam Reed has been writing for libertarian and Objectivist media since the 1970s. Adam Reed cites this work in his biography on the "Rebirth of Reason" discussion board here http://rebirthofreason.com/Articles/Reed...
Do you mean her metaphysics? I don't know if it was an influence or not, but I do not think it has to be. It is enough for Plato or Aristotle to have been the inspiration.
Having worked in IT for over 25 years, I came in right as OOP was going mainstream.
I have always looked at OOP as being related to Plato's Metaphysics: (there is a universe of perfect forms from everything in our universe renders).
I will put this on the "stack of stuff to get to", but definitely high up in the stack.
Thanks for posting this.
I also learned mainly Fortran. I was excited about programming until I did an internship and found out at the time that the kernel I liked about programming, which was solving an engineering problem took up maybe 5% of the effort. Another 45% of the effort was devoted to making pretty, user friendly I/O and the other 50% was testing and documentation.
My perception of the programming effort was pretty similar to yours, db.
Let me ask you how the world would look if we had no industrial designers. All computers would be gray. All cars would be gray. Instead of db's "pretty, user friendly I/O" we would be living in Stalingrad.
Moreover, bare bones functionality might argue against any coverings at all. Just keep everything dust-free and lubricated all the timeā¦ How would that work out?
Would you really trade your current computer for an IBM 3270 terminal?
2) My reference to the old Tektronix printers was a compliment to db. They were five years ahead of their time and are still ahead of inkjet printing. I wouldn't want to go back to those printers because the color laser printers are better now.
3) To me the beauty of a computer (or peripheral) is in how efficient and robust it is. I have plenty of old pieces of equipment that do the job that I buy on EBay.
I also have old tools. They look nice; they work well. The IBM 3270 did not look nice because the designers were not the users.
I came to technical writing through programming; but, like Tron, I fight for the users. So, it is easy for me to point out the programmers where their designs are counter-intuitive. Back in 2000, I stopped a group of SQL developers from putting "File Update" on the F1 key. It was fine for them; they needed it there. Once the system was live, the clericals were never going to need it. The clericals expected Help under F1.
That kind of glitch is typical of bad design. See here: http://necessaryfacts.blogspot.com/2012/...
or just put "worst designs" in your search engine.
This Old House has 14 examples here:
http://www.thisoldhouse.com/toh/photos/0...
programmers and techs generally spend one third of their time (my figure) on cosmetics as the work is inspected and approved by managers who do not understand it but judge on prettiness.
"All computers would be gray. All cars would be gray."
Gray computers- would not bother me. Gray cars- my car has been resprayed to suit my 'taste' in a custom color, my money my choice. Recall, Henry Ford said all cars should be black to keep costs down, he later found out that despite the plant manager's opinion, the customer is king,
BTW, I also started with Fortran, but since then, I learned Java and Ruby. You went into law. We all make career choices.
'But no one else could.' ..run that program. So?
I do have to add that I have seen work made deliberately obscure to preserve the self importance (sometimes it is incompetence) of the designer.
Fortran, Cobol, C, C+, C++, Java, Ruby, they are all just libraries of function calls to lower and lower levels of instructions. Ultimately, at the lowest level, every computer program is just one very, very long binary number.
Epistemologically, if not for Objects, we would be without Concepts and would have to discuss every program as a huge string of concretes.
Imagine if you could not say "Liberals are wrong." but would have to name each and every one of them and each and every one of their errors each and every time you wanted to discuss it.
One of my USB drives glitched on Eject, so I wanted to clear it off, reformat it, and reload it. Hard to do nowadays... At least from the GUI. From the Command Line it was much easier.
So, I share that perspective with you. That being so, nonetheless, even the Command Line is a convenient user interface.
The first time that I saw a computer was 1962 or so. I was 12 or 13. The woman next door was a mathematician, a refugee from Castro's Cuba. She ran the data processing department for a hospital. To program her computer, she pulled off the front panel and changed the wires on a plug board.
One time, I was doing some low-level work, rewriting the same 20 or 30 lines in debug, and I said to myself, "I wish I had an editor." and I heard the voice of Obi-wan Kinobi: "Beware the Dark Side, Luke!"
(A friend of mine returned from Europe and complained about stupid Germans. "I never met a stupid German," I said. "That's because you never went to Germany," she replied. People are people everywhere.)