Ian –
I do not recall ever using the commands PEEK and POKE, but I do remember being told that "Spaghetti Programming" was not by bad language but by programmers' bad use of the language – i.e. poor designing and not using the language to its best.
That suggests that if the language itself does have a weakness, it is in allowing that to happen.
However, and what seems thin in this thread, is that the programming is not done because one has a computer and knows how to programme it, but to perform some intended work. You don't make a model engine because you have a lathe, for example, but you have the lathe because you want to make the engine – but the related parallel is that both need plenty of special skill.
So my superiors did not learn to write in BASIC just to fill up computers (and floppy discs), but to perform real work, and their programmes worked very well whatever the purists might sniff at. I dare say later languages might have been more efficient or whatever, and the users were constrained by the computers available, but that misses the point.
Our workhorse office PCs all had MS Win-x loaded but the lab computers were HP ones without MS software and which matched the measuring analysers made by HP, Solartron, etc, with appropriate data connectors. HP also sold its own version of BASIC for this, using simple command-lines to pass instructions such as frequency range and steps, and receive the results, to and from the analyser.
Our department needed hefty great number-mills the contemporary PCs and anything beginning with 'W' could not handle, so built a small, local network of internal server and PCs., all SUN Microsystems machines. These had their own, stripped version of windows and entailed a lot of DOS-like command-writing to use, let alone programme.
In later years the firm installed circuit-diagrammatic applications like "Labview", but these are not programming languages as such. Rather, they are a sort of laboratory 2D CAD to create a single test and analysis system from the PC and the attached sensors, etc.; and like CAD, do not need users to learn programming.
.
I learnt BASIC to a moderate level (ASCII-value based, string-handling routines were fun!) but could never learn creating and reading data files, despite having a real text-book to use. So my programmes could only print the result of each entry-set one at a time, on screen and paper. I have not attempted C++ although WIN-11 includes it in the applications directory – dangerous if the MS software itself is all in C++? My interest is now simply as observer – I have no engineering purpose for programming.
.
I am intrigued though that even allowing for the development of the electronics (to fill with even more bells and whistles?), there are so many different languages for basically the same circuits.
The computer can only really work in one way, often with circuits common to many makes, so whatever its make, whatever application and language you throw at it still needs translating to what its little blocks of transistors understand. (I don't know the differences between assembling, compiling and interpreting; only that it all needs reducing to 1s and 0s.) It's like having a dozen different books with different styles of drawings on making that same cited model engine, but whatever order you make the parts in, whichever holding methods or tools, the lathe still only works in one way!
So apart from up-dating to exploit electronics development, since any one PC will run various languages within its power if loaded with the right translators, is the Babel-esque range of languages genuinely technical or merely commercial?