Retro Computing (on Steroids)

Advert

Retro Computing (on Steroids)

Home Forums Related Hobbies including Vehicle Restoration Retro Computing (on Steroids)

Viewing 9 posts - 76 through 84 (of 84 total)
  • Author
    Posts
  • #651365
    Robert Atkinson 2
    Participant
      @robertatkinson2

      Why do people keep saying BASIC is interpreted? While many BASIC implementations do use interpretors compliers are available and some versions are solely compiled. And if interpreted languages are so bad why is Python so good? Pyton needs huge resources. OK silicon is cheap now but more still costs more and my be less reliable. Even micropython needs a processor with 256kB of program memory and 16kB of RAM. I've written functional code used in professional commercial products in BASIC (Pic Basic Pro) that ran on 256 words (12 bit) of program space and 25 bytes of RAM. Yes, that is 409 Bytes total memory.

      Robert

      Advert
      #651368
      Ady1
      Participant
        @ady1

        Most of it took ages to learn and forever to program

        and if it was easy to learn it ran too slow

        It was fun but it was very rarely done properly by "professionals"

        The NHS got humped for billions by the computer guys who promised the moon and delivered a horse and cart

        At times there seemed to be more lines of legal get out of jail free code than computer code when you looked at a users EULA

        The computer industry was the first modern industrial supplier who could deliver a product that didn't actually work and they didn't have to give the money back

        It was fun though!

        #651378
        An Other
        Participant
          @another21905

          Just for information:

          LINK

          Interesting reading the diversion into programming. My first attempts at programming began in around 1965 using Elliott Autocode on Elliott 803 computers – hand-punching paper strip, and later onto stuff like Fortran and Cobol, at the same time going into more exotic machine languages as a result of involvement in research into fully steerable satellite tracking systems.

          Working both as a 'programmer' and an 'engineer' over the years (whatever they are), it always struck me that programming languages can never provide everything to everybody. I found I was using BASIC (in various forms) to produce quick and dirty 'programs' or solutions to immediate engineering problems, yet at times getting involved with in-depth programming to produce 'operational' software, and in my opinion they differed tremendously.

          I didn't care how it was done in BASIC, and nor did I want to mess about with compiling/linking etc — the good thing was to be able to write/edit something quickly and see it working – our focus was on the equipment, not the software – it was just a tool. Yet BASIC for all the reasons listed in this post, was hopeless for 'serious' software production.

          I suppose we now recognise the shortcomings of all software, but we should remember that the languages we now use were written at least some years ago – times, and equipment change, as does usage – in 1965 we had no idea of the scourge modern hacking would grow into, forcing software programmers (in theory) to use more secure languages (Rust, et. al). C and its variants were considered perfectly safe for use some years ago, security was left to the programmers competence, if it was even considered, but not any more.

          Introduction of more efficient and secure languages has brought its own problems. Python was/is often touted as "the" language to use, yet that too has brought along its own problems. As someone already said on this thread, it is horrendous to learn – I believe I read recently there is even a move now to simplify Python, and reduce the use of the multiplicity of tools associated with it that do the same job.

          I think software will always carry these problems – and different people will always use what they are familiar with, and like to use.

          #651392
          SillyOldDuffer
          Moderator
            @sillyoldduffer
            Posted by Robert Atkinson 2 on 09/07/2023 08:08:41:

            Why do people keep saying BASIC is interpreted? …

            And if interpreted languages are so bad why is Python so good?

            Robert

            People usually think BASIC is interpreted because that's what they learned. It's very common, and it was unusual for amateurs to invest in a compiler. If they'd done so, they'd have found compiled BASIC isn't as cuddly as interpreted!

            But, the assumption misses an important point, which is computer languages and their implementations aren't the same. Interpreted, compiled, and combinations are all possible, with pros & cons.

            Python, like Java and many others, compiles to an intermediate byte code, which is then interpreted. This approach combines many of the advantages of both implementations,:

            • the compile phase error checks and optimises the code – loop unrolling, moving stuff that doesn't change outside the loop, minimising branching, reordering, factoring duplicate code, removing dead code, keeping busy variables in registers etc and many other tricks.
            • The interpreter provides efficient base functionality and memory management. It's fed clean fast byte code, without the complexities implicit to machine code.

            Although the approach works well, it means Python is a hefty beast, too big in full form to fit on smaller microcontrollers, not a good choice for embedded computing. However, there are Python compilers than do generate machine code, the main problem being implementations aren't identical.

            As with all tools, the best computer language is determined by what it's for. Most languages have glass ceilings that stop the programmer doing what he needs. In my experience:

            • The original interpreted BASIC has a low glass ceiling. Fine and dandy up to a point, then, a shattering stop, maybe forcing the whole program to be rewritten in something else. Compiled BASIC removes some obstacles, but not all, and converting people and code causes £delay. Worse, having to switch to a compiler, suggests a need to think again. Having failed to choose the right tool once, it would be foolish to do it again! Not the sort of language problem the average internet Joe can advise on. If the code is focussed on a system-like problem, then C/C++ is a good choice. But if the code is focussed on efficient number crunching, then maybe the answer is FORTRAN. Back in the day, even horrible old COBOL was massively better at data processing that BASIC. Today there are many alternatives. If it were a road car, original interpreted BASIC would be a Citroen 2CV.
            • C/C++ probably doesn't have a glass ceiling! It's a high-performance system language, close to the machine, used to develop other software tools. Chances are BASIC is written in C, as is much of Python, and many other languages. The downside is a lot to learn and much of the work is a slow low-level plod. The programmer is responsible for almost everything. If it were a road vehicle, C would be a MV Agusta Brutale 1000 Nürburgring with bald tyres.
            • Python has a high-glass ceiling, and is extensible. Although it can do system work, the main focus is productivity. For example, classic BASIC only had two data-structures, strings and arrays. which aren't enough for advanced work. Say a program is needed to count how many times each word occurs in a book. An array can be used to keep tally, but they're fixed size and we don't know how many unique words are in a book. A different data-structure is needed, one that can grow, and BASIC doesn't have one. Python provides sets, dictionaries, deques, counters, maps, lists and tuples. Their availability makes Python highly productive. If Python were a car it would be something like a high-end SUV. Comfy cabin in the front, big carrying capacity, reasonable on and off road performance, winch on the back, and the driver is an ordinary chap with an plain licence.

            In the distant past I wrote BASIC for money and it caused many problems. Later, perl was wonderful for several years, but it couldn't keep up with Python. I also wrote C/C++. Now retired, I find Python and C/C++ complement each other delightfully. Python is good for rapid development of big complex general purpose programs that don't need space/time optimisation. C/C++ is excellent when space/time and performance are vital, for operating system level work, and embedded code (Arduino and friends).

            Anyone remember Filetab? This is/was a decision table language used for report writing. Back in the 70's it was brilliant, far better than COBOL, apart from its glass ceilings! Really hot at reading files, but not for writing them – this limited what it was useful for. Not good at maths. A more serious shortcoming appeared as the code grew in size. Up to about 2 pages, decision tables sparkle, but after that people start having trouble following the logic. Above a certain level of complexity it was easier to write COBOL, even though COBOL is clunky.

            My advice, look for language limitations! Likes are secondary.

            Dave

            #651394
            IanT
            Participant
              @iant

              ANO – I'm not sure linking two Uno's to provide VGA o/p is the best way to do things. More of a "Look, what I can do" Techie thing. The PicomiteVGA is a far more elegant solution to this requirement, with excellent integration of graphics into the overall system. I haven't looked too much further (into Tiny Basic) but the simple 'Hello World' example uses line numbers and a GOTO statement – so I guess it is certainly "Retro" from that point of view.

              Nigel (and your Amstrad memories) – I suspect many still remember Basic as being like that – with line numbers, PEEKS & POKES, nested GOTO loops – so called Spagetti Code. If you really need them, they are still available in MMB but their use is really not recommended. I have access to named Subroutines and Functions, CASE statements and simple access to input/output functionality (no need to manipulate memory to access hardware).

              However, I think ease of use is really the best feature of all. When I'm writing programmes and make a typo or mistake (a frequent occurance I'm afraid), when MMB detects my error, it drops me straight into the Editor at the faulty statement so it can be edited. Having made changes, I simply hit 'F3' and MMB saves and runs the programme again. It's very quick and intuitive to use – which for me far outweighs any issues around interpreters, speed etc. Modern micros are very (very) much faster than the 8-bit ones of forty years ago, so processing speed isn't a practical problem in most of the applications I write. Much of the time the system is just sat waiting for something to happen…

              Regards,

              IanT

              Edited By IanT on 09/07/2023 13:41:42

              #651419
              Nigel Graham 2
              Participant
                @nigelgraham2

                Ian –

                I do not recall ever using the commands PEEK and POKE, but I do remember being told that "Spaghetti Programming" was not by bad language but by programmers' bad use of the language – i.e. poor designing and not using the language to its best.

                That suggests that if the language itself does have a weakness, it is in allowing that to happen.

                However, and what seems thin in this thread, is that the programming is not done because one has a computer and knows how to programme it, but to perform some intended work. You don't make a model engine because you have a lathe, for example, but you have the lathe because you want to make the engine – but the related parallel is that both need plenty of special skill.

                So my superiors did not learn to write in BASIC just to fill up computers (and floppy discs), but to perform real work, and their programmes worked very well whatever the purists might sniff at. I dare say later languages might have been more efficient or whatever, and the users were constrained by the computers available, but that misses the point.

                Our workhorse office PCs all had MS Win-x loaded but the lab computers were HP ones without MS software and which matched the measuring analysers made by HP, Solartron, etc, with appropriate data connectors. HP also sold its own version of BASIC for this, using simple command-lines to pass instructions such as frequency range and steps, and receive the results, to and from the analyser.

                Our department needed hefty great number-mills the contemporary PCs and anything beginning with 'W' could not handle, so built a small, local network of internal server and PCs., all SUN Microsystems machines. These had their own, stripped version of windows and entailed a lot of DOS-like command-writing to use, let alone programme.

                In later years the firm installed circuit-diagrammatic applications like "Labview", but these are not programming languages as such. Rather, they are a sort of laboratory 2D CAD to create a single test and analysis system from the PC and the attached sensors, etc.; and like CAD, do not need users to learn programming.

                .

                I learnt BASIC to a moderate level (ASCII-value based, string-handling routines were fun!) but could never learn creating and reading data files, despite having a real text-book to use. So my programmes could only print the result of each entry-set one at a time, on screen and paper. I have not attempted C++ although WIN-11 includes it in the applications directory – dangerous if the MS software itself is all in C++? My interest is now simply as observer – I have no engineering purpose for programming.

                .

                I am intrigued though that even allowing for the development of the electronics (to fill with even more bells and whistles?), there are so many different languages for basically the same circuits.

                The computer can only really work in one way, often with circuits common to many makes, so whatever its make, whatever application and language you throw at it still needs translating to what its little blocks of transistors understand. (I don't know the differences between assembling, compiling and interpreting; only that it all needs reducing to 1s and 0s.) It's like having a dozen different books with different styles of drawings on making that same cited model engine, but whatever order you make the parts in, whichever holding methods or tools, the lathe still only works in one way!

                So apart from up-dating to exploit electronics development, since any one PC will run various languages within its power if loaded with the right translators, is the Babel-esque range of languages genuinely technical or merely commercial?

                #651435
                Ady1
                Participant
                  @ady1

                  If the hardware had been better things would have been a lot easier, the industry had a lot of limitations forced upon it and worked hard to overcome them

                  I paid 200 quid for 8MB of extra ram in my 1600 quid 1GB HDD computer, so almost 2k out of the 30k I got for a 2 bedroom flat in Edinburgh went on my new Windows 3.1 wundermachine

                  #651488
                  David Taylor
                  Participant
                    @davidtaylor63402

                    Old-school BASIC did have significant problems that stopped anyone writing good code with it. The best you could do was code that worked and was as good as the language would allow. I'd say those problems have been greatly reduced now.

                    I recommended one of our users try out a MicroMite for talking to a small grain silo he has. He's already used an Arduino for a pneumatic gate system for sorting sheep so he's technically very capable, but I thought the MicoMite with it's more immediate programming model might be a nice change.

                    If my kids were at all interested in programming I'd probably get one. I think it's a great project.

                    But I still mourn the simplicity the old 80s micros had, hardware wise.

                    The range of languages available is driven by demand, not hardware compatibility. It can be as random as one guy decides to write an operating system kernel for fun which then goes on to run most of the world's computers so lots of people keep its implementation language alive (straight C) or another decides he wants to write a language and it get popular for some reason (Python). Java was one of the, these days, more rare industry driven examples of a language getting popular. Marketing, the interpreted write-one-run-anywhere feature, and the fact it was actually a mostly better C/C++ for application level coding meant it took off. It's largely considered the COBOL of the 21st century which I think is unfair. COBOL is still the COBOL of the 21st century anyway!

                    #651507
                    Gerard O’Toole
                    Participant
                      @gerardotoole60348

                      I was never a programmer, rather a scientist. But in the seventies(and eighties) there was little available application software suitable for our calculation and dabbling at BASIC to get an error free result was common. Every laboratory seemed to have at least one person who would manage to create a simple program to do the tedious calculations. And at that level BASIC was a reasonably useful programming language.

                      But i went on to attempt to create a program to record requests, accept results and print reports for an analytical medical laboratory. BASIC was chosen because that was all available on the HP computer available. Is was complemented by a very good Assembler but everything depended on the BASIC program. And , did that program, which was essentially a specialised data base management application, highlight the severe deficiencies of BASIC. It really is totally unsuitable for any sort of structured development. Even with the exclusion, where possible , of GOTO and only using labelled GOSUB routines it was still a nightmare. I have no way of knowing , but I suspect it was one of the largest BASIC program ever written and used continuously. It was always under development. Any real programmer would have know not to attempt such a large project using BASIC.

                      Thankfully I have left that drama well behind. But if I need to code a small , or large, program I tend to go to either C# (windows) or Java(only for Android) never BASIC. While my preference is for C# I much taken with the ubiquity of the mobile phone. and find anything coded for Android is usually more useful. Of course Google have decided that the preferred language is now Kotlin instead of Java but i am not too sure i could bother learning something knew. I can accept Dave and David's assurance that Python is marvellous but its' benefits and delights will probably remain a mystery to me.

                    Viewing 9 posts - 76 through 84 (of 84 total)
                    • Please log in to reply to this topic. Registering is free and easy using the links on the menu at the top of this page.

                    Advert

                    Latest Replies

                    Viewing 25 topics - 1 through 25 (of 25 total)
                    Viewing 25 topics - 1 through 25 (of 25 total)

                    View full reply list.

                    Advert

                    Newsletter Sign-up