FORTRAN to Fortran
I wrote my first computer program sometime in October 1970 - the year I began studying Natural Sciences at Cambridge. I could not wait to sign up for a FORTRAN course at the Computer Laboratory, using the university's "Titan" computer. (Titan was advanced for its time - but only ran at about one million instructions per second, and everyone using it at the same time had to share only 128k of main memory. Modern home computers are several thousand times as powerful. Equivalent university computers are now millions of times more powerful.) At that time, of course, the only place you could find computers were in universities, government research establishments and a few large corporations.
I have used the FORTRAN language continuously from 1970 until my retirement in 2015. However, much of the language syntax has changed considerably during that period. The original FORTRAN had a limited and relatively simple syntax and grammar, so was very easy to learn. It was also specifically designed to support scientific programming. (The name is a contraction of FORmula TRANslation.) Computer time was then scarce and valuable, so tended to be used mainly for serious stuff like scientific research. For a number of decades FORTRAN was the automatic choice of anyone writing scientific programs because it has all that was required to handle the programs of relatively small size and complexity that it was then feasible to write and run.
Rather surprisingly, many of these programs and others written in subsequent decades are still in regular use in important roles.
Hence Fortran is still a main stream language because there is a large base of valuable software that will not be replaced any time soon because of the huge cost and high risk of things going wrong. (The product of one my early major projects is still lively and healthy at 30 years old, and could well remain so for another 20. Providing an equivalent capability today would probably cost in excess of £10 million.) The big advantage of Fortran was always its efficiency: it was good at exploiting the capabilities of the hardware to the maximum extent. Scientists performing advanced simulations - such as predicting how proteins fold or calculating the consequences of black-hole collisions - are always wanting to push to the boundaries of what is technically feasible in order to solve the cutting edge problems. You need to squeeze out the last drop of available performance.
The fact that some 40-year old programs are still fulfilling important roles also testifies to Fortran's relative portability: computers die every two to five years, but the programs live on. This is, however, partly due to the nature of the problems one is trying to solve. If you are working at the level of abstraction necessary to solve, say, differential equations specific features of computer hardware may not matter too much: the underlying equations are always the same. (There is a caveat: the most demanding problems sometimes need special computer architectures, such as large scale parallel computing. Programs may then be written to specifically suit individual computer architecture.)
Over the past 40 years typical Fortran programs have become larger and more complex. In the early days I would have regarded a program of 1000 lines as rather large, but by the time I retired our most important nuclear reactor simulation program has about 500,000 lines of Fortran. By modern standard this is actually rather modest: the weather forecasters can probably multiply this by a significant factor. The growth in size and complexity has, however, meant that program designers needed to work at ever higher levels of design abstraction, and in order to translate these abstractions into code most easily they needed addition features in the language.
The latest incarnations of Fortran have adopted many of the programming constructs that first appeared in other languages (and the change of the official name from FORTRAN to Fortran is supposed to flag the evolution) so it is now a much larger and more powerful (though possibly harder to learn) language. By this I mean that it is easier to express complex ideas with the new incarnations, but possibly a bit harder than previously to express simple ideas. Fortran is therefore still a reasonable choice for scientists. On the whole, I have found that people who are perhaps more interested in physics and maths rather than software engineering can readily acquire adequate Fortran programming skills which they can use to build understandable and efficient mathematically based programs without having to avoid the traps of more complex environments (such as C++).
More recently, in my opinion, it has been suffering from a lack of commercial support: Fortran programmers tend to use a lot of open source, free software, and are less focussed on programming productivity than software engineers who produce front-line products for organisations such as banks.
Such organisations need rapid time-to-market and they are prepared to spend money to achieve it, so commercial suppliers of software tools work on productivity environments for the types of languages used by these customers (such as Java and C++).
Less software is now written in Fortran and it appears less frequently in job adverts. My programs that would once have been written in Fortran are now written in Python (see below). And quite right too! It is a better tool for relatively small scale data handling applications. Most physical science and engineering students emerging from university will have undertaken programming courses in either Python or Java (and almost never Fortran). Given no other constraints they will work with what they know (and quite right too). There are, however still problems that are handled better in Fortran than in any other language, so it is likely to retain a niche user community in areas of scientific application where the mathematics is to the fore and techniques such as parallel programming are important.
Fortran is not a language for graphical applications. We scientists usually run our large simulations without graphics but save output to large data files which we post-process for graphical analysis with specialist tools written in different languages.