Discussion about this post

User's avatar
TioMoco's avatar

I'd say that the development of specialized programming languages are responsible for much of what is observed. The languages are designed to support, or work in a manner that addresses many of the problems of previous languages and the constraints they placed on how you had to think about a problem. When a language is designed to get out of your way and allow you to focus on higher-level constructs, you'd be surprised how much more you can do with it.

I'm also leaving out the incredible amount of work that modern compilers must do to translate purely imaginary constructs invented for a particular application into something the hardware can run.

The stuff you can do with a few lines of something as widely-known such as Java or Python is trivial compared to what one had to write to accomplish the same thing in say, FORTRAN - and that assumes that FORTRAN can, somehow - through sheer volume of code - accomplish the same task at all.

It's like asking your kid to mow the yard with tweezers. I'll take my ExMark 36" commercial mower any day.

Expand full comment
gwern's avatar

> Finding the asymptote is high school math, not a field of study.

More playing with words, more ignoring my historical point. Why aren't you addressing my original point? (And calculus is not 'a field of study'? OK then...)

In what sense is Brutus correct that "Programming theory used to not consider asymptotic time to be an important field of study."? Can you provide any evidence for this bizarre claim like absence of asymptotics from Knuth, no Turing Awards before 1980 for asymptotic improvements, etc etc?

If you can't, then I'm going to stop replying because debating minutia of how 'Big O isn't *really* correct' is irrelevant to the OP, to my original comment, and to Brutus's claim.

Expand full comment
84 more comments...