Why is Python so popular despite being so slow?

Speed is only important when you’re writing something that’s very computation-intensive or will be run many many times. For one-off jobs, or other jobs where speed of execution isn’t really important, speed of development becomes much more important.
Why is python so popular despite being so slow
Python is a more semantically powerful, high-level language than faster languages, which makes it much quicker and less painful to develop in. It’s for the same reasons that it’s so semantically powerful that it’s so slow: it must be interpreted because the language is too dynamic to be statically compiled. There is a trade-off among programming languages between speed of execution and convenience of development, and for many projects the former is more important while for many the latter is more important. Python probably won out over the other high-level/scripting languages because it also happens to be a very elegant.
Also, Python has many many libraries available for it, many of which are written in C, which means you can do many things that are very CPU-intensive using Python at the helm. (Of course, the extensive base of libraries available is another reason why Python is such a useful language to code in, but the fact that Python is an elegant language probably explains why it gained the momentum that results in so many libraries being written for it in the first place.)
There is also PyPy, which speeds up Python execution by several hundred percent by just-in-time compiling it.
I guess it’s just cheaper to throw more servers at it than it is to code in and maintain, say, C or C++. Or maybe it’s that at typical server speeds internet bandwidth, hard drive access, etc. becomes a bottleneck before actual code execution does. Just speculation, though, I don’t know much about it.
Being fast really matters. As somebody said in her talk, "Users really respond to speed."
A major problem for the future is datasets keep getting bigger and at a rate much faster than memory bandwidth and latency improves. I was speaking to a hedge fund tech guy at the D language meetup last night about this. His datasets are maybe 10x bigger than 10 years ago, and memory is maybe only 2x as fast. These relative trends show no sign of slowing - so Moore's Law isn't going to bail you out here.. He found that at data sizes for logs of 30 gig Python chokes. He also said that you can prise numpy from his cold dead hands as its very useful for quick prototyping. But, contra Guido, no Python isn't fast enough for many serious people, and this problem will only get worse. It has horrible cache locality, and when the CPU has to wait for memory access due to the data not being in the cache you may have to wait 500 cycles.
Popular articles