Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So we are paying 99% of the performance just for the 1% of cases where it's nice to code in.

Why do people think it's a good trade-off?



Performance is worthless if the code isn't correct. It's easier to write correct code reasonably quickly in Python in simple cases (integers don't overflow like in C, don't wrap around like in C#, no absurd implicit conversions like in other scripting languages).

Also you don't need code to be fast a lot of the time. If you just need some number crunching that is occasionally run by a human, taking a whole second is fine. Pretty good replacement for shell scripting too.


But many of the language decisions that make Python so slow don't make code easier to write correctly. Like monkey patching; it is very powerful and can be useful, but it can also create huge maintainability issues, and its existence as a feature hinders making the code faster.


I mean you can see it with your own experience, folks will post a 50 line snippet of ordinary C code in an blog post which looks like you're reading a long dead ancient language littered with macros and then be like "this is a lot to grok here's the equivalent code in Python / Ruby" and it's 3 lines and completely obvious.

Folks on HN are so weird when it comes to why these languages exist and why people keep writing in them. For all their faults and dynamism and GC and lack of static typing in the real world with real devs you get code that is more correct written faster when you use a higher level language. It's Go's raison d'etre.


Because it's nice to code in. Not everything needs to scale or be fast.

Personally I think it is more crazy that you would optimize 99% of the time just to need it for 1% of the time.


That’s why Python is the second best language for everything.

The amount of complexity you can code up in a short time, that most everyone can contribute to, is incredible.


It isn't an either or choice. The people interested in optimizing the performance are typically different people than those interested in implementing syntactic sugar. It is certainly true that growing the overall codebase risks introducing tensions for some feature sets but that is just a consideration you take when diligently adding to the language.


Because many never used Smalltalk, Common Lisp, Self, Dylan,... so they think CPython is the only way there is, plus they already have their computer resources wasted by tons of Electron apps anyway, that they hardly question CPython's performance, or lack thereof.


Has it ever crossed your mind that they just like Python?


I use Python 90% of my day and I can't say I like it or hate it or care about it at all. I use it because it has all the libraries I need, and LLMs seem to know it pretty well too. It's a great language for people that don't actually care about programming languages and just want to get stuff done.


And slow code, yes it has cross my mind.

Usually they also call Python to libraries that are 95% C code.


The hypocrisy gets even worse: the C code then gets compiled to assembly!


Except C developers actually acknowledge that, they don't call libraries written in Assembly, C code.


Sure they do.

https://github.com/OpenMathLib/OpenBLAS https://github.com/FFmpeg/FFmpeg

Plenty of assembly in those projects but no mention of it in the README. Most C projects don't acknowledge the assembly they use.


It's much more than 1%, it is what enables commonly used libraries like pytest and Pydantic.


Most of the time you are waiting on a human or at least something other than the cpu. Most of the time more time is spent by the programmer writing the code than all the users combined waiting for the program to run.

between those two, most often performance is just fine to trade off.


Because computers are more than 100x faster than they were when I started programming, and they were already fast enough back then? (And meanwhile my coding ability isn't any better, if anything it's worse)


I don't think anyone aware of this thinks it's a good tradeoff.

The more interesting question is why the tradeoff was made in the first place.

The answer is, it's relatively easy for us to see and understand the impact of these design decisions because we've been able to see their outcomes over the last 20+ years of Python. Hindsight is 20/20.

Remember that Python was released in 1991, before even Java. What we knew about programming back then vs what we know now is very different.

Oh and also, these tradeoffs are very hard to make in general. A design decision that you may think is irrelevant at the time may in fact end up being crucial to performance later on, but by that point the design is set in stone due to backwards compatibility.


It isn't. There are many things Python isn't up to the task. However, it has been around forever, and some influential niche verticals like cyber security Python was as or more useful than native tooling, and works on multiple platforms.


Because Matplotlib and Pandas etc. save programmer time, even where they waste processor time.


I can say with certainty I’ve never paid a penny. Have you?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: