Hacker News
3 years ago by patrick91

Presentation from Guido van Rossum at the Python Language Summit: https://github.com/faster-cpython/ideas/blob/main/FasterCPyt...

3 years ago by antman

Comments on the points being made based on my experience

- without breaking anyone's code (did that for the print function etc and delayed py3 adoption for a decade but not for the most user requested feature which is speed)

- No large PRs (how a 5x speedup can take place with small patches that no one figured out yet?)

- a small team (not enough money, less risky if they tried to adopt other pypy-esque projects that are more advanced speed wise, a more comprehensive plan that would invite donations, I would certainly give. For this presentation I don't know)

3 years ago by kzrdude

I don't think "without breaking anyone's code" even needs to be a goal. Python has rolling deprecations and feature removals planned in the 3.x release series, and careful such feature transitions could be used to help JIT features or similar, too.

To back up that fact: Python 3.10 will remove long-deprecated features, see issue tracker: https://bugs.python.org/issue41165 and What's new, the Removed section: https://docs.python.org/3.10/whatsnew/3.10.html#removed

3 years ago by uluyol

WRT large PRs, it doesn't sound like they expect all the improvements to be from small changes. I think the point is the code will be changed in increments so that even large changes (overall) will be easier to review and offer feedback.

3 years ago by yjftsjthsd-h

> without breaking anyone's code (did that for the print function etc and delayed py3 adoption for a decade but not for the most user requested feature which is speed)

The Python 3 debacle is a beautifully worked example of why being backwards compatible is so important, and I can 100% sympathize with them not wanting to go through that again.

3 years ago by axaxs

Well, glad he's coming around. Years back he seemed to stand by the notion that Python wasn't slow, showing a toy example. I remember thinking it was just condescending, whether intentional or not.

3 years ago by takeda

I think years ago things were much different, also even today that might still be true.

Among interpreted languages python is/was one of the faster ones.

NodeJS has similar speed, and in some cases is faster. I don't know any other interpreted language though.

People are also complaining about GIL, but ironically NodeJS is single threaded.

3 years ago by qsort

Nodejs is much faster than Python even for workloads where there's overlap in use-cases, like webapis and scripting.

The only production language with performance in the ballpark of Python is Ruby.

I agree that more often than not it simply isn't an issue, but CPython performance isn't much far ahead of toy languages, for a whole lot of reasons that we are all sick and tired of hearing.

3 years ago by etaioinshrdlu

I updated a large Django project from python2.7 to 3.9 recently. Afterward, I was pleased that the app was both faster and used less than half the memory. It was somewhat surprising that the new version was so much better - most software seems to typically get only heavier over time!

So, I take this as a good sign that progress will continue.

3 years ago by guggle

> most software seems to typically get only heavier over time!

It's often the case for end-user applications, but I find that most languages/runtimes tend to get better with time (and work of course !). There's a similar trend with the PHP runtime.

3 years ago by MaxBarraclough

That rule of thumb applies to the web too. Browsers keep getting faster, websites keep getting slower.

3 years ago by cogman10

I can't think of a language/runtime that has actually gotten slower with time. The nearest I can really point to is the uptick in compile time that happens with languages like Rust/C++ every so often. That's not really a "language is slower" problem though.

3 years ago by kzrdude

rustc's trend the last twenty releases has been that the compiler is getting faster. I guess every dependency is slowly growing in size, offsetting this in your perception? The compiler benchmarks are of course held constant to be able to compare releases.

See the perf website https://perf.rust-lang.org/dashboard.html

3 years ago by 1vuio0pswjnm7

But isn't the Python install getting heavier.

3 packages will be installed:

  expat-2.3.0_1 
  gdbm-1.19_1 
  python-2.7.18_3 
Size required on disk: 18MB

3 packages will be installed:

  expat-2.3.0_1 
  gdbm-1.19_1 
  python3-3.9.4_1 
Size required on disk: 22MB
3 years ago by kristianp

True, but that's tiny by today's standards.

3 years ago by kzrdude

This is exciting! So that we don't miss the main point: Eric Snow, GvR and Mark Shannon will be working for Microsoft to improve Python performance.

In a way, it seems like Mark Shannon found someone to take him up on his offer on a plan for speeding up Python.

3 years ago by BiteCode_dev

For people googling, not that mark shannon:

https://en.wikipedia.org/wiki/Mark_Shannon_(actor)

That mark shannon:

https://github.com/markshannon

He tried to do hotpy in 2011, and trigger the idea of FAT byte code which I believed inspired the FAT python attempt of Victor Stinner.

Now stinner is working on Hpy, dropbox on pyston, instagram on cinder, and pyjion just got a release.

Looks like Python is going to get faster one way or another.

3 years ago by sizediterable

Dropbox hasn't sponsored Pyston development in quite a while https://blog.pyston.org/2017/01/31/pyston-0-6-1-released-and...

3 years ago by rqst

Why do people in the Python community get so excited over announcements? I'd rather see working code, but no one cares about that in the Python universe.

It is always announcements, talks, conferences and if something emerges it is a bit weird like the pattern matching.

Meanwhile the Erlang people quietly produced a JIT without any advertisements.

3 years ago by coldtea

>Why do people in the Python community get so excited over announcements? I'd rather see working code, but no one cares about that in the Python universe.

Because almost everything you've seen as 'working code' started its life as an announcement.

And because to coordinate and discuss future work, there would need to be some announcements.

And because some announcements (based on the persons, e.g. here GvR is involved) or the funding (e.g. here MS is involved) or the specificity (e.g. here 3.10 timeframe is discussed) are more important than others.

>Meanwhile the Erlang people quietly produced a JIT without any advertisements.

Good for them. That's maybe because much fewer care for Erlang (and thus for the advertisements) related to Python (which has a much larger dev base), so the advertisements of the former are posted fewer times and discussed by fewer people.

3 years ago by HUSSTECH

Just to add on the coordination point. Python has a large user base, and that's not to say larger user numbers equals better/superior. So the broadcasting of intention or direction is certainly welcome.

Not only is it a large user base, but a varied one too. Flask, Django, FastAPI, Twisted...and that is just to name the web frameworks! We have scientific use, research use, cli tools. Perhaps in some cases end users (developers or not) of those tools may not be aware Python is the foundation of said tool.

Anecdotally the Erlang users I've met have been incredibly knowledgeable and in tune with the language features and development. I find that pretty cool.

In my opinion, an argument can be made that Elixir is the most prominent web framework for the language, so developers can just keep up with that and not the language if they wish. Compared to the Python ecosystem. As Erlang inevitably grows in popularity, it too may fragment.

3 years ago by dragonwriter

> Meanwhile the Erlang people quietly produced a JIT without any advertisements.

Er, no, they didn’t, they made plenty of announcements before the release, most of which even made their way to HN.

3 years ago by kzrdude

This project has already submitted a few small changes into cpython, like https://github.com/python/cpython/pull/25069 and https://github.com/python/cpython/pull/25729

3 years ago by BiteCode_dev

> Why do people in the Python community get so excited over announcements?

When you like a tech, knowing people are brewing improved perfs on it is kinky. On Python, the motto has always been "it's fast enough", "if you want perfs, don't use python", "C extensions will solve this", "we don't want to make the main implementation complicated", "python dynamism and GIL make it a hard problem", etc.

So in the python world, it's particularly big news, especially given that previous attempts (gilectomy, unladen swallow, first pyston...) all died.

3 years ago by WesolyKubeczek

> So in the python world, it's particularly big news, especially given that previous attempts (gilectomy, unladen swallow, first pyston...) all died.

This is precisely the point you seem to be missing. All those things you mentioned were similar big announcements back in their day, and they all have just died by fizzling. What should be setting this one apart?

3 years ago by pca006132

Just wondering, why is CPython a lot slower than JS? It seems to me that both languages are interpreted and comes with some reflection functionalities (like modifying object methods), but JS seems a lot faster in most of the benchmarks in https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

Edit: Why is this being downvoted? Is my statement incorrect or offensive to some people?

3 years ago by scatters

CPython has a C API. Any Python object may at any time be passed into a C extension that expects to see all its introspectible attributes, nicely type-erased. This inhibits inlining, elision and monomorphization, all essential optimization techniques available to the JavaScript JIT.

3 years ago by epidemian

JS have been heavily optimized by browser vendors, fueled by the intense competition on the browser space, and the huge resources invested by companies like Google.

IIRC Firefox 3 was the first browser to have what we would call a modern JS engine, with heavy emphasis on JIT compiling. Google Chrome soon followed and V8 dominated the JS perf story.

Python doesn't have such fierce competition of implementations. And besides, it has the "escape hatch" of being able to implement performance-critical code paths as C extensions, which is what heavy-lifting libraries like numpy do.

3 years ago by est

> Python doesn't have such fierce competition of implementations

Python have many competitions, but the language lack some sort of spec and the only reference is CPython.

3 years ago by el_oni

I think they were refering to a lack of interpreter competition. Sure there is pypy, jython, ironpython etc. But they arent competing. They are just used for different usecases.

In the browser space their has been a lot of competition to make the fastest js runtime because it makes the browser faster.

3 years ago by pansa2

> It seems to me that both languages are interpreted

Node.js uses the V8 engine for JavaScript which isn't just an interpreter - it includes a JIT compiler.

3 years ago by dec0dedab0de

Yes, the JIT has got to be the biggest difference. I wonder how node compares to pypy.

I think when GVR mentions "There's machine code generation in our future" that he is talking about a JIT

3 years ago by throwaway894345

I suspect Node is still quite a lot faster than Pypy because the former benefits from v8 which is a well-funded project which doesn’t have to worry about compatibility with the sprawling C-extension interface that CPython exposes.

3 years ago by ForHackernews

I think mostly because Google has put a massive amount money and effort into making V8 faster.

Performance has always been a lower-tier priority for cPython and other, faster, implementations (like PyPy) have not seen much mainstream adoption.

3 years ago by fdej

I'd like to see optimizations targeting ctypes, or a successor to ctypes. I wish I could write elegant, performant C wrappers in pure Python. Right now the best choices are Cython, which is a hassle (separate, slow compilation, various warts), and ctypes, which is slow (and has some design problems of its own). Julia's ccall alone is a major selling point over Python right now.

3 years ago by ihnorton

CFFI [1] is a step in the right direction, inspired by LuaJIT's CFFI. It originated from the PyPy folks and is supported in PyPy [2]; it's also supported to some degree by Numba [3]. I don't know what level of C call optimization is available when using the JITs, so I can't speak to the performance, but I've used it casually via CPython and was impressed by the API. That said, it has been around for a while and the traction seems somewhat limited -- I would guess because most people who have this kind of problem also need more than "just" FFI.

[1] https://cffi.readthedocs.io/en/latest/index.html [2] https://doc.pypy.org/en/latest/extending.html#cffi [3] https://numba.readthedocs.io/en/stable/reference/pysupported...

3 years ago by BiteCode_dev

It's not nuitka main purpose, but it can produce a lib and speed up the code: https://nuitka.net/

3 years ago by simonw

I'm excited about WASM here. https://github.com/wasmerio/wasmer-python lets you call WASM binaries from Python, which means if you can compile a C library to WASM you can then call it from Python... without having to worry about introducing crashing bugs and security vulnerabilities thanks to the WASM sandbox.

3 years ago by adsharma

@fdej: what successor to ctypes did you have in mind? Other than the names (I prefer i8 over c_int8, but easy to work around) and lack of ergonomic interop with python ints, are there other issues you're aware of?

3 years ago by fdej

My number one wish would be the ability to read definitions from .h files automatically. Other than that, I've had some issues with memory management with ctypes (objects being deallocated prematurely) that I never had with Cython, but that may just be my own fault. I agree about the lack of interop with Python ints.

3 years ago by morelisp

> My number one wish would be the ability to read definitions from .h files automatically.

This was originally a feature of ctypes but was dropped before stdlib inclusion because it's difficult to solve in the general, "public distribution" case. Nonetheless I don't think there's any need to replace ctypes to reintroduce it, it was done as a library / external tool before and still could be.

https://svn.python.org/projects/ctypes/trunk/ctypeslib/ctype...

3 years ago by milliams

I posted the link to the first PEP yesterday (https://news.ycombinator.com/item?id=27134290) I think this looks like a very promising project.

3 years ago by qteax

25-50% has been hoped for many times in the past 3 decades. I don't find that very impressive.

CPython got big because of its relatively decent C-API and glue capabilities. 25% does not make a difference, decent C extensions have speedups in the order of 10-100 times (yes, times, not percent).

If I had a dollar for every time someone proposed a Python speedup ...

3 years ago by pjmlp

Python is feeling the pressure of not having a JIT on the box.

Microsoft is reactivating their JIT project for Python as well.

Their talk tomorrow:

"Talk: Restarting Pyjion, a general purpose JIT for Python – is it worth it?"

https://us.pycon.org/2021/schedule/presentation/52/

3 years ago by aztec100

There have been many similar talks in the past decades. At some point Unladen Swallow was presented as "Google's project" (which is quite exaggerated).

It looks more like Microsoft has JIT envy now that Instagram and others have open sourced (quite restricted) JIT projects and has to show presence again.

I think every CPython JIT project will be full of corner cases in both performance and behavior, so it will add to the growing database of weird Python behavior that users have to memorize.

3 years ago by klyrs

> If I had a dollar for every time someone proposed a Python speedup ...

Well, that's just the thing. Many have tried, many have succeeded, but their work hasn't been upstreamed so adoption is almost nonexistent. This is different. It's finally a mainline project. That's noteworthy, and long overdue given the plurality of proofs-of-concept.

3 years ago by MR4D

I think you are right to be skeptical, but this is GvR with Microsoft's money.

So hopefully the combination of those two can make it work.

3 years ago by Too

If you claimed JavaScript was fast 10 years ago people would also laugh in your face. Todays JS is 10 times or more faster, thanks to JIT.

3 years ago by The_rationalist

The python JIT that promise to get the highest throughput once maturity reached is https://github.com/oracle/graalpython

Daily Digest

Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.