numpy works on pypy these days. You have to install it via pip and it does not have wheels, so it takes a few minutes to install, otherwise works. The numpy-heavy workloads will not be sped up by pypy, but depending on what you are exactly doing your python program might be faster in other parts.
We are actively trying to secure funding and goals to make numpy fast, now that it works and we are working together with numpy community to make sure it stays working.
The blog could be written by Trafalmadorian trolls, and it wouldn't matter - so long as you trusted that "pypy.org" was part of the PyPy project and that its connection to PayPal was valid.
I don't think your last comment is relevant. What I think you are saying is that the blog should be hosted under pypy.org and not blogspot. It could be WordPress or any other system.
It couldn't easily be Pelican because the blog supports comments, which means integrating with some other commenting system - plus authentication and spam detection. Why not get rid of the hassle of manually adding files and rebuilding the server and just use blogspot?
Lastly, are you really saying that people aren't donating to PyPy because some supportive blog site somewhere else, run by who knows who, isn't hosted on pypy.org?
yeah - because people dont go to the homepage. what gets tweeted out is this blogspot page (including here on HN). So people check out the nice blog post and see a donate link on the left of https://morepypy.blogspot.com/ and then go ...hmm, looks shady.
Not many read the blog post, then go to Google, search for pypy and then click the link and then donate.
Fundraising is not an easy process - there is an aspect of sales here. Blogspot may be easy.. but its not good branding. And that was my point - cos I want them to succeed.
Please explain how it looks shady, because I don't understand it.
If I start a blog - completely independent of PyPy - which promotes PyPy and describes the development efforts, and I want to encourage people to support the PyPy project, then I might link to the PyPy donation page, yes?
Yet, because I'm not affiliated with the project, I can't put the blog under pypy.org.
Would my blog also be shady? If so, doesn't that mean that no one should promote donating to a project unless that promotion is done under the project's domain?
You write "Not many read the blog post, then go to Google, search for pypy and then click the link and then donate."
That would be true even if the blog post were hosted under pypy.org. That is, why would anyone donate after reading one blog post?
I would think that more people would donate after they download PyPy, try it out, and see that it's worthwhile enough to fund.
In that case, they've already found out how to get PyPy, so there's no need to "Google, search for pypy" because they've already done that.
Nor is PyPy unusual in this respect. It's hard to find projects which have both a blog and a donate page. Of the 50 so projects I looked at, the following are "shady" according to your definition:
- Orange, with a blog at https://blog.biolab.si/ is a sibling subdomain to the main project site at https://orange.biolab.si/ , making it ... semi-shady? I mean, how are we to know that "biolab.si" is not some sort of large ISP where any of their customers can make a subdomain?
I did find a two places where the "donate" and "blog" pages were hosted on the same domain: http://rssbandit.org/ (which links to a SourceForce page which redirects to PayPal) and http://sagerss.com/ .
To summarize, I don't see why it's shady, your view seems to mean that no one should promote donating to a project except for the project itself, static site generation tools don't seem to be a good fit for blog support, and it doesn't seem like the effort to change to a single domain would have a worthwhile effect on improving the donation rate.
your examples are weird - and ill try to explain if i wasnt able to.
I'm not saying that the blog is bad and you are explaining why the blog is good.
I'm saying it is hosted on blogspot and there is another main site which is a pypy.org . People dont think the way you do. Try it out - ask 10 people if they would open this blogspot page and ever click on the donate link. Pypy's blog is not shady - blogspot as a free hosting service is shady.
your examples of mozilla, etc blogs are super orthogonal - they are hosted on an authoritative domain. Again, I'm repeating - it has nothing to do with pypy. Its about blogspot.
You are again attacking my suggestion of static site - the reason i suggested that was because it was a python based site generator with a CMS. you can try to do a wix site or whatever. As long as its an authoritative domain.
Its not about me - its how people thing. You should go and check.
So if someone read a blog post under gitlab.com/gitlab_blog/, then they still can't trust the donate link but must instead, as you wrote earlier "go to Google, search for [gitlab] and then click the link and then donate" .. though neither takes donations on their home page.
How is the PyPy situation any worse by not hosting the blog on a non-authoritative URL?
I've attempted to run it (not with this latest release yet though) and it appeared as if calling a numpy function essentially disabled the JIT for that function, which was not acceptable for us at that point. What did seem to help curiously was wrapping the numpy function in a python function and then calling that, which seemed to prevent the JIT being disabled in the calling function.
PyPy is fantastic. I'm my option, if they achieve to be faster than CPython in all scenarios( I'm thinking in Numpy/Scikit/Pandas CPython C API abuse), and get full Windows support (they do not support multiprocess in Windows), they WILL be the preferred Python Runtime.
I doubt they'll ever get even as fast as CPython for CPython API usage. What allows PyPy to run Python code fast is the same thing that makes the CPython API slow: The two implementations look nothing alike on the inside. The API is too leaky an abstraction to implement efficiently on anything but CPython.
Actually, there's been quite some speculation that PyPy could inherit the 2.x crown when CPython expires its support for 2.x (1 Jan 2020). PyPy is probably the best non-CPython Python 2.7 implementation, so it's a natural fit to some degree. I suppose it's all dependent on where the PyPy team wants to spend its time and whether or not someone might fund them for this effort (Canonical, RH, etc).
I don't know that they have the will to bug-for-bug reproduce CPython but the C API emulation layer added within the last few years probably means that it will be really high fidelity.
I don't know that I agree that it's a fiasco. I've been asked to work on a PDP11 in the last few years and while it was mind-boggling (the request -- I turned it down), it was understandable from a business perspective. So I can see how some businesses might want Just A Little While Longer on Python 2.x.
shrug, use python 3.x and don't sweat the fact that some folks are still stuck on 2.x. You certainly can't stop someone from offering support for 2.x -- the language is well defined and the reference implementation is open source. Anyone out there could put up a "Python 2.x Support: Cheap!" sign and be in business.
Reinventing the wheel keeps the circus going forever. As long as python 2 is supported I don't have to touch any of the scripts I have to "maintain". The moment I am forced to migrate everything to Python 4 (they broke everything already, why not again) I will have to deal with a huge amount of pointless busywork just trying to restore functionality.
Funnily I can still watch VHS movies on a relatively new TV. Just had to plug in the VHS recorder and play, meanwhile the description of 2to3 makes it look like a plug and pray.
I mean really? A script that might mention some cases it can't fix as warnings. "optional" fixes that replace missing classes with incompatible replacements, giving you the "choice" of fixing the mess by hand. Also an apparently unstable API for your own fixes, doubling down on breaking your code while you try to fix your broken code has to be dedication.
> Note The lib2to3 API should be considered unstable and may change drastically in the future.
Well, 2to3 works surprisingly well and I've converted tons of projects with it. Most major Python projects support both 2 and 3, so, private codebases, which "just work", should stay on the "just working" old versions of the interpreter instead of insisting maintainers to waste time entertaining their laziness, because things that "just work" don't require ANY updates, right?
I am happy as long as I can plug-in old versions of the interpreter in a system 10 years from now as easily as I could plug-in my VHS recorder into the TV. Might be able to do that if I install a windows version of python on Wine, there is a good chance (~100%) that the dependencies of python on Linux will not remain compatible that long.
Yeah, but dependencies are another story. So, my point is, if something "just works" and you don't want to have to upgrade, you need to freeze the entire system with all dependencies, which includes hardware, OS, etc. Everything else is just an illusion.
I sill have people using my Win16-based software written using Borland Pascal with Objects I wrote 30 years ago. They keep changing power supplies, etc. of an old system, but recently they had some major hardware problem and asked me to help them. I couldn't and wouldn't.
They ended up somebody creating a VM from their hard drive, and they are happy now and can back up that image and keep the circus going on forever.
If pypy wants to support 2, they can. If developers want to take advantage of that, they can. If library maintainers want to drop 2 support, they can. If others want to fork the libraries to provide security for the 2.x fork, they can.
What is not acceptable is trying to dictate what technology others choose to use to suit your personal preference.
Well, there's demand for McDonald's today as well, and people happy to supply to meet it! The Python community has dragged this circus for way too long it is now finally being put to an end. But there always will be dinosaurs who won't embrace progress and innovation, of course.
I could kiss this comment. You don't have a right to stop other people using py2 just so your ecosystem is a little cleaner. Yet that level of entitlement is apparently not uncommon in Python discussions.
There are two things in play here - one is that pypy is a large py2-only codebase that is unlikely to be ported. Second is that maintaining python 2 both does not cost much and it's used by the majority of our users, despite of what you hear on the internet. So, there is both good reason to keep it and a large amount of work to drop it.
I might be mistaken, but it was my impression that the changes to the 2.7 version now are mostly "core" changes that also apply to PyPy3. Basically, most language-specific work is going toward Python 3, and PyPy2 benefits when features they have in common are improved.
It'll be interesting to see if there's an upswing in PyPy usage over the next few months, for the explicit reason that they're not slated to drop 2.7 compatibility. I would imagine a noticeable percentage of 2.7 codebases without the available resources for a rewrite (that has by definition, no business value) would welcome anything close to a drop-in replacement. I'll be very interested to see if I'm wrong though!
3.7 it's a minor update from 3.6, that basically consists in changes to the standard library rather than the language, and 3.8 I think it will be the same, so once the 3.6 support is implemented it will take less time to implement 3.7 support.
Dataclasses are a sweet feature that I can see getting wide adoption pretty quickly. This said, they have a backport and it's mostly just sugar, so I expect one could probably get them working on 3.6 pypy.
I doubt that they allow you to work on a VM there. I saw only marketing and content jobs there. In Munich or Zurich they do have VM teams. Maciej is from Warsaw, he could do surfing in Munich. Armin Rigo is in Switzerland.
How realistic is it for an experienced programmer to dedicate 4 hours a week and join as a contributor? Not sure it is enough to bootstrap enough "project" domain experience to be productive. Asking for a friend.
I don't have a suitable workload with me right now. Has anyone tested the performance of PyPy3 vs PyPy2? Last time I tested it on a problem, PyPy2 was significantly faster (at least x3). That was maybe one or two years ago, when it was in beta.
At work we had this internal service that did search entirety in memory, using sorted lists. When it was initially written, years ago, that was deemed "good enough".
By the time I joined the company a little over a year ago, search requests were sometimes exceeding 2 minutes, and would be killed due to HTTP timeout (somehow users were OK with search taking a long time, they were just annoyed that now sometimes it didn't work at all). While I was working to rewrite the whole thing to push data to ES and do everything from there, the old system needed to keep running.
So I moved it over to PyPy. I did need to swap blist(C extension) for sortedcontainers (pure python), but that was a minimal change. Search got about 30% faster, enough to keep chugging along while the rewrite happened.
The place where I work uses it for back-end report generation. In my experience, PyPy's claimed speed benefits are very realistic. For most of our code, we get about a 3x speedup. For some areas of code, we get up to a 7x speedup. The only compatibility issues we've run into stemmed from mistakes on our part (e.g. all dicts are effectively OrderedDicts in PyPy, but your code will break in CPython if you rely on that implementation detail).
As a side note, for some reason, it's terrible for pytest — your tests will take literally 100x as long. If you trust PyPy's compatibility, I'd recommend running your tests with CPython even if you run PyPy in production.
For what it's worth, I was using it for a Django project and it did speed up noticeably - but I also ran into mysterious crashes. Now, I can't pin it to pypy necessarily, might have been some Django lib not playing ball with pypy. Had no time to test it in depth because moving back to regular Python was "good enough".
Why do you think so? Rust is an interesting language in itself, but I don't see why you'd expect a Rust port of CPython to compete with PyPy in terms of performance. It's still, at its core, the CPython architecture.