It was an interesting read, I'm not sure if they are like the shark, already dead but the message hasn't gotten to the swimming part yet, or like a harbinger of the future.
One of the things engineers are going to have to come to grips with is that you can actually be "done" doing new design. It's really really hard in FOSS stuff because bug fixing is so much less rewarding than new feature development. But programming languages are tools, and if you're familiar with tools in the physical world you realize that once you get to a certain optimum, there isn't a lot of 'new feature' to add.
The nature of tools is why a good oscilloscope or bandsaw is still a good oscilloscope or bandsaw 25 years later. It does what it needs as well now as it did when it was new. If there is some new 'space' to work in, you might need a variation of the tool, but the basic tool is fine.
Computers, and computer tools, are maturing. We've seen this in the slowing of the upgrade cycle, the resistance to change that the OP writes about, people are ok with their tools. That will be a different world of computers than we are used to I suspect.
I'm not sure if your dead shark is referring to Eve or Python 3.
I'm of two minds here. I currently work for Google. I doubt Google will ever move to Python 3. There's just too much legacy code, not enough certainty that upgrading it won't introduce bugs, and too little business reason to switch.
However, I also try to stay reasonably current on technologies available outside of Google, mostly out of paranoia that I'll end up one of those irrelevant big-company employees. And when I try to weigh all of the technology options I might use for a startup against my accumulated experience and what I want in technology infrastructure - Python 3 still stacks up fairly well. I'm not entirely certain I would use it - Go is an intriguing new option, and the non-Java JVM alternatives have gotten a lot better since I was last in the startup world in 2008. But the big changes in Python 3 - Unicode and async - have helped it stay competitive, and disciplined use of function annotations could help eliminate much of the maintenance/documentation problems of not having static typing, and it still is way beyond the competition in terms of convenient syntax and helpful abstractions.
In general tech infrastructure has about zero chance of gaining adoption in existing large enterprises, because the costs of switching are prohibitive regardless of how good it is. Companies stick with whatever was popular when they were founded, which is why Google (1998) is still a C++/Java shop, Facebook (2004) still uses PHP, and Dropbox (2007) is all Python. But that's not how new technologies get adopted. They get adopted by old companies dying off and getting replaced by new ones, and as long as tech companies continue to die (which seems a virtual certainty), there will be room for new languages and tools.
Agreed - this pulls together the ideas that maintenance is 90% of software, that Moore's Law really does appear to be ending, and that the choice of our tools is a lot less to do with "best tool for the job" and a sort of gravitational shifting between evangelists and convenience.
I have not actually met anyone who moved to 3 because 2 was not good enough or that had major issues with say unicode that they could not solve any other way.
Let's face, there's going to be projects that will NEVER be ported to Python3. Not a couple, a lot. Why should they? If they are using software for making money, it works and porting it to Python3 maybe is just too expensive and too risky.
The point is that, as time past, hopefully more new projects will be created in Python3, and they can drive the development of more and more modules and tools, and, eventually, all new projects are in Python3 and Python2 is just legacy.
Legacy code happens. And for good reasons. And there are lots of people working on them and earning money. Software is a means to an end. I love to use new technology and using old tech drives me crazy, but from the point of view of business, it's a decision that makes sense. There are still COBOL systems that behave perfectly.
EVE is a game. It currently works. Its players don't care if it's done in Lisp, x86 assembly or Haskell. The people working with the code care, but to migrate it to Python3 is simply to costly and risky to do. And will probably always be.
If they start a new project, they may be starting it on Python3. And that's fine. That's the way it should.
If they're going to keep developing their game for 20 years, and they're smart, then they'll migrate sooner or later; otherwise the weight of finding developers willing to work in the ancient language that python 2.7 will become, and teach them that language, will drag them down, costing them more in the long term than a migration would. Just as it has for COBOL projects. It sounds like they are making efforts to pay down their technical debt; once they have decent test coverage, the move will be much less intimidating, and it sounds like their string-handling library is debt they really want to be rid of - it's just not top priority at the moment.
For projects that are being mothballed in "maintenance mode", sticking to the old tech makes sense. But for a project being actively developed, either you pay off your technical debt, or you pay interest on it forever.
Maybe that's true and maybe it's not. Unlike COBOL, Python 2 isn't inferior to its putative successor. Maybe people will eventually move anyway. Maybe they won't. Maybe they will, but not until after we are all dead.
Or maybe somebody will say bugger this for a game of soldiers, take the current Python 2, add such features from 3 as can be backported without breaking compatibility and release it as Python 4, and everyone will move to that and forget about Python 3 altogether. I don't know that will happen, of course, but I don't know it won't either. Prediction is hard, especially about the future.
I'd love to see an example of a project that has been actively developed for 20 years and has made a similar change. Linux comes to mind as a project old enough that has been kept updated regularly, but it has always been good-old C (not sure, though, maybe there has been some changes introduced)
(I'm not trying to be sarcastic, I think it will be interesting to see how they faced that kind of change)
You're right, migration may be needed at some point. But "at some point" is a very fuzzy definition, and my guess it that it will be later rather than sooner, giving the complexity of it. Paying interests of technical debt for this year, and we'll talk again next year indefinitely may well be an option if getting rid of technical debt is too costly. Sure, you'll pay more on the long term, but each year you'll face the same question: "Should I do this costly long term solution, or just keep going whit relative pain for a while?"
Except 20 year old code bases tend don't need much development.
PS: At one point I was maintaining one by myself and noticed hey, I don't actually have any open issues at all. Which compared to most software development is an odd place to be.
I don think it should surprise anyone here that users with enterprise-size Python codebases aren't jumping to Python 3. PEP 373 doesn't end support until 2015, and I assume at that point RHEL or maybe even someone in the python community will pick up the torch (a la Rails LTS).
I think by far the more interesting question is what is going on with today's green-field projects (a.k.a. the enterprises in the making). Whoever steps forward to maintain 2.7 long-term will probably be charging an arm and a leg (Red Hat I'm looking at you) and so of course this is unaffordable for bootstrapped companies and ordinary folks will not be able to afford a security patched Python 2 install.
Meanwhile the statistics I've seen for green-field development are mixed-to-promising for Python 3. I know for my new projects I'm over 50% 3.3. And when the shoe drops with 2.x support EOL, I think that's going to give it a serious kick.
All the new python3/python2 libraries I've noticed adhoc recently have been doing the horrible 'polyglot' one-code-base-support-python2-and-python3 thing, or been in python2.
Beyond not getting to take full advantage of python3, it's really a relatively painless solution (though when you hit the pain points (bytes vs strings vs unicode) you'll notice).
There's nothing particularly wrong with it (honestly polyglot should have been python 3.1, with new features at they rolled to 3.2, 3.3, etc. in my opinion); but if the reason for using python 3 is that it has shiny new features (like async io), it's a bit of bummer because you can't use them if your library is py2 compatible.
...which kind of makes it 'python 3 but actually python 2 running on the python 3 runtime', and the code is harder to maintain than either python 2 or python 3 (twice as much testing to do too).
Not really much of a carrot for library developers to support python 3.
what kind of library could transparently use async IO, or not, based on platform? using explicit async for anything dramatically changes the behavior and public API of the code.
there is of course a huge carrot for library developers to support python 3 which is, so that python 3 users can use your library, rather than having all of your work replaced by something else and generally holding back the community.
Well I suppose it's all in how you interpret them. For example, if you look at https://wiki.python.org/moin/2.x-vs-3.x-survey?action=Attach... which shows 22% Python 3 vs Python 2 adoption, perhaps that sounds bad. But if you imagine that those 22% are the green-field projects and 78% are the people maintaining Eve Online, then that is a great outcome for Python 3. So I guess the question hinges on whether you believe Python 3 usage is highly correlated with green-field development, which is a belief I hold.
> I presume you've read
There are a lot of things I dislike about this article. One is that it acknowledges that library support can pull you to Python 2, while totally ignoring that library support can also pull you to Python 3. For example, as a result of spending a lot of time in Python 3 I have a lot of Python 3 code (e.g. what was a month ago green-field library development). These libraries are neat, and I want to use them, and it is a requirement for using them that I use them from Python 3, and they cause me to use Python 3 in the same way that Twisted's Python 2-only support causes some other developer to go to 2-only. The pendulum can swing both ways, and it's up to us as library authors to decide which way we want to swing it. In my view, the real villains here are library authors who have decided to make Python 2.x a first class citizen and add new features do the 2.x versions of their library instead of designating 2.x as bugfix-only like the Python Software Foundation did long ago.
The second thing I dislike about the article is that it ignores the obvious solution: port package X to Python 3. Now for very large libraries this may not be feasible, but certainly the subset you're going to use in your project could be ported, and of course the entire library can be ported for smaller cases. We all need to be doing more to give back to the open source community, most of us take and then give nothing, this is not a good combination. The fact that "submit a patch" is never discussed as a possibility in these "boohoo Twisted doesn't support Python 3" discussions is beyond ridiculous and exposes what I think is a very serious problem in the community. Are we hackers or are we whiners? So I think we should hack, and if we want package XXX for Python 3 we should all pitch in and work on that.
> All the new python3/python2 libraries I've noticed adhoc recently have been doing the horrible 'polyglot' one-code-base-support-python2-and-python3 thing, or been in python2.
uh, what's better? two codebases? heh, hardly. Trust me.
As someone who works in VFX I can sympathise with CCP on this.
However the real question is: why should port everything to 3? none of the software in VFX uses it, plus a lot of people aren't really up to speed with the changes.
What is the point? yes the language "purer" but that doesn't make my life easier......
It's a chicken and egg situation - until VFX houses (often with close to a million lines of Python code integrating stuff) need Python 3 (I don't see why they would based on the new features), they're not going to ask companies like The Foundry and Autodesk to add support for it.
And the commercial companies aren't going to waste dev (and huge amounts of test time) on a feature 95% of their customers don't want or need.
Houdini's got dual 2/3 support, but I find it very difficult to believe any big studio is using the 3 support (as it wouldn't integrate with the rest of the pipeline).
If you haven't already written your own horrible hacky string handling framework, python 3's unicode support is much nicer. And IMO, for projects of the size I work on, super() alone justifies the upgrade.
But I think to talk about porting is to ask the wrong question. The main reason for python 3 isn't to port existing projects, it's to make python a good choice for new projects.
You correctly point out that unicode is much much better, but How often are you dealing with it? I know that I certainly don't have to think about it in my pipeline.
In some projects, not at all; in others, most of the time. But it's string/unicode that takes up most of the time and effort in porting; projects that don't involve much string handling are pretty trivial to port to python3, IME.
High-end wasn't mentioned before. But I know, I know, blender is often just not an option. Although I don't find the interface as bad as everyone is saying. But then again, I'm a software developer and I don't find GIMP's interface that bad either.
and it's what free/open source software gets wrong almost all the time (and I say that as an advocate of it), due to the fact devs rarely listen to users. The features are often there but they're not that easy or flexible to use, so often they're practically useless in certain situations.
There was something last year when someone came up with some workflow / design improvements suggestions to blender, and the developers just didn't understand why it was necessary, saying "it's possible to do stuff in blender".
One would think that this is not the case for blender, because of the blender movie projects where users and developers work in the same room.
> There was something last year when someone came up with some workflow / design improvements suggestions to blender
Do you mean these awful ribbons? IIRC he also suggested different views for different tasks, even though blender has always had that feature! What was a valid point where the inconsistencies in the interface and some minor things.
I think that statement can often be taken in good and bad ways, depending on the devs.
Some software I love and use is developed by people who do not care what their users think because they are their own users. They build for themselves and produce excellent results.
Some developers listen to users too much, and end up allowing every piece of their software to be configurable. Even if a very small percentage of users actually needs that configurability. They then lose focus on simplicity and elegance that the majority of their users were attracted to in the first place.
Of course, it goes the other way as well and many pieces of software are better due to user feedback.
"listen to the users" does not actually mean "change the software the way the users say to" it means to listen to how they're failing to achieve the results they are trying for and use that knowledge to improve the software. Listening to users does not preclude sticking to UI conventions and such since your users are unlikely to be software or UI designers themselves.
The point of maintaining the source code to the latest version is to guarantee its survival to the long term: at some point you will have to pass the ownership to another, younger, dev that might not have been taught into this older technology; or port your script to a newer hardware, and eventually deal with an sub-optimal or incomplete VM.
Either you maintain to the latest version and distribute the cost in time or one day you will have to start from scratch, and probably lose more time and money to do so.
See every organization still running on XP ? They may be sentence to death in 3 month, that may be the hardest way learn it.
"See every organization still running on XP ? They may be sentence to death in 3 month, that may be the hardest way learn it."
Huh? XP will just be chugging along, for another decade probably in some places. There are still business being run on DOS software FFS. What is the easiest - pay for a complete replacement somewhere in the next decade or two, or upgrading everything every few years to stay with the times?
(written from Windows XP and not likely to move for at least another half year...)
I'm sure that a lot of XP machines in the corporate world aren't connected to the internet either. For example, there's no reason to give a bunch of low-level employees in the accounting department access to the internet, and if their company's internal software runs fine on IE8 (or as native code), there's no particular reason to move away from XP. PCs on a factory floor that control industrial machinery probably aren't connected to the internet either.
Well, with Microsoft (and many other software company) ready to pull the plug of updates on this OS, I hope you have faith in your antivirus to stop every unpublished exploit.
Anti-Virus systems are not required for anybody who practices even a tiny modicum of caution (Don't browse with plugins like java enabled, never open attachments, don't click on links) - and in stand alone with a half decent set of firewall rules your Windows XP system will be fine.
A patched system with a firewall on and without "trojan horses" brought in by the user is relatively safe.
XP will stop getting patches soon.
And this list (http://www.cvedetails.com/vulnerability-list/vendor_id-26/pr...) is only going to get longer and longer, because even though Microsoft will be EOLing XP, there will be tens of millions of Internet facing machines using it, probably even in 2020.
Having a firewall + not loading trojans gets you 99.9% of the way to security.
The problems are that normally people (A) don't want to deal with the hassle of a firewall, and (B) don't like to be cautious about opening attachments (C) People don't like to be restrained about what they click on, and finally (D) People tend to browse with all sorts of plugins loaded (not to mention Javascript being almost universally loaded).
For those people, yes, they will need to have a lot more handholding by their operating system vendor.
For somebody running a Windows XP system that doesn't have to do any of those (Cash Register, Kiosk, Office Machine) - they are fine, can be locked down, and can probably run Windows XP for the next 20 years without concern.
Most places I know of running WinXP are completely cut off from Internet, using personal media like pendrives is prohibited and the identity of a user is confirmed with physical "PKI card" or something.
They will use their XP's long after the universe dies, I think.
"Most"? Really? Most of the XP machines I know of are being used like any normal desktop machine: email, browsing, office, thumb drives, etc.
I know things are different with industrial machine control, they might be different at my doctor's office, and so forth. But I don't think those special situations add up to "most". Not yet.
And the dentist office is one of them, actually. Anyway, I may be wrong about this right now, but as you note, in a few years, when the support dies and nothing works on XP any longer it will still thrive in environments I describe.
remember to refactor costs man hours, which costs real cash. If it works there is no point changing it. especially if the benefit is some so nebulous as "sub optimal VM" if you were worried about speed in the first place it wouldn't have been written in python.
also you wouldn't port your script to new hardware, you'd port Python (well wait for someone else to....)
> And that why so many businesses are outperform by new ones...
So many? Is this including the massive number of new businesses that fail in the first few years? I think that is a very hard assertion to make without seeing the actual numbers of these "new businesses." Most are private and don't report their profits, so you have no idea if they are burning through cash or actually making money -- you just see the hype.
Inertia is a total killer. Your extant codebase is both leverage and inertia. If The New Thing requires changing course and isn't amenable to your extant leverage, most (all?) companies can not maneuver to deal with The New Thing. It simply costs a great deal for very small benefit.
I would suggest that is a fault of process not programming.
If there is a need to upgrade then obviously you need to do it. However if your pipeline doesn't touch unicode, and your upgrading to python 3.3 purely for unicode, then its a massive waist of man power.
programs are a tool, nothing more. If the tool works there is no need to change it. infact its can be very expensive, especially if noone knows how to use it.
For example, most people don't need a pneumatic drill for their DIY. Yes it might be much more flexible, and really really fast. But the cost of maintaining it, and training to get the best use out of it is prohibitive.
However if you own a garage, hand tools are far too slow, and there are is a rich pool of talent to use you fast powerful tools.
Servicing technical debt sometimes costs more than refactoring. Sometimes not. A good manager/programmer can decide which situation they are in and act accordingly.
> at some point you will have to pass the ownership to another, younger, dev that might not have been taught into this older technology; or port your script to a newer hardware, and eventually deal with an sub-optimal or incomplete VM.
We're talking about Python 2 vs 3 here, not upgrading your system from Cobol to Clojure.
But the python source code's available to build, and anyway, quite a bit of the python in VFX is run through embedded interpreters in Maya, Nuke, Houdini or Katana which means they can keep using an old version indefinitely.
Reading the article, I've realised I do now support a Python 2.8, but I personally believe it should have only one feature...
There's no getting around the fact that large Python codebases have been built up in 2.x. Going forward, there are three paths for the developers of these codebases to take:
1. They can stay with Python 2.x until the lack of updates becomes a problem.
2. They can attempt to switch out Python 2.x code with Python 3.x code where appropriate.
3. They can choose to rewrite with a language other than Python.
Of the three options, which is the least beneficial to the Python community? The third one. Which is the most beneficial? The second one. To enable this, it makes sense to enable easier mixing of Python 2 with Python 3. So the 'one feature' I'd like to see in Python 2.8, if it is ever made, is to be able to interpret both Python 2.7 code and some set version of Python 3.
How would the interpreter know the difference? The code could easily be labeled using comments, such as how the "# -- coding: utf-8 --" switch works. It's not a new idea, it'll be very familiar to many developers (references to HTML and XML schema spring to mind).
This way everyone wins, Python 2 developers get the chance to move to Python 3 as and when it makes sense, Python 3 developers can make use of Python 2 libraries until the ports are ready, and it should be (relatively) easy to implement.
I changed it to "relatively few." We actually have several thousand altogether, but we have a lot of code. I don't want to discourage the great progress we've made.
The core point is the same: With sufficient test coverage, you can change one variable (Python 3 vs 2) and control for all others. Tests are not an end, they are a means to an end; the courage to change your software in the face of a constantly changing world.
Sounds like you're not there, but maybe you have specific domains of EVE that you feel comfortable changing at your whim. That's the success to focus on.
Somewhat naive too to be honest. As someone that had just plain old ruby circa 1.8.5ish code (not rails) that I ported to 1.9 a while back, even with tests there were differences in behavior that I guarantee you probably weren't thinking of writing a unit test for.
Unit tests are nice, but they don't make things braindead easy. Just make it a bit easier to narrow down where something might be broken/different.
I understand your position. I've managed a group that was struggling to increase test coverage. It's hard to make testing work if management wasn't behind it from the get-go. Short term thinking almost ensures this doesn't happen. It is possible to catch up, however. A friend of mine did it with a DoD project. (In C++!)
That's quite a glorified view of tests. You can have the best test coverage in the world backed by a great CI culture but if porting something is a huge time investment, it may still be a huge time investment. Maybe there's one 2.x feature that you use almost everywhere that has been removed in 3.x and there's no simple way to replace it automatically.
When I ported one of my projects to Python 3, by far the majority of time was spent on porting the test suite. The tests were written in Python 2 as well, and of course there is no tests of the test suite. It required extensive use of coverage to make sure that things were still being tested, plus new code to test things that lost coverage (eg items that were bytes). All of that effort then resulted in me being at the same place as I was with Python 2.
You can have the best test coverage in the world backed by a great CI culture but if porting something is a huge time investment, it may still be a huge time investment.
The right way to port is this: You make a rewriting engine that does the porting for you, and you develop on that, not on the port directly. The maintenance team keeps adding features to the legacy engine, but you can absorb those changes by running your rewriting engine on the whole code base.
Then, when the ported version is passing tests and looks good enough, you switch people over to the port. I've done this. It works. In fact, I know of at least one company that has based a consulting practice around this technique.
Yeah, but if you have a test suite you add Python 3 and silence the existing failures. Now you can (at least partially) avoid writing new code that wouldn't work on 3, as well as gain the option of refactoring things to resolve compatibility - with lots of freedom to decide how much effort to spend on that if any.
It'd no longer be a huge risky project where you have to cut the QA team over to Python 3, then spend a bunch of QA's time putting the new code through the ropes with no end in site while explaining to management that this is all providing zero new functionality.
Yeah, but if you have a test suite you add Python 3 and silence the existing failures.
They likely already know where a lot of their failures are going to be even without tests. The effort involved in fixing those known failures alone is prohibitive. The fact that the rest of regression testing will be easier with tests is nice, but doesn't reduce the work required to a level where they can justify doing it.
Now you can (at least partially) avoid writing new code that wouldn't work on 3,
Once you're on 3, you can just run your code and see if it works on 3. You don't need tests to avoid writing code that works on 2 but not 3.
as well as gain the option of refactoring things to resolve compatibility - with lots of freedom to decide how much effort to spend on that if any.
I don't think resolving compatibility issues is optional...
> I don't think resolving compatibility issues is optional...
It is if you stay on 2 for production/qa and the bulk of new feature development in the meantime. Only CI (and any developers working on python 3 compat.) would be running 3 and only to provide reports of how compatible things are.
Only once the test suite passes would you swap QA to 3, start removing 2 compatibility code (i.e. whatever you've done to hack around handling strings) and eventually ship 3 to production.
Edit: The solution of "just switch to 3 then make it work" is exactly what they can't do because of (completely valid) business and political concerns. I think that if they were able to have a goal of some small percent of test progress on Python 3 per release - but continue to ship features - they would be able to find time to do the work if it's at all a priority, even if it took many many releases. If it's not at all a priority then maybe that's the one-sentence summary instead.
I wonder if things would look differently if they made everything more modular and went for opensource approach. Sure it has it's downsides, but from what they list:
- "We have our own localization solution inside EVE..." - if it's better than gettext, other people would use it too and some would push for porting it to py3
- "We just removed a custom importer we’ve wanted to remove for years..." - was it for DI? would it become a better framework on its own if it was adopted in other codebases?
- "Ultimately we’ve been ... monolithic" - maybe that's also something they noticed
It would be great in cases like this to look into an alternative reality where they both used available modules and opensourced any big chunk they created themselves and see how things turned out. Maybe they would be worse for some reason...
It was interesting, and the biggest point that sticks out is "we have little automated tests" and that is where the main problem is. From working in tested and untested environments I can attest that a tested environment can rip technologies out and put new ones in easily and quickly. Sure it may take some time, but you always know you didn't break the world.
You used to be able to wrap the client in your own Python, execute that, and give yourself extraordinary powers (read data from any market in game, send a pop up message to any player, etc)
Yes, you can get caught by logs. That however still leaves some already existing consequences of said illegal (in terms of player-game interaction) manipulations. For example compensate expensive destroyed ships which were destroyed with use of an exploit. Also, that affects customer satisfaction and support must deal with that too.
The whole client uses python, or at least did back in 2008. Someone decompiled and released the source at one point, which allowed various exploits and easier macroing.
It's interesting to see an open-source language fail like that. Perhaps there is a need in some sort of strategical direction, to ensure new versions don't cause upgrade issues?
The last 2 weeks have been full of people championing the idea that perhaps a 2.8 with back ported 3.x features is a better solution than the 3.x line, because it provides new features with an any easy upgrade path to existing code bases.
...just because most of the core python developers currently seem to hate this idea (pep 404) doesn't mean it won't happen.
If 3.4 is as much of a flop of the 3.x line has been thus far, we're almost certainly see some change of direction over the next few months.
I don't understand why Python 3 is a flop. It's a incompatible, new version of a programming language. It's seen slow, steady, constant adoption. In the low single digits, but that's to be expected for a incompatible new version.
Most Linux distributions will be adopting it, most mainstream packages are ported or are planning a port.
Want to bet that in 3 years' time Python 3 will be the most popular Python version (51%-49% maybe, but still a majority)?
Wait for all the LTS/Enterprise distributions to switch to it - that will be the tipping point.
And then people bitch and moan about Java's type erasure since the type when Java added generics. You can't have everything. Python 3 is not backwards compatible. Its adoption was bound to not be 90% in 1 year or so, especially for an already widely used and embedded system.
Python 3 is not a success, but it is a quite nice base for building the next 10 years of Python. And for this goal, Python 3 is probably on track.
The real answer is both PG and Google and many other Serious Folk have blogged and talked about how growth is the single most important key metric for tracking project success.
Actual total users are irrelevant; they're like page views. Cute, but useless.
What's critical is that once a product has over come its initial growth spurt of early adopters, if the rate of rate of growth isn't in the right ballpark, the project is in trouble.
My fears for python 3 are two fold:
1) The initial 'early adopters' growth spurt of python 3 is over. However, we're not seeing any large scale migrations to py3; lots and lots of people just sticking with py2. Sure, new stuff is being written in python 3, but the question is, is it growing? How fast is it growing compared to the overall growth rate of python adoption?
2) NO ONE KNOWS. The core python team is not tracking this information; either they don't care, or they're totally out of touch with reality.
So when I say 'no one is using it' I mean, 'relatively few new people are using it, the growth rate over time of python3 relative to python2 appears to be flat', and if you were part of YC, that would mean the project catastrophically failing.
...and the developers should be paying attention, and they don't appear to be.
My definition of success for shadowcat doesn't require that level of growth. I'm sure many, many companies have a definition of success that doesn't require a 'rate of rate of growth' congruent with being 'part of YC'.
Given that, attempting to apply the same definition of success to a programming language just seems silly, especially given explosive growth usually ends up with a disastrous pop culture - think 1999-era Perl or 2003-era PHP or 2006-era Rails for examples where the majority of people using the language weren't necessarily adding noticeable value to the ecosystem, and in the long term have created a lot of hatred for the respective platforms as a result of code that was just plain bad.
I'm just saying that if the rate of adoption of python 3 is flat or negative, that's extremely bad regardless of the absolute number of people using it.
Its naive to think that growth isn't a key metric for any community project.
I'm not going to justify that; there's plenty of research out there about it.
One of the things engineers are going to have to come to grips with is that you can actually be "done" doing new design. It's really really hard in FOSS stuff because bug fixing is so much less rewarding than new feature development. But programming languages are tools, and if you're familiar with tools in the physical world you realize that once you get to a certain optimum, there isn't a lot of 'new feature' to add.
The nature of tools is why a good oscilloscope or bandsaw is still a good oscilloscope or bandsaw 25 years later. It does what it needs as well now as it did when it was new. If there is some new 'space' to work in, you might need a variation of the tool, but the basic tool is fine.
Computers, and computer tools, are maturing. We've seen this in the slowing of the upgrade cycle, the resistance to change that the OP writes about, people are ok with their tools. That will be a different world of computers than we are used to I suspect.