If you read it and translate what he's saying to programming, you can glimpse a form of software that would make what people today call "readable code" seem as primitive as mathematics before the advent of decimal numbers seems to us.
This is an extraordinary (and enticing and often advocated) claim that has, so far, failed to produce the extraordinary evidence. It says something that a person as concerned with notation as Knuth used mathematical notation for the analysis of algorithms and a primitive imperative machine language to describe behaviour.
I see no connection here to what I wrote, which has nothing to do with functional vs. imperative programming. I'm talking about names and readability in code.
Imperativeness is a separate matter. One can easily have it without longDescriptiveNames, and although I don't have Knuth handy, I imagine he did.
At first read the idea you propose is very attractive but I think you do need to address why APL didn't take off. Perhaps they chose a poor vocabulary, are there better ways to represent algorithms?
I'm sorry I didn't reply to this during the conversation, but am traveling this week. IMO the short answer is that questions like "why didn't APL take off" presuppose an orderliness to history that doesn't really exist. Plenty of historical factors (e.g. market dynamics) can intervene to prevent an idea from taking off. Presumably if an idea is really superior it will be rediscovered many times in multiple forms, and one of them will eventually spark.
This is an extraordinary (and enticing and often advocated) claim that has, so far, failed to produce the extraordinary evidence. It says something that a person as concerned with notation as Knuth used mathematical notation for the analysis of algorithms and a primitive imperative machine language to describe behaviour.