I like his article claiming that the Costco website in the UK does not and has never worked at all, where there's people in the comments saying "it works fine".
If a major website doesn't work for me, and it's not a one-time thing, I tend to assume the problem might be on my end and troubleshoot from there instead of writing an article about how it's amazing that nobody else has ever noticed that it's been down forever.
Well you should read the reply on the kernel bug tracker:
"Peter already gave a quick reply on this regard couple years ago [1] and I think more tests and numbers are required in different environments, instead of solely think in the raw number of cores. When a high throughput server starts to fall down based on the task time slices? Maybe increasing from 8 to 24 is fine in normal memory intensive tasks, but what about 128 cores in a high-demand network server?
8 has proven to be "enough" so far (compared to other OSes), but, of course, it doesn't mean it hasn't room for improvements."
So it's pretty clear that it's not going to change any time soon and it's not a bug.