Also, according to Chrome's telemetry, very, very few websites are using it in practice. It's not like the proposal is threatening to make some significant portion of the web inaccessible. At least we can see the data underlying the proposal here.
I'm curious as to the scope of the problem, if html spec drops xslt, what the solutions would be; I've never really used xslt (once maybe, 20 years ago). In addition to just pre-rendering your webpage server-side, I assume another possible solution is some javascript library that does the transformations, if it needed to be client-side?
As a general rule of thumb, 0.1% of PageVisits (1 in 1000) is large, while 0.001% is considered small but non-trivial. Anything below about 0.00001% (1 in 10 million) is generally considered trivial.
There are around 771 billion web pages viewed in Chrome every month (not counting other Chromium-based browsers). So seriously breaking even 0.0001% still results in someone being frustrated every 3 seconds, and so not to be taken lightly!
--- end quote ---
3. Any feature removal on the web has to be a) given thorough thought and investigation which we haven't seen. Library of congress apparently uses XSLT and Chrome devs couldn't care less
This was mentioned in the discussions and are an easy search away. Which means that googlers in their arrogance didn't do any research at all and that their counter underrepresents data as explicitly stated in their own document
And yet it's not the Googlers and other browser implementers who didn't do even a modicum or research who are arrogant, but me, because I made a potential mistake quickly searching for something on my phone at night?
>Chrome telemetry underreports a lot of use cases
Sure; in that case, I would suggest to the people with those use cases that they should stop switching off telemetry. Everyone on HN seems to forget telemetry isn't there for shits and giggles, it's there to help improve a product. If you refuse to help improve the product, don't expect a company to improve the product for you, for free.
Looking at the problem differently. Say some change would make Hacker News unusable, the data would support this and show that it practically affects no one.
First, we are an insignificant portion of the web, and it's okay to admit that.
Second, if HN were built upon outdated Web standards practically nobody else uses, I'm sure YCombinator could address the issue before the deadline (which would probably be at least a year or two out) to meet the needs of its community. Every plant needs nourishment to survive.
First, you're assuming that those portions of the Web won't evolve in order to survive. Second, you're ascribing a motive to Google that you assume (probably falsely) that they possess.