Hacker Newsnew | past | comments | ask | show | jobs | submit | pseudohadamard's commentslogin

And it looks the same with and without Javascript enabled. Unlike 99% of all web sites which are anything from shite through to blank pages without JS.

That's "UB that was detected in this study". Since gcc will silently break code when it detects UB and you can't tell until you hit that specific case, the 40% is a lower bound. In practice it could be anything up to the full 100%.

In theory. But most C programs do not rely on UB. What is the basis for your claim?

Uhh... mathematics and logic? Since there's no perfect UB detector, one that detects UB in 40% of programs can only be presenting a lower bound. And I don't know why you think C programs rely on UB, they have it present without the programmer knowing about it.

It follows from mathematics and logic that "larger than 40%" could be 100%, but it does not follow that this is likely or reasonable to assume.

One thing you need to add is that UB can be incredibly subtle and almost impossible to spot even by people with decades of programming experience. However, the compiler - and we're talking almost exclusively gcc here - will spot it and silently break your code. It won't warn "hey, I've spotted UB here!" even with every possible warning enabled, it will just quietly break your code without giving you any indication that it's done so.

It's some of the most user-hostile behavior I've ever encountered in an application.


Myself and other developers I know have tried giving feedback for gcc. On the whole, going outside and shouting at clouds is more productive.

I felt the same. There are too few contributors for GCC. At some point I started to fix the bugs that I had filed myself. Still, it is important that the user make themselves heard.

I think it's a circular problem, the gcc developers are very insular and respond to outside input with anything from ignoring it to getting into long lawyeristic arguments why, if you squint at the text just right, their way is the only right way, which strongly discourages outside contributions. There's only so many hours in the day and arguing till you're blue in the face that silently mutating a piece of code into unexpected different code that always segfaults when run based on a truly tortured interpretation of two sentences of text gets old fast. The gcc devs would make great lawyers for bypassing things like environmental law, they'd find some tortuous interpretation of an environmental protection law that let them dump refinery waste into a national park and then gleefully do it because their particular interpretation of the law didn't prohibit it.

Contrast this with Linus' famous "we do not break userspace" rant which is the polar opposite of the gcc devs "we love to break your code to show how much cleverererer than you we are". Just for reference the exact quote, https://lkml.org/lkml/2012/12/23/75, is:

  And you *still* haven't learnt the first rule of kernel maintenance?  If a change results in user programs breaking, it's a bug in the kernel. We never EVER blame the user programs. How hard can this be to understand?  ... WE DO NOT BREAK USERSPACE!
Ah, Happy Fun Linus. Can you imagine the gcc devs ever saying "if we break your code it's a problem with gcc" or "we never blame the user?".

This really seems to be gcc-specific problem. It doesn't affect other compilers like MSVC, Diab, IAR, Green Hills, it's only gcc and to a lesser extent clang. Admittedly this is from a rather small sample but the big difference between those two sets that jumps out is that the first one is commercial with responsibilities to customers and the second one isn't.


In my experience it is worse with clang that even more aggressively uses UB than GCC to optimize (and Chris Lattner in his famous blog post very much justified this line of thinking), and I have seen similar things with MSCV. I do not know about the others.

I think that GCC changed a bit in recent years, but I am also not sure that an optimizing compiler can not have the same policy as the kernel. For the kernel, it is about keeping API's stable which is realistic, but an optimizing compiler inherently relies on some semantic interpretation of the program code and if there is a mismatch that causes something to break it is often difficult to fix. It is also that many issues were not caused because they decided suddenly "let's now exploit this UB we haven't exploited before" but that they always relied on it but an improved optimization now makes something affect more or different program. This creates a difficult situation because it is not clear how to fix it if you don't want to roll back the improvement you spend a lot of time on and others paid for. Don't get me wrong, I agree the went to far in the past in exploiting UB, but I do think this is less of a problem when looking forward and there is also generally more concern about the impact on safety and security now.


Good point, yeah. I really want to like clang because it's not gcc but they have been following the gcc path a lot in recent years. I haven't actually seen it with MSVC, but I'm still on an old pre-bloat version of Visual Studio so maybe they've got worse in recent versions too.

I think a lot of the UB though isn't "let's exploit UB", it's "we didn't even know we had UB in the code". An example is twos-complement arithmetic, which the C language has finally acknowledged more than half a century after the last non-twos-complement machine was built (was the CDC 6600 the last one's-complement machine? Were most of the gcc dev even born when that was released?). So everyone on earth has been under the crazy notion that their computer used twos-complement maths which the gcc (and clang) devs know is actually UB and allows them to do whatever they want with your code when they encounter it.


>so developers will have to ditch memory-sucking frameworks and start to optimize things again.

Can you DM me your contact details? I have a nice shiny new bridge that I can get you a great deal on.


Same here. I have friends in the US but I've resigned myself to not seeing them again for a long time because I won't subject myself to that amount of harassment. If ever a country hung out a giant "Please stay away, you're not welcome here" sign, this is it.

It's also possible to run an economy on empty for a long, long time provided there's a war on. Look at Germany, who literally ran the country on empty throughout all of WWII, books like Adam Tooze's "Wages of Destruction" cover this in great detail. There was a saying in the last few years, "enjoy the war, the peace will be terrible", and it was, because once you took the wartime tourniquet off all the toxins flooded your body. It wasn't until the Marshall Plan that the bare subsistence life was slowly eradicated.

Now, can you see anyone giving Putin's kleptocracy a few trillion dollars to rebuild Рашка? The Marshall Plan rebuilt Germany because the US realised that without that as the economic powerhouse of Europe the place would be a basket case in need of US support for decades, but when you're just a gas station masquerading as a country (McCain) no-one's going to bail you out except insofar as it keeps the gas flowing, and if you look at places like Nigeria you don't need much to keep the gas flowing.


> can you see anyone giving Putin's kleptocracy a few trillion dollars

Unfortunately, if the leaks are true, they might actually be discussing that.

"Fortunately" Putin is more bent on having the whole Donbas which Ukraine will not give so I don't believe this happens in the near future - which is bad for Russia as a country.


> Unfortunately, if the leaks are true, they might actually be discussing that.

Which specific leaks are discussing giving Putin a few trillion dollars? Who is the money coming from?


So it's an open-source version of Exchange then?

I worked with VC++ 6.0 up until Windows 11 when it really, really wouldn't run any more, then switched to VS 2008. The code is portable across multiple systems so it didn't really matter which version of VS it's developed with, and VC++ 6.0 would load, build the project, and have it ready to run while VS 2022 was still struggling through its startup process.

VS 2008 is starting to show the elephantine... no, continental land-mass bloat that VS is currently at, and has a number of annoying bugs, but it's still vastly better than anything after about VS 2012. And the cool thing is that MS can't fuck with it any more. When I fire up VS tomorrow it'll be the exact same VS I used today, not with half a dozen features broken, moved around, gone without a trace, ...


Yeah recent VS is awful. I recently tried VS2022. What a mess.

VS2026 is even worse. And if you thought the CoPilot enshittification in 2022 was bad, wait'll you see 2026. We only use it for final builds now, so develop under a bloated but at least not enshittified yet version, then do release builds and testing with whatever the latest version is before shipping.

Hey, don't diss SACD. I have a DVD player that happens to do SACD that I'm hoping to resell to an audiophile for five or more times what I paid for it at some point, I just have to wait another ten years or so for it to become a rare piece of classic audio hardware, so far it's only about 20 years old.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: