It depends. Many modern microcontrollers are perfectly fine driving LEDs directly off IO pins if the pin specs say it is rated for sufficient current (like 20mA). However, older ones like ESP8266 can only do like 2mA and the 8051 even less. Or you run into a total power budget issue if your are running too many pins. Also, some IO pins are perfectly fine at sinking current to ground but aren't suited for sourcing current, in which case the LED would be directly connected to an external high voltage and the IO pin would simply be switching to ground or not.
> When C code is run in machines capable of failing with gruesome death, its unsafeness may indeed result in gruesome death.
And yet, it never does. It's been powering those types of machines likely longer than you have been alive, and the one exception I can think of where lives were lost, the experts found that the development process was at fault, not the language.
If it was as bad as you make out, we'd have many many many occurrences of this starting in the 80s. We don't.
How was this flamebait? It is an example of how bad programming choices/assumptions/guardrails costs lives, a counterargument to the statement of 'And yet, it never does'. Splitting hairs if the language is C or assembly is missing the spirit of the argument, as both those languages share the linguistic footguns that made this horrible situation happen (but hey, it _was_ the 80s and choices of languages was limited!). Though, even allowing the "well ackuacally" cop-out argument, it is trivial to find examples of code in C causing failures due to out-of-bounds usage of memory; these bugs are found constantly (and reported here, on HN!). Now, you would need to argue, "well _none_ of those programs are used in life-saving tech" or "well _none_ of those failures would, could, or did cause injury", to which I call shenanigans. The link drop was meant to do just that.
We need to agree to disagree on this one; the claim that C is fine and does not cause harm due to its multitude of foot-guns, I think, is an egregious and false claim. So don't make false claims and don't post toxic positivity, I guess?
I don't think the prevalence of these articles this time of year is because the authors go on holiday, but instead is because the new year is the perfect time to ponder: "Will this be the year of the Linux desktop?"
> It would be great for the browser become the cross-platform application target.
This is the kind of thing that I feel is very nice and terrible at the same time. Yes it is convenient but it is also such a complex piece of software, it's sad that it is required to run gui apps. Ok, it may not be required yet per say, but I have mixed feelings about this direction.
Looking at the complexity and area of hardware floating point, I often wonder why we don't see more unified combined integer+floating point units, like done in the R4200 [1], which reused most of the integer datapath while just adding a smaller extra smaller 12-bit datapath for the exponent.
If you use a tile-based hardware renderer, such as on the original nintendo chip, then pixels are rendered on the fly to the screen by the hardware automatically pulling pixels based on the tile map.
id Software should have just partnered with a heavy metal band that jointly released an album of Doom music you could put in your stereo while you play the game.
reply