It just seems to me that if you think something is unsafe, don't build it in the first place? It's like they're developing nuclear reactors and hoping they'll invent control rods before they're needed.
Alignment instead of risk of course suggests the real answer: they're perfectly happy inventing a Monkeys Paw as long as it actually grants wishes.
I think one can reasonably draw three regions on the spectrum; at the extremes, either safe enough to build without thinking hard, and dangerous enough to not build without thinking hard.
Many LessWrong folks are in the latter camp, but some are in the middle; believing in high rewards if this is done right, or just inevitability, which negate the high risks.
Personally I think that from a geopolitical standpoint this tech is going to be built regardless of safety; I’d rather we get some friendly AGIs built before Skynet comes online. There is a “power weight” situation where advanced friendly AGI will be the only way to defend against advanced unfriendly AGI.
Put more simply, even if I assess the EV is negative, do I think the EV is less negative if I build it vs. US/Chinese military?
Alignment instead of risk of course suggests the real answer: they're perfectly happy inventing a Monkeys Paw as long as it actually grants wishes.