I’m personally not worried about AI being the cause of extinction for humans.
It may reflect my educational background, but I genuinely think climate issues are a more urgent priority today. However, that does not mean that I think we don't need to be discussing the issues around AI.
It seems people are really worried about what AI will do to humanity, and that AI might get sentient and annihilate us all, and there is probably some small chance that that would happen. However, what I think is more dangerous and more likely to happen, is that AI will almost certainly be used by nefarious people, organizations, or nations in power plays at whatever cost to humanity. Ultimately, I think that AI in the hands of bad humans has more potential to destroy humanity than AI alone.
It may reflect my educational background, but I genuinely think climate issues are a more urgent priority today. However, that does not mean that I think we don't need to be discussing the issues around AI.
It seems people are really worried about what AI will do to humanity, and that AI might get sentient and annihilate us all, and there is probably some small chance that that would happen. However, what I think is more dangerous and more likely to happen, is that AI will almost certainly be used by nefarious people, organizations, or nations in power plays at whatever cost to humanity. Ultimately, I think that AI in the hands of bad humans has more potential to destroy humanity than AI alone.