1 Comment

"Personally, I find this unrelenting fixation on 'AI death' rather counterproductive. It skews the discourse because it isn't the primary concern of the majority of AI researchers, and it paints 'AI' as an ominous force—almost as if 'it' were already a self-governing entity rather than a tool shaped by human hands."

I find this to be an... interesting take. My chronic concern over AI has hardly been about it becoming sentient and deciding that "the human is obsolete", but *precisely* the fact that it is shaped and wielded by human hands.

The humans who built the Chernobyl reactors knew full well how unforgiving nuclear forces are, and still nearly wiped humanity off the planet in efforts to cut corners financially.

We have let climate change slip farther and farther into disaster because the humans who stand to profit from factors that perpetuate it have their checkbooks in government.

Facebook isn't destructive to the world because it is sentient - it's destructive because it makes way more money when tuned for maximum profits rather than maximum human concern, resulting in damage ranging from tripling teen suicide rates to election interference in multiple democracies.

What scares me about AI is not sentience - it is that it is powerful, that it is only going to escalate in power, and that it is wielded by humans who consistently place profits in front of compassion and conflate market demands with human needs.

Expand full comment