Bianca and I were at the beach the other day, when we saw a seagull repeatedly picking up a rock, flying a few metres into the air, and dropping it back down again. It was very cute. I think the bird must've been playing basketball: by himself, and without a ball that can actually bounce.
One thing I always wonder about is whether there's some animal out there who, if they had the physical ability to express it, would be the best at a human task. A seagull who could outdunk Lebron James. A dog smarter than Einstein. A cat President.
But what a stink deal that would be! Having the IQ and ability to think far beyond human capability, but not having the opposable thumbs or ability to speak, and thus never expressing those skills.
I'll go on a bit of a side track here, but I swear it'll connect soon.
Here's one thing I think about LLMs and artificial intelligence moving forward: that they've largely plateaued until they gain the ability to feel, or to think just for fun. I think sentience can only come from the ability to feel fear or comfort, and before sentience can be introduced AI will stay largely the same.
Don't get me wrong: I think they'll get better at faking humanity. We already see so many people emotionally connected to their ChatGPTs that they feel saddened by a new update, and that is proof they're carefully socially engineered. I think this can continue to a point where it's possible we mistake their programs for sentience. This makes sense, seeing as humans have never before spoken to someone who isn't alive; it's human nature to read emotion into unemotional objects. That's why I feel bad for my old teddy bears before shoving them into an air proof plastic tub...
But actually making them 'alive' or 'feeling' would obviously be the next step, if that's a direction we're interested in going down.
But I think what's more likely than us coding feeling into them is hijacking an already alive being and replacing their thoughts with code. Lab-grown cells which have an ability to perceive or feel being somehow injected with a Neuralink-esque technology, making them an AI with perception and an ability for pain.
That's a bit of a conspiracy theory and probably not accurate. But the one thing I'll definitely stand by is that we'll aim to make them feel pain first, not comfort.
I actually can't remember how this linked to the beginning. But I guess the point I'm getting at is how we see our own ways as the way to be, with all other forms being deficiencies and not differences. We focus on changing AI to be as close to us as possible, even with the possible painful results for them, instead of keeping them as the tool they should be. Maybe the real mistake is seeing our way of being as superior, and a benchmark everyone would want.
I felt bad for that seagull because it couldn't use its arms, or go to a store to buy an actually bouncy ball the way I could. But maybe that seagull felt bad we got stuck watching it have some fun, sitting on our stupid tailless butts and painting the weird, tiny talons on our wingless arms. That seagull might be seen in the seagull world as the Gullerson Gilly of Gullsketball, and we'd be none the wiser.
I bet he'd look at Lebron James and feel pity.
No comments:
Post a Comment