The Minds Of Machines


My telephone wants my attention;

I can tell by the way that it yells.
It announces these things by the way that it rings
Its annoying cacophonous bells.
My car wants to see the mechanic,
And says so by flashing a light;
I’m quite apprehensive–it’s always expensive
When something goes wrong (and it might!).
My laptop computer, it hates me
(And the feeling is mutual, jerk!)
No warning light flashes, just ‘click’ and it crashes
And vanishes all of my work.
When I don’t understand their construction,
And their actions are hard to explain,
There are times that I find I just call it their mind
And the same thing is true for my brain.
This comment over on the confabulator brain post at Pharyngula has not quite got it right. We do infer inner causes–wants and desires–in our machines, but not in the circumstances he uses as examples. “My lawnmower wants to mow the grass” is, I agree, not the way we speak of “want”. But my lawnmower does not want to start on cold mornings–it wants to be warm. It wants a new oil filter–it tells me in smoke signals. My computer wants me to back up files. My car wants an oil change. My kids want ice cream. I want a job that pays.
“Wanting” implies that, given the opportunity to do X, we will. I cannot know the internal state of my computer’s desire for backup, but I also cannot know my kids’ internal state. Nor do I know the internal state of those who taught me the word “want”, so I have no way of knowing if my feelings are the same as theirs–I only know that my behavior matches. “Want” works just as well–in the appropriate situations–for machines as it does for people or animals.
We use inner cause words in other situations, too–those situations in which causality is not easily determined. If my computer crashes because of a power surge that also takes out the transformer down the block and puts half the town in the dark, it is clear what happened; if my computer crashes for no apparent reason, it is because it has a mind of its own. Or it hates me.
Which it does.

Comments

  1. says

    Someone should set this masterpiece to tune… Something like the title song for Disney's Aladin… Too good, Mr. Cuttlefish!What you describe is akin to pareidolia, no?

  2. says

    Wow, Suirauqa, I had never thought of it that way, but that's a very apt comparison! We see faces in tortillas, and mind in behavior. I gotta give this a bit more thought–it's a very appealing notion, which is always when I want to make sure to be skeptical!

  3. says

    It doesn't take much for us to start imbuing a system with a sort of "mind" – a spinning top is very simple, but when knocked about it "wants" to stay upright, and to keep spinning. This would also apply to religion; the big invisible mind that must exist, because certain large-scale events are easier for some to include in their model with a mind behind them.

  4. says

    Suirauqa — I was thinking the same thing. This does demand music. :-) I'm busy today, but I'll give it a try tomorrow or some other time soon and put it on my website if it comes out decent and Cuttlefish lets me again. My rendition of "I Am Charles Darwin" went over pretty well…

Leave a Reply

Your email address will not be published. Required fields are marked *