Louis Shalako
Are you worried about the future of artificial
intelligence, A.I.?
Maybe you should be. That’s because you have no
idea of who might be in charge of building it, and even less of who’s training
it. Therefore, you have no idea of what sort, or whose moral values will be
inculcated.
The truth may be rather more sobering than one would
like.
That’s because you—and people like you, are already
training A.I.
You pitch in your fair share of unpaid effort multiple
times, each and every day. That’s the great thing about the modern world.
Everybody just wants to help, and to be heard.
You do it on Facebook, Twitter, and Amazon, almost
every single day. You do it a little bit more with each and every Google
search. You are training those algorithms—the basis of all of the experimental
work going on in artificial intelligence.
Maybe you are right to be concerned. After all, you
know who you are. And you make a choice every time you click on something—and
the algorithms take note.
You may be seizing-on, and reposting every negative
little thing—or every negative big thing.
This might include the latest
atrocity, the latest terror plot, the latest bad thing the President, the
government, or the Mayor has done. Maybe your soufflé fell or your great Uncle
Rudolf finally kicked off at the age of 104. Thoughts and prayers, right? Or you could just post a lot of
pictures of cute and cuddly kittens. Your emotions mean nothing to a machine.
It’s just a series of numbers, measuring engagement.
How
popular is this?
It doesn’t care why.
It cannot measure, truth, or love, or anger.
It cannot measure, truth, or love, or anger.
It doesn’t have the capacity to ask why, although it
may be very clever with the thesaurus and the rules of language.
(Although I would submit that understanding the rules of language is an important first step in asking why.)
Somewhere in the world, there is someone prepared to
pay dearly for those little bits of information.
Your wants, your likes and your dislikes. Your fears
and pet peeves might be very useful in the right hands.
Either way, you are helping to train those bots, those
artificial intelligences that will be shaping your world of tomorrow.
It’s not up to me to tell you how to live your life or
what might be the appropriate choices. It’s clear the really bad guys are
working their asses off trying to change the fate of the world.
They're trying to impose the most socially-negative outcomes.
They're trying to impose the most socially-negative outcomes.
Take comfort in
the fact that they represent only a miniscule percentage of the world’s
population, over ninety-nine percent of whom are good, decent people not
looking to make trouble. All they want is to get by and to live in peace with
their neighbours. To raise their kids, to have a home and to make a living.
The silent majority won’t be quite as silent as they
might have been in the past.
The influences will change. A new balance will happen
because it must happen. It is inevitable, mathematically. The values of a
billion people in India will far outweigh the values of thirty-five million
Canadians when measured by bots and algorithms. (Think of the sheer spending
power.) Most of them prefer cute cats to predicting the apocalypse, which has
recently been re-named ‘declinism’. This guy says the idea of decline is
dangerous and I tend to agree. I thought long and hard before sharing or linking
to his article.
My opinion is that it’s a bit over-blown, although the
rise of anti-democratic or anti-egalitarian value systems must always give us
pause for thought.
That’s the funny thing about the world, besides change
being a constant.
We’re not always in the majority.
This time, I would say that we are. And that's what they're against. They are against rule of the majority, by the majority, and for the benefit of the majority. No, ladies and gentlemen. The bad guys don't like that one at all.
Ultimately, the artificial intelligences will be able
to measure that. Right now, all they know is that you’re browsing for white
cotton shirts, or a grappling-hook, or a pair of size seventeen wingtip shoes.
They never think to ask why, but then they never cared
in the first place.
END
...hey. check out my new book, Tactics of Delay, on Smashwords.
Thank you for reading.
No comments:
Post a Comment
Please feel free to comment on the blog posts, art or editing.