Skip to main content

Are we really so open to manipulation by artificial intelligence?

A piece in Technology Review suggests that humans are easily influenced by artificial intelligence 'agents' - the likes of Siri and Alexa, and in this particular case dedicated agents for particular tasks (HT to Andrew Rosenbloom for drawing it to my attention).

The article's author suggests that many of our connections with other human beings are shallow and unreliable, so we appreciate the fact that a suitably programmed agent will always be there for us - sympathetic and responsive. However, I do wonder if too much is being read into the responses of the clients, in part because they were probably predominantly American, and it's difficult to read too much generally into a response from one society, especially one that is atypical in many ways. The other thing that struck me about this aspect was whether any allowance was being made for fun and system probing. If I say anything non-practical to Siri or Alexa, then it is almost always either because I'm being ironic, or because I'm curious to see how the system will respond. (A typical example would be the number of times I've asked Siri to open the pod bay doors.)

More worryingly, perhaps, we read 'Every behavioral change we at Cognea wanted, we got. If we wanted a user to buy more product, we could double sales. If we wanted more engagement, we got people going from a few seconds of interaction to an hour or more a day.' I think this is something else I have to be sceptical about. How was this tested? I could not imagine any circumstance where I would want to talk to an AI agent for 'an hour a day'. I might do it once, to test out its abilities - but I would no more talk to an AI agent for an hour a day than I would a telephone salesman. Equally, I don't doubt agents can increase sales by the way they interact. So can a well-designed web page. And human psychology being what it is, I'd expect a small percentage of this improvement is due to being flattered by the attention of the agent, even if the client is aware that this is a program speaking - because we are hard wired for a lot of these things.

Overall, there's no doubt that as AI becomes more sophisticated we will be increasingly at risk of being manipulated by it - particularly if 'we' are the kind of people who fall for scam emails. (Again who were the people in these tests? How big were the sample sizes? How were the samples selected? Where are the statistics? It's all a bit vague.) We need to make sure that this manipulation does not go too far, just as we do with our strict controls on advertising (again this may be a US difference, where advertising seems to have far fewer restrictions in what it can say than is the case in Europe). But I don't think it's time to panic yet.

Comments

Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope