I'm sure you're all familiar with this scenario. There's a computer - or a robot, or an automaton of some type, or an android. And it speaks. But it doesn't speak in contractions. Many scenarios even take this further, showing development in the artificial individual by having him/her/it start to use contractions. It's a hint of sentience.
But what does it actually mean?
Just yesterday I got to thinking through what this sudden acquisition of contractions would mean in a linguistic sense, and I arrived at a conclusion which made me blink.
I should of course begin by explaining that I'm a descriptive linguist and very much a believer in the chaos-theoretical model of language and language learning. Chomsky-style "universal grammar" isn't necessary in such a model, and given the fact that languages vary so widely across the world, that seems quite a relief to me. [Language "universals" tend not to be universal, but instead very large-scale trends.]
So what kind of system is assumed to underlie speech which does not use any contractions? Basically, it implies a language built up from a list of vocabulary words and a set of syntactic rules. An artificial intelligence using these resources would not use contractions because those wouldn't be part of its programming, and the sudden development of contractions would thus imply transcendence of fundamental programming.
If, however, we look at the acquisition problem from a neural network perspective, it looks very different. A neural network acquires language based on examples that it receives in the environment. It would parse out the words based on patterns of repetition and difference, and it would parallel the developmental curves that we see in human language development, where there is a steady increase in proficiency at the beginning due to memorization, then the learner grasps a larger pattern and overapplies it (leading to apparently less successful performance on tests) and then starts re-introducing all the exceptions to the rule. I would expect a successful neural network-based artificial intelligence to speak in contractions very early on, regardless of its proficiency in other matters (not to mention its sentience).
So in the end, the contractions question is something of a conceit. This isn't necessarily a bad thing. By "conceit" I mean it's a gesture toward the idea of developing intelligence. It's subtle enough that not everyone will notice it, yet definitely a change in an artificial being's behavior. If you try to reason through the way that this super-amazing artificial brain can't seem to operate on anything more complex than a vocabulary list and a set of syntactic rules, you might get tripped up... But the most important thing here is that use of contractions is a marker that normal people will notice. And as a marker, it has been very useful - and continues to be useful - for science fictional storytellers.