I still remember, back in 2002, the hit on the Japanese market of an app, Bowlingual, that could be used on a cellphone (or through a separate device) to connect to your dog. The dog had sensors in the collar (a microphone plus movement sensors able to detect the wagging of the tail) and a loudspeaker. The data on its movements (particularly the wagging on the tail) and its barking were processed by a microchip and converted into a sentence, out of a set of 20 possible phrases, that you could hear in the phone. You could call your dog using your phone, talk to it and get responses, like “I feel hungry, when are you coming home?”.
The basic idea was to have a computer interpret a number of clues, making up the dog’s language, translating them to us.
The product is still available today (it sells on Amazon for about 200US$), has seen a number revisions and has been flanked by Meowlingual (you can guess what it is for…). You can google to read about the effectiveness of both.
The basic idea is good. Can we use a computer to help us understand “alien” communications? In these last years we have seen a tremendous uptake of Artificial Intelligence, so it should be no surprise that this is being applied to this area as well.
Here comes the news reported by Wired of a mom of three kids, Ariana Anderson a UCLA computational neurophysiologist, that has developed Chatterbaby, an AI powered application that is claimed to be able to translate babies howls into meaning. The app has been trained on 1,700 babies and hundreds of thousands of howls and it has, according to the developers, an accuracy of 90% in identifying the reason for crying, being hungry or feeling pain, being fuzzy or seeking company…
Back in 2012 a study pointed out that babies that will later be diagnosed with an autistic syndrome give some hints as early as when they where 6 months old by the way they were crying. Their cry had an higher than normal pitch (see clip). It was difficult to detect by an untrained ear but a computer analysing the sound could spot the difference. The study was based on a limited set of babies in families that had a kid with autism (as part of a study to assess the probability of recurring autisms syndrome in siblings) so it couldn’t be taken as statistically sound.
Now researchers are looking into the possible use of Chatterbaby as a potential diagnostic tool for early autisms syndrome detection. Chatterbaby has the potential to be used by thousands of families and the possibility to analyse the data on a large sample may provide a definitive, statistically sound answer to this. Notice that, as in most cases in medicine, you don’t have a straightforward one-to-one relation of the sort: “high pitch” >>> “autisms”. It is just a hint, it needs to be evaluated against several other parameters, also fuzzy. Here is where the possibility of accessing a large data lake and analysing it with artificial intelligence algorithms that gets smarter over time (meaning that when a clue detected now is confirmed in 2 years time the algorithm “learns” and gets better) may pay off.
It would be a great help towards the management of autisms since the sooner the diagnoses is made the better chances of helping the growing kid with customised support.