In the next few years, if it seems like your smartphone’s recommendations are getting better, it may be because it’s getting smarter:
Smartphones are ideal devices for [mood detection] because they are filled with sensors that detect light, sound, motion and location, all of which might help deduce a user’s emotional state. In tests, MoodExplorer could guess the mood of users correctly from their smartphone data 76 per cent of the time, where mood was judged as either happy, sad, angry, surprised, afraid or disgusted. [“App guesses your emotions to target you with adverts,” NewScientist (17 February 2018)]
I definitely have mixed feelings on this one, being a fairly private person – I recall a high school class where an assignment was given out to write up how you were feeling at the moment. My response was that was a private matter. The teacher suggested that perhaps I should see a counselor or psychologist, which I promptly and completely ignored as ludicrous.
But it’s not surprising that a machine learning system could be trained for this sort of deduction, as we definitely share a body language, and even if it’s somewhat culturally dependent, that merely means identification of the culture and selection of the proper system.
Is this something that would open us up for manipulation? An interesting question to meditate on.