In 2013, I gave myself a goal of running 1000 miles. I used RunKeeper to record my runs and used its goal feature to track my progress and quickly see how much I had left. Two days before the new year, I was able to hit my goal and got a little notification from RunKeeper congratulating me on achieving my goal. This small notification got me thinking about how emotion is built into our products. RunKeeper doesn’t care whether it was a 1 mile or 1000 mile goal - the reaction I get would be the same. Yet if I shared these two achievements with my friends, the reactions I get would be completely different. Sure, an algorithm could be designed to treat accomplishments of various difficulties differently and can even be adapted to take into account that to some people, running one mile is equivalent to others running 1000 miles.
I doubt a smarter algorithm would actually make a difference. We might appreciate the intelligence of the algorithm but we’re not going to believe that this digital praise was authentic or that our software actually cares. We already have Google Now promising to give us the information we need when we need it and Amazon is trying to ship products to our doors before we even place the order. Yet as smart as these are, they’re not emotional. Even Siri is just an algorithm. As technology gets smarter, I wonder whether future generations will feel this way. We’ll continue to see these improvements as simply smarter software and better data but I doubt future generations will feel the same way. We’ve seen how dumb our technology has been and won’t be able to think of it as anything more than software. Future generations will be born and grow up in a world surrounded by smarter and better versions of what we have and won’t be saddled with this bias. Many believe the singularity will happen in our lifetimes but I think this will have the larger effect - that we’ll start viewing technology as our equal.