According to the Trendhunter magazine, a Chinese robotics company called Siasun has built a robot that:
… has the ability to spot gas leaks, monitor your health, talk to you when you are alone, communicate with the police, send text messages and take phone calls when you are not at home.
The robot is built for to take on the role of caretakers for the elderly population. To a lot of people, the above mentioned features of the robot hardly seem enough to call the robot a ‘nanny robot’ or a ‘caretaker robot’. Regardless, the idea of a homebound robot supporting an elderly or a child isn’t a new idea. Rosie from The Jetsons have inspired many roboticists, and PR2 robots from Willow Garage seems to have a dream of becoming a real-life Rosie. It can now fold towels, recognize and grab different objects, and plug itself in to recharge (watch a demo of PR2 folding towels below). So why aren’t there more nanny / caretaker robots around our houses? One of the reasons is that these systems usually face issues that a lot of us worry about – privacy. M. Ryan Calo and Dr. Ian Kerr, from Stanford University and the University of Ottawa respectively, are amongst the few who are beginning to talk about such issues of privacy. In a forthcoming roboethics anthology, Robot Ethics: The Ethical and Social Implications of Robotics, Calo is writing a chapter on the potential impact of robots on privacy (access sample excerpt of the chapter from Stanford Law website).
One of Kerr’s students, Michal, recently covered this very topic in his recent blog post on ‘the laws of robotics’ and listed out some of the privacy concerns that robotics technologies bring forth when they start to roam about at our homes. To generalize, any personal information/data collected about you – whether it be the statistics on when you’re most likely to get home, whereabout in the home you are at any given moment etc – can be extremely useful for an intelligent robot trying to assist you in your daily life activities, but the same information can just as well be abused. Take, for example, the case of ‘what time you’re expected to get home’. In a positive scenario, a robot could use this information to start cooking dinner for you, so that at the end of a long day you can come home to a warm home cooked (robot cooked?) meal. In a negative scenario, this information could be passed onto a burglar – possibly a robotic burglar!? – who would use the information to decide when he/she should break into your house. Hence, it is not surprising that Siasun’s robot seems to have an automatic alerting functionality upon signals from its physical detection system raises flags related to privacy issues. As the Trendhunter magazine calls it, their software is ‘controversial’.