Chatbots: A solution for elderly loneliness?
Updated: Jul 6
Nowadays, there are many uses of chatbots, a computer program that responds like an intelligent entity when conversed with. In many companies, for instance, when people contact them directly in their pages like Facebook, Instagram or even on websites, the chances that chatbot posts the responses are very high.
In the educational field, is another place that we can find chatbots. Some English courses offer chatbots for students to talk and practice a new language, for example. The person can text and sometimes speaks with a chatbot, and the answer comes appropriately.
Now, projects like “Automated Nursing Program,” created by Osmar Zaïane from the University of Alberta, are trying to develop a chatbot designed to simulate dynamic conversation and provide social fulfilment for elders experiencing loneliness.
Loneliness among Canada’s elderly has been called a public health crisis. According to Statistics Canada, as many as 1.4 million elderly Canadians report being lonely, a condition that commonly leads to depression and an overall deterioration in health. But would chatbots be a solution for elderly loneliness?
It is real that chatbots can help the elderly with loneliness and other issues through a simple chat. However, a few areas must have a chatbot regulation before a senior can use this technology.
Areas for Chatbots Regulation
The first is data privacy and security. It is essential to have a regulation in this area because chatbots can collect a massive amount of personal data. There is a need to have strong policies to protect these data, and seniors users need to know in advance the data that chatbots are taking.
Another issue is handling rogue chatbots, which is when chatbots can harm users. An example of this situation is when a bot steals personal data like bank account passwords or spreads negative sentiments like racism, homophobia or when they delude people into building a fake relationship. In this case, owner bots need to guarantee the total control over their bots and in any signal of misbehaviour, the owner needs to shut the bot down.
The third area that must be discussed is transparency in terms and conditions. Bot owners need to guarantee that the Chatbots are training to do their jobs. This is a huge issue when the subject is about bots that provide medical information or financial advice. It is essential that the bots know well the content they are giving to users. For seniors who need medical assistance, the bots must give them the right information about their health.
Last but not least, the fourth field is to deal with moral issues. Chatbots owners need to be careful when their bots are dealing with people who need medical treatments, like depression, to guarantee that the bots will not make the situation worse. Some bots can learn lousy behaviour by themselves and talk about issues that could be a trigger for many people struggling with mental health. Thus, this area needs special attention.
Therefore, chatbot owners need to guarantee that some measures are being taken before starting service with a chatbot to older adults, and this could be a solution for elderly loneliness. If the owner lost control over his chatbot, he needs to ensure that the bot will be shut down.