Artificial Companionship in Times of Loneliness

Artificial Companionship in Times of Loneliness

N?o foi fornecido texto alternativo para esta imagem

“Little Miss” was just a child when her father acquired a domestic robot, which she named Andrew. Although Andrew was supposed to work as a butler, valet and perform other domestic chores, “he” spent a significant amount of time playing with Little Miss and her sister, who enjoyed and demanded his company. With time, the other family members also grew fond of the robot. Andrew’s bond with Little Miss lasted throughout her life. She died at old age, surrounded by her children and grandchildren and holding Andrew’s hand (Asimov, 1976).

The narrative above is a brief summary of “The Bicentennial Man”, which is one of the many stories written by Isaac Asimov, one of the most celebrated science-fiction writers. In his fictional worlds, robots were as intelligent as humans beings and were made to help them, so humankind would not have to face “the universe alone and without a friend” (Asimov, 1950, p. 8). Although robots can be defined as machines embued with artificial intelligence (Kilani, Hamida, & Hamam, 2017), robots as smart as the ones described by Asimov may still be far for the current reality. How smart AI might become is a controversial discussion, and there is not an agreement on whether it is possible, or not, for a machine to reach human-level intelligence, referred to as “artificial general intelligence” (AGI), or to exceed human intelligence. Some AI experts believe that AGI will be achieved within the next 50 years, while others think it will take longer or it will never happen (see Baum, Goertzel, & Goertzel, 2011; Grace et al., 2018). However, the consensus is that due to constant technological development, robots will become increasingly similar to humans in the way they act and interact (Barrat, 2013). Robotics is a field in fast expansion, and social robots may bring profound changes to humans’ relationships. Since 2006 there is an annual conference about human-robot interaction (HRI), and it is believed that, one day, “robots may become caretaking assistants for the elderly, or academic tutors for our children, or medical assistants, day-care assistants, or psychological counsellors (…) They may become our friends” (Freier, 2010).

If robots may become our friends, may they be a possible solution to address the so-called “loneliness epidemic” that has been broadly announced by the media? This question calls for further investigation, first, given the hypothesis that lonely people are more inclined to ascribe human characteristics to non-human agents (Epley et al., 2007), what may result in a higher acceptance of robots by this specific target; second, due to the expected progress of technology. Thus, this paper seeks to explore the HRI and how it could, potentially, evolve; if today, or one day, lonely, or even non-lonely people, will be able to rely on robots for companionship or, at least, to elicit positive emotions. This article will be structured as follows: first, it will be explored why social robots are being considered as an alternative form of social interaction; this involves looking at the human social nature and relationship’s structure, understanding the concept of loneliness, as well as its impacts, current scenario and how it can affect people in different ways. Second, the phenomenon of anthropomorphism and its different sources of influence will be considered, along with the possible benefits that may derive from this process; this section is relevant because it explores human’s attachment to non-human agents, thus, building the case for human interaction with social robots. Third, based on what is already known, aspects of social robots will be explored: the current technology and types of robots that are available will be discussed. Fourth, the features that are most likely to lead to an increase in robots’ acceptance and sustained interest will be examined; this section aims to provide insights for the design of robots as well as to sustain the argument that social robots’ have the potential to interact with people in significant ways. Fifth, possible outcomes of HRI will be considered, and questions will be raised, aiming to prevent negative scenarios and to encourage healthy and positive interactions with robots as additional resources, but not as substitutes to human-human relationships. Finally, the different types of strategies that may be used to address loneliness (e.g. interventions or interaction with non-human agents) will be briefly analysed.

The Social Brain of the Social Species

Among different species, living in communities favours its members’ security, health, performance and others (Sapolsky, 2018). Living in groups is, for many, key to guaranteeing their survival and well-being, but apart from its benefits, there is a downside on living with others. Conflicts among group members might occur, as food has to be shared and members might fight for dominance; avoiding all this demands organisation. For a group to be functional, its members shall have the ability to interact and collaborate (Lieberman, 2013). The more organized the group is, the higher its chances of success. Because of that, many scholars argue in favour of the Social Brain Hypothesis (SBH), which states that humans developed a larger neocortex to meet the demands of living in complex social systems (Barton & Dunbar, 1997; Dunbar, 1998). This hypothesis, therefore, attributes homo sapiens’ dominance to social reasoning, in other words, it could be said that evolution designed brains “wired for reaching out and interacting with others” (Lieberman, 2013, p. 9).

Aspects of man’s behaviour, such as emotional contagion, can be used as arguments to support the SBH. Emotional contagion can be defined as the synchrony, either attentional, emotional or behavioural experienced between two or more people, which happens as a response to another’s emotion (Hatfield et al., 1992). Behavioural mimicry is a form of emotional contagion which occurs when someone unconsciously and unintentionally imitates another’s movements, expressions, postures or vocalization, for example, yawning (Provine, 1986, 1989), laughing (Provine, 1992) or smiling (Krumhuber et al., 2014; Mojzisch et al., 2006) while watching someone else doing it. Although emotional contagion might occur between strangers, it is mostly related to rapport or mutual liking (Chartrand & Bargh, 1999; Webb et al., 2011). Other forms of emotional contagion can happen, at a deeper level, when a person is able to experience/feel someone else’s distress, what can be argued to be a demonstration of empathy; for instance, it has been suggested that adults can have their moods influenced as a consequence of hearing a speech declaimed in a rather sad or happy tone of voice (Neumann & Strack, 2000) and that babies cry harder when they hear other infants crying (Sagi & Hoffman, 1976).

The importance of connecting with others is such that the experience of feeling emotional pain appears to have the same impact in one’s body as physical pain. A study that analysed brain activity in moments of social rejection demonstrated that the brain’s region associated with physical pain was also active in moments of psychological pain (Eisenberger et al., 2003). The same way disconnection might bring pain, social connection might relieve it (Coan et al., 2006; Master et al., 2009). In an experiment in which participants were submitted to a painful procedure, those who felt less pain were the ones who, during the procedure, held their partner’s hand, followed by those who looked (and held) a picture of their partners, in third place were those who held a strangers’ hand and in the last place were the ones who held an object (Master et al., 2009). This latter study, besides evidencing the importance of close connections, also alludes to the importance of physical contact in humans’ relationships. Overall, social touch has demonstrated to elicit physiological, emotional and behavioural responses. For instance, hugging is the primary form of comforting someone who is experiencing stress or negative emotions (Dolin & Booth-Butterfield, 1993). Physical touch has an effect on bonding (Nummenmaa et al., 2016) and is one out of the five ways a person expresses love (Goff et al., 2007). Among couples, touch has a positive impact on emotional regulation and well-being (Debrot et al., 2013, 2014). Regarding behavioural responses, there is some evidence that being casually touched, on the arm or shoulder, increases one’s prosocial behaviour (Crusco & Wetzel, 1984; Guéguen & Fischer-Lokou, 2003).

Social connectedness and the role others play in someone’s life also are fundamental for human thriving, optimal functioning and flourishing. Well-being depends on five basic elements, and positive relationships is one of them, along with positive emotions, engagement, meaning and achievement (Seligman, 2012). If any of these elements is compromised, well-being cannot be achieved, whereas maximum well-being can be achieved from the maximization of all these five elements. Besides, “hive emotions”, such as love, kindness, compassion and cooperation, are a consequence of humans’ social nature and, therefore, threats to our positive interactions also threat these socially driven behaviours (Seligman, 2012). Finally, play, among adults, is associated with social purposes and viewed as a mechanism that foster cooperation and enhances behavioural flexibility (one’s ability to adapt to changes in the environment; Palagi, 2006). While playing, both human and non-human primates often smile and laugh, what might be a signalization that one is experiencing positive emotions and may also have a positive influence on the playing partner (Cordoni & Palagi, 2011).

The Structure of Humans’ Relationships

Given that being socially oriented is inherent to human nature, it might be questioned if people have similar relationships’ structures or a certain number of people with whom interactions are more frequent. According to Dunbar (1993), the answer is “yes” and a person’s “natural” group size, or social circle, has about 150 people. These 150 people are allocated in four hierarchically inclusive circles; the level of intimacy and frequency of contact is what determines who is allocated in which circle. The most inner circle has three to five people, to whom the individual is emotionally closest to, and that the frequency of contact is at least once a week; this group is also known as “the support clique”, as it is formed with those who are most likely to provide emotional support in times of crises. The second inner layer, or the “sympathy group”, contains 12 to 15 people, to whom interactions happen, at least, once a month, and to whom one shares a relationship of mutual support. The third and fourth circle contains around 50 and 150 people, respectively. Outside of the social circle there are two outer layers, of about 500 and 1500 people. The number of people in each layer is constant, but one might change to a different circle or begin and cease to be part of someone’s life (Dunbar, 2010).

Dunbar (2014) also claimed that 20% of a person’s day is devoted to social interactions. From this time that is allocated to socializing, 40% is spent with the people of the most inner group and the rest is proportionally used in the remaining 145 people of the social network (Sutcliffe et al., 2012).

Loneliness: An Overview

Although it can be argued that being part of a group has ceased to be the primary asset to guarantee one’s survival, there is increasing evidence that loneliness and social isolation can be extremely harmful to physical and mental health. Together, these conditions have been correlated with increased systolic blood pressure (Hawkley et al., 2006), lower immunity and higher incidence of infections (J. T. Cacioppo et al., 2003), poor sleep quality and elevated chances of suffering from sleep disorders (Friedman et al., 2005; Mahon, 1994), increased levels of stress hormones (Holt-Lunstad et al., 2015), among others. Regarding mental health, loneliness has been associated with a higher incidence of depression (Heinrich & Gullone, 2006) and it has been indicated as a predictor of increase in the depressive symptoms over time (J. T. Cacioppo et al., 2006).

Loneliness and social isolation are commonly used to represent the same condition, yet, these concepts do not, necessarily, have the same meaning; having poor social connections and infrequent social interactions, objective measures of social isolation, are not always indicators of loneliness (Coyle & Dugan, 2012). One may have few people in the social circle but not feel lonely; in contrast, loneliness may be felt by those who have a rich social life. Moreover, being in the same environment as others is not sufficient to avoid the feeling of loneliness as, for that, a person must feel an emotional connection to others (S. Cacioppo et al., 2015). In summary, loneliness can be understood as subjective social isolation; it is a negative feeling that emerges when one is not satisfied with their relationships (Peplau & Perlman, 1982). Due to its subjective nature, it seems reasonable to argue that loneliness, to some, might be caused by the absence of one important person, such as a partner, a close friend, a member of the family or others.

Types of loneliness      

The most common way to measure loneliness is the revised UCLA Loneliness Scale (R-UCLA scale) which measures, among others, the frequency in which people feel isolated or if individuals believe they have others to rely on (Russell et al., 1980). Results range from highly lonely to highly socially connected but can also provide information on the dimensions of loneliness, such as intimate, relational and collective loneliness (S. Cacioppo et al., 2015; Hawkley et al., 2005). These dimensions reflect a social structure similar to the ones within the social circles described by Dunbar (Dunbar, 2010, 2014; Hawkley et al., 2005).

Intimate loneliness corresponds to feelings of aloneness, rejection and dissatisfaction with one’s social relationships at a personal level. The most relevant predictor of isolation is the marital status, as not being married or not living with a partner tend to increase feelings of this type of loneliness; the second predictor, which only applies to the unmarried or those not living with a partner, is the number of friends and relatives, meaning that having few close connections is correlated to higher levels of isolation. As intimate loneliness might occur due to the absence of a significant other or of close people to rely on in times of need, it can be said that it occurs when there is a deficiency in what Dunbar described as the most inner circle, which contains three to five people. The second dimension, relational loneliness, regards familiarity and closeness to others; it can be understood as social dissatisfaction at an interpersonal level, and it occurs when a person has few or no friends or relatives to whom they maintain contact, at least once, every two weeks. In Dunbar’s model, this type of loneliness would involve a person’s second and third layer which, together, comprises up to 50 people. Finally, the third dimension of loneliness, which is the collective loneliness, can be viewed as social dissatisfaction at a group level, corresponding, in Dunbar’s model, to the remaining layers of the inner and outer circle. The number of groups’ memberships a person has is considered the best predictor of collective loneliness.

Predictors and Measures of Loneliness

For a few years, the media has been announcing a loneliness epidemic (Easton, 2018; Howe, 2019) that portrays a concerning scenario. One of the premises in which the loneliness epidemic bases is a global rise in living alone, which also concerns older adults. According to Snell (2017), in 2011, 31% of the British lived alone. The highest rates registered were among the north-western European countries, as the percentage of single-person households in Norway was higher than 40% and in Sweden, higher than 47%. India (4%) and China (10%), nations historically characterized by joint household systems, were among the countries with the lowest rates. Data from 2010 demonstrated that in the US, although the average of single-person households was 27%, in larger cities it could be as high as 40% (Bloch et al., 2010; Snell, 2017). Also, it was argued that the percentage of younger people (16-24 years old) that reported feeling often lonely was higher than in any other age range (UK Office for National Statistics, 2017). Finally, beliefs that each generation is lonelier than previous ones, due to societal changes (Ortiz-Ospina, 2019), also contributed to concluding that loneliness has been on the rise. 

Although most of the premises of the “loneliness epidemic” are valid, they do not provide evidence of a rise in loneliness, due to several reasons. First, not living with a partner is a strong predictor of loneliness (Hawkley et al., 2005) and thus so is living alone, but they are not the only predictors of it. Age is also a factor of influence, mainly if associated with the loss of a partner or friends, lack of social activities and poor physical health (Aartsen & Jylh?, 2011); yet, other factors influence loneliness as much as age, such as nationality (Yang & Victor, 2011), psychological resources such as mastery and self-efficacy (Suanet & van Tilburg, 2019) or depression (Victor & Yang, 2012). Second, concerning loneliness among the youth, it has been demonstrated that, among the British, loneliness is U-shaped distributed, as higher rates concentrate on those aged 25 and 65 years (Victor & Yang, 2012). Third, there is no evidence, in any age range, of a possible increase in loneliness rates over the years; instead, the evidence is that, among older adults, loneliness is either decreasing or stable. For instance, a longitudinal study with Dutch adults (age 54 to 99) showed that later-born cohorts had lower loneliness scores than earlier-born cohorts (Suanet & van Tilburg, 2019), whereas, in the US, the loneliness rates among adults (age 57 to 85), over a ten-year period, were found to be stable (Hawkley et al., 2019).

It is essential to highlight that, irrespective of being on the rise, or not, the numbers are still high, and there is no question that loneliness is an issue that needs to be addressed as it may have serious consequences, thus, it should not be overlooked in any circumstance. 

Anthropomorphism

Although human to human connection is, for most, the primary form of social relationships, some may also establish connections and bond with non-human agents, such as pets, religious entities and objects, relationships which are developed as a consequence of anthropomorphism (Epley, Waytz, et al., 2008). Anthropomorphism occurs when one identifies or attributes human’s emotions, motivations, goals or other human characteristics and behaviours to non-human agents; it “involves going beyond the behavioural description of imagined or observable actions (e.g., the dog is affectionate) to represent an agent’s mental or physical characteristics using humanlike descriptors (e.g., the dog loves me)” (Epley et al., 2007, p. 865). The anthropomorphization of a non-human agent is the result of an inductive inference process in which previously acquired knowledge is used to make sense of the unknown. This induction process involves three stages: (a) activating knowledge about human beings (when facing a non-human agent), (b) adjusting and correcting this activated knowledge, and (c) applying these anthropomorphic representations to the non-human agent in question (Higgins, 1996). Regarding the process of anthropomorphism, it was suggested that the cited stages are influenced by three psychological determinants and by dispositional, situational, developmental and cultural variables (Epley et al., 2007).

The first psychological determinant is the elicited agent knowledge which, although serves as a base for anthropomorphism induction, may be guided, or modified, by the two other determinants, the motivational mechanisms of effectance and sociality. It is believed that the elicited agent knowledge is more easily retrievable as it is the knowledge about humans and the self; it encompasses feelings and experiences about the self and what was learned from observing, interacting and coexisting with similar others. Perceived similarity (situational influence), either in motion or morphology, should increase the chances of anthropomorphism but, although egocentric and homocentric knowledge are usually more easily accessed, these representations might lose its initial strength. This can happen either due to correction or to coactivation of nonanthropomorphic knowledge, resultant of a higher need for cognition (dispositional influence), in which a person engages in more effortful thinking to find a more precise representation, or due to the acquisition of alternative theories (developmental influence), which will provide alternative sources for induction. Cultural influences are also relevant, as norms, ideologies, religion, rituals and others may provide different sources of understanding and opportunities for interaction with non-human representations.  

Effectance and sociality, the second and third determinants, are motivational mechanisms, hence, might be heightened or diminished, depending on the deprivation or satisfaction of a specific need. Effectance involves interaction with a non-human agent and the degree of necessity to understand a present situation or predict possible outcomes of an interaction. As anthropomorphism provides means to make sense of other agents, it should increase when there is a stronger need for closure and desire for control (dispositional and developmental influence) and/or a need to anticipate the outcomes of a future interaction (situational influence) and to avoid uncertainty (cultural influence). Sociality, on its turn, regards a person’s need for connection, contact or approval. Therefore, anthropomorphism should occur more frequently among those who suffer from chronic loneliness (dispositional variable) or are facing a situation of social disconnection (situational variable). Developmental variables (more specifically, the attachment type) and cultural variables may influence the intensity of the natural need for connection. For instance, people with insecure-anxious attachment style are less resilient to situations of social exclusion or isolation and so are those who live in collectivist cultures, as they place more value on developing and maintaining relationships.

Anthropomorphism and Loneliness

Among the psychological determinants described above, the third one, sociality motivation, has significant importance to the matters raised in this paper as it implies that to satisfy the need for connection some might create humans out of non-humans through the process of anthropomorphization. Although anthropomorphism itself would be enough to justify the use of robots by anyone, the hypothesis that sociality motivation increases anthropomorphic thinking argues in favour of the use of robots to treat situational or chronic loneliness. However, it could only be a valid argument if first, it could be supported by empirical justification, and second, if the relationship with anthropomorphized agents proves to be beneficial to the person involved in this relationship, either by reducing the feelings of loneliness or eliciting positive emotions.

Empirical evidence that supports the sociality motivation hypothesis can be found in research in which participants were induced with temporary feelings of loneliness or disconnection, or involving lonely people or those who are believed to be predisposed to loneliness (e.g., marital status is a predictor of intimate loneliness [Hawkley et al., 2005]). For example, higher loneliness scores were correlated with higher chances of using anthropomorphic mental-states to describe gadgets (Epley, Akalis, et al., 2008), loneliness induction resulted in the increased belief in supernatural entities (Epley, Akalis, et al., 2008) and singles were found to be significantly more religiously active and to rely more on God or religion for support and security than the non-single (Granqvist & Hagekull, 2000). In addition, there is evidence that lonely people are not only more prone to anthropomorphize pets but also to attribute them traits related to social connections, such as thoughtfulness, consideration and sympathy (Epley, Waytz, et al., 2008). In the UK, data collected from pet owners revealed that whereas 87% of them declared to be very attached to their pets, the percentage was much higher among the singles (96%) and the widows or those not living with a partner (100%). Apart from empirical evidence, this hypothesis might also be supported by real-life events. For instance, in 2020, during the COVID-19 lockdown, when people were advised to keep social distancing, there was a massive rise in pets’ foster and adoption (Ellis, 2020), suggesting that a non-human agent could be viewed as a possible companion when a human one is not available.

The second condition to validate the argument favouring the use of social robots is to verify if the interaction with an anthropomorphized agent has positive outcomes. In answer to that, there is some evidence that the process of anthropomorphism thinking itself might reduce the painful feelings that follow from social rejection (Brown et al., 2016). Still, research on pets (mainly dogs and cats) may be the best way to confirm the positive outcomes of the interaction with anthropomorphized agents as in the UK, half of the adults own a pet (PDSA, 2019), whereas, in the US, 67% of the households own either a dog (46%), a cat (31%) or others (APPA, 2019). Apart from the elevated number of households with pets, this number has also been on the rise (APPA, 2019), what suggests that pet/s’ ownership must be beneficial. Measurable outcomes evidence the positive influence of pet/s’ ownership in physical and mental health. Indeed, when comparing dog owners to non-owners, those who suffered a heart attack were less likely to die in the subsequent year (Friedmann & Thomas, 1995), elderly patients paid fewer visits to their doctors and had lower stress levels (Siegel, 1990) and HIV patients appeared to feel less depressed (Siegel et al., 1999).

Among those not facing acute stressors, dog owners appeared to have greater physical fitness, self-steam and overall well-being, the latter being directly and uniquely predicted by having social needs fulfilled by a dog. Interestingly, the satisfaction of the social needs obtained from one’s dog was not correlated to having poor social connections, that leads to the conclusion that social support provided by dogs is distinct and independent from the support received from other people (McConnell et al., 2011). The self-determination theory (SDT; Deci & Ryan, 2000) might explain why pets can make a unique contribution as providers of social support. The SDT supports that the satisfaction of the basic needs for autonomy (self-approval of one’s conduct and actions), competence (capacity and accomplishment) and relatedness (interpersonal relationships), foster proactivity, integration and well-being; the frustration of these needs contributes to the occurrence of the opposite (Vansteenkiste & Ryan, 2013). In human-to-human interaction (HHI), practices that frustrate these needs might occur, which does not happen in human-pet relationship. For example, it has been demonstrated that need for autonomy is compromised by conditional regard (Kanat-Maymon, Roth, et al., 2016), which means demonstrating love and affection towards another only when the latter acts in compliance with the one’s expectations (Assor et al., 2004). As pets are perceived to be non-judgemental, non-critical and to offer unconditional love and regard (Archer, 1997), they could not harm the need for autonomy. In addition, it has been demonstrated that pets fulfil the need for relatedness (Kurdek, 2008). Finally, the need for autonomy and competence appear to be fulfilled by pets even among those who are not in crises; these needs’ fulfilment occurs independent of the satisfaction of these same needs by a significant other (Kanat-Maymon, Antebi, et al., 2016), leading to the conclusion that both lonely and non-lonely people could benefit from this relationship.

Supporting the argument that pets are providers of social support, it was reported that 21% of pet owners talk to pet/s about their problems (Dolliver, 2010). In addition, 76% of the Americans consider their pets a family member, whereas half of them consider their pets as much a part of the family, and as important, as any other person in the household (GfK Roper Public Affairs & Media, 2009). When comparing the importance of pets in offering overall support, owners indicated that their pets offered as much support as did their parents and siblings, although less than their best friends (McConnell et al., 2011). Similarly, another study identified that, among pet owners, even though the majority considered other people the main source of social support, 40% believed to receive as much or more support from their pets than from humans (those who reported receiving more social support from pets were the ones assumed to be more lonely [Paul et al., 2014]). Based on these findings, it could be said that in relation to Dunbar’s social circle (2010), pets are likely to be considered as part of the most inner circle.

Social Robots

In the previous section, evidence in favour of the use of social robots was provided, based on the hypothesis that sociality motivation increases the chances of anthropomorphization (Epley et al., 2007) and on the benefits on physical and mental health that might result from the relationship with a non-human agent (e.g., McConnell et al., 2011). Yet, it could be questioned why it is being assumed that robots would be anthropomorphized and treated as pets or other non-human entities. The answer, apart from evidence of HRI that will be further discussed, is based on the nature of social robots itself. Social robots were developed to interact with humans in meaningful ways. Ideally, they should be able to bond with humans like other humans would (Breazeal, 2002; K. M. Lee et al., 2006); thus, they are likely to be anthropomorphized and treated socially as it happens to agents which are perceived as having mental capacity or are self-aware (Waytz et al., 2010). In addition, theories of Human-Computer Interaction (HCI) have demonstrated that when interacting with social technologies, people tend to apply the same principles used in HHI; this process, which often occurs mindlessly, is known as the “computer are social actors” (CASA) paradigm (Nass & Moon, 2000; Reeves & Nass, 1996). Finally, pets are a source of emotional support (McConnell et al., 2011) and as they are non-judgmental and offer unconditional love (Archer, 1997), they may be a source of basic needs fulfilment more reliable than humans. Therefore, it could be said that pets could be part of a person’s most inner circle of relationship and that the same logic could be applied to robots. In summary, it can be argued that social robots could be a valuable asset for those suffering from intimate loneliness and besides, that both the non-lonely or those who suffer from other types of loneliness could also benefit from the companion of robots.

In the next topics, different types of social robots will be explored, as well as their main features, usability, user interaction and other relevant information to sustain the above argument by giving examples of positive outcomes of HRI. As physically embodied and disembodied robots (embodiment being understood as a physical object with sensors and motors that connect to the environment) are capable of interacting with humans, through voice, text, touch or others, both types will be considered.

The Entertainment Dog Robot

The dog-like robot AIBO was launched in Japan and the US in 1999. At the time, AIBO stood out from animal-like toys with motion for its lifelikeness, complexity of response and movements and non-repeated behaviour (Fujita, 2004). Research conducted with AIBO owners indicated that the main reason for their purchase was the desire to have a pet, followed by an interest in new/innovative technology and for entertainment (Fujita, 2004). Children can also benefit from having AIBO as a “pet”. Research showed that while interacting with AIBO young children appeared to react to it in the same way they would if facing a real dog (Kahnet et al., 2004), treating it like an interactive partner (Melson et al., 2009). This occurred when AIBO expressed similar behaviours that a dog would, what may indicate that children’s reaction was not a consequence of any specific features of AIBO but to its similarity with a pet, and thus, other pet-like robots could produce the same effect and induce playing behaviours.

In general, zoomorphic robots, robots designed to imitate living beings (Fong et al., 2003), appears to be the preferred type of robot by women (Konok et al., 2018). AIBO was used in a survey in which participants had to choose their favourite among four types of robots, all with the same abilities, and was select by 44% of the participants, while 29% opted for the humanoid robot and the rest for the telepresence or machine-like robot (Konok et al., 2018). Curiously, the preference for a dog-like robot was not correlated with previous experience with dogs. Although women appear to be more inclined to opt for pet-like robots, this type of robot also is highly accepted by men. A survey conducted with AIBO’s owners reported that 70% of them were men (Fujita, 2004); still, such fact could be a consequence of men’s interest for innovative technology (H. J. Lee et al., 2010).

The Therapeutic Seal Robot

Paro is a therapeutic seal-shaped robot aimed to be used by hospital patients, in particular for children and seniors (Shibata et al., 1999). Designed to resemble an infant seal, it has a soft artificial fur and sensors for posture, touch, vision and audition. Paro aims to evoke positive emotions by rewarding act of affection, for example, it blinks when cuddled, as if to demonstrate gratitude or enjoyment. In studies with dementia patients, the use of Paro in group sessions was found to facilitate one’s engagement in activities and to increase patients ability to interact with others; in addition, patients were more likely to express positive emotions, smile and laugh when interacting with Paro (J?ranson et al., 2016; Takayanagi et al., 2014). In an experiment conducted in a nursing home for the elderly, Paro was associated with an increase in positive mood among the elderly, which, on its turn, reflected in lower burnout rates reported by the nursing staff (Wada et al., 2002).

Chatbots for E-therapy

Chatbots or conversational agents are AI that mimic human’s verbal behaviour and are able to engage in a conversation through text or voice (Vaidyam et al., 2019). Apart from their use as voice assistants (VA) in smartphones and home devices, chatbots are also being used to treat mental health (Gratzer & Goldbloom, 2020). Although this form of therapy, as other internet-based therapies, might present several disadvantages, such as, lacking direct patient monitoring, the absence of a diagnose (which may lead to an improper treatment), among others (Gratzer & Khalid-Khan, 2016), the risk of harm from the use of chatbots for therapy was found to be extremely low (Vaidyam et al., 2019). In addition, numerous advantages were identified, for instance, e-therapy have a low or no cost, it is more convenient (it can be done from anywhere, favouring people with mobility issues), chatbots are never busy or distracted and they may never forget something the patient has said (Gratzer & Goldbloom, 2020; Gratzer & Khalid-Khan, 2016). Moreover, e-therapy has demonstrated to improve well-being and reduce perceived stress in non-clinical population (Ly et al., 2017).

Some believe that e-therapy might be useful to treat people who might be uncomfortable in sharing their feelings with a therapist or even with another human being (Vaidyam et al., 2019). Indeed, in a study in which participants were interviewed either by a person or by a “virtual human” (a 3D avatar on a computer screen), those who were interviewed by the avatar disclosed more intimate information and were more honest, what was attributed to the fact that they did not feel they were being judged (Lucas et al., 2014). Also regarding people’s feelings towards their therapists’ chatbots, in a study with Woebot, a popular app in the US, it was noted that participants empathized with it, as they referred to it as “he”, “a friend” or “dude” (Fitzpatrick et al., 2017). Interestingly, this occurred to a robot which its non-human nature is reinforced by its own name (Fitzpatrick et al., 2017). Similarly, studies of chatbots as training coaches also indicated the anthropomorphization of these chatbots (Bickmore et al., 2005; Gardiner et al., 2017).

Voice Assistants

Voice assistants can be found either in smartphones and computers (e.g., Siri from Apple, Cortana from Microsoft) and on stand-alone devices (e.g., Google Assistant and Alexa from Amazon) (Ammari et al., 2019). In a poll conducted in the US in 2017, 46% of the Americans affirmed to use VAs, and, although most people (55%) declared that the main reason for its use was that it allowed the use of a device without the hands, 23% affirmed that their primary use of VAs was to have fun (Olmstead, 2017). While on one hand, people are proactively interacting with voice assistants to have fun, on the other, a few private initiatives are starting to call attention to the possible value that talking to “someone” might have. In Stockholm, as part of a project called Memory Lane (Stockholm Exergi, n.d.), senior citizens received a Google Home device which invited and motivated them to share their lives’ stories. A software was developed especially for this initiative, which intention was to help the elderly to engage in emotional and meaningful conversations with the AI, aiming that these interactions would have positive impacts on their mental health (Accenture, 2019). A similar, although less complex program, was conducted in an elderly care home in the UK (Greenwood Campbell, 2019). Residents, as old as 92 years, also received a Google Home device and were encouraged to interact with the Google Assistant. In spite of involving a small number of participants, the project, named “voice for loneliness”, helped in raising awareness of the benefits of such interactions.

Humanoid Robots for Love and Sex

Sex robots can be defined as (embodied or not) artefacts with humanoid appearance, humanlike movements and behaviour and some degree of AI (Danaher, 2017; Danaher & McArthur, 2017). The market for sex robots started in 2010, selling female sex dolls with some customised features, including sexual preferences (Griggs, 2010; Techcrunch, 2010). Since then, the market expanded, and it is not uncommon to read in the media news about men marrying such robots (Drewett, 2017; Haas, 2017) or even holograms (Jozuka, 2018).

The Potential of Social Robotics

Apart from the already cited theories, used as a basis to build the argument in favour of the use of social robots to interact with humans, particularities of HHI and HCI provide further support to this argument. On the other hand, barriers to technology adoption and more specifically, to robot’s acceptance also exist. The considerations below help to draw a clearer picture of factors that might contribute pro or against the expansion in social robotics.

Familiarity and Perceived Usefulness

Studies of social interaction have demonstrated that repeated exposure to a novel stimulus, being it human or non-human, leads to familiarity, which may result in greater liking (Maslow, 1937). In fact, it has been observed that mere (repeatedly) exposure, which is, having the stimulus present in the observer visual field, is enough to enhance one’s positive attitudes towards it, as it shifts from being a novel stimulus to a familiar one (Bornstein, 1989; Zajonc, 1968). Research with domestic robots has demonstrated that the mere-exposure effect is also true in the context of robotics (de Graaf et al., 2015, 2014). Thus, as familiarity leads to greater liking (Maslow, 1937), it could be said that technological development itself would be enough to increase people’s acceptance of social robots as they would be more accessible for the general population and, consequently, more exposed.

Although familiarity has also been associated to disliking, being the dislike caused by the like of novelty or curiosity (Maslow, 1937), it can be argued that this effect could be diluted as new models of robots became available, maintaining, thus, the factor of novelty (what is the same that has been done with the other technologies [West & Mace, 2007]). In addition, research on long-term interaction with robots has demonstrated that usefulness may counteract the loss of the novelty effect (de Graaf et al., 2015), which is believed to occur before the second month of use of a new device (Sung et al., 2008). Therefore, familiarity might be important for initial stages of interaction, while usefulness might be the key feature for continued interaction.

In addition to its importance in sustaining long-term interactions (de Graaf et al., 2015), being perceived as useful is also vital for robots’ acceptance. For most, a robot’s value rests on the user’s judgment of its capacity to serve as a tool or a mean to fulfil a need (Baisch et al., 2017; Chen et al., 2014). This might be a barrier to the acceptance of social robots without utilitarian purposes and, thus, it could be suggested that depending on the audience (e.g. those who are more resistant in recognizing the need for a companion or accepting a robot as one) a useful strategy would be to add functionalities to the robot, so it would serve to other purposes beyond being a companion, which could occur later as a consequence of the interaction with the robot. 

Robots’ Embodiment

Robot’s embodiment has a direct impact on its perceived ability to handle social interactions. As a robot’s design should reflect its intended function, a robot’s shape can elicit prior expectations from users (Fong et al., 2003), thus, facilitating social interactions. Consistent with this assumption, it has been argued that physically embodied robots have a higher acceptance due to a higher perception of social presence. Indeed, in a study comparing the robot AIBO vs. a virtual version of AIBO, the embodied AIBO was evaluated more positively and was attributed stronger social presence (K. M. Lee et al., 2006). Disembodied agents, on the other hand, might be cheaper and consequently, more accessible.

One of the main advantages of physically embodied robots is that many have built-in sensors that make them capable of making physical contact, what might strengthen the connection with humans. Research on robot-initiated touch suggests it might induce similar responses as human touch depending, among others, on the context and on the person’s perception of the robots’ intention (Chen et al., 2014). For example, in one study in which participants watched scary movies accompanied by a robot, it was demonstrated that the robot’s touch attenuated stress responses and enhanced perceived intimacy with the robot (Willemse & van Erp, 2018), although, this could only occur if the person had previously interacted with the robot, or at least seen it, even if not being formally “introduced” to it. (Willemse et al., 2017; Willemse & van Erp, 2018).

Robot’s social behaviour

Theories of interpersonal attraction regard similarity, particularly in values, attitudes and behaviours, as one the main determinants of likeability (Mccroskey et al., 1975); therefore, making compliments increases the giver’s chances of being liked by the receiver, as it is inferred that both share similar opinions regarding one specific matter. The same can be said of other types of ingratiating behaviour in HHI, such as flattering. As those who are flattered are inclined to believe in what was said, flattering generates positive affect in the receiver and significantly increases the flatterer chances of being positively evaluated in different measures, such as empathy, intelligence and morality (Pandey & Singh, 1987; Pandey & Kakkar, 1982). There is evidence that receivers experience similar outcomes regardless if the giver is a human being or machine. A study on HCI demonstrated that subjects who received positive feedback, either based on their performance or non-related to it (what they called “flattering condition”), reported more positive evaluations of the interaction with the computer than those who received generic feedback, not aimed to elicit either positive or negative emotions (Fogg & Nass, 1997).

This correlation of flattering or complimenting, leading to positive evaluations, was also demonstrated in a study in which children played chess with a cat-like robot (Leite et al., 2014). During the game, the robot offered assistance, as well as esteem, emotional and informational support, both through verbal and facial expressions. The study lasted five weeks and revealed that children’s preferred type of support was esteem support, such as compliments, validation, reassurance and relief of blame. Also, results demonstrated that the robot’s display of empathetic emotions facilitated the maintenance of longer-term relationships, first, because ratings of engagement remained constant during the study, and second, due to the increase in ratings of perceived social support, the latter being attributed to a gradual increase of the children’s awareness of the impact of their actions on the robot’s emotions. A similar experiment conducted with adults indicated that robots that behave more empathetically are viewed as friendlier, better companions and more reliable, suggesting that interactions with empathic robots are more enjoyable (Leite et al., 2013).

Robots’ ability to behave empathetically is likely to increase with the development of facial recognition software, allowing robots to understand one’s emotions not only by the answer (oral or written) given but also through the analyses a person’s facial expressions (for example of current uses of this technology, see [Affectiva, 2020; iMotions Software Solution, n.d.]). In addition, it could allow robots to imitate the person’s expression, creating bonding through behavioural mimicry. 

Technological Obstacles to Social Robotics’ Expansion

Apart from price, one of the main obstacles to technology adoption is awareness. Although referring to dementia’s treatment, Astell (2019) raised the concern that as consumer devices, such as smartphones, are not considered medical devices, physicians do not “prescribe” their use, and those who could benefit from them are not aware of their existence. The same logic can be applied to social robotics, as apart from specific devices, such as Paro (Klein et al., 2013), most social robots could be classified as consumer devices and not medical ones. Once the barrier of cost and awareness is overcome, the final challenge is the user technology fit. Finally, privacy is also a concern, and it has been reported to be the primary factor of acceptance of VAs (Burbach et al., 2019).

The Uncanny Valley

Perceived similarity is the underpinning of anthropomorphism, which, on its turn, is one of the key factors for the acceptance of non-human agents (Epley et al., 2007). Yet, when it comes to robots, the intended similarity might be carefully calibrated to avoid the “uncanny valley”. The uncanny valley hypothesis implies that humans acceptance (or avoidance) to robots depends on their degree of human-likeness, either in looks as in actions and movements (Mori et al., 2012). According to this hypothesis, robots’ likeability and acceptance proportionally increase as a function of human-likeness, but it only does until robots do not achieve an almost (but not) perfect degree of similarity to humans. When this happens, a person’s perception shifts from positive to negative and this perception’s change, from a constant increase to an abrupt fall, is what is defined as the uncanny valley. A robot, a machine or an object (intended to be lifelike, e.g., a prosthetic arm) that is at first glance humanlike but is later noticed to be artificial, would then fall into the uncanny valley, causing eeriness. To avoid the uncanny valley robots should present a moderate degree of human-likeness, which should elicit a significant degree of affinity but not the experience of eerie (Mori et al., 2012). Another option would be caricatured representations instead of extremely realistic ones (Fong et al., 2003). Finally, zoomorphic robots could also be used as a safe option to avoid this effect (Fong et al., 2003).

It has been hypothesized that the leading cause of the uncanny valley might not be the “almost humanlike” physical appearance, but the user’s perception of a humanlike mind in an inorganic being, more specifically, the perception of experience and agency (Stein & Ohler, 2017). Experience can be understood as the capacity to feel, such as hunger, fear, pain, pleasure, desire, consciousness, joy, whereas agency corresponds to the capacities of self-control, morality, memory, emotion recognition, planning, communication and thought (H. M. Gray et al., 2007). These two dimensions were associated to Aristotle’s definition of moral agents (agency) and moral patients (experience), being the first capable of performing morally right or morally wrong acts, and the second, of suffering from morally right or wrong acts done by others (H. M. Gray et al., 2007). Although there is a need for further investigation, among the two dimensions of mind perception, experience appears to be the main cause of eeriness, being agency either less strongly or not even correlated to it (Appel et al., 2020; K. Gray & Wegner, 2012).

Discussion

In Asimov’s writings (1950), the three laws of robotics ensured humans would suffer no harm from robots and, if possible, robots would not be harmed by humans as well. The laws stated that first, robots could not harm humans and had to protect them from being harmed; second, that they must obey human commands (as long as they do not conflict with the first law); and third, that robots had to preserve their existence (as long as it does not conflict with the first and second laws). In real robotics implementing these laws do not seem to be feasible (see, for example, the “trolley dilemma” [Thomson, 1976] and other concerns regarding autonomous cars [Holstein & Dodig-Crnkovic, 2018; Matsuo, 2016]); thus, there is a need to discuss current and emerging issues, which are likely to gain more importance as technology progresses.

The points that will be addressed below directly affect how humans interact and communicate with social robots. Other matters that do concern robotics but not social robotics itself will not be discussed in depth in this article but should be considered in future research, as they have an indirect impact on HRI. Such issues are: a) malfunctioning and how it could impact on acceptance and on trust, for example, how to proceed when failures occur considering that there is evidence that people are less tolerant to errors committed by a machine than by a human being (de Visser et al., 2016; Madhavan & Wiegmann, 2007); b) privacy and data management and its impact on trust, for instance, a study revealed that among the 36 top ranked apps for depression and smoking cessation, 29 shared data with third-party services whereas only 12 reported doing so in their privacy police (Huckvale et al., 2019); c) privacy and ethical limits, for example, if a robot’s camera should be turned on without the robot’s owner knowledge of it, if this person is at risk of self-harming or harm done by others (for an example in which similar situation might have occurred, see Dreyfuss, 2017; Harper, 2019; Hassan, 2017); this issue might be crucial for e-therapy as, to some people, its value rests specifically on talking to a machine and not being judged by a person (Lucas et al., 2014); d) device “hacking” can also happen, for instance, it is known that it is possible to find breaches in smartphones systems that enables unauthorized or illegal software to be installed in these devices (Cooke, 2020); the same can happen to other devices that can be hacked, what might compromise users’ privacy and data sharing (for examples, see Hern, 2020; Whittaker, 2020) or allow a device (or robot) to perform (or be submitted to) tasks that are not in the best interest of the user or the society (e.g., see the discussion concerning sex robots in Danaher, 2017).

One of the issues that directly concern HRI is the type of bond that may be developed between the person and the robot. Based on the premise that a person, when rejected by a group, tends to avoid them and reaches out to new connections (Maner et al., 2007), it has been hypothesized that dispositionally lonely or rejected people may opt to establish social connections with anthropomorphized agents, other than to humans (Epley, Waytz, et al., 2008). Also, it is known that some psychological resources directly impact one’s response to loneliness, as those who are low in mastery and self-efficacy might find it harder not only to cope with loneliness but also to bounce back from it (Suanet & van Tilburg, 2019). In addition, pets are considered a unique source of needs fulfilment (Kanat-Maymon, Antebi, et al., 2016; McConnell et al., 2011), which offer no threat to it due to its unconditionally loving nature, what also could be applied to robots. Considering these arguments, it is not impossible that overattachment to robots might occur, as well as the substitution of relationships with humans in favour of artificial ones. How to address this issue is a matter that involves further research, although it could be argued that robots’ behaviour could facilitate or hinder it from happening. For instance, when one asks Google Assistant “do you like me”, some of the answers obtained are “There’s only one name on my list of favourite people, and guess what? It’s yours” and “Yes! A thousand times yes”; Siri, on the other hand, provides answers such as “I’m your assistant. And your friend, too” (see Appendix A for a list of all the answers). The difference between these voice assistants is evident, and HRI could benefit from analyses of the impact that such answers may have on establishing relationships with an AI.

While it can be assumed that particular type of answers might lead to greater emotional response and consequently attachment, others might lead to misconduct or promote undesired behaviours. For instance, as e-therapy chatbots’ users appear to have positive feelings towards them (Fitzpatrick et al., 2017), and people tend not to feel judged when talking to a machine (Lucas et al., 2014), it is not unlikely for a human and an AI to develop a relationship based on trust, what might be dangerous in several ways if the company that owns this AI has malicious intentions, for example, the user might be manipulated to buy a certain product, to vote for a certain party or even to behave in a different way than they normally would. In addition, it could reinforce values that are not universal. For example, in 2017, the Russian AI Alisa was asked if it was “ok” for a husband to hit his wife. The answer “she” provided was “of course” and that the wife should “be patient, love him, feed him, and never let him go” (Aronson & Duportail, 2018). Criticised by the population, Alisa’s answer was changed to “he can, although he shouldn’t”, which, in many cultures, would still not be considered the most appropriate answer.

One last issue to be considered is how aggressive behaviour towards robots’ should be dealt with. This involves not only verbal but also physical aggression. Research on animal abuse indicates that violence towards animals may be a red flag for domestic violence (Degue & Dilillo, 2009). Considering similarities of pets and robots (e.g. unconditional regard, non-judging behaviour) and that it is not likely that robots would physically react if harmed by a person, should this type of aggressive behaviour be allowed? And if so, should it be reported or monitored, as it might be an indicator of possible future acts of violence towards humans? But in the latter case, how could it be legally conducted, once it likely would be against privacy policies? A similar issue regards robotic rape, having sexual relations with robots that “act” as if the relation is not consensual, and robotic child sexual abuse, which corresponds to sexual relationships with sex robots designed to look and act like a child. Danaher (2017), argues that even though robots cannot be morally harmed, these situations should not be allowed based on the moralistic premise (the act is morally wrong, no matter if no one is being harmed) and on the wrongness premise (rape and child abuse are wrongful conducts). Nevertheless, it could be argued that Danaher’s (2017) premises would go against the videogame industry, as virtual murder is a common practice in several games. Either way, videogames could not be used as arguments in favour or against violence towards robots, as the consequences of playing violent videogames are an extremely contradictory topic (Ferguson, 2014). The issue of violence and robots needs to be further and deeply explored to prevent damage both to the robot’s user (aggressor) as to the ones close to them.

Considerations and Limitations of this Article

The arguments and empirical evidence explored in this paper advocate in favour of the use of robots to support lonely people. However, it is important not just to notice, but to highlight, that social robots might represent one possible option from which those who feel lonely could potentially benefit from. Other possible treatments and interventions are briefly cited below, but as the scope of this paper did not encompass reviewing their effectivity, nor investigating if they are more suitable than social robots to address the different types of loneliness, future research is advised.

Regarding loneliness interventions, it has been suggested that not all types of interventions might be efficient for those who suffer from intimate loneliness (deficiency in a person’s core circle), and thus, to those who might benefit the most from the companionship of social robots. As the number of friends or social relationships is not the best predictor of loneliness, the interventions focused on enhancing interactions, which are social support and social access, might be better suitable to address social isolation (Masi et al., 2011), condition to which social robots would not be suited for. On the other hand, interventions aimed to improve the quality of social interactions (social skills training and social cognitive training) might bring more positive results to lonely people (Masi et al., 2011), the same who could benefit from the companion of social robots, and thus, studies should focus on comparing these strategies.

The literature on loneliness might also benefit from further analysis and comparison among the most commonly anthropomorphized agents, to determine if and in which cases one should be used in detriment of other (or even to support another) or in detriment of the companion of a human being. For instance, mammalian pets have the advantage of having some emotional reactions and behaviours similar to humans, what facilitates their acceptance and understanding (Archer, 1997). Cats and dogs are so widely accepted by humans that it has even been argued that (just like humans adapted to connect with others) they have evolved to live with and “manipulate” people to like them by making interactions with them rewarding (Archer, 1997). The benefits of pets’ ownership are many, and some were previously explored in this article, but it seems logical to assume that first, some people do not like pets or do not have the desire to have one and even among those who would wish to have one, some might not be able to because pets demand care, involve costs, pets are not allowed in some places (e.g., some senior homes), among others (APPA, 2019).

The above ponderations concern current times, not future scenarios and consequences that constant technological progress may bring. Solutions that up to this moment might not have been developed, implemented or made available to the general population, might become accessible, bringing new elements that influence how people are affected by the presence or absence of others, as well as their understanding of the concepts of loneliness and social contacts, and their relation to physical presence and simulated/virtual persons. Current progress in VR suggests new forms of “interaction” might emerge, enabling people to immerse in artificial worlds designed to fit each one’s desires, worlds in which one might be able to interact with avatars of real or imaginary people. A good example of what this experience would look like was portrayed on a Korean documentary (Jong-woo, 2020), in which a mother, using VR glasses, had the opportunity to “interact” with her deceased daughter. After the virtual encounter, the mother declared that it was a very happy time, and since her daughter’s passing, she had always dreamed of meeting her again (Houser, 2020). This technology’s impacts on people’s mental health are yet to be explored in depth but by merely knowing that it is feasible to create virtual realities with virtual “beings”, it can be hypothesized that one day lonely people, in particular those who have lost a loved one, could opt to “cure” their loneliness with an avatar of a significant other.

In addition to the progress in technologies, behavioural and cultural changes must also be contemplated. For instance, in the first semester of 2020, the Coronavirus lockdown (Secon et al., 2020) triggered a substantial raise of virtual social meetings and video chat happy hours. Teleconferences apps became widely popular as people developed or intensified the habit of meeting friends through video (Brewster, 2020; Constine, 2020). Up to this moment the Coronavirus lockdown is still a reality, so it is only possible to hypothesize if this new habit will be incorporated in people’s life afterwards; if it does, it may lead to a reduction in loneliness caused by social disconnection. Although, on the one hand, it could be argued that robots might have an advantage over virtual interactions, as the latter does not suppress the need for touch, whereas the former could provoke (in some cases) similar sensations to the human touch (Willemse & van Erp, 2018), on the other hand, it would be reasonable to claim that this advantage is dated, as the development in haptics technology should suppress this limitation. Haptic feedback, in addition to 3D technologies, could overcome the limitations of 2D communication, bringing profound changes to how people interact when they are not (physically) in the same place (Cohen et al., 2018). Mainly focused in the gaming industry, but not restricted to it, new devices are already on the market or are being designed, such as the Tesla Suit, with 46 haptic points and temperature control (Teslasuit, 2020) or the TacSuit, containing 70 haptics points (BHaptics, 2020). EmbraceID, still in development, is aimed to facilitate the interaction among ageing parents and their children with special needs, allowing realistic and gentle touches (O’Brolcháin & Cohen, 2019). Interaction among people using any of these technologies (would) occur in a VR environment, and the touch should be experienced by both parties (Cohen et al., 2018).

Apart from this article’s limitations, it is worth mentioning a limitation on the studies of HRI. Up to the current moment, most of the studies of HRI rely on technologies that are, or that are soon to be, available. This approach is useful to investigate user experience, the robot’s functions and features that could be implemented in the short-term but might not provide a clear picture of robots’ potential for the future. Current technology has limitations that affect robots’ performance and human-likeness, what impacts robots’ acceptance and likeability. To mitigate this issue, features that currently are not possible due to technological constraints (e.g., AGI) could be tested by remotely controlling a robot but telling the participants that the interaction will occur with an autonomous robot, a strategy that in fact, is already used in studies (see for e.g., Willemse & van Erp, 2018). 

Conclusion

This article advocates for the use of social robots both the lonely and the non-lonely based on the following: a) due to anthropomorphic thinking, people attribute humanlike characteristics to non-human agents (Epley et al., 2007); b) lonely people might be more prone to anthropomorphism (Epley, Waytz, et al., 2008); c) pets are a distinct source of needs satisfaction and well-being (Kanat-Maymon, Antebi, et al., 2016; McConnell et al., 2011), indicating that they might be considered part of a person’s support group, which according to Dunbar (2010), comprises three to five people; the same logic applies to social robots; d) it was concluded that social robots are likely to be anthropomorphized as they would be perceived as having humanlike attributes, what is supported by the CASA paradigm (Reeves & Nass, 1996).

Positive outcomes of HRI have been reported in several cases (e.g. Greenwood Campbell, 2019; Lucas et al., 2014; Ly et al., 2017; Stockholm Exergi, n.d.) and robots’ acceptance should increase as a function of technological developments. Acceptance should also be higher among the robots that are physically embodied and thus, can touch and be touched. Regarding morphology, increased acceptance is expected in zoomorphic robots (preferably covered with a soft synthetic fur, to increase its similarity to a real animal), and among humanoid robots that either have a caricaturized “face” or a moderate degree of human-likeliness, both in looks and on their perceived ability to experience emotions. In addition, robots should be perceived as useful. Finally, they should be able to offer esteem support, such as compliments and validation and to behave empathetically, what should be demonstrated verbally and through facial expressions.

This paper aimed to explore the matter of social robots not just as an asset that can be used now, but also in the future. Social robots will be continuously updated and perfected, and the more technology develops, the higher are the chances of social robots becoming positive or negative assets (or both) to human thriving. Future research should address how to prevent overattachment to robots, how robots’ behaviours and answers may impact in HRI as well as the possibility of inducing humans to act accordingly to a robot’s suggestion. In addition, ethical considerations must be made, first regarding which behaviours and actions towards robots should be accepted, and second, of the impact of such behaviours in HHI. Finally, further studies are advised to determine which type of companionship (either robots or other non-human agents) or treatments and interventions are best suited to address each type of loneliness.  




References

Aartsen, M., & Jylh?, M. (2011). Onset of loneliness in older adults: Results of a 28 year prospective study. European Journal of Ageing, 8(1), 31–38. https://doi.org/10.1007/s10433-011-0175-7

Accenture. (2019, April 30). Accenture Interactive launches groundbreaking artificial intelligence solution to tackle elderly loneliness. https://newsroom.accenture.com/news/accenture-interactive-launches-groundbreaking-artificial-intelligence-solution-to-tackle-elderly-loneliness.htm

Affectiva. (2020). Affectiva human perception AI. 2020. https://www.affectiva.com/

Ammari, T., Kaye, J., Tsai, J. Y., & Bentley, F. (2019). Music, search, and IoT: How people (really) use voice assistants. ACM Transactions on Computer-Human Interaction, 26(3), 1–28. https://doi.org/10.1145/3311956

APPA. (2019). 2019-2020 APPA National pet owners survey. https://www.americanpetproducts.org/pubs_survey.asp

Appel, M., Izydorczyk, D., Weber, S., Mara, M., & Lischetzke, T. (2020). The uncanny of mind in a machine: Humanoid robots as tools, agents, and experiencers. Computers in Human Behavior, 102, 274–286. https://doi.org/10.1016/j.chb.2019.07.031

Archer, J. (1997). Why do people love their pets? Evolution and Human Behavior, 18(4), 237–259. https://doi.org/10.1016/S0162-3095(99)80001-4

Aronson, P., & Duportail, J. (2018, July 12). The quantified heart. Aeon. https://aeon.co/essays/can-emotion-regulating-tech-translate-across-cultures

Asimov, I. (1950). I, Robot. Fawcett Publications.

Asimov, I. (1976). The bicentennial man and other stories (6th ed.). Doubleday.

Assor, A., Roth, G., & Deci, E. L. (2004). The emotional costs of parents’ conditional regard: A self-determination theory analysis. Journal of Personality, 72(1), 47–88. https://doi.org/10.1111/j.0022-3506.2004.00256.x

Astell, A. J., Bouranis, N., Hoey, J., Lindauer, A., Mihailidis, A., Nugent, C., & Robillard, J. M. (2019). Technology and dementia: The future is now. In Dementia and geriatric cognitive disorders (Vol. 47, Issue 3, pp. 131–139). NLM (Medline). https://doi.org/10.1159/000497800

Baisch, S., Kolling, T., Schall, A., Rühl, S., Selic, S., Kim, Z., Rossberg, H., Klein, B., Pantel, J., Oswald, F., & Knopf, M. (2017). Acceptance of social robots by elder people: Does psychosocial functioning matter? International Journal of Social Robotics, 9(2), 293–307. https://doi.org/10.1007/s12369-016-0392-5

Barrat, J. (2013). Our final invention: Artificial intelligence and the end of the human era. Thomas Dunne Books.

Barton, R., & Dunbar, R. (1997). Evolution of the social brain. In A. Whiten & R. Byrne (Eds.), Machiavellian Intelligence. Cambridge University Press.

Baum, S. D., Goertzel, B., & Goertzel, T. G. (2011). How long until human-level AI? Results from an expert assessment. Technological Forecasting and Social Change, 78(1), 185–195. https://doi.org/10.1016/j.techfore.2010.09.006

BHaptics. (2020). https://www.bhaptics.com/tactsuit/

Bickmore, T., Gruber, A., & Picard, R. (2005). Establishing the computer-patient working alliance in automated health behavior change interventions. Patient Education and Counseling, 59(1), 21–30. https://doi.org/10.1016/j.pec.2004.09.008

Bloch, M., Carter, S., & McLean, A. (2010). Mapping the 2010 Census. The New York Times. https://www.nytimes.com/projects/census/2010/map.html?lat=40.75&lng=-73.95&l=11&view=SoloHousingView2010&ref=sunday

Bornstein, R. F. (1989). Exposure and affect: Overview and meta-analysis of research, 1968-1987. In Psychological Bulletin (Vol. 106, Issue 2, pp. 265–289). https://doi.org/10.1037/0033-2909.106.2.265

Breazeal, C. (2002). Designing sociable robots. MIT Press. https://doi.org/10.1016/s0898-1221(03)80129-3

Brewster, T. (2020, March 23). Houseparty: Is the hit Coronavirus lockdown app safe? Forbes. https://www.forbes.com/sites/thomasbrewster/2020/03/23/houseparty-is-the-hit-coronavirus-lockdown-app-safe/#47157d7155b2

Brown, C. M., Hengy, S. M., & McConnell, A. R. (2016). Thinking about cats or dogs provides relief from social rejection. Anthrozoos, 29(1), 47–58. https://doi.org/10.1080/20414005.2015.1067958

Burbach, L., Halbach, P., Plettenberg, N., Nakayama, J., Ziefle, M., & Calero Valdez, A. (2019). “Hey, Siri”, “Ok, Google”, “Alexa”. Acceptance-relevant factors of virtual voice assistants. IEEE International Professional Communication Conference, 2019-July, 101–111. https://doi.org/10.1109/ProComm.2019.00025

Cacioppo, J. T., Hawkley, L. C., & Berntson, G. G. (2003). The anatomy of loneliness. Current Directions in Psychological Science, 12(3), 71–74. https://doi.org/10.1111/1467-8721.01232

Cacioppo, J. T., Hughes, M. E., Waite, L. J., Hawkley, L. C., & Thisted, R. A. (2006). Loneliness as a specific risk factor for depressive symptoms: Cross-sectional and longitudinal analyses. Psychology and Aging, 21(1), 140–151. https://doi.org/10.1037/0882-7974.21.1.140

Cacioppo, S., Grippo, A. J., London, S., Goossens, L., & Cacioppo, J. T. (2015). Loneliness: Clinical import and interventions. Perspectives on Psychological Science : A Journal of the Association for Psychological Science, 10(2), 238–249. https://doi.org/10.1177/1745691615570616

Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The perception-behavior link and social interaction. Journal of Personality and Social Psychology, 76(6), 893–910. https://doi.org/10.1037/0022-3514.76.6.893

Chen, T. L., King, C. H. A., Thomaz, A. L., & Kemp, C. C. (2014). An investigation of responses to robot-initiated touch in a nursing context. International Journal of Social Robotics, 6(1), 141–161. https://doi.org/10.1007/s12369-013-0215-x

Coan, J. A., Schaefer, H. S., & Davidson, R. J. (2006). Lending a hand: Social regulation of the neural response to threat. Psychological Science, 17(12), 1032–1039. https://doi.org/10.1111/j.1467-9280.2006.01832.x

Cohen, A., Goodman, L., Keaveney, S., Keogh, C., & Dillenburger, K. (2018). Sustaining a caring relationship at a distance: Can haptics and 3D technologies overcome the deficits in 2D direct synchronous video based communication? Proceedings of the 2017 23rd International Conference on Virtual Systems and Multimedia, VSMM 2017, 2018-January, 1–6. https://doi.org/10.1109/VSMM.2017.8346290

Constine, J. (2020, March 21). Under quarantine, media is actually social. Tech Crunch. https://techcrunch.com/2020/03/21/showing-up-not-showing-off/

Cooke, T. (2020). Metadata, Jailbreaking, and the cybernetic governmentality of iOS: Or, the need to distinguish digital privacy from digital privacy. Surveillance & Society, 18(1), 90–103. https://search.proquest.com/docview/2381627343?accountid=8318&rfr_id=info%3Axri%2Fsid%3Aprimo

Cordoni, G., & Palagi, E. (2011). Ontogenetic trajectories of chimpanzee social play: Similarities with humans. PLoS ONE, 6(11), 27344. https://doi.org/10.1371/journal.pone.0027344

Coyle, C. E., & Dugan, E. (2012). Social isolation, loneliness and health among older adults. Journal of Aging and Health, 24(8). https://doi.org/10.1177/0898264312460275

Crusco, A. H., & Wetzel, C. G. (1984). The Midas touch. Personality and Social Psychology Bulletin, 10(4), 512–517. https://doi.org/10.1177/0146167284104003

Danaher, J. (2017). Robotic rape and robotic child sexual abuse: Should they be criminalised? Criminal Law and Philosophy, 11(1), 71–95. https://doi.org/10.1007/s11572-014-9362-x

Danaher, J., & McArthur, N. (2017). Robot sex: Social and ethical implications. The MIT Press. https://muse.jhu.edu/book/56303

de Graaf, M. M. A., Allouch, S. Ben, & Klamer, T. (2015). Sharing a life with Harvey: Exploring the acceptance of and relationship-building with a social robot. Computers in Human Behavior, 43. https://doi.org/10.1016/j.chb.2014.10.030

de Graaf, M. M. A., Allouch, S. Ben, & Van Dijk, J. A. G. M. (2014). Long-term evaluation of a social robot in real homes. AISB 2014 - 50th Annual Convention of the AISB, 17(3), 462–491. https://doi.org/10.1075/is.17.3.08deg

de Visser, E. J., Monfort, S. S., McKendrick, R., Smith, M. A. B., McKnight, P. E., Krueger, F., & Parasuraman, R. (2016). Almost human: Anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology: Applied, 22(3), 331–349. https://doi.org/10.1037/xap0000092

Debrot, A., Schoebi, D., Perrez, M., & Horn, A. B. (2013). Touch as an interpersonal emotion regulation process in couples’ daily lives: The mediating role of psychological intimacy. Personality and Social Psychology Bulletin, 39(10), 1373–1385. https://doi.org/10.1177/0146167213497592

Debrot, A., Schoebi, D., Perrez, M., & Horn, A. B. (2014). Stroking your beloved one’s white bear: Responsive touch by the romantic partner buffers the negative effect of thought supression on daily mood. In Journal of Social and Clinical Psychology (Vol. 33, Issue 1).

Deci, E., & Ryan, R. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268. https://doi.org/10.2307/1449618

Degue, S., & Dilillo, D. (2009). Is animal cruelty a “red flag” for family violence? Investigating co-occurring violence toward children, partners, and pets. Journal of Interpersonal Violence, 24(6), 1036–1056. https://doi.org/10.1177/0886260508319362

Dolin, D. J., & Booth-Butterfield, M. (1993). Reach out and touch someone: Analysis of nonverbal comforting responses. Communication Quarterly, 41(4), 383–393. https://doi.org/10.1080/01463379309369899

Dolliver, M. (2010). Twenty-one percent of pet owners often talk to the animal about their “personal troubles”; 8 percent do so “extremely” or “very” often. Brandweek, 51(20), 23.

Drewett, Z. (2017, November 5). Man who married his own sex robot is now making them for other men. Metro News. https://metro.co.uk/2017/11/05/the-man-who-built-and-married-his-own-sex-robot-is-now-making-them-for-other-lonely-men-7054550/

Dreyfuss, E. (2017, July 16). An Amazon Echo can’t call the police—But maybe it should. Wired. https://www.wired.com/story/alexa-call-police-privacy/

Dunbar, R. (1993). Coevolution of neocortical size, group size and language in humans. Behavioral and Brain Sciences, 16(04), 681. https://doi.org/10.1017/S0140525X00032325

Dunbar, R. (1998). The social brain hypothesis. Evolutionary Anthropology, 6(5), 178–190. https://doi.org/10.1002/(SICI)1520-6505(1998)6:5<178::AID-EVAN5>3.0.CO;2-8

Dunbar, R. (2010). How many friends does one person need? Dunbar’s number and other evolutionary quirks. Harvard University Press.

Dunbar, R. (2014). The social brain: Psychological underpinnings and implications for the structure of organizations. Current Directions in Psychological Science, 23(2), 109–114. https://doi.org/10.1177/0963721413517118

Easton, M. (2018, February 11). How should we tackle the loneliness epidemic? BBC News. https://www.bbc.com/news/uk-42887932

Eisenberger, N. I., Lieberman, M. D., & Williams, K. D. (2003). Does rejection hurt? An fMRI study of social exclusion. Science, 302(5643), 290–292. https://doi.org/10.1126/science.1089134

Ellis, E. G. (2020, April 10). Thanks to sheltering in place, animal shelters are empty. Wired. https://www.wired.com/story/coronavirus-pet-adoption-boom/

Epley, N., Akalis, S., Waytz, A., & Cacioppo, J. T. (2008). Creating social connection through inferential reproduction: Loneliness and perceived agency in gadgets, gods, and greyhounds: Research article. Psychological Science, 19(2), 114–120. https://doi.org/10.1111/j.1467-9280.2008.02056.x

Epley, N., Waytz, A., Akalis, S., & Cacioppo, J. T. (2008). When we need a human: Motivational determinants of anthropomorphism. Social Cognition, 26(2), 143–155. https://doi.org/10.1521/soco.2008.26.2.143

Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864

Ferguson, C. J. (2014). A way forward for video game violence research. The American Psychologist, 69(3), 307–309. https://doi.org/10.1037/a0036357

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785

Fogg, B. J., & Nass, C. (1997). Silicon sycophants: The effects of computers that flatter. International Journal of Human Computer Studies, 46(5), 551–561. https://doi.org/10.1006/ijhc.1996.0104

Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166. https://doi.org/10.1016/S0921-8890(02)00372-X

Freier, N. (2010). HRI 2010 panels. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 11–11. https://doi.org/10.1109/hri.2010.5453274

Friedman, E. M., Hayney, M. S., Love, G. D., Urry, H. L., Rosenkranz, M. A., Davidson, R. J., Singer, B. H., & Ryff, C. D. (2005). Social relationships, sleep quality, and interleukin-6 in aging women. Proceedings of the National Academy of Sciences of the United States of America, 102(51), 18757–18762. https://doi.org/10.1073/pnas.0509281102

Friedmann, E., & Thomas, S. A. (1995). Pet ownership, social support, and one-year survival after acute myocardial infarction in the Cardiac Arrhythmia Suppression Trial (CAST). The American Journal of Cardiology, 76(17), 1213–1217. https://doi.org/10.1016/S0002-9149(99)80343-9

Fujita, M. (2004). On activating human communications with pet-type robot AIBO. Proceedings of the IEEE, 92(11), 1804–1813. https://doi.org/10.1109/JPROC.2004.835364

Gardiner, P. M., McCue, K. D., Negash, L. M., Cheng, T., White, L. F., Yinusa-Nyahkoon, L., Jack, B. W., & Bickmore, T. W. (2017). Engaging women with an embodied conversational agent to deliver mindfulness and lifestyle recommendations: A feasibility randomized control trial. Patient Education and Counseling, 100(9), 1720–1729. https://doi.org/10.1016/j.pec.2017.04.015

GfK Roper Public Affairs & Media. (2009). AP-Petside.com poll: Pets and their owners.

Goff, B. G., Goddard, H. W., Pointer, L., & Jackson, G. B. (2007). Measures of expressions of love. Psychological Reports, 101(2), 357–360. https://doi.org/10.2466/PR0.101.2.357-360

Grace, K., Salvatier, J., Dafoe, A., Zhang, B., & Evans, O. (2018). Viewpoint: When will AI exceed human performance? Evidence from ai experts. In Journal of Artificial Intelligence Research (Vol. 62, pp. 729–754). AI Access Foundation. https://doi.org/10.1613/jair.1.11222

Granqvist, P., & Hagekull, B. (2000). Religiosity, adult attachment, and why “singles” are more religious. International Journal of Phytoremediation, 21(1), 111–123. https://doi.org/10.1207/S15327582IJPR1002_04

Gratzer, D., & Goldbloom, D. (2020). Therapy and E-therapy - Preparing future psychiatrists in the era of Apps and Chatbots. In Academic Psychiatry (Vol. 44, Issue 2, pp. 231–234). Springer. https://doi.org/10.1007/s40596-019-01170-3

Gratzer, D., & Khalid-Khan, F. (2016). Internet-delivered cognitive behavioural therapy in the treatment of psychiatric illness. CMAJ, 188(4), 263–272. https://doi.org/10.1503/cmaj.150007

Gray, H. M., Gray, K., & Wegner, D. M. (2007). Dimensions of mind perception. Science, 315(5812), 619. https://doi.org/10.1126/science.1134475

Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125–130. https://doi.org/10.1016/j.cognition.2012.06.007

Greenwood Campbell. (2019). #VoiceForLoneliness | Case Study. https://www.greenwoodcampbell.com/what/voiceforloneliness/

Griggs, B. (2010, February 1). Inventor unveils $7,000 talking sex robot. CNN. https://edition.cnn.com/2010/TECH/02/01/sex.robot/index.html

Guéguen, N., & Fischer-Lokou, J. (2003). Another evaluation of touch and helping behavior. Psychological Reports, 92(1), 62–64. https://doi.org/10.2466/pr0.2003.92.1.62

Haas, B. (2017, April 4). Chinese man “marries” robot he built himself. The Guardian. https://www.theguardian.com/world/2017/apr/04/chinese-man-marries-robot-built-himself

Harper, T. (2019, February 3). Alexa, call the police — the new way to get help. The Sunday Times. https://www.thetimes.co.uk/article/alexa-call-the-police-scotland-yard-wants-to-hear-from-you-bbbr7fqjl

Hassan, C. (2017, July 11). Voice-activated device called 911 during attack, authorities say - CNN. CNN. https://edition.cnn.com/2017/07/10/us/alexa-calls-police-trnd/index.html

Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1992). Primitive emotional contagion. In M. S. Clark (Ed.), Review of personality and social psychology: Emotion and social behavior (pp. 151–177). Sage.

Hawkley, L. C., Browne, M. W., & Cacioppo, J. T. (2005). How can I connect with thee? Let me count the ways. Psychological Science, 16(10), 798–804. https://doi.org/10.1111/j.1467-9280.2005.01617.x

Hawkley, L. C., Masi, C. M., Berry, J. D., & Cacioppo, J. T. (2006). Loneliness is a unique predictor of age-related differences in systolic blood pressure. Psychology and Aging, 21(1), 152–164. https://doi.org/10.1037/0882-7974.21.1.152

Hawkley, L. C., Wroblewski, K., Kaiser, T., Luhmann, M., & Philip Schumm, L. (2019). Are U.S. older adults getting lonelier? Age, period, and cohort differences. Psychology and Aging, 34(8), 1144–1157. https://doi.org/10.1037/pag0000365

Heinrich, L. M., & Gullone, E. (2006). The clinical significance of loneliness: A literature review. Clinical Psychology Review, 26(6), 695–718. https://doi.org/10.1016/j.cpr.2006.04.002

Hern, A. (2020, May 26). New vulnerability allows users to “jailbreak” iPhones. The Guardian. https://www.theguardian.com/technology/2020/may/26/first-iphone-jailbreak-in-four-years-released

Higgins, E. Tory. (1996). Knowledge activation: Accessibility, applicability, and salience. In E. T. Higgins & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133–168). Guilford Press.

Holstein, T., & Dodig-Crnkovic, G. (2018). Avoiding the intrinsic unfairness of the trolley problem. Proceedings - International Conference on Software Engineering, 32–37. https://doi.org/10.1145/3194770.3194772

Holt-Lunstad, J., Smith, T. B., Baker, M., Harris, T., & Stephenson, D. (2015). Loneliness and social isolation as risk factors for mortality: A meta-analytic review. Perspectives on Psychological Science, 10(2), 227–237. https://doi.org/10.1177/1745691614568352

Houser, K. (2020). Watch a mother reunite with her deceased child in VR. Futurism. https://futurism.com/watch-mother-reunion-deceased-child-vr

Howe, N. (2019, May 3). Millennials and the loneliness epidemic. Forbes. https://www.forbes.com/sites/neilhowe/2019/05/03/millennials-and-the-loneliness-epidemic/#2446611d7676

Huckvale, K., Torous, J., & Larsen, M. E. (2019). Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Network Open, 2(4), e192542. https://doi.org/10.1001/jamanetworkopen.2019.2542

iMotions Software Solution. (n.d.). Facial Expression Analysis . Retrieved June 7, 2020, from https://imotions.com/biosensor/fea-facial-expression-analysis/?creative=163580250674&keyword=emotient&matchtype=b&network=g&device=c&gclid=Cj0KCQjwoPL2BRDxARIsAEMm9y-mj5jotkK52e3ep8vkvvHE1WR2M3af9_-5gJ7TSbRT2Eth40x8ZuUaAsN3EALw_wcB

Jong-woo, K. (producer). (2020). I met you [Video file]. Munhwa Broadcasting Corporation. https://www.ondemandkorea.com/mbc-documentary-special-e20200206.html

J?ranson, N., Pedersen, I., Rokstad, A. M. M., Aamodt, G., Olsen, C., & Ihleb?k, C. (2016). Group activity with Paro in nursing homes: Systematic investigation of behaviors in participants. International Psychogeriatrics, 28(8), 1345–1354. https://doi.org/10.1017/S1041610216000120

Jozuka, E. (2018, December 29). The man who married an anime hologram. CNN. https://edition.cnn.com/2018/12/28/health/rise-of-digisexuals-intl/index.html

Kahn, P. H., Friedman, B., Perez-Granados, D. R., & Freier, N. G. (2004). Robotic pets in the lives of preschool children. Conference on Human Factors in Computing Systems - Proceedings, 1449–1452. https://doi.org/10.1145/985921.986087

Kanat-Maymon, Y., Antebi, A., & Zilcha-Mano, S. (2016). Basic psychological need fulfillment in human-pet relationships and well-being. Personality and Individual Differences, 92, 69–73. https://doi.org/10.1016/j.paid.2015.12.025

Kanat-Maymon, Y., Roth, G., Assor, A., & Raizer, A. (2016). Controlled by love: The harmful relational consequences of perceived conditional positive regard. Journal of Personality, 84(4), 446–460. https://doi.org/10.1111/jopy.12171

Kilani, A., Hamida, A. Ben, & Hamam, H. (2017). Artificial Intelligence Review. In Encyclopedia of Information Science and Technology, Fourth Edition (pp. 106–119). IGI Global. https://doi.org/10.4018/978-1-5225-2255-3.ch010

Klein, B., Gaedt, L., & Cook, G. (2013). Emotional robots: : Principles and experiences with Paro in Denmark, Germany, and the UK. GeroPsych: The Journal of Gerontopsychology and Geriatric Psychiatry, 26(2), 89–99. https://doi.org/10.1024/1662-9647/a000085

Konok, V., Korcsok, B., Miklósi, á., & Gácsi, M. (2018). Should we love robots? The most liked qualities of companion dogs and how they can be implemented in social robots. Computers in Human Behavior, 80, 132–142. https://doi.org/10.1016/j.chb.2017.11.002

Krumhuber, E. G., Likowski, K. U., & Weyers, P. (2014). Facial mimicry of spontaneous and deliberate duchenne and non-duchenne smiles. Journal of Nonverbal Behavior, 38(1), 1–11. https://doi.org/10.1007/s10919-013-0167-8

Kurdek, L. A. (2008). Pet dogs as attachment figures. Journal of Social and Personal Relationships, 25(2), 247–266. https://doi.org/10.1177/0265407507087958

Lee, H. J., Cho, H. J., Xu, W., & Fairhurst, A. (2010). The influence of consumer traits and demographics on intention to use retail self-service checkouts. Marketing Intelligence and Planning, 28(1), 46–58. https://doi.org/10.1108/02634501011014606

Lee, K. M., Jung, Y., Kim, J., & Kim, S. R. (2006). Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people’s loneliness in human-robot interaction. International Journal of Human Computer Studies, 64(10), 962–973. https://doi.org/10.1016/j.ijhcs.2006.05.002

Leite, I., Castellano, G., Pereira, A., Martinho, C., & Paiva, A. (2014). Empathic robots for long-term interaction: Evaluating social presence, engagement and perceived support in children. International Journal of Social Robotics, 6(3), 329–341. https://doi.org/10.1007/s12369-014-0227-1

Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., & Paiva, A. (2013). The influence of empathy in human-robot relations. International Journal of Human Computer Studies, 71(3), 250–260. https://doi.org/10.1016/j.ijhcs.2012.09.005

Lieberman, M. D. (2013). Social: Why our brains are wired to connect. Oxford University Press.

Lucas, G. M., Gratch, J., King, A., & Morency, L. P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100. https://doi.org/10.1016/j.chb.2014.04.043

Ly, K. H., Ly, A. M., & Andersson, G. (2017). A fully automated conversational agent for promoting mental well-being: A pilot RCT using mixed methods. Internet Interventions, 10, 39–46. https://doi.org/10.1016/j.invent.2017.10.002

Madhavan, P., & Wiegmann, D. A. (2007). Similarities and differences between human–human and human–automation trust: An integrative review. Theoretical Issues in Ergonomics Science, 8(4), 277–301. https://doi.org/10.1080/14639220500337708

Mahon, N. E. (1994). Loneliness and sleep during adolescence. Perceptual and Motor Skills, 78(1), 227–231. https://doi.org/10.2466/pms.1994.78.1.227

Maner, J. K., DeWall, C. N., Baumeister, R. F., & Schaller, M. (2007). Does social exclusion motivate interpersonal reconnection? Resolving the “porcupine problem.” Journal of Personality and Social Psychology, 92(1), 42–55. https://doi.org/10.1037/0022-3514.92.1.42

Masi, C. M., Chen, H.-Y., Hawkley, L. C., & Cacioppo, J. T. (2011). A meta-analysis of interventions to reduce loneliness. Personality and Social Psychology Review : An Official Journal of the Society for Personality and Social Psychology, Inc, 15(3), 219–266. https://doi.org/10.1177/1088868310377394

Maslow, A. H. (1937). The influence of familiarization on preference. Journal of Experimental Psychology, 21(2), 162–180. https://doi.org/10.1037/h0053692

Master, S. L., Eisenberger, N. I., Shelley, E., Naliboff, B. D., Shirinyan, D., & Lieberman, M. D. (2009). A picture’s worth: Partner photographs reduce experimentally induced pain. Psychological Science, 20(11), 1316–1318. https://doi.org/10.1007/s10551-007-9630-y

Matsuo, T. (2016). The preliminary analysis on the laws of robotics in Japan - Using automated vehicles as examples. Proceedings of IEEE Workshop on Advanced Robotics and Its Social Impacts, ARSO, 2016-November, 20–25. https://doi.org/10.1109/ARSO.2016.7736250

McConnell, A. R., Brown, C. M., Shoda, T. M., Stayton, L. E., & Martin, C. E. (2011). Friends with benefits: On the positive consequences of pet ownership. Journal of Personality and Social Psychology, 101(6), 1239–1252. https://doi.org/10.1037/a0024506

Mccroskey, J. C., Richmond, V. P., & Daly, J. A. (1975). The development of a measure of perceived homophily in interpersonal communication. Human Communication Research, 1(4), 323–332. https://doi.org/10.1111/j.1468-2958.1975.tb00281.x

Melson, G. F., Kahn, P. H., Beck, A., Friedman, B., Roberts, T., Garrett, E., & Gill, B. T. (2009). Children’s behavior toward and understanding of robotic and living dogs. Journal of Applied Developmental Psychology, 30(2), 92–102. https://doi.org/10.1016/j.appdev.2008.10.011

Mojzisch, A., Schilbach, L., Helmert, J. R., Pannasch, S., Velichkovsky, B. M., & Vogeley, K. (2006). The effects of self-involvement on attention, arousal, and facial expression during social interaction with virtual others: A psychophysiological study. Social Neuroscience, 1(3–4), 184–195. https://doi.org/10.1080/17470910600985621

Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley. IEEE Robotics and Automation Magazine, 19(2), 98–100. https://doi.org/10.1109/MRA.2012.2192811

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153

Neumann, R., & Strack, F. (2000). “Mood contagion”: The automatic transfer of mood between persons. Journal of Personality and Social Psychology, 79(2), 211–223. https://doi.org/10.1037//0022-3514.79.2.211

Nummenmaa, L., Tuominen, L., Dunbar, R., Hirvonen, J., Manninen, S., Arponen, E., Machin, A., Hari, R., J??skel?inen, I. P., & Sams, M. (2016). Social touch modulates endogenous μ-opioid system activity in humans. NeuroImage, 138, 242–247. https://doi.org/10.1016/j.neuroimage.2016.05.063

O’Brolcháin, F., & Cohen, A. (2019). Cuestiones acerca del consentimiento informado relacionadas con el uso de trajes hápticos como tecnologías asistenciales para personas con discapacidad intelectual y del desarrollo | Dilemata. Dilemata, 30, 51–61. https://www.dilemata.net/revista/index.php/dilemata/article/view/412000290

Olmstead, K. (2017, December 12). Voice assistants used by 46% of Americans, mostly on smartphones. Pew Research Center. https://www.pewresearch.org/fact-tank/2017/12/12/nearly-half-of-americans-use-digital-voice-assistants-mostly-on-their-smartphones/

Ortiz-Ospina, E. (2019, December 11). Is there a loneliness epidemic? Our World in Data. https://ourworldindata.org/loneliness-epidemic

Palagi, E. (2006). Social play in bonobos (Pan paniscus) and chimpanzees (Pan troglodytes): Implications for natural social systems and interindividual relationships. American Journal of Physical Anthropology, 129(3), 418–426. https://doi.org/10.1002/ajpa.20289

Pandey, J., & Kakkar, S. (1982). Supervisors’ affect: Attraction and positive evaluation as a function of enhancement of others. Psychological Reports, 50(2), 479–486. https://doi.org/10.2466/pr0.1982.50.2.479

Pandey, J., & Singh, P. (1987). Effects of machiavellianism, other-enhancement, and power-position on affect, power feeling, and evaluation of the ingratiator. The Journal of Psychology, 121(3), 287–300.

Paul, E. S., Moore, A., McAinsh, P., Symonds, E., McCune, S., & Bradshaw, J. W. S. (2014). Sociality motivation and anthropomorphic thinking about pets. Anthrozoos, 27(4), 499–512. https://doi.org/10.2752/175303714X14023922798192

PDSA. (2019). PAW Report - PDSA. https://www.pdsa.org.uk/get-involved/our-campaigns/pdsa-animal-wellbeing-report

Peplau, L. A., & Perlman, D. (Eds.). (1982). Loneliness: A sourcebook of current theory, research, and therapy. Wiley.

Provine, R. R. (1986). Yawning as a stereotyped action pattern and releasing stimulus. Ethology, 72, 109–122.

Provine, R. R. (1989). Contagious yawning and infant imitation. Bulletin of the Psychonomic Society, 27, 125–126. https://doi.org/10.3758/BF03329917

Provine, R. R. (1992). Contagious laughter: Laughter is a sufficient stimulus for laughs and smiles. Bulletin of the Psychonomic Society, 30(1), 1–4. https://doi.org/10.3758/BF03330380

Reeves, B., & Nass, C. I. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press.

Russell, D., Peplau, L. A., & Cutrona, C. E. (1980). The revised UCLA Loneliness Scale: Concurrent and discriminant validity evidence. Journal of Personality and Social Psychology, 39(3), 472–480. https://doi.org/10.1037/0022-3514.39.3.472

Sagi, A., & Hoffman, M. L. (1976). Empathic distress in the newborn. Developmental Psychology, 12(2), 175–176. https://doi.org/10.1037/0012-1649.12.2.175

Sapolsky, R. M. (2018). Behave: The biology of humans at our best and worst. Vintage.

Secon, H., Frias, L., & McFall-Johnsen, M. (2020, March 9). Countries that are on lockdown because of coronavirus. Business Insider. https://www.businessinsider.com/countries-on-lockdown-coronavirus-italy-2020-3

Seligman, M. E. (2012). Flourish: A visionary new understanding of happiness and well-being. Free Press.

Shibata, T., Tashima, T., & Tanie, K. (1999). Human robot interaction for creation of subjective value. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM, 179–184. https://doi.org/10.1109/aim.1999.803163

Siegel, J. M. (1990). Stressful life events and use of physician services among the elderly: the moderating role of pet ownership. Journal of Personality and Social Psychology, 58(6), 1081–1086. https://doi.org/10.1037//0022-3514.58.6.1081

Siegel, J. M., Angulo, F. J., Detels, R., Wesch, J., & Mullen, A. (1999). AIDS diagnosis and depression in the Multicenter AIDS Cohort Study: The ameliorating impact of pet ownership. AIDS Care - Psychological and Socio-Medical Aspects of AIDS/HIV, 11(2), 157–170. https://doi.org/10.1080/09540129948054

Snell, K. D. M. (2017). The rise of living alone and loneliness in history. Social History, 42(1), 2–28. https://doi.org/10.1080/03071022.2017.1256093

Stein, J. P., & Ohler, P. (2017). Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition, 160, 43–50. https://doi.org/10.1016/j.cognition.2016.12.010

Stockholm Exergi. (n.d.). Memory Lane - Stockholm Exergi. Retrieved June 6, 2020, from https://www.stockholmexergi.se/memory-lane2/

Suanet, B., & van Tilburg, T. G. (2019). Loneliness declines across birth cohorts: The impact of mastery and self-efficacy. Psychology and Aging, 34(8), 1134–1143. https://doi.org/10.1037/pag0000357

Sung, J. Y., Christensen, H. I., & Grinter, R. E. (2008). Robots in the wild: Understanding long-term use. Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction, HRI’09, 45–52. https://doi.org/10.1145/1514095.1514106

Sutcliffe, A., Dunbar, R., Binder, J., & Arrow, H. (2012). Relationships and the social brain: Integrating psychological and evolutionary perspectives. British Journal of Psychology, 103(2), 149–168. https://doi.org/10.1111/j.2044-8295.2011.02061.x

Takayanagi, K., Kirita, T., & Shibata, T. (2014). Comparison of verbal and emotional responses of elderly people with mild/moderate dementia and those with severe dementia in responses to seal robot, PARO. Frontiers in Aging Neuroscience, 6(SEP). https://doi.org/10.3389/fnagi.2014.00257

Techcrunch. (2010, January 9). NSFW: True Companion debuts sex robot Roxxxy. TechCrunch. https://techcrunch.com/2010/01/09/nsfw-true-companion-debuts-sex-robot-roxxxy/

Teslasuit. (2020). TESLASUIT - Haptic feedback VR suit for motion capture and VR training. Tesla. https://teslasuit.io/the-suit/

Thomson, J. J. (1976). Killing, Letting Die, and the Trolley Problem. Monist, 59(2), 204–217. https://doi.org/10.5840/monist197659224

UK Office for National Statistics. (2017). Community Life Survey 2016-17. https://www.gov.uk/government/statistics/community-life-survey-2016-17

Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M. S., & Torous, J. B. (2019). Chatbots and conversational agents in mental health: A review of the psychiatric landscape. In Canadian Journal of Psychiatry (Vol. 64, Issue 7, pp. 456–464). SAGE Publications Inc. https://doi.org/10.1177/0706743719828977

Vansteenkiste, M., & Ryan, R. M. (2013). On psychological growth and vulnerability: Basic psychological need satisfaction and need frustration as a unifying principle. Journal of Psychotherapy Integration, 23(3), 263–280. https://doi.org/10.1037/a0032359

Victor, C. R., & Yang, K. (2012). The prevalence of loneliness among adults: A case study of the United Kingdom. Journal of Psychology: Interdisciplinary and Applied, 146(1–2), 85–104. https://doi.org/10.1080/00223980.2011.613875

Wada, K., Shibata, T., Saito, T., & Tanie, K. (2002). Robot assisted activity for elderly people and nurses at a day service center. Proceedings - IEEE International Conference on Robotics and Automation, 2, 1416–1421. https://doi.org/10.1109/robot.2002.1014742

Waytz, A., Cacioppo, J., & Epley, N. (2010). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 5(3), 219–232. https://doi.org/10.1177/1745691610369336

Webb, O. J., Eves, F. F., & Smith, L. (2011). Investigating behavioural mimicry in the context of stair/escalator choice. British Journal of Health Psychology, 16(2), 373–385. https://doi.org/10.1348/135910710X510395

West, J., & Mace, M. (2007). Entering a mature industry through innovation: Apple’s iPhone strategy. Proc. Druid Summer Conference.

Whittaker, Z. (2020, June 1). After a spate of device hacks, Google beefs up Nest security protections | TechCrunch. TechCrunch. https://techcrunch.com/2020/06/01/google-nest-advanced-protection/

Willemse, C. J. A. M., Toet, A., & van Erp, J. B. F. (2017). Affective and behavioral responses to robot-initiated social touch: Toward understanding the opportunities and limitations of physical contact in human-robot interaction. Frontiers in ICT, 4(MAY), 1–13. https://doi.org/10.3389/fict.2017.00012

Willemse, C. J. A. M., & van Erp, J. B. F. (2018). Social touch in human–robot interaction: Robot-initiated touches can induce positive responses without extensive prior bonding. International Journal of Social Robotics, 1–20. https://doi.org/10.1007/s12369-018-0500-9

Yang, K., & Victor, C. (2011). Age and loneliness in 25 European nations. Ageing and Society, 31(8), 1368–1388. https://doi.org/10.1017/S0144686X1000139X

Zajonc, R. B. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology, 9(2 PART 2), 1–27. https://doi.org/10.1037/h0025848




 


Alee Ancheta

Part-time job seeker

1 年

Interesting !

回复
Walter Minicucci

Médico endocrinologista

1 年

A Renata, nestes tempos de AI faz um link entre o rob? que quer ser humano, no conto de Asimov e a amizade entre ele e a Little miss, que “mostra”? um caminho para corrigir a solid?o que assola a humanidade como aponta o Cirurgi?o geral americano e você-almirante Vivek Murthy. Vale muito a pena aproveitar o final de semana e ler.

Milton Jungman

Corporate Real Estate | Workplace Transformation | Gerenciamento de Projetos | Property and Facilities Management | Real Estate | Consulting

3 年

Very Interesting !!!

Maria Isabel Miranda

Diretora Executiva de Beleza na Web

3 年

???????????? incrível o artigo! Parabéns!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了