Defining the status of virtual assistant: can a chatbot be called a subject of legal relations

23 June 2017
Defining the status of virtual assistant: can a chatbot be called a subject of legal relations

Can we consider a chatbot as a subject of legal relations? Are bots just services or official representatives of companies? Who must bear the responsibility for the mistakes of virtual assistants? Representatives of Legal IT Group Anton Tarasyuk and Mikhail Vertepa decided to answer these questions.

 

Bots are services, not company representatives

 

According to experts, today a chatbot cannot be called a fully-fledged subject of legal relations, i.e. he cannot have rights and obligations. Virtual assistants are just part of company’s services, and the latter bears legal responsibility for recommendations that could cause huge losses to the user, or illegal actions, for example, processing and using customer’s personal data without notifying him.

In fact, a chatbot is a combination of software and content. Moreover, even when self-learning AI algorithms are integrated in a chatbot, providing it with emotional intelligence, it is unlikely that one can consider the virtual assistant as a fully-fledged employee and company representative. At least at the current stage of technological development.

In such a way, when talking about complex chatbots with artificial intelligence, the following theses are appropriate:

  • chatbot is a communication tool, and the development company is responsible for it;
  • online activity of a bot entails consequences for its creator;
  • chatbot and its separate elements belong to intellectual property.

 

Cases when a company must take responsibility for chatbot’s words

 

Experts viewed several examples to show how a company is responsible for chatbot’s words. One them was ordering pizza via a chatbot with 20-minute delivery service. If the company is late, it returns half of the order price. So, a customer orders and pays for pizza in a chat. Starting from that moment, the company must fulfil its obligations or incur liability for its promise.

By the way, the company must obtain the user’s consent to use a chatbot for processing his personal data, accepting payments, etc. Otherwise, it violates the laws that protect personal information, consumer rights and norms of e-commerce.

Another example: a chatbot provides consultancy services at an additional cost. If a customer uses the virtual assistant’s recommendations and suffers documented losses, he can file a suit against the company that developed the bot. Resolution of such kind of disputes will mainly depend on the terms of the public offer agreement.

It is important to note that bots can use only publicly available information to provide consultations.

Conclusion:

Experts arrived at a conclusion that qualified specialists should back up chatbots; otherwise, companies can suffer significant losses.

Share:
Similar news
12 February 2018
An upgrade of Telegram 4.8 includes several new useful options: authorization on other websites and video streaming. Telegram Login is...
05 February 2018
The Ukrainian Association of Travel Agencies (UATA) will help tour operators with advising tourists on vacation. Specialists created a chatbot...
29 January 2018
Apple iMessage will have a new feature, a business chat, in the spring of 2018. It will allow many businesses...

Subscribe to news

be aware of the news industry conference