VRChatbot_4.jpg

VR Chatbot - Pandorabots

VR Chatbot Application

Integrating custom chatbots into VR and other applications via Unity3D

VRChatbot_3.jpg

The Challenge

Develop a Chatbot VR demo application for Pandorabots and create Unity3D packages for developers to integrate into their own projects.

VRChatbot_teleport_1.jpg
VRChatbot_1.jpg

My Approach

The project was broken up into three smaller projects: Text Input/Output, Audio Input/Output, VR Demo with Audio Input/Output. The purpose of breaking the project up into smaller pieces was to allow the developer to easily implement as much or little of this project as they would like to their own. To keep the design modular, a Scriptable Object Event System architecture was created. With this system a developer can easily replace components of the design for their specific needs. This event system also creates a nice visual understanding of the code as shown in the Unity Herarchy Window.

Microphone data is converted to text by Watson Speech-to-Text. Text from Watson is then sent to Pandorabots. Pandorabots text response is then converted to speech by Google Cloud Text-to-Speech, and this data is then saved to the drive and played back. To keep frame rates down saving the audio to disk was done through multithreading.

Text Input/Output

The first step was to implement the Pandorabots API into Unity. A simple UI was created to type and send messages to Pandorabots and receive a text response from the developer’s custom chatbot.

Audio Input/Output

In order to setup audio input and chatbot audio ouput I used the Watson Speech-to-Text (STT) API from IBM to stream microphone data and Google Cloud Text-to-Speech (TTS) API to convert the chatbot text response to speech. To accompany the chatbot audio response, I created a visual animation which could be adapted to lip syncing depending on the bot avatar. In our case the bot avatar was a simple sphere, so the animation consisted of a flare radiating according to the audio out amplitude.

Chatbot VR Demo

With Text-to-Speech and Speech-to-Text setup the last step was to create an immersive experience in VR. A simple room with free textures and furniture made up the virtual environment. VRTK was used for locomotion due to it’s familiarity across the community. The interactable objects consisted of a ball and a service bell. The ball on the desk could be picked up and passed between hands. The service bell could be rung by making a pointing gesture with an index finger and colliding the user’s virtual finger with the service bell.