Deployment options¶
The Alan AI Platform is designed to be adaptable and accommodate a variety of deployment scenarios. Here you can find all possible configurations that system architects can employ to meet the particular goals and requirements of their organizations.
The Alan AI Platform offers the following deployment options, in ascending order of complexity and cost:
Cloud deployment¶
The Alan AI Platform is delivered as Platform-as-a-Service by default. All components, tools and services required for conversational UX implementation are hosted and maintained in the Alan AI Cloud fully managed by Alan AI. Users can access the Alan AI Platform over the Internet.
Cloud deployment is the easiest deployment option, as it is immediately available and requires no additional preparation and maintenance.
The cloud deployment option involves the following roles and components:
Script developer (1): person responsible for:
Writing scripts that define the dialog and business logic of the conversational experience. Dialog scripts are edited and controlled via Alan AI Studio (4) being a part of the Alan AI Cloud (5).
Managing script deployment.
Analyzing conversational data and customers’ behavior in interactions with an AI assistant.
Application users (2): users interacting with the application with the Alan AI SDK embedded. The Alan AI SDK communicates with a dedicated virtual machine in the Alan AI Cloud (3), transferring voice/text data and events over a TLS-encrypted TCP/IP connection.
Alan AI Cloud (3): cloud-based environment that hosts Alan AI Studio (4), orchestrates VMs (5), manages storage (6) and performs other functions.
Alan AI Studio (4): interface to the system, a web-based IDE comprising a script editor, version and deployment control, dialog script testing tools, logs and analytics.
SLU, Dialog VM (5): virtual machine allocated for each conversational project. The VM is responsible for voice processing, dialog management and execution of the business logic defined in the dialog script. Dialog scripts are loaded from the Alan AI Studio storage (6).
Storage (6): Alan AI Studio storage containing dialog scripts with version history, user behavior analytics and statistics.
Conversational services (7): component responsible for speech recognition and generation. Conversational services may reside in the Alan AI Cloud (3) or as a third-party API.
Minimal on-premise deployment¶
This deployment scenario features a hybrid infrastructure, with the majority of components hosted in the cloud.
This deployment option requires the client (organization) to run a Docker image (9) with a Dialog VM (5) on premises. In this scenario, Alan AI Studio is not utilized; therefore, the client is responsible for script management and third-party voice API setup (if they are leveraged).
The minimal on-premises deployment option involves the following roles and components:
Script developer (1) writes the dialog script. The dialog script can be stored on the same machine as the machine running Docker (9) and the Dialog VM (5).
Application users (2) communicate directly with the Dialog VM (5) going through the client’s DMZ (8).
Alan AI Cloud (3) hosts Conversational services (7).
Dialog VM (5) mounts the script folder from the host machine, connects to the Conversational services (7).
Conversational services (7) may reside in the Alan AI Cloud (3) or in a third-party cloud (10).
DMZ (8): client’s demilitarized zone. The DMZ should have an externally visible IP address with a DNS name through which the Alan AI SDK (2) will connect to the Dialog VM (5) in Docker (9).
Docker (9) runs the Dialog VM image (5) on the client’s machine.
Third-party voice API service (10) can be used instead of Conversational services in the Alan AI Cloud (3).
On-premise deployment with cloud management¶
This deployment scenario features a hybrid infrastructure. Here, a Docker image (9) with the Dialog VM (5) runs on premises, but the script developer (1) can use the power of Alan AI Studio (4) hosted in the Alan AI Cloud (3) to:
Write and manage scripts
Get access to conversational analytics
Control the Dialog VM run on premises
This deployment option is available with Kubernetes cluster or Docker Compose.
On-premise deployment with cloud services¶
In this deployment scenario, all services – Alan AI Studio (4), Dialog VM (5) and storage (6) – are hosted on premises. Conversational services can be run either in the Alan AI Cloud (3) or as third-party voice API – Google, Microsoft and so on (10).
This deployment option is available with Kubernetes cluster or Docker Compose.
On-premise deployment¶
In the on-premise deployment scenario, all Alan AI Cloud components are hosted on premises.
This deployment option is available with Kubernetes cluster or Docker Compose.