Hybrid deployment

The Alan AI Platform offers the following options for hybrid deployment, in ascending order of complexity and cost:

Minimal on-premises deployment

This deployment scenario features a hybrid infrastructure, with the majority of components hosted in the Alan AI Cloud.

This deployment option requires the organization to run a Docker image with a Dialog VM at the on-premises location. In this scenario, Alan AI Studio is not used; the organization is responsible for script management and third-party conversational API setup (if applicable).

../../../_images/minimal-on-premises.svg

The minimal on-premises deployment option involves the following roles and components:

  1. Script developer: a person who writes scripts that define the dialog and business logic of the conversational experience, manages script deployment, analyzes conversational data and customers’ behavior in interactions with the AI assistant.

    The dialog scripts can be stored on the same machine as the machine running Docker (7) and the Action Transformer VM (4).

  2. Application users: users interacting with the application with the Alan AI SDK embedded. The application users communicate directly with the Action Transformer VM (4) through the client’s DMZ (6).

  3. Alan AI Cloud: a cloud-based environment that hosts Conversational services (5).

  4. Action Transformer VM: a VM allocated for each AI assistant project. This VM is responsible for dialog management and execution of the business logic defined in the dialog script. The Action Transformer VM mounts the script folder from the host machine, connects to the Conversational services (5).

  5. Conversational services: components responsible for speech recognition and generation. The Conversational services may reside in the Alan AI Cloud (3) or in a third-party cloud (8).

  6. DMZ: client’s demilitarized zone. The DMZ must have an externally visible IP address with a DNS name through which the Alan AI SDK (2) will connect to the Action Transformer (4) in Docker (7).

  7. Docker: an environment running the Action Transformer image (4) on the client’s machine.

  8. Third-party API services: conversational services that can be used instead of Conversational services (5) in the Alan AI Cloud (3).

On-premises deployment with cloud management

This deployment scenario features a hybrid infrastructure. Here, a Docker image with the Action Transformer VM runs at the on-premises location, but the script developer can use the power of Alan AI Studio hosted in the Alan AI Cloud to:

  • Write and manage scripts

  • Get access to conversational analytics

  • Control the Action Transformer run at the on-premises location

This deployment option is available with Kubernetes cluster or Docker Compose.

../../../_images/cloud-management.svg

The on-premises deployment with cloud management option involves the following roles and components:

  1. Script developer: a person who writes scripts that define the dialog and business logic of the conversational experience, manages script deployment, analyzes conversational data and customers’ behavior in interactions with the AI assistant.

  2. Application users: users interacting with the application with the Alan AI SDK embedded. The application users communicate directly with the Action Transformer VM (5) through the client’s DMZ (8).

  3. Alan AI Cloud: a cloud-based environment that hosts Alan AI Studio (4), administers storage (6) and performs other functions.

  4. Alan AI Studio: an interface to the Alan AI Platform, a web-based IDE comprising a script editor, version and deployment control, dialog script testing tools, logs and analytics.

  5. Action Transformer VM: a VM allocated for each AI assistant project. This VM is responsible for dialog management and execution of the business logic defined in the dialog script. The Action Transformer VM mounts the script folder from the host machine, connects to the Conversational services (7).

  6. Storage: the Alan AI Studio storage containing dialog scripts with version history, user behavior analytics and statistics.

  7. Conversational services: components responsible for speech recognition and generation.

  8. DMZ: client’s demilitarized zone. The DMZ must have an externally visible IP address with a DNS name through which the Alan AI SDK (2) will connect to the Action Transformer (5) in Docker (9).

  9. Docker: an environment running the Action Transformer image (5) on the client’s machine.

On-premises deployment with cloud services

In this deployment scenario, all services – Alan AI Studio, the Action Transformer VM and storage – are hosted at the on-premises location. Conversational services can be run either in the Alan AI Cloud or as third-party API – Google, Microsoft and so on.

This deployment option is available with Kubernetes cluster or Docker Compose.

../../../_images/voice-services.svg

The on-premises deployment with cloud services option involves the following roles and components:

  1. Script developer: a person who writes scripts that define the dialog and business logic of the conversational experience, manages script deployment, analyzes conversational data and customers’ behavior in interactions with the AI assistant.

  2. Application users: users interacting with the application with the Alan AI SDK embedded. The application users communicate directly with the Action Transformer VM (5) through the client’s DMZ (8).

  3. Alan AI Cloud: a cloud-based environment that hosts Alan AI Studio (4), administers storage (6) and performs other functions.

  4. Alan AI Studio: an interface to the Alan AI Platform, a web-based IDE comprising a script editor, version and deployment control, dialog script testing tools, logs and analytics.

  5. Action Transformer VM: a VM allocated for each AI assistant project. This VM is responsible for dialog management and execution of the business logic defined in the dialog script. The Action Transformer VM mounts the script folder from the host machine, connects to the Conversational services (7).

  6. Storage: the Alan AI Studio storage containing dialog scripts with version history, user behavior analytics and statistics.

  7. Conversational services: components responsible for speech recognition and generation.

  8. DMZ: client’s demilitarized zone. The DMZ must have an externally visible IP address with a DNS name through which the Alan AI SDK (2) will connect to the Action Transformer (5) in Docker (9).

  9. Docker: an environment running Alan AI component images on the client’s machine.

  10. Third-party API services: conversational services that can be used instead of Conversational services (7) in the Alan AI Cloud (3).