Artificial Intelligence

We have recently introduced a dedicated Artificial Intelligence section within the Settings interface. This new area enables you to integrate Large Language Models (LLMs) to access AI-powered functionality. Our newly released auto-translation feature significantly simplifies the creation of multilingual flows. Continue reading to learn how to configure your LLM and develop automatically translated multilingual flows.

Beyond automated translation in multilingual flows through LLMs, you can also leverage ChatGPT integration within flows by implementing it through Webhook actions. We will continue introducing additional AI-enhanced features throughout the coming year, so stay informed about upcoming developments!

AI Applications Within Flows

The potential applications are virtually limitless. Consider these implementation ideas:
🌐 Automated translation of flows into multiple languages

🧠 Language acquisition: Facilitate conversational practice, translation exercises, or vocabulary assessment

🧑🏽‍🤝‍🧑🏾 Customer support: Manage inquiries, address complaints, or process refund requests through conversational interfaces

🗂️ Internal support assistant: Help staff locate documents, policies, or human resources information

📋 Onboarding automation: Guide new users or team members through processes with step-by-step instructions

👱🏽 Sales support: Qualify potential leads, schedule appointments, or recommend service packages

📅 Task automation: Integrate with external APIs to manage reservations, notifications, or daily reporting

Model Integration Process

Begin by accessing your workspace settings and navigating to the AI section in the left navigation panel.

Select your preferred model from the dropdown menu, choosing between OpenAI, DeepSeek, Google AI, or Anthropic.

Input your API Key, obtainable from your selected AI service’s dashboard. This example demonstrates OpenAI integration.

After key configuration, you will be prompted to select from available models.

Assign a distinctive name to your model configuration.

Configuration complete! Your LLM is now successfully integrated with your Rapidpro App workspace.

For additional assistance, please use the support widget located in the lower right corner of your browser interface.