An AI development platform for
context-aware conversations

On Premise

Running servo on a local server when high security and reliability is needed


Run conversations and AI automation with Servo from the cloud

Edge Device

Conversation on IoT devices running Servo on the device

An open-source, cross-platform development framework for conversations and AI, built to support large-scale projects:

  • Develop, maintain and re-use complex conversations through a visual architectural editor
  • Augment NLU classifiers with context-based understanding
  • Mix-and-manage different models from IBM Watson, Microsoft Cognitive Services, Amazon Lex and more
  • Use a simulator and debugger to test and debug bots internally
  • Easily extend functionality with a fully open-source runtime
  • Drag-and-drop modules  for seamless back-end integration
  • Automated testing harness for end-to-end dialogues

Full Development Interface

Servo is made by developers, for developers. We know what a developer needs to build real AI systems. Servo is built above classic design patterns, abstracted database access layer, templating, caching and more, as presented in the videos below:

Visual flow Control

Simulator & Debugger

AI Learner

Advanced Context Engine

Context is a very powerful tool for AI systems. A system’s behavior can be influenced by location, time, a specific user or correlation in a conversation. With Servo’s  context detection algorithms, same conversations are modeled with far less nodes and complexity. Apply it for speech, text, location, time, or anything you think of.


Build AI abilities as sub processes

As any experienced developer knows, when projects grow in functionality and team size, the risk of bugs and critical crashes grows exponentially.  In Servo, this risk is significantly reduced by applying two important principles, both embodied in sub-processes: isolation and re-use.

Simplify tree structure

Reuse sub processes within the same or different projects

Reduce development time

Trusted By

Contact us: