Skip to main content

Comwork AI


This tutorial is also available in the following languages:


This feature aims to expose AI1 models such as NLP2 or LLM3 to be exposed as an API based on this open source project.

Enabling this API​

In the SaaS version, you can ask to be granted using the support system.

If you're admin of the instance, you can grant users like this:


UI chat​

Once you're enabled, you can try the CWAI api using this chat web UI:


Use the API​

Of course, the main purpose is to be able to interact with those model using very simple http endpoints:


Here's how to get all the available models:

$ curl -X 'GET' '' -H 'accept: application/json' -H 'X-Auth-Token: XXXXXX'


"models": [
"status": "ok"

Then prompting with one of the available models:

curl -X 'POST' \
'' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-H 'X-Auth-Token: XXXXXX' \
-d '{
"model": "nlptownsentiment",
"message": "This is bad !",
"settings": {}

The answer would be:

"response": [
"The predicted emotion is: Anger"
"score": 1,
"status": "ok"


  • you have to replace the XXXXXX value with your own token generated with this procedure.
  • you can replace by the API's instance URL you're using, with the CWAI_API_URL environment variable. For the Tunisian customers for example, it would be

Use the CLI​

You can use the cwc CLI which provide a subcommand ai:

$ cwc ai
This command lets you call the CWAI endpoints

cwc ai
cwc ai [command]

Available Commands:
models Get the available models
prompt Send a prompt

-h, --help help for ai

Use "cwc ai [command] --help" for more information about a command.

List the available models​

$ cwc ai models
[gpt2 nlptownsentiment nltksentiment textblobsentiment mock]

Send a prompt to an available model​

$ cwc ai prompt
Error: required flag(s) "message", "model" not set
cwc ai prompt [flags]

-h, --help help for prompt
-m, --message string The message input
-t, --model string The chosen model
$ cwc ai prompt --model nltksentiment --message "This is bad"
Status Response Score
ok [The predicted sentiment is: negative, score: -0.5423] -0.5423

Driver interface​

If you fork the cwai API, you can implement you're own driver that will load and generate answer from models implementing this abstract:

class ModelDriver(ABC):
def load_model(self):

def generate_response(self, prompt: Prompt):

Then add your model in the ENABLED_MODELS environment variable.

In the future we'll propose a web GUI to upload your own drivers as a serverless function on the web console. In the meantime you can ask comwork via the support system.

  1. Artificial intelligence↩
  2. Natural language processing↩
  3. Large language model↩