Skip to main content

AI Models


Intro

AI Models allow you to use any of HuggingFace's transformers models in your API.

This can be useful for tasks such as natural language processing, machine translation, and question answering.. allowing you to connect multiple models to each other.


How AI models work

You will find a wide range of available AI nodes... but they all work the same way.

AI nodes will use the HuggingFace Inference API to be able to use the AI model based on the model ID you choose, and it will save the response so it can be accessible from any other node using the following syntax: koxy.res.{NODE_NAME}.


You can use the AI nodes response from any other node using the koxy.res.{NODE_NAME} as described above, and if the response is a JSON or an object you can get the nested values like this (example): koxy.res.{gpt2.outputs} (this will be the value of the outputs the gpt2 model returned).


Add AI nodes to your flow

To add AI nodes to your flow you should open the nodes list. check the flows builder documentation to learn more, Choose the Artificial intellegence tabe, choose the AI node you want to use, and then You should fill the following properties:

  • name: node names should be unique and can't be changed.

  • model: The model's ID you want to use. You can click on the model input and a list of all models will open up (based on the AI node you choose).

  • body: the parameters you want to send to the model.

  • Next: the trigger that the node should call after running the model. click on the next input to get a list of available triggers.


Requirements and limitations

  • You will need to add your HuggingFace token: AI models will have low rate limits if You don't add your own HuggingFace token.

  • Not all models are supported: Only models supported in the Inference API can be used in this node.

  • Rate limits: If you suddenly send 10k requests then you’re likely to receive 503 errors saying models are loading. In order to prevent that, you should instead try to start running queries smoothly from 0 to 10k over the course of a few minutes.


Valid body parameters

We left this to you because of the fact that you might use any format based on the model type. We recommend you check this documentation to learn more about how to set your body's parameters.

Example:

'{"inputs": "The answer to the universe is [MASK]."}'

After all, setting your model's parameters change based on the model you are using. again You can click on the deply to Inference API button in HuggingFace to see how they set their parameters for every model.


Running private models

When you add your HuggingFace token your private models will be available in the models list. so you can use any of your models in your API.


Example

In this example we will use the bert-base-uncased model to fill the [MASK].

So first you need to give your node a name, in this example We will name it model1.

We will click on the model input enter our HuggingFace token and then choose the bert-base-uncased model and click on Use model.

Now let's set our body or parameters:

'{"inputs": "The goal of life is [MASK]"}'

And we will set the next trigger to stop.

If crouse we will connect the start node with our model1 node so it starts runnign from it.

When We run this flow, We will get something like this:

{
"run_number": "run_1690108458839",
"responses": {
"model1": [
{
"score": 0.0801406130194664,
"token": 1012,
"token_str": ".",
"sequence": "' { \" inputs \" : \" the goal of life is. \" } '"
},
{
"score": 0.029869617894291878,
"token": 1010,
"token_str": ",",
"sequence": "' { \" inputs \" : \" the goal of life is, \" } '"
},
{
"score": 0.02789161540567875,
"token": 1025,
"token_str": ";",
"sequence": "' { \" inputs \" : \" the goal of life is ; \" } '"
},
{
"score": 0.022287525236606598,
"token": 1024,
"token_str": ":",
"sequence": "' { \" inputs \" : \" the goal of life is : \" } '"
},
{
"score": 0.021031850948929787,
"token": 2673,
"token_str": "everything",
"sequence": "' { \" inputs \" : \" the goal of life is everything \" } '"
}
]
},
"errors": [],
"logs": [
"Started (main) flow...",
"Running start",
"Running model1",
"✓ Done"
],
"available_runs": 999957
}

You can use Dynamic variables in the following properties:

  • Body (parameters)