# Fun fun functions!

#### UPDATE 17-Oct-2024: LLM-enabled functions are live. This will allow your AI prompts to call a Function as a tool in the LLM pipeline.

So you’ve got your Gooey.AI workflow up and running, everything is working fine and you're happy with the outputs. But is it REALLY ready? You must also connect your AI workflow to your data, servers, and other APIs. With our new Functions feature, you can do just that.

### What are Gooey.AI Functions? <a href="#aecwpo98db66" id="aecwpo98db66"></a>

**Gooey.AI Functions are sandboxed Javascript functions & API calls inside your Gooey.AI recipes.**

LLMs can be non-deterministic and often we must have some functions to ensure that the user receives the right information or that user response triggers something important in your tech stack.

### Why functions in Gooey.AI Matter <a href="#id-6obccrfhun54" id="id-6obccrfhun54"></a>

Functions came up as a feature request from our customers and partners. With minimal setup for the development team, the Functions workflow can:

1. **Call your APIs**\
   Use your APIs for BEFORE, AFTER, and Prompt Requests (LLM-Enabled), this will allow you quick iterations and flexibility to pull or push data to your servers and tech stacks. See the example below at the end of the page
2. [**Call the APIs of others**](https://gooey.ai/compare-large-language-models/functions-make-a-haiku-with-iss-coordinates-k4vuehh6hhvo/)\
   Add API functions from any other projects and processes that support your product
3. [**Perform logic in javascript**](https://gooey.ai/functions/log-variables-sg8xvlg206ss/)\
   Use simple JS logic and operators for your workflow.
4. **Chain Gooey.AI workflows together**\
   Now you can easily chain Gooey.AI Workflows

At the core, Gooey.AI Functions adds the power of JavaScript to any of your AI Workflows. You create small JS snippets, that can be mixed and matched with any Gooey workflow. This means you can:&#x20;

* chain all the best parts of our abstraction layers
* customize and chain GenAI tools to your existing systems
* deploy code snippets directly with Gooey (no setup, no servers needed!)

### How does it work? <a href="#tof9h8jstsh4" id="tof9h8jstsh4"></a>

Functions can be used in three ways in Gooey.AI:&#x20;

1. **BEFORE**: executed **before** a Gooey.AI run. This means a function is executed and its response is used as part of the Gooey.AI workflow. [Example here](https://gooey.ai/compare-large-language-models/functions-make-a-haiku-with-iss-coordinates-k4vuehh6hhvo/).&#x20;
2. **AFTER**: executed **after** a Gooey.AI run. This means a function is executed after the Gooey.AI workflow is run, where the response from the AI run is used as a variable in the function. [Example here](https://gooey.ai/copilot/).
3. **PROMPT**: The LLM understands the user's query and decides if and when the function should be executed. [Example here.](https://gooey.ai/copilot/barebones-gpt-4o-v1xm6uhp/)

<figure><img src="/files/7j4rAfdrQmqrI49lFbCS" alt=""><figcaption></figcaption></figure>

### How can I use Gooey.AI functions? <a href="#qnigw1qx1hbf" id="qnigw1qx1hbf"></a>

In this scenario, imagine you have an AI agent on Gooey.AI, the agent has an Analysis script in the Agent Deployments that uses GPT-4 to output:

1. Category of user’s query
2. Whether the assistant could find responses in the vectorDB and answer the user.

We will use Functions to create a POST request and send the GPT-4 output to a support CRM like Hubspot.

<figure><img src="/files/4MCmuWpsV6WFsQ8GAhCm" alt=""><figcaption></figcaption></figure>

### Who's using it? <a href="#id-3zd7gvtihypa" id="id-3zd7gvtihypa"></a>

#### **Scenario:** Ulangizi AI Agent (Opportunity.org)

Our customer Opportunity.org has been deploying several AI agents in Africa. They are meant to send out a disclaimer to all new users about their data policy. They want to ensure the message is only sent out to new users. In this scenario, we can use a BEFORE Function to call their API with the “Custom Message”. If the user is new they will receive the “Custom Message” + Assistant, if they are old, or have a longer conversation with the agent, they do not receive the “Custom Message”

<figure><img src="/files/Gyr8utgHuShJ5DrV1EEY" alt=""><figcaption></figcaption></figure>

### What's coming?

* Secret management - This will allow you to include Secrets API keys and tokens in your Function calls.&#x20;

### How do I get started? <a href="#id-5en4zepmi4k2" id="id-5en4zepmi4k2"></a>

Check out the links below:

<table data-view="cards"><thead><tr><th></th><th data-hidden data-card-target data-type="content-ref"></th><th data-hidden data-card-cover data-type="files"></th></tr></thead><tbody><tr><td><strong>How to use Gooey.AI functions?</strong></td><td><a href="https://docs.gooey.ai/guides/how-to-use-gooey-functions">https://docs.gooey.ai/guides/how-to-use-gooey-functions</a></td><td><a href="/files/cGJknqvKaL2exH0zaOvN">/files/cGJknqvKaL2exH0zaOvN</a></td></tr><tr><td><p><strong>EXAMPLE:</strong> </p><p><strong>Fetch WEATHER API</strong></p></td><td><a href="https://gooey.ai/functions/current-weather-rxmquy60p1vq/">https://gooey.ai/functions/current-weather-rxmquy60p1vq/</a></td><td><a href="/files/EjpuuFtzzW7uwB4jwQ1t">/files/EjpuuFtzzW7uwB4jwQ1t</a></td></tr><tr><td><p><strong>EXAMPLE:</strong> </p><p><strong>Connect Fetch API with GOOEY.AI LLM Generator</strong></p></td><td><a href="https://gooey.ai/compare-large-language-models/functions-make-a-haiku-with-iss-coordinates-k4vuehh6hhvo/">https://gooey.ai/compare-large-language-models/functions-make-a-haiku-with-iss-coordinates-k4vuehh6hhvo/</a></td><td><a href="/files/nzBy9PMfGIWXVTbRLdCW">/files/nzBy9PMfGIWXVTbRLdCW</a></td></tr></tbody></table>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://blog.gooey.ai/fun-fun-functions.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
