🧩Fun fun functions!
UPDATE 17-Oct-2024: LLM-enabled functions are live. This will allow your AI prompts to call a Function as a tool in the LLM pipeline.
So you’ve got your Gooey.AI workflow up and running, everything is working fine and you're happy with the outputs. But is it REALLY ready? You must also connect your AI workflow to your data, servers, and other APIs. With our new Functions feature, you can do just that.
What are Gooey.AI Functions?
Gooey.AI Functions are sandboxed Javascript functions & API calls inside your Gooey.AI recipes.
LLMs can be non-deterministic and often we must have some functions to ensure that the user receives the right information or that user response triggers something important in your tech stack.
Why functions in Gooey.AI Matter
Functions came up as a feature request from our customers and partners. With minimal setup for the development team, the Functions workflow can:
Call your APIs Use your APIs for BEFORE, AFTER, and Prompt Requests (LLM-Enabled), this will allow you quick iterations and flexibility to pull or push data to your servers and tech stacks. See the example below at the end of the page
Call the APIs of others Add API functions from any other projects and processes that support your product
Perform logic in javascript Use simple JS logic and operators for your workflow.
Chain Gooey.AI workflows together Now you can easily chain Gooey.AI Workflows
At the core, Gooey.AI Functions adds the power of JavaScript to any of your AI Workflows. You create small JS snippets, that can be mixed and matched with any Gooey workflow. This means you can:
chain all the best parts of our abstraction layers
customize and chain GenAI tools to your existing systems
deploy code snippets directly with Gooey (no setup, no servers needed!)
How does it work?
Functions can be used in three ways in Gooey.AI:
BEFORE: executed before a Gooey.AI run. This means a function is executed and its response is used as part of the Gooey.AI workflow. Example here.
AFTER: executed after a Gooey.AI run. This means a function is executed after the Gooey.AI workflow is run, where the response from the AI run is used as a variable in the function. Example here.
PROMPT: The LLM understands the user's query and decides if and when the function should be executed. Example here.
How can I use Gooey.AI functions?
In this scenario, imagine you have an AI copilot on Gooey.AI, the copilot has an Analysis script in the Copilot integrations that uses GPT-4 to output:
Category of user’s query
Whether the assistant could find responses in the vectorDB and answer the user.
We will use Functions to create a POST request and send the GPT-4 output to a support CRM like Hubspot.
Who's using it?
Scenario: Ulangizi AI Copilot (Opportunity.org)
Our customer Opportunity.org has been deploying several copilots in Africa. They are meant to send out a disclaimer to all new users about their data policy. They want to ensure the message is only sent out to new users. In this scenario, we can use a BEFORE Function to call their API with the “Custom Message”. If the user is new they will receive the “Custom Message” + Assistant, if they are old, or have a longer conversation with the Copilot, they do not receive the “Custom Message”
What's coming?
Secret management - This will allow you to include Secrets API keys and tokens in your Function calls.
How do I get started?
Check out the links below:
Last updated