β¨LLM-enabled Functions
Last updated
Last updated
@Dara.network / Gooey.AI / support@gooey.ai
There are many scenarios where we don't need to run the Functions for every Copilot query. For example:
User wants to calculate data in between the conversation (like an HVAC CFM Calculation)
LLM needs to do a Google search to respond to the query
LLM needs to do a look-up weather API and answer the query
When the user sends a query in Natural Language, the LLM determines the following:
does the query require a function?
which part of the text should be passed as an argument in the function?
In the example below, the query is about CFM calculations which are commonly used in HVAC installations.
Using LLM-enabled Functions is exactly the same as "BEFORE" and "AFTER" functions
Head over the Functions workflow
Create your PROMPT Function:
create a basic fetch call for the weather of any location
create a serper
You can find more examples here
Hit Submit, if your code is working fine you will get your outputs on the right side. Use the βSave as Newβ button and update the run name.
Now head over to the Gooey workflow where you want to add the saved functions.
Head over to the example below:
Check the Functions option, and choose βPROMPTβ from the dropdown and add your Saved example. And then hit "SUBMIT!
You can check your Functions output in the Workflow at the end of the page in "Details" section.