Available on plans:
Free
Pro
Team
Enterprise
The Split Text into Chunks for AI Prompts action is in beta and is subject to change.
When using large language models (LLMs) with large data sets, you may reach token limits. Large data sets can be anything from documents to website pages.
You can use Formatter’s Split Text into Chunks for AI Prompts to break up your data into smaller chunks to use with AI apps. These chunks are based on your input data, model type, prompt, and response size. Then you can then run each chunk separately in your Zap.
- The Split Text into Chunks for AI Prompt action supports English text only and may behave unpredictably with non-English text.
1. Add a Formatter step
- In the Zap editor, click the Action step. A dialog box will open.
- In the search box, search for and select Formatter. The right sidebar will open to the Setup tab.
- Click the Action event field to open the dropdown menu and select Text.
- Click Continue to continue to the Configure tab.
2. Set up your transform
- Click the Transform field to open the dropdown menu, then select Split Text into Chunks for AI Prompts.
- In the Input field, manually enter the string you want to transform or map a field from a previous step.
- (Optional) In the Prompt field, enter the prompt you intend to use in your AI step.
- This will help Formatter estimate how large each chunk should be.
- The prompt is not included in the chunks.
- (Optional) In the Response Size field, select the maximum response size.
- This is the expected size of your response in tokens.
- Use 0 to apply no limit.
- In the Chunk Size field, select a chunk size based on the token limits of popular AI models.
- Alternatively, use a custom value to set a custom token limit.
- For reference, about 75 words make up 100 tokens.
- In the Output Style field, select a text format for your output:
- Line items: each chunk is returned as a single item in a series of line items. This is the default.
- Separate fields: each chunk is returned as a separate, mappable field.
- Click Continue to continue to the Test tab.
OpenAI’s best practices recommend a maximum response of 256 tokens.
3. Test your action
- Click Test step.
- The Data out tab will display the transform output so you can confirm your string was formatted correctly.
- If the new data is not what you expect, you can adjust the transform settings and retest the step again.
4. Loop your text chunks
Use Looping by Zapier to run each chunk through your AI prompt. Looping by Zapier will create a loop to run a set of actions more than once.
Learn more about using Looping by Zapier to:
After you’ve set up your looping step, you can send your text chunks to the AI app of your choice.
Chunk example: summarize web page content
You can use this Zap template to learn how to set up chunking in a Zap. In this template:
- The web page URL triggers the Zap from an email trigger.
- Digest by Zapier is used for storing chunk summaries.
- Delay by Zapier is used for ensuring chunks are summarized in order.
- The Delay and Digest steps use an ID field (Raw Date) to reference the Digest and Delay for the specific Zap run from the trigger.
- If you change your trigger, you must replace the ID field. This field should be unique each time the Zap runs.
- The OpenAI steps are set to default parameters, and the prompt is for summarizing text.