Use Formatter’s newest transform to avoid token limits when using large data sets with AI (Beta)

miscEye icon Note

This information was accurate at the time of publication. Please check out the latest product update notes for any updates or changes.

Hey Zapier users!

We're excited to introduce a new feature to help you work with AI large language models (LLM) and large data sets more efficiently: Formatter's Split Text into Chunks for AI Prompts transform.

Use it in your Zaps when dealing with large data sets like documents or website pages. It intelligently breaks your input into chunks based on input data, model type, prompt, and response size. 

Then, you can run each chunk separately in your Zap to avoid hitting token limits.

This feature makes it easier than ever to handle large data sets in your Zaps, giving you more control and flexibility. Give it a try and see how it streamlines your workflows!


Learn more about using Split Text into Chunks for AI Prompts.

Was this article helpful?
0 out of 0 found this helpful