You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 16, 2025. It is now read-only.
When processing large images or documents (file) with every request the files is send as encoded data. This increases the token usage since every time the file needs to be ingested again. Especially with Tool Calling this leads to a high usage of tokens.
To mitigate this issues some platforms provide a way to cache input files. It would be nice to fine a way to enable caching.
changed the title [-]Add Image / Document caching for supported Platforms[/-][+]Provide a way for Image / Document caching for supported Platforms[/+]on May 19, 2025
See also #317. That issue might be related though it specifically provides an example for tool calling. I'm more interested in caching large portions of prompt content.
Activity
[-]Add Image / Document caching for supported Platforms[/-][+]Provide a way for Image / Document caching for supported Platforms[/+]bjalt commentedon May 19, 2025
See also #317. That issue might be related though it specifically provides an example for tool calling. I'm more interested in caching large portions of prompt content.
chr-hertel commentedon May 22, 2025
Sounds like a good addition - would be great if LLM Chain can handle that right away 👍