Skip to content

"usage" field missing, then we crash and the error is due to we want to read nothing value. #296

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Cvikli
Copy link

@Cvikli Cvikli commented Jun 17, 2025

We have to handle the case where there is no "usage" field.
I don't know if my solution the the best way, but it spare the immediate error, so we can handle the issue somehow else.

What we got due to the error:
┌ Error: Task failed
│ exception =
│ MethodError: no method matching getindex(::Nothing, ::Symbol)
│ The function getindex exists, but no method is defined for this combination of argument types.
└ @ TODO4AI ~/repo/todoforai/agent/src/interfaces/clientAPI/handlers_agent.jl:109

Stacktrace:
[1] aigenerate(prompt_schema::PromptingTools.AnthropicSchema, prompt::Vector{PromptingTools.AbstractChatMessage}; verbose::Bool, api_key::String, model::String, return_all::Bool, dry_run::Bool, conversation::Vector{PromptingTools.AbstractMessage}, streamcallback::StreamCallbacksExt.StreamCallbackWithHooks, no_system_message::Bool, aiprefill::Nothing, http_kwargs::@NamedTuple{}, api_kwargs::@NamedTuple{top_p::Float64, temperature::Float64, max_tokens::Int64, stop_sequences::Vector{String}}, cache::Symbol, betas::Nothing, kwargs::@kwargs{}) @ PromptingTools ~/.julia/packages/PromptingTools/XSraD/src/llm_anthropic.jl:520
[2] aigenerate(prompt::Vector{PromptingTools.AbstractChatMessage}; model::String, kwargs::@kwargs{cache::Symbol, api_kwargs::@NamedTuple{top_p::Float64, temperature::Float64, max_tokens::Int64, stop_sequences::Vector{String}}, streamcallback::StreamCallbacksExt.StreamCallbackWithHooks, verbose::Bool}) @ PromptingTools ~/.julia/packages/PromptingTools/XSraD/src/llm_interface.jl:511
[3] work(agent::EasyContext.FluidAgent{EasyContext.SysMessageV1}, conv::EasyContext.Session{EasyContext.Message}; cache::Symbol, no_confirm::Bool, highlight_enabled::Bool, process_enabled::Bool, on_error::Function, on_done::typeof(EasyContext.noop), on_start::Function, io::TODO4AI.TodoIOWrapper, tool_kwargs::Dict{String, String}, thinking::Nothing, MAX_NUMBER_OF_TOOL_CALLS::Int64) @ EasyContext ~/repo/EasyContext.jl/src/agents/FluidAgent.jl:171

Or what would be the best way to handle these cases?

@svilupp
Copy link
Owner

svilupp commented Jun 17, 2025

Do you have some MWE to get this fail? I remember Tamas asking for some changes in the StreamingCallbacks in this vein, but we could never replicate it.

@Cvikli
Copy link
Author

Cvikli commented Jun 17, 2025

It was totally random. It can be even issue on the other side. Not sure how to replicate.

Actually happened during a workflow. And I couldn't get actually any data from the case as it actually fails before we could get any data. :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants