Returns detailed information about the most recent LLM call, including
the prompt sent, response received, and metadata like tokens and cost.
Works with both dsp() calls and module-based calls.
Value
A dsprrr_prompt_inspection object containing:
prompt: The full prompt sent to the LLMresponse: The LLM's responsemodel: The model usedtokens_in: Input tokens usedtokens_out: Output tokens generatedcost: Cost in USD (if available)timestamp: When the call was madesource: Where the call originated ("dsp()" or module name)
Returns NULL if no LLM calls have been made.
Examples
if (FALSE) { # \dontrun{
# Make an LLM call
dsp("question -> answer", question = "What is 2+2?")
# Inspect what happened
get_last_prompt()
#> ─── Last Prompt ───────────────────────────────────
#> System: Given the fields `question`, produce the fields `answer`.
#>
#> User: question: What is 2+2?
#>
#> ─── Response ──────────────────────────────────────
#> Assistant: {"answer": "4"}
#>
#> ─── Metadata ──────────────────────────────────────
#> Model: gpt-4o-mini | Tokens: 45 in, 12 out | Cost: $0.0001
} # }
