Skip to contents

This function allows you to chat with a chatbot that answers questions based on the provided context and chat history. It uses GPT-4 architecture to generate responses.

Usage

chat_with_context(
  query,
  service = "openai",
  model = "gpt-4",
  index = NULL,
  add_context = TRUE,
  chat_history = NULL,
  history_name = "chat_history",
  session_history = NULL,
  add_history = TRUE,
  task = "Context Only",
  k_context = 4,
  k_history = 4,
  save_history = TRUE,
  overwrite = FALSE,
  local = FALSE,
  embedding_model = NULL
)

Arguments

query

The input query to be processed.

service

Name of the AI service to use, defaults to openai.

model

Name of the openai model to use, defaults to gpt-3.5-turbo

index

Index to look for context.

add_context

Whether to add context to the query or not. Default is TRUE.

chat_history

Chat history dataframe for reference.

history_name

Name of the file where chat history is stored.

session_history

Session history data for reference.

add_history

Whether to add chat history to the query or not. Default is TRUE.

task

Task type, either "Context Only" or "Permissive Chat". Default is "Context Only".

k_context

Number of top context matches to consider. Default is 4.

k_history

Number of top chat history matches to consider. Default is 4.

save_history

Whether to save the chat history or not. Default is TRUE.

overwrite

Whether to overwrite the history file or not. Default is FALSE.

local

Whether to use the local model or not. Default is FALSE.

embedding_model

A model object to use for embedding. Only needed if local is TRUE. Default is NULL.

Value

A list containing the prompt, context, and answer.

Examples

if (FALSE) { # rlang::is_interactive()
rlang::is_interactive()
query <- "What is the capital of France?"
result <- chat_with_context(query = query, context = context)
}