Package: tidyprompt 0.0.1

Luka Koning

tidyprompt: Prompt Large Language Models and Enhance Their Functionality

Easily construct prompts and associated logic for interacting with large language models (LLMs). 'tidyprompt' introduces the concept of prompt wraps, which are building blocks that you can use to quickly turn a simple prompt into a complex one. Prompt wraps do not just modify the prompt text, but also add extraction and validation functions that will be applied to the response of the LLM. This ensures that the user gets the desired output. 'tidyprompt' can add various features to prompts and their evaluation by LLMs, such as structured output, automatic feedback, retries, reasoning modes, autonomous R function calling, and R code generation and evaluation. It is designed to be compatible with any LLM provider that offers chat completion.

Authors:Luka Koning [aut, cre, cph], Tjark Van de Merwe [aut, cph], Kennispunt Twente [fnd]

tidyprompt_0.0.1.tar.gz
tidyprompt_0.0.1.tar.gz(r-4.5-noble)tidyprompt_0.0.1.tar.gz(r-4.4-noble)
tidyprompt_0.0.1.tgz(r-4.4-emscripten)tidyprompt_0.0.1.tgz(r-4.3-emscripten)
tidyprompt.pdf |tidyprompt.html
tidyprompt/json (API)
NEWS

# Install 'tidyprompt' in R:
install.packages('tidyprompt', repos = c('https://cran.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/tjarkvandemerwe/tidyprompt/issues

Pkgdown site:https://tjarkvandemerwe.github.io

3.18 score 9 scripts 47 exports 25 dependencies

Last updated 3 days agofrom:513d3caee2. Checks:2 OK. Indexed: no.

TargetResultLatest binary
Doc / VignettesOKJan 08 2025
R-4.5-linuxOKJan 08 2025

Exports:add_msg_to_chat_historyadd_textanswer_as_booleananswer_as_integeranswer_as_jsonanswer_as_key_valueanswer_as_listanswer_as_named_listanswer_as_regex_matchanswer_as_textanswer_by_chain_of_thoughtanswer_by_reactanswer_using_ranswer_using_sqlanswer_using_toolschat_historyconstruct_prompt_textdf_to_stringextract_from_return_listget_chat_historyget_prompt_wrapsis_tidypromptllm_breakllm_feedbackllm_provider_google_geminillm_provider_groqllm_provider_mistralllm_provider_ollamallm_provider_openaillm_provider_openrouterllm_provider_xaillm_provider-classllm_verifypersistent_chat-classprompt_wrapquit_ifr_json_schema_to_examplesend_promptset_chat_historyset_system_promptskim_with_labels_and_levelstidyprompttidyprompt-classtools_add_docstools_get_docsuser_verifyvector_list_to_string

Dependencies:askpassclicurldplyrfansigenericsgluehttr2jsonlitelifecyclemagrittropensslpillarpkgconfigR6rappdirsrlangstringistringrsystibbletidyselectutf8vctrswithr

Creating prompt wraps

Rendered fromcreating_prompt_wraps.Rmdusingknitr::rmarkdownon Jan 08 2025.

Last update: 2025-01-08
Started: 2025-01-08

Getting started

Rendered fromgetting_started.Rmdusingknitr::rmarkdownon Jan 08 2025.

Last update: 2025-01-08
Started: 2025-01-08

Sentiment analysis in R with a LLM and 'tidyprompt'

Rendered fromsentiment_analysis.Rmdusingknitr::rmarkdownon Jan 08 2025.

Last update: 2025-01-08
Started: 2025-01-08

Readme and manuals

Help Manual

Help pageTopics
Add a message to a chat historyadd_msg_to_chat_history
Add text to a tidypromptadd_text
Make LLM answer as a boolean (TRUE or FALSE)answer_as_boolean
Make LLM answer as an integer (between min and max)answer_as_integer
Make LLM answer as JSON (with optional schema)answer_as_json
Make LLM answer as a list of key-value pairsanswer_as_key_value
Make LLM answer as a list of itemsanswer_as_list
Make LLM answer as a named listanswer_as_named_list
Make LLM answer match a specific regexanswer_as_regex_match
Make LLM answer as a constrained text responseanswer_as_text
Set chain of thought mode for a promptanswer_by_chain_of_thought
Set ReAct mode for a promptanswer_by_react
Enable LLM to draft and execute R codeanswer_using_r
Enable LLM to draft and execute SQL queries on a databaseanswer_using_sql
Enable LLM to call R functionsanswer_using_tools
Create or validate 'chat_history' objectchat_history
Construct prompt text from a tidyprompt objectconstruct_prompt_text
Convert a dataframe to a string representationdf_to_string
Function to extract a specific element from a listextract_from_return_list
Get the chat history of a tidyprompt objectget_chat_history
Get prompt wraps from a tidyprompt objectget_prompt_wraps
Check if object is a tidyprompt objectis_tidyprompt
Create an 'llm_break' objectllm_break
Create an 'llm_feedback' objectllm_feedback
Create a new Google Gemini LLM providerllm_provider_google_gemini
Create a new Groq LLM providerllm_provider_groq
Create a new Mistral LLM providerllm_provider_mistral
Create a new Ollama LLM providerllm_provider_ollama
Create a new OpenAI LLM providerllm_provider_openai
Create a new OpenRouter LLM providerllm_provider_openrouter
Create a new XAI (Grok) LLM providerllm_provider_xai
LlmProvider R6 Classllm_provider-class
Have LLM check the result of a prompt (LLM-in-the-loop)llm_verify
PersistentChat R6 classpersistent_chat-class
Wrap a prompt with functions for modification and handling the LLM responseprompt_wrap
Make evaluation of a prompt stop if LLM gives a specific responsequit_if
Generate an example object from a JSON schemar_json_schema_to_example
Send a prompt to a LLM providersend_prompt
Set the chat history of a tidyprompt objectset_chat_history
Set system prompt of a tidyprompt objectset_system_prompt
Skim a dataframe and include labels and levelsskim_with_labels_and_levels
Create a tidyprompt objecttidyprompt
Tidyprompt R6 Classtidyprompt-class
Add tidyprompt function documentation to a functiontools_add_docs
Extract documentation from a functiontools_get_docs
Have user check the result of a prompt (human-in-the-loop)user_verify
Convert a named or unnamed list/vector to a string representationvector_list_to_string