Package: tidyllm 0.3.0

Eduard Brüll

tidyllm: Tidy Integration of Large Language Models

A tidy interface for integrating large language model (LLM) APIs such as 'Claude', 'Openai', 'Groq','Mistral' and local models via 'Ollama' into R workflows. The package supports text and media-based interactions, interactive message history, batch request APIs, and a tidy, pipeline-oriented interface for streamlined integration into data workflows. Web services are available at <https://www.anthropic.com>, <https://openai.com>, <https://groq.com>, <https://mistral.ai/> and <https://ollama.com>.

Authors:Eduard Brüll [aut, cre], Jia Zhang [ctb]

tidyllm_0.3.0.tar.gz
tidyllm_0.3.0.tar.gz(r-4.5-noble)tidyllm_0.3.0.tar.gz(r-4.4-noble)
tidyllm_0.3.0.tgz(r-4.4-emscripten)tidyllm_0.3.0.tgz(r-4.3-emscripten)
tidyllm.pdf |tidyllm.html
tidyllm/json (API)
NEWS

# Install 'tidyllm' in R:
install.packages('tidyllm', repos = c('https://cran.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/edubruell/tidyllm/issues

Pkgdown:https://edubruell.github.io

3.18 score 20 scripts 740 downloads 62 exports 33 dependencies

Last updated 9 days agofrom:44ab8e6d03. Checks:OK: 2. Indexed: no.

TargetResultDate
Doc / VignettesOKDec 09 2024
R-4.5-linuxOKDec 09 2024

Exports:azure_openaiazure_openai_chatazure_openai_embeddingcancel_openai_batchchatchatgptcheck_batchcheck_claude_batchcheck_mistral_batchcheck_openai_batchclaudeclaude_chatdf_llm_messageembedfetch_batchfetch_claude_batchfetch_mistral_batchfetch_openai_batchgeminigemini_chatgemini_delete_filegemini_embeddinggemini_file_metadatagemini_list_filesgemini_upload_fileget_metadataget_replyget_reply_dataget_user_messagegroqgroq_chatgroq_transcribelast_metadatalast_replylast_reply_datalast_user_messagelist_batcheslist_claude_batcheslist_mistral_batcheslist_openai_batchesllm_messageLLMMessagemistralmistral_chatmistral_embeddingollamaollama_chatollama_download_modelollama_embeddingollama_list_modelsopenaiopenai_chatopenai_embeddingpdf_page_batchperplexityperplexity_chatrate_limit_infosend_batchsend_claude_batchsend_mistral_batchsend_openai_batchtidyllm_schema

Dependencies:askpassbase64encclicpp11curlfansigenericsgluehttr2jsonlitelifecyclelubridatemagrittropensslpdftoolspillarpkgconfigpngpurrrqpdfR6rappdirsRcpprlangS7stringistringrsystibbletimechangeutf8vctrswithr

Get Started

Rendered fromtidyllm.Rmdusingknitr::rmarkdownon Dec 09 2024.

Last update: 2024-12-08
Started: 2024-11-07

Readme and manuals

Help Manual

Help pageTopics
Azure-OpenAI Endpoint Provider Functionazure_openai
Send LLM Messages to an OpenAI Chat Completions endpoint on Azureazure_openai_chat
Generate Embeddings Using OpenAI API on Azureazure_openai_embedding
Cancel an In-Progress OpenAI Batchcancel_openai_batch
Chat with a Language Modelchat
Alias for the OpenAI Provider Functionchatgpt
Check Batch Processing Statuscheck_batch
Check Batch Processing Status for Claude APIcheck_claude_batch
Check Batch Processing Status for Mistral Batch APIcheck_mistral_batch
Check Batch Processing Status for OpenAI Batch APIcheck_openai_batch
Provider Function for Claude models on the Anthropic APIclaude
Interact with Claude AI models via the Anthropic APIclaude_chat
Convert a Data Frame to an LLMMessage Objectdf_llm_message
Generate text embeddingsembed
Fetch Results from a Batch APIfetch_batch
Fetch Results for a Claude Batchfetch_claude_batch
Fetch Results for an Mistral Batchfetch_mistral_batch
Fetch Results for an OpenAI Batchfetch_openai_batch
Google Gemini Provider Functiongemini
Send LLMMessage to Gemini APIgemini_chat
Delete a File from Gemini APIgemini_delete_file
Generate Embeddings Using the Google Gemini APIgemini_embedding
Retrieve Metadata for a File from Gemini APIgemini_file_metadata
List Files in Gemini APIgemini_list_files
Upload a File to Gemini APIgemini_upload_file
Retrieve Metadata from Assistant Repliesget_metadata last_metadata
Retrieve Assistant Reply as Textget_reply last_reply
Retrieve Assistant Reply as Structured Dataget_reply_data last_reply_data
Retrieve a User Message by Indexget_user_message last_user_message
Groq API Provider Functiongroq
Send LLM Messages to the Groq Chat APIgroq_chat
Transcribe an Audio File Using Groq transcription APIgroq_transcribe
List all Batch Requests on a Batch APIlist_batches
List Claude Batch Requestslist_claude_batches
List Mistral Batch Requestslist_mistral_batches
List OpenAI Batch Requestslist_openai_batches
Create or Update Large Language Model Message Objectllm_message
Large Language Model Message ClassLLMMessage
Mistral Provider Functionmistral
Send LLMMessage to Mistral APImistral_chat
Generate Embeddings Using Mistral APImistral_embedding
Ollama API Provider Functionollama
Interact with local AI models via the Ollama APIollama_chat
Download a model from the Ollama APIollama_download_model
Generate Embeddings Using Ollama APIollama_embedding
Retrieve and return model information from the Ollama APIollama_list_models
OpenAI Provider Functionopenai
Send LLM Messages to the OpenAI Chat Completions APIopenai_chat
Generate Embeddings Using OpenAI APIopenai_embedding
Batch Process PDF into LLM Messagespdf_page_batch
Perplexity Provider Functionperplexity
Send LLM Messages to the Perplexity Chat APIperplexity_chat
Get the current rate limit information for all or a specific APIrate_limit_info
Send a batch of messages to a batch APIsend_batch
Send a Batch of Messages to Claude APIsend_claude_batch
Send a Batch of Requests to the Mistral APIsend_mistral_batch
Send a Batch of Messages to OpenAI Batch APIsend_openai_batch
Create a JSON schema for structured outputstidyllm_schema