| # Supabase Tutorial | |
| [Supabase](https://supabase.com/) is an open source Firebase alternative. | |
| Start your project with a Postgres database, Authentication, instant APIs, Edge Functions, Realtime subscriptions, Storage, and Vector embeddings. | |
| ## Use Supabase to log requests and see total spend across all LLM Providers (OpenAI, Azure, Anthropic, Cohere, Replicate, PaLM) | |
| liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for you to send data to a particular provider depending on the status of your responses. | |
| In this case, we want to log requests to Supabase in both scenarios - when it succeeds and fails. | |
| ### Create a supabase table | |
| Go to your Supabase project > go to the [Supabase SQL Editor](https://supabase.com/dashboard/projects) and create a new table with this configuration. | |
| Note: You can change the table name. Just don't change the column names. | |
| ```sql | |
| create table | |
| public.request_logs ( | |
| id bigint generated by default as identity, | |
| created_at timestamp with time zone null default now(), | |
| model text null default ''::text, | |
| messages json null default '{}'::json, | |
| response json null default '{}'::json, | |
| end_user text null default ''::text, | |
| error json null default '{}'::json, | |
| response_time real null default '0'::real, | |
| total_cost real null, | |
| additional_details json null default '{}'::json, | |
| constraint request_logs_pkey primary key (id) | |
| ) tablespace pg_default; | |
| ``` | |
| ### Use Callbacks | |
| Use just 2 lines of code, to instantly see costs and log your responses **across all providers** with Supabase: | |
| ``` | |
| litellm.success_callback=["supabase"] | |
| litellm.failure_callback=["supabase"] | |
| ``` | |
| Complete code | |
| ```python | |
| from litellm import completion | |
| ## set env variables | |
| ### SUPABASE | |
| os.environ["SUPABASE_URL"] = "your-supabase-url" | |
| os.environ["SUPABASE_KEY"] = "your-supabase-key" | |
| ## LLM API KEY | |
| os.environ["OPENAI_API_KEY"] = "" | |
| # set callbacks | |
| litellm.success_callback=["supabase"] | |
| litellm.failure_callback=["supabase"] | |
| #openai call | |
| response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi π - i'm openai"}]) | |
| #bad call | |
| response = completion(model="chatgpt-test", messages=[{"role": "user", "content": "Hi π - i'm a bad call to test error logging"}]) | |
| ``` | |
| ### Additional Controls | |
| **Different Table name** | |
| If you modified your table name, here's how to pass the new name. | |
| ```python | |
| litellm.modify_integration("supabase",{"table_name": "litellm_logs"}) | |
| ``` | |
| **Identify end-user** | |
| Here's how to map your llm call to an end-user | |
| ```python | |
| litellm.identify({"end_user": "krrish@berri.ai"}) | |
| ``` |