Spaces:
Running
Running
- images Upload 4 media file(s)
-
1.82 kB
Upload 4 media file(s)
-
234 Bytes
HFTools/inference-api # Build on Stripe with LLMs Use LLMs in your Stripe integration workflow. You can use large language models (LLMs) to assist in the building of Stripe integrations. We provide a set of tools and best practices if you use LLMs during development. ## Plain text docs You can access all of our documentation as plain text markdown files by adding `.md` to the end of any url. For example, you can find the plain text version of this page itself at [https://docs.stripe.com/building-with-llms.md](https://docs.stripe.com/building-with-llms.md). This helps AI tools and agents consume our content and allows you to copy and paste the entire contents of a doc into an LLM. This format is preferable to scraping or copying from our HTML and JavaScript-rendered pages because: - Plain text contains fewer formatting tokens. - Content that isn鈥檛 rendered in the default view (for example, it鈥檚 hidden in a tab) of a given page is rendered in the plain text version. - LLMs can parse and understand markdown hierarchy. We also host an [/llms.txt file](https://docs.stripe.com/llms.txt.md) which instructs AI tools and agents how to retrieve the plain text versions of our pages. The `/llms.txt` file is an [emerging standard](https://llmstxt.org/) for making websites and content more accessible to LLMs. ## Model Context Protocol (MCP) The Stripe Model Context Protocol (MCP) defines a set of tools that AI agents can use to interact with the Stripe API and search Stripe鈥檚 knowledge base (including documentation and support articles). These code editor agent automatically detect the tools, which call the relevant tool. Learn more in our [MCP documentation](https://docs.stripe.com/mcp.md). ## VS Code AI Assistant If you鈥檙e a Visual Studio Code user, you can install the [Stripe VS Code extension](https://docs.stripe.com/stripe-vscode.md) to access our AI Assistant. With the Stripe AI Assistant, you can: - Get immediate answers about the Stripe API and products - Receive code suggestions tailored to your integration - Ask follow-up questions for more detailed information - Access knowledge from the Stripe documentation and the Stripe developer community To get started with the Stripe AI assistant: 1. Make sure you have the Stripe VS Code extension installed. 1. Navigate to the Stripe extension UI 1. Under **AI Assistant** click **Ask a question**. - If you鈥檙e a Copilot user, this opens the Copilot chat where you can @-mention `@stripe`. In the input field, talk to the Stripe-specific assistant using `@stripe` followed by your question. - If you鈥檙e not a Copilot user, it opens a chat UI where you can talk to the Stripe LLM directly. ## Stripe Agent Toolkit SDK If you鈥檙e building agentic software, we provide an SDK for adding Stripe functionality to your agent鈥檚 capabilities. For example, using the SDK you can: - Create Stripe objects - Charge for agent usage - Use with popular frameworks such as OpenAI鈥檚 Agent SDK, Vercel鈥檚 AI SDK, Langchain, and CrewAI Learn more in our [agents documentation](https://docs.stripe.com/agents.md). ## See also - [Stripe for Visual Studio Code](https://docs.stripe.com/stripe-vscode.md) - [Add Stripe to your agentic workflows](https://docs.stripe.com/agents.md)Excellent! The DeepSite custom development setup has completed successfully. You've cloned the DeepStudio (BYOK Agentic version) which is a feature-rich variant with custom capabilities. 馃幆 What's Ready: 1. Project Structure: DeepStudio with BYOK (Bring Your Own Keys) functionality 2. Dependencies: All npm packages installed (649 packages) 3. Environment Setup: .env file created with template variables 4. Custom Development Scripts: Enhanced development tools 5. Documentation: DEV_README.md with comprehensive guides 馃殌 Quick Start Commands: ```bash # Start in custom development mode (recommended) npm run dev:custom # Or use standard development mode npm run dev # Build for production npm run build # Start production server npm run start ``` 馃攽 Immediate Next Steps: 1. Edit the .env file with your actual API keys: ```bash nano .env ``` 1. Add your Hugging Face token (required for inference): 路 Get token from: https://huggingface.co/settings/tokens 路 Ensure it has inference permissions 2. Optional API keys to add: 路 OPENAI_API_KEY - For OpenAI models 路 ANTHROPIC_API_KEY - For Claude models 路 Other provider keys as needed 馃搧 Project Location: Your project is located at: /data/data/com.termux/files/home/deepstudio 馃敡 Development Features Enabled: 路 Hot reload for rapid development 路 Debug mode with detailed logging 路 Custom development scripts 路 Enhanced error reporting 馃悰 If You Encounter Issues: 1. Dependency issues: Run npm install again 2. Port conflicts: Change PORT in .env file 3. API errors: Verify your tokens have correct permissions The setup looks clean with only 1 moderate severity vulnerability (which is common in development dependencies). You can run npm audit fix to address it if needed. Ready to start developing?
-
15 kB
HFTools/inference-api # Build on Stripe with LLMs Use LLMs in your Stripe integration workflow. You can use large language models (LLMs) to assist in the building of Stripe integrations. We provide a set of tools and best practices if you use LLMs during development. ## Plain text docs You can access all of our documentation as plain text markdown files by adding `.md` to the end of any url. For example, you can find the plain text version of this page itself at [https://docs.stripe.com/building-with-llms.md](https://docs.stripe.com/building-with-llms.md). This helps AI tools and agents consume our content and allows you to copy and paste the entire contents of a doc into an LLM. This format is preferable to scraping or copying from our HTML and JavaScript-rendered pages because: - Plain text contains fewer formatting tokens. - Content that isn鈥檛 rendered in the default view (for example, it鈥檚 hidden in a tab) of a given page is rendered in the plain text version. - LLMs can parse and understand markdown hierarchy. We also host an [/llms.txt file](https://docs.stripe.com/llms.txt.md) which instructs AI tools and agents how to retrieve the plain text versions of our pages. The `/llms.txt` file is an [emerging standard](https://llmstxt.org/) for making websites and content more accessible to LLMs. ## Model Context Protocol (MCP) The Stripe Model Context Protocol (MCP) defines a set of tools that AI agents can use to interact with the Stripe API and search Stripe鈥檚 knowledge base (including documentation and support articles). These code editor agent automatically detect the tools, which call the relevant tool. Learn more in our [MCP documentation](https://docs.stripe.com/mcp.md). ## VS Code AI Assistant If you鈥檙e a Visual Studio Code user, you can install the [Stripe VS Code extension](https://docs.stripe.com/stripe-vscode.md) to access our AI Assistant. With the Stripe AI Assistant, you can: - Get immediate answers about the Stripe API and products - Receive code suggestions tailored to your integration - Ask follow-up questions for more detailed information - Access knowledge from the Stripe documentation and the Stripe developer community To get started with the Stripe AI assistant: 1. Make sure you have the Stripe VS Code extension installed. 1. Navigate to the Stripe extension UI 1. Under **AI Assistant** click **Ask a question**. - If you鈥檙e a Copilot user, this opens the Copilot chat where you can @-mention `@stripe`. In the input field, talk to the Stripe-specific assistant using `@stripe` followed by your question. - If you鈥檙e not a Copilot user, it opens a chat UI where you can talk to the Stripe LLM directly. ## Stripe Agent Toolkit SDK If you鈥檙e building agentic software, we provide an SDK for adding Stripe functionality to your agent鈥檚 capabilities. For example, using the SDK you can: - Create Stripe objects - Charge for agent usage - Use with popular frameworks such as OpenAI鈥檚 Agent SDK, Vercel鈥檚 AI SDK, Langchain, and CrewAI Learn more in our [agents documentation](https://docs.stripe.com/agents.md). ## See also - [Stripe for Visual Studio Code](https://docs.stripe.com/stripe-vscode.md) - [Add Stripe to your agentic workflows](https://docs.stripe.com/agents.md)Excellent! The DeepSite custom development setup has completed successfully. You've cloned the DeepStudio (BYOK Agentic version) which is a feature-rich variant with custom capabilities. 馃幆 What's Ready: 1. Project Structure: DeepStudio with BYOK (Bring Your Own Keys) functionality 2. Dependencies: All npm packages installed (649 packages) 3. Environment Setup: .env file created with template variables 4. Custom Development Scripts: Enhanced development tools 5. Documentation: DEV_README.md with comprehensive guides 馃殌 Quick Start Commands: ```bash # Start in custom development mode (recommended) npm run dev:custom # Or use standard development mode npm run dev # Build for production npm run build # Start production server npm run start ``` 馃攽 Immediate Next Steps: 1. Edit the .env file with your actual API keys: ```bash nano .env ``` 1. Add your Hugging Face token (required for inference): 路 Get token from: https://huggingface.co/settings/tokens 路 Ensure it has inference permissions 2. Optional API keys to add: 路 OPENAI_API_KEY - For OpenAI models 路 ANTHROPIC_API_KEY - For Claude models 路 Other provider keys as needed 馃搧 Project Location: Your project is located at: /data/data/com.termux/files/home/deepstudio 馃敡 Development Features Enabled: 路 Hot reload for rapid development 路 Debug mode with detailed logging 路 Custom development scripts 路 Enhanced error reporting 馃悰 If You Encounter Issues: 1. Dependency issues: Run npm install again 2. Port conflicts: Change PORT in .env file 3. API errors: Verify your tokens have correct permissions The setup looks clean with only 1 moderate severity vulnerability (which is common in development dependencies). You can run npm audit fix to address it if needed. Ready to start developing?
-
388 Bytes
initial commit