How to Build a Custom GPT: Complete Guide to Knowledge Files, Actions API & Sharing
How to Build a Custom GPT: Knowledge Files, Actions API & Sharing Setup
Custom GPTs let you create specialized AI assistants tailored to your workflows. Whether you need a customer support bot trained on your documentation or an internal tool that connects to your APIs, this guide walks you through every step — from initial creation to public deployment.
Prerequisites
- ChatGPT Plus, Team, or Enterprise subscription (Custom GPTs are not available on the free plan)
- Your knowledge base files (PDF, DOCX, TXT, CSV, JSON — max 20 files, 512 MB each)
- (Optional) An external API endpoint with OpenAPI 3.0+ schema for Actions
Step-by-Step: Creating Your Custom GPT
Step 1: Open the GPT Builder
- Navigate to chat.openai.com and log in.
- Click your profile icon → My GPTs → Create a GPT.
- You’ll see two tabs: Create (conversational builder) and Configure (manual setup). Use Configure for full control.
Step 2: Define Identity & Instructions
Fill in the following fields under the Configure tab:
| Field | Description | Example |
|---|---|---|
| Name | Public-facing name of your GPT | DevOps Assistant Pro |
| Description | Short summary shown in the GPT Store | Helps teams troubleshoot CI/CD pipelines |
| Instructions | System prompt that governs behavior | See example below |
Write clear, structured instructions using this template:
You are “DevOps Assistant Pro”, an expert in CI/CD, Docker, Kubernetes, and cloud infrastructure.
RULES:
- Always provide actionable CLI commands.
- When referencing documentation, cite the specific file name from your knowledge base.
- If you don’t know the answer, say so — do not fabricate commands.
- Format outputs in markdown with code blocks.
PERSONALITY:
- Professional but approachable.
- Prefer concise answers with expandable details.
CONTEXT:
- Users are mid-to-senior engineers on Linux/macOS.
Primary stack: AWS, Terraform, GitHub Actions.
Step 3: Upload Knowledge Files
- Scroll to the Knowledge section in Configure.
- Click Upload files and select your documents.
- Supported formats: PDF, DOCX, PPTX, TXT, MD, CSV, JSON, XML, HTML.
- The GPT uses retrieval-augmented generation (RAG) to search these files at query time.
**Best practices for knowledge files:**
- Use descriptive file names (e.g.,
aws-ec2-troubleshooting-2025.pdfinstead ofdoc1.pdf). - Break large documents into topic-specific files for better retrieval accuracy.
- Include a
table-of-contents.mdfile that maps topics to file names — this helps the GPT route queries. - Keep total knowledge base under 50 files for optimal performance.
Step 4: Configure Actions (API Integration)
Actions allow your GPT to call external APIs. You need an OpenAPI 3.0 specification.
4a. Write Your OpenAPI Schema
{
“openapi”: “3.1.0”,
“info”: {
“title”: “Ticket Management API”,
“version”: “1.0.0”,
“description”: “Create and query support tickets”
},
“servers”: [
{
“url”: “https://api.yourcompany.com/v1”
}
],
“paths”: {
“/tickets”: {
“get”: {
“operationId”: “listTickets”,
“summary”: “List open support tickets”,
“parameters”: [
{
“name”: “status”,
“in”: “query”,
“required”: false,
“schema”: { “type”: “string”, “enum”: [“open”, “closed”, “pending”] }
}
],
“responses”: {
“200”: {
“description”: “Array of tickets”,
“content”: {
“application/json”: {
“schema”: {
“type”: “array”,
“items”: {
“type”: “object”,
“properties”: {
“id”: { “type”: “string” },
“title”: { “type”: “string” },
“status”: { “type”: “string” }
}
}
}
}
}
}
}
},
“post”: {
“operationId”: “createTicket”,
“summary”: “Create a new support ticket”,
“requestBody”: {
“required”: true,
“content”: {
“application/json”: {
“schema”: {
“type”: “object”,
“required”: [“title”, “description”],
“properties”: {
“title”: { “type”: “string” },
“description”: { “type”: “string” },
“priority”: { “type”: “string”, “enum”: [“low”, “medium”, “high”] }
}
}
}
}
},
“responses”: {
“201”: { “description”: “Ticket created successfully” }
}
}
}
}
}
4b. Add the Action in GPT Builder
- Click Create new action in the Configure tab.
- Paste your OpenAPI schema into the schema editor.
- Set Authentication: choose API Key, OAuth 2.0, or None.
- For API Key auth, select the header name (typically
Authorization) and enterBearer YOUR_API_KEY. - Set your Privacy policy URL (required for public GPTs).
- Click Test to validate each endpoint.
4c. Validate with cURL Before Connecting
# Test your API independently before adding to GPT
curl -X GET "https://api.yourcompany.com/v1/tickets?status=open" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json"
# Test POST endpoint
curl -X POST "https://api.yourcompany.com/v1/tickets" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"title": "Test Ticket", "description": "Created via API", "priority": "low"}'
Step 5: Enable Capabilities
Toggle the built-in capabilities your GPT needs:
- Web Browsing — for real-time information lookup
- Code Interpreter — for data analysis, chart generation, and running Python
- DALL·E — for generating visuals on demand
Step 6: Set Sharing & Publishing Options
- Click Save in the top-right corner.
- Choose your sharing level:
| Option | Visibility | Use Case |
|---|---|---|
| Only me | Private — only you can use it | Personal productivity tools |
| Anyone with the link | Unlisted — accessible via direct URL | Team tools, client demos |
| Everyone | Public — listed in the GPT Store | Community tools, lead generation |
Pro Tips for Power Users
- Version your instructions: Keep a local copy of your system prompt in a Git repo. This lets you track changes and roll back if a prompt update degrades performance.
- Use conversation starters: Add 4 starter prompts that showcase your GPT’s best capabilities — these appear as clickable buttons for new users.
- Chain Actions strategically: Your GPT can call multiple Actions in sequence. Design your API endpoints to be composable (e.g., search → get detail → update).
- Test with edge cases: Before publishing, test ambiguous queries, empty API responses, and large file references to ensure graceful handling.
- Monitor usage: In the GPT Store analytics dashboard, track conversation counts and user retention to iterate on your instructions.
- Structure knowledge files with headers: Use clear H1/H2 markdown headers in your docs — the retrieval system uses these as semantic anchors.
Troubleshooting Common Errors
| Error | Cause | Fix |
|---|---|---|
| "Could not connect to the API" | Server URL mismatch or CORS issue | Verify the servers.url in your schema matches your actual endpoint. Ensure your API accepts requests from https://chat.openai.com. |
| "Authentication failed" | Invalid or expired API key | Regenerate your API key and update it in the Action authentication settings. Check the header format (Bearer vs raw key). |
| Knowledge file not referenced | File name is generic or content lacks structure | Rename files descriptively and add a TOC file. Mention file names explicitly in your instructions. |
| "Schema validation error" | OpenAPI spec has syntax issues | Validate your schema at editor.swagger.io before pasting. Common issues: missing operationId, invalid $ref paths. |
| GPT ignores instructions | System prompt is too long or contradictory | Keep instructions under 1500 words. Use clear sections with headers. Remove conflicting rules. |
Frequently Asked Questions
Can I update my Custom GPT’s knowledge files after publishing?
Yes. Open your GPT in the builder, navigate to the Knowledge section, remove outdated files, and upload new ones. Changes take effect immediately — no need to republish. Users in active conversations may need to start a new chat to access the updated knowledge base.
How do I secure sensitive data when using Actions with an external API?
Use OAuth 2.0 authentication instead of static API keys for production environments. Never expose internal-only endpoints; create a dedicated proxy or gateway layer with scoped permissions. Add rate limiting on your API server, and always validate incoming request headers to confirm they originate from OpenAI’s infrastructure.
What are the file size and quantity limits for Knowledge uploads?
Each file can be up to 512 MB, and you can upload up to 20 files per GPT. Supported formats include PDF, DOCX, TXT, CSV, JSON, MD, PPTX, and HTML. For best retrieval performance, keep individual files focused on a single topic and under 50 pages or 100,000 tokens of text content.