Skip to main content

Documentation Index

Fetch the complete documentation index at: https://rendobar.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

Rendobar exposes its API as a Model Context Protocol server so AI agents can submit jobs, upload files, and check status without writing HTTP code. There are two servers — pick by client.

Pick a transport

Local stdio

For Claude Desktop, Cursor, Cline, Windsurf, Zed, VS Code, Claude Code, Continue.Install: npx -y @rendobar/mcp. Reads local files. Streams uploads.

Remote HTTP

For claude.ai, ChatGPT Apps, hosted MCP gateways.URL: https://api.rendobar.com/mcp. No file system access.

Why two servers

The MCP spec has no primitive for streaming bytes from a client to a remote server. A remote MCP can submit jobs but cannot read a file off your laptop — the agent has to dictate a curl command and wait for you to paste back the URL. The local server runs on your machine, has access to your files, and uploads them in a single tool call. That is the difference. For agents inside a browser (claude.ai) or a phone (ChatGPT mobile), the remote server is the only option.

Tools

Both servers expose the same six job-related tools. Local adds upload_file for disk reads; remote keeps upload_media for curl-based uploads.
ToolLocalRemotePurpose
submit_jobSubmit any active job type
get_jobPoll status, fetch result
list_jobsRecent jobs with filters
cancel_jobCancel a waiting/dispatched job
get_accountBalance, plan, limits
upload_fileRead local file, upload, return URL
upload_mediaReturn upload endpoint + curl command
Full schemas: Tools reference.

Authentication

Both servers use a Rendobar API key (rb_live_* or rb_test_*). Get one at app.rendobar.com → Settings → API Keys.
  • Local: RENDOBAR_API_KEY env var, --api-key flag, or credentials file
  • Remote: Authorization: Bearer header

See also