Upload Guide
This guide explains the different ways to provide media files to the Redner B2B API and helps you choose the best method for your use case.
Overview
The Redner API supports three methods for providing media files:
- Direct HTTP/HTTPS URLs - Simple, one-step approach for publicly accessible files
- Presigned URL Upload - Two-step process for private files or better control
- Multipart Upload - For large files over 100 MB
Method 1: Direct HTTP/HTTPS URLs
The simplest approach - just provide a URL to your media file in the API request.
When to Use
- Files are already hosted and publicly accessible
- File size is under 3500 MB
- You want the simplest integration (single API call)
- You don't mind the API downloading from your server
Supported URL Types
HTTP/HTTPS URLs:
https://example.com/video.mp4
https://cdn.example.com/media/audio.mp3
https://storage.example.com/files/presentation.mp4
Requirements
- Must use HTTP or HTTPS protocol
- File must be publicly accessible (or include authentication in URL)
- URL must include a valid file extension
- Maximum file size: 3500 MB
Not Supported
- localhost or private IP addresses (192.168.x.x, 10.x.x.x, 172.16-31.x.x)
- FTP, SFTP, or other protocols
- URLs without file extensions
- YouTube, Vimeo, or other streaming platform URLs
Supported File Formats
Video: mp4, webm, ogg
Audio: mp3, wav, m4a, flac, amr
Subtitles: srt (for Translate and Voice-Over APIs)
Example
curl -X POST https://api.rednerapp.com/v1/transcribe \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"mediaUrl": "https://example.com/video.mp4",
"sourceLanguage": "en-US"
}'How It Works
- You provide the URL in your API request
- The file is uploaded to our server
- Processing begins automatically
Processing Time
Total processing time includes:
- Download time (depends on file size and your server speed)
- Processing time (transcription, translation, etc.)
Error Handling
Common errors and solutions:
| Error | Cause | Solution |
|---|---|---|
| "Invalid URL format" | URL doesn't start with http:// or https:// | Check URL protocol |
| "File not found" | URL returns 404 | Verify URL is accessible |
| "Download timed out" | File too large or slow network | Use upload mechanism instead |
| "Unsupported format" | Invalid file extension | Use supported format |
| "File too large" | File exceeds 3500 MB | Use upload mechanism |
| "URLs to local networks not allowed" | URL points to private IP | Use public URL or upload mechanism |
Method 2: Presigned URL Upload
A two-step process that gives you more control and works with private files.
When to Use
- Files are private or not publicly accessible
- File size is under 3500 MB
- You need better error handling and retry logic
How It Works
- Initialize upload - Get a presigned URL
- Upload file - Upload using the presigned URL
- Confirm upload - Trigger processing
Step 1: Initialize Upload
Request a presigned URL for uploading your file:
curl -X POST https://api.rednerapp.com/v1/upload/init \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"apiProduct": "transcribe",
"sourceLanguage": "en-US",
"fileName": "video.mp4"
}'Request Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
| apiProduct | string | Yes | API product: "transcribe", "translate", "voiceover", or "dub" |
| sourceLanguage | string | Conditional | Required for transcribe and dub |
| targetLanguage | string | Conditional | Required for translate, voiceover, and dub |
| fileName | string | Yes | Original file name with extension |
Response:
{
"jobId": "job_abc123",
"uploadUrl": "https://s3.amazonaws.com/bucket/path?signature=...",
"expiresAt": "2025-12-04T12:30:00Z"
}The presigned URL expires in 15 minutes. You must complete the upload before expiration.
Step 2: Upload File
Upload your file using the presigned URL:
curl -X PUT "https://s3.amazonaws.com/bucket/path?signature=..." \
-H "Content-Type: video/mp4" \
--upload-file video.mp4Important:
- Use PUT method (not POST)
- Set appropriate Content-Type header
- Upload the entire file in one request
- Complete upload before URL expires (15 minutes)
Step 3: Confirm Upload
After upload completes, trigger processing:
curl -X POST https://api.rednerapp.com/v1/upload/confirm \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"jobId": "job_abc123"
}'Response:
{
"jobId": "job_abc123",
"status": "pending",
"message": "Processing started successfully"
}Complete Example (Python)
import requests
API_KEY = "rdn_live_abc123..."
BASE_URL = "https://api.rednerapp.com/v1"
# Step 1: Initialize upload
init_response = requests.post(
f"{BASE_URL}/upload/init",
headers={
"Authorization": f"Bearer {API_KEY}",
"Content-Type": "application/json"
},
json={
"apiProduct": "transcribe",
"sourceLanguage": "en-US",
"fileName": "video.mp4"
}
)
data = init_response.json()
job_id = data["jobId"]
upload_url = data["uploadUrl"]
# Step 2: Upload file
with open("video.mp4", "rb") as f:
upload_response = requests.put(
upload_url,
headers={"Content-Type": "video/mp4"},
data=f
)
if upload_response.status_code == 200:
# Step 3: Confirm upload
confirm_response = requests.post(
f"{BASE_URL}/upload/confirm",
headers={
"Authorization": f"Bearer {API_KEY}",
"Content-Type": "application/json"
},
json={
"jobId": job_id
}
)
print(f"Job started: {job_id}")
else:
print(f"Upload failed: {upload_response.status_code}")Complete Example (JavaScript)
const API_KEY = 'rdn_live_abc123...';
const BASE_URL = 'https://api.rednerapp.com/v1';
async function uploadAndProcess(file) {
// Step 1: Initialize upload
const initResponse = await fetch(`${BASE_URL}/upload/init`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
apiProduct: 'transcribe',
sourceLanguage: 'en-US',
fileName: file.name
})
});
const { jobId, uploadUrl } = await initResponse.json();
// Step 2: Upload file
const uploadResponse = await fetch(uploadUrl, {
method: 'PUT',
headers: {
'Content-Type': file.type
},
body: file
});
if (uploadResponse.ok) {
// Step 3: Confirm upload
const confirmResponse = await fetch(`${BASE_URL}/upload/confirm`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
jobId,
apiProduct: 'transcribe'
})
});
const result = await confirmResponse.json();
console.log('Job started:', result.jobId);
return result;
} else {
throw new Error('Upload failed');
}
}Method 3: Multipart Upload
For large files over 100 MB, use multipart upload to upload in chunks.
When to Use
- File size is over 100 MB
- You need to upload very large files (up to 3500 MB)
- You want to resume failed uploads
- Network connection is unreliable
How It Works
- Start multipart upload - Initialize and get upload ID
- Upload parts - Upload file in chunks (5 MB - 100 MB each)
- Complete upload - Finalize the upload and automatically start processing
Step 1: Start Multipart Upload
curl -X POST https://api.rednerapp.com/v1/upload/multipart/start \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"apiProduct": "transcribe",
"sourceLanguage": "en-US",
"fileName": "large-video.mp4"
}'Request Parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
| apiProduct | string | Yes | API product type (transcribe, translate, voiceover, dub) |
| sourceLanguage | string | Conditional | Required for transcribe, translate, and dub |
| targetLanguage | string | Conditional | Required for translate, voiceover, and dub |
| fileName | string | Yes | File name with extension |
| partCount | integer | No | Number of parts (default: 10, max: 10000) |
| outputFormat | string | No | Output format for voiceover (same or audio_only) |
| srtUrl | string | No | SRT file URL (for translate/voiceover) |
| srtContent | string | No | SRT file content (for translate/voiceover) |
Response:
{
"jobId": "job_abc123",
"uploadId": "upload_xyz789",
"partUrls": [
"https://s3.amazonaws.com/bucket/path?partNumber=1&uploadId=...",
"https://s3.amazonaws.com/bucket/path?partNumber=2&uploadId=...",
"https://s3.amazonaws.com/bucket/path?partNumber=3&uploadId=..."
],
"partSize": 10485760,
"expiresAt": "2025-12-04T12:30:00Z"
}Step 2: Upload Parts
Upload each part using the provided URLs:
import requests
def upload_file_in_parts(file_path, part_urls):
part_size = 10 * 1024 * 1024 # 10 MB
etags = []
with open(file_path, 'rb') as f:
for i, part_url in enumerate(part_urls):
# Read part data
part_data = f.read(part_size)
# Upload part
response = requests.put(
part_url,
data=part_data,
headers={'Content-Type': 'application/octet-stream'}
)
# Save ETag for completion
etag = response.headers['ETag']
etags.append({
'partNumber': i + 1,
'etag': etag
})
print(f"Uploaded part {i + 1}/{len(part_urls)}")
return etagsStep 3: Complete Upload
After all parts are uploaded, complete the multipart upload. This automatically starts job processing (no need to call /v1/upload/confirm):
curl -X POST https://api.rednerapp.com/v1/upload/multipart/complete \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"jobId": "job_abc123",
"uploadId": "upload_xyz789",
"parts": [
{"partNumber": 1, "etag": "abc123..."},
{"partNumber": 2, "etag": "def456..."},
{"partNumber": 3, "etag": "ghi789..."}
]
}'Response:
{
"jobId": "job_abc123",
"status": "processing",
"location": "https://s3.amazonaws.com/bucket/path",
"executionArn": "arn:aws:states:us-east-1:123456789012:execution:TranscribeWorkflow:exec-123"
}Abort Upload
If you need to cancel a multipart upload:
curl -X POST https://api.rednerapp.com/v1/upload/multipart/abort \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"jobId": "job_abc123",
"uploadId": "upload_xyz789"
}'Choosing the Right Method
| Scenario | Recommended Method | Why |
|---|---|---|
| Small files (< 100 MB), public URL | Direct HTTP/HTTPS URL | Simplest, one API call |
| Small files (< 100 MB), private | Presigned URL Upload | Direct upload, secure |
| Medium files (100 MB - 1 GB), public URL | Direct HTTP/HTTPS URL | Simple, acceptable download time |
| Medium files (100 MB - 1 GB), private | Presigned URL Upload | Faster than download |
| Large files (1 GB - 3500 MB) | Multipart Upload | Reliable, resumable |
| Very large files (> 3500 MB) | Not supported | Contact support for alternatives |
Best Practices
Security
- Never expose API keys in client-side code
- Use presigned URLs for private files
- Implement proper error handling
- Validate file types before upload
Performance
- Use multipart upload for files over 100 MB
- Upload from servers close to AWS regions
- Implement retry logic with exponential backoff
- Monitor upload progress and handle timeouts
Error Handling
Always implement proper error handling:
try {
const result = await uploadAndProcess(file);
console.log('Success:', result);
} catch (error) {
if (error.response?.status === 401) {
console.error('Authentication failed - check API key');
} else if (error.response?.status === 400) {
console.error('Invalid request:', error.response.data);
} else if (error.response?.status === 413) {
console.error('File too large - use multipart upload');
} else {
console.error('Upload failed:', error.message);
}
}Monitoring
- Set up webhooks to receive job status updates
- Monitor upload success rates
- Track processing times
- Log errors for debugging
Troubleshooting
Upload Fails with 403 Forbidden
Cause: Presigned URL expired or invalid
Solution:
- Presigned URLs expire in 15 minutes
- Request a new URL and retry upload
- Ensure you're using PUT method, not POST
Upload Succeeds but Processing Doesn't Start
Cause: Forgot to call confirm endpoint
Solution:
- Always call
POST /v1/upload/confirmafter upload - Verify the jobId matches the upload
Multipart Upload Fails on Last Part
Cause: Part size mismatch or missing parts
Solution:
- Ensure all parts except the last are the same size
- Last part can be smaller
- Verify all parts uploaded successfully before completing
File Too Large Error
Cause: File exceeds 3500 MB limit
Solution:
- Compress the file before upload
- Use a lower bitrate for video encoding
- Contact support if you need higher limits
Network Timeout During Upload
Cause: Slow network or large file
Solution:
- Use multipart upload for better reliability
- Implement retry logic
- Upload from a server with better connectivity
Rate Limits
All API endpoints are subject to two tiers of rate limiting:
Tier 1: Global Rate Limits (All Endpoints)
All API calls are subject to global rate limits per client:
- 100 requests per minute
- 10,000 requests per day
These limits apply to ALL endpoints including job creation, management, and upload endpoints.
Tier 2: Per-Product Rate Limits (Job Creation Only)
Job creation endpoints have additional per-product rate limits:
| Endpoint | Rate Limit |
|---|---|
| POST /v1/transcribe | 4 requests per minute |
| POST /v1/translate | 32 requests per minute |
| POST /v1/voiceover | 8 requests per minute |
| POST /v1/dub | 1 request per minute |
Important: These are in addition to the global limits. A request must pass both checks to be allowed.
Each upload operation (init, upload part, complete, confirm) counts as one request toward the global limit but is not subject to per-product limits.
Next Steps
- Set up Webhooks to receive job completion notifications
- Explore the other API endpoints for detailed documentation
- Check Error Handling for comprehensive error codes