Batch Execution
Execute multiple code snippets in different languages in a single API request. This is useful when you need to run multiple independent code executions and want to minimize HTTP overhead.Headers
| Header | Type | Required | Description |
|---|---|---|---|
| X-API-Key | string | Yes | Your Sandbox API key |
| Content-Type | string | Yes | Must be application/json |
Request Body
Array of execution requests (maximum 10):| Parameter | Type | Required | Description |
|---|---|---|---|
| code | string | Yes | The code to execute |
| language | string | Yes | Programming language |
| stdin | string | No | Standard input |
| environment | object | No | Environment variables |
| timeout | integer | No | Execution timeout in seconds |
| session_id | string | No | Optional session ID |
Response
Array of execution results in the same order as requests:Error Handling
If an individual execution fails, it returns an error result withexit_code: -1 instead of failing the entire batch:
Limitations
400 Bad Request
Examples
Multi-Language Batch
Execute code in Python, JavaScript, and Go:Test Cases Batch
Run multiple test cases for the same code:With Different Configurations
Different timeout and environment per execution:Python Client Example
JavaScript Client Example
Execution Order
Important: Executions in a batch are run sequentially, not in parallel. This ensures:- Consistent resource usage
- Predictable execution order
- Easier debugging and troubleshooting
Performance Considerations
Sequential Execution Time
Total execution time is the sum of individual execution times:- Batch: ~15 seconds total (sequential)
- 3 Async calls: ~5 seconds total (parallel)
When to Use Batch
Batch execution is ideal for: ✅ Code comparison: Testing same code in different languages ✅ Test suites: Running multiple test cases sequentially ✅ Reducing HTTP overhead: Many quick executions (< 1s each) ✅ Order matters: When executions must run in specific order ✅ Atomic operations: All executions succeed or you track which failedWhen NOT to Use Batch
Avoid batch execution for: ❌ Long-running code: Use async execution instead ❌ Parallel processing: Submit separate async requests ❌ Independent operations: Better to run separately for better error isolationBest Practices
- Limit batch size: Keep batches small (3-5 executions) for faster response times
- Use consistent timeouts: Set appropriate timeouts for each execution
- Handle partial failures: Check
exit_codefor each result - Group related executions: Use
session_idto track related executions - Monitor total time: Sum of execution times should be well under HTTP timeout
- Consider alternatives: For > 5 executions, consider async execution or multiple batches
Rate Limiting
Batch requests count as multiple requests for rate limiting purposes. A batch of 10 executions counts as 10 requests against your rate limit. If you’re near your rate limit, batch execution can help reduce HTTP overhead while staying within limits.Error Scenarios
Individual Execution Failure
One execution fails, others continue:Invalid Language
Batch Size Exceeded
Comparison with Other Methods
| Feature | Batch | Synchronous | Asynchronous |
|---|---|---|---|
| Multiple executions | ✅ Yes | ❌ No | ✅ Yes (multiple calls) |
| Execution order | Sequential | N/A | Parallel |
| HTTP calls | 1 | Multiple | Multiple |
| Wait for results | ✅ Yes | ✅ Yes | ❌ No (polling) |
| Max executions | 10 per batch | 1 | Unlimited |
| Best for | Quick multi-language tests | Single execution | Long-running tasks |