executor abstraction, graphene to strawberry
This commit is contained in:
@@ -68,6 +68,47 @@ aws s3 cp video.mp4 s3://mpr-media-in/
|
||||
aws s3 sync /local/media/ s3://mpr-media-in/
|
||||
```
|
||||
|
||||
## GCP Production (GCS via S3 compatibility)
|
||||
|
||||
GCS exposes an S3-compatible API. The same `core/storage.py` boto3 code works
|
||||
with no changes — only the endpoint and credentials differ.
|
||||
|
||||
### GCS HMAC Keys
|
||||
Generate under **Cloud Storage → Settings → Interoperability** in the GCP console.
|
||||
These act as `AWS_ACCESS_KEY_ID` / `AWS_SECRET_ACCESS_KEY`.
|
||||
|
||||
### Configuration
|
||||
```bash
|
||||
S3_ENDPOINT_URL=https://storage.googleapis.com
|
||||
S3_BUCKET_IN=mpr-media-in
|
||||
S3_BUCKET_OUT=mpr-media-out
|
||||
AWS_ACCESS_KEY_ID=<GCS HMAC access key>
|
||||
AWS_SECRET_ACCESS_KEY=<GCS HMAC secret>
|
||||
|
||||
# Executor
|
||||
MPR_EXECUTOR=gcp
|
||||
GCP_PROJECT_ID=my-project
|
||||
GCP_REGION=us-central1
|
||||
CLOUD_RUN_JOB=mpr-transcode
|
||||
CALLBACK_URL=https://mpr.mcrn.ar/api
|
||||
CALLBACK_API_KEY=<secret>
|
||||
```
|
||||
|
||||
### Upload Files to GCS
|
||||
```bash
|
||||
gcloud storage cp video.mp4 gs://mpr-media-in/
|
||||
|
||||
# Or with the aws CLI via compat endpoint
|
||||
aws --endpoint-url https://storage.googleapis.com s3 cp video.mp4 s3://mpr-media-in/
|
||||
```
|
||||
|
||||
### Cloud Run Job Handler
|
||||
`task/gcp_handler.py` is the Cloud Run Job entrypoint. It reads the job payload
|
||||
from `MPR_JOB_PAYLOAD` (injected by `GCPExecutor`), uses `core/storage` for all
|
||||
GCS access (S3 compat), and POSTs the completion callback to the API.
|
||||
|
||||
Set the Cloud Run Job command to: `python -m task.gcp_handler`
|
||||
|
||||
## Storage Module
|
||||
|
||||
`core/storage.py` provides all S3 operations:
|
||||
@@ -114,7 +155,14 @@ mutation { scanMediaFolder { found registered skipped files } }
|
||||
4. Uploads result to `S3_BUCKET_OUT`
|
||||
5. Calls back to API with result
|
||||
|
||||
Both paths use the same S3 buckets and key structure.
|
||||
### Cloud Run Job Mode (GCP)
|
||||
1. `GCPExecutor` triggers Cloud Run Job with payload in `MPR_JOB_PAYLOAD`
|
||||
2. `task/gcp_handler.py` downloads source from `S3_BUCKET_IN` (GCS S3 compat)
|
||||
3. Runs FFmpeg in container
|
||||
4. Uploads result to `S3_BUCKET_OUT` (GCS S3 compat)
|
||||
5. Calls back to API with result
|
||||
|
||||
All three paths use the same S3-compatible bucket names and key structure.
|
||||
|
||||
## Supported File Types
|
||||
|
||||
|
||||
Reference in New Issue
Block a user