major refactor

This commit is contained in:
2026-03-13 01:07:02 -03:00
parent eaaf2ad60c
commit 3eeedebb15
61 changed files with 441 additions and 242 deletions

View File

@@ -70,7 +70,7 @@ aws s3 sync /local/media/ s3://mpr-media-in/
## GCP Production (GCS via S3 compatibility)
GCS exposes an S3-compatible API. The same `core/storage.py` boto3 code works
GCS exposes an S3-compatible API. The same `core/storage/s3.py` boto3 code works
with no changes — only the endpoint and credentials differ.
### GCS HMAC Keys
@@ -103,15 +103,15 @@ aws --endpoint-url https://storage.googleapis.com s3 cp video.mp4 s3://mpr-media
```
### Cloud Run Job Handler
`task/gcp_handler.py` is the Cloud Run Job entrypoint. It reads the job payload
`core/task/gcp_handler.py` is the Cloud Run Job entrypoint. It reads the job payload
from `MPR_JOB_PAYLOAD` (injected by `GCPExecutor`), uses `core/storage` for all
GCS access (S3 compat), and POSTs the completion callback to the API.
Set the Cloud Run Job command to: `python -m task.gcp_handler`
Set the Cloud Run Job command to: `python -m core.task.gcp_handler`
## Storage Module
`core/storage.py` provides all S3 operations:
`core/storage/` package provides all S3 operations:
```python
from core.storage import (
@@ -157,7 +157,7 @@ mutation { scanMediaFolder { found registered skipped files } }
### Cloud Run Job Mode (GCP)
1. `GCPExecutor` triggers Cloud Run Job with payload in `MPR_JOB_PAYLOAD`
2. `task/gcp_handler.py` downloads source from `S3_BUCKET_IN` (GCS S3 compat)
2. `core/task/gcp_handler.py` downloads source from `S3_BUCKET_IN` (GCS S3 compat)
3. Runs FFmpeg in container
4. Uploads result to `S3_BUCKET_OUT` (GCS S3 compat)
5. Calls back to API with result