
Composer runs an Airflow DAG on a schedule (or trigger). Tasks connect to the vendor’s SFTP using key-based auth, pull new CSV/JSON, place them in a raw GCS bucket, validate (naming, row counts, optional schema), then load into BigQuery staging/partitioned tables. Success moves files to an archive; failures raise alerts. DAG parameters (SFTP path, GCS prefix, BQ table) make the same flow reusable across vendors.