GCP Data Engineer

Houston, TX
Contracted
Mid Level
Job Title: Data Engineer – GCP & Python Development
Location: Houston, TX
Key Responsibilities
  • Design, develop, and maintain reliable and scalable data pipelines on GCP using tools like Dataflow, BigQuery, Pub/Sub, and Cloud Composer.
  • Write efficient and reusable Python scripts and modules for ETL/ELT workflows and data transformations.
  • Collaborate with data scientists, analysts, and other engineers to integrate data from various sources, ensure data quality, and optimize performance.
  • Build and manage data lake and data warehouse solutions leveraging BigQuery and Cloud Storage.
  • Automate data validation and monitoring workflows for data integrity and reliability.
  • Implement CI/CD pipelines for data engineering workflows using tools like Cloud Build, GitHub Actions, or Jenkins.
  • Monitor and optimize job performance, cost efficiency, and error handling across GCP services.
  • Maintain proper documentation of data flows, schemas, and transformation logic.

Requirements
Technical Skills
  • Strong proficiency in Python, with experience in writing scalable, modular, and testable code.
  • Solid experience with Google Cloud Platform – especially BigQuery, Cloud Functions, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer (Airflow).
  • Experience with SQL and building optimized queries for large-scale data processing.
  • Hands-on experience with data orchestration tools like Apache Airflow (Composer preferred).
  • Knowledge of data modeling, data warehousing concepts, and data governance best practices.
  • Familiarity with Docker, Terraform, or Kubernetes is a plus.
Share

Apply for this position

Required*
Apply with Indeed
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*