We are looking for an experienced Google Cloud Platform (GCP) Dataflow expert to help us build streaming and batch data pipelines to ingest, transform and load data into our BigQuery data warehouse.
The successful candidate should have extensive hands on experience designing, developing and optimizing Dataflow pipelines that ingest data from Pub/Sub and other sources for both real-time and batch processing use cases. They should be proficient in Java/Python and have a deep understanding of Dataflow concepts like windowing, triggers, side inputs etc.
Experience designing and architecting scalable data warehousing solutions on BigQuery is essential. The pipelines need to support ingesting millions of records per day from various APIs and services. Experience integrating Dataflow with other GCP services like Cloud Functions, Datastore, Storage etc is preferred.
This is a short term contracting role where you will work closely with our in-house team of developers and data scientists. By leveraging your strong Dataflow expertise, you will help us build a robust data ingestion and ETL layer to serve the analytics and reporting needs of our fast growing SaaS platform. Experience with PostgreSQL and Snowflake is a plus.
The ideal candidate should have at least 3 years of relevant hands on experience developing complex Dataflow pipelines and data warehousing solutions on Google Cloud Platform. proficiency with Programming languages such as Java/Python is mandatory.