Data Engineer
Position: Data Engineer – Google Cloud Platform
Position Responsibilities:
- Design, develop, and implement data engineering solutions in the Google Cloud Platform (GCP), adhering to standards, templates, patterns, and best practices.
- Create full stack data engineering solutions encompassing development, ingestion, curation, implementation, deployment, automation, and monitoring of both structured and unstructured data.
- Collaborate with various teams including Data Factory Engineering Organization, Data Architecture, PEM, GDIA, Information Technology, and Data Consumers to drive data engineering capabilities, product design, proof of concepts, and MVPs.
- Develop high-quality data engineering solutions focusing on cloud-first principles, encapsulation, repeatability, automation, and auditability.
- Work independently and as part of a team to build, test, maintain, and troubleshoot data solutions.
- Continuously integrate and deploy data solutions via CI/CD pipelines.
- Utilize Test Driven Development and Code Pairing Practices.
- Develop and maintain comprehensive documentation for data engineering processes, systems, and pipelines ensuring clarity, transparency, and alignment with customer requirements.
Skills Required:
- Proficient in technical documentation, translating business requirements into technical specifications.
- Understanding of the GCP ecosystem with emphasis on BigQuery and DataFlow.
- Ability to design and code analytical solutions for data collections.
- Proficiency in developing data quality and validation routines.
- Capable of testing data products in the development process.
Skills Preferred:
- Strong oral and written communication skills.
- Ability to write complex SQL queries for data analysis.
- Aptitude for communicating complex solution concepts in simple terms.
- Ability to apply multiple solutions to business problems.
- Quick comprehension of functions and capabilities of new technologies.
Experience Required:
- 1+ years of academic/work experience in:
- Data design, data architecture, and data modeling.
- Building Big Data pipelines for operational and analytical solutions.
- Running and tuning queries in databases including BigQuery, SQL Server, Hive, or equivalent platforms.
- Data management, including running queries and compiling data for analytics.
- Developing code in languages such as Java, Python, and SQL.
Experience Preferred:
- 2+ years of experience in GCP Cloud data implementation projects (Dataflow, AirFlow, BigQuery, Cloud Storage, Cloud Build, Cloud Run, etc.).
- Experience with Agile methodologies and tools such as Rally or Jira.
- Certification: Google Professional Data Engineer.
- Experience programming and producing working models or transformations with modern programming languages.
- Knowledge or experience in designing and deploying data processing systems with technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Teradata, Tableau, Qlik, or others.
- Strong team player with excellent collaboration skills.
- Demonstrated customer focus and problem-solving abilities.
- Resourceful, quick learner, and highly self-motivated.
If you're passionate about leveraging cutting-edge technologies to drive data-driven insights and solutions, and if you meet the qualifications above, we'd love to hear from you! Apply now to be part of our innovative team revolutionizing data engineering in the cloud.