DetroitRecruiter Since 2001
the smart solution for Detroit jobs

GCP DATA ENGINEER

Company: ApTask
Location: Detroit
Posted on: November 26, 2022

Job Description:

Job summary
You will be responsible for developing scalable big data pipeline solutions in the Ford GCP Data Factory. In addition, you will:
" Advanced knowledge of the GCP ecosystem with a focus on Big Query.
" Designing and coding Big Query to analyze data collections
" Analyze user needs to determine how software should be built or if existing software should be modified.
" Participate in design, delivery estimates and code reviews
" Develop and/or perform software automated testing procedures, solutions 3.Experience 8to12yrs
4.Required Skills Technical Skills- ,ANSI SQL,Google Big Table,Apache Hadoop,
Map Reduce Domain Skills- ,Automotive Industry
5.Nice to have skills Techincal Skills- ,Hive,Core Java Domain Skills-
6.Technology Data Management
7.Shift Day 8.Roles & Responsibilities You will be responsible for developing scalable big data pipeline solutions in the Ford GCP Data Factory.

In addition, you will:
" Advanced knowledge of the GCP ecosystem with a focus on Big Query.
" Designing and coding Big Query to analyze data collections
" Analyze user needs to determine how software should be built or if existing software should be modified.
" Participate in design, delivery estimates and code reviews
" Develop and/or perform software automated testing procedures, solutions, and frameworks to ensure software functions as needed
" Translate business requirements and specifications into usable and scalable software "
Process and understand capabilities and limitations of data outputs from the software
" Understand and assist with the technical infrastructure of an application or system
" Determine and execute the software deployment process and troubleshoot performance issues.
" Develop data quality and validation routines " Build distributed reliable and scalable data pipelines to ingest and process data in real-time and other unstructured data.

Skills Required:
Data design, data architecture and data modeling (both transactional and analytic)
Building Big Data pipelines for operational and analytical solutions Running and tuning queries in databases including Big Query, SQL Server, Hive Data Management - including running queries and compiling data for analytics
Experience with developing code in one or more languages such as Java, Python and SQL GCP Cloud data implementation projects experience (Dataflow, AirFlow, BigQuery, Cloud Storage, Cloud Build, Cloud Run, etc.) Agile methodologies
Experience Required:
5 Experience Preferred: 0 Education Required: Bachelor s degree in Computer Science, Computer Engineering, Information Technology, or equivalent experience
Education Preferred:
" Masters degree in Data Science, Computer Science, or Data Engineering
" Certification: Google Professional Data Engineer
Skills Preferred:
7 years of experience 9

Keywords: ApTask, Detroit , GCP DATA ENGINEER, Engineering , Detroit, Michigan

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category
within


Log In or Create An Account

Get the latest Michigan jobs by following @recnetMI on Twitter!

Detroit RSS job feeds