Data Engineer Job at On-Demand Group, Minneapolis, MN

bmg5WW10anpMOFR4TCtnY3VQS1Z5ZytxV1E9PQ==
  • On-Demand Group
  • Minneapolis, MN

Job Description

Job Title: Data Engineer

Job Location: On-site

Job Type: Contract to Hire

USC of GC Holders only for contract to hire need having no sponsorship

  • Must have requirements:
  • GCP, SQL, Python, Airflow
  • System design mindset
  • Communication – ability to vocalize what they are doing, what/how they are achieving their work. Accents not an issue as long as they are comprehendible.
  • Healthcare not required, but a nice to have.
  • Location: Onsite – any 4 office location, focus is Minneapolis, Arlington, VA, Portland, OR, Raleigh, NC
  • 100% onsite, then switch to 2-3x/week hybrid if they do well

Job Summary:

The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining

data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The

incumbent will collaborate with data analysts, data scientists, and other engineers to

ensure timely access to high-quality data for data-driven decision-making across the

organization.

The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on

coding in data processing solutions and scalable data pipelines to support analytics and

exploratory analysis. This role ensures new business requirements are decomposed and

implemented in the cohesive end-to-end designs that enable data integrity and quality, and

best support BI and analytic capability needs that power decision-making. This includes building data acquisition programs that handle the business’s

growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining

a robust data catalog.

This is a Senior Data Engineering role within Data & Analytics’ Data Core organization

working closely with leaders of the Data & Analytics. The incumbent will continually

improve the business’s data and analytic solutions, processes, and data engineering

capabilities. The incumbent embraces industry best practices and trends and, through

acquired knowledge, drives process and system improvement opportunities.

Responsibilities:

• Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and

Airflow for data ingestion, transformation, and loading.

• Optimize data pipelines for performance, scalability, and cost-efficiency.

• Ensure data quality through data cleansing, validation, and monitoring processes.

• Develop and maintain data models and schemas in BigQuery to support various

data analysis needs.

• Automate data pipeline tasks using scripting languages like Python and tools like

Dataflow.

• Collaborate with data analysts and data scientists to understand data requirements

and translate them into technical data solutions.

• Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines

with CI/CD workflows.

• Monitor and troubleshoot data pipelines and infrastructure to identify and resolve

issues.

• Stay up-to-date with the latest advancements in GCP BigQuery and other related

technologies.

• Document data pipelines and technical processes for future reference and

knowledge sharing.

Basic Requirements:

• Bachelor’s degree or equivalent experience in Computer Science, Mathematics,

Information Technology or related field.

• 5+ years of solid experience as a data engineer.

• Strong understanding of data warehousing / datalake concepts and data modeling

principles.

• Proven experience with designing and implementing data pipelines using GCP

BigQuery, Dataflow and Aiflow.

• Strong SQL and scripting languages like Python (or similar) skills.

• Experience with data quality tools and techniques.

• Ability to work independently and as part of a team.

• Strong problem-solving and analytical skills.

• Passion for data and a desire to learn and adapt to new technologies.

• Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.

• Experience with cloud deployment and automation tools like Terraform.

• Experience with data visualization tools like Tableau or Power BI or Looker.

• Experience with healthcare data.

• Familiarity with machine learning, artificial intelligence and data science concepts.

• Experience with data governance and healthcare PHI data security best practices.

• Ability to work independently on tasks and projects to deliver data engineering

solutions.

• Ability to communicate effectively and convey complex technical concepts as well

as tasks / project updates.

The projected hourly range for this position is $78 to $89.

On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.

Job Tags

Hourly pay, Contract work, Work at office,

Similar Jobs

AFSCME District Council 36

Union Representative Job at AFSCME District Council 36

 ...AFSCME District Council 36 Union Representative Los Angeles, CA (for servicing various AFSCME 36 Public and Private Sector affiliates as assigned) Background The American Federation of State, County and Municipal Employees (AFSCME) is the countrys largest... 

SS Solutions

Sports-Minded Assistant Manager Job at SS Solutions

 ...Finances Accounting Qualifications: Competitive individuals with a winning mentality to move up into a leadership role Sports-minded and Energetic team players Team captains ready to grow and train Positive Energy Candidates who are serious about a... 

International Leadership of Texas

Facilities Technician Job at International Leadership of Texas

 ...~ High School Diploma or GED, preferred ~2+ years experience in project management, certificate in trade school or equivalent work experience related to project management, vendor management, facilities and/or maintenance preferred ~ Maintain a valid Texas driver... 

Strativ Group

NX Drafter Job at Strativ Group

CAD Drafter (Siemens NX) Location: San Francisco Bay Area Contract Duration: Initial 6 months with high likelihood of extension/FTE conversion Engagement: W2 and 1099 options Rates: The client is focused on finding the right individuals and is therefore ...

Openkyber

Junior Cloud Automation Engineer Job at Openkyber

 ...company based in Seattle, WA is looking for a ML Data Pipeline Engineer to join their dynamic team! Job Responsibilities:...  ...Requirements: ~5 year+ experience ~ Experience with cloud platforms (AWS, Azure, Google Cloud Platform) ~ Strong Data...