Data Cloud Engineer - New York, New York | STAND 8 Careers | Stand8

Data Cloud Engineer
Back to Job Search
Data Cloud Engineer
Date Posted:  9/14/2021
Job ID:  Job #3333
Employment Type: Contract
Location: New YorkNew York
Submit Resume

We are seeking a Data Engineer for one of the best technology partners leading the field of digital innovation. This team values growth, collaboration, and a team-focused culture balanced with cutting-edge technology and solutions.
STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in LA, Atlanta, New York, Raleigh, and more.

Senior Data Engineer is responsible for building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting. This engineer’s duty is to monitor the existing metrics, analyze data, and lead partnerships with other Data and Analytics teams in an effort to identify and implement systems and process improvements. Data Engineer also designs, architects, implements, and supports key datasets.


  • GCP or AWS Cloud experience
  • Airflow experience
  • SQL experience
  • Python experience 


  • Google Cloud Certified - Professional Data Engineer certification would be a plus
  • Knowledge of Git, Jinja2, Docker, Bitbucket, and Bamboo
  • Familiar with a NoSQL database such as MongoDB
  • Familiar with version control systems (Git and Bitbucket)
  • Familiar with Atlassian products Jira and Confluence
  • Hands-on experience with Apache Airflow or Google Composer


  • Design and develop highly scalable and reliable data engineering pipelines to process large volumes of data across many data sources in the cloud
  • Identify, design and implement internal process improvements by automating manual processes and optimizing data delivery
  • Develop and promote best practices in data engineering
  • Develop real-time data processing applications using Google Cloud
  • Be part of the on-call rotation supporting our SLA’s
  • Participate in design and code reviews 


  • Bachelor's degree in Computer Science or equivalent experience in a related field
  • Hands-on experience working in data warehousing or data engineering environment
  • Advanced SQL programming skills
  • Experience developing data solutions on GCP or AWS
  • Experience in ingestion of data from external APIsand data stores
  • Experience in design, build and operationalization of big data pipelines on Cloud Technologies.
  • Strong problem-solving, quality and ability to execute
  • Strong experience in authoring, scheduling, and monitoring of workflows (Apache Airflow related technologies)
  • Strong communication & interpersonal skills
Location: Remote, able to work EST hours