Drag

skill

Location : ,

Job Description

DBT/Snowflake/Azure Data Engineer w/ Retail Experience

Job Description:

Architect or Senior Engineer  but they will be doing development work

Hands on exp- not looking for manager

Cloud data warehousing

Snowflake

Azure

Aws GCP is secondary- Snowflake is primary

Python, SQL

DBT Experience

Previous retail (Prev Albertsons would be great!)

Communication is key, speaking to stakeholders

Will need to present to leadership

Python sql technical round

Ask to provide how code works, how do you want to do this

Snowflake

Communication skills, explain implementation teams

Strong communication

Communication skills

 

QUALIFICATIONS AND SPECIAL SKILLS REQUIRED:

List Education level, Years of Experience, Technical Knowledge, and/or Certifications required for the position.

•             12 + years in-depth, data engineering experience and execution of data pipelines, data ops, scripting and SQL queries

•             5+ years proven data architecture experience - must have demonstrable experience data architecture, accountable for data standards, designing data models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models)

•             At least 3 years experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc. Experience with Snowflake and other Cloud Data Warehousing / Data Lake preferred

•             Expert in engineering data pipelines using various data technologies – ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience

•             5+ years hands on data warehouse design, development, and data modeling best practices for modern data architectures

•             Highly proficient in at least one of these programming languages: Java, Python

•             Experience with modern data modelling tools, data preparation tools.

•             Experience with adding data lineage, technical glossary from data pipelines to data catalog tools

•             Highly proficient in Data analysis – analyzing SQL, Python scripts, ETL/ELT transformation scripts

•             Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow. Hands on DevOps/Data Ops experience required.

•             Knowledge/working experience in reporting tools such as MicroStrategy, Power BI would be a plus.

•             Self-driven individual with the ability to work independently or as part of a project team

•             Experience working in an Agile Environment preferred, Familiarity with Retail domain preferred

•             Experience with Streamsets, dbt preferred.

•             Strong communication skills are required with the ability to give and receive information, explain complex information in simple terms and maintain a strong customer service approach to all users.

•             Bachelor’s Degree in Computer Science, Information Systems, Engineering, Business Analytics, Business Management required.