Carex is working on an outstanding opportunity for a Data Engineer in the states of WI or IA. In this role, you will be responsible for building and working on end-to-end data projects using a modern cloud tech/data stack. This role provides a great opportunity to participate in many exciting and dynamic data engineering challenges. The ideal candidate will be an experienced pipeline builder that enjoys building data systems from the ground up. They will support our internal stakeholders through modern data practices, tools and platforms including Snowflake Data Cloud, Python, Informatica IICS, and Azure.
Excellent communication skills, both verbal and written, with the ability to work in both a team environment and on self-directed tasks are required.

Job Responsibilities
• Designs and develops data pipelines (ELT/ETL) architecture to load data from a wide range of sources to Snowflake and Azure Data Cloud.
• Plans, analyzes, designs, codes, tests, implements, and maintains moderately complex data projects.
• Assists team members with design and acts as a resource for particularly difficult design projects.
• Responsible for translating operational requirements for moderately complex systems. Recommends and may assist in leading definition of user stories and design workshops, data modeling and prototyping. Develops documentation as required.
• Performs and may lead the testing of complex systems to ensure system reliability prior to implementation. Develops and advises on innovative or blended techniques to meet customer needs. Recommends application development team standards for testing. Recommends and may develop metrics to evaluate process and system performance improvements.
• Understands data pipelines and modern ways of automating data pipeline using cloud-based implementation. Tests and clearly documents the requirements to create technical and functions specs.
• Identify, design, and implement internal process improvements such as automation of manual processes, optimizing data processing, etc.
• Help set up and maintain CI/CD pipelines.
• Maintains current knowledge of information systems and technologies. Recommends changes to design, development, and implementation standards when appropriate. Responsible for technical application development for all stages in the development lifecycle.
• All other duties as assigned

Job Qualifications

Required Qualifications
• 5 years of related experience.

Preferred Qualifications
• Bachelor's Degree -Emphasis in information technology or related area.

Knowledge, Skills, and Abilities
• Experience and proficiency in using Python programming in data transformation type activities.
• Hands-on development experience with Snowflake data platform including Snowpipe, tasks, stored procedures, streams, resource monitors, Snowpark, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and understanding how to use these features.
• Solid experience in working with Informatica data tools with a focus on Informatica IICS, PowerCenter, DEI/BDM.
• Solid experience in Unit Test, System Integration Test and User Acceptance testing.
• Designed and developed data loading processes to load data from a wide range of sources such as Oracle, flat files, Web Services, and Azure Cloud.
• Expertise in developing and implementing business logic through SQL.
• An active learner that is passionate about data and new technologies. Comfortable recommending new and better ways to do things.
• Experience with Git or a similar version control/source code management tool is desired.
• Developed reusable code, transformations and templates to offset redundant coding and reduced development time and improving loading performance both at mapping and at session level.
• Ability to work in a diverse work environment.

#LI-LL1