Our Partner is hiring a Master Data Management Consultant. Ideal candidates will have implementation experience with SAP ERP (specifically, ECC version and/or S/4 HANA)
What you’ll do:
- Act as a SME for a global SAP ERP Project
- Utilize your Data Governance and Data Management expertise to guide the project
What you’ll bring:
- SAP systems knowledge
- Data Governance and Data Management experience
- Domain expertise: Material Master, Business Partner, and/or Finance Master
#LI-TB1
Our health-tech partner is seeking a Senior Data Engineer to join their team! This is an exciting, new role with a fast growing, funded company in Wisconsin.
Their data engineers design and develop data pipelines to integrate disparate systems and curate data for machine learning. This role will leverage your strong understanding of data modeling principles and modern data platforms to develop intuitive data model architectures, extract-load-transform (ELT) processes, and testing infrastructure to ensure reliably robust data. Our engineers are not only motivated to research new solutions but to own the problem end-to-end.
The ideal candidate has 4+ years of experience working as a Data Engineer supporting ELT data pipelines and workflow orchestration tools. Experience with healthcare data will set you apart, as well experience working in a startup environment.
JOB RESPONSIBILITIES
- Design, build, and implement generalized large-scale, sophisticated, and high-volume data pipelines for downstream analytics and data science.
- Perform continuous integration to ensure that every step of a data pipeline is testable and automated
- Lead technical design of scalable and flexible data architectures
- Re-architect existing code bases transitioning to new technologies and frameworks
- Document code, provide progress reports and perform code review and peer feedback
- Track milestones, activities and interdependencies across projects and tasks, with frequent status updates to stakeholders
- Continuously evaluate and identify improvements in the system processes and architecture
- Assist in maintaining data quality and fidelity in production systems
- Participate in Agile planning around data feature requests and advocate for the best data engineering projects in priority planning
- Mentor data engineering team members on coding, architecture, and data engineering processes
QUALIFICATIONS
- Minimum of a Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, Economics or related field. Ideally, a Masters in a related field
- 6+ years of experience in a data engineering role
- 6+ years of programming experience, preferably in Python
- 6+ years of experience in Cloud data engineering in AWS stack, AWS certification is a plus
- 6+ years of experience in working with large data sets and pipelines (e.g., Hadoop, Spark, HBase)
- 6+ years of SQL experience, including performance tuning and query optimization
- Experience with healthcare data is a strong preference, specifically Epic, HL7, EDI, CDI
- Expertise in creating and maintaining production data pipelines using Airflow
- 3+ years of experience with schema design and data modeling
- Expert in code versioning and collaboration tools/repositories like GitHub
- Experience with container orchestration and deployment frameworks (e.g., Kubernetes, Docker) is preferred
- Experience in Cloud data engineering in AWS stack. AWS certification is a plus
- Strong debugging, critical thinking, and technical design skills with the ability to learn and apply new technologies quickly
- Excellent communication skills, supporting recommendations, design ideas, and analysis to team and stakeholders
- Experience in leading and mentoring other data engineers
Our partner is seeking a Collaborative Senior Delivery Engineer to join their rapidly growing team. They make transformative ROI possible for their clients and partners by meeting them right where they are and automating their existing business rules – that’s where you come in. In this highly visible role, you will bring our partner's clients and partners into their automated future from project discovery, implementation, testing, and finally go live.
Reporting to the Lead Delivery Engineer, you will configure the core product for new deployments, and also design new modules to connect into the core product in order to automate our clients' more unique business rules and support their regulatory requirements. You’ll work closely with their Customer Success Team to triage, investigate, and close support tickets, while also working closely with the core Product Engineering Team to design new modules or protocols.
About You:
-
- You have direct experience in design, build, deployment, version control, configuration and testing of applications using modern best practices.
- You enjoy providing solutions to business problems in a fast-paced team, and have a background in support, delivery or implementation.
- You’re opinionated about tooling and curious about new trends and technologies in the software development world.
- You like to work collaboratively with developers, data scientists and UX teams to improve the quality and resiliency of the products you’re using and supporting.
- You are aware of serverless solutions leveraging virtualization and/or containerization strategies and technologies (Docker, Lambda, etc.) with a passion to learn more about them.
Required Qualifications:
-
- 4+ years of demonstrated hands-on software development projects in Python is required.
- Experience working with AWS cloud architecture, including products such RDS, S3, ECS, Lambda, DynamoDB, API Gateway and CodeDeploy.
- Experience working in highly regulated industries like finance, health care or defense.
- A working knowledge of various different development languages.
- Experience working in an Agile team, as Lead or Senior Engineer
- Excellent verbal and written communication skills with a focus on good technical writing. #LI-TB1