Carex is recruiting for a Software Engineer with experience in application development and/or as a data engineer.
What you’ll do:
Participate in tactical initiatives such as designing, developing and reviewing code with the opportunity to be part of strategic visioning, introducing new technologies, design, improved metrics, and process improvements
Be part of the full application lifecycle (design, develop, test, deploy and maintain), innovating in each step
Design and implement highly complex technical solutions for data engineering and analytic systems using new methodologies and emerging technologies
Design, build, manage and optimize data pipelines for data structures encompassing data transformation, data models, schemas, metadata, data quality, and workload management
Designing API’s and working with other teams to build integrations
Develop full slices of the application from the UI to the data store
Share responsibility with your teammates for occasional after-hours on-call rotation for support of production level systems
Design, develop, execute, and maintain complex automated test code, scripts, data, and associated drivers per recognized SDLC methodology
Participate in and lead design and code reviews
What you’ll bring:
Required:
Must be presently authorized to work in the U.S. without a requirement for work authorization sponsorship by our company for this position now or in the future
Minimum, high school diploma or GED
Must be at least 18 years of age
Must be located in one of the following locations: Must be located in one of the following locations: AZ, CA, FL, IL, LA, MD, MI, MN, MO, NJ, NV, NY, OH, OR, TN, TX, VA or WI
For flexible or full remote work from home positions, reliable high speed internet connection and dedicated work space are required
Occasional travel to company offices or meetings as required
3+ years of professional (post-graduate) experience in application development or data engineer role at an enterprise level with experience in the following:
Experience in building data lakes and data marts
Experience with data pipeline and workflow management tools
Experience building data and ML models and transformation rules
Experience with data visualization tools like Tableau, Power BI, Looker
Experience building processes supporting data transformation, data structures and metadata
Advanced SQL knowledge and experience working with relational databases, NOSQL Databases, query authoring (SQL)
1+ years of Data Warehouse, Data Lake, BI experience
1+ years of experience with object-oriented programming language preferably in Scala, Spark, Java or Python development
Google Cloud or any other public cloud platform experience
Experience with Docker, Kubernetes or other Containerization platform and Cloud Native platforms
Experience developing software in a SaaS environment using CI/CD and DevOps methodology
Experience with automated test development and execution (for example, REST-assured, Selenium, etc)
Experience in an Agile working environment
Preferred:
Advanced degree (Bachelor's or Master’s) in Computer Science, Computer Information Systems, Management Information Systems, or a related field preferred
Experience creating and consuming Restful and/or SOAP API's
Hands-on experience with implementing data transformation layer using modern technologies like Spark, Hive, Hadoop
Experience with BigQuery, Snowflake or any other cloud or on-prem data warehousing technologies
Experience with Apache Kafka or Confluent Cloud
Experience with Machine Learning
Experience with designing and implementing Data pipeline.