See Job Openings

Data Architect (Remote)

  • Location: Remote, Wisconsin
  • Type: Direct Hire
  • Job #2865

Carex is currently on the hunt for an Enterprise Data Architect.  This person will work closely with IT Engineering Leadership, Product and Business Development to understand our client partner’s vast data footprint and the value data offers to the target market. You will collaborate with IT Engineering Leadership and Product Leadership on a future-state vision for their data portfolio, including software, resources and consultants needed to develop industry-leading data practices. You will work closely with software and security architects and engineers to ensure our overall IT delivery strategy is closely understood. You must be able to maintain effective relationships with all levels of company management, as well as with internal and external customers. You will also be required to establish and monitor key performance metrics to ensure compliance of established standards, processes and procedures.

Our client partner is a SaaS business-to-business solution that enhances the customer experience from the moment after an accident to the completion of the claim. They are an integrated software platform that enables communication and collaboration between tens of thousands of collision repair shops, insurance providers and other industry professionals around the world. They are a remote-first organization with remote and hybrid employees aligned to flexible teams from office locations in California, Missouri, Wisconsin and the United Kingdom. They are a global organization that strives to provide an inclusive environment where all employees can thrive. Their products reflect the diversity of the team and they work to ensure that their products meet the needs of all customers. They recognize the value of diverse perspectives in everything they do and strive to ensure employees of all backgrounds feel empowered to voice their ideas and bring their authentic selves to work. They achieve these priorities through inclusive programs, benefits, and initiatives that are integrated into the fabric of how they work every day. They refine and challenge their agile mindset through various communities of practice, days of development, and innovation days.  Their culture values diversity, engagement, and discovery.

What you’ll do:

  • Emphasis will be placed on strategic visioning such as overall design, introducing new technologies, improved metrics, and process improvements in data warehouse design, data modeling and data governance
  • Building and operationalizing a data strategy that aligns with our multiple cloud-driven microservices architecture
  • Drive towards adoption of the target state architecture by executing on the strategy. Must possess the leadership qualities necessary to drive change and adoption of the strategy.
  • Work with our Product Managers to define the technical strategy for our data driven products and applications.
  • Develop Architectures for highly scalable and fault-tolerant products, such as Confluent Cloud and various database technologies, including SQL Server, MySQL, Postgres, MongoDB, etc.
  • Provide technical and architectural oversight of data pipelines and systems that are required to be reliable, massively scalable, highly available, and maintainable.
  • Work with Security Governance on developing actionable policies with Data Privacy, including segmentation, encryption and categorization of data elements.
  • Work with product organization to develop features into our product pipeline that transition our development teams to our longer-term platform strategy.
  • Collaborating and coordinating with multiple departments, stakeholders, partners, and external vendors on the overall strategy
  • Define solution level architecture for agile teams including guidance on development tools, target platforms, operations, and security.
  • Analyze and improve efficiency, scalability and stability of databases and API’s delivering data while delivering impactful business value
  • Analyze, design and develop data models. Design and optimize ETL processes.

What you’ll bring:

Required:

  • 10+ years of experience setting design patterns for application integration, data governance or analytics development
  • 5+ years of experience in a data architect, data analyst, or data modeler role for a large, complex business
    • producing data deliverables for BI or analytics solutions
    • direct hands-on experience designing and developing data solutions and data modeling, including BI solutions with multi-tenancy
  • 3+ years planning, estimating and sizing cloud-based data opportunities, in a global delivery model (Google Cloud, Azure or AWS)
  • 3+ years of experience in working with various go-to-market channels, supporting sales and driving business
  • Experience in an Agile (Scrum or KANBAN) working environment
  • Experience with Global data protection and governance
  • Experience with various design and architectural patterns
  • Must be presently authorized to work in the U.S. without a requirement for work authorization sponsorship by our company for this position now or in the future
  • Must be at least 18 years of age
  • Minimum, high school diploma or GED
  • For flexible or full remote work from home positions, reliable high speed Internet connection and dedicated work space are required
  • Must be located in AZ, CA, FL, IL, LA, MD, MI, MN, MO, NJ, NV, NY, OH, OR, TN, TX, VA, WA or WI

Preferred:

  • Advanced degree (Bachelor’s or Master’s) in Computer Science, Computer Information Systems, Management Information Systems, or related field of study. Or equivalent technical experience in a professional environment
  • Microsoft certified Azure Data, Cloud or Security Architect
  • Google Cloud Platform (GCP) certification (or comparable experience)
  • Experience with private and public cloud architectures
  • Experience in cloud native computing (Docker, Kubernetes)
  • Experience in Azure cloud computing (PaaS and IaaS)
  • Experience in Data Science
  • Experience with organizing and structuring data to utilize ML (machine learning) and other predictive analytics
  • Experience in SQL or NoSQL DB structures, experience with DB migration scripts (SQL Server, MySQL, Mongo)
  • Experience with technology stacks available in the industry for data cataloging, data ingestion, capture, processing and curation: Kafka, StreamSets, Attunity, Collibra, Map Reduce, Hadoop, Spark, Flume, Hive, Impala, SparkSQL

#LI-TB1

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!