Data Engineer


The StudioC is led by experts who understand Church communication and we are on a mission to bring clarity and purpose to church member engagement. Our team is passionate about equipping church leaders with the resources and insights necessary to deliver personalized member communication. We do this by delivering innovative solutions to our ministry partners to engage their members in transformational next steps. These innovative solutions include our Marketing & Strategy Consulting, Member Engagement Software (MES), Marketing campaign development, analytics and insights.


The Data Engineer plays a key role in integrating clients’ customer management system(s) with the StudioC data warehouse and Member Engagement Solution (MES); not limited to, but particularly focusing on clients that use Rock ChMS. Logic mapping and data integration enable personalized messaging, data hygiene and data analytics to facilitate member engagement and improve organizational growth. The Data Engineer will identify and utilize strategies to troubleshoot integrations and provide subject matter expertise around Rock RMS data structures and integrations. 

Our ideal candidate has an internal drive for growth, self-initiative, extreme attention to detail and a strong desire to solve actual business and data problems with repeatable, support processes. 


  1. Connect client ChMS and other data sources to StudioC data warehouse to normalize data for most effective data aggregation.
  2. Serve as a resident expert in Rock RMS integrations.
  3. Ensure client data is secure and integrated in the most efficient and effective manner.
  4. Identify and confirm data/logic mapping to meet business requirements.
  5. Create, monitor and improve Extract, Transform and Load (ETL) processes to meet business requirements.
  6. Be a team player in a way that builds confidence, creates value and increases dependability, as a whole. 

Essential Job Functions:

  1. Manage the Extract, Transform, and Load (ETL) process team. These resources create, monitor and manage an environment that includes several hundred ETL processes per day
  2. Ensure all our ETL processes are secure, scalable, responsive, and reliable
  3. Collaborating with on-call engineers to troubleshoot issues and minimize downtime
  4. Benchmarking, analyzing and reporting on and/or making recommendations for the improvement and growth of cloud operations and infrastructure
  5. Develop and implement instrumentation for monitoring the health and availability of services including fault detection, alerting, triage, and recovery
  6. Implementing appropriate logging, monitoring and reporting related capabilities
  7. Support AWS RDS for PostgreSQL, and Microsoft SQL database performance, maintenance and troubleshooting
  8. Specifically, code and support all Rock RMS ETL processes, including researching, data mapping, coding API data retrieval, creating data transformations, and managing all Rock RMS data loading programs.


Education: Bachelor’s degree in Computer Science, Engineering, Management Information Systems, or equivalent education and/or work experience

Experience & Competencies Desired 

  1. Bachelor’s degree in Computer Science, Engineering, Management Information Systems, or equivalent education or work experience
  2. 7+ years of relevant experience with different cloud technologies especially AWS
  3. Must be an excellent team player working with multidisciplinary teams (Developers, Architects, Operations and Testers) to fully automate and build out new application environments on the Cloud, spearheading automation and high performing system projects
  4. 3+ years of relevant experience working with the Rock RMS
  5. Understanding of Rock RMS, specifically Rock RMS API 

Basic Qualifications

  1. Bachelor Degree or 5+ years of professional or military experience.
  2. 7+ years of experience as a technical specialist.
  3. 5+ years of hands-on experience of programming in languages including .NET, .NET Core, C#, .NET MVC, Vue, Javascript 
  4. 3+ years of hands-on experience implementing Rock RMS
  5. Experience developing cloud native CI/CD workflows and tools, such as Code Deploy (AWS) and/or GitLab.
  6. Experience with the full software development lifecycle and delivery using Agile practices.

Preferred Qualifications

  1. Experience with Beanstalk in production environments.
  2. AWS Certification(s) such as Solutions Architect Pro, DevOps Engineer Pro, SysOps Admin, Developer Associate.
  3. Strong presentation, verbal communication and written communications skills.
  4. Ability to lead effectively across organizations and engagements, preferably from a professional services organization or similar.

Character & Chemistry

  1. Personal integrity in all areas of life

General Information

  1. Average Hrs/Week: 40
  2. FLSA Status: Exempt
  3. Department: Engineering
  4. Reports to: Director of Engineering
  5. Location: Phoenix, AZ
  6. Date Revised: 2022.06

The above statements are intended to describe the general nature and level of work being performed by individuals assigned to this position. They are not intended to be an exhaustive list of all duties, responsibilities, and skills required of employees.

Data Engineer
Phoenix, Arizona, USA
Scroll to Top