- Design, develop, and maintain end-to-end data solutions using open source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.).
- Contribute to multiple data solutions throughout their entire lifecycle (conception to launch).
- Partner with business stakeholders to understand and meet their data requirements.
- Provide ongoing maintenance and enhancements to existing data solutions.
- Maintain security in accordance with client security policies.
- Participate in an Agile development environment.
Education + Experience
- Bachelor s degree in Computer Science, Engineering, or Information Management (or equivalent).
- 5+ years of relevant work experience.
- Professional experience designing, creating and maintaining scalable data pipelines.
- Hands-on experience with a variety of big data technologies (Hadoop / Cloudera, Spark, Cloud, etc.).
- Experience with object-oriented scripting languages: Java (required), Python, etc.
- Advanced knowledge of SQL and experience with relational databases.
- Experience with UNIX shell scripts and commands.
- Experience with version control (GIT), issue tracking (Jira), and code reviews.
- Proficient in agile development practices.