NB: Python, Django, Postgres, Mysql, Any Cloud Experience, Rest Service are essential
You have the following technical competencies
- Minimum of 5 years Experience with Software Development, Data Warehousing, Big Data or DevOps
- Experience with Apache Airflow
- Experience with cloud technologies
- Some experience in the following stack : Kubernetes, Docker, Rkt, apache beam, PySpark, traefik, BigQuery, BigTable, Cloud Spanner or whatever else is the best tool for the job to get the job do
You have the following track record
- Bachelors Degree in Computer Science or any related field
- Experience as a team leader
- Experience in taking responsibility for delivery and quality.
- Proven ability to lead
- Proven track record of delivering projects successfully
- Experience in breaking down high level Epics into manageable stories for teams
- Experience in architectural design of platform components
- You need to be able to provide input for architectural decisions, based on practical implications of the technologies and not only the theoretical benefits of said tech
What other personal competencies would you need?
- The ability to solve problems.
- The ability to rotate around a problem, to see if solutions can be gained in different ways.
- The ability to work in an ever changing, unstructured environment.
- The ability to work as part of a team, with vastly differing skill sets and opinions.
- The ability to contribute ideas to the quorum.
- The ability to mentor and provide guidance for other team members.
- A systems approach to thinking, as opposed to a siloed approach. The candidate needs to understand how their work affects the greater system.
- The ability to work without supervision, and take accountability for the work they deliver.
- The ability to liaise with a client, sifting through the fluff and extracting the actual requirements.