IT - Application & Software Development
Apr 13, 2021
Primary responsibilities include (but not limited to):
- Proficient in SQL, Python, PowerBi and Agile practices
- Experience in understanding business challenges and translating them into technical requirements
- Experience with data modeling, data warehousing, and building end-to-end ETL pipelines
- Experience in Business Intelligence (BI) tools and visualization software (e.g., PowerBi, Tableau, etc.) to empower non-technical, internal customers to drive their own analytics and reporting
- Create and manage large datasets by extracting, transforming, combining, and loading data from various heterogeneous data sources
- Comfortable with ambiguity and motivated to work in a fast-paced environment, enjoys the challenge of unfamiliar tasks. Ability to learn on the go and deliver solutions
- Experience in writing complex, highly-optimized SQL queries across large data sets
- Experience with at least one relational database technology such as MS SQL, Snowflake, Oracle or Teradata
- Optional / Good to have requirements – Experience with Azure DevOps, Docker, Terraform, Airflow, Luigi or Orchestration tools. Any WebApp development experience and comfortable working with APIs. Development experience working in a CPG / FMCG company
We require a Developer to build ETL pipelines/Automations/desktop applications to optimise current business processes. The ideal candidate will be a self-starter, highly analytical, comfortable diving deep and making data driven decisions with a strong customer focus with the ability to multi-task, navigate ambiguity, adapt quickly to changing business requirements, learn new systems, gain new skills, and find creative, scalable solutions to difficult problems.
In this role, you will primarily build ETL solutions from initial experimentation to application deployment, including but not limited to the following:-
Identify gaps and improvement opportunities in existing dataflow. Design, implement, and maintain a ETL pipelines for large data-sets. Major and minor enhancements to existing data pipelines. Maintain data integrity, availability, and audit-ability. Drive operational excellence by automation of processes and by the adoption of new technologies and best practices.