DW1233 - Data Integration Backend Developer

TI - Desenvolvimento
Kitchener ou Remoto
Não Divulgado
CLT ou PJ

Descrição da Vaga

Selecionamos para empresa de tecnologia da informação com sede em Kitchener / Canadá

About the area:

  • The role is On cloud-based Java & Python applications for data integration, to maintain & enhance and develop apps in AWS with Java & Python. Step function, AWS basic services (DynamoDB, S3, API Gateway, cloudwatch, cloud formation, ...) for apps development, Java and Python hand-on dev knowledge is required.
  • This new person will be working on Data Integration tools. He/She will be collaborating with the software engineering team + data engineering team. The role is to develop the necessary connectors on our ETL tool, make sure the connectivity with data sources is established, and collaborate with AWS Data Engineers to make sure the ingestion is being done smoothly. The person should be able to explain what these services are used for and show in explanation how these services were used in his developments and eg for integrating data.
  • They will be working all together to define the functionalities the ETL tool needs for all projects ingestions and will define the delivery sprints as a team. The best practices on software engineering will be communicated to him/her by the Software Engineering Managers. Our 2 ETL / Data Integration tools are in AWS, one in Java, and the other in Python/CDK.
  • If the person knows Java & Python but not AWS, he is not fit for the job. If the person knows AWS but not programing with Java AND Python he is not suitable either.  

Project's Length:

  • 8 months  

Main responsabilities:

  • Know the AWS main services that we use in serverless/non-serverless AWS architecture and shows a proven track record of having worked with them (S3, Lambda, step functions, EC2, DynamoDB, SNS, API Gateway, Cloudwatch, IAM, ....). 
  • Comfortable with DevOps and using GIT functionalities.
  • Good understanding of how to integrate the data moving it from a data source to a data lake: such as getting data from an API (eg Salesforce...) and sending it to the data lake, or from any standard database technology (Oracle, MySQL, SQL,...)
  • Able to coordinate and communicate in English with team members.
  • Troubleshooting; 50% of the time Support and accountability of the issues;
  • Explain to the end user what's available/not available on the APIs;
  • Understand and fix issues on the pipeline;
  • Implementation into production  

Mandatory Technical Requirements (include years of experience):

  • REST APIs (integration with various web based APIS)
  • Java 11
  • AWS (CDK)
  • Python   

Desirable Technical Requirements:

  • JDBC  

Soft Skills:

  • Communication - Deal with vendros, other teams members and stakeholders to understand the data, adaptability, hands on - do not mind performing support 50% of the time. A more senior person with bunch years of experience might be frutrated.