We're excited to offer this opportunity to join our team. Take a look at the details of this position and apply if you feel you're a good fit. We look forward to receiving your application.
Design and implement innovative data pipeline solutions.
Maintain and manage data warehouse technologies.
Participate in the development of ETL scripts from multiple sources.
Work with the client and the team to understand requirements, evaluate new features and help create solutions.
Write project documentation.
5+ years of professional experience as Data Engineer Developer.
5+ years of professional experience in: SQL, Data Warehouse and Creating ETL pipelines (Extract, Transform and Load) to obtain data from different sources.
2+ years of professional experience using Spark.
3+ years of professional experience with at least one cloud solution (Azure, AWS or Google Cloud Platform).
2+ years of professional experience developing and maintaining big data pipelines.
3+ years of professional experience with at least one scripting language such as Python, Java or Scala.
You don't have to meet all requirements  to be able to apply.
Apply if you think you are a good fit and we will take it from there.