Job Description
Job Description
The position focuses on the design, development and maintenance of data pipelines and the integration of information from different systems. The role also includes creating solutions for operational analysis and reporting, maintaining technical documentation, and collaborating with business teams.
Responsibilities:
Integrate data from internal and external systems such as CRM, LMS, marketing platforms and databases.
Develop and maintain REST API, JSON and OAuth integrations.
Automate recurring data processes using Python or similar scripting languages.
Design and maintain optimized data models for operational analysis and reporting.
Work with cloud data warehouses such as BigQuery or Snowflake to ensure efficient storage and retrieval.
Implement data validation and monitoring mechanisms to ensure accuracy and reliability.
Proactively monitor and resolve data channel failures.
Create dashboards and data apps using low-code or no-code platforms such as Power BI, Retool, Coda, Tableau, or Google Sheets.
Translate data sets into clear, actionable information for business stakeholders.
Collaborate with analysts and business teams to define KPIs and reporting requirements.
Suggest improvements to existing data flows and business processes.
Maintain structured and complete technical documentation related to pipelines, integrations, data definitions, and architecture.
Participate in interdisciplinary discussions to align technical implementation with business needs.
Requirements:
Proficiency in Python or similar scripting language for data manipulation and automation.
Knowledge of SQL and experience in data modeling.
Hands-on experience with REST API, JSON and OAuth integrations and related tools like Postman.
Experience working with cloud platforms such as GCP or AWS and with cloud data warehouses.
Experience with orchestration tools such as Airflow, dbt or similar frameworks.
Familiarity with Git and CI/CD processes.
Experience with low-code or no-code platforms for data visualization or workflow automation.
Understanding of business processes and decision making based on KPI.
Between 2 and 4 years of experience in data engineering or data-focused position.
Experience developing and customizing internal tools using Retool with knowledge of JavaScript.
Experience optimizing query performance and cost in cloud environments.
Exposure to real-time data streaming solutions.
Experience supporting AI or ML data pipelines.
Data storytelling and communication skills.
Salary to receive
To agree