We are representing our client in the global technology and engineering space to find a skilled and motivated Data Engineer. In this role, you will be responsible for creating, building, and maintaining scalable data pipelines and infrastructure that support data analytics, reporting, and business intelligence needs. You will collaborate closely with data scientists, analysts, and software engineers to ensure that data is accessible, reliable, and processed efficiently.
Responsibilities:
- Design, develop, and maintain robust ETL/ELT pipelines to ingest and transform data from multiple sources.
- Build and manage scalable data infrastructure (e.g., data warehouses, data lakes).
- Ensure the quality, integrity, security, and compliance of data across all systems and pipelines.
- Collaborate with stakeholders to understand data requirements and support data-driven initiatives.
- Optimize data workflows and processing for improved performance and cost efficiency.
- Develop and maintain data models, schemas, and metadata.
- Monitor and troubleshoot data-related issues in production environments.
Requirements:
- At least 2 years of experience as a Data Engineer or in a similar role.
- Strong expertise in SQL and experience with both relational and non-relational databases (e.g.,PostgreSQL, MySQL, MongoDB).
- Experience with data pipeline tools and frameworks (e.g., Apache Airflow, Spark, Kafka, dbt).
- Proficiency in programming languages such as Python, Scala, or Java.
- Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and cloud data services (e.g., Redshift, BigQuery, Snowflake).
- Familiarity with tools like FFmpeg.
- Experience with data warehouse concepts and industry best practices.
- Strong problem-solving skills and keen attention to detail.
- Excellent communication and collaboration skills.