Key facts:
- Role: Data engineer
- Experience: 3+ years (open to various seniority levels depending on fit)
- Location: Remote
- Contract type: Retainer with a pre-agreed daily rate
Company introduction:
DataSow is a data consultancy firm focused on bridging the gap between business strategy and technical implementation. We are growing rapidly and looking for talented individuals to join our journey. We pride ourselves on combining professionalism with a fun, supportive work environment, where everyone has the space to thrive and make an impact.
Job Description:
We are looking for Data Engineers who can help us deliver end-to-end data solutions in cloud-based environments. You'll be responsible for building robust data pipelines, enabling analytics and machine learning use cases, and translating business needs into scalable technical implementations.
We're open to a range of seniorities—from strong junior engineers who want to grow quickly, to experienced professionals ready to take the lead on architectural decisions. What matters most is your ability to work hands-on and your motivation to learn and deliver value.
You will work on diverse projects across cloud platforms such as Azure and AWS, using tools like Databricks, Snowflake, Kafka, dbt, etc.
Main tasks & responsibilities:
- Design and build modern data pipelines (ETL/ELT) and transformation workflows
- Integrate data from structured and unstructured sources (e.g. APIs, databases, files)
- Work with cloud platforms like Azure and AWS
- Develop solutions using modern tools such as Databricks, Snowflake, Kafka, dbt, etc.
- Collaborate with client teams to understand requirements and deliver business value
- Ensure high standards for performance, security, and maintainability
Required Skills & Qualifications:
We’re looking for people who have experience with some of the following technologies and are eager to grow in others:
- At least 3 years of hands-on experience with cloud platforms (Azure, AWS, or GCP)
- Experience with data transformation tools (e.g. dbt, Databricks, or custom Python-based solutions)
- Familiarity with orchestration tools (e.g. Airflow, Azure Data Factory, or similar)
- Strong understanding of data warehousing and data modeling concepts, including batch processing, storage, and transformation of structured datasets
- A strong sense of ownership and ability to work independently
- Effective communication skills and proficiency in English
Why us?
💸 Industry-leading salary: we offer compensation at or above market level
🧘 Flexible and remote-first: work from anywhere, on your schedule
🎯 Outcome-driven environment: what matters is what you deliver
💬 Supportive team culture: we collaborate, give feedback, and grow together
🚀 Learning and growth: you’ll work with senior professionals and have access to challenging, impactful projects
🧠 Room to shape your role: whether you’re early in your journey or more experienced, we support your growth path