This role will follow a hybrid working arrangement with our client, 3 days per week onsite in the Nonthaburi area. High possibility to be converted to our full time engineer.
In this role, you will get to:
- Design, build, and maintain scalable and efficient data pipelines and ETL processes
- Develop and implement data models, data warehouses, and data lakes
- Monitor and optimize the performance of our data infrastructure
- Collaborate with cross-functional teams to identify and gather data requirements
- Ensure data quality and integrity by implementing data validation and cleansing procedures
- Troubleshoot and resolve data-related issues in a timely manner
- Stay up-to-date with new technologies and best practices in data engineering
You’ll be successful if you have:
- At least 2 years of hands-on experience in data engineering
- Having extensive experience using Databricks to design and manage scalable pipelines, hands on PySpark in Databricks notebooks, building Spark pipelines in Databricks
- Proven experience with ETL tools or frameworks (e.g., Airflow, dbt, Informatica, Talend)
- Solid understanding of data warehousing and lakehouse architecture
- Proficient with Python, SQL and one or more programming languages (e.g., Python, Scala)
- Experience working with cloud platforms such as AWS, GCP, or Azure (esp. BigQuery, Redshift, or Snowflake)
- Comfortable working independently and able to manage priorities in a contract role
- Good command in English is preferred (reading, writing and speaking)
APPLY NOW!

Be a part of
Data Engineering
team
“ Let's power industry leaders with our AI engine ”
5 steps
for interview
Depends on the department
This is your chance
to build your career
in a growing data driven
industry.|
Because we cultivate intelligence and learning and help people grow beyond their potential.
We seek passion and ambition to turn data into action!
Loading...
Copyright © 2026 Sertis Co.,Ltd. - All rights reserved.





