Big Data Engineer Department: Technology Employment Type: Full Time Location: Malta Reporting To: Vronsky Bartolo Description As part of GiG’s growth strategy, we are establishing a dedicated Migrations Team to lead the migration of brands from legacy systems and other platforms into our modern GiG CoreX platform. In the role of Migration Engineer, you will take ownership of the technical execution of these migrations: designing, implementing, validating, and executing complex data pipelines to transfer and transform customer and operational data. You will maintain high standards of data integrity, perform rigorous testing and verification, provide support throughout cutover phases, and partner closely with stakeholders (product, infrastructure, operations, compliance) to ensure successful onboarding of clients onto CoreX. Key Responsibilities Data Pipeline Design & Implementation: Design, develop, and maintain robust data pipelines for the extraction, transformation, and loading (ETL) of large volumes of structured and unstructured data, ensuring scalability and efficiency. Migration Execution & Validation: Lead the execution of data migrations, including data extraction, transformation, schema mapping, validation, and cutover, ensuring data integrity and consistency. Collaboration & Stakeholder Engagement: Work closely with product, technology, delivery, and compliance teams to gather migration requirements, align on data definitions, and ensure successful migration outcomes. Automation & Optimization: Implement automation solutions for ETL/migration processes to improve efficiency and reduce manual intervention. Monitoring & Troubleshooting: Monitor and troubleshoot migration pipelines, addressing issues promptly to ensure smooth operations. Documentation & Process Improvement: Document migration processes, contribute to process improvement initiatives, and ensure adherence to best practices. Technical Support & Incident Resolution: Provide technical support during incident resolution related to migration pipelines and data discrepancies. Agile Participation: Actively participate in all agile scrum meetings such as daily standups, refinement sessions, and retrospectives. Innovation & Continuous Improvement: Propose and implement new ideas to improve existing products and services, fostering a culture of innovation. Code Review & Mentorship: Perform code reviews for other engineers, ensuring code quality and knowledge sharing within the team. Stakeholder Communication: Effectively communicate with respective stakeholders, ensuring that all information is transmitted correctly and to the right audience. Requirements Essential Skills Bachelor’s/Master’s in Computer Science, Engineering, or related field. 3+ years of experience in data engineering, ETL processes, or data migration. Strong SQL skills and experience working with structured and unstructured datasets. Knowledge of data governance, compliance, and data quality assurance. Solution-oriented mindset with the ability to troubleshoot and resolve issues. Strong experience with Big Data and streaming technologies including ClickHouse, Kafka and NiFi Strong knowledge of Microservices, APIs, GraphQL Preferred Skills Experience in the iGaming industry or regulated markets. Experience with automation of ETL/migration processes. Benefits Great career development opportunities Hybrid working model International Health Insurance Health and Wellbeing Package (350 EUR per year) Birthday Day Off Me Time - 1 day off per year