The IC4 - Sr Data Engineer role at SILMC involves leading the design, development, and optimization of advanced data solutions. The role ensures scalability, reliability, and alignment with business objectives. This position is critical for defining data architectures, mentoring team members, and promoting best practices in data engineering. The Senior Data Engineer fosters a data-driven culture and delivers measurable business value, with a strong focus on performance, security, and governance.
Responsibilities:
- Lead the design and implementation of robust and scalable data pipelines.
- Collaborate with cross-functional teams to gather requirements and define technical solutions.
- Develop and optimize data models using industry-standard techniques.
- Implement advanced data processing workflows (batch, microbatch, near real-time, and real-time).
- Drive the adoption of data governance and privacy strategies.
- Design and maintain data architectures for data lakes, warehouses, and real-time streaming platforms.
- Develop monitoring processes and data quality metrics.
- Contribute to all phases of the data engineering lifecycle, adhering to Agile methodologies.
- Mentor and guide junior and mid-level engineers.
- Maintain and document data architectures, processes, and technical workflows.
- Lead the integration of APIs and external data sources.
- Manage version control and repository practices.
- Promote an autonomous work culture.
- Serve as a Spin Culture Ambassador.
Requirements:
- Minimum 5 - 6 years of experience in data engineering or related fields.
- Advanced technical expertise in software development life cycle (SDLC) methodologies.
- Advanced understanding of design patterns.
- Advanced experience in Python, Java, and data-related frameworks.
- Advanced knowledge of dimensional data modeling and techniques in structured, semi-structured, and unstructured data storage (non-SQL), as well as ETL, ELT construction.
- Advanced knowledge in file processing: CSV, JSON, Parquet, Yaml, images, and videos.
- Advanced knowledge in data processing: batch, microbatch, near real-time, and real-time.
- Advanced understanding and experience to contribute to and implement data privacy strategies.
- Advanced understanding and experience in implementing and maintaining robust data governance strategies.
- Advanced knowledge in Data Engineering Lifecycle.
- Advanced knowledge in processing different types of internal and external data sources with API management.
- Advanced knowledge in understanding Architecture: Business, Solutions, Data, etc.
- Advanced mastery in version control: Git, Github, Gitlab.
- Advanced in business vision.
- Advanced mastery and evidence with projects using the Google Cloud Platform (GCP) Stack for Data.
- Advanced mastery and evidence with projects using the Amazon Web Services (AWS) Stack for Data.
- Advanced experience in developing and optimizing SQL and related queries.
- Able to effectively and exceptionally socialize the design of the data solution without supervision.
- Advanced experience in Project Management with Agile methodology (Scrum or Kanban).
- Experience in collaborating on projects to achieve OKRs, warning of risks, and providing business value.
- Advanced competence in transparently communicating project status.