Tech Lead - Data Engineering

16 09 2024
:
Pune, India
:
IS&Digital
:
Regular
apply in romanian

Tech Lead - Data Engineering

- - - - - - - - - - - -

Design and impact analysis

·        Contribute to define feasibility (technical options...), functional design and solution validation.

·        Performs impact analysis related to data captation (system performance in production, security, etc.)

·        Identifies the source of physical data

·        Enforces security and confidentiality rules for data on its perimeter

Implementation and Deployment

·        Completes and executes data collection procedures

·        Builds data collection infrastructure (IoT: Internet of Things, sensors, database, files, etc.)

·        Deploys data collection infrastructure and procedures

·        Measures the impact on the technical chain implemented (from the connected object to the storage system)

·        Completes the necessary tests to validate the solution

·        Documents the implemented solutions

Support and troubleshooting

·        Analyzes and understands the origin of a complex malfunction, incident or bug.

·        Adopts a proactive approach to avoid or identify root causes of problems.

·        Provides technical assistance to users.

  • Overall responsibility of technical product quality including data , visualization & ML subject areas  ranging across e.g.  CDL, PBI & AICP platforms etc.       
  • Coordination with Data Science Lead , QA lead & Architects for   technical quality elements.
  • She/He is the guarantor of quality access to data sources.
  • She/He is responsible for management of the data and is guarantor of the quality of its use (referencing, standardization and qualification) in order to facilitate its use by the teams (Data Analysts and Data Scientists).
  • She/He also contributes to the elaboration of the data policy and the structuring of its life cycle within the regulatory framework in force, in collaboration with the Chief Data Officer.
  • Her/His intervention scope centres on application systems in the data management and processing domain, and on platforms such as Big Data, IoT, etc.
  • She/He is responsible for overseeing and integrating data of a variety of types originating from these different sources and confirms the quality of the Data entering the Data Lake (she/he receives data, deletes duplicates, etc.).
  • Support to Scrum master & Project manager in terms of technical clarification , validations & contribution to planning elements as deemed necessary
  • She/he must be able to work as individual contributor as well (as & when needed) and can perform below operations :
    • Captures the structured and unstructured data produced within different applications or outside the entity.
    • o   Integrates the components.
    • o   Structures the data (semantics, etc.)
    • o   Maps the available components.
    • o   Cleans up the data (deleting duplicates, etc.)
    • o   Validates the data
    • o   Where appropriate, he creates the data repository
  • Make suggestions to architecture group on infrastructure requirements for data acquisition from a disparate data source.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for better scalability, etc.
  • Create and maintain robust and stable data pipeline architecture.
  • Build, test, deploy data/AI products.
  • Involved in the development of AI based API
  • Collaboration with various stakeholders for end-to-end delivery of technical pipelines
    • Partner with Data Science group to integrate AI/ML algorithms (written by another team) into an enterprise product.
    • Partner with Product Management , Data Architects and Platform team to ensure alignments between AI work & business objectives.
  • Fair understanding of DevOps/MLOPs and setting up CI/CD pipelines

Design and impact analysis

·        Contribute to define feasibility (technical options...), functional design and solution validation.

·        Performs impact analysis related to data captation (system performance in production, security, etc.)

·        Identifies the source of physical data

·        Enforces security and confidentiality rules for data on its perimeter

Implementation and Deployment

·        Completes and executes data collection procedures

·        Builds data collection infrastructure (IoT: Internet of Things, sensors, database, files, etc.)

·        Deploys data collection infrastructure and procedures

·        Measures the impact on the technical chain implemented (from the connected object to the storage system)

·        Completes the necessary tests to validate the solution

·        Documents the implemented solutions

Support and troubleshooting

·        Analyzes and understands the origin of a complex malfunction, incident or bug.

·        Adopts a proactive approach to avoid or identify root causes of problems.

·        Provides technical assistance to users.