より良い経験を。あなただからできる。

あなたらしく。あなたらしく成長する。面白いプロジェクトに取り組もう。

Be yourself. Grow your own way. Work on interesting projects.

Informatica (ETL) Developer

Contract Type:

Brick and Mortar

Location:

Hyderabad - TS

Date Published:

04-16-2026

Job ID:

REF41565B

Company Description:

Sutherland is seeking a reliable and technical person to join us as  Informatica (ETL) Developer who will play a key role in driving our continued product growth and innovation. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!!

Job Description:

The Informatica (ETL) Developer Designs, implements, and maintains complex data engineering solutions in the Business Intelligence and Analytics team.

Responsible for design, development, implementation, testing, documentation, and support of analytical and data solutions/projects requiring data aggregation/data pipelines/ETL/ELT from multiple sources into an efficient reporting mechanism, database/data warehouse using appropriate tools like Informatica, Azure Data Factory, SSIS. This includes interacting with business to gather requirements, analysis, and creation of functional and technical specs, testing, training, escalation, and follow-up.

Support of the applications would include resolving issues reported by users.  Issues could be caused by bugs in the application or user errors or programming errors.  Resolution process will include, but not limited to, investigate known bugs on software vendor support website, create tickets or service requests with software vendor, develop scripts to fix data issues, make program changes, test fixes and apply the changes to production. 

These tasks and activities will be completed with the help and under the guidance of the supervisor.  Participation in team and / or project meetings, to schedule work and discuss status, will be required.

The position also requires staying abreast with changes in technology, programming languages, and software development tools. 

Responsibilities:

  • Data Pipeline / ETL (40%): Designs and implements data stores and ETL data flows and data pipelines to connect and prepare operational systems data for analytics and business intelligence (BI) systems.
  • Support & Operations (10%): Manages production deployments and automation, monitoring, job control and production support. Works with business users to test programs in Development and Quality. Investigates issues using vendor support website(s).
  • Data Modeling / Designing Datasets (10%): Reviews and understands business requirements for development tasks assigned and applies standard data modelling and design techniques based upon a detailed understanding of requirements.
  • Data Architecture and Technical Infrastructure (10%): Plans and drives the development of data engineering solutions ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance.
  • SDLC Methodology & Project Management (5%): Contributes to technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
  • Data Governance and Data Quality (5%): Identifies and investigates data quality/integrity problems, determine impact and provide solutions to problems.
  • Metadata Management & Documentation (5%): Documents all processes and mappings related to Data Pipelines work and follows development best practices as adopted by the BIA team.
  • End-User Support, Education and Enablement (5%): Contributes to training and Data Literacy initiatives within the team and End user community.
  • Innovation, Continuous Improvement & Optimization (5%): Continuously improves and optimizes existing Data Engineering assets/processes.
  • Partnership and Community Building (5%): Collaborates with other IT teams, Business Community, data scientists and other architects to meet business requirements. Interact with DBAs on data designs optimal for data engineering solutions performance

Required Skills:

  • Experience: 4 to 7 years of proven experience as part Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
  • Developing complex SQL queries and SQL optimization.
  • Experience with other cloud platforms (e.g., AWS, Google Cloud) and multi-cloud environments
  • Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business Intelligence/Data Warehousing Architecture.
  • Skills:
    • Proficiency in Informatica / PowerCenter / IDMC tools (4+ Years).
    • Informatica IDMC (Cloud Data Integration / Application Integration) (3+ Years)
    • Data pipeline development using cloud platforms
    • Snowflake data warehousing (2+ Years)
    • Salesforce (SFDC) integration (2+ Years)
    • Informatica Salesforce Data Connector configuration (SFDC)
    • Real-time and batch data integration of Salesforce to Snowflake and Snowflake to Salesforce.
    • Data migration and ETL/ELT processes
    • API-based integrations and data orchestration
    • Strong SQL and data modeling skills
    • Performance tuning and troubleshooting of data pipelines
  • Certifications: Snowflake, Salesforce connector, IDMC Tools are Plus
  • Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
  • Understanding of Data Architecture
  • Knowledge of ETL and data engineering standards and best practices for the design and development of data pipelines and data extract, transform and load processes
  • Design, build and test data products based on feeds from multiple systems, using a range of different storage technologies, access methods or both
  • Knowledge of data warehousing concepts, including multi-dimensional models and ETL logic for maintaining star-schemas
  • Good Understanding of concepts and principles of data modelling. 
  • Ability to produce, maintain and update relevant data models for specific needs.
  • Can reverse-engineer data models from a live system
  • SQL programming desirable (i.e., stored procedures dev.)
  • Proficient in data analysis, defect identifications and resolutions.
  • Strong professional verbal and written communication skills.
  • Ability to work with little supervision and within changing priorities.
  • Ability to analyze requirements and troubleshoot problems.

Qualifications:

  • Bachelor’s degree  in Computer Science, Information Technology, or a related field.
  • Hybrid work model: In-office on Monday, Wednesday, and Friday.
  • Working Time:  India shift timings will be until 11:30 PM IST
  • Work Location: Pune / Hyderabad

Additional Information:

All your information will be kept confidential according to EEO guidelines.

今すぐ応募
キャリアパス
在宅勤務

この仕事をシェアする

この仕事に興味がありますか?
仕事を保存
アラートとして作成

似たような仕事

SCHEMA MARKUP ( This text will only show on the editor. )