Better experiences. Made possible by you.

Be yourself. Grow your own way. Work on interesting projects.

Be yourself. Grow your own way. Work on interesting projects.

Informatica (ETL) Solution Architect

Contract Type:

Brick and Mortar

Location:

Hyderabad - TS

Date Published:

04-16-2026

Job ID:

REF41566X

Company Description:

Sutherland is seeking a reliable and technical person to join us as  Informatica (ETL) Solution Architect  who will play a key role in driving our continued product growth and innovation. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!!

Job Description:

The Informatica (ETL) Solution Architect is responsible for designing, developing, implementing, and maintaining complex data engineering solutions within the Business Intelligence and Analytics team.

Responsible for design, development, implementation, testing, documentation, and support of analytical and data solutions/projects requiring data aggregation/data pipelines/ETL/ELT from multiple sources into an efficient reporting mechanism, database/data warehouse using appropriate tools like Informatica, Azure Data Factory, SSIS. This includes interacting with business to gather requirements, analysis, and creation of functional and technical specs, testing, training, escalation, and follow-up.

Support of the applications would include resolving issues reported by users.  Issues could be caused by bugs in the application or user errors or programming errors.  Resolution process will include, but not limited to, investigate known bugs on software vendor support website, create tickets or service requests with software vendor, develop scripts to fix data issues, make program changes, test fixes and apply the changes to production. 

These tasks and activities will be completed with the help and under the guidance of the supervisor.  Participation in team and / or project meetings, to schedule work and discuss status, will be required.

The position also requires staying abreast with changes in technology, programming languages, and software development tools. 

Responsibilities:

  • Data Architecture and Technical Infrastructure (40%): Plans and drives the development of data engineering solutions ensuring that solutions balance functional and non-functional requirements. Monitors application of data standards and architectures including security and compliance.
  • Innovation, Continuous Improvement & Optimization (15%): Continuously improves and optimizes existing Data Engineering assets/processes.
  • Data Pipeline/ETL (15%): Designs and implements data stores and ETL data flows and data pipelines to connect and prepare operational systems data for analytics and business intelligence (BI) systems.
  • Data Modeling/Designing Datasets (10%): Reviews and understands business requirements for development tasks assigned and applies standard data modelling and design techniques based upon a detailed understanding of requirements.
  • Support & Operations (5%): Manages production deployments and automation, monitoring, job control and production support. Works with business users to test programs in Development and Quality. Investigates issues using vendor support website(s).
  • SDLC Methodology & Project Management (5%): Contributes to technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
  • Data Governance and Data Quality (5%): Identifies and investigates data quality/integrity problems, determine impact and provides solutions to problems.
  • Metadata Management & Documentation (5%): Documents of all processes and mappings related to Data Pipelines work and follows development best practices as adopted by the BIA team.
  • End-User Support, Education and Enablement (5%): Contributes to training and Data Literacy initiatives within the team and End user community.
  • Partnership and Community Building (5%): Collaborates with other IT teams, Business Community, data scientists and other architects to meet business requirements. Interact with DBAs on data designs optimal for data engineering solutions

Required Skills:

  • Experience: 9+ years of proven experience as part Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
  • Developing complex SQL queries and SQL optimization.
  • Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business Intelligence/Data Warehousing Architecture.
  • Skills:
    • Strong experience in designing end-to-end data integration and cloud architecture solutions.  (4+ Year)
    • Informatica IDMC (Cloud Data Integration / Application Integration) (5+ Years)
    • Strong expertise in Informatica IDMC (Cloud Data Integration and Application Integration) for building scalable integration frameworks. (6+ Year)
    • Proven experience in architecting data pipelines between cloud systems, Snowflake, and Salesforce (SFDC).
    • Hands-on knowledge of Informatica Salesforce Connector configuration and integration design.
    • Experience in real-time and batch data integration patterns and architecture.
    • Strong understanding of data warehousing concepts, preferably with Snowflake.
    • Ability to define solution architecture, data flow design, and integration standards across enterprise systems.
    • Experience in stakeholder engagement, requirement gathering, and translating business needs into technical solutions.
    • Strong knowledge of APIs, microservices, and cloud integration best practices.
    • Ability to guide development teams and ensure alignment with architectural standards and governance.

Qualifications:

  • Bachelor’s degree  in Computer Science, Information Technology, or a related field.
  • Hybrid work model: In-office on Monday, Wednesday, and Friday.
  • Working Time:  India shift timings will be until 11:30 PM IST
  • Work Location: Pune / Hyderabad

Additional Information:

All your information will be kept confidential according to EEO guidelines.

Apply Now
Career Path
Work at Home

Partagez ce poste

Intéressé(e) par ce poste ?
Enregistrez le travail
Créez une alerte

Opportunités similaires :

SCHEMA MARKUP ( This text will only show on the editor. )