Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 10.0 years
14 - 15 Lacs
Pune
Work from Office
HSBC electronic data processing india pvt ltd is looking for DataStage /Senior Consultant Specialist to join our dynamic team and embark on a rewarding career journey Advises clients on complex business and technology challenges Leads solution design, implementation, and delivery Provides subject-matter expertise across multiple domains Drives strategic initiatives and stakeholder alignment
Posted 2 weeks ago
6.0 - 10.0 years
12 - 16 Lacs
Jaipur
Work from Office
ABOUT HAKKODA Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!. We are looking for people experienced with data architecture, design and development of database mapping and migration processes. This person will have direct experience optimizing new and current databases, data pipelines and implementing advanced capabilities while ensuring data integrity and security. Ideal candidates will have strong communication skills and the ability to guide clients and project team members. Acting as a key point of contact for direction and expertise. Key Responsibilities Design, develop, and optimize database architectures and data pipelines. Ensure data integrity and security across all databases and data pipelines. Lead and guide clients and project team members, acting as a key point of contact for direction and expertise. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Manage and support large-scale technology programs, ensuring they meet business objectives and compliance requirements. Develop and implement migration, dev/ops, and ETL/ELT ingestion pipelines using tools such as DataStage, Informatica, and Matillion. Utilize project management skills to work effectively within Scrum and Agile Development methods. Create and leverage metrics to develop actionable and measurable insights, influencing business decisions. Qualifications 7+ years of proven work experience in data warehousing, business intelligence (BI), and analytics. 3+ years of experience as a Data Architect. 3+ years of experience working on Cloud platforms (AWS, Azure, GCP). Bachelors Degree (BA/BS) in Computer Science, Information Systems, Mathematics, MIS, or a related field. Strong understanding of migration processes, dev/ops, and ETL/ELT ingestion pipelines. Proficient in tools such as DataStage, Informatica, and Matillion. Excellent project management skills and experience with Scrum and Agile Development methods. Ability to develop actionable and measurable insights and create metrics to influence business decisions. Previous consulting experience managing and supporting large-scale technology programs. Nice to Have 6-12 months of experience working with Snowflake. Understanding of Snowflake design patterns and migration architectures. Knowledge of Snowflake roles, user security, and capabilities like Snowpipe. Proficiency in SQL scripting. Cloud experience on AWS (Azure and GCP are also beneficial) Python scripting skills. Benefits: - Health Insurance - Paid leave - Technical training and certifications - Robust learning and development opportunities - Incentive - Toastmasters - Food Program - Fitness Program - Referral Bonus Program Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive. Ready to take your career to the next level? Apply today and join a team that s shaping the future!! Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with application design and development methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Noida
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of the development process, ensuring that applications are robust and user-friendly. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Noida office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : BE or BTECH Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure application functionality.- Conduct code reviews and provide technical guidance to team members.- Troubleshoot and resolve application issues in a timely manner.- Stay updated on industry trends and best practices to enhance application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and tools.- Hands-on experience in designing and developing applications.- Knowledge of database management systems and SQL. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A BE or BTECH degree is required. Qualification BE or BTECH
Posted 2 weeks ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of data engineering solutions- Optimize and maintain data pipelines for optimal performance- Collaborate with data scientists and analysts to understand data requirements Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering- Strong understanding of ETL processes- Experience with cloud-based data platforms such as AWS or Azure- Knowledge of data modeling and database design- Experience with big data technologies like Hadoop or Spark Additional Information:- The candidate should have a minimum of 5 years of experience in Data Engineering- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache AirflowMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Python (Programming Language), Apache Airflow.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and management best practices. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to maintain alignment and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 7.0 years
10 - 19 Lacs
Bengaluru
Hybrid
Job Description Experience 4 to 7 years. Experience in any ETL tools [e.g. DataStage] with implementation experience in large Data Warehouse Proficiency in programming languages such as Python etc. Experience with data warehousing solutions (e.g., Snowflake, Redshift) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of SQL and database management systems. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and data pipeline orchestration tools (e.g. Airflow). Proven ability to lead and develop high-performing teams, with excellent communication and interpersonal skills. Strong analytical and problem-solving abilities, with a focus on delivering actionable insights. Responsibilities Design, develop, and maintain advanced data pipelines and ETL processes using niche technologies. Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions. Ensure data quality and integrity by implementing robust data validation and monitoring processes. Optimize data systems for performance, scalability, and reliability. Develop comprehensive documentation for data engineering processes and systems.
Posted 2 weeks ago
6.0 - 10.0 years
5 - 11 Lacs
Hyderabad
Work from Office
Greetings from NCG! We have a opening for Snowflake Developer role in Hyderabad office! Below JD for your reference Job Description: We are hiring an experienced Senior Data Engineer with strong expertise in IBM DataStage , Azure Data Platform , and Power BI . The ideal candidate will be responsible for end-to-end data integration, transformation, and reporting solutions that drive business decisions. Key Responsibilities: Design and develop robust ETL solutions using IBM DataStage for data extraction, transformation, and loading. Manage and orchestrate data workflows in Azure using Azure Data Factory , Azure SQL , Data Lake , etc. Build intuitive and dynamic Power BI dashboards and reports for various business stakeholders. Optimize data models and ensure performance tuning across DataStage and Power BI platforms. Collaborate with business users, analysts, and data engineers to gather reporting requirements. Ensure adherence to data governance, security policies, and quality standards. Conduct unit testing and support UAT cycles for data pipelines and reports. Document ETL designs, data mappings, and visualization structures. Required Skills: 6+ years of hands-on experience in ETL development using IBM DataStage . Strong experience with Azure data services : Azure Data Factory, Azure SQL DB, Blob Storage, etc. Advanced knowledge of Power BI , including DAX , Power Query, data modeling, and report publishing. Proficient in writing SQL queries and managing large datasets. Experience in data warehousing concepts and data architecture design . Strong problem-solving skills and attention to detail. For further information please contact the HR Harshini - 9663082098 Swathi - 9972784663 Thanks and regards, Chiranjeevi Nanjunda Talent Acquisition Lead - NCG
Posted 2 weeks ago
10.0 - 15.0 years
30 - 35 Lacs
Chennai, Bengaluru
Work from Office
Principal Architect (Data and Cloud) - Neoware Technology Solutions Private Limited Principal Architect (Data and Cloud) Requirements More than 10 years of experience in Technical, Solutioning, and Analytical roles. 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure). Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc. Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud. Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud. Experience of having worked with traditional ETL tools like Informatica / DataStage / OWB / Talend , etc. Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift. Experience in having worked on one or more data integration, storage, and data pipeline tool sets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc. Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design. Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. Good understanding of BI Reporting and Dashboarding and one or more tool sets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure. Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure. Be a trusted technical advisor to customers and solutions for complex Cloud & Data related technical challenges. Be a thought leader in architecture design and development of cloud data analytics solutions. Liaison with internal and external stakeholders to design optimized data analytics solutions. Partner with SMEs and Solutions Architects from leading cloud providers to present solutions to customers. Support Sales and GTM teams from a technical perspective in building proposals and SOWs. Lead discovery and design workshops with potential customers across the globe. Design and deliver thought leadership webinars and tech talks alongside customers and partners. Responsibilities Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. Interface with multiple stakeholders within IT and business to understand the data requirements. Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction. Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems. Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data. Mentor young Talent within the Team, Define and track their growth parameters. Contribute to building Assets and Accelerators.
Posted 2 weeks ago
2.0 - 4.0 years
7 - 11 Lacs
Gurugram
Work from Office
A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives Job Description: Key responsibilities: Have 2 - 4 yrs exp Will work in close Supervision of Tech Leads/ Lead Devs Should able to understand detailed design with minimal explanation. Individual Contributor. Resource will able to perform mid to complex level tasks with minimal supervision. Senior team members will peer review assigned tasks. Build and configure our Marketing Database/Data environment platform by integrating feeds as per detailed design/transformation logic. Good knowledge of Unix scripting &/or Python Must have strong knowledge in SQL Good understanding of ETL (Talend, Informatica, Datastage , Ab Initio etc ) as well as database skills (Oracle, SQL server, Teradata, Vertica, redshift, Snowflake, Big query, Azure DW etc ). Fair understanding of relational databases, stored procs etc. Experience in Cloud computing (one or more of AWS, Azure, GCP) will be plus . Less supervision & guidance from senior resources will be required .
Posted 2 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Gurugram
Work from Office
A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives Job Description: Key responsibilities: Have 2 - 4 yrs exp Will work in close Supervision of Tech Leads/ Lead Devs Should able to understand detailed design with minimal explanation. Individual Contributor. Resource will able to perform mid to complex level tasks with minimal supervision. Senior team members will peer review assigned tasks. Build and configure our Marketing Database/Data environment platform by integrating feeds as per detailed design/transformation logic. Good knowledge of Unix scripting &/or Python Must have strong knowledge in SQL Good understanding of ETL (Talend, Informatica, Datastage , Ab Initio etc ) as well as database skills (Oracle, SQL server, Teradata, Vertica, redshift, Snowflake, Big query, Azure DW etc ). Fair understanding of relational databases, stored procs etc. Experience in Cloud computing (one or more of AWS, Azure, GCP) will be plus . Less supervision & guidance from senior resources will be required . Location: DGS India - Gurugram - Golf View Corporate Towers Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 2 weeks ago
5.0 - 10.0 years
11 - 15 Lacs
Gurugram
Work from Office
Position Summary This is the Requisition for Employee Referrals Campaign and JD is Generic. We are looking for Associates with 5+ years of experience in delivering solutions around Data Engineering, Big data analytics and data lakes, MDM, BI, and data visualization. Experienced to Integrate and standardize structured and unstructured data to enable faster insights using cloud technology. Enabling data-driven insights across the enterprise. Job Responsibilities He/she should be able to design implement and deliver complex Data Warehousing/Data Lake, Cloud Data Management, and Data Integration project assignments. Technical Design and Development – Expertise in any of the following skills. Any ETL tools (Informatica, Talend, Matillion, Data Stage), andhosting technologies like the AWS stack (Redshift, EC2) is mandatory. Any BI toolsamong Tablau, Qlik & Power BI and MSTR. Informatica MDM, Customer Data Management. Expert knowledge of SQL with the capability to performance tune complex SQL queries in tradition and distributed RDDMS systems is must. Experience across Python, PySpark and Unix/Linux Shell Scripting. Project Managementis must to have. Should be able create simple to complex project plans in Microsoft Project Plan and think in advance about potential risks and mitigation plans as per project plan. Task Management – Should be able to onboard team on the project plan and delegate tasks to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items with team members in an onshore-offshore model. Handle Client Relationship – Manage client communication and client expectations independently or with support of reporting manager. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills. Education Bachelor of Technology Master's Equivalent - Engineering Work Experience Overall, 5- 7years of relevant experience inData Warehousing, Data management projects with some experience in the Pharma domain. We are hiring for following roles across Data management tech stacks - ETL toolsamong Informatica, IICS/Snowflake,Python& Matillion and other Cloud ETL. BI toolsamong Power BI and Tableau. MDM - Informatica/ Raltio, Customer Data Management. Azure cloud Developer using Data Factory and Databricks Data Modeler-Modelling of data - understanding source data, creating data models for landing, integration. Python/PySpark -Spark/ PySpark Design, Development, and Deployment
Posted 2 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Chennai
Work from Office
Job Summary: We are looking for a seasoned ETL Engineer with hands-on experience in Talend or IBM DataStage , preferably both, to lead data integration efforts in the mortgage domain . The ideal candidate will play a key role in designing, developing, and managing scalable ETL solutions that support critical mortgage data processing and analytics workloads. Key Responsibilities: End-to-end ETL solution development using Talend or DataStage.Design and implement robust data pipelines for mortgage origination, servicing, and compliance data.Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions.Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards.Manage job orchestration, scheduling, and error handling mechanisms.Document ETL workflows, data dictionaries, and system processes.Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools Talend (preferred) or IBM DataStage.Solid understanding of mortgage lifecycle and related data domains.Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake).Familiarity with job scheduling tools , version control , and CI/CD pipelines .Excellent problem-solving, leadership, and communication skills.
Posted 2 weeks ago
3.0 - 8.0 years
7 - 17 Lacs
Hyderabad
Work from Office
Job Title: Database Engineer Analytics – L Responsibilities As a Database Engineer supporting the bank’s Analytics platforms, you will be a part of a centralized team of database engineers who are responsible for the maintenance and support of Citizens’ most critical databases. A Database Engineer will be responsible for: • Requires conceptual knowledge of database practices and procedures such as DDL, DML and DCL. • Requires how to use basic SQL skills including SELECT, FROM, WHERE and ORDER BY. • Ability to code SQL Joins, subqueries, aggregate functions (AVG, SUM, COUNT), and use data manipulation techniques (UPDATE, DELETE). • Understanding basic data relationships and schemas. • Develop Basic Entity-Relationship diagrams. • Conceptual understanding of cloud computing • Can solves routine problems using existing procedures and standard practices. • Can look up error codes and open tickets with vendors • Ability to execute explains and identify poorly written queries • Review data structures to ensure they adhere to database design best practices. • Develop a comprehensive backup plan. • Understanding the different cloud models (IaaS, PaaS, SaaS), service models, and deployment options (public, private, hybrid). • Solves standard problems by analyzing possible solutions using experience, judgment and precedents. • Troubleshoot database issues, such as integrity issues, blocking/deadlocking issues, log shipping issues, connectivity issues, security issues, memory issues, disk space, etc. • Understanding cloud security concepts, including data protection, access control, and compliance. • Manages risks that are associated with the use of information technology. • Identifies, assesses, and treats risks that might affect the confidentiality, integrity, and availability of the organization's assets. • Ability to design and implement highly performing database using partitioning & indexing that meet or exceed the business requirements. • Documents a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. • Ability to code complex SQL. • Performs effective backup management and periodic databases restoration testing. • General DB Cloud networking skills – VPCs, SGs, KMS keys, private links. JOB DESCRIPTION • Ability to develop stored procedures and at least one scripting language for reusable code and improved performance. Know how to import and export data into and out of databases using ETL tools, code, migration tools like DMS or scripts • Knowledge of DevOps principles and tools, such as CI/CD. • Attention to detail and demonstrate a customer centric approach. • Solves complex problems by taking a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information • Ability to optimize queries for performance and resource efficiency • Review database metrics to identify performance issues. Required Qualifications • 2-10+ years of experience with database management/administration, Redshift, Snowflake or Neo4J • 2-10+ years of experience working with incident, change and problem management processes and procedures. • Experience maintaining and supporting large-scale critical database systems in the cloud. • 2+ years of experience working with AWS cloud hosted databases • An understanding of one programming languages, including at least one front end framework (Angular/React/Vue), such as Python3, Java, JavaScript, Ruby, Golang, C, C++, etc. • Experience with cloud computing, ETL and streaming technologies – OpenShift, DataStage, Kafka • Experience with agile development methodology • Strong SQL performance & tuning skills • Excellent communication and client interfacing skills • Strong team collaboration skills and capacity to prioritize tasks efficiently. Desired Qualifications • Experience working in an agile development environment • Experience working in the banking industry • Experience working in cloud environments such as AWS, Azure or Google • Experience with CI/CD pipeline (Jenkins, Liquibase or equivalent) Education and Certifications • Bachelor’s degree in computer science or related discipline
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai
Hybrid
Role & responsibilities Job Summary: We are looking for a seasoned ETL Engineer with hands-on experience in IBM DataStage , preferably both, to lead data integration efforts in the mortgage domain. The ideal candidate will play a key role in designing, developing, and managing scalable ETL solutions that support critical mortgage data processing and analytics workloads. Key Responsibilities: End-to-end ETL solution development using DataStage. Design and implement robust data pipelines for mortgage origination, servicing, and compliance data. Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions. Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards. Manage job orchestration, scheduling, and error handling mechanisms. Document ETL workflows, data dictionaries, and system processes. Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools IBM DataStage. Solid understanding of mortgage lifecycle and related data domains. Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake). Familiarity with job scheduling tools , version control , and CI/CD pipelines .
Posted 2 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Pune
Work from Office
5 years of DataStage experience. Strong Data warehousing knowledge and ready to provide guidance to junior members in the team. Must have a good communication skill as the role demands lot of interaction with US Business team as well as IT stakeholders. Must be able to work independently handle multiple concurrent tasks with an ability to prioritize and manage tasks effectively. Experience in developing DataStage jobs and deploying the jobs thru SDLC cycle. Knowledge of data modeling database design and the data warehousing ecosystem. Ability to work independently and collaborate with others at all levels of technical understanding. Analyzing organizational data requirements and reviewing/Understanding logical and physical Data Flow Diagrams and Entity Relationship Diagrams using tools such as Visio and Erwin Designing and building scalable DataStage solutions. Updating data within repositories and data warehouses. Assisting project leaders in determining project timelines and objectives. Monitoring jobs and identifying bottlenecks in the data processing pipeline. Testing and troubleshooting problems in system designs and processes. Proficiency in SQL or another relevant coding language. Has great communication and reasoning skills including the ability to make a strong case for technology choices. 5+ years of experience in testing debugging skills and troubleshooting support and development issues
Posted 2 weeks ago
5.0 - 10.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly to support business operations. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with organizational goals. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Continuously assess and improve application development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with data modeling and database design.- Familiarity with performance tuning and optimization techniques.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
3 - 6 Lacs
Chennai
Work from Office
Programming languages/Tools SQL , Datastage , Teradata . Design complex ETL jobs in IBM Datastage to load data into the DWH as per business logic Work experience in Teradata Database as Developer . Understand and analyse ERP reports and document the logic Identify gaps in the existing solutions to accommodate new business processes introduced by the merger Work on designing TAS workflows to replicate data from SAP into the DWH Prepare test cases and technical specifications for the new solutions Interact with other upstream and downstream application teams and EI teams to build robust data transfer mechanisms between various systems.Essential Skills Required Sound interpersonal communication skills Coordinate with customers and Business Analysts to understand business and reporting requirements Support the development of business intelligence standards to meet business goals. Ability to understand Data warehousing concepts and implement reports based on users inputs Area of expertise includes Teradata SQL, DataStage, Teradata , Shell Scripting . Demonstrated focus on driving for results Ability to work with a cross functional teamEmployment Experience Required Minimum 3+ years technical experience in a data warehousing concepts and as ETL developer.
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while keeping abreast of the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good To Have Skills: Experience with SAP HANA and data warehousing concepts.- Strong understanding of ETL processes and data integration techniques.- Familiarity with reporting tools and dashboard creation.- Experience in performance tuning and optimization of data models. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of applications.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane