Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Design and implement data integration solutions using IBM InfoSphere DataStage. Develop ETL jobs, write PL/SQL scripts, and use Unix Shell Scripting for text processing to manage large datasets efficiently.
Posted 2 days ago
3.0 - 5.0 years
15 - 18 Lacs
Noida
Work from Office
Responsibilities Design, Develop, and Maintain ETL Pipelines: Create, optimize, and manage Extract, Transform, Load (ETL) processes using Python scripts and Pentaho Data Integration (Kettle) to move and transform data from various sources into target systems (e.g., data warehouses, data lakes). Data Quality Assurance: Implement rigorous data validation, cleansing, and reconciliation procedures to ensure the accuracy, completeness, and consistency of data. Data Sourcing and Integration: Work with diverse data sources, including relational databases (SQL Server, MySQL, PostgreSQL), flat files (CSV, Excel), APIs, and cloud platforms. Performance Optimization: Identify and implement improvements for existing ETL processes to enhance data load times, efficiency, and scalability. Troubleshooting and Support: Diagnose and resolve data-related issues, ensuring data integrity and timely availability for reporting and analysis. Documentation: Create and maintain comprehensive documentation for all ETL processes, data flows, and data dictionaries. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver robust data solutions. Ad-hoc Analysis: Perform ad-hoc data analysis and provide insights to support business decisions as needed. About the Role: We are looking for a skilled and passionateData Engineerwith 3 to 4 years of experience in building robust ETL pipelines using both visual ETL tools (preferably Kettle/Pentaho) and Python-based frameworks. You will be responsible for designing, developing, and maintaining high-quality data workflows that support our data platforms and reporting environments. Key Responsibilities: Design, develop, and maintain ETL pipelines using Kettle (Pentaho) or similar tools. Build data ingestion workflows using Python (Pandas, SQLAlchemy, psycopg2). Extract data from relational and non-relational sources (APIs, CSV, databases). Perform complex transformations and ensure high data quality. Load processed data into target systems such as PostgreSQL, Snowflake, or Redshift. Implement monitoring, error handling, and logging for all ETL jobs. Maintain job orchestration via shell scripts, cron, or workflow tools (e.g., Airflow). Work with stakeholders to understand data needs and deliver accurate, timely data. Maintain documentation for pipelines, data dictionaries, and metadata. Requirements: 3 to 4 years of experience in Data Engineering or ETL development. Hands-on experience withKettle (Pentaho Data Integration) or similar ETL tools. Strong proficiency in Python (including pandas, requests, datetime, etc.). Strong SQL knowledge and experience with relational databases (PostgreSQL, SQL Server, etc.). Experience with source control (Git), scripting (Shell/Bash), and config-driven ETL pipelines. Good understanding of data warehousing concepts, performance optimization, and incremental loads. Familiarity with REST APIs, JSON, XML, and flat file processing. Good to Have: Experience with job scheduling tools (e.g., Airflow, Jenkins). Familiarity with cloud platforms (AWS, Azure, or GCP). Knowledge of Data Lakes, Big Data, or real-time streaming tools is a plus. Experience working in Agile/Scrum environments. Soft Skills: Strong analytical and problem-solving skills. Self-motivated and able to work independently and in a team. Good communication skills with technical and non-technical stakeholders. Industry Software Development Employment Type Full-time
Posted 2 days ago
5.0 - 10.0 years
8 - 14 Lacs
Hyderabad
Work from Office
#Employment Type: Contract 1. 5+ Years in ETL Domain Development (in which 3 plus years in Talend) 2. Strong in SQL Queries Writing (Mandate) 3. Hands on Trouble Shooting SQL Queries (Mandate) 4. Hands-on Talend Deployments Development (Mandate) 5. Strong in DWH Concepts (Mandate)
Posted 2 days ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
Urgent Requirement for Ab Initio Developer. Experience : 4+ Years Location : Pan India. Ab Initio GDE hands-on ( worked on Graphs , plans and psets ) Experience on Ab Initio Development and Code promotion Experience on Unix shell scripting and unix commands . 4+ years of relevant experience .
Posted 2 days ago
3.0 - 7.0 years
11 - 16 Lacs
Gurugram
Work from Office
Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 2 days ago
8.0 - 10.0 years
10 - 12 Lacs
Karnataka
Remote
Role Overview- Design and development of ETL and BI applications in DataStage- Design/develop testing processes to ensure end to end performance, data integrity and usability.- Carry out performance testing, integration and system testing- Good SQL Knowledge is mandatory - Basic Unix knowledge is required- Should be able to communicate with client and work on technical requirement POSITION GENERAL DUTIES AND TASKS : At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our companys growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. NTT DATA, Inc. currently seeks a DataStage - Developer to join our team in Bangalore. (Currently Remote) Role Overview - Design and development of ETL and BI applications in DataStage - Design/develop testing processes to ensure end to end performance, data integrity and usability. - Carry out performance testing, integration and system testing - Good SQL Knowledge is mandatory - Basic Unix knowledge is required - Should be able to communicate with client and work on technical requirement.
Posted 2 days ago
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Minimum 3 to 5 years of Talend Developer experience. Work on the User stories and develop the Talend jobs development following the best practices. Create detailed technical design documents of talend jobs development work. Work with the SIT team and involve for defect fixing for Talend components. Note: Maximo IBM tool knowledge would have an advantage for Coned otherwise it is Ok.
Posted 2 days ago
2.0 - 7.0 years
7 - 17 Lacs
Bengaluru
Work from Office
In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 2+ years of experience in data, reporting, analytics or a combination of two; or a MS/MA degree or higher in a quantitative field such as applied math, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis 2+ years of hands-on experience in SQL especially in Oracle, SQL Server & Teradata environment 2+ years of experience in Python, Tableau or BI Reporting tool- creating solutions with aid of data visualization. This includes but not limited to developing and creating BI dashboards, working on end to end reports, deriving insights from data Hands on experience in ETL development using Alteryx, or SSIS or any ETL tools Excellent verbal, written, and interpersonal communication skill Working experience in NLP : Topic modeling, Sentiment analysis Good data interpretation and presentation skills Willingness/ability to take ownership of a project Exceptionally fast learner, able to reach non-trivial SME level under tight deadlines High energy, can-do attitude, self-directed, pro-active, self-starter who excels in an environment with limited supervision Lead in managing multiple complex exercises in various stages simultaneously Lead in managing internal customer expectations while adhering to internal SLA timeframes Extensive knowledge and understanding of research and analysis Strong analytical skills with high attention to detail and accuracy Collaborative, team-focused attitude 2+ years of experience in Customer/ Marketing / Sales Analytics Job Expectations: Providing business and technical leadership to develop reporting analytics team delivering for Customer & Employee Experience team Responsible for maintaining partner relationships, ensuring high quality team deliverables and SLAs Working closely with the US partners on daily basis, interacting closely with multiple business partners anchor program managers Work independently, foster a culture of healthy and efficient working for the team Designing and solving complex business problems by analytical techniques and tools Will be involved directly in the technical build-out and/or support of databases, query tools, reporting tools, BI tools, dashboards, etc. that enable analysis, modeling, and/or advanced data visualization including development of Business Objects reports using multiple databases Recommends potential data sources; compiles and mines data from multiple, cross business sources. Works with typically very large data sets, both structured and unstructured, and from multiple sources Develops specific, customized reports, ad hoc analyses and/or data visualizations, formatted with business user-friendly techniques to drive adoption, such as (1) PowerPoint slides and presentations, and (2) clear verbal and e-mail communications Works with senior consultants or directly with partners, responsible for identifying and defining business requirements and translating business needs into moderately complex analyses and recommendations. Works with local and international colleagues and with internal customers, responsible for identifying and defining business requirements and catering to business needs for the team Ensures adherence to data management/data governance regulations and policies
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
Mumbai, Ahmedabad
Work from Office
We are seeking a skilled SAS Developer Team Lead with strong expertise in SAS Data Integration (DI), SAS Visual Analytics (VA), and SAS Management Console (SMC). The ideal candidate will have hands-on experience in ETL development, data visualization, and metadata management, along with a proven track record of leading technical teams. Experience in delivering projects within the BFSI and Government sectors is essential. Key Responsibilities: Lead and mentor a team of SAS developers to achieve project goals. Design, develop, and optimize ETL workflows using SAS DI Studio. Create and manage interactive dashboards and reports using SAS Visual Analytics (VA). Oversee user roles, libraries, and metadata through SAS Management Console (SMC). Coordinate with clients and internal stakeholders to gather requirements and ensure timely project execution. Conduct code reviews to maintain coding standards and promote best practices. Monitor performance, troubleshoot issues, and ensure data quality and system stability. Qualifications: 3 to 5 years of hands-on experience in SAS DI, VA, and SMC. Strong understanding of data warehousing concepts and ETL design. Experience in BFSI and/or Government projects is a must. Prior experience in leading and mentoring a team. Excellent communication and client-handling skills. Preferred: SAS certifications are an added advantage.
Posted 2 days ago
7.0 - 12.0 years
30 - 40 Lacs
Indore, Pune, Bengaluru
Hybrid
Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps
Posted 3 days ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 3 days ago
6.0 - 10.0 years
1 - 1 Lacs
Bengaluru
Remote
We are looking for a highly skilled Senior ETL Consultant with strong expertise in Informatica Intelligent Data Management Cloud (IDMC) components such as IICS, CDI, CDQ, IDQ, CAI, along with proven experience in Databricks.
Posted 3 days ago
7.0 - 11.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Role: Lead SQL Server Developer Location Bangalore (Ashok Nagar) Experience : 8 - 10 years of prior experience which includes 3+ years of Team Lead experience required. Education : Bachelor's/Masters Degree in Technology. Salary : Negotiable Job Type : Full Time (On Role) Mode of Work : Work from Office Job Description What you will be doing: Working on Microsoft SQL Server 2012 and above server-side development. Designing the schema that can be usable across multiple customers. Design and developing T-SQL (Transact SQL) Stored Procedures, Functions and Triggers and SSIS packages. Develop underlying data models and databases. Develop, manage and maintain data dictionary and or metadata. Doing performance tuning & query optimization. Ensuring compliance of standards and conventions in developing programs. Analyse and resolve complex issues without oversight from other people. Perform quality checks on reports and exports to ensure exceptional quality. Works in Scrum environment. Preferred Skills: Knowledge of tools like Qlik /Qlik sense and/or any other data visual tools. Understanding of .NET code/jQuery experience is a plus. Knowledge in Microsoft Reporting tool (SSRS). Experience in Database Administration activities. Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -
Posted 4 days ago
4.0 - 9.0 years
5 - 9 Lacs
Gurugram
Work from Office
Key Responsibilities: ETL Development and Maintenance Design, develop, and implement ETL processes using SSIS to support data integration and warehousing requirements. Maintain and enhance existing ETL workflows to ensure data accuracy and integrity. Collaborate with data analysts, data architects, and other stakeholders to understand data requirements and translate them into technical specifications. Extract, transform, and load data from various source systems into the data warehouse. Perform data profiling, validation, and cleansing to ensure high data quality. Monitor ETL processes to ensure timely and accurate data loads. Write and optimize complex SQL queries to extract and manipulate data. Work with SQL Server to manage database objects, indexes, and performance tuning. Ensure data security and compliance with industry standards and regulations. Business Intelligence and Reporting: Develop and maintain interactive dashboards and reports using Power BI or SSRS. Collaborate with business users to gather requirements and create visualizations that provide actionable insights. Integrate Power BI with other data sources and platforms for comprehensive reporting. Scripting and Automation: Utilize Python for data manipulation, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve efficiency. Insurance Domain Expertise: Leverage knowledge of insurance industry processes and terminology to effectively manage and interpret insurance data. Work closely with business users and stakeholders within the insurance domain to understand their data needs and provide solutions. Qualifications Required Skills and Qualifications: Technical Skills: Proficient in SQL and experience with SQL Server. Strong experience with SSIS for ETL development and data integration. Proficiency in Python for data manipulation and scripting. Experience with Power BI/SSRS for developing interactive dashboards and reports. Knowledge of data warehousing concepts and best practices. Domain KnowledgeSolid understanding of insurance industry processes, terminology, and data structures. Experience working with insurance-related data, such as policies, claims, underwriting, and actuarial data. Additional Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Job Location
Posted 4 days ago
4.0 - 9.0 years
8 - 12 Lacs
Hyderabad, Pune
Work from Office
Sr MuleSoft Developer1 Design and implement MuleSoft solutions using AnyPoint Studio, Mule ESB, and other related technologies.Collaborate with cross-functional teams to gather requirements and deliver high-quality ETL processes.Develop and maintain APIs using RAML and other industry standards.Strong understanding of RAML (REpresentational API Modeling Language) and its usage in API design.Develop complex integrations between various systems, including cloud-based applications such as Snowflake.Ensure seamless data flow by troubleshooting issues and optimizing existing integrations.Provide technical guidance on best practices for data warehousing, ETL development, and PL/SQL programming language.Strong understanding of SQL concepts, including database schema design, query optimization, and performance tuning.Proficiency in developing complex ETL processes using various technologies such as Cloud platforms and Data Warehousing tools (Snowflake).Experience working with multiple databases and ability to write efficient PL/SQL code snippets.
Posted 4 days ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and methodologies.- Experience with cloud-based data solutions and architectures.- Familiarity with data modeling concepts and practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai, Ahmedabad
Work from Office
We are looking for a skilled and detail-oriented SAS Developer with 35 years of experience, proficient in SAS Visual Analytics (VA), Visual Investigator (VI), and Data Integration (DI). The candidate will work on high-impact projects for international clients, supporting solutions across business domains such as banking, financial services, and insurance. The ideal candidate should be open to working in international time zones when assigned to remote projects. Key Responsibilities: Develop, enhance, and maintain SAS solutions using SAS VA, SAS VI, and SAS DI. Perform data extraction, transformation, and loading (ETL) processes using SAS DI Studio. Create interactive dashboards and reports using SAS Visual Analytics. Collaborate with business analysts, project managers, and end users to gather requirements and deliver technical solutions. Troubleshoot and optimize existing SAS code and processes for performance and scalability. Ensure data quality and integrity in reporting and analysis tasks. Support deployment, testing, and validation of SAS components. Work independently or as part of a team for global delivery in international client engagements. Follow best practices in documentation, version control, and development standards. Qualifications: 3 to 5 years of hands-on experience in SAS development. Strong experience in SAS VA (Visual Analytics), SAS VI (Visual Investigator), and SAS DI (Data Integration). Good understanding of data warehousing concepts and ETL development. Familiarity with SQL and database platforms like Oracle, Teradata, or SQL Server. Excellent problem-solving skills and attention to detail. Strong communication and client interaction skills. Ability to work in international time zones (e.g., US, UK, or Middle East) when assigned remote projects. Bachelor's degree in Computer Science, Information Systems, or related field. Good to Have: Experience working in banking or credit risk domains. Exposure to cloud-based SAS solutions (e.g., SAS Viya). Remote: Open for international client projects (must be flexible with working hours) Joining: Immediate to 30 days preferred
Posted 5 days ago
8.0 - 10.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Position: Solution Architect (ETL) Location: Bangalore Experience: 8 Yrs CTC: As per the Industry standards Immediate Joiners # Job Summary We are seeking an experienced Solution Architect (ETL) to design and implement data integration solutions using ETL (Extract, Transform, Load) tools. The ideal candidate will have a strong background in data warehousing, ETL, and data architecture. # Key Responsibilities 1. Design and Implement ETL Solutions: Design and implement ETL solutions using tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. 2. Data Architecture: Develop and maintain data architectures that meet business requirements and ensure data quality and integrity. 3. Data Warehousing: Design and implement data warehouses that support business intelligence and analytics. 4. Data Integration: Integrate data from various sources, including databases, files, and APIs. 5. Data Quality and Governance: Ensure data quality and governance by implementing data validation, data cleansing, and data standardization processes. 6. Collaboration: Collaborate with cross-functional teams, including business stakeholders, data analysts, and IT teams. 7. Technical Leadership: Provide technical leadership and guidance to junior team members. # Requirements 1. Education: Bachelors degree in Computer Science, Information Technology, or related field. 2. Experience: Minimum 8 years of experience in ETL development, data warehousing, and data architecture. 3. Technical Skills: ETL tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. Data warehousing and business intelligence tools such as Oracle, Microsoft, or SAP. Programming languages such as Java, Python, or C#. Data modeling and data architecture concepts. 4. Soft Skills: Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a team environment and lead junior team members. # Nice to Have 1. Certifications: Certifications in ETL tools, data warehousing, or data architecture. 2. Cloud Experience: Experience with cloud-based data integration and data warehousing solutions. 3. Big Data Experience: Experience with big data technologies such as Hadoop, Spark, or NoSQL databases. # What We Offer 1. Competitive Salary: Competitive salary and benefits package. 2. Opportunities for Growth: Opportunities for professional growth and career advancement. 3. Collaborative Work Environment: Collaborative work environment with a team of experienced professionals.
Posted 5 days ago
5.0 - 8.0 years
9 - 13 Lacs
Mumbai
Work from Office
Skill required: Data Management - AWS Architecture Designation: Data Eng, Mgmt & Governance Sr Analyst Qualifications: BE/BTech Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIIn this role, you will be responsible for designing, developing, implementing, and managing distributed applications and systems on the AWS platform. You will be responsible for ETL development, data analysis, technical design, and testing on AWS environment. What are we looking for AWS Python (Programming Language) PySpark Adaptable and flexible Ability to work well in a team Strong analytical skills Commitment to quality Agility for quick learning Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification BE,BTech
Posted 5 days ago
5.0 - 10.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience with data integration and data warehousing- Knowledge of data quality and data governance principles- Hands-on experience with Ab Initio GDE and EME tools Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Chennai office- A 15 years full time education is required Qualification 15 years full time education
Posted 5 days ago
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 5 days ago
7.0 - 9.0 years
12 - 15 Lacs
Hyderabad
Work from Office
We are seeking an experienced ETL Developer with a strong background in Python and Airflow to join our dynamic team in Hitech City, Hyderabad. The ideal candidate will have over 7 years of experience in ETL processes and data integration, with a focus on optimizing and enhancing data pipelines. While expertise in Snowflake is not mandatory, a strong understanding of RDBMS and SQL is essential.
Posted 5 days ago
5.0 - 8.0 years
25 - 30 Lacs
Pune, Gurugram, Bengaluru
Work from Office
NYU Manager - Owais UR Delivery Manager - Laxmi Title: Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies. Overview: As a Senior Developer I at NYU EDA team, you will play a vital role to improve the operation of our data load and management processes. Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully. You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity. Responsibilities: Manage and perform Healthy Planet file loads into a data warehouse. Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary. Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports. Collaborate with the data engineering team to streamline data processing workflows. Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python. Ensure all data-related tasks are performed accurately and on time. Investigate and resolve data discrepancies and processing issues. Prepare and maintain documentation for processes and workflows. Conduct periodic data audits to ensure data integrity and compliance with defined standards. Skillset Requirements: MS/Oracle SQL Python Data warehousing and ETL processes Monitoring tools such as Apache Airflow Data quality and integrity assurance Strong analytical and problem-solving abilities Excellent written and verbal communication Additional Skillset Familiarity with monitoring and managing Apache Airflow DAGs. Experience: Minimum of 5 years experience in a similar role, with a focus on data management and process automation. Proven track record of successfully managing complex data processes and meeting deadlines. Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Certifications: Epic Cogito MS/Oracle SQL, Python, or data management are a plus.
Posted 5 days ago
3.0 - 8.0 years
20 - 30 Lacs
Hyderabad, Pune
Hybrid
Job Summary: oin our team and what well accomplish together As an MDM Developer, you will be responsible for implementing and managing Master Data Management (MDM) projects. The ideal candidate will have extensive experience with Informatica MDM and proficiency in configuring MDM tools and integrating them with cloud environments. You will utilize your expertise in data engineering to build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS). Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice. Here’s how Learn new skills & advance your data development practice Analyze and profile data Design, develop, test, deploy, maintain and improve batch and real-time data pipelines Assist with design and development of solution prototypes Support consumers with understanding the data outcomes and technical design Collaborate closely with multiple teams in an agile environment What you bring You are a senior developer with 3+ years of experience in IT platform implementation in a technical capacity Bachelor of Computer Science, Engineering or equivalent Extensive experience with Informatica MDM (Multi-Domain Edition) version 10 Proficiency in MDM configuration, including Provisioning Tool, Business Entity Services, Customer 360, data modeling, match rules, cleanse rules, and metadata analysis Expertise in configuring data models, match and merge rules, database schemas, and trust and validation settings Understanding of data warehouses/cloud architectures and ETL processes Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.) Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools Experience with Informatica MDM (preferred) but strong skills in other MDM tools still an asset Experience working with message queues like JMS, Kafka, PubSub A passion for data quality Great-to-haves Experience with Informatica MDM SaaS Experience with Python and software engineering best practices API development using Node.js and testing using Postman/SoapUI Understanding of TMF standards
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane