Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Job Description: We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance
Posted 5 days ago
4.0 - 8.0 years
0 - 1 Lacs
Bengaluru
Remote
Offshore Senior Developer: o Performs detailed design of complex applications and complex architecture components o May lead a small group of developers in configuring, programming, and testing o Fixes medium to complex defects and resolves performance problems o Accountable for service commitments at the individual request level for in-scope applications o Monitors, tracks, and participates ticket resolution for assigned tickets o Manages code reviews and mentors other developers Skill/Experience/Education Mandatory Skills Google Big Query development; ETL; SQL, Linux (Preferable); SSIS package building & troubleshooting; advanced data modeling
Posted 5 days ago
5.0 - 8.0 years
10 - 20 Lacs
Chennai, Bengaluru
Work from Office
ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046
Posted 1 week ago
5.0 - 10.0 years
7 - 17 Lacs
Chennai, Bengaluru
Work from Office
5+years of experience as an ETL Developer, with hands-on expertise in (ODI). Proficiency in Oracle Database and MySQL, with strong skills in SQL & PL/SQL Experience in data integration, transformation, and loading from heterogeneous data sources.
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Work from Office
Develop and optimize Big Data solutions using Apache Spark. Work extensively with PySpark and Data Engineering tools. Handle real-time data processing using Kafka and Spark Streaming. Design and implement ETL pipelines and migrate workflows to Spark. Required Candidate profile Hands-on experience with Hadoop, HDFS, YARN. Strong programming skills in Scala, Java, and Python. Exposure to CI/CD automation for Big Data workflows.
Posted 1 week ago
6.0 - 11.0 years
30 - 35 Lacs
Hyderabad, Delhi / NCR
Hybrid
Support enhancements to the MDM and Performance platform Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years in Python and advanced SQL including profiling, refactoring Experience with REST API and Hands on AWS Glue EMR etc Experience with Markit EDM or Semarchy or MDM will be plus
Posted 1 week ago
8.0 - 13.0 years
18 - 27 Lacs
Hyderabad
Hybrid
We are looking for an experienced ETL developer who will be responsible for the whole lifecycle of assigned tasks of ETL pipelines creation and maintenance from the concept to PROD release including: data analysis and requirements elicitation, implementing data pipelines, testing, gathering approvals and migrating code. Ideal fit will be for self-organized and result oriented person who can work without supervision Ideal candidate should have 10+ years of experience in ETL/DB/DWH/Business Intelligence spheres. Required skills: Strong SQL skill. Ability to read, modify and create complex queries. Combined with good business analysis skills, be able to interpret, judge data quality, and summarize information accordingly. Experience in developing ETL / building data pipelines with ETL tools or programming languages. Preferably experienced in multiple DW/BI aspects from ETL, Data Access Control, Data quality to Metadata management & data governance Excellent independent decision-making capabilities and a solution-oriented attitude. Ability to prioritize own tasks and manage dates Nice to have: Advanced knowledge of Google Cloud BigQuery Profound knowledge of Google Cloud Platform Passing knowledge of Python + Apache Beam/Google Cloud Dataflow will be a plus.
Posted 1 week ago
4.0 - 6.0 years
10 - 16 Lacs
Kolkata, Pune, Mumbai (All Areas)
Hybrid
JOB TITLE: Software Developer II: Oracle Data Integrator (ODI)OVERVIEW OF THE ROLE:We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud(OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing,and optimizing cutting-edge data integration and analytics solutions. Your contributions will bepivotal in enhancing data-driven decision-making and delivering actionable insights across theorganization.Key Responsibilities: Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debuggingof applications.Technical Skills: Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization,workflow design, etc. Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. HASHEDIN BY DELOITTE 2025 Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migratingcomplex datasets in oracle Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Lowlevel design documents. Ability to work with minimal guidance or supervision in a time critical environment. Experience: 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or related field.ABOUT HASHEDINWe are software engineers who solve business problems with a Product Mindset for leadingglobal organizations.By combining engineering talent with business insight, we build software and products that cancreate new enterprise value.The secret to our success is a fast-paced learning environment, an extreme ownership spirit,and a fun culture.WHY SHOULD YOU JOIN US?With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn,your work will make an impact that matters.So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion,collaboration, and high performance – HashedIn is the place to be!From learning to leadership, this is your chance to take your software engineering career to thenext level.So, what impact will you make?Visit us @ https://hashedin.com
Posted 1 week ago
6.0 - 8.0 years
15 - 18 Lacs
Bengaluru
Work from Office
Role Overview For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Key Responsibilities Data Collection and Management: Extract, transform, and load (ETL) data from various sources into our online applications using tools and processes. Utilize APIs to automate data integration from diverse platforms. Maintain and enhance existing data pipelines, ensuring data integrity and consistency. Data Analysis: Conduct in-depth data analysis to uncover trends, patterns, and actionable insights. Utilize SQL for querying, managing, and manipulating large datasets. Create and maintain interactive dashboards and reports to present data insights to stakeholders. Monitoring and Alerts: Implement and manage Azure monitoring and alerting systems to ensure data workflows and applications are functioning optimally. Proactively identify and troubleshoot issues in data processes, ensuring minimal downtime and maximum reliability. Collaboration and Communication: Collaborate with cross-functional teams including marketing, product development, and IT to understand data needs and provide analytical support. Communicate complex data findings and recommendations to both technical and non-technical audiences. Contribute to continuous improvement of data processes, analytical methodologies, and best practices. Critical Competencies for Success Educational Background: Bachelors degree in Data Science, Statistics, Computer Science, or a related field. Technical Skills: Proficient in SQL for data querying and manipulation. Experience with ETL processes and tools. Strong understanding of API integration and data automation. Hands-on experience with Azure monitoring and alerting tools. Knowledge of programming languages such as HTML and Javascript is a plus. Experience: Proven experience in a data analyst role or similar position. Demonstrated experience with online data sources and web analytics. Experience with cloud platforms, particularly Azure, is required. Analytical Skills: Strong problem-solving skills and attention to detail. Ability to analyze large datasets and generate meaningful insights. Excellent statistical and analytical capabilities. Soft Skills: Strong communication and presentation skills. Ability to work independently and collaboratively in a team environment. Good organizational skills and ability to manage multiple projects simultaneously. Role & responsibilities For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Preferred candidate profile Bachelors degree in Data Science, Statistics, Computer Science, or a related field Perks and benefits Industrial standards
Posted 1 week ago
3.0 - 6.0 years
7 - 15 Lacs
Gurugram
Work from Office
Dear Candidate, Greetings!! Hiring For SSIS Developer - Gurgaon(wfo) Responsibilities 1 Must have exp into SSIS packages for ETL processes 2 End to end data migration 3 Must have exp in Oracle Cloud Share resume on abhishek@xinoe.com Regards,
Posted 1 week ago
8.0 - 12.0 years
10 - 14 Lacs
Pune
Work from Office
Before you apply to a job, select your language preference from the options available at the top right of this page. : Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Full stack developer with Java, Oracle and Angular. Devops and Agile project management is a plus. Plans, develops, and manages the organization's information software, applications, systems, and networks. Application Containerization (Kubernetes, Red Hat Open Shift) Experience with public cloud (e.g., Google, Azure) Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Ensures UPS's business needs are met through continual upgrades and development of new technical solutions. Qualifications: 8-12 years of experience Bachelors Degree or International equivalent Employee Type:
Posted 1 week ago
5.0 - 8.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners . KPI Partners is a leading provider of technology consulting and solutions, specializing in delivering high-quality services that enable organizations to optimize their operations and achieve their strategic objectives. We are committed to empowering businesses through innovative solutions and a strong focus on customer satisfaction. Job Description. We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities. - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.). - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment. Why Join KPI Partners? - Opportunity to work with a talented and diverse team on cutting-edge projects. - Competitive salary and comprehensive benefits package. - Continuous learning and professional development opportunities. - A culture that values innovative thinking and encourages collaboration. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.**
Posted 1 week ago
4.0 - 6.0 years
8 - 12 Lacs
Bengaluru
Remote
JOB DESCRIPTION JOB TITLE: Software Engineer I REPORTS TO: Lead Software Engineer ______________________________________________________________________________ POSITION SUMMARY: The Software Engineer will be responsible for developing and maintaining ETL integrations with telecom carriers, ensuring the seamless exchange of billing data, service orders, and inventory updates. The ideal candidate will bring knowledge of EDI and other electronic data standards. Candidate should have a strong understanding of telecom billing formats, and experience working with TEM platforms. Working Knowledge of Java for integration scripting and backend automation is a plus. ESSENTIAL FUNCTIONS: Design, develop, and maintain ETL integrations between telecom carriers, TEM platforms, and internal systems. Utilize ETL tools and techniques to extract, transform, and load data from EDI and other electronic format transactions into internal databases and applications, ensuring data accuracy and consistency across systems. Map and transform different electronic formats to meet both internal and partner requirements. Troubleshoot and resolve electronic transformation issues, ensuring timely and accurate data processing. Troubleshoot and resolve electronic data transmission issues, ensuring timely and accurate data exchange. Collaborate with TEM vendors and telecom providers to onboard new carriers and maintain data quality. Support end-to-end invoice processing workflows, from data ingestion to system reconciliation. Document technical specifications, mapping guidelines, and electronic format process flows. Monitor EDI and other electronic format system performance and proactively resolve issues or errors. Work cross-functionally with IT, telecom operations and other teams to implement and optimize the processes. Identify automation opportunities to improve workflow efficiency and data accuracy. Stay current on telecom industry trends, and billing standards. REQUIREMENTS: 4+ years experience in SQL preferable on Oracle or other database querying skills for data validation and troubleshooting. Familiarity with PL/SQL o TSQL Solid experience on Unix including Basic Shell Scripting User Level Experience on Linux and Microsoft operating systems Prior Experience in telecom billing formats and invoice data structures or other financial experience. Experience using Ticket management systems (ex. JIRA, SERVICENOW) Diagnostic Knowledge of FTP/SFTP for secure file transfers and batch job automation. Strong troubleshooting, analytical, and documentation skills Good organizational skills Ability to manage complex activities with high level of details. A well-organized and self-directed individual who can work with minimal supervision. Must be a quick learner of new technologies and adaptable to change. Good to Have: Familiarity with Java for backend integrations, data processing, or EDI and other electronic format middleware enhancements is a plus. Ability to independently identify, research, and resolve issues. Ability to multi-task continuously Extreme attention to detail and accuracy Ability to effectively communicate with all levels within the organization. ______________________________________________________________________________ ADDITIONAL RESPONSIBILITIES: Performs all other related duties as required or directed. Follow all safety rules and regulations. Work assigned hours as required.
Posted 2 weeks ago
8.0 - 13.0 years
25 - 40 Lacs
Bengaluru
Work from Office
*Must-Have Skills:* * Azure Databricks / PySpark hands-on * SQL/PL-SQL advanced level * Snowflake – 2+ years * Spark/Data pipeline development – 2+ years * Azure Repos / GitHub, Azure DevOps * Unix Shell Scripting * Cloud technology experience *Key Responsibilities:* 1. *Design, build, and manage data pipelines using Azure Databricks, PySpark, and Snowflake. 2. *Analyze and resolve production issues (Tier 2 support with weekend/on-call rotation). 3. *Write and optimize complex SQL/PL-SQL queries. 4. *Collaborate on low-level and high-level design for data solutions. 5. *Document all project deliverables and support deployment. Good to Have: Knowledge of Oracle, Qlik Replicate, GoldenGate, Hadoop Job scheduler tools like Control-M or Airflow Behavioral: Strong problem-solving & communication skills
Posted 2 weeks ago
3.0 - 5.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Company name: PulseData labs Pvt Ltd (captive Unit for URUS, USA) About URUS We are the URUS family (US), a global leader in products and services for Agritech. SENIOR DATA ENGINEER This role is responsible for the design, development, and maintenance of data integration and reporting solutions. The ideal candidate will possess expertise in Databricks and strong skills in SQL Server, SSIS and SSRS, and experience with other modern data engineering tools such as Azure Data Factory. This position requires a proactive and results-oriented individual with a passion for data and a strong understanding of data warehousing principles. Responsibilities Data Integration Design, develop, and maintain robust and efficient ETL pipelines and processes on Databricks. Troubleshoot and resolve Databricks pipeline errors and performance issues. Maintain legacy SSIS packages for ETL processes. Troubleshoot and resolve SSIS package errors and performance issues. Optimize data flow performance and minimize data latency. Implement data quality checks and validations within ETL processes. Databricks Development Develop and maintain Databricks pipelines and datasets using Python, Spark and SQL. Migrate legacy SSIS packages to Databricks pipelines. Optimize Databricks jobs for performance and cost-effectiveness. Integrate Databricks with other data sources and systems. Participate in the design and implementation of data lake architectures. Data Warehousing Participate in the design and implementation of data warehousing solutions. Support data quality initiatives and implement data cleansing procedures. Reporting and Analytics Collaborate with business users to understand data requirements for department driven reporting needs. Maintain existing library of complex SSRS reports, dashboards, and visualizations. Troubleshoot and resolve SSRS report issues, including performance bottlenecks and data inconsistencies. Collaboration and Communication Comfortable in entrepreneurial, self-starting, and fast-paced environment, working both independently and with our highly skilled teams. Collaborate effectively with business users, data analysts, and other IT teams. Communicate technical information clearly and concisely, both verbally and in writing. Document all development work and procedures thoroughly. Continuous Growth Keep abreast of the latest advancements in data integration, reporting, and data engineering technologies. Continuously improve skills and knowledge through training and self-learning. This job description reflects managements assignment of essential functions; it does not prescribe or restrict the tasks that may be assigned. Requirements Bachelor's degree in computer science, Information Systems, or a related field. 2+ years of experience in data integration and reporting. Extensive experience with Databricks, including Python, Spark, and Delta Lake. Strong proficiency in SQL Server, including T-SQL, stored procedures, and functions. Experience with SSIS (SQL Server Integration Services) development and maintenance. Experience with SSRS (SQL Server Reporting Services) report design and development. Experience with data warehousing concepts and best practices. Experience with Microsoft Azure cloud platform and Microsoft Fabric desirable. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with Agile methodologies.
Posted 2 weeks ago
7.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Hybrid
Company name: PulseData labs Pvt Ltd (captive Unit for URUS, USA) About URUS We are the URUS family (US), a global leader in products and services for Agritech. SENIOR DATA ENGINEER This role is responsible for the design, development, and maintenance of data integration and reporting solutions. The ideal candidate will possess expertise in Databricks and strong skills in SQL Server, SSIS and SSRS, and experience with other modern data engineering tools such as Azure Data Factory. This position requires a proactive and results-oriented individual with a passion for data and a strong understanding of data warehousing principles. Responsibilities Data Integration Design, develop, and maintain robust and efficient ETL pipelines and processes on Databricks. Troubleshoot and resolve Databricks pipeline errors and performance issues. Maintain legacy SSIS packages for ETL processes. Troubleshoot and resolve SSIS package errors and performance issues. Optimize data flow performance and minimize data latency. Implement data quality checks and validations within ETL processes. Databricks Development Develop and maintain Databricks pipelines and datasets using Python, Spark and SQL. Migrate legacy SSIS packages to Databricks pipelines. Optimize Databricks jobs for performance and cost-effectiveness. Integrate Databricks with other data sources and systems. Participate in the design and implementation of data lake architectures. Data Warehousing Participate in the design and implementation of data warehousing solutions. Support data quality initiatives and implement data cleansing procedures. Reporting and Analytics Collaborate with business users to understand data requirements for department driven reporting needs. Maintain existing library of complex SSRS reports, dashboards, and visualizations. Troubleshoot and resolve SSRS report issues, including performance bottlenecks and data inconsistencies. Collaboration and Communication Comfortable in entrepreneurial, self-starting, and fast-paced environment, working both independently and with our highly skilled teams. Collaborate effectively with business users, data analysts, and other IT teams. Communicate technical information clearly and concisely, both verbally and in writing. Document all development work and procedures thoroughly. Continuous Growth Keep abreast of the latest advancements in data integration, reporting, and data engineering technologies. Continuously improve skills and knowledge through training and self-learning. This job description reflects managements assignment of essential functions; it does not prescribe or restrict the tasks that may be assigned. Requirements Bachelor's degree in computer science, Information Systems, or a related field. 7+ years of experience in data integration and reporting. Extensive experience with Databricks, including Python, Spark, and Delta Lake. Strong proficiency in SQL Server, including T-SQL, stored procedures, and functions. Experience with SSIS (SQL Server Integration Services) development and maintenance. Experience with SSRS (SQL Server Reporting Services) report design and development. Experience with data warehousing concepts and best practices. Experience with Microsoft Azure cloud platform and Microsoft Fabric desirable. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Experience with Agile methodologies.
Posted 2 weeks ago
12.0 - 17.0 years
3 - 7 Lacs
Kolkata
Work from Office
Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : any graduate Summary :As a Data Management Practitioner, you will be responsible for maintaining the quality and compliance of an organization's data assets. Your role involves designing and implementing data strategies, ensuring data integrity, enforcing governance policies, and optimizing data usage within the organization. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Design and advise on data quality rules- Set up effective data compliance policies- Ensure data integrity and enforce governance policies- Optimize data usage within the organization Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Strong understanding of data management best practices- Experience in designing and implementing data strategies- Knowledge of data governance and compliance policies- Ability to optimize data usage for organizational benefit Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Kolkata office- A degree in any graduate is required Qualification any graduate
Posted 2 weeks ago
2.0 - 4.0 years
8 - 11 Lacs
Chennai
Hybrid
This is an operational role responsible for providing data analysis & management support. The incumbent may seek appropriate level of guidance and advice to ensure delivery of quality outcomes. Responsibilities Gathering and preparing relevant data to use in analytics applications. Acquiring data from primary or secondary data sources and support in maintaining databases Identify, analyze, and interpret trends or patterns in data sets. Filter and clean data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Develop and Support ETL Jobs, Schedule batch jobs via CRON, Database modeling for RDBMS Gather data requirements, follow Scrum methodology, ownership from development to deployment Minimum qualification & experience 6 Months -2 Years of DB Programming and any ETL tool (prefer Pentaho)/ Data Engineering Desired Skill sets Database programming RDMBS like oracle/mysql/maria DB ETL tool (Informatica / prefer pentaho) Production support experience is desirable DB Design Python skills are value added UNIX commands and job monitoring and debugging skills
Posted 2 weeks ago
5.0 - 6.0 years
9 - 16 Lacs
Gurugram
Hybrid
Role Summary: We are seeking an experienced ETL Developer with strong expertise in Informatica PowerCenter , Oracle SQL/PLSQL , and data warehousing concepts . The ideal candidate will play a key role in developing, optimizing, and maintaining ETL workflows, ensuring seamless data integration and transformation to support business-critical applications. Experience in Snowflake and job scheduling tools such as Control-M is a plus. Key Responsibilities: Collaborate with Technical Leads, Business Analysts, and Subject Matter Experts to understand data models and business requirements. Design, develop, and implement ETL solutions using Informatica PowerCenter . Develop, optimize, and maintain complex SQL/PLSQL scripts to support data processing in Oracle databases. Provide accurate development estimates and deliver high-quality solutions within agreed timelines. Ensure data integrity, reconciliation, and exception handling by following best practices and development standards. Participate in cross-functional team meetings to coordinate dependencies and deliverables. Implement procedures for data maintenance, monitoring, and performance optimization. Essential Skills & Experience: Technical: Minimum 3+ years of hands-on experience with Informatica PowerCenter in ETL development. Experience with Snowflake data warehouse platform. Familiarity with Source Control tools (e.g., Git, SVN). Proficiency in job scheduling tools like Control-M . Strong skills in UNIX shell scripting for automation. Solid experience (minimum 2 years) in SQL/PLSQL development including query tuning and optimization. In-depth understanding of Data Warehousing, Datamart, and ODS concepts . Knowledge of data normalization, OLAP techniques , and Oracle performance optimization . Experience working with Oracle or SQL Server databases (3+ years) along with Windows/UNIX environment expertise. Functional: Minimum 3 years of experience in the financial services sector or related industries. Sound understanding of data distribution, modeling, and physical database design . Ability to engage and communicate effectively with business stakeholders and data stewards . Strong problem-solving, analytical, interpersonal, and communication skills .
Posted 2 weeks ago
5.0 - 10.0 years
7 - 17 Lacs
Hyderabad
Work from Office
locationsHyderabad, India posted onPosted Yesterday job requisition idR-446112 About this role: Wells Fargo is seeking a Lead Analytics Consultant People Analytics. As a consultant, you will work as analytics professional in HR People Analytics and Business Insights delivery team and will be responsible for effective delivery of projects as per the business priority. The incumbent is expected to be an expert into executive summary, people strategy, HR consulting, HR advisory, advanced analytics & data science and value addition to the projects. In this role, you will: Advise line of business and companywide functions on business strategies based on research of performance metrics, trends in population distributions, and other complex data analysis to maximize profits and asset growth, and minimize operating losses within risk and other operating standards Provide influence and leadership in the identification of new tools and methods to analyze data Ensure adherence to compliance and legal regulations and policies on all projects managed Provide updates on project logs, monthly budget forecasts, monthly newsletters, and operations reviews Assist managers in building quarterly and annual plans and forecast future market research needs for business partners supported Strategically collaborate and consult with peers, colleagues, and mid-level to senior managers to resolve issues and achieve goals Lead projects, teams, or serve as a peer mentor to staff, interns and external contractors Required Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 5+ years of experience working with Tableau/Power BI/SQL Experience working with complex datasets using SQL 5+ years of experience with creating visualizations. Dashboarding experience involving multiple views that all respond to navigation/filter/etc. Ability to publish that can be reused for across dashboards/workbooks and used for self-service by other analysts working on the same domain (and/or, to reuse cube created by others where expedient). Demonstrate comprehensive understanding of HR business and related processes. Collaborate with cross-functional teams to address servicing challenges and optimize processes. Able to work as Individual Contributor and deliver end to end product development. Good experience working on SQL/PL-SQL Domain understanding of HR and its complete product lifecycle (Hire to Retire) will be an added advantage. Experience working with SAS programming. Knowledge on Tableau Prep and/or Alteryx a plus. Working on python or any Data science tools will be added advantage Knowledge on Tableau Prep and/or Alteryx a plus. Hands on experience in ETL development using any ETL tools. Good to have certifications in BI Reporting tools Data Management, or Data Engineering. Expected to learn the business aspects quickly, multitask and prioritize between projects. Dedicated, enthusiastic, driven and performance-oriented; possesses a strong work ethic and good team player. Job Expectations: Detail oriented, results driven, and can navigate in a quickly changing and high demand environment while balancing multiple priorities. Simple work documentation skills. Requirements, query documentation, testing. Consultative skills: should have the ability to rationalize business need and solution design from people not knowing how to ask precisely for what they need. Strong written and verbal communication, presentation, and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions, and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management.
Posted 2 weeks ago
8.0 - 10.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Job Track Description: The ETL Data Architect would be responsible for driving data migration strategy and execution within complex Enterprise landscape. This role will be responsible to bring in the best practices of data migration and integrations with Salesforce. Bring in best practices for Salesforce data migration integration. Create Data migration strategy for Salesforce implementations. Define template/uniform file format for migrating data into Salesforce. Must Skill: Data Architect with 8-10 years of ETL experience and 5+ years of Informatica Cloud (IICS, ICRT) experience. 5+ years of experience on Salesforce systems. Develop comprehensive data mapping and transformation plans to align data with Salesforce data model and software solution. Good understanding of Salesforce data model and schema builder. Excellent understanding of relational database concepts and how to best implement database objects in the Salesforce. Experience integrating large sets of data into Salesforce from multiple data sources. Experience with EDI transactions. Experience in Design and Development of ETL/Data Pipelines. Excellent understanding of SOSL and SOQL and the Salesforce Security model. Full understanding of project life cycle and development methodologies. Ability to interact with technical and functional teams. Excellent oral, written communication and presentation skills. Should be able to work in offshore / onsite model. Experience: Expert in ETL development with Informatica cloud using various connectors. Experience with Real Time integrations and Batch scripting. Expert in implementing the business rules by creating various transformations, working with multiple data sources like flat files, relational and cloud database, etc. and developing mappings. Experience in using ICS workflow tasksSession, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session, and Pre/Post commands. Ability to migrate objects in all phases (DEV, QA/UAT and PRD) following standard defined processes. Performance analysis with large data sets Experience in writing technical specifications based on conceptual design and stated business requirements. Experience in designing and maintaining logical and physical data models and communicates to peers and junior associates using flowcharts, unified data language, Data flow Diagram. Good Knowledge of SQL, PL/SQL and Data Warehousing Concepts. Experience in using Salesforce SOQL is a plus. Responsibilities: Excellent troubleshooting and debugging skills in Informatica Cloud. Significant knowledge of PL/ SQL including tuning, triggers, ad hoc queries, and stored procedures. Strong analytical skills. Works under minimal supervision with some latitude for independent judgement. Prepare and package scripts and code across development, test, and QA environments. Participate in change control planning for production deployments. Conducts tasks and assignments as directed.
Posted 2 weeks ago
3.0 - 5.0 years
5 - 10 Lacs
Chennai
Work from Office
Job Summary: We are seeking a SAS Data Integration Developer to design, develop, and maintain Campaign Management Data Mart (CMDM) solutions, integrate multiple data sources, and ensure data quality for marketing analytics and campaign execution. Key Responsibilities: Data Integration & ETL Development o Develop data ingestion, transformation, and deduplication pipelines. o Standardize, cleanse, and validate large-scale customer data. o Work with GaussDB, SAS ESP, APIs, and SAS DI Studio for data processing. Master Data Management (CMDM) Configuration o Implement unification & deduplication logic for a single customer view. o Develop and manage data masking & encryption for security compliance. API & CI360 Integration o Integrate CMDM with SAS CI360 for seamless campaign execution. o Ensure API connectivity and data flow across platforms. Testing & Deployment o Conduct Unit, Integration, and UAT Testing. o Deploy CMDM solutions to production and provide knowledge transfer. Key Skills Required: SAS Data Integration Studio (SAS DI Studio) Design, develop, and maintain Campaign Management Data Mart (CMDM) Data Management (SAS Base, SQL, Data Cleansing) SAS ESP, GaussDB, and API Integration Data Governance (RBAC, GDPR, PII Compliance) Data Masking & Encryption Techniques
Posted 2 weeks ago
3.0 - 8.0 years
35 - 50 Lacs
Bengaluru
Work from Office
About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. PS: This role is with one of our clients who is a leading name in Retail Industry.
Posted 2 weeks ago
10.0 - 15.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Req ID: 322003 We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karntaka (IN-KA), India (IN). Strong hands-on experience in SQLs, PL/SQLs [Procs, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp preferred]"‚"‚"‚"‚ Experience on MS-SQL [preferred], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred)"‚"‚"‚"‚"‚"‚"‚"‚"‚ Total Exp7 "“ 10 Yrs.
Posted 2 weeks ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Req ID: 321918 We are currently seeking a ETL and BDX developer to join our team in Bangalore, Karntaka (IN-KA), India (IN). "¢ Develop and maintain Power BI dashboards, reports, and datasets. "¢ Collaborate with stakeholders to gather and analyse business requirements. "¢ Design robust and scalable data models using Power BI and underlying data sources. "¢ Write complex DAX expressions for calculated columns, measures, and KPIs. "¢ Optimize performance of Power BI reports and data models. "¢ Integrate Power BI with other data sources (SQL Server, Excel, Azure, SharePoint, etc.). "¢ Implement row-level security and data access control. "¢ Automate data refresh schedules and troubleshoot refresh failures. "¢ Mentor junior developers and conduct code reviews. "¢ Work closely with data engineering teams to ensure data accuracy and integrity. "¢ Exp working on Power Query and data flows. "¢ Strong in writing SQL queries Total Exp7 "“ 10 Yrs.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane