Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Area: Miscellaneous Group, Miscellaneous Group > Data Analyst Qualcomm Overview: Qualcomm is a company of inventors that unlocked 5G ushering in an age of rapid acceleration in connectivity and new possibilities that will transform industries, create jobs, and enrich lives. But this is just the beginning. It takes inventive minds with diverse skills, backgrounds, and cultures to transform 5Gs potential into world-changing technologies and products. This is the Invention Age - and this is where you come in. General Summary: About the Team Qualcomm's People Analytics team plays a crucial role in transforming data into strategic workforce insights that drive HR and business decisions. As part of this lean but high-impact team, you will have the opportunity to analyze workforce trends, ensure data accuracy, and collaborate with key stakeholders to enhance our data ecosystem. This role is ideal for a generalist who thrives in a fast-paced, evolving environment"”someone who can independently conduct data analyses, communicate insights effectively, and work cross-functionally to enhance our People Analytics infrastructure. Why Join Us End-to-End Impact Work on the full analytics cycle"”from data extraction to insight generation"”driving meaningful HR and business decisions. Collaboration at Scale Partner with HR leaders, IT, and other analysts to ensure seamless data integration and analytics excellence. Data-Driven Culture Be a key player in refining our data lake, ensuring data integrity, and influencing data governance efforts. Professional Growth Gain exposure to multiple areas of people analytics, including analytics, storytelling, and stakeholder engagement. Key Responsibilities People Analytics & Insights Analyze HR and workforce data to identify trends, generate insights, and provide recommendations to business and HR leaders. Develop thoughtful insights to support ongoing HR and business decision-making. Present findings in a clear and compelling way to stakeholders at various levels, including senior leadership. Data Quality & Governance Ensure accuracy, consistency, and completeness of data when pulling from the data lake and other sources. Identify and troubleshoot data inconsistencies, collaborating with IT and other teams to resolve issues. Document and maintain data definitions, sources, and reporting standards to drive consistency across analytics initiatives. Collaboration & Stakeholder Management Work closely with other analysts on the team to align methodologies, share best practices, and enhance analytical capabilities. Act as a bridge between People Analytics, HR, and IT teams to define and communicate data requirements. Partner with IT and data engineering teams to improve data infrastructure and expand available datasets. Qualifications Required4-7 years experience in a People Analytics focused role Analytical & Technical Skills Strong ability to analyze, interpret, and visualize HR and workforce data to drive insights. Experience working with large datasets and ensuring data integrity. Proficiency in Excel and at least one data visualization tool (e.g., Tableau, Power BI). Communication & Stakeholder Management Ability to communicate data insights effectively to both technical and non-technical audiences. Strong documentation skills to define and communicate data requirements clearly. Experience collaborating with cross-functional teams, including HR, IT, and business stakeholders. Preferred: Technical Proficiency Experience with SQL, Python, or R for data manipulation and analysis. Familiarity with HR systems (e.g., Workday) and cloud-based data platforms. People Analytics Expertise Prior experience in HR analytics, workforce planning, or related fields. Understanding of key HR metrics and workforce trends (e.g., turnover, engagement, diversity analytics). Additional Information This is an office-based position (4 days a week onsite) with possible locations that may include India and Mexico
Posted 1 month ago
4.0 - 9.0 years
9 - 18 Lacs
Pune, Gurugram
Work from Office
The first Data Engineer specializes in traditional ETL with SAS DI and Big Data (Hadoop, Hive). The second is more versatile, skilled in modern data engineering with Python, MongoDB, and real-time processing.
Posted 1 month ago
7.0 - 12.0 years
22 - 37 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Hybrid
Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 7–10 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation
Posted 1 month ago
11.0 - 16.0 years
10 - 20 Lacs
Noida, Greater Noida, Delhi / NCR
Work from Office
-Data Architect Department: Data & Analytics The Data Architect having more than 14 years of experience and should play a pivotal role in designing, developing, and governing scalable data architectures to support enterprise-wide data integration, analytics, and reporting. This role will focus on creating unified data models, optimizing data pipelines, and ensuring compliance with regulatory standards (GDPR) using cloud-based platforms. The ideal candidate is a strategic thinker with deep expertise in data modeling, cloud data platforms, and governance. Key Responsibilities: - - Design and implement logical and physical data models to support diverse data sources (e.g., relational database) - Develop scalable architectures integrating data lakes, data warehouses, and master data management (MDM) solutions to create unified views (e.g., customer 36o). - Leverage services to build ETL/ELT pipelines and ensure data consistency. - Establish data governance frameworks using Data Governance and Catalog to ensure metadata management, lineage tracking, and data discoverability. - Design models and processes to comply with regulatory requirements (e.g., GDPR, HIPAA), including encryption, data masking, and access controls. - Define and enforce data quality standards through profiling, cleansing, and validation. - Architect solutions to handle high-volume, high-velocity data environments, leveraging cloud platforms with auto-scaling capabilities. - Optimize data models and pipelines for query performance, using indexing, denormalization, and caching strategies. - Partner with data engineers, analysts, and business stakeholders to translate requirements into technical designs. Mandatory Skills- Data warehousing Data Modelling, Data Migration projects ETL Tools (SSIS, Informatica) SQL scripting Connect- Aarushi.Shukla@coforge.com
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Hello everyone! PwC India is inviting applicants for the role below, kindly apply if found suitable! JD ETL Developer Work Location: Mumbai Years of experience: Around 5 to 10 years Level: Senior Assocaite , Tech leads , Mangers . Work Model - Work from office Mandatory Skill Set-ETL Informatica IICS or Power center Job Description: Location: Mumbai Full job description Skill: Informatica PowerCenter or IICS or ODI or Talend - 5 - 10 years of professional experience with working knowledge in a Data and Analytics role with a Global organization -Mandatory Hands-on Development Experience in ETL tool : Informatica Power center OR IICS or ODI or Talend -Should have implemented end to end ETL life cycle, preparing ETL design frameworks and execution -Must have rich experience building Operational Data stores, Data marts and Enterprise Data warehouse -Must have very good SQL skills ( Specifically in Oracle, MySQL, Postgre) -Should be able to create and execute ETL designs, and test cases should be able to write complex SQL queries for testing/analysis depending upon the functional/technical requirement. -Should have worked on performance optimization, error handling, writing stored procedures, etc. -Demonstrate ability to communicate effectively with both Technical & Business stakeholders - Should have good knowledge of SCD Type 1, SCD Type 2 concepts. -Should have Understanding of data modelling, Data warehousing concepts Pease lfill the application form below. https://lnkd.in/gVTyzjmu
Posted 1 month ago
1.0 - 4.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Req ID: 321498 We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. "¢ Work closely with Data modeller to ensure data models support the solution design "¢ Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. "¢ Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. "¢ Develop documentation and artefacts to support projects Minimum Skills Required"¢ ADF "¢ Fivetran (orchestration & integration) "¢ SQL "¢ Snowflake DWH
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Req ID: 319341 We are currently seeking a MS Fabric Architect - Support to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesKey Responsibilities: Provide technical support and troubleshooting for Microsoft Fabric services. Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. Monitor system performance and resolve issues proactively. Collaborate with cross-functional teams to optimize data workflows and analytics solutions. Document support procedures, best practices, and troubleshooting steps. Assist in user training and onboarding for Microsoft Fabric-related tools and applications. Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: 6+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. Experience with troubleshooting and resolving issues in a cloud-based environment. Familiarity with SQL, data pipelines, and ETL processes. Excellent problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Preferred Qualifications: Microsoft certifications related to Fabric, Azure, or Power BI. Experience with automation and scripting (PowerShell, Python, etc.). Understanding of security and compliance considerations in cloud-based data platforms. Minimum Skills RequiredKey Responsibilities: Provide technical support and troubleshooting for Microsoft Fabric services. Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. Monitor system performance and resolve issues proactively. Collaborate with cross-functional teams to optimize data workflows and analytics solutions. Document support procedures, best practices, and troubleshooting steps. Assist in user training and onboarding for Microsoft Fabric-related tools and applications. Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: 6+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. Experience with troubleshooting and resolving issues in a cloud-based environment. Familiarity with SQL, data pipelines, and ETL processes. Excellent problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Preferred Qualifications: Microsoft certifications related to Fabric, Azure, or Power BI. Experience with automation and scripting (PowerShell, Python, etc.). Understanding of security and compliance considerations in cloud-based data platforms."
Posted 1 month ago
5.0 - 8.0 years
10 - 15 Lacs
Pune, Chennai
Work from Office
5+ years of experience in report development and migration Strong hands-on experience with Oracle Reports (6i/10g/11g) Proficient in JasperReports , Jaspersoft Studio , and JRXML templates Strong knowledge of SQL and PL/SQL Working knowledge of Java and JasperReports API Experience configuring JDBC data sources and working with complex datasets Familiarity with JasperReports Server : deployment, user management, and scheduling Experience with Git or version control tools Good communication skills and ability to work with business stakeholders Preferred Qualifications: Experience with iReport Designer (legacy support) Exposure to CI/CD for report deployment Knowledge of ETL tools or data transformation processes Oracle and/or JasperReports certification is a plus Roles and Responsibilities **Job Title: Jaspersoft Developer - 002** **Job Location:** Chennai, Tamil Nadu, India / Pune, Maharashtra, India **Experience:** 5 to 8 years --- ### Roles and Responsibilities: 1. **Jaspersoft Development:** - Design, develop, and maintain several Jaspersoft reporting solutions using JRXML templates and JasperAPI to meet specific business requirements. - Collaborate with business stakeholders to gather requirements and translate them into clear and actionable reporting specs. 2. **Database Management:** - Utilize PL/SQL for data extraction, transformation, and loading (ETL) processes to support reporting needs. - Design efficient SQL queries and optimize database performance for generating timely reports. 3. **Integration:** - Integrate Jaspersoft reports with existing applications and databases, leveraging JDBC for seamless data connectivity and reporting. 4. **Testing and Validation:** - Conduct thorough testing and validation of reports to ensure data accuracy and adherence to requirements before deployment. - Support end-users in validating reports for operational effectiveness and troubleshoot any issues that arise. 5. **Documentation:** - Create and maintain comprehensive documentation for developed reports, including frameworks, data sources, and design methodologies. - Ensure all specifications and design documents comply with organizational standards. 6. **Performance Optimization:** - Monitor report performance and identify areas for optimization to enhance efficiency and user experience. - Implement best practices in report development to localize and streamline reporting processes. 7. **Collaboration:** - Work closely with cross-functional teams including analysts, project managers, and other developers to ensure alignment and timely delivery of projects. - Participate in code reviews and contribute to shared knowledge within the development team. 8. **Continuous Improvement:** - Stay updated with the latest Jaspersoft features and enhancements to implement improvements in solutions. - Provide training and support to junior developers and team members as needed. --- ### Desired Skills: - Proficiency in Java, JDBC, JRXML Templates, and JasperAPI for effective report design and development. - Strong command of PL/SQL for data manipulation and reporting purposes. ### Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Between 5 to 8 years of relevant experience in Jaspersoft development and database management. **A proactive approach and strong problem-solving skills will be essential for success in this role.**
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Pune, Chennai
Work from Office
Required Skills & Qualifications: 5+ years of experience in report development and migration Strong hands-on experience with Oracle Reports (6i/10g/11g) Proficient in JasperReports , Jaspersoft Studio , and JRXML templates Strong knowledge of SQL and PL/SQL Working knowledge of Java and JasperReports API Experience configuring JDBC data sources and working with complex datasets Familiarity with JasperReports Server : deployment, user management, and scheduling Experience with Git or version control tools Good communication skills and ability to work with business stakeholders Preferred Qualifications: Experience with iReport Designer (legacy support) Exposure to CI/CD for report deployment Knowledge of ETL tools or data transformation processes Oracle and/or JasperReports certification is a plus Roles and Responsibilities **Job Title:** Jaspersoft Developer - 005 **Location:** Chennai, Tamil Nadu, India / Pune, Maharashtra, India **Experience Required:** 5 to 10 years **Roles and Responsibilities:** 1. **Jasper Reports Development:** - Design, develop, and maintain reports using Jasper Reports. - Create and manage JRXML templates to meet business requirements. 2. **Jaspersoft Maintenance:** - Monitor, troubleshoot, and optimize Jaspersoft server performance. - Ensure that Jaspersoft reports are seamlessly integrated into applications. 3. **Database Interaction:** - Utilize JDBC connectors to establish and manage database connections. - Write and optimize PLSQL queries to fetch the required data for report generation. 4. **API Integrations:** - Leverage Jasper API for report generation and management. - Implement and customize report data sources using customized API calls. 5. **Collaboration:** - Work closely with business analysts and stakeholders to gather reporting requirements. - Collaborate with cross-functional teams to integrate reports into applications and verify their alignment with business needs. 6. **Testing & Quality Assurance:** - Conduct thorough testing of reports to ensure accuracy and efficiency. - Implement best practices for version control and documentation of reporting processes. 7. **Training and Support:** - Provide support and training to end-users on report usage and troubleshooting. - Prepare technical documentation and user manuals for Jasper Reports. 8. **Continuous Improvement:** - Stay updated with the latest trends and technology advancements in reporting tools and databases. - Suggest and implement enhancements to improve report generation performance and user experience. **Desired Skills:** - Proficiency in Java and JDBC Connectors. - Strong experience with Jasper API and JRXML Templates. - Hands-on experience with JasperStudio and Jaspersoft suite. - Knowledge of PLSQL for effective database interaction. **Qualifications:** - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as a Jaspersoft Developer with a focus on report development and database management. **Important Note:** Candidates with experience between 5 to 10 years and a robust skill set as described will be considered for this role, ensuring the organization can leverage deep industry knowledge and technical expertise in Jaspersoft development.
Posted 1 month ago
10.0 - 12.0 years
15 - 25 Lacs
Pune, Chennai
Work from Office
Required Skills & Qualifications: 5+ years of experience in report development and migration Strong hands-on experience with Oracle Reports (6i/10g/11g) Proficient in JasperReports , Jaspersoft Studio , and JRXML templates Strong knowledge of SQL and PL/SQL Working knowledge of Java and JasperReports API Experience configuring JDBC data sources and working with complex datasets Familiarity with JasperReports Server : deployment, user management, and scheduling Experience with Git or version control tools Good communication skills and ability to work with business stakeholders Preferred Qualifications: Experience with iReport Designer (legacy support) Exposure to CI/CD for report deployment Knowledge of ETL tools or data transformation processes Oracle and/or JasperReports certification is a plus Roles and Responsibilities ### Job Title: Jaspersoft Developer - 006 **Job Location:** Chennai, Tamil Nadu, India / Pune, Maharashtra, India **Experience Required:** - Minimum: 10 years - Maximum: 12 years --- #### Roles and Responsibilities: 1. **Jaspersoft Development:** - Design, develop, and maintain Jaspersoft reports and dashboards using Jasper Reports, Jasper Studio, and JRXML templates, ensuring high-quality outputs that meet client specifications. 2. **Database Interaction:** - Implement JDBC connectors to establish connections with various databases, ensuring seamless data retrieval and integration for reporting purposes. 3. **PLSQL Development:** - Write and optimize PLSQL queries to fetch and manipulate data as required for reporting, ensuring data accuracy and efficiency. 4. **API Integration:** - Utilize Jasper API to integrate Jaspersoft reports with external applications, ensuring smooth data flow and functionality across platforms. 5. **Technical Documentation:** - Create and maintain comprehensive documentation of developed reports, including technical specifications, deployment procedures, and user manuals. 6. **Collaboration:** - Work closely with business analysts, stakeholders, and other team members to gather requirements and provide solutions that meet business needs. 7. **Troubleshooting and Support:** - Provide ongoing support for deployed reports, troubleshoot any issues, and implement enhancements as required to improve functionality and performance. 8. **Performance Optimization:** - Analyze and optimize report performance, ensuring minimal load times and efficient resource utilization in the reporting environment. 9. **Best Practices Implementation:** - Stay updated with the latest trends and best practices in Jaspersoft development and reporting solutions, applying this knowledge to enhance existing processes. 10. **Mentorship:** - Guide and mentor junior developers on Jaspersoft best practices, report development techniques, and database management to foster a collaborative and productive team environment. --- **Desired Set of Skills:** - Proficient in Java programming - Experience using JDBC Connectors - Familiarity with Jasper API and JRXML Templates --- This role is suited for a seasoned Jaspersoft Developer with extensive experience in report design and development, database interaction, and team collaboration, aimed at delivering effective reporting solutions for various business needs.
Posted 1 month ago
6.0 - 11.0 years
18 - 25 Lacs
Hyderabad
Work from Office
Position: Informatica Developer Location: India Experience: 6+ Years About the Role: We are looking for an experienced Informatica Developer (IDMC) to design, build, and manage cloud-based data integration solutions. This role focuses on managing batch processes and ensuring seamless data exchange across systems. Key Responsibilities: Design and implement data integration workflows using Informatica Intelligent Cloud Services (IDMC). Develop and maintain Salesforce-centric data pipelines, including data extraction, transformation, and loading (ETL). Manage and optimize batch processes to handle large volumes of Salesforce data. Ensure accurate and efficient data movement between Salesforce and other applications/databases. Identify and resolve integration-related issues and optimize performance. Collaborate with business and technical teams to gather integration requirements. Maintain detailed technical documentation for integration components and processes. Key Skills & Experience: Hands-on experience in Informatica Cloud (IDMC) development. Mandatory experience in Salesforce data integration and understanding of Salesforce objects, data model, API usage, and best practices. Proficiency in designing cloud data integration workflows and batch processes. Good understanding of various data sources and cloud databases, APIs, flat files. Strong analytical, debugging, and troubleshooting abilities. Ability to work both independently and within cross-functional teams.
Posted 1 month ago
1.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows
Posted 1 month ago
3.0 - 5.0 years
13 - 17 Lacs
Chennai
Work from Office
InfoCepts is looking for Data Architect- Snowflake & DBT to join our dynamic team and embark on a rewarding career journey Design and Development: Create and implement data warehouse solutions using Snowflake, including data modeling, schema design, and ETL (Extract, Transform, Load) processes Performance Optimization: Optimize queries, performance-tune databases, and ensure efficient use of Snowflake resources for faster data retrieval and processing Data Integration: Integrate data from various sources, ensuring compatibility, consistency, and accuracy Security and Compliance: Implement security measures and ensure compliance with data governance and regulatory requirements, including access control and data encryption Monitoring and Maintenance: Monitor system performance, troubleshoot issues, and perform routine maintenance tasks to ensure system health and reliability Collaboration: Collaborate with other teams, such as data engineers, analysts, and business stakeholders, to understand requirements and deliver effective data solutions Skills and Qualifications:Snowflake Expertise: In-depth knowledge and hands-on experience working with Snowflake's architecture, features, and functionalities SQL and Database Skills: Proficiency in SQL querying and database management, with a strong understanding of relational databases and data warehousing concepts Data Modeling: Experience in designing and implementing effective data models for optimal performance and scalability ETL Tools and Processes: Familiarity with ETL tools and processes to extract, transform, and load data into Snowflake Performance Tuning: Ability to identify and resolve performance bottlenecks, optimize queries, and improve overall system performance Data Security and Compliance: Understanding of data security best practices, encryption methods, and compliance standards (such as GDPR, HIPAA, etc) Problem-Solving and Troubleshooting: Strong analytical and problem-solving skills to diagnose and resolve issues within the Snowflake environment Communication and Collaboration: Good communication skills to interact with cross-functional teams and effectively translate business requirements into technical solutions Scripting and Automation: Knowledge of scripting languages (like Python) and experience in automating processes within Snowflake
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Chennai
Work from Office
Programming languages/Tools SQL , Datastage , Teradata . Design complex ETL jobs in IBM Datastage to load data into the DWH as per business logic Work experience in Teradata Database as Developer . Understand and analyse ERP reports and document the logic Identify gaps in the existing solutions to accommodate new business processes introduced by the merger Work on designing TAS workflows to replicate data from SAP into the DWH Prepare test cases and technical specifications for the new solutions Interact with other upstream and downstream application teams and EI teams to build robust data transfer mechanisms between various systems.Essential Skills Required Sound interpersonal communication skills Coordinate with customers and Business Analysts to understand business and reporting requirements Support the development of business intelligence standards to meet business goals. Ability to understand Data warehousing concepts and implement reports based on users inputs Area of expertise includes Teradata SQL, DataStage, Teradata , Shell Scripting . Demonstrated focus on driving for results Ability to work with a cross functional teamEmployment Experience Required Minimum 3+ years technical experience in a data warehousing concepts and as ETL developer.
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while keeping abreast of the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good To Have Skills: Experience with SAP HANA and data warehousing concepts.- Strong understanding of ETL processes and data integration techniques.- Familiarity with reporting tools and dashboard creation.- Experience in performance tuning and optimization of data models. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Business Intelligence (BI) Publisher Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing application features, and ensuring that the applications align with business objectives. You will also engage in problem-solving activities and contribute to the overall improvement of application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Business Intelligence (BI) Publisher.- Strong understanding of data modeling and reporting tools.- Experience with SQL and database management.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with application development methodologies. Additional Information:- The candidate should have minimum 5 years of experience in Oracle Business Intelligence (BI) Publisher.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).
Posted 1 month ago
6.0 - 11.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Were looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization. Responsibilities: Lead the design of data warehouses, lakes, and ETL workflows. Collaborate with teams to gather requirements and build scalable solutions. Ensure data governance, security, and optimal performance of systems. Mentor junior engineers and drive end-to-end project delivery.: 6+ years of experience in data engineering, including at least 2 full-cycle datawarehouse projects. Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms. Expertise in big data tools (e.g., Apache Spark, Kafka). Excellent communication skills and leadership abilities.PreferredExperience with workflow orchestration tools (e.g., Airflow), real-time data,and DataOps practices.
Posted 1 month ago
7.0 - 8.0 years
15 - 16 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: This role will be responsible for: Develop and support new feeds ingestion / understand the existing framework and do the development as per the business rules and requirements. Development and maintenance of new changes / enhancements in Data Ingestion / Juniper and promoting and supporting those in the production environment within the stipulated timelines. Need to get familiar with the Data Ingestion / Data Refinery / Common Data Model / Compdata frameworks quickly and contribute to the application development as soon as possible. Methodical and measured approach with a keen eye for attention to detail; Ability to work under pressure and remain calm in the face of adversity; Ability to collaborate, interact and engage with different business, technical and subject matter experts; Good, concise, written and verbal communication Ability to manage workload from multiple requests and to balance priorities; Pro-active, a can do mind-set and attitude; Good documentation skills Requirements To be successful in this role, you should meet the following requirements: Experience (1 = essential, 2 = very useful, 3 = nice to have): 1. Hadoop / Hive / GCP 2. Agile / Scrum 3. LINUX Technical skills (1 = essential, 2 = useful, 3 = nice to have): 1. Any ETL tool 1. Analytical trouble shooting. 2. Hive QL 1. On-Prem / Cloud infra knowledge
Posted 1 month ago
3.0 - 7.0 years
10 - 15 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: This role will be responsible for: Develop and support new feeds ingestion / understand the existing framework and do the development as per the business rules and requirements. Development and maintenance of new changes / enhancements in Data Ingestion / Juniper and promoting and supporting those in the production environment within the stipulated timelines. Need to get familiar with the Data Ingestion / Data Refinery / Common Data Model / Compdata frameworks quickly and contribute to the application development as soon as possible. Methodical and measured approach with a keen eye for attention to detail; Ability to work under pressure and remain calm in the face of adversity; Ability to collaborate, interact and engage with different business, technical and subject matter experts; Good, concise, written and verbal communication Ability to manage workload from multiple requests and to balance priorities; Pro-active, a can do mind-set and attitude; Good documentation skills Requirements To be successful in this role, you should meet the following requirements: Experience (1 = essential, 2 = very useful, 3 = nice to have): 1. Hadoop / Hive / GCP 2. Agile / Scrum 3. LINUX Technical skills (1 = essential, 2 = useful, 3 = nice to have): 1. Any ETL tool 1. Analytical trouble shooting. 2. Hive QL 1. On-Prem / Cloud infra knowledge
Posted 1 month ago
7.0 - 8.0 years
15 - 16 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: This role will be responsible for: Develop and support new feeds ingestion / understand the existing framework and do the development as per the business rules and requirements. Development and maintenance of new changes / enhancements in Data Ingestion / Juniper and promoting and supporting those in the production environment within the stipulated timelines. Need to get familiar with the Data Ingestion / Data Refinery / Common Data Model / Compdata frameworks quickly and contribute to the application development as soon as possible. Methodical and measured approach with a keen eye for attention to detail; Ability to work under pressure and remain calm in the face of adversity; Ability to collaborate, interact and engage with different business, technical and subject matter experts; Good, concise, written and verbal communication Ability to manage workload from multiple requests and to balance priorities; Pro-active, a can do mind-set and attitude; Good documentation skills Requirements To be successful in this role, you should meet the following requirements: Experience (1 = essential, 2 = very useful, 3 = nice to have): 1. Hadoop / Hive / GCP 2. Agile / Scrum 3. LINUX Technical skills (1 = essential, 2 = useful, 3 = nice to have): 1. Any ETL tool 1. Analytical trouble shooting. 2. Hive QL 1. On-Prem / Cloud infra knowledgeYou ll achieve more when you join HSBC.
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have Location: Hyderabad, Pune, Bangalore
Posted 1 month ago
10.0 - 12.0 years
4 - 7 Lacs
Mumbai, Pune, Chennai
Work from Office
Locations: Mumbai, Pune, Chennai, Bangalore *Remote opportunity is also available Experience in SAP Technical Development with a focus on SAP MDK, Fiori, and ABAP Experience in SAP Asset Manager, Maintaining and developing Mobile Apps in S4 HANA SAP BTP, Mobile Services, and SAP Mobile Platform (SMP) Native mobile app development for iOS and Android Java, JavaScript, HTML5, and CSS SAP BTP, Mobile Services, and SAP Mobile Platform (SMP) Desirable Technical Skill Having experience in provision custom app access to mobile devices and providing support. Hands-on experience developing in-sync and offline functionalities for mobile apps. Experience in customizing / enhancing standard mobile apps. Good understanding of ODATA and ability to consume complex deep entities and change sets in the front end. Good understanding and experience in APIs and how to effectively use them on the front end. Having hands-on experience in developing intuitive mobile apps
Posted 1 month ago
6.0 - 9.0 years
7 - 14 Lacs
Hyderabad
Work from Office
Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: 1. Proficiency in ETL, Batch, and Streaming Process 2. Experience with BigQuery, Cloud Storage, and CloudSQL 3. Strong programming skills in Python, SQL, and Apache Beam for data processing 4. Understanding of data modeling and schema design for analytics 5. Knowledge of data governance, security, and compliance in GCP 6. Familiarity with machine learning workflows and integration with GCP ML tools 7. Ability to optimize performance within data pipelines Functional Requirements: 1. Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features 2. Experience in leading and mentoring peers within an existing development team 3. Strong communication skills to craft and communicate robust solutions 4. Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations 5. Willingness to work on contemporary data architecture in Public and Private Cloud environments T his role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements . Qualification Engineering Grad / Postgraduate CRITERIA 1. Proficient in ETL, Python, and Apache Beam for data processing efficiency. 2. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. 3. Strong collaboration skills with cross-functional teams for data product development. 4. Comprehensive knowledge of data governance, security, and compliance in GCP. 5. Experienced in optimizing performance within data pipelines for efficiency. 6. Relevant Experience: 6-9 years Connect at 9993809253
Posted 1 month ago
2.0 - 7.0 years
3 - 7 Lacs
Kolkata
Work from Office
SSIS Developers Wanted Work from Office in Kolkata! Rapid Care is hiring BI Analysts for our Kolkata (Salt Lake, Sector 5) office. If you have 2+ years of experience and want to work on impactful projects, we would love to hear from you! Experience: 2+ years Key Skills: Must-Have: SSIS (MANDATORY), Tableau, MYSQL, Business Analysis Recommended: SSRS, ETL Processes Location: Kolkata (Salt Lake, Sector 5) Work from Office (5 days a week, 9-hour shifts) Responsibilities: Design, develop, and maintain SSIS packages for robust ETL processes. Extract data from multiple sources like SQL Server, Excel, flat files, and APIs. Transform raw data using data cleansing, validation, and business logic rules. Load processed data into data warehouses, data marts, or target databases. Optimize ETL workflows for performance, scalability, and fault tolerance. Monitor and troubleshoot ETL jobs to ensure data accuracy and timely delivery. Implement logging, error handling, and package configuration frameworks. Collaborate with DBAs, analysts, and application teams to understand data needs. Document ETL design, data flows, and metadata for transparency and maintenance. Schedule and automate SSIS jobs using SQL Server Agent or other tools. Must be based in Kolkata Interested? Reach out to me at: Email: adutta@imedx.com.au Call / Whatsapp : 7001986900
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France