Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
3 - 7 Lacs
Hyderabad
Work from Office
5+ Years of experience in developing Snowflake data models, data ingestion, views, Stored procedures, complex queries Good experience in SQL Experience in Informatica Power center / IICS ETL tools Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Provide production support for Data Warehouse issues such data load problems, transformation translation problems Ability to facilitate and coordinate discussion and to manage expectations of multiple stakeholders Candidate must have good communication and facilitation skills Work in an onsite-offshore model involving daily interactions with Onshore teams to ensure on-time quality deliverables
Posted 3 days ago
8.0 - 13.0 years
8 - 14 Lacs
Bengaluru
Work from Office
8+ years of experience as a project manager, utilizing project management methodologies and disciplines, best practices, and artifacts in both Waterfall and SCRUM environments * Experience in leading and managing Data warehousing / Informatica / ETL projects is essential. Responsible for overall systems development life cycle * Ability to facilitate and coordinate discussion and to manage expectations of multiple stakeholders with good communication and facilitation skills
Posted 3 days ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 2+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 days ago
5.0 - 10.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Upgrade the Automic Workload Automation (AWA) platform from V12.2 to V21.3. This upgrade will be performed in following environments i.Non-Production One ii.Production - One Following key services will be performed to carry out this upgrade A.Discovery, Planning and Onboarding a.Kick-Off meeting between the Supplier and stakeholder b.Understand the current landscape c.Discuss and agree on the upgrade approach including version (Version 21.2 or Version 21.3) d.Discuss and agree the Plan e.Discuss and agree the Test approach including key jobs / all jobs to be tested f.Start Documenting the migration process B.Non-Production - Upgrade and Deploy a.Secure the internal approvals to kick start the upgrade process b.Install Automic Workload Automation agreed version and its components c.Upgrade / deploy the 45 agents d.Fix the issues / configurations e.Update the Migration document f.Validate the system features C.Non-Production - Test and Validate a.Assist and Support team in testing b.Sign Off by team on the successful upgrade of the non-Production environment D.Production - Upgrade and Deploy a.Secure the internal approvals to kick start the upgrade process b.Install Automic Workload Automation agreed version and its components c.Upgrade / deploy the 12 agents d.Fix the issues / configurations e.Update the Migration document f.Validate the system features E.Production Production Validation a.Assist and Support team in testing b.Sign Off by team on the successful upgrade of the Production environment c.Provide Hypercare / Warranty for two weeks F.Handover to Support Teams / Operations Team If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 3 days ago
6.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
SFDC Einstein Analytics Skills a core end to end skill set needed for the development of EA dashboards 6- 8 Years of IT experience with at least 5 years Salesforce experience 2-3 Years of Salesforce Analytics experience Salesforce certification or Einstein Analytics Accreditation will be a huge advantage ; Knowledge of Einstein Discovery and predictive modelling will be a plus; Data preparation to be able to understand source data and advise on ETL transformations to achieve desired data structures for EA Dataflow build to manipulate and join data from Salesforce and external systems in SFDC Native dataflow editor Salesforce Analytics Query Language (SAQL) - advanced experience of SAQL Working knowledge of at least one ETL platform which can be used to load data into EA datasets like Informatica Cloud Working knowledge of Salesforce CRM architecture and security model
Posted 3 days ago
6.0 - 11.0 years
2 - 6 Lacs
Hyderabad
Work from Office
5+ years experience inTeradata. Good inTeradata, should know BTEQ scripts, FastLoad, MLoad Experience Informatica Power Center experience Good in Writing SQL queries using Joins, Essential to have DataIKU tool proficient knowledge Proficient in Python Scripting Read logs. Can Script, Can automate script, can build Server
Posted 3 days ago
6.0 - 10.0 years
4 - 8 Lacs
Pune
Work from Office
Position Overview Summary: The Data Engineer will expand and optimize the data and data pipeline architecture, as well as optimize data flow and collection for cross functional teams. The Data Engineer will perform data architecture analysis, design, development and testing to deliver data applications, services, interfaces, ETL processes, reporting and other workflow and management initiatives. The role also will follow modern SDLC principles, test driven development and source code reviews and change control standards in order to maintain compliance with policies. This role requires a highly motivated individual with strong technical ability, data capability, excellent communication and collaboration skills including the ability to develop and troubleshoot a diverse range of problems. Responsibilities Design and develop enterprise data data architecture solutions using Hadoop and other data technologies like Spark, Scala.
Posted 3 days ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
4+ years of hands on experience using Azure Cloud, ADLS, ADF & Databricks Finance Domain Data Stewardship Finance Data Reconciliation with SAP down-stream systems Run/Monitor Pipelines/ Validate the Data Bricks note books Able to interface with onsite/ business stake holders. Python, SQL Hands on Knowledge of Snowflake/DW is desirable.
Posted 3 days ago
5.0 - 10.0 years
3 - 7 Lacs
Chennai
Work from Office
POSITION OVERVIEW : Informatica Developer POSITION GENERAL DUTIES AND TASKS : 3+ Years of strong experience in Snowflake Experience in ETL and in building data ingestion pipelines Good knowledge in Oracle, SQL,PL/SQL Exposure to Agile methodology and usage of JIRA Tool Should be good in understanding of functional requirements and Business Process Candidate must have good communication and facilitation skills Work in an onsite-offshore model involving daily interactions with Onshore teams to ensure on-time quality deliverables nt experience in SharePoint with min 2+ year experience on office 365 / SharePoint online , Hands on experience 3+ years of relevant experience in SharePoint with minerience on office 365 / SharePoint online , Hands on experience in coding using Angular, HTML, JavaScript, CSS, React JS, SPFx, and RES
Posted 3 days ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 2+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 days ago
6.0 - 11.0 years
8 - 18 Lacs
Hyderabad
Work from Office
Immediate Job Openings on #Informatica/BI Admin_ Pan India_Contract Experience: 6+ Years Skill:Informatica / BI Admin Location: Pan India Notice Period:Immediate. Employment Type: Contract Informatica / BI Admin BI Administration Support (SAS, Tableau and reporting services) Claims Centre Guidewire Administration Support for Informatica Informatica Admin, ILM Support for Claims BAU and Service Ops Claims Centre ODS Administration Support for Informatica Policy Center Informatica Administration and support Alteryx, Sapiens, Celonis administration and support Informatica support for Apex.
Posted 3 days ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
Urgent Requirement for Ab Initio Developer. Experience : 4+ Years Location : Pan India. Ab Initio GDE hands-on ( worked on Graphs , plans and psets ) Experience on Ab Initio Development and Code promotion Experience on Unix shell scripting and unix commands . 4+ years of relevant experience .
Posted 3 days ago
3.0 - 5.0 years
14 - 19 Lacs
Mumbai, Pune
Work from Office
Company: Marsh McLennan Agency Description: Marsh McLennan is seeking candidates for the following position based in the Pune office. Senior Engineer/Principal Engineer What can you expect We are seeking a skilled Data Engineer with 3 to 5 years of hands-on experience in building and optimizing data pipelines and architectures. The ideal candidate will have expertise in Spark, AWS Glue, AWS S3, Python, complex SQL, and AWS EMR. What is in it for you Holidays (As Per the location) Medical & Insurance benefits (As Per the location) Shared Transport (Provided the address falls in service zone) Hybrid way of working Diversify your experience and learn new skills Opportunity to work with stakeholders globally to learn and grow We will count on you to: Design and implement scalable data solutions that support our data-driven decision-making processes. What you need to have: SQL and RDBMS knowledge - 5/5. Postgres. Should have extensive hands-on Database systems carrying tables, schema, views, materialized views. AWS Knowledge. Core and Data engineering services. Glue/ Lambda/ EMR/ DMS/ S3 - services in focus. ETL data:dge :- Any ETL tool preferably Informatica. Data warehousing. Big data:- Hadoop - Concepts. Spark - 3/5 Hive - 5/5 Python/ Java. Interpersonal skills:- Excellent communication skills and Team lead capabilities. Understanding of data systems well in big organizations setup. Passion deep diving and working with data and delivering value out of it. What makes you stand out Databricks knowledge. Any Reporting tool experience. Preferred MicroStrategy. Marsh McLennan (NYSEMMC) is the worlds leading professional services firm in the areas ofrisk, strategy and people. The Companys more than 85,000 colleagues advise clients in over 130 countries.With annual revenue of $23 billion, Marsh McLennan helps clients navigate an increasingly dynamic and complex environment through four market-leading businesses.Marshprovides data-driven risk advisory services and insurance solutions to commercial and consumer clients.Guy Carpenter develops advanced risk, reinsurance and capital strategies that help clients grow profitably and pursue emerging opportunities. Mercer delivers advice and technology-driven solutions that help organizations redefine the world of work, reshape retirement and investment outcomes, and unlock health and well being for a changing workforce. Oliver Wyman serves as a critical strategic, economic and brand advisor to private sector and governmental clients. For more information, visit marshmclennan.com , or follow us onLinkedIn andX . Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people regardless of their sex/gender, marital or parental status, ethnic origin, nationality, age, background, disability, sexual orientation, caste, gender identity or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person. Marsh McLennan (NYSEMMC) is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marshmclennan.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 3 days ago
3.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities: Supports the design and development of program methods, processes, and systems to consolidate and analyze structured and unstructured, diverse "big data" sources. Interfaces with internal customers for requirements analysis and compiles data for scheduled or special reports and analysis Supports project teams to develop analytical models, algorithms and automated processes, applying SQL understanding and Python programming, to cleanse, integrate and evaluate large datasets. Supports the timely development of products for manufacturing and process information by applying sophisticated data analytics. Able to quickly understand the requirement and create it into executive level presentation slides. Participates in the design, development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. used to drive key business decisions. Strong business & financial (P&L) acumen. Able to understand key themes, financial terms and data points to create appropriate summaries. Works with business intelligence manager and other staff to assess various reporting needs. Analyzes reporting needs and requirements, assesses current reporting in the context of strategic goals and devise plans for delivering the most appropriate reporting solutions to users. Qualification: Bachelors/Masters degree or relevant 7 - 12 years of experience as data analyst Required technical skills in SQL, Azure, Python, Databricks, Tableau (good to have) PowerPoint and Excel expertise Experience in Supply Chain domain. Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and basic knowledge of related disciplines. Business Expertise Has knowledge of best practices and how own area integrated with others; is aware of the competition and the factors that differentiate them in the market. Leadership Acts as a resource for colleagues with less experience; may lead small projects with manageable risks and resource requirements. Problem Solving Solves complex problems; takes a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information. Impact Impacts a range of customer, operational, project or service activities within own team and other related teams; works within broad guidelines and policies. Interpersonal Skills Explains difficult or sensitive information; works to build consensus. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 20% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 3 days ago
8.0 - 10.0 years
25 - 27 Lacs
Bengaluru
Work from Office
Key Skills: Data Engineer, Data Integration, Informatica, Pyspark, Informatica MDM. Roles and Responsibilities: Utilize Informatica IDMC tools including Data Profiling, Data Quality, and Data Integration modules to support data initiatives. Implement and maintain robust Data Quality frameworks, ensuring data accuracy, consistency, and reliability. Work on ETL (Extract, Transform, Load) processes to support business intelligence and analytics needs. Participate in agile product teams to design, develop, and deliver data solutions aligned with business requirements. Perform data validation, cleansing, and profiling to meet data governance standards. Collaborate with cross-functional teams to understand business needs and translate them into technical solutions. Assist in the design and execution of QA processes to maintain high standards in data pipelines. Support integration with cloud platforms such as MS Azure and utilize DevOps tools for deployment. Contribute to innovation and improvement initiatives including the development of new features with modern data tools like Databricks. Maintain clear documentation and ensure alignment with data governance and data management principles. Optionally, develop visualizations using Power BI for data storytelling and reporting. Experience Requirement: 8-10 years of experience working on Data Quality projects. At least 3 years of hands-on experience with Informatica Data Quality modules. Strong understanding of data profiling, validation, cleansing, and overall data quality concepts. Experience with ETL processes and QA/testing frameworks. Basic knowledge of Microsoft Azure platform and services. Exposure to Data Governance and Management practices. Experience in agile environments using tools like DevOps. Strong analytical, problem-solving, and troubleshooting skills. Proficient in English with excellent communication and collaboration abilities. Nice to have: experience developing with Power BI. Education: Any Graduation.
Posted 3 days ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are seeking a skilled Informatica BDM Engineer with a strong background in Big Data Management to join our team. The ideal candidate will be responsible for designing, developing, and implementing Informatica solutions to manage and analyze large volumes of data effectively. You will play a crucial role in ensuring data integrity, security, and compliance within our organization. Overall Responsibilities Design and Development: Design, develop, and implement solutions using Informatica Big Data Management. Ensure quality and performance of technical and application architecture and design across the organization. Data Management: Work extensively with Oozie scheduling, HQL, Hive, HDFS, and data partitioning to manage large datasets. Collaborate with teams to ensure effective data integration and transformation processes using SQL and NoSQL databases. Security Implementation: Design and implement security systems, identifying gaps in existing architectures and recommending enhancements. Adhere to established policies and best practices regarding data security and compliance. Monitoring and Troubleshooting: Actively monitor distributed services and troubleshoot issues in production environments. Implement resiliency and monitoring solutions to ensure continuous service availability. Agile and DevOps Practices: Participate in Agile methodology, ensuring timely delivery of projects while adhering to CI/CD principles using tools like GitHub and Jenkins. Collaboration and Influence: Work collaboratively with multiple teams to share knowledge and improve productivity. Effectively research and benchmark technologies against best-in-class solutions. Technical Skills Core Skills Informatica BDM: Minimum 5 years of development and design experience. Data Technologies: Extensive knowledge of Oozie, HQL, Hive, HDFS, and data partitioning. Databases: Proficient in SQL and NoSQL databases. Operating Systems: Strong Linux OS configuration skills, including shell scripting. Security and Compliance: Knowledge of designing security controls for data transfers and ETL processes. Understanding of compliance and regulatory requirements, including encryption and data integrity. Networking Basic understanding of networking concepts including DNS, Proxy, ACL, and policy troubleshooting. DevOps & Agile: Familiar with Agile methodologies, CI/CD practices, and tools (GitHub, Jenkins). Experience with distributed services resiliency and monitoring. Experience Minimum 5 years of experience in Informatica Big Data Management. Experience in the Banking, Financial, and Fintech sectors preferred. Proven ability to implement design patterns and security measures in large-scale infrastructures. Qualifications Education: Bachelors or Masters degree in Computer Science or a related field (or equivalent industry experience). Soft Skills Excellent interpersonal and communication skills to effectively present ideas to teams and stakeholders. Strong listening skills with the ability to speak clearly and confidently in front of management and peers. Positive attitude towards work, fostering a climate of trust and collaboration within the team. Enthusiastic and passionate about technology, creating a motivating environment for the team. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 3 days ago
3.0 - 5.0 years
14 - 19 Lacs
Pune
Work from Office
Job Summary Synechron is seeking a skilled Qlik Sense Developer to design, develop, and optimize business intelligence solutions that enhance data-driven decision-making across the organization. The successful candidate will collaborate with cross-functional teams to understand business requirements, craft scalable dashboards, and enable stakeholders to derive insights efficiently. This role plays a vital part in supporting digital transformation and strategic initiatives through effective data visualization and analytics. Software Required Software Skills: Strong proficiency in Qlik Sense (version 12 or later) Experience with QlikView or other BI tools (preferred) Knowledge of SQL and data querying techniques Familiarity with data modeling concepts Working knowledge of scripting languages (e.g., Python, R) is a plus Experience with cloud-based data platforms (Azure, AWS, GCP) Preferred Software Skills: Integration with other BI/analytics tools Knowledge of ETL processes and tools (Informatica, Talend, etc.) Experience with scripting for automation and data manipulation Overall Responsibilities Collaborate with business users and technical teams to gather reporting and dashboard requirements Design, develop, and deploy interactive, user-friendly Qlik Sense dashboards and visualizations Optimize data models and scripts to improve performance and scalability Conduct thorough testing to ensure accuracy and usability of reports Implement best practices for security, version control, and documentation Stay updated with latest features, techniques, and trends in BI and data visualization Provide ongoing support and enhancements to existing dashboards, resolving issues proactively Participate in cross-functional meetings to align data solutions with strategic goals Performance Outcomes & Expectations: Timely delivery of high-quality BI solutions that meet user needs Improved data accessibility and insights derived from dashboards Reduction in report generation time through performance tuning Active contribution to knowledge sharing and best practices Technical Skills (By Category) Programming Languages: EssentialQlik Sense scripting language, SQL PreferredPython, R (for advanced data manipulation and automation) Databases/Data Management: EssentialSQL Server, Oracle, PostgreSQL PreferredNoSQL (MongoDB, Cassandra) Cloud Technologies: PreferredCloud platforms (Azure, AWS, GCP) with experience integrating with BI tools Frameworks and Libraries: Qlik Sense QlikView, Qlik Data Market Data modeling tools (dimensional modeling, star schemas) Development Tools and Methodologies: Qlik Sense, QlikView, SQL Management Studio Version control (Git, Bitbucket) Agile methodologies (Scrum, Kanban) Data integration and ETL tools Security Protocols: User access management within Qlik Sense Data security standards and encryption practices Experience Minimum of 3-5 years of hands-on experience designing and deploying Qlik Sense dashboards Proven track record of translating business requirements into visual insights Experience with data modeling, scripting, and data querying Exposure to cloud and on-premises environments Experience collaborating with cross-functional teams in dynamic settings Day-to-Day Activities Participate in daily stand-up meetings and sprint planning Gather dashboard requirements through stakeholder engagement Develop, test, and publish Qlik Sense visualizations Optimize data load scripts and data models for performance Troubleshoot and resolve data discrepancies or performance bottlenecks Conduct user training and documentation for dashboards Collaborate with data engineers, analysts, and business users for continuous improvement Keep abreast of the latest BI trends and features, recommending innovations Qualifications Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or related field Professional certification in Qlik Sense or BI tools is a plus Continuous professional development in BI trends and data visualization techniques Professional Competencies Strong analytical and problem-solving skills Excellent written and verbal communication abilities Ability to work independently and collaboratively Detail-oriented with a focus on data accuracy Adaptability to changing requirements and technological updates Time management skills to prioritize deliverables S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 3 days ago
5.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
BI Tools or Data Acceleration/Data Processing deployment and administration Any previous experience of administering In memory columnar databases like Exasol, Greenplum, Vertica, Snowflake Strong analytical and problem-solving skills Ability to communicate orally and in writing in a clear and straightforward manner with a broad range of technical and non-technical users and stakeholders Proactive and focused on results and success; conveys a sense of urgency and drives issues to closure Should be a team player and leader, flexible, hardworking, and self-motivated and have a positive outlook with the ability to take on difficult initiatives and challenges Ability to handle multiple concurrent projects First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 3 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 3 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 3 days ago
10.0 - 12.0 years
12 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Urgent Requirement forPentaho DI . Experience:10+ Years Employment :C2H Notice Period:Immediate Mandatory Skills Must have ETL, Pentaho Data Integration (Kettle and Spoon), Designing and Troubleshooting and Performance tuning Should have - AWS cloud, XML, Java scripts, java & Unix shell scripts, Good to have skills Integration with sales force components.
Posted 3 days ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Requirements Description and Requirements 7+ years of experience in quality assurance, with at least 3+ years in a Test Data Management (TDM) lead or senior role. Proven experience in designing and implementing test data management strategies, data masking, and test data provisioning for large-scale software projects. Lead the development and implementation of comprehensive test data management strategies to support functional, regression, performance, security, and other types of testing. Establish governance processes and best practices for handling, managing, and securing test data across multiple projects and environments. Ensure that test data complies with legal, regulatory, and organizational security policies (e.g., GDPR, HIPAA). Design and oversee the creation of high-quality, realistic, and representative test data to meet the needs of different types of testing. Use data generation tools and techniques to produce test data that mirrors real-world data while maintaining privacy and security. Develop automated processes for generating and refreshing test data in line with project and release timelines. Implement and manage data masking, anonymization, and sanitization techniques to ensure sensitive information is protected while retaining data integrity for testing purposes. Develop and enforce data security practices related to the use and storage of test data. Work closely with QA, development, and DevOps teams to understand the specific test data requirements for different testing phases (e.g., unit, integration, performance, UAT). Collaborate with business and IT teams to ensure that required test data is available when needed and meets quality expectations. Support the creation of data models and mapping to align test data with application requirements. Implement strategies for efficient storage and retrieval of test data to ensure high performance and reduce resource consumption during testing. Continuously assess and optimize test data strategies to improve test execution time, resource allocation, and overall testing efficiency. Manage large-scale data sets and ensure their availability across multiple environments (development, testing, staging, production). Lead the evaluation, implementation, and continuous improvement of test data management tools and automation platforms (e.g., Informatica TDM, Delphix, IBM InfoSphere Optim). Leverage automation to streamline test data creation, management, and refresh cycles, ensuring quick access to the latest data for testing. Drive the adoption of self-service tools to enable teams to generate, refresh, and manage their own test data securely. Monitor and manage test data usage to ensure compliance with internal standards and external regulations. Provide regular reporting on test data quality, availability, and utilization to key stakeholders, highlighting any risks or issues. Track and resolve test data issues (e.g., missing data, incorrect data) and provide solutions to improve data availability and accuracy. Lead and mentor a team of test data management professionals, providing guidance, training, and support to enhance team capabilities. Establish clear goals, KPIs, and performance metrics for the team and ensure that projects are completed on time and to a high standard. Foster a culture of continuous improvement, encouraging the team to innovate and apply new test data management techniques. Stay up-to-date with emerging trends, technologies, and best practices in test data management and data privacy. Evaluate and recommend new tools, technologies, and methods to improve the test data management process, increase efficiency, and reduce manual effort. Experience with AI and automation tools for test data generation and data management. Additional Job Description Expertise in test data management tools and platforms (e.g., Delphix, Informatica TDM, IBM InfoSphere Optim, CA TDM). Strong knowledge of data security, privacy, and compliance standards (e.g., GDPR, HIPAA) as they relate to test data. Proficient in database management and query languages (e.g., SQL, PL/SQL) for data manipulation, extraction, and analysis. Experience with test automation frameworks and integration of TDM tools into CI/CD pipelines. Familiarity with cloud-based test data management solutions (e.g., AWS, Azure, Google Cloud). EEO Statement At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Equal Opportunity Employer At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence and performance without regard to any characteristic related to diversity.
Posted 3 days ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Minimum 3 years of experience in working TestDataManagement(TDM) Hands-on experience in handling tools like CA FastDataMasker, Informatica, IBM optim. Exposure in data masking/Obfuscation Hands on experience in SQL, along with multiple databases like Oracle, SQL Server, GreenPlum etc. Hands on experience in Data Profiling, Data masking and reporting. Experience in training and mentoring juniors. Experience in team handling not more than 3-4 associates. Hands Experience working in offshore-onshore model, with good communication skills. Hands on experience in Java and Python will be a plus.
Posted 3 days ago
6.0 - 11.0 years
8 - 14 Lacs
Gurugram
Work from Office
As a BI ETL Test Engineer, you take care of the testing of BI systems. This includes validation of the business data flow , ETL components, data lineage, ETL architecture and you are able to analyze defects during data validation. This includes setting up testing strategy, recommend tools, perform technical feasibility and risk assessments. Primary Skills As a BI ETL Test Specialist, you are expected to be subject matter expert in this area of specialised testing. This includes understanding the business data flow , ETL components, data lineage, ETL architecture and are able to analyze defects during data validation. You have a good technical knowledge on Databases, Unix/Linux and ETL and BI Tools. You are expected to develop testing strategy, recommend tools, perform technical feasibility, conduct risk assessments and build business cases (ROI). You are expected to own delivery of specialised testing projects. You are expected to work independently to provide technical support and guidance. Skills (competencies)
Posted 3 days ago
0 years
0 Lacs
India
Remote
About Us Our leading SaaS-based Global Employment Platform™ enables clients to expand into over 180 countries quickly and efficiently, without the complexities of establishing local entities. At G-P, we’re dedicated to breaking down barriers to global business and creating opportunities for everyone, everywhere. Our diverse, remote-first teams are essential to our success. We empower our Dream Team members with flexibility and resources, fostering an environment where innovation thrives and every contribution is valued and celebrated. The work you do here will positively impact lives around the world. We stand by our promise: Opportunity Made Possible. In addition to competitive compensation and benefits, we invite you to join us in expanding your skills and helping to reshape the future of work. At G-P, we assist organizations in building exceptional global teams in days, not months—streamlining the hiring, onboarding, and management process to unlock growth potential for all. About The Position As a Senior Engineering Manager at Globalization Partners, you will be responsible for both technical leadership and people management. This includes contributing to architectural discussions, decisions, and execution, as well as managing and developing a team of Data Engineers (of different experience levels). What You Can Expect To Do Own the strategic direction and execution of initiatives across our Data Platform, aligning technical vision with business goals. Guide teams through architectural decisions, delivery planning, and execution of complex programs that advance our platform capabilities. Lead and grow high-performing engineering teams responsible for the full data and analytics stack—from ingestion (ETL and Streaming) through transformation, storage, and consumption—ensuring quality, reliability, and performance at scale. Partner cross-functionally with product managers, architects, engineering leaders, and stakeholders from Cloud Engineering and other business domains to shape product and platform capabilities, translating business needs into actionable engineering plans. Drive delivery excellence by setting clear expectations, removing blockers, and ensuring engineering teams are progressing efficiently towards milestones while maintaining technical integrity. Ensure adoption and consistency of platform standards and best practices, including shared components, reusable libraries, and scalable data patterns. Support technical leadership across teams by fostering a strong culture of engineering excellence, security, and operational efficiency. Guide technical leads in maintaining high standards in architecture, development, and testing. Contribute to strategic planning, including the evolution of the data platform roadmap, migration strategies, and long-term technology investments aligned with company goals. Champion agile methodologies and DevOps practices, driving continuous improvement in team collaboration, delivery cycles, and operational maturity. Mentor and develop engineering talent, creating an environment where individuals can thrive through coaching, feedback, and growth opportunities. Promote a culture of innovation, accountability, and psychological safety. Challenge the Data Platform Quality and Performance by building/monitoring quality KPI and building a quality-first culture What We Are Looking For Proven experience leading geographically distributed engineering teams in the design and delivery of complex data and analytics platforms. Strong technical foundation with hands-on experience in modern data architectures, handling structured and unstructured data, and programming in Python—capable of guiding teams and reviewing design and code at a high level when necessary. Proficiency in SQL and relational database technologies, with the ability to guide data modeling and performance optimization discussions. In-depth understanding of ETL processes and data integration strategies, with practical experience overseeing data ingestion (batch and streaming), transformation, and quality assurance initiatives. Familiarity with commercial data platforms (e.g., Databricks, Snowflake) and cloud-native data warehouses (e.g., Redshift, BigQuery), including trade-offs and best practices in enterprise environments. Working knowledge of data governance and cataloging solutions, such as Atlan, Alation, Informatica, or Collibra, and experience supporting enterprise data stewardship efforts. Deep understanding of data quality, experience in building quality processes, and usage of tools like Monte Carlo. Understanding of machine learning and AI workloads, including the orchestration of data pipelines for model training and deployment in both batch and streaming contexts. Strong analytical and problem-solving skills, with the ability to drive root-cause analysis, evaluate architectural trade-offs, and support decision-making in ambiguous or fast-changing environments. Exceptional communication skills, with a track record of clear and effective collaboration across technical and non-technical stakeholders. Fluent in English, both verbal and written, with the ability to influence at all levels of the organization. Bachelor’s degree in Computer Science or a related field; advanced degrees or equivalent professional experience are a plus. We will consider for employment all qualified applicants who meet the inherent requirements for the position. Please note that background checks are required, and this may include criminal record checks. G-P. Global Made Possible. G-P is a proud Equal Opportunity Employer, and we are committed to building and maintaining a diverse, equitable and inclusive culture that celebrates authenticity. We prohibit discrimination and harassment against employees or applicants on the basis of race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth, and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other legally protected status. G-P also is committed to providing reasonable accommodations to individuals with disabilities. If you need an accommodation due to a disability during the interview process, please contact us at careers@g-p.com.
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane