Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
12 - 20 Lacs
Mumbai Suburban
Work from Office
Job Role: Minimum 3 years of previous industry work experience will be preferred •In-depth understanding of database structure principles. • Knowledge of data mining and segmentation techniques, expertise in SQL and Oracle. • Familiarity with data visualization and data oriented. • Ability to document complex business processes and handle all types of customer requests. • Good communication skill in English; Math & Statistical analysis, ability to interpret and collate relevant data. • Should have working experience on on-premises and cloud-based data infrastructure handling large and diverse datasets • Experience in one or more of the below technologies is preferred • AWS/GCP/Azure • Kubernetes/Docker Swarm • Apache Hadoop & Apache Spark • Elastic Stack/Elk • Airflow / Prefect • MongoDB, Cassandra, Redis, Memcached and DynamoDB • MySQL, Cassandra, and Oracle SQL • PowerBI/Tableau/Qlik view
Posted 1 month ago
2.0 - 6.0 years
13 - 17 Lacs
Mumbai
Work from Office
At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world's energy systems. Their spirit fuels our mission. Our culture is defined by caring, agile, respectful, and accountable individuals. We value excellence of any kind. Sounds like you? Software Developer - Data Integration Platform- Mumbai or Pune , Siemens Energy, Full Time Looking for challenging role? If you really want to make a difference - make it with us We make real what matters. About the role Technical Skills (Mandatory) Python (Data Ingestion Pipelines) Proficiency in building and maintaining data ingestion pipelines using Python. Blazegraph Experience with Blazegraph technology. Neptune Familiarity with Amazon Neptune, a fully managed graph database service. Knowledge Graph (RDF, Triple) Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management. AWS Environment (S3) Experience working with AWS services, particularly S3 for storage solutions. GIT Proficiency in using Git for version control. Optional and good to have skills Azure DevOps (Optional)Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred). Metaphactory by Metaphacts (Very Optional)Familiarity with Metaphactory, a platform for knowledge graph management (very optional). LLM / Machine Learning ExperienceExperience with Large Language Models (LLM) and machine learning techniques. Big Data Solutions (Optional)Experience with big data solutions is a plus. SnapLogic / Alteryx / ETL Know-How (Optional)Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial. We don't need superheroes, just super minds. A degree in Computer Science, Engineering, or a related field is preferred. Professional Software DevelopmentDemonstrated experience in professional software development practices. Years of Experience3-5 years of relevant experience in software development and related technologies. Soft Skills Strong problem-solving skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced and dynamic environment. Strong attention to detail and commitment to quality. Fluent in English (spoken and written) We've got quite a lot to offer. How about you? This role is based in Pune or Mumbai , where you'll get the chance to work with teams impacting entire cities, countries "“ and the shape of things to come. We're Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at: www.siemens.com/careers
Posted 1 month ago
8.0 - 10.0 years
15 - 20 Lacs
Gurugram
Work from Office
Position Summary: We are looking for an experienced Microsoft 365 Specialist to join our dynamic team for streamlining the enterprise Project data. The ideal candidate will possess a strong proficiency in Microsoft 365 applications and Generative AI tools, along with extensive knowledge of data governance principles. This role will focus on data aggregation, integration, and the development of a robust data architecture to ensure data integrity and accessibility across multiple digital projects in the organization. The candidate should be capable of acting as a developer to build a future-proof architecture that connects various data storage options in our Digital business groups. This would make our digital projects future-proof and AI implementation ready with respect to data flow, data quality and lead to overall operational excellence A Snapshot of your Day How You'll Make an Impact (responsibilities of role) Utilize the full suite of Microsoft 365 applications to streamline data & workflows across different Digital Projects and segments. Customization of the same (as required) will be needed. Act as a developer to build a future-proof architecture that connects various data storage options, including applications, cloud services, drives, and SharePoint etc. Designed architecture shall consolidate fragmented data from various sources to create a single, reliable source of truth for accurate reporting and analysis Integrate and leverage Generative AI tools, such as Co-Pilot, to improve data analysis and reporting capabilities Implement data governance policies, workflows and practices to ensure data quality, security, and compliance with relevant regulations Experience in data integration and transformation techniques, including ETL (Extract, Transform, Load) processes, to ensure data consistency and accuracy Collaborate with stakeholders to identify data needs and ensure accurate reporting and analysis Ensure data integrity and accessibility across the organization, enabling informed decision-making Communicate effectively with cross-functional teams and stakeholders to understand data requirements and deliver solutions that meet business needs Provide training and support to team members on data governance policies, procedures and required operability of Microsoft 365 tools Keep abreast of new features and capabilities in Microsoft 365 related to data governance. What You Bring Bachelor's/master's degree in information technology or computer science, or a related field. 8 to 10 years of experience in developing architectures for data governance. Proven experience with Microsoft 365 applications and Generative AI tools, like Co-Pilot Strong understanding of data governance principles, practices, and policies Experienced in utilizing a variety of database management systems and data exchange formats to optimize data storage, retrieval, and interoperability Knowledge of relevant industry regulations and standards . Proficiency in data architecture design Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders Strong problem-solving skills and the ability to work collaboratively in a dynamic team environment across the globe
Posted 1 month ago
5.0 - 8.0 years
12 - 15 Lacs
Chennai
Work from Office
Job Summary We are seeking a skilled Informatica DGDQ Developer with over 5 years of experience to manage data governance and data quality processes. The successful candidate will play a crucial role in ensuring data accuracy, consistency, and quality across various systems. You will work closely with cross-functional teams to design and implement solutions that ensure data integrity and compliance with governance policies. Mandatory Skills Proficiency in Informatica PowerCenter, including mapping, workflows, and transformations. Strong SQL and PL/SQL skills for data manipulation and querying. Experience with ETL processes and data warehousing concepts. Knowledge of data integration, data quality, and data migration techniques. Experience in performance tuning and troubleshooting of Informatica jobs. Knowledge of version control systems (e.g., Git) and CI/CD processes. Roles and Responsibilities Design, develop, and implement Informatica DGDQ solutions to manage data quality and governance Collaborate with business teams to gather data quality and governance requirements Perform data profiling, cleansing, and validation using Informatica tools Develop and maintain data quality rules, workflows, and dashboards Monitor and troubleshoot data quality issues, ensuring timely resolution Ensure compliance with data governance policies and frameworks Work with data stewards and stakeholders to define data standards and best practices Document data governance processes, data lineage, and metadata management Provide training and support for data quality tools and processes to the organization Qualifications Bachelors degree in computer science, Information Technology, or a related field 5+ years of experience with Informatica DGDQ (Data Governance and Data Quality) Strong understanding of data governance frameworks and data quality management Experience with data profiling, cleansing, and validation tools Informatica Data Governance and Data Quality certification is a plus Technical Skills Expertise in Informatica DGDQ tools for data quality and governance management Strong knowledge of SQL and data modeling techniques Experience with data profiling, cleansing, validation, and enrichment Knowledge of data governance best practices, including data stewardship and metadata management Experience working with large datasets and complex data environments Familiarity with data security, compliance, and regulatory requirements Soft Skills Excellent communication and collaboration skills Ability to work closely with cross-functional teams and stakeholders Strong problem-solving and analytical abilities Detail-oriented with a focus on data accuracy and consistency Adaptability and a proactive approach to addressing data governance challenges Good to Have Experience with cloud platforms like AWS or Azure for data governance and quality solutions Knowledge of Big Data tools and technologies Experience with REST APIs for integrating data governance tools with third-party systems Work Experience Minimum of 5 to 8 years of experience in Informatica development and data integration. Proven ability to deliver high-quality data solutions in a fast-paced environment.
Posted 1 month ago
4.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title ServiceNow ITOM Developer (Event Management) Job Location Bangalore / Mumbai (Onsite) Experience 4+ years Job Type Full-Time About the Role : We are seeking a highly skilled ServiceNow ITOM Developer with a strong background in Event Management. The ideal candidate will have at least 4 years of hands-on experience in ServiceNow ITOM (IT Operations Management) modules, specifically focusing on Event Management and its integration with other ServiceNow modules. You will be responsible for designing, developing, and implementing ServiceNow Event Management solutions that align with our IT infrastructure and support operations. Key Responsibilities : - Design, develop, and implement ServiceNow Event Management solutions to meet business requirements. - Develop and customize Event Management policies, thresholds, and alert rules for automatic event processing. - Work closely with the ServiceNow ITOM team to integrate Event Management with other modules like Incident Management, Problem Management, Change Management, and Discovery. - Troubleshoot and resolve issues related to ServiceNow Event Management, including event rules, alerting, and notification processes. - Perform regular system updates, maintenance, and upgrades of the ServiceNow Event Management platform. - Develop and maintain integrations with monitoring tools to process incoming events into ServiceNow. - Create and maintain technical documentation for all customizations and integrations. - Ensure that the Event Management process is optimized for efficiency, performance, and scalability. - Collaborate with cross-functional teams to ensure effective implementation and alignment with organizational goals. - Participate in on-call support and provide incident resolution related to ITOM and Event Management. Requirements : - Experience Minimum 4 years of experience in ServiceNow ITOM, specifically in Event Management. - ServiceNow Modules Hands-on experience in Event Management, Discovery, Service Mapping, and Orchestration. - Strong knowledge of ServiceNow ITOM and its Event Management processes. - Proficiency in ServiceNow Scripting Client Scripts, Business Rules, UI Actions, UI Pages, Script Includes. - In-depth understanding of event management best practices, including event correlation, suppression, and alert rules. - Integration Experience Knowledge of integrating ServiceNow Event Management with third-party monitoring tools (e.g., Nagios, Zabbix, SolarWinds, etc.). - Strong experience with ServiceNow API, Web Services, and data integration processes. - Familiarity with ITIL processes and CMDB concepts. - Experience in configuring and customizing ServiceNow Event Management in an enterprise environment. - Ability to work effectively in an agile environment and manage multiple tasks. - Strong communication and problem-solving skills. - Bachelor's degree in Computer Science, Information Technology, or related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
4.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Role SQL developer Location Bangalore Experience 4 + years Employment Type Full Time, Working mode Regular Notice Period Immediate - 15 Days Job Summary : As a Software Engineer / Senior Software Engineer (Database), you will play a pivotal role in designing, developing, and maintaining the database infrastructure for our core product. You will collaborate closely with the development team to ensure that our database solutions are scalable, efficient, and aligned with our business objectives. Key Responsibilities : 1. Database Design and Development : - Develop and implement database models, views, tables, stored procedures, and functions to support product development. - Design and maintain SSIS packages, T-SQL scripts, and SQL jobs. - Optimize database performance through query tuning, indexing, and partitioning. 2. Data Integration : - Develop complex stored procedures for loading data into staging tables from various sources. - Ensure data integrity and consistency across different systems. 3. Data Analytics : - Collaborate with data analysts to design and implement data analytics solutions using tools like SQL Server, SSIS, SSRS, and Excel Power Pivot/View/Map. 4. Documentation Document complex processes, business requirements, and specifications. 5. Database Administration : - Provide authentication and authorization for database access. - Develop and enforce best practices for database design and development. - Manage database migration activities. Required Skills : Technical Skills : - Strong proficiency in MS SQL Server (query tuning, stored procedures, functions, views, triggers, indexes, column store index, SQL server column storage, query execution plan). - Experience with database design, normalization, and performance optimization. - Knowledge of data warehousing and ETL processes. - Experience with SSIS, SSRS, and Excel Power Pivot/View/Map. Soft Skills : - Excellent analytical, problem-solving, and communication skills. - Ability to work independently and as part of a team. - Attention to detail and commitment to quality. Benefits : - Competitive salary and benefits package - Opportunities for professional growth and development - Remote work flexibility Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
5.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
We are seeking an experienced SQL Developer with expertise in SQL Server Analysis Services (SSAS) and AWS to join our growing team. The successful candidate will be responsible for designing, developing, and maintaining SQL Server-based OLAP cubes and SSAS models for business intelligence purposes. You will work with multiple data sources, ensuring data integration, optimization, and performance of the reporting models. This role offers an exciting opportunity to work in a hybrid work environment, collaborate with cross-functional teams, and apply your skills in SSAS, SQL, and AWS to design scalable and high-performance data solutions. Key Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models, including both Multidimensional and Tabular models, to support business intelligence and reporting solutions. - Create and manage OLAP cubes that are optimized for fast query performance and used for analytical reporting and decision-making. - Develop and implement multidimensional and tabular data models for various business needs, ensuring the models are flexible, scalable, and optimized for reporting. - Work on performance tuning and optimization of SSAS solutions, ensuring efficient query processing and high performance even with large data sets. - Integrate data from various sources, including SQL Server databases, flat files, and cloud-based storage, into SSAS models for seamless and accurate reporting. - Integrate and manage data from AWS services (e.g., S3, Redshift, etc.) into the SQL Server database and SSAS models for hybrid cloud and on-premise data solutions. - Leverage SQL Server PolyBase to access and integrate data from external data sources like AWS S3, Azure Blob Storage, or other systems for data processing. - Ensure data integrity, consistency, and accuracy within the data models and reporting systems. Work closely with data governance teams to maintain high-quality data standards. - Work in an agile team environment with BI developers, data engineers, and business analysts to align data models and solutions with business requirements. - Provide support for production systems, troubleshoot issues with SSAS models, queries, and reporting solutions, and implement fixes when necessary. - Maintain clear and detailed technical documentation for SSAS model designs, ETL processes, and best practices for data integration. Required Skills & Experience : - 5+ years of experience as a SQL Developer with strong hands-on expertise in SSAS. - In-depth experience in creating and managing SSAS models, including multidimensional (OLAP) and tabular models. - Proficiency in SQL Server (T-SQL, SSIS, SSRS) for data integration, data transformation, and reporting. - Strong understanding of SSAS performance tuning, query optimization, and processing. - Experience with AWS services, particularly AWS S3, AWS Redshift, and their integration with SQL Server-based solutions. - Knowledge of SQL Server PolyBase for data integration and access from external data sources. - Experience in business intelligence solutions and creating reports using tools like Power BI or SSRS. - Familiarity with cloud data integration, ensuring seamless integration between on-premise SQL Server databases and cloud-based storage (AWS). - Strong problem-solving skills and the ability to troubleshoot and resolve issues in data models and data warehouses. - Excellent communication skills, both verbal and written, with the ability to effectively collaborate with cross-functional teams. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
7.0 - 10.0 years
1 - 5 Lacs
Pune
Work from Office
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About The Role : As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
5.0 - 7.0 years
10 - 14 Lacs
Gurugram
Work from Office
RolePower BI Developer. Exp 5 plus. Location Bangalore, Pune, Gurgaon, Noida & HyderabadWork Model (Hybrid). Experience Range 5 yrs plus. Notice Immediate. Must have skills :- Experience in the Investments domain.- 5+ years' experience with data analysis and visualization.- 5+ years hands on experience with Power BI developments. - 5+ years hand on experience with SQL/Snowflake. - Working knowledge on Power BI Report Builder.- Power BI modeling and SQL querying skills- Design Power BI models and metadata for reporting.- Share & collaborate within the Power Platform ecosystem.- Experienced in tuning and troubleshooting issues.- Ability to connect to multiple data sources including file systems, databases, and cloud sources - SQL, Oracle, Snowflake, SharePoint.- High level understanding of Scrum methodologies, ceremonies, and metrics.- Create technical stories and convert business needs and inputs to technical solution and design.- Experience with end-to-end delivery of reporting solutions. Good to have skills :- Experience with other BI/ Reporting/ Visualization tools like Tableau, Cognos.- Scrum / Safe / agile methodologies / task and story estimation.- Experience with Power Apps.- Data integration skills.- ETL, Data mapping, massaging and transformation.Experience 5-7 YearsApplyInsightsFollow-upSave this job for future referenceDid you find something suspiciousReport Here! Hide This JobClick here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Hubli, Mangaluru, Mysuru
Work from Office
About the Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Engineer leading various projects within the new Data Platform team The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on ?enhanced intelligence? Ensuring we have fit-for-purpose modern capabilities is a key goal for the team, Key Responsibilities Design, develop, and maintain scalable data pipelines to support diverse analytics and machine learning needs, Optimize and manage data architectures for reliability, scalability, and performance, Implement and support data integration solutions from our data partners, including ETL/ELT processes, ensuring seamless data flow across platforms, Collaborate with Data Scientists, Analysts, and Product Teams to define and support data requirements, Manage and maintain data platforms such as Oracle, Snowflake, and/or Databricks, ensuring high availability and performance, whilst optimizing for cost, Ensure data security and compliance with company policies and relevant regulations, Monitor and troubleshoot data systems to identify and resolve performance issues, Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment Analyze large datasets to identify patterns, trends, and insights that can inform business decisions, Work with 3rd party providers of Data and Data Platform products to evaluate and implement solutions achieving Investclouds business objectives, Lead a small team, as part of the global team, based in India and working closely with co-located data scientists as well as the broader global team, Required Skills Bachelors or Masters degree in Computer Science, Engineering, or a related field, or equivalent practical experience, Minimum of 5 years of professional experience in data engineering or a related role, Proficiency in database technologies, including Oracle and PostgreSQL, Hands-on experience with Snowflake and/or Databricks, with a solid understanding of their ecosystems, Expertise in programming languages such as Python or SQL, Strong knowledge of ETL/ELT tools and data integration frameworks, Experience with cloud platforms such as AWS, GCP, or Azure, Familiarity with containerization and CI/CD tools (e-g , Docker, Git), Excellent problem-solving skills and the ability to handle complex datasets, Outstanding communication skills to collaborate with technical and non-technical stakeholders globally, Knowledge of data preprocessing, feature engineering, and model evaluation metrics Excellent proficiency in English Ability to work in a fast-paced environment across multiple projects simultaneously Ability to lead a small team ensuring a highly productive, collaborative and positive environment Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect, Preferred Skills Experience with real-time data processing and streaming platforms (e-g , Apache Kafka), Knowledge of data warehousing and data lake architectures, Familiarity with governance frameworks for data management and security, Knowledge of Machine Learning frameworks (TensorFlow, PyTorch, Scikit-learn) and LLM frameworks (e-g Langchain) What do we offer Join our diverse and international cross-functional team, comprising data scientists, product managers, business analyst and software engineers As a key member of our team, you will have the opportunity to implement cutting-edge technology to create a next-generation advisor and client experience, Location and Travel The ideal candidate will be expected to work from the office (with some flexibility) Occasional travel may be required,
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Zetwerk is looking for Architect & Data Governance to join our dynamic team and embark on a rewarding career journey Collaborating with clients, engineers, and other stakeholders to determine project requirements and goals Developing and presenting design concepts, plans, and models to clients for approval Conducting site surveys and analyzing data to determine the best design solutions for a particular location and purpose Preparing detailed drawings and specifications Staying current with relevant building codes, regulations, and industry trends Managing budgets, schedules, and other project-related activities Ensuring that projects are completed within budget, on time, and to the satisfaction of clients and stakeholders An Architect must possess a combination of technical, creative, and interpersonal skills
Posted 1 month ago
6.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders
Posted 1 month ago
4.0 - 5.0 years
6 - 10 Lacs
Kochi, Bengaluru
Work from Office
4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability
Posted 1 month ago
0.0 - 2.0 years
4 - 7 Lacs
Navi Mumbai
Work from Office
Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 5+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets
Posted 1 month ago
0.0 - 1.0 years
3 - 6 Lacs
Navi Mumbai
Work from Office
Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 3+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets
Posted 1 month ago
6.0 - 11.0 years
15 - 25 Lacs
Chennai
Work from Office
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring ETL - Oracle ODI developers for one of our leading MNC client. PFB the details for your better understanding: ******* Looking for SHORT JOINERS ******* WORK LOCATION: CHENNAI Job Role: ETL ODI Developer EXPERIENCE: 5+ yrs CTC Range: 15 LPA to 25 LPA Work Type: WFO Mandatory Skills : Hands on development experience in ETL using ODI 11G/12C is MUST Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts ******* Looking for SHORT JOINERS ******* Appreciate your valuable references, if any. Thanks & Regards Sathya K GSN Consulting Mob: 8939666794 Mail ID: sathya@gsnhr.net; Web: https://g.co/kgs/UAsF9W
Posted 1 month ago
6.0 - 8.0 years
12 - 16 Lacs
Mumbai, Navi Mumbai
Work from Office
?? Duration: 6 Months ?? Notice Period: Immediate Joiners Only ?? Job Description: Hands-on experience in Salesforce development & customization Strong knowledge of Apex, Visualforce & Lightning Web Components (LWC) Skilled in data integration and migration with a focus on data integrity ?? Driving Results: Capable of working independently and in a team Flexible approach based on project needs Proactively identify and communicate issues or risks ?? Personal Traits: Self-driven, dynamic, and adaptive Comfortable with ambiguity and change Strong collaboration and analytical skills Confident yet humble, with a mindset for continuous learning
Posted 1 month ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Collibra Certified Rangers (Preferable) Analyze current architecture, data flow, data dependencies and related documentation. Develop Collibra integration solutions. Trace data from source system, across the various contact points of data landscape, to final destination system. Use Lineage harvester and adhere to industry best practices. Experience in metadata extraction, building business and technical lineage among assets. Design, develop and test Collibra integrations and workflows. Take advantage of the depth and breadth of integrations to connect data ecosystem to Collibra Data Intelligence Platform. Access Collibra-supported integrations, partner and other pre-built integrations and APIs to gain visibility. Understand and share insights. Understand Data Governance, Metadata Management, Reference Data Management, Data Modeling, Data Integration & Data Analysis. Work as a Solution provider. Collibra API Development
Posted 1 month ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Abinitio: Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX: Shell Scripting (must), Unix utilities like sed, awk, perl, python Informatica IICS: Good Exp in designing and developing ETL mappings using IICS. Should be familiar with bulk loading concepts, Change Data Capture (CDC),Data Profiling and Data validation concepts . Should have prior experience working with different types of data sources/targets. Understanding configuration, migration and deployment of ETL mappings. Teradata: Assist in the design, development, and testing of Teradata databases and ETL processes to support data integration and reporting Collaborate with data analysts and other team members to understand data requirements and provide solutions DataStage: Overall experience of 5 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies.
Posted 1 month ago
10.0 - 20.0 years
10 - 19 Lacs
Karur
Remote
Greetings from MindPro Technologies Pvt Ltd (Www.mindprotech.com) Job Description for Informatica Lead Position Experience Required: 10+ Years Mode : Remote Key Responsibilities Design & Development Lead the design, development, and implementation of ETL processes using Informatica products. Create, optimize, and maintain mappings, sessions, and workflows to ensure high performance and reliability. API Data Integration Coordinate with development teams to design and manage data ingestion from APIs into the Informatica environment. Develop strategies for real-time or near-real-time data processing, ensuring secure and efficient data flow. Collaborate on API specifications, error handling, and data validation requirements. Data Integration & Warehousing Integrate data from diverse sources (e.g., relational databases, flat files, cloud-based systems, APIs) into target data warehouses or data lakes. Ensure data quality by implementing best practices, validation checks, and error handling. Project Leadership Provide technical oversight and guidance to the ETL development team. Work with and expand development standards, processes, and coding practices to maintain a consistent and high-quality codebase. Coordinate with product owners, project managers, and onshore teams to track progress and meet milestones. Solution Architecture Work with business analysts and stakeholders to gather requirements and translate them into technical solutions. Propose improvements, optimizations, and best practices to enhance existing data integration solutions. Performance Tuning & Troubleshooting Identify bottlenecks in mappings or workflows; recommend and implement performance tuning strategies. Troubleshoot ETL or other data ingestion related issues, perform root cause analysis, and provide solutions in a timely manner. Collaboration & Communication Collaborate with onshore business and technical teams to ensure smooth project execution and knowledge sharing. Communicate project status, potential risks, and technical details to stakeholders and leadership. Participate in regular meetings with onshore teams, aligning on priorities and resolving issues. Documentation & Reporting Maintain comprehensive technical documentation including design specifications, test cases, and operational procedures. Generate reports or dashboards as needed to keep stakeholders informed of project progress and data pipeline performance.
Posted 1 month ago
6.0 - 8.0 years
0 - 0 Lacs
Bengaluru
Remote
Azure DE Primary Responsibilities - Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Create data models for analytics purposes Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations Use Azure Data Factory and Databricks to assemble large, complex data sets Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Ensure data security and compliance Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures Required skills: Blend of technical expertise, analytical problem-solving, and collaboration with cross-functional teams Azure DevOps Apache Spark, Python SQL proficiency Azure Databricks knowledge Big data technologies The DEs should be well versed in coding, spark core and data ingestion using Azure. Moreover, they need to be decent in terms of communication skills. They should also have core Azure DE skills and coding skills (pyspark, python and SQL).
Posted 1 month ago
5.0 - 8.0 years
16 - 20 Lacs
Pune
Work from Office
We are looking for an Anaplan Model Builder/Architect to design, build, and support financial, sales, and service planning models using the Anaplan platform. This is an opportunity to work in a fast-paced, challenging environment managing critical data for performance insights. Key Responsibilities Design, develop, and maintain Anaplan models aligned to business requirements. Collaborate with end users, consultants, and third-party vendors to support the Anaplan platform. Ensure high-quality delivery of finance/corporate system applications. Analyze existing model performance and implement enhancements. Translate complex problems into streamlined, efficient modeling solutions. Work with agile, cross-functional, and geographically dispersed teams. Manage and deliver user stories and requirements within sprint timelines. 4+ years of Anaplan, Level 3 Anaplan model builder certification, data visualization (Anaplan cloud-based platform for managing), basic data integration and scripting concept Good to have : Preferred Anaplan solution architect, ETL technologies
Posted 1 month ago
5.0 - 10.0 years
8 - 14 Lacs
Hyderabad
Work from Office
Job Title : Azure Synapse Developer Position Type : Permanent Experience : 5+ Years Location : Hyderabad (Work From Office / Hybrid) Shift Timings : 2 PM to 11 PM Mode of Interview : 3 rounds (Virtual/In-person) Notice Period : Immediate to 15 days Job Description : We are looking for an experienced Azure Synapse Developer to join our growing team. The ideal candidate should have a strong background in Azure Synapse Analytics, SSRS, and Azure Data Factory (ADF), with a solid understanding of data modeling, data movement, and integration. As an Azure Synapse Developer, you will work closely with cross-functional teams to design, implement, and manage data pipelines, ensuring the smooth flow of data across platforms. The candidate must have a deep understanding of SQL and ETL processes, and ideally, some exposure to Power BI for reporting and dashboard creation. Key Responsibilities : - Develop and maintain Azure Synapse Analytics solutions, ensuring scalability, security, and performance. - Design and implement data models for efficient storage and retrieval of data in Azure Synapse. - Utilize Azure Data Factory (ADF) for ETL processes, orchestrating data movement, and integrating data from various sources. - Leverage SSIS/SSRS/SSAS to build, deploy, and maintain data integration and reporting solutions. - Write and optimize SQL queries for data manipulation, extraction, and reporting. - Collaborate with business analysts and other stakeholders to understand reporting needs and create actionable insights. - Perform performance tuning on SQL queries, pipelines, and Synapse workloads to ensure high performance. - Provide support for troubleshooting and resolving data integration and performance issues. - Assist in setting up automated data processes and create reusable templates for data integration. - Stay updated on Azure Synapse features and tools, recommending improvements to the data platform as appropriate. Required Skills & Qualifications : - 5+ years of experience as a Data Engineer or Azure Synapse Developer. - Strong proficiency in Azure Synapse Analytics (Data Warehouse, Data Lake, and Analytics). - Solid understanding and experience in data modeling for large-scale data architectures. - Expertise in SQL for writing complex queries, optimizing performance, and managing large datasets. - Hands-on experience with Azure Data Factory (ADF) for data integration, ETL processes, and pipeline creation. - SSRS (SQL Server Reporting Services) and SSIS (SQL Server Integration Services) expertise. - Power BI knowledge (basic to intermediate) for reporting and data visualization. - Familiarity with SSAS (SQL Server Analysis Services) and OLAP concepts is a plus. - Experience in troubleshooting and optimizing complex data processing tasks. - Strong communication and collaboration skills to work effectively in a team-oriented environment. - Ability to quickly adapt to new tools and technologies in the Azure ecosystem.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France