Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As an AWS Solution Architect with 8 years of experience, your primary responsibility will be managing solution architecture, providing consulting services, overseeing software development, integration, and business processes. Your tasks will include designing, developing, and implementing scalable and reliable AWS cloud solutions in alignment with best practices. Collaboration with cross-functional teams to achieve project objectives and meet customer needs will be essential. Your technical skills should include proficiency in working with relational databases like PostgreSQL, Microsoft SQL Server, and Oracle, with experience in database schema design, optimization, and management. You must also have strong knowledge of AWS services such as S3, AWS DMS (Database Migration Service), and AWS Redshift Serverless. Setting up and managing data pipelines using AWS DMS, creating and managing data storage solutions using AWS S3, and expertise in data integration techniques and tools are required. Experience in designing and implementing ETL processes, performing data mapping, transformation, and data cleansing activities, as well as setting up and managing data warehouses, particularly AWS Redshift Serverless, is necessary. Proficiency in creating and managing views in AWS Redshift, scripting languages like Python or SQL for automating data integration tasks, and familiarity with automation tools and frameworks are essential. Your analytical and problem-solving skills will be crucial, including the ability to analyze and interpret complex data sets, identify and resolve data integration issues, and troubleshoot data integration and migration issues effectively. Strong collaboration skills to work with database administrators and stakeholders, excellent communication skills for documenting data integration processes, and adaptability to stay updated with the latest data integration tools and technologies are expected. Knowledge of data security and privacy regulations, ensuring adherence to data security and privacy standards during data integration processes, and proven experience in similar data integration projects are important. Familiarity with the requirements and challenges of integrating production relational databases with AWS services, along with AWS certifications such as AWS Certified Solutions Architect or AWS Certified Database - Specialty, will be advantageous. Your qualifications should include experience in solution architecture and consulting, proficiency in software development and integration, knowledge of business process optimization and automation, and preferably AWS certification or equivalent experience. This role requires a proactive individual with a strong technical background, excellent communication skills, and a collaborative mindset to deliver effective AWS cloud solutions and data integration services.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Power BI Consultant at MathCo, you will be responsible for leading the design, development, and deployment of impactful dashboards and data visualizations. Your role will require a deep understanding of Power BI capabilities, strong data modeling skills, and the ability to translate business requirements into actionable insights. Working across cross-functional teams, you will contribute to strategic initiatives while ensuring high standards of delivery and performance. Your key responsibilities will include leading the end-to-end development of Power BI dashboards, ensuring visual appeal, functionality, and business relevance. You will gather and interpret reporting requirements from stakeholders, architect data models, and design dashboard solutions accordingly. Leveraging advanced Power BI capabilities such as Power Query, DAX, M language, and API integrations, you will build robust reports while optimizing dashboards for performance and user experience. In addition, you will play a crucial role in ensuring strong data governance and security within Power BI by managing user access and role-based permissions. Collaboration with data engineering and business teams to manage source data connections and ensure data quality will be essential. You will also drive technical discussions, provide consulting expertise, and offer recommendations aligned with business objectives. To excel in this role, you should have 4-6 years of hands-on experience with Power BI, including proficiency in complex DAX queries and Power Query transformations. Strong SQL skills and a solid understanding of data warehousing, BI concepts, and dashboard performance optimization are also required. Experience with data integration from multiple sources, handling complex data relationships, and designing intuitive and interactive dashboards for business storytelling is essential. Certifications in Power BI or related visualization tools, familiarity with agile methodologies, and experience working in a fast-paced consulting environment are preferred. Exposure to big data technologies, modern data stacks, DevOps practices, CI/CD pipelines, and version control in analytics workflows will be advantageous. High emotional intelligence, cultural adaptability, and effective communication skills are key attributes for success in this role. As a MathCo team member, you are expected to embody MathCo's culture and way of working, demonstrate ownership, strive for excellence in delivering results, actively engage in initiatives fostering company growth, and support diversity while appreciating different perspectives. Join us at MathCo to leverage your expertise, drive innovation, and "Leave a Mark" as a Mathemagician in the world of Enterprise AI and Analytics.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
nagpur, maharashtra
On-site
The Data & Analytics Team is looking for a Data Engineer with a hybrid skillset in data integration and application development. In this role, you will play a crucial part in designing, engineering, governing, and enhancing our entire Data Platform. This platform serves customers, partners, and employees by providing self-service access. You will showcase your expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles. To be successful in this role, you should have at least 5 years of experience in system/data integration, development, or implementation of enterprise and/or cloud software. You must have strong experience with Web APIs (RESTful and SOAP) and be proficient in setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). Demonstrated proficiency in Python, data wrangling, and query authoring in SQL and NoSQL environments is essential. Experience in a cloud-based computing environment, specifically GCP, is preferred. You should also excel in documenting Business Requirement, Functional & Technical documentation, writing Unit & Functional Test Cases, Test Scripts & Run books, and incident management systems like Jira, Service Now, etc. Working knowledge of Agile Software development methodology is required. As a Data Engineer, you will be responsible for leading system/data integration, development, or implementation efforts for enterprise and/or cloud software. You will design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. Performing extensive data wrangling and authoring complex queries in both SQL and NoSQL environments for structured and unstructured data will be part of your daily tasks. You will develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). Providing operational support for the data platform and applications, including incident management, will also be a key responsibility. Additionally, you will create comprehensive Business Requirement, Functional, and Technical documentation, develop Unit & Functional Test Cases, Test Scripts, and Run Books, and manage incidents effectively using systems like Jira, Service Now, etc. At GlobalLogic, we prioritize a culture of caring. We put people first, offering an inclusive culture where you can build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development, providing various opportunities to grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that make an impact. We believe in balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect work-life balance. Join us in a high-trust organization where integrity is key, and trust is a cornerstone of our values to employees and clients. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest companies, helping create innovative digital products and experiences. You'll have the opportunity to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As the MLOps Engineering Director at Horizontal Data Science Enablement Team within SSO Data Science, you will play a crucial role in managing the Databricks platform for the entire organization and leading best practices in MLOps. Your responsibilities will include overseeing the administration, configuration, and maintenance of Databricks clusters and workspaces. You will continuously monitor the clusters for high workloads or excessive usage costs, ensuring the overall health of the clusters and addressing any issues promptly. Implementing and managing security protocols to safeguard sensitive information and facilitating the integration of various data sources into Databricks will be key aspects of your role. Collaborating closely with data engineers, data scientists, and stakeholders, you will provide support for data processing and analytics needs. Maintaining comprehensive documentation of Databricks configurations, processes, and best practices, as well as leading participation in security and architecture reviews, will be part of your responsibilities. Additionally, you will bring MLOps expertise to the table by focusing on areas such as model monitoring, feature catalog/store, model lineage maintenance, and CI/CD pipelines. To excel in this role, you should possess a Master's degree in computer science or a related field, along with strong experience in Databricks management, cloud technologies, and MLOps solutions like MLFlow. Your background should include hands-on experience with industry-standard CI/CD tools, data governance processes, and coding proficiency in languages such as Python, Java, and C++. A systematic problem-solving approach, excellent communication skills, and a sense of ownership and drive are essential qualities for success in this position. Moreover, your ability to set yourself apart will be demonstrated through your experience in SQL tuning, automation, data observability, and supporting highly scalable systems. Operating in a 24x7 environment, self-motivation, creativity in solving software problems, and ensuring system availability across global time zones will further enhance your profile for this role. In alignment with Mastercard's corporate security responsibility, you will be expected to adhere to security policies and practices, maintain the confidentiality and integrity of accessed information, report any security violations, and complete mandatory security trainings. By taking on this role, you will contribute to ensuring the efficiency and security of Mastercard's data science operations.,
Posted 1 month ago
13.0 - 17.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a highly skilled and experienced Business Analytics and Insights Manager to drive data-driven business strategies, empower decision-making, and enhance customer experiences for our enterprise products. Your role will involve leveraging advanced analytics, predictive modeling, and data integration to provide actionable insights aligned with business objectives. You will be responsible for predicting business trends, forecasting performance, and providing data-backed recommendations to improve revenue attainment, customer retention, and sales performance. Additionally, you will lead efforts to establish seamless integrations between data sources, data catalogue systems, and analytics tools to drive insights and innovation. You will be expected to: - Deliver actionable insights and dashboards using tools like Power BI, KNIME, Tableau, and similar analytics platforms - Communicate insights to Success Managers, Product Leaders, and Customer Experience Executives through engaging visualizations and data stories - Support leadership with data-driven recommendations to enhance decision-making and strategic planning You will also: - Analyze historical and real-time data to develop predictive models and forecasts for business value growth, revenue attainment, and customer experience improvement - Identify renewal trends, churn risks, and upsell/cross-sell opportunities through data-driven insights - Provide forward-looking recommendations to mitigate risks and optimize performance Furthermore, you will: - Establish and manage integrations across data sources, ensuring a unified data view - Build, maintain, and optimize data pipelines, warehouses, and catalog systems for scalable analytics - Implement robust data governance frameworks to ensure data integrity, accuracy, and security You will leverage advanced analytics platforms and AI/ML capabilities to generate predictive and prescriptive insights. Your role will involve developing forecasting models to anticipate business trends, revenue outcomes, and customer behavior, as well as driving corrective and preventive strategies based on analytics. Collaboration with cross-functional teams to understand business challenges, providing strategic insights to senior leadership, and fostering a culture of data-driven decision-making will be key aspects of your role. Qualifications and Skills: - Education: Bachelors or Masters degree in Statistics, Computer Science, Data Science, Business Analytics, or a related field - Experience: 13-15 years in data analytics, business intelligence, or related domains, with expertise in predictive analytics and forecasting, preferably in a B2B SaaS environment - Technical Expertise: Proficiency in analytics and visualization tools, expertise in AI/ML platforms, and strong understanding of data management and governance frameworks - Functional Knowledge: Deep understanding of customer lifecycle management, business performance metrics, and ability to translate data into actionable insights for strategy and performance improvement,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
Greetings from Teknikoz, With over 6 years of experience, you should possess mandatory skills in Power BI, Tableau, BI Solutions Architecture, and Dimensional Data Modeling. Your responsibilities will include proven experience working with SAP HANA Cloud, including HANA Cloud services like HANA Database XS and HANA Application Cloud. You must have hands-on experience in SAP Business Application Studio (BAS) for SAP Cloud Platform development. Additionally, a strong background in creating and optimizing calculation views, stored procedures, and SQL-based solutions is required. You should have in-depth understanding and experience with Smart Data Integration (SDI) for data integration and transformation. Familiarity with Cloud Connector for secure integration of on-premise systems with SAP Cloud solutions is crucial. Knowledge about database triggers, constraints, and maintaining data integrity is necessary. Solid understanding of SAP HANA development concepts and best practices is expected. Experience with SAP SDI Smart Data Integration and cloud-based data connectivity is preferred. Proficiency in version control, CI/CD pipelines, and deployment processes in cloud environments is an advantage. Excellent analytical and problem-solving skills are essential. Being a team player with excellent communication skills is key to success in this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As the global leader in Process Mining technology and one of the world's fastest-growing SaaS firms, Celonis believes in unlocking productivity through data and intelligence at the core of business processes. To achieve this goal, Celonis is seeking a Technical Solution Architect specializing in OEM & Technology Partnerships to join the Celonis Garage team. Celonis Garage, an independent research and development unit within Celonis, is dedicated to pioneering new business models, exploring emerging technologies, and developing prototypes to enhance the Celonis platform. The role of the Technical Solution Architect involves driving the successful technical implementation of Celonis within OEM partner solutions and strategic technology partnerships. The focus is on developing usable, go-to-market ready Celonis applications that accelerate adoption and growth through embedded Process Intelligence. The Technical Solution Architect will provide technical expertise during pre-sales activities, design solutions and application architectures for OEM partner integration, and develop technical presentations, demos, and proof of concepts. Additionally, the role involves guiding and supporting partners throughout the technical implementation of Celonis, providing hands-on assistance with configuration, data integration, app building, and customization of the Celonis platform. Collaboration with internal product and engineering teams, development of connectors to the Celonis platform, and support in launching go-to-market ready Celonis applications for partners" end customer base are key responsibilities of the Technical Solution Architect. The role also includes conducting initial customer POCs, evolving partner apps towards robust offerings, and aligning technical solutions with business objectives. The ideal candidate should hold a Bachelor's degree in Computer Science or a related technical field, possess extensive experience with the Celonis platform, and have a strong technical background in data integration, API development, cloud technologies, and application development. Proficiency in Python, SQL, REST APIs, product management expertise, excellent communication skills, and the ability to work independently and as part of a team are essential qualifications for this role. Joining Celonis offers the opportunity to work with award-winning process mining technology, benefit from career growth opportunities, receive exceptional benefits, prioritize well-being, and be part of a company driven by strong values. If you are passionate about driving technical enablement and implementation supporting strategic technology partnerships, Celonis welcomes you to join their dynamic, international team in Bangalore, India.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As a BlueYonder Demand Planning professional with over 6 years of experience, you will be responsible for implementing and managing the data hub to ensure seamless integration with Blue Yonder tools. Your role will involve leading supply chain planning and inventory optimization projects, collaborating with IT and business teams to enhance data quality and efficiency, and utilizing SQL and programming skills to improve system functionalities. It will be essential for you to stay updated on the latest trends in supply chain management and data integration. You will work closely with clients to understand their challenges, opportunities, and risks, sharing industry best practices while remaining adaptable to their unique circumstances. Your responsibilities will include executing each client's design within the BY TMS application, utilizing established project management processes to track progress, communicate updates, and manage tasks effectively. Collaborating with internal and external stakeholders, you will ensure the timely completion of project deliverables and maintain strong relationships throughout the project lifecycle. With a minimum of 10 years of experience in supply chain management and data integration, you should possess a deep understanding and hands-on experience in Blue Yonder Warehouse Management and Planning modules. Proficiency in integrating JDA ESP and JDA SCPO solutions, along with experience in ERP application integrations with WMS, will be advantageous. Technical expertise in Blue Yonder supply chain planning tools, Planning and WMS platforms, and hands-on developer experience with JDA Integrator are key qualifications for this role. Your strong understanding of supply chain functions and best practices, coupled with experience in SQL and programming languages, will be valuable assets in this position. Familiarity with IT standards, processes, and methodologies, including Quality Assurance, Project Management Life Cycle, Software Delivery Life Cycle, and Agile methodologies, will be essential for your success in this role. As an accountable professional, you will ensure the quality, completeness, and timely delivery of assigned projects, resolving complex support issues and leading technical project teams when necessary.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The Healthcare Analytics Specialist position based in Gurugram requires 3 to 5 years of experience. You will be responsible for developing, maintaining, and optimizing data pipelines, including ingestion, transformation, and loading of internal and external sources. Collaboration with Operations is essential to design scalable, secure, and high-performing data workflows. Best practices in data governance, version control, security, and documentation must be implemented. Analytical models for cost, quality, and utilization metrics will be built and maintained using tools like Python, R, or SQL-based BI tools. Reports will be developed to communicate findings to stakeholders across the organization. Ingesting and preprocessing third-party data from various sources, ensuring compliance with transparency requirements and designing automated workflows for data validation are key responsibilities. Data quality assurance through validation checks, audits, and anomaly detection frameworks is crucial. Compliance with healthcare regulations such as HIPAA, HITECH, and data privacy requirements must be maintained. Participation in internal and external audits of data processes is expected. Continuous improvement and thought leadership are encouraged to enhance data processes, adopt new technologies, and promote a data-driven culture within the organization. Mentoring junior analysts and sharing best practices in data analytics, reporting, and pipeline development are part of the role. Required qualifications include a Bachelor's degree in health informatics, Data Science, Computer Science, Statistics, or a related field. Proficiency in data integration & ETL, databases & cloud, BI & visualization, and healthcare domain expertise is necessary. Analytical & problem-solving skills, soft skills, and strong project management abilities are also required. Preferred qualifications include advanced degrees, experience with healthcare cost transparency regulations, familiarity with DataOps or DevOps practices, certification in BI or data engineering, and experience in data stewardship programs & leading data governance initiatives. Joining this position offers opportunities for innovation through advanced analytics projects, growth by leading initiatives and mentoring others, and working in a supportive culture that values open communication, knowledge sharing, and continuous learning.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Wipro Limited is a leading technology services and consulting company dedicated to creating innovative solutions that cater to the complex digital transformation needs of clients. With a comprehensive range of capabilities in consulting, design, engineering, and operations, we assist clients in achieving their most ambitious goals and establishing future-ready, sustainable businesses. Our global presence spans over 65 countries with a workforce of more than 230,000 employees and business partners, committed to supporting our customers, colleagues, and communities in adapting to an ever-evolving world. For more information, please visit our website at www.wipro.com. As a Data Engineer with a minimum of 7 years of experience, including at least 2 years of project delivery experience in DataIku platforms, you will be responsible for configuring and optimizing Dataiku's architecture. This includes managing data connections, security settings, and workflow optimization to ensure seamless operations. Your expertise in Dataiku recipes, Designer nodes, API nodes, and Automation nodes will be instrumental in deploying custom workflows and scripts using Python. Collaboration is key in this role, as you will work closely with data analysts, business stakeholders, and clients to gather requirements and translate them into effective solutions within the DataIku environment. Your ability to independently navigate a fast-paced environment and apply strong analytical and problem-solving skills will be crucial in meeting project timelines and objectives. Additionally, familiarity with agile development methodologies and experience with Azure DevOps for CR/Production deployment implementation are highly desirable. Join us in reinventing the digital landscape at Wipro, where we encourage constant evolution and empower individuals to shape their professional growth. We welcome applications from individuals with disabilities to contribute to our diverse and inclusive workforce.,
Posted 1 month ago
6.0 - 10.0 years
35 - 37 Lacs
Bengaluru
Remote
Role & responsibilities Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. • Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. • Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. • Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. • Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards.
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Noida, India
Work from Office
Key Responsibilities: 1.Architect and design end to end data pipelines starting from Source systems to Data warehouse. 2.Lead the development of scalable Python- Spark based data processing workflows 3.Define and implement data modeling standards for DWH including fact/dimension schema and historical handling. 4.Oversee performance tuning of Python, Spark and ETL loads. 5.Ensure robust data integration with Tableau reporting by designing data structures optimized for Bl consumption. 6.Mentor junior engineers and drive engineering best practices. 7.Work loosely with business stakeholders, developers and product teams to align data initiatives with business goals, 8.Define SLAs, error handling, logging, monitoring and alerting mechanisms across pipelines. Must Have: 1. Strong Oracle SQL expertise and deep oracle DWH experience. 2. Proficiency in Python and Spark with experience handling large scale data transformations. 3. Experience in building batch data pipelines and managing dependencies. 4. Solid understanding of data warehousing principles and dimensional modeling. 5. Experience working with reporting tools like Tableau. 6. Good to have experience in cloud-based DWHs (like Snowflake) for future- readiness. Mandatory Competencies ETL - ETL - Data Stage Beh - Communication and collaboration BI and Reporting Tools - BI and Reporting Tools - Tableau QA/QE - QA Analytics - Data Analysis Database - Database Programming - SQL Big Data - Big Data - SPARK Programming Language - Python - Python Shell ETL - ETL - Ab Initio
Posted 1 month ago
10.0 - 15.0 years
10 - 14 Lacs
Gurugram
Work from Office
Experience - Min 10 years Pharmaceutical domain is must Work mode - Hybrid/Remote Job Overview We are seeking an experienced MDM (Master Data Management) Architect to design, implement, and govern enterprise-wide MDM strategy. The ideal candidate will have deep expertise in MDM platforms, data governance, and data quality management, ensuring accurate and consistent master data across the organization Key Responsibilities MDM Strategy & DesignDefine and implement an enterprise MDM architecture aligned with business objectives. Design scalable and sustainable MDM solutions for domains such as Customer, Product, Pricing data. Evaluate and recommend MDM tools (e.g., Reltio, Informatica MDM)Implementation & IntegrationLead end-to-end MDM implementation, including data modeling, workflows, match-merge rules, and survivorship. Integrate MDM with ERP (SAP), CRM (Salesforce), and other enterprise systems. Ensure seamless data flow across transactional and analytical systems. Data Governance & QualityEstablish data governance policies, stewardship, and ownership frameworks. Define data quality rules, KPIs, and remediation processes. Collaborate with business stakeholders to enforce data standards. Technical Leadership: Provide thought leadership on MDM best practices and emerging trends. Mentor data teams and drive adoption of MDM principles. Troubleshoot and optimize MDM performance issues. Compliance & SecurityEnsure MDM solutions comply with global regulations (GDPR, HIPAA, etc.). Implement role-based access controls and data security measures. Qualifications Qualifications & Skills Experience:10+ years in MDM architecture, data management, or related roles. Hands-on experience with leading MDM tools (Reltio, Informatica MDM).Strong knowledge of data modeling, ETL, APIs, and cloud-based MDM solutions. Experience in life sciences, pharmaceuticals Skills: Expertise in SQL, data integration, and master data governance. Strong analytical and problem-solving abilities. Excellent communication and stakeholder management skills. Job Location
Posted 1 month ago
8.0 - 13.0 years
9 - 13 Lacs
Mumbai, India
Work from Office
At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world’s energy systems. Their spirit fuels our mission. Teamcenter PLM- Data Migration Specialist -Mumbai or Pune , Siemens Energy, Full Time Looking for challenging role If you really want to make a difference - make it with us We make real what matters. About the role- A TEAMCENTER PLM Data Migration Specialist is a professional who specializes in the movement of data from one system or platform to another. They understand TEAMCENTER PLM data structures and related PLM processes. They play a crucial role in ensuring the successful transfer and integration of data. TEAMCENTER PLM Data Migration Specialists are responsible for analyzing data structures, mapping data fields, validating data integrity, and implementing efficient migration strategies. Main data formats are meta-data, documents and Creo CAD data. Source data systems shall be SAP, PTC Windchill PDMLink, Oracle Agile CADIM, Sharepoint and file storage systems. The role includes to ensure that the target system meets requirements of the to be migrated data. Gaps will be identified and worked out to a solution with respective system teams. o Specializes in the movement of data from one system or platform to another o Developing and adapting the migration tools o Migration dry runs o Assessment of the migration result (with the help of the respective roll-out project) o Data Cleansing (with the help of the respective roll-out project) o Goalidentify and rectify inconsistencies, gaps and errors within the data to meet the Teamcenter data model In source systems During the Migration (as part of the transformation in the ETL process) In the target system Main responsibilities of a data migration specialist are o Analyze data structures and target system data model o Analyze, prepare and execute data cleansing services. o Analyze, specify and document data migration requirements with clients and internal teams o Collaborate with cross-functional teams o Collaborate with data architects to design data migration solutions o Conduct post-migration data validation o Conduct testing on migrated data to ensure client requirements are fulfilled. o Coordinate with stakeholders to gather requirements o Create spreadsheets or use other data analysis tools with large numbers of figures without mistakes. o Develop and execute test plans on migrated data o Develop templates for Data Migration objects that can be leveraged for multiple rollouts. o Document data migration processes and procedures o Extract, transform, and load are terms often used by the team o Handle escalated client complaints and concerns as needed – Bug/Issue tracking for data subjects. o Identify and mitigate data risks o Identify data migration requirements o Maintain appropriate levels of data security and privacy relating to customer data. o Maintain data migration documentation o Map data fields between source and target systems o Optimize data migration processes for efficiency within the data and migration team o Profile data results from legacy data sources. o Troubleshoot and resolve data migration issue We don’t need superheroes, just super minds. BE or BTech Degree in Information Technology, Computer Science Minimum 8+ years of support experience in Teamcenter Application 11.5 or higher, Administration of Teamcenter Application / 4T architecture, Teamcenter deployments, code quality reviews Ability to create deployment scripts in a Windows based Client/Server Platforms. Knowledge of windows Batch scripting, PowerShell and Python scripting languages is essential Good Understanding of Active Workspace, T4X, SWIM in addition to Teamcenter Server and client administration Good understanding of Teamcenter Data model which includes BMIDE code full and codeless customization Well versed with Teamcenter modules (Query builder, structure manager, Multi BOM manager, TEAMCENTER PLMXML, Workflow designer, Product Configurator) Installation, configuration, administration, maintenance, including integrations deployments with Teamcenter Ability to document architecture / technical specifications Exposure of migration projects / rollouts High proficient in Server-side customization (ITK) High proficient in Client-side customization (RAC) High proficient in AWC customization including style sheets Web Services High proficient in SOA development High proficient in Workflow handlers' development We’ve got quite a lot to offer. How about you This role is based in Mumbai or Pune , where you’ll get the chance to work with teams impacting entire cities, countries – and the shape of things to come. We’re Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at:
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : Microsoft SQL ServerMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Microsoft SQL Server is a mandatory skill (SQL Server & SSIS)- Must To Have Skills: Proficiency in Microsoft SQL Server Integration Services (SSIS).- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) processes and tools.- Familiarity with database management and optimization techniques.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft SQL Server Integration Services (SSIS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Services Development.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 5 years of experience in SAP Data Services Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 25.0 years
13 - 18 Lacs
Kolkata
Work from Office
Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Customer Data Platform & Integration, Google Cloud Data Services Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Architect, you will provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. You will also assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Should have Influencing and Advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the design and implementation of complex application solutions.- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.- Provide guidance and expertise on application architecture and integration strategies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Customer Data Platform & Integration, Google Cloud Data Services.- Strong understanding of data integration and data management principles.- Experience in designing and implementing scalable and secure data platforms.- Knowledge of cloud-based technologies and services.- Hands-on experience with data modeling and database design. Additional Information:- The candidate should have a minimum of 15 years of experience in Customer Data Platform & Integration.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process while maintaining a focus on quality and efficiency. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : ScalaMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior professionals to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Scala.- Strong understanding of application development methodologies.- Experience with software development life cycle and agile practices.- Familiarity with database management and data integration techniques. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also be responsible for troubleshooting issues and providing guidance to team members, fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Engage in continuous improvement initiatives to optimize application performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and management processes.- Experience with data quality and governance frameworks.- Familiarity with application lifecycle management tools.- Ability to analyze and resolve complex technical issues. Additional Information:- The candidate should have minimum 3 years of experience in Informatica MDM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Microsoft Power Apps Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct thorough testing and debugging of applications to ensure optimal performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Apps.- Experience with Microsoft Power Automate for workflow automation.- Strong understanding of application lifecycle management.- Familiarity with data integration techniques and APIs.- Ability to create user-friendly interfaces and enhance user experience. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Power Apps.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Stibo Product Master Data Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Stibo Product Master Data Management.- Strong understanding of application development methodologies.- Experience with database management and data integration techniques.- Familiarity with software development life cycle and agile practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Stibo Product Master Data Management.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years" experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |