Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 6 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 months ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 3 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less
Posted 2 months ago
2.0 - 6.0 years
13 - 17 Lacs
Mumbai
Work from Office
At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world's energy systems. Their spirit fuels our mission. Our culture is defined by caring, agile, respectful, and accountable individuals. We value excellence of any kind. Sounds like you? Software Developer - Data Integration Platform- Mumbai or Pune , Siemens Energy, Full Time Looking for challenging role? If you really want to make a difference - make it with us We make real what matters. About the role Technical Skills (Mandatory) Python (Data Ingestion Pipelines) Proficiency in building and maintaining data ingestion pipelines using Python. Blazegraph Experience with Blazegraph technology. Neptune Familiarity with Amazon Neptune, a fully managed graph database service. Knowledge Graph (RDF, Triple) Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management. AWS Environment (S3) Experience working with AWS services, particularly S3 for storage solutions. GIT Proficiency in using Git for version control. Optional and good to have skills Azure DevOps (Optional)Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred). Metaphactory by Metaphacts (Very Optional)Familiarity with Metaphactory, a platform for knowledge graph management (very optional). LLM / Machine Learning ExperienceExperience with Large Language Models (LLM) and machine learning techniques. Big Data Solutions (Optional)Experience with big data solutions is a plus. SnapLogic / Alteryx / ETL Know-How (Optional)Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial. We don't need superheroes, just super minds. A degree in Computer Science, Engineering, or a related field is preferred. Professional Software DevelopmentDemonstrated experience in professional software development practices. Years of Experience3-5 years of relevant experience in software development and related technologies. Soft Skills Strong problem-solving skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced and dynamic environment. Strong attention to detail and commitment to quality. Fluent in English (spoken and written) We've got quite a lot to offer. How about you? This role is based in Pune or Mumbai , where you'll get the chance to work with teams impacting entire cities, countries "“ and the shape of things to come. We're Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at: www.siemens.com/careers
Posted 2 months ago
5.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
About The Role : As a Python Developer, you will play a critical role in our software development and data engineering initiatives. You will work closely with data engineers, architects, and other developers to build and maintain our applications and data pipelines. Your expertise in Python development, API design, and cloud technologies will be essential to your success. Responsibilities : - Design, develop, and maintain applications using the latest Python frameworks and technologies (Django, Flask, FastAPI). - Utilize Python libraries and tools (Pandas, NumPy, SQLAlchemy) for data manipulation and analysis. - Develop and maintain RESTful APIs, ensuring security, authentication, and authorization (OAuth, JWT). - Deploy, manage, and scale applications on AWS services (EC2, S3, RDS, Lambda). - Utilize infrastructure-as-code tools (Terraform, CloudFormation) for infrastructure management (Good to have). - Design and develop database solutions using PL/SQL (Packages, Functions, Ref cursors). - Implement data normalization and Oracle performance optimization techniques. - Design and develop data warehouse solutions, including data marts and ODS concepts. - Implement low-level design of warehouse solutions. - Work with Kubernetes for container orchestration, deploying, managing, and scaling applications on Kubernetes clusters.- - Utilize SnapLogic cloud-native integration platform for designing and implementing integration pipelines. Required Skills : - Expertise in Python frameworks (Django, Flask, FastAPI). - Proficiency in Python libraries (Pandas, NumPy, SQLAlchemy). - Strong experience in designing, developing, and maintaining RESTful APIs. - Familiarity with API security, authentication, and authorization mechanisms (OAuth, JWT). - Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors). - Knowledge of data normalization and Oracle performance optimization techniques. - Experience in development & low-level design of warehouse solutions. - Familiarity with Data Warehouse, Datamart and ODS concepts. - Proficiency in AWS services (EC2, S3, RDS, Lambda). Good to Have Skills : Kubernetes : - Hands-on experience with Kubernetes for container orchestration. Infrastructure as Code : - Experience with infrastructure-as-code tools (Terraform, CloudFormation). Integration Platforms : - Experience with SnapLogic cloud-native integration platform. Experience : - 5 to 8 years of experience as a Python Developer. Location : - Bangalore or Gurgaon Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
6.0 - 11.0 years
18 - 33 Lacs
Pune
Hybrid
Job Requirements: Any of the skills for the following Skills for SnapLogic development is OK. Not should or Must have. 1. Should have strong knowledge in Snaplogic pipeline development and architecture. 2. Should have hands on experience in using various snaps available in Snaplogic like REST Snaps, Transform Snaps, Database Snaps, Script snap etc. 3. Should have knowledge in task creation like scheduled task, triggered task etc. 4. Should have experience in working Agile. 5. Should have Pipeline monitoring and troubleshooting experience. 6. knowledge in Integration development using AWS/any other cloud technologies. Work on Microsoft Dynamics (schema/connect browser), JDBC, Service Now, Google Big query Snaps, Oracle, REST, SOAP 7.Building complex mappings with JSON path expressions, Python scripting. Qualifications: 6-10 years of overall IT experience 2-3 years of Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts. Analyze & translate functional specifications /user stories into technical specifications. Experience with end to end implementations in Snap logic (Develop/Test/ Implementations) Integration experience to work with third party/external vendors across all modules and providing solution for Snaplogic design. Good written and verbal communication capabilities Strong experience in coordinating with the Business Analysts to understand business requirement, functional requirements, and conversion of business rules into technical specifications Proven ability to work independently or in conjunction with a team.
Posted 2 months ago
5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Client Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Hi....! We are hiring for below Positions Job Title: Snaplogic Key Skills: Snaplogic , Netsuite, ETL , SAP Job Locations: Pan India Experience: 5– 14 Years Budget: 1- 23LPA Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Job Description: Should have hands on experience in Snaplogic at least 5+ years. Should have experience to build integration with Salesforce, Netsuite and SAP etc. Working in Ultra would be an added advantage. Should have implementation experience in ETL/ELT. Should have experience in Snaplogic public API and creation of API with Snaplogic Should have in-depth knowledge in JSON/XML/XSD/XPath/XSLT. Working with APIM would be added advantage. Should have hands on experience in web services like Restful/Soap. Should have hands on experience on Java Script. Should have working experience in Agile methodology. Good to have basic core Java concept. Should have prior experience to handle customers and managing team. Should work on at least any of 3 design patterns like - ETL/API/ESB/Cloud Native Interested Candidates please share your CV to sushma.n@people-prime.com Show more Show less
Posted 2 months ago
14 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Hi....! We are hiring for below Positions Job Title: Snaplogic Developer Key Skills: Snaplogic, ETL, Salesforce, Netsuite, API, JSON, XML, SOAP, JavaScript, Agile Methodology, Job Locations: PAN INDIA Experience: 5 – 14 Years Budget: Based On Current CTC, Will give 30-40% Hike Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Interview Mode: Virtual Job Description: Job Summary Snaplogic : Should have hands on experience in Snaplogic at least 5+ years. Should have experience to build integration with Salesforce, Netsuite and SAP etc. Working in Ultra would be an added advantage. Should have implementation experience in ETL/ELT. Should have experience in Snaplogic public API and creation of API with Snaplogic Should have in-depth knowledge in JSON/XML/XSD/XPath/XSLT. Working with APIM would be added advantage. Should have hands on experience in web services like Restful/Soap. Should have hands on experience on Java Script. Should have working experience in Agile methodology. Good to have basic core Java concept. Should have prior experience to handle customers and managing team. Should work on at least any of 3 design patterns like - ETL/API/ESB/Cloud Native Interested Candidates please share your CV to vamsi.v@people-prime.com Show more Show less
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra
Remote
Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team The vision of our Business Technology Organization is to be the trusted partner that fuels Workday's business technology innovations and products to enable company growth at scale. Our team cultivates relationships with built on collaboration and trust to support a rapidly growing business. We strive to improve efficiency and operational effectiveness through technology, innovation and inspiration. About the Role Responsibilities: Software Application Troubleshooting at Application, Database, Network & Integration layers Lead and deliver Automation of tasks/service requests Perform trend analysis, and develop action plans for improving SLAs and reducing case volume and problems Incident troubleshooting, resolution and technical root cause analysis to address the problem permanently Identify business risks, inefficiencies, issues and opportunities related to Salesforce platform. Document, maintain standardization and look for ways to constantly improve processes & procedures. Develop expertise of Workday Go-To-Market business applications end-to-end Develop domain expertise of Workday’s Enterprise Applications including Integrations About You Basic Qualifications: 5+ years in Enterprise Software Application Development Bachelor’s or equivalent experience in Computer Science, Information Technology, or related field Other Qualifications: Deep technical knowledge of enterprise software application development and enterprise application integrations (Salesforce, Apttus, MuleSoft/SnapLogic) Hands on experience troubleshooting technical issues on Salesforce platform end-to-end (Application, Database, Network & Integration layers) Knowledge of IT service management tools and best practices(preferred) Self-motivated, flexible, teammate with proven multi-tasking, time management & organization expertise with the ability to handle multiple and often changing priorities. Attention to detail with the ability to analyze and tackle sophisticated problems as well as provide documentation, mentorship and instruction to users. Proven ability to learn and embrace new technologies, applications, and solutions. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!
Posted 2 months ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Qualifications Experience: Minimum of 9 years of experience in data engineering. Technical Skills Proficiency in Snowflake, dbt, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills Excellent problem-solving and analytical skills. Strong Communication And Collaboration Skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus. Show more Show less
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Mulesoft Anypoint Platform Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : BE Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications using MuleSoft Anypoint Platform. Your typical day will involve collaborating with cross-functional teams, analyzing business process and application requirements, and delivering high-quality solutions. Roles & Responsibilities: Design, build, and configure applications using MuleSoft Anypoint Platform to meet business process and application requirements. Collaborate with cross-functional teams to analyze business process and application requirements, and deliver high-quality solutions. Develop and maintain integration solutions using MuleSoft Anypoint Platform, including API development, data integration, and ETL processes. Ensure the performance, quality, and responsiveness of applications, and identify and correct bottlenecks and fix bugs. Professional & Technical Skills: Must To Have Skills:Experience in designing and developing integration solutions using MuleSoft Anypoint Platform. Good To Have Skills:Experience with other integration platforms such as Dell Boomi, Informatica, or SnapLogic. Strong understanding of integration patterns, RESTful APIs, and web services. Experience with Java, XML, JSON, and other related technologies. Experience with Agile development methodologies and DevOps practices. Solid grasp of software development best practices, including coding standards, code reviews, source control management, and testing methodologies. Additional Information: The candidate should have a minimum of 2 years of experience in MuleSoft Anypoint Platform. The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering high-quality integration solutions. This position is based at our Hyderabad office.Alternate location can be Pune, Bangalore, Mumbai, Chennai, NCR, Kolkata. Qualifications BE
Posted 2 months ago
3 - 5 years
5 - 10 Lacs
Chennai, Bengaluru
Work from Office
Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes Role & responsibilities Preferred candidate profile
Posted 2 months ago
2 - 7 years
6 - 10 Lacs
Bengaluru
Work from Office
Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers
Posted 2 months ago
3 - 7 years
9 - 13 Lacs
Bengaluru
Work from Office
Mission/Position Headline: OIC (Oracle Integration Cloud) with OTM (Oracle Transportation Management) or other Oracle SaaS applications. Areas of Responsibility: Design and develop (deliver) product extensions, integrations, reports, etc. Stage/migrate customer data as needed. Lead/Support knowledge transition to client. Support all formal documentation of solutions, including requirements for product extensions, etc. Lead/Support all solution activities, including testing and verification. Contribute to the training and development of client technical staff. System Troubleshooting and debugging. Be willing to do after-hours support in USA and EU (level 1-3) **After-hours for US will coincide with daytime/evening working hours in India** Need to be available between 8am and noon EST (meaning work would be between 3am and 12pm EST, or 12:30pm and 9:30pm IST) Desired Experience: Must have development knowledge/experience on OIC (Oracle Integration Cloud). Must have experience in the OIC service role, Adapter & Connection, Connection and Security Properties, Agent & Architecture. Must have experience in the Data Transformation module. Integration action, file processing, error handling, OIC Administrator task Detailed technical knowledge of Oracle Integration Cloud (OIC) and Oracle Cloud Infrastructure (OCI), including Integrations, connections, securing connections, mapping, APIs. Experience building integration in OIC using REST/SOAP services. Configure lookup and manage certificates and keys within OIC. Data ModelXML Schema, and JSON Integration design and development Data migration and loading Integration with other applications Performance tuning; scalability configuration; troubleshooting Experience with SnapLogic or other integration software a plus. Qualification and Experience Overall, 8-12 Yrs of experience. Bachelor"™s or Master"™s degree in Computer Science/Electronics Engineering required, or equivalent. Capabilities Should have good communication skills, be self-motivated, quality and result oriented Strong Analytical and Problem-Solving Skills
Posted 2 months ago
8 - 13 years
18 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Location: Hyderabad, Bangalore Function: HD HR Requisition ID: 1032970 Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. The team Our Global HR Technology team is responsible for the HR technology stack strategy and execution across theHitachi Digital operating companies, which include GlobalLogic, Hitachi Digital Services and Hitachi Vantara and comprise of more than 50,000 employees in 52 countries across the globe. We are an innovative, driven and dynamic team that are passionate about people and technology and are currently leading several critical transformation initiatives, which include a global re-implementation of Workday to incorporate new functionalities, global tech stack optimization, and the introduction of AI capability across HR. What you’ll be doing The Workday Integration Lead will be responsible for leading the design, delivery, and management of integrations across all operating companies (OpCos) within the organization. This role focuses on overseeing the lifecycle of Workday and HR technology integrations, ensuring seamless connectivity between systems, data integrity, and process efficiency. As the Integration Lead, you will collaborate with internal stakeholders and external partners to deliver high-quality, scalable integration solutions. This role will drive strategic initiatives, including ensuring compliance with global data governance and security standards, while managing ongoing optimization of the integration landscape post-implementation. You will: Serve as a primary representative of the HR engineering function, collaborating with implementation teams and internal stakeholders to translate business requirements into technical designs, and develop, test, and deploy HR integrations Define and execute the product vision and roadmap for Workday and related HR technology integrations, with consideration for integration architecture best practice, scalability, governance and business continuity Lead the design and delivery of integration solutions, collaborating with IT, HR, and external vendors to develop scalable solutions for critical business processes, such as payroll, benefits, and finance Oversee data mapping, conversion, and validation processes to ensure data accuracy and consistency across systems Ensure compliance with data governance, privacy regulations, and security protocols during integration design and development Manage the lifecycle of Workday and other HR system integrations, monitoring integration performance through operational dashboards, ensuring stability and continuous improvement Build strong relationships with OpCo stakeholders, ensuring integration solutions meet local and global business requirements Facilitate communication and training to ensure stakeholders understand and can effectively use integration solutions Ensure all integration solutions adhere to global compliance and regulatory requirements, including GDPR, CCPA, and other data privacy standards. Collaborate with IT and data governance teams to address security vulnerabilities and ensure alignment with enterprise policies. What you bring to the team Bachelor’s degree in Computer Science, Information Systems, or a related field. Minimum of 8 years of experience in HR technology integrations, preferably with Workday or similar ERP systems. Proven experience managing complex integration projects in global organizations. Strong expertise in Workday integration tools, including Workday Studio, EIBs, Core Connectors, and Workday Web Services (REST/SOAP APIs). Familiarity with middleware solutions (e.g., Dell Boomi, SnapLogic) and programming languages like XML, XSLT etc., Comprehensive understanding of HR functional areas (HCM, Payroll, Benefits, etc.) and associated data models. Experience with global data governance, security frameworks, and compliance standards. Strong project management skills with the ability to lead multiple initiatives and balance competing priorities. Exceptional communication and stakeholder management abilities, with a collaborative approach. Analytical and problem-solving skills, with a proactive mindset toward innovation and improvement. Certification in Workday Integration Certification and / or Workday HCM Certification are preferred but not essential About us We’re a global, 1000-stong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Our Values We strive to create an inclusive environment for all and are open to considering home working, compressed/flexible hours and flexible arrangements. Get in touch with us to explore how we might be able to accommodate your specific needs.We are proud to say we are an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. Championing diversity, equity, and inclusion
Posted 2 months ago
3 - 6 years
7 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
Key Responsibilities: Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes. Preferred candidate profile
Posted 2 months ago
2 - 5 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are looking for a skilled and experienced SnapLogic Developer with expertise in analysing, developing, and deploying integration solutions. This role involves end-to-end project delivery and close collaboration with stakeholders to ensure the successful execution of integration solutions. Responsibilities Design, develop and maintain scalable integration solutions using SnapLogicManage and oversee integration projects from initiation to completion with timely delivery and high-quality outcomesCollaborate with stakeholders to translate business requirements into efficient technical solutionsProvide technical guidance and support throughout all phases of the project lifecycleDevelop and maintain technical documentation, adhering to established processes and best practicesTroubleshoot and resolve integration challenges, ensuring system reliability and effectivenessContinuously assess current processes to identify opportunities for improvement and optimizationMonitor system performance and ensure compliance with organizational and project-specific standards Requirements 4-5 years of working experience in SnapLogic development and end-to-end integration deliveryKnowledge of SnapLogic Designer, SnapLogic Manager, and pipelines for handling integration tasksExpertise in integrating systems such as databases, SaaS applications, and REST APIs using SnapLogicBackground in ETL processes, data flows, and data transformation capabilitiesFamiliarity with cloud platforms like AWS, Azure, or GCP and their integration capabilitiesUnderstanding of error handling, debugging, and best practices to ensure seamless integrations
Posted 2 months ago
2 - 5 years
0 Lacs
Mumbai Metropolitan Region
On-site
Data Quality & GovernanceStrong understanding of data validation, testing (e.g., dbt tests), and lineage trackingEmphasis on maintaining data trust across pipelines and modelsStakeholder ManagementPartner with business and technical stakeholders to define data needs and deliver insightsAbility to explain complex data concepts in clear, non-technical termsDocumentation & CommunicationMaintain clear documentation for models, metrics, and data transformations (using DBT docs or similar)Strong verbal and written communication skills; able to work cross-functionally across teamsProblem-Solving & OwnershipProactive in identifying and resolving data gaps or issuesSelf-starter with a continuous improvement mindset and a focus on delivering business value through dataIACdeploy scalable, secure, and high-performing Snowflake environments in line with data governance and security in palce using Terraform and other automation scripitAutomate infrastructure provisioning, testing, and deployment for seamless operations Requirements Strong SQL & DBT ExpertiseExperience building and maintaining scalable data models in DBTProficient in modular SQL, Jinja templating, testing strategies, and DBT best practicesData Warehouse ProficiencyHands-on experience with Snowflake including: Dimensional and data vault modeling (star/snowflake schemas)Performance optimization and query tuningRole-based access and security managementData Pipeline & Integration ToolsExperience with Kafka (or similar event streaming tools) for ingesting real-time dataFamiliarity with SnapLogic for ETL/ELT workflow design, orchestration, and monitoringVersion Control & AutomationProficient in Git and GitHub for code versioning and collaborationExperience with GitHub Actions or other CI/CD tools to automate DBT model testing, deployment, and documentation updates
Posted 2 months ago
3 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Servicenow QA As a Senior ServiceNow Consultant, you will be responsible for leading and executing ServiceNow projects and initiatives for clients. You will work closely with clients to understand their requirements, identify opportunities for improvement, and develop innovative solutions leveraging the ServiceNow platform. Key Responsibilities: Serve as the primary ServiceNow subject matter expert for clients, advising on best practices, platform capabilities, and potential solutions. Lead all aspects of ServiceNow project delivery, including requirements gathering, design, development, testing, and deployment.Conduct product demonstrations and educate stakeholders on ServiceNow functionality.Collaborate with clients to understand their business requirements and how ServiceNow can support them. Develop and configure custom ServiceNow applications, workflows, forms, reports, and integrations. Customize and extend ServiceNow ITSM, SPM, and ITOM modules to meet specific client needs. Develop and maintain project plans, budgets, and schedules. Communicate project status, risks, and issues to clients and project stakeholders. Requirements: Bachelors degree in Computer Science, Information Technology, or related field. 5+ years of hands-on experience with the ServiceNow platform, including architecture, design, development, and deployment across multiple modules. 3+ years of experience developing and managing integrations with ServiceNow and other systems and technologies. In depth technical knowledge of integration protocols and technologies, including REST APIs, SOAP APIs, JDBC, LDAP, and others. Experience with other integration tool sets such as Workato, Apptus or Snaplogic is a big plus. Proven experience leading successful ServiceNow implementations from start to finish. Strong problem-solving and analytical skills, with the ability to develop creative solutions to complex problems. Excellent verbal and written communication skills, with the ability to effectively communicate technical information to both technical and non-technical stakeholders. Experience with multiple ServiceNow modules, such as ITSM, SPM, ITOM, HRSD, or CSM. Experience with Performance Analytics is a plus. Experience with the younger ServiceNow modules such as WSD, LSD, SLM is a big plus. Preferred Certifications: ServiceNow Certified Implementation Specialist (ITSM). Consulting experience preferred. Hardware Asset Management (HAM) Fundamentals Software Asset Management (SAM) Fundamentals ServiceNow Certified Implementation Specialist (Discovery) Mandatory Skills Service now ITSM, SPM & ITOM Workato, Apptus or Snaplogic integration protocols and technologies, including REST APIs, SOAP APIs, JDBC, LDAP, and others About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 months ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Pune. You will play a crucial role in developing innovative solutions to enhance business operations and efficiency. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of SnapLogic solutions Conduct code reviews and ensure adherence to coding standards Troubleshoot and resolve technical issues in SnapLogic integrations Professional & Technical Skills: Must To Have Skills: Proficiency in SnapLogic Strong understanding of ETL processes Experience with API integrations Knowledge of cloud platforms such as AWS or Azure Hands-on experience in developing and maintaining SnapLogic pipelines Additional Information: The candidate should have a minimum of 5 years of experience in SnapLogic This position is based at our Pune office A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France