Jobs
Interviews

87468 Integration Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 1.0 years

3 - 5 Lacs

Salt Lake, Kolkata, West Bengal

On-site

Job Description - Flutter Developer Location: Kolkata Experience: 1-3 years Role Description: We are looking for a skilled Flutter Developer with at least 3 years of hands-on experience to join our team. You will be responsible for building cross-platform mobile applications with pixel-perfect UI and robust business logic. The ideal candidate has a strong grasp of state management techniques, particularly BLoC , and can efficiently translate Figma designs into responsive, production-ready Flutter UIs. Key Responsibilities: Develop and maintain high-performance, reusable Flutter applications. Convert Figma/UI designs into responsive Flutter layouts. Implement business logic using BLoC pattern or equivalent. Work closely with product managers and designers to deliver high-quality features. Optimize app performance and ensure cross-platform consistency (Android & iOS). Write clean, maintainable, and testable code. Should know how to connect REST APIs and graphql Candidate should know how to write test cases and have knowledge in TDD. Should know how to use git and github and CI/CD pipeline Should have experience in deploy to play store or app store. Troubleshoot and debug application issues Requirements: 3+ years of hands-on experience with Flutter & Dart. Solid understanding of State Management (preferably BLoC, Provider, or Riverpod). Strong ability to translate Figma designs into responsive UIs. Good understanding of REST APIs and integration techniques. Familiarity with mobile app deployment processes (Play Store, App Store). Experience with Git, CI/CD tools, and agile workflows. Strong problem-solving and debugging skills. Nice to Have: Experience with Firebase, local storage (Hive/Shared Preferences), or GraphQL. Knowledge of animations, custom widgets, and performance optimization techniques. Exposure to native Android/iOS development Job Type: Full-time Pay: ₹300,000.00 - ₹500,000.00 per year Ability to commute/relocate: Salt Lake, Kolkata, West Bengal: Reliably commute or planning to relocate before starting work (Required) Application Question(s): If selected, within how many days you can join us? Education: Bachelor's (Preferred) Experience: Flutter: 1 year (Required) BLoc: 1 year (Required) Work Location: In person

Posted 17 hours ago

Apply

3.0 - 7.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Department: Middle Office Location: Mysore ThoughtFocus is a privately held global technology and consulting firm founded in 2004, headquartered in Brookfield, Wisconsin, USA. The company specializes in digital services and technology-enabled operations, primarily serving the financial services, manufacturing, higher education, and public sectors. What You’ll Do: • Candidate should be able to do daily transaction management, creation, and management of security master across different asset classes, reference data management and asset servicing. • Help the project POC’s to onboard clients on Client's platform from middle office perspective: define the scope, integrate with client’s OMS and testing along with a proper sign off, also help in establishing workflows. • Perform day to day-to-day operations including exceptions resolution and timely response to close the loop with various stakeholders. • Coordinate with client ops teams, investment managers, data vendors to resolve day-to-day exceptions, as well as develop more strategic initiatives. • Consult with other internal functions like pricing and trade accounting to resolve discrepancies and/or respond to internal queries. • Create & maintain proper documentation of new and already established workflows. • Work with technology teams as needed, by assisting in special projects, developing bespoke reports, developing specifications, product implementation and UAT. Hands on Experience in: o Trade life cycle management o Trade booking o Trade affirmation o Trade reconciliation o P&L reporting Strong understanding of financial products: o Equities o Fixed Income o Derivatives (Futures, Options, Swaps) What You’ll Need: • Experience of 3- 7years in the financial services industry and a very strong knowledge of financial products across asset classes - equities, fixed income, commodities, FX, and Credit • An MBA and/or CFA, preferably • An in-depth understanding of the various stages of the trade life cycle • An understanding of data sources such as Refinitiv, Bloomberg, IDC, Markit, etc. • Experience in integration with different trade execution systems • Strong analytical skills, critical thinking skills, and attention to detail • The ability to be initiative-taking and to think independently along with solid organizational skills. • Strong people skills. • Effective oral/written communications skills • The ability to work under pressure and take on additional operational responsibilities. • Proficiency in Microsoft Office applications and SQ • Experience in VBA/Python is a big plus.

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : ServiceNow Good to have skills : ServiceNow IT Service Management Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary We are seeking a skilled ServiceNow Developer to support ServiceNow implementation and upgrade projects for end customers. This is a remote position with occasional travel (up to 25%) for meetings. The ideal candidate is a self-motivated, solution-oriented professional with a strong technical background in ServiceNow, committed to maintaining platform stability and adhering to development best practices. You will join a growing ServiceNow Elite Partner with a global footprint and a strong focus on ITSM and ITOM Implementation and Services. Roles & Responsibilities Design, configure, develop, and implement baseline and custom applications in ServiceNow Collaborate with stakeholders to customize core applications including Incident, Problem, Change, Service Request, and CMDB Develop and maintain advanced modules such as Customer Service Management, Field Service Management, and Service Portal Create and manage integrations using Integration Hub and REST/SOAP web services Provide architectural guidance and ensure adherence to ServiceNow development standards Perform administrative tasks on the platform including user/group management, ACLs, catalog items, and UI actions Keep up-to-date with the latest ServiceNow features and best practices Deliver scalable and maintainable solutions aligned with customer requirements Participate in client meetings and work with cross-functional teams Professional & Technical Skills Required Skills: 5+ years of hands-on experience with ServiceNow development, configuration, and administration Experience with core ITSM modules and ServiceNow apps such as CMDB, Incident, Problem, Change, Request Management Expertise in ServiceNow integration using REST/SOAP Web Services, Integration Hub Solid understanding of user administration, ACLs, UI Policies, Business Rules, and Catalog Items Strong communication and problem-solving skills Ability to work independently and manage priorities Preferred Skills: ServiceNow Certified System Administrator (CSA) (Required) Certified Implementation Specialist and Certified Application Developer (CAD) (Preferred) 5+ years of experience with web technologies such as JavaScript, Java, XML, HTML, CSS, AJAX, and HTTP Experience developing orchestration and automation workflows Expertise in relational databases like MS SQL Server or Oracle Experience working in medium to large enterprises (7,500+ employees) Industry experience in healthcare, public sector, financial services, retail, or manufacturing Additional Information 15 years of full time education Opportunity to work with a globally distributed team Part of a ServiceNow Elite Partner organization

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

0 years

0 Lacs

Nellore, Andhra Pradesh, India

On-site

Company Description Codinglimits brings ideas to reality, shaping future businesses and helping achieve the greatest ambitions. Our work is guided by the belief that combining the best technology and expertise with proper methodology ensures meaningful outcomes. We are committed to providing an exceptional experience for our employees, fostering innovation, and driving success. Role Description This is a full-time, on-site role for a NestJs Developer located in Nellore. The NestJs Developer will be responsible for developing and maintaining server-side application logic. The role includes working on API integration, ensuring high performance and responsiveness, and collaborating with front-end developers to integrate user-facing elements. Daily tasks will involve writing reusable, testable, and efficient code, debugging and resolving software defects, and participating in code reviews and team meetings. Qualifications Proficiency in NestJs and JavaScript/TypeScript Experience with API integration and RESTful services Understanding of front-end technologies, ideally HTML, CSS, and Angular or ReactJS Strong knowledge of SQL and database design principles Excellent problem-solving skills and attention to detail Ability to work collaboratively in a team-oriented environment Bachelor's degree in Computer Science, Engineering, or a related field Experience in agile development methodologies is a plus

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Duck Creek Claims Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications function seamlessly to support organizational goals. You will also participate in testing and refining applications to enhance user experience and efficiency, while staying updated on industry trends and best practices to continuously improve your contributions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Duck Creek Claims. - Strong understanding of application development methodologies. - Experience with software testing and debugging techniques. - Familiarity with database management and data integration processes. - Ability to work with version control systems and collaborative development tools. Additional Information: - The candidate should have minimum 3 years of experience in Duck Creek Claims. - This position is based at our Mumbai office. - A 15 years full time education is required.

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 17 hours ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Join PwC US - Acceleration Center as a Manager of GenAI Data Science to lead innovative projects and drive significant advancements in GenAI solutions. We offer a competitive compensation package, a collaborative work environment, and ample opportunities for professional growth and impact. Years of Experience: Candidates with 8+ years of hands on experience Responsibilities Lead and mentor a team of data scientists in understanding business requirements and applying GenAI technologies to solve complex problems. Oversee the development, implementation, and optimization of machine learning models and algorithms for various GenAI projects. Direct the data preparation process, including data cleaning, preprocessing, and feature engineering, to ensure data quality and readiness for analysis. Collaborate with data engineers and software developers to streamline data processing and integration into machine learning pipelines. Evaluate model performance rigorously using advanced metrics and testing methodologies to ensure robustness and effectiveness. Spearhead the deployment of production-ready machine learning applications, ensuring scalability and reliability. Apply expert programming skills in Python, R, or Scala to develop high-quality software components for data analysis and machine learning. Utilize Kubernetes for efficient container orchestration and deployment of machine learning applications. Design and implement innovative data-driven solutions such as chatbots using the latest GenAI technologies. Communicate complex data insights and recommendations to senior stakeholders through compelling visualizations, reports, and presentations. Lead the adoption of cutting-edge GenAI technologies and methodologies to continuously improve data science practices. Champion knowledge sharing and skill development within the team to foster an environment of continuous learning and innovation. Requirements 8-10 years of relevant experience in data science, with significant expertise in GenAI projects. Advanced programming skills in Python, R, or Scala, and proficiency in machine learning libraries like TensorFlow, PyTorch, or scikit-learn. Extensive experience in data preprocessing, feature engineering, and statistical analysis. Strong knowledge of cloud computing platforms such as AWS, Azure, or Google Cloud, and data visualization techniques. Demonstrated leadership in managing data science teams and projects. Exceptional problem-solving, analytical, and project management skills. Excellent communication and interpersonal skills, with the ability to lead and collaborate effectively in a dynamic environment. Preferred Qualifications Experience with object-oriented programming languages such as Java, C++, or C#. Proven track record of developing and deploying machine learning applications in production environments. Understanding of data privacy and compliance regulations in a corporate setting. Relevant advanced certifications in data science or GenAI technologies. Nice To Have Skills Experience with specific tools such as Azure AI Search, Azure Document Intelligence, Azure OpenAI, AWS Textract, AWS Open Search, and AWS Bedrock. Familiarity with LLM backed agent frameworks like Autogen, Langchain, Semantic Kernel, and experience in chatbot development. Professional And Educational Background Any graduate /BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA

Posted 17 hours ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

This position is ON-SITE Working in the office 5 days / week. You'll be well placed in our dynamic and friendly Development Team. Situated in the prestigious Silver Utopia Building, Chakala, Andheri East, Mumbai – 400099, Maharashtra You will make significant contributions to the mobile payments industry by bringing engineering excellence in innovating new features and functions in customer centric applications. You will analyse product requirements, design implementations, and participate in integration testing. You will be part of a very innovative, dedicated, and high-performance team and need to match their level of excellence and thoroughness in delivering world-class banking applications. Requirements of the position The incumbent is expected to: Always consider Ziksu’s objectives first when undertaking all duties. Communicate effectively, model integrity and respect in all interactions. Operate within Ziksu’s framework, policies and procedures and ensure effective transparency and accountability in all activity. Collaborate with staff in other departments to achieve common goals and best practice. Facilitate business improvements as appropriate. Demonstrate analytical problem-solving skills, customer focus and alignment with Ziksu’s strategic objectives. Support organisational change and continuous improvement by actively contributing to achieve Ziksu’s vision, mission, and priorities. Comply with the requirements of Ziksu’s Code of Conduct and all relevant legislation including EEO, OSH and Records Management. Role specific responsibilities Design, develop, and maintain high-performance, secure backend applications using Java and Spring Boot . Build and manage RESTful APIs and microservices for fintech and payment systems , ensuring scalability, reliability, and compliance. Lead the design of system architecture and database schema for complex, data-driven applications. Continuously optimize code and infrastructure for performance, scalability, maintainability, and cost-efficiency. Collaborate cross-functionally with frontend, DevOps, QA, UI/UX, and product teams to ensure smooth integration and delivery. Oversee integration with third-party services (payment gateways, banks, KYC/AML APIs), including sandbox/live environment testing and compliance validation. Lead security planning , ensure adherence to industry security standards (OWASP, PCI-DSS) , and participate in code audits and threat modelling. Perform and oversee code reviews , ensuring best practices, code quality, maintainability, and knowledge sharing across the team. Take ownership of CI/CD pipeline setup and maintenance , ensuring fast, safe, and automated release cycles. Proactively monitor production systems , manage logs, and coordinate incident responses to minimize downtime and improve system observability. Mentor junior developers, delegate tasks, and assist in managing timelines, technical blockers, and quality deliverables. Document and maintain detailed technical specifications including system architecture, design decisions, API contracts , and technical how-to’s. Required Skills 7–10 years of strong experience with Java , Spring Boot , and REST API development . Proficiency with Oracle DB , MySQL , or PostgreSQL , including complex queries and database performance optimization. Deep understanding of backend architecture , design patterns , data structures , and algorithms . Strong hands-on experience with Docker , Jenkins , and modern CI/CD pipelines; Kubernetes experience is a plus. Experience in cloud-based deployments , ideally on Oracle Cloud or similar platforms (AWS, Azure, GCP). Expertise in Git , version control workflows, and collaboration tools. Proven experience in debugging , profiling , and performance tuning of backend applications. Knowledge of security best practices , encryption , authentication/authorization flows , and regulatory compliance in fintech. Practical understanding of Agile methodologies , including Scrum or Kanban. Strong written and verbal communication skills, with the ability to write technical documentation and communicate effectively with stakeholders. Bonus: Exposure to test automation tools (e.g., Selenium , JUnit , Postman ) and experience with DevSecOps practices. Qualifications & experience: Bachelor’s / master’s degree in engineering (Computer Science, Computer Engineering or related) Minimum 7–10 years of experience in Java programming You MUST ANSWER the QUESTIONS attached and PROVIDE YOUR RESUME. Please note: Preference will be given to availability for an IMMEDIATE START (within 7 Days). For any queries, please reach out to work@ziksu.com

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Location: Mumbai—locals only. Experience: 3+ years Budget: Open Competitive Market rate [always keep it low] Interview Mode: 1st Round -Virtual, 2nd/3rd -compulsory face to face, may have more than 3 rounds. Required Details Total Experience Relevant Experience Designation Current Company: Current CTC Expected CTC Notice Period Current Location Expected Location: Offer In Hand PAN Number: Dob Reason for Job Change: Education Passed out Percentage Highest Degree Interview Availability: Any 3 slots University UAN Number mandate: Jd Experience: 3+ years This person will be working with the Infrastructure Engineering team as part of the Netops team, this person will be supporting numerous critical services like alert management, BAU L2 tasks including incidents, firewall requests, load balancer requests, changes, troubleshooting and on-call support. Desired Experience Provide technical expertise in cloud provisioning for networking and platforms. Drive technical discussions across the infrastructure, application development and support teams toward cloud infrastructure deployments (IaaS, PaaS, SaaS) Interface with cloud service providers to remediate integration related technical challenges. Effectively communicate and consult with business users, developers and the infrastructure team to design innovative cloud-based infrastructure solutions, including re-architecture and migration of existing on-prem applications, which affordably meet business SLAs. Create and present professional grade deliverables such as timeline and project deliverable documents, technical drawings, and presentations. Keep pace with emerging tools, techniques, and products for cloud platforms, hosting Windows and Linux based web applications, Kubernetes, and elastic compute environments. To provide support for level 1 and 2 incident and problem management & to perform advanced troubleshooting. To ensure that all change requests and incidents are processed within agreed SLA. Core Expertise on Routing & Switching Intermediate knowledge on Firewalls troubleshooting (Cisco + Palo Alto). Experience in one of the cloud technologies Microsoft Azure/AWS/GCP. Excellent organizational, interpersonal, and communication skills, including fluency in English. Demonstrated success in defining and executing migrations to cloud based infrastructure. Experience implementing and integrating existing processes with IaaS, PaaS, SaaS offerings from major cloud providers (Azure, AWS etc.) Experience with at least one of the scripting languages PowerShell, Bash, Python Experience with IaaC such as ARM, Terraform, CloudFormation would be a plus. Requires knowledge of network concepts, standards, and protocols Requires knowledge of operating system, database and application management Independent worker with proven ability to simultaneously execute multiple projects in a demanding environment. Experience in application & network security design is a plus.

Posted 17 hours ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title - + + Management Level: 07 - Manager Location: Bangalore/ Gurgaon/Pune/Mumbai Must have skills: Business Process Consulting Additional Skills: Problem definition, Architecture, Design, R&D, Innovation mgmt., PLM, BOM, Digital Twin and Thread space, Process Excellence, Digital Transformations, SAP PLM Packages (PDM functional knowledge and configuration, PPM, Master Data, EBOM, PS) with strong functional and implementation knowledge Job Summary Looking for Self-Driven and Seasoned Senior Manager/Manager with exceptional skills in coordinating, organizing and supporting execution of transformations/improvements in PLM Programs for our clients and to build and grow Engineering and R&D Digitization team. As Senior Manager/Manager in Engineering and R&D Digitization, will need to work closely with leadership to define and deliver in the areas of PLM Enablement, BOM Management, Master Data Management and Digital Twin & Thread Roles & Responsibilities Key responsibilities include: Lead Engineering and R&D Transformation Programs to drive Innovation and Process Enablement for the Clients Lead and Curate relevent assets, offering in PLM Enablement, Integrated BOM, Product & Engineering Master Data Management and Digital Twin and Thread areas and develop and execute Go To Market for the same along with Leadership In-depth understanding of Product Data Management and able to drive Product Journey with capabilities in defining PLM Roadmap, Process Design, Value Realization and PLM Maturity Assessment areas Experience in Master/Material Data Management and Data Migration Tools and solutions that meet our clients’ needs in innovative ways. Enabling transformation in R&D utilizing the SAP PLM capabilities by creating business processes for Package/Product design, Bill of Material Management, Engineering Change Management, Product Research, Simulations, Prototyping, Product Testing (qualitative & quantitative) and supplier integration. Professional & Technical Skills At least 10 years of experience in Business Process Consulting, problem definition, Architecture/Design /Detailing of Processes At least 7 years of experience in SAP PLM Packages (PDM functional knowledge and configuration, PPM, Master Data, EBOM, PS) with strong functional and implementation knowledge as well as general Project Management and Customer Management skills. At least 6 years of industry experience with SAP PLM package implementations which includes strong knowledge in configuration, Agile architecture and all its components. Experience in Classification Migration, Master Data Cleansing and Engineering Master Data experience is preferred. At least 5 years of experience in Configuration/solutions evaluation/ Validation and deployment Project Management Experience with strong communication and teamwork skills Ability to work in Global Environment using Onshore Offshore model Sensitivity and skill at working with different cultures and styles Rapidly learn and apply new engineering technologies and exposure to other PLM tools Additional Information Experience of working in PLM, BOM, Master Data Management and Digital Twin and Thread space Expert in SAP PLM, Process Excellence, Data Governance, Digital Transformations and shaping end to end Engineering Transformations Concrete experience leading complex PLM Solution Design across multiple industries Ability to work in a rapidly changing environment where continuous innovation is desired. Analytical and quantitative skills and the ability to use hard data and metrics to back up assumptions and develop business cases. Ability to clearly communicate these data insights to others. General Manager / owner mentality, work closely with Team to deliver At least 6 years of industry experience with SAP PLM package implementations which includes strong knowledge in configuration, Agile architecture and all its components. Experience in Classification Migration, Master Data Cleansing and Engineering Master Data experience is preferred. At least 5 years of experience in Configuration/solutions evaluation/ Validation and deployment Project Management Experience with strong communication and teamwork skills Ability to work in Global Environment using Onshore Offshore model Sensitivity and skill at working with different cultures and styles Rapidly learn and apply new engineering technologies and exposure to other PLM tools About Our Company | Accenture

Posted 17 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Oracle SFP (Student Financial Planning) Professionals in the following areas : Experience 5-8 Years Job Description We are looking to hire Oracle Student Financial Planning (SFP) Implementation/Support Consultant with 5 to 10 Yrs of experience. The Oracle Student Financial Planning (SFP) Implementation Consultant will support the deployment and optimization of Oracle’s cloud-based financial aid solution for US universities. This role involves working closely with financial aid officers, IT teams, and university leadership to ensure the system meets institutional and regulatory requirements. The consultant is also expected to support the existing SFP system. Roles Implementation Specialist: Leads or supports the deployment of Oracle SFP. Functional Consultant: Translates university financial aid processes into Oracle SFP configurations. Advisor to Stakeholders: Collaborates with financial aid officers, IT teams, and leadership. Trainer and Supporter: Provides training and post-implementation support to users. Responsibilities Collaborate with university departments to gather financial aid requirements. Configure Oracle SFP to align with institutional policies and federal regulations. Assist in migrating legacy financial aid data into Oracle SFP. Conduct system testing, troubleshoot issues, and validate configurations. Ensure compliance with U.S. federal financial aid regulations. Create documentation for configurations, processes, and training materials. Support change management during transition from legacy systems. Maintain clear communication with project stakeholders. Skills Expertise in Oracle Cloud Applications, especially Oracle SFP. Knowledge of U.S. federal financial aid regulations (e.g., FAFSA, Pell Grants). Strong project management and organizational skills. Analytical thinking and problem-solving abilities. Effective verbal and written communication skills. Good understanding of data integration and cloud infrastructure. Adaptability to dynamic university environments. Customer Management Required Technical/ Functional Competencies Has working knowledge of customers' business domains and technology suite. Use latest technology, proactively suggest solutions to increase business, and understand customer's business. Projects Documentation Has In-depth understanding of documentation involved in Project like BBP & Solution Design, FS etc. Able to build into require project documentation and can do a Peer review for team members documents. Domain/ Industry Knowledge Working knowledge of customers' business processes and relevant technology platforms/products. Ability to prepare process maps, workflows, and business cases with application of industry standards and practices. Creation of medium to complex business models. Functional Design Working knowledge of high-level scope analysis, solution design processes, implementation and integration approaches, as well as cross-functional processes. Able to understand and design processes, identify key business drivers, translate use cases into diagrams, update design specifications, and design modular, flexible solutions meeting business requirements. Understand the overall solution's integrity, application behavior, and business rules, providing input for technical components, data design, and prototype creation. Can coordinate process playbacks and prototype reviews with stakeholders and configure applications for realizing business solutions. Requirement Gathering And Analysis Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyze the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Test Management Able to perform unit testing & perform comparison testing for rehosting, report testing status and create iteration, system integration test plan and develop integration test cases as required. Execute automation test scripts/manual test cases as per test plan, record findings and DIT Test Cases against the baseline code provided. Identify, report and document defects identified and perform defect fix/ deviations from expected results Create test cases, test scenarios and test data and perform Development Integration Testing. Accountability Required Behavioral Competencies Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration Shares information within team, participates in team activities, asks questions to understand other points of view. Agility Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 17 hours ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job description: Job Description Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution ͏ Do 1. Bridging the gap between project and support teams through techno-functional expertise For a new business implementation project, drive the end to end process from business requirement management to integration & configuration and production deployment Check the feasibility of the new change requirements and provide optimal solution to the client with clear timelines Provide techno-functional solution support for all the new business implementations while building the entire system from the scratch Support the solutioning team from architectural design, coding, testing and implementation Understand the functional design as well as technical design and architecture to be implemented on the ERP system Customize, extend, modify, localize or integrate to the existing product by virtue of coding, testing & production Implement the business processes, requirements and the underlying ERP technology to translate them into ERP solutions Write code as per the developmental standards to decide upon the implementation methodology Provide product support and maintenance to the clients for a specific ERP solution and resolve the day to day queries/ technical problems which may arise Create and deploy automation tools/ solutions to ensure process optimization and increase in efficiency Sink between technical and functional requirements of the project and provide solutioning/ advise to the client or internal teams accordingly Support on-site manager with the necessary details wrt any change and off-site support ͏ 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet ͏ Deliver No. Performance Parameter Measure 1.Contribution to customer projectsQuality, SLA, ETA, no. of tickets resolved, problem solved, # of change requests implemented, zero customer escalation, CSAT2.AutomationProcess optimization, reduction in process/ steps, reduction in no. of tickets raised3.Skill upgradation# of trainings & certifications completed, # of papers, articles written in a quarter ͏ Mandatory Skills: Oracle Apps SCM Functional . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP CX Service Cloud C4C Good to have skills : SAP Sales and Distribution (SD) Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications align with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities: - Strong SAP technical, configuration, and business area knowledge in S4 HANA CS module (Service management) and S4 HANA Service solution. - In-Depth knowledge of repairs processing, goods receipt, storage location and tracking, integration with SAP Sales Order processing and Finance aspects to track costs of repair, details of repair and final invoice to customer - Equipment master setup, tracking and integration with warranty tracking and coordination with SD and FI modules for reporting of warranty recognition - Very good knowledge of Service work orders and integration throughout repairs process - Knowledge of service labor posting and integration with time keeping system Good Knowledge of material issues and postings - Knowledge of S4 HANA Integration with Field service management is strongly desired. - Translate user's requests into application system solutions - Resolve business issues by working with various groups within the company - Redesign procedures to suggested best business practices in S4 HANA - We are also looking for people with integration capabilities and cross modular capability in SAP SD and MM/LE. - Strong understanding & experience in Integration of CS and SD with Material Management (MM), Logistics Execution (LE) & Financial Accounting (FI) modules. Professional & Technical Skills: - Should have excellent communication skills - Ability to work in a customer environment and work in coordination with a remote project team - Ability to build good training documentation - Analytical skills to build problem statements and come up with solution statements - Experience in managing tasks and deliverables related to an SAP implementation project in different phases Additional Information: - The candidate should have minimum 5 years of experience in SAP CX Service Cloud C4C. - This position is based in Pune. - A 15 years full time education is required.

Posted 17 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the team and your role We are currently looking for integration consultants that are passionate about integration and understand what it takes to deliver TOP quality integration solutions to our clients and partners. You have an eagle-eye for identifying the integration challenges and the ability to translate those same business challenges into the best integration solutions. You can listen and stay calm under pressure and can be the linking pin between business and IT. You have seen integrations in different shapes, sizes, and colours, you can integrate any to any, either on-premise or cloud. You advise our clients about the best integration strategies based on best practices, analysts' recommendations, and architecture patterns. Last but not least, to be successful in this position we expect you to apply your strong consultancy & integration skills. Part of your responsibilities is to support the different developing teams during the entire lifecycle of an interface, from requirements gathering, analysis, design, development, testing, and handing over to operational support. We are looking for experienced Enterprise Integration Consultants to join our team. The ideal candidate has: Strong knowledge of integration principles and consultancy skills to be able to translate business to IT requirements. Hands-on experience in the integration of SAP and non-SAP systems in A2A and B2B scenarios. Deep expertise and hands-on experience in Boomi as the primary integration platform. Working knowledge of API Management platforms (e.g., SAP API Management, Apigee, or others). Familiarity with event-driven architecture and distributed streaming platforms like Solace or Confluent Kafka . In-depth technical, functional, and architectural expertise in integrating applications using different technologies such as but not limited to REST, SOAP, ALE-IDocs, EDI, RFC, XI, HTTP, IDOC, JDBC, File/FTP, Mail, JMS. Solid middleware knowledge and web service skills. Good understanding of REST API and Web Services. Extensive experience with integrating 3rd party applications using REST-based services. Experience using tools like Postman & SOAPUI for service testing. Proven experience with full life cycle Integration implementation or rollout projects. Demonstrated experience with deliverables planning, client-facing roles, and high pace environments. What is Rojo all about? Founded in 2011, Rojo Integrations has transformed from a consulting firm into a comprehensive SAP integration leader, partnering with top software vendors like SAP, Coupa, SnapLogic, and Solace. As the leading SAP integration partner and ultimate expert, we provide seamless enterprise integration and data analytics solutions, enabling real-time insights and empowering digital transformation. Trusted by global Bluechip companies such as Heineken and Siemens, we deliver tailored services to meet unique business needs. Rojo is headquartered in the Netherlands and operates globally from its offices in the Netherlands, Spain, and India. We specialize in SAP integration modernization and business processes, improving data integration and business strategies. Our 360-degree portfolio includes consultancy, software development, and managed services to streamline integration, enhance observability, and drive growth. Requirements to succeed in this role Experience using Boomi , SAP PO, SAP Cloud Integration, SnapLogic, and/or API Management. Quick Learner and adapt to the new tools and technologies and evaluate their test applicability. Team Player with good technical, analytical, communication skills and client-driven mindset. A bright mind and ability to understand a complex platform. Ability to understand technical/engineering concepts and to learn integration product functionality and applications. Demonstrated user-focused technical writing ability. Must be able to communicate complex technical concepts clearly and effectively. Strong analytical and problem-solving skills. Ability to work independently in a dynamic environment. Ability to work on multiple complex projects simultaneously. Strong interpersonal communication skills. Effectively communicates in one-to-one and group situations. At least three years of previous experience in a similar role. Additional desired skills: You have at least a Bachelors degree in computer engineering or a related field. Experience with any API Management Platform. Experience with Distributed Streaming Platforms and Event-based Integration Architecture such as Kafka or Solace. Extensive experience in integration of SAP and non-SAP systems in A2A and B2B scenarios using SAP Integration Suite or Cloud Integration (CPI). Experience in integration with main SAP backend systems (SAP ERP, SAP S/4HANA, SAP S/4HANA Cloud). SAP PO experience in programming UDFs, Modules, Look-Ups (RFC, SOAP, JDBC), BPM, Inbound and Outbound ABAP Proxies. Extensive knowledge of Java, JavaScript and/or GroovyScript. Good understanding CI/CD concepts. Speak and write English fluently. Affinity and experience with integration platforms/software like Boomi, SAP Cloud Integration, or SnapLogic is desirable. What do we offer? The chance to gain work experience in a dynamic and inspiring environment and launch your career. Plenty growth opportunities while working in a high energy and fun environment. The opportunity to work on innovative projects with colleagues who are genuinely proud of their contribution. Training and mentoring to support your professional development with a yearly education budget. International atmosphere with Multicultural environments (+- 20 nationalities). A global, inclusive and diverse working climate within a world conscious organization. Plus, other exciting benefits specific to each region. Rojo is committed in achieving diversity & inclusion in terms of gender, caste, race, religion, nationality, ethnic origin, sexual orientation, disability, age, pregnancy, or other status. All qualified candidates are encouraged to apply. No one fits a job description perfectly, and there is no such thing as the perfect candidate. If you don't meet all the criteria, we'd still love to hear from you. Does that spark your interest? Apply now.

Posted 17 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job description: Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ͏ 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ͏ 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ͏ Deliver No. Performance Parameter Measure 1.Continuous Integration, Deployment & Monitoring of Software100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan2.Quality & CSATOn-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation3.MIS & Reporting100% on time MIS & report generation Mandatory Skills: Oracle Apps OTM Functional . Experience: 3-5 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP EWM Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement software solutions to meet business requirements. - Create the Functional design document in discussion with client - Collaborate with team members to design and optimize applications. - Troubleshoot and resolve technical issues in applications. - Perform functional test to validate the requirement and support Integration testing, User testing, and defect resolution. - Stay updated with industry trends and technologies. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in advanced SAP EWM. - Strong understanding of advanced SAP EWM functionalities. - Experience in Yard Management, Labor Management, Dock Appointment Scheduling, Slotting and Rearrangement and Interlink Management. - Experience in advanced SAP EWM implementation and customization. - Knowledge of advanced SAP EWM integration with other SAP modules. - Hands-on experience in advanced SAP EWM configuration and testing. Additional Information: - The candidate should have a minimum of 3 years of experience in SAP EWM. - This position is based at our Bengaluru office. - A 15 years full time education is required.

Posted 17 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies