Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Snowflake Experience : 5-8 Years .
Posted 5 days ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Database Architecting Experience : 5-8 Years.
Posted 5 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Ciklum is looking for a Senior Data Scientist to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Senior Data Scientist, become a part of a cross-functional development team working for A healthcare technology company that provides platforms and solutions to improve the management and access of cost-effective pharmacy benefits. Our technology helps enterprise and partnership clients simplify their businesses and helps consumers save on prescriptions. Our client is a leader in SaaS technology for healthcare, They offer innovative solutions with integrated intelligence on a single enterprise platform that connects the pharmacy ecosystem. With their expertise and modern, modular platform, our partners use real-time data to transform their business performance and optimize their innovative models in the marketplace. Responsibilities: Development of prototype solutions, mathematical models, algorithms, machine learning techniques, and robust analytics to support analytic insights and visualization of complex data sets Work on exploratory data analysis so you can navigate a dataset and come out with broad conclusions based on initial appraisals Provide optimization recommendations that drive KPIs established by product, marketing, operations, PR teams, and others Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability Work directly with business analysts and data engineers to understand and support their use cases Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions Drive innovation by exploring new experimentation methods and statistical techniques that could sharpen or speed up our product decision-making processes Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members Contribute to unit’s activities and community building, participate in conferences, and provide excellence in exercise and best practices Requirements: We know that sometimes, you can’t tick every box. We would still love to hear from you if you think you’re a good fit! 5+ years of development of Data Science solutions with a proven track record of leveraging analytics to drive significant business impact Bachelor's/Master's degree in Mathematics, Statistics, Computer Science, Operations Research, Econometrics or related field Proven ability to relate and solve business problems through machine learning and statistics 4+ years of experience applying various machine learning techniques: regression, classification, clustering, dimensional reduction, time series prediction, and/or outlier detection, recommendation systems Understanding of advantages and drawbacks of machine learning algorithms as well as their usage constraints including performance 4+ years of experience in Python development of machine learning solutions and statistical analysis: Pandas, SciPy, Scikit-learn, XGBoost, LightGBM, and/or statsmodels, imbalanced-learn libraries and ML libraries like scikit-learn, TensorFlow, PyTorch, data wrangling and visualization (e.g., Pandas, NumPy, Matplotlib, Seaborn Experience in working with large-scale datasets, including time series and healthcare data Experience with NLP, deep learning and GenAI Experience diving into data to consider hidden patterns and conducting error analysis 2+ years experience in data visualization: Power BI, Tableau, and/or Python libraries like Matplotlib and Seaborn Experience with SQL for data processing, data manipulation, sampling, reporting 3+ years experience creating/maintaining of OOP Machine Learning solutions Understanding of CRISP-ML(Q) / TDSP concept 1+ year of experience with MLOps: integration of reliable Machine Learning Pipelines in Production, Docker, containerization, orchestration 2+ years of experience with Clouds (AWS, Azure, GCP) and Clouds AI And ML Services(e.g. Amazon Sage Maker, Azure ML) Excellent time and project management skills, with the ability to manage detailed work and communicate project status effectively to all levels Desirable: Probability Theory & Statistics knowledge and intuition as well as understanding of Mathematics behind Machine Learning 1+ year of experience in Deep Learning solution development with Tensorflow or PyTorch libraries Data Science / Machine Learning certifications, or research experience with papers being published Experience with Kubernetes Experience with Databricks, Snowflake platforms 1+ year of BigData experience, i.e. Hadoop / Spark Experience with No-SQL, and/or columnar/graph databases What`s in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy license, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally, and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: India is a strategic growth market for Ciklum. Be a part of a big story created right now. Let’s grow our delivery center in India together! Boost your skills and knowledge: create and innovate with like-minded professionals — all of that within a global company with a local spirit and start-up soul. Supported by Recognize Partners and expanding globally, we will engineer the experiences of tomorrow! Be bold, not bored! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.
Posted 5 days ago
5.0 - 8.0 years
7 - 10 Lacs
Kolkata
Work from Office
Role: The purpose of the role is to provide assurance on the quality of deployment for the assigned accounts and support in establishing mechanisms that enhance and sustain customer satisfaction levels. The role is expected to support in enhancing customer advocacy by predicting and preventing customer escalations & dissatisfactions and drive a culture of continuous improvement in the assigned accounts. Do: Implement deployment quality strategy for the assigned Accounts Provide inputs in the development of strategy for the assigned accounts while considering the quality standards, client expectations, quality, and monitoring mechanisms Review and reallocate the priorities to align with the overall strategy of the line of business / business unit Quality control and Customer satisfaction Support the completion of Annual Customer Satisfaction survey by ensuring completion of survey by the account customers, representatives for various projects within the account. Ensure completion of survey and address any queries in a timely manner. Support in conceptualizing the action planning by communicating with clients and interacting with Delivery Managers, vertical delivery heads and service delivery heads Drive the account wise tracking of action planning identified for sustained CSAT in various projects. Drive the Quarterly pulse survey for selected accounts or projects for periodic check-ins. Support the Account Leadership teams for tracking and managing client escalation for closure. Early Warnings and Business partnership Drive the implementation of mechanisms for preventing client escalations / dis-satisfactions by creating an early warning system in DigiQ covering aspects like delivery quality, delivery schedule, resources constraints, financial issues (overloading of effort / over-run potential), productivity, and slippages on milestones. Participate in Monthly and Quarterly Business review along with Business and Account leadership to ensure adherence of defined quality processes, define new life cycle models and ensure gating processes are followed the projects within the accounts. Drive the upskilling of delivery teams on quality management tools, knowledge management and create mechanisms for sharing of best practices. Support the collection of metrics on the performance / health of process and regular publishing of compliance and metrics dashboards. Continuous Improvement Drive a culture of continuous improvement in the assigned accounts to ensure enhance efficiency and productivity of resources Create mechanisms between the projects in the account for sharing knowledge, quality issues, risk mitigation methods within the accounts to drive the continuous improvement Plan and drive year on year improvement goals in various projects by way of process streamlining & improvements and automation, leading to cost savings and / or efficiency Support the collection of metrics to show the improvements- efficiency / productivity improvement. Team Management Team Management Clearly define the expectations for the team Assign goals for the team, conduct timely performance reviews and provideconstructive feedback to own direct reports Guide the team members in acquiring relevant knowledge and develop theirprofessional competence Drive geography specific trainings for the quality team, designed basis the statutory norms that apply in different countries Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the tea Track team satisfaction scores and identify initiatives to build engagementwithin the team Mandatory Skills: Big Data Consulting. Experience: 5-8 Years.
Posted 5 days ago
40.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title : Senior QA Automation Consultant About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals JD Mandatory Skillset Programming Languages: Proficiency in SQL and Python (Robot framework) for writing automated tests and conducting data validation. Experience in Test automation frameworks (e.g., Selenium or Postman) Experience in Jira tool or any other Test management tool Cloud Knowledge: Familiarity with AWS services (e.g., S3, EC2) Basic understanding of CI/CD tools like Jenkins, GitLab, AWS CodePipeline required. Responsibilities: Develop and maintain automated test scripts using SQL and Python to validate data accuracy across banking platforms. Implement and manage test automation frameworks with tools like Selenium and Postman to streamline testing processes and improve efficiency. Utilize AWS services (e.g., S3, EC2) for testing and ensure applications perform optimally in cloud environments, focusing on scalability and reliability. Conduct security testing, ensuring adherence to secure SDLC practices and compliance with industry regulations. Good to have: Security Testing: Knowledge of SCAS, SAST, DAST/WAS, and experience within secure SDLC frameworks. Knowledge on Hadoop and its ecosystem Data-Oriented Testing Skills: Experience with data warehousing and ETL processes, as well as quality management on platforms like Databricks. Knowledge in BI tool knowledge-Qliksense/IBM Cognos Test Strategy Design: Expertise in creating test strategies for data governance, data lineage, and compliance testing. Cross-functional Collaboration: Demonstrated experience collaborating with data engineering, DevOps, and application development teams. Responsibilities Job Title : Senior QA Automation Consultant About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals JD Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 5 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Oracle Cloud Infrastructure (OCI) is a strategic growth area for Oracle. It is a comprehensive cloud service offering in the enterprise software industry, spanning Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). OCI is committed to providing the best in cloud products that meet the needs of our customers who are tackling some of the world’s biggest challenges. About The Network Monitoring (NM) Team Networking is a mission critical part of the OCI cloud. Our customers want higher availability, more visibility, greater network security, better network performance and throughput, better capacity planning, root cause analysis, and prediction of failures. We help Oracle Cloud Infrastructure (OCI) build the best-in-class cloud monitoring solutions to provide performance monitoring, what-if analysis, root cause analysis, prediction, and capacity planning for Oracle’s global cloud network infrastructure. Our mission is to build monitoring services that comprehensively view, analyze, plan, and optimize to scale and operate our networks. Responsibilities We are looking for a Consulting Member of Technical Staff for the OCI Network Monitoring team who has the expertise and passion in solving difficult problems in globally distributed systems, building cloud native at scale network monitoring and analytics solutions and improve the operational posture of Network Monitoring. You should be comfortable at building complex distributed systems involving a huge amount of data handling - collecting metrics, building data pipelines, and analytics for real-time processing, online processing, and batch processing. If you are passionate about designing, developing, testing, and delivering cloud services, are excited to learn and thrive in a fast-paced environment, the NM team is the place for you. Required Qualifications/Desired Qualifications: (Adjust as per focus area) * BS/MS or Equivalent in CS or equivalent relevant area 10+ years of experience in software development * 5+ years of experience in developing large scale distributed services/applications * Proficiency with Java/Python/C++ and Object-Oriented programming Networking protocol knowledge such as TCP/IP/Ethernet/BGP/OSPF Networking Management Technologies such as SNMP, Netflow, BGP Monitoring Protocol, gNMI * Excellent knowledge of data structures, search/sort algorithms * Excellent organizational, verbal, and written communication skills * Knowledge of cloud computing & networking technologies including monitoring services Operational experience running and troubleshooting large networks * Experience developing service-oriented systems * Exposure to Hadoop, Spark, Kafka, Storm, open TSDB, Elastic Search or other distributed compute platforms * Experience developing automated test suites * Experience with Jira, Confluence, BitBucket * Knowledge of Scrum & Agile Methodologies Qualifications Career Level - IC5 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 5 days ago
1.0 - 2.0 years
3 - 4 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 08 S&P Global Mobility The Role: Data Engineer The Team We are the Research and Modeling team, driving innovation by building robust models and tools to support the Vehicle & Powertrain Forecast team. Our work includes all aspects of development of, and ongoing support for, our business line data flows, analyst modelling solutions and forecasts, new apps, new client-facing products, and many other work areas besides. We value ownership, adaptability, and a passion for learning, while fostering an environment where diverse perspectives and mentorship fuel continuous growth. The Impact We areseekinga motivated and talented Data Engineer to be a key player in building a robust data infrastructure and flows that supports our advanced forecasting models. Your initial focus will be to create a robust data factory to ensure smooth collection and refresh of actual data, a critical component that feeds our forecast. Additionally, you will be assisting in developing mathematical models and supporting the work of ML engineers and data scientists. Your work will significantly impact our ability to deliver timely and insightful forecasts to our clients. Whats in it for you: Opportunity to build foundational data infrastructure that directly impacts advanced forecasting models and client delivery. Gain exposure to and support the development of sophisticated mathematical models, Machine Learning, and Data Science applications. Contribute significantly to delivering timely and insightful forecasts, influencing client decisions in the automotive sector. Work in a collaborative environment that fosters continuous learning, mentorship, and professional growth in data engineering and related analytical fields. Responsibilities Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines for efficient data ingestion, processing, and storage, primarily focusing on creating a data factory for our core forecasting data. Data Quality and Integrity: Implement robust data quality checks and validation processes to ensure the accuracy and consistency of data used in our forecasting models. Mathematical Model Support: Collaborate with other data engineers to develop and refine mathematical logics and models that underpin our forecasting methodologies. ML and Data Science Support Provide data support to our Machine Learning Engineers and Data Scientists. Collaboration and Communication: Work closely with analysts, developers, and other stakeholders to understand data requirements and deliver effective solutions. Innovation and Improvement: Continuously explore and evaluate new technologies and methodologies to enhance our data infrastructure and forecasting capabilities. What Were Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Minimum of 1 - 2 years of experience in data engineering, with a proven track record of building and maintaining data pipelines. Strong proficiency in SQL and experience with relational and non-relational databases. Strong Python programming skills, with experience in data manipulation and processing libraries (e.g., Pandas, NumPy). Experience with mathematical modelling and supporting ML and data science teams. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data services. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Experience in the automotive sector is a plus. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 203 - Entry Professional (EEO Job Group) (inactive), 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH203 - Entry Professional (EEO Job Group)
Posted 5 days ago
6.0 - 11.0 years
11 - 16 Lacs
Noida
Work from Office
Data Engineering- Technical Lead Paytm is Indias leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks financial instruments. To further enhance merchants business, Paytm offers merchants commerce services through advertising and Paytm Mini app store.Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners. About the Role: This position requiressomeone to work on complex technical projects and closely work with peers in an innovative andfast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6+ years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliabletools, handling petabytes ofdataevery day. Brainstorm and create new platforms that can help in our quest to makeavailable to cluster users in all shapes and forms, with low latency and horizontalscalability. Make changes to ourdiagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline forDataingestion for real-time dash-boarding.Develop complex and efficient functions to transform rawdatasources into powerful,reliable components of ourdatalake. Design & implement new components and various emerging technologies in HadoopEco- System, and successful execution of various projects. Be a brand ambassador for Paytm- Stay Hungry, Stay Humble, Stay Relevant! Skills that will help you succeed in this role: Fluent withStrong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc.Excellent programming/debugging skills in Python/Java/Scala. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra.Hands-on programming experience with multithreaded applications.Good to have experience in Database, SQL, messaging queues like Kafka. Good to have experience in developing streaming applications e.g. Spark Streaming,Flink, Storm, etc.Good to have experience with AWS and cloud technologies such as S3 Experience with caching architectures like Redis etc. Why join us: Because you get an opportunity to make a difference, and have a great time doing that.You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve.You should work with us if you think seriously about what technology can do for people.We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants- and we are committed to it. Indias largest digital lending story is brewing here. Its your opportunity to be a part of the story!
Posted 5 days ago
2.0 - 6.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Title - Retail Specialized Data Scientist Level 9 SnC GN Data & AI Management Level:09 - Consultant Location:Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must have skills: A solid understanding of retail industry dynamics, including key performance indicators (KPIs) such as sales trends, customer segmentation, inventory turnover, and promotions. Strong ability to communicate complex data insights to non-technical stakeholders, including senior management, marketing, and operational teams. Meticulous in ensuring data quality, accuracy, and consistency when handling large, complex datasets. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Strong proficiency in Python for data manipulation, statistical analysis, and machine learning (libraries like Pandas, NumPy, Scikit-learn). Expertise in supervised and unsupervised learning algorithms Use advanced analytics to optimize pricing strategies based on market demand, competitor pricing, and customer price sensitivity. Good to have skills: Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms such as AWS or Google Cloud for large-scale data processing. Experience with ETL (Extract, Transform, Load) processes and tools like Apache Airflow to automate data workflows. Familiarity with designing scalable and efficient data pipelines and architecture. Experience with tools like Tableau, Power BI, Matplotlib, and Seaborn to create meaningful visualizations that present data insights clearly. Job Summary : The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help our retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Roles & Responsibilities: Leverage Retail Knowledge:Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. Use AI-driven techniques for personalization, demand forecasting, and fraud detection. Use advanced statistical methods help optimize existing use cases and build new products to serve new challenges and use cases. Stay updated on the latest trends in data science and retail technology. Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills : Strong analytical and statistical skills. Expertise in machine learning and AI. Experience with retail-specific datasets and KPIs. Proficiency in data visualization and reporting tools. Ability to work with large datasets and complex data structures. Strong communication skills to interact with both technical and non-technical stakeholders. A solid understanding of the retail business and consumer behavior. Programming Languages:Python, R, SQL, Scala Data Analysis Tools:Pandas, NumPy, Scikit-learn, TensorFlow, Keras Visualization Tools:Tableau, Power BI, Matplotlib, Seaborn Big Data Technologies:Hadoop, Spark, AWS, Google Cloud Databases:SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Qualification Experience: Minimum 3 year(s) of experience is required Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
Posted 5 days ago
5.0 - 10.0 years
8 - 12 Lacs
Kochi
Work from Office
Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Big Data, Python or R Good to have skills:Scala, SQL Job Summary A Data Scientist is expected to be hands-on to deliver end to end vis a vis projects undertaken in the Analytics space. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Roles and Responsibilities Identify valuable data sources and collection processes Supervise preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns for insurance industry. Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Collaborate with engineering and product development teams Hands-on knowledge of implementing various AI algorithms and best-fit scenarios Has worked on Generative AI based implementations Professional and Technical Skills 3.5-5 years experience in Analytics systems/program delivery; at least 2 Big Data or Advanced Analytics project implementation experience Experience using statistical computer languages (R, Python, SQL, Pyspark, etc.) to manipulate data and draw insights from large data sets; familiarity with Scala, Java or C++ Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications Hands on experience in Azure/AWS analytics platform (3+ years) Experience using variations of Databricks or similar analytical applications in AWS/Azure Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Strong mathematical skills (e.g. statistics, algebra) Excellent communication and presentation skills Deploying data pipelines in production based on Continuous Delivery practices. Additional Information Multi Industry domain experience Expert in Python, Scala, SQL Knowledge of Tableau/Power BI or similar self-service visualization tools Interpersonal and Team skills should be top notch Nice to have leadership experience in the past Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)
Posted 5 days ago
5.0 - 8.0 years
5 - 8 Lacs
Chennai
Work from Office
Kafka Admin Consult with inquiring teams on how to leverage Kafka within their pipelines Architect, Build and Support existing and new Kafka clusters via IaC Partner with Splunk teams to route trac through Kafka by utilizing open-source agents and collectors deployed via ChefRemediate any health issues within Kafka Automate (where possible) any operational processes on the team Create new and/or update monitoring dashboards and alerts as neededManage a continuous improvement / continuous development (CI/CD pipelinePerform PoCs on new components to expand/enhance teams Kafka oerings Preferred QualificationsKnowledge and experience with Splunk, Elastic, Kibana and GrafanaKnowledge and experience with log collection agents such as Open-Telemetry, Fluent Bit, FluentD, Beats and LogStash.Knowledge and experience with Kubernetes / DockerKnowledge and experience with Kafka-ConnectKnowledge and experience with AWS or AzureKnowledge and experience with Streaming Analytics Mandatory Skills: API Microservice Integration. Experience: 5-8 Years.
Posted 5 days ago
5.0 - 8.0 years
4 - 7 Lacs
Mumbai
Work from Office
Excellent Knowledge on Spark; The professional must have a thorough understanding Spark framework, Performance Tuning etc Excellent Knowledge and hands-on experience of at least 4+ years in Scala and PySpark Excellent Knowledge of the Hadoop eco System- Knowledge of Hive mandatory Strong Unix and Shell Scripting Skills Excellent Inter-personal skills and for experienced candidates Excellent leadership skills Mandatory for anyone to have Good knowledge of any of the CSPs like Azure,AWS or GCP; Certifications on Azure will be additional Plus. Mandatory Skills: PySpark. Experience: 5-8 Years.
Posted 5 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Engineer: The Data Engineer is responsible for building Data Engineering Solutions using next generation data techniques. The individual will be working directly with product owners, customers and technologists to deliver data products/solutions in a collaborative and agile environment. Responsibilities: Responsible for design and development of big data solutions. Partner with domain experts, product managers, analyst, and data scientists to PySpark and Python Work with data scientist to build Client pipelines using heterogeneous sources and provide engineering services for data science applications Ensure automation through CI/CD across platforms both in cloud and on-premises Define needs around maintainability, testability, performance, security, quality and usability for data platform Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes Convert Talend based pipelines into languages like PySpark, Python to execute on Hadoop and non-Hadoop ecosystems Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance Evaluate new IT developments and evolving business requirements and recommend appropriate systems alternatives and/or enhancements to current systems by analyzing business processes, systems and industry standards. Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinates and contributes to the objectives of the entire function. Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 4 -8 years of total IT experience 4+ years of relevant experience with Pyspark and Python Experience on designing and developing Data Pipelines for Data Ingestion or Transformation using Python. Experience with Spark programming (pyspark or Python ) Hands-on experience with Python/Pyspark and basic libraries for machine learning is required; Exposure to containerization and related technologies (e.g. Docker, Kubernetes) Exposure to aspects of DevOps (source control, continuous integration, deployments, etc.) Can-do attitude on solving complex business problems, good interpersonal and teamwork skills Possess team management experience and have led a team of data engineers and analysts. Experience in Oracle performance tuning, SQL, Autosys and basic Unix scripting. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 5 days ago
5.0 - 8.0 years
4 - 7 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Data Analysis. Experience: 5-8 Years.
Posted 5 days ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Min Exp -8+years Domain - Pharmaceutical Location - Pan India Overview We are looking for a Deputy Manager/Group Manager in Advanced Analytics for the Lifesciences/Pharma domain. The person will lead a dynamic team focused on assisting clients in Marketing, Sales, and Operations through advanced data analytics. Proficiency in ML & DL Algorithms, NLP, Generative AI, Omni Channel Aalytics and Python/R/SAS is essential. Roles and Responsibilities summary: Partner with the Clients’ Advanced Analytics team to identify, scope, and execute advanced analytics efforts that answer business questions, solve business needs, and add business value. Examples include estimating marketing channel effectiveness or estimating sales force sizing. Maintain a broad understanding of pharmaceutical sales, marketing and operations and develop analytical solutions in these areas. Stay current with respect to statistical/mathematical/informatics modeling methodology, to maintain proficiency in applying new and varied methods, and to be competent in justifying methods selected. POC development for building internal capabilities and standardization of common modeling processes. Lead & guide the team independently or with little support to implement & deliver complex project assignments. Provide strategic leadership to the team by building new capabilities within the group and identifying business opportunities. Provide thought leadership by contributing to whitepapers and articles at the BU and organization level. Developing and delivering formal presentations to senior clients in both delivery and sales situations Additional Information: Interpersonal communication skills for effective customer consultation Teamwork and leadership skills Self-management skills with a focus on results for timely and accurate completion of competing deliverables Make the impossible possible in quest to make life better. Bring Analytics to life by giving it zeal and making it applicable to business. Know, learn, and keep up-to-date on the statistical and scientific advances to maximize your impact. Bring an insatiable desire to learn, to innovate, and to challenge yourself for the benefit of patients. Technical Skills: Proficient in Python or R for statistical and machine learning applications. Expertise in a wide range of techniques, including Regression, Classification Decision Trees, Text Mining, Natural Language Processing, Bayesian Models, and more. Build & train neural network architectures such as CNN, RNN, LSTMs, Transformers. Experience in Omni Channel Analytics for predicting the Nest Best Action using the Advanced ML/DL/RL algorithms and Pharma CRM data Hands-on experience in NLP & NLG, covering topic modeling, Q&A, chatbots, and document summarization. Proficient in LLMs (e.g., GPT, Lang chain, llama index) and open-source LLMs Cloud Platforms: Hands-on experience in Azure, AWS, GCP, with application development skills in Python, Docker, and Git. Good to have Skills: Exposure to big data technologies such as Hadoop, Hive MapReduce etc. Qualifications Basic Qualifications: .Tech / Masters in a quantitative discipline (e.g. Applied Mathematics, Computer Science, Bioinformatics, Statistics; Ops Research, Econometrics)
Posted 5 days ago
5.0 - 8.0 years
7 - 10 Lacs
Kolkata
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Data Analysis. Experience: 5-8 Years. >
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Acceldata is reimagining the way companies observe their Data! Acceldata is the pioneer and leader in data observability, revolutionizing how enterprises manage and observe data by offering comprehensive insights into various key aspects of data, data pipelines and data infrastructure across various environments. Our platform empowers data teams to manage products effectively by ensuring data quality, preventing failures, and controlling costs. As a Senior Platform Engineer You'll develop internal tools, create and enhance Linux CLI applications and extend Go-based tools with plugins. You'll also build web applications to integrate these tools with customer APIs and services. A day in the life of Senior Platform Engineer Primarily developing/extending Linux command-line (CLI applications for the various product offerings at Acceldata. Example: tools similar to kubectl, docker-cli and aws-cli). Extending our existing internal tools developed in Go programming language by developing various plugins. (Example: plugins that can extend the functionalities of Telegraf, Filebeat etc., ) Develop web applications that will chain the internal tools with the various APIs/services at the customer's end. You are a great fit for this role if you have 2-6 years of total experience. Hands-on experience in Go programming language paradigms, constructs and idioms Experience with Golang frameworks such as Cobra, Gin or HashiCorp's Go-Plugin. Understanding of Go-specific data structures & algorithms, RESTful web services. Experience in relational & non-relational databases. Knowledge of DevOps tools such as Docker/Kubernetes and practices is an advantage. Experience in Linux and system application development. Bonus Points for Proficiency in Big Data Systems or Hadoop components. We care for our team Mentorship & Growth ESOPs Medical and Life Insurance Paid Maternity & Parental Leave Corporate Uber Program Learning & Development Support Acceldata for All We are a fast-growing company, solving complex data problems at scale. We are driven by strong work ethics, high standards of excellence, and a spirit of collaboration. We promote innovation, commitment, and accountability. Our goal is to cultivate a healthy work environment that fosters a sense of belonging, encourages teamwork, and brings out the best in every individual. Why Acceldata? Acceldata is redefining data observability for enterprise data systems. Founded by experts who recognized the need for innovative monitoring and management solutions in a cloud-first, AI-driven environment, our platform empowers data teams to effectively manage data products. We address common challenges such as scaling and performance issues, cost overruns, and data quality problems by providing operational visibility, proactive alerts, and monitoring reliability across the various environments. Delivered as a SaaS product, Acceldata's solutions have been embraced by global customers, such as HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Zoominfo, GSK, Oracle, PubMatic, PhonePe (Walmart), Hersheys, Dun & Bradstreet, and many more. Acceldata is a Series-C funded company and its investors include Insight Partners, March Capital, Lightspeed, Sorenson Ventures, Industry Ventures, and Emergent Ventures.
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Data Platform Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let Me Tell You About The Role As a Senior Data Tooling Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 5 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
PFB the detailed JD: 🔹 Experience: 6 to 8+ years (Hands-on) 🔹 Location: Pune (WFO) 🔹 Notice Period: 0-30 Days Must Have: Proficiency in at least one of the following programming languages: Java/Scala/Python Good understanding of SQL Experience of development and deployment of at least one end-to-end data storage/processing pipeline Strong Experience in Spark development with batch and streaming Intermediate level expertise in HDFS and Hive Experience with Pyspark and Data Engineering ETL implementation and migration to spark Experience of working with Hadoop cluster Python, PySpark, Data Bricks developer with knowledge of cloud Experience with Kafka and Spark streaming (Dstream and Structured Streaming) Experience with using Jupyter notebooks or any other developer tool Experience with Airflow or other workflow engines Good communication skills and logical skills Good to Have Skills: Prior experience of writing Spark jobs using Java is highly appreciated Prior experience of working with Cloudera Data Platform (CDP) Hands-on experience with NoSQL databases like HBase, Cassandra, Elasticsearch, etc. Experience of using maven and git Agile scrum methodologies Flink and Kudu streaming Automation of workflows CI/CD Nifi streaming and transformation
Posted 5 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a skilled Data Engineer to join our growing data team in India. You will be responsible for designing, building, and maintaining scalable data infrastructure and pipelines that enable data-driven decision making across our organization and client projects. This role offers the opportunity to work with cutting-edge technologies and contribute to innovative data solutions for global clients. What you do Technical Skills Minimum 3+ years of experience in data engineering or related field Strong programming skills in Python and/or Scala/Java Experience with SQL and database technologies (PostgreSQL, MySQL, MongoDB) Hands-on experience with data processing frameworks: Apache Spark, Hadoop ecosystem Apache Kafka for streaming data Apache Airflow or similar workflow orchestration tools Knowledge of data warehouse concepts and technologies Experience with containerization (Docker, Kubernetes) Understanding of data modeling principles and best practices Cloud & Platform Experience Experience with at least one major cloud platform (AWS, Azure, or GCP) Familiarity with cloud-native data services: Data lakes, data warehouses, and analytics services Server less computing and event-driven architectures Identity and access management for data systems Knowledge of Infrastructure as Code (Terraform, CloudFormation, ARM templates) Data & Analytics Understanding of data governance and security principles Experience with data quality frameworks and monitoring Knowledge of dimensional modeling and data warehouse design Familiarity with business intelligence and analytics tools Understanding of data privacy regulations (GDPR, CCPA) Preferred Qualifications Advanced Technical Skills Experience with modern data stack tools (dbt, Fivetran, Snowflake, Databricks) Knowledge of machine learning pipelines and MLOps practices Experience with event-driven architectures and microservices Familiarity with data mesh and data fabric concepts Experience with graph databases (Neo4j, Amazon Neptune) Industry Experience Experience in digital agency or consulting environment Background in financial services, e-commerce, retail, or customer experience platforms Knowledge of marketing technology and customer data platforms Experience with real-time analytics and personalization systems Soft Skills Strong problem-solving and analytical thinking abilities Excellent communication skills for client-facing interactions Ability to work independently and manage multiple projects Adaptability to rapidly changing technology landscape Experience mentoring junior team members What we ask Data Infrastructure & Architecture Design and implement robust, scalable data architectures and pipelines Build and maintain ETL/ELT processes for batch and real-time data processing Develop data models and schemas optimized for analytics and reporting Ensure data quality, consistency, and reliability across all data systems Platform-Agnostic Development Work with multiple cloud platforms (AWS, Azure, GCP) based on client requirements Implement data solutions using various technologies and frameworks Adapt quickly to new tools and platforms as project needs evolve Maintain expertise across different cloud ecosystems and services Data Pipeline Development Create automated data ingestion pipelines from various sources (APIs, databases, files, streaming) Implement data transformation logic using modern data processing frameworks Build monitoring and alerting systems for data pipeline health Optimize pipeline performance and cost-efficiency Collaboration & Integration Work closely with data scientists, analysts, and business stakeholders Collaborate with DevOps teams to implement CI/CD for data pipelines Partner with client teams to understand data requirements and deliver solutions Participate in architecture reviews and technical decision-making What we offer You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply. At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here. Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team!
Posted 5 days ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Tasks Experience Experience in building and managing data pipelines. experience with development and operations of data pipelines in the cloud (Preferably Azure.) Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark Deep expertise in architecting and data pipelines in cloud using cloud native technologies. Good experience in both ETL and ELT Ingestion patterns Hands-on experience working on large volumes of data(Petabyte scale) with distributed compute frameworks. Good Understanding of container platforms Kubernetes and docker Excellent knowledge and experience with object-oriented programming Familiarity developing with RESTful API interfaces. Experience in markup languages such as JSON and YAML Proficient in relational database design and development Good knowledge on Data warehousing concepts Working experience with agile scrum methodology Technical Skills Strong skills in distributed cloud Data analytics platforms like Databricks, HD insight, EMR cluster etc. Strong in Programming Skills -Python/Java/R/Scala etc. Experience with stream-processing systems: Kafka, Apache Storm, Spark-Streaming, Apache Flink, etc. Hands-on working knowledge in cloud data lake stores like Azure Data Lake Storage. Data pipeline orchestration with Azure Data Factory, Amazon Data Pipeline Good Knowledge on File Formats like ORC, Parquet, Delta, Avro etc. Good Experience in using SQL and No-SQL. databases like MySQL, Elasticsearch, MongoDB, PostgreSQL and Cassandra running huge volumes of data Strong experience in networking and security measures Proficiency with CI/CD automation, and specifically with DevOps build and release pipelines Proficiency with Git, including branching/merging strategies, Pull Requests, and basic command line functions Strong experience in networking and security measures Good Data Modelling skills Job Responsibilities Cloud Analytics, Storage, security, resiliency and governance Building and maintaining the data architecture for data Engineering and data science projects Extract Transform and Load data from sources Systems to data lake or Datawarehouse leveraging combination of various IaaS or SaaS components Perform compute on huge volume of data using open-source projects like Databricks/spark or Hadoop Define table schema and quickly adapt with the pipeline Working with High volume unstructured and streaming datasets Responsible to manage NoSQL Databases on Cloud (AWS, Azure etc.) Architect solutions to migrate projects from On-premises to cloud Research, investigate and implement newer technologies to continually evolve security capabilities Identify valuable data sources and automate collection processes Implement adequate networking and security measures for the data pipeline Implement monitoring solution for the data pipeline Support the design, and implement data engineering solutions Maintain excellent documentation for understanding and accessing data storage Work independently as well as in teams to deliver transformative solutions to clients Be proactive and constantly pay attention to the scalability, performance and availability of our systems Establishes privacy/security hierarchy and regulates access Collaborate with engineering and product development teams Systematic problem-solving approach with strong communication skills and a sense of ownership and drive Qualifications Bachelors degree or Masters in Computer Science or relevant streams Any Relevant cloud data engineering certification
Posted 5 days ago
2.0 - 5.0 years
6 - 8 Lacs
Navi Mumbai
Work from Office
Job Summary: We are seeking a Data Scientist with strong Data Engineering skills to bridge the gap between raw data and actionable insights. In this hybrid role, you will work on building robust data pipelines, enabling large-scale data processing, and applying machine learning models to solve real-world business problems. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines to process structured and unstructured data from various sources. Collaborate with data scientists and analysts to build data infrastructure that supports modeling, experimentation, and advanced analytics. Clean, transform, and organize large datasets for use in machine learning models and statistical analysis . Build and deploy ML pipelines in production environments using MLOps principles . Optimize data workflows for scalability, performance, and cost-efficiency. Monitor and troubleshoot data quality issues, implementing solutions to ensure reliability and accuracy.
Posted 5 days ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Over all IT Experience: 10+yrs Relevant Experience: 4 - 6 years Preferred Qualifications: B. Tech / M. Tech or other equivalent degree Notice period - Immediate to 15days only Job Type: Full-time Work Timings: 12:00 PM to 9:00 PM IST Total IT Experience: 10+ years Location: Chennai Model: Work From Office Responsibilities Roles & Responsibilities - Experience as Data Architect in designing Data Warehouse using Microsoft Technologies and Big Data Technologies (Hive, Hadoop, Spark SQL) Experience in GCP / AWS is a Must Design and implement effective database solutions and models to store and retrieve company data. Examine and identify database structural necessities by evaluating client operations, applications, and programming. Understanding requirements and creating Technical Specifications. Responsible for business analysis, Systems Design, Architecture Principles Design, Data Engineering, Data Modernization, Database Design, Security Practices, Data migration, Solutions Modelling, Systems Integration. Exposure in creating Integration Layer using any ETL tools Recommend solutions to improve new and existing database systems. Oversee the migration of data from legacy systems to new solutions. Monitor the system performance by performing regular tests, troubleshooting, and integrating new features. Should have Excellent Communication Skills. Qualifications Preferred Qualifications: B. Tech / M. Tech or other equivalent degree
Posted 5 days ago
6.0 - 10.0 years
8 - 12 Lacs
Navi Mumbai
Work from Office
Title: Lead Data Scientist (Python) Required Technical Skillset:Language : Python, PySpark Framework - Scikit-learn, TensorFlow, Keras, PyTorch, Libraries - NumPy, Pandas, Matplotlib, SciPy, Scikit-learn - DataFrame, Numpy, boto3 Database - Relational Database(Postgres), NoSQL Database (MongoDB) Cloud - AWS cloud platforms Other Tools - Jenkins, Bitbucket, JIRA, Confluence A machine learning engineer is responsible for designing, implementing, and maintaining machine learning systems and algorithms that allow computers to learn from and make predictions or decisions based on data. The role typically involves working with data scientists and software engineers to build and deploy machine learning models in a variety of applications such as natural language processing, computer vision, and recommendation systems. The key responsibilities of a machine learning engineer includes: Collecting and preprocessing large volumes of data, cleaning it up, and transforming it into a format that can be used by machine learning models. Model building which includes Designing and building machine learning models and algorithms using techniques such as supervised and unsupervised learning, deep learning, and reinforcement learning. Evaluating the model performance of machine learning models using metrics such as accuracy, precision, recall, and F1 score. Deploying machine learning models in production environments and integrating them into existing systems using CI/CD Pipelines, AWS Sagemaker Monitoring the performance of machine learning models and making adjustments as needed to improve their accuracy and efficiency. Working closely with software engineers, product managers and other stakeholders to ensure that machine learning models meet business requirements and deliver value to the organization. Requirements and Skills: Mathematics and Statistics: A strong foundation in mathematics and statistics is essential. They need to be familiar with linear algebra, calculus, probability, and statistics to understand the underlying principles of machine learning algorithms. Programming Skills: Should be proficient in programming languages such as Python. The candidate should be able to write efficient, scalable, and maintainable code to develop machine learning models and algorithms. Machine Learning Techniques: Should have a deep understanding of various machine learning techniques, such as supervised learning, unsupervised learning, and reinforcement learning and should also be familiar with different types of models such as decision trees, random forests, neural networks, and deep learning. Data Analysis and Visualization: Should be able to analyze and manipulate large data sets. The candidate should be familiar with data cleaning, transformation, and visualization techniques to identify patterns and insights in the data. Deep Learning Frameworks: Should be familiar with deep learning frameworks such as TensorFlow, PyTorch, and Keras and should be able to build and train deep neural networks for various applications. Big Data Technologies: A machine learning engineer should have experience working with big data technologies such as Hadoop, Spark, and NoSQL databases. They should be familiar with distributed computing and parallel processing to handle large data sets. Software Engineering: A machine learning engineer should have a good understanding of software engineering principles such as version control, testing, and debugging. They should be able to work with software development tools such as Git, Jenkins, and Docker. Communication and Collaboration: A machine learning engineer should have good communication and collaboration skills to work effectively with cross-functional teams such as data scientists, software developers, and business stakeholders.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France