Jobs
Interviews

15684 Spark Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Python Developer Experience Level: 5-7 Years Location : Hyderabad Job Description We are seeking an experienced Lead Python Developer with a proven track record of building scalable and secure applications, specifically in the travel and tourism industry. The ideal candidate should possess in-depth knowledge of Python, modern development frameworks, and expertise in integrating third-party travel APIs. This role demands a leader who can foster innovation while adhering to industry standards for security, scalability, and performance. Roles and Responsibilities Application Development: Architect and develop robust, high-performance applications using Python frameworks such as Django, Flask, and FastAPI. API Integration: Design and implement seamless integration with third-party APIs, including GDS, CRS, OTA, and airline-specific APIs, to enable real-time data retrieval for booking, pricing, and availability. Data Management: Develop and optimize complex data pipelines to manage structured and unstructured data, utilizing ETL processes, data lakes, and distributed storage solutions. Microservices Architecture: Build modular applications using microservices principles to ensure scalability, independent deployment, and high availability. Performance Optimization: Enhance application performance through efficient resource management, load balancing, and faster query handling to deliver an exceptional user experience. Security and Compliance: Implement secure coding practices, manage data encryption, and ensure compliance with industry standards such as PCI DSS and GDPR. Automation and Deployment: Leverage CI/CD pipelines, containerization, and orchestration tools to automate testing, deployment, and monitoring processes. Collaboration: Work closely with front-end developers, product managers, and stakeholders to deliver high- quality, user-centric solutions aligned with business goals.Requirements  Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.  Technical Expertise: o At least 4 years of hands-on experience with Python frameworks like Django, Flask, and FastAPI. o Proficiency in RESTful APIs, GraphQL, and asynchronous programming. o Strong knowledge of SQL/No SQL databases (PostgreSQL, MongoDB) and big data tools (e.g., Spark, Kafka). o Experience with cloud platforms (AWS, Azure, Google Cloud), containerization (Docker, Kubernetes), and CI/CD tools (e.g., Jenkins, GitLab CI). o Familiarity with testing tools such as PyTest, Selenium, and SonarQube. o Expertise in travel APIs, booking flows, and payment gateway integrations.  Soft Skills: o Excellent problem-solving and analytical abilities. o Strong communication, presentation, and teamwork skills. o A proactive attitude with a willingness to take ownership and perform under pressure.

Posted 2 days ago

Apply

9.0 - 13.0 years

0 Lacs

karnataka

On-site

As a Senior ML Scientist in the Advertising Optimization & Automation Science team at Wayfair, you will play a crucial role in leveraging machine learning and generative AI to streamline campaign workflows. Based in Bangalore, India, you will contribute to building intelligent systems that drive personalized recommendations and campaign automation within Wayfair's advertising platform. Your responsibilities will include designing and implementing intelligent budget, tROAS, and SKU recommendations, as well as simulation-driven decisioning to enhance advertiser outcomes and unlock substantial commercial value. In collaboration with cross-functional teams, you will lead the development of GenAI-powered creative optimization and automation to drive incremental ad revenue and improve supplier outcomes. Additionally, you will elevate technical standards by promoting best practices in ML system design and development across the team. The ideal candidate for this role possesses a Bachelor's or Masters degree in Computer Science, Mathematics, Statistics, or a related field, along with 9+ years of experience in building large-scale machine learning algorithms. You should have a strong theoretical understanding of statistical models and ML algorithms, proficiency in programming languages such as Python, and experience with relevant ML libraries like TensorFlow and PyTorch. Strategic thinking, customer-centric mindset, and a desire for creative problem-solving are essential qualities for success in this position. Additionally, you should be adept at influencing senior-level stakeholders, possess excellent communication skills, and demonstrate the ability to shape technical roadmaps through cross-functional partnerships. Nice-to-have qualifications include experience with GCP, Airflow, and containerization, as well as familiarity with Generative AI and agentic workflows. Knowledge of Bayesian Learning, Multi-armed Bandits, or Reinforcement Learning is also advantageous. By joining Wayfair, you will be part of a dynamic and innovative team that is dedicated to reinventing the way people shop for their homes. If you are looking for a fast-paced environment that offers continuous learning and growth opportunities, Wayfair may be the perfect place for you to advance your career.,

Posted 2 days ago

Apply

0.0 - 3.0 years

0 Lacs

Vadodara, Gujarat

On-site

Analytics Posted on Jul 22, 2025 Vadodara-Gujarat Minimum Required Experience : 3 years Full Time Skills Machine Learning TensorFlow Pytorch NLP Description Designation/Role Name Machine Learning Engineer Org Structure Data Science & AI Team Work Schedule According to the business needs Job Description With excellent analytical and problem-solving skills, you should understand business problems of the customers, translate them into scope of work and technical specifications for developing into Data Science projects. Efficiently utilize cutting edge technologies in AI areas (Machine Learning, NLP, Computer Vision) and development of solutions for business problems. Good exposure technology platforms for Data Science, AI, Gen AI, cloud with implementation experience. Ability to understand data, requirements, design and develop a Machine Learning Model for the requirements. This Job requires the following: Designing, developing, and implementing end-to-end machine learning production pipelines (data exploration, sampling, training data generation, feature engineering, model building, and performance evaluation) Experience in predictive analytics and statistical modeling Experience in successfully making use of the following: Logistic Regression Multivariate Regression, Support Vector Machines, Stochastic Processes, Decision Trees, Lifetime analysis, common clustering algorithms, Optimization, CNN Essential Qualifications B-Tech or BE - computer / IT or MCA or MSC- Computer Science along with necessary certifications is preferred Technical Qualifications (Essential) Hands-on programming experience Hands-on technical design experience Hands-on prompt engineering experience Design and Development of at least 3 Data Science, AI Projects with design and development of Machine Learning models 1 Generative AI Project designed, developed and delivered to production is desirable Primary Skills Hands-on coding experience in Python, PyTorch, Spark/PySpark, SQL, TensorFlow, NLP Frameworks and similar tools/frameworks Good understanding of business and domain of the applications Hands-on experience in design and development of Gen AI applications using Open Source LLMs and cloud platforms Hands-on experience in design and development of API based applications for AI and Data Science Projects Understanding in GenAI concepts, RAG and Models fine-tuning techniques is desirable Understand the concepts of major AI models such as OpenAI, Llama, Hugging Face, Mistral AI etc., Understanding of DevOps pipelines for deployment Good understanding of Data Engineering lifecycle – data pipelines, data warehouse, data lake Secondary Skills Experience using Databricks and Azure Data platform Knowledge of any configuration management tools is desirable Familiarity with containerization and container orchestration services like Docker and Kubernetes Experience 3+ years in Machine Learning Model development in Data Science/AI Projects. Awareness of LLM integrations / development is desirable. Description of Responsibility Understand customer’s requirements (Business, Functional, Non-Functional etc.,), design and develop Machine Learning Models Design and implement Machine Learning Models using major technology and computing platforms (open source and cloud) Possess excellent analytical and problem-solving skills and be able to understand various forms of data, patterns and derive insights. Collaborate internal and external stakeholders for deriving solution that requires cross functional teams and smoother execution of the projects Knowledge of data modeling and understanding of different data structures Experience with design of AI/ML solutions either as standalone or integrated with other applications Experience in Generative AI solutions for the business/automation requirements using open source LLMs (Open AI, LLama, Mistral etc.,) is desirable Sills / Competencies requirement Research Orientation Proactive & Clear Communication Collaboration Solution Orientation Solution Articulation Accountability Adaptability / Flexibility Analytical Skills Listening Skills Customer Service Orientation

Posted 2 days ago

Apply

2.0 - 3.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Location Ahmedabad, India Experience 2-3 Job Type Full Time Job Description Designation: AI/ML Developer Location: Ahmedabad Department: Technical Job Summary: We Are Looking Enthusiastic AI/ML Developer With 2 To 3 Years Of Relevant Experience In Machine Learning And Artificial Intelligence. The Candidate Should Be Well-Versed In Designing And Developing Intelligent Systems And Have A Solid Grasp Of Data Handling And Model Deployment. Key Responsibilities: Develop and implement machine learning models tailored to business needs. Develop and fine-tune Generative AI models (e.g., LLMs, diffusion models, VAEs) using platforms like Hugging Face, LangChain, or OpenAI. Conduct data collection, cleaning, and pre-processing for model readiness. Train, test, and optimize models to improve accuracy and performance. Work closely with cross-functional teams to deploy AI models in production environments. Perform data exploration, visualization, and feature selection Stay up-to-date with the latest trends in AI/ML and experiment with new approaches. Design and implement Multi-Agent Systems (MAS) for distributed intelligence, autonomous collaboration, or decision-making. Integrate and orchestrate agentic workflows using tools like Agno, CrewAI or LangGraph. Ensure scalability and efficiency of deployed solutions. Monitor model performance and perform necessary updates or retraining. Requirements: Strong programming skills in Python and experience with libraries like Tensor Flow, PyTorch, Scikit-learn, and Keras. Experience working with vector databases (Pinecone, Weaviate, Chroma) for RAG systems. Good understanding of machine learning concepts, including classification, regression, clustering, and deep learning. Knowledge of knowledge graphs, semantic search, or symbolic reasoning. Proficiency in working with tools such as Pandas, NumPy, and data visualization libraries. Hands-on experience deploying models using REST APIs with frameworks like Flask or FastAPI. Familiarity with cloud platforms (AWS, Google Cloud, or Azure) for ML deployment. Knowledge of version control systems like Git. Experience with Natural Language Processing (NLP), computer vision, or predictive analytics. Exposure to MLOps tools and workflows (e.g., MLflow, Kubeflow, Airflow). Basic familiarity with big data frameworks like Apache Spark or Hadoop. Understanding of data pipelines and ETL processes. What We Offer: Opportunity to work on live projects and client interactions. A vibrant and learning-driven work culture. 5 days a week & Flexible work timings. About Company Techify is the Fastest Growing Tech Company with a talented, passionate and learning team. Techify's DNA Is About Solutions & Technologies. We are here to help our customers grow their business. Our Vision is to Become One of the Best Product Engineering companies in India We put client relationships first hence our mission is to build software solutions that help clients transform their business by unleashing hidden potential with technology. So our success mantra is Customer first, Team second and We are the third. Our main focus is our Customers’ and Partners’ success. Our visionary and experienced team turns innovative ideas into efficient products & softwares. Our well-defined processes ensure on-time delivery to our partners giving us an edge over our competitors. The most important pillar in achieving our goals is our dedicated Team and to encourage them and keep them motivated, we have set up a culture that rewards Self Development and Innovation. Our cutting-edge services include intensive research and analysis to identify the appropriate technology to achieve best performances by incurring least cost possible. We take a studied approach towards cost, performance, feature trade-offs to help companies surmount the challenges of delivering high-quality, timely products and services to the marketplace. We have the ability to take up any product be it at the stage of defining, designing, verifying or realizing. Here are our recognitions. We are the winner of Grand Challenge in Vibrant Gujarat Summit’2018. We have also achieved prestigious “Trend Setter” award from Gujarat Innovation Society. Times Coffee Table Book Covered us in “Gujarat the Inspiring edge” edition. We are also Amazon web services consulting and networking partners.

Posted 2 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID 30181671 Job Category Digital Technology Job Title – Sr. Architect (Data and Integration) Preferred Location - Bangalore India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Responsibilities: Enterprise Data & Integration Strategy Define and drive the enterprise data and integration vision, ensuring alignment with business and IT objectives. Establish best practices for API-led connectivity, data pipelines, and cloud-native architectures. Lead the implementation of standardized integration patterns, including data lakes, event-driven processing, and distributed computing. Ensure all data solutions are resilient, secure, and compliant with industry regulations. Technical Leadership & Execution Architect and oversee the deployment of scalable and high-performance data platforms and integration solutions. Partner with engineering, analytics, and IT teams to design and implement data-driven capabilities. Optimize data processing workflows for security, performance, and cost-efficiency. Assess and integrate emerging technologies, including AI/ML and advanced analytics frameworks, into the data strategy. Governance, Security & Compliance Establish enterprise-wide data governance frameworks to ensure data accuracy, security, and compliance. Implement advanced monitoring, logging, and alerting strategies to maintain high availability of data services. Drive automation in data quality management, security enforcement, and integration testing using DevOps methodologies. Work closely with risk and compliance teams to ensure adherence to data privacy regulations (GDPR, CCPA, etc.). Role Purpose: Data & Integration Architect (Technical leadership) will be responsible for shaping the enterprise-wide data and integration strategy, driving innovation, and overseeing the implementation of large-scale data solutions. This role requires a deep technical expertise in data engineering, API integrations, and real-time data processing to enable seamless interoperability across enterprise applications. The successful candidate will provide strategic direction, mentor technical teams, and work closely with business leaders to implement data frameworks that support analytics, automation, and digital transformation at scale. Minimum Requirements: 12+ years of experience in enterprise data architecture, integration, or software engineering leadership roles. Proven expertise in designing and managing complex data architectures, including data lakes, data warehouses, and real-time streaming platforms. Hands-on experience with enterprise integration tools such as Boomi, MuleSoft, Kafka, AWS Glue, or equivalent. Deep understanding of API management, authentication mechanisms (OAuth2, SAML), and data security best practices. Strong experience integrating large-scale ERP, CRM, and HR systems (SAP, Salesforce, Workday, etc.). Proficiency in DevOps, CI/CD pipelines, and cloud infrastructure (AWS, Azure, GCP). Experience with AI/ML-driven data solutions and predictive analytics. Hands-on expertise with big data technologies such as Spark, Flink, or Databricks. Strong programming skills in Python, Java, or Scala for data transformation and automation. Industry certifications in cloud computing, data management, or integration technologies. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 days ago

Apply

0.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID 30181213 Job Category Digital Technology Job Title – Data Lakehouse Platform Architect/ Engineer Preferred Location - Bangalore/Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Responsibilities: Lead the architecture and engineering of data lakehouse platforms using Apache Iceberg on AWS S3 , enabling scalable storage and multi-engine querying. Design and build infrastructure-as-code solutions using AWS CDK or Terraform to support repeatable, automated deployments. Deliver and optimize ELT/ETL data pipelines for real-time and batch workloads with AWS Glue, Apache Spark, Kinesis , and Airflow . Enable compute engines (Athena, EMR, Redshift, Trino, Snowflake) through efficient schema design, partitioning, and metadata strategies. Champion observability and operational excellence across the platform by implementing robust monitoring, alerting, and logging practices. Drive automation through CI/CD pipelines using GitHub Actions, CircleCI, or AWS CodePipeline , improving deployment speed and reliability. Partner cross-functionally with data engineers, DevOps, security, and FinOps teams to align platform features to evolving business needs. Provide thought leadership on open standards, cost optimization, and scaling data platform capabilities to support AI/ML and analytics initiatives. Role Purpose: 14+ years of experience in data engineering, cloud infrastructure, or platform engineering roles, with at least 3 years in a senior or lead capacity. Expert-level experience with AWS services (S3, Glue, Kinesis, IAM, CloudWatch, EMR). Strong working knowledge of Apache Iceberg or similar open table formats (e.g., Delta Lake, Hudi). Proficiency in Python , with the ability to build infrastructure, automation, and data workflows. Demonstrated experience designing data lakehouse architectures supporting large-scale analytics and ML use cases. Hands-on experience with CI/CD pipelines, infrastructure-as-code, and cloud-native automation tooling. Strong understanding of data governance principles, schema evolution, partitioning, and access controls. Minimum Requirements: Familiarity with AWS Lake Formation, Snowflake, Databricks, or Trino. Experience optimizing cloud cost and performance throughFinOps practices. Prior experience contributing to platform strategy or mentoring junior engineers. Understanding of security, compliance, and operational controls in regulated enterprise environments. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. Experience building data and AI solutions and working with technical customers. Experience designing cloud enterprise solutions and supporting customer projects to completion. Ability to communicate in English fluently to support client relationship management in this region. Preferred qualifications: Experience working with Large Language Models, data pipelines, and with data analytics, data visualization techniques. Experience with core Data ETL techniques. Experience in leveraging LLMs to deploy multimodal solutions encompassing Text, Image, Video and Voice. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. As a Cloud Engineer, you'll play a key role in ensuring that strategic customers have the best experience moving to the Google Cloud GenAI and Agentic AI suite of products. You will design and implement solutions for customer use cases, leveraging core Google products. You'll work with customers to identify opportunities to transform their business with GenAI, and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and GenAI solutions and solve complex technical customer challenges. Act as a trusted technical advisor to Google’s strategic customers. Identify new product features and feature gaps, provide guidance on existing product challenges, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID 30181207 Job Category Digital Technology Job Title – Data Platform Operations Lead Preferred Location - Bangalore/Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Responsibilities: Platform Development & Enablement Build and maintain scalable, modular services and frameworks for ELT pipelines, data lakehouse processing, integration orchestration, and infrastructure provisioning. Enable self-service capabilities for data engineers, platform operators, and integration teams through tools, documentation, and reusable patterns. Lead the platform architecture and development of core components such as data pipelines, observability tooling, infrastructure as code (IaC), and DevOps automation. Technical Leadership Champion platform-first thinking—identifying common needs and abstracting solutions into shared services that reduce duplication and accelerate delivery. Own the technical roadmap for platform capabilities across domains such as Apache Iceberg on S3, AWS Glue, Airflow/MWAA, Kinesis, CDK, and Kubernetes-based services. Promote design patterns that support real-time and batch processing, schema evolution, data quality, and integration at scale. Collaboration & Governance Collaborate with Data Engineering, Platform Operations, and Application Integration leaders to ensure consistency, reliability, and scalability across the platform. Contribute to FinOps and data governance initiatives by embedding controls and observability into the platform itself. Work with Architecture and Security to align with cloud, data, and compliance standards. Role Purpose: 12+ years of experience in software or data platform engineering, with 2+ years in a team leadership or management role. Strong hands-on expertise with AWS cloud services (e.g., Glue, Kinesis, S3), data lakehouse architectures (Iceberg), and orchestration tools (Airflow, Step Functions). Experience developing infrastructure as code using AWS CDK, Terraform, or CloudFormation. Proven ability to design and deliver internal platform tools, services, or libraries that enable cross-functional engineering teams. Demonstrated expertise in Python for building internal tools, automation scripts, and platform services that support ELT, orchestration, and infrastructure provisioning workflows. Proven experience leading DevOps teams and implementing CI/CD pipelines using tools such as GitHub Actions, CircleCI, or AWS CodePipeline to support rapid, secure, and automated delivery of platform capabilities. Minimum Requirements: Experience with Nexla, Kafka, Spark, or Snowflake. Familiarity with data mesh or product-based data architecture principles. Track record of promoting DevOps, automation, and CI/CD best practices across engineering teams. AWS certifications or equivalent experience preferred. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID 30181204 Job Category Digital Technology Job Title – Technical Specialist Preferred Location - Bangalore/Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Role Responsibilities: Oversee the day-to-day operations of data platforms, ensuring high availability and optimal performance. Incident Response: Lead the resolution of complex technical issues, conducting root cause analyses and implementing preventive measures. System Optimization: Analyze system performance metrics to identify areas for improvement, implementing solutions to enhance efficiency and reduce costs. Collaboration: Work closely with data engineering, DevOps, and security teams to coordinate deployments, upgrades, and maintenance activities. Documentation: Maintain comprehensive documentation of system configurations, operational procedures, and troubleshooting guides. Compliance and Security: Ensure that data platforms adhere to organizational policies and industry regulations, implementing necessary controls and audits. Innovation: Stay abreast of emerging technologies and industry trends, recommending and implementing new tools and practices to enhance platform operations. Role Purpose: Experience: 8+ years in platform operations, system administration, or related roles, with a focus on data platforms and cloud environments. Technical Proficiency: Strong knowledge of cloud services (e.g., AWS, Azure), containerization (e.g., Docker, Kubernetes), and infrastructure as code (e.g., Terraform, CloudFormation). Scripting Skills: Proficiency in scripting languages such as Python, Bash, or PowerShell for automation purposes. Problem-Solving: Demonstrated ability to diagnose complex system issues and implement effective solutions. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to non-technical stakeholders. Certifications: Relevant certifications such as AWS Certified SysOps Administrator, Microsoft Certified: Azure Administrator, or equivalent are highly desirable. Minimum Requirements: Data Platform Expertise: Experience with data processing frameworks (e.g., Apache Spark), data orchestration tools (e.g., Airflow), and database management systems (e.g., PostgreSQL, MySQL). Monitoring Tools: Familiarity with monitoring and logging tools such as Prometheus, Grafana, ELK Stack, or Splunk. Agile Methodologies: Experience working in Agile environments, participating in sprint planning, retrospectives, and daily stand-ups. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 days ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. Experience building data and AI solutions and working with technical customers. Experience designing cloud enterprise solutions and supporting customer projects to completion. Ability to communicate in English fluently to support client relationship management in this region. Preferred qualifications: Experience working with Large Language Models, data pipelines, and with data analytics, data visualization techniques. Experience with core Data ETL techniques. Experience in leveraging LLMs to deploy multimodal solutions encompassing Text, Image, Video and Voice. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. About the job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive. We help customers transform and evolve their business through the use of Google’s global network, web-scale data centers, and software infrastructure. As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners. As a Cloud Engineer, you'll play a key role in ensuring that strategic customers have the best experience moving to the Google Cloud GenAI and Agentic AI suite of products. You will design and implement solutions for customer use cases, leveraging core Google products. You'll work with customers to identify opportunities to transform their business with GenAI, and deliver workshops designed to educate and empower customers to realize the full potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs. In this role, you will lead the timely execution of adopting the Google Cloud Platform solutions to the customer. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver effective big data and GenAI solutions and solve complex technical customer challenges. Act as a trusted technical advisor to Google’s strategic customers. Identify new product features and feature gaps, provide guidance on existing product challenges, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of key business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The Executive - SAP Support position at House of Shipping involves designing, implementing, and managing data solutions using Azure Data services to facilitate data-driven decision-making processes. Your primary responsibility will be to develop robust data pipelines and architectures for efficient Extraction, Transformation, and Loading (ETL) operations from various sources into the Data warehouse or Data Lakehouse. Your key duties will include designing and developing data solutions on Azure, managing data integration workflows using Azure Data Factory, implementing data storage solutions using Azure Synapse SQL Pool and other Azure services, and monitoring and optimizing the performance of data pipelines and storage solutions. You will also collaborate with Data analysts, Developers, and Business stakeholders to understand data requirements and troubleshoot data-related issues to ensure data accuracy, reliability, and availability. To excel in this role, you should have proficiency in Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake Storage, as well as experience with SQL, RDBMS systems, data modeling, ELT batch design, data integration techniques, Python programming, and Serverless architecture using Azure Functions. Experience with Spark, Streaming services, Azure AI services, PowerBI, and Data Lake house architecture will be advantageous. Preferred certifications for this role include Microsoft Certified: Azure Data Engineer Associate or Microsoft Certified: Fabric Analytics Engineer Associate. If you are passionate about data solutions and enjoy working in a collaborative environment to deliver effective solutions, we invite you to join our team at House of Shipping.,

Posted 2 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation

Posted 2 days ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm with a team of over 2800 professionals focused on using data and technology to solve complex problems that impact millions of lives worldwide. Our culture is centered around expertise, respect, and a team-first mindset. Headquartered in Silicon Valley, we have delivery centers globally and offices in various cities across India, the US, UK, Canada, and Singapore, along with a significant remote workforce. At Tiger Analytics, we are certified as a Great Place to Work. Joining our team means being at the forefront of the AI revolution, working with innovative teams that push boundaries and create inspiring solutions. We are currently looking for an Azure Big Data Engineer to join our team in Chennai, Hyderabad, or Bangalore. As a Big Data Engineer (Azure), you will be responsible for building and implementing various analytics solutions and platforms on Microsoft Azure using a range of Open Source, Big Data, and Cloud technologies. Your typical day might involve designing and building scalable data ingestion pipelines, processing structured and unstructured data, orchestrating pipelines, collaborating with teams and stakeholders, and making critical tech-related decisions. To be successful in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be proficient in technologies such as Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Strong coding skills in SQL, Python, or Scala/Java are essential, as well as experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, and Elastic Search. Knowledge of file formats such as Delta Lake, Avro, Parquet, JSON, and CSV is also required. Ideally, you should have experience in building REST APIs, working on Data Lake or Lakehouse projects, supporting BI and Data Science teams, and following Agile and DevOps processes. Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) would be a valuable addition to your profile. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with different skills and qualities to apply, even if they do not meet all the criteria for the role. We are committed to providing equal opportunities and fostering a culture of listening, trust, respect, and growth. Please note that the job designation and compensation will be based on your expertise and experience, and our compensation packages are competitive within the industry. If you are passionate about leveraging data and technology to drive impactful solutions, we would love to stay connected with you.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Site Reliability Engineering (SRE) Technical Leader on the Network Assurance Data Platform (NADP) team at Cisco ThousandEyes, you will be responsible for ensuring the reliability, scalability, and security of the cloud and big data platforms. Your role will involve representing the NADP SRE team, contributing to the technical roadmap, and collaborating with cross-functional teams to design, build, and maintain SaaS systems operating at multi-region scale. Your efforts will be crucial in supporting machine learning (ML) and AI initiatives by ensuring the platform infrastructure is robust, efficient, and aligned with operational excellence. You will be tasked with designing, building, and optimizing cloud and data infrastructure to guarantee high availability, reliability, and scalability of big-data and ML/AI systems. This will involve implementing SRE principles such as monitoring, alerting, error budgets, and fault analysis. Additionally, you will collaborate with various teams to create secure and scalable solutions, troubleshoot technical problems, lead the architectural vision, and shape the technical strategy and roadmap. Your role will also encompass mentoring and guiding teams, fostering a culture of engineering and operational excellence, engaging with customers and stakeholders to understand use cases and feedback, and utilizing your strong programming skills to integrate software and systems engineering. Furthermore, you will develop strategic roadmaps, processes, plans, and infrastructure to efficiently deploy new software components at an enterprise scale while enforcing engineering best practices. To be successful in this role, you should have relevant experience (8-12 yrs) and a bachelor's engineering degree in computer science or its equivalent. You should possess the ability to design and implement scalable solutions, hands-on experience in Cloud (preferably AWS), Infrastructure as Code skills, experience with observability tools, proficiency in programming languages such as Python or Go, and a good understanding of Unix/Linux systems and client-server protocols. Experience in building Cloud, Big data, and/or ML/AI infrastructure is essential, along with a sense of ownership and accountability in architecting software and infrastructure at scale. Additional qualifications that would be advantageous include experience with the Hadoop Ecosystem, certifications in cloud and security domains, and experience in building/managing a cloud-based data platform. Cisco encourages individuals from diverse backgrounds to apply, as the company values perspectives and skills that emerge from employees with varied experiences. Cisco believes in unlocking potential and creating diverse teams that are better equipped to solve problems, innovate, and make a positive impact.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

indore, madhya pradesh

On-site

At ClearTrail, you will be part of a team dedicated to developing solutions that empower those focused on ensuring the safety of individuals, locations, and communities. For over 23 years, ClearTrail has been a trusted partner of law enforcement and federal agencies worldwide, committed to safeguarding nations and enhancing lives. We are leading the way in the future of intelligence gathering through the creation of innovative artificial intelligence and machine learning-based lawful interception and communication analytics solutions aimed at addressing the world's most complex challenges. We are currently looking for a Big Data Java Developer to join our team in Indore with 2-4 years of experience. As a Big Data Java Developer at ClearTrail, your responsibilities will include: - Designing and developing high-performance, scalable applications using Java and big data technologies. - Building and maintaining efficient data pipelines for processing large volumes of structured and unstructured data. - Developing microservices, APIs, and distributed systems. - Experience working with Spark, HDFS, Ceph, Solr/Elasticsearch, Kafka, and Delta Lake. - Mentoring and guiding junior team members. If you are a problem-solver with strong analytical skills, excellent verbal and written communication abilities, and a passion for developing cutting-edge solutions, we invite you to join our team at ClearTrail and be part of our mission to make the world a safer place.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing and building scalable and efficient data warehouses to support analytics and reporting needs. Additionally, you will develop and optimize ETL pipelines by writing complex SQL queries and automating data pipelines using tools like Apache Airflow. Your role will also involve query optimization, performance tuning, and database management with MySQL, PostgreSQL, and Spark for structured and semi-structured data. Ensuring data quality and governance will be a key part of your responsibilities, where you will validate, monitor, and enforce practices to maintain data accuracy, consistency, and completeness. You will implement data governance best practices, define data standards, access controls, and policies to uphold a well-governed data ecosystem. Data modeling, ETL best practices, BI dashboarding, and proposing/implementing solutions to improve existing systems will also be part of your day-to-day tasks. Collaboration and problem-solving are essential in this role as you will work independently, collaborate with cross-functional teams, and proactively troubleshoot data challenges. Experience with dbt for data transformations is considered a bonus for this position. To qualify for this role, you should have 5-7 years of experience in the data domain, with expertise in data engineering and BI. Strong SQL skills, hands-on experience with data warehouse concepts, ETL best practices, and proficiency in MySQL, PostgreSQL, and Spark are required. Experience with Apache Airflow, data modeling techniques, BI tools like Power BI, Tableau, Apache Superset, data quality frameworks, and governance policies are also essential. The ability to work independently, identify problems, and propose effective solutions is crucial. If you are looking to join a dynamic team at Zenda and have the required experience and skills, we encourage you to apply for this position.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

Are you a skilled professional with experience in SQL, Python (Pandas & SQLAlchemy), and data engineering We have an exciting opportunity for an ETL Developer to join our team! As an ETL Developer, you will be responsible for working with MS SQL, Python, and various databases to extract, transform, and load data for insights and business goals. You should have a Bachelor's degree in Computer Science or a related field, or equivalent work experience. Additionally, you should have at least 5 years of experience working with MS SQL, 3 years of experience with Python (Pandas, SQLAlchemy), and 3 years of experience supporting on-call challenges. Key responsibilities include running SQL queries on multiple disparate databases, working with large datasets using Python and Pandas, tuning MS SQL queries, debugging data using Python and SQLAlchemy, collaborating in an agile environment, managing source control with GitLab and GitHub, creating and maintaining databases, interpreting complex data for insights, and familiarity with Azure, ADF, Spark, and Scala concepts. If you're passionate about data, possess a strong problem-solving mindset, and thrive in a collaborative environment, we encourage you to apply for this position. For more information or to apply, please send your resume to samdarshi.singh@mwidm.com or contact us at +91 62392 61536. Join us in this exciting opportunity to contribute to our data engineering team! #ETLDeveloper #DataEngineer #Python #SQL #Pandas #SQLAlchemy #Spark #Azure #Git #TechCareers #JobOpportunity #Agile #DataAnalysis #SQLTuning #OnCallSupport,

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with data processing frameworks. - Strong understanding of distributed computing principles. - Familiarity with cloud platforms and services. - Experience in developing and deploying applications in a microservices architecture. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Technical Lead with over 8 years of experience in Data Engineering, Analytics, and Python development, including at least 3 years in a Technical Lead / Project Management role, you will play a crucial role in driving data engineering and analytics projects for our clients. Your client-facing skills will be essential in ensuring successful project delivery and effective communication between technical and business stakeholders. Your responsibilities will include designing and implementing secure, scalable data architectures on cloud platforms such as AWS, Azure, or GCP. You will lead the development of cloud-based data engineering solutions covering data ingestion, transformation, and storage while defining best practices for integrating diverse data sources securely. Overseeing security aspects of integrations and ensuring compliance with organizational and regulatory requirements will be part of your role. In addition, you will develop and manage robust ETL/ELT pipelines using Python, SQL, and modern orchestration tools, as well as integrate real-time streaming data using technologies like Apache Kafka, Spark Structured Streaming, or cloud-native services. Collaborating with data scientists to integrate AI models into production pipelines and cloud infrastructure will also be a key aspect of your responsibilities. Furthermore, you will work on advanced data analysis to generate actionable insights for business use cases, design intuitive Tableau dashboards and data visualizations, and define data quality checks and validation frameworks to ensure high-integrity data pipelines. Your expertise in REST API development, backend services, and integrating APIs securely will be crucial in developing and deploying data products and integrations. To excel in this role, you must have deep hands-on experience with cloud platforms, expertise in Python, SQL, Spark, Kafka, and streaming integration, proven ability with data warehousing solutions like BigQuery, Snowflake, and Redshift, and a strong understanding of integration security principles. Proficiency in data visualization with Tableau, REST API development, and AI/ML integration will also be essential. Preferred qualifications include prior experience managing enterprise-scale data engineering projects, familiarity with DevOps practices, and understanding of regulatory compliance requirements for data handling. Your ability to lead technical teams, ensure project delivery, and drive innovation in data engineering and analytics will be key to your success in this role.,

Posted 2 days ago

Apply

4.0 years

0 Lacs

Uttar Pradesh, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking a Testing and Training Analyst - Oracle Fusion (Finance & SCM) , to join our team based in Noida. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. Global Business Services India At Technip Energies, we are continually looking for ways to become more efficient, and ways to improve our quality, customer focus and cost competitiveness. The Global Business Services (GBS) organization is key to executing this strategy, by standardizing our processes and centralizing our services. Our Vision: A customer-focused, cost-efficient, innovative, and high performing organization that drives functional excellence. GBS provide streamlined and consistent services to our internal customers in the domain of Finance and Accounting, Human Resources, Business Functional Support, Procurement and Legal. Our services fit our global organization and allow us to focus on business strategy and priorities. GBS also maintains continuous improvement plans to enhance our customer-oriented service culture. About the opportunity we offer: Carrying out User Regression Testing (Per Quarter or as per requirement) Performing End User Training to Global User Update ERP related documents. Operate as an interface between IDS functions and Business End User. All other duties as reasonably requested. About you: Minimum 4-6 years of experience in business domain. Functional Knowledge of Finance module of Oracle Fusion ERP (Finance – GL and Cash Management & Procurement) Experience of Testing and Training in ERP. Experience of Preparing and updating Quality documents. English fluently Must have: Experience in writing and updating procedures and process evaluation and documentation. A demonstrated knowledge of ERP Testing and training. A demonstrated knowledge of Excellent communication skills. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What’s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 2 days ago

Apply

13.0 years

0 Lacs

Uttar Pradesh, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking a Joint Manager - Accounts Payable (Indian Accounting) , to join our team based in Noida. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. Global Business Services India At Technip Energies, we are continually looking for ways to become more efficient, and ways to improve our quality, customer focus and cost competitiveness. The Global Business Services (GBS) organization is key to executing this strategy, by standardizing our processes and centralizing our services. Our Vision: A customer-focused, cost-efficient, innovative, and high performing organization that drives functional excellence. GBS provide streamlined and consistent services to our internal customers in the domain of Finance and Accounting, Human Resources, Business Functional Support, Procurement and Legal. Our services fit our global organization and allow us to focus on business strategy and priorities. GBS also maintains continuous improvement plans to enhance our customer-oriented service culture. About the opportunity we offer: Responsible for the day-to-day management of the Accounts Payable function for the Indian customers. Administer all AP activities including Invoice reception management, OCR, Invoice processing, AP helpdesk, Travel & Expense, Payments, Intercompany Reconciliations, Month-end including Accruals. Responsible for SLA / KPI management and regular customer governance. In Depth Knowledge and work experience of Indian accounting, Indian taxation & compliances (GST/TDS/MSME/input credit), Custom Duty, Advance Payments etc. is must, also he/she must be good in client management & team management. Supervise staff of payable specialists along with overall responsibility for vendor and employee account management. Supervise various duties for Accounts Payable function including processing vendor invoices, receipt matching to invoices, monitor Accounts Payable mailbox and ensure that POs and proper payment approvals are provided. Ensure vendor invoices and check requests are paid in accordance with the company’s cash flow and authorization policies. Ensure Intercompany invoices booking & compliance w.r.t. tax compliance documents (NO PE, TRC, 10F) as per DTAA treaty with respective countries. Ensure compliance of foreign invoices as per FEMA guidelines Ensure invoices PO, Non-PO, Subcontract meet compliances, TDS, GST, WCT, LTDC Effective Stakeholder Management, resolve, in collaboration with Business Procurement, MDM, Requestors, supplier billing discrepancies and related inquiries with adequate follow ups in place Ensure Advances are booked as per agreed terms in PO/Sub-contract. Vigilance on critical vendors payments like MSME vendors as per MSME ACT Manage payments as per agreed PO/Subcontract terms, ensure adjustment of advances & TDS, also ensure it is paid as per agreed milestones. Tracking of Advance TDS deduction against Invoices and recovery from vendor invoices (TDS deduction - Yearly basis provision) Ensure adequate quality monitoring is in place for the transactions processed in the system. Monitor Purchase Orders and liaise with procurement team for PO receipting / GRN and resolving PO discrepancies. Manage Monthly /Quarterly/Yearly Accruals (book & reverse accruals) for Project & non‐project activities. Perform Intercompany Reconciliation confirmation activities on quarterly basis, to ensure balance confirmation as per threshold limits for Group Accounts Ensure necessary reconciliations in place, example – Vendor Reconciliation, Contractor Status review, LTDC tracker. Perform Monthly GST reconciliation, closing all related queries, and to ensure GST data for filing return. Ensuring honoring the stringent deadline along with due coordination/handshake with stakeholders Monitor performance of direct reports. Provide prompt and objective coaching in accomplishing goals. Conduct performance reviews, recommend salary increases and is actively involved in recognition and employee development strategies. Assures effective communication is maintained within the department and externally. Where appropriate, inform employees of company/department plans and progress. Conduct staff meetings at regular intervals. Ensuring team development activities and mentoring the team on different areas viz process, people, and stakeholder management Build solid relationships across all units at various accounting levels. Implement best practices and process improvements and ensure Quality and Process Gaps are duly reviewed at regular intervals. Lead and follow through to completion any assigned special projects and cost effectiveness initiatives. All other duties as needed or required per business requirement. About you: 13+ years of finance and accounts (including - accounts payable) experience with an Indian BPO / KPO/SSO People Management experience of at least 3 Years on papers. Masters in commerce or business administration, accounting, finance, or related field Strong written and verbal skills, analytical skills, and ability to compose and initiate correspondence. Key Skills Good working knowledge of Indian Accounting Good exposure of overall F&A operations including AP, AR and GL Good working knowledge of Accounting ERP - IFS/Jeevan or Oracle/Oracle Fusion (preferred) Good written and verbal communication skills Excellent customer management skills Knowledge of Indian & International Accounting Standards Must be flexible and able to work in 24x7 shifts. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What’s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 2 days ago

Apply

0.0 years

0 Lacs

Uttar Pradesh, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking a Graduate Trainee – Compliance Support , to join our Digiteam based in Noida. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. Global Business Services India At Technip Energies, we are continually looking for ways to become more efficient, and ways to improve our quality, customer focus and cost competitiveness. The Global Business Services (GBS) organization is key to executing this strategy, by standardizing our processes and centralizing our services. Our Vision: A customer-focused, cost-efficient, innovative, and high performing organization that drives functional excellence. GBS provide streamlined and consistent services to our internal customers in the domain of Finance and Accounting, Human Resources, Business Functional Support, Procurement and Legal. Our services fit our global organization and allow us to focus on business strategy and priorities. GBS also maintains continuous improvement plans to enhance our customer-oriented service culture. About the mission we offer you: Coordinating with key stakeholders for collecting & consolidating evidence Monitoring of quarterly ERP Access Review campaigns which are executed in tool for audit purposes Key Interactions Group Internal Control Team ERP - Support Team Business Stakeholders About you: Graduate (BCA/BSc./B.Com/BBA) Experience 0-1 Year. Good verbal & written communication skills Knowledge of basis accounting and its principles Knowledge of Office 365 You are meant for this position if you have/ are: Zeal to learn and can support existing processes Adapt to new system & technology very quickly. Enjoy working in a fast-paced environment. Flexible working hours during peak business periods Your career with us Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, Graduate Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development. What’s Next? Once receiving your system application, our recruiting team will screen and match your skills, experience, and potential team fit against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting www.ten.com and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About The Role The Core Analytics & Science Team (CAS) is Uber's primary science organisation, covering both our main lines of business as well as the underlying platform technologies on which those businesses are built. We are a key part of Uber's cross-functional product development teams, helping to drive every stage of product development through data analytic, statistical, and algorithmic expertise. CAS owns the experience and algorithms powering Uber's global Mobility and Delivery products. We optimise and personalise the rider experience, target incentives and introduce customizations for routing and matching for products and use cases that go beyond the core Uber capabilities. What the Candidate Will Do ---- Refine ambiguous questions and generate new hypotheses and design ML based solutions that benefit product through a deep understanding of the data, our customers, and our business Deliver end-to-end solutions rather than algorithms, working closely with the engineers on the team to productionize, scale, and deploy models world-wide. Use statistical techniques to measure success, develop northstar metrics and KPIs to help provide a more rigorous data-driven approach in close partnership with Product and other subject areas such as engineering, operations and marketing Design experiments and interpret the results to draw detailed and impactful conclusions. Collaborate with data scientists and engineers to build and improve on the availability, integrity, accuracy, and reliability of data logging and data pipelines. Develop data-driven business insights and work with cross-functional partners to find opportunities and recommend prioritisation of product, growth, and optimisation initiatives. Present findings to senior leadership to drive business decisions Basic Qualifications ---- Undergraduate and/or graduate degree in Math, Economics, Statistics, Engineering, Computer Science, or other quantitative fields. 4+ years experience as a Data Scientist, Machine learning engineer, or other types of data science-focused functions Knowledge of underlying mathematical foundations of machine learning, statistics, optimization, economics, and analytics Hands-on experience building and deployment ML models Ability to use a language like Python or R to work efficiently at scale with large data sets Significant experience in setting up and evaluation of complex experiments Experience with exploratory data analysis, statistical analysis and testing, and model development Knowledge in modern machine learning techniques applicable to marketplace, platforms Proficiency in technologies in one or more of the following: SQL, Spark, Hadoop Preferred Qualifications Advanced SQL expertise Proven track record to wrangle large datasets, extract insights from data, and summarise learnings/takeaways. Proven aptitude toward Data Storytelling and Root Cause Analysis using data Advanced understanding of statistics, causal inference, and machine learning Experience designing and analyzing large scale online experiments Ability to deliver on tight timelines and prioritise multiple tasks while maintaining quality and detail Ability to work in a self-guided manner Ability to mentor, coach and develop junior team members Superb communication and organisation skills

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The Software Engineering team at Dell Technologies is dedicated to delivering next-generation application enhancements and new products to meet the evolving needs of the world. As a Software Principal Engineer in Bangalore, you will be at the forefront of designing and developing software using cutting-edge technologies, tools, and methodologies in collaboration with both internal and external partners. Your primary responsibility will be to develop sophisticated systems and software solutions aligned with our customers" business goals and requirements. You will work closely with business stakeholders, conduct data analysis, ETL tasks, and data administration independently. Additionally, you will design, develop, and maintain scalable ETL pipelines, collaborating with various teams to ensure data requirements are met. It will be essential to stay updated on industry trends, mentor junior data engineers, and provide technical guidance as needed. To excel in this role, you should have a minimum of 8 years of industry experience with a focus on advanced ETL skills, including proficiency in tools like Airflow, ControlM, or Informatica. Strong Python programming skills for data manipulation, along with a solid understanding of SQL and NoSQL databases, are required. Experience with big data technologies such as Hadoop, Spark, or Kafka, as well as familiarity with cloud platforms like AWS, Azure, ADF Synapse, and SQLserver, will be beneficial. Desirable qualifications include experience with product data and product lifecycle management, as well as working knowledge of data visualization tools like Tableau or Power BI. At Dell Technologies, we believe in the power of each team member to make a positive impact. We prioritize our team members" growth and development, offering opportunities to work with cutting-edge technology and some of the industry's best minds. If you are seeking a rewarding career where you can contribute to a future that benefits everyone, we invite you to join us. Application closing date: 30 July 2025 Dell Technologies upholds the principle of equal employment opportunity and is committed to providing a work environment free of discrimination and harassment for all employees. If you are ready to take the next step in your career with us, we look forward to welcoming you to our team.,

Posted 2 days ago

Apply

1.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers.bms.com/working-with-us . Position Summary The GPS Data & Analytics Software Engineer I role is accountable for developing data solutions. The role will be accountable for developing the pipelines for the data enablement projects, production/application support and enhancements. Additional responsibilities include data analysis, data operations process and tools, data cataloguing, and developing data SME skills in Global Product Development and Supply - Analytics and AI Enablement organization. Key Responsibilities The Data Engineer will be responsible for designing, building, delivering and maintaining high quality data products and analytic ready data solutions for GPS Cell Therapy Develop cloud-based (AWS) data pipelines using DBT and Glue Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and stake holders to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and protection through validation, testing, and security protocols Implement and maintain security protocols to protect sensitive data Stay up to date with emerging trends and technologies in data engineering, analytical engineering and analytics and adapt with new technologies Participate in the analysis, design, build, manage, and operate lifecycle of the enterprise data lake and analytics focused digital capabilities Work in agile environment Debugging issues on the go Understanding the existing models if required and taking it forward Using JIRA for effort estimation, task tracking and communication about the task Using GIT for version control, quality checks and reviews Proficient Python, Spark, SQL, AWS Redshift, DBT, AWS S3, Glue/Glue Studio, Athena, IAM, other Native AWS Service familiarity with Domino/data lake principles Good to have knowledge/hands on experience in React Js for creating analytics dashboard if required Good to have knowledge about AWS Cloud Formation Templates Partner with other data, platform, and cloud teams to identify opportunities for continuous improvements Required: 1-3 years of experience in information technology field in developing AWS cloud native data lakes and ecosystems Understanding of cloud technologies preferably AWS and related services in delivering and supporting data and analytics solutions/data lakes Should have working knowledge of GIT and version control good practices Proficient in Python, Spark, SQL, AWS Services Good to have experience in React.js and Full stack technologies Good to have worked in agile development environment and have used JIRA or similar task tracking & management tools Good to have experience/knowledge of working with DBT Knowledge of data security and privacy best practices Ideal Candidates Would Also Have: Prior experience in global life sciences especially in the GPS functional area will be a plus Experience working internationally with a globally dispersed team including diverse stakeholders and management of offshore technical development team(s) Strong communication and presentation skills Other Qualifications: Bachelor's degree in computer science, Information Systems, Computer Engineering or equivalent is preferred If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol Responsibilities BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies