Jobs
Interviews

15886 Spark Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders Around 4-6 years of experience in IT industry, preferably banking domain Expertise and experience in Java (java 1.8 (building API, Java thread, collections, Streaming, dependency injection/inversion), Junit, Big-data (Spark, Oozie, Hive) and Azure (AKS, CLI, Event, Key valut) and should have been part of digital transformation initiatives with knowledge of Unix, SQL/RDBMS and Monitoring Development experience in REST APIs Experience in managing tools – GIT/BIT Bucket, Jenkins, NPM, Docket/Kubernetes, Jira, Sonar Knowledge of Agile practices and Agile@Scale Good communication / collaboration skills

Posted 4 days ago

Apply

4.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders Profile required Around 4-6 years of experience in IT industry, preferably banking domain Expertise and experience in Java (java 1.8 (building API, Java thread, collections, Streaming, dependency injection/inversion), Junit, Big-data (Spark, Oozie, Hive) and Azure (AKS, CLI, Event, Key valut) and should have been part of digital transformation initiatives with knowledge of Unix, SQL/RDBMS and Monitoring Development experience in REST APIs Experience in managing tools GIT/BIT Bucket, Jenkins, NPM, Docket/Kubernetes, Jira, Sonar Knowledge of Agile practices and Agile at Scale Good communication / collaboration skills

Posted 4 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models Programming Skills - Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 4 days ago

Apply

0.0 - 3.0 years

0 - 0 Lacs

Pune, Maharashtra

Remote

We’re seeking an experienced and confident Cold Calling Expert who can make high-quality outbound calls to businesses and prospects in the USA market . Your goal is to generate qualified leads, set appointments, and spark meaningful business conversations. Key Responsibilities (Female only apply) Make outbound cold calls to US-based leads (B2B). Engage prospects, introduce services, and handle objections. Qualify leads and schedule meetings for the sales team. Maintain CRM records and update call outcomes. Meet weekly call volume and conversion targets. Requirements Proven cold calling or outbound sales experience (USA market preferred). Fluent English communication skills – both spoken and written. Comfortable with US business culture and etiquette . Strong interpersonal and persuasive skills. Self-motivated, reliable, and target-oriented. Tech-savvy with experience using CRMs (e.g., HubSpot, Zoho, Salesforce, etc.). Quiet work environment and noise-canceling headset. Preferred Qualifications Previous BPO, VA agency, or lead generation experience. Experience calling for real estate, SaaS, medical, or service-based companies . Familiarity with tools like Dialpad, Aircall, RingCentral, etc. What We Offer Competitive hourly rate or commission-based model (based on experience). Flexible working hours. Work from the comfort of your home. Ongoing projects with long-term opportunities. Supportive team and training (if required). Job Types: Full-time, Part-time Pay: ₹15,000.00 - ₹40,000.00 per month Benefits: Health insurance Internet reimbursement Paid sick time Education: Bachelor's (Preferred) Experience: total: 3 years (Required) Language: English (Preferred) Usa English (Required) Location: Pune, Maharashtra (Required) Shift availability: Night Shift (Required) Work Location: Remote Speak with the employer +91 7841970552

Posted 4 days ago

Apply

6.0 - 8.0 years

1 - 4 Lacs

Chennai

Hybrid

3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).

Posted 4 days ago

Apply

12.0 years

0 Lacs

Greater Chennai Area

On-site

Job Description The Data Engineering team within the AI, Data, and Analytics (AIDA) organization is the backbone of our data-driven sales and marketing operations. We provide the essential foundation for transformative insights and data innovation. By focusing on integration, curation, quality, and data expertise across diverse sources, we power world-class solutions that advance Pfizer’s mission. Join us in shaping a data-driven organization that makes a meaningful global impact. Role Summary We are seeking a technically adept and experienced Data Solutions Engineering Senior Manager who is passionate about and skilled in designing and developing robust, scalable data models. This role focuses on optimizing the consumption of data sources to generate unique insights from Pfizer’s extensive data ecosystems. A strong technical design and development background is essential to ensure effective collaboration with engineering and developer team members. As a Senior Data Solutions Engineer in our data lake/data warehousing team, you will play a crucial role in designing and building data pipelines and processes that support data transformation, workload management, data structures, dependencies, and metadata management. Your expertise will be pivotal in creating and maintaining the data capabilities that enables advanced analytics and data-driven decision-making. In this role, you will work closely with stakeholders to understand their needs and collaborate with them to create end-to-end data solutions. This process starts with designing data models and pipelines and establishing robust CI/CD procedures. You will work with complex and advanced data environments, design and implement the right architecture to build reusable data products and solutions, and support various analytics use cases, including business reporting, production data pipelines, machine learning, optimization models, statistical models, and simulations. As the Data Solutions Engineering Senior Manager, you will develop sound data quality and integrity standards and controls. You will enable data engineering communities with standard protocols to validate and cleanse data, resolve data anomalies, implement data quality checks, and conduct system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data-driven solutions for the pharmaceutical industry. Role Responsibilities Project solutioning, including scoping, and estimation. Data sourcing, investigation, and profiling. Prototyping and design thinking. Designing and developing data pipelines & complex data workflows. Create standard procedures to ensure efficient CI/CD. Responsible for project documentation and playbook, including but not limited to physical models, conceptual models, data dictionaries and data cataloging. Technical issue debugging and resolutions. Accountable for engineering development of both internal and external facing data solutions by conforming to EDSE and Digital technology standards. Partner with internal / external partners to design, build and deliver best in class data products globally to improve the quality of our customer analytics and insights and the growth of commercial in its role in helping patients. Demonstrate outstanding collaboration and operational excellence. Drive best practices and world-class product capabilities. Qualifications Bachelor’s degree in a technical area such as computer science, engineering, or management information science. Master’s degree is preferred. 12 to 16 years of combined data warehouse/data lake experience as a data lake/warehouse developer or data engineer. 12 to 16 years in developing data product and data features in servicing analytics and AI use cases. Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Technical Skillset 9+ years of hands-on experience in working with SQL, Python, object-oriented scripting languages (e.g. Java, C++, etc.) in building data pipelines and processes. Proficiency in SQL programming, including the ability to create and debug stored procedures, functions, and views. 9+ years of hands-on experience designing and delivering data lake/data warehousing projects. Minimal of 5 years in hands on design of data models. Proven ability to effectively assist the team in resolving technical issues. Proficient in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and working knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates. Information & Business Tech

Posted 4 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking an Assistant Manager/ Manager - Planning , to join our Planning team based in Noida, India . About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. About The Job Responsibilities: Defines project break-down structures. Develops a Schedule compliant with the selected execution strategy. Arrange schedule review, identify key milestones, and highlight critical paths. Supports Client approval process of the Schedule Baseline, progress claimed or milestones achievement. Monitor and control progress of work in term of: Physical progress and analysis Deliverables and Manhours spent. Schedule of work Project status and forecast. Monitors the project status, to detect any delay and to propose corrective actions. Well versed with Planning package deliverables. Well Versed with Engineering discipline deliverables and its associated linkages. Prepare periodic report to the client/ project management. Coordinate and follow up with other departments and disciplines for progress measurement and forecast plan. Highlights areas of concerns and solutions to help achieve a successful and timely completion of Projects. Operationally manages the project Planning Team as provided. Plans to mitigate risks and reduce costs. Defines and optimizes the necessary resources (E-P-C-C) Defines the Planning & Scheduling system (methods, procedures, time-schedules, physical progress, dashboards, etc.) Defines for the Management, the Proposal Manager, and the Estimation Department a Contract duration realistic and feasible for execution. Increases Project team members’ awareness of the Project Milestones, main durations, and critical paths. Estimates the delay linked with changes and claims and to prepare the Extension of Time analysis, in case of delay due to Client. Provides Project Feedback and Lessons Learned to improve the Planning & Scheduling methods and provide input to the statistics of Planning Department About You Engineering degree or equivalent diploma 5-15 years of experience in Project execution of Detailed Engineering/ EPC projects/EPCM At least 1 no of EPCC full project lifecycle experience Good understanding of the business principles for the energy (Preferably Oil & Gas) industry Good knowledge of scheduling computer tools (i.e., Primavera (must) / MS Project) Thorough understanding and experience in Project delivery processes and principles, activity linkages Possesses good data handling capabilities with spread sheets or databases, business intelligence tools. Professional English Solid analytical and problem-solving skills. Accuracy and attention to details Able to present data and fact in a clear and consistent manner. Possesses effective communication skills. Solid analytical and problem-solving skills. Knowledge of construction associated activities. Accuracy and attention to details Knowledge of Tools like Acumen Fuse/ Power BI shall be an additional advantage. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What’s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking a Manager - Planning , to join our Planning team based in Gandhinagar, Gujarat. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. About The Job Responsibilities: Defines project break-down structures. Develops a Schedule compliant with the selected execution strategy. Arrange schedule review, identify key milestones, and highlight critical paths. Supports Client approval process of the Schedule Baseline, progress claimed or milestones achievement. Monitor and control progress of work in term of: Physical progress and analysis Deliverables and Manhours spent. Schedule of work Project status and forecast. Monitors the project status, to detect any delay and to propose corrective actions. Well versed with Planning package deliverables. Well Versed with Engineering discipline deliverables and its associated linkages. Prepare periodic report to the client/ project management. Coordinate and follow up with other departments and disciplines for progress measurement and forecast plan. Highlights areas of concerns and solutions to help achieve a successful and timely completion of Projects. Operationally manages the project Planning Team as provided. Plans to mitigate risks and reduce costs. Defines and optimizes the necessary resources (E-P-C-C) Defines the Planning & Scheduling system (methods, procedures, time-schedules, physical progress, dashboards, etc.) Defines for the Management, the Proposal Manager, and the Estimation Department a Contract duration realistic and feasible for execution. Increases Project team members’ awareness of the Project Milestones, main durations, and critical paths. Estimates the delay linked with changes and claims and to prepare the Extension of Time analysis, in case of delay due to Client. Provides Project Feedback and Lessons Learned to improve the Planning & Scheduling methods and provide input to the statistics of Planning Department About You Engineering degree or equivalent diploma 10+ years of experience in Project execution of Detailed Engineering/ EPC projects/EPCM At least 1 no of EPCC full project lifecycle experience Good understanding of the business principles for the energy (Preferably Oil & Gas) industry Good knowledge of scheduling computer tools (i.e., Primavera (must) / MS Project) Thorough understanding and experience in Project delivery processes and principles, activity linkages Possesses good data handling capabilities with spread sheets or databases, business intelligence tools. Professional English Solid analytical and problem-solving skills. Accuracy and attention to details Able to present data and fact in a clear and consistent manner. Possesses effective communication skills. Solid analytical and problem-solving skills. Knowledge of construction associated activities. Accuracy and attention to details Knowledge of Tools like Acumen Fuse/ Power BI shall be an additional advantage. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What’s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Uttar Pradesh, India

On-site

Job Description Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking Cybersecurity Risk Analyst to join our team based in Noida. The IT Risk Analyst reports directly to the IT Risk Manager and is in charge for Risk identification, assessment, mitigation and follow-up processes for both IT and OT environments. About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. Global Business Services India At Technip Energies, we are continually looking for ways to become more efficient, and ways to improve our quality, customer focus and cost competitiveness. The Global Business Services (GBS) organization is key to executing this strategy, by standardizing our processes and centralizing our services. Our Vision : A customer-focused, cost-efficient, innovative, and high performing organization that drives functional excellence. GBS provide streamlined and consistent services to our internal customers in the domain of Finance and Accounting, Human Resources, Business Functional Support, Procurement and Legal. Our services fit our global organization and allow us to focus on business strategy and priorities. GBS also maintains continuous improvement plans to enhance our customer-oriented service culture. Responsibilities: Responsible for Digiteam and cybersecurity risk identification, assessment, mitigation and follow-up Maintains the documentation relating to risk management processes. Responsible for maintaining a Risk Register at group level Responsible for risk management Indicators (KRI) calculation and communication relating to the whole cybersecurity department. Collaborate with Security Operations Center (SOC) teams to analyze incident trends and integrate findings into risk assessments. Support the development and implementation of risk treatment plans, including technical controls and compensating measures About You At least 5 years of experience in IT Risk Management Certifications (preferred but not mandatory): ITIL, CRISC (Certified in Risk and Information Systems Control), CISM (Certified Information Security Manager), ISO 27005 Risk Manager Certification or equivalent on filed experience. Hands-on experience conducting cybersecurity risk assessments in hybrid environments (on-premises and cloud). Experience working with DevSecOps teams to integrate risk management into CI/CD pipelines. Familiarity with incident response processes and post-incident risk re-evaluation. Technical Skills Strong understanding of cyber threat intelligence and its application in risk management. Familiarity with GRC platforms (e.g., SureCloud, ServiceNow GRC) for risk tracking and reporting. Experience with vulnerability management tools (e.g., Tenable, Qualys, Rapid7) and interpreting scan results. Knowledge of cloud security frameworks (e.g., CSA CCM, Azure Security Benchmark, AWS Well-Architected Framework). Understanding secure architecture principles and ability to review system designs for risk exposure. Familiarity with compliance frameworks such as GDPR, SOX, and industry-specific standards (e.g., IEC 62443 for OT). Familiarity with ISO2700x, NIST, CIS frameworks. Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What’s next? Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn, Instagram, Facebook, X and YouTube for company updates.

Posted 4 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Responsibilities Responsible for leading software application backend design, development, delivery, and maintenance. Evaluate and select alternative technical solutions for identified requirements with knowledge of backend and J2EE application development. Work with an onshore team to clarify business requirements into product features, acting as a liaison between business and technical teams. Resolve technical issues and provide technical support. Provide technical guidance and assistance to other software engineers. Prepare staffing plan and allotment of resources. Assist the project managers in resolving any issues and conflicts within their projects. Improve customer relations by effective communication, managing expectations, and meeting commitments. Keep abreast of technical and organizational developments in your own professional field. Required Qualifications Bachelor's degree in computer science, information technology, or related area (equivalent work experience will be considered). 1+ years' experience in developing business applications in a full software development life cycle using web technologies. 1+ years' experience in Software development analysis and design (UML). Advanced experience with Node.js, ReactJS, JavaScript, TypeScript, HTML5, CSS3, SASS, Python and web service integration Have a solid technical background in J2EE, Structs, Spring, Hibernate, and MuleSoft. Experience in PostgreSQL, Microsoft SQL Server, Nginx, Docker, Redis, Spring Boot and Spring Cloud, Web Service, WebSphere/JBoss/WebLogic. Experience using at least one of the following cloud platforms: Azure, AWS, GCP. Prefer deep understanding with Azure DevOps, Azure Synapse Analytics, Databricks, Delta Lake and Lakehouse. Experience in designing, developing, and optimizing data processing applications using Apache Spark in Databricks. Capable of writing efficient Spark jobs in languages such as Scala, Python, PySpark, Spark SQL. Familiarity with the application and integration of Generative AI, Prompt Engineering and Large Language Models (LLMs) in enterprise solutions. Demonstrate the ability to independently design and implement the backend of an entire business module. Demonstrate excellent interpersonal skills, particularly in balancing requirements, managing expectations, collaborating with team members, and driving effective results. Proactive attitude, ability to work independently, and a desire to continuously learn new skills and technology. Excellent written and verbal communication skills in English. Additional Or Preferred Qualifications Master’s degree in computer science, information technology, or related majors. Technical lead experience. 3+ years’ experience in developing business applications in a full software development life cycle using web technologies. Experience using Azure and either AWS or GCP. Experience with data visualization tools such as Power BI or Tableau.

Posted 4 days ago

Apply

170.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary Processes Good verbal and written communication skills Possess analytical and structured problem-solving skill Ability to learn and adapt to new technologies and frameworks Good programming & debugging skills Ability to handle raw and unstructured data Good understanding of software development life cycle (Agile and Waterfall model) Understanding of coding standards. Understanding on source control, versioning, branching etc. Hands-on in Big Data Toolset such as Hadoop, HDFS, HIVE, SPARK, Bash Scripting Hands-on in SQL Hands on in any reporting tool (e.g. Tableau, Dataiku, MSTR etc.) is a plus Familiar with Enterprise Data Warehouse and Reference Data Management is a plus Familiar with Control M (or other job orchestration tool) is a plus Familiar with building ELT/ETL pipeline in Hadoop is a plus Familiar with Azure DevOps is a plus Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Lead to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders FCSO development teams and FCSO Business Skills And Experience Hadoop Apache Hive PySpark SQL Azure DevOps Control M Qualifications Education Diploma/Degree Competencies Action Oriented Collaborates Customer Focus Gives Clarity & Guidance Manages Ambiguity Develops Talent Drives Vision & Purpose Nimble Learning Decision Quality Courage Instills Trust Strategic Mindset Technical Competencies: This is a generic competency to evaluate candidate on role-specific technical skills and requirements About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers

Posted 4 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

LinkedIn was built to help professionals achieve more in their careers, and everyday millions of people use our products to make connections, discover opportunities and gain insights. Our global reach means we get to make a direct impact on the world’s workforce in ways no other company can. We are much more than a digital resume – we transform lives through innovative products and technology. At LinkedIn, our approach to flexible work is centered on trust and optimized for culture, connection, clarity, and the evolving needs of our business. The work location of this role is hybrid, meaning it will be performed both from home and from a LinkedIn office on select days, as determined by the business needs of the team. Productivity Engineering is a team at LinkedIn that builds products that power LinkedIn’s business. We drive technology vision, architecture, and design systems that help the company deliver major business processes (go-to-market, sales, finance, and customer support etc.). We deliver applications and products that let our customers do business with us in a seamless way, help grow our top line and increase our efficiency. As a Staff Software Engineer, you will play a pivotal role in the design & development of business processes, systems, applications, information and data. While working on this team, you’ll get a chance to make key technology decisions, solve massively complex problems, and develop new skills in the process. You will be results driven and coaching teams to deliver software products at the highest levels of quality and craftsmanship. This position is full-time and based in our Bengaluru office. This role will be based in Bangalore, India. Responsibilities: The ideal candidate will help scale LinkedIn’s productivity engineering team as we continue to experience dramatic growth in our members and the usage of our products with a focus on one or more of the areas below: - Applications and Automation: Design and develop software applications and automation tools to power the Sales & Operations Team and help develop software to boost business as well as sales productivity. - Microservices: Develop end to end Microservices and define work on the next-generation platform for internal applications of LinkedIn. - Enterprise Software: Build and operate the platform that powers Enterprise applications at LinkedIn. The goal is to provide and run in a 24/7 production environment a platform that enables our sales users and employees to be more productive, while at the same time remaining constantly available and performant. - Metrics and Analytics: Work on our enterprise metrics and analytics tool, building the frontend and backend of our enterprise metrics platform. - Service: Provide a technical platform for the Enterprise Software team to build services, which are the essential unit of development and deployment. - Integrations and Data Infrastructure: A focus on building integration with both cloud and on-premises applications and supporting large scale systems and tools that enable the generation of insights and data products on LinkedIn’s internal data via self-serve computing, reporting solutions, and interactive querying. - Provide technical leadership, driving and performing best engineering practices to initiate, plan, and execute large-scale, cross-functional, and company-wise critical programs. - Identify, leverage, and successfully evangelize opportunities to improve engineering productivity. Basic Qualifications: - BTech/MCA or equivalent in Computer Science or related technical discipline. - 10+ years of experience in software architecture, design, and development. - 10+ years of experience writing clean and efficient code in Java, Python, and/or C#. Preferred Qualifications: - Expert knowledge of computer science, with strong competencies in data structures, algorithms, and software design. - Expertise in designing and architecting solutions for distributed and scalable systems. - Experience working with cloud technology such as Azure. - Experience writing scalable and efficient Java for an enterprise product. - Experience using message queues such as Kafka. - Experience using NoSQL databases such as Cosmos DB. - Experience of working with big data technologies like Hadoop and Apache Spark - Experience with AI technologies and frameworks and a strong understanding of machine learning algorithms and data processing techniques. - Skilled in developing AI agents that deconstruct prompts into structured workflows, adapt to real-time feedback, and iteratively refine outputs. - Expert in designing autonomous, context-aware systems that simulate human judgment and adapt to dynamic environments using advanced ML and NLP. - Experienced in extending AI tools with custom or pre-built skills to support tailored workflows and real-time collaboration—enabling intelligent assistance across tasks like automation, code generation, troubleshooting, and cross-functional communication. - Knowledge of (and a passion for) current trends and best practices in full stack architecture, including performance, security, and scalability. - Familiarity and comfort with command-line applications, git source control and other aspects of developing in large, distributed software teams. - Strong written and verbal communicator who is highly organized and able to think entrepreneurially. Suggested Skills: - Full stack development - API Development - Software design - Java or C#, Python - Tools: LangChain, Semantic Kernel, OpenAI GPT, Azure OpenAI India Disability Policy LinkedIn is an equal employment opportunity employer offering opportunities to all job seekers, including individuals with disabilities. For more information on our equal opportunity policy, please visit https://legal.linkedin.com/content/dam/legal/Policy_India_EqualOppPWD_9-12-2023.pdf Global Data Privacy Notice for Job Candidates This document provides transparency around the way in which LinkedIn handles personal data of employees and job applicants: https://legal.linkedin.com/candidate-portal

Posted 4 days ago

Apply

4.0 - 9.0 years

14 - 18 Lacs

Noida

Work from Office

Looking for smart, curious and highly motivated engineers from an Application Support or Test Automation background, who have good experience in Python and SDLC . The ideal candidate who has the desire to work on Gen AI projects would have enrolled themselves in some Gen AI courses, and should have done some reading/exploration by themselves. The ideal candidate should have 4-10 years of practical work experience in areas like - integrating/managing APIs, async programming frameworks/libraries, state management , concurrency , containerization and telemetry. Mandatory Competencies Data Science and Machine Learning - Data Science and Machine Learning - Gen AI Python - Python UI - Typescript Beh - Communication

Posted 4 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role We are seeking a motivated and detail-oriented Mid-Level Data Engineer with 2–3 years of experience in designing, developing, and optimizing data pipelines within the healthcare domain. The ideal candidate will have hands-on experience with Databricks , strong SQL skills, and a solid understanding of healthcare data standards (e.g., HL7, EDI X12 – 837/835, HCC, CPT/ICD codes). Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Databricks, PySpark, and Delta Lake for large-scale healthcare datasets. Collaborate with data scientists, analysts, and product managers to understand data requirements and deliver clean, reliable data. Ingest, process, and transform healthcare-related data such as claims (837/835), EHR/EMR, provider/member, and clinical datasets. Implement data quality checks, validations, and transformations to ensure high data integrity and compliance with healthcare regulations. Optimize data pipeline performance, reliability, and cost in cloud environments (preferably Azure or AWS). Maintain documentation of data sources, data models, and transformations. Support analytics and reporting teams with curated datasets and data marts. Adhere to HIPAA and organizational standards for handling PHI and sensitive data. Assist in troubleshooting data issues and root cause analysis across systems. Required Qualifications 2–3 years of experience in a data engineering role, preferably in the healthcare or healthtech sector. Hands-on experience with Databricks, Apache Spark (PySpark), and SQL. Familiarity with Delta Lake, data lakes, and modern data architectures. Solid understanding of healthcare data standards: EDI 837/835, CPT, ICD-10, DRG, or HCC. Experience with version control (e.g., Git), CI/CD workflows, and task orchestration tools (e.g., Airflow, Azure Data Factory, dbt). Ability to work with both structured and semi-structured data (JSON, Parquet, Avro, etc.). Strong communication skills and ability to collaborate in cross-functional teams. Education Bachelor’s degree in Business Administration, Healthcare Informatics, Information Systems, or a related field.

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment? OUR IMPACT We are Compliance Engineering, a global team of more than 300 engineers and scientists who work on the most complex, mission-critical problems. We build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive UX/UI and Big Data applications. Compliance Engi neering is looking to fill several big data software engineering roles Your first deliverable and success criteria will be the deployment, in 2025, of new complex data pipelines and surveillance models to detect inappropriate trading activity. How You Will Fulfill Your Potential As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, leverage various technologies including; Java, Spark, Hadoop, Flink, MapReduce, HBase, JSON, Protobuf, Presto, Elastic Search, Kafka, Kubernetes be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, including negotiating data contracts, capturing data quality metrics, processing large scale data, building surveillance detection models, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software systems across our products. Qualifications A successful candidate will possess the following attributes: A Bachelor's or Master's degree in Computer Science, Computer Engineering, or a similar field of study. Expertise in java, as well as proficiency with databases and data manipulation. Experience in end-to-end solutions, automated testing and SDLC concepts. The ability (and tenacity) to clearly express ideas and arguments in meetings and on paper. Experience in the some of following is desired and can set you apart from other candidates : developing in large-scale systems, such as MapReduce on Hadoop/Hbase, data analysis using tools such as SQL, Spark SQL, Zeppelin/Jupyter, API design, such as to create interconnected services, knowledge of the financial industry and compliance or risk functions, ability to influence stakeholders. About Goldman Sachs Goldman Sachs is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world.

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

What this job involves: Being the go-to MEP person Are you considered to be the go-to person for all MEP matters? That’s what you’ll be in this role. You’ll manage all activities related to mechanical, engineering and plumbing in terms of planning, designing, procurement, construction, testing and commissioning, and final handover. Your task is to fully understand, collect and deliver clients’ MEP requirements. The design manager in MEP will depend on you to help schedule or plan establishment, value engineer, and design change management. You’ll also assist the contract manager in MEP-related procurement and VO management. On top of that, you’ll support the construction manager in MEP-related installation, site inspection and contractor management. Making visions come true You’ll develop big ideas that will spark the effective management and successful execution of all phases of a project—from initiating, designing, planning, controlling, executing, monitoring, and closing. You’ll need to carefully identify and take note of our clients’ needs, and figure out what exactly needs to be done. This involves defining the scope of the work and expected outcome, while also detailing all the necessary objectives to get there. While you do all of these, you’ll need to keep tabs on company resources used in the projects, and to allocate these resources to complete the project within the budget. You’ll also need to help clients organize and analyze all tender and procurement for all contractors and suppliers; and represent them from the beginning to the end of a project. Building strong teams and business reputation One of your priorities will be to produce high-performing teams that drive successful project execution. You’ll also represent and promote the company throughout projects and in pursuit of more project opportunities. Keeping risks at bay How do you deal with risks? You’ll need to identify any potential risks in the MEP field and report them to the Project Manager. It will be critical to design a risk management and solution provision, particularly to identify health & safety issues. You will understand why this is your responsibility. Sound like you? To apply you need to be: An MEP pro You have a degree in MEP engineering-related discipline or related field, and five years of combined educational and work experience. You also need to have sufficient experience in construction site management, as well as a strong understanding of all aspects of development management including, financial appraisal, risk management, negotiation, etc. Do you have a strong background of all aspects of MEP-related management—including the development of MEP project plan and procedures and construction schedules? Are you familiar with HVAC, electrical engineering, and BMS? Do you have knowledge of security system, AV system, and IT system? If your answers are yes, let’s talk. A business savvy leader who can walk the talk You understand the business well enough, particularly in terms of the systems and tools to use, the best practices and the safety requirements. You’re also knowledgeable of key industries and local market, with the real estate and construction business above all. You also have a basic understanding of the key drivers that push the projects forward, while also considering the client’s business requirements. You’ll back up your business know-hows with the necessary communication skills, as you need to regularly do business development presentations to potential clients in both English and Chinese. You’ll also manage site activities, negotiate with contractors, review the legal aspects of contracts, contribute to market analysis, and manage change orders. A flexible leader with superb interpersonal skills Are you a people person with superb interpersonal skills? You’ll need to create a proactive working environment that not only motivates your employees, but also encourages them to maintain good relationship with clients, communicate effectively with each other, and contribute enthusiastically to the project. You also need to be a results-oriented leader with good problem-solving skills, as well as someone who can nurture positive relationships with all stakeholders involved, including your team members and clients. What we can do for you: At JLL, we make sure that you become the best version of yourself by helping you realise your full potential in an entrepreneurial and inclusive work environment. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay and benefits package.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role As a Sr. Data Engineer in the Sales Automation Engineering team you should be able to work through the different areas of Data Engineering & Data Architecture including the following: Data Migration - From Hive/other DBs to Salesforce/other DBs and vice versa Data Modeling - Understand existing sources & data models and identify the gaps and building future state architecture Data Pipelines - Building Data Pipelines for several Data Mart/Data Warehouse and Reporting requirements Data Governance - Build the framework for DG & Data Quality Profiling & Reporting What The Candidate Will Need / Bonus Points ---- What the Candidate Will Do ---- Demonstrate strong knowledge of and ability to operationalize, leading data technologies and best practices. Collaborate with internal business units and data teams on business requirements, data access, processing/transformation and reporting needs and leverage existing and new tools to provide solutions. Build dimensional data models to support business requirements and reporting needs. Design, build and automate the deployment of data pipelines and applications to support reporting and data requirements. Research and recommend technologies and processes to support rapid scale and future state growth initiatives from the data front. Prioritize business needs, leadership questions, and ad-hoc requests for on-time delivery. Collaborate on architecture and technical design discussions to identify and evaluate high impact process initiatives. Work with the team to implement data governance, access control and identify and reduce security risks. Perform and participate in code reviews, peer inspections and technical design/specifications. Develop performance metrics to establish process success and work cross-functionally to consistently and accurately measure success over time Delivers measurable business process improvements while re-engineering key processes and capabilities and maps to future-state vision Prepare documentations and specifications on detailed design. Be able to work in a globally distributed team in an Agile/Scrum approach. Basic Qualifications Bachelor's Degree in computer science or similar technical field of study or equivalent practical experience. 8+ years professional software development experience, including experience in the Data Engineering & Architecture space Interact with product managers, and business stakeholders to understand data needs and help build data infrastructure that scales across the company Very strong SQL skills - know advanced level SQL coding (windows functions, CTEs, dynamic variables, Hierarchical queries, Materialized views etc) Experience with data-driven architecture and systems design knowledge of Hadoop related technologies such as HDFS, Apache Spark, Apache Flink, Hive, and Presto. Good hands on experience with Object Oriented programming languages like Python. Proven experience in large-scale distributed storage and database systems (SQL or NoSQL, e.g. HIVE, MySQL, Cassandra) and data warehousing architecture and data modeling. Working experience in cloud technologies like GCP, AWS, Azure Knowledge of reporting tools like Tableau and/or other BI tools. Preferred Qualifications Python libraries (Apache spark, Scala) Working experience in cloud technologies like GCP, AWS, Azure

Posted 4 days ago

Apply

14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About the Company: We are a leading technology firm dedicated to providing innovative data solutions that empower businesses to harness the full potential of their data. Our mission is to drive success through cutting-edge technology and exceptional service, fostering a culture of collaboration, integrity, and continuous improvement. About the Role: We are hiring a Director of Databricks Engineering to lead multiple client engagements, drive project delivery, and manage high-performing engineering teams. Responsibilities: 14+ years in data engineering, including 3+ years in leadership/director-level roles Proven experience with Databricks, Delta Lake, and cloud data architecture Strong track record of project delivery, team management, and client success Excellent communication and leadership skills in fast-paced environments Oversee and deliver multiple Databricks-based data engineering projects Manage project budgets, costing, staffing, and client expectations Lead and mentor engineering teams across engagements Collaborate with clients on architecture, strategy, governance, and reporting Ensure high-quality delivery aligned with best practices and business value Qualifications: Databricks – Full-platform expertise for scalable data solutions: Strong hands-on experience with Databricks for building and managing ETL pipelines, Delta Lake, notebooks, and job orchestration. Skilled in cluster optimization, workspace management, and integrating Databricks with Azure services. Cloud – Azure (preferred), or similar cloud environments: Deep hands-on experience with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and integration with Databricks. Ability to design and deploy cloud-native data architectures. Data Engineering – Spark, PySpark, and Python for scalable data processing: Strong background in building scalable, high-performance ETL/ELT pipelines using Spark and PySpark. Ability to write optimized, production-grade Python code for data transformation, orchestration, and automation in distributed environments. Data Warehousing & SQL – Designing and querying enterprise data models: Proven experience in designing data warehouses or lakehouses, dimensional modeling, and writing complex SQL queries for analytics and reporting. Governance – Implementation and management of Unity Catalog: Hands-on experience implementing Unity Catalog for managing metadata, access control, and data lineage in Databricks. Reporting Tools – Power BI or similar (Tableau, Looker, etc.): Ability to work with business teams to build insightful dashboards and visualizations using Power BI. Required Skills: Strong hands-on experience with Databricks Deep hands-on experience with Azure data services Strong background in building scalable ETL/ELT pipelines Proven experience in designing data warehouses or lakehouses Hands-on experience implementing Unity Catalog Ability to work with business teams to build dashboards Preferred Skills: Experience with additional cloud environments Familiarity with other reporting tools Pay range and compensation package: Competitive salary based on experience and qualifications. Equal Opportunity Statement: We are committed to creating a diverse and inclusive workplace. We encourage applications from all qualified individuals, regardless of race, gender, age, sexual orientation, disability, or any other characteristic protected by law.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role:-Data Engineer-Investment Exp:-6-10 Yrs Location :- Hyderabad Primary Skills :- ETL, Informatica,SQL,Python and Investment domain Please share your resumes to jyothsna.g@technogenindia.com, Job Description :- •7-9 years of experience with data analytics, data modeling, and database design. •3+ years of coding and scripting (Python, Java, Scala) and design experience. •3+ years of experience with Spark framework. •5+ Experience with ELT methodologies and tools. •5+ years mastery in designing, developing, tuning and troubleshooting SQL. •Knowledge of Informatica Power center and Informatica IDMC. •Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. •Strong data analysis skills for extracting insights from financial data •Proficiency in reporting tools (e.g., Power BI, Tableau). T he Ideal Qualifications Technical Skills: •Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. •Familiarity with regulatory requirements and compliance standards in the investment management industry. •Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. •Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: •Strong analytical and problem-solving abilities. •Exceptional communication and interpersonal skills. •Ability to influence and motivate teams without direct authority. •Excellent time management and organizational skills, with the ability to prioritize multiple initiatives.

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role-AI/ML Engineer Mandatory Skills - Python, ML libraries (Pytorch, Tensorflow), Gen AI, Kubernetes, NLP Good to have skill- MLOps Location: Hyderabad Only Work Type-Work from Office (5 Days in a week) Experience-4 to 8 Yrs. Skills Required - Strong programming skills in Python, Java, Spring Boot, or Scala. Experience with ML frameworks like TensorFlow, PyTorch, XGBoost, TensorFlow or LightGBM. Familiarity with information retrieval techniques (BM25, vector search, learning-to-rank). Knowledge of embedding models, user/item vectorization, or session-based personalization. Experience with large-scale distributed systems (e.g., Spark, Kafka, Kubernetes). Hands-on experience with real-time ML systems. Background in NLP, graph neural networks, or sequence modeling. Experience with A/B testing frameworks and metrics like NDCG, MAP, or CTR.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Data Engineer ( Indore ) 🕒 Experience: 5+ years 💼 Type: Full-time Are you passionate about building scalable data pipelines and working with real-time streaming platforms? Join our growing team as a Data Engineer and help power next-gen data solutions! 🔧 Key Responsibilities: Design and maintain real-time data pipelines using Apache Kafka Write efficient and optimized SQL queries for data extraction and transformation Build robust ETL/ELT processes for structured & unstructured data Collaborate with analysts, data scientists & devs to deliver insights Ensure data quality, security & performance optimization Integrate with tools like Spark, Airflow, or Snowflake (as applicable) 🧠 Skills We Value: Proficient in Apache Kafka , Kafka Streams or Kafka Connect Strong in SQL, Python/Scala , and cloud platforms (AWS/GCP/Azure) Experience with data lakes, message queues, and large-scale systems A problem-solving mindset and passion for clean, efficient code ✨ Why Work With Us? Exciting projects with global clients Collaborative, innovation-driven environment Flexible working options Competitive compensation 📩 Apply now yukta.sengar@in.spundan.com or tag someone perfect for this role!

Posted 4 days ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

About the Role We are looking for a seasoned Engineering Manager to lead the development of our internal Risk, Fraud and Operations Platform . This platform plays a critical role in ensuring smooth business operations, detecting anomalies, managing fraud workflows, and supporting internal teams with real-time visibility and control. As an Engineering Manager, you’llbe responsible for leading a cross-functional team of backend engineers working on high-throughput systems, real-time data pipelines, and internal tools that power operational intelligence and risk management. You will own delivery, architecture decisions, team growth, and collaboration with stakeholders. Key Responsibilities Lead and grow a team of software engineers building internal risk and ops platforms. Oversee the design and development of scalable microservices and real-time data pipelines. Collaborate with stakeholders from Risk, Ops, and Product to define technical roadmaps and translate them into delivery plans. Ensure high system reliability, data accuracy, and low-latency access to risk signals and ops dashboards. Drive architectural decisions, code quality, testing, and deployment best practices. Contribute to hands-on design, reviews, and occasional coding when required. Optimize performance and cost-efficiency of services deployed on AWS. Mentor team members and foster a culture of ownership, innovation, and continuous learning. Tech Stack You'll Work With Languages: Node.js, Python, Java Data & Messaging: Kafka, OpenSearch, MongoDB, MySQL , Apache Spark , Apache Flink , Apache Druid Architecture: Microservices, REST APIs Infrastructure: AWS (EC2, ECS/EKS, Lambda, RDS, CI/CDetc.) Requirements 8+ years of software engineering experience with backend and distributed systems. 2+ years of people management or tech leadership experience. Strong experience with Node.js and Python ; familiarity with Java is a plus. Hands-on experience with event-driven architecture using Kafka or similar. Exposure to OpenSearch , MongoDB , and relational databases like MySQL . Exposure to Spark, Flink , Data pipeline ETL Deep understanding of cloud-native architecture and services on AWS . Proven ability to manage timelines, deliver features, and drive cross-functional execution. Strong communication and stakeholder management skills. Preferred Qualifications Prior experience in risk, fraud detection, operations tooling , or internal platforms . Experience with observability, alerting, and anomaly detection systems. Comfortable working in fast-paced environments with rapidly evolving requirements.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

OLIVER+ is a global team of creative thinkers, tech-savvy trendsetters, and production pros specialising in film, CGI, automation, AI, motion design, and digital/print content. We partner with over 300 clients in 40+ countries and counting. Our focus is to connect clients with high-quality solutions, talent and ambitious opportunities worldwide. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. OUTLINE OF ROLE Manage the Purchase Order Processes for UK finance teams. Undertake Intercompany, Management accounts, and Ad hoc tasks as assigned by the UK central Finance Team. Description Of Role Administration of the PO Process (both External and Intercompany): Process any Purchase Order (PO) requests within the accounting system(s); Manage the process from request (via email) to issue of the PO order to the requester of the PO; Maintain the register of active PO’s and ensure accurate reporting of PO’s held in the system; Ensure that controls around the PO system are maintained including ensuring that the correct information is provided, correct authorisation has been given or requested; Respond to all requests via Email (specific account to be set up) within 48 hours; Provide reporting (weekly/monthly) on PO’s. Manage the process to request PO’s for UK outgoing intercompany recharges, once received ensuring that the requests are provided to the AR team to raise invoices before intercompany cut off. Intercompany Roll forward all intercompany files and make changes as required; Confirm intercompany balances (Balance Sheet and Overhead) for UK entities with other parties both within IIG/Oliver and the Brandtech Group; Recharges Assist the finance team in the request of PO’s for the Intercompany Finance recharges and ensure these are received in adequate time for the AR team to raise invoices on a monthly/quarterly basis; Bank Reconciliation Reconciliation of UK bank accounts within the accounting system. Working with the AP and AR Finance teams to ensure correct postings. Ad Hoc Provide support to members of the UK Finance Teams as required including analysis, report running, projects and audit; Required Skills/Experience Qualified Accountant (ACA/ACCA/CIMA)/Part Qualified/Finalist – at least 3 years accounting experience. Highly computer literate - Advanced Excel skills; comfortable dealing with complex Formulae. Adaptable to change in a fast-paced environment. Ability to work well under pressure and meet deadlines. Excellent problem-solving skills. Good communication and interpersonal skills. High attention to detail. Our Values Shape Everything We Do Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER+, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER+ has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.'

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Strong programming skills in SQL, Python and PySpark for data processing and automation. Experience with Databricks and Snowflake (preferred) for building and maintaining data pipelines. Understanding of Machine Learning and AI techniques, especially for data quality and anomaly detection. Experience with cloud platforms such as Azure and AWS and familiarity with Azure Web Apps Knowledge of Data Quality and Data Governance concepts (Preferred) Nice to have: Power BI dashboard development experience. Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417177 Relocation Package No

Posted 4 days ago

Apply

30.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Publicis Groupe is the third largest communications group in the world, a leader in marketing, communication and digital business transformation. Publicis Groupe offers its clients seamless access to the expertise of its 80,000 talents across four Solution hubs: creative with Publicis Communications (Publicis Worldwide, Saatchi & Saatchi, Leo Burnett, BBH, Marcel, Fallon, MSL, Prodigious), media services with Publicis Media (Starcom, Zenith, Spark Foundry, Blue 449, Performics, Digitas), digital business transformation with Publicis.Sapient and health & wellness communications with Publicis Health. Publicis Groupes agencies are present in over 100 countries around the world. The PGD team in India boasts of over 600+ specialists across Mumbai, Gurgaon, Pune and Bengaluru certified across all major platforms - Amazon, Facebook Blueprint, Google Adwords, Google Shopping, SA 360, Data Studio and more. With solid expertise in Search, Programmatic, Data engineering, Data sciences, Ecommerce, Consulting and Development, the team supports a host of Publicis operations across US, Europe and Asia and their multinational clients. Founded in 1926 by Marcel Bleustein-Blanchet, the father of French advertising, Publicis Groupe is today led by its third CEO in its history, Arthur Sadoun, Chairman & CEO. Maurice Lvy, who led the company for 30 years until June 2017, is today the Chairman of the Groupes Supervisory Board. Job Location's : Gurgaon, Bangalore, Mumbai, Hyderabad & Pune Sounds like you? Our Multichannel Platform is looking for a TV Buyer/-Optimizer who is responsible for KPI based advertising spot placement on German TV channels. Responsibilities:- Task: operational buying of TV advertising time / responsibility for optimal advertising-spot-placement Translate Excel buying approvals from the media planner into our buying-tool MediaWizard (MW) Provide the system with all given parameters like budget, GRP, CpGRP, time-zone-share, channel mix, net-reach or other relevant KPIs. Verification if all given KPIs are reachable with a simulation by the system MW Booking selected ad-breaks via automatic buying at the vendors and transmit all bookings to our data-management tool DAP. Booking-management in DAP: create and manage orders, synchronization bookings with the vendors, apply and maintain discounts. If you can communicate well and work methodically as part of the team, we’d like to meet you. The goal is to inspire and attract the target audience. Requirement Any graduate/B.E/B.Tech/MBA Must be able to understand and interpret various data (absolute numbers; Indices; percentages) Proactive and communicative Accuracy Excel skills Analytical understanding English- or German-speaking/-writing Willingness to translate and understand German texts independently and proactive Understanding of German/European way of communication and culture (having stayed in Europe or worked for a European company for a couple of months is an A++) Interested can share their profiles at neha.bist@publicisgroupe.com

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies