Jobs
Interviews

15886 Spark Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

India

Remote

Description Director of Digital Demand, India Location: Remote (Mumbai) EGNYTE YOUR CAREER. SPARK YOUR PASSION. Role Egnyte is a place where we spark opportunities for amazing people. We believe that every role has meaning, and every Egnyter should be respected. With 22,000+ customers worldwide and growing, you can make an impact by protecting their valuable data. When joining Egnyte, you’re not just landing a new career — you’re becoming part of a team of doers, thinkers, and collaborators who live by our core values: Invested Relationships Fiscal Prudence Candid Conversations About Egnyte Egnyte is a secure multi-cloud platform for content security and governance that enables organizations to better protect and collaborate on their most valuable content. Established in 2008, Egnyte has democratized cloud content security for more than 22,000 organizations. We help customers improve data security, maintain compliance, prevent and detect ransomware threats, and boost productivity — on any app, any cloud, anywhere. Visit www.egnyte.com for more. About The Role We are seeking a future-focused, hands-on Director of Digital Demand based in India to lead a high-performing team of digital specialists across SEM, SEO, Paid Social, Website Acquisition, and Trial Engagement Marketing. This leader will drive full-funnel digital performance while evaluating and integrating AI-powered tools and emerging technologies to scale growth, improve personalization, and drive measurable impact. You’ll collaborate cross-functionally across Web, Product Marketing, Marketing Ops, and Analytics teams to ensure Egnyte’s digital journey is optimized for conversion, relevance, and innovation. What You’ll Do 📈 Digital Performance Strategy & Innovation Build and execute a scalable global digital acquisition strategy to exceed MQL, trial, and pipeline targets. Champion AI and automation to improve content creation, media targeting, personalization, and campaign optimization. Continuously assess and adopt emerging technologies (e.g., AI-powered SEO tools, predictive analytics, generative content) to drive competitive advantage. Lead experimentation frameworks across the full funnel — from visit to conversion to pipeline. 🚀 Channel Leadership & Optimization Manage and mentor channel experts across Paid Search, SEO, Paid Social, and Web Engagement. Oversee multi-platform execution (Google Ads, Performance Max, LinkedIn, Meta, Programmatic, etc.) to drive trial and conversion outcomes. Partner with creative and content teams to leverage AI for scalable, brand-aligned assets and messaging. Lead ethical first-party data strategy to enable privacy-first personalization across channels. 📅 Planning & Budget Ownership Lead quarterly and semi-annual planning for all digital demand channels. Deliver performance-driven plans to leadership, secure budgets, track utilization, and optimize investments to maximize pipeline impact. Use predictive analytics and real-time insights to dynamically reallocate spend for maximum ROI. 📊 Analytics & Performance Insight Define KPIs and success metrics for digital programs; work with Marketing Ops and Analytics to deliver insightful dashboards and reporting. Drive cohort, funnel, and attribution analysis to optimize performance and trial-to-paid conversion. Translate data into actionable insights to improve spend efficiency and pipeline contribution. 🤝 Cross-Functional Leadership Collaborate closely with Product Marketing, Web, Creative, and BDR teams to ensure full-funnel alignment. Act as the senior-most digital marketing leader in India and key contributor to global demand strategy. Partner with RevOps, Martech, and Data teams to shape a forward-looking marketing tech stack. Your Qualifications ✅ Must-Have Experience 15+ years in B2B SaaS digital marketing, including 3+ years in a Director or senior leadership role. Proven success driving pipeline through SEM, SEO, Paid Social, and digital conversion programs. Deep understanding of full-funnel KPIs, trial-based acquisition models, and conversion rate optimization. Expertise in GA4, Performance Max, attribution models, lead scoring, and ROI tracking. Experience building and scaling AI-enhanced campaigns or tools into workflows. 💡 Preferred Qualifications Prior leadership experience with globally distributed or hybrid marketing teams. Familiarity with modern Martech (Marketo, Salesforce, HubSpot, Dynamic Yield, etc.) Demonstrated ability to translate AI and data innovations into real business outcomes. Strong grasp of ethical data practices and compliance (GDPR, CCPA, etc.). Strategic thinker with the agility to experiment, learn fast, and scale what works. Why Join Us? Join a $250M+ high-growth SaaS company solving real-world data security and collaboration challenges. Lead and innovate within a global digital demand engine at scale. Work alongside a results-driven, collaborative team that embraces innovation, experimentation, and customer impact.

Posted 4 days ago

Apply

2.0 - 4.0 years

0 Lacs

Greater Hyderabad Area

On-site

Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Key Responsibilities Working closely with the Data lake engineers to provide technical guidance, consultation and resolution of their queries. Assist in development of simple and advanced analytics best practices, processes, technology & solution patterns and automation (including CI/CD) Working closely with various stakeholders in US team with a collaborative approach. Develop data pipeline in python/pyspark to be executed in AWS cloud. Set up analytics infrastructure in AWS using cloud formation templates. Develop mini/micro batch, streaming ingestion patterns using Kinesis/Kafka. Seamlessly upgrading the application to higher version like Spark/EMR upgrade. Participates in the code reviews of the developed modules and applications. Provides inputs for formulation of best practices for ETL processes / jobs written in programming languages such as PySpak and BI processes. Working with column-oriented data storage formats such as Parquet , interactive query service such as Athena and event-driven computing cloud service - Lambda Performing R&D with respect to the latest and greatest Big data in the market, perform comparative analysis and provides recommendations to choose the best tool as per the current and future needs of the enterprise. Required Qualifications Bachelors or Masters degree in Computer Science or similar field 2-4 years of strong expeirence in big data development Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Preferred Qualifications Cloud certification (AWS, Azure or GCP) About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Vadodara MPS is manufacturing FHT and LPT equipment with capacity of ~300 k production hours SC Planning team is supporting production by providing monthly plan and milestone schedule. This scheduling is not sufficient, and we are not able to track the complete production process. Production team need support for activity level scheduling for equipment manufacturing. This will help them to measure real time progress and identify process bottlenecks. This job will be done by Production Schedulers – one for FHT business and one for LPT business. Responsibilities / Tasks Responsible for production planning and materials management with an aim to meet customer delivery schedule with optimum lead-time, inventory and utilization of workshop capacity. Release milestone schedule for key equipment fabrication according to manufacturing sequence and lead time. Advance planning of long lead items (Raw Materials/Plates, FIM/Essential Parts, Bought-outs etc.) according to customer delivery schedule. Study build package/drawing set and define procurement strategy for all materials and accordingly define the material master in SAP. Study build package/drawing set and create multi-level manufacturing bill of material (BOM) in SAP according to procurement strategy and manufacturing sequence. Create Project, WBS structure and generate demands in SAP. Do material requirement planning (MRP) and generate purchase requisitions and planned orders. Release production orders for in-house manufacturing items. Allocation of materials from free stock to WBS as per project demands. Plate cutting parts entry and allocation of material into respective production orders as per nesting layouts. Create MOQ items additional purchase requisitions as per procurement request. Sub-contracting planning and procurement as per delivery schedule. Monitor and align availability of inputs (drawing & materials) as per workshop loading plan. Provide material requirements dates & project shortage list to procurement. Regular review of projects as per plan, monitoring the progress, and define proactive actions for deviations wherever required. Participate into workshop daily GEMBA meetings to discuss the progress and issues arising during execution. Organize and manage review meetings with internal stakeholders, group customers and manage correspondence. Manage revision of build package and accordingly update the schedule, BOM, production orders and timely communicate to all stakeholders. Packing and dispatch planning and preparation of related documents. Contribute to various organization initiatives related to Lean, SOC, ISO, Sustainability, Global SAP, New Product Development, Lead Time Reduction etc. Co-ordination with cross functions for smooth execution of assigned projects. Your Profile / Qualifications Degree or Diploma in Mechanical/Fabrication/Production Engineering with 5 to 10 years of experience preferably in production planning & scheduling in fabrication industries. Broad knowledge and understanding of production planning and materials management in project driven make to order manufacturing environment. Working knowledge of project planning software MS Project and SAP PP, PS & MM Modules. Should be familiar with operational excellence tools like Lean, 5S, Gemba, Kaizen and ISO 9001, 14001 & 45001. Should have ability to manage assigned projects / tasks independently. Positive mindset, quick learner, team player and customer centric approach. Strong analytical and problem-solving skills. Strong communication skills in English. Did we spark your interest? Then please click apply above to access our guided application process.

Posted 4 days ago

Apply

0 years

1 - 1 Lacs

Calicut

On-site

Job Title: Content Creator Location: Kozhikode , Hilite Business Park Job Type: Full Time Experience Level: Minimum 6 Month Experience Job Summary: We are looking for a creative and detail-oriented Content Creator to produce engaging and high-quality content for our brand. The ideal candidate should be skilled in content writing, social media management, and video presentation Key Responsibilities: Develop, write, and edit content for blogs, social media, website, newsletters, and marketing materials. Create visually appealing and brand-aligned posts (images, videos, reels, infographics). Manage and grow our social media platforms by regularly posting, engaging with the audience, and monitoring performance. Collaborate with the design and marketing teams to brainstorm campaign ideas and storytelling strategies. Stay updated on trends in social media, content, and audience behavior. Use analytics tools to track content performance and suggest improvements. Requirements: Proven experience as a Content Creator, or similar role. Excellent writing and editing skills in English (Malayalam or other languages is a plus). Basic knowledge of design tools (Canva, Adobe Spark, Photoshop, etc.). Creativity, attention to detail, and ability to meet deadlines. Degree/diploma in Marketing, Communications, Journalism, or related field preferred. Preferred Skills: Ability to create memes, trending content, and reels Good communication and teamwork skills Perks & Benefits: Flexible working environment Exposure to real-time content trends Opportunities for career growth Job Types: Full-time, Permanent Pay: ₹10,000.00 - ₹15,000.00 per month Work Location: In person

Posted 4 days ago

Apply

0.0 - 5.0 years

0 Lacs

Mohali, Punjab

On-site

About Net Spark Solutions: Net Spark Solutions is a leading digital solutions provider delivering innovative web design and development services with a team of skilled industry experts, we specialize in creating fully-functional digital solutions that help businesses grow, reach global audiences, and boost revenue. Experience: 5 to 8 years Job Type: Full-time Mode: Work from Office Location: Mohali, Punjab Job Overview We are looking for a Digital Marketing Team Lead with strong expertise in SEO, SMO, PPC, Facebook Ads, and CRO. The candidate should have excellent communication skills, be able to handle client calls, manage SEO, PPC, and FB Ads projects independently, and lead the team effectively. Experience in client conversion, preparing impactful proposals, and helping the company close deals is essential. Key Responsibilities: Lead and manage digital marketing campaigns across SEO , PPC, SMO, Facebook Ads, and CRO. Handle client calls and maintain effective communication with clients. Independently manage and execute SEO, PPC, and Facebook Ads projects. Supervise and guide the digital marketing team to achieve targets. Prepare impactful proposals to support client acquisition efforts. Assist in client conversion and help close business deals. Requirements: Strong expertise in SEO (on-page, off-page, technical, and local SEO), SMO, PPC, Facebook Ads, and Conversion Rate Optimization (CRO). GMB skills and Content marketing skills Well-versed in the best practices of website optimization Excellent communication skills. Ability to handle client calls confidently. Experience managing SEO, PPC, and Facebook Ads projects independently. Proven leadership skills to effectively lead and manage a team. Experience in client conversion and preparing impactful proposals. Ability to assist in closing deals and contributing to business growth. Why Join Net Spark Solutions? Work on International Projects – Gain valuable hands-on experience by contributing to global digital solutions. Flexible Work Timings – Enjoy a better work-life balance with flexible scheduling options. Positive & Friendly Environment – Be part of a supportive team culture that values collaboration and growth. Learning & Development – Access mentorship, training resources, and professional courses to grow your skills and career. Job Type: Full-time Pay: Up to ₹660,000.00 per year Schedule: Day shift Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: SEO: 5 years (Required) Client Handling: 5 years (Required) SMO: 5 years (Required) Team Handling: 5 years (Required) CRO: 5 years (Required) Language: English (Required) Work Location: In person

Posted 4 days ago

Apply

7.0 years

1 - 4 Lacs

Hyderābād

On-site

We are seeking a senior data services engineer with experience in databricks to join our innovative development team. The ideal candidate will have a robust background in java development, distributed computing, and big data technologies, with a focus on Databricks. Proficiency in cloud platforms like Azure and data warehousing solutions such as Snowflake is essential. This role offers the opportunity to lead the design and implementation of cutting-edge data solutions that drive business intelligence and analytics. Key Responsibilities: Lead the design, development, and deployment of scalable applications integrated with Databricks for big data processing Architect and optimize distributed computing solutions using Apache Spark within the Databricks environment to handle large-scalable datasets efficiently Implement and manage data lake house architecture using Databricks, ensuring data integrity, security, and accessibility. Develop and maintain ETL pipelines using Databricks and Apache Spark, automating data workflows to improve processing efficiency Provide technical leadership in system architecture design, making informed decisions to meet business requirements and scalability needs Optimize cloud resource utilization on platforms like Azure ensuring cost-effective and reliable data processing solutions Must-Have Skills : Bachelor's degree in Computer Science, Information Technology, or a related field Minimum of 7 years of hands-on experience in Java development, with a deep understanding of object-oriented principles Extensive experience working with Databricks, including designing and implementing data pipelines, managing clusters, and utilizing Databricks notebooks Proficiency in distributed computing frameworks, particularly Apache Spark, within the Databricks environment. Strong experience with cloud platforms especially Microsoft Azure, including services like Azure Data Lake Storage (ADLS) and Azure Data Factory Solid understanding of data warehousing concepts and hands-on experience with Snowflake Experience with version control system such as Git and familiarity with CI/CD pipelines Excellent problem-solving skills, attention to detail and the ability to work effectively in a collaborative team environment Good-to-Have Skills: Experience with additional programming languages such as Python or Scala Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes Understanding of agile development methodologies and experience working in Agile teams Familiarity with monitoring and logging tools to ensure application reliability and performance Contributions to open-source projects or active participation in relevant technical communities

Posted 4 days ago

Apply

10.0 years

10 Lacs

Hyderābād

On-site

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good – you’ve come to the right place. We’re building a product data platform to bring Salesforce’s product signals into the agentic era — powering smarter, adaptive, and self-optimizing product experiences. As a Senior Manager , you’ll lead a team of talented engineers in designing and building trusted, scalable systems that capture, process, and surface rich product signals for use across analytics, AI/ML, and customer-facing features. You’ll guide architectural decisions, drive cross-functional alignment, and shape strategy around semantic layers, knowledge graphs, and metrics frameworks that help teams publish and consume meaningful insights with ease. We’re looking for a strategic, systems-minded leader who thrives in ambiguity, excels at cross-org collaboration, and has a strong technical foundation to drive business and product impact. What You’ll Do Lead and grow a high-performing engineering team focused on batch and streaming data pipelines using technologies like Spark, Trino, Flink, and DBT Define and drive the vision for intuitive, scalable metrics frameworks and a robust semantic signal layer Partner closely with product, analytics, and engineering stakeholders to align schemas, models, and data usage patterns across the org Set engineering direction and best practices for building reliable, observable, and testable data systems Mentor and guide engineers in both technical execution and career development Contribute to long-term strategy around data governance, AI-readiness, and intelligent system design Serve as a thought leader and connector across domains to ensure data products deliver clear, trusted value What We’re Looking For 10+ years of experience in data engineering or backend systems, with at least 2+ years in technical leadership or management roles Strong hands-on technical background, with deep experience in big data frameworks (e.g., Spark, Trino/Presto, DBT) Familiarity with streaming technologies such as Flink or Kafka Solid understanding of semantic layers, data modeling, and metrics systems Proven success leading teams that build data products or platforms at scale Experience with cloud infrastructure (especially AWS — S3, EMR, ECS, IAM) Exposure to modern metadata platforms, Snowflake, or knowledge graphs is a plus Excellent communication and stakeholder management skills A strategic, pragmatic thinker who is comfortable making high-impact decisions amid complexity Why Join Us This is your opportunity to shape how Salesforce understands and uses its product data. You’ll be at the forefront of transforming raw product signals into intelligent, actionable insights — powering everything from internal decision-making to next-generation AI agents. If you're excited by the challenge of leading high-impact teams and building trusted systems at scale, we'd love to talk to you. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form . Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.

Posted 4 days ago

Apply

5.0 years

6 - 10 Lacs

Hyderābād

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Title: AWS Senior Data Engineer Experience Required: Minimum 5+ years Job Summary: We are seeking a skilled Data Engineer with a strong background in data ingestion, processing, and storage. The ideal candidate will have experience working with various data sources and technologies, particularly in a cloud environment. You will be responsible for designing and implementing data pipelines, ensuring data quality, and optimizing data storage solutions. Key Responsibilities: Design, develop, and maintain scalable data pipelines for data ingestion and processing using Python, Spark, and AWS services. Work with on-prem Oracle databases, batch files, and Confluent Kafka for data sourcing. Implement and manage ETL processes using AWS Glue and EMR for batch and streaming data. Develop and maintain data storage solutions using Medallion Architecture in S3, Redshift, and Oracle. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Monitor and optimize data workflows using Airflow and other orchestration tools. Ensure data quality and integrity throughout the data lifecycle. Implement CI/CD practices for data pipeline deployment using Terraform and other tools. Utilize monitoring and logging tools such as CloudWatch, Datadog, and Splunk to ensure system reliability and performance. Communicate effectively with stakeholders to gather requirements and provide updates on project status. Technical Skills Required: Proficient in Python for data processing and automation. Strong experience with Apache Spark for large-scale data processing. Familiarity with AWS S3 for data storage and management. Experience with Kafka for real-time data streaming. Knowledge of Redshift for data warehousing solutions. Proficient in Oracle databases for data management. Experience with AWS Glue for ETL processes. Familiarity with Apache Airflow for workflow orchestration. Experience with EMR for big data processing. Mandatory: Strong AWS data engineering skills. Good Additional Skills: Familiarity with Terraform for infrastructure as code. Experience with messaging services such as SNS and SQS. Knowledge of monitoring and logging tools like CloudWatch, Datadog, and Splunk. Experience with AWS DataSync, DMS, Athena, and Lake Formation. Communication Skills: Excellent verbal and written communication skills are mandatory for effective collaboration with team members and stakeholders. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 4 days ago

Apply

2.0 - 3.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Role : ML Engineer (Associate / Senior) Experience : 2-3 Years (Associate) 4-5 Years (Senior) Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 4 days ago

Apply

0 years

2 - 5 Lacs

Hyderābād

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Key Responsibilities Develop, deploy, and monitor machine learning models in production environments. Automate ML pipelines for model training, validation, and deployment. Optimize ML model performance, scalability, and cost efficiency. Implement CI/CD workflows for ML model versioning, testing, and deployment. Manage and optimize data processing workflows for structured and unstructured data. Design, build, and maintain scalable ML infrastructure on cloud platforms. Implement monitoring, logging, and alerting solutions for model performance tracking. Collaborate with data scientists, software engineers, and DevOps teams to integrate ML models into business applications. Ensure compliance with best practices for security, data privacy, and governance. Stay updated with the latest trends in MLOps, AI, and cloud technologies. Mandatory Skills Technical Skills: Programming Languages: Proficiency in Python (3.x) and SQL. ML Frameworks & Libraries: Extensive knowledge of ML frameworks (TensorFlow, PyTorch, Scikit-learn), data structures, data modeling, and software architecture. Databases: Experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Mathematics & Algorithms: Strong understanding of mathematics, statistics, and algorithms for machine learning applications. ML Modules & REST API: Experience in developing and integrating ML modules with RESTful APIs. Version Control: Hands-on experience with Git and best practices for version control. Model Deployment & Monitoring: Experience in deploying and monitoring ML models using:MLflow (for model tracking, versioning, and deployment) WhyLabs (for model monitoring and data drift detection) Kubeflow (for orchestrating ML workflows) Airflow (for managing ML pipelines) Docker & Kubernetes (for containerization and orchestration) Prometheus & Grafana (for logging and real-time monitoring) Data Processing: Ability to process and transform unstructured data into meaningful insights (e.g., auto-tagging images, text-to-speech conversions). Preferred Cloud & Infrastructure Skills: Experience with cloud platforms : Knowledge of AWS Lambda, AWS API Gateway, AWS Glue, Athena, S3 and Iceberg and Azure AI Studio for model hosting, GPU/TPU usage, and scalable infrastructure. Hands-on with Infrastructure as Code (Terraform, CloudFormation) for cloud automation. Experience on CI/CD pipelines: Experience integrating ML models into continuous integration/continuous delivery workflows. We use Git based CI/CD methods mostly. Experience with feature stores (Feast, Tecton) for managing ML features. Knowledge of big data processing tools (Spark, Hadoop, Dask, Apache Beam). EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

8.0 years

12 - 24 Lacs

Hyderābād

On-site

Senior Java / Spark Developer. Required Skills & Experience: 8 Years of Java Experience Strong Spring boot experience Experience using Java-spark writing complex SQL, PL/SQL queries around Data Analysis, Data Lineage & Reconciliation preferably in PostgreSQL/Oracle Experience of creating Data Lineage documents, Source to Target (STM) Mapping documents and Low-Level Technical Specification Documents Experience of design and implementation of ETL/ELT framework for complex warehouses/data marts. Hands on development mentality, with a willingness to troubleshoot and solve complex problems. Desired Skills/Experience: Life Insurance Industry Experience a huge plus Keen ability to prioritize and handle multiple assignments Experience working in an On-site/Off-site Development Model Job Type: Full-time Pay: ₹100,000.00 - ₹200,000.00 per month Experience: Java: 8 years (Required) spark: 8 years (Required) Spring Boot: 8 years (Required) Work Location: In person

Posted 4 days ago

Apply

0 years

1 - 1 Lacs

Mohali

On-site

About the Role We are looking for a passionate Data Science fresher who has completed at least 6 months of practical training, internship, or project experience in the data science field. This is an exciting opportunity to apply your analytical and problem-solving skills to real-world datasets while working closely with experienced data scientists and engineers. Key Responsibilities Assist in data collection, cleaning, and preprocessing from various sources. Support the team in building, evaluating, and optimizing ML models . Perform exploratory data analysis (EDA) to derive insights and patterns. Work on data visualization dashboards and reports using tools like Power BI, Tableau, or Matplotlib/Seaborn. Collaborate with senior data scientists and domain experts on ongoing projects. Document findings, code, and models in a structured manner. Continuously learn and adopt new techniques, tools, and frameworks. Required Skills & Qualifications Education: Bachelor’s degree in Computer Science, Statistics, Mathematics, Engineering, or a related field. Experience: Minimum 6 months internship/training in data science, analytics, or machine learning. Technical Skills: Proficiency in Python (Pandas, NumPy, Scikit-learn, etc.). Understanding of machine learning algorithms (supervised/unsupervised). Knowledge of SQL and database concepts. Familiarity with data visualization tools/libraries. Basic understanding of statistics and probability. Soft Skills: Strong analytical thinking and problem-solving ability. Good communication and teamwork skills. Eagerness to learn and grow in a dynamic environment. Good to Have (Optional) Exposure to cloud platforms (AWS, GCP, Azure). Experience with big data tools (Spark, Hadoop). Knowledge of deep learning frameworks (TensorFlow, PyTorch). What We Offer Opportunity to work on real-world data science projects . Mentorship from experienced professionals in the field. A collaborative, innovative, and supportive work environment. Growth path to become a full-time Data Scientist with us. Job Types: Full-time, Permanent, Fresher Pay: ₹10,000.00 - ₹15,000.00 per month Benefits: Health insurance Schedule: Day shift Fixed shift Monday to Friday Application Question(s): have you done your 6 month training ? Education: Bachelor's (Preferred) Language: English (Preferred) Work Location: In person

Posted 4 days ago

Apply

5.0 - 6.0 years

8 - 15 Lacs

India

On-site

We are seeking a highly skilled Python Developer with expertise in Machine Learning and Data Analytics to join our team. The ideal candidate should have 5-6 years of experience in developing end-to-end ML-driven applications and handling data-driven projects independently. You will be responsible for designing, developing, and deploying Python-based applications that leverage data analytics, statistical modeling, and machine learning techniques. Key Responsibilities: Design, develop, and deploy Python applications for data analytics and machine learning. Work independently on machine learning model development, evaluation, and optimization. Develop ETL pipelines and process large-scale datasets for analysis. Implement scalable and efficient algorithms for predictive analytics and automation. Optimize code for performance, scalability, and maintainability. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Integrate APIs and third-party tools to enhance functionality. Document processes, code, and best practices for maintainability. Required Skills & Qualifications: 5-6 years of professional experience in Python application development. Strong expertise in Machine Learning, Data Analytics, and AI frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Proficiency in Python libraries such as Pandas, NumPy, SciPy, and Matplotlib. Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.). Hands-on experience with big data technologies (Apache Spark, Delta Lake, Hadoop, etc.). Strong experience in developing APIs and microservices using FastAPI, Flask, or Django. Good understanding of data structures, algorithms, and software development best practices. Strong problem-solving and debugging skills. Ability to work independently and handle multiple projects simultaneously. Good to have - Working knowledge of cloud platforms (Azure/AWS/GCP) for deploying ML models and data applications. Job Type: Full-time Pay: ₹800,000.00 - ₹1,500,000.00 per year Schedule: Day shift Experience: Python: 5 years (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 4 days ago

Apply

7.0 years

1 - 9 Lacs

Bengaluru

On-site

Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Senior Software Engineer – Data Modernization (GenAI) Location: Manyata Tech Park, Bangalore (Hybrid) Business & Team: CommSec is Australia's largest online retail stockbroker. It is one of the most highly visible and visited online assets in Australian financial services. CommSec’s systems utilise a variety of technologies and support a broad range of investors. Engineers within CommSec are offered regular opportunities to work on some of the finest IT systems in Australia, as well as having opportunity to develop careers across different functions and teams within the wider Bank. Impact & Contribution: Apply core concepts, technology and domain expertise to effectively develop software solutions to meet business needs. You will contribute to building the brighter future for all by ensuring that our team builds the best solutions possible using modern development practices that ensure both functional and non-functional needs are met. If you have a history of building a culture of empowerment and know what it takes to be a force multiplier within a large organization, then you’re the kind of person we are looking for. You will report to the Lead Engineer within Business Banking Technology. Roles & Responsibilities: Build scalable agentic AI solutions that integrate with existing systems and support business objectives. Implement MLOps pipelines Design and conduct experiments to evaluate model performance and iteratively refine models based on findings. Hands on experience in automated LLM outcome validation and metrication of AI outputs. Good knowledge of ethical AI practices and tools to implement. Hand-on experience in AWS cloud services such as SNS, SQS, Lambda. Experience in big data platform technologies such as to Spark framework and Vector DB. Collaborate with Software engineers to deploy AI models in production environments, ensuring robustness and scalability. Participate in research initiatives to explore new AI models and methodologies that can be applied to current and future products. Develop and implement monitoring systems to track the performance of AI models in production. Hands on DevSecOps experience including continuous integration/continuous deployment, security practices. Essential Skills: The AI Engineer will involve in the development and deployment of advanced AI and machine learning models. The ideal candidate is highly skilled in MLOps and software engineering, with a strong track record of developing AI models and deploying them in production environments. 7+ years' experience RAG, Prompt Engineering Vector DB, Dynamo DB, Redshift Spark framework, Parquet, Iceberg Python MLOps Langfuse, LlamaIndex, MLflow, Gleu, Bleu AWS cloud services such as SNS, SQS, Lambda Traditional Machine Learning Education Qualifications: Bachelor’s degree or Master's Degree in engineering in Information Technology. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 06/08/2025

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru

On-site

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

8.0 years

5 - 10 Lacs

Bengaluru

On-site

We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. What you'll do: We are looking for a Senior Software Engineer – Java to join and strengthen the App2App Integration team within SAP Business Data Cloud. This role is designed to accelerate the integration of SAP’s application ecosystem with its unified data fabric, enabling low-latency, secure and scalable data exchange. You will take ownership of designing and building core integration frameworks that enable real-time, event-driven data flows between distributed SAP systems. As a senior contributor, you will work closely with architects to drive the evolution of SAP’s App2App integration capabilities, with hands-on involvement in Java, ETL and distributed data processing, Apache Kafka, DevOps, SAP BTP and Hyperscaler platforms. Responsibilities: Design and develop App2App integration components and services using Java, RESTful APIs and messaging frameworks such as Apache Kafka. Build and maintain scalable data processing and ETL pipelines that support real-time and batch data flows. Integrate data engineering workflows with tools such as Databricks, Spark or other cloud-based processing platforms (experience with Databricks is a strong advantage). Accelerate the App2App integration roadmap by identifying reusable patterns, driving platform automation and establishing best practices. Collaborate with cross-functional teams to enable secure, reliable and performant communication across SAP applications. Build and maintain distributed data processing pipelines, supporting large-scale data ingestion, transformation and routing. Work closely with DevOps to define and improve CI/CD pipelines, monitoring and deployment strategies using modern GitOps practices. Guide cloud-native secure deployment of services on SAP BTP and major Hyperscaler (AWS, Azure, GCP). Collaborate with SAP’s broader Data Platform efforts including Datasphere, SAP Analytics Cloud and BDC runtime architecture What you bring: Bachelor’s or Master’s degree in Computer Science, Software Engineering or a related field. 8+ years of hands-on experience in backend development using Java, with strong object-oriented design and integration patterns. Hands-on experience building ETL pipelines and working with large-scale data processing frameworks. Experience or experimentation with tools such as Databricks, Apache Spark or other cloud-native data platforms is highly advantageous. Familiarity with SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud or HANA is highly desirable. Design CI/CD pipelines, containerization (Docker), Kubernetes and DevOps best practices. Working knowledge of Hyperscaler environments such as AWS, Azure or GCP. Passionate about clean code, automated testing, performance tuning and continuous improvement. Strong communication skills and ability to collaborate with global teams across time zones Meet your Team: SAP is the market leader in enterprise application software, helping companies of all sizes and industries run at their best. As part of the Business Data Cloud (BDC) organization, the Foundation Services team is pivotal to SAP’s Data & AI strategy, delivering next-generation data experiences that power intelligence across the enterprise. Located in Bangalore, India, our team drives cutting-edge engineering efforts in a collaborative, inclusive and high-impact environment, enabling innovation and integration across SAP’s data platforms #DevT3 Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426958 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.

Posted 4 days ago

Apply

5.0 - 7.0 years

4 - 10 Lacs

Bengaluru

On-site

5 - 7 Years 1 Opening Bengaluru Role description Job Title: Java Spark Developer Experience: 5 to 7 Years Location: Bangalore Job Summary: We are seeking a skilled Java Spark Developer to design, develop, and maintain big data applications leveraging Apache Spark and Java. The ideal candidate will have a strong background in Core Java, experience with data frames and Spark-SQL, and a solid understanding of relational databases and orchestration frameworks. Primary Responsibilities: Design, develop, and maintain Java-based big data applications using Apache Spark. Work extensively with data frames to process and analyze large datasets. Integrate and manage relational databases such as MySQL, PostgreSQL, or Oracle. Utilize orchestration frameworks to automate and manage data workflows. Collaborate with data engineers to define and implement robust data processing pipelines. Write clean, maintainable, and efficient code following best practices. Conduct code reviews and provide constructive feedback to peers. Troubleshoot performance issues and resolve software defects promptly. Stay up-to-date with the latest trends and technologies in Java development and big data. Required Skills & Qualifications: Strong proficiency in Core Java . 5 to 7 years of hands-on experience with Apache Spark , including Spark DataFrames and Spark-SQL . Experience with relational databases (e.g., Db2, PostgreSQL, MySQL). Familiarity with orchestration frameworks for managing data workflows. Solid understanding of big data processing and analytics . Excellent problem-solving skills and keen attention to detail . Strong communication and team collaboration skills. Preferred Skills (Nice to Have): Familiarity with distributed file systems (e.g., HDFS). Experience with CI/CD tools like Jenkins, GitLab CI. Understanding of data warehousing concepts . Knowledge of Agile/Scrum methodologies . Why Join Us? Opportunity to work on cutting-edge big data projects. Collaborative and growth-focused work culture. Competitive compensation and benefits package. Skills Core Java,Apache spark,Relational Database About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku is the No. 1 TV streaming platform in the U.S., Canada, and Mexico with 70+ millions of active accounts. Roku pioneered streaming to the TV and continues to innovate and lead the industry. We believe Roku’s continued success relies on its investment in our machine learning/ML recommendation engine. Roku enables our users to access millions of contents including movies, episodes, news, sports, music and channels from all around the world. About the role We’re on a mission to build cutting-edge advertising technology that empowers businesses to run sustainable and highly-profitable campaigns. The Ad Performance team owns server technologies, data, and cloud services aimed at improving the ad experience. We're looking for seasoned engineers with a background in machine learning to aid in this mission. Examples of problems include improving ad relevance, inferring demographics, yield optimisation, and many more. Employees in this role are expected to apply knowledge of experimental methodologies, statistics, optimisation, probability theory, and machine learning using both general purpose software and statistical languages. What you’ll be doing ML infrastructure: Help build a first-class machine learning platform from the ground up which manages the entire model lifecycle - feature engineering, model training, versioning, deployment, online serving/evaluation, and monitoring prediction quality. Data analysis and feature engineering: Apply your expertise to identify and generate features that can be leveraged by multiple use cases and models. Model training with batch and real-time prediction scenarios: Use machine learning and statistical modelling techniques such as Decision Trees, Logistic Regression, Neural Networks, Bayesian Analysis and others to develop and evaluate algorithms for improving product/system performance, quality, and accuracy. Production operations: Low-level systems debugging, performance measurement, and optimisation on large production clusters. Collaboration with cross-functional teams: Partner with product managers, data scientists, and other engineers to deliver impactful solutions. Staying ahead of the curve: Continuously learn and adapt to emerging technologies and industry trends. We’re excited if you have Bachelors, Masters, or PhD in Computer Science, Statistics, or a related field. Experience in applied machine learning on real use cases (bonus points for ad tech-related use cases). Great coding skills and strong software development experience (we use Spark, Python, Java). Familiarity with real-time evaluation of models with low latency constraints. Familiarity with distributed ML frameworks such as Spark-MLlib, TensorFlow, etc. Ability to work with large scale computing frameworks, data analysis systems, and modelling environments. Examples include Spark, Hive, NoSQL stores such as Aerospike and ScyllaDB. Ad tech background is a plus. #LI-PS2 Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 4 days ago

Apply

5.0 - 7.0 years

4 - 10 Lacs

Bengaluru

On-site

5 - 7 Years 1 Opening Bengaluru Role description Job Title: Data Engineer Experience Required: 5 to 7 Years Location: Bangalore Primary Responsibilities: Design, develop, and maintain scalable ETL pipelines for processing and transforming large datasets. Collaborate with data scientists, analysts , and other stakeholders to understand data requirements and deliver effective data solutions. Optimize and tune data processing workflows to improve performance and efficiency. Implement data quality checks to ensure integrity and consistency across data sources. Manage and maintain relational databases and data warehouses . Leverage cloud-based data platforms (e.g., Snowflake , Databricks ) for data storage and processing. Monitor and troubleshoot data pipelines to ensure reliability and minimize downtime. Create and maintain documentation for data engineering processes and best practices. Required Skills & Qualifications: 5 to 7 years of experience as a Data Engineer or in a similar role. Proficiency in Apache Spark for large-scale data processing. Strong programming skills in Python . Advanced knowledge of SQL for data querying and manipulation. Experience working with relational databases and building scalable ETL pipelines . Familiarity with cloud data platforms such as Snowflake or Databricks . Strong problem-solving skills and high attention to detail. Excellent communication and collaboration abilities. Desired Skills: Experience with big data technologies such as Kafka . Understanding of data modeling and data warehousing concepts. Familiarity with containerization and orchestration tools (e.g., Docker , Kubernetes ). Experience with version control systems like Git . Knowledge of data governance and compliance requirements . Skills Data Engineering,Apache spark,Python,SQL About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 4 days ago

Apply

8.0 - 13.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 4 days ago

Apply

15.0 years

6 - 8 Lacs

Bengaluru

On-site

Company Description Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region. Job Description Job Summary - Bosch Research is seeking a highly accomplished and technically authoritative Software Expert in AI/ML Architecture to define, evolve, and lead the technical foundations of enterprise-grade, AI-driven systems. This is a technical leadership role without people management responsibilities , intended for professionals with deep expertise in software architecture , AI/ML systems , and large-scale engineering applications and their end-to-end deliveries. You will own the architecture and technical delivery of complex software solutions—ensuring they are robust, scalable, and capable of serving diverse business domains and datasets. The ideal candidate demonstrates mastery in cloud-native engineering , MLOps , Azure ML , and the integration of AI Algorithms (Computer Vision, Text, Timeseries, ML, etc.), LLMs , Agentic AI , and other advanced AI capabilities into secure and high-performing software environments Roles & Responsibilities: Technical Architecture and Solution Ownership Define, evolve, and drive software architecture for AI-centric platforms across industrial and enterprise use cases. Architect for scalability, security, availability, and multi-domain adaptability , accommodating diverse data modalities and system constraints. Embed non-functional requirements (NFRs) —latency, throughput, fault tolerance, observability, security, and maintainability—into all architectural designs. Incorporate LLM , Agentic AI , and foundation model design patterns where appropriate, ensuring performance and operational compliance in real-world deployments. Enterprise Delivery and Vision Lead the translation of research and experimentation into production-grade solutions with measurable impact on business KPIs (both top-line growth and bottom-line efficiency). Perform deep-dive gap analysis in existing software and data pipelines and develop long-term architectural solutions and migration strategies. Build architectures that thrive under enterprise constraints , such as regulatory compliance, resource limits, multi-tenancy, and lifecycle governance. AI/ML Engineering and MLOps Design and implement scalable MLOps workflows , integrating CI/CD pipelines, experiment tracking, automated validation, and model retraining loops. Operationalize AI pipelines using Azure Machine Learning (Azure ML) services and ensure seamless collaboration with data science and platform teams. Ensure architectures accommodate responsible AI , model explainability, and observability layers. Software Quality and Engineering Discipline Champion software engineering best practices with rigorous attention to: Code quality through static/dynamic analysis and automated quality metrics Code reviews , pair programming, and technical design documentation Unit, integration, and system testing , backed by frameworks like pytest, unit test, or Robot Framework Code quality tools such as SonarQube, CodeQL, or similar Drive the culture of traceability, testability, and reliability , embedding quality gates into the development lifecycle. Own the technical validation lifecycle , ensuring reproducibility and continuous monitoring post-deployment. Cloud-Native AI Infrastructure Architect AI services with cloud-native principles , including microservices, containers, and service mesh. Leverage Azure ML , Kubernetes , Terraform , and cloud-specific SDKs for full lifecycle management. Ensure compatibility with hybrid-cloud/on-premise environments and support constraints typical of engineering and industrial domains Qualifications Educational qualification: Masterís or Ph.D. in Computer Science, AI/ML, Software-Engineering, or a related technical discipline Experience: 15+ years in software development, including: Deep experience in AI/ML-based software systems Strong architectural leadership in enterprise software design Delivery experience in engineering-heavy and data-rich environments Mandatory/requires Skills: Programming : Python (required), Java, JS, Frontend/Backend Technologies, Databases C++ (bonus) AI/ML : TensorFlow, PyTorch, ONNX, scikit-learn, MLFlow(equivalents) LLM/GenAI : Knowledge of transformers, attention mechanisms, fine-tuning, prompt engineering Agentic AI : Familiarity with planning frameworks, autonomous agents, and orchestration layers Cloud Platforms : Azure (preferred), AWS or GCP; experience with Azure ML Studio and SDKs Data & Pipelines : Airflow, Kafka, Spark, Delta Lake, Parquet, SQL/NoSQL Architecture : Microservices, event-driven design, API gateways, gRPC/REST, secure multi-tenancy DevOps/MLOps : GitOps, Jenkins, Azure DevOps, Terraform, containerization (Docker, Helm, K8s) What You Bring Proven ability to bridge research and engineering in the AI/ML space with strong architectural clarity. Ability to translate ambiguous requirements into scalable design patterns . Deep understanding of the enterprise SDLC óincluding review cycles, compliance, testing, and cross-functional alignment. A mindset focused on continuous improvement, metrics-driven development , and transparent technical decision-making. Additional Information Why Bosch Research? At Bosch Research, you will be empowered to lead the architectural blueprint of AI/ML software products that make a tangible difference in industrial innovation. You will have the autonomy to architect with vision, scale with quality, and deliver with rigor—while collaborating with a global community of experts in AI, engineering, and embedded systems.

Posted 4 days ago

Apply

4.0 years

8 - 10 Lacs

Bengaluru

On-site

As passionate about our people as we are about our mission. Why Join Q2? Q2 is a leading provider of digital banking and lending solutions to banks, credit unions, alternative finance companies, and fintechs in the U.S. and internationally. Our mission is simple: build strong and diverse communities through innovative financial technology—and we do that by empowering our people to help create success for our customers. What Makes Q2 Special? Being as passionate about our people as we are about our mission. We celebrate our employees in many ways, including our “Circle of Awesomeness” award ceremony and day of employee celebration among others! We invest in the growth and development of our team members through ongoing learning opportunities, mentorship programs, internal mobility, and meaningful leadership relationships. We also know that nothing builds trust and collaboration like having fun. We hold an annual Dodgeball for Charity event at our Q2 Stadium in Austin, inviting other local companies to play, and community organizations we support to raise money and awareness together. Company Overview: PrecisionLender’s pricing and profitability platform helps commercial bank relationship managers make smart, real-time pricing decisions and deliver superior customer service. Andi®, our virtual pricing analyst, uses artificial intelligence to glean and deliver insights from the thousands of deals priced daily in the platform. Using PrecisionLender, banks grow faster with stronger and more profitable relationships. Our product is used globally by 200+ banks and 10,000+ relationship managers to price more than $1 trillion in commercial loans. What You’ll Do Here: As a Software Engineer in Test, you will be responsible for building solutions to testing problems. The primary focus of your work will be automation. You will be automating tests at all levels (unit, integration, end-to-end) for the PrecisionLender web app. These tests will run in our continuous integration environment, so they must be efficient, robust, and scalable. Your tests will enable code to be pushed to production with a high level of quality and the end-user in mind. You will work closely with Software Development teams to understand their needs and be able to create test coverage and scenarios that ensure we are building quality into our growing product base. If you like to solve problems, streamline operations, and see the result of the work you put in, then we have a place for you on our team! You will be expected to take individual responsibility for delivering value, work with more senior members on larger efforts, and take part in continuously improving our product and company. RESPONSIBILITIES: Creates and maintains automated test cases, executes test suites, reviews and diagnoses reported bugs, and ensures overall system quality prior to a customer release. Designs, develops, maintains, and troubleshoots automated suites of tests through continuous integration for value added feedback. Works with the engineering teams to derive testing requirements throughout the development cycle. Reproduces, debugs, and isolates problems and verify fixes. Works closely with software developers to create software artefacts including test plans, test cases, test procedures and test reports. Works cross functional areas with internal partner engineering teams in a disciplined agile environment. Estimates own testing tasks and works productively with minimum supervision while showing excellent team attitude. Participates in the performance testing and analysis framework for a web services architecture. EXPERIENCE AND KNOWLEDGE: 4 years college degree in Software Engineering, Computer Science or related technical disciplines 5+ of experience, preferably in either a Software Development Engineer in Test role Or Software Quality engineering. Solid experience in testing web-based applications, with hands-on expertise in UI automation scripting using Selenium (C#/Python/Java) and Selenium Grid. Strong expertise in testing web services (SOAP and RESTful APIs), with a focus on API automation. Proficiency in debugging complex web application issues through code reviews and log analysis. Good knowledge of OOP principles and data structures. Competence in working with any one or two of the databases such as Postgres, MySQL, Oracle, or NoSQL systems. Good to have experience with BDD and Gherkin language. Experience with Azure dev-ops pipeline / Jenkins or other continuous integration systems. Experience with tools & applications such as JIRA, Confluence, Browserstack/qTest, Github/Bitbucket. Must be detail oriented, analytical and creative thinker This position requires fluent written and oral communication in English. Health & Wellness Hybrid Work Opportunities Flexible Time Off Career Development & Mentoring Programs Health & Wellness Benefits, including competitive health insurance offerings and generous paid parental leave for eligible new parents Community Volunteering & Company Philanthropy Programs Employee Peer Recognition Programs – “You Earned it” Click here to find out more about the benefits we offer. Our Culture & Commitment: We’re proud to foster a supportive, inclusive environment where career growth, collaboration, and wellness are prioritized. And our benefits go beyond healthcare—offering resources for physical, mental, and professional well-being. Click here to find out more about the benefits we offer. Q2 employees are encouraged to give back through volunteer work and nonprofit support through our Spark Program ( see more ). We believe in making an impact—in the industry and in the community. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, genetic information, or veteran status.

Posted 4 days ago

Apply

10.0 years

2 - 11 Lacs

Bengaluru

On-site

Join our Team About this opportunity: We are looking for a Senior Machine Learning Engineer with 10+ years of experience to design, build, and deploy scalable machine learning systems in production. This is not a data science role — we are seeking an engineering-focused individual who can partner with data scientists to productionize models, own ML pipelines end-to-end, and drive reliability, automation, and performance of our ML infrastructure. You’ll work on mission-critical systems where robustness, monitoring, and maintainability are key. You should be experienced with modern MLOps tools, cloud platforms, containerization, and model serving at scale. What you will do: Design and build robust ML pipelines and services for training, validation, and model deployment. Work closely with data scientists, solution architects, DevOps engineers, etc. to align the components and pipelines with project goals and requirements. Communicate deviation from target architecture (if any). Cloud Integration: Ensuring compatibility with cloud services of AWS, and Azure for enhanced performance and scalability Build reusable infrastructure components using best practices in DevOps and MLOps. Security and Compliance: Adhering to security standards and regulatory compliance, particularly in handling confidential and sensitive data. Network Security: Design optimal network plan for given Cloud Infrastructure under the E// network security guidelines Monitor model performance in production and implement drift detection and retraining pipelines. Optimize models for performance, scalability, and cost (e.g., batching, quantization, hardware acceleration). Documentation and Knowledge Sharing: Creating detailed documentation and guidelines for the use and modification of the developed components. The skills you bring: Strong programming skills in Python Deep experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn, XGBoost). Hands-on with MLOps tools like MLflow, Airflow, TFX, Kubeflow, or BentoML. Experience deploying models using Docker and Kubernetes. Strong knowledge of cloud platforms (AWS/GCP/Azure) and ML services (e.g., SageMaker, Vertex AI). Proficiency with data engineering tools (Spark, Kafka, SQL/NoSQL). Solid understanding of CI/CD, version control (Git), and infrastructure as code (Terraform, Helm). Experience with monitoring/logging (Prometheus, Grafana, ELK). Good-to-Have Skills Experience with feature stores (Feast, Tecton) and experiment tracking platforms. Knowledge of edge/embedded ML, model quantization, and optimization. Familiarity with model governance, security, and compliance in ML systems. Exposure to on-device ML or streaming ML use cases. Experience leading cross-functional initiatives or mentoring junior engineers. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Bangalore Req ID: 770160

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

Job Description: Senior/Azure Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai At least 5+ years’ of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We’re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 4 days ago

Apply

4.0 years

6 - 10 Lacs

Bengaluru

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Data Engineer + Power BI Senior– Consulting As part of our GDS Consulting team, you will be part of NCLC team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We’re looking for resources with expertise in Microsoft BI, Power BI, Azure Data Factory, Data Bricks to join the group of our Data Insights team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your key responsibilities Responsible for managing multiple client engagements. Understand and analyse business requirements by working with various stakeholders and create the appropriate information architecture, taxonomy and solution approach Work independently to gather requirements, cleansing extraction and loading of data Translate business and analyst requirements into technical code Create interactive and insightful dashboards and reports using Power BI, connecting to various data sources and implementing DAX calculations. Design and build complete ETL/Azure Data Factory processes moving and transforming data for ODS, Staging, and Data Warehousing Design and development of solutions in Data Bricks, Scala, Spark, SQL to process and analyze large datasets, perform data transformations, and build data models. Design SQL Schema, Database Schema, Stored procedures, function, and T-SQL queries. Skills and attributes for success Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates. Able to manage Senior stakeholders. Experience in leading teams to execute high quality deliverables within stipulated timeline. Skills in PowerBI, Azure Data Factory, Databricks, Azure Synapse, Data Modelling, DAX, Power Query, Microsoft Fabric Strong proficiency in Power BI, including data modelling, DAX, and creating interactive visualizations. Solid experience with Azure Databricks, including working with Spark, PySpark (or Scala), and optimizing big data processing. Good understanding of various Azure services relevant to data engineering, such as Azure Blob Storage, ADLS Gen2, Azure SQL Database/Synapse Analytics Strong SQL Skills and experience with of one of the following: Oracle, SQL, Azure SQL. Good to have experience in SSAS or Azure SSAS and Agile Project Management. Basic Knowledge on Azure Machine Learning services. Excellent Written and Communication Skills and ability to deliver technical demonstrations Quick learner with “can do” attitude Demonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members To qualify for the role, you must have A bachelor's or master's degree A minimum of 4-7 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred Ideally, you’ll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies