Jobs
Interviews

16161 Spark Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes for clients across various industries. With a workforce of over 125,000 professionals in more than 30 countries, we are driven by curiosity, agility, and a commitment to creating lasting value. Our purpose is to pursue a world that works better for people, transforming leading enterprises worldwide, including Fortune Global 500 companies. We leverage our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to drive innovation and success. We are currently seeking applications for the position of Assistant Manager, Azure Data Engineer. In this role, you will be responsible for designing, developing, and maintaining data integration and provision solutions for data analytics and reporting teams. Key responsibilities include: - Designing, developing, and maintaining data pipelines using Azure Data Factory, Databricks, and other Azure services. - Monitoring and optimizing Azure data pipelines for high performance and reliability. - Orchestrating dataflow and developing real-time and batch data processing solutions using Azure Synapse Analytics, Azure Data Lake, or equivalent platforms. - Implementing data validation and cleansing procedures to ensure data quality and integrity. - Collaborating with various data teams to provide data for analytics and reporting purposes. - Automating data pipelines for scalability and monitoring ease using tools like Azure Logic Apps or Azure Automation. Qualifications we seek in you: Minimum Qualifications/Skills: - Bachelor's degree in computer science, Information Technology, or related field. - Relevant experience in Azure cloud-based data engineering or similar roles. - Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and Azure Synapse Analytics. - Experience in ETL development, data orchestration, Python, SQL, and Spark for data engineering tasks. - Familiarity with CI/CD pipeline development using Azure DevOps or similar tools. Preferred Qualifications/Skills: - Strong problem-solving skills and attention to detail. - Excellent communication skills for collaboration with cross-functional teams and business stakeholders. - Certification in Microsoft Azure Data Engineer or Azure Solutions Architect. If you are looking to join a dynamic team and work in a fast-paced environment, this role offers the opportunity to contribute to the success of our data operations. Join us in shaping the future of data analytics and reporting at Genpact. Job Details: - Position: Assistant Manager - Location: India-Gurugram - Schedule: Full-time - Education Level: Bachelor's/Graduation/Equivalent - Job Posting: May 1, 2025, 6:23:46 AM - Unposting Date: Oct 28, 2025, 2:23:46 AM - Master Skills List: Operations - Job Category: Full Time,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be responsible for developing and maintaining Python-based REST APIs with a strong emphasis on adhering to OpenAPI (Swagger) specifications and writing clean, testable code. It will be crucial to collaborate effectively with internal teams to ensure alignment on data structures, endpoints, versioning strategies, and deployment timelines. You will utilize tools such as Postman and Swagger UI for validating and documenting API endpoints. Monitoring and improving the performance, reliability, and security of deployed APIs will be a key part of your role. Additionally, you will provide support to API consumers by maintaining clear documentation and assisting with technical queries. Your contributions will extend to continuous improvement initiatives in development practices, code quality, and system observability, including logging and error handling. Version control and CI/CD workflows will be managed using tools like GitHub, Azure DevOps, or similar platforms. The ideal candidate should possess a minimum of 3 years of experience in backend development using Python, with familiarity working with frameworks like FastAPI and Flask. A solid understanding of REST API design, versioning, authentication, and documentation, particularly OpenAPI/Swagger, is required. Proficiency in tools such as Postman, VS Code, GitHub, and SQL databases is essential. Knowledge of Azure Functions or cloud-based deployment patterns is advantageous, and experience with Azure is preferred but not mandatory. Troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes will be part of your day-to-day responsibilities. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus. You should be a team player with a collaborative mindset, proactive in sharing knowledge, and adept at problem-solving. Proficiency in English, both written and spoken, is necessary for effective communication within the team. If you do not find a suitable role among the current openings but are a passionate and skilled engineer, we encourage you to reach out to us at careers@hashagile.com. Our company is growing rapidly, and we are always looking for enthusiastic individuals to join our team.,

Posted 5 days ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We're looking for a Associate Software Engineer This role is Office Based, Pune Office As a Software Engineer , you will be designing and delivering solutions that scale to meet the needs of some of the largest and most innovative organizations in the world. You will work with team members to understand and exceed the expectations of users, constantly pushing the technical envelope, and helping Cornerstone deliver great results. Working in an agile software development framework focused on development sprints and regular release cycles, you’ll own the complete feature story and mentor juniors. In this role, you will… Design, develop, and enhance .NET applications and services for legacy and cloud platforms, utilizing ASP.NET, C#, .NET, React, and CI/CD tools Analyze product and technical user stories and convey technical specifications in a concise and effective manner. Code & deliver a working deliverable, with a ‘first time right’ approach. Contribute to architectural decisions and participate in designing robust, scalable solutions. Troubleshoot and resolve complex production issues, deliver detailed root cause analysis (RCA), and collaborate with global Engineering, Product, and Release teams. Participate in sprint planning, and technical design reviews; provide input as appropriate. Partner with engineers, product managers, and other team members as appropriate Continuously expand and maintain deep knowledge of our products and technologies. You’ve Got What It Takes If You Have.. Bachelor’s/Master’s in Computer Science or related field 1 - 2 years’ hands-on experience with ASP.NET, C#, and .NET. Basic exposure to Gen AI and familiarity with AI tools and their applications. Strong in OOP and SOLID design principles. Should be very good at analyzing and Debugging/Troubleshooting functional and technical issues Proficient experience with relational databases such as Microsoft SQL Server/Postgres. Able to optimize designs/queries for scale. Proven experience in developing Microservices and RESTful services. Strong TDD skills with experience in unit testing frameworks like NUnit or xUnit. Proficiency with ORMs such as Entity Framework or NHibernate. Good understanding on secure development practices. Proactively codes to avoid Security issues whilst able to resolve all security findings Excellent analytical, quantitative and problem-solving abilities. Conversant in algorithms, software design patterns, and their best usage. Good understanding on how to deal with concurrency and parallel work streams. Self-motivated, requiring minimal oversight. Effective team player with strong communication skills and an ability to manage multiple priorities. Passion for continuous learning and technology improvement. Good to have Exposure to modern java script frameworks like Angular or React Exposure to non-relational DBs like MongoDB .Experience developing RESTful services, or other SOA development experience (preferably AWS) Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook !

Posted 5 days ago

Apply

12.0 - 18.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for an experienced Manager Data Engineering with expertise in Databricks or the Apache data stack to lead complex data platform implementations. As the Manager Data Engineering, you will play a crucial role in spearheading high-impact data engineering projects for global clients, delivering scalable solutions, and catalyzing digital transformation. You should have a total of 12-18 years of experience in data engineering, with at least 3-5 years in a leadership or managerial capacity. Hands-on experience in Databricks or core Apache stack components such as Spark, Kafka, Hive, Airflow, NiFi, etc., is essential. Proficiency in one or more cloud platforms like AWS, Azure, or GCP is preferred, ideally with Databricks on the cloud. Strong programming skills in Python, Scala, and SQL are required, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is advantageous. Your responsibilities will include leading the architecture, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks, ensuring delivery accountability for data engineering programs across various industries. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key aspect of your role. Additionally, you will be responsible for ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance. Managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will have opportunities for continuous learning and development, engaging in interesting and meaningful work that makes an impact. We believe in providing balance and flexibility to help you integrate your work and life effectively. GlobalLogic is a high-trust organization built on integrity and ethical values, providing a safe and reliable environment for your professional growth and success. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with leading companies worldwide to create innovative digital products and experiences. Join us to be a part of transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 5 days ago

Apply

3.0 - 8.0 years

0 Lacs

haryana

On-site

As a key member of the Data consulting team, you will work directly with partners and senior stakeholders of clients to design and implement big data and analytics solutions. Your role requires excellent communication and organizational skills, along with a problem-solving attitude. You will have the opportunity to collaborate with a world-class team of business consultants and engineers to solve complex business problems using data and analytics techniques. Working in a highly entrepreneurial environment, you can expect fast-track career growth and a best-in-industry remuneration package. Your primary responsibilities will include developing data solutions within Big Data Azure and/or other cloud environments, working with diverse data sets to meet the requirements of Data Science and Data Analytics teams, and building and designing Data Architectures using tools such as Azure Data Factory, Databricks, Data Lake, and Synapse. You will liaise with CTO, Product Owners, and other Operations teams to deliver engineering roadmaps, perform data mapping activities, assist the Data Analyst team in developing KPIs and reporting, and maintain relevant documentation and knowledge bases. Additionally, you will research and suggest new database products, services, and protocols. To excel in this role, you must have technical expertise in emerging Big Data technologies such as Python, Spark, Hadoop, Clojure, Git, SQL, and Databricks, as well as experience with visualization tools like Tableau and PowerBI. Proficiency in cloud, container, and microservice infrastructures, data modeling, query techniques, and complexity analysis is essential. Experience or knowledge of agile methodologies like Scrum, working with development teams, and product owners will be beneficial. Certifications in any of the mentioned areas are preferred. You should be able to work independently, communicate effectively with remote teams, and demonstrate curiosity to learn and apply emerging technologies to solve business problems. Timely communication and escalation of issues or dependencies to higher management are crucial aspects of this role. If you are interested in this opportunity, please send your resume to sakshi.vohra@invokhr.com and careers@invokhr.com.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You should have around 3 years of experience working as an NLP Engineer or in a similar role where you have demonstrated an understanding of NLP techniques for text representation and semantic extraction. Your expertise should also extend to data structures and modeling, enabling you to effectively design software architecture. A deep understanding of text representation techniques like n-grams, bag of words, sentiment analysis, as well as statistics and classification algorithms is essential for this role. Your proficiency in Python is crucial, along with a solid grasp of data structures and algorithms. Experience with Machine Learning Libraries such as scikit-learn, PyTorch, and TensorFlow will be highly beneficial. As a Python AI/ML Developer, you should possess an analytical mind with strong problem-solving abilities to excel in this position.,

Posted 5 days ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As a Medicare Risk Adjustment Data Analyst, you’ll play a crucial role in supporting the development and enhancement of new analytical applications related to Medicare risk adjustment as well as supporting existing applications such as Coding Transformation Modernization, Attribution Analytics, Financial Risk Adjustment and Management Engine. Primary Responsibilities This position is for OI Clinical Solutions - Decision intelligence team and upon selection, you will be part of dynamic team working on developing and delivering Best in-class Analytics for end users. Your work will focus on understanding CMS Medicare Advantage business and developing best in-class Analytics for OI Clinical Solutions - Decision Intelligence team according to Business/Technical requirements. Here are the key responsibilities, qualities and experience we will look for in an ideal candidate: Gather and analyze business and/ or functional requirements from 1 or more client business teams. Validate requirements with stakeholders and day to day project team, provide suggestions and recommendations in line with industry best practices. Developing and delivering Best in-class Analytics for end users using Big Data and Cloud platforms. Document, discuss and resolve business, data, data processing and BI/ reporting issues within the team, across functional teams, and with business stakeholders. Present written and verbal data analysis findings, to both the project team and business stakeholders as required to support the requirements gathering phase and issue resolution activities. Manage changing business priorities and scope and work on multiple projects concurrently Self - motivated and proactive with the ability to work in a fast - paced environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience Tech Skills/Experience: 1+ years of work experience with python, spark and HIVE and solid experience in developing Analytics at scale using Python, Spark and HIVE 1+ years of work experience in developing E2E Analytics pipeline on Hadoop/Bigdata platform 1+ years of work experience in SQL or associated languages. 1+ years of work experience - Ability to convert Business requirements into technical requirements and ability to develop Best in class code as per Technical/Business requirements. Proven interpersonal, collaboration, diplomatic, influencing, planning and organizational skills Consistently demonstrate clear and concise written and verbal communication Proven ability to effectively use complex analytical, interpretive and problem-solving techniques Proven relationship management skills to partner and influence across organizational lines Demonstrated ability to be work under pressure and to meet tight deadlines with proactive, decisiveness and flexibility Preferred Qualifications AWS/GCP or any other cloud-based platform development experience Understanding of Medicare risk adjustment programs Understanding of CMS datasets such as MMR/MOR/EDPS etc. At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Data Engineering Specialist, you will be responsible for assessing, capturing, and translating complex business issues into structured technical tasks for the data engineering team. This includes designing, building, launching, optimizing, and extending full-stack data and business intelligence solutions. Your role will involve supporting the build of big data environments, focusing on improving data pipelines and data quality, and working with stakeholders to meet business needs. You will create data access tools for the analytics and data scientist team, conduct code reviews, assist other developers, and train team members as required. Additionally, you will ensure that developed systems comply with industry standards and best practices while meeting project requirements. To excel in this role, you should possess a Bachelor's degree in computer science engineering or equivalent, or relevant experience. Certification in cloud technologies, especially Azure, would be beneficial. You should have 2-3+ years of development experience in building and maintaining ETL/ELT pipelines on various sources and operational programming tasks. Experience with Apache data projects or cloud platform equivalents and proficiency in programming languages like Python, Scala, R, Java, Golang, Kotlin, C, or C++ is required. Your work will involve collaborating closely with data scientists, machine learning engineers, and stakeholders to understand requirements and develop data-driven solutions. Troubleshooting, debugging, and resolving issues within generative AI system development, as well as documenting processes, specifications, and training procedures will be part of your responsibilities. In summary, this role requires a strong background in data engineering, proficiency in cloud technologies, experience with data projects and programming languages, and the ability to collaborate effectively with various stakeholders to deliver high-quality data solutions.,

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a Senior Staff Engineer to contribute to the development of innovative programs and products tailored to meet the requirements of Experian's clients, particularly those in the financial services sector. Our focus includes addressing critical questions such as enhancing the robustness and scalability of our loan origination modeling approach and addressing crucial aspects in the lending industry such as bias, fairness, and explainable AI. As an evolving team, we embody a startup mindset within the framework of a larger organization. Our emphasis lies on agility, impact, and transformation of the organizational culture around us through our outcomes and operational methodologies. Your responsibilities will include taking on challenging assignments within the development teams, offering substantial technical expertise across the entire development cycle, and guiding junior team members. You will play a pivotal role in executing technical and business strategies, ensuring the achievement of functional objectives, and comprehensively supporting products by grasping the interconnection of various components. In this role, you will be instrumental in supporting complex software development projects by contributing to planning, system design, and mentoring junior developers. You will be expected to innovate and architect solutions for intricate technical issues or system enhancements, as well as steer the technical direction for product development, encompassing technology selection and enhancement plans. Your tasks will involve developing Java and Scala components for our analytics product platforms on AWS, actively engaging with the platform and applications, and collaborating with geographically dispersed cross-functional teams to elevate the value of Analytics offerings. Additionally, you will be involved in enhancing the product to optimize cost efficiency while maximizing scalability and stability. You will be reporting to a Senior Manager, with your primary workplace being in Hyderabad, and a requirement for working from office two days a week for a Hybrid work model. Key Skills Required: - Proficiency in distributed data processing frameworks like Spark - Familiarity with public cloud platforms such as AWS, Azure, GCP (preferably AWS) - Experience with Docker, Kubernetes, CI/CD pipelines, and observability tools - Hands-on expertise in Scala, Java, and Python Qualifications: - Over 10 years of industry experience in object-oriented programming and asynchronous programming - Bachelor's degree in computer science or a related field We welcome individuals who are passionate about leveraging their technical expertise to drive innovation and contribute to the growth and success of our dynamic team.,

Posted 5 days ago

Apply

0.0 - 31.0 years

6 - 16 Lacs

Navi Mumbai

On-site

🌟 Golden Job Opportunity for ITI Electricians, Wiremen – Utility Sector 🌟* 📍 *Job Locations (Local): *Nearby location will provide Vasai | Nalasopara I Virar I Thane | Kalyan | Dombivali I Badlapur I Ulhasnagar I Navi Mumbai | Vashi I Palghar I Boisar I Dahanu I Saphale I Bhandup I Mulund I South Mumbai 🗓️ *Interview Date: 23-07-2025 (Meet Parvesh ) ⏰ *Time:* 11:00 AM – 4:00 PM 🔗 Apply Now: https://forms.gle/N29orFYh6mwWqsyP9 💼 *Salary & Incentives for ITI Meter Installers:* Incentives: Urban Area Per Meter: 90 200 meters: 200 x 90 = 18000 400 meters: 400 x 90 = 36000 600 meters: 600 x 90 = 54000 Example : Install 700 meters and earn ₹63,000/month! 💸 💰 *Potential Monthly Earnings: Up to ₹1,00,000!* Incentives : Rural Area Per Meter: 120 200 meters: 200 x 120 = 24000 400 meters: 400 x 120 = 48000 600 meters: 600 x 120 = 72000 Example : Install 700 meters and earn ₹84,000/month! 💸 Potential Monthly Earnings: Up to ₹1 Lakh! 😎 🔧 *Job Roles:* * Electric Meter Installation * Replacing DT, Feeder & CT Meters 🎓 *Eligibility:* * ITI in Electrical (Freshers & Experienced welcome) * 10th Pass Candidates with Meter Installation Experience Also Eligible 📞 *For More Information, Contact:* 📱 Parvesh: 9920266168 📱 Aditya: 970271208 🚀 *Apply now and spark your career in the power sector!* **Don't miss out – limited slots available!**

Posted 5 days ago

Apply

0.0 - 31.0 years

3 - 14 Lacs

Mumbai/Bombay

On-site

**🌟 Golden Job Opportunity for ITI Electricians, Wiremen – Utility Sector 🌟** 📍 **Job Locations (Local):** Thane 🗓️ **Interview Date:**Saturday, *26th July 2025* ⏰ **Time:** 9:00 AM – 4:00 PM 📌 **Venue:** **Quess Corp Limited** Ahura Centre, B-Wing, 5th Floor, Mahakali Caves Rd, Andheri East, Mumbai, Maharashtra 400093 🔗 **Google Map:** [Click here](https://g.co/kgs/ZuMdtYc) 💼 **Salary & Incentives for ITI Meter Installers:** *Incentives:* *Urban Area* Per Meter: 90 200 meters: 200 x 90 = 18000 400 meters: 400 x 90 = 36000 600 meters: 600 x 90 = 54000 *Example : Install 700 meters and earn ₹63,000/month!* 💸 💰 **Potential Monthly Earnings: Up to ₹1,00,000!** *Incentives* : *Rural Area* Per Meter: 120 200 meters: 200 x 120 = 24000 400 meters: 400 x 120 = 48000 600 meters: 600 x 120 = 72000 *Example : Install 700 meters and earn ₹84,000/month!* 💸 *Potential Monthly Earnings: Up to ₹1 Lakh! 😎* 🔧 **Job Roles:** * Electric Meter Installation * Survey & Technical Support * Replacing DT, Feeder & CT Meters 🎓 **Eligibility:** * ITI in Electrical (Freshers & Experienced welcome) * Bike & Driving License required for Installer role 📞 **For More Information, Contact:** Amit * 📞 :- 9702835982 🚀 **Apply now and spark your career in the power sector!** ***Don't miss out – limited slots available!***

Posted 5 days ago

Apply

0.0 - 31.0 years

3 - 12 Lacs

Dombivali

On-site

**🌟 Golden Job Opportunity for ITI Electricians, Wiremen – Utility Sector 🌟** 📍 **Job Locations (Local):** kalyan, Dombivali 🗓️ **Interview Date:**Saturday, *26th July 2025* ⏰ **Time:** 9:00 AM – 4:00 PM 📌 **Venue:** **Quess Corp Limited** Ahura Centre, B-Wing, 5th Floor, Mahakali Caves Rd, Andheri East, Mumbai, Maharashtra 400093 🔗 **Google Map:** [Click here](https://g.co/kgs/ZuMdtYc) 💼 **Salary & Incentives for ITI Meter Installers:** *Incentives:* *Urban Area* Per Meter: 90 200 meters: 200 x 90 = 18000 400 meters: 400 x 90 = 36000 600 meters: 600 x 90 = 54000 *Example : Install 700 meters and earn ₹63,000/month!* 💸 💰 **Potential Monthly Earnings: Up to ₹1,00,000!** *Incentives* : *Rural Area* Per Meter: 120 200 meters: 200 x 120 = 24000 400 meters: 400 x 120 = 48000 600 meters: 600 x 120 = 72000 *Example : Install 700 meters and earn ₹84,000/month!* 💸 *Potential Monthly Earnings: Up to ₹1 Lakh! 😎* 🔧 **Job Roles:** * Electric Meter Installation * Survey & Technical Support * Replacing DT, Feeder & CT Meters 🎓 **Eligibility:** * ITI in Electrical (Freshers & Experienced welcome) * Bike & Driving License required for Installer role 📞 **For More Information, Contact:** Amit * 📞 :- 9702835982 🚀 **Apply now and spark your career in the power sector!** ***Don't miss out – limited slots available!***

Posted 5 days ago

Apply

1.0 - 31.0 years

1 - 3 Lacs

Somnath Nagar, Mysore/Mysuru

On-site

We are looking for an enthusiastic Business development associate to join our team. The ideal candidate will be responsible for making outbound calls to potential customers, introducing our services, and ensuring excellent customer engagement. Responsibilities Outbound Calling: Make outbound calls to individuals or businesses to promote services. Engage with prospects in a friendly, persuasive, and professional manner. Present detailed information about products or services to potential customers. Highlight key features and benefits to spark interest and generate leads. Ensure customer inquiries are addressed with appropriate information and solutions. Customer Relationship Building: Build and maintain strong relationships with customers. Maintain accurate records of customer interactions and feedback. Follow up on leads and potential opportunities for service engagement. Performance Metrics: Meet or exceed outbound call targets and customer engagement goals. Achieve individual performance metrics related to lead conversion and customer satisfaction. Collaboration & Reporting: Work closely with team members and management to achieve team goals. Report on daily activities, progress, and challenges as required. Skills & Qualifications: Strong verbal communication skills. Persuasive and confident attitude. Ability to handle customer inquiries and objections professionally. Previous experience in voice process or customer service is a plus. Basic knowledge of computer applications.

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The company believes in conducting business every day based on core values of Inclusion, Innovation, Collaboration, and Wellness, ensuring a global team works together with customers at the center. As part of the team, you will have the opportunity to impact the business by identifying AI ML opportunities and building solutions that drive results. You will lead ML projects, conduct research to discover new ML techniques, and innovate to enhance team and business efficiencies. Collaborating closely with engineers, analysts, and leaders, you will implement and optimize ML models, establish best practices for model management, deployment, and monitoring, and integrate ML models into products and services. Additionally, you will assist in troubleshooting technical issues, maintain documentation, project tracking, and quality controls. The ideal candidate will have a degree in engineering, science, statistics, or mathematics, possessing a strong technical background in machine learning. Excellent communication skills, an analytical mindset, and a passion for problem-solving are essential. Candidates should have at least 3 years of hands-on experience in problem-solving using Machine Learning, proficiency in Python or Java, and familiarity with technologies like Spark, Hadoop, BigQuery, and SQL. Deep knowledge of machine learning algorithms, explainable AI methods, GenAI, and NLP is required, along with experience with Cloud frameworks such as GCP and AWS. Experience in Lending and Financial services is considered a plus. The company offers a range of benefits, and it is committed to Diversity and Inclusion. To understand more about the company's culture and community, visit https://about.pypl.com/who-we-are/default.aspx. If you are interested in joining the Talent Community or have any questions related to your skills, please don't hesitate to apply, as the company values all candidates and aims to bridge the confidence gap and imposter syndrome.,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for fetching and transforming data from various systems, conducting in-depth analyses to identify gaps, opportunities, and insights, and providing recommendations that support strategic business decisions. Your key responsibilities will include data extraction and transformation, data analysis and insight generation, visualization and reporting, collaboration with cross-functional teams, and building strong working relationships with external stakeholders. You will report to the VP Business Growth and work closely with clients. To excel in this role, you should have proficiency in SQL for data querying and Python for data manipulation and transformation. Experience with data engineering tools such as Spark and Kafka, as well as orchestration tools like Apache NiFi and Apache Airflow, will be essential for ETL processes and workflow automation. Expertise in data visualization tools such as Tableau and Power BI, along with strong analytical skills including statistical techniques, will be crucial. In addition to technical skills, you should possess soft skills such as flexibility, excellent communication skills, business acumen, and the ability to work independently as well as within a team. Your academic qualifications should include a Bachelors or Masters degree in Applied Mathematics, Management Science, Data Science, Statistics, Econometrics, or Engineering. Extensive experience in Data Lake architecture, building data pipelines using AWS services, proficiency in Python and SQL, and experience in the banking domain will be advantageous. Overall, you should demonstrate high motivation, a good work ethic, maturity, personal initiative, and strong oral and written communication skills to succeed in this role.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Python REST API Developer at our Coimbatore location, you will be responsible for developing and maintaining Python-based REST APIs with a strong emphasis on adhering to OpenAPI (Swagger) specifications and producing clean, testable code. Your role will involve collaborating with internal teams to ensure alignment on data structures, endpoints, versioning strategies, and deployment timelines. Using tools such as Postman and Swagger UI, you will validate and document API endpoints effectively. Monitoring and enhancing the performance, reliability, and security of deployed APIs will be a key part of your responsibilities. Supporting API consumers by maintaining clear documentation and assisting with technical queries is essential. To excel in this role, you should have a minimum of 3 years of strong experience in backend development using Python frameworks like FastAPI and Flask. A solid understanding of REST API design, versioning, authentication, and documentation, particularly OpenAPI/Swagger, is crucial. Proficiency in tools such as Postman, VS Code, GitHub, and working with SQL-based databases is required. While experience with Azure Functions or cloud-based deployment patterns is advantageous, it is not mandatory. You should be comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Additionally, having experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not a requirement. A team player with a collaborative mindset and a proactive approach to sharing knowledge and problem-solving will thrive in our environment. Fluency in written and spoken English is necessary for effective communication within our team.,

Posted 5 days ago

Apply

0.0 - 4.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking exceptional Python Developers interested in collaborating with a US Startup. If you have a genuine passion for creating and implementing machine learning solutions using Python, desire a job offering complete flexibility to work from any location, and are eager to gain experience in a Startup environment, then this role is tailored for you. Whether you prefer working from your dream vacation spot or a serene countryside setting, as long as you have a stable internet connection, you can effectively work remotely from your chosen destination. Bid farewell to long commutes and the hassle of rushing to in-person meetings. Are you ready to dedicate yourself to hard work while also enjoying some well-deserved downtime Let's team up. As part of the application process, interested candidates must complete the pre-screening behavioral assessment form. Without this essential step, candidates will not be considered for this position. **Requirements** - Bachelor's or Master's degree in Statistics/Math, Computer Science, Finance/Economics, Computer Engineering, or a related quantitative field (Ph.D. candidates are encouraged to apply) - Proficiency in Python Web Development, particularly Flask - Familiarity with SQL, Unix, Docker, Git, and relational Databases - Strong analytical, design, problem-solving, and troubleshooting/debugging abilities - Capability to work independently in a home office without the need for constant supervision - Proficient in DevOps and deployment pipelines for software deployment to servers (both On-premise hosting and Azure) - Experience in Analytics/Machine Learning projects, including a solid understanding of how SkLearn/Spark libraries and other machine learning packages function in web servers - Knowledge of software design patterns and software engineering best practices - Flexible schedule, with a focus on evening work post-college hours (approximately 3-5 hours daily) **What You'll Do, But Not Limited To:** - Develop Python code with emphasis on scalability, supportability, and maintainability - Engage in software development, configuration, and customization - Identify and resolve issues in production - Enhance and expand all components of the company's technology suite through collaboration with development teams to determine application requirements - Evaluate and prioritize client feature requests **Who You Are:** - Reliable, Independent, and adept at multitasking - Honest individual who values transparency - Team Player who enjoys collaborative work - Effective Communicator capable of translating goals to team members - Self-Starter who takes ownership of projects and tasks - Builder with a strong commitment to delivering superior products/experiences to customers and taking responsibility for their work - Experimental mindset, always willing to explore new tools, techniques, and approaches, even if failures occur **Nice To Have:** - Pursuing/Completed MS/Ph.D. in Computing, Physics, or other STEM fields - Curious and enthusiastic about learning Analytics/Machine Learning - Prior experience in Financial Services is advantageous **Benefits** - Remote First Company with 100% remote work to accommodate your schedule - Flexible Hours - Competitive Stipend/Salary Please note: - Strict intolerance towards plagiarism during the screening test; any evidence of AI-generated solutions like ChatGPT will result in immediate disqualification. - Submit your assignment only as a zip attachment via email; other forms of submission will be automatically rejected. - Preference will be given to candidates from top schools at Pune University, Mumbai University, NIT, IISER, TIFR, IIT, ISI, or leading schools in the USA/UK.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Java with Hadoop Developer at Airlinq in Gurgaon, India, you will play a vital role in collaborating with the Engineering and Development teams to establish and maintain a robust testing and quality program for Airlinq's products and services. Your responsibilities will include but are not limited to: - Being part of a team focused on creating end-to-end IoT solutions using Hadoop to address various industry challenges. - Building quick prototypes and demonstrations to showcase the value of technologies such as IoT, Machine Learning, Cloud, Micro-Services, DevOps, and AI to the management. - Developing reusable components, frameworks, and accelerators to streamline the development cycle of future IoT projects. - Operating effectively with minimal supervision and guidance. - Configuring Cloud platforms for specific use-cases. To excel in this role, you should have a minimum of 3 years of IT experience with at least 2 years dedicated to working with Cloud technologies like AWS or Azure. You must possess expertise in designing and implementing highly scalable enterprise applications and establishing continuous integration environments on the targeted cloud platform. Proficiency in Java, Spring Framework, and strong knowledge of IoT principles, connectivity, security, and data streams are essential. Familiarity with emerging technologies such as Big Data, NoSQL, Machine Learning, AI, and Blockchain is also required. Additionally, you should be adept at utilizing Big Data technologies like Hadoop, Pig, Hive, and Spark, with hands-on experience in any Hadoop platform. Experience in workload migration between on-premise and cloud environments, programming with MapReduce and Spark, as well as Java (core Java), J2EE technologies, Python, Scala, Unix, and Bash Scripts is crucial. Strong analytical, problem-solving, and research skills are necessary, along with the ability to think innovatively and independently. This position requires 3-7 years of relevant work experience and is based in Gurgaon. The ideal educational background includes a B.E./B.Tech., M.E./M. Tech. in Computer Science, Electronics Engineering, or MCA.,

Posted 5 days ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Hyderabad

Work from Office

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are looking for experienced product development engineers/experts who could join our cloud product engineering team to build the next gen applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you about the potential role. Responsibilities: Understand the business requirements and technical constraints and architect/design/develop. Participate in the complete development life cycle. Review the architecture/design/code of self and others. Develop enterprise application features/services using Azure cloud services, C# .NET Core, ReactJS etc, implementing DevSecOps principles. Own and be accountable for the Quality, Performance, Security and Sustenance of the respective product deliverables. Strive for self-excellence along with enabling success of the team/stakeholders. Requirements 4 to 10 years of experience in developing enterprise software products Strong knowledge of C#, .NET Core, Azure DevOps Working knowledge of the JS frameworks – Preferably ReactJS Experience in container-based development, AKS, Service Fabric etc Experience in messaging queue like RabbitMQ, Kafka Experience in Azure Services like Azure Logic Apps, Azure Functions Experience in databases like SQL Server, PostgreSQL Knowledge of reporting solutions like PowerBI, Apache SuperSet etc Knowledge of Micro-Services and/or Micro-Frontend architecture Knowledge of Code Quality, Code Monitoring, Performance Engineering, Test Automation Tools We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

Sundew is a leading digital transformation company with a 17-year legacy of excellence that helps businesses leverage data-driven insights to drive innovation and optimize operations. We are seeking a talented Data Engineer with expertise in Robotic Process Automation (RPA) and Artificial Intelligence/Machine Learning (AI/ML) to join our dynamic team. As a Data Engineer with RPA and AI/ML expertise, you will play a crucial role in designing, implementing, and maintaining data pipelines, workflows, and automation solutions. Your responsibilities will include developing and fine-tuning LLM models with expertise in RAG (Retrieval-Augmented Generation), content extraction, and familiarity with the concept of local LLMs. Additionally, you will collaborate closely with our Data Engineering team to innovate and optimize LLM models and work with cross-functional teams to integrate RPA infrastructure, enabling clients to automate existing processes. Role & Responsibilities - Develop and optimize LLM models using state-of-the-art techniques, focusing on RAG (Retrieval-Augmented Generation). - Implement content extraction methods to enhance the accuracy and relevance of generated responses. - Collaborate with cross-functional teams to identify and address LLM development and deployment challenges. - Stay updated with the latest advancements in natural language processing (NLP) and machine learning research. - Provide technical guidance and mentorship to junior team members. - Contribute to documenting and disseminating best practices in LLM development and deployment. - Implement RPA solutions to automate repetitive tasks and improve operational efficiency. - Collaborate with data scientists and business analysts to understand data requirements and translate them into technical solutions. - Monitor performance, troubleshoot issues, and optimize data workflows for scalability and reliability. - Stay up-to-date with the latest trends and technologies in data engineering, RPA, and AI/ML. Required Skill Sets - Bachelor's or higher degree in Computer Science, Engineering, or a related field. - Proven experience in developing and fine-tuning LLM models, preferably with a focus on RAG. - Hands-on experience with RPA tools such as UiPath, Automation Anywhere, Blue Prism, Power Automate. - Strong proficiency in Python programming and familiarity with relevant libraries/frameworks such as TensorFlow, PyTorch, scikit-learn. - Solid understanding of natural language processing (NLP) fundamentals and machine learning principles. - Experience with content extraction techniques such as Named Entity Recognition (NER), text summarization, and information retrieval. - Familiarity with the concept of local LLMs and their applications in specific domains or languages. - Excellent problem-solving skills and the ability to work effectively in a collaborative team environment. - Strong communication skills with the ability to explain complex technical concepts to non-technical stakeholders. Preferred Qualifications - Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Prior working knowledge in implementing RPA and AI/ML Methods. - Certification in RPA or AI/ML technologies. - Contributions to open-source projects related to natural language processing or machine learning. - Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). How To Apply Interested candidates should submit their resume and a cover letter detailing their relevant experience and explaining how they meet the specified requirements. Sundew is an equal opportunity employer. We encourage candidates of all backgrounds to apply.,

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Manager position is an intermediate management role where you will be responsible for leading and directing a team of employees to establish and implement new or improved application systems and programs, in collaboration with the Technology team. Your main objective will be to oversee applications systems analysis and programming activities. Your responsibilities will include conducting tasks such as feasibility studies, time and cost estimates, IT planning, risk technology assessment, applications development, and establishing new or revised applications systems and programs to meet specific business needs. You will also be monitoring and controlling all phases of the development process, providing user and operational support on applications, and utilizing your specialty knowledge to analyze complex problems and provide evaluative judgments. Additionally, you will recommend and develop security measures post-implementation, consult with users/clients and technology groups, recommend advanced programming solutions, and ensure essential procedures are followed. You will also serve as an advisor to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and team members. Qualifications for this role include at least 8 to 11 years of hands-on application development experience. The primary skill set required includes Java 17, Spring Boot 3.0, PL/SQL (Oracle 19c), DB Design and Optimization, Microservices (Spring Boot), Web services, Design Patterns, Kafka, and Elasticsearch. It would be beneficial to have knowledge of Spark, Nodejs, Jenkins, OpenShift, Autosys, Angular 14+, and JavaScript. Experience in systems analysis, software application programming, microservice development, distributed system design, managing successful projects, and working under pressure is essential. The ideal candidate will possess a Bachelor's degree or equivalent experience. This job description is a high-level overview of the responsibilities involved, and other duties may be assigned as needed.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Manager at Autodesk, you will lead the BI and Data Engineering Team to develop and implement business intelligence solutions. Your role is crucial in empowering decision-makers through trusted data assets and scalable self-serve analytics. You will oversee the design, development, and maintenance of data pipelines, databases, and BI tools to support data-driven decision-making across the CTS organization. Reporting to the leader of the CTS Business Effectiveness department, you will collaborate with stakeholders to define data requirements and objectives. Your responsibilities will include leading and managing a team of data engineers and BI developers, fostering a collaborative team culture, managing data warehouse plans, ensuring data quality, and delivering impactful dashboards and data visualizations. You will also collaborate with stakeholders to translate technical designs into business-appropriate representations, analyze business needs, and create data tools for analytics and BI teams. Staying up to date with data engineering best practices and technologies is essential to ensure the company remains ahead of the industry. To qualify for this role, you should have 3 to 5 years of experience managing data teams and a BA/BS in Data Science, Computer Science, Statistics, Mathematics, or a related field. Proficiency in Snowflake, Python, SQL, Airflow, Git, and big data environments like Hive, Spark, and Presto is required. Experience with workflow management, data transformation tools, and version control systems is preferred. Additionally, familiarity with Power BI, AWS environment, Salesforce, and remote team collaboration is advantageous. The ideal candidate is a data ninja and leader who can derive insights from disparate datasets, understand Customer Success, tell compelling stories using data, and engage business leaders effectively. At Autodesk, we are committed to creating a culture where everyone can thrive and realize their potential. Our values and ways of working help our people succeed, leading to better outcomes for our customers. If you are passionate about shaping the future and making a meaningful impact, join us in our mission to turn innovative ideas into reality. Autodesk offers a competitive compensation package based on experience and location. In addition to base salaries, we provide discretionary annual cash bonuses, commissions, stock grants, and a comprehensive benefits package. If you are interested in a sales career at Autodesk or want to learn more about our commitment to diversity and belonging, please visit our website for more information.,

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are looking for a Data Engineer with over 5 years of experience to join our team in Ahmedabad. As a Data Engineer, you will play a key role in transforming raw data into valuable insights and creating scalable data infrastructure. Your responsibilities will include designing data pipelines, optimizing data systems, and supporting data-driven decision-making. Key responsibilities of the role include: - Architecting, building, and maintaining scalable data pipelines from various sources. - Designing effective data storage, retrieval mechanisms, and data models for analytics. - Implementing data validation, transformation, and quality monitoring processes. - Collaborating with cross-functional teams to deliver data-driven solutions. - Identifying bottlenecks, optimizing workflows, and providing mentorship to junior engineers. We are looking for a candidate with: - 4+ years of hands-on experience in Data Engineering. - Proficiency in Python and data pipeline design. - Experience with Big Data tools like Hadoop, Spark, and Hive. - Strong skills in SQL, NoSQL databases, and data warehousing solutions. - Knowledge of cloud platforms, especially Azure. - Familiarity with distributed computing, data modeling, and performance tuning. - Understanding of DevOps, Power Automate, and Microsoft Fabric is a plus. - Strong analytical thinking, collaboration skills, excellent communication skills, and the ability to work independently or as part of a team. Qualifications required for this position include a Bachelor's degree in Computer Science, Data Science, or a related field. If you are passionate about data engineering and have the necessary expertise, we encourage you to apply and be a part of our innovative team in Ahmedabad.,

Posted 5 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Job Title: Python / AI-ML Developer Experience Required: 2–5 Years Location: Kolkata Job Type: Full-time Job Summary: We are looking for a passionate and skilled Python / AI-ML Developer with 2–5 years of experience to join our data science and engineering team. The ideal candidate will be responsible for building intelligent solutions, designing machine learning models, and developing scalable Python-based systems. This role offers the opportunity to work on real-world data-driven projects with a focus on innovation and performance. Key Responsibilities: Design, build, and deploy machine learning models and AI-driven solutions. Write clean and efficient Python code for data preprocessing, model training, and evaluation. Work with large datasets to extract insights and create predictive models. Collaborate with cross-functional teams including data scientists, engineers, and business stakeholders. Deploy models into production using REST APIs or cloud services. Monitor and retrain models to ensure accuracy and efficiency over time. Research and experiment with new algorithms and AI techniques Required Skills and Qualifications : 2–5 years of hands-on experience in Python development with a focus on AI/ML. Solid understanding of machine learning algorithms (supervised, unsupervised, deep learning). Experience with ML libraries such as Scikit-learn, TensorFlow, PyTorch, Keras, or XGBoost. Proficient in data handling using Pandas, NumPy, and data visualization tools like Matplotlib or Seaborn. Experience in training, evaluating, and tuning models using appropriate metrics. Strong foundation in mathematics and statistics (linear algebra, probability, optimization). Familiarity with REST APIs and model deployment strategies. Experience with Git and collaborative development Preferred Qualifications: Bachelor's or master’s degree in computer science, Data Science, AI, Engineering, or a related field. Experience with cloud platforms (AWS, Azure, GCP) and ML Ops tools. Exposure to Natural Language Processing (NLP), Computer Vision, or Recommendation Systems. Knowledge of Big Data tools (Spark, Hadoop) is a plus. Understanding of CI/CD pipelines and containerization using Docker/Kubernetes

Posted 5 days ago

Apply

10.0 - 15.0 years

17 - 30 Lacs

Pune

Work from Office

Dear Candidate, This is with reference for Opportunity Senior Tech Lead Databricks professionals PFB the Job Description Responsibilities: Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of buildingand optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies