Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Database Designer / Senior Data Engineer at VE3, you will be responsible for architecting and designing modern, scalable data platforms on AWS and/or Azure, ensuring best practices for security, cost optimization, and performance. You will develop detailed data models and document data dictionaries and lineage to support data solutions. Additionally, you will build and optimize ETL/ELT pipelines using languages such as Python, SQL, Scala, and services like AWS Glue, Azure Data Factory, and open-source frameworks like Spark and Airflow. Collaboration is key in this role as you will work closely with data analysts, BI teams, and stakeholders to translate business requirements into data solutions and dashboards. You will also partner with DevOps/Cloud Ops to automate CI/CD for data code and infrastructure, ensuring governance, security, and compliance standards such as GDPR and ISO27001 are met. Monitoring, alerting, and data quality frameworks will be implemented to maintain data integrity. As a mentor, you will guide junior engineers and stay updated on emerging big data and streaming technologies to enhance our toolset. The ideal candidate should have a Bachelor's degree in Computer Science, Engineering, IT, or similar field with at least 3 years of hands-on experience in a Database Designer / Data Engineer role within a cloud environment. Technical skills required include expertise in SQL, proficiency in Python or Scala, and familiarity with cloud services like AWS (Glue, S3, Kinesis, RDS) or Azure (Data Factory, Data Lake Storage, SQL Database). Strong communication skills are essential, along with an analytical mindset to address performance bottlenecks and scaling challenges. A collaborative attitude in agile/scrum settings is highly valued. Nice to have qualifications include certifications in AWS or Azure data analytics, exposure to data science workflows, experience with containerized workloads, and familiarity with DataOps practices and tools. At VE3, we are committed to fostering a diverse and inclusive environment where every voice is heard, and every idea can contribute to tomorrow's breakthrough.,
Posted 1 day ago
5.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Data Analytics and Quality Engineer with 7 to 14 years of experience, you will play a crucial role in ensuring the quality of analytics products within our organization. Your responsibilities will include designing and documenting testing scenarios, creating test plans, and reviewing quality specifications and technical designs for both existing and new analytics products. You will collaborate closely with the Data & Analytics team to drive data quality programs and implement automated test frameworks within an agile team structure. Your expertise in QA processes, mentoring, ETL testing, data validation, data quality, and knowledge of RCM or US Healthcare will be essential in this role. Proficiency in programming languages such as SQL (T-SQL or PL/SQL) is a must, while knowledge of Python is a plus. Hands-on experience with tools like SSMS, Toad, BI tools (Tableau, Power BI), SSIS, ADF, and Snowflake will be beneficial. Familiarity with data testing tools like Great Expectations, Deequ, dbt, and Pytest for data scripts is desirable. Your educational background should include a Bachelor's degree in computer science, Information Technology, Data Science, Math, Finance, or a related field, along with a minimum of 5 years of experience as a quality assurance engineer or data analyst with a strong focus on data quality. Preferred qualifications include QA-related certifications and a strong understanding of US healthcare revenue cycle and billing. In this role, you will be responsible for test execution for healthcare analytics, creation of detailed test plans and test cases, and ensuring that production system defects are documented and resolved promptly. Your ability to design testing procedures, write testing scripts, and monitor testing results according to best practices will be crucial in ensuring that our analytics meet established quality standards. Your knowledge of test case management tools, Agile development tools, data quality frameworks, and automated testing tools will be valuable assets in this position. Additionally, your proficiency in SQL, ability to test data systems for performance and scalability, and strong analytical skills will contribute to the success of our analytics products. Strong communication skills, process improvement abilities, and time management skills are also essential for this role. If you are looking to join a growing and innovative organization where you can work with new technology in both manual and automation testing environments, this Senior Quality Assurance Engineer position is an ideal opportunity for you.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Sr. Data Analytics Engineer at Ajmera Infotech Private Limited (AIPL) in Bengaluru/Bangalore, you will play a crucial role in building planet-scale software for NYSE-listed clients, enabling decisions that are critical and must not fail. With 5-9 years of experience, you will join a 120-engineer team specializing in highly regulated domains such as HIPAA, FDA, and SOC 2. The team delivers production-grade systems that transform data into a strategic advantage. You will have the opportunity to make end-to-end impact by building full-stack analytics solutions ranging from lake house pipelines to real-time dashboards. Fail-safe engineering practices such as TDD, CI/CD, DAX optimization, Unity Catalog, and cluster tuning will be part of your daily routine. You will work with a modern stack including technologies like Databricks, PySpark, Delta Lake, Power BI, and Airflow. As part of a mentorship culture, you will lead code reviews, share best practices, and grow as a domain expert. You will operate in a mission-critical context, helping enterprises migrate legacy analytics into cloud-native, governed platforms with a compliance-first mindset in HIPAA-aligned environments. Your key responsibilities will include building scalable pipelines using SQL, PySpark, and Delta Live Tables on Databricks, orchestrating workflows with Databricks Workflows or Airflow, designing dimensional models with Unity Catalog and Great Expectations validation, delivering robust Power BI solutions, migrating legacy SSRS reports to Power BI, optimizing compute and cost, and collaborating cross-functionally to convert product analytics needs into resilient BI assets. To excel in this role, you must have 5+ years of experience in analytics engineering, with at least 3 years in production Databricks/Spark contexts. Advanced skills in SQL, PySpark, Delta Lake, Unity Catalog, and Power BI are essential. Experience in SSRS-to-Power BI migration, Git, CI/CD, cloud platforms like Azure/AWS, and strong communication skills are also required. Nice-to-have skills include certifications like Databricks Data Engineer Associate, experience with streaming pipelines, data quality frameworks like dbt and Great Expectations, familiarity with BI platforms like Tableau and Looker, and cost governance knowledge. Ajmera offers competitive compensation, flexible hybrid schedules, and a deeply technical culture where engineers lead the narrative. If you are passionate about building reliable, audit-ready data products and want to take ownership of systems from raw ingestion to KPI dashboards, apply now and engineer insights that matter.,
Posted 5 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant, Data Engineer In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies . Responsibilities Identify , design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs. Generate, present, and develop ideas for progressing the data environment, such as common frameworks or common methodologies, which the IM IT division will then use. Possess technical curiosity to explore new features in existing tools and technologies as well as explore new methodologies, features and tools that can be adopted by the IM IT division. Share, instruct, and coach the Investment Technology division on data topics such as best practices, design methodologies, and query optimization. Organize, liaison, and work with other development teams to onboard applications and processes onto the new data architecture (cloud technologies, replication, etc.). Desire to learn and become the subject matter expert on tools and technologies. Act as a liaison to the various development teams and other data teams to market and proliferate the data architecture doctrines, principles and standards. Help devise and implement pragmatic data governance principles and methodology . Perform detailed data analysis and validation. Assist with preparation, coordination, and execution of User Acceptance Testing. Qualifications we seek in you! Minimum Qualifications BE/B Tech/MCA Excellent written and verbal communication skills Preferred Qualifications/ Skills Python, SQL, Spark/ PySpark, AWS Glue, AWS Aurora (preferably Postgres), AWS S3, dbt . Strong relational database design. Experience with multi-temporal data. Great Expectations, Jasper Reports, workflow experience (BPMN), .NET Excellent verbal and written skills. Ability to work in a team environment. Ability to work effectively and efficiently with supervision. Capable of managing multiple tasks with tight time deadlines. Possess strong analytical ability and excellent attention to detail. Maintain a strong commitment to quality. Strong Excel skills. Strong analytical skills. Tools/Technologies used: SQL, SQL Server, ETL tools, Autosys, Snowflake/Cloud/Azure experience a plus Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 week ago
2.0 - 7.0 years
7 - 17 Lacs
Bengaluru
Remote
Job Title: AI Engineer Quality & Performance Job Description: We are seeking a highly skilled AI Engineer - Quality & Performance to ensure the accuracy, robustness, and ethical integrity of AI-generated responses. In this role, you will leverage AI validation frameworks, adversarial testing tools, and responsible AI principles to assess and enhance AI systems. If you are passionate about AI security, performance evaluation, and responsible AI practices, wed love to hear from you! Key Responsibilities: AI Response Validation: Evaluate AI-generated responses for accuracy, coherence, relevance, and alignment with defined standards. AI Red Teaming: Utilize tools such as Pyrit, NVIDIA Garak, Giskard AI, Attest, and TruLens to conduct adversarial testing and identify vulnerabilities, biases, or security risks in AI models. AI Ground Truth Check: Develop and refine validation datasets using frameworks like PromptFlow, Cerberus, CheckList, and FairBench to ensure factual integrity and reliability of AI outputs. AI Performance Evaluation: Implement benchmarking strategies using tools such as OpenAI Eval, DeepEval, Dynabench, and LM Harness to measure AI model efficiency, robustness, and responsiveness. AI Quality Assurance: Apply automated validation frameworks such as DeepChecks, Great Expectations, and Robustness Gym to maintain high standards in AI response generation. Responsible AI Validation: Leverage tools like Fairlearn, Aequitas, and AI Fairness 360 to assess AI fairness, mitigate biases, and enforce ethical AI principles. Prompt Rewriting & Validation: Analyze and refine AI prompts using PromptFoo and Helium to optimize response quality and consistency across diverse applications. Qualifications: Education: Bachelors or Masters degree in Computer Science, AI, Engineering, or a related field (or equivalent practical experience). Experience: 2-8 years of experience in AI engineering, software development, or automation testing. Minimum 1 year of experience working with red teaming frameworks (e.g., Pyrite, Attest) and AI validation tools (e.g., Fairlearn, DeepChecks). Strong understanding of AI security, fairness principles, and adversarial testing methodologies. Technical Skills: Proficiency in programming languages such as Python, JavaScript, or relevant AI evaluation scripting. Experience in AI bias detection, robustness testing, and prompt engineering. Familiarity with machine learning workflows, AI policy frameworks, and ethical AI implementation. Key Competencies: Strong analytical and problem-solving skills in AI validation and red teaming. Attention to detail when assessing AI-generated responses. Creativity and critical thinking in designing and refining AI prompts. Excellent written and verbal communication skills for reporting AI evaluation results. Ability to work in a fast-paced, collaborative environment with multidisciplinary teams. Why Join Us? Work with cutting-edge AI validation frameworks to shape the future of responsible AI. Engage in AI security, fairness assessments, and red teaming to enhance AI resilience. Be part of an innovative and dynamic team at the forefront of AI model evaluation. Competitive compensation package with strong career growth opportunities.
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Title: Senior Data Engineer (4-6 Years Experience) Location: Kotak Life HO Department: Data Science & Analytics Employment Type: Full-Time About the Role: We are seeking a highly skilled Data Engineer with 4-6 years of hands-on experience in designing and developing scalable, reliable, and efficient data solutions. The ideal candidate will have a strong background in cloud platforms (AWS or Azure), experience in building both batch and streaming data pipelines, and familiarity with modern data architectures including event-driven and medallion architectures. Key Responsibilities: .Design, build, and maintain scalable data pipelines (batch and streaming) to process structured and unstructured data from various sources. .Develop and implement solutions based on event-driven architectures using technologies like Kafka, Event Hubs, or Kinesis. .Architect and manage data workflows based on the Medallion architecture (Bronze, Silver, Gold layers). .Work with cloud platforms (AWS or Azure) to manage data infrastructure and storage, compute, and orchestration services. .Leverage cloud-native or open-source tools for data transformation, orchestration, monitoring, and quality checks. .Collaborate with data scientists, analysts, and product manager to deliver high-quality data solutions. .Ensure best practices in data governance, security, lineage, and observability. Required Skills & Qualifications: .4-6 years of professional experience in data engineering or related roles. .Strong experience in cloud platforms: AWS (e.g., S3, Glue, Lambda, Redshift) or Azure (e.g., Data Lake, Synapse, Data Factory, Functions). .Proven expertise in building batch and streaming pipelines using tools like Spark, Flink, Kafka, Kinesis, or similar. .Practical knowledge of event-driven architectures and experience with message/event brokers. .Hands-on experience implementing Medallion architecture or similar layered data architectures. .Familiarity with data orchestration tools (e.g., Airflow, Azure Data Factory, AWS Step Functions). .Proficiency in SQL, Python, or Scala for data processing and pipeline development. .Exposure to open-source tools in the modern data stack (e.g., dbt, Delta Lake, Apache Hudi, Great Expectations). Preferred Qualifications: .Experience with containerization and CI/CD for data workflows (Docker, GitHub Actions, etc.). .Knowledge of data quality frameworks and observability tooling. .Experience with Delta Lake or Lakehouse implementations. .Strong problem-solving skills and ability to work in fast-paced environments. What We Offer:
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Pyspark /Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance , and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science , Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, Great Expectations, Splink and PyTorch . . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Pyspark/Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science, Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, Great Expectations, Splink and PyTorch. . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
10.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About the Role: We are looking for a Senior Engineering Manager with 10+ years of experience and 2 years of people management experience to help scale and modernize Myntra's data platform. The ideal candidate will have a strong background in building scalable data platforms using a combination of open-source technologies and enterprise solutions. The role demands deep technical expertise in data ingestion, processing, serving, and governance, with a strategic mindset to scale the platform 10x to meet the ever-growing data needs across the organization. This is a high-impact role requiring innovation, engineering excellence and system stability, with an opportunity to contribute to OSS projects and build data products leveraging available data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs across analytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka (Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging Databricks Spark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunities and build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability of the platform. Work with open-source communities and facilitate contributing to OSS projects aligned with Myntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in a cloud environment. Management Responsibilities: Technical Guidance : This role will play the engineering lead role for teams within Myntra Data Platform. You will provide technical leadership to a team of excellent data engineers; this requires that you have the technical depth to make complex design decisions and the hands-on ability to lead by example. Execution and Delivery : You will be expected to instill and follow good software development practices and ensure timely delivery of high-quality products. You should be familiar with agile practices as well as be able to adapt these to the needs of the business, with a constant focus on product quality. Team management : You will be responsible for hiring and mentoring your team; helping individuals grow in their careers, having constant dialogue about their aspirations and sharing prompt, clear and actionable feedback about performance. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. Experience: 10+ years of experience in building large-scale data platforms. 2+ years of people management experience. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-based environment. Experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governance practices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in data engineering. Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technical challenges. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and building best-in-class data products.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France