Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As an AI and Machine Learning Engineer at Dailoqa, you will play a crucial role in shaping the future of Financial Services clients and the company as a whole. Working directly with the founding team, you will have the opportunity to apply the latest AI techniques to address real-world problems encountered by Financial Services clients. Your responsibilities will include designing, constructing, and enhancing datasets to assess and continually enhance our solutions, as well as engaging in strategy and product ideation sessions to influence our product and solution roadmap. Key Responsibilities: - Agentic AI Development: Build scalable multi-modal Large Language Model (LLM) based AI agents using frameworks such as LangGraph, Microsoft Autogen, or Crewai. - AI Research and Innovation: Research and develop innovative solutions for relevant AI challenges such as Retrieval-Augmented Generation (RAG), semantic search, knowledge representation, tool usage, fine-tuning, and reasoning in LLMs. - Technical Expertise: Demonstrate proficiency in a technology stack comprising Python, LlamaIndex / LangChain, PyTorch, HuggingFace, FastAPI, Postgres, SQLAlchemy, Alembic, OpenAI, Docker, Azure, Typescript, and React. - LLM and NLP Experience: Hands-on experience with LLMs, RAG architectures, Natural Language Processing (NLP), or applying Machine Learning to solve real-world problems. - Dataset Development: Establish a strong track record of constructing datasets for training and/or evaluating machine learning models. - Customer Focus: Dive deep into the domain, comprehend the problem, and concentrate on delivering value to the customer. - Adaptability: Thrive in a fast-paced environment and demonstrate enthusiasm for joining an early-stage venture. - Model Deployment and Management: Automate model deployment, monitoring, and retraining processes. - Collaboration and Optimization: Collaborate with data scientists to review, refactor, and optimize machine learning code. - Version Control and Governance: Implement version control and governance for models and data. Required Qualifications: - Bachelor's degree in computer science, Software Engineering, or a related field. - 4-8 years of experience in MLOps, DevOps, or similar roles. - Strong programming experience and familiarity with Python-based deep learning frameworks like PyTorch, JAX, Tensorflow. - Proficiency in cloud platforms (AWS, Azure, or GCP) and infrastructure-as-code tools like Terraform. Desired Skills: - Experience with experiment tracking and model versioning tools. - Proficiency with the technology stack: Python, LlamaIndex / LangChain, PyTorch, HuggingFace, FastAPI, Postgres, SQLAlchemy, Alembic, OpenAI, Docker, Azure, Typescript, React. - Knowledge of data pipeline orchestration tools like Apache Airflow or Prefect. - Familiarity with software testing and test automation practices. - Understanding of ethical considerations in machine learning deployments. - Strong problem-solving skills and ability to work in a fast-paced environment.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining an innovative company that is revolutionizing retail checkout experiences by utilizing cutting-edge Computer Vision technology to replace traditional barcodes. Our platform aims to create seamless, faster, and smarter checkout processes, enhancing the shopping experience for both retailers and consumers. As we are growing rapidly, we are seeking an experienced Android/Cross-Platform App Developer to be a part of our team and help in shaping the future of retail technology. As a Senior Data Engineer, you will be an integral part of our expanding data team. Your primary responsibilities will involve building and optimizing data infrastructure, pipelines, and tooling to support analytics, machine learning, and product development. This role requires a strong background in cloud-native data engineering, a passion for scalable systems, and the ability to work independently with minimal supervision. Key Responsibilities: - Design, build, and maintain scalable data pipelines and ETL/ELT workflows using tools such as Kestra or Prefect. - Architect and manage cloud-based data infrastructure utilizing platforms like Snowflake, MySQL, and LanceDB. - Implement and uphold data quality, lineage, and governance best practices. - Collaborate with analytics, BI, and product teams to establish data models for reporting, experimentation, and operational use cases. - Optimize query performance, storage costs, and data reliability across various platforms. - Oversee data ingestion from internal and external systems through APIs, CDC, or streaming technologies like Kafka and MQTT. - Develop automated data validation, testing, and monitoring frameworks to ensure data integrity. - Contribute to infrastructure-as-code and deployment processes using CI/CD pipelines and version control systems like Git. - Capable of working independently and driving projects forward with minimal supervision. Skills and Qualifications: - 5+ years of experience as a data engineer or software engineer in large-scale data systems. - Proficiency in SQL, Python, and modern data transformation frameworks. - Hands-on experience in building and maintaining production-level ETL/ELT pipelines. - Familiarity with cloud data warehouses like Snowflake and RedPanda Cloud. - Expertise in workflow orchestration tools such as Airflow, Kestra, or Prefect. - Understanding of data modeling techniques like dimensional modeling and normalization. - Experience with cloud platforms such as AWS and Azure for data infrastructure and services. - Ability to work independently and lead projects with minimal guidance. Nice to Have: - Experience with streaming data technologies, specifically RedPanda. - Knowledge of data security, privacy, and compliance practices including GDPR and HIPAA. - Background in DevOps for data, encompassing containerization and observability tools. - Previous involvement in a Retail or e-commerce data environment. Software Qualifications: - Languages: Python, SQL, Rust - Data Warehousing: Snowflake, MySQL - ETL/ELT Orchestration Tools: Kestra, Prefect - Version Control & CI/CD: Git, GitHub Actions - Orchestration & Infrastructure: Docker, Kubernetes, Redpanda, Cloudflare - Monitoring: OpenobserveAI, Keep Why Join Us : - Become part of a forward-thinking company shaping the future of retail technology. - Collaborate with a dynamic and innovative team that values creativity. - Opportunity to contribute to cutting-edge projects and enhance your skills. - Competitive salary and benefits package. - Enjoy a flexible work environment with opportunities for career growth.,
Posted 5 days ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Key Skills : PostgreSQL, Cron Jobs, Databricks, Azure, SSIS, Prefect, Data Pipelines, Cloud Data Migration, MSSQL. Roles and Responsibilities: Design and implement data models in PostgreSQL database on cloud environments. Build and manage transformation pipelines using Databricks for data migration from MSSQL to PostgreSQL. Schedule and manage automation using Cron jobs. Mentor and guide junior team members. Work in Azure or any cloud-based environment. Ensure successful and optimized data migration from MSSQL to PostgreSQL. Experience Requirement: 5-10 years of experience in database engineering and data migration. Hands-on experience in PostgreSQL, Cron jobs, Databricks, and Azure. Experience with data pipelines using SSIS or Prefect is preferred. Education: B.E., B.Tech.
Posted 1 week ago
3.0 - 8.0 years
9 - 19 Lacs
Hyderabad
Work from Office
We Advantum Health Pvt. Ltd - US Healthcare MNC looking for Senior AI/ML Engineer. We Advantum Health Private Limited is a leading RCM and Medical Coding company, operating since 2013. Our Head Office is located in Hyderabad, with branch operations in Chennai and Noida. We are proud to be a Great Place to Work certified organization and a recipient of the Telangana Best Employer Award. Our office spans 35,000 sq. ft. in Cyber Gateway, Hitech City, Hyderabad Job Title: Senior AI/ML Engineer Location: Hitech City, Hyderabad, India Work from office Ph: 9177078628, 7382307530, 9059683624 Address: Advantum Health Private Limited, Cyber gateway, Block C, 4th floor Hitech City, Hyderabad. Location: https://www.google.com/maps/place/Advantum+Health+India/@17.4469674,78.3747158,289m/data=!3m2!1e3!5s0x3bcb93e01f1bbe71:0x694a7f60f2062a1!4m6!3m5!1s0x3bcb930059ea66d1:0x5f2dcd85862cf8be!8m2!3d17.4467126!4d78.3767566!16s%2Fg%2F11whflplxg?entry=ttu&g_ep=EgoyMDI1MDMxNi4wIKXMDSoASAFQAw%3D%3D Job Summary: We are seeking a highly skilled and motivated Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support analytics, machine learning, and business intelligence initiatives. You will work closely with data analysts, scientists, and engineers to ensure data availability, reliability, and quality across the organization. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines for ingesting and transforming large volumes of structured and unstructured data Build and optimize data infrastructure for scalability, performance, and reliability Collaborate with cross-functional teams to understand data needs and translate them into technical solutions Implement data quality checks, monitoring, and alerting mechanisms Manage and optimize data storage solutions (data warehouses, data lakes, databases) Ensure data security, compliance, and governance across all platforms Automate data workflows and optimize data delivery for real-time and batch processing Participate in code reviews and contribute to best practices for data engineering Required Skills and Qualifications: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field 3+ years of experience in data engineering or related roles Strong programming skills in Python, Java, or Scala Proficiency with SQL and working with relational databases (e.g., PostgreSQL, MySQL) Experience with data pipeline and workflow orchestration tools (e.g., Airflow, Prefect, Luigi) Hands-on experience with cloud platforms (AWS, GCP, or Azure) and cloud data services (e.g., Redshift, BigQuery, Snowflake) Familiarity with distributed data processing tools (e.g., Spark, Kafka, Hadoop) Solid understanding of data modeling, warehousing concepts, and data governance Preferred Qualifications: Experience with CI/CD and DevOps practices for data engineering Knowledge of data privacy regulations such as GDPR, HIPAA, etc. Experience with version control systems like Git Familiarity with containerization (Docker, Kubernetes) Follow us on LinkedIn, Facebook, Instagram, Youtube and Threads for all updates: Advantum Health Linkedin Page: https://www.linkedin.com/showcase/advantum-health-india/ Advantum Health Facebook Page: https://www.facebook.com/profile.php?id=61564435551477 Advantum Health Instagram Page: https://www.instagram.com/reel/DCXISlIO2os/?igsh=dHd3czVtc3Fyb2hk Advantum Health India Youtube link: https://youtube.com/@advantumhealthindia-rcmandcodi?si=265M1T2IF0gF-oF1 Advantum Health Threads link: https://www.threads.net/@advantum.health.india HR Dept, Advantum Health Pvt Ltd Cybergateway, Block C, Hitech City, Hyderabad Ph: 9177078628, 7382307530, 9059683624
Posted 2 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Delhi / NCR
Hybrid
8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 1 month ago
5.0 - 9.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for a Senior Data Engineer who will design, build, and maintain scalable data pipelines and ingestion frameworks. The ideal candidate must have experience with DBT, orchestration tools like Airflow or Prefect, and cloud platforms such as AWS. Responsibilities include developing ELT pipelines, optimizing queries, implementing CI/CD, and integrating with AWS services. Strong SQL, Python, and data modeling skills are essential. The role also involves working with real-time and batch processing, ensuring high performance and data integrity.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France