Home
Jobs

2162 Indexing Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About The Job The Red Hat Quality Engineering team is looking for a Quality Engineer to join us in Pune India. In this role, you will write new and optimize existing test automation scripts and frameworks. You'll also work on the analysis of the non-deterministic failures, ways to reduce them and report bugs and defects. This team is home to some of the most well-known faces in the international QE community and will set you up for an exciting and rewarding SDET journey. Our team culture encourages you to come up with innovative solutions to problems and allow you to work with some of the brightest engineers in the open-source industry. What will you do? Test new features and ensure fast feedback for developers Create and maintain testing artifacts Write automated test scripts, analyze test results, and identify and file bugs accordingly Test bugs and write bug reproducers Write API and Performance tests scripts What will you bring? 3+ years of experience in Quality Engineering Hands-on experience writing code in at least one programming language, preferably Python, Java, or JavaScript Strong understanding of HTTP communication protocols Expertise in testing and working with RESTful APIs Proficiency API test automation Experience building and testing systems designed to handle large datasets, with attention to performance and scalability Familiarity with distributed systems and an understanding of challenges related to consistency, availability, and partitioning Solid grasp of core testing concepts, including functional, sanity, regression testing, etc. Excellent observation skills with a strong attention to detail Self-motivated with the ability to take direction and collaborate effectively within a team Eagerness to learn new technologies and apply problem-solving skills Decision-making skills on day-to-day development Intermediate written and verbal communication skills in English Apache Solr, Indexing, RestAssured, Python, AI/ML experience is good to have About Red Hat Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Inclusion at Red Hat Red Hat’s culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village. Equal Opportunity Policy (EEO) Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Red Hat does not seek or accept unsolicited resumes or CVs from recruitment agencies. We are not responsible for, and will not pay, any fees, commissions, or any other payment related to unsolicited resumes or CVs except as required in a written contract between Red Hat and the recruitment agency or party requesting payment of a fee. Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email application-assistance@redhat.com. General inquiries, such as those regarding the status of a job application, will not receive a reply.

Posted 2 days ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Develop and maintain database applications using PL/SQL, focusing on stored procedures, triggers, and database performance optimization. Work with teams to integrate database solutions into broader application frameworks.

Posted 2 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and maintain Oracle databases, writing efficient SQL queries and PL/SQL scripts. Focus on performance tuning, data integrity, and ensuring seamless data integration within enterprise systems.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

What You’ll Do Manage and maintain PostgreSQL databases in development, staging, and production environments. Write and optimize SQL queries, stored procedures, functions, and triggers to support application logic. Design, implement, and maintain logical and physical database schemas. Monitor database performance and implement performance tuning strategies. Ensure data integrity, security, and availability through regular maintenance and backups. Collaborate with application developers to understand requirements and provide efficient database solutions. Handle database migrations, versioning, and deployment as part of CI/CD pipelines. Perform regular database health checks, index analysis, and query optimization. Troubleshoot and resolve database issues, including slow queries, locking, and replication errors. What We Seek In You Proven experience as a PostgreSQL Database with hands-on SQL development experience. Strong knowledge of PL/SQL and writing efficient stored procedures and functions. Experience with database schema design, normalization, and data modeling. Solid understanding of PostgreSQL internals, indexing strategies, and performance tuning. Experience with backup and recovery tools, pg_dump, pg_restore, replication, and monitoring tools. Proficient in Linux/Unix command-line tools for database management. Familiar with version control systems (e.g., Git) and CI/CD practices. Life At Next At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks Of Working With Us Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 2 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Key Responsibilities: Data Engineering & Architecture: Design, develop, and maintain high-performance data pipelines for structured and unstructured data using Azure Data Bricks and Apache Spark. Build and manage scalable data ingestion frameworks for batch and real-time data processing. Implement and optimize data lake architecture in Azure Data Lake to support analytics and reporting workloads. Develop and optimize data models and queries in Azure Synapse Analytics to power BI and analytics use cases. Cloud-Based Data Solutions: Architect and implement modern data lakehouses combining the best of data lakes and data warehouses. Leverage Azure services like Data Factory, Event Hub, and Blob Storage for end-to-end data workflows. Ensure security, compliance, and governance of data through Azure Role-Based Access Control (RBAC) and Data Lake ACLs. ETL/ELT Development: Develop robust ETL/ELT pipelines using Azure Data Factory, Data Bricks notebooks, and PySpark. Perform data transformations, cleansing, and validation to prepare datasets for analysis. Manage and monitor job orchestration, ensuring pipelines run efficiently and reliably. Performance Optimization: Optimize Spark jobs and SQL queries for large-scale data processing. Implement partitioning, caching, and indexing strategies to improve performance and scalability of big data workloads. Conduct capacity planning and recommend infrastructure optimizations for cost-effectiveness. Collaboration & Stakeholder Management: Work closely with business analysts, data scientists, and product teams to understand data requirements and deliver solutions. Participate in cross-functional design sessions to translate business needs into technical specifications. Provide thought leadership on best practices in data engineering and cloud computing. Documentation & Knowledge Sharing: Create detailed documentation for data workflows, pipelines, and architectural decisions. Mentor junior team members and promote a culture of learning and innovation. Required Qualifications: Experience: 7+ years of experience in data engineering, big data, or cloud-based data solutions. Proven expertise with Azure Data Bricks, Azure Data Lake, and Azure Synapse Analytics. Technical Skills: Strong hands-on experience with Apache Spark and distributed data processing frameworks. Advanced proficiency in Python and SQL for data manipulation and pipeline development. Deep understanding of data modeling for OLAP, OLTP, and dimensional data models. Experience with ETL/ELT tools like Azure Data Factory or Informatica. Familiarity with Azure DevOps for CI/CD pipelines and version control. Big Data Ecosystem: Familiarity with Delta Lake for managing big data in Azure. Experience with streaming data frameworks like Kafka, Event Hub, or Spark Streaming. Cloud Expertise: Strong understanding of Azure cloud architecture, including storage, compute, and networking. Knowledge of Azure security best practices, such as encryption and key management. Preferred Skills (Nice to Have): Experience with machine learning pipelines and frameworks like MLFlow or Azure Machine Learning. Knowledge of data visualization tools such as Power BI for creating dashboards and reports. Familiarity with Terraform or ARM templates for infrastructure as code (IaC). Exposure to NoSQL databases like Cosmos DB or MongoDB. Experience with data governance to Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. JobCategory:BigData

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Senior Consultant - Data & Analytics Career Level: D2 Introduction to Role Are you ready to make a significant impact in the world of data and analytics? We are looking for a highly skilled Senior Data & Analytics Engineer to join our offshore Knowledge Engineering Team. In this pivotal role, you will support various projects and initiatives across AstraZeneca by developing and managing the data ingestion, clean-up, and enrichment processes. Your expertise in data engineering and analytics, coupled with hands-on experience with AWS DevOps tools, will be crucial in driving our projects forward. Accountabilities Collaborate with project teams across diverse domains to understand their data needs and provide expertise in data ingestion and enrichment processes. Design, develop, and maintain scalable data pipelines and ETL workflows for the Knowledge Graph Team. Implement advanced data engineering techniques to ensure optimal performance and reliability of data systems. Work closely with data scientists and analysts to ensure high-quality data for knowledge graph construction and advanced analytics. Troubleshoot and resolve complex issues related to data pipelines, ensuring efficient data flow. Optimize data storage and processing for performance, scalability, and cost-efficiency. Stay updated with the latest trends in data engineering, analytics, and AWS DevOps to drive innovation. Provide DevOps/CloudOps support for the Knowledge Graph Team as needed. Essential Skills/Experience Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Strong expertise in data engineering, ETL workflows, and data pipeline development. Proficiency in programming languages such as Python, Java, or Scala. Experience with advanced data engineering techniques and best practices. Proven experience in DevOps/CloudOps support, particularly in AWS environment. Excellent troubleshooting and problem-solving skills. Web-services development and consumption (e.g., RESTful, GraphQL). Strong communication and collaboration skills, with the ability to lead and work effectively with cross-functional teams, present findings, and influence decision-making. Desirable Skills/Experience Experience in a senior or leadership role, guiding and mentoring junior data and analytics engineers. Familiarity with knowledge graph construction and advanced analytics is a plus. Familiarity with Object Graph Mapping libraries and/or Aspect-Oriented Programming. Expertise in data engineering using modern data platforms to build, deploy, and maintain data applications. Application deployment technologies (e.g., containerised workflows, Kubernetes) and cloud provisioning tools (e.g., CloudFormation, Terraform). Experience with full-text search and indexing (e.g., Solr/Lucene, Elastic Search). Knowledge of testing frameworks. AWS certification(s) is highly desirable. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, your work directly impacts patients by transforming our ability to develop life-changing medicines. We empower our business to perform at its peak by combining cutting-edge science with leading digital technology platforms. Join us at a crucial stage of our journey as we become a digital and data-led enterprise. Here, you will have the opportunity to innovate, take ownership, and explore new solutions in a dynamic environment that values diverse minds working inclusively together. Are you ready to shape the future of the pharmaceutical industry? Apply now to join our team! Date Posted 25-Jun-2025 Closing Date AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 56,000 employees in nearly 30 countries, is recognized for its consulting, digital services and software development. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organizations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a fully collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2023, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. We are looking for an experienced SSIS and MS SQL Developer to design, develop, and support SQL Server databases and SSIS packages. The role involves working on development projects and providing production support to ensure smooth database operations and optimal performance. SQL Server Development: Design and develop complex T-SQL queries, stored procedures, views, functions, and triggers. Optimize database queries for improved performance and efficiency. Develop scripts for data migration, import/export, and transformation tasks. SSIS Package Development: Design, develop, and maintain ETL (Extract, Transform, Load) processes using SSIS. Create, modify, and troubleshoot SSIS packages for data integration, migration, and transformation. Scheduling of SSIS packages. Production Support: Monitor and support production databases to ensure high availability and performance. Troubleshoot and resolve database-related issues in production environments. Identify and optimize long-running queries, deadlocks, and other performance bottlenecks. Provide 24/7 production support as part of an on-call rotation for critical issues. Perform root cause analysis and provide resolutions for database incidents and outages. Required Technical Skills: Hands-on experience with MS SQL Server. Proficiency in T-SQL programming (queries, stored procedures, functions, triggers). Strong experience with SQL Server Integration Services (SSIS) for ETL processes. Experience in query optimization, indexing, and performance tuning. Good to have: Knowledge or work experience in SSRS, C#.net/Java, Python, Zendesk, JIRA, BizTalk, RTI, Jenkins, Splunk, CICD, New Relic, Autosys scheduler, CA7 scheduler. Total Experience Expected: 04-06 years Qualifications Relevant experience in database maintenance and support. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Additional Information Must be willing to work in shifts and participate in on-call support as required. At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 2 days ago

Apply

0.0 - 4.0 years

0 - 0 Lacs

Delhi, Delhi

On-site

Indeed logo

Job Title: Full Stack PHP Developer (CodeIgniter & Laravel) Location: Janak Puri(west delhi) 5 days working sat sun off Experience Level: Mid to Senior (4+ years preferred) Salary max 50k in hand upto(Salary negotiable for deserving candidates) Job Overview: We are seeking a talented and experienced Full Stack PHP Developer who is proficient in both CodeIgniter and Laravel frameworks. The ideal candidate will be responsible for developing and maintaining robust web applications, handling both front-end and back-end development, and collaborating closely with the product and design teams. Key Responsibilities: Design, develop, and maintain scalable web applications using PHP, Laravel, and CodeIgniter. Work on both front-end and back-end development, ensuring responsive and efficient user interfaces. Build and integrate RESTful APIs and third-party services. Optimize application performance and troubleshoot issues across the stack. Write clean, maintainable, and secure code following best practices. Collaborate with UI/UX designers, QA engineers, and other developers. Manage database operations (MySQL) including query optimization. Version control using Git. Required Skills & Qualifications: 3+ years of hands-on experience in PHP development. Strong experience with Laravel and CodeIgniter frameworks. Solid knowledge of HTML5, CSS3, JavaScript, and jQuery. Familiarity with front-end frameworks such as Vue.js or React is a plus. Good understanding of MySQL databases, indexing, and performance tuning. Experience in using Git for version control. Knowledge of MVC architecture and REST API integration. Ability to work independently and manage multiple projects. Preferred Qualifications (Nice to Have): Experience with cloud platforms (AWS, DigitalOcean). Familiarity with Docker or container-based development. Knowledge of automated testing (e.g., PHPUnit). Basic understanding of SEO and accessibility principles WhatsApp 9354220033 Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Schedule: Morning shift Supplemental Pay: Performance bonus Yearly bonus Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Total work experience as PHP Developer: 4 years (Required) total work experience in laravel , CodeIgniter and PHP: 4 years (Required) Language: English (Required) Work Location: In person

Posted 2 days ago

Apply

3.0 - 8.0 years

6 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

3-5 years of experience in MySQL database development and administration. Strong knowledge of relational database concepts, design, and indexing. Expertise in SQL query optimization and performance tuning. Deep understanding of database security best practices and encryption methods. Experience with ETL processes and tools. Ability to write complex SQL queries, stored procedures, and functions. Strong problem-solving skills, attention to detail, and the ability to work independently or within a team. Excellent communication skills for cross-functional collaboration. Bachelors degree in Computer Science, Information Technology, or a related field. Certification in MySQL administration and knowledge of Oracle or PostgreSQL is a plus! If youre looking to take your career to the next level, apply now and become part of a dynamic and innovative team!

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

About Valorant Valorant is a fast-growing procurement consulting firm helping mid-market and PE-backed companies transform operations. We’re now launching our next chapter: building AI products to radically automate and augment procurement workflows. This isn’t about chatbot demos. We’re building real enterprise software with real client data, solving real problems. About the Role As our Full-Stack AI Engineer / Technical Product Lead , you’ll drive the design, development, and launch of intelligent agentic systems that blend LLMs, vector search, structured data, and enterprise workflows. You’ll work closely with domain experts, iterate fast, and own the tech stack end to end—from backend services to frontend interfaces. This is a zero-to-one opportunity to build production-grade AI tools that work at scale and matter to real businesses. What You’ll Do Architect and build AI-powered products for procurement and supply chain use cases Develop LLM features using RAG (Retrieval-Augmented Generation), prompt engineering, and custom context pipelines Implement semantic document search using vector databases (Chroma, FAISS, etc.) Build Python backend services for data ingestion, transformation, and orchestration of AI pipelines Work with structured enterprise data (e.g., ledgers, SaaS exports, CSVs) to extract insights and power analytics Design or collaborate on frontend development for dashboards, chat interfaces, and user-facing tools (React or similar) Translate complex workflows into clean, intuitive UX with strong usability principles Ensure enterprise-grade reliability, explainability, and data privacy Contribute to product roadmap, feature planning, and fast iteration cycles with consultants and PMs Take ownership of the full stack and help shape a modern, scalable AI-first architecture What We’re Looking For 5–7+ years of experience in software engineering, full-stack development, or AI/ML product engineering Hands-on experience shipping LLM features in production (OpenAI, Claude, LLaMA, Mistral, etc.) Strong Python skills; experience with LangChain, LLaMA Index, or similar frameworks Experience with vector search, semantic indexing, and chunking strategies Backend engineering experience: designing modular APIs, microservices, and orchestration layers (FastAPI, Flask, Django, etc.) Proficiency in frontend development using React, Vue, or similar frameworks Understanding of UI/UX principles and ability to turn workflows into usable interfaces Familiarity with structured data workflows: pandas, SQL, and validation pipelines Exposure to cloud environments and dev tooling (Docker, GitHub Actions, AWS/GCP) Pragmatic, product-focused mindset — values useful outputs over academic perfection Bonus: Domain experience in procurement, supply chain, legal tech, or enterprise SaaS Bonus: Experience mentoring junior engineers or contributing to team scaling Why Join Us? Build meaningful AI products that solve real problems — not just tech showcases Collaborate with domain experts and access rich real-world data from day one Operate with autonomy, fast iteration cycles, and strong strategic backing Shape the tech foundation of an ambitious, AI-native product company Competitive pay and flexible remote work.

Posted 2 days ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Position: Search Engineer Location: Remote Experience: 6+ years Job Description: Highly skilled Search Engineer with deep expertise in designing, implementing, and optimizing search solutions using Apache Solr, Elasticsearch, and Apache Spark . Need substantial experience handling big data search and document-based retrieval, with a strong focus on writing complex queries and indexing strategies for large-scale systems. Key Responsibilities: · Design and implement robust, scalable search architectures using Solr and Elasticsearch. · Write, optimize, and maintain complex search queries (including full-text, faceted, fuzzy, geospatial, and nested queries) using Solr Query Parser and Elasticsearch DSL. · Work with business stakeholders to understand search requirements and translate them into performant and accurate queries. · Build and manage custom analyzers, tokenizers, filters, and index mappings/schemas tailored to domain-specific search needs. · Develop and optimize indexing pipelines using Apache Spark for processing large-scale structured and unstructured datasets. · Perform query tuning and search relevance optimization based on precision, recall, and user engagement metrics. · Create and maintain query templates and search APIs for integration with enterprise applications. · Monitor, troubleshoot, and improve search performance and infrastructure reliability. · Conduct evaluations and benchmarking of search quality, query latency, and index refresh times. Required Skills and Qualifications: · 4 to 5 years of hands-on experience with Apache Solr and/or Elasticsearch in production environments. · Proven ability to write and optimize complex Solr queries (standard, dismax, edismax parsers) and Elasticsearch Query DSL, including: o Full-text search with analyzers o Faceted and filtered search o Boolean and range queries o Aggregations and suggesters o Nested and parent/ child queries · Strong understanding of indexing principles, Lucene internals, and relevance scoring mechanisms (BM25, TF-IDF). · Proficiency with Apache Spark for custom indexing workflows and large-scale data processing. · Experience with document parsing and extraction (JSON, XML, PDFs, etc.) for search indexing. · Experience integrating search into web applications or enterprise software platforms.

Posted 2 days ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

This role is for one of Weekday's clients Min Experience: 4 years Location: India JobType: full-time Requirements About the Role: We are seeking a highly skilled Backend Engineer with expertise in ClickHouse and Java to join our high-performing engineering team. This is an exciting opportunity for professionals who are passionate about data-intensive systems and building scalable backend solutions for real-time analytics platforms. You will be responsible for architecting, developing, and maintaining data pipelines, APIs, and database integrations that power mission-critical analytics services. If you are a backend developer who thrives in a fast-paced environment and has a strong foundation in Java and ClickHouse , this role is for you. Key Responsibilities: Backend Development: Design, implement, and optimize backend services and APIs using Java to support our real-time analytics platform and business logic. ClickHouse Integration: Build and manage large-scale ClickHouse database clusters. Optimize queries, schemas, and ingestion pipelines to ensure high performance and scalability. Data Pipeline Management: Collaborate with data engineers and DevOps teams to create and maintain efficient data pipelines from ingestion to querying using ClickHouse. System Architecture & Design: Contribute to the architectural design of backend services, ensuring robustness, maintainability, and scalability of the system. Monitoring and Optimization: Proactively monitor system performance and take measures to improve reliability, latency, and throughput, especially around ClickHouse-driven components. Code Quality & Best Practices: Write clean, efficient, and well-documented code. Conduct code reviews and promote best practices in backend development and database usage. Troubleshooting and Debugging: Quickly identify and resolve backend-related issues, particularly around data inconsistencies or performance bottlenecks in ClickHouse. Collaboration: Work closely with frontend developers, data scientists, product managers, and other stakeholders to translate requirements into technical solutions. Required Skills & Qualifications: 4+ years of experience in backend development, with at least 1-2 years of hands-on experience with ClickHouse. Strong proficiency in Java and understanding of multi-threaded applications and performance optimization. Deep understanding of ClickHouse internals, data modeling, indexing, and performance tuning. Experience with database design, schema optimization, and writing complex queries. Familiarity with RESTful API development and microservices architecture. Exposure to cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker, Kubernetes) is a plus. Experience with CI/CD, Git, and agile development methodologies. Excellent problem-solving and analytical thinking skills. Bachelor's or Master's degree in Computer Science, Engineering, or related field. Nice to Have: Experience integrating ClickHouse with data ingestion tools like Kafka, Spark, or Flink. Familiarity with monitoring tools such as Grafana, Prometheus, or similar. Working knowledge of additional backend languages or frameworks (e.g., Golang, Spring Boot) is a bonus.

Posted 2 days ago

Apply

6.0 - 11.0 years

7 - 11 Lacs

Gurugram

Hybrid

Naukri logo

Skills: Oracle Database, Postgres, Database design Secondary Skills: Data modelling, Performance Tuning, ETL processes, Automating Backup and Purging Processes Skill Justification Database Designing, Data Modelling, and Core Component Implementation: These are fundamental skills for a DBA. Database designing involves creating the structure of the database, data modelling is about defining how data is stored, accessed, and related, and core component implementation ensures that the database is set up correctly and efficiently. Data Integration and Relational Data Modelling: Data integration is crucial for combining data from different sources into a unified view, which is essential for accurate reporting and analysis. Relational data modelling helps in organizing data into tables and defining relationships, which is a core aspect of managing relational databases. Optimization and Performance Tuning: Optimization and performance tuning are critical for ensuring that the database runs efficiently. This involves analyzing and improving query performance, indexing strategies, and resource allocation to prevent bottlenecks and ensure smooth operation. Automating Backup and Purging Processes: Automating backup and purging processes is vital for data integrity and storage management. Regular backups protect against data loss, while purging old or unnecessary data helps maintain database performance and manage storage costs.

Posted 2 days ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelor’s degree in Computer Science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Master’s degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

🧩 Junior Software Engineer (Java / SQL / Angular / Flutter) Overview: We are looking for enthusiastic Junior Software Engineers (1–3 years experience) with skills in Java/SQL/ Angular/Flutter to join our development team. The ideal candidate should be a quick learner with a solid foundation in coding, problem-solving, and team collaboration. Key Responsibilities: Develop and maintain applications using Java (Spring Boot), SQL, Angular, or Flutter. Build and integrate RESTful APIs; manage front-end and back-end components. Write efficient, scalable, and reusable code and queries. Participate in debugging, performance tuning, code reviews, and Agile sprints. Collaborate with UI/UX teams and ensure best practices in development. Requirements: 1–3 years of hands-on experience in one or more of the following: Java : Core Java, Spring Boot, OOP principles, Maven/Gradle. SQL : MSSQL/PostgreSQL/Oracle, stored procedures, indexing, optimization. Angular : Angular v2+, HTML5, CSS3, TypeScript, JavaScript. Flutter : Dart, widget-based UI design, cross-platform mobile app development. Strong understanding of REST APIs, version control (Git), and Agile methodologies.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Experience in designing and developing a new test automation framework from scratch using a test automation tool like Playwright/JavaScript/TypeScript or /Selenium/Cucumber/Java or Selenium/C#.net/Spec Flow or similar frameworks. Hands-on experience in software testing and writing test automation code in at least one programming or scripting language (JavaScript, TypeScript, Java, C# and/or Python) and guide team members on technical front. Hands-on experience in developing and executing test scripts for REST or SOAP based APIs. Experience in intercepting and mocking complex HTTP API calls in UI and API tests. Experience in messaging systems (AMQ, WMQ or Kafka). Experience in writing and guiding teams on complex end-to-end tests automation. Experience on test management and defect tracking tools like Jira, Azure DevOps, HP ALM, etc. Experience in driving defect management strategy (create, set priority/severity, retest, close, traceability, reporting etc.) Experience in driving the team to implement automated test suites for all testing types (unit, component, API, UI, UAT, E2E, etc.) Hands-on experience in writing and executing complex SQL queries and understands concepts like indexing, schemas, views, etc. Experience in training team members to understand version control concepts. Perform lead-level automation code review as a required approver. Experience in guiding the team on version control concepts, tools and hands-on experience on commands and operations (like commit, fetch, push, pull, squash, resolve merge conflicts etc.) Experience in establishing branching strategy and best practices for automation code review process (in coordinate with Engg. lead, DevSecOps) Hands-on experience on at least one of the performance testing tools (e.g., JMeter, K6, LoadRunner, NeoLoad, etc.). Experience in analyzing performance testing requirements and implementation of load, stress, endurance, volume testing etc. Experience in leading the team on CI/CD pipeline implementation for automated tests suites. (in coordination with DevSecOps team) Experience in creating test plan for accessibility and security testing. Hands-on experience on cloud platforms (e.g., Azure, AWS, GCP) Understanding of Gen AI, Gen AI tools (e.g., GitHub CoPilot) and experience in leveraging Gen AI in quality engineering space. Language requirement: English ________________________________________________________________________ The Lead Software Quality Engineer establishes plans and objectives for the Quality Assurance (QA) team, leads and develops staff, and ensures resources are effectively utilized for meeting business goals. He/She drives team results, including quality of work, timeliness, and budgetary goals. This position serves as a technical resource and provides expertise in key UPS business functions and supporting technologies. He/She collaborates with management to plan, coordinate, schedule, and manage QA resources. This position leads the improvement of work procedures and processes. He/She monitors project budget, timelines, and resource allocation (e.g., team members, contractors, vendors, etc.). The Lead Software Quality Engineer interacts with staff, customers, Information Services (I.S.) management, internal and external networks, QA professionals, and vendors to ensure effective integration between different functions, units, and teams. He/She contributes to initial project design phases, provides testing expertise, and develops test plans and strategies for projects. This position leads testing and development staff within the testing organization. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Job Summary Fiche de poste : Experience using a test automation tool like Playwright/JavaScript/TypeScript or /Selenium/Cucumber/Java or Selenium/C#.net/Spec Flow or similar frameworks. Hands-on experience in software testing and writing test automation code in at least one programming or scripting language (JavaScript, TypeScript, Java, C# and/or Python) Hands-on experience in developing and executing test scripts for REST or SOAP based APIs. Experience in intercepting and mocking complex HTTP API calls in UI and API tests. Experience in messaging systems (AMQ, WMQ or Kafka). Experience in functional and end to end test scripts development. Experience with at least one test management and defect tracking tool like Jira, Azure DevOps, HP ALM, etc. Experience in defect management strategy (create, set priority/severity, retest, close) Coordinate, contribute to and facilitate implementation of different testing types (Unit, integration/component, UI, API, E2E, etc.) Hands-on experience in writing and executing complex SQL queries and understands concepts like indexing, schemas, views, etc. Experience understanding and following advanced level version control concepts (add, delete, branching strategy) Experience on version control tool and hands-on experience on commands and operations (like commit, fetch, push, pull, squash, resolve merge conflicts etc.) Perform peer-reviews of automation code as a required approver. Hands-on experience on at least one of the performance testing tools (e.g., JMeter, K6, LoadRunner, NeoLoad, etc.). Able to understand and explain CI/CD concepts. Able to understand and explain accessibility and security testing. Experience on a cloud platform (e.g., Azure, AWS, GCP) Fundamental understanding of Gen AI, Gen AI tools (e.g., GitHub CoPilot) Language requirement: English ___________________________ This position provides mentorship and expertise in technologies and processes for Information Services Management (ISM) and Quality Assurance (QA). He/She maintains an awareness of emerging technologies to ensure a competitive advantage. This position automates test scenarios and expected outcomes. He/She provides expertise for UPS key business functions and supporting technologies. This position applies a comprehensive knowledge of technical skills, principles, practices, and procedures of testing methodologies and working knowledge in planning, designing, and conducting QA reviews and inspections. This position conducts comprehensive testing and risk-based assessments of the testing objects. He/She uses source documentation as input and contributes to the planning and implementation of testing activities. This position leads testing components of large and complex projects, assigns tasks, provides direction to resources, and reports progress to project stakeholders. He/She creates and selects tools and methodologies for review and approval by management. Responsibilities Conducts quality assessment (QA) development processes. Develops test solutions. Provides expertise in testing across the QA organization. Develops and implements new practices and testing standards. Contributes to project design. Qualifications Bachelor's Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field - Preferred Experience with both web and client/server based testing Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary Experience using a test automation tool like Playwright/JavaScript/TypeScript or /Selenium/Cucumber/Java or Selenium/C#.net/Spec Flow or similar frameworks. Hands-on experience in software testing and writing test automation code in at least one programming or scripting language (JavaScript, TypeScript, Java, C# and/or Python) Hands-on experience in developing and executing test scripts for REST or SOAP based APIs. Experience in intercepting and mocking complex HTTP API calls in UI and API tests. Experience in messaging systems (AMQ, WMQ or Kafka). Experience in functional and end to end test scripts development. Experience with at least one test management and defect tracking tool like Jira, Azure DevOps, HP ALM, etc. Experience in defect management strategy (create, set priority/severity, retest, close) Coordinate, contribute to and facilitate implementation of different testing types (Unit, integration/component, UI, API, E2E, etc.) Hands-on experience in writing and executing complex SQL queries and understands concepts like indexing, schemas, views, etc. Experience understanding and following advanced level version control concepts (add, delete, branching strategy) Experience on version control tool and hands-on experience on commands and operations (like commit, fetch, push, pull, squash, resolve merge conflicts etc.) Perform peer-reviews of automation code as a required approver. Hands-on experience on at least one of the performance testing tools (e.g., JMeter, K6, LoadRunner, NeoLoad, etc.). Able to understand and explain CI/CD concepts. Able to understand and explain accessibility and security testing. Experience on a cloud platform (e.g., Azure, AWS, GCP) Fundamental understanding of Gen AI, Gen AI tools (e.g., GitHub CoPilot) Language requirement: English ___________________________ This position provides mentorship and expertise in technologies and processes for Information Services Management (ISM) and Quality Assurance (QA). He/She maintains an awareness of emerging technologies to ensure a competitive advantage. This position automates test scenarios and expected outcomes. He/She provides expertise for UPS key business functions and supporting technologies. This position applies a comprehensive knowledge of technical skills, principles, practices, and procedures of testing methodologies and working knowledge in planning, designing, and conducting QA reviews and inspections. This position conducts comprehensive testing and risk-based assessments of the testing objects. He/She uses source documentation as input and contributes to the planning and implementation of testing activities. This position leads testing components of large and complex projects, assigns tasks, provides direction to resources, and reports progress to project stakeholders. He/She creates and selects tools and methodologies for review and approval by management. Responsibilities Conducts quality assessment (QA) development processes. Develops test solutions. Provides expertise in testing across the QA organization. Develops and implements new practices and testing standards. Contributes to project design. Qualifications Bachelor's Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field - Preferred Experience with both web and client/server based testing Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Experience in designing and developing a new test automation framework from scratch using a test automation tool like Playwright/JavaScript/TypeScript or /Selenium/Cucumber/Java or Selenium/C#.net/Spec Flow or similar frameworks. Hands-on experience in software testing and writing test automation code in at least one programming or scripting language (JavaScript, TypeScript, Java, C# and/or Python) and guide team members on technical front. Hands-on experience in developing and executing test scripts for REST or SOAP based APIs. Experience in intercepting and mocking complex HTTP API calls in UI and API tests. Experience in messaging systems (AMQ, WMQ or Kafka). Experience in writing and guiding teams on complex end-to-end tests automation. Experience on test management and defect tracking tools like Jira, Azure DevOps, HP ALM, etc. Experience in driving defect management strategy (create, set priority/severity, retest, close, traceability, reporting etc.) Experience in driving the team to implement automated test suites for all testing types (unit, component, API, UI, UAT, E2E, etc.) Hands-on experience in writing and executing complex SQL queries and understands concepts like indexing, schemas, views, etc. Experience in training team members to understand version control concepts. Perform lead-level automation code review as a required approver. Experience in guiding the team on version control concepts, tools and hands-on experience on commands and operations (like commit, fetch, push, pull, squash, resolve merge conflicts etc.) Experience in establishing branching strategy and best practices for automation code review process (in coordinate with Engg. lead, DevSecOps) Hands-on experience on at least one of the performance testing tools (e.g., JMeter, K6, LoadRunner, NeoLoad, etc.). Experience in analyzing performance testing requirements and implementation of load, stress, endurance, volume testing etc. Experience in leading the team on CI/CD pipeline implementation for automated tests suites. (in coordination with DevSecOps team) Experience in creating test plan for accessibility and security testing. Hands-on experience on cloud platforms (e.g., Azure, AWS, GCP) Understanding of Gen AI, Gen AI tools (e.g., GitHub CoPilot) and experience in leveraging Gen AI in quality engineering space. Language requirement: English ________________________________________________________________________ The Lead Software Quality Engineer establishes plans and objectives for the Quality Assurance (QA) team, leads and develops staff, and ensures resources are effectively utilized for meeting business goals. He/She drives team results, including quality of work, timeliness, and budgetary goals. This position serves as a technical resource and provides expertise in key UPS business functions and supporting technologies. He/She collaborates with management to plan, coordinate, schedule, and manage QA resources. This position leads the improvement of work procedures and processes. He/She monitors project budget, timelines, and resource allocation (e.g., team members, contractors, vendors, etc.). The Lead Software Quality Engineer interacts with staff, customers, Information Services (I.S.) management, internal and external networks, QA professionals, and vendors to ensure effective integration between different functions, units, and teams. He/She contributes to initial project design phases, provides testing expertise, and develops test plans and strategies for projects. This position leads testing and development staff within the testing organization. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 2 days ago

Apply

4.0 years

0 Lacs

India

On-site

GlassDoor logo

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life At Medtronic, we push the limits of what technology can do to make tomorrow better than yesterday and that makes it an exciting and rewarding place to work. We value what makes you unique. Be a part of a company that thinks differently to solve problems, make progress, and deliver meaningful innovations. As Sr. Database Architect, you will be a part of our Global DBA team responsible for providing design and support. The Database Architect role is pivotal in establishing our Database As A Service model for cloud-based and on premise database deployments. The Database Architect works as a part of a team of database architects, solution architects, engineers and business customers to bring industry best practices to database design, provisioning, automation, security, reliability and availability. This role will work with a Global IT team to engineer solutions to complex business problems while leveraging open source and traditional databases. Your commitment to drive and managing high quality execution, operational excellence and delivering technology solutions that meet business needs and optimize customer experience will have a direct impact on the organization, and ultimately, affect the lives of millions. We believe that when people from different cultures, genders, and points of view come together, innovation is the result —and everyone wins. Medtronic walks the walk, creating an inclusive culture where you can thrive. Responsibilities may include the following and other duties may be assigned: Participate in a Global team of Database Architects, Engineers and Administrators to provide technical solution to projects that engage MDT database platforms. Provide hands on technical support across the various on premise and cloud-based database offerings. Participate in the design of our Database As A Service solutions for cloud-based database offerings. Participate in the design of our automated provisioning solutions for on premise and cloud-based database offerings using modern automation tools. Partner with various IT Infrastructure teams and to fulfill project needs while providing exceptional customer outcomes. Provide technical leadership and governance for the big data team, ensuring the implementation of solution architecture within the Mongo & Hadoop ecosystem. Design and implement scalable MongoDB architecture (replica sets, sharding, high availability) Ensure data security and compliance with industry standards, including Kerberos integration and encryption. Architect overview of the administration, configuration, and maintenance of Mongo & Hadoop clusters and associated databases on On-Prem & Cloud-base(AWS). Provide experience with database monitoring technologies for on premise and cloud-based databases including Prometheus, Grafana, SQL Studio, SolarWinds, RDS Console, CloudWatch, Ambari, Cloudera Manager, or custom scripts to track performance metrics and detect issues Provide experience with desktop client tools including MongoDB Database Tools, PGAdmin, DBeaver, HeidiSQL, Navicat, SQL Developer, SQL Studio, Toad, etc. Provide experience with database security products and techniques including auditing, encryption, virtual private databases, row level security, etc. Provide experience with enterprise backup and recovery tools and techniques for providing full, incremental, online, offline and transaction log backups. Work well with IT teams and Business Partners to identify and implement opportunities to improve database performance, reliability, scalability and availability. Willingness to contribute, learn and grow as a member of a team that supports a variety of technologies. Required Knowledge and Experience: Bachelor’s Degree Minimum of 4-5 Years of Mongo DBA Architect, Strong Internals, Aggregation framework, Indexing and experience with MongoDB Atlas. Minimum of 5+ Years of Hadoop DBA Architect, experience with Hadoop distributions (Cloudera, Hortonworks, or Apache) and Strong knowledge of Hadoop ecosystem components: HDFS, YARN, Hive, HBase, Spark, Oozie, Zookeeper, etc. Minimum of 2 years’ experience with open source or cloud-based databases. Nice to Have Recent experience with Apache Hadoop versions 3.4 and 3.3 Proficiency in Hadoop ecosystem tools like Pig, Hive, HBase, and Oozie Experience integrating MongoDB with big data ecosystems (Kafka, Spark, etc.) is a plus. Recent experience with MariaDB, PostgreSQL, Snowflake, SQL Server or Oracle databases in addition to Hadoop & Mongo DB. Recent experience with capacity planning and estimating the requirements for the Mongo & Hadoop clusters. Recent experience deploys in Hadoop clusters using Apache source or distribution sources like Cloudera. Recent experience managing the size of Hadoop clusters based on the data to be stored in HDFS. Recent experience deploys of Mongo DB clusters using MongoDB OPS Manager or Atlas CLI. Ability to manage and review Hadoop & Mongo log files. Recent experience with IAM tools and automated security provisioning for on premise and cloud-based database technologies. Recent experience with automated database provisioning using Terraform, CloudFormation, etc. Recent experience with scripting languages like MongoDB Shell, Perl, PowerShell, BASH, KSH, etc. Recent experience with coding languages Java, Python, JavaScript and SQL Recent experience provisioning databases on Microsoft Azure, Amazon EC2 or Amazon RDS. Recent experience with modern backup approaches and tools including NetBackup or Data Domain for on premise and cloud-based databases. Significant database development or support experience with application development and implementation teams. Recent experience with DevOps and software engineering technologies . Proficiency in Unix, Linux and Windows Server operating systems Knowledge of SQL commands for data retrieval and manipulation Understanding of various DBMS types (i.e., relational, columnar, non-relational) Proficiency in installation, configuration, and maintenance of different DBMS platforms Proficiency in data migration between different database systems Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. This position is eligible for a short-term incentive called the Medtronic Incentive Plan (MIP). About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.

Posted 2 days ago

Apply

6.0 years

3 - 15 Lacs

India

On-site

GlassDoor logo

Immediate Joiners Only (Less than 15 days’ Notice Period) Location: Madhapur, Hyderabad Contact: hr@happivet.ai No of Positions: 2 We are seeking an experienced Senior Software Engineer to join our Vet Healthcare Technology team. In this role, you will design, develop, and maintain cloud-native applications on Azure that power our Practice Management platform. You’ll collaborate closely with cross-functional teams—clinical SME’s, architects, QA, and DevOps—to deliver robust, scalable, and secure solutions utilizing .NET 8, React, and modern Azure services. Key Responsibilities Architecture & Design Lead design discussions and apply proven design patterns (e.g., CQRS, Repository, Factory) to ensure clean, maintainable code. Define microservices boundaries and integration strategies (APIs, messaging) for HL7 and FHIR data flows. Development & Integration Build backend services in .NET 8 , leveraging Azure Functions, Logic Apps, Service Bus, API Gateway, and Storage Services. Develop responsive front-end interfaces using React , TypeScript, and state-management libraries (e.g., Redux or Context API). Implement data persistence layers for SQL Server and PostgreSQL , including schema design, stored procedures, and performance tuning. Integrate with healthcare standards (HL7 v2/v3, FHIR R4) and third-party systems via secure, high-throughput interfaces. Quality & Compliance Write unit and integration tests to ensure code quality; participate in code reviews and pair-programming sessions. Follow best practices for security, privacy, and compliance in healthcare (HIPAA, GDPR, etc.). Mentorship & Collaboration Mentor mid-level engineers, drive knowledge-sharing sessions, and contribute to technical roadmaps. Work in an Agile/Scrum environment: estimate user stories, attend sprint ceremonies, and deliver incremental value. Required Qualifications B. Tech / BE – (CSE/IT/ECE/EEE) 6+ years of professional experience developing enterprise applications with .NET (latest versions, ideally .NET 8). Proficiency in React (with hooks and modern toolchains), HTML5/CSS3, and JavaScript/TypeScript. Hands-on experience with Azure Cloud Services : Functions, Logic Apps, Storage Accounts, Service Bus, Key Vault, API Gateway OAuth2/OpenID Strong SQL skills: T-SQL for SQL Server , PL/pgSQL for PostgreSQL , indexing and query optimization. Experience integrating with EHR/EMR systems and healthcare messaging standards: HL7 (v2/v3) and FHIR (R4). Solid understanding of Docker and container-based deployment workflows. Deep grasp of software design principles, SOLID, and common design patterns. Excellent problem-solving skills, communication, and ability to work collaboratively in distributed teams. Preferred Qualifications Prior experience in healthcare industry. Contributions to open-source projects or active participation in developer communities. Job Type: Full-time Pay: ₹337,562.45 - ₹1,522,451.85 per year Benefits: Health insurance Provident Fund Location Type: In-person Schedule: Day shift Work Location: In person Speak with the employer +91 6309841855

Posted 2 days ago

Apply

5.0 - 9.0 years

9 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 2 days ago

Apply

7.0 years

8 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description: Key Responsibilities: Data Engineering & Architecture: Design, develop, and maintain high-performance data pipelines for structured and unstructured data using Azure Data Bricks and Apache Spark. Build and manage scalable data ingestion frameworks for batch and real-time data processing. Implement and optimize data lake architecture in Azure Data Lake to support analytics and reporting workloads. Develop and optimize data models and queries in Azure Synapse Analytics to power BI and analytics use cases. Cloud-Based Data Solutions: Architect and implement modern data lakehouses combining the best of data lakes and data warehouses. Leverage Azure services like Data Factory, Event Hub, and Blob Storage for end-to-end data workflows. Ensure security, compliance, and governance of data through Azure Role-Based Access Control (RBAC) and Data Lake ACLs. ETL/ELT Development: Develop robust ETL/ELT pipelines using Azure Data Factory, Data Bricks notebooks, and PySpark. Perform data transformations, cleansing, and validation to prepare datasets for analysis. Manage and monitor job orchestration, ensuring pipelines run efficiently and reliably. Performance Optimization: Optimize Spark jobs and SQL queries for large-scale data processing. Implement partitioning, caching, and indexing strategies to improve performance and scalability of big data workloads. Conduct capacity planning and recommend infrastructure optimizations for cost-effectiveness. Collaboration & Stakeholder Management: Work closely with business analysts, data scientists, and product teams to understand data requirements and deliver solutions. Participate in cross-functional design sessions to translate business needs into technical specifications. Provide thought leadership on best practices in data engineering and cloud computing. Documentation & Knowledge Sharing: Create detailed documentation for data workflows, pipelines, and architectural decisions. Mentor junior team members and promote a culture of learning and innovation. Required Qualifications: Experience: 7+ years of experience in data engineering, big data, or cloud-based data solutions. Proven expertise with Azure Data Bricks, Azure Data Lake, and Azure Synapse Analytics. Technical Skills: Strong hands-on experience with Apache Spark and distributed data processing frameworks. Advanced proficiency in Python and SQL for data manipulation and pipeline development. Deep understanding of data modeling for OLAP, OLTP, and dimensional data models. Experience with ETL/ELT tools like Azure Data Factory or Informatica. Familiarity with Azure DevOps for CI/CD pipelines and version control. Big Data Ecosystem: Familiarity with Delta Lake for managing big data in Azure. Experience with streaming data frameworks like Kafka, Event Hub, or Spark Streaming. Cloud Expertise: Strong understanding of Azure cloud architecture, including storage, compute, and networking. Knowledge of Azure security best practices, such as encryption and key management. Preferred Skills (Nice to Have): Experience with machine learning pipelines and frameworks like MLFlow or Azure Machine Learning. Knowledge of data visualization tools such as Power BI for creating dashboards and reports. Familiarity with Terraform or ARM templates for infrastructure as code (IaC). Exposure to NoSQL databases like Cosmos DB or MongoDB. Experience with data governance to Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Cyara is the world’s leading Automated CX Assurance Platform provider, enabling leading brands across the globe to build better customer experiences faster. Through automated testing and monitoring, Cyara accelerates the delivery of flawless customer journeys across digital and voice channels while reducing the risk of customer-facing defects. Every day the most recognizable brands, including Airbnb and NAB trust Cyara to deliver customer smiles at scale. Our promise is Customer Smiles - Delivered at Scale, and as a member of Cyara’s team, you’ll be given the opportunity to bring that mission to fruition alongside our amazing community of fun-loving forward thinkers. Interested to find out more about us? Check out: www.cyara.com Want to know what it’s really like to join Cyara? Check out this link to meet some real Cyarans and read about their individual career journey with us: https://cyara.com/employee-profiles/ Cyara’s Values: At Cyara, our values shape everything we do. We're passionate about Delivering Excellence by putting the customer first, collaborating globally, and always striving to improve. We take smart risks and Innovate Boldly , setting new standards and learning from every experience. Integrity First is our cornerstone—we value humility, authenticity, and respect for diversity, building trust in all we do. We Embrace Curiosity by empowering you to experiment, learn, and grow in a dynamic environment. At Cyara, our values drive us forward, shaping a culture where innovation and excellence thrive. Cyara’s Diversity, Equity, Inclusive and Belonging: At Cyara, we are dedicated to fostering a workplace that embodies equal opportunity and champions diversity, equity, inclusion, and belonging (DEIB). We strive to cultivate an environment where every individual feels valued, respected, and empowered to bring their whole selves to work, contributing unique perspectives and talents. Our commitment includes continuously evaluating and enhancing our policies, practices, and culture to align with our DEIB principles. We ensure a discrimination-free environment where individuals are evaluated solely on their merits and abilities, regardless of legally protected statuses such as sex, race, color, ethnicity, national origin, age, religion, disability, sexual orientation, gender identity, veteran status, or medical condition. By celebrating our differences and championing inclusivity, we enrich our organization, make more thoughtful decisions, and drive collective success. Roles & responsibilities Position: We are currently hiring for a Back-End Developer to design, build and maintaina backend API that will that provides various functionality to end-users. The backend is built on Cockroach DB and Python Flask. The successful candidate will be responsible for designing and developing new API endpoints based on the input coming from different teams and projects. In addition, you will also work on integration, maintaining and training AI/ML modelsfocused on audio quality and telephony environment. Technical Requirements: Strong command of Python (preferably 3.x) with experience in writing clean, modular, and maintainable code. Familiarity with Python libraries commonly used in Flask projects (e.g., requests and other relevant to your stack). Proficiency with SQL Alchemy Experience designing and managing database schemas and relationships (e.g., one-to-many, many-to-many) and performing migrations using Alembic. Strong understanding of relational databases (e.g., MySQL) and ability to write efficient SQL queries. Experience with Redis for caching, session management, or real-time data processing. Experience with machine learning workflows, including model training, evaluation, and deployment. Knowledge of database optimization, indexing, and transaction management. Understanding Flask concepts like routing and request handling. Solid grasp of HTTP protocols, RESTful API development, and web application architecture. Experience with API testing tools (e.g., Postman, curl) and debugging. Familiarity with version control using Git (GitHub). Familiarity with deployment tools and platforms (e.g., Docker, AWS). Knowledge of testing frameworks like pytestand unittestfor unit and integration testing. Experience required for this role: 5+ years of experience in backend development 3+ years of experience with Python Strong problem-solving skills and the ability to work collaboratively in a team environment Hands-on machine learning experience, with a focus on NLP and deep learning. Ability to work with AI/ML models and run them in a Docker environment. Understandingaudio quality parameters and audio measurements techniques. Strong knowledge of databases, including performancetuning and optimization. Proficient with version control systems, particularly Git (GitHub). Solid understanding of docker Experience working in Unix/Linux environments. Excellent communication skills, both written and verbal in English Why you should join us: At Cyara you’ll have the opportunity to work with a group of people who share common goals, are driven by a similar passion, and value the expertise of their peers. Cyara is committed to being an equal opportunity employer, focused on building and maintaining a diverse, inclusive and authentic workplace; and a work environment that is free from discrimination and harassment, based upon any legally protected status or protected characteristic, including but not limited to an individual's sex, race, color, ethnicity, national origin, age, religion, disability, sexual orientation, veteran status, gender identity, or pregnancy. At Cyara we appreciate and welcome the fact that our culture is living and growing as we continue to evolve over time. With this opportunity comes the chance to enjoy a flexible work environment, competitive compensation and a work culture that's results-oriented, fast-paced and focused on continuous improvement, whilst maintaining a family first, team oriented, and ever positive atmosphere. Cyara cares for its own - you’ll feel that on your first day - and you'll get the chance to work for a global, growing company, and an all-inclusive team of innovators. Interested? Know someone who might be? Apply online now. Cyara are a Global Circle Back Initiative Employer - we commit to respond to every applicant. Agencies: Thanks but we’ve got this one! Please, no phone calls or emails to any employees of Cyara outside of the Talent Acquisition team. Cyara’s policy is to only accept resumes from Agencies via the Cyara Agency Portal. Agencies must have a valid fee agreement in place and they must have been assigned the specific requisition to which they submit resumes, by the Cyara Talent Acquisition team before submitting any CV's. Any resume submitted outside of this process will be deemed the sole property of Cyara and, in the event, a candidate is submitted outside of this policy is hired, no fee or payment of any kind will be paid

Posted 2 days ago

Apply

7.0 years

2 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description: Key Responsibilities: Data Engineering & Architecture: Design, develop, and maintain high-performance data pipelines for structured and unstructured data using Azure Data Bricks and Apache Spark. Build and manage scalable data ingestion frameworks for batch and real-time data processing. Implement and optimize data lake architecture in Azure Data Lake to support analytics and reporting workloads. Develop and optimize data models and queries in Azure Synapse Analytics to power BI and analytics use cases. Cloud-Based Data Solutions: Architect and implement modern data lakehouses combining the best of data lakes and data warehouses. Leverage Azure services like Data Factory, Event Hub, and Blob Storage for end-to-end data workflows. Ensure security, compliance, and governance of data through Azure Role-Based Access Control (RBAC) and Data Lake ACLs. ETL/ELT Development: Develop robust ETL/ELT pipelines using Azure Data Factory, Data Bricks notebooks, and PySpark. Perform data transformations, cleansing, and validation to prepare datasets for analysis. Manage and monitor job orchestration, ensuring pipelines run efficiently and reliably. Performance Optimization: Optimize Spark jobs and SQL queries for large-scale data processing. Implement partitioning, caching, and indexing strategies to improve performance and scalability of big data workloads. Conduct capacity planning and recommend infrastructure optimizations for cost-effectiveness. Collaboration & Stakeholder Management: Work closely with business analysts, data scientists, and product teams to understand data requirements and deliver solutions. Participate in cross-functional design sessions to translate business needs into technical specifications. Provide thought leadership on best practices in data engineering and cloud computing. Documentation & Knowledge Sharing: Create detailed documentation for data workflows, pipelines, and architectural decisions. Mentor junior team members and promote a culture of learning and innovation. Required Qualifications: Experience: 7+ years of experience in data engineering, big data, or cloud-based data solutions. Proven expertise with Azure Data Bricks, Azure Data Lake, and Azure Synapse Analytics. Technical Skills: Strong hands-on experience with Apache Spark and distributed data processing frameworks. Advanced proficiency in Python and SQL for data manipulation and pipeline development. Deep understanding of data modeling for OLAP, OLTP, and dimensional data models. Experience with ETL/ELT tools like Azure Data Factory or Informatica. Familiarity with Azure DevOps for CI/CD pipelines and version control. Big Data Ecosystem: Familiarity with Delta Lake for managing big data in Azure. Experience with streaming data frameworks like Kafka, Event Hub, or Spark Streaming. Cloud Expertise: Strong understanding of Azure cloud architecture, including storage, compute, and networking. Knowledge of Azure security best practices, such as encryption and key management. Preferred Skills (Nice to Have): Experience with machine learning pipelines and frameworks like MLFlow or Azure Machine Learning. Knowledge of data visualization tools such as Power BI for creating dashboards and reports. Familiarity with Terraform or ARM templates for infrastructure as code (IaC). Exposure to NoSQL databases like Cosmos DB or MongoDB. Experience with data governance to Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies