Home
Jobs

69353 Sql Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

8 - 15 Lacs

Pune

Hybrid

Naukri logo

Role : Developer Location: Pune Hybrid Excellent Communication skills NP: Immediate Joiners to 1 month 9 (Only serving NP candidates apply) Exp: 3 to 9 yrs All Mandatory Skills : ( Must be in the roles and responsibilities) Data Platform Java Python Spark Kafka Cloud technologies (Azure / AWS) Databricks Interested Candidate Share Resume at dipti.bhaisare@in.experis.com

Posted 1 hour ago

Apply

6.0 - 10.0 years

13 - 18 Lacs

Chennai

Hybrid

Naukri logo

Role: Cloud DB Administartor Location: Chennai Exp : 5 to 8 yrs Work Mode: Hybrid NP : Immediate Joiners to 15 days ( Who can join in the month of June ) Primary Skills: Azure And SQL Excellent Communication Skills Role & responsibilities Azure SQL Installation, Configuration on IaaS, and PaaS High Availability options and Data Migration Options Security Management and Backup and recovery Options DR planning, execution, and SQL scripting Monitoring and Alerts Experience with database technologies such as PostgreSQL, MySQL, SQL Server, Oracle. Experience with database management tools for backups, recovery, snapshot management, sharding, partitioning and as well as experience with database performance tuning. Good oral and written Communication skills in English. Interested Candidate Share Resume at dipti.bhaisare@in.experis.com

Posted 1 hour ago

Apply

3.0 - 8.0 years

12 - 19 Lacs

Pune

Hybrid

Naukri logo

This is Only for Pune Local Candidates ( Not for Relocation Candidates) Role : Data Engineer This is C2H Role Experience : 3- 8 yrs Location : Kharadi , Pune Excellent Communication SKills NP: Immediate joiner to 1 m Primary Skills Python, document intelligence, NLP, unstructured data extraction (desirable to have OpenAI and prompt engineering) Secondary Skills Azure infra experiences and data bricks Mandatory Skills Data Infrastructure & Engineering Designing, building, productionizing, and maintaining scalable and reliable data infrastructure and data products. Experience with data modeling, pipeline idempotency, and operational observability 2.Programming Languages: Proficiency in one or more object-oriented programming languages such as: Python Scala Java C# 3.Database Technology : Strong experience with: SQL and NoSQL databases Query structures and design best practices Scalability, readability, and reliability in database design 4.Distributed Systems Experience implementing large-scale distributed systems in collaboration with senior team members. 5. . Software Engineering Best Practices Technical design and reviews Unit testing, monitoring, and alerting Code versioning, code reviews, and documentation CI/CD pipeline development and maintenance 6.Security & Compliance Deploying secure and well-tested software and data assets Meeting privacy and compliance requirement 7.Site Reliability Engineering Service reliability, on-call rotations, defining and maintaining SLAs Infrastructure as code and containerized deployments Job Description : Able to enrich data by data transformation and joining with other datasets. Able to analyze data and derive statistical insights. Able to convey story through data visualization. Ability to build Data pipelines for diverse interfaces. Good understating of API workflow. Technical Skills : AWS Data Lake and AWS data hub and AWS cloud platform. Interested Candidate Share Resume at dipti.bhaisare@in.experis.com

Posted 1 hour ago

Apply

3.0 - 5.0 years

6 - 8 Lacs

Gurugram

Hybrid

Naukri logo

Role & responsibilities Understand business requirements from the PBI assigned and get clarifications Break work in simpler code Do code development using various AWS services like AWS cloud, EC2, IAM, KMS keys, AWS Lambda, Batch, Terraform/CFT, Event bridge, Managed Kafka, Kinesis, Glue, PySpark Ensure the software systems are scalable, reliable, and efficient. Protect data and applications by implementing robust security measures. Unit test case creation and testing. Promote code to higher environment using CI/CD release QA and UAT testing support. Fix bugs Production deployment and post deployment support Production issue support, troubleshooting and optimizing for performance and efficiency Preferred candidate profile EDUCATION: Bachelor's FIELD OF STUDY: BE/B.Tech/MCA Language Required : English 3-5 years of AWS ETL development experience. Must have experience on AWS cloud, EC2, IAM, KMS keys, AWS Lambda, Batch, Terraform/CFT, Event bridge, Managed Kafka, Kinesis, Glue, PySpark. Understanding of data modelling concepts. Knowledge of Python and other programming languages. Knowledge of different SQL/NOSQL data storage techniques. SQL competence (query performance tuning, index management etc) and a grasp of database structure are required. Worked in Agile methodology and its ceremonies. Passionate about sophisticated data structures and problem solutions. Capability to analyze and troubleshoot complicated data sets. Design automate and support sophisticated data extraction, transformation and loading applications. Analytical abilities Good in oral and written communication Experience with data warehousing and business intelligence (BI) Knowledge of industries technology trends Perks And Benefits: Transportation Services : Convenient and reliable commute options to ensure a hassle-free journey to and from work. Meal Facilities : Nutritious and delicious meals provided to keep you energized throughout the day. Career Growth Opportunities : Clear pathways for professional development and advancement within the organization. Captive Unit Advantage : Work in a stable, secure environment with long-term projects and consistent workflow. Continuous Learning : Access to training programs, workshops, and resources to support your personal and professional growth. APPLY NOW : https://encore.wd1.myworkdayjobs.com/externalnew/job/Gurgaon---Candor-Tech-Space-IT---ITES-SEZ/Software-Engineer_HR-18559-1 And share your Cv at Anjali.panchwan@mcmcg.com

Posted 1 hour ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Ahmedabad

Work from Office

Naukri logo

Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com

Posted 1 hour ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Noida

Work from Office

Naukri logo

Salary- No bar for the right candidate. Greetings from Plasma Softech Pvt ltd! We are hiring Full stack Developer (Dot Net Core + Angular). Note: - Minimum 3 years experience with good technical skills with Dot Net Core (.net core) , Angular 2+, Web API, C# & SQL & preferred immediate joiner Responsibilities: This is a full-stack developer role. Need to work on the front-end as well as back-end applications (web) which would implement various role-based user flows, dashboards with a very intuitive. Strong Microsoft based development background (.NET CORE, ASP.NET Core, C#, Web API, Dot Net Core, SQL (MY SQL & MS SQL) and Web Services.) Desired skillset: Having a strong knowledge of OOPS, design pattern, solid principle Deep knowledge of the .NET 3.5/4.0/4.5 Framework (.NET Core is good to have) including Visual Studio 2015/2017 & 2019 along with NuGet Package Manager Strong Experience in LINQ and ADO.NET/Entity Framework. Experience with Angular 2 and above , and similar technologies Experience with XML, HTML5/CSS3, JSON and HTTP/REST Services Strong knowledge of software implementation, the best practices and code analysis too Must have experience Angular 2 and above version and dot net core. For more information, please visit our website, its mentioned below signature. Thanks & Regards, Tanu Kumari HR Executive| Plasma Softech Pvt. Ltd. p: 8287237618 e: tanuk@plasmacomp.com B-151, First Floor Sector 6, Noida 201301 www.plasmacomp.com | www.c2m.net

Posted 1 hour ago

Apply

5.0 - 10.0 years

15 - 17 Lacs

Hyderabad

Hybrid

Naukri logo

Job Description: We are looking for a highly skilled and experienced .NET Full Stack Developer with strong expertise in Angular to join our team in Hyderabad. The ideal candidate should have a minimum of 5 years of hands-on experience in full-stack development using .NET technologies and modern front-end frameworks. Key Responsibilities: Develop, test, and maintain robust and scalable web applications using .NET Core and Angular. Design and implement RESTful APIs and Web APIs to support front-end functionality. Work extensively on Angular components, services, and modules for dynamic and responsive UI. Write optimized and complex SQL queries for data access and manipulation in SQL Server. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Participate in code reviews, technical discussions, and software architecture decisions. Work with CI/CD pipelines and GitLab for version control and deployment automation. Troubleshoot, debug, and upgrade existing software and suggest improvements. Required Skills: Strong proficiency in .NET Core and C#. Hands-on experience with Angular (version 8 or above). Excellent understanding of REST API/Web API development. Solid experience in SQL Server, including writing queries, stored procedures, and functions. Familiarity with DevOps tools, especially GitLab CI/CD pipelines. Good understanding of software development life cycle (SDLC) and agile methodologies.

Posted 1 hour ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Naukri logo

Primary Responsibilities: Develop visual reports, dashboards and KPI scorecards using Power BI desktop. Build Analysis Services reporting models. Connect to data sources, importing data and transforming data for Business Intelligence. Implement row level security on data and understand application security layer models in Power BI. Integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation. Use advance level calculations on the data set. Design and develop Azure-based data centric application to manage large healthcare data application Design, build, test and deploy streaming pipelines for data processing in real time and at scale Create ETL packages Make use of Azure cloud services in ingestion and data processing Own feature development using Microsoft Azure native services like App Service, Azure Function, Azure Storage, Service Bus queues, Event Hubs, Event Grid, Application Gateway, Azure SQL, Azure DataBricks, etc Identify opportunities to fine-tune and optimize applications running on Microsoft Azure, cost reduction, adoption of best cloud practices, data and application security covering scalability and high availability Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects of Microsoft Azure Focus on automation using Infrastructure as a Code (IaaC), Jenkins, Azure DevOps, Terraform, etc. Communicate effectively with other engineers and QA Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Senior Cloud Data Engineer Position with about 7+ Years of hands-on technical Experience in the Data processing, reporting and Cloud technologies. Working Knowledge of executing the projects in the Agile Methodologies. 1. Required Skills 1. Be able to envision the overall solution for defined functional and non-functional requirements; and be able to define technologies, patterns and frameworks to materialize it. 2. Design and develop the framework of the system and be able to explain choices made. Also write and review design document explaining overall architecture, framework and high level design of the application. 3. Create, understand and validate Design and estimated effort for given module/task, and be able to justify it. 4. Be able to define in-scope, out-of-scope and taken assumptions while creating effort estimates. 5. Be able to identify and integrate well over all integration points in context of a project as well as other applications in the environment. 6. Understand the business requirements and develop data models Technical Skills: 1. Strong proficiency as a Cloud Data Engineer utilizing Power BI and Azure Data Bricks to support as well as design, develop and deploy requested updates to new and existing cloud-based services. 2. Experience with developing, implementing, monitoring and troubleshooting applications in the Azure Public Cloud. 3. Proficiency in Data Modeling and reporting 4. Design and implement database schema 5. Design and development of well documented source code. 6. Development of both unit testing and system testing scripts that will be incorporated into the QA process. 7. Automating all deployment steps with Infrastructure as Code (IAC) and Jenkins Pipeline as Code (JPaC) concepts. 8. Define guidelines and benchmarks for NFR considerations during project implementation. 9. Do required POCs to make sure that suggested design/technologies meet the requirements. . Required Experience: 5+ to 10+ years of professional experience developing SQL, Power BI, SSIS and Azure Data Bricks. 5+ to 10+ years of professional experience utilizing SQL Server for data storage in large-scale .NET solutions. Strong technical writing skills. Strong knowledge of build/deployment/unit testing tools. Highly motivated team player and a self-starter. Excellent verbal, phone, and written communication skills. Knowledge of Cloud-based architecture and concepts. Required Qualifications: Graduate or Post Graduate in Computer Science /Engineering/Science/Mathematics or related field with around 10 years of experience in executing the Data Reporting solutions Cloud Certification, preferably Azure

Posted 1 hour ago

Apply

3.0 - 7.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Key Responsibilities Develop and deploy personalised pricing models using historical behaviour, purchase intent, segmentation, and contextual data. Apply advanced statistical and machine learning techniques to estimate demand curves and user-level price sensitivity. Design and execute pricing A/B tests, analyzing lift, revenue impact, and user experience trade-offs. Develop dynamic pricing frameworks that adjust in real-time based on inputs such as location, time, inventory, and user cohorts. Collaborate with engineering teams to integrate models into pricing engines and user-facing platforms. Communicate findings clearly to business stakeholders and make data-backed pricing recommendations. Must-Have Qualifications 36 years of experience in data science, pricing, or quantitative strategy roles. Strong programming skills in Python and SQL; experience with libraries like scikit-learn, stats models, or XGBoost. Deep knowledge of pricing analytics, revenue management, and behavioral economics. Experience in building predictive models for conversion, elasticity, or revenue uplift. Ability to synthesize complex data into actionable strategies with business impact. Strong experimentation mindset with familiarity in causal inference and A/B testing methodologies.

Posted 1 hour ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Cloud Data Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP/AWS/Azure, ensuring data is accessible, reliable, and available for business use. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain data pipelines using GCP/AWS/Azure services such as Dataflow, Dataproc, BigQuery, and Cloud Storage. Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP/AWS/Azure environments. Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs. Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability. ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python. Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity. Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency. Security: Ensure data security and compliance with industry standards and best practices. Required Skills & Qualifications: Education: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 8+ years of experience in data engineering, with at least 2 years working with GCP/Azure/AWS Technical Skills: Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in BigQuery. Familiarity with CI/CD pipelines and version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus.

Posted 2 hours ago

Apply

3.0 - 5.0 years

4 - 9 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

3 years of experience in testing (preferably in enterprise software).Writing test cases, executing them, providing test results and analyzing and detailing issues.Verification of issues, appropriate logging and subsequent postfix testing.

Posted 2 hours ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Perform manual user access review and access control matrix review periodically. Reconciliation, validation of data, rectification of observations before sharing them with any auditors Publish periodic dashboards about review and ongoing audit updates to management Execute and deliver automation projects end to end Face various internal and external auditors. Provide evidence, data, and resolution on observations and process improvement to avoid further observations A team player with good communication skills Proficient in excel and power-point Adaptive in learning new skills Python knowledge will be added advantage

Posted 2 hours ago

Apply

5.0 - 7.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Strong Nodejs development skills. Ability to develop and execute scalable APIs and Applications in Nodejs.Strong database skills. Ability to write complex queries in SQL /.Experience in Azure Services (Integration Services like Logic Apps and Function Apps) is a big plus but not required MySQL / Oracle / Postgresql. Experience or Knowledge of Java I a big plus but not required

Posted 2 hours ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Duration : 6 Months Timings : General IST Notice Period : within 15 days or immediate joiner About the Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round) - Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality and Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the organization. Documentation and Communication : - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business users. Cross-Functional Collaboration : - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements : Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field.Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and architectures. Preferred Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices.

Posted 2 hours ago

Apply

6.0 - 11.0 years

18 - 22 Lacs

Noida, Gurugram, Bengaluru

Work from Office

Naukri logo

Requirement : Java Developer Experience: 6+ Years Location: Gurgaon/Bangalore/Noida Job Description 6+ years hands-on experience in Java. Experience in building Order and Execution Management, Trading systems is required Financial experience and exposure to Trading In depth understanding of concurrent programming and experience in designing high throughput, high availability, fault tolerant distributed applications is required. Experience in building distributed applications using NoSQL technologies like Cassandra, coordination services like Zookeeper, and caching technologies like Apache Ignite and Redis strongly preferred Experience in building micro services architecture / SOA is required. Experience in message-oriented streaming middleware architecture is preferred (Kafka, MQ, NATS, AMPS) Experience with orchestration, containerization, and building cloud native applications (AWS, Azure) is a plus Experience with modern web technology such as Angular, React, TypeScript a plus Strong analytical and software architecture design skills with an emphasis on test driven development. Experience in programming languages such as Scala, python would be a plus. Experience in using Project Management methodologies such as Agile/Scrum Effective communication and presentation skills (written and verbal) are required Bachelors or master’s degree in computer science or engineering Good Communication skills

Posted 2 hours ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Kolkata

Work from Office

Naukri logo

Design and implement data architecture solutions that align with business requirements Develop and maintain data models, data dictionaries, and data flow diagrams.

Posted 2 hours ago

Apply

0.0 - 1.0 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: We are pleased to offer you the opportunity to join our Manufacturing Operations Excellence team as an intern for a six-month period based in Chennai. This position is aimed at individuals passionate about both manufacturing and digital innovation. You will actively contribute to our operational excellence and digital transformation programs, supporting initiatives related to Lean Six Sigma, data visualization, and emerging technologies. You will also help develop interactive Power BI dashboards, automate reporting processes, and support rollout of tools like AI-powered assistance, smart boards, and remote collaboration platforms. Your role will focus on improving process visibility and manufacturing KPIs, enhancing ERP automation, and contributing to Industry 4.0 initiatives. You will work closely with teams across Dubai and Abu Dhabi, supporting strategic initiatives driven by the VP of Manufacturing. Support Lean Rollout Support global rollout of Lean and related campaigns. Power BI Dashboarding Design and build Power BI dashboards to monitor Quality, Efficiency, and Productivity KPIs. Data Modeling & Preparation Clean, prepare, and model manufacturing data to support decision-making. Programming & Automation Use scripting (e.g., Python or SQL) to support automation and data workflows. Smart Technology Deployment Assist in deploying smart boards, AI-based remote assistance, and other new tools. ERP Integration Support Collaborate with IT and manufacturing to improve ERP data flows. Sustainability Initiatives Contribute to eco-efficiency and green manufacturing projects. Value Stream Mapping Draft current and future state maps to identify performance gaps. Reporting & Presentations Prepare presentations and reports summarizing project progress. SPAN OF COMMUNICATION Internal: With all departments as necessary in the fulfilment of the job requirements. External: As guided and requested by the VP Manufacturing JOB RELEVANT INFORMATION (BOUNDARIES & DECISION-MAKING AUTHORITY) The Jobholder has no decision-making authority. Skills Qualifications: Recent graduate in Engineering, Manufacturing, Computer Science, or a related discipline. Job-Specific Skills: Proficiency in Power BI and data visualization. Working knowledge of Python, SQL, or other programming languages. Familiar with Lean Manufacturing, Six Sigma, and value stream mapping. Understanding of Industry 4.0 concepts: ERP systems, smart boards, AI tools. Advanced Excel and PowerPoint skills. Strong communication and analytical thinking. Education Engineering, Manufacturing, Computer Science, or a related discipline.

Posted 2 hours ago

Apply

8.0 - 13.0 years

20 - 30 Lacs

Chennai

Remote

Naukri logo

Job Summary: We are seeking a highly skilled Azure Solution Architect to design, implement, and oversee cloud-based solutions on Microsoft Azure. The ideal candidate will have a deep understanding of cloud architecture, a strong technical background, and the ability to align Azure capabilities with business needs. You will lead the architecture and design of scalable, secure, and resilient Azure solutions across multiple projects. Role & responsibilities: Design end-to-end data architectures on Azure using Microsoft Fabric, Data Lake (ADLS Gen2), Azure SQL/Synapse, and Power BI. Lead the implementation of data integration and orchestration pipelines using Azure Data Factory and Fabric Data Pipelines. Architect Lakehouse/Data Warehouse solutions for both batch and real-time processing, ensuring performance, scalability, and cost optimization. Establish data governance, lineage, and cataloging frameworks using Microsoft Purview and other observability tools. Enable data quality, classification, and privacy controls aligned with compliance and regulatory standards. Drive adoption of event-driven data ingestion patterns using Event Hubs, Event Grid, or Stream Analytics. Provide architectural oversight on reporting and visualization solutions using Power BI integrated with Fabric datasets and models. Define architecture standards, data models, and reusable components to accelerate project delivery. Collaborate with data stewards, business stakeholders, and engineering teams to define functional and non-functional requirements. Support CI/CD, infrastructure as code, and DevOps for data pipelines using Azure DevOps or GitHub Actions. Lead Proof of Concepts (PoCs) and performance evaluations for emerging Azure data services and tools. Monitor system performance, data flow, and health using Azure Monitor and Fabric observability capabilities. Required Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. 5+ years of experience as a data architect or solution architect in cloud data environments. 3+ years of hands-on experience designing and implementing data solutions on Microsoft Azure . Strong hands-on expertise with: Azure Data Factory Microsoft Fabric (Data Engineering, Data Warehouse, Real-Time Analytics, Power BI) Azure Data Lake (ADLS Gen2), Azure SQL, and Synapse Analytics Power BI for enterprise reporting and data modeling Experience with data governance and cataloging tools , ideally Microsoft Purview. Proficient in data modeling techniques (dimensional, normalized, or data vault). Strong understanding of security, RBAC, data encryption, Key Vault, and privacy requirements in Azure. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert (AZ-305) or Azure Enterprise Data Analyst Associate (DP-500) . Hands-on experience with Microsoft Fabric end-to-end implementation. Familiarity with medallion architecture , delta lake, and modern lakehouse principles. Experience in Agile/Scrum environments and stakeholder engagement across business and IT. Strong communication skills, with the ability to explain complex concepts to both technical and non-technical audiences.

Posted 2 hours ago

Apply

2.0 - 5.0 years

6 - 11 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Immediate joiners preferred .. Job Title: Data & Analytics Strategy Executive Experience: Minimum 2+ Years Location: Remote [Hybrid later] (Pune / Coimbatore / chennai / Mumbai / Gurugram / Bengaluru) Type: Full-Time Shift : 12:30 PM to 9:30 PM Afternoon shift Role Overview: Analyst will use in-depth knowledge in Technical Analytics skills and an understanding of media channel based KPIs to help campaign reporting, measurement along with providing insights & recommendations Key Responsibilities Analyze media spend, ROI, and performance metrics across key digital platforms such as Google Ads, Facebook/Instagram, DCM, DV360 and translating data into actionable insights. Develop and maintain advanced dashboards and reports that provide actionable insights and visualizations to stakeholders at different levels of the organization. Maintain, create and re-view QA plans for deliverables to align with the requirements, identify discrepancies if any and troubleshoot issues. Work with various internal and external stakeholders to develop project plan, manage the day-to-day tasks and meet project deadlines Skills and experience: Minimum 2-5 years of experience as a Marketing Analyst. Strong experience with Digital Media Data across Search, Social, Display, Programmatic media and strong understanding of Digital KPIs. Understanding of marketing funnels and relevant KPIs. Knowledge of selected programming languages (e.g., SQL) In-depth knowledge of relational databases (e.g., Snowflake, GCP) Analytical & problem-solving skills. Strong written and oral communication skills. Expertise in designing new dashboards, identifying the right metrics, layout. Exposure to Tableau. Good-to-have: Other Visualization tools like Power BI and Looker Studio, GCP, Datorama etc. Interested share resume on dhanashree.chitre@weareams.com

Posted 2 hours ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

The IZOT product line includes BMCs Intelligent Z Optimization & Transformation products, which help the worlds largest companies to monitor and manage their mainframe systems. The modernization of mainframe is the beating heart of our product line, and we achieve this goal by developing products that improve the developer experience, the mainframe integration, the speed of application development, the quality of the code and the applications security, while reducing operational costs and risks. We acquired several companies along BMC is looking for a talented Python Developer to join our family working on complex and distributed software, developing, and debugging software products, implementing features, and assisting the firm in assuring product quality. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: We are seeking a Python with AI/ML Developer to join a highly motivated team responsible for developing and maintaining innovation for mainframe capacity and cost management. As an Application Developer at BMC, you will be responsible for: Developing and integrating AI/ML models with a focus on Generative AI (GenAI), Retrieval-Augmented Generation (RAG), and Vector Databases to enhance intelligent decision-making. Building scalable AI pipelines for real-time and batch inference, optimizing model performance, and deploying AI-driven applications. Implementing RAG-based architectures using LLMs (Large Language Models) for intelligent search, chatbot development, and knowledge management. Utilizing vector databases (e.g., FAISS, ChromaDB, Weaviate, Pinecone) to enable efficient similarity search and AI-driven recommendations. Developing modern web applications using Angular to create interactive and AI-powered user interfaces. To ensure youre set up for success, you will bring the following skillset & experience: 7+ years of experience in designing and implementing AI/ML-driven applications Strong proficiency in Python and AI/ML frameworks like TensorFlow, PyTorch, Hugging Face Transformers, LangChain. Experience with Vector Databases (FAISS, ChromaDB, Weaviate, Pinecone) for semantic search and embeddings. Hands-on expertise in LLMs (GPT, LLaMA, Mistral, Claude, etc.) and fine-tuning/customizing models. Proficiency in Retrieval-Augmented Generation (RAG) and prompt engineering for AI-driven applications. Experience with Angular for developing interactive web applications. Experience with RESTful APIs, FastAPI, Flask, or Django for AI model serving. Working knowledge of SQL and NoSQL databases for AI/ML applications. Hands-on experience with Git/GitHub, Docker, and Kubernetes for AI/ML model deployment.

Posted 2 hours ago

Apply

3.0 - 4.0 years

3 - 4 Lacs

Gurugram

Work from Office

Naukri logo

Develop & maintain backend services using PHP Laravel framework Design, implement & manage databases using Oracle and SQL Create RESTful APIs and ensure smooth integration with frontend interfaces Debug, test, and document software applications

Posted 2 hours ago

Apply

2.0 - 7.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: react, ReactJs, Node Js, DSA Role Overview: We are looking for a talented Full Stack Developer with 25 years of experience to join our dynamic team in Hyderabad. As a Senior Full Stack Developer at Apty, you will contribute to the design, development, and maintenance of cutting-edge web applications and browser extensions that serve enterprise clients worldwide. You should have strong problem-solving skills and be proficient in both front-end and back-end technologies. Key Responsibilities: Develop responsive and user-friendly web interfaces using HTML, CSS, and React.js. Build robust server-side applications and APIs using Node.js. Design and manage relational databases using SQL. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Debug, troubleshoot, and enhance existing applications for optimal performance and scalability. Ensure code quality through reviews, testing, and adherence to best practices. Stay updated with the latest trends and technologies. Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or a related field. Strong programming and problem-solving skills. Proficiency in HTML, CSS, and React.js for front-end development. Experience with Node.js for back-end development. Good understanding of relational databases, including SQL. Familiarity with RESTful APIs and web application architecture.

Posted 2 hours ago

Apply

9.0 - 14.0 years

32 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Python, Golang, Rust, GCP, Airflow, Docker, Containerization, Hadoop, Hive, SQL, Spark, Generative AI, Agentic Workflows, Machine learning (ML) About the job Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. We are seeking an experienced Senior Machine Learning Manager to lead the Notifications Science team, focused on building intelligent, ML-driven systems for personalized notifications. These systems ensure that we send the right message to the right customer, at the right time, through the right channel (Push, Email, SMS), and at the right cadence, while balancing incremental revenue with customer engagement health. In this role, youll be accountable for the technical roadmap, driving innovation to build the next generation of Wayfairs communications ML stack. Youll work closely with a high-performing team of ML scientists and engineers to solve some of Wayfairs most complex challenges in personalization, latency, and scale with direct impact on customer experience and company revenue. What Youll do: Own the strategy, roadmap, and execution of notification intelligence and automation solutions. Lead the development of GenAI-powered content generation, send-time optimization, and cross-channel orchestration systems. Build intelligent systems that drive significant incremental revenue while minimizing customer fatigue and unsubscribes. Develop and grow technical leadership within the team, modeling a culture of continuous research and innovation. Collaborate with Engineering and Product teams to scale decisioning systems to millions of notifications daily. Act as a subject matter expert, providing mentorship and technical guidance across the broader Data Science and Engineering organization. We Are a Match Because You Have: Bachelor's or Masters degree in Computer Science, Mathematics, Statistics, or related field. 9+ years of industry experience, with at least 12 years of experience managing teams, and 5+ years as an individual contributor working on production ML systems. Strategic thinker with a customer-centric mindset and a desire for creative problem-solving, looking to make a big impact in a growing organization. Demonstrated success influencing senior-level stakeholders on strategic direction, based on recommendations backed by in-depth analysis, and excellent written and verbal communication skills. Ability to partner cross-functionally to own and shape technical roadmaps and the organizations required to drive them. Proficient in one or more programming languages, e.g., Python, Golang, or Rust. Nice to have: Experience with GCP, Airflow, and containerization (Docker). Experience building scalable data processing pipelines with big data tools such as Hadoop, Hive, SQL, Spark, etc. Experience in Bayesian Learning, Multi-armed Bandits, or Reinforcement Learning. Familiarity with Generative AI and agentic workflows.

Posted 2 hours ago

Apply

2.0 - 7.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Skills: 2+ years of experience as a UX Designer, Interaction Designer, or Data Visualization Designer, ideally working directly within product and engineering organizations on technical or enterprise-grade products. A strong portfolio showcasing your design process and demonstrating your ability to translate complex data into clear and actionable user interfaces, preferably within cloud engineering & infrastructure . Proven experience conducting user research with technical users (e.g., engineers, SREs) and the ability to translate their specialized needs into effective design solutions for data-intensive applications. Solid understanding of interaction design principles, information architecture, and usability best practices, with a strong emphasis on data visualization principles for technical audiences analyzing complex systems. Proficiency in industry-standard design and prototyping tools such as Figma, Sketch, or similar. Excellent communication, presentation, and collaboration skills, with the ability to articulate design rationale clearly and persuasively to technical and product stakeholders. Proven experience working directly within agile product and engineering teams, embedding design into the development lifecycle. A deep empathy for technical users and a passion for creating tools that empower them to understand, diagnose, and manage complex systems effectively. Ability to work independently, manage multiple projects simultaneously, and thrive in a fast-paced, technically driven environment.

Posted 2 hours ago

Apply

12.0 - 17.0 years

7 - 12 Lacs

Chandigarh

Work from Office

Naukri logo

Job Summary As a key contributor to our ERP Transformation Services team, the Senior ETL Data Migration Analyst is responsible for owning the design, development, and execution of enterprise-wide data migration activities. This role is instrumental in the success of global ERP implementations primarily Oracle EBS and SAP ECC by ensuring consistent, auditable, and high-quality data migration processes using industry-standard tools and frameworks. In This Role, Your Responsibilities Will Be: Pre-Go-Live: Planning & Development Design and implement global data migration strategies for Oracle and SAP ERP projects. Develop ETL processes using Syniti DSP / SKP or an equivalent tool to support end-to-end migration. Collaborate with legacy system teams to extract and analyze source data. Build workflows for data profiling, cleansing, enrichment, and transformation. Ensure audit ability and traceability of migrated data, aligned with compliance and governance standards. Go-Live & Cutover Execution Support mock loads, cutover rehearsals, and production data loads. Monitor data load progress and resolve issues related to performance, mapping, or data quality. Maintain a clear log of data migration actions and reconcile with source systems. Post-Go-Live: Support & Stewardship Monitor data creation and updates to ensure business process integrity post go-live. Provide data extract/load services for ongoing master data maintenance. Contribute to legacy data archiving strategies, tools, and execution. Tools, Documentation & Collaboration Maintain documentation of ETL procedures, technical specifications, and data lineage. Partner with implementation teams to translate business requirements into technical solutions. Contribute to the development and refinement of ETL frameworks and reusable components. Travel Requirements Willingness to travel up to 20% for project needs, primarily during key implementation phases Who You Are: You show a tremendous amount of initiative in tough situations; you are someone who has strong analytical and problem-solving skills. You are self-motivated, accountable, and proactive in learning and applying new technologies. You possess superb communication and collaboration across global teams. For This Role, You Will Need: 12+ years of IT experience with a focus on ETL, data management, and ERP data migration. Strong hands-on experience with Oracle EBS or SAP ECC implementations. Proficiency in Syniti DSP, Informatica, Talend, or similar enterprise ETL tools. Proficient SQL skills; ability to write and optimize queries for large datasets. Demonstrable track record in data profiling, cleansing, and audit trail maintenance. Academic background in MCA / BE / BSC - Computer Science, Engineering, Information Systems, or Business Administration Proven Application development experience in .NET, ABAP, or scripting languages Familiarity with Data Migration implementations and data modeling principles. Knowledge of project management methodologies (Agile, PMP, etc.). Performance Indicators Successful execution of data migration cutovers with minimal errors. Complete Data traceability and audit compliance from source to target. Timely delivery of ETL solutions and reports per project phases. Continuous improvement and reuse of ETL frameworks and standard processes. Our Culture & Commitment to You: .

Posted 2 hours ago

Apply

Exploring SQL Jobs in India

SQL (Structured Query Language) is a crucial skill in the field of data management and analysis. In India, the demand for professionals with SQL expertise is on the rise, with numerous job opportunities available across various industries. Job seekers looking to break into the IT sector or advance their careers in data-related roles can benefit greatly from acquiring SQL skills.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Chennai

These cities are known for their thriving IT industries and are hotspots for SQL job openings.

Average Salary Range

In India, the average salary range for SQL professionals varies based on experience levels. Entry-level positions can expect to earn around ₹3-5 lakhs per annum, while experienced professionals with 5+ years of experience can earn anywhere from ₹8-15 lakhs per annum.

Career Path

A typical career progression in the SQL domain may include roles such as: - Junior SQL Developer - SQL Developer - Senior SQL Developer - Database Administrator - Data Analyst - Data Scientist

Advancing to higher roles like Tech Lead or Data Architect is possible with increased experience and expertise.

Related Skills

In addition to SQL proficiency, job seekers in India may benefit from having skills such as: - Data analysis and visualization tools (e.g., Tableau, Power BI) - Programming languages (e.g., Python, R) - Knowledge of database management systems (e.g., MySQL, Oracle) - Understanding of data warehousing concepts

Interview Questions

Here are 25 SQL interview questions to help you prepare for job interviews in India:

  • What is SQL and why is it important? (basic)
  • Differentiate between SQL and NoSQL databases. (basic)
  • Explain the difference between INNER JOIN and OUTER JOIN. (medium)
  • What is a subquery in SQL? (medium)
  • Write a query to find the second highest salary in a table. (medium)
  • How do you optimize SQL queries for better performance? (advanced)
  • What is normalization in databases? (basic)
  • Explain the ACID properties of a transaction. (medium)
  • What is the difference between CHAR and VARCHAR data types? (basic)
  • Write a query to calculate the total number of orders for each customer. (medium)
  • What is a correlated subquery? (advanced)
  • How do you handle duplicates in a SQL query result? (medium)
  • What is a stored procedure in SQL? (basic)
  • Explain the difference between UNION and UNION ALL operators. (medium)
  • Write a query to find the third highest salary in a table. (advanced)
  • How do you implement transactions in SQL? (medium)
  • What is the difference between TRUNCATE and DELETE statements? (basic)
  • Explain the concept of indexing in databases. (medium)
  • Write a query to retrieve all employees who do not report to anyone. (medium)
  • How do you handle NULL values in SQL queries? (basic)
  • What is a self-join in SQL? (advanced)
  • Explain the concept of data integrity in databases. (basic)
  • Write a query to calculate the average salary of employees by department. (medium)
  • How do you perform data migration in SQL? (medium)
  • What is a view in SQL and why is it used? (basic)

Closing Remark

As you explore SQL job opportunities in India, remember to not only focus on mastering SQL but also to develop related skills that can make you a well-rounded professional in the data management field. Prepare thoroughly for interviews by practicing common SQL questions and showcase your expertise confidently. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies