Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
22 - 25 Lacs
Bengaluru
Work from Office
We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance
Posted 3 days ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
What you will be doing: 5-8 years of in SQL and working withStored Procedures. Experience in writing and interpretingSQ queries, designing data models and data standards. Experience withAzure B2C / Entrawould be an added advantage Experience in Cloud platform - Azure Experience inSQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience inloading datainto database from databases and files. Experience in analyzing and drawing valuable conclusions from data profiling results. Understanding business processes and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clientssimultaneously. Excellent analytical, verbal and communication skills Well versed with the advanced level of writing stored procedures, construing the query plans, altering the indexes, and troubleshooting the performance holdups Skilled at optimizing large, complicated SQL statements. Familiar with SSRS reporting tools. Nice to have knowledge in .Net for developing/debugging API/web services. Additional Job Description Work with project stakeholders to provide solutions addressing business data requirements. Work with application developers and business analysts to determine new and changed database requirements and provide database solutions accordingly. Design, implement and support solution using SQL Server Develop SQL scripts, including queries, views, stored procedures, functions, and triggers to create a new product with feature enhancements. Create and maintain up to date documentation of the data model, data flow and field level mappings. Participate in software releases using Agile development and scrum model.
Posted 5 days ago
2.0 - 5.0 years
2 - 5 Lacs
Pune, Maharashtra, India
On-site
Key Skills and Responsibilities Snowflake Architecture: Deep understanding of Snowflake's architecture, components, and functionalities. Snowflake cost model: Understanding of how Snowflake cost model operates and ability to improve efficiency and track costs across the platform. Database Management: Expertise in creating, managing, and optimizing databases, schemas, tables, and views. Data Modeling: Knowledge of dimensional modeling (star and snowflake schemas) for data warehousing. Data Warehousing: Knowledge of data warehousing concepts, methodologies, and best practices. SQL: Proficiency in SQL for querying, manipulating, and analyzing data. Performance Optimization: Understanding of performance tuning techniques (clustering, materialized views, warehouse sizing) to improve query performance. Security: Expertise in implementing security measures (IAM, role-based access control, data encryption, password rotation processes) to protect sensitive data. ETL/ELT Tools: Familiarity with ETL/ELT tools (e.g., HVR, SnapLogic) for data loading and transformation. Cloud Computing: Understanding of cloud concepts (e.g., scalability, elasticity, resource management) and experience with cloud platforms (AWS, Azure). Scripting: Proficiency in scripting languages (e.g., Python, Bash) for automation and data processing tasks. Reporting platform: Experience in setting up Power BI workspaces and Cognos Packages with secure data models ready for reporting. Project Management: Knowledge of project management methodologies (e.g., Agile, Waterfall) and tools. Other Skills Problem-Solving: Ability to troubleshoot and resolve technical issues efficiently. Communication: Effective communication skills to interact with stakeholders and team members. Analytical Thinking: Ability to analyze data and identify patterns and trends. Attention to Detail: Meticulous approach to ensure data accuracy and integrity. Adaptability: Willingness to learn new technologies and adapt to changing requirements. Project Management: Ability to plan, execute, and monitor projects effectively. Leadership: Ability to lead and motivate project teams. Time Management: Effective time management skills to meet project deadlines. Adaptability: Flexibility to adapt to changing project requirements. Stakeholder Management: Ability to manage expectations and build relationships with stakeholders.
Posted 1 week ago
2.0 - 5.0 years
2 - 5 Lacs
Noida, Uttar Pradesh, India
On-site
Key Skills and Responsibilities Snowflake Architecture: Deep understanding of Snowflake's architecture, components, and functionalities. Snowflake cost model: Understanding of how Snowflake cost model operates and ability to improve efficiency and track costs across the platform. Database Management: Expertise in creating, managing, and optimizing databases, schemas, tables, and views. Data Modeling: Knowledge of dimensional modeling (star and snowflake schemas) for data warehousing. Data Warehousing: Knowledge of data warehousing concepts, methodologies, and best practices. SQL: Proficiency in SQL for querying, manipulating, and analyzing data. Performance Optimization: Understanding of performance tuning techniques (clustering, materialized views, warehouse sizing) to improve query performance. Security: Expertise in implementing security measures (IAM, role-based access control, data encryption, password rotation processes) to protect sensitive data. ETL/ELT Tools: Familiarity with ETL/ELT tools (e.g., HVR, SnapLogic) for data loading and transformation. Cloud Computing: Understanding of cloud concepts (e.g., scalability, elasticity, resource management) and experience with cloud platforms (AWS, Azure). Scripting: Proficiency in scripting languages (e.g., Python, Bash) for automation and data processing tasks. Reporting platform: Experience in setting up Power BI workspaces and Cognos Packages with secure data models ready for reporting. Project Management: Knowledge of project management methodologies (e.g., Agile, Waterfall) and tools. Other Skills Problem-Solving: Ability to troubleshoot and resolve technical issues efficiently. Communication: Effective communication skills to interact with stakeholders and team members. Analytical Thinking: Ability to analyze data and identify patterns and trends. Attention to Detail: Meticulous approach to ensure data accuracy and integrity. Adaptability: Willingness to learn new technologies and adapt to changing requirements. Project Management: Ability to plan, execute, and monitor projects effectively. Leadership: Ability to lead and motivate project teams. Time Management: Effective time management skills to meet project deadlines. Adaptability: Flexibility to adapt to changing project requirements. Stakeholder Management: Ability to manage expectations and build relationships with stakeholders.
Posted 1 week ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Navi Mumbai, Maharashtra, India
On-site
Your role and responsibilities As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud
Posted 2 weeks ago
5.0 - 15.0 years
6 - 19 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Description We are seeking an experienced SQL Developer to join our team in India. The ideal candidate will have 5-15 years of experience in SQL development, with a strong focus on optimizing database performance and ensuring data integrity. Responsibilities Develop and maintain SQL queries and procedures to support business applications. Optimize SQL queries for performance improvements and efficiency. Create and manage database schemas, tables, and relationships. Collaborate with software developers and data analysts to gather requirements and deliver data solutions. Troubleshoot database issues and provide support to end-users as needed. Ensure data integrity and security across all databases. Skills and Qualifications Proficient in SQL and PL/SQL. Strong understanding of database design and normalization principles. Experience with SQL Server, MySQL, or Oracle databases. Ability to write complex queries, stored procedures, and triggers. Knowledge of data warehousing concepts and ETL processes. Familiarity with database performance tuning and optimization techniques. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France