Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
100 - 200 Lacs
Bengaluru
Hybrid
What Youll Do: Play a key role in developing and driving a multi-year technology strategy for a complex platform Directly and indirectly manage several senior software engineers (architects) and managers by providing coaching, guidance, and mentorship to grow the team as well as individuals Lead multiple software development teams - architecting solutions at scale to empower the business, and owning all aspects of the SDLC: design, build, deliver, and maintain Inspire, coach, mentor, and support your team members in their day to day work and their long term professional growth Attract, onboard, develop and retain diverse top talents, while fostering an inclusive and collaborative team and culture (our latest DEI Report) Lead your team and peers by example. As a senior member of the team your methodologies, technical and operational excellence practices, and system designs will help to continuously improve our domain Identify, propose, and drive initiatives to advance the technical skills, standards, practices, architecture, and documentation of our engineering teams Facilitate technical debate and decision making with an appreciation for trade-offs Continuously rethink and push the status quo, even when it challenges your/our established ideas. Preferred candidate profile Results-oriented, collaborative, pragmatic, and continuous improvement mindset Hands-on experience driving software transformations within high-growth environments (think complex, cross-continentally owned products) 15+ years of experience in engineering, out of which at least 10 years spent in leading highly performant teams and their managers (please note that a minimum of 5 years in leading fully fledged managers is required) Experience making architectural and design-related decisions for large scale platforms, understanding the tradeoffs between time-to-market vs. flexibility Significant experience and vocation in managing and enabling peoples growth and performance Experience designing and building high-scale generalizable products with outstanding user experience. Practical experience in hiring and developing engineering teams and culture and leading interdisciplinary teams in a fast-paced agile environment Capability to communicate and collaborate across the wider organization, influencing decisions with and without direct authority and always with inclusive, adaptable, and persuasive communication Analytical and decision-making skills that integrate technical and business requirements
Posted 2 months ago
3.0 - 5.0 years
6 - 8 Lacs
Chandigarh
Work from Office
Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc
Posted 2 months ago
4 - 9 years
5 - 12 Lacs
Pune
Work from Office
Night Shift: 9:00PM to 6:00AM Hybrid Mode: 3 days WFO & 2 Days WFH Job Overview We are looking for a savvy Data Engineer to manage in-progress and upcoming data infrastructure projects. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder using Python and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities for Data Engineer * Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes. * Identify, design, and implement internal process improvements through: automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Build the infrastructure required for optimal extraction, transformation, and loading ofdata from a wide variety of data sources using SQL / AWS / Snowflakes technologies. * Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. * Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. * Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. * Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer * Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. * Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. * Strong analytic skills related to working with unstructured datasets. * Build processes supporting data transformation, data structures, metadata, dependency and workload management. * A successful history of manipulating, processing and extracting value from large disconnected datasets. Desired Skillset:- * 2+ years of experience in a Python Scripting and Data specific role, with Bachelor degree. * Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, APIs and how they work. * Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake. * Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra. * Experience with most or all the following cloud services: AWS, Azure, Snowflake,Google Strong project management and organizational skills. * Experience supporting and working with cross-functional teams in a dynamic environment.
Posted 3 months ago
5.0 - 8.0 years
15 - 20 Lacs
hyderabad, chennai, bengaluru
Hybrid
Job description: Job Title: Developer - Data Engineer Position: Senior Software Engineer Experience: 5+ Years Category: Software Developer Main location: Bangalore/Chennai/Hyderabad/Pune Position ID: J0825-0242 Employment Type: Full Time Summary: • 5+ years of Total Experience • 3+ years of direct hands-on experience with the GCP BigQuery, Apache Airflow • Experience designing cloud solutions • Experience in Developing scripts for DWH Platforms using Hadoop, Python, Pyspark • Experiencing in migration activities from DWH to Cloud platforms Detailed JD: • 5+ years of Total Experience • 3+ years of direct hands-on experience with the GCP BigQuery, Apache Airflow • Experience designing cloud solutions. • Experience in Developing scripts for DWH Platforms using Hadoop, Python, Pyspark • Experiencing in migration activities from DWH to Cloud platforms • Excellent communication and presentation skills. • Strong analytical and troubleshooting skills Your future duties and responsibilities • Experience in Developing scripts for DWH Platforms using Hadoop, Python, Pyspark • Experiencing in migration activities from DWH to Cloud platforms • Excellent communication and presentation skills. • Strong analytical and troubleshooting skills Required qualifications to be successful in this role Must have skills: GCP BigQuery, Apache Airflow, Hadoop, Python, Pyspark, Good to have skills: communication and presentation skills, Strong analytical and troubleshooting skills.
Posted Date not available
5.0 - 10.0 years
12 - 22 Lacs
bengaluru
Remote
Job Title: Lead Backend Developer GCP BigQuery Location: Remote Experience: 5+ Years Type: Full-Time Job Summary: We are seeking a highly skilled Lead Backend Developer with strong experience in Google Cloud Platform (GCP) and BigQuery . The ideal candidate will have a solid background in backend systems, cloud-based data solutions, and microservice development. This role does not require front-end skills and is fully focused on backend engineering and data operations. Key Responsibilities: Design, develop, and deploy backend systems using GCP and BigQuery Lead and mentor a team of backend developers through project execution Manage code versions, pipelines, and CI/CD using GIT and Azure DevOps Participate actively in Scrum ceremonies and Agile project delivery Develop and optimize microservices , data flows, and integration components Troubleshoot issues, monitor backend performance, and resolve technical bottlenecks Collaborate with cross-functional teams for architecture planning and documentation Required Skills & Experience: Strong hands-on experience with GCP and Google BigQuery Proficiency with GIT , Scrum , and Azure DevOps Experience working with Power BI , microservices , and backend integration Solid understanding of backend architecture and cloud-based data workflows Excellent communication, leadership, and problem-solving skills Good to Have: Familiarity with SQL Server , DB2 , Spring Boot , Java , C# , and JSON Experience with AzureAD , HTTP protocols , AMQP messaging , and technical documentation (e.g., ReadMe)
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
55803 Jobs | Dublin
Wipro
24489 Jobs | Bengaluru
Accenture in India
19138 Jobs | Dublin 2
EY
17347 Jobs | London
Uplers
12706 Jobs | Ahmedabad
IBM
11805 Jobs | Armonk
Bajaj Finserv
11514 Jobs |
Amazon
11476 Jobs | Seattle,WA
Accenture services Pvt Ltd
10903 Jobs |
Oracle
10677 Jobs | Redwood City