Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Infrastructure as Code (IaC) Good to have skills : Microsoft Azure DevOpsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are delivered on time and meet quality standards. Your role will require effective communication and coordination with stakeholders to align project goals and expectations, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Team player, good communication skills, critical thinking - Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - 6 years of experience as a Platform Engineer or similar role, focusing on deploying, optimizing, and maintaining large-scale analytics platforms. Expertise in Azure DevOps skills. (Experience with CI/CD, preferably in Azure DevOps)- Experience in Terraform & Ansible - Knowledge of containerization technologies like Docker and Kubernetes, particularly for deploying and managing data services. Good to know Python and Shell scripting. Knowledge of Role-Based Access Controls and Identity Access management (IAM).- Proactive attitude towards learning new technologies and staying updated with industry trends. Additional InformationReal-world experience in deploying, optimizing, and maintaining large-scale analytics platforms in production environments. Good understanding of network architecture and security. Qualification 15 years full time education
Posted 3 weeks ago
8.0 - 12.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary Are you a customer-focused professional with a passion for advocacy, leadership, and excellence? Do you thrive in a high-achieving, growth-oriented culture where you can be confident and have a clear sense of purpose? Would you enjoy building camaraderie with a team of diverse people from around the world who share the drive to be trusted strategic advisors to executives in high-impact situations? If so, we invite you to consider joining NetApp’s Customer Assurance Program (CAP) team! CAP is a global autonomous team of advocacy champions whose primary purpose is to sustain customer confidence and trust in NetApp. CAP serves as the highest level of escalation within NetApp, activated by exception when standard channels have proven inadequate. Once CAP accepts an engagement, the CAP Manager advocates for the best interests of both customer and NetApp while owning, managing, and resolving critical situations with a holistic ownership mindset, striving to turn risks into high-impact opportunities. In addition to escalation management, a CAP Manager will also have opportunities to lead or actively participate in continuous improvement projects and initiatives, championing the voice of our organization and customers. This provides further opportunities to lead global cross-functional teams, proposing new ideas, identifying root causes and systemic issues, recommending and implementing process improvements, and driving organizational change to enhance NetApp's support quality. Job Requirements Key Responsibilities Own and drive the resolution of critical customer escalations with end-to-end accountability, ensuring alignment with NetApp’s strategic goals. Conduct holistic situation appraisals and problem analyses to uncover specific customer pain points and drive resolution strategies that deliver long-term value. Lead diverse, cross-functional virtual teams across geographies, time zones, and cultures to resolve complex technical and business challenges, ensuring alignment with organizational priorities. Build trusted and sustained relationships with stakeholders across NetApp, customers, and partners, securing their commitment to expedite resolution and drive systemic improvements. Act as a strategic advisor to executives, confidently managing expectations, providing actionable insights, and serving as the primary point of contact throughout the CAP engagement. Develop and execute resolution plans that balance time, cost, and customer satisfaction, while identifying risks and implementing mitigation strategies to achieve sustainable outcomes. Communicate effectively with multicultural, cross-functional audiences at all levels, delivering clear, concise updates in both verbal and written formats. Simultaneously manage multiple high-impact escalations, projects, and initiatives, driving outcomes that align with NetApp’s strategic objectives. Document processes and resolutions with precision and clarity, ensuring insights are leveraged for continuous improvement and alignment with organizational goals. Identify systemic trends and root causes of escalations, driving improvements across “People, Processes, and Products” to enhance customer satisfaction and operational efficiency. Lead or contribute to CAP's continuous improvement initiatives and strategic projects, aligning outcomes with NetApp’s organizational vision and goals. Required Skills and Attributes Be flexible and adaptable in fast-paced, volatile situations, quickly understanding escalation landscapes and adjusting to evolving changes and customer expectations Develop and maintain strong relationships with key cross-functional stakeholders Exhibit executive presence with excellent verbal and written communication skills, consistently delivering high-quality outputs Communicate effectively under pressure, regardless of audience or issue complexity Embody a "whatever it takes" attitude to remove obstacles, gain buy-in, and convey urgency in any situation, executing tasks with efficacy, accuracy, and consistency Demonstrate strong situational and cross-functional leadership when managing escalations or projects and initiatives, owning and driving resolutions Think and act as an owner, taking initiative and personal responsibility for your own work and holding others accountable for theirs as the situation requires Maintain diplomacy, assurance, and calm under pressure, balancing empathy and assertiveness when conveying needs and making challenging decisions Embrace a growth mindset, viewing challenges as opportunities to learn and grow Collaborate and leverage the strengths of others to achieve better outcomes Foster an environment where others feel inspired to be their best selves Possess strong time management skills to handle multiple complex issues simultaneously across various time zones Holistically evaluate and clearly communicate the implications of decisions made during CAP engagements Learn core technical knowledge of NetApp products and solutions. Desired Experience and Education History of at least 8-10 years’ work experience in the high-tech industry*. Proven record of leading globally distributed teams in support of Fortune 500 companies. Demonstrated ability in restoring and/or maintaining trust with customers at all levels. Extensive experience owning and resolving complex and/or critical situations. Strong background in delivering high-quality output to executive audiences. Skilled at influencing senior leadership and customers towards win-win agreements for successful outcomes. Proficiency in Microsoft 365 Suite and fundamental knowledge of storage infrastructure technologies. *A combination of relevant education, training, and/or certifications, along with industry experience, may be considered instead of the required 8-10 years of work experience. About NetApp NetApp is the global Intelligent Data Infrastructure leader enabling organizations to manage any data, for any application, anywhere it’s needed - optimized, secured, and protected by intelligence. Only NetApp provides a silo-less approach combining unified data storage with enterprise-grade storage service natively embedded in the world’s biggest clouds. We offer integrated data services with built-in data resilience, policy-based governance, and CloudOps solutions with AI-powered optimization of on-prem and cloud infrastructure. Our company values – put the customer at the center, care for each other and our communities, build belonging every day, embrace a growth mindset, think and act like owners – inform every decision we make, from customer interactions and social causes to designing solutions and supporting our employees What NetApp Offers: Impactful Work: Be part of a team that directly influences customer satisfaction, organizational developments, and loyalty, making a real difference in customers’ experience with NetApp Career Growth: Opportunities for professional development and career advancement within a global, innovative company Collaborative Culture: A supportive and inclusive work environment where your ideas and contributions are valued.Work-Life Balance: Flexible work arrangements and a commitment to work-life balance.
Posted 3 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Description: Develop and maintain high-performance Rust applications. Collaborate with cross-functional teams to design, develop, and test software features. Participate in code reviews, testing, and debugging to ensure high-quality code. I ntegrate it with Rust applications. Optimize and improve existing codebase. Develop and implement scalable backend architecture. Stay up to date with emerging trends and technologies in Rust development. Collaborate with cross-functional teams to design, develop and implement complex enterprise. applications using Rust programming language. Develop and maintain scalable backend architectures for high-performance systems. Develop and maintain SQL and NoSQL databases. Write clean, maintainable and efficient code that meets the projects technical objectives . Conduct code reviews, testing and debugging. Keep up to date with new trends and emerging technologies in Rust and blockchain development. Work with development teams and product managers to ideate software solutions. Responsible in building the product as per the product specification defined by the product architects. Build back-end data services. Expertise & leadership to shape our Rust engineering division. Develop and manage well-functioning databases and applications. Closely work with the solution architect and product architect towards implementing the requirements. Troubleshoot, debug and upgrade software. Should be very proficient in Linux system. Should be able to create, update and maintain Linux service.
Posted 3 weeks ago
4.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
We are looking for a skilled SAP ABAP Developer with a strong programming background and hands-on experience in modern SAP technologies. The ideal candidate should have solid technical expertise across ABAP and related frameworks, along with a strong educational foundation. Design, develop, and implement SAP applications using ABAP/ABAP OO Develop SAP Core Data Services (CDS), OData services, and Fiori/UI5 applications Work with technologies such as BOPF, RAP, and HANA Integrate SOAP APIs and work with frontend scripting using JavaScript Collaborate with functional teams to translate business requirements into technical solutions Required Skills: 4 to 10 years of experience in software development Strong educational background: Bachelor s degree in Engineering or MCA from reputed institutes Expertise in: ABAP/ABAP OO CDS Views / OData Services Fiori / UI5 BOPF / HANA / RAP SOAP API / JavaScript Migration, Implemenataion, Sap Abap
Posted 3 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Opportunity for SAP Basis Senior Consultant Experience: 5+yrs Relevant with SAP Basis Notice Period: Immediate - 30 days Location: Pune ( Panchshil Business Park ), Hyderabad ( Orbit by Auro Realty), Bengaluru (Prestige Shantiniketan, Whitefield's) Mandate: 2 years of S4 experience 2 end to end Implementation 1 S4 Implementation JD: S4 HANA migrations for ECC and add-on applications. System checks post work steps, certificates, interface connectivity. Process monitoring on ERP, NW and HANA. Simplification item check runs for S/4 HANA support, maintenance. Cloud Tenant support for Cloud systems, S/4 HANA, Data Services. Support of application departments in problem analysis and resolution (RTR, EHS, ITF, PTP, RTP). Remote function calls, ACL, gateway, monitoring of work processes and transports. Solution manager: analysis of usability for managed system config, monitoring, alerting. SNotes. Certificate/Key understanding and support. Opening and working on internal tickets and OSS incidents to SAP for Me, Rise with SAP. General SAP BASIS tasks, monitoring on current ERP systems, TMS maintenance. Content Server updates, migrations. Maintenance (security patch cycles, housekeeping, job scheduling, printer support) of R/3 SPDD and SPAU adjustment of ABAP objects. Worked on BTP administration Database Administration Troubleshooting, Upgrades, Performance Tuning Planning and Execution of SAP Upgrades S4Hana Migration/Conversion SAP Performance and error analysis SAP System optimization system Refresh SAP TMS SAP Fiori implementation (1 lifecycle at least)
Posted 3 weeks ago
2.0 - 5.0 years
12 - 16 Lacs
Pune
Work from Office
Overview We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Qualifications Bachelor's degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Data Software Engineer with 5 to 12 years of experience in Big Data and related technologies. The ideal candidate will have expertise in distributed computing principles, Apache Spark, and hands-on programming with Python. Roles and Responsibility Design and implement Big Data solutions using Apache Spark and other relevant technologies. Develop and maintain large-scale data processing systems, including stream-processing systems. Collaborate with cross-functional teams to integrate data from multiple sources, such as RDBMS, ERP, and files. Optimize performance of Spark jobs and troubleshoot issues. Lead a team efficiently and contribute to the development of Big Data solutions. Experience with native Cloud data services, such as AWS or AZURE Databricks. Job Expert-level understanding of distributed computing principles and Apache Spark. Hands-on programming experience with Python and proficiency with Hadoop v2, Map Reduce, HDFS, and Sqoop. Experience with building stream-processing systems using technologies like Apache Storm or Spark-Streaming. Good understanding of Big Data querying tools, such as Hive and Impala. Knowledge of ETL techniques and frameworks, along with experience with NoSQL databases like HBase, Cassandra, and MongoDB. Ability to work in an AGILE environment and lead a team efficiently. Strong understanding of SQL queries, joins, stored procedures, and relational schemas. Experience with integrating data from multiple sources, including RDBMS (SQL Server, Oracle), ERP, and files.
Posted 3 weeks ago
4.0 - 6.0 years
5 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Application Developer with 4 to 6 years of experience to design, build, and configure applications that meet business process and application requirements. This position is based at our Hyderabad office. Roles and Responsibility Design, build, and configure applications to meet business process and application requirements. Collaborate with team members to understand project needs and develop application features. Ensure solutions align with business objectives and engage in testing and troubleshooting to enhance application performance and user experience. Continuously seek opportunities for improvement and innovation in application development processes. Participate in code reviews to ensure quality and adherence to best practices. Assist in documenting application processes and workflows. Job Proficiency in SAP CPI for Data Services. Strong understanding of application development methodologies. Experience with integration tools and techniques. Familiarity with data management and data transformation processes. Ability to troubleshoot and resolve application issues efficiently. Minimum 3 years of experience in SAP CPI for Data Services. A 15 years full-time education is required.
Posted 3 weeks ago
6.0 - 8.0 years
5 - 8 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=6 to 8 , jd=9 HDC3B SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive successful project outcomes and foster a collaborative team environment. Roles & Responsibilities- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) processes and data warehousing concepts.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues related to data services. Additional Information- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required., Title=SAP BusinessObjects Data Services, ref=6566344
Posted 3 weeks ago
8.0 - 13.0 years
2 - 30 Lacs
Bengaluru
Work from Office
A Snapshot of Your Day On a typical day, you will lead the design and implementation of scalable ETL/ELT data pipelines using Python or C#, while managing cloud-based data architectures on platforms like Azure and AWS Youll collaborate with data scientists and analysts to ensure seamless data integration for analysis and reporting, and mentor junior engineers on standard processes Additionally, you will supervise and optimize data pipelines for performance and cost efficiency, while ensuring compliance with data security and governance regulations How Youll Make An Impact For our Onshore Execution Digital Product Development team, we are looking for a highly skilled Data Engineer with 8-10 years of experience to join our team In this role, you will take ownership of designing and implementing data pipelines, optimizing data workflows, and supporting the data infrastructure You will work with large datasets, cloud technologies while ensuring data quality, performance, and scalability Lead the design and implementation of scalable ETL/ELT data pipelines using Python or C# for efficient data processing Architect data solutions for large-scale batch and real-time processing using cloud services (AWS, Azure, Google Cloud) Craft and manage cloud-based data architectures with services like AWS Redshift, Google BigQuery, Azure Data Lake, and Snowflake Implement cloud data solutions using Azure services such as Azure Data Lake, Blob Storage, SQL Database, Synapse Analytics, and Data Factory Develop and automate data workflows for seamless integration into Azure platforms for analysis and reporting Manage and optimize Azure SQL Database, Cosmos DB, and other databases for high availability and performance Supervise and optimize data pipelines for performance and cost efficiency Implement data security and governance practices in compliance with regulations (GDPR, HIPAA) using Azure security features Collaborate with data scientists and analysts to deliver data solutions that meet business analytics needs Mentor junior data engineers on standard processes in data engineering and pipeline design Set up supervising and alerting systems for data pipeline reliability Ensure data accuracy and security through strong governance policies and access controls Maintain documentation for data pipelines and workflows for transparency and onboarding What You Bring 8-10 years of proven experience in data engineering with a focus on large-scale data pipelines and cloud infrastructure Strong expertise in Python (Pandas, NumPy, ETL frameworks) or C# for efficient data processing solutions Extensive experience with cloud platforms (AWS, Azure, Google Cloud) and their data services Sophisticated knowledge of relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra) Familiarity with big data technologies (Apache Spark, Hadoop, Kafka) Strong background in data modeling and ETL/ELT development for large datasets Experience with version control (Git) and CI/CD pipelines for data solution deployment Excellent problem-solving skills for troubleshooting data pipeline issues Experience in optimizing queries and data processing for speed and cost-efficiency Preferred: Experience integrating data pipelines with machine learning or AI models Preferred: Knowledge of Docker, Kubernetes, or containerized services for data workflows Preferred: Familiarity with automation tools (Apache Airflow, Luigi, DBT) for managing data workflow Preferred: Understanding of data privacy regulations (GDPR, HIPAA) and governance practices About The Team Who is Siemens Gamesa Siemens Gamesa is part of Siemens Energy, a global leader in energy technology with a rich legacy of innovation spanning over 150 years Together, we are committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible As a leading player in the wind industry and manufacturer of wind turbines, we are passionate about driving the energy transition and providing innovative solutions that meet the growing energy demand of the global community At Siemens Gamesa, we are always looking for dedicated individuals to join our team and support our focus on energy transformation Our Commitment to Diversity Lucky for us, we are not all the same Through diversity, we generate power We run on inclusion and our combined creative energy is fueled by over 130 nationalities Siemens Energy celebrates character no matter what ethnic background, gender, age, religion, identity, or disability We energize society, all of society, and we do not discriminate based on our differences Rewards/Benefits All employees are automatically covered under the Medical Insurance Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age Siemens Gamesa provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy as a part of CTC, tax saving measure We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment Please contact us to request accommodation
Posted 3 weeks ago
5.0 - 10.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
JD : 5+ years of experience in software engineeringStrong proficiency in SQL, with a deep understanding of query optimization and performance tuningExperience in implementing automated SQL code review using AI/ML techniques to identify performance bottlenecks and suggest query optimizationsExperience working with GCP servicesSolid hands-on experience with Python for scriptingExperience with automation of GitHub ActionsHands-on experience in designing, developing and deploying microservicesExperience in building APIs in fastapi/flask for data services and system integration
Posted 3 weeks ago
0.0 - 2.0 years
3 - 7 Lacs
Noida
Work from Office
Understanding of Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Databricks Knowledge of data integration and ETL concepts Familiarity with SQL and programming languages such as Python or Scala Basic understanding of data modeling and database design. Good communication skills. EXPERIENCE 0-2 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): databricks, Azure Data Factory, Pyspark
Posted 3 weeks ago
2.0 - 6.0 years
5 - 8 Lacs
Pune
Work from Office
Supports, develops, and maintains a data and analytics platform. Effectively and efficiently processes, stores, and makes data available to analysts and other consumers. Works with Business and IT teams to understand requirements and best leverage technologies to enable agile data delivery at scale. Note:- Although the role category in the GPP is listed as Remote, the requirement is for a Hybrid work model. Key Responsibilities: Oversee the development and deployment of end-to-end data ingestion pipelines using Azure Databricks, Apache Spark, and related technologies. Design high-performance, resilient, and scalable data architectures for data ingestion and processing. Provide technical guidance and mentorship to a team of data engineers. Collaborate with data scientists, business analysts, and stakeholders to integrate various data sources into the data lake/warehouse. Optimize data pipelines for speed, reliability, and cost efficiency in an Azure environment. Enforce and advocate for best practices in coding standards, version control, testing, and documentation. Work with Azure services such as Azure Data Lake Storage, Azure SQL Data Warehouse, Azure Synapse Analytics, and Azure Blob Storage. Implement data validation and data quality checks to ensure consistency, accuracy, and integrity. Identify and resolve complex technical issues proactively. Develop reliable, efficient, and scalable data pipelines with monitoring and alert mechanisms. Use agile development methodologies, including DevOps, Scrum, and Kanban. External Qualifications and Competencies Technical Skills: Expertise in Spark, including optimization, debugging, and troubleshooting. Proficiency in Azure Databricks for distributed data processing. Strong coding skills in Python and Scala for data processing. Experience with SQL for handling large datasets. Knowledge of data formats such as Iceberg, Parquet, ORC, and Delta Lake. Understanding of cloud infrastructure and architecture principles, especially within Azure. Leadership & Soft Skills: Proven ability to lead and mentor a team of data engineers. Excellent communication and interpersonal skills. Strong organizational skills with the ability to manage multiple tasks and priorities. Ability to work in a fast-paced, constantly evolving environment. Strong problem-solving, analytical, and troubleshooting abilities. Ability to collaborate effectively with cross-functional teams. Competencies: System Requirements Engineering: Uses appropriate methods to translate stakeholder needs into verifiable requirements. Collaborates: Builds partnerships and works collaboratively to meet shared objectives. Communicates Effectively: Delivers clear, multi-mode communications tailored to different audiences. Customer Focus: Builds strong customer relationships and delivers customer-centric solutions. Decision Quality: Makes good and timely decisions to keep the organization moving forward. Data Extraction: Performs ETL activities and transforms data for consumption by downstream applications. Programming: Writes and tests computer code, version control, and build automation. Quality Assurance Metrics: Uses measurement science to assess solution effectiveness. Solution Documentation: Documents information for improved productivity and knowledge transfer. Solution Validation Testing: Ensures solutions meet design and customer requirements. Data Quality: Identifies, understands, and corrects data flaws. Problem Solving: Uses systematic analysis to address and resolve issues. Values Differences: Recognizes the value that diverse perspectives bring to an organization. Preferred Knowledge & Experience: Exposure to Big Data open-source technologies (Spark, Scala/Java, Map-Reduce, Hive, HBase, Kafka, etc.). Experience with SQL and working with large datasets. Clustered compute cloud-based implementation experience. Familiarity with developing applications requiring large file movement in a cloud-based environment. Exposure to Agile software development and analytical solutions. Exposure to IoT technology. Additional Responsibilities Unique to this Position Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 3 to 5 years of experience in data engineering or a related field. Strong hands-on experience with Azure Databricks, Apache Spark, Python/Scala, CI/CD, Snowflake, and Qlik for data processing. Experience working with multiple file formats like Parquet, Delta, and Iceberg. Knowledge of Kafka or similar streaming technologies. Experience with data governance and data security in Azure. Proven track record of building large-scale data ingestion and ETL pipelines in cloud environments. Deep understanding of Azure Data Services. Experience with CI/CD pipelines, version control (Git), Jenkins, and agile methodologies. Familiarity with data lakes, data warehouses, and modern data architectures. Experience with Qlik Replicate (optional).
Posted 3 weeks ago
1.0 - 3.0 years
16 - 19 Lacs
Bengaluru
Work from Office
About The Position Chevron invites applications for the role of Cloud Engineer Data Hosting within our team in India This position supports Chevrons data hosting environment by delivering modern digital data hosting capabilities in a cost competitive, reliable, and secure manner This position will provide broad exposure to the application of technology to enable business with many opportunities for growth and professional development for the candidate Key Responsibilities Design, implement, and manage scalable and secure data hosting solutions on Azure Develop and maintain data architectures, including data models, data warehouses, and data lakes Refine data storage and extraction procedures to enhance performance and cost-effectiveness Uphold stringent data security measures and ensure adherence to relevant industry standards and regulatory requirements Collaborate with data scientists, analysts, and other stakeholders to understand and address their data needs Monitor and troubleshoot data hosting environments to ensure high availability and reliability Streamline data workflows and operations through the automation capabilities of Azure Data Factory and comparable technologies Design, develop, and deploy modular cloud-based systems Develop and maintain cloud solutions in accordance with best practices Required Qualifications Must have bachelors degree in computer science engineering or related discipline 0-5 years' experience At least 2 years of experience in data hosting for both on-premises and azure environments Microsoft AZ900 Certification Proficient in utilizing Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Data Factory In-depth understanding of cloud infrastructure, encompassing virtual networks, storage solutions, and compute resources within Azure Extensive hands-on experience with Azure services such as Azure SQL Database, Azure Blob Storage, Azure Data Lake, and Azure Synapse Analytics Well-versed in on-premises storage systems from vendors like NetApp, Dell, and others Skilled proficiency in scripting languages like Ansible, PowerShell, Python, and Azure CLI for automation and management tasks Comprehensive knowledge of Azure security best practices, including identity and access management, encryption, and compliance standards Preferred Qualifications Demonstrated proficiency in architecting, deploying, and managing secure and scalable data hosting solutions on the Azure platform Extensive experience in developing and maintaining robust data architectures, including data models, data warehouses, and data lakes, utilizing Azure services Expertise in optimizing data storage and retrieval processes for superior performance and cost efficiency within Azure environments In-depth knowledge of data security protocols and compliance with industry standards and regulations, with a focus on Azure cloud compliance Proven ability to collaborate effectively with data scientists, analysts, and other stakeholders to address their data needs using Azure's capabilities Strong track record of monitoring and troubleshooting Azure data hosting environments to ensure high availability and system reliability Skilled in automating data workflows and processes using Azure Data Factory and other Azure-based automation tools Experience in designing, developing, and deploying modular, cloud-based systems, with a particular emphasis on Azure solutions Commitment to maintaining cloud solutions in alignment with Azure best practices and continuously integrating Azure's latest updates and features Possession of Azure certifications, such as the Azure Data Engineer Associate or Azure Database Administrator Associate, with a preference for candidates holding the Azure Solutions Architect Expert certification or equivalent advanced credentials Chevron ENGINE supports global operations, supporting business requirements across the world Accordingly, the work hours for employees will be aligned to support business requirements The standard work week will be Monday to Friday Working hours are 8:00am to 5:00pm or 1 30pm to 10 30pm Chevron participates in E-Verify in certain locations as required by law
Posted 3 weeks ago
11.0 - 13.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Principal AWS Data Engineer Location : Bangalore Experience : 9 - 12 years Job Summary: In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new Lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team. Must Have Tech Skills: Prior Principal Engineer experience, leading team best practices in design, development, and implementation, mentoring team members, and fostering a culture of continuous learning and innovation Extensive experience in software architecture and solution design, including microservices, distributed systems, and cloud-native architectures. Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Deep technical knowledge of AWS data services and engineering practices, with demonstrable experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Experience of delivering Lakehouse solutions/architectures Nice To Have Tech Skills: Knowledge of additional programming languages and development tools to provide flexibility and adaptability across varied data engineering projects A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Lead complex projects autonomously, fostering an inclusive and open culture within development teams. Mentor team members and lead technical discussions. Provides strategic guidance on best practices in design, development, and implementation. Leads the development of high-quality, efficient code and develops necessary tools and applications to address complex business needs Collaborates closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading the design and planning of these components. Drive the migration of existing data processing workflows to a Lakehouse architecture, leveraging Iceberg capabilities. Serves as an internal subject matter expert in software development, advising stakeholders on best practices in design, development, and implementation Key Skills: Deep technical knowledge of data engineering solutions and practices. Expert in AWS services and cloud solutions, particularly as they pertain to data engineering practices Extensive experience in software architecture and solution design Specialized expertise in Python and Spark Ability to provide technical direction, set high standards for code quality and optimize performance in data-intensive environments. Skilled in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment. Exceptional communicator who can translate complex technical concepts for diverse stakeholders, including engineers, product managers, and senior executives. Provides thought leadership within the engineering team, setting high standards for quality, efficiency, and collaboration. Experienced in mentoring engineers, guiding them in advanced coding practices, architecture, and strategic problem-solving to enhance team capabilities. Educational Background: Bachelor’s degree in computer science, Software Engineering, or a related field is essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
Posted 3 weeks ago
6.0 - 7.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 5 - 8 Years Location : Bangalore Contribute to building state-of-the-art data platforms in AWS, leveraging Python and Spark. Be part of a dynamic team, building data solutions in a supportive and hybrid work environment. This role is ideal for an experienced data engineer looking to step into a leadership position while remaining hands-on with cutting-edge technologies. You will design, implement, and optimize ETL workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires technical expertise, strong problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable experience as a senior data engineer. Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Experience with data services in Lakehouse architecture. Good background and proven experience of data modelling for data platforms Nice To Have Tech Skills: A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Provides guidance on best practices in design, development, and implementation, ensuring solutions meet business requirements and technical standards. Works closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading design and planning of these components. Drive the migration of existing data processing workflows to the Lakehouse architecture, leveraging Iceberg capabilities. Communicates complex technical information clearly, tailoring messages to the appropriate audience to ensure alignment. Key Skills: Deep technical knowledge of data engineering solutions and practices. Implementation of data pipelines using AWS data services and Lakehouse capabilities. Highly proficient in Python, Spark and familiar with a variety of development technologies. Skilled in decomposing solutions into components (Epics, stories) to streamline development. Proficient in creating clear, comprehensive documentation. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Previous Financial Services experience delivering data solutions against financial and market reference data. Solid grasp of Data Governance and Data Management concepts, including metadata management, master data management, and data quality. Educational Background: Bachelor’s degree in computer science, Software Engineering, or related field essential. Bonus Skills: A working knowledge of Indices, Index construction and Asset Management principles.
Posted 3 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the required standards and functionality. You will also be responsible for developing new features and addressing any issues that arise, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality and data profiling techniques.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will engage in the development and configuration of software systems, either managing the entire process or focusing on specific stages of the product lifecycle. Your day will involve collaborating with team members, applying your knowledge of various technologies and methodologies, and ensuring that the software solutions meet client needs effectively and efficiently. You will also be responsible for troubleshooting issues and implementing improvements to enhance system performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 3 years of experience in SAP BusinessObjects Data Services.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : Business Process Architect Project Role Description : Analyze and design new business processes to create the documentation that guides the implementation of new processes and technologies. Partner with the business to define product requirements and use cases to meet process and functional requirements. Participate in user and task analysis to represent business needs. Must have skills : SAP CPI for Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business Architect, you will define opportunities to create tangible business value for the client by leading current state assessments and identifying high-level customer requirements. Your typical day will involve collaborating with various stakeholders to understand their needs, analyzing existing processes, and designing innovative solutions that align with the client's strategic goals. You will also be responsible for developing comprehensive business cases that outline the necessary steps to achieve the envisioned outcomes, ensuring that all proposed solutions are practical and beneficial for the organization.Key Responsibilities:1 Design Build and configure IBP CI-DS and RTI applications to meet business process and application requirements 2 Play the role of stream lead for individual IBP integration module processes 3 Should be able to drive discussion with the client and conduct workshops 4 Should be able to drive IBP project deliverables and liaison with other teams 5 Effectively communicates with internal and external stakeholders6. Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements.7. Collaborate with cross-functional teams to create the process blueprint and establish business process requirements to drive out application requirements and metrics.8. Assist in quality management reviews, ensuring all business and design requirements are met.9. Educate stakeholders to ensure a complete understanding of the designs.Functional Expertise:1. Must To Have Skills: Proficiency in SAP CPI for Data Services.2. Good To Have Skills: Knowledge in SAP IBP functional modules3. Strong understanding of integration patterns and data transformation techniques.4. Experience with process mapping and business process modeling.5. Ability to communicate complex concepts clearly to diverse audiences.6. Familiarity with project management methodologies and tools. Additional Information:1. The candidate should have minimum 3.5 years of experience in SAP CPI for Data Services.2. A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Pune
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring the smooth operation of essential applications. You will engage in problem-solving activities, analyze system performance, and implement solutions to enhance system reliability and efficiency, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and solutions to enhance team knowledge.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong analytical skills to diagnose and resolve software issues.- Experience with data integration and transformation processes.- Familiarity with database management systems and SQL.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 3 years of experience in SAP BusinessObjects Data Services.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
17 - 22 Lacs
Navi Mumbai
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support projects and clients effectively. You will engage in problem-solving activities, guiding your team through challenges while ensuring that project goals are met efficiently and effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality management and data profiling.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Chennai
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : Life SciencesMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems, applying knowledge of technologies, methodologies, and tools to support clients or projects in Chennai. You will lead the software development process from end-to-end or for specific product lifecycle stages. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the software development process from end-to-end- Implement innovative solutions to enhance software systems- Ensure timely delivery of high-quality software products Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services- Good To Have Skills: Experience with Life Sciences- Strong understanding of data integration and ETL processes- Experience in designing and implementing data migration strategies- Knowledge of data quality management and data governance practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services- This position is based at our Chennai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
17 - 22 Lacs
Navi Mumbai
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to align the data architecture with business objectives, ensuring that the data platform meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of data modeling techniques.- Familiarity with cloud-based data storage solutions.- Experience in implementing data integration strategies. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
4.0 - 5.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant international team. We provide a variety of competitive, innovative, and reliable Record-to-Report services from maintaining financial records to financial closing and reporting. We process, compile, and deliver relevant financial information covering Accounting & Closing, Commercial, Tax and Master Data Services. We deliver maximum value to the business by driving Record-to-Report optimization and digitalization using our entrepreneurial approach. We also support our customers current and future business requirements with the help of our high level of process and automation competency. Youll make a difference by You will be responsible to perform Billing, Journal postings, reporting & Controlling activities You will coordinate with Front office colleagues /customers and internal team Reconciles transactions by comparing and correcting data Drive process improvement that impacts the function Performing other commercial / Hub related activities Your success is grounded in You are a graduate and 4-5 years of experience Very good communication skills Strong accounting including knowledge of P/L & B/s You are proficient in MS Office (Advanced MS Excel Flexible with shifts and overtime Interpersonal skills Analytical ability Join us and be yourself! This role is based in Bangalore , where youll get the chance to work with teams impacting entire cities, countries- and the shape of things to come. Were Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow.
Posted 3 weeks ago
7.0 - 12.0 years
14 - 18 Lacs
Noida
Work from Office
Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture Were guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France