Home
Jobs

4851 Hadoop Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform components.- Ensure data platform scalability and performance.- Conduct regular data platform audits.- Stay updated on emerging data platform technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience with data integration and data modeling.- Hands-on experience with data pipeline orchestration tools.- Knowledge of data security and compliance standards. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Snowflake Data Engineer1 were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 2 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Snowflake data engineer1 As mentioned earlier, for this role, were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 2 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your Primary Responsibilities Include Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 days ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data Engineer , you are required to Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level At least3- 5years hands-on experience in Data Engineering Desired Knowledge & Experience Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internalsCatalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL TSQL/Spark SQL/HiveQL Storage Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL :Cosmos, Mongo, Cassandra Cubes SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server TSQL, Stored Procedures Hadoop HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 2 days ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Noida

Work from Office

Naukri logo

Join Us in Transforming Healthcare with the Power of Data & AI At Innovaccer, were on a advanced Healthcare Intelligence Platform ever created. Grounded in an AI-first design philosophy, our platform turns complex health data into real-time intelligence, empowering healthcare systems to make faster, smarter decisions. We are building a unified , end-to-end data platform that spans Data Acquisition & Integration, Master Data Management , Data Classification & Governance , Advanced Analytics & AI Studio , App Marketplace , AI-as-BI capabilities, etc. All of this is powered by an Agent-first approach , enabling customers to build solutions dynamically and at scale. Youll have the opportunity to define and develop platform capabilities that help healthcare organizations tackle some of the industrys most pressing challenges, such as Kidney disease management, Clinical trials optimization for pharmaceutical companies, Supply chain intelligence for pharmacies, and many more real-world applications. Were looking for talented engineers and platform thinkers who thrive on solving large-scale, complex, and meaningful problems. If youre excited about working at the intersection of healthcare, AI, and cutting-edge platform engineering, wed love to hear from you. About the Role We are looking for a Staff Engineer to design and develop highly scalable, low-latency data platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Masters degree in Computer Science, Engineering, or a related field.

Posted 2 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Support Senior Analyst is responsible for providing technical and functional support for business-critical applications. This role involves troubleshooting application issues, managing incidents, collaborating with IT teams, and ensuring smooth system operations to minimize business disruptions. Responsibilities: Provide Level 2/3 support for business applications, troubleshooting technical and functional issues Monitor daily Autosys job executions, system performance, ensuring uptime and reliability Investigate and diagnose production issues in a timely manner to provide root cause analysis and solutions to prevent future issues Collaborate with cross-functional teams to identify and resolve production issues Create and maintain documentation related to production issues and resolutions Coordinate with development and release management teams to ensure smooth deployment of sprint releases with minimal business interruptions. Coordinate with development , release management and business teams to during the UAT Testing phase of the sprint Coordinate with SA and EAP teams on the System updates, green zones and COB activities Monitor and Manage Inbound and Outbound feed SLA breaches Provide timely updates on system performance and issue status to senior management Participate in on-call rotation and respond to production issues after business hours Work with the production team and engineers to ensure product quality, reliability, and performance meet specifications Qualifications: 10+ years' experience in an Application Support role L2 & L3 Experience installing, configuring or supporting business applications. Experience with some programming languages and willingness/ability to learn. Advanced execution capabilities and ability to adjust quickly to changes and re-prioritization Effective written and verbal communications including ability to explain technical issues in simple terms that non-IT staff can understand. Demonstrated analytical skills Issue tracking and reporting using tools Knowledge/ experience of problem Management Tools. Good all-round technical skills Effectively share information with other support team members and with other technology teams Ability to plan and organize workload Consistently demonstrates clear and concise written and verbal communication skills Ability to communicate appropriately to relevant stakeholders Experience in developing and optimizing Unix shell scripts for automation and issue resolution. Proficient in creating and debugging complex SQL queries for data analysis and troubleshooting. Have experience with Big Data technologies, including Hadoop, Hive, Impala, and Spark. Manage job orchestration using tools like Autosys and other schedulers. Education: Bachelor’s/University degree or equivalent experience Master's degree preferred ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Support ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

7.0 - 12.0 years

14 - 18 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture Were guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Overview The Gen AI engineer will be part of a team which designs, builds, and operates the AI services of Siemens Healthineers (SHS). The ideal candidate for this role will have experience with AI services. The role will require the candidate to develop and maintain artificial intelligence systems and applications that help businesses and organizations solve complex problems. The role also requires expertise in machine learning, deep learning, natural language processing, computer vision, and other AI technologies. Task and Responsibilities Generative AI Engineer is responsible for designing, architecting, and development of AI product/service. The main responsibilities include Designing, developing, and deploying Azure-based AI solutions, including machine learning models, cognitive services, and data analytics solutions. Collaborating with cross-functional teams, such as data scientists, business analysts, and developers, to design and implement AI solutions that meet business requirements. Building and training machine learning models using Azure Machine Learning, and tuning models for optimal performance. Developing and deploying custom AI models using Azure Cognitive Services, such as speech recognition, language understanding, and computer vision. Creating data pipelines to collect, process, and prepare data for analysis and modeling using Azure data services, such as Azure Data Factory and Azure Databricks. Implementing data analytics solutions using Azure Synapse Analytics or other Azure data services. Deploying and managing Azure services and resources using Azure DevOps or other deployment tools. Monitoring and troubleshooting deployed solutions to ensure optimal performance and reliability. Ensuring compliance with security and regulatory requirements related to AI solutions. Staying up-to-date with the latest Azure AI technologies and industry developments, and sharing knowledge and best practices with the team. Qualifications Overall 5+ years combined experience in IT and recent 3 years as AI engineer. Bachelor's or master's degree in computer science, information technology, or a related field. Experience in designing, developing, and delivering successful AI Services. Experience with cloud computing technologies, such Azure. Relevant industry certifications, such as Microsoft CertifiedAzure AI Engineer, Azure Solution Architect etc. is a plus. Excellent written and verbal communication skills to collaborate with cross-functional teams and communicate technical information to non-technical stakeholders. Technical skills Proficiency in programming languagesYou should be proficient in programming language such as Python and R. Experience of Azure AI servicesYou should have experience with Azure AI services such as Azure Machine Learning, Azure Cognitive Services, and Azure Databricks. Data handling and processingYou should be proficient in data handling and processing techniques such as data cleaning, data normalization, and feature extraction. Knowledge of SQL, NoSQL, and big data technologies such as Hadoop and Spark are also beneficial. Experience with Cloud platformYou should have a good understanding of cloud computing concepts and experience working with Azure services such as containers, Kubernetes, Web Apps, Azure Front Door, CDN, Web Application firewalls etc. DevOps and CI/CDYou should be familiar with DevOps practices and CI/CD pipelines. This includes tools such as Azure DevOps & Git. Security and complianceYou should be aware of security and compliance considerations when building and deploying AI models in the cloud. This includes knowledge of Azure security services, compliance frameworks such as HIPAA and GDPR, and best practices for securing data and applications in the cloud. Machine learning algorithms and frameworksKnowledge of machine learning algorithms and frameworks such as TensorFlow, Keras, PyTorch, and Scikit-learn is a plus.

Posted 2 days ago

Apply

4.0 - 5.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make an outstanding addition to our vibrant team. Siemens Mobility is an independent run company of Siemens AG. Its core business includes rail vehicles, rail automation and electrification solutions, turnkey systems, intelligent road traffic technology and related services. In Mobility, we help our customers meet the need for hard-working mobility solutions. Were making the lives of people who travel easier and more enjoyable while constantly developing new, intelligent mobility solutions! We are looking forEmbedded Linux Engineer- Train IT Youll make a difference by You will be part of the Engineering team for new and exciting software applications in our trains. Your mission will be to customize Linux image of our Train IT platform for specific train and integrate applications such as train server, train to ground communication, passenger information, passenger counting or CCTV. This role requires a wide range of technical skills and a desire to find out how things work and why. Be a member of the international engineering team Configure and customize Debian Linux image for deployment to the train Customize applications and configure devices such as network switches and special devices according to the system architecture of the train Integrate these applications and devices with other systems in the train Cooperate with software test team Provide technical support in your area of expertise Desired Skills: Minimum 4-5 years of Experience in software development. Experience with Linux as power user or administrator Experience with configuration of managed switches Good knowledge of TCP/IP Understanding of network protocols like DHCP, RADIUS, DNS, multicast, SSL/TLS Experience with issue tracking tools such as JIRA or Redmine Highly organized and self-motivated Hands-on, problem-solving mentality Experience in the railway industry. Long term interest in the IT domain, passion for IT German language Python programming Fluent English Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about mobility athttps://new.siemens.com/global/en/products/mobility.html and about Siemens careers at

Posted 2 days ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of data integration processes and methodologies.- Experience with various ETL tools and frameworks.- Ability to design and execute test cases for data validation.- Familiarity with database management systems and SQL. Additional Information:- The candidate should have minimum 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will engage in problem-solving discussions, contribute innovative ideas, and refine application functionalities to enhance user experience and operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Good To Have Skills: Experience with data integration tools and methodologies.- Strong understanding of data quality and validation processes.- Familiarity with database management systems and SQL.- Experience in performance testing and optimization of ETL processes. Additional Information:- The candidate should have minimum 3 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Information Systems Responsibilities To design and implement real-time data streaming pipelines. Using Apache Kafka expertise and making event-driven architectures and for stream processingExperience in cloud like Azure Preferred Skills: Technology-Java-Apache-Kafka Technology-Java-Core Java Technology-Big Data - Data Processing-Spark-Apache Flink

Posted 2 days ago

Apply

9.0 - 11.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Design, develop, and maintain scalable data pipelines on Databricks using PySpark Collaborate with data analysts and scientists to understand data requirements and deliver solutions Optimize and troubleshoot existing data pipelines for performance and reliability Ensure data quality and integrity across various data sources Implement data security and compliance best practices Monitor data pipeline performance and conduct necessary maintenance and updates Document data pipeline processes and technical specificationsLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Mysore, Kolkata, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have 9+ years of experience in data engineering Proficiency with Databricks and Apache Spark Strong SQL skills and experience with relational databases Experience with big data technologies (e.g., Hadoop, Kafka) Knowledge of data warehousing concepts and ETL processes Experience with CI/CD tools, particularly Jenkins Excellent problem-solving and analytical skills Solid understanding of big data fundamentals and experience with Apache Spark Familiarity with cloud platforms (e.g., AWS, Azure) Experience with version control systems (e.g., BitBucket) Understanding of DevOps principles and tools (e.g., CI/CD, Jenkins) Databricks certification is a plus Preferred Skills: Technology-Big Data-Big Data - ALL Technology-Cloud Integration-Azure Data Factory (ADF) Technology-Cloud Platform-AWS Data Analytics-AWS Data Exchange

Posted 2 days ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to define, design, and ship new features.- Develop high-quality software design and architecture.- Identify, prioritize, and execute tasks in the software development life cycle.- Conduct software analysis, programming, testing, and debugging.- Create technical documentation for reference and reporting. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data analytics platforms.- Experience with big data technologies such as Apache Spark and Hadoop.- Knowledge of programming languages like Python, Scala, or SQL.- Hands-on experience in developing and deploying data pipelines.- Familiarity with data modeling and database design principles. Additional Information:- The candidate should have a minimum of 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.-Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Big Data - Data Processing-SparkScala developer Preferred Skills: Technology-Functional Programming-Scala

Posted 2 days ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering,BCom,BSc,MSc,MCA Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Big Data - Data Processing-Spark Preferred Skills: Technology-Big Data - Data Processing-Spark-SparkSQL

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 2 days ago

Apply

3.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job ID: Pyt-ETP-Pun-1075 Location: Pune,Bangalore,Other Company Overview Bridgenext is a Global consulting company that provides technology-empowered business solutions for world-class organizations. Our Global Workforce of over 800 consultants provide best in class services to our clients to realize their digital transformation journey. Our clients span the emerging, mid-market and enterprise space. With multiple offices worldwide, we are uniquely positioned to deliver digital solutions to our clients leveraging Microsoft, Java and Open Source with a focus on Mobility, Cloud, Data Engineering and Intelligent Automation. Emtec’s singular mission is to create “Clients for Life” – long-term relationships that deliver rapid, meaningful, and lasting business value. At Bridgenext, we have a unique blend of Corporate and Entrepreneurial cultures. This is where you would have an opportunity to drive business value for clients while you innovate and continue to grow and have fun while doing it. You would work with team members who are vibrant, smart and passionate and they bring their passion to all that they do – whether it’s learning, giving back to our communities or always going the extra mile for our client. Position Description We are looking for members with hands-on Data Engineering experience who will work on the internal and customer-based projects for Bridgenext. We are looking for someone who cares about the quality of code and who is passionate about providing the best solution to meet the client needs and anticipate their future needs based on an understanding of the market. Someone who worked on Hadoop projects including processing and data representation using various AWS Services. Must Have Skills 3-4 years of overall experience Strong programming experience with Python Experience with unit testing, debugging, and performance tuning. Experience with Docker, Kubernetes, and cloud platforms (AWS preferred) Experience with CI/CD pipelines and DevOps best practices. Familiarity with workflow management tools like Airflow. Experience with DBT is a plus. Good to have experience with infrastructure-as-a-code technologies such as Terraform, Ansible Good to have experience in Snowflake modelling – roles, schema, databases. Professional Skills Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues

Posted 2 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address business needs and enhance user experience. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application design and development Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark- Strong understanding of distributed computing principles- Experience in developing scalable applications- Knowledge of data processing frameworks like Hadoop- Hands-on experience with real-time data processing- Good To Have Skills: Experience with Scala programming language Additional Information:- The candidate should have a minimum of 5 years of experience in Apache Spark- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter, Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education" Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter, Oracle Procedural Language Extensions to SQL (PLSQL)- Strong understanding of ETL processes and data integration- Experience in developing complex data mappings and transformations- Knowledge of data warehousing concepts and best practices- Hands-on experience in performance tuning and optimization of ETL processes Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Drive innovation and continuous improvement in application development Additional Information:- The candidate should have min 5 years of experience - This position is based at our Gurugram office. Visiting client office twice a week is must.- A 15 years full-time education is required." Qualification 15 years full time education

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies