Home
Jobs

9546 Kafka Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

About This Role Job Overview Are you interested in building innovative technology that crafts the financial markets? Do you like working at the speed of a startup, and solving some of the world’s most exciting challenges? Do you want to work with, and learn from, hands-on leaders in technology and finance? At BlackRock, we are looking for Software Engineers who like to innovate and solve sophisticated problems. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, and passion while giving you the opportunity to grow technically and as an individual. We invest and protect over $9 trillion (USD) of assets and have an extraordinary responsibility to our clients all over the world. Our technology empowers millions of investors to save for retirement, pay for college, buy a home, and improve their financial well-being. Being a technologist at BlackRock means you get the best of both worlds: working for one of the most sophisticated financial companies and being part of a software development team responsible for next generation technology and solutions. What are Aladdin and Aladdin Engineering? You will be working on BlackRock's investment operating system called Aladdin. Aladdin is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform to power informed decision-making and create a connective tissue for thousands of users investing worldwide. Our development teams reside inside the Aladdin Engineering group. We collaboratively build the next generation of technology that changes the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and supports millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users every day worldwide! Being a Member Of Aladdin Engineering, You Will Be Tenacious: Work in a fast paced and highly complex environment Creative thinker: Analyse multiple solutions and deploy technologies in a flexible way. Great teammate: Think and work collaboratively and communicate effectively. Fast learner: Pick up new concepts and apply them quickly. Responsibilities Include Collaborate with team members in a multi-office, multi-country environment. Deliver high efficiency, high availability, concurrent and fault tolerant software systems. Significantly contribute to development of Aladdin’s global, multi-asset trading platform. Work with product management and business users to define the roadmap for the product. Design and develop innovative solutions to complex problems, identifying issues and roadblocks. Apply validated quality software engineering practices through all phases of development. Ensure resilience and stability through quality code reviews, unit, regression and user acceptance testing, dev ops and level two production support. Be a leader with vision and a partner in brainstorming solutions for team productivity, efficiency, guiding and motivating others. Drive a strong culture by bringing principles of inclusion and diversity to the team and setting the tone through specific recruiting, management actions and employee engagement. Candidate should be able to lead individual projects priorities, deadlines and deliverables using AGILE methodologies. Qualifications B.E./ B.TECH./ MCA or any other relevant engineering degree from a reputed university. 4+ years of proven experience Skills And Experience A proven foundation in core Java and related technologies, with OO skills and design patterns Track record building high quality software with design-focused and test-driven approaches Good hands-on object-oriented programming knowledge in Java. Strong knowledge of Open-Source technology stack (Spring, Hibernate, Maven, JUnit, etc.). Experience with relational database and/or NoSQL Database (e.g., Apache Cassandra) Great analytical, problem-solving and communication skills Some experience or a real interest in finance, investment processes, and/or an ability to translate business problems into technical solutions. Candidate should have experience leading development teams, projects or being responsible for the design and technical quality of a significant application, system, or component. Ability to form positive relationships with partnering teams, sponsors, and user groups. Candidate should have experience in building microservices and APIs ideally with REST, Kafka or gRPC. Candidate should have experience in high scale distributed technology like Kafka, Mongo, Ignite, Redis. Candidate should have experience in DevOps and tools like Azure DevOps Nice To Have And Opportunities To Learn Experience working in an agile development team or on open-source development projects. Experience with optimization, algorithms or related quantitative processes. Experience with Cloud platforms like Microsoft Azure, AWS, Google Cloud Experience with AI-related projects/products or experience working in an AI research environment. A degree, certifications or opensource track record that shows you have a mastery of software engineering principles. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 4 days ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Chennai, Bengaluru

Hybrid

JAVA FULLSTACK Exp 4 to 9 yrs Location- Chennai & Bangalore Java 17, Spring boot 3, Kafka, REST Services with AWS Cloud Platform, Microservices , SQL Working knowledge of Angular 14+, RxJS, HTML5, CSS Experience in Angular Mandatory. Thorough understanding, hand-on experience on above technologies with Commitment to quality and high standards. Excellent oral and written communication skills

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us Spark Java Developer Barclays which will be helping us build, maintain and support the all First line of controls applications. The successful candidate would be accountable for the technical operation of this control during the Asia hours. This role would require high degree of communication with the global leads in US and India. To be successful as Spark Java Developer, where you should have experience with: Capture the functional requirements by talking to business team and leads in US and India Convert the functional requirement to design and code/ Efficiently write the regression and unit test cases of the developed functionality Co-ordinate with L1 support team and proactively get involved in support activities for the production if required. Contribute to the maturity of DevOps on the application. Provide timely status updates to relevant stakeholders. Graduate in the Computer Science & computer application skills Be proficient in technologies like Java 17.0, Spark, Spring Boot, Micro Services, SQL It would be a great advantage if you know Kafka, Apache Ignite, Cucumber framework, React Should be aware of Agile Scrum/Kanban methodology. Some Other Highly Valued Skills May Include Partner very closely with the business operations team Work closely with global team to deliver on agreed outcomes. Experience in Gitlab,Autosys and Devops You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a " Full Stack Developer " at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a " Full Stack Developer " you should have experience with: Technology Skills / Experience Required Back end Latest Java with exposure to Java 8+, Spring / Spring Boot, Spring Data, JPA/Hibernate, Spring MVC Java Messaging, Spring Cloud Stream, Solace/Kafka/ActiveMQ or similar SQL, MS SQL Server, Postgres, or other RDBMS , MongoDB Junit, Mockito, JMeter and other testing tools/frameworks Kubernetes / Docker / Openshift / AWS GIT, TeamCity/Jenkins, Sonar, Maven / Gradle, Chef Knowledge of microservices and message driven architectures Web ReactJS / Redux, Grunt / Gulp / Webpack HTML5 / CSS / JavaScript, Google Chrome Dev Tools Typescript, jQuery, RequireJS Karma / Jasmine / Mocha, JSHint, Node / NPM, LESS / SASS Material Design / Bootstrap or other similar web UI library Preferred Qualifications Experience with DevOps and Test Automation using Selenium is preferred. Finance experience preferred - loan origination / syndication experience strongly preferred Experience on a team using Agile project management strongly preferred Good experience with automated unit testing and TDD strongly preferred Demonstrated problem-solving skills and excellent communication. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Analyst Expectations Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. Check work of colleagues within team to meet internal and stakeholder requirements. Provide specialist advice and support pertaining to own work area. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise. Make judgements based on practise and previous experience. Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements. Build relationships with stakeholders/ customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)

Posted 5 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Design and implement search solutions using Elasticsearch, focusing on optimizing search performance, indexing, and scaling to handle large datasets. Ensure fast, relevant search results for various applications.

Posted 5 days ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai

Work from Office

Design and implement data architecture and models for Big Data solutions using MapR and Hadoop ecosystems. You will optimize data storage, ensure data scalability, and manage complex data workflows. Expertise in Big Data, Hadoop, and MapR architecture is required for this position.

Posted 5 days ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Kolkata

Work from Office

Design, build, and maintain data pipelines on Google Cloud Platform, using tools like BigQuery and Dataflow. Focus on optimizing data storage, processing, and analytics to support business intelligence initiatives.

Posted 5 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Design and implement Big Data solutions using Hadoop and MapR ecosystem. You will work with data processing frameworks like Hive, Pig, and MapReduce to manage and analyze large data sets. Expertise in Hadoop and MapR is required.

Posted 5 days ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Design, implement, and optimize Big Data solutions using Hadoop technologies. You will work on data ingestion, processing, and storage, ensuring efficient data pipelines. Strong expertise in Hadoop, HDFS, and MapReduce is essential for this role.

Posted 5 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Design, develop, and implement Java-based applications. Work on complex software systems, ensuring high performance, scalability, and security. Mentor junior developers and lead project initiatives.

Posted 5 days ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai

Work from Office

Design and implement big data solutions using Hadoop ecosystem tools like MapR. Develop data models, optimize data storage, and ensure seamless integration of big data technologies into enterprise systems.

Posted 5 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do With moderate supervision, manage project's progress, metadata collection, development and management. Perform investigations on internal / external stakeholder queries with high level direction from the Team Leader Analyze problems, identify root cause, formulate findings and observations of results, suggest resolutions and communicate to internal / external stakeholders with moderate guidance from the Team Leader. Maintain current knowledge of industry regulatory requirements such as reporting mandates, concepts and procedures, compliance requirements, and regulatory framework and structure. Be able to support internal/external queries on data standards. Enter/maintain information in documentation repository. Follow established security protocols, identify and report potential vulnerabilities. Perform intermediate level data quality checks, following established procedures. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 2+ years of experience as a data engineer or related role Cloud certification strongly preferred Intermediate skills using programming languages - Python, SQL (Big Query) or scripting languages Basic understanding and experience with Google Cloud Platforms and an overall understanding of cloud computing concepts Experience building and maintaining simple data pipelines, following guidelines, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience supporting the design and implementation of basic data models Demonstrates proficient Git usage and contributes to team repositories What could set you apart Master's Degree Experience with GCP (Cloud certification strongly preferred) Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Airflow, GCP dataflow etc. Experience with AI or Machine Learning Experience with Data Visualisation Tools such as Tableau or Looker Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 5 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

NPS Prism Title: Data Engineer Location: India (Hybrid) Experience: 3–6 Years Employment Type: Full-time Company Profile: NPS Prism is a market-leading, cloud-based CX benchmarking and operational improvement platform owned by Bain & Company. NPS Prism provides its customers with actionable insights and analysis that guide the creation of game-changing customer experiences. Based on rock-solid sampling, research, and analytic methodology, it lets customers see how they compare to their competitors on overall NPS®, and on every step of the customer journey. With NPS Prism you can see where you’re strong, where you lag, and how customers feel about doing business with you and your competitors, in their own words. The result: Prioritize the customer interactions that matter most. NPS Prism customers use our customer experience benchmarks and insights to propel their growth and outpace the competition. Launched in 2019, NPS Prism has rapidly grown to a team of over 200, serving dozens of clients around the world. NPS Prism is 100% owned by Bain & Company, one of the top management consulting firms in the world and a company consistently recognized as one of the world’s best places to work. We believe that diversity, inclusion, and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. Position Summary: We are seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have strong expertise in Python, SQL and PySpark, with proven experience working on Databricks and cloud platforms such as Azure and AWS. A solid understanding of ETL tools like Python as well as basic knowledge of DevOps practices and CI/CD pipelines, will be advantageous. This is a unique opportunity to work in a dynamic and fast-paced environment to design and implement robust data solutions for scalable business needs. Working with Git and versioning. Key Responsibilities: Data Pipeline Development: Design, build, and optimize ETL/ELT workflows using tools like Databricks, SQL, Python/pyspark & Alteryx (Good to have) Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets. from source to emerging data Cloud Data Engineering: Work on cloud platforms (Azure, AWS) to build and manage data lakes, data warehouses, and scalable data architectures Utilize cloud services like Azure Data Factory, AWS Glue, orfor data processing and orchestration Databricks and Big Data Solutions: Use Databricks for big data processing, analytics, and real-time data processing Leverage Apache Spark for distributed computing and handling complex data transformations Data Management: Create and manage SQL-based data solutions, ensuring high availability, scalability, and performance Develop and enforce data quality checks and validation mechanisms Collaboration and Stakeholder Engagement: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver impactful data solutions Understand business requirements and translate them into technical solutions DevOps and CI/CD: Leverage CI/CD pipelines to streamline development, testing, and deployment of data engineering workflows Work with DevOps tools like Git, Jenkins, or Azure DevOps for version control and automation Documentation and Optimization: Maintain clear documentation for data workflows, pipelines, and processes Optimize data systems for performance, scalability, and cost-efficiency Required Qualifications, Experience And Skills Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field Experience: 3–6 years of experience in Data Engineering or related roles Hands-on experience with big data processing frameworks, data lakes, and cloud-native services Skills: Core Skills: Proficiency in Python, SQL, and PySpark for data processing and manipulation Proven experience in Databricks and Apache Spark Expertise in working with cloud platforms like Azure, AWS Sound knowledge of ETL processes and tools like Alteryx. (Good to have) Data Engineering Expertise: Leveraging data lakes, data warehouses, and data pipelines Data Pipeline – Build a Data Pipeline from scratch Strong understanding of distributed systems and big data technologies DevOps and CI/CD: Basic understanding of DevOps principles and familiarity with CI/CD pipelines Hands-on experience with tools like Git, Jenkins, or Azure DevOps Additional Skills: Familiarity with data visualization tools like Power BI, Tableau, or similar is a plus Knowledge of streaming technologies such as Kafka or Event Hubs is desirable Strong problem-solving skills and a knack for optimizing data solutions Excellent communication (oral and written) skills Powered by JazzHR ZD9SyGoN0u

Posted 5 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Develop and maintain data-driven applications using Scala and PySpark. Work with large datasets, performing data analysis, building data pipelines, and optimizing performance.

Posted 5 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Design, develop, and maintain data pipelines and data management solutions. Ensure efficient data collection, transformation, and storage for analysis and reporting.

Posted 5 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Specializes in Public Key Infrastructure (PKI) implementation and certificate management. Responsibilities include configuring digital certificates, managing encryption protocols, and ensuring secure communication channels. Expertise in SSL/TLS, HSMs, and identity management is required.

Posted 5 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Develop and manage data pipelines using Snowflake. Optimize performance and data warehousing strategies.

Posted 5 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Develop and maintain applications using Kafka for event streaming, Kubernetes for container orchestration, Angular 13 for front-end development, and Core Java for back-end functionality.

Posted 5 days ago

Apply

12.0 - 15.0 years

30 - 40 Lacs

Bengaluru

Hybrid

Java 11 and above. Database development, SQL and Stored Procedures in Sybase or Oracle. Hibernate, Spring framework JUnit, mocking techniques (e.g., easymock, jmock, mockito,..) Experience with Architectural Paradigms like DDD, notions of division by domain/sub-domain, API First, Data Driven Knowledge of Integration Patterns - Event Driven Architecture Knowledge of Apache Camel/Kafka Experience with deployment models like modular, monolith and microservice Experience with continuous integration like Jenkins and version control tools Knowledge of performance tuning of java applications Knowledge of Agile methodologies, e.g., TDD, XP, Scrum. Excellent communication skills Be able to understand complex systems and take the initiative, to seek out and accept new responsibilities Have drive, self motivation, good interpersonal skills, and work in a team

Posted 5 days ago

Apply

12.0 - 15.0 years

0 - 3 Lacs

Bengaluru

Work from Office

Dear candidate Greetings from Teamware Solutions We have an opportunity with one of our product-based clients Experience: 12+ Yrs Mode of interview: virtual Notice: Immediate -15 days Skills: Java Full stack, Java 11, SQL, Sybase or oracle , Spring, Hibernate, Junit , Kafka , Agile . Interested can share their updated profile to punyamurthi.m@twsol.com

Posted 5 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About The Role This is a remote, contract position responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics). We are looking for a skilled Senior Data Engineer with a strong background in Python, SQL, PySpark, Azure, Databricks, Synapse, Azure Data Lake, DevOps and cloud-based large scale data applications with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment, contributing to the architecture, design, and implementation of Data products in the Aviation Industry, including migration from Synapse to Azure Data Lake. This role involves hands-on coding, mentoring junior staff and collaboration with multi-disciplined teams to achieve project objectives. Qualification & Experience Must have a full-time Bachelor's degree in Computer Science or similar At least 5 years of experience as a data engineer with strong expertise in Databricks, Azure, DevOps, or other hyperscalers 5+ years of experience with Azure DevOps, GitHub Proven experience delivering large scale projects and products for Data and Analytics, as a data engineer, including migrations Following certifications: Databricks Certified Associate Developer for Apache Spark Databricks Certified Data Engineer Associate Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Data Engineer Associate Microsoft Exam: Designing and Implementing Microsoft DevOps Solutions (nice to have) Required Skills/Competencies Strong programming Skills in one or more languages such as Python (must have), Scala, and proficiency in writing efficient and optimized code for data integration, migration, storage, processing and manipulation Strong understanding and experience with SQL and writing advanced SQL queries Thorough understanding of big data principles, techniques, and best practices Strong experience with scalable and distributed Data Processing Technologies such as Spark/PySpark (must have: experience with Azure Databricks), DBT and Kafka, to be able to handle large volumes of data Solid Databricks development experience with significant Python, PySpark, Spark SQL, Pandas, NumPy in Azure environment Strong experience in designing and implementing efficient ELT/ETL processes in Azure and Databricks and using open source solutions being able to develop custom integration solutions as needed Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming Expertise in data cleansing, transformation, and validation Proficiency with Relational Databases (Oracle, SQL Server, MySQL, Postgres, or similar) and NonSQL Databases (MongoDB or Table) Good understanding of Data Modeling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Strong experience in designing and implementing Data Warehousing, data lake and data lake house, solutions in Azure and Databricks Good experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT) Strong understanding of the software development lifecycle (SDLC), especially Agile methodologies Strong knowledge of SDLC tools and technologies Azure DevOps and GitHub, including project management software (Jira, Azure Boards or similar), source code management (GitHub, Azure Repos or similar), CI/CD system (GitHub actions, Azure Pipelines, Jenkins or similar) and binary repository manager (Azure Artifacts or similar) Strong understanding of DevOps principles, including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform, ARM including hands-on experience), configuration management, automated testing, performance tuning and cost management and optimization Strong knowledge in cloud computing specifically in Microsoft Azure services related to data and analytics, such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake, Azure Stream Analytics, SQL Server, Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database, etc Experience in Orchestration using technologies like Databricks workflows and Apache Airflow Strong knowledge of data structures and algorithms and good software engineering practices Proven experience migrating from Azure Synapse to Azure Data Lake, or other technologies Strong analytical skills to identify and address technical issues, performance bottlenecks, and system failures Proficiency in debugging and troubleshooting issues in complex data and analytics environments and pipelines Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Experience with BI solutions including PowerBI is a plus Strong written and verbal communication skills to collaborate and articulate complex situations concisely with cross-functional teams, including business users, data architects, DevOps engineers, data analysts, data scientists, developers, and operations teams Ability to document processes, procedures, and deployment configurations Understanding of security practices, including network security groups, Azure Active Directory, encryption, and compliance standards Ability to implement security controls and best practices within data and analytics solutions, including proficient knowledge and working experience on various cloud security vulnerabilities and ways to mitigate them Self-motivated with the ability to work well in a team, and experienced in mentoring and coaching different members of the team A willingness to stay updated with the latest services, Data Engineering trends, and best practices in the field Comfortable with picking up new technologies independently and working in a rapidly changing environment with ambiguous requirements Care about architecture, observability, testing, and building reliable infrastructure and data pipelines Responsibilities Architect, design, develop, test and maintain high-performance, large-scale, complex data architectures, which support data integration (batch and real-time, ETL and ELT patterns from heterogeneous data systems: APIs and platforms), storage (data lakes, warehouses, data lake houses, etc), processing, orchestration and infrastructure. Ensuring the scalability, reliability, and performance of data systems, focusing on Databricks and Azure Contribute to detailed design, architectural discussions, and customer requirements sessions Actively participate in the design, development, and testing of big data products. Construct and fine-tune Apache Spark jobs and clusters within the Databricks platform Migrate out of Azure Synapse to Azure Data Lake or other technologies Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive) Design and implement data models and schemas that support efficient data processing and analytics Design and develop clear, maintainable code with automated testing using Pytest, unittest, integration tests, performance tests, regression tests, etc Collaborating with cross-functional teams and Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components meeting product deliverables Evaluating and implementing new technologies and tools to improve data integration, data processing, storage and analysis Evaluate, design, implement and maintain data governance solutions: cataloging, lineage, data quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns Continuously monitor and fine-tune workloads and clusters to achieve optimal performance Provide guidance and mentorship to junior team members, sharing knowledge and best practices Maintain clear and comprehensive documentation of the solutions, configurations, and best practices implemented Promote and enforce best practices in data engineering, data governance, and data quality Ensure data quality and accuracy Design, Implement and maintain data security and privacy measures Be an active member of an Agile team, participating in all ceremonies and continuous improvement activities, being able to work independently as well as collaboratively Fusemachines is an Equal Opportunities Employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR YyCZFu6HUv

Posted 5 days ago

Apply

7.0 - 11.0 years

27 - 42 Lacs

Gurugram

Work from Office

Hi,Cvent is hiring for Lead Senior Engineer- Java role. Interested candidate can apply or share their resume at anjali.singh@cvent.com Job Description- What You Will Be Doing: Work on Internet scale applications, where performance, reliability, scalability and security are critical design goals not after-thoughts. Create intuitive, interactive and easy-to-use web applications using rich client-side and REST based server-side code. Implement the nuts and bolts of Microservices Architecture, Service-Oriented Architecture (SOA) and Event-Driven Architecture (EDA) in real-life applications. Gain experience with different database technologies, ranging from traditional relational to the latest NoSQL products such as Couchbase, AWS DynamoDB. Collaborate with some of the best engineers in the industry to work on complex Software as a Service (SaaS) based applications What You Will Need for this Position: You need to have a strong passion for software development and must take pride in designing and coding. You should also have great analytical skills and ability to handle complex, modular software development in a collaborative team-based environment.

Posted 5 days ago

Apply

175.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

hackajob is collaborating with American Express to connect them with exceptional tech professionals for this role. Description At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also, they do preventive work to avoid future overdues with accounts that have a high exposure. The eCRMS organization is looking for a hands-on Software Engineer for Customer 360 (C360) Engineering team. The C360 Platform is a critical platform in American Express that provides a holistic view of the customer's relationships with various American Express products and manages customers demographics and provides intelligent insights about customer's contact preferences. This platform is an integral part of all critical user journeys and is at the fore front of all the new initiatives the company is undertaking. In C360, we build and operate highly available and scalable services using event-driven reactive architecture to provide real time services to power critical use cases across the company. We perform data analysis, work on anomaly detection and create new data insights. This role of an Engineer will be an integral part of a team that builds large-scale, cloud-native, event-driven reactive applications to create 360-degree view of the customer. Specifically, You Will Help Lead build of new micro-services that help manages our rapidly growing data hub. Lead build of services to perform real time data processing at scale for relational, analytical queries across multi-dimensional data. Lead build of services that generalize stream processing to make it trivial to source, sink and stream process data. Improve efficiency, reliability and scalability of our data pipelines. Work on cross-functional initiatives and collaborate with Engineers across organizations. Influence team members with creative changes and improvements by challenging status quo and demonstrating risk taking Be a productivity multiplier for your team by analyzing your workflow and contributing to enable the team to be more effective, productive, and demonstrating faster and stronger results. Are you up for the challenge? 5+ years of experience in building large scale distributed applications with object-oriented design using java related stack. Holds a masters or bachelor’s degree in Computer Science, Information Systems, or other related field (or has equivalent work experience). Ability to implement scalable, high performing, secure, highly available solutions. Proficient in developing solution architecture for business problems and communicating it to large teams. Proficient in weighing pros and cons of different solution options and gaining alignment on the preferred option with multiple stakeholders. Experience with NoSQL technologies such as Cassandra, Couchbase, etc. Experience with web services and API development on enterprise platforms using REST, GraphQL, and gRPC. Expertise in Big Data technologies like Hive, Map Reduce, and Spark Experience in Event Driven Microservice architecture - Vert.X, Kafka, etc Experience with automated release management using Maven, Git, Jenkins. Experience with Vert.X and Event driven Architecture is a plus. Experience with Postgres is a plus. Experience with Docker/Openshift based deployment is a plus. Compliance Language American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 5 days ago

Apply

5.0 - 10.0 years

0 - 3 Lacs

Bengaluru

Hybrid

Openings for Java Developer Exp : 5+Years Experience Job Location : Bangalore Notice Period: Immediate to Max 15 days Work Mode : Hybrid Shift : General Shifts Interview Mode : Virtual Skills Needed: Strong Experience in Java Programming Spring, Sprint boot, Spring MVC, Hibernate Application servers, servlet containers, JMS, JPA, Rest API Kafka My SQL or NO SQL DB Cloud Technology CI/CD, Kubernetes Interested drop resumes on mail to: shubha.s@twsol.com

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies