Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
15 - 27 Lacs
chennai
Hybrid
Senior Bigdata Developer (GCP - BigQuery , DataFlow, DataProc, Spanner) Very good communication skill Self starter and learner Willing to work from office on Hybrid mode.
Posted 1 day ago
3.0 - 7.0 years
20 - 25 Lacs
pune
Work from Office
About the Role: We are looking for a highly motivated Senior Software Engineer, Data Analytics with experience to join our fast-paced engineering team. The ideal candidate takes full ownership of their work, thrives in cross-functional collaboration, and is passionate about building scalable, fault-tolerant big data systems. In this role, you will design and develop high-performance data platforms, mentor junior engineers, and contribute to delivering impactful analytics solutions that drive strategic business decisions. What Youll Do: Design, build, and optimize scalable and fault-tolerant Big Data pipelines for batch and streaming workloads. Develop real-time streaming applications using Apache Spark Streaming or Flink. Work with Snowflake, Hadoop, Kafka, and Spark for large-scale data processing and analytics. Implement workflow orchestration using tools like Apache Airflow, Oozie, or Luigi. Develop backend services and REST APIs to serve analytics and data products. Collaborate with product managers, stack holders, and cross-functional teams to deliver data-driven solutions. Ensure data quality, governance, and security across the data ecosystem. Guide and mentor junior engineers, providing technical leadership and best practice recommendations. Perform code reviews, performance tuning, and troubleshoot distributed system issues. Drive innovation by evaluating and implementing new tools, frameworks, and approaches for data engineering. We'd Love for You to Have: 4-7 years of experience in Big Data & Analytics engineering. Strong programming skills in Java, Scala, or Python. Hands-on experience with Apache Spark, Hadoop, Kafka, and distributed data systems. Proficiency in SQL and experience with Snowflake (preferred) or other cloud data warehouses. Practical experience with workflow orchestration tools such as Airflow, Oozie, or Luigi. Strong foundation in data structures, algorithms, and distributed system design. Familiarity with cloud platforms (AWS, GCP, Azure) and related data services. Experience with containerization and orchestration (Docker, Kubernetes). Exposure to data observability, monitoring tools, and AI/ML integration with data pipelines. Experience in mentoring and guiding team members. Proven track record of working on cross-team collaboration projects. Strong problem-solving skills with the ability to take ownership and deliver end-to-end solutions. Qualifications Should have a bachelors degree in engineering (CS / IT) or equivalent degree from a well-known Institute / University.
Posted 2 days ago
5.0 - 9.0 years
13 - 23 Lacs
bengaluru
Hybrid
The Role Develops and program methods, automated processes, and systems to cleanse, integrate and analyze structured and unstructured, diverse big data sources to generate actionable insights and solutions using machine learning and advanced analytics . Interprets and communicates insights and findings from analyses and experiments to other analysts, data scientists, team members and business partners. The Main Responsibilities Support the development of end-to-end analytics solutions by assisting in the design and implementation of solutions that cover the entire data science lifecycle, including data discovery, cleaning, exploratory data analysis, model building, and deployment. Assist with operationalizing models and participate in the iterative process of refining models and insights based on feedback and business requirements. Analyze data and build predictive, prescriptive, and advanced analytical models in various areas including capacity planning, effect/anomaly detection, predictive asset failure/maintenance, workload optimization, customer segmentation and business performance. Gain direct experience with various modeling techniques such as clustering, regression, and time series forecasting, applying these techniques to generate actionable insights and recommendations. Mine information for previously unknown patterns and insights hidden in these assets and leverage them for competitive advantage. Create compelling data visualizations and dashboards to effectively communicate findings to both technical and non-technical audiences. Present insights in a clear, concise, and actionable manner. Collaborate within and across cross-functional teams, working closely with data engineers, data scientists, and business stakeholders to understand business problems, gather requirements, and communicate insights effectively. Contribute to collaborative problem-solving sessions and agile development processes. Develop and operationalize end-to-end machine learning pipelines on Databricks , including feature engineering, model training, evaluation, and deployment. Implement and manage MLOps practices , integrating Git for version control, CI/CD pipelines for model deployment, and automated monitoring of models in production. Develop and consume RESTful APIs for data integration , enabling seamless connectivity between analytics applications and external systems. Ensure reproducibility, auditability, and governance of data science models by adhering to enterprise MLOps standards and frameworks. Support analytics democratization by packaging models as reusable components and APIs for consumption across the enterprise. What We Look for in a Candidate Able to apply techniques such as classification, clustering, regression, deep learning, association, anomaly detection, time series forecasting, Hidden Markov models and Bayesian inference to solve pragmatic business problems. Able to design working models and implement them on Big Data systems using Map Reduce or Spark frameworks . Familiar with Hadoop, Pig, Hive, Scope, Cosmos, or similar technologies . Able to work within an agile, iterative DevOps development process. Experience: 3+ years of experience delivering Machine Learning and Advanced Analytics solutions Experience with statistical programming environments like Python, R, SPSS, or IBM Watson Studio Experience building data models and performing complex queries using SQL Experience performance tuning large datasets Experience building large data pipelines and/or web services Experience developing visualization and dashboards using PowerBI or similar tools Fluent in one or more object-oriented languages like C#, C++, Scala, Java, and scripting languages like Python or Ruby "We are an equal opportunity employer committed to fair and ethical hiring practices. We do not charge any fees or accept any form of payment from candidates at any stage of the recruitment process. If anyone claims to offer employment opportunities in our company in exchange for money or any other benefit, please treat it as fraudulent and report it immediately."
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking a seasoned Senior Developer & Tech Lead who is enthusiastic about writing clean and efficient code, constructing scalable systems, promoting engineering excellence, and supervising a team of skilled developers in a fast-paced, Agile environment. This position is well-suited for developers with extensive hands-on experience in Java and Apache Spark, coupled with a solid understanding of object-oriented design principles. Your responsibilities will include conducting detailed impact analysis for code changes, designing and implementing scalable, high-performance code using Java and Bigdata/Apache Spark, and ensuring the code is of high quality, maintainable, modular, and adheres to industry-standard design patterns and SOLID principles. You will also be responsible for writing robust unit tests using JUnit, leading code reviews to enforce clean design and best engineering practices, fostering an environment of ownership and accountability, and mentoring a team of developers through technical challenges. As a Senior Developer & Tech Lead, you will collaborate closely with Architects, Quality Engineers, DevOps, and Product owners to deliver high-quality code at speed. You will work in a cross-functional Agile team, participating in daily stand-ups, sprint planning, retrospectives, and backlog grooming. Additionally, you will translate user stories into technical tasks and ensure timely delivery of high-quality solutions. The ideal candidate for this role should possess at least 8 years of development experience with a strong background in Java, Bigdata/Apache Spark, and object-oriented programming. Experience with REST APIs, RDBMS database, and Kafka messaging systems is required, along with exposure to microservices architecture and containerization tools such as Docker and Kubernetes. Proven experience in leading teams and mentoring developers in a fast-paced development environment is essential. A solid understanding of software development lifecycle (SDLC) and Agile methodologies, excellent problem-solving skills, and the ability to think critically under pressure are also crucial. Strong communication skills and the ability to collaborate effectively in cross-functional teams are highly valued. Education-wise, a Bachelor's degree or equivalent experience is required, while a Master's degree is preferred. If you are a person with a disability and require a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review the Accessibility at Citi. You can also view Citi's EEO Policy Statement and the Know Your Rights poster.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
At Citi, we are not just building technology, we are building the future of banking. Encompassing a broad range of specialties, roles, and cultures, our teams are creating innovations used across the globe. Citi is constantly growing and progressing through our technology, with a laser focus on evolving the ways of doing things. As one of the world's most global banks, we are changing how the world does business. Shape your career with Citi. We are currently looking for a high-caliber professional to join our team as Officer, Tableau Developer -C11- Hybrid based in Pune, India. Being part of our team means that we will provide you with the resources to meet your unique needs, empower you to make healthy decisions, and manage your financial well-being to help plan for your future. For instance: - We provide programs and services for your physical and mental well-being, including access to telehealth options, health advocates, confidential counseling, and more. Coverage varies by country. - We empower our employees to manage their financial well-being and help them plan for the future. - We provide access to an array of learning and development resources to help broaden and deepen your skills and knowledge as your career progresses. In this role, you are expected to: Responsibilities: - Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. - Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems. - Apply fundamental knowledge of programming languages for design specifications. - Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging. - Serve as an advisor or coach to new or lower-level analysts. - Identify problems, analyze information, and make evaluative judgments to recommend and implement solutions. - Resolve issues by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. - Has the ability to operate with a limited level of direct supervision. - Can exercise independence of judgment and autonomy. - Acts as a subject matter expert to senior stakeholders and/or other team members. - Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4 - 8 years of relevant experience in Tableau developer- Strong in SQL - Reporting Tool- Tableau - Programming skill- Python - Database - Oracle/ Bigdata - Good to have experience in the Financial Service industry - Intermediate level experience in Applications Development role - Consistently demonstrates clear and concise written and verbal communication - Demonstrated problem-solving and decision-making skills - Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: - Bachelor's degree/University degree or equivalent experience If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review Accessibility at Citi. View Citi's EEO Policy Statement and the Know Your Rights poster.,
Posted 1 week ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
The Financial Crimes & Fraud Prevention Analytics team at Citi is looking for a skilled individual to join as a C14 (people manager) reporting to the Director/Managing Director, AIM. This role will involve leading a team of data scientists based in Pune/Bangalore, focusing on the development and implementation of Machine Learning (ML) /AI/Gen AI models for Fraud Prevention. The successful candidate will be responsible for designing, developing, and deploying generative AI based solutions, analyzing data to understand fraud patterns, and developing models to achieve overall business goals. Additionally, the individual will collaborate with the model implementation team, ensure model documentation, and address questions from model risk management (MRM) while adapting to changing business needs. Key Responsibilities: - Lead as Subject Matter Expert (SME) in the area of ML/AI/Gen AI, demonstrating strong AI and ML concepts and the ability to articulate complex concepts to diverse audiences. - Lead a team of data scientists in the development and implementation of ML /AI/Gen AI models, providing technical leadership, mentorship, and ensuring 100% execution accuracy. - Customize and fine-tune existing RAG frameworks or design new frameworks to meet project requirements. - Establish governance frameworks for model development, deployment, and monitoring to meet MRM and Fair Lending guidelines. - Oversee the end-to-end model development lifecycle and ensure timely deployment with high quality and no errors. - Manage a team of 15+ data scientists, providing career development, conflict management, performance management, coaching, mentorship, and technical guidance. Requirements: - Minimum 15+ years of analytics experience in core model development using ML/AI/Gen AI techniques. - Strong knowledge of current state-of-the-art ML/AI/Gen AI algorithms and their pros and cons. - Experience in Bigdata environments, Python, SQL, and Big Data. - Bachelors or masters degree in computer science, Data Science, Machine Learning, or a related field. Ph.D. is a plus. - At least 8 years of people management experience. - Proven track record of building and deploying generative models-based solutions in production environments. - Excellent verbal and written communication skills, with the ability to influence business outcomes and decisions. - Strong project management skills and the ability to define business requirements and create robust technical documentation. - Strategic thinking and the ability to frame business problems, with excellent analytical and statistical skills. If you are a person with a disability and need a reasonable accommodation to use Citi's search tools and/or apply for a career opportunity, please review Accessibility at Citi. For more information on Citi's EEO Policy Statement and the Know Your Rights poster, visit the Citi website.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
Candidate will be responsible for guiding the full lifecycle of a Hadoop solution, including requirements analysis, platform selection, technical architecture design, infrastructure design and build, testing, and deployment. This position involves building the Bigdata/Hadoop/NoSQL DB capability in Infrastructure services under the Technology Office, known as Bigdata CoE. The location for this position is Bengaluru. The ideal candidate should have at least 6 years of experience and a graduation qualification. ISquaresoft offers a wide range of benefits to its employees, including opportunities for career growth and various healthcare plans. The company is open to new ideas, opinions, and perspectives that can contribute to taking the organization to the next level. At ISquareSoft, employees are highly valued, and the company has an effective organizational structure in place. Equal opportunities and environment-friendly policies are implemented within the organization. If you are an exceptional and passionate individual, ISquaresoft is always looking for talented people to join the team and help in making ISquaresoft even better.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an adept individual in data analysis techniques, you will be responsible for handling large datasets and performing data modeling, validation, and statistical analyses using various data analysis tools. Your role will involve driving project deliveries and managing a team of data professionals while effectively liaising with business stakeholders. Your key accountabilities and responsibilities will include converting business problems into analytical problems, providing high-quality analysis and recommendations, efficient project management and delivery, conceptualizing data-driven solutions for multiple businesses/regions, and focusing on driving efficiency gains and process enhancements. You will also be expected to utilize data to improve customer outcomes and drive business benefits through self-initiated projects. In terms of leadership and teamwork, you should showcase a positive team player attitude while collaborating with a diverse team globally. This may involve liaising with various stakeholders such as business teams, analysts, product owners, data engineers, and platform developers. Additionally, you will be responsible for mentoring junior team members, working with an Agile mindset, and demonstrating stakeholder management and collaboration skills while ensuring adherence to risk management, internal controls, and compliance. Your functional knowledge should include proficiency in Python, SQL, SAS, Excel, and Pyspark. Prior project management experience, Agile methodology knowledge, and familiarity with tools like Jira, Confluence, and GitHub are essential. Exposure to Credit Risk Management framework, Bigdata, Hadoop data lake, cloud platforms like GCP, Azure, AWS, and visualization tools like Qliksense, Tableau, and Power BI will be advantageous in this role.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You should have a UG/PG degree in Computer Science Engineering or Electronics and Communication Engineering. You must possess hands-on experience in Computer Science domains including Big Data, Cloud Computing, Data Mining, Dependable and Secure Computing, and Image Processing. Additionally, you should have domain knowledge in Mobile Computing, Network Security, Web Mining, and Software Engineering. Knowledge in Embedded System, VLSI, IoT, Power Electronics, and Wireless Communication is also required. It is essential to have adequate knowledge of reading research articles and possess an excellent attitude and passion towards engineering. You should be innovative with good analytical, quantitative, and data interpretation skills. Good written communication skills, excellent documentation, and presentation skills are a must. As a self-starter and self-paced learner, you are expected to contribute within the agile framework. Your responsibilities will include contributing to requirement/product/design document specification, troubleshooting, and playing the role of a critical validation team member. You should be versatile in learning the technical and soft skills necessary to accomplish assigned tasks successfully. Additionally, following an established process, suggesting improvements to existing processes, and defining new processes are key aspects of this role.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
We are seeking a talented Knowledge Neo4J Developer to join our team as a key member of the data engineering team. Your primary responsibility will be the design, implementation, and optimization of graph databases, aimed at efficiently storing and retrieving high-dimensional data. You will have the opportunity to work with cutting-edge technologies in locations such as Hyderabad, Pune, Gurugram, and Bangalore. The major skills required for this role include expertise in Neo4j, Cypher, Python, and Bigdata tools such as Hadoop, Hive, and Spark. As a Knowledge Neo4J Developer, your responsibilities will include designing, building, and enhancing the client's online platform. You will leverage Neo4j to create and manage knowledge graphs, ensuring optimal performance and scalability. Additionally, you will research, propose, and implement new technology solutions following best practices and standards. Your role will involve developing and maintaining knowledge graphs using Neo4j, integrating graph databases with existing infrastructure, and providing support for query optimization and data modeling. To excel in this position, you should have a minimum of 5-10 years of experience in data engineering, proficiency in query languages like Cypher or Gremlin, and a strong foundation in graph theory. Experience with Bigdata tools is essential, along with excellent written and verbal communication skills, superior analytical and problem-solving abilities, and a preference for working in dual shore engagement setups. If you are interested in this exciting opportunity, please share your updated resume with us at francis@lorventech.com.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III - Big Data/Java/Scala at JPMorgan Chase within the Liquidity Risk (LRI) team, you will design and implement the next generation build out of a cloud native liquidity risk management platform for JPMC. The Liquidity Risk technology organization aims to provide comprehensive solutions to managing the firm's liquidity risk and to meet our regulatory reporting obligations across 50+ markets. The program will include the strategic build out of advanced liquidity calculation engines, incorporate AI and ML into our liquidity risk processes, and bring digital-first reporting capabilities. The target platform must process 40-60 million transactions and positions daily, calculate risk presented by the current actual as well as model-based what-if state of the market, build a multidimensional picture of the corporate risk profile, and provide the ability to analyze it in real time. Job Responsibilities: Executes standard software solutions, design, development, and technical troubleshooting. Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation. Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems. Adds to team culture of diversity, equity, inclusion, and respect. Contributes to team drive for continual improvement of development process and innovative solutions to meet business needs. Applies appropriate dedication to support the business goals through technology solutions. Required Qualifications, Capabilities, and Skills: Formal training or certification on software engineering concepts and 2+ years applied experience. Hands-on development experience and in-depth knowledge of Java, Scala, Spark, Bigdata related technologies. Hands-on practical experience in system design, application development, testing, and operational stability. Experience in cloud technologies (AWS). Experience across the whole Software Development Life Cycle. Experience to agile methodologies such as CI/CD, Applicant Resiliency, and Security. Emerging knowledge of software applications and technical processes within a technical discipline. Ability to work closely with stakeholders to define requirements. Interacting with partners across feature teams to collaborate on reusable services to meet solution requirements. Preferred Qualifications, Capabilities, and Skills: Experience of working in big data solutions with evidence of ability to analyze data to drive solutions. Exposure to complex computing using JVM and Big data. Ability to find the issue and optimize an existing workflow.,
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
sonipat, haryana
On-site
As a Data Engineer + Subject Matter Expert in Data Mining at Newton School of Technology, you will play a crucial role in revolutionizing technology education and empowering students to bridge the employability gap in the tech industry. You will have the opportunity to develop and deliver engaging lectures, mentor students, and contribute to the academic and research environment of the Computer Science Department. Your key responsibilities will include developing comprehensive lectures on "Data Mining", BigData, and Data Analytics courses, covering foundational concepts to advanced techniques. You will guide students on the complete data lifecycle, including preprocessing, cleaning, transformation, and feature engineering. Teaching a wide range of algorithms for Classification, Association rules mining, Clustering, and Anomaly Detections will be a part of your role. Moreover, you will design practical lab sessions, grade assessments, mentor students on projects, and stay updated with the latest advancements in data engineering and machine learning to ensure the curriculum remains cutting-edge. To excel in this role, you are required to have a Ph.D. or a Master's degree with significant industry experience in Computer Science, Data Science, Artificial Intelligence, or related fields. Your expertise in data engineering and machine learning concepts, proficiency in Python and its data science ecosystem, experience in teaching complex topics at the undergraduate level, and excellent communication skills are essential qualifications. Preferred qualifications include a record of academic publications, industry experience as a Data Scientist or in a similar role, familiarity with big data technologies and deep learning frameworks, and experience in mentoring student teams for data science competitions or hackathons. By joining Newton School of Technology, you will be offered competitive salary packages, access to advanced labs and facilities, and the opportunity to be part of a forward-thinking academic team shaping the future of tech education. If you are passionate about transforming technology education, empowering students, and staying at the forefront of data engineering and machine learning, we are excited about the possibility of you joining our team at Newton School of Technology. For more information about our university, please visit our website: Newton School of Technology.,
Posted 1 month ago
10.0 - 15.0 years
20 - 30 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Role: Lead Data Engineer Exp.: 10+ years Location: Pune, Bengaluru, Hyderabad, Chennai, Gurugram, Noida Work Mode: Hybrid (3 days work from office) Key Skills: Snowflake, SQL, Data Engineering, ETL, Any Cloud (GCP/AWS/Azure) Must Have Skills: Proficient in snowflake and SQL 4+ years of experience in snowflake and 8+ years of experience in SQL Atleast 10+ years of experience in data engineering development project Atleast 6+ years of experience in Data Engineering in the cloud technology Strong expertise with Snowflake data warehouse platform, including architecture, features, and best practices. Hands on experience ETL and DE tools Design, develop, and maintain efficient ETL/ELT pipelines using Snowflake and related data engineering tools. Optimize Snowflake data warehouses for performance, cost, and scalability. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions. Implement data modeling and schema design best practices in Snowflake. Good communication skills is a must Good to have skills: Knowledge of DNA/Fiserv- core banking system Knowledge of data governance, security, and compliance standards
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an Infoscion, your primary responsibility is to interact with clients to address quality assurance issues and ensure utmost customer satisfaction. You will be involved in understanding requirements, creating and reviewing designs, validating architecture, and delivering high-quality service offerings in the technology domain. Participating in project estimation, providing inputs for solution delivery, conducting technical risk planning, performing code reviews, and unit test plan reviews are crucial aspects of your role. Leading and guiding your teams towards developing optimized code deliverables, continuous knowledge management, and adherence to organizational guidelines and processes are also key responsibilities. You will play a significant role in building efficient programs and systems. If you believe you have the skills to assist clients in their digital transformation journey, this is the ideal place for you to thrive. In addition to the primary responsibilities, you are expected to have knowledge of multiple technologies, basic understanding of architecture and design fundamentals, familiarity with testing tools, and agile methodologies. Understanding project life cycle activities, estimation methodologies, quality processes, and business domains is essential. Analytical abilities, strong technical skills, good communication skills, and a deep understanding of technology and domains are also required. Furthermore, you should be able to demonstrate a solid understanding of software quality assurance principles, SOLID design principles, and modeling methods. Keeping abreast of the latest technologies and trends, and possessing excellent problem-solving, analytical, and debugging skills are highly valued. Preferred Skills: - Technology: Functional Programming - Scala,
Posted 1 month ago
6.0 - 11.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Gurgaon Skill: GCP Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Bigdata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 month ago
6.0 - 11.0 years
0 - 0 Lacs
Bengaluru
Work from Office
Experience across Enterprise BI/Big Data/DW/ETL technologies such as Teradata, Hadoop, Tableau, SAS, Hyperion, or Business Objects. Data Modelling Patterns Experience in working within a Data Delivery Life Cycle framework. Experience leading discussions and presentations. Experience in driving decisions across groups of stakeholders. Extensive experience in large enterprise environments handling large volume of datasets with High Service Level Agreement(s) 5+ years experience gained in a financial Institution, or Insurance provider, is desirable. Experience in Development projects using Ab Initio, Snowflake, AWS/Cloud Services Recognized industry certifications Role & responsibilities Preferred candidate profile
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce is currently seeking software developers who are passionate about creating impactful solutions for users, the company, and the industry. Join a team of talented engineers to design and develop innovative features that enhance our CRM platform's stability and scalability. As a software engineer at Salesforce, you will be involved in architecture, design, implementation, and testing to ensure the delivery of high-quality products to our customers. We take pride in writing maintainable code that strengthens product stability and simplifies our work processes. Our team values individual strengths and encourages personal growth. By empowering autonomous teams, we aim to foster a culture of innovation and excellence that benefits both our employees and customers. **Your Impact** As a Senior Backend Software Engineer at Salesforce, your responsibilities will include: - Building new components to enhance our technology offerings in a dynamic market - Developing high-quality code for our cloud platform used by millions of users - Designing, implementing, and optimizing APIs and API framework features for scalability - Contributing to all phases of software development life cycle in a Hybrid Engineering model - Creating efficient components for a multi-tenant SaaS cloud environment - Conducting code reviews, mentoring junior engineers, and providing technical guidance **Required Skills:** - Proficiency in multiple programming languages and platforms - 5+ years of experience in backend software development, including designing distributed systems - Deep knowledge of object-oriented programming and scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++ - Strong skills in PostgreSQL/SQL and experience with relational and non-relational databases - Understanding of software development best practices and leadership abilities - Degree or equivalent experience with relevant competencies **Preferred Skills:** - Experience with developing SAAS products on public cloud platforms like AWS, Azure, or GCP - Knowledge of Big Data/ML, S3, Kafka, Elastic Search, Terraform, Kubernetes, and Docker - Previous experience in a fast-paced, multinational organization **Benefits & Perks** - Comprehensive benefits package including well-being reimbursement, parental leave, adoption assistance, and more - Access to training resources on Trailhead.com - Mentorship opportunities with leadership and executive thought leaders - Volunteer programs and community engagement initiatives as part of our giving back model For further details, please visit [Salesforce Benefits Page](https://www.salesforcebenefits.com/),
Posted 1 month ago
5.0 - 8.0 years
0 - 3 Lacs
Pune, Chennai
Hybrid
Hello Connections , Exciting Opportunity Alert !! We're on the hunt for passionate individuals to join our dynamic team as Data Engineer Job Profile : Data Engineers Experience : Minimum 5 to Maximum 8 Yrs of exp Location : Chennai / Pune Mandatory Skills : Big Data | Hadoop | pyspark | spark | sparkSql | Hive Qualification : B.TECH / B.E / MCA / Computer Science Background - Any Specification How to Apply? Send your CV to: sipriyar@sightspectrum.in Contact Number - 6383476138 Don't miss out on this amazing opportunity to accelerate your professional career! #bigdata #dataengineer #hadoop #spark #python #hive #pysaprk
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
About Impetus: Impetus Technologies is a digital engineering company dedicated to offering expert services and products to support enterprises in accomplishing their transformation objectives. Specializing in solving the analytics, AI, and cloud challenges, we empower businesses to foster unparalleled innovation and expansion. Established in 1991, we stand out as leaders in cloud and data engineering, catering cutting-edge solutions to Fortune 100 corporations. Our headquarters are located in Los Gatos, California, while our development centers span across NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad, boasting a global team of over 3000 professionals. Additionally, we have operational offices in Canada and Australia and maintain collaborative relationships with renowned organizations such as American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Skills Required: - Bigdata - Pyspark - Hive - Spark Optimization Good to have: - GCP,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce has immediate opportunities for software developers who want their lines of code to have a significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. As a Senior Backend Software Engineer, your job responsibilities will include: - Building new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. - Developing high-quality, production-ready code that millions of users of our cloud platform can use. - Designing, implementing, and tuning robust APIs and API framework-related features that perform and scale in a multi-tenant environment. - Working in a Hybrid Engineering model and contributing to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. - Building efficient components/algorithms on a microservice multi-tenant SaaS cloud environment. - Conducting code reviews, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level). Required Skills: - Mastery of multiple programming languages and platforms. - 5+ years of backend software development experience including designing and developing distributed systems at scale. - Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS, and C++. - Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. - A deeper understanding of software development best practices and demonstrate leadership skills. - Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.). Preferred Skills: - Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. - Experience with Big-Data/ML and S3. - Hands-on experience with Streaming technologies like Kafka. - Experience with Elastic Search. - Experience with Terraform, Kubernetes, Docker. - Experience working in a high-paced and rapidly growing multinational organization. Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. - World-class enablement and on-demand training with Trailhead.com. - Exposure to executive thought leaders and regular 1:1 coaching with leadership. - Volunteer opportunities and participation in our 1:1:1 model for giving back to the community. For more details, visit [Salesforce Benefits](https://www.salesforcebenefits.com/).,
Posted 1 month ago
5.0 - 12.0 years
0 Lacs
haryana
On-site
As a GCP Data Developer specializing in BigData and ETL, you will be an integral part of our technology services client's team in Bangalore and Gurugram. With 5-12 years of experience in the field, your role will involve creating and maintaining database standards and policies, managing database availability and performance, defining and implementing event triggers for performance or integrity issues, and carrying out database housekeeping tasks. Monitoring usage transaction volumes, response times, and concurrency levels will also be among your responsibilities. If you find this opportunity compelling, please send your updated resume to hema.g@s3staff.com.,
Posted 1 month ago
8.0 - 13.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Roles and Responsibilities: Lead Agile project management for data product initiatives, ensuring timely delivery and alignment with strategic business objectives. Collaborate with stakeholders across the organization to identify business needs and translate them into clear data product requirements and user stories. Facilitate Agile ceremonies (daily stand-ups, sprint planning, retrospectives) to maintain team focus and momentum. Manage and prioritize the product backlog in coordination with product owners and data experts to maximize value delivery. Ensure data quality, governance, and compliance standards are met throughout the product lifecycle. Foster cross-functional collaboration among data engineers, data scientists, analysts, and business teams to resolve impediments and steer delivery. Develop and maintain product roadmaps that reflect evolving business priorities and data capabilities. Track project progress using Agile metrics and provide transparent communication to stakeholders. Support continuous improvement by coaching the team on Agile best practices and adapting processes as needed. Define and track the KPIs that transparently reflect the status of key initiatives Direct a team of 5 or more people comprising leads, principals, etc. and indirectly co-ordinate with more people as part of a cross functional teams comprising of varied roles and functions Required Skills and Experience: Bachelors degree or above preferably in Software Engineering Strong understanding of Agile frameworks such as Scrum or Kanban, and experience facilitating Agile teams. Should have experience of leading all Agile ceremonies Knowledge of data product management principles, including requirements definition, data quality, and governance. Excellent communication and stakeholder management skills to bridge technical and business perspectives. Strong business communication, presentation and conflict management skills Experience working with data professionals (data engineers, data scientists, data quality engineers) and understanding data pipelines. Proficiency with Agile project management tools like ADO, Jira or equivalent. Ability to manage competing priorities and adapt plans based on feedback and changing requirements. Proficient in delivery and quality metrics, burn down charts, progress and status reporting Knowledge of 2 or more effort and cost estimation methodologies/ frameworks Proficient in scope (requirements)/ backlog management, quality management, defect prevention and risks and issues management Nice to Have Qualities & Skills Flexibility to learn and apply new methodologies Mortgage Industry experience /knowledge Strong commercial acumen i.e. understanding of pricing models, delivery P&L, budgeting, etc. Basic level understanding of contracts Relevant certifications like Certified Scrum Master (CSM, PMP, Agile Project Management, etc. Knowledge of compliance frameworks like RESPA, TILA, CFPB, and data security standards. Knowledge of Azure Cloud
Posted 1 month ago
3.0 - 7.0 years
3 - 8 Lacs
Chennai
Work from Office
Job Title: Senior Programmer- AI & Data Engineering Location: Work from Office Experience Required: 3+ Years Job Type: Full-Time Department: Technology / Engineering Job Summary: We are seeking a highly skilled and motivated Senior Programmer with a strong background in AI development, Python programming , and data engineering . The ideal candidate will have hands-on experience with OpenAI models , Machine Learning , Prompt Engineering , and frameworks such as NLTK , Pandas , and Numpy . You will work on developing intelligent systems, integrating APIs, and deploying scalable solutions using modern data and cloud technologies. Key Responsibilities: Design, develop, and optimize intelligent applications using OpenAI APIs and machine learning models. Create and refine prompts for Prompt Engineering to extract desired outputs from LLMs (Large Language Models). Build and maintain scalable, reusable, and secure REST APIs for AI and data applications. Work with large datasets using Pandas , NumPy , SQL , and integrate text analytics using NLTK . Collaborate with cross-functional teams to understand requirements and translate them into technical solutions. Use the Function Framework to encapsulate business logic and automate workflows. Apply basic knowledge of cloud platforms (AWS, Azure, or GCP) for deployment and scaling. Assist in data integration, processing, and transformation for Big Data systems. Write clean, maintainable, and efficient Python code. Conduct code reviews, mentor junior developers, and lead small projects as needed. Required Skills & Qualifications: Minimum 3 years of experience in Python development with a strong focus on AI and ML. Proven expertise in OpenAI tools and APIs . Hands-on experience with Machine Learning models and Prompt Engineering techniques. Solid programming skills using Python , along with libraries like Pandas , Numpy , and NLTK . Experience developing and integrating REST APIs . Working knowledge of SQL and relational database systems. Familiarity with Function Frameworks and modular design patterns. Basic understanding of cloud platforms (AWS/GCP/Azure) and Big Data concepts. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Preferred Qualifications: Exposure to Docker , Kubernetes , or similar container orchestration tools. Understanding of MLOps , data pipelines , or cloud-based AI deployments . Experience with version control systems like Git and CI/CD pipelines.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City