Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
9 - 13 Lacs
Pune
Work from Office
Company Description: As a leading global investment management firm, AB fosters diverse perspectives and embraces innovation to help our clients navigate the uncertainty of capital markets. Through high-quality research and diversified investment services, we serve institutions, individuals, and private wealth clients in major markets worldwide. Our ambition is simple: to be our clients most valued asset-management partner. Group Description: We are seeking a Pune-based senior quantitative researcher to join our buy-side Index and Derivative Solutions unit. The job will focus on research and development of tools and processes to support all aspects of the systematic investment process and research working in tandem with the group of quantitative research analysts and portfolio managers. The candidate will leverage and contribute to our state-of-the-art options investment platform, including modules for simulation, portfolio construction, automatic trade routing, reporting and integration utilities. Work will include quantitative analysis and modeling; portfolio construction and reporting; as well as building various dashboards and reports. All software is being developed in Python and SQL. While most of the users and staff are in New York, the candidate will join a growing team in Pune and Nashville. Specific Responsibilities: Participate in development of the next generation platform for options strategies. Assistance in factor research and risk-management approaches unique to option investment strategies Development of front-end tools to aid portfolio optimization, monitoring, trade building and trade routing. Onboarding data from various internal and external sources Simplification, automation, and support of existing manual processes What makes this role unique or interesting (if applicable)? AB investment units are making a significant investment in building out a platform to improve our ability implement systematic investment strategies. This project is still in its early stages, and the candidate will be given significant opportunities to make contributions at the ground-level. The candidate will have an opportunity to work alongside an established team of developers, quantitative analysts and portfolio managers to create a derivatives strategies platform encompassing quantitative modeling, portfolio construction, performance attribution, data consolidation and quality control. Qualifications, Experience, Education: Undergraduate degree in Engineering (required); Masters in Engineering, Economics, or Finance (preferred). CFA/FRM (pursuing or completed) is a plus. Proficiency in Python and one other programming language (MATLAB preferred, or R/C++) is required. Strong understanding of statistical analysis and techniques (hypothesis testing, distributions, regression modeling) is required. Machine learning, Natural Language Processing methods is a strong advantage Academic-level exposure to option pricing methods is required. High attention to detail, accuracy, and ability to work independently and as part of a team. Experience in quantitative analysis with exposure to multiple asset classes (equities, fixed income, commodities, currencies) is a plus. Experience with large data sets and SQL for data processing. Knowledge of macroeconomic and broad capital markets (equities, credit, rates, volatility) is a plus. Experience in asset management and factor research is an advantage. Familiarity with financial databases and tools (FactSet, Bloomberg) is required. Ability to quickly learn new tools, multi-task, and thrive in a fast-paced environment. Special Knowledge (if applicable): Nice to have: Experience with Git/GitHub Experience working with risk models/attribution tools such as BarraOne, Experience with market data vendors - Bloomberg, QADirect, Barclays POINT, etc Experience with machine learning or big data Experience working in the finance industry, demonstrable curiosity in quantitative research and investment Location: Pune, India Pune, India
Posted 4 days ago
2.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities: Big Data Architecture: Design, build, and implement scalable Big Data solutions to process and analyze vast datasets in a timely and efficient manner. Data Pipeline Development: Develop ETL (Extract, Transform, Load) pipelines for large-scale data processing. Ensure data pipelines are automated, scalable, and robust to handle high volumes of data. Distributed Systems: Work with distributed computing frameworks (e.g., Apache Hadoop , Apache Spark , Flink ) to process big datasets across multiple systems and clusters. Data Integration: Integrate data from multiple sources (structured, semi-structured, and unstructured) into a unified data architecture.
Posted 4 days ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Total Experience: 10+ Years Relevant Experience: 10+ Years Rate: 11000 INR/day Interview Mode: One F2F Is mandatory to attend ( Kindly avoid the candidate who cannot attend one F2F round ) Candidate should be ready to join as a subcontractor. If relevant & total years of experience is not as mentioned, will be straightforward reject. Profile with higher rate will be straightforward reject. Please treat the below requirement as critical and share 2 quality profiles those are really interested in subcon role. Kindly go through below instruction very clearly and then do the submission with requested details Vendors should check the requirement clearly and not to send the profiles just by key word search Vendors should check the availability and interest of the resource to join as Subcon Kindly submit profiles within the rate card Ensure there is no ex- Infosys Emp profile submission as we have 6 months of cooling period We need only top 1-2 quality profiles, avoid multiple mail thread on profiles submission. ECMS Req # 514780 Number of Openings 1 Duration of Hiring 12 Months Relevant and Total years of experience 10+ Detailed job description - Skill Set: Create, test, and implement enterprise-level apps with Snowflake Design and implement features for identity and access management Create authorization frameworks for better access control Implement Client query optimization, major security competencies with encryption Solve performance issues and scalability issues in the system Transaction management with distributed data processing algorithms Possess ownership right from start to finish Migrate solutions from on-premises setup to cloud-based platforms Understand and implement the latest delivery approaches based on data architecture Project documentation and tracking based on understanding user requirements Perform data integration with third-party tools including architecting, designing, coding, and testing phases Manage documentation of data models, architecture, and maintenance processes Continually review and audit data models for enhancement Performance tuning, user acceptance training, application support Maintain confidentiality of data Risk assessment, management, and mitigation plans Regular engagement with teams for status reporting and routine activities Migration activities from one database to another or on-premises to cloud Mandatory Skills(ONLY 2 or 3) Snowflake Developer Vendor Billing range in local currency (per day) 11000 INR/DAY Work Location Chennai, Hyderabad, Bangalore, or Mysore Infosys location WFO/WFH/Hybrid WFO Hybrid WFO Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO NO Mode of interview F2F BGCHECK before or After onboarding Post Onboarding
Posted 4 days ago
6.0 - 8.0 years
30 - 35 Lacs
Bengaluru
Work from Office
The Professional designs, builds, and maintains moderately complex data systems that enable data analysis and reporting. With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision-making. This includes designing and implementing scalable data pipelines, optimizing workflows, and ensuring data quality and integrity to support enterprise applications and analytics. The role also involves working with a wide range of tools and technologies including Snowflake, Python, PL/SQL, and Power BI to deliver robust and efficient data solutions. Key Accountabilities DATA & ANALYTICAL SOLUTIONS: Designs and implements scalable and robust data engineering solutions using advanced technologies and cloud platforms. Applies best practices to ensure sustainability and performance of data systems. Demonstrates strong problem-solving abilities and delivers practical solutions in fast-paced environments. DATA PIPELINES: Builds and maintains efficient ETL processes and data pipelines for batch and streaming data, enabling seamless ingestion and transformation of large datasets. Proficient in handling big data using tools such as Hadoop, HDFS, MapReduce, and Hive. Also, proficiency in Snowflake tools like Snowpark, SnowPy etc DATA SYSTEMS: Reviews and optimizes backend systems and data architectures to improve performance and reliability of enterprise data solutions. Applies performance optimization techniques to enhance data workflows. DATA INFRASTRUCTURE: Prepares and manages infrastructure components to support efficient data storage, retrieval, and processing. Ensures data quality and integrity across systems. DATA FORMATS: Implements appropriate data formats and structures to enhance usability and accessibility across analytics and reporting platforms. STAKEHOLDER MANAGEMENT: Collaborates with cross-functional teams and business stakeholders to gather requirements and deliver data solutions aligned with business needs. Known for effective communication and teamwork. DATA FRAMEWORKS: Develops prototypes and implements data engineering frameworks to support analytics initiatives and improve data processing capabilities. Leads strategic initiatives to enhance data engineering practices. AUTOMATED DEPLOYMENT PIPELINES: Implements automated deployment pipelines to streamline code releases and ensure governance and compliance. DATA MODELING: Performs data modeling aligned with technologies such as Snowflake and PL/SQL to ensure performance, scalability, and accessibility. Experienced in data design, development, and documentation Qualifications Minimum requirement of 6-8 years of relevant work experience.
Posted 4 days ago
2.0 - 4.0 years
20 - 25 Lacs
Bengaluru
Work from Office
In the HPE Hybrid Cloud , we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what s next for you. Job Family Definition: The Cloud Developer builds from the ground up to meet the needs of mission-critical applications, and is always looking for innovative approaches to deliver end-to-end technical solutions to solve customer problems. Brings technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing cloud-based solutions. Collaborates with project managers and development partners to ensure effective and efficient delivery, deployment, operation, monitoring, and support of Cloud engagements. The Cloud Developer provides business value expertise to drive the development of innovative service offerings that enrich HPEs Cloud Services portfolio across multiple systems, platforms, and applications. Management Level Definition: Contributions include applying intermediate level of subject matter expertise to solve common technical problems. Acts as an informed team member providing analysis of information and recommendations for appropriate action. Works independently within an established framework and with moderate supervision. What you will do: Designs simple to moderate cloud application features as per specifications. Develops and maintains cloud application modules adhering to security policies. Designs test plans, develops, executes, and automates test cases for assigned portions of the developed code. Deploys code and troubleshoots issues in application modules and deployment environment. Shares and reviews innovative technical ideas with peers, high-level technical contributors, technical writers, and managers. Analyses science, engineering, business, and other data processing problems to develop and implement solutions to complex application problems, system administration issues, or network concerns. What you will need: Bachelors degree in computer science, engineering, information systems, or closely related quantitative discipline. masters desirable. Typically, 2-4 years experience. Knowledge and Skills: Strong programming skills in Python or Golang. Expertise in development of micro services and deploying this on Kubernetes environment Understanding and work experience in GitOps, Devops, CD/CD tooling, concepts of package management software, software deployment and life cycle management Experience in architecting software deployments, scripting, deployment tools like chef, puppet ansible, orchestration tools like Terraform Good to have Enterprise Data Center Infrastructure knowledge (Servers, Storage, Networking) Experience with design methodologies, cloud-native applications, developer tools, managed services, and next-generation databases. Good written and verbal communication skills. Ability to quickly learn new skills and technologies and work we'll with other team members. Understanding DevOps practices like continuous integration/deployment and orchestration with Kubernetes. Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Release Management, Security-First Mindset, User Experience (UX) What We Can Offer You: Health & we'llbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional we'llbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you'reach any career goals you have whether you want to become a knowledge expert in your field or apply your skills to another division
Posted 4 days ago
0.0 - 1.0 years
2 - 3 Lacs
Chennai
Work from Office
You make sound judgments and promote a Associate / Candidates focused environment. You optimize execution and results. You inspire commitment through communication and influence. You demonstrate adaptability while thinking and acting strategically. You build and sustain internal and external relationships. Flexible to work in US hours shifts. What you'll do As a Customer Care Senior Resolution Coordinator, you will take a high volume of incoming calls, chats, and emails from customers, stores, and associates while navigating multiple systems to aid in answering questions and resolving issues. All Customer Care Coordinators must have the ability to communicate professionally in a conversational manner while utilizing all available resources to ensure customer satisfaction. To exceed our customers needs, our associates must be punctual, reliable, problem solve, act with integrity and be dedicated to making a difference. What you'll bring 0 - 12 months of relevant customer service experience Excellent written and verbal communication skills Able to interact professionally with customers. Ability to manage multiple tasks simultaneously. Customer focused mindset with a high level of urgency; role model for delivering Extraordinary Customer Care In this role, you may be asked to switch between any support channel of phone, chat, and Email based on the business requirements. Review, analyze, and process critical customer queries with accuracy to provide customer satisfaction. Adhere to quality, compliance guidelines and SLA s Must be willing to take continuous voice calls Must type a minimum of 25 WPM Proficient with Microsoft Office programs (Outlook, Word) Successful completion of mandatory training Should be flexible work in a 24/7 work environment with rotating weekly time off. Should be able to work in permanent night shifts or any assigned shifts on a rotational basis. Any graduation Preferred Qualifications... basic computer processing/data entry software
Posted 4 days ago
5.0 - 10.0 years
4 - 8 Lacs
Mohali
Work from Office
We are seeking a detail-oriented and technically skilled Assistant Project Manager(Technical / CIR) to support our Document-Based Review (DBR) projects. The idealcandidate will have a legal support or technical background and be comfortable usingautomation and scripting tools to streamline large-scale data processes. This is an on-siteposition based in Chandigarh/Mohali, and candidates must be willing to work from theoffice daily. Key Responsibilities: - Execute PDF automation scripts to extract structured data from large files,including OCR-based extraction from scanned PDFs. - Ensure seamless data transfer to Excel, applying formatting and logic as perproject needs. - Use Python, MySQL, and Excel to create and run data deduplication scripts,cleaning datasets and removing duplicates. - Perform data grouping and classification tasks, particularly on platforms otherthan iConect. - Manage Excel consolidation and data normalization, conducting all standarddata integrity checks in line with project specifications. - Automate daily task reports using Python to provide insights on resourceallocation and productivity. Requirements Experience: 5 - 10 years in a legal support, data processing, or technical project role. - Technical Skills: o Proficiency in Python, MySQL, and Excel o Experience with OCR tools, SharePoint, and automation scripts - Soft Skills: o Excellent verbal and written communication in English o Strong attention to detail and problem-solving ability
Posted 4 days ago
0.0 - 1.0 years
1 - 3 Lacs
Kolkata
Work from Office
Responsibilities: * Manage back office operations with focus on data entry & processing * Provide clerical support for banking processes * Execute non-voice tasks efficiently * Maintain accurate records through typing skills Contact- 7003551682(HR)
Posted 4 days ago
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Work from Office
The ideal candidate will have a strong background in data engineering and excellent problem-solving skills. Roles and Responsibility Design and develop large-scale data pipelines and architectures to support business intelligence and analytics. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data models and databases to ensure data integrity and consistency. Implement data quality checks and validation processes to ensure accuracy and reliability. Optimize data processing workflows to improve performance and efficiency. Troubleshoot and resolve technical issues related to data engineering projects. Job Requirements Strong understanding of data engineering principles and practices. Experience with data modeling, database design, and development. Proficiency in programming languages such as Python or Java. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with industry-standard tools and technologies used in data engineering. Educational qualification: Any Graduate.
Posted 4 days ago
3.0 - 8.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Are you interested in building high-performance, globally scalable Financial systems that support Amazons current and future growth? Are you seeking an environment where you can drive innovation leveraging the scalability and innovation with Amazons AWS cloud services? Do you have a passion for ensuring a positive customer experience? This is the job for you. Amazons Finance Technology organization (FinTech) is responsible for building and maintaining the critical finance technology applications that enable new business growth, ensure compliance with financial and tax reporting obligations, and provide deep analysis of Amazons financial data. This function is of paramount importance to the company, as it underpins Amazons ability to effectively manage its finances and drive continued expansion. At the heart of FinTechs mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazons critical close timelines and maintaining the integrity of the companys financial data. Amazon Financial Technology Team is looking for a results-oriented, driven software development engineer, who can help us create the next generation of distributed, scalable financial systems. Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts that are typically being defined in real-time. We need someone to design and develop services that facilitate global financial transactions worth billions (USD) annually. This is a unique opportunity to be part of a mission-critical initiative with significant organizational visibility and impact. Design Foundational Greenfield Services: You will collaborate with your team to architect and implement the core services that will form the backbone of this new accounting software. Your technical expertise and innovative thinking will be instrumental in ensuring the foundational services are designed with scalability, reliability, and performance in mind for Amazon. Adopting Latest Technology: You will have the chance to work with the latest technologies, frameworks, and tools to build these foundational services. This includes leveraging advancements in areas such as cloud computing, distributed systems, data processing, and real-time analytics. Solving High-Scale Processing Challenges: This project will involve handling millions of transactions per day, presenting you with the unique challenge of designing and implementing robust, high-performance solutions that can handle this scale of volume efficiently. You will be challenged to tackle complex problems related to data processing, queuing, and real-time analytics. Cross-Functional and Senior Engineer Collaboration: You will work closely with cross-functional teams, including product managers, data engineers, and accountants. You will also be working directly with multiple Principal Engineers and presenting your work to Senior Principal Engineers. This experience will give you the opportunities and visibility to help build the required leadership skills to enhance your career. Define high level and low level design for software solutions using the latest AWS technology in a large distributed environment. Take the lead on defining and implementing engineering best practices and using data to define and improve operational best practices. Help drive the architecture and technology choices for FinTech accounting products. Design, develop and deploy medium to large software solutions for Amazon accounting needs. Raise the bar on code quality, including security, readability, consistency, maintainability. About the team At the heart of FinTechs mission is the General Ledger team, which builds and operates the technologies to account for and post millions of financial transactions daily to support accurate internal and external financial reporting. This team processes on average 371MM+ transactions per month, servicing the accounting needs of Finance, Accounting, and Tax teams worldwide. The work of the General ledger team is absolutely essential to meeting Amazons critical close timelines and maintaining the integrity of the companys financial data. 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language Bachelors degree 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelors degree in computer science or equivalent
Posted 4 days ago
5.0 - 10.0 years
11 - 15 Lacs
Pune
Work from Office
Support Specialist at N Consulting Ltd Job Title: Support Specialist Eagle Platform (Portfolio Management) Location: Riyadh, Saudi Arabia Type: Full-time / Contract Industry: Banking / Investment Management / FinTech Experience Required: 5+ years We are seeking a highly skilled Support Specialist with hands-on experience working on BNY Mellon s Eagle Investment Systems , particularly the Eagle STAR, PACE, and ACCESS modules used for portfolio accounting, data management, and performance reporting . The ideal candidate will have supported the platform in banking or asset management environments, preferably with experience at Bank of America , BNY Mellon , or institutions using Eagle for middle- and back-office operations . Key Responsibilities: Provide day-to-day technical and functional support for the Eagle Platform including STAR, PACE, and Performance modules Troubleshoot and resolve user issues related to portfolio accounting, performance calculation, and reporting Act as a liaison between business users and technical teams for change requests, data corrections, and custom reports Monitor batch jobs, data feeds (security, pricing, transaction data), and system interfaces Work closely with front-office, middle-office, and operations teams to ensure accurate data processing and reporting Manage SLA-driven incident resolution and maintain support documentation Support data migrations, upgrades, and new release rollouts of Eagle components Engage in root cause analysis and implement preventive measures Required Skills and Experience: 5+ years of experience in financial systems support, with a strong focus on Eagle Investment Systems Strong knowledge of portfolio management processes , NAV calculations , and financial instruments (equities, fixed income, derivatives) Prior work experience in Bank of America , BNY Mellon , or with asset managers using Eagle is highly preferred Proficient in SQL , ETL tools , and understanding of data architecture in financial environments Familiarity with upstream/downstream systems such as Bloomberg, Aladdin, or CRD is a plus Strong analytical skills and attention to detail Excellent communication skills in English (Arabic is a plus) Preferred Qualifications: Bachelor s degree in Computer Science, Finance, or related field ITIL Foundation or similar certification in service management Prior experience working in a banking or asset management firm in the GCC is a bonus
Posted 4 days ago
6.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
No of years experience 8-10 years Detailed job description - Skill Set: 6+ years of industry experience including cloud technologies. Very strong hands-on experience in Databricks with AWS cloud service in Data engineering/Data processing hands-on. Hands on experience in AWS C loud-based development and integration Proficiency in programming languages Scala and Spark Data frame for data processing and application development Practical experience with Data Engineering, Data Ingestion/Orchestration with Apache Airflow and the accompanying DevOps with CI-CD tools Strong knowledge on Spark, Databricks SQL - Data engineering pipeline Experience in offshore/onshore model and Ability and agile methodology. Gathering requirements, understand the business need and regular discussion with tech on design, development activities. Should have good experience working with client architect/design team to understand the architecture, requirement and work on the development. Experience working in a Financial Industry. Certification on Databricks and AWS will be added advantage Mandatory Skills Databricks, AWS, Scala, Spark Data frame, Spark SQL, Apache Airflow, SQL
Posted 4 days ago
8.0 - 13.0 years
10 - 15 Lacs
Chennai
Work from Office
Overview Key Responsibiities: Lead the GDPR impementation and compiance efforts across the organization. Conduct thorough GDPR risk assessments and impact anayses. Deveop and impement GDPR compiance poicies and procedures. Train and educate staff on GDPR requirements and best practices. Liaise with interna and externa stakehoders to ensure compiance. Monitor data protection compiance and data processing activities. Manage data breach response and reporting procedures. Perform reguar audits to ensure ongoing GDPR compiance. Stay updated with the atest deveopments in data protection aws and reguations. Coaborate with IT and security teams to ensure the impementation of technica safeguards. Document and maintain records of processing activities. Responsibiities Quaifications: Bacheor’s degree in aw, Information Technoogy, Business, or a reated fied. A master’s degree is preferred. At east 10 years of experience in data protection, privacy aws, or GDPR compiance. Professiona certification such as CIPP/E, CIPM , or simiar is mandatory Strong anaytica and probem-soving skis. Exceent communication and interpersona skis. Abiity to work independenty and manage mutipe projects simutaneousy. Experience with data protection impact assessments, data protection principes, and data breach management. Skis and Competencies: Deep understanding of data protection and privacy reguations aong with Project management skis. Abiity to communicate compex reguatory requirements in a cear and concise manner. Strong ethica standards and commitment to privacy and data protection. Proficiency in using data protection management toos and software. Preferred Experience: Experience working in a simiar roe within a mutinationa organization. Famiiarity with other goba data protection reguations such as CCPA, HIPAA, APPI, DPDPA, POPIA, PIPEDA etc. Experience in a ega advisory roe or as a Data Protection Officer (DPO). Quaifications: Bacheor’s degree in aw, Information Technoogy, Business, or a reated fied. A master’s degree is preferred. At east 10 years of experience in data protection, privacy aws, or GDPR compiance. Professiona certification such as CIPP/E, CIPM , or simiar is mandatory Strong anaytica and probem-soving skis. Exceent communication and interpersona skis. Abiity to work independenty and manage mutipe projects simutaneousy. Experience with data protection impact assessments, data protection principes, and data breach management. Skis and Competencies: Deep understanding of data protection and privacy reguations aong with Project management skis. Abiity to communicate compex reguatory requirements in a cear and concise manner. Strong ethica standards and commitment to privacy and data protection. Proficiency in using data protection management toos and software. Preferred Experience: Experience working in a simiar roe within a mutinationa organization. Famiiarity with other goba data protection reguations such as CCPA, HIPAA, APPI, DPDPA, POPIA, PIPEDA etc. Experience in a ega advisory roe or as a Data Protection Officer (DPO).
Posted 4 days ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Description We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Qualifications Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python
Posted 4 days ago
8.0 - 12.0 years
16 - 18 Lacs
Bengaluru
Work from Office
Job Title: Senior Data Scientist Location: Bangalore Experience: 8 - 12 Years Job Summary We are seeking a highly skilled and experienced Senior Data Scientist to join our team in Bangalore. The ideal candidate will have a deep understanding of Machine Learning (ML) and Artificial Intelligence (AI), with a strong focus on Azure Fabric within the banking and finance domain. In this role, you will develop and implement advanced data-driven solutions that enhance decision-making, optimise processes, and contribute to the success of our client s financial objectives. Mandatory Skills Proven experience in traditional Machine Learning (ML) and Artificial Intelligence (AI). Strong experience in Azure Fabric and its integration with various banking systems. Expertise in Data Science methodologies, predictive modelling, and statistical analysis. Solid understanding of the Finance domain with a focus on banking processes and challenges. Hands-on experience with big data technologies and cloud platforms (Azure, AWS). Proficiency in Python-related data science libraries (e.g., Pandas, NumPy, Scikit-learn). Experience in data processing, ETL pipelines, and data engineering. Familiarity with SQL and NoSQL databases. Key Responsibilities Design and implement Machine Learning (ML) and Artificial Intelligence (AI) models to solve complex business problems in the finance sector. Work closely with business stakeholders to understand requirements and translate them into data-driven solutions. Develop and deploy ML models on Azure Fabric, ensuring their scalability and efficiency. Analyze large datasets to identify trends, patterns, and insights to support decision-making. Collaborate with cross-functional teams to integrate AI/ML solutions into business processes and banking systems. Maintain and optimise deployed models and ensure their continuous performance. Keep up to date with industry trends, technologies, and best practices in AI and ML, specifically within the finance industry. Qualifications Education: Bachelor s/Master s degree in Computer Science, Data Science, Engineering, or related field. Certifications: Relevant certifications in Data Science, Azure AI, or Machine Learning is a plus. Technical Skills Expertise in Machine Learning (ML) algorithms (Supervised and Unsupervised). Strong experience with Azure Fabric and related Azure cloud services. Proficient in Python, R, and data science libraries (Pandas, Scikit-learn, TensorFlow). Experience in AI and Deep Learning models, including neural networks. Working knowledge of big data technologies such as Spark, Hadoop, and Databricks. Experience with version control systems (Git, GitHub, etc.). Soft Skills Excellent problem-solving and analytical skills. Strong communication skills, with the ability to present complex data insights clearly to non-technical stakeholders. Ability to work effectively in a collaborative, cross-functional environment. Strong attention to detail and ability to manage multiple tasks simultaneously. A passion for continuous learning and staying updated on new technologies. Experience in the banking or financial services industry. Familiarity with DevOps practices for ML/AI model deployment. Knowledge of cloud-native architecture and containerization (Docker, Kubernetes). Familiarity with Deep Learning and Natural Language Processing (NLP) techniques. 8-12 years of experience in Data Science, with hands-on experience in ML, AI, and working within the finance or banking industry. Proven track record of designing and deploying machine learning models and working with Azure Fabric. Experience with client-facing roles and delivering solutions that impact business decision-making. Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Timely and effective delivery of ML/AI models that solve complex business problems. Continuous improvement and optimisation of deployed models. High-quality insights and data-driven solutions delivered for business stakeholders. Client satisfaction with AI/ML solutions implemented within the banking domain. Number of successful ML/AI models deployed and their performance post-deployment. Model accuracy and predictive capability (based on business goals). Client feedback on AI-driven solutions. Completion time for delivering actionable data-driven insights. Team collaboration and mentoring effectiveness with junior data scientists. Click here to upload your CV / Resume We accept PDF, DOC, DOCX, JPG and PNG files Verification code successfully sent to registered email Invalid Verification Code! Thanks for the verification! Our support team will contact you shortly!.
Posted 4 days ago
2.0 - 4.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Cloud Developer This role has been designed as Onsite with an expectation that you will primarily work from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today s complex world. Our culture thrives on finding new and better ways to accelerate what s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: In the HPE Hybrid Cloud , we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what s next for you. Job Family Definition: The Cloud Developer builds from the ground up to meet the needs of mission-critical applications, and is always looking for innovative approaches to deliver end-to-end technical solutions to solve customer problems. Brings technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing cloud-based solutions. Collaborates with project managers and development partners to ensure effective and efficient delivery, deployment, operation, monitoring, and support of Cloud engagements. The Cloud Developer provides business value expertise to drive the development of innovative service offerings that enrich HPEs Cloud Services portfolio across multiple systems, platforms, and applications. Management Level Definition: Contributions include applying intermediate level of subject matter expertise to solve common technical problems. Acts as an informed team member providing analysis of information and recommendations for appropriate action. Works independently within an established framework and with moderate supervision. What you will do: Designs simple to moderate cloud application features as per specifications. Develops and maintains cloud application modules adhering to security policies. Designs test plans, develops, executes, and automates test cases for assigned portions of the developed code. Deploys code and troubleshoots issues in application modules and deployment environment. Shares and reviews innovative technical ideas with peers, high-level technical contributors, technical writers, and managers. Analyses science, engineering, business, and other data processing problems to develop and implement solutions to complex application problems, system administration issues, or network concerns. What you will need: Bachelors degree in computer science, engineering, information systems, or closely related quantitative discipline. Master s desirable. Typically, 2-4 years experience. Knowledge and Skills: Strong programming skills in Python or Golang. Expertise in development of micro services and deploying this on Kubernetes environment Understanding and work experience in GitOps, Devops, CD/CD tooling, concepts of package management software, software deployment and life cycle management Experience in architecting software deployments, scripting, deployment tools like chef, puppet ansible, orchestration tools like Terraform Good to have Enterprise Data Center Infrastructure knowledge (Servers, Storage, Networking) Experience with design methodologies, cloud-native applications, developer tools, managed services, and next-generation databases. Good written and verbal communication skills. Ability to quickly learn new skills and technologies and work well with other team members. Understanding DevOps practices like continuous integration/deployment and orchestration with Kubernetes. Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Release Management, Security-First Mindset, User Experience (UX) What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Lets Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #hybridcloud Job: Engineering Job Level: TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity . Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 4 days ago
1.0 - 5.0 years
15 - 17 Lacs
Mumbai
Work from Office
Your mission As a Software Engineer II AI, you will be at the forefront of building agent-based systems and intelligent pipelines that combine structured geospatial data with unstructured content. You ll collaborate with AI researchers, data scientists, and product engineers to deliver scalable, high-performing solutions. If you thrive on pushing the boundaries of what AI can do in the real world especially in dynamic environments like map making this is the role for you. Your Tasks Design and develop agentic systems using Python and frameworks like Lang Chain or Haystack Build and optimize scalable Retrieval-Augmented Generation (RAG) pipelines using LLMs and vector stores Integrate AI reasoning engines with data sources (SQL, NoSQL, REST APIs, file systems) Enhance system observability and monitoring for intelligent agents and workflows Implement testable, modular, and well-documented code with focus on production-readiness Collaborate with ML and backend engineers to tune performance and cost-efficiency Stay up-to-date with the latest developments in LLMs, multi-agent systems, and semantic retrieval What you should bring along Bachelor s or Master s in Computer Science, Artificial Intelligence, or equivalent 3+ years of experience in software development, with at least 2+ years in AI/ML systems Strong programming skills in Python, including async and multiprocessing capabilities Deep understanding of LLM integration patterns (OpenAI, Hugging Face, etc.) Experience building scalable RAG architectures with vector databases like FAISS, Weaviate, or Pinecone Familiarity with prompt engineering, semantic search, and knowledge graphs Proficiency in designing backend services with RESTful APIs and microservices Working knowledge of containerization (Docker), CI/CD pipelines, and cloud platforms (preferably AWS) Excellent communication and documentation skills Who are you Develop, extend and maintain AI-powered software products in an innovative and iteratively growing environment Implement tools to enhance both automated and semi-automated map data processing, combining backend/service-based software stacks and AI-based agentic workflows Build dashboards or monitoring systems to visualize agent reasoning and RAG system metrics
Posted 4 days ago
2.0 - 8.0 years
18 - 20 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities This role is accountable for Designing and Implementing Integrations Build, deploy, and maintain integrations using cloudnative services (AWS, GCP) and middleware tools. API and EventDriven Architectures Develop and manage APIs, API gateways, and Kafkabased event streaming solutions for realtime data processing. Scalability and Performance Optimization Ensure integrations are robust, secure, and optimized for performance and costeffectiveness. Monitoring and Troubleshooting Proactively identify and resolve integration failures, ensuring system reliability and minimal downtime. Collaboration and Documentation Work with crossfunctional teams to understand integration needs and maintain clear technical documentation. Mandatory skill sets Must have knowledge, skills and experiences Strong understanding of CI/CD pipelines and Infrastructure as code principles such as Terraform. Experience with CI/CD tooling such as GitHub, Jenkins, Codefresh , Docker, Kubernetes. Experienced in building RestFul API s using Java (Springboot). Experienced in AWS development environment and ecosystem Cloud native and digital solutioning leveraging emerging technologies incl. containers, serverless, data, API and Microservices etc Experience with measuring, analysing, monitoring, and optimizing cloud performance including cloud system reliability and availability Understanding of storage solutions, networking, and security. Strong Familiarity with cloud platform specific Well Architected Frameworks Production experience of running services in Kubernetes. Preferred skill sets Good to have knowledge, skills and experiences The good to have knowledge, skill and experience (KSE) the role requires are MultiCloud Experience Exposure to both AWS and GCP integration services and the ability to work across different cloud providers. Message Queue Systems Experience with messaging systems like RabbitMQ, ActiveMQ, or Google Cloud Pub/Sub Serverless Architectures Handson experience with serverless computing and eventdriven workflows. Observability and Logging Knowledge of monitoring and logging tools such as Prometheus, Grafana, AWS CloudWatch, or GCP Stackdriver. Security Best Practices Understanding of cloud security, encryption, and compliance frameworks (e.g., IAM policies, SOC2, GDPR). Testing and Automation Knowledge of API testing frameworks (Postman, Karate, or REST Assured) and automation testing for integrations. Kubernetes and Containerization Experience deploying and managing integrations in containerized environments using Kubernetes (EKS/GKE). Networking Concepts Understanding of VPCs, private link, service mesh, and hybrid cloud connectivity. Knowledge of ANZ Retail and Commercial Banking Systems Familiarity with ANZ banking technology ecosystems, core banking platforms, payment systems, and regulatory requirements. Years of experience required o 7 to 8 years (23 years relevant) Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Power BI Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 4 days ago
1.0 - 4.0 years
3 - 6 Lacs
Gurugram
Work from Office
About this role BlackRock is seeking a highly skilled and motivated Analyst to support its growing and dynamic Client Data function! In this role, you will be responsible to drive the accuracy, quality and consistent use of the most impactful, globally relevant data fields, facilitating scale & efficiency across BLK s global sales and service ecosystem. You will work closely with cross-functional teams including business stakeholders and technical teams for Client Data to establish standards for the entry and maintenance of client data, implement exception monitoring to identify data inconsistencies and complete high-risk updates where required. At BlackRock, we are dedicated to encouraging an inclusive environment where every team member can thrive and contribute to our world-class success. This is your chance to be part of a firm that is not only ambitious but also committed to delivering flawless and proven investment strategies. Key Responsibilities: As a Data Analyst, you will play a pivotal role in ensuring the accuracy and efficiency of our client data. Your responsibilities will include: Data Governance & Quality: Monitor data health and integrity, and ensure data products meet strict standards for accuracy, completeness, and consistency. Conduct regular assessments to identify deficiencies and opportunities for improvement. Data Management : Maintain, cleanse and update records within the Client Relationship Management systems. This may include researching information across a variety of data sources, working with internal client support groups to create data structures that mimic client asset pools and connecting client information across data sources. Process Improvement and Efficiency : Identify and complete process improvements from initial ideation to implementation. Collaborate with cross-functional teams product managers, engineers, and business stakeholders to plan, design, and deliver data products. Quality Assurance : Collaborate with teams to test new CRM features, ensuring tools function accurately and identifying defects for resolution. Collaboration & Communication: Prioritize effectively with various collaborators across BlackRock. Ensure efficient and timely data governance and maintenance in an agile environment. Qualifications & Requirements: We seek candidates who are ambitious, diligent, and have a proven track record in data management. The ideal candidate will possess the following qualifications: Experience: MBA or equivalent experience required; major in Business, Finance, MIS, Computer Science or related fields preferred 1 to 4 years of experience in data management or data processing Financial services industry experience is a plus but not required Skills and Qualifications: Proficiency in SQL; Python experience a plus Proficiency in data management / reporting tools and technologies such as POWER BI a plus Experience with business applications including Excel and PowerPoint Experience working with CRM platforms; Microsoft Dynamics experience a plus Organized and detail-oriented with strong time management skills Self-motivated with a strong focus on service and ability to liaise with many groups across the company Excellent online research skills Exceptional written and verbal communication skills Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model . About BlackRock . This mission would not be possible without our smartest investment - the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com / company / blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. #EarlyCareers Our benefits . Our hybrid work model . About BlackRock . This mission would not be possible without our smartest investment - the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com / company / blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 4 days ago
2.0 - 3.0 years
4 - 5 Lacs
Mumbai
Work from Office
Job Overview We are looking for an experienced Data Engineer with at least 2-3 years of hands-on experience in designing and maintaining scalable data infrastructure. You will work with cross-functional teams to ensure high-quality, accessible, and well-structured data systems that support business intelligence, analytics, and other data-driven needs. Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines for ingesting, transforming, and loading data. Build and manage scalable data architectures on cloud platforms such as AWS, GCP, or Azure. Ensure data quality, integrity, and consistency through validation, auditing, and monitoring. Collaborate with analytics and product teams to gather data requirements and deliver optimized data sets. Implement and manage data lakes and data warehouses using tools like Redshift, BigQuery, Azure or Snowflake. Develop automated workflows using orchestration tools like Apache Airflow or AWS Step Functions. Optimize data processing workflows for performance and cost-efficiency. Document data models, data flows, and transformation logic to ensure transparency and maintainability. Enforce data security, privacy, and governance best practices. Perform regular data audits and troubleshooting for data issues. Required Skills & Qualifications Bachelor s or Master s degree in Computer Science, Information Systems, or related field. Minimum 2-3 years of experience in data engineering roles. Strong hands-on experience with cloud data services (e.g., AWS S3, Glue, Athena, Redshift; or equivalents in GCP/Azure). Proficiency in Python, PySpark, and SQL for data manipulation and scripting. Experience with big data tools such as Spark or Hadoop. Solid understanding of data modeling, warehousing concepts, and distributed systems. Strong problem-solving skills with a focus on debugging data quality and performance issues. Experience with data visualization tools or platforms is a plus (e.g., Power BI, Tableau, Looker). Good to Have Experience with version control and CI/CD for data pipelines. Familiarity with message queue systems like Kafka or Kinesis. Exposure to real-time data processing. Knowledge of infrastructure-as-code tools like Terraform or CloudFormation.
Posted 4 days ago
8.0 - 15.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Associate Director - Data Engineering Associate Director - Data Engineering About Junglee Games: With over 140 million users, Junglee Games is a leader in the online skill gaming space. Founded in San Francisco in 2012 and part of the Flutter Entertainment Group, we are revolutionizing how people play games. Our notable games include Howzat, Junglee Rummy, and Junglee Poker. Our team comprises over 900 talented individuals who have worked on internationally acclaimed AAA titles like Transformers and Star Wars: The Old Republic and contributed to Hollywood hits such as Avatar. Junglee s mission is to build entertainment for millions of people around the world and connect them through games. Junglee Games is not just a gaming company but a blend of innovation, data science, cutting-edge tech, and, most importantly, a values-driven culture that is creating the next set of conscious leaders. Job overview: As our Associate Director, Data Engineering you will be responsible for leading a highly qualified team and solve complex problems. You will partner with multiple stakeholders across the organization to efficiently deliver data infrastructure, data modeling, Generative AI solutions and also play the role of an architect in the team. You will have a very good understanding of data visualisation and analytics. Job Location: Bangalore Key responsibilities: Highly developed verbal and written communication skills, with the ability to work up and down within the organization to influence others and achieve results Design and implement Generative AI solutions (e.g., LLMs, diffusion models, transformers) for real-world gaming applications, including Fraud Detection, Recommender Systems, Responsible Gaming, Conversational AI (chatbots, virtual assistants) etc Build robust pipelines for model training, inference, and continuous learning Partner with key stakeholders across all levels to drive solutions to meet business needs Line Manager a team of 7 direct report Data and Analytics Engineers Accountable for the technical delivery of technical Features to achieve business outcomes Mentor a team of data engineers fostering a culture of innovation. Drive visualisation strategy, ensuring that the business users have access to clear, actionable dashboards and reports. Participates in solution approaches / designs and operating principles Initial point of escalation for team to remove blockers Manage the development of efficient ETL processes to gather, clean, and transform data from various sources Stay abreast of emerging technologies and tools, evaluating their potential to enhance our data capabilities Foster an Engineering Mindset Demonstrates a commitment and passion for associate development, driving the talent agenda. Sets clear goals and expectations around performance, providing timely feedback and stretching targets Stay current with the latest in GenAI research and tooling, bringing innovative approaches to production. Ensure high data availability, reliability and performance across the stack Qualifications & skills required Typically has 8-15 years prior experience in Data engineering with at least 1-2 years focused on Generative AI or LLM 3 - 5 Years of team management experience Experience deploying ML models in production environments (REST APIs, microservices, etc.). Hands-on experience with frameworks such as LangChain, LlamaIndex, or Retrieval-Augmented Generation (RAG) systems Experience managing multiple concurrent projects and development teams Experience with stakeholder management Deep understanding of ETL/ELT workflows , batch and real time data processing. Proficient with AWS Cloud technologies (i.e. S3, Lambda, Dynamo, EC2), Python, Spark Strong proficiency in data warehousing, ETL processes. Be a part of Junglee Games to: Value Customers & Data - Prioritize customers, use data-driven decisions, master KPIs, and leverage ideation and A/B testing to drive impactful outcomes. Inspire Extreme Ownership - We embrace ownership, collaborate effectively, and take pride in every detail to ensure every game becomes a smashing success. Lead with Love - We reject micromanagement and fear, fostering open dialogue, mutual growth, and a fearless yet responsible work ethic. Embrace change - Change drives progress and our strength lies in adapting swiftly and recognizing when to evolve to stay ahead. Play the Big Game - We think big, challenge norms, and innovate boldly, driving impactful results through fresh ideas and inventive problem-solving. Avail a comprehensive benefits package that includes paid gift coupons, fitness plans, gadget allowances, fuel costs, family healthcare, and much more. Know more about us: Explore the world of Junglee Games through our website, www.jungleegames.com . Get a glimpse of what Life at Junglee Games looks like on LinkedIn . Here is a quick snippet of the Junglee Games Offsite 24 Liked what you saw so far? Be A Junglee
Posted 4 days ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Job Announcement for a post of Remote Sensing GIS Specialist by | Sep 3, 2023 | | Join Our Team as a Remote Sensing GIS Specialist At BlueEnergy Build Private Limited (BEBPL), we re a dynamic company specializing in multi-sector investigations. Our primary focus involves conducting comprehensive studies through hydrological, geological, geophysical, and hydrogeological investigations. We take immense pride in our commitment to providing clients with accurate and insightful data. We understand that the success of any project hinges on a profound understanding of the subsurface. That s why we ve assembled a team of experts from diverse fields, each contributing their unique expertise to ensure our investigations are thorough and precise. Whether we re assessing groundwater resources for agriculture, deciphering geological formations for infrastructure projects, or conducting hydrogeological studies for environmental conservation, BEBPL is dedicated to delivering high-quality services tailored to our clients specific needs. We value excellence, innovation, and collaboration and are driven by a shared passion for advancing knowledge in the field of subsurface investigations. Committed to the highest standards of professionalism and ethics, BEBPL strives to be a trusted partner for projects spanning various sectors. Position: Remote Sensing and GIS Specialist Location: Hyderabad Qualifications: M.Sc. in Remote Sensing, Geographic Information System, Geology, Hydrology, Spatial Information Technology, or any other relevant course. Minimum 5 years of professional experience in image processing, GIS mapping, and related works. Responsibilities: Utilize remote sensing and GIS tools to process and analyze geospatial data. Conduct image processing, interpretation, and analysis of satellite and aerial imagery. Create and maintain detailed GIS maps, databases, and spatial models. Collaborate with multidisciplinary teams to provide geospatial solutions for various projects. Implement advanced spatial analysis techniques to derive meaningful insights from geospatial data. Manage and maintain geospatial databases, ensuring data accuracy and quality. Stay updated with the latest advancements in remote sensing and GIS technologies and recommend their application to improve processes. Effectively communicate findings and present geospatial information to stakeholders. Requirements: Strong academic background with an M.Sc. in a relevant field. Proficiency in remote sensing and GIS software such as ArcGIS, QGIS, ENVI, or similar tools. Demonstrated experience in image processing, data analysis, and geospatial modeling. Familiarity with various types of geospatial data sources, including satellite and aerial imagery. Excellent problem-solving skills and attention to detail. Effective communication skills, both written and verbal. Ability to work independently and as part of a team. Strong organizational skills and the ability to manage and prioritize multiple tasks. Experience: 5+ years in image processing, GIS mapping, and related works. Location: Hyderabad Salary: Competitive, commensurate with qualifications and experience. How to Apply: Interested candidates who meet the qualifications are invited to join our team as a Remote Sensing and GIS Specialist. . Please include RSGIS Application [Your Name] in the email subject line. Deadline for Application: 30 days from the date of posting We are an equal opportunity employer and welcome applications from candidates of all backgrounds. Join our team in Hyderabad and contribute your expertise to exciting geophysical projects. Your skills and commitment will play a key role in advancing our mission. Apply today!
Posted 4 days ago
4.0 - 6.0 years
5 - 9 Lacs
Ahmedabad, Bengaluru
Work from Office
Sr Software Engineer at PierSight | Jobs at PierSight Bangalore / Ahmedabad, India As per industry standards February 21st, 2025 Role: Sr Software Engineer Industry Type: Space Technology Location: Ahmedabad / Bangalore Employment Type: Full-time Job Description: Are you ready to join the pioneering team at PierSight SpaceWere a Space-Tech company with teams in Ahmedabad, California and Bangalore on a mission to build the worlds largest constellation of Synthetic Aperture Radar and AIS satellites for comprehensive ocean surveillance. With backing from prestigious institutional investors like Alphawave Global, Elevation Capital, All in Capital, and Techstars, were set to make a significant impact. We are seeking a highly skilled and experienced Software Engineer to lead the development of our Maritime Tracking and Analysis software. This role requires a deep understanding of full-stack web development, geospatial data, Docker, Kubernetes, and product management. Cloud certifications are a plus. Key Responsibilities: Architect the full Maritime Analysis and Tracking System, ensuring it meets business needs and technical requirements. Manage the product development lifecycle, from initial concept to final delivery. Lead and manage a team of developers, providing technical guidance and support. Ensure the product is delivered on time, within scope, and meets quality standards. Collaborate with cross-functional teams, including engineering, design, and marketing, to ensure seamless integration and alignment with business objectives. Create detailed product specifications and technical documentation. Make decisions on technology stacks and architectural patterns. Oversee the implementation of product features and ensure quality control. Key Skills and Qualifications: 4-6 years of full-stack web development experience. Strong understanding of building and managing geospatial data. Proficiency in Docker and Kubernetes. Solid understanding of product management principles. Cloud certifications (AWS, Azure, Google Cloud) are a plus. Excellent problem-solving capabilities and analytical thinking. Strong communication and leadership abilities. Ability to work effectively in a fast-paced, dynamic environment. Preferred Qualifications: Experience in maritime or geospatial software development. Knowledge of cloud-based image production pipelines and data processing workflows. Why Join Us: Be part of a cutting-edge technology startup with a mission to revolutionize maritime surveillance. Work with a dynamic and passionate team of professionals. Opportunity to lead and shape the development of innovative software solutions. Competitive salary and benefits package.
Posted 4 days ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Job Description You will be a part of our Data Engineering team and will be focused on delivering exceptional results for our clients. A large portion of your time will be in the weeds working alongside your team architecture, designing, implementing, and optimizing data solutions. You ll work with the team to deliver, migrate and/or scale cloud data solutions; build pipelines and scalable analytic tools using leading technologies including AWS, Azure, GCP, Spark, Hadoop, etc. What you ll be doing? Develop data pipelines to move and transform data from various sources to data warehouses Ensure the quality, reliability, and scalability of the organizations data infrastructure Optimize data processing and storage for performance and cost-effectiveness Collaborate with data scientists, analysts, and other stakeholders to understand their requirements and develop solutions to meet their needs Continuously monitor and troubleshoot data pipelines to ensure their reliability and availability Stay up to date with the latest trends and technologies in data engineering and apply them to improve our data capabilities Qualifications Bachelor s degree in computer science, Software Engineering, or a related field 6+ years of experience in data engineering or a related field Strong programming skills in Python or Scala
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
In the IBM Chief Information Office,you wi be part of a dynamic team driving the future of AI and data science in arge-scae enterprise transformations. We offer a coaborative environment where your technica expertise wi be vaued, and your professiona deveopment wi be supported. Join us to work on chaenging projects, everage the atest technoogies, and make a tangibe impact on eading organisations. As a Data Scientist within IBM's Chief Information Office, you wi support AI-driven projects across the enterprise. You wi appy your technica skis in AI, machine earning, and data anaytics to assist in impementing data-driven soutions that aign with business goas. This roe invoves working with team members to transate data insights into actionabe recommendations. Key Responsibiities: Technica Execution and Leadership: Deveop and depoy AI modes and data anaytics soutions. Support the impementation and optimisation of AI-driven strategies per business stakehoder requirements. Hep refine data-driven methodoogies for transformation projects. Data Science and AI: Design and impement machine earning soutions and statistica modes, from probem formuation through depoyment, to anayse compex datasets and generate actionabe insights. Learn and utiise coud patforms to ensure the scaabiity of AI soutions. Leverage reusabe assets and appy IBM standards for data science and deveopment. Project Support: Lead and contribute to various stages of AI and data science projects, from data exporation to mode deveopment. Monitor project timeines and hep resove technica chaenges. Design and impement measurement frameworks to benchmark AI soutions, quantifying business impact through KPIs. Coaboration: Ensure aignment to stakehoders strategic direction and tactica needs. Work with data engineers, software deveopers, and other team members to integrate AI soutions into existing systems. Contribute technica expertise to cross-functiona teams. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Bacheors or Masters in Computer Science, Data Science, Statistics, or a reated fied is required; an advanced degree is strongy preferred Experience: 5+ yearsof experience in data science, AI, or anaytics with a focus on impementing data-driven soutions Experience with data ceaning, data anaysis, A/B testing, and data visuaization Experience with AI technoogies through coursework or projects Technica Skis: Proficiency in SQL and Python for performing data anaysis and deveoping machine earning modes Knowedge of common machine earning agorithms and frameworksinear regression, decision trees, random forests, gradient boosting (e.g., XGBoost, LightGBM), neura networks, and deep earning frameworks such as TensorFow and PyTorch Experience with coud-based patforms and data processing frameworks Understanding of arge anguage modes (LLMs) Famiiarity with IBMs watsonx product suite Famiiarity with object-oriented programming Anaytica Skis: Strong probem-soving abiities and eagerness to earn
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data processing job market in India is thriving with opportunities for job seekers in the field. With the growing demand for data-driven insights in various industries, the need for professionals skilled in data processing is on the rise. Whether you are a fresh graduate looking to start your career or an experienced professional looking to advance, there are ample opportunities in India for data processing roles.
These major cities in India are actively hiring for data processing roles, with a multitude of job opportunities available for job seekers.
The average salary range for data processing professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakh per annum, while experienced professionals can earn upwards of INR 10 lakh per annum.
A typical career path in data processing may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise in the field, they may progress from Junior Data Analyst to Senior Data Analyst, and eventually to roles such as Data Scientist or Data Architect.
In addition to data processing skills, professionals in this field are often expected to have knowledge of programming languages such as Python, SQL, and R. Strong analytical and problem-solving skills are also essential for success in data processing roles.
As you explore opportunities in the data processing job market in India, remember to prepare thoroughly for interviews and showcase your skills and expertise confidently. With the right combination of skills and experience, you can embark on a successful career in data processing in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane