Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Team : We are looking for a highly motivated, enthusiastic, and skilled engineering lead for Commodity Insights. We strive to deliver solutions that are sector-specific, data-rich, and hyper-targeted for evolving business needs. Our Software development Leaders are involved in the full product life cycle, from design through release. The resource would be joining a strong innovative team working on the content management platforms which support a large revenue stream for S&P Commodity Insights. Working very closely with the Product owner and Development Manager, teams are responsible for the development of user enhancements and maintaining good technical hygiene. The successful candidate will assist in the design, development, release and support of content platforms. Skills required include ReactJS, Spring Boot, RESTful microservices, AWS services (S3, ECS, Fargate, Lambda, etc.), CSS HTML, AJAX JSON, XML and SQL (PostgreSQL/Oracle), . The candidate should be aware of GEN AI or LLM models like Open AI and Claude etc. The candidate should be enthusiast in working on prompt building related to GenAI and business-related prompts. The candidate should be able to develop and optimize prompts for AI models to improve accuracy and relevance. The candidate must be able to work well with a distributed team, demonstrate an ability to articulate technical solutions for business requirements, have experience with content management/packaging solutions, and embrace a collaborative approach for the implementation of solutions. Responsibilities : Lead and mentor a team through all phases of the software development lifecycle, adhering to agile methodologies (Analyze, design, develop, test, debug, and deploy). Ensure high-quality deliverables and foster a collaborative environment. Be proficient with the use of developer tools supporting the CI/CD process including configuring and executing automated pipelines to build and deploy software components Actively contribute to team planning and ceremonies and commit to team agreement and goals Ensure code quality and security by understanding vulnerability patterns, running code scans, and be able to remediate issues. Mentoringthe junior developers. Make sure that code review tasks on all user storiesare added and timely completed. Perform reviews and integration testing to assure quality of project development eorts Design database schemas, conceptual data models, UI workows and application architectures that t into the enterprise architecture Support the user base, assisting with tracking down issues and analyzing feedback to identify product improvements Understand and commit to the culture of S&P Global: the vision, purpose and values of the organization Basic Qualifications : 10+ years experience in an agile team development role, delivering software solutions using Scrum Java, J2EE, Javascript, CSS/HTML, AJAX ReactJS, Spring Boot, Microservices, RESTful services, OAuth XML, JSON, data transformation SQL and NoSQL Databases (Oracle, PostgreSQL) Working knowledge of Amazon Web Services (Lambda, Fargate, ECS, S3, etc.) Experience on GEN AI or LLM models like Open AI and Claude is preferred. Experience with agile workflow tools (e.g. VSTS, JIRA) Experience with source code management tools (e.g. git), build management tools (e.g. Maven) and continuous integration/delivery processes and tools (e.g. Jenkins, Ansible) Self-starter able to work to achieve objectives with minimum direction Comfortable working independently as well as in a team Excellent verbal and written communication skills Preferred Qualifications: Analysis of business information patterns, data analysis and data modeling Working with user experience designers to deliver end-user focused benefits realization Familiar with containerization (Docker, Kubernetes) Messaging/queuing solutions (Kafka, etc.) Familiar with application security development/operations best practices (including static/dynamic code analysis tools) About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit . Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & Wellness: Health care coverage designed for the mind and body. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 months ago
4.0 - 7.0 years
5 - 16 Lacs
Hyderabad, Bengaluru
Work from Office
Roles and Responsibilities : Design, develop, test, deploy and maintain Snowflake data warehouses for clients. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Develop ETL processes using Python scripts to extract data from various sources and load it into Snowflake tables. Troubleshoot issues related to Snowflake performance tuning, query optimization, and data quality. Job Requirements : 4-7 years of experience in developing large-scale data warehouses on AWS using Snowflake. Strong understanding of Lambda expressions in Snowflake SQL language. Experience with Python programming language for ETL development.
Posted 2 months ago
7.0 - 12.0 years
15 - 27 Lacs
Bengaluru
Hybrid
Labcorp is hiring a Senior Data engineer. This person will be an integrated member of Labcorp Data and Analytics team and work within the IT team. Play a crucial role in designing, developing and maintaining data solutions using Databricks, Fabric, Spark, PySpark and Python. Responsible to review business requests and translate them into technical solution and technical specification. In addition, work with team members to mentor fellow developers to grow their knowledge and expertise. Work in a fast paced and high-volume processing environment, where quality and attention to detail are vital. RESPONSIBILITIES: Design and implement end-to-end data engineering solutions by leveraging the full suite of Databricks, Fabric tools, including data ingestion, transformation, and modeling. Design, develop and maintain end-to-end data pipelines by using spark, ensuring scalability, reliability, and cost optimized solutions. Conduct performance tuning and troubleshooting to identify and resolve any issues. Implement data governance and security best practices, including role-based access control, encryption, and auditing. Work in fast-paced environment and perform effectively in an agile development environment. REQUIREMENTS: 8+ years of experience in designing and implementing data solutions with at least 4+ years of experience in data engineering. Extensive experience with Databricks, Fabric, including a deep understanding of its architecture, data modeling, and real-time analytics. Minimum 6+ years of experience in Spark, PySpark and Python. Must have strong experience in SQL, Spark SQL, data modeling & RDBMS concepts. Strong knowledge of Data Fabric services, particularly Data engineering, Data warehouse, Data factory, and Real- time intelligence. Strong problem-solving skills, with ability to perform multi-tasking. Familiarity with security best practices in cloud environments, Active Directory, encryption, and data privacy compliance. Communicate effectively in both oral and written. Experience in AGILE development, SCRUM and Application Lifecycle Management (ALM). Preference given to current or former Labcorp employees. EDUCATION: Bachelors in engineering, MCA.
Posted 2 months ago
10.0 - 15.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Work from Office
Roles and responsibilities Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Work with the leadership to set the standards for software engineering practices within the machine learning engineering team and support across other disciplines Play an active role in leading team meetings and workshops with clients. Choose and use the right analytical libraries, programming languages, and frameworks for each task. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Help us to shape the next generation of our products. What Were Looking For Total experience in data management area for 10 + years’ experience in the implementation of modern data ecosystems in AWS/Cloud platforms. Strong experience with AWS ETL/File Movement tools (GLUE, Athena, Lambda, Kinesis and other AWS integration stack) Strong experience with Agile Development, SQL Strong experience with Two or Three AWS database technologies (Redshift, Aurora, RDS,S3 & other AWS Data Service ) covering security, policies, access management Strong programming Experience with Python and Spark Strong learning curve for new technologies Experience with Apache Airflow & other automation stack. Excellent with Data Modeling. Excellent oral and written communication skills. A high level of intellectual curiosity, external perspective, and innovation interest Strong analytical, problem solving and investigative skills Experience in applying quality and compliance requirements. Experience with security models and development on large data sets
Posted 2 months ago
4.0 - 7.0 years
12 - 20 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Role and Responsibilities Managing the complete software development process from conception to deployment Maintaining and upgrading the software following deployment Managing the end-to-end life cycle to produce software and application. Overseeing and guiding the analysing, writing, building, and deployment of software Overseeing the automated testing and providing feedback to management during the development process Modifying and testing changes to previously developed programs Skills and Experience 3+ years of experience in developing enterprise level applications using React,Javascript,Typescript,HTML,CSS. 1+ years of experience in AWS services (Lambda,Ec2 etc) Strong proficiency in JavaScript, object model, DOM manipulation and event handlers, data structures and Complete understanding of Virtual DOM, component lifecycle, REST API integration etc. Experience in writing UI test cases Excellent verbal and written communication and collaboration skills to effectively communicate with both business and technical teams. Comfortable working in a fast-paced, result-oriented environment. Experience in Leading team. Preferred candidate profile
Posted 2 months ago
9.0 - 14.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Qualifications/Skill Sets: Experience 8+ years of experience in software engineering with at least 3+ years as a Staff Engineer or Technical Lead level. Architecture Expertise: Proven track record designing and building large-scale, multi-tenant SaaS applications on cloud platforms (e.g., AWS, Azure, GCP). Tech Stack: Expertise in modern backend languages (e.g., Java, Python, Go, Node.js), frontend frameworks (e.g., React, Angular), and database systems (e.g., PostgreSQL, MySQL, NoSQL). Cloud & Infrastructure: Strong knowledge of containerization (Docker, Kubernetes), serverless architectures, CI/CD pipelines, and infrastructure-as-code (e.g., Terraform, CloudFormation). End to end development and deployment experience in cloud applications Distributed Systems: Deep understanding of event-driven architecture, message queues (e.g., Kafka, RabbitMQ), and microservices. Security: Strong focus on secure coding practices and familiarity with identity management (OAuth2, SAML) and data encryption. Communication: Excellent verbal and written communication skills with the ability to present complex technical ideas to stakeholders. Problem Solving: Strong analytical mindset and a proactive approach to identifying and solving system bottlenecks.
Posted 2 months ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, building, and maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security . The ideal candidate should have strong AWS knowledge and be capable of writing and maintaining Terraform, CloudFormation, and CI/CD pipelines to streamline cloud deployments. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: AWS Infrastructure Design & Implementation Architect, implement, and manage highly available AWS cloud environments . Design VPCs, Subnets, Security Groups, and IAM policies to enforce security standard processes. Optimize AWS costs using reserved instances, savings plans, and auto-scaling . Infrastructure as Code (IaC) & Automation Develop, maintain, and enhance Terraform & CloudFormation templates for cloud provisioning. Automate deployment, scaling, and monitoring using AWS-native tools & scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce standard processes in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response. Monitoring & Performance Optimization Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring. Implement autoscaling, load balancing, and caching strategies for performance optimization. Solve cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard processes for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of experience in computer science, IT, or related field with hands-on cloud experience OR Bachelors degree and 6 to 8 years of experience in computer science, IT, or related field with hands-on cloud experience OR Diploma and 10 to 12 years of experience in computer science, IT, or related field with hands-on cloud experience Must-Have Skills: Deep hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.) . Expertise in Terraform & CloudFormation for AWS infrastructure automation. Strong knowledge of AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53) . Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.) . Strong troubleshooting and debugging skills in cloud networking, storage, and security . Preferred Qualifications: Good-to-Have Skills: Experience with Kubernetes (EKS) and service mesh architectures . Knowledge of AWS Lambda and event-driven architectures . Familiarity with AWS CDK, Ansible, or Packer for cloud automation. Exposure to multi-cloud environments (Azure, GCP) . Familiarity with HPC, DGX Cloud . Professional Certifications (preferred): AWS Certified Solutions Architect Associate or Professional AWS Certified DevOps Engineer Professional Terraform Associate Certification Soft Skills: Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment.
Posted 2 months ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
The role is responsible for designing, developing, and maintaining software solutions for Research scientists. Additionally, it involves automating operations, monitoring system health, and responding to incidents to minimize downtime. You will join a multi-functional team of scientists and software professionals that enables technology and data capabilities to evaluate drug candidates and assess their abilities to affect the biology of drug targets. This team implements scientific software platforms that enable the capture, analysis, storage, and reporting for our Large Molecule Discovery Research team (Design, Make, Test and Analyze processes). The team also interfaces heavily with teams supporting our in vitro assay management systems and our compound inventory platforms. The ideal candidate possesses experience in the pharmaceutical or biotech industry, strong technical skills, and full stack software engineering experience (spanning SQL, back-end, front-end web technologies, automated testing). Roles & Responsibilities: Work closely with product team, business team including scientists, and other collaborators Analyze and understand the functional and technical requirements of applications, solutions and systems and translate them into software architecture and design specifications Design, develop, and implement applications and modules, including custom reports, interfaces, and enhancements Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Conduct code reviews to ensure code quality and adherence to standard methodologies Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently Stay updated with the latest technology and security trends and advancements What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is a with these qualifications. Basic Qualifications: RMasters degree with 1 - 3 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 7 - 9 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 1+ years of experience in implementing and supporting biopharma scientific software platforms Functional Skills: Proficient in Java or Python Proficient in at least one JavaScript UI Framework (e.g. ExtJS, React, or Angular) Proficient in SQL (e.g. Oracle, PostgreSQL, Databricks) Preferred Qualifications: Experience with event-based architecture and serverless AWS services such as EventBridge, SQS, Lambda or ECS. Experience with Benchling Hands-on experience with Full Stack software development Strong understanding of software development methodologies, mainly Agile and Scrum Working experience with DevOps practices and CI/CD pipelines Experience of infrastructure as code (IaC) tools (Terraform, CloudFormation) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience with automated testing tools and frameworks Experience with big data technologies (e.g., Spark, Databricks, Kafka) Experience with leveraging the use of AI-assistants (e.g. GitHub Copilot) to accelerate software development and improve code quality Professional Certifications : AWS Certified Cloud Practitioner preferred Soft Skills: Excellent problem solving, analytical, and troubleshooting skills Strong communication and interpersonal skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to learn quickly & work independently Team-oriented, with a focus on achieving team goals Ability to manage multiple priorities successfully Strong presentation and public speaking skills
Posted 2 months ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Looking for a skilled Senior Data Science Engineer with 6-12 years of experience to lead the development of advanced computer vision models and systems. The ideal candidate will have hands-on experience with state-of-the-art architectures and a deep understanding of the complete ML lifecycle. This position is based in Bengaluru. Roles and Responsibility Lead the development and implementation of computer vision models for tasks such as object detection, tracking, image retrieval, and scene understanding. Design and execute end-to-end pipelines for data preparation, model training, evaluation, and deployment. Perform fine-tuning and transfer learning on large-scale vision-language models to meet application-specific needs. Optimize deep learning models for edge inference (NVIDIA Jetson, TensorRT, OpenVINO) and real-time performance. Develop scalable and maintainable ML pipelines using tools such as MLflow, DVC, and Kubeflow. Automate experimentation and deployment processes using CI/CD workflows. Collaborate cross-functionally with MLOps, backend, and product teams to align technical efforts with business needs. Monitor, debug, and enhance model performance in production environments. Stay up-to-date with the latest trends in CV/AI research and rapidly prototype new ideas for real-world use. Job Requirements 6-7+ years of hands-on experience in data science and machine learning, with at least 4 years focused on computer vision. Strong experience with deep learning frameworks: PyTorch (preferred), TensorFlow, Hugging Face Transformers. In-depth understanding and practical experience with Class-incremental learning and lifelong learning systems. Proficient in Python, including data processing libraries like NumPy, Pandas, and OpenCV. Strong command of version control and reproducibility tools (e.g., MLflow, DVC, Weights & Biases). Experience with training and optimizing models for GPU inference and edge deployment (Jetson, Coral, etc.). Familiarity with ONNX, TensorRT, and model quantization/conversion techniques. Demonstrated ability to analyze and work with large-scale visual datasets in real-time or near-real-time systems. Experience working in fast-paced startup environments with ownership of production AI systems. Exposure to cloud platforms such as AWS (SageMaker, Lambda), GCP, or Azure for ML workflows. Experience with video analytics, real-time inference, and event-based vision systems. Familiarity with monitoring tools for ML systems (e.g., Prometheus, Grafana, Sentry). Prior work in domains such as retail analytics, healthcare, or surveillance/IoT-based CV applications. Contributions to open-source computer vision libraries or publications in top AI/ML conferences (e.g., CVPR, NeurIPS, ICCV). Comfortable mentoring junior engineers and collaborating with cross-functional stakeholders.
Posted 2 months ago
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 2 months ago
9.0 - 12.0 years
35 - 40 Lacs
Bengaluru
Work from Office
We are seeking an experienced AWS Architect with a strong background in designing and implementing cloud-native data platforms. The ideal candidate should possess deep expertise in AWS services such as S3, Redshift, Aurora, Glue, and Lambda, along with hands-on experience in data engineering and orchestration tools. Strong communication and stakeholder management skills are essential for this role. Key Responsibilities Design and implement end-to-end data platforms leveraging AWS services. Lead architecture discussions and ensure scalability, reliability, and cost-effectiveness. Develop and optimize solutions using Redshift, including stored procedures, federated queries, and Redshift Data API. Utilize AWS Glue and Lambda functions to build ETL/ELT pipelines. Write efficient Python code and data frame transformations, along with unit testing. Manage orchestration tools such as AWS Step Functions and Airflow. Perform Redshift performance tuning to ensure optimal query execution. Collaborate with stakeholders to understand requirements and communicate technical solutions clearly. Required Skills & Qualifications Minimum 9 years of IT experience with proven AWS expertise. Hands-on experience with AWS services: S3, Redshift, Aurora, Glue, and Lambda . Mandatory experience working with AWS Redshift , including stored procedures and performance tuning. Experience building end-to-end data platforms on AWS . Proficiency in Python , especially working with data frames and writing testable, production-grade code. Familiarity with orchestration tools like Airflow or AWS Step Functions . Excellent problem-solving skills and a collaborative mindset. Strong verbal and written communication and stakeholder management abilities. Nice to Have Experience with CI/CD for data pipelines. Knowledge of AWS Lake Formation and Data Governance practices.
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Remote
As a Lead Engineer, you will play a critical role in shaping the technical direction of our projects. You will be responsible for leading a team of developers undertaking Creditsafe s digital transformation to our cloud infrastructure on AWS. Your expertise in Data Engineering, Python and AWS will be crucial in building and maintaining high-performance, scalable, and reliable systems. Key Responsibilities: Technical Leadership: Lead and mentor a team of engineers, providing guidance and support to ensure high-quality code and efficient project delivery. Software Design and Development: Collaborate with cross-functional teams to design and develop data-centric applications, microservices, and APIs that meet project requirements. AWS Infrastructure: Design, configure, and manage cloud infrastructure on AWS, including services like EC2, S3, Lambda, and RDS. Performance Optimization: Identify and resolve performance bottlenecks, optimize code and AWS resources to ensure scalability and reliability. Code Review: Conduct code reviews to ensure code quality, consistency, and adherence to best practices. Security: Implement and maintain security best practices within the codebase and cloud infrastructure. Documentation: Create and maintain technical documentation to facilitate knowledge sharing and onboarding of team members. Collaboration: Collaborate with product managers, architects, and other stakeholders to deliver high-impact software solutions. Research and Innovation: Stay up to date with the latest Python, Data Engineering and AWS technologies, and propose innovative solutions that can enhance our systems. Troubleshooting: Investigate and resolve technical issues and outages as they arise. Qualifications: Bachelor's or higher degree in Computer Science, Software Engineering, or a related field. Proven experience as a Data Engineer with a strong focus on AWS services. Solid experience in leading technical teams and project management. Proficiency in Python, including deep knowledge of data engineering implementation patterns. Strong expertise in AWS services and infrastructure setup. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving skills and the ability to troubleshoot complex technical issues. Strong communication and teamwork skills. A passion for staying updated with the latest industry trends and technologies.
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Ahmedabad
Work from Office
We are seeking a seasoned Lead Platform Engineer with a strong background in platform development and a proven track record of leading technology design and teams. The ideal candidate will have at least 8 years of overall experience, with a minimum of 5 years in relevant roles. This position entails owning module design and spearheading the implementation process alongside a team of talented platform engineers. Job Title: Lead Platform Engineer Job Location: Ahmedabad/Pune (Work from Office) Required Experience: 7+ Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Lead the design and architecture of robust, scalable platform modules, ensuring alignment with business objectives and technical standards. Drive the implementation of platform solutions, collaborating closely with platform engineers and cross-functional teams to achieve project milestones. Mentor and guide a team of platform engineers, fostering an environment of growth and continuous improvement. Stay abreast of emerging technologies and industry trends, incorporating them into the platform to enhance functionality and user experience. Ensure the reliability and security of the platform through comprehensive testing and adherence to best practices. Collaborate with senior leadership to set technical strategy and goals for the platform engineering team. Requirements: Minimum of 8 years of experience in software or platform engineering, with at least 5 years in roles directly relevant to platform development and team leadership. Expertise in Python programming, with a solid foundation in writing clean, efficient, and scalable code. Proven experience in serverless application development, designing and implementing microservices, and working within event-driven architectures. Demonstrated experience in building and shipping high-quality SaaS platforms/applications on AWS, showcasing a portfolio of successful deployments. Comprehensive understanding of cloud computing concepts, AWS architectural best practices, and familiarity with a range of AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Exceptional problem-solving skills, with a proven ability to optimize complex systems for efficiency and scalability. Excellent communication skills, with a track record of effective collaboration with team members and successful engagement with stakeholders across various levels. Previous experience leading technology design and engineering teams, with a focus on mentoring, guiding, and driving the team towards achieving project milestones and technical excellence. Good to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.
Posted 2 months ago
8.0 - 12.0 years
20 - 27 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are looking for an experienced Python Architect to lead the design and development of scalable, high-performance applications. The ideal candidate will have strong expertise in Python, particularly with Flask or Django frameworks, and a solid background in AWS services including Lambda, ECS, S3, and RDS. You will design microservices and REST APIs while providing architectural leadership and mentoring to the development team. Key Responsibilities: Architect and develop Python-based applications using Flask or Django Design and implement microservices and RESTful APIs Lead cloud deployments leveraging AWS services such as Lambda, ECS, S3, and RDS Provide technical guidance and mentorship to development teams Collaborate with cross-functional teams to ensure scalable and robust solutions Requirements: Strong Python programming skills with hands-on experience in Flask/Django Proficient in AWS cloud services (Lambda, ECS, S3, RDS, etc.) Experience designing microservices and REST APIs Proven ability in architectural leadership and mentoring Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai, Hyderabad Education- Preferred degree in Computer Science or related field
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Join our team! We re building a world where Identity belongs to you.We are looking for a Senior FullStack Engineer to join our growing team in Business Technology (BT) in India and to help scale our business solutions while providing an extra focus on security, enabling Okta to be the most efficient, scalable, and reliable company. In this role, you will be responsible for designing and developing customizations, extensions, configurations, and integrations required to meet the company s strategic business objectives. Candidates will work collaboratively with business stakeholders, business analysts, and engineers on different infrastructure layers, from proposal development to deployment and support. Therefore, a commitment to collaborative problem-solving and delivering high-quality solutions is essential. In addition, your product owner will look to you to provide all technical services design, config, software development, and testing. Qualifications: 5+ years of robust experience with hands-on development & design experience Experience working with the following technologies Java, NodeJs, Typescript, AWS (Lambda, EventBridge, SQS, SNS, API Gateway, DynamoDB, Secrets Manager/Parameter Store, EC2 Instances, AppFlows, StepFunctions, Kinesis), React, Scripting Languages (Python, Shell, Kotlin), Databases (DynamoDB, PostGreSQL), Terraform, Serverless architecture,, Unit Testing Frameworks (JUnit, Mockito) Experience working on latest AI technologies is a big plus Provide leadership and have influence over the design, implementation and support of all the POCs built for the business Experience coaching and developing individuals for increased effectiveness and working with a geographically dispersed workforce is a plus. Willingness to learn and master unfamiliar technologies and/or concepts Excellent verbal and written technical documentation skills Responsibilities: Translate business requirements into well-architected solutions that best leverage the AWS infrastructure and technologies. Provide a detailed level of effort estimates for proposed solutions. Articulate the benefits and risks of a solution s feasibility and functionality. Collaborate with business stakeholders and product managers to find the most suitable solution for their needs. Owning the deliverables from discovery to deployment with appropriate documentation. Create and execute unit, integration, and functional tests. What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Fostering Diversity, Equity, Inclusion and Belonging at Okta
Posted 2 months ago
9.0 - 14.0 years
20 - 30 Lacs
Bengaluru
Hybrid
My profile :- linkedin.com/in/yashsharma1608 Hiring manager profile :- on payroll of - https://www.nyxtech.in/ Clinet : Brillio PAYROLL AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience 9 + yrs Notice period Immediate joiner Location Any Brillio location (Preferred is Bangalore) Budget – 30 LPA Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management
Posted 2 months ago
6.0 - 11.0 years
16 - 20 Lacs
Gurugram, sector 20
Work from Office
Conduct API Testing using REST Assured. Perform Automation Testing using Selenium WebDriver. Carry out Performance and Load Testing with JMETER. Ensure the quality and reliability of applications integrating through pub/sub mechanisms, AWS API Gateway, and REST APIs Work with Publisher/Subscriber event-based integrations, AWS Glue, AWS Event Bridge, and Lambda functions. Collaborate with cross-functional teams to identify and resolve integration and testing challenges. Proven experience in API Testing, Automation Testing (Selenium WebDriver), and Performance Testing (JMETER). Strong understanding of integration patterns such as pub/sub and REST-based messaging. Hands-on experience with AWS technologies including AWS Glue, AWS EventBridge, and Lambda functions. Ability to work effectively in a hybrid setting and commute to the office as required. _ Skills : - Automation Testing ,Java,Selenium webdriver,BDD cucumber,API Testing,AWS (min 6 month)
Posted 2 months ago
5.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
The Opportunity: Primary responsibilities will include: Understanding the design for enhancements in the product anddeveloping accordingly; participating actively in design discussions Analyzing Business Requirements, discussing impacted areas, suggesting solutions to resolve issues/areas of concern Coding and Unit testing of enhancements in the Product Suite Stabilizing and maintaining the Product Suite Actively participating in SCRUM ceremonies, providing constructive suggestions and inputs Developing testable, reusable, efficient, legible code for enhancements in Product Suite Analyzing root cause of issues and suggesting areas for improvement Actively contributing to meet the team commitments The Candidate: Required skills/qualifications: 5-8 years of relevant experience Hands-on experience with AWS services such as EC2, S3, Lambda, DynamoDB, API Gateway, and CloudFormation Strong proficiency in Java (Spring Boot, Hibernate, or other modern Java frameworks) Experience with Angular (including Angular CLI, RxJS, Angular forms, and component-based architecture) Experience in RESTful API development and integration with both front-end and back-end systems Solid understanding of databases (SQL and NoSQL databases like MySQL, MongoDB, DynamoDB) Familiarity with CI/CD pipelines, DevOps practices, and cloud infrastructure management using tools like Jenkins, Git, Docker, and Terraform Understanding of microservices architecture and experience building or maintaining microservices-based applications Fluency in written and spoken English Preferred skills/qualifications: Experience developing, building, testing, deploying, and operating applications Familiarity with working with cloud technologies Agile/Scrum methodologies, with the ability to manage multiple tasks in a fast-paced environment.
Posted 2 months ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Bachelor s degree in Computer Science, Engineering, or a related field. 4-6 years of professional experience with AWS Lambda and serverless architecture. Proficiency in Python programming. Strong experience with shell scripting and SQL Experience working in Production environment and well versed with ITIL processes. Excellent communication and interpersonal skills. Experience with Oracle BRM is an advantage but not mandatory. Familiarity with other AWS services (e.g., S3, DynamoDB, API Gateway) is desirable. Ability to work independently and in a team environment.
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Pune
Hybrid
Role Overview: The Senior Tech Lead - AWS Data Engineering leads the design, development and optimization of data solutions on the AWS platform. The jobholder has a strong background in data engineering, cloud architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities: Lead the design and implementation of AWS-based data architectures and pipelines. Architect and optimize data solutions using AWS services such as S3, Redshift, Glue, EMR, and Lambda. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in AWS data environments. Stay updated on the latest AWS technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+Yrs of Experience in IT Minimum 5-7 years in design and development of cloud data platforms using AWS services Must have experience of design and development of data lake / data warehouse / data analytics solutions using AWS services like S3, Lake Formation, Glue, Athena, EMR, Lambda, Redshift Must be aware about the AWS access control and data security features like VPC, IAM, Security Groups, KMS etc Must be good with Python and PySpark for data pipeline building. Must have data modeling including S3 data organization experience Must have an understanding of hadoop components, No SQL database, graph database and time series database; and AWS services available for those technologies Must have experience of working with structured, semi-structured and unstructured data Must have experience of streaming data collection and processing. Kafka experience is preferred. Experience of migrating data warehouse / big data application to AWS is preferred . Must be able to use Gen AI services (like Amazon Q) for productivity gain Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with AWS data services and tools. AWS certification (e.g., AWS Certified Data Analytics - Specialty). Experience with machine learning and AI integration in AWS environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture
Posted 2 months ago
6.0 - 9.0 years
8 - 11 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
About the Role: Grade Level (for internal use): 10 S&P Dow Jones Indices The Role S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Java Application Developer to join our technology team. The Location Mumbai/Hyderabad/Chennai The Team You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. The Impact You will be working on one of the core technology platforms responsible for the end of day calculation as well as dissemination of index values. Whats in it for you You will have the opportunity to work on the enhancements to the existing index calculation system as well as implement new methodologies as required. Responsibilities Design and development of Java applications for SPDJI web sites and its feeder systems. Participate in multiple software development processes including Coding, Testing, De-bugging & Documentation. Develop software applications based on clear business specifications. Work on new initiatives and support existing Index applications. Perform Application & System Performance tuning and troubleshoot performance issues. Develop web based applications and build rich front-end user interfaces. Build applications with object oriented concepts and apply design patterns. Integrate in-house applications with various vendor software platforms. Setup development environment / sandbox for application development. Check-in application code changes into the source repository. Perform unit testing of application code and fix errors. Interface with databases to extract information and build reports. Effectively interact with customers, business users and IT staff. What were looking for Basic Qualification Bachelors degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience. (6 to 9) years of IT experience in application development and support. Strong Experience with Java, J2EE, JMS &.EJBs Advanced SQL & basic PL/SQL programming Basic networking knowledge / Unix scripting Exposure to UI technologies like react JS Basic understanding of AWS cloud (EC2, EMR, Lambda, S3, Glue, etc.) Excellent communication and interpersonal skills are essential, with strong verbal and writing proficiencies. Preferred Qualification Experience working with large datasets in Equity, Commodities, Forex, Futures and Options asset classes. Experience with Index/Benchmarks or Asset Management or Trading platforms. Basic Knowledge of User Interface design & development using JQuery, HTML5 & CSS.
Posted 2 months ago
8.0 - 12.0 years
18 - 22 Lacs
Thane, Navi Mumbai, Mumbai (All Areas)
Hybrid
Must be proficient in AWS with 3 + years of AWS serverless development experience i.e. Lambda, SQS, SNS, API Gateway Expertise in Framework like Express.JS Relational databases e.g. MySQL, PostgreSQL and NoSQL databases e.g. MongoDB, DynamoDB
Posted 2 months ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
S&P Dow Jones Indices is seeking a Python/Bigdata developer to be a key player in the implementation and support of data Platforms for S&P Dow Jones Indices. This role requires a seasoned technologist who contributes to application development and maintenance. The candidate should actively evaluate new products and technologies to build solutions that streamline business operations. The candidate must be delivery-focused with solid financial applications experience. The candidate will assist in day-to-day support and operations functions, design, development, and unit testing. Responsibilities and Impact: Lead the design and implementation of EMR Spark workloads using Python, including data access from relational databases and cloud storage technologies. Implement new powerful functionalities using Python, Pyspark, AWS and Delta Lake. Independently come up with optimal designs for the business use cases and implement the same using big data technologies. Enhance existing functionalities in Oracle/Postgres procedures, functions. Performance tuning of existing Spark jobs. Respond to technical queries from operations and product management team. Implement new functionalities in Python, Spark, Hive. Enhance existing functionalities in Postgres procedures, functions. Collaborate with cross-functional teams to support data-driven initiatives. Mentor junior team members and promote best practices. Respond to technical queries from the operations and product management team. What Were Looking For: Basic Required Qualifications: Bachelors degree in computer science, Information Systems, or Engineering, or equivalent work experience. 5 - 8 years of IT experience in application support or development. Hands on development experience on writing effective and scalable Python programs. Deep understanding of OOP concepts and development models in Python. Knowledge of popular Python libraries/ORM libraries and frameworks. Exposure to unit testing frameworks like Pytest. Good understanding of spark architecture as the system involves data intensive operations. Good amount of work experience in spark performance tuning. Experience/exposure in Kafka messaging platform. Experience in Build technology like Maven, Pybuilder. Exposure with AWS offerings such as EC2, RDS, EMR, lambda, S3,Redis. Hands on experience in at least one relational database (Oracle, Sybase, SQL Server, PostgreSQL). Hands on experience in SQL queries and writing stored procedures, functions. A strong willingness to learn new technologies. Excellent communication skills, with strong verbal and writing proficiencies. Additional Preferred Qualifications: Proficiency in building data analytics solutions on AWS Cloud. Experience with microservice and serverless architecture implementation.
Posted 2 months ago
5.0 - 7.0 years
25 - 32 Lacs
Noida
Work from Office
-5-7 years of experience in Software/Application development/enhancement and handling high-priority customer escalations. - Rich experience in Node.Js, JavaScript, Angular, AWS (S3, Lambda, EC2, Dynamo, Cloudfront, ALB). - Good Experience in Redis, DynamoDB, SQL Databases - Good Experience with Microservices - Strong analytical, communication and interpersonal skills.
Posted 2 months ago
5.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role: Lead Software Engineering The Team: Our team is responsible for the architecture, design, development, and maintenance of technology solutions to support the Sustainability business unit within Market Intelligence and other divisions. Our program is built on a foundation of inclusivity, enablement, and adaptability and respect which fosters an environment of open-communication and trust. We take pride in each team members accountability and responsibility to move us forward in our strategic initiatives. Our work is collaborative, we work transparently with others within our business unit and others across the entire organization. The Impact: As a Lead, Cloud Engineering at S&P Global, you will be instrumental in streamlining the software development and deployment of our applications to meet the needs of our business. Your work ensures seamless integration and continuous delivery, enhancing the platform's operational capabilities to support our business units. You will collaborate with software engineers and data architects to automate processes, improve system reliability, and implement monitoring solutions. Your contributions will be vital in maintaining high availability security and performance standards, ultimately leading to the delivery of impactful, data-driven solutions. Whats in it for you: Career Development: Build a meaningful career with a leading global company at the forefront of technology. Dynamic Work Environment: Work in an environment that is dynamic and forward-thinking, directly contributing to innovative solutions. Skill Enhancement: Enhance your software development skills on an enterprise-level platform. Versatile Experience: Gain full-stack experience and exposure to cloud technologies. Leadership Opportunities: Mentor peers and influence the products future as part of a skilled team. Key Responsibilities: Design and develop scalable cloud applications using various cloud services. Collaborate with cross-functional teams to define, design, and deliver new features. Implement cloud security best practices and ensure compliance with industry standards. Monitor and optimize application performance and reliability in the cloud environment. Troubleshoot and resolve issues related to our applications and services. Stay updated with the latest cloud technologies and trends. Manage our cloud instances and their lifecycle, to guarantee a high degree of reliability, security, scalability, and confidence at any given time. Design and implement CI/CD pipelines to automate software delivery and infrastructure changes. Collaborate with development and operations teams to improve collaboration and productivity. Manage and optimize cloud infrastructure and services. Implement configuration management tools and practices. Ensure security best practices are followed in the deployment process. What Were Looking For: Bachelor's degree in Computer Science or a related field. Minimum of 10+ years of experience in a cloud engineering or related role. Proven experience in cloud development and deployment. Proven experience in agile and project management. Expertise with cloud services (AWS, Azure, Google Cloud). Experience in EMR, EKS, Glue, Terraform, Cloud security, Proficiency in programming languages such as Python, Java, Scala, Spark Strong Implementation experience in AWS services (e.g. EC2, ECS, ELB, RDS, EFS, EBS, VPC, IAM, CloudFront, CloudWatch, Lambda, S3. Proficiency in scripting languages such as Bash, Python, or PowerShell. Experience with CI/CD tools like Azure CI/CD. Experience in SQL and MS SQL Server. Knowledge of containerization technologies like Docker, Kubernetes. Nice to have - Knowledge of GitHub Actions, Redshift and machine learning frameworks Excellent problem-solving and communication skills. Ability to quickly, efficiently, and effectively define and prototype solutions with continual iteration within aggressive product deadlines. Demonstrate strong communication and documentation skills for both technical and non-technical audiences.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |