Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You will be responsible for performing full-stack development activities using MERN Stack. This will involve translating UX Designs into functional web applications using React JS. You will collaborate with the technical architecture design team, system architect, and product manager to ensure the seamless integration of various components. Your role will also include writing effective business logic, implementing algorithms for system modules, and designing databases for scalable and secure systems using NoSQL (MongoDB) or RDBMS (MySQL or PostgreSQL). In addition to development tasks, you will be involved in proof of concept development with other engineers and providing effort estimations in coordination with the Product Manager and Engineering Head. Testing software to ensure responsiveness and efficiency, writing unit tests for a robust system, and creating technical documentation will also be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in computer sciences or a related field, with a postgraduate degree considered advantageous. Strong knowledge of algorithms and data structures is essential, along with hands-on experience in the MERN stack. Proficiency in developing RestGraphQL APIs using Node.js with Typescript and working with AWS services such as EC2, SQS, SES, and Lambda are required. Familiarity with Docker containerized application development is a plus. Moreover, experience working with US Clients is a must, and exposure to various projects and business models will be beneficial. Experience in SaaS product development and proficiency in other JS frameworks like Next.JS, React Native, and Expo will be advantageous for this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
jodhpur, rajasthan
On-site
The ideal candidate for this position should have the following qualifications and experience: Backend Requirements: - Possess at least 5 years of experience working with Python. - Demonstrated hands-on experience with at least one of the following frameworks: Flask, Django, or FastAPI. - Proficient in utilizing various AWS services, including Lambda, S3, SQS, and CloudFormation. - Skilled in working with relational databases such as PostgreSQL or MySQL. - Familiarity with testing frameworks like Pytest or NoseTest. - Expertise in developing REST APIs and implementing JWT authentication. - Proficient in using version control tools such as Git. Frontend Requirements: - Have a minimum of 3 years of experience with ReactJS. - Thorough understanding of ReactJS and its core principles. - Experience in working with state management tools like Redux Thunk, Redux Saga, or Context API. - Familiarity with RESTful APIs and modern front-end build pipelines and tools. - Proficient in HTML5, CSS3, and pre-processing platforms like SASS/LESS. - Experience in implementing modern authorization mechanisms, such as JSON Web Tokens (JWT). - Knowledge of front-end testing libraries like Cypress, Jest, or React Testing Library. - Bonus points for experience in developing shared component libraries. If you meet the above criteria and are looking to work in a dynamic environment where you can utilize your backend and frontend development skills effectively, we encourage you to apply for this position.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Manager at Autodesk, you will lead the BI and Data Engineering Team to develop and implement business intelligence solutions. Your role is crucial in empowering decision-makers through trusted data assets and scalable self-serve analytics. You will oversee the design, development, and maintenance of data pipelines, databases, and BI tools to support data-driven decision-making across the CTS organization. Reporting to the leader of the CTS Business Effectiveness department, you will collaborate with stakeholders to define data requirements and objectives. Your responsibilities will include leading and managing a team of data engineers and BI developers, fostering a collaborative team culture, managing data warehouse plans, ensuring data quality, and delivering impactful dashboards and data visualizations. You will also collaborate with stakeholders to translate technical designs into business-appropriate representations, analyze business needs, and create data tools for analytics and BI teams. Staying up to date with data engineering best practices and technologies is essential to ensure the company remains ahead of the industry. To qualify for this role, you should have 3 to 5 years of experience managing data teams and a BA/BS in Data Science, Computer Science, Statistics, Mathematics, or a related field. Proficiency in Snowflake, Python, SQL, Airflow, Git, and big data environments like Hive, Spark, and Presto is required. Experience with workflow management, data transformation tools, and version control systems is preferred. Additionally, familiarity with Power BI, AWS environment, Salesforce, and remote team collaboration is advantageous. The ideal candidate is a data ninja and leader who can derive insights from disparate datasets, understand Customer Success, tell compelling stories using data, and engage business leaders effectively. At Autodesk, we are committed to creating a culture where everyone can thrive and realize their potential. Our values and ways of working help our people succeed, leading to better outcomes for our customers. If you are passionate about shaping the future and making a meaningful impact, join us in our mission to turn innovative ideas into reality. Autodesk offers a competitive compensation package based on experience and location. In addition to base salaries, we provide discretionary annual cash bonuses, commissions, stock grants, and a comprehensive benefits package. If you are interested in a sales career at Autodesk or want to learn more about our commitment to diversity and belonging, please visit our website for more information.,
Posted 2 weeks ago
6.0 - 11.0 years
18 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities JD for Java + React + AWS Experience: 6 - 10 years Required Skills: Java, Spring, Spring Boot, React, microservices, JMS, ActiveMQ, Tomcat, Maven, GitHub, Jenkins, Linux/Unix, Oracle and PL/SQL, AWS EC2, S3, API Gateway, Lambda, Route53, Secrets Manager, CloudWatch Nice to have skills: Experience with rewriting legacy Java applications using Spring Boot & React Building serverless applications Ocean Shipping domain knowledge AWS CodePipeline Responsibilities: Develop and implement front-end and back-end solutions using Java, Spring, Spring Boot, React, microservices, Oracle and PL/SQL and AWS services. Experience working with business users in defining processes and translating those to technical specifications. Design and develop user-friendly interfaces and ensure seamless integration between front-end and back-end components. Write efficient code following best practices and coding standards. Perform thorough testing and debugging of applications to ensure high-quality deliverables. Optimize application performance and scalability through performance tuning and code optimization techniques. Integrate third-party APIs and services to enhance application functionality. Build serverless applications Deploy applications in AWS environment Perform Code Reviews. Pick up production support engineer role when needed Excellent grasp of application security concerns and remediation techniques. Well-rounded technical background in current web and micro-service technologies. Responsible for being an expert resource for architects in the development of target architectures to ensure that they can be properly designed and implemented through best practices. Should be able to work in a fast paced environment. Stay updated with the latest industry trends and emerging technologies to continuously improve skills and knowledge.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 20 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Snowflake Developer Location: Hyderabad or Bangalore. ( Hybrid Working ) Experience: 5+ Years Responsibilities: Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake . Design and implement data pipelines with proper 2NF/3NF normalization standards. Develop and maintain ETL processes for integrating data from multiple ERP and source systems. Create scalable and secure data architecture in Snowflake that supports DQ needs. Raise CAB requests through Carriers change process and manage deployment to production. Provide UAT support and transition finalized pipelines to support teams. Document all technical artifacts for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams for seamless DQ integration. Optimize queries, manage performance tuning, and ensure best practices in data operations. Requirements: Strong hands-on experience with Snowflake . Expert-level SQL and experience with data transformation . Familiarity with data architecture and normalization techniques (2NF/3NF). Experience with cloud-based data platforms and pipeline design. Prior experience with AWS data services (e.g., S3, Glue, Lambda, Step Functions) is a strong advantage. Experience with ETL tools and working in agile delivery environments. Understanding of Carrier CAB process or similar structured deployment workflows. Ability to debug complex issues and optimize pipelines for scalability. Strong communication and collaboration skills
Posted 2 weeks ago
10.0 - 15.0 years
16 - 27 Lacs
Pune
Work from Office
Dear Candidate, This is with reference for Opportunity for Lead - AWS Data Engineering professionals PFB the Job Description Responsibilities: Lead the design and implementation of AWS-based data architectures and pipelines. Architect and optimize data solutions using AWS services such as S3, Redshift, Glue, EMR, and Lambda. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in AWS data environments. Stay updated on the latest AWS technologies and industry trends. Key Technical Skills & Responsibilities Experience in design and development of cloud data platforms using AWS services Must have experience of design and development of data lake / data warehouse / data analytics solutions using AWS services like S3, Lake Formation, Glue, Athena, EMR, Lambda, Redshift Must be aware about the AWS access control and data security features like VPC, IAM, Security Groups, KMS etc Must be good with Python and PySpark for data pipeline building. Must have data modeling including S3 data organization experience Must have an understanding of hadoop components, No SQL database, graph database and time series database; and AWS services available for those technologies Must have experience of working with structured, semi-structured and unstructured data Must have experience of streaming data collection and processing. Kafka experience is preferred. Experience of migrating data warehouse / big data application to AWS is preferred . Must be able to use Gen AI services (like Amazon Q) for productivity gain Eligibility Criteria: Bachelors degree in computer science, Data Engineering, or a related field. Extensive experience with AWS data services and tools. AWS certification (e.g., AWS Certified Data Analytics - Specialty). Experience with machine learning and AI integration in AWS environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Skill - Senior Tech Lead AWS Data Engg Location - Pune
Posted 2 weeks ago
6.0 - 11.0 years
0 - 1 Lacs
Bengaluru
Work from Office
Job Requirements Please Find below is the JD. Location: Whitefield Bangalore Employment Type: Full-Time Experience Level: Senior Level (9 to 14 years) Work Mode: Work from Office Job Description: We are seeking a highly skilled and experienced Senior Java and AWS Developer to join our dynamic team. The ideal candidate will have a strong background in developing scalable server-side applications and cloud solutions using Java and AWS services. Work Experience Responsibilities: Design, develop, and maintain scalable microservices using Java and Spring Boot. Develop and optimize cloud-based applications on AWS, leveraging services like Lambda, S3,Lambda,RDS and EC2. Write unit and integration tests to maintain software quality. Create and maintain RESTful APIs to support front-end functionality. Ensure application performance, scalability, and security. Implement best practices for cloud architecture and infrastructure. Collaborate with front-end developers, designers, and other stakeholders. Write and maintain technical documentation. Monitor and optimize application performance. Troubleshoot and resolve issues in a timely manner. Qualifications: Bachelors degree in computer science or a related field. Proven experience as a Java Developer. Proficiency in Spring Boot and microservices architecture. Hands-on experience with AWS services such as EC2, S3, Lambda, and RDS. Hands-on Experience with Angular Strong understanding of RESTful API design and development. Familiarity with containerization technologies like Docker. Experience with version control systems, especially Gitlab. Ability to work collaboratively in a team environment. Excellent problem-solving skills and attention to detail. Skills: Java Microservices Spring Boot Angular AWS (EC2, S3, Lambda, RDS) RESTful APIs CFT Git JavaScript/TypeScript Preferred Qualifications: AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer) Experience with serverless architectures Knowledge of DevOps practices and CI/CD pipelines
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
At WT Technologies Pvt. Ltd., we are redefining how the event industry embraces digital transformation. With two pioneering products WeTales and Wowsly, we offer a seamless blend of creativity and technology to elevate every celebration. WeTales is our creative powerhouse, specializing in bespoke digital invitations that blend storytelling, animation, 3D design, and personalization to deliver unforgettable first impressions. From save-the-dates to full wedding invite suites, WeTales brings every love story to life visually and emotionally. Wowsly, our event-tech platform, empowers hosts and planners with cutting-edge digital solutions like QR-based check-ins, live RSVP tracking, automated communication, and real-time guest coordination making event management smarter, faster, and smoother. Together, WeTales + Wowsly form a complete ecosystem for modern events offering design to delivery to digital management crafted with precision, passion, and innovation. Key Responsibilities and Accountabilities: - Technology Leadership: - Define the technology vision and strategy for Wowsly.com and Wetales.in. - Ensure scalability, security, and high performance across all platforms. - Evaluate and implement cutting-edge technologies to enhance product offerings. - Development & Architecture: - Oversee front-end development using React.js and ensure seamless user experiences. - Manage back-end development in PHP Laravel, optimizing database interactions and API performance. - Architect scalable and secure solutions on AWS, leveraging cloud-native services. - Team Management: - Lead and mentor a team of developers and engineers, fostering a culture of innovation and collaboration. - Set development processes, code reviews, and quality standards. - Recruit, onboard, and retain top technical talent as the team grows. - Operations & Infrastructure: - Manage AWS infrastructure, ensuring cost optimization, uptime, and reliability. - Oversee CI/CD pipelines, DevOps processes, and automated testing. - Handle system monitoring, debugging, and issue resolution. - Collaboration & Stakeholder Management: - Work with the CEO and product managers to define and prioritize product roadmaps. - Communicate technical challenges and opportunities to non-technical stakeholders. - Ensure alignment between technical capabilities and business objectives. Required Skills & Qualifications: Technical Expertise: - Strong experience in front-end development using React.js. - Proven expertise in back-end development using PHP Laravel. - Hands-on experience with AWS services, including EC2, S3, RDS, Lambda, and CloudFront. - Knowledge of database systems like MySQL, PostgreSQL, or MongoDB. - Proficiency in DevOps tools and processes, including Docker, Kubernetes, Jenkins, etc. - Understanding of web performance optimization, security protocols, and API integrations. Leadership & Management: - 5+ years of experience in technology roles, with at least 3 years in a leadership capacity. - Excellent team management, mentoring, and delegation skills. - Ability to align technology initiatives with business goals and product requirements. Soft Skills: - Strong analytical and problem-solving skills. - Excellent communication and interpersonal abilities. - Entrepreneurial mindset with a passion for innovation. Preferred Qualifications: - Experience with SaaS platforms and B2B products. - Familiarity with event-tech solutions and industry trends. - Prior experience in a start-up or fast-paced work environment. - Understanding of mobile app development (React Native, Flutter, etc.),
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Senior Backend Developer at Talent500, you will be a key player in building and maintaining robust, scalable backend systems. With 8-12 years of experience, you will lead the entire backend development lifecycle, ensuring high-performing, secure applications that meet user needs. Leveraging your expertise in Python frameworks like Django and Django REST Framework, along with AWS and Kubernetes, you will contribute to the design, implementation, testing, and deployment of backend APIs. Your responsibilities will include designing and developing backend APIs using Django REST Framework, creating efficient database models with MySQL and Elasticsearch, implementing caching strategies using Redis, and utilizing message queues like RabbitMQ and Kafka for asynchronous communication. You will also manage backend tasks and workflows with Celery, contribute to the design and architecture of new features and microservices, and ensure high availability, scalability, and performance of backend systems. In terms of infrastructure and deployment, you will deploy and manage backend applications on AWS using technologies like EC2, ECS, Lambda, and EKS, implement containerization using Docker, and orchestrate deployments with Kubernetes. Security and monitoring will be crucial aspects of your role, as you will implement robust security practices, monitor system health and performance using AWS tools like CloudWatch and CloudTrail, and proactively identify and troubleshoot issues to minimize downtime. Collaboration and communication will be key to your success, as you will collaborate effectively with frontend developers, product managers, and other stakeholders, participate in code reviews and knowledge-sharing sessions, and provide technical guidance and mentorship to junior developers. Staying updated with the latest advancements in Python frameworks, AWS services, and Kubernetes technologies will also be essential, as you continuously learn and expand your skillset to adapt to evolving requirements. Required Skills: - 8-12 years of experience in backend development - Proficiency in Python, Django, Django REST Framework, Celery, uWSGI - Strong understanding of database technologies like MySQL and Elasticsearch - Experience with caching solutions like Redis and message queues like RabbitMQ - Familiarity with JavaScript and React.js - Proven experience with AWS cloud technologies, including EC2, ECS, Lambda, and EKS - Understanding of containerization and Kubernetes - Excellent problem-solving and analytical skills - Strong communication and collaboration skills Preferred Additional Skills: - Experience with CI/CD pipelines (e.g., Jenkins, GitLab CI) - Experience with infrastructure as code (IaC) tools like Terraform or CloudFormation - Experience with security best practices and compliance standards - Experience in DevOps methodologies and practices This role offers a challenging and rewarding opportunity for a senior backend developer passionate about building high-quality, scalable applications. If you have the skills and experience we are looking for, we encourage you to apply and be part of our dynamic team at Talent500.,
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Hybrid
Were looking for a talented and results-oriented Cloud Solutions Architect to work as a key member of Sureifys engineering team. Youll help build and evolve our next-generation cloud-based compute platform for digitally-delivered life insurance. Youll consider many dimensions such as strategic goals, growth models, opportunity cost, talent, and reliability. Youll collaborate closely with the product development team on platform feature architecture such that the architecture aligns with operational needs and opportunities. With the number of customers growing and growing, it’s time for us to mature the fabric our software runs on. This is your opportunity to make a large impact at a high-growth enterprise software company. Key Responsibilities : Collaborate with key stakeholders across our product, delivery, data and support teams to design scalable and secure application architectures on AWS using AWS Services like EC2, ECS, EKS, Lambdas, VPC, RDS, ElastiCache provisioned via Terraform Design and Implement CICD pipelines using Github, Jenkins and Spinnaker and Helm to automate application deployment and updates with key focus on container management, orchestration, scaling, optimizing performance and resource utilization, and deployment strategies Design and Implement security best practices for AWS applications, including Identity and Access Management (IAM), encryption, container security and secure coding practices Design and Implement best practices for Design and implement application observability using Cloudwatch and NewRelic with key considerations and focus on monitoring, logging and alerting to provide insights into application performance and health. Design and implement key integrations of application components and external systems, ensuring smooth and efficient data flow Diagnose and resolve issues related to application performance, availability and reliability Create, maintain and prioritise a quarter over quarter backlog by identifying key areas of improvement such as cost optimization, process improvement, security enhancements etc. Create and maintain comprehensive documentation outlining the infrastructure design, integrations, deployment processes, and configuration Work closely with the DevOps team and as a guide / mentor and enabler to ensure that the practices that you design and implement are followed and imbibed by the team Required Skills: Proficiency in AWS Services such as EC2, ECS, EKS, S3, RDS, VPC, Lambda, SES, SQS, ElastiCache, Redshift, EFS Strong Programming skills in languages such as Groovy, Python, Bash Shell Scripting Experience with CICD tools and practices including Jenkins, Spinnaker, ArgoCD Familiarity with IaC tools like Terraform or Cloudformation Understanding of AWS security best practices, including IAM, KMS Familiarity with Agile development practices and methodologies Strong analytical skills with the ability to troubleshoot and resolve complex issues Proficiency in using observability, monitoring and logging tools like AWS Cloudwatch, NewRelic, Prometheus Knowledge of container orchestration tools and concepts including Kubernetes and Docker Strong teamwork and communication skills with the ability to work effectively with cross function teams Nice to haves AWS Certified Solutions Architect - Associate or Professional
Posted 2 weeks ago
3.0 - 6.0 years
0 - 3 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Role & responsibilities Implement and manage AIOps platforms for intelligent monitoring, alerting, anomaly detection, and root cause analysis (RCA). Possess end-to-end knowledge of VLLM model hosting and inferencing. Advanced knowledge of public cloud platforms such as AWS and Azure. Build and maintain machine learning pipelines and models for predictive maintenance, anomaly detection, and noise reduction. Experience in production support and real-time issue handling. Design dashboards and visualizations to provide operational insights to stakeholders. Working knowledge of Bedrock, SageMaker, EKS, Lambda, etc. 1 to 2 years of experience with Jenkins and GoCD to make build/deploy pipelines. Hands-on experience with open-source and self-hosted model APIs using SDKs. Drive data-driven decisions by analyzing operational data and generating reports on system health, performance, and availability. Basic knowledge of kserve and rayserve inferencing . Good knowledge of high level scaling using Karpenter , Keda , System based vertical/horizontal scaling. Strong knowledge on linux operating system or linux certified . Previous experience with Helm chart deployments and Terraform template and module creation is highly recommended. Secondary Responsibilities: Proven experience in AIOps and DevOps, with a strong background in cloud technologies (AWS, Azure, Google Cloud). Proficiency in tools such as Kubeflow, Kserve, ONNX, and containerization technologies (Docker, Kubernetes). Experience with enterprise-level infrastructure, including tools like terraform, helm, and On-Prem servers hosting. Previous experience in fintech or AI based tech companies are highly desirable. Demonstrates the ability to manage workloads effectively in a production environment. Possesses excellent communication and collaboration skills, with a strong focus on cross-functional teamwork.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be working as a Business Intelligence Engineer III in Pune on a 6-month Contract basis with the TekWissen organization. Your primary responsibility will be to work on Data Engineering on AWS, including designing and implementing scalable data pipelines using AWS services such as S3, AWS Glue, Redshift, and Athena. You will also focus on Data Modeling and Transformation by developing and optimizing dimensional data models to support various business intelligence and analytics use cases. Additionally, you will collaborate with stakeholders to understand reporting and analytics requirements and build interactive dashboards and reports using visualization tools like the client's QuickSight. Your role will also involve implementing data quality checks and monitoring processes to ensure data integrity and reliability. You will be responsible for managing and maintaining the AWS infrastructure required for the data and analytics platform, optimizing performance, cost, and security of the underlying cloud resources. Collaboration with cross-functional teams and sharing knowledge and best practices will be essential for identifying data-driven insights. As a successful candidate, you should have at least 3 years of experience as a Business Intelligence Engineer or Data Engineer, with a strong focus on AWS cloud technologies. Proficiency in designing and implementing data pipelines using AWS services like S3, Glue, Redshift, Athena, and Lambda is mandatory. You should also possess expertise in data modeling, dimensional modeling, data transformation techniques, and experience in deploying business intelligence solutions using tools like QuickSight and Tableau. Strong SQL and Python programming skills are required for data processing and analysis. Knowledge of cloud architecture patterns, security best practices, and cost optimization on AWS is crucial. Excellent communication and collaboration skills are necessary to effectively work with cross-functional teams. Hands-on experience with Apache Spark, Airflow, or other big data technologies, as well as familiarity with AWS DevOps practices and tools, agile software development methodologies, and AWS certifications, will be considered as preferred skills. The position requires a candidate with a graduate degree and TekWissen Group is an equal opportunity employer supporting workforce diversity.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
At our organization, we prioritize people and are dedicated to providing cutting-edge AI solutions with integrity and passion. We are currently seeking a Senior AI Developer who is proficient in AI model development, Python, AWS, and scalable tool-building. In this role, you will play a key part in designing and implementing AI-driven solutions, developing AI-powered tools and frameworks, and integrating them into enterprise environments, including mainframe systems. Your responsibilities will include developing and deploying AI models using Python and AWS for enterprise applications, building scalable AI-powered tools, designing and optimizing machine learning pipelines, implementing NLP and GenAI models, developing Retrieval-Augmented Generation (RAG) systems, maintaining AI frameworks and APIs, architecting cloud-based AI solutions using AWS services, writing high-performance Python code, and ensuring the scalability, security, and performance of AI solutions in production. To qualify for this role, you should have at least 5 years of experience in AI/ML development, expertise in Python and AWS, a strong background in machine learning and deep learning, experience in LLMs, NLP, and RAG systems, hands-on experience in building and deploying AI models, proficiency in cloud-based AI solutions, experience in developing AI-powered tools and frameworks, knowledge of mainframe integration and enterprise AI applications, and strong coding skills with a focus on software development best practices. Preferred qualifications include familiarity with MLOps, CI/CD pipelines, and model monitoring, a background in developing AI-based enterprise tools and automation, and experience with vector databases and AI-powered search technologies. Additionally, you will benefit from health insurance, accident insurance, and a competitive salary based on various factors including location, education, qualifications, experience, technical skills, and business needs. You will also be expected to actively participate in monthly team meetings, team-building efforts, technical discussions, peer reviews, contribute to the OP-Wiki/Knowledge Base, and provide status reports to OP Account Management as required. OP is a technology consulting and solutions company that offers advisory and managed services, innovative platforms, and staffing solutions across various fields such as AI, cybersecurity, and enterprise architecture. Our team is comprised of dynamic, creative thinkers who are dedicated to delivering quality work. As a member of the OP team, you will have access to industry-leading consulting practices, strategies, technologies, innovative training, and education. We are looking for a technology leader with a strong track record of technical excellence and a focus on process and methodology.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Principal Data Engineer (Associate Director) at Fidelity in Bangalore, you will be an integral part of the ISS Data Platform Team. This team plays a crucial role in building and maintaining the platform that supports the ISS business operations. You will have the opportunity to lead a team of senior and junior developers, providing mentorship and guidance, while taking ownership of delivering a subsection of the wider data platform. Your role will involve designing, developing, and maintaining scalable data pipelines and architectures to facilitate data ingestion, integration, and analytics. Collaboration will be a key aspect of your responsibilities as you work closely with enterprise architects, business analysts, and stakeholders to understand data requirements, validate designs, and communicate progress. Your innovative mindset will drive technical advancements within the department, focusing on enhancing code reusability, quality, and developer productivity. By challenging the status quo and incorporating the latest data engineering practices and techniques, you will contribute to the continuous improvement of the data platform. Your expertise in leveraging cloud-based data platforms, particularly Snowflake and Databricks, will be essential in creating an enterprise lake house. Additionally, your advanced proficiency in the AWS ecosystem and experience with core AWS data services like Lambda, EMR, and S3 will be highly valuable. Experience in designing event-based or streaming data architectures using Kafka, along with strong skills in Python and SQL, will be crucial for success in this role. Furthermore, your role will involve implementing data access controls to ensure data security and performance optimization in compliance with regulatory requirements. Proficiency in CI/CD pipelines for deploying infrastructure and pipelines, experience with RDBMS and NOSQL offerings, and familiarity with orchestration tools like Airflow will be beneficial. Your soft skills, including problem-solving, strategic communication, and project management, will be key in leading problem-solving efforts, engaging with stakeholders, and overseeing project lifecycles. By joining our team at Fidelity, you will not only receive a comprehensive benefits package but also support for your wellbeing and professional development. We are committed to creating a flexible work environment that prioritizes work-life balance and motivates you to contribute effectively to our team. To explore more about our work culture and opportunities for growth, visit careers.fidelityinternational.com.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
As an AWS Cloud Engineer at our company based in Kerala, you will play a crucial role in designing, implementing, and maintaining scalable, secure, and highly available infrastructure solutions on AWS. Your primary responsibility will be to collaborate closely with developers, DevOps engineers, and security teams to support cloud-native applications and business services. Your key responsibilities will include designing, deploying, and maintaining cloud infrastructure using various AWS services such as EC2, S3, RDS, Lambda, and VPC. Additionally, you will be tasked with building and managing CI/CD pipelines, automating infrastructure provisioning using tools like Terraform or AWS CloudFormation, and monitoring and optimizing cloud resources through CloudWatch, CloudTrail, and other third-party tools. Furthermore, you will be responsible for managing user permissions and security policies using IAM, ensuring compliance, implementing backup and disaster recovery plans, troubleshooting infrastructure issues, and responding to incidents promptly. It is essential that you stay updated with AWS best practices and new service releases to enhance our overall cloud infrastructure. To be successful in this role, you should possess a minimum of 3 years of hands-on experience with AWS cloud services, a solid understanding of networking, security, and Linux system administration, as well as experience with DevOps practices and Infrastructure as Code (IaC). Proficiency in scripting languages such as Python and Bash, familiarity with containerization tools like Docker and Kubernetes (EKS preferred), and holding an AWS Certification (e.g., AWS Solutions Architect Associate or higher) would be advantageous. It would be considered a plus if you have experience with multi-account AWS environments, exposure to serverless architecture (Lambda, API Gateway, Step Functions), familiarity with cost optimization, and the Well-Architected Framework. Any previous experience in a fast-paced startup or SaaS environment would also be beneficial. Your expertise in AWS CloudFormation, Kubernetes (EKS), AWS services (EC2, S3, RDS, Lambda, VPC), cloudtrail, cloud, scripting (Python, Bash), CI/CD pipelines, CloudWatch, Docker, IAM, Terraform, and other cloud services will be invaluable in fulfilling the responsibilities of this role effectively.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
delhi
On-site
As a DevOps Engineer at AuditorsDesk, you will be responsible for designing, deploying, and maintaining AWS infrastructure using Terraform for provisioning and configuration management. Your role will involve implementing and managing EC2 instances, application load balancers, and AWS WAF to ensure the security and efficiency of web applications. Collaborating with development and operations teams, you will integrate security practices throughout the software development lifecycle and automate testing and deployment processes using CI/CD pipelines. You should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5 to 10 years of experience working with AWS services and infrastructure. Proficiency in infrastructure as code (IaC) using Terraform, hands-on experience with load balancers, and knowledge of containerization technologies like Docker and Kubernetes are required. Additionally, familiarity with networking concepts, security protocols, scripting languages for automation, and troubleshooting skills are essential for this role. Preferred qualifications include AWS certifications like AWS Certified Solutions Architect or AWS Certified DevOps Engineer, experience with infrastructure monitoring tools such as Prometheus and knowledge of compliance frameworks like PCI-DSS and GDPR. Excellent communication skills and the ability to collaborate effectively with cross-functional teams are key attributes for success in this position. This is a permanent, on-site position located in Delhi with compensation based on industry standards. If you are a proactive and detail-oriented professional with a passion for ensuring high availability and reliability of systems, we invite you to join our team at AuditorsDesk and contribute to making audit work paperless and efficient.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Developer 2 (FSD), you will be responsible for leading the design and delivery of complex end-to-end features across frontend, backend, and data layers. Your role will involve making strategic architectural decisions, reviewing and approving pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns. Additionally, you will build and maintain shared UI component libraries and backend service frameworks for team reuse. Identifying and eliminating performance bottlenecks in both browser rendering and server throughput will be a crucial part of your responsibilities. You will also be instrumental in instrumenting services with metrics and logging, defining and enforcing comprehensive testing strategies, and owning CI/CD pipelines for automating builds, deployments, and rollback procedures. Ensuring OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices will be key aspects of your role. Furthermore, you will partner with Product, UX, and Ops teams to translate business objectives into technical roadmaps. Facilitating sprint planning, estimation, and retrospectives for predictable deliveries will be part of your routine. Mentoring and guiding SDE-1s and interns, as well as participating in hiring processes, will also be part of your responsibilities. To qualify for this role, you should have at least 3-5 years of experience building production Full stack applications end-to-end with measurable impact. Strong leadership skills in Agile/Scrum environments, proficiency in React (or Angular/Vue), TypeScript, and modern CSS methodologies are required. You should be proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot). Expertise in designing RESTful and GraphQL APIs, scalable database schemas, as well as knowledge of MySQL/PostgreSQL indexing, NoSQL databases, and caching are essential. Experience with containerization (Docker) and AWS services such as Lambda, EC2, S3, API Gateway is preferred. Skills in unit/integration and E2E testing, frontend profiling, backend tracing, and secure coding practices are also important. Strong communication skills, the ability to convey technical trade-offs to non-technical stakeholders, and experience in providing constructive feedback are assets for this role. In addition to technical skills, we value qualities such as a commitment to delivering high-quality software, collaboration abilities, determination, creative problem-solving, openness to feedback, eagerness to learn and grow, and strong communication skills. This position is based in Hyderabad.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 10 years of experience and be proficient in setting up, configuring, and integrating API gateways in AWS. Your expertise should include API frameworks, XML/JSON, REST, and data protection in software design, build, test, and documentation. It is essential to have practical experience with various AWS services such as Lambda, S3, CDN (CloudFront), SQS, SNS, EventBridge, API Gateway, Glue, and RDS. You must be able to effectively articulate and implement projects using these AWS services to enhance business processes through integration solutions. The job is located in Bangalore, Chennai, Mumbai, Noida, and Pune, and requires an immediate joiner. If you meet the requirements and are looking for an opportunity to contribute your skills in AWS integration and API management, we encourage you to apply for this position. To apply, please fill out the form below with your full name, email, phone number, attach your CV/Resume in .pdf, .doc, or .docx format, and include a cover letter. By submitting this form, you agree to the storage and handling of your data as per the website's privacy policy.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be a valuable member of the data engineering team, contributing to the development of data pipelines, data transformations, and exploring new data patterns through proof of concept initiatives. Your role will also involve optimizing existing data feeds and implementing enhancements to improve data processes. Your primary skills should include a strong understanding of RDBMS concepts, hands-on experience with the AWS Cloud platform and its services such as IAM, EC2, Lambda, RDS, Timestream, and Glue. Additionally, proficiency in data streaming tools like Kafka, hands-on experience with ETL/ELT tools, and familiarity with databases like Snowflake or Postgres are essential. It would be beneficial if you have an understanding of data modeling techniques, as this knowledge would be considered a bonus for this role.,
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
karnataka
On-site
You are looking for a skilled .NET Architect with expertise in AWS to join the team in Bangalore. As a .NET Architect, you will be responsible for designing and implementing scalable and secure .NET-based solutions, utilizing AWS cloud services effectively. Your role will involve collaborating with cross-functional teams, evaluating AWS services, and maintaining comprehensive documentation. It is essential to have a strong background in .NET software development, architecture, and AWS services. Your problem-solving skills, communication abilities, and experience in leading software development teams will be crucial for this role. If you have the required experience and expertise, and you are ready to take on this challenge, we would like to hear from you.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 professionals spanning across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant- Databricks Lead Developer. As a Databricks Developer in this role, you will be tasked with solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Keep abreast of new and emerging technologies and assess their potential application for service offerings and products. - Collaborate with architects and lead engineers to devise solutions that meet functional and non-functional requirements. - Demonstrate proficiency in understanding relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess experience in the Data Engineering domain. Qualifications we are looking for: Minimum qualifications: - Bachelor's Degree or equivalency in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - <<>> years of experience in IT. - Familiarity with new and emerging technologies and their possible applications for service offerings and products. - Collaboration with architects and lead engineers to develop solutions meeting functional and non-functional requirements. - Understanding of industry trends and standards. - Strong analytical and technical problem-solving abilities. - Proficiency in either Python or Scala, preferably Python. - Experience in the Data Engineering domain. Preferred qualifications: - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience with CI/CD for building Databricks job pipelines. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. If you are a proactive individual with a passion for innovation and a strong commitment to continuous learning and upskilling, we invite you to apply for this exciting opportunity to join our team at Genpact.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Join our team at Fortinet, a leading cybersecurity company dedicated to shaping the future of cybersecurity and redefining the intersection of networking and security. We are on a mission to protect people, devices, and data worldwide. Currently, we are looking for a dynamic Staff Software Development Engineer to join our rapidly growing business. As a Staff Software Development Engineer at Fortinet, you will play a crucial role in enhancing and expanding our product capabilities. Your responsibilities will include designing and implementing core services, as well as defining the system architecture. We are seeking a highly motivated individual who excels in a fast-paced environment and can contribute effectively to the team. The ideal candidate will possess a can-do attitude, a passion for technology, extensive development experience, and a quick learning ability. Your key responsibilities as a Staff Software Development Engineer will include: - Developing enterprise-grade backend components to improve performance, responsiveness, server-side logic, and platform - Demonstrating a strong understanding of technology selection with well-justified study to support decisions - Troubleshooting, debugging, and ensuring timely resolution of software defects - Participating in functional spec, design, and code reviews - Adhering to standard practices for application code development and maintenance - Actively working towards reducing technical debt in various codebases - Creating high-quality, secure, scalable software solutions based on technical requirements specifications and design artifacts within set timeframes and budgets We are seeking candidates with: - 8-12 years of experience in Software Engineering - Proficiency in Python programming and frameworks like Flask/FastAPI - Solid knowledge of RDBMS (e.g., MySQL, PostgreSQL), MongoDB, Queueing systems, and ES Stack - Experience in developing REST API-based microservices - Strong grasp of data structures and multi-threading/multi-processing programming - Experience in building high-performing, distributed, scalable, enterprise-grade applications - Familiarity with AWS services (ECS, ELB, Lambda, SQS, VPC, EC2, IAM, S3), Docker, and Kubernetes (preferred) - Excellent problem-solving and troubleshooting skills - Ability to effectively communicate technical topics to both technical and business audiences - Self-motivated with the capability to complete tasks with minimal direction - Experience in cyber security engineering is a plus About Our Team: Our team culture is centered around collaboration, continuous improvement, customer-centricity, innovation, and accountability. These values are ingrained in our ethos and culture, fostering a dynamic and supportive environment that promotes excellence and innovation while prioritizing our customers" needs and satisfaction. Why Join Us: We welcome candidates from diverse backgrounds and identities to apply. We offer a supportive work environment and a competitive Total Rewards package designed to enhance your overall health and financial well-being. Embark on a challenging, fulfilling, and rewarding career journey with Fortinet. Join us in delivering solutions that have a meaningful and lasting impact on our 660,000+ customers worldwide.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this position should have a Bachelor's or Master's degree in Computer Science or Computer Engineering, or an equivalent field. You should possess at least 2-6 years of experience in server side development using technologies such as GoLang, Node.JS, or Python. You should demonstrate proficiency in working with AWS services like Lambda, DynamoDB, Step Functions, and S3. Additionally, you should have hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. A strong background in database technologies including MongoDB and DynamoDB is preferred. You should also have experience with CI/CD pipelines and automation processes. Any experience in Video Transcoding / Streaming on Cloud would be considered a plus. Problem-solving skills are essential for this role as you may encounter various challenges while working on projects.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
As a Senior Data Scientist with 5+ years of experience, you will play a crucial role in our team based in Indore/Pune. Your responsibilities will involve designing and implementing models, extracting insights from data, and interpreting complex data structures to facilitate business decision-making. You should have a strong background in Machine Learning areas such as Natural Language Processing, Machine Vision, Time Series, etc. Your expertise should extend to Model Tuning, Model Validation, Supervised and Unsupervised Learning. Additionally, hands-on experience with model development, data preparation, and deployment of models for training and inference is essential. Proficiency in descriptive and inferential statistics, hypothesis testing, and data analysis and exploration are key skills required for this role. You should be adept at developing code that enables reproducible data analysis. Familiarity with AWS services like Sagemaker, Lambda, Glue, Step Functions, and EC2 is expected. Knowledge of data science code development and deployment IDEs such as Databricks, Anaconda distribution, and similar tools is essential. You should also possess expertise in ML algorithms related to time-series, natural language processing, optimization, object detection, topic modeling, clustering, and regression analysis. Your skills should include proficiency in Hive/Impala, Spark, Python, Pandas, Keras, SKLearn, StatsModels, Tensorflow, and PyTorch. Experience with end-to-end model deployment and production for at least 1 year is required. Familiarity with Model Deployment in Azure ML platform, Anaconda Enterprise, or AWS Sagemaker is preferred. Basic knowledge of deep learning algorithms like MaskedCNN, YOLO, and visualization and analytics/reporting tools such as Power BI, Tableau, Alteryx would be advantageous for this role.,
Posted 2 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Ahmedabad
Work from Office
As a Senior platform engineer, you are expected to design and develop key components that power our platform. You will be building a secure, scalable, and highly performant distributed platform that connects multiple cloud platforms like AWS, Azure, and GCP. Job Title: Sr. Platform Engineer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Responsibilities: Take full ownership of developing, maintaining, and enhancing specific modules of our cloud management platform, ensuring they meet our standards for scalability, efficiency, and reliability. Design and implement serverless applications and event-driven systems that integrate seamlessly with AWS services, driving the platform's innovation forward. Work closely with cross-functional teams to conceptualize, design, and implement advanced features and functionalities that align with our business goals. Utilize your deep expertise in cloud architecture and software development to provide technical guidance and best practices to the engineering team, enhancing the platform's capabilities. Stay ahead of the curve by researching and applying the latest trends and technologies in the cloud industry, incorporating these insights into the development of our platform. Solve complex technical issues, providing advanced support and guidance to both internal teams and external stakeholders. Requirements: A minimum of 5 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. Proven expertise in serverless development and event-driven architecture design, with a track record of developing and shipping high-quality SaaS platforms on AWS. Comprehensive understanding of cloud computing concepts, architectural best practices, and AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Solid knowledge of object-oriented programming (OOP), SOLID principles, and experience with relational and NoSQL databases. Proficiency in developing and integrating RESTful APIs and familiarity with source control systems like Git. Exceptional problem-solving skills, capable of optimizing complex systems. Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. A strong drive for continuous learning and staying updated with industry developments. Nice to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes. Experience: 5 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. 1+ years of experience working on applications built using Serverless architecture. 1+ years of hands-on experience with Microservices Architecture in live projects. 1+ years of experience applying Domain-Driven Design principles in projects. 1+ years of experience working with Event-Driven Architecture in real-world applications. 1+ years of experience integrating, consuming, and maintaining AWS services. 1+ years of experience working with Boto3 in Python.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France