Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview Bandgi Technologies is a SaaS product development company which provides niche technology skills and help organizations in innovation. Our specialization is on Industry 4.0/IIOT and we provide solutions to clients in US, Canada and Europe. We are Innovation Enablers and have offices in India (Hyderabad) and in the UK (Maidenhead). We are looking for an experienced DevOps Engineer with a background in AWS to join a growing enterprise organization. You will work within a growing AWS cloud team looking to build on and maintain their cloud infrastructure. The cloud engineer will split their time between supporting the transition of code through pipelines into a live state from software development, and evolving and maintaining cloud infrastructure and project/service introduction activities. Skill Sets - Must Have Solid Experience 5+yrs in Terraform writing Shell script, VPC'S creation, DevSecOps & sst.dev and Github actions is Mandatory. Working Knowledge & Experience On Linux operating systems & Experience building CI/CD pipelines using following : AWS : VPC, Security Group, IAM, S3, RDS, Lambda, EC2 (Autoscaling Group, Elastic beanstalk), CloudFormation and AWS stacks. Container : Docker, Kubernetes, Helm, Terraform. CI/CD pipelines : GitHub Actions(Mandatory). Databases SQL & NoSQL (MySQL, Postgres, DynamoDB). Observability best practices (Prometheus, Grafana, Jaeger, Elasticsearch) Good To Have Learning Attitude API Gateways. Microservices best practices (including design patterns) Authentication and Authorization (OIDC/SAML/OAuth 2.0) Your Responsibilities Will Include Operation and control of Cloud infrastructure (Docker platform services, network services and data storage). Preparation of new or changed services. Operation of the change/release process. Application of cloud management tools to automate the provisioning, testing, deployment and monitoring of cloud components. Designing cloud services and capabilities using appropriate modelling techniques. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
68.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities Design, develop, and optimize large-scale data pipelines using PySpark and Apache Spark. Build scalable and robust ETL workflows leveraging AWS services such as EMR, S3, Lambda, and Glue. Collaborate with data scientists, analysts, and other engineers to gather requirements and deliver clean, well-structured data solutions. Integrate data from various sources, ensuring high data quality, consistency, and reliability. Manage and schedule workflows using Apache Airflow. Work on ML model deployment pipelines using tools like SageMaker and Anaconda. Write efficient and optimized SQL queries for data processing and validation. Develop and maintain technical documentation for data pipelines and architecture. Participate in Agile ceremonies, sprint planning, and code reviews. Troubleshoot and resolve issues in production environments with minimal supervision. Required Skills And Qualifications Bachelor's or Masters degree in Computer Science, Engineering, or a related field. 68 years of experience in data engineering with a strong focus on : Python PySpark SQL AWS (EMR, EC2, S3, Lambda, Glue) Experience in developing and orchestrating pipelines using Apache Airflow. Familiarity with SageMaker for ML deployment and Anaconda for environment management. Proficiency in working with large datasets and optimizing Spark jobs. Experience in building data lakes and data warehouses on AWS. Strong understanding of data governance, data quality, and data lineage. Excellent documentation and communication skills. Comfortable working in a fast-paced Agile environment. Experience with Kafka or other real-time streaming platforms. Familiarity with DevOps practices and tools (e.g., Terraform, CloudFormation). Exposure to NoSQL databases such as DynamoDB or MongoDB. Knowledge of data security and compliance standards (GDPR, HIPAA). Work with cutting-edge technologies in a collaborative and innovative environment. Opportunity to influence large-scale data infrastructure. Competitive salary, benefits, and professional development support. Be part of a growing team solving real-world data challenges. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We are seeking a Senior Full Stack Engineer with deep expertise in modern JavaScript ecosystems and cloud architecture. You'll be working on complex application modernization initiatives, focusing on transforming legacy systems into scalable, cloud-native applications. Core Technical Stack Frontend : React.js (with Hooks, Context API), Next.js 14+, Redux/RTK, TypeScript, Tailwind CSS, Material-UI/Chakra UI Backend : Node.js, NestJS, Express.js, GraphQL (Apollo Server), WebSocket Cloud & Infrastructure AWS Services : ECS, Lambda, API Gateway, S3, CloudFront, RDS, DynamoDB, SQS/SNS, ElastiCache Infrastructure as Code : Terraform, CloudFormation Containerization : Docker, Kubernetes, ECS Databases & Caching MongoDB PostgreSQL Redis Elasticsearch Authentication & Security : OAuth2.0/OIDC JWT AWS Cognito SAML2.0 Testing & Quality : Jest React Testing Library Cypress CI/CD & Monitoring GitHub Actions Jenkins AWS CloudWatch DataDog Key Technical Responsibilities System Architecture & Development (70%) : Design and implement microservices architectures using Node.js/NestJS, focusing on scalability and performance Build reusable component libraries and establish frontend architecture patterns using React.js and Next.js Implement real-time features using WebSocket/Socket.io for live data updates and notifications Design and optimize database schemas, write complex queries, and implement caching strategies Develop CI/CD pipelines with automated testing, deployment, and monitoring Create and maintain infrastructure as code using Implement security best practices and compliance requirements (SOC2, GDPR) Examples Of Current Projects Modernizing a monolithic PHP application into microservices using NestJS and React Implementing event-driven architecture using AWS EventBridge and SQS Building a real-time analytics dashboard using WebSocket and Time-series databases Optimizing application performance through caching strategies and CDN implementation Developing custom hooks and components for shared functionality across applications Technical Leadership (30%) : Conduct code reviews and provide technical mentorship Contribute to technical decision-making and architecture discussions Document technical designs and maintain development standards Collaborate with product teams to define technical requirements Guide junior developers through complex technical challenges Required Technical Experience Expert-level proficiency in JavaScript/TypeScript and full-stack development Deep understanding of React.js internals, hooks, and performance optimization Extensive experience with Node.js backend development and microservices Strong background in cloud architecture and AWS services Hands-on experience with container orchestration and infrastructure automation Proven track record of implementing authentication and authorization systems Experience with monitoring, logging, and observability tools Preferred Qualifications Technical Expertise : Advanced degree in Computer Science, Engineering, or related field Experience with cloud-native development and distributed systems patterns Proficiency in additional programming languages (Rust, Go, Python) Deep understanding of browser internals and web performance optimization Experience with streaming data processing and real-time analytics Architecture & System Design Experience designing event-driven architectures at scale Knowledge of DDD (Domain-Driven Design) principles Background in implementing CQRS and Event Sourcing patterns Experience with high-throughput, low-latency systems Understanding of distributed caching strategies and implementation Cloud & DevOps AWS Professional certifications (Solutions Architect, DevOps) Experience with multi-region deployments and disaster recovery Knowledge of service mesh implementations (Istio, Linkerd) Familiarity with GitOps practices and tools (ArgoCD, Flux) Experience with chaos engineering practices Security & Compliance Understanding of OWASP security principles Experience with PCI-DSS compliance requirements Knowledge of cryptography and secure communication protocols Background in implementing Zero Trust architectures Experience with security automation and DevSecOps practices Development & Testing Experience with TDD/BDD methodologies Knowledge of performance testing tools (k6, JMeter) Background in implementing continuous testing strategies Experience with contract testing (Pact, Spring Cloud Contract) Familiarity with mutation testing concepts About Us TechAhead is a global digital transformation company with a strong presence in the USA and India. We specialize in AI-first product design thinking and bespoke development solutions. With over 15 years of proven expertise, we have partnered with Fortune 500 companies and leading global brands to drive digital innovation and deliver excellence. At TechAhead, we are committed to continuous learning, growth and crafting tailored solutions that meet the unique needs of our clients. Join us to shape the future of digital innovation worldwide and drive impactful results with cutting-edge AI tools and strategies! (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Karnataka, India
On-site
Who You’ll Work With You will be a part of the larger Global Technology organization working on Nike’s internal product creation tools and report to the team’s Engineering Leadership. You will work day-to-day with a team of engineers, the team’s Product Manager and Principal Engineers in the organization on software projects to achieve Nike’s business objectives. You will also engage with other Global Technology functions and teams on organizational and technical goals. Who We Are Looking For We’re looking for a Software Engineer to solve complex software engineering problems supporting Nike’s pursuit of delivering state of the art tools to our Consumer Product & Innovation community. The candidate needs to be highly collaborative with peers, productive in a fast-paced development environment and have depth of native cloud software engineering experience. What You’ll Work On You will be part of a team of engineers building out tooling for our internal Consumer Product & Innovation team members. We are investing in building modular, configurable and “API-First” capabilities which will be consumed by modern web applications built with the most recent SPA frameworks. Bachelor’s degree in Computer Science or Engineering, Information Systems, or similar field or relevant professional experience in lieu of a degree 2+ years of hands-on industry software development experience. Experience with front-end frameworks like React, Angular, or Vue.js. Experience with cloud architecture, modern DevOps, infrastructure as code, CI/CD and related tools. Experience with AWS products including Lambda, Step Functions, DynamoDB, Elasticsearch, s3 Modern testing methodologies and frameworks such as Mocha, Jasmine and Jest Strong understanding of architectural design patterns and computer-science fundamentals Experience with implementing and integrating AI, Machine Learning and related data solutions preferred. Hands-on experience implementing and supporting modern software architectural principles and patterns (REST, domain-driven design, microservices, etc) Excellent verbal and written communication skills Demonstrated ability to build and maintain relationships with multiple peers and cross-functional partners Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management Expertise around Data : Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes. File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Karnataka, India
On-site
Who You’ll Work With You will be a part of the larger Global Technology organization working on Nike’s internal product creation tools and report to the team’s Engineering Manager. You will work day-to-day with the a team of engineers, the team’s Product Manager and Principal Engineers in the organization on software projects to achieve Nike’s business objectives. You will also engage with other Global Technology functions and teams on organizational and technical goals. Who We Are Looking For We’re looking for a Software Engineer III to solve complex software engineering problems supporting Nike’s pursuit of delivering state of the art tools to our Consumer Product & Innovation community. The candidate needs to be highly collaborative with peers, productive in a fast-paced development environment and have depth of native cloud software engineering experience A minimum of 5 years software development experience Experience working in a highly collaborative, multi-discipline development team environment Experience working in a cloud-based environment (e.g. AWS Services) Experience in micro service architecture, domain driven design, and RESTful Services using Java Experience working with Java and Spring Boot framework mandatorily Experience in Test Driven development – with Junit, Mockito, cucumber etc. Experience with monitoring and tracing tools like Splunk, SignalFx, Kibana/Grafana etc. Experience working with NoSQL data stores like Cassandra, DynamoDB, MongoDB etc. Experience working with relational data stores like MySQL, Oracle etc. Experience with continuous integration, unit testing, static analysis, and automated integration tests. Continuous delivery experience preferred. Working knowledge of Scrum and agile principles Comfortable working in a fast-paced, results-oriented environment Comfortable working with globally distributed and diverse teams Demonstrate end-to-end ownership of project/capabilities that includes requirement evaluation, design, implementation, observability and maintenance. Excellent verbal and written communication and collaboration skills to effectively communicate with both business and technical teams Bachelor’s degree in computer science, Information Systems, or other relevant subject area, or equivalent experience. What You’ll Work On You will contribute to the development of new features and identifying technical areas for improvement, proactively collaborating with product team members at Nike to design, develop, and deploy a highly scalable solution. You will participate in peer code reviews, sharing and receiving constructive feedback to maintain high-quality code standards. You will stay informed about the latest industry trends, bringing new ideas and improvements to the team. You will support the team’s agile practices, promoting continuous improvement and collaboration to deliver top-notch consumer experiences at scale. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Karnataka, India
On-site
Who You’ll Work With You will be a part of the larger Global Technology organization working on Nike’s internal product creation tools and report to the team’s Engineering Manager. You will work day-to-day with the a team of engineers, the team’s Product Manager and Principal Engineers in the organization on software projects to achieve Nike’s business objectives. You will also engage with other Global Technology functions and teams on organizational and technical goals. Who We Are Looking For We’re looking for a Senior Software Engineer to solve complex software engineering problems supporting Nike’s pursuit of delivering state of the art tools to our product developers and broader creation community. The candidate needs to be highly collaborative with peers, productive in a fast-paced development environment and have depth of native cloud software engineering experience. What You’ll Work On You will be part of a team of engineers building out tooling for our internal Innovation & Consumer Creation team members. We are investing in building modular, configurable and “API-First” capabilities which will be consumed by modern web applications build with the most recent SPA frameworks. Bachelor’s degree in Computer Science or Engineering, Information Systems, or similar field or relevant professional experience in lieu of a degree 5+ years of hands-on industry software development experience. Demonstrated expertise in node.js, python or similar languages Experience with cloud architecture, infrastructure as code, CI/CD and related tools. Experience with AWS products including Lambda, Step Functions, DynamoDB, Elasticsearch, s3 Modern testing methodologies and frameworks such as Mocha, Jasmine and Jest Strong understanding of architectural design patterns and computer-science fundamentals Experience with HTML, CSS, SASS, Javascript HTML + HTML5, including Flexbox, semantic markup, accessibility and concepts such as OOCSS, BEM and ATOMIC) Experience with implementing and integrating AI, Machine Learning and related data solutions preferred. Experience with browser APIs, including a working knowledge of event lifecycles and performance optimization CSS + CSS2, including cross-browser compliance 2+ years of hands on experience with technologies such as vue.js, vuetify.js, vuext, React (experience with other frameworks such as Angular is a plus), Redux, Template Technologies, Micro Services, REST Web Services. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
India
Remote
Now Hiring: Senior Data Engineer (Database Specialist) Location: Remote-India (Candidates in India only apply) Job Type: Full-Time Role Description This is a full-time remote role for a Data Engineer (Database Specialist). The Data Engineer will be responsible for designing and implementing data models, developing and maintaining data pipelines, and extracting, transforming, and loading (ETL) data. In addition, the role includes designing and managing data warehouses and performing data analytics to support business decisions. Key Responsibilities Design, optimize, and manage databases that support large-scale data Develop reliable ETL/ELT pipelines for transforming structured and unstructured device data Ensure data consistency, traceability, and performance across systems Work with time-series and event-based data Collaborate with QA, engineering, and product teams to integrate analytics into the product Implement data quality checks, audit trails, and compliance safeguards Support cloud data infrastructure in AWS, with tools like Redshift, RDS, S3 Must-Have Skills 5+ years in data engineering or backend-focused development Strong command of SQL and schema design (PostgreSQL, MySQL) Hands-on with NoSQL systems (MongoDB, DynamoDB) Python-based ETL scripting (Pandas, PySpark preferred) Experience with cloud-based data platforms (AWS) Familiarity with SaaS application data workflows. Understanding of security, privacy, and data compliance practices Nice to Have Certifications in AWS Data Engineering Apply now or refer someone in your network! Drop Resume- Rikhi@Sachhsoft.com Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas ( Oracle FCCM ) expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas ( Oracle FCCM ) expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
3.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are looking for a hands-on Data Engineer who is passionate about solving business problems through innovation and engineering practices. As a Data Engineer, the candidate will leverage deep technical knowledge and will apply knowledge of data architecture standards, data warehousing, data structures, and business intelligence to drive the creation of high-quality data products for data driven decision making. Required Qualifications 3-6 Years Experience of implementing data-intensive solutions using agile methodologies. Code contributing member of Agile teams, working to deliver sprint goals. Write clean, efficient, and maintainable code that meets the highest standards of quality. Very strong in coding Python/Pyspark, UNIX shell scripting Experience in cloud native technologies and patterns Ability to automate and streamline the build, test and deployment of data pipelines T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in data integration platforms such as Apache Spark Experienced in writing Pyspark code to handle large data set ,perform data transformation , familiarity with Pyspark integration with other Apache Spark component ,such as Spark SQL , Understanding of Pyspark optimization techniques Strong proficiency in working with relational databases and using SQL for data querying, transformation, and manipulation. Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Iceberg for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, PySpark, UNIX Shell scripting DevOps: Exposure to concepts and enablers - CI/CD platforms, bitbucket/Github, JIRA, Jenkins, Tekton, Harness Technical Skills (Valuable) Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls, framework libraries like Deequ Federated Query: Starburst, Trino Containerization: Fair understanding of containerization platforms like Docker, Kubernetes, Openshift File Formats: Exposure in working on File/Table Formats such as Avro, Parquet, Iceberg, Delta Schedulers: Basics of Job scheduler like Autosys, Airflow Cloud: Experience in cloud native technologies and patterns (AWS, Google Cloud) Nice to have: Java, for REST API development Other skills : Strong project management and organizational skills. Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team. Experience in managing and implementing successful projects Ability to adjust priorities quickly as circumstances dictate Consistently demonstrates clear and concise written and verbal communication ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Greater Ahmedabad Area
On-site
Company Profile Infocusp Innovations is an IT firm working in the broad fields of Machine Learning, Artificial Intelligence (AI), Computer Science, Software Engineering, Mobile and Web App Development, QA, and Signal Processing. Global presence in India and the United States of America, with offices in Ahmedabad, Pune and California, we make businesses smart and systems smarter to make people's lives easier. Our teams work closely with the research and product engineering teams to develop next-gen robust solutions that improve the quality of life of users and enhance the user experience. We have worked on high-impact projects in the domains of Healthcare, FinTech, IoT, Law and Humanity, AgroTech and Horticulture, Molecular Chemistry, GeoPhysics, Biology, Energy, Logistics, Recruitment and Gaming. Transparent communication, healthy and constructive discussions along with an open and welcoming work culture are what we prefer at Infocusp. We make conscious decisions to facilitate our work with favorable working conditions and amenities that enable our employees to give their best. About the Role A Senior Software Engineer is a highly proficient professional who excels in designing, developing, and maintaining complex software systems. They demonstrate expertise in multiple programming languages, possess a deep understanding of software architecture, and often lead significant projects or mentor junior engineers. This role involves making critical technical decisions, optimizing software performance, resolving intricate challenges, and contributing to innovative solutions. Senior Software Engineers play a pivotal role in shaping software strategies, driving technical excellence, and staying current with industry trends to deliver robust and advanced software applications. Responsibilities Own the design, development, evaluation, and deployment of highly scalable software products involving front-end and back-end development. Maintain quality, responsiveness, and stability of the system. Evaluate and make decisions on the use of new tools and technologies. Design and develop memory-efficient, compute-optimized solutions for the software. Delegate tasks and mentor junior engineers. Prioritize and distribute the tasks amongst the team members. Design and administer automated testing tools and continuous integration tools. Produce comprehensive and usable software documentation. Follow secure development, testing, and deployment guidelines and practices in order to adhere to the overall security of the system under consideration. Requirements B.E\B.Tech\M.E.\M.S.\M.Tech\PhD candidates' entries with significant prior experience in the fields above will be considered. 3+ years of relevant industry experience with TypeScript or Python Mastery of one or more back-end programming languages (JavaScript, Java, Scala, C++, etc.) Experience in using cloud services (AWS, GCP, Microsoft Azure) and distributed systems like (Hadoop, Spark, Beam). Deep understanding of various relational (SQL, PostgreSQL), non-relational (Mongo, DynamoDB, Cassandra) and time-series (InfluxDB) database systems. Knowledge of automated and continuous integration testing tools (Jenkins, Team City, Circle CI, etc.) Good to have experience in front-end programming paradigms and libraries (for example advanced JavaScript libraries and frameworks such as Angular and React). Ability to plan and design software system architecture. Proven experience in platform development for large-scale systems. Location: Ahmedabad/Pune Contact us to apply If you would like to apply for this role, send your resume to careers@infocusp.com . Show more Show less
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
Tamil Nadu, India
Remote
Job Title: Senior Software Engineer Location: Remote, must be based within Tamil Nadu Employment Type: Full-time About Profice: Profice is a forward-thinking software development company specializing in custom solutions, data management, and AI-powered HR & Payroll innovations. We empower businesses with cutting-edge technology, seamless automation, and a culture of agility, flexibility, and continuous learning. Learn more about us at www.profice.co.uk . About this role: We are seeking a talented .net full stack engineer who have 4-6 years of experience in developing software applications with expertise in .net full stack development and AWS Cloud platform. Should have excellent communication skills and a passion to work in a start-up culture in UK time zone . It’s a full time remote based role and preference to those who live within Tamil Nadu state. What you’ll do: Business & Solution Development Understand business requirements and translate them into technical solutions. Collaborate with the Tech Lead / Architect to design solutions when needed. Develop and perform unit testing to ensure quality and reliability. Issue Analysis & Resolution Investigate production issues, directly engaging with client users for clarity. Deliver timely fixes based on the severity and impact of the issue. Code Review & Technical Guidance Review pull requests (PRs) from junior developers on GitHub. Provide technical assistance on items that do not require Tech Lead / Architect involvement. Data Automation & Optimization Work extensively with HR & Payroll data. Identify opportunities to automate data reconciliation and synchronization using appropriate tools. What you’ll need: Software professional with 4-6 years of experience in software development using Microsoft technologies such as .Net Core, JavaScript / TypeScript, Background Services and REST APIs. Must-Have Skills: Strong Coding Foundations Solid understanding of coding fundamentals, data structures, and algorithms. Backend Development Hands-on experience with .NET Core , RESTful APIs , and Entity Framework . Proficiency in unit and integration testing . Frontend Development Experience with Angular 17 or an equivalent front-end framework. Cloud & Troubleshooting Hands-on experience with AWS serverless services such as API Gateway, Lambda, SQS, CloudWatch , etc. Strong troubleshooting skills in cloud environments. Software Design & Best Practices Excellent understanding of design principles and patterns . Ability to design solutions with minimal assistance from architects. Strong grasp of coding standards and version control using GitHub . Database Expertise Ability to write complex SQL queries in PostgreSQL or an equivalent database. Good understanding of No SQL databases like MongoDB, DynamoDB etc. Cloud Platform Knowledge Good understanding of AWS or similar cloud platforms. Communication Fluent in English , both written and verbal. Other Desirable Skills: Data Processing & Automation Experience with data validations, transformations, and synchronizations using Power Automate, Python, Gen AI , etc. DevOps & CI/CD Familiarity with SAM, Jenkins, or equivalent CI/CD tools . Experience working with Cron jobs for scheduling tasks. Performance Optimization Experience in identifying and resolving software performance issues. Who you are: Strong Communicator Fluent in English (mandatory). Spanish proficiency is a plus. Proactive & Adaptable Comfortable working independently as well as collaborating in a team. Thrives in a fast-paced, AGILE environment. Startup Mindset Adaptable and comfortable with a start-up culture , taking ownership and initiative. Location & Availability Must work in the UK time zone . Must reside within Tamil Nadu, India. Who are we? Here at Profice, we build intelligent, scalable, and people-centric software solutions that simplify the management of workforces in businesses. Rooted in our passion and interest for both technology and HR transformation, our mission is to bridge the gap between raw HR data and operational efficiency. With extensive expertise across software development, data integration, and global HR system rollouts, we empower and inspire organizations to make their people data work smarter. What makes Profice a great place to work? At Profice, we strongly believe in work-life balance and the continuous growth of our employees. Our employees are entirely remote based, focusing on delivering high-quality results on time without compromising standards. Our agile culture fosters open communication, eliminating rigid hierarchies and enabling direct access to leadership. We provide abundant opportunities to learn, innovate, and thrive in a dynamic environment. What we offer: 24 days paid leave per calendar year. Indian Public holidays as applicable. Medical Insurance. Yearly bonus based on performance. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description & Requirements Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen. The Server Engineer will report to the Technical Director. Responsibilities Design, develop, and run a fast, scalable, highly available game service all the way from conception to delivery to live service operations Work with designers, client engineering, and production teams to achieve gameplay goals Implement security best practices and original techniques to keep user data secure and prevent cheating Create and run automated testing, readiness testing, and deployment plans Monitor the performance and costs of the server infrastructure to improve our game Design and implement data transformation layers using Java/Spring/AWS/Protobuf Collaborate with game server and web frontend teams to define API contracts Manage Release Ops / Live Ops of web services Qualifications We encourage you to apply if you can meet most of the requirements and are comfortable opening a dialogue to be considered. 4+ years development of scalable back-end services BS degree in Computer Science or equivalent work experience Proficiency in PHP, Java Experience with Cloud services like Amazon Web Services or Google Cloud Platform Experience with Redis Experience with Database Design and usage of large datasets in both relational (MySQL, Postgres) and NoSQL (Couchbase, DynamoDB) environments Experience defining API contracts and collaborating with cross-functional teams Bonus 3+ years of experience developing games using cloud services like AWS, Azure, Google Cloud Platform, or similar Proficient in technical planning, solution research, proposal, and implementation Background using metrics and analytics to determine the quality or priority Comfortable working across client and server codebases Familiar with profiling, optimising, and debugging scalable data systems Passion for making and playing games About Electronic Arts We’re proud to have an extensive portfolio of games and experiences, locations around the world, and opportunities across EA. We value adaptability, resilience, creativity, and curiosity. From leadership that brings out your potential, to creating space for learning and experimenting, we empower you to do great work and pursue opportunities for growth. We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more. We nurture environments where our teams can always bring their best to what they do. Electronic Arts is an equal opportunity employer. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. We will also consider employment qualified applicants with criminal records in accordance with applicable law. EA also makes workplace accommodations for qualified individuals with disabilities as required by applicable law. Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary Rust Support Lead Candidate required to have Knowledge of Rust programming language Candidate should have Strong handson experience with AWS serverless technologiesLambda API Gateway DynamoDB S3 etc Good to have Experience with Predict Spring or any other Cloud POS solutions Candidate should have experience in Troubleshooting issues in a production environment Strong verbal and written communication skills Responsibilities 8 years of experience in production support Lead the production support team effectively Identify diagnose and resolve issues in RUST production environment Implement monitoring and logging solutions to proactively detect and address potential problems Develop highperformance secure and efficient applications using Rust Write clean maintainable and welldocumented code in Rust Communicate effectively with team members stakeholders and clients to understand requirements and provide updates Document technical specifications processes and solutions clearly and conciselyParticipate in code reviews team meetings and knowledgesharing sessions to foster a culture of continuous improvement Good to have Has worked on any cloud POS Integrate and enhance pointofsale functionalities using Predict Spring or other Cloud POS solutions Collaborate with crossfunctional teams to ensure seamless integration and operation of POS systems Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are the only professional services organization who has a separate business dedicated exclusively to the financial services marketplace. Join Digital Engineering Team and you will work with multi-disciplinary teams from around the world to deliver a global perspective. Aligned to key industry groups including Asset management, Banking and Capital Markets, Insurance and Private Equity, Health, Government, Power and Utilities, we provide integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning and individually tailored coaching you will experience ongoing professional development. That’s how we develop outstanding leaders who team to deliver on our promises to all of our stakeholders, and in so doing, play a critical role in building a better working world for our people, for our clients and for our communities. Sound interesting? Well, this is just the beginning. Because whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. We’re seeking a versatile Full Stack Architect with hands-on experience in Python (including multithreading and popular libraries) ,GenAI and AWS cloud services. The ideal candidate should be proficient in backend development using NodeJS, ExpressJS, Python Flask/FastAPI, and RESTful API design. On the frontend, strong skills in Angular, ReactJS, TypeScript, etc.EY Digital Engineering is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional capability and product knowledge. The Digital Engineering (DE) practice works with clients to analyse, formulate, design, mobilize and drive digital transformation initiatives. We advise clients on their most pressing digital challenges and opportunities surround business strategy, customer, growth, profit optimization, innovation, technology strategy, and digital transformation. We also have a unique ability to help our clients translate strategy into actionable technical design, and transformation planning/mobilization. Through our unique combination of competencies and solutions, EY’s DE team helps our clients sustain competitive advantage and profitability by developing strategies to stay ahead of the rapid pace of change and disruption and supporting the execution of complex transformations. Your Key Responsibilities Application Development: Design and develop cloud-native applications and services using AWS services such as Lambda, API Gateway, ECS, EKS, and DynamoDB, Glue, Redshift, EMR. Deployment and Automation: Implement CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy to automate application deployment and updates. Architecture Design: Collaborate with architects and other engineers to design scalable and secure application architectures on AWS. Performance Tuning: Monitor application performance and implement optimizations to enhance reliability, scalability, and efficiency. Security: Implement security best practices for AWS applications, including identity and access management (IAM), encryption, and secure coding practices. Container Services Management: Design and deploy containerized applications using AWS services such as Amazon ECS (Elastic Container Service), Amazon EKS (Elastic Kubernetes Service), and AWS Fargate. Configure and manage container orchestration, scaling, and deployment strategies. Optimize container performance and resource utilization by tuning settings and configurations. Application Observability: Implement and manage application observability tools such as AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack (Elasticsearch, Logstash, Kibana). Develop and configure monitoring, logging, and alerting systems to provide insights into application performance and health. Create dashboards and reports to visualize application metrics and logs for proactive monitoring and troubleshooting. Integration: Integrate AWS services with application components and external systems, ensuring smooth and efficient data flow. Troubleshooting: Diagnose and resolve issues related to application performance, availability, and reliability. Documentation: Create and maintain comprehensive documentation for application design, deployment processes, and configuration. Skills And Attributes For Success Required Skills: AWS Services: Proficiency in AWS services such as Lambda, API Gateway, ECS, EKS, DynamoDB, S3, and RDS, Glue, Redshift, EMR. Backend: Python (multithreading, Flask, FastAPI), NodeJS, ExpressJS, REST APIs Frontend: Angular, ReactJS, TypeScript Cloud Engineering : Development with AWS (Lambda, EC2, S3, API Gateway, DynamoDB), Docker, Git, etc. Proven experience in developing and deploying AI solutions with Python, JavaScript Strong background in machine learning, deep learning, and data modelling. Good to have: CI/CD pipelines, full-stack architecture, unit testing, API integration Security: Understanding of AWS security best practices, including IAM, KMS, and encryption. Observability Tools: Proficiency in using observability tools like AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack. Container Orchestration: Knowledge of container orchestration concepts and tools, including Kubernetes and Docker Swarm. Monitoring: Experience with monitoring and logging tools such as AWS CloudWatch, CloudTrail, or ELK Stack. Collaboration: Strong teamwork and communication skills with the ability to work effectively with cross-functional teams. Preferred Qualifications: Certifications: AWS Certified Solutions Architect – Associate or Professional, AWS Certified Developer – Associate, or similar certifications. Experience: At least 8 Years of experience in an application engineering role with a focus on AWS technologies. Agile Methodologies: Familiarity with Agile development practices and methodologies. Problem-Solving: Strong analytical skills with the ability to troubleshoot and resolve complex issues. Education: Degree: Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field, or equivalent practical experience What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies. Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... You will be part of a World Class Container Platform team that builds and operates highly scalable Kubernetes based container platforms(EKS, OCP, OKE and GKE) at a large scale for Global Technology Solutions at Verizon, a top 20 Fortune 500 company. This individual will have a high level of technical expertise and daily hands-on implementation working in a product team developing services in two week sprints using agile principles. This entitles programming and orchestrating the deployment of feature sets into the Kubernetes CaaS platform along with building Docker containers via a fully automated CI/CD pipeline utilizing AWS, Jenkins Ansible playbooks, AWS, CI/CD tools and process ( Jenkins, JIRA, GitLab, ArgoCD), Python, Shell Scripts or any other scripting technologies. You will have autonomous control over day-to-day activities allocated to the team as part of agile development of new services. Automation and testing of different platform deployments, maintenance and decommissioning Full Stack Development Participate in POC(Proof of Concept) technical evaluations for new technologies for use in the cloud What we’re looking for... You’ll Need To Have Bachelor’s degree or four or more years of experience. GitOpsCI/CD workflows (ArgoCD, Flux) and Working in Agile Ceremonies Model Address Jira tickets opened by platform customers Strong Expertise of SDLC and Agile Development Experience in Design, develop and implement scalable React/Node based applications (Full stack developer) Experience with development with HTTP/RESTful APIs, Microservices Experience with Serverless Lambda Development, AWS Event Bridge, AWS Step Functions, DynamoDB, Python Database experience (RDBMS, NoSQL, etc.) Familiarity integrating with existing web application portals Strong backend development experience with languages to include Golang (preferred), Spring Boot and Python. Experience with GitLab CI/CD, Jenkins, Helm, Terraform, Artifactory Strong Development of K8S tools/components which may include standalone utilities/plugins, cert-manager plugins, etc. Development and Working experience with Service Mesh lifecycle management and configuring, troubleshooting applications deployed on Service Mesh and Service Mesh related issues Strong Terraform and/or Ansible and Bash scripting experience Effective code review, quality, performance tuning experience, test Driven Development Certified Kubernetes Application Developer (CKAD) Excellent cross collaboration and communication skills Even better if you have one or more of the following: Working experience with security tools such as Sysdig, Crowdstrike, Black Duck, Xray, etc. Experience with OWASP rules and mitigating security vulnerabilities using security tools like Fortify, Sonarqube, etc. Experience with monitoring tools like NewRelic (NRDOT), OTLP Certified Kubernetes Administrator (CKA) Certified Kubernetes Security Specialist (CKS) Red Hat Certified OpenShift Administrator Development Experience with the Operator SDK Experience creating validating and/or mutating webhooks Familiarity with creating custom EnvoyFilters for Istio service mesh and cost optimization tools like Kubecost, CloudHealth to implement right sizing recommendations If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon Prime is a program that provides millions of members with unlimited one-day delivery, unlimited streaming of video and music, secure online photo storage, access to kindle e-books as well as Prime special deals on Prime Day. In India, Prime members get unlimited free One-Day and Two-day delivery, video streaming and early and exclusive access to deals. After the launch in 2016, the Amazon Prime team is now looking for a detailed oriented business intelligence engineer to lead the business intelligence for Prime and drive member insights. At Amazon, we're always working to be the most customer-centric company on earth. To get there, we need exceptionally talented, bright, and driven people. We are looking for a dynamic, organized, and customer-focused Analytics expert to join our Amazon Prime Analytics team. The team supports the Amazon India Prime organization by producing and delivering metrics, data, models and strategic analyses. This is a highly challenging and make-shift role that requires an individual with excellent team leadership skills, business acumen, and the breadth to work across multiple Amazon Prime Business Teams, Data Engineering, Machine Learning and Software Development teams. A successful candidate will be a self-starter comfortable with ambiguity, strong attention to detail, and a proven ability to work in a fast-paced and ever-changing environment. Key job responsibilities Developing a long term analytical strategy and driving the implementation of that strategy. Using analytics to influence multiple departments, increasing their productivity and effectiveness to achieve strategic goals. Identify, develop, manage, and execute analyses to uncover areas of opportunity and present written business recommendations that help shape our business roadmap. Large scale data mining to find trends and problems, then communicate and drive corrective action. Ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to accurately determine cause and effect relations Apply Statistical and Machine Learning methods to specific business problems Partner with Machine Learning team in Model Building w.r.t. variable definition, model validation and creation of a targeting strategy Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions across Amazon Prime business units. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2890009 Show more Show less
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Title: Java/Python Developer (NoSQL & AWS Cloud with Quicksight Experience Preferred) Experience: 4–6 Years Location: Hyderabad Job Type: Full-Time Job Summary: We are looking for a skilled Java or Python Developer with strong experience in NoSQL databases and AWS Cloud technologies. Candidates with exposure to AWS Quicksight or any other AWS services will be given preference. This role involves developing and maintaining scalable systems with a focus on performance, reliability, and cloud-native development. Key Responsibilities: • Design, develop, and maintain applications using Java or Python • Work extensively with NoSQL databases like DynamoDB, MongoDB, or Cassandra • Leverage AWS services (EC2, S3, Lambda, RDS, Quicksight, etc.) to build and scale applications • Build robust backend systems with a focus on performance and scalability • Develop RESTful APIs and integrate with third-party services • Optimize database queries and backend services for high throughput • Collaborate with cross-functional teams including frontend, DevOps, and QA • Create technical documentation and maintain coding standards • Provide production support and troubleshooting of deployed services Key Skills and Qualifications: • Bachelor’s degree in computer science, Engineering, or related field • 4–6 years of hands-on development experience with Java or Python • Strong understanding and practical experience with NoSQL databases • Proficiency in AWS Cloud services , including EC2, S3, Lambda, and Quicksight • Experience in building scalable, high-performance APIs • Familiarity with microservices architecture and containerization (Docker/Kubernetes) • Solid knowledge of CI/CD pipelines and tools like Git, Jenkins, or similar • Strong debugging, analytical, and problem-solving skills • Ability to work in Agile development environments Preferred Qualifications: • AWS certifications (Developer Associate or similar) • Experience working with AWS Quicksight for data visualization and dashboards • Familiarity with Infrastructure as Code (e.g., CloudFormation or Terraform) • Exposure to full-stack development is a plus • Experience with message brokers like Kafka, SQS, or SNS Show more Show less
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Position Senior Engineer - Data Engineer Job Description Position - Data Engineer What You Will Be Doing Design and development of real time software and Cloud/Web/mobile based software application. Analyze domain specific technical, high level or low level requirement and modification as per end customer or system requirement. Perform software testing including unit, functional and system level requirement including manual and automated Perform code review following coding guidelines and static code analysis & troubleshoots software problems of limited difficulty. Document technical deliverable like software specifications, design document, code commenting, test cases and test report, Release note etc. throughout the project life cycle. Develop software solutions from established programming languages or by learning new language required for specific project. What Are We Looking For Experience: 4 to 8 years in software/data engineering. Data Technologies: Proficiency in SQL, NoSQL databases (e.g., DynamoDB, MongoDB), ETL tools, and data warehousing solutions. Programming Languages: Proficiency in Python is a must. Cloud Platforms: Azure, AWS (e.g., EC2, S3, RDS) or GCP. Visualization Tools: Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Data Governance: Knowledge of data governance and security practices. CI/CD: Experience with DevOps practices, including CI/CD pipelines and containerization (Docker, Kubernetes). Communication Skills: Excellent verbal and written communication skills in English. Agile Methodologies: Experience working in Agile development environments. AI/ML Awareness: Understanding of AI and ML concepts, frameworks (e.g., TensorFlow, PyTorch), and practical applications.0 Generative AI Awareness: Familiarity with Generative AI technologies and their potential use cases. Location - Indore/ Ahmedabad Location: IN-GJ-Ahmedabad, India-Ognaj (eInfochips) Time Type Full time Job Category Engineering Services Show more Show less
Posted 2 weeks ago
6.0 years
0 Lacs
Chandigarh, India
On-site
Job Profile: Senior Node.js Developer with AWS Experience Location: Chandigarh , Kochi Job Overview: We are seeking an experienced Senior Node.js Developer with a strong background in AWS to join our dynamic development team. The ideal candidate will have over 6 years of professional experience in designing, developing, and deploying scalable web applications and services using Node.js and AWS. Key Responsibilities: • Develop and Maintain Applications: Design, build, and maintain scalable Node.js applications, ensuring high performance and responsiveness. • Cloud Infrastructure: Architect and manage AWS services such as EC2, Lambda, S3, RDS, DynamoDB, and others to support application requirements. • Integration: Develop and implement APIs and integrate with third-party services and internal systems. • Code Quality: Write clean, maintainable, and efficient code. Conduct code reviews and mentor junior developers. • Optimization: Monitor and optimize application performance and scalability in a cloud environment. • Collaboration: Work closely with cross-functional teams, including product managers, designers, and other developers to deliver high-quality solutions. • Security: Implement security best practices to ensure application and data security. • Troubleshooting: Diagnose and resolve complex technical issues related to Node.js applications and AWS infrastructure. • Continuous Improvement: Stay up-to-date with the latest industry trends and technologies, and recommend improvements to existing systems. Requirements: • Experience: Minimum of 6 years of professional experience in Node.js development and working with AWS services. • Technical Skills: Deep expertise in Node.js and JavaScript (ES6+). Proven experience with AWS services (EC2, Lambda, S3, RDS, DynamoDB, etc.). o Strong knowledge of RESTful APIs, microservices architecture, and asynchronous programming. Experience with version control systems such as Git. Familiarity with containerization technologies like Docker and orchestration tools such as Kubernetes is a plus. • Education: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent experience. • Soft Skills: Excellent problem-solving skills and attention to detail Strong communication skills and ability to work effectively in a collaborative team environment. Proven ability to manage multiple tasks and projects simultaneously. Preferred Qualifications: • Experience with serverless architecture and AWS serverless services. • Knowledge of front-end technologies (e.g., React, Angular, Vue.js) is a plus. • Familiarity with CI/CD pipelines and DevOps practices. What We Offer: • Competitive salary and benefits package. • Opportunities for career growth and professional development. • A collaborative and innovative work environment. • Flexible working hours Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Life at UiPath The people at UiPath believe in the transformative power of automation to change how the world works. We’re committed to creating category-leading enterprise software that unleashes that power. To make that happen, we need people who are curious, self-propelled, generous, and genuine. People who love being part of a fast-moving, fast-thinking growth company. And people who care—about each other, about UiPath, and about our larger purpose. Could that be you? Your mission To help build the Peak platform - a new system of intelligence that allows companies to quickly harness the power of AI. Based in Jaipur , you will be working in a collaborative team on cutting edge technologies in a supportive and dynamic environment. Ultimately you are responsible for building the Peak platform and on-boarding new clients What You'll Do At UiPath Continually updating your technical knowledge and skills to ensure Peak are using the most relevant tools and techniques Utilize your extensive experience with Typescript, Nodejs and AWS to contribute to the development of high-quality software solutions Work closely with product managers, designers, and other stakeholders to ensure alignment and successful delivery of projects. Effectively communicate technical concepts and project updates to both technical and non-technical audiences. Supporting the onboarding of new clients to the Peak platform Creating technical specifications and test plans. Writing and testing code, refining and rewriting as necessary Exploring new technologies and advancement in cloud tech. Coaching and mentoring junior team members, fostering a culture of learning and continuous improvement. What You'll Bring To The Team We are building a team of world-class engineers in Jaipur , essentially we are looking for bright, talented engineers looking to work at the cutting edge of practical AI. Experience in React, Typescript, Nodejs, AWS, Docker. Preferred experience in Kafka and Kubernetes. Experience with one or more Database systems, such as PostgreSQL, DynamoDB. Strong understanding of full-stack development, including front-end, back-end, and database management. Ability to write clean, well-tested, and extendable code that is easy to maintain. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent verbal and written communication skills, with the ability to articulate complex technical topics clearly and concisely. Maybe you don’t tick all the boxes above—but still think you’d be great for the job? Go ahead, apply anyway. Please. Because we know that experience comes in all shapes and sizes—and passion can’t be learned. Many of our roles allow for flexibility in when and where work gets done. Depending on the needs of the business and the role, the number of hybrid, office-based, and remote workers will vary from team to team. Applications are assessed on a rolling basis and there is no fixed deadline for this requisition. The application window may change depending on the volume of applications received or may close immediately if a qualified candidate is selected. We value a range of diverse backgrounds, experiences and ideas. We pride ourselves on our diversity and inclusive workplace that provides equal opportunities to all persons regardless of age, race, color, religion, sex, sexual orientation, gender identity, and expression, national origin, disability, neurodiversity, military and/or veteran status, or any other protected classes. Additionally, UiPath provides reasonable accommodations for candidates on request and respects applicants' privacy rights. To review these and other legal disclosures, visit our privacy policy. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
About Zeller At Zeller, we’re champions for businesses of all sizes, and proud to be a fast-growing Australian scale-up taking on the ambitious goal of reimagining business banking and payments. We believe in a level playing field, where all businesses benefit from access to smarter payments and financial services solutions that accelerate their cash flow, help them get paid faster, and give them a better understanding of their finances. So we’re hard at work building the tools to make it happen. Zeller is growing fast, backed by leading VCs, and brings together a global team of passionate payment and tech industry professionals. With an exciting roadmap of innovative new products under development, we are building a high performing team to take on the outdated banking solutions. If you are passionate about innovation, thrive in fast-paced environments, embrace a challenge, hate bureaucracy, and can’t think of anything more exciting than disrupting the status-quo, then read on to learn more. About The Role The Zeller product engineering team owns the software, infrastructure and customer experience that enables more than 85,000 Australian businesses to accept payments and access the financial services they need to run their businesses. As a Senior Application Support Engineer you will be a leading member of the team that shapes and owns Zeller’s commitment to excellent and highly available service delivery. What you’ll be doing Deliver projects that improve the service delivery of Zeller’s Application Support team We are looking for someone to be a senior member of a small team, but is still principally involved in hands-on service delivery. Be a primary point of contact for escalated product issues from Zeller’s account and customer success teams. Own and orchestrate the triage, investigation and resolution of complex technical issues driving the pace of resolution and communicating well-thought out and reliable direction. Be an expert in the products and workflows you support and promote and share that knowledge to our partner teams. Using your technical expertise, participate in application monitoring using logs, data stores, internal tools and dashboards. Be a part of our incident response team, responding to alerts and bearing some on-call responsibilities. What Skills And Experience We Are Looking For Zeller is a product driven startup with a deep care for the quality of service we provide. Experience in software companies with a customer facing product is highly valued. You have the ability to manage multiple, competing tasks & priorities with ease in a fast-moving environment. A strong technical background with excellent troubleshooting, analytical and data skills. This should include familiarity with AWS services (or similar), an active SQL skill set, experience with release management toolset and service reporting tools (datadog or similar). Excellent communication skills and the ability to build strong partnerships with engineering, QA, and customer facing teams. Demonstrated experience participating in change management and incident response processes Payments experience is highly valued but not required Excitement and drive to work in a product company that delivers mission critical financial services The tools Zeller uses to get the work done Familiarity with these services or close equivalents is appreciated but we do not expect you to have used all of them. Hubspot is our principal CRM and where we track our support tickets. We also use Jira in conjunction with our engineering teams. The systems we support run in browsers, mobile applications, and payment terminals. The backend systems we support use AWS and are principally written in Typescript on a lambda, postgres, DynamoDB stack and using an event driven architecture. We monitor our products using tools and dashboardings in products like Datadog and Sentry Zeller’s payment services integrate with many third parties, particularly point of sale systems. Familiarity with POS, or managing issues with third party partners is valued. Like the rest of our team, you will benefit from Competitive remuneration A balanced, progressive, and supportive work environment; Excellent parental leave and other leave entitlements; Fully remote role Annual get together with the team Endless learning and development opportunities; Plenty of remote friendly fun and social opportunities - we love to come together as a team; An ability to influence and shape the future of Zeller as our company scales both domestically and globally; Being part of one of Australia’s most exciting scale-ups. Show more Show less
Posted 2 weeks ago
3.0 - 6.0 years
8 - 10 Lacs
Hyderābād
On-site
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Your Team's Impact FactSet is seeking an Experienced software development engineering with proven proficiency in deployment of software adhering to best practices and with fluency in the development environment and with related tools, code libraries and systems. Responsible for the entire development process and collaborates to create a theoretical design. Demonstrated ability to critique code and production for improvement, as well as to receive and apply feedback effectively. Proven ability to maintain expected levels of productivity and increasingly becoming independent as a software developer, requiring less direct engagement and oversight on a day to day basis from one’s manager. Focus is on developing applications, testing & maintaining software, and the implementation details of development ; increasing volume of work accomplished (with consistent quality, stability and adherence to best practices), along with gaining a mastery of the products to which one is contributing and beginning to participate in forward design discussions for how to improve based on one’s observations of the code, systems and production involved. Software Developers provide project leadership and technical guidance along every stage of the software development life cycle. What You'll Do Work on the Data Lake platform handling millions of documents annually, built on No SQL Architecture. Focus on developing new features while supporting and maintaining existing systems, ensuring the platform's continuous improvement. Develop innovative solutions for feature additions and bug fixes, optimizing existing functionality as needed to enhance system efficiency and performance. Engage with Python, Frontend and C#.NET repositories to support ongoing development and maintenance, ensuring robust integration and functionality across the application stack. Participate in weekly On Call support to address urgent queries and issues in common communication channels, ensuring operational reliability and user satisfaction. Create comprehensive design documents for major architectural changes and facilitate peer reviews to ensure quality and alignment with best practices. Utilize object-oriented programming principles to develop low-level designs that effectively support high-level architectural frameworks, contributing to scalable solutions. Collaborate with product managers and key stakeholders to thoroughly understand requirements and propose strategic solutions, leveraging cross-functional insights. Actively participate in technical discussions with principal engineers and architects to support proposed design solutions, fostering a collaborative engineering environment. Accurately estimate key development tasks and share insights with architects, engineering directors, to align on priorities and resource allocation. Operate within an agile framework, collaborating with engineers and product developers using tools like Jira and Confluence. Engage in test-driven development and elevate team practices through coaching and reviews. Create and review documentation and test plans to ensure thorough validation of new features and system modifications. Work effectively as part of a geographically diverse team, coordinating with other departments and offices for seamless project progression. These responsibilities aim to highlight the complexity of managing a platform that ingests millions of documents, underscoring the importance of innovative solutions, technical proficiency, and collaborative efforts to ensure the Data Lake platform's success. What We're Looking For Bachelor’s or master’s degree in computer science, Engineering, or a related field is required. 3-6 years of experience in software development, with a focus on systems handling large-scale data operations. In-depth understanding of data structures and algorithms to optimize software performance and efficiency. Proficiency in object-oriented design principles is essential. Strong skills in Python, AWS, Frontend and C#.NET to comprehend and contribute to existing applications. Experience with non-relational databases, specifically DynamoDB, MongoDB and Elasticsearch, for optimal query development and troubleshooting. Experience with frontend technologies like Angular, React or Vue.js to support development of key interfaces Software Development:Familiarity with GitHub-based development processes, facilitating seamless collaboration and version control. Experience in building and deploying production-level services, demonstrating ability to deliver reliable and efficient solutions. API and System Integration: Proven experience working with APIs, ensuring robust connectivity and integration across the system. AWS Expertise: Working experience with AWS services such as Lambda, EC2, S3, and AWS Glue is beneficial for cloud-based operations and deployments. Problem-Solving and Analysis: Strong analytical and problem-solving skills are critical for developing innovative solutions and optimizing existing platform components. Communication and Collaboration: Excellent collaborative and communication skills, enabling effective interaction with geographically diverse teams and key stakeholders. On Call and Operational Support: Capability to address system queries and provide weekly On Call support, ensuring system reliability and user satisfaction. Organizational Skills: Ability to prioritize and manage work effectively in a fast-paced environment, demonstrating self-direction and resourcefulness. Required Skills: Python Proficiency: Experience with Python and relevant libraries like Pandas and NumPy is beneficial for data manipulation and analysis tasks. Jupyter Notebooks: Familiarity with Jupyter Notebooks is a plus for supporting data visualization and interactive analysis. Agile Methodologies: Understanding of Agile software development is advantageous, with experience in Scrum as a preferred approach for iterative project management. Linux/Unix Experience: Exposure to Linux/Unix environments is desirable, enhancing versatility in system operations and development. What's In It for You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane