Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Senior PostgreSQL Database Administrator Location: Noida, India Experience Required: 10 to 12 years Education Qualification: B.Tech/B.E. in Computer Science, IT or MCA Job Summary: We are seeking an experienced and highly skilled PostgreSQL Database Administrator to join our team in Noida. The ideal candidate will be responsible for the performance, integrity, and security of our PostgreSQL database systems. This role requires deep expertise in database architecture, performance tuning, indexing, and backup/recovery strategies. Key Responsibilities: Install, configure, and maintain PostgreSQL databases across multiple environments (Dev/Test/Prod). Manage database security, integrity, and backup procedures. Design and implement robust backup and recovery solutions. Monitor database performance and proactively tune SQL queries and server configurations. Create and maintain database objects including tables, indexes, views, stored procedures, and triggers. Implement effective indexing and partitioning strategies to improve performance and scalability. Collaborate with development teams for schema design, query optimization, and application support. Plan and execute database migrations, upgrades, and patch management. Maintain high availability and disaster recovery setups. Automate routine DBA tasks using shell scripts or Python. Prepare and maintain documentation including SOPs, architecture diagrams, and incident reports. Provide on-call support and resolve critical database issues as needed. Required Skills and Experience: 9–12 years of proven experience as a PostgreSQL DBA in large-scale enterprise environments. Strong knowledge of PostgreSQL architecture and internals. Deep understanding of database design, performance tuning, indexing, and query optimization. Hands-on experience in implementing backup and disaster recovery solutions. Expertise in performance monitoring tools (e.g., pg_stat_statements, pgBadger). Proficiency with scripting languages (Bash, Python, or similar) for automation. Experience with replication (logical/streaming), partitioning, and connection pooling. Familiarity with Linux/Unix systems and cloud environments (AWS/GCP/Azure). Strong problem-solving skills and the ability to handle high-pressure situations. Preferred Qualifications: Certification in PostgreSQL administration or cloud-based database services. Experience with tools like Ansible, Terraform, or other infrastructure-as-code technologies. Exposure to other database technologies like MySQL or Oracle is a plus.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Fullstack SDE - II at NxtWave, you Build applications at a scale and see them released quickly to the NxtWave learners (within weeks )Get to take ownership of the features you build and work closely with the product tea mWork in a great culture that continuously empowers you to grow in your caree rEnjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster )NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidl yBuild in a world-class developer environment by applying clean coding principles, code architecture, etc .Responsibilitie sLead design and delivery of complex end-to-end features across frontend, backend, and data layers .Make strategic architectural decisions on frameworks, datastores, and performance patterns .Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns .Build and maintain shared UI component libraries and backend service frameworks for team reuse .Identify and eliminate performance bottlenecks in both browser rendering and server throughput .Instrument services with metrics and logging, driving SLIs, SLAs, and observability .Define and enforce comprehensive testing strategies: unit, integration, and end-to-end .Own CI/CD pipelines, automating builds, deployments, and rollback procedures .Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices .Partner with Product, UX, and Ops to translate business objectives into technical roadmaps .Facilitate sprint planning, estimation, and retrospectives for predictable deliveries .Mentor and guide SDE-1s and interns; participate in hiring .Qualifications & Skill s3–5 years building production Full stack applications end-to-end with measurable impact .Proven leadership in Agile/Scrum environments with a passion for continuous learning .Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies .Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot) .Expert in designing RESTful and GraphQL APIs and scalable database schemas .Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis) .Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, api gateway etc .Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright) .Frontend profiling (Lighthouse) and backend tracing for performance tuning .Secure coding: OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes .Strong communicator able to convey technical trade-offs to non-technical stakeholders .Experience in reviewing pull requests and providing constructive feedback to the team .Qualities we'd love to find in you : The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality softwa reStrong collaboration abilities and a flexible & friendly approach to working with tea msStrong determination with a constant eye on solutio nsCreative ideas with problem solving mind-s etBe open to receiving objective criticism and improving upon itEagerness to learn and zeal to gr owStrong communication skills is a huge pl usWork Location : Hyderab ad About Nxt WaveNxtWave is one of India’s fastest-growing ed-tech startups, revolutionizing the 21st-century job market. NxtWave is transforming youth into highly skilled tech professionals through its CCBP 4.0 programs, regardless of their educational backgro und.NxtWave is founded by Rahul Attuluri (Ex Amazon, IIIT Hyderabad), Sashank Reddy (IIT Bombay) and Anupam Pedarla (IIT Kharagpur). Supported by Orios Ventures, Better Capital, and Marquee Angels, NxtWave raised $33 million in 2023 from Greater Pacific Capi tal.As an official partner for NSDC (under the Ministry of Skill Development & Entrepreneurship, Govt. of India) and recognized by NASSCOM, NxtWave has earned a reputation for excelle nce.Some of its prestigious recognitions incl ude:Technology Pioneer 2024 by the World Economic Forum, one of only 100 startups chosen glob ally‘Startup Spotlight Award of the Year’ by T-Hub in 2023‘Best Tech Skilling EdTech Startup of the Year 2022’ by Times Business Aw ards‘The Greatest Brand in Education’ in a research-based listing by URS M ediaNxtWave Founders Anupam Pedarla and Sashank Gujjula were honoured in the 2024 Forbes India 30 Under 30 for their contributions to tech educa tionNxtWave breaks learning barriers by offering vernacular content for better comprehension and retention. NxtWave now has paid subscribers from 650+ districts across India. Its learners are hired by over 2000+ companies including Amazon, Accenture, IBM, Bank of America, TCS, Deloitte and m ore. Know more about NxtW ave: https://www.cc bp.inRead more about us in the ne ws – Economic Times | CNBC | YourStory | VCC ircle
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Fullstack SDE - II at NxtWave, you Build applications at a scale and see them released quickly to the NxtWave learners (within weeks) Get to take ownership of the features you build and work closely with the product team Work in a great culture that continuously empowers you to grow in your career Enjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster) NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidly Build in a world-class developer environment by applying clean coding principles, code architecture, etc. Responsibilities Lead design and delivery of complex end-to-end features across frontend, backend, and data layers. Make strategic architectural decisions on frameworks, datastores, and performance patterns. Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns. Build and maintain shared UI component libraries and backend service frameworks for team reuse. Identify and eliminate performance bottlenecks in both browser rendering and server throughput. Instrument services with metrics and logging, driving SLIs, SLAs, and observability. Define and enforce comprehensive testing strategies: unit, integration, and end-to-end. Own CI/CD pipelines, automating builds, deployments, and rollback procedures. Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices. Partner with Product, UX, and Ops to translate business objectives into technical roadmaps. Facilitate sprint planning, estimation, and retrospectives for predictable deliveries. Mentor and guide SDE-1s and interns; participate in hiring. Qualifications & Skills 3–5 years building production Full stack applications end-to-end with measurable impact. Proven leadership in Agile/Scrum environments with a passion for continuous learning. Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies. Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot). Expert in designing RESTful and GraphQL APIs and scalable database schemas. Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis). Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, api gateway etc. Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright). Frontend profiling (Lighthouse) and backend tracing for performance tuning. Secure coding: OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes. Strong communicator able to convey technical trade-offs to non-technical stakeholders. Experience in reviewing pull requests and providing constructive feedback to the team. Qualities we'd love to find in you: The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality software Strong collaboration abilities and a flexible & friendly approach to working with teams Strong determination with a constant eye on solutions Creative ideas with problem solving mind-set Be open to receiving objective criticism and improving upon it Eagerness to learn and zeal to grow Strong communication skills is a huge plus Work Location: Hyderabad About NxtWave NxtWave is one of India’s fastest-growing ed-tech startups, revolutionizing the 21st-century job market. NxtWave is transforming youth into highly skilled tech professionals through its CCBP 4.0 programs, regardless of their educational background. NxtWave is founded by Rahul Attuluri (Ex Amazon, IIIT Hyderabad), Sashank Reddy (IIT Bombay) and Anupam Pedarla (IIT Kharagpur). Supported by Orios Ventures, Better Capital, and Marquee Angels, NxtWave raised $33 million in 2023 from Greater Pacific Capital. As an official partner for NSDC (under the Ministry of Skill Development & Entrepreneurship, Govt. of India) and recognized by NASSCOM, NxtWave has earned a reputation for excellence. Some of its prestigious recognitions include: Technology Pioneer 2024 by the World Economic Forum, one of only 100 startups chosen globally ‘Startup Spotlight Award of the Year’ by T-Hub in 2023 ‘Best Tech Skilling EdTech Startup of the Year 2022’ by Times Business Awards ‘The Greatest Brand in Education’ in a research-based listing by URS Media NxtWave Founders Anupam Pedarla and Sashank Gujjula were honoured in the 2024 Forbes India 30 Under 30 for their contributions to tech education NxtWave breaks learning barriers by offering vernacular content for better comprehension and retention. NxtWave now has paid subscribers from 650+ districts across India. Its learners are hired by over 2000+ companies including Amazon, Accenture, IBM, Bank of America, TCS, Deloitte and more. Know more about NxtWave: https://www.ccbp.in Read more about us in the news – Economic Times | CNBC | YourStory | VCCircle
Posted 1 week ago
0 years
0 Lacs
India
Remote
Job Title: SQL Developer Intern Company: Enerzcloud Solutions Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data Engineering / Development About the Company: Enerzcloud Solutions is a forward-thinking technology company focused on delivering smart, scalable software and data solutions. We help businesses make better decisions through automation, data analysis, and cutting-edge development practices. Job Summary: We are seeking a dedicated and detail-oriented SQL Developer Intern to join our remote development team. This internship offers real-world exposure to writing SQL queries, managing databases, and supporting business intelligence and analytics processes. Key Responsibilities: Write and optimize SQL queries for data extraction and reporting Assist in designing, creating, and maintaining relational databases Perform data validation, transformation, and troubleshooting tasks Work on ETL processes and support data pipeline development Collaborate with data analysts and developers to fulfill data needs Document queries, schemas, and workflow processes Requirements: Pursuing or recently completed a degree in Computer Science, IT, or related field Strong foundational knowledge of SQL and relational databases Familiarity with MySQL, PostgreSQL, SQL Server, or similar platforms Understanding of normalization, joins, indexing, and query optimization Basic knowledge of Excel or BI tools is a plus Eager to learn and adapt in a remote work environment Perks & Benefits: ₹25,000/month stipend Real-world data and development project exposure Internship certificate upon successful completion Mentorship and learning support Flexible remote working
Posted 1 week ago
2.0 - 3.0 years
3 - 6 Lacs
New Delhi, Gurugram
Work from Office
Hiring For US / UK Travel BPO With Meta / PPC Call Experience Cruise/ Flight Sales Experience must Fluent English communication Open For immediate joining and rotational shift must "No other Process Experience can apply Call Shristi 7838882457 Required Candidate profile Below Mentioned Current profile and Salary Brackets Customer Support- 30 to 45 k Sales - 40 to 65 k ( PPC/Meta/ Cruise) SEO - 30 k QA - upto 35 k Perks and benefits Both side transport Meal incentive
Posted 1 week ago
1.0 - 3.0 years
2 - 3 Lacs
Gurugram
Work from Office
Hiring SEO From US Travel BPO experience Minimum for 06 Month's. Candidates only with relevant experience can apply Rotational shift Salary upto 35 k Cab + meal+ pf Only immediate joiners can call or WhatsApp updated resume Shristi @7838882457
Posted 1 week ago
10.0 - 31.0 years
15 - 17 Lacs
Gurgaon/Gurugram
On-site
Job Title: Database Architect Role Purpose The Database Architect is responsible for defining and implementing high-level database strategies aligned with enterprise business objectives. This includes architecting scalable, secure, and sustainable database systems, ensuring efficient access and performance while driving innovation in database technologies. Key Responsibilities Strategic Planning & Architecture Define strategic database requirements and develop architectural strategies at the modeling, design, and implementation stages to meet enterprise-wide needs. Design scalable database systems capable of handling high transaction loads and supporting data growth beyond 60 TB. Database Design & Development Create robust database solutions that ensure system reliability, including physical structure, functional capabilities, performance, security, backup, and recovery protocols. Design efficient database applications, including data transfer mechanisms, temporary tables, partitions, and indexing strategies to optimize performance. Installation & Maintenance Install and configure database systems using optimal access techniques, while maintaining detailed documentation of installation actions and configurations. Monitor and maintain system performance, troubleshoot production and development issues, and perform necessary maintenance activities. Performance Tuning & Monitoring Analyze system resource utilization and optimize parameters to enhance performance. Develop and implement processes for database load balancing, system upgrades, and migrations with minimal downtime. Collaboration & Governance Collaborate with system architects, software engineers, and stakeholders to translate business needs into technical database requirements. Ensure compliance with database development standards and enforce regular process documentation aligned with internal policies. Data Recovery & Security Establish and maintain high-availability clusters, disaster recovery procedures, and secure access controls. Monitor system consumption trends to ensure uptime and scalability, recommending hardware enhancements when needed. Innovation & Best Practices Research and introduce innovative technologies to future-proof database systems. Develop and enforce best practices for data migrations, upgrades, and integration with enterprise architecture. Key Skills & Competencies Proven expertise in database architecture, optimization, and performance tuning for large-scale systems. Deep understanding of relational and non-relational databases (e.g., Oracle, SQL Server, PostgreSQL, MongoDB, etc.). Strong knowledge of data modeling, indexing strategies, backup/recovery methods, and high availability architectures. Excellent problem-solving and troubleshooting skills. Strong communication and stakeholder management capabilities. Experience with database migration, version upgrades, and system integration in a complex enterprise environment. Qualifications & Experience Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 8–12 years of experience in database design and administration, with at least 3–5 years in a lead or architect role. Experience with enterprise-scale systems exceeding 60 TB in size. Hands-on experience with clustering, backup/restore strategies, and automation tools.
Posted 1 week ago
2.0 - 31.0 years
2 - 3 Lacs
Meerut
On-site
Dear candidate, We are looking for Bath Sales Associate for one of our renowned client for Meerut locations. Below are the details: Job Title: Bath Associate Job Purpose: Sales of various products of Bath division such as faucets, sanitaryware, accessories etc. in a specified geography. Main Responsibilities: Achieving a target of Rs. 500,000 of secondary sales per months. Meeting retail consumers, customer’s likes plumbers, contractors, architects in field and walk-in’s customer at store. Explaining product Features, Advantages & Benefit with demonstration wherever necessary to sell the products. Creating demand for the product at the consumer level and directing the consumers, influencers like plumber, contractor & architects to the AP Home store. Completing the sales process by ensuring billing to the end consumers. Attending to consumer’s complaints in use of the products and suggesting the remedial measures. Collecting information regarding opportunities for sale such as construction activity. 70% time in the field catering to customers, APH Store walk-ins, architects, contractors and 30% time in the store to attend to customers. Reporting – For generating leads, maintaining a pipeline and daily work plan - reporting will be to the APH SSO For business generation through the leads – reporting will be to Bath SH. Updating consumers, site details regularly in LEAD App for the visibility on indexing and business potential generated. Skills Required: Excellent communication and people skills Qualifications: Essential MBA – Sales & Marketing Previous Experience: Essential - Sales experience of minimum 2- 3 years Preferred - Having worked in a market development role/sales role in a similar industry like building material, plumbing or bath fittings
Posted 1 week ago
7.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Project Role : Data & Document Mmgt Processor Project Role Description : Perform end to end document management services according to service level agreements. This includes data digitization, data indexing, document scanning and maintenance etc. Support initiatives with a focus on continuous improvement. Must have skills : Business Requirements Analysis Good to have skills : AWS Architecture Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data and Document Management Processor, you will engage in a variety of tasks that ensure the effective management of documents and data. Your typical day will involve performing end-to-end document management services, which include data digitization, indexing, scanning, and maintenance of documents. You will also support initiatives aimed at continuous improvement, ensuring that all processes align with established service level agreements. Collaboration with various teams will be essential to enhance operational efficiency and drive improvements in document management practices. • 7+ Years of experience • Essential skills are : • process modelling, excellent stakeholder management (across business and technical) and solution thought leadership with the ability to translate the technical into business and vice versa. • Experience in capital mkt Desirable experience in Agile ways of working -Core BA Skills – requirement elicitation, impact analysis, requirement documentation, user stories creatio, DOD, Working with PO finalizing PB, test support, business readiness along with -JIRA + Confluence know-how -Agile methodology experience -Soft skills – business and stakeholder management -Process flow – conversant with Visio or draw.io -MS Office – proficient with excel, power point and word Additional Information: - The candidate should have minimum 7.5 years of experience in Business Requirements Analysis. - This position is based at our Bengaluru office. - A 15 years full time education is required.
Posted 1 week ago
5.0 years
0 Lacs
India
On-site
Staff Software Engineer, Data Ingestion The Staff Software Engineer, Data Ingestion will be a critical individual contributor responsible for designing collection strategies, developing, and maintaining robust and scalable data pipelines. This role is at the heart of our data ecosystem, deliver new analytical software solution to access timely, accurate, and complete data for insights, products, and operational efficiency. Key Responsibilities: Design, develop, and maintain high-performance, fault-tolerant data ingestion pipelines using Python. Integrate with diverse data sources (databases, APIs, streaming platforms, cloud storage, etc.). Implement data transformation and cleansing logic during ingestion to ensure data quality. Monitor and troubleshoot data ingestion pipelines, identifying and resolving issues promptly. Collaborate with database engineers to optimize data models for fast consumption. Evaluate and propose new technologies or frameworks to improve ingestion efficiency and reliability. Develop and implement self-healing mechanisms for data pipelines to ensure continuity. Define and uphold SLAs and SLOs for data freshness, completeness, and availability. Participate in on-call rotation as needed for critical data pipeline issues Key Skills: 5+ years of experience, ideally with background in computer science, working in software product companies. Extensive Python Expertise: Extensive experience in developing robust, production-grade applications with Python. Data Collection & Integration: Proven experience collecting data from various sources (REST APIs, OAuth, GraphQL, Kafka, S3, SFTP, etc.). Distributed Systems & Scalability: Strong understanding of distributed systems concepts, designing for scale, performance optimization, and fault tolerance. Cloud Platforms: Experience with major cloud providers (AWS or GCP) and their data-related services (e.g., S3, EC2, Lambda, SQS, Kafka, Cloud Storage, GKE). Database Fundamentals: Solid understanding of relational databases (SQL, schema design, indexing, query optimization). OLAP database experience is a plus (Hadoop) Monitoring & Alerting: Experience with monitoring tools (e.g., Prometheus, Grafana) and setting up effective alerts. Version Control: Proficiency with Git. Containerization (Plus): Experience with Docker and Kubernetes. Streaming Technologies (Plus): Experience with real-time data processing using Kafka, Flink, Spark Streaming.
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description What We Do At Goldman Sachs, our Engineers don’t just make things – we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets. Engineering, which is comprised of our Technology Division and global strategists groups, is at the critical centre of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here. Who We Look For Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile and more. We look for creative collaborators who evolve, adapt to change and thrive in a fast-paced global environment. About Data Engineering SRE Data plays a critical role in every facet of the Goldman Sachs business. The Data Engineering group is at the core of that offering, focusing on providing the platform, processes, and governance, for enabling the availability of clean, organized, and impactful data to scale, streamline, and empower our core businesses. Within Data Engineering, we run and operate some of Goldmans Sachs largest platforms, our clients are engineers and analyst across all business units that depend on our platforms for daily business deliverables. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will be responsible for observability, cost and capacity with operational accountability for some of Goldman Sachs’s largest data platforms. We are engaged in the full lifecycle of platforms from design to demise with an adapted SRE strategy to the lifecycle. Who are we Looking for You have a background as a developer and can express yourself in code. You have a focus on Reliability, Observability, Capacity Management, DevOps and SDLC (Software Development Lifecycle). You are a self-leader that is comfortable taking on problem statements with n-degrees of freedom and structure them into data driven deliverables. You drive strategy with “skin in the game”, you are on the rota with the team, you drive Postmortems and you have an attitude that the problem stops with you. How You Will Fulfil Your Potential Drive adoption of cloud technology for data processing and warehousing You will drive SRE strategy for some of GS largest platforms including Lakehouse and Data Lake Engage with data consumers and producers to match reliability and cost requirements You will drive strategy with data Relevant Technologies: Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, Gitlab Basic Qualifications A Bachelor or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) 1-4+ years of relevant work experience in a team-focused environment 1-2 years hands on developer experience at some point in career Understanding and experience of DevOps and SRE principles and automation, managing technical and operational risk Experience with cloud infrastructure (AWS, Azure, or GCP) Proven experience in driving strategy with data Deep understanding of multi-dimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL v/s NoSQL modelling, milestoning, indexing, partitioning) Excellent communications skills and the ability to work with subject matter experts to extract critical business concepts Independent thinker, willing to engage, challenge or learn Ability to stay commercially focused and to always push for quantifiable commercial impact Strong work ethic, a sense of ownership and urgency Strong analytical and problem-solving skills Ability to build trusted partnerships with key contacts and users across business and engineering teams Preferred Qualifications Understanding of Data Lake / Lakehouse technologies incl. Apache Iceberg Experience with cloud databases (e.g. Snowflake, Big Query) Understanding concepts of data modelling Working knowledge of open-source tools such as AWS lambda, Prometheus Experience coding in Java or Python
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI) Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to c onduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams. Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. Skills: Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302894
Posted 1 week ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI) Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to c onduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams. Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. Skills: Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302894
Posted 1 week ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Job Description: As an ELK (Elastic, Logstash & Kibana) Data Engineer, you would be responsible for developing, implementing, and maintaining the ELK stack-based solutions for Kyndryl’ s clients. This role would be responsible to develop efficient and effective, data & log ingestion, processing, indexing, and visualization for monitoring, troubleshooting, and analysis purposes. Key Responsibilities : Configure Logstash to receive, filter, and transform logs from diverse sources (e.g., servers, applications, AppDynamics, Storage, Databases and so son) before sending them to Elasticsearch. Configure ILM policies, Index templates etc. Develop Logstash configuration files to parse, enrich, and filter log data from various input sources (e.g., APM tools, Database, Storage and so on) Implement techniques like grok patterns, regular expressions, and plugins to handle complex log formats and structures. Ensure efficient and reliable data ingestion by optimizing Logstash performance, handling high data volumes, and managing throughput. Utilize Kibana to create visually appealing dashboards, reports, and custom visualizations. Collaborate with business users to understand their data integration & visualization needs and translate them into technical solutions Establishing the correlation within the data and develop visualizations to detect the root cause of the issue. Integration with ticketing tools such as Service Now Hands on with ML and Watcher functionalities Monitor Elasticsearch clusters for health, performance, and resource utilization Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides Who You Are Education, Experience, and Certification Requirements: BS or MS degree in Computer Science or a related technical field 5+ years overall IT Industry Experience. 3+ years of development experience with Elastic, Logstash and Kibana in designing, building, and maintaining log & data processing systems 3+ years of Python or Java development experience 4+ years of SQL experience (No-SQL experience is a plus) 4+ years of experience with schema design and dimensional data modelling Experience working with Machine Learning model is a plus Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus “Elastic Certified Engineer” certification is preferrable Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 week ago
0.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Bengaluru, Karnataka Job ID 30185740 Job Category Digital Technology Role : SQL developer with Data modeling and AWS/Azure Location: Bangalore Full/ Part-time: Full Time. Build a career with confidence: Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. About the Role: Looking for SQL Developer with ETL background and AWS OR Azure cloud platform experience. Job Description: Design, develop, and implement scalable and efficient data warehouse solutions on cloud platforms using Azure Fabric, AWS Redshift etc, Create and optimize data models to support business reporting and analytical needs. Integration using ETL Tools like Azure Data Factory etc. Write complex SQL queries, stored procedures, and functions for data manipulation and analysis. Implement data quality checks and validation processes to ensure data accuracy and integrity. Monitor and optimize data warehouse performance, including query tuning, indexing, and data partitioning strategies. Identify and troubleshoot data-related issues, ensuring data availability and reliability. Collaborate with data architects, data engineers, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Analytical Skills: Strong problem-solving, analytical, and critical thinking skills. Preferred Skills & Tools for this role are: Experience of 7 to 10 years in the below mentioned skill sets Cloud Platforms: Azure (Data Factory, Azure Fabric, SQL DB, Data Lake), AWS (RedShift)—Any Azure tools OR AWS Databases: Postgres SQL OR MSSQL ETL Tools: Azure Data Factory OR Any ETL Tool Experience.- Languages: Expert level proficiency in T-SQL, Python.—TSQL AND PYTHON BI Tools: Power BI or similar—POWERBI OR TABLEAU OR SPOTFIRE Version Control & DevOps: Azure DevOps, Git.—any of these is preferred Benefits: We are committed to offering competitive benefits programs for all our employees and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave. Drive forward your career through professional development opportunities. Achieve your personal goals with our Employee Assistance Programme. Our commitment to you: Our greatest assets are the expertise, creativity, and passion of our employees. We strive to provide a great place to work that attracts, develops, and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback, and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 week ago
1.0 - 2.0 years
3 - 6 Lacs
New Delhi, Gurugram
Work from Office
Hiring For US / UK Travel BPO With Meta / PPC Call Experience Cruise/ Flight Sales Experience must Fluent English communication Open For immediate joining and rotational shift must "No other Process Experience can apply Call Shristi 7838882457 Required Candidate profile Below Mentioned Current profile and Salary Brackets Customer Support- 30 to 45 k Sales - 40 to 65 k ( PPC/Meta/ Cruise) SEO - 30 k QA - upto 35 k Perks and benefits Both side transport Meal incentive
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Must have : Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc. Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing , Late arrival , Triggers etc. Good To Have Hyperparameter tuning experience. Proficient in either Apache Spark or Apache Beam or Apache Flink. Should have hands-on experience on Distributed computing. Should have working experience on Data Architecture design. Should be aware of storage and compute options and when to choose what. Should have good understanding on Cluster Optimisation/ Pipeline Optimisation strategies. Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources). Should have Business mindset to understand data and how it will be used for BI and Analytics purposes. Should have working experience on CI/CD pipelines, Deployment methodologies, Infrastructure as a code (eg. Terraform). Hands-on experience on Kubernetes. Vector based Database like Qdrant. LLM experience (embeddings generation, embeddings indexing, RAG, Agents, etc.). Key Responsibilities Design, develop, and implement AI models and algorithms using Python and Large Language Models (LLMs). Collaborate with data scientists, engineers, and business stakeholders to define project requirements and deliver impactful AI-driven solutions. Optimize and manage data pipelines, ensuring efficient data storage and retrieval with PostgreSQL. Continuously research emerging AI trends and best practices to enhance model performance and capabilities. Deploy, monitor, and maintain AI applications in production environments, adhering to best industry standards. Document technical designs, workflows, and processes to facilitate clear knowledge transfer and project continuity. Communicate technical concepts effectively to both technical and non-technical team Skills and Qualifications : Proven expertise in Python programming for AI/ML applicati (ref:hirist.tech)
Posted 1 week ago
7.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title : Senior Python Developer Key Responsibilities Software Development : Design, develop, test, and deploy high-quality Python applications and services. API Development : Build and maintain robust and scalable APIs using frameworks like FastAPI or Flask. Database Management : Design database schemas, write complex SQL queries, and optimize database performance for PostgreSQL. System Design : Contribute to the architectural design of new features and systems, ensuring scalability, reliability, and maintainability. Containerization & Orchestration : Implement and manage applications within containerized environments using Docker and orchestrate deployments with Kubernetes. CI/CD Implementation : Work with CI/CD pipelines to ensure automated testing, deployment, and continuous integration. Troubleshooting & Debugging : Identify, diagnose, and resolve complex technical issues in production and development environments. Code Quality : Ensure code quality through rigorous testing, code reviews, and adherence to best practices. Project Ownership : Take ownership of projects, driving them independently from conception to successful deployment and maintenance. Collaboration : Collaborate effectively with cross-functional teams, including product managers, other engineers, and QA. Required Skills & Experience Python Expertise : 7+ years of professional experience in Python development, with a strong understanding of Pythonic principles and best practices. Web Frameworks : Strong experience with FastAPI (or Flask, with a willingness to quickly adapt and switch to FastAPI). Database Proficiency : Proficiency in PostgreSQL, including advanced SQL querying, database design, indexing strategies, and performance tuning. Containerization & Orchestration : Solid understanding and hands-on experience with Kubernetes for container orchestration and microservices deployment. Development Tools : Experience with Docker for containerization, Git for version control, and implementing/managing CI/CD pipelines (e.g., Jenkins, GitLab CI/CD, GitHub Actions). Data Structures & Algorithms : Strong background in data structures, algorithms, and their practical application in solving complex problems. System Design : Proven ability in designing scalable, resilient, and performant software systems. Independent Work : Demonstrated ability to work independently, take initiative, and drive projects end-to-end with minimal supervision. Communication : Good communication skills, both written and verbal, with the ability to articulate technical concepts clearly and concisely. Education & Certifications Bachelor's degree in Computer Science, Software Engineering, or a related technical field. Master's degree is a plus. Relevant certifications in Python, cloud platforms, or container technologies are a plus. (ref:hirist.tech)
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a strong algorithm and logic building capability along with the ability to prototype rapidly. You must be fluent in MSSQL and have a deep understanding of Entity-Relationship concepts, Normalization/Denormalization, indexing, and performance monitoring and tuning. Your role will involve writing optimized, effective, reusable, and scalable code, analyzing existing SQL queries for performance improvements, and testing and debugging with refactoring capabilities. Implementing security and data protection solutions, knowledge in RESTful API and microservice environment, understanding of AGILE, and creating technical documentation are also key responsibilities. Additionally, you should possess soft skills to work in a team environment and excel in a startup environment with a high level of ownership and commitment. Writing unit and integration test cases is also expected. As for the qualifications, you should hold a Bachelors degree in EE, CS, ME, or equivalent, with a minimum of 2+ years of experience. Demonstrated strong written and verbal communication skills are necessary. Hands-on experience with MSSQL is a must, along with experience in one of AWS, GCP, Azure Cloud. Some understanding of building scalable and reliable products in the cloud, ability to prioritize end-to-end, debug, and develop modular code, and thinking outside the box to discover innovative solutions for complex data management issues are also required.,
Posted 1 week ago
7.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Project Role : Data & Document Mmgt Processor Project Role Description : Perform end to end document management services according to service level agreements. This includes data digitization, data indexing, document scanning and maintenance etc. Support initiatives with a focus on continuous improvement. Must have skills : Business Requirements Analysis Good to have skills : AWS Architecture Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data and Document Management Processor, you will engage in a variety of tasks that ensure the effective management of documents and data. Your typical day will involve performing end-to-end document management services, adhering to service level agreements. This includes activities such as data digitization, data indexing, document scanning, and maintenance. You will also support initiatives aimed at continuous improvement, collaborating with various teams to enhance processes and outcomes. JD :: • 7+ Years of experience • Essential skills are : • process modelling, excellent stakeholder management (across business and technical) and solution thought leadership with the ability to translate the technical into business and vice versa. • Experience in capital mkt Desirable experience in Agile ways of working -Core BA Skills – requirement elicitation, impact analysis, requirement documentation, user stories creatio, DOD, Working with PO finalizing PB, test support, business readiness along with -JIRA + Confluence know-how -Agile methodology experience -Soft skills – business and stakeholder management -Process flow – conversant with Visio or draw.io -MS Office – proficient with excel, power point and word Additional Information: - The candidate should have minimum 7.5 years of experience in Business Requirements Analysis. - This position is based at our Bengaluru office. - A 15 years full time education is required.
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
BI Architect About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking an experienced BI Architect with expertise in Databricks, Spotfire (Tableau and Power BI secondary), AWS, and enterprise business intelligence (BI) solutions to design and implement scalable, high-performance BI architectures. This role will focus on data modeling, visualization, governance, self-service BI enablement, and cloud-based BI solutions, ensuring efficient, data-driven decision-making across the organization. The ideal candidate will have strong expertise in BI strategy, data engineering, data warehousing, semantic layer modeling, dashboarding, and performance optimization, working closely with data engineers, business stakeholders, and leadership to drive BI adoption and enterprise analytics excellence. Preferred Candidate would have extensive Spotfire experience followed by Power BI or Tableau. Roles & Responsibilities: Design and develop enterprise BI architectures and implement the architectural vision for TIBCO Spotfire at the enterprise level hosted in AWS Partner with data engineers and architects to ensure optimal data modeling, caching, and query performance in Spotfire Design scalable, secure, and high-performance Spotfire environments, including multi-node server setups and hybrid cloud integrations. Develop reusable frameworks and templates for dashboards, data models, and automation processes. Optimize BI query performance, indexing, partitioning, caching, and report rendering to enhance dashboard responsiveness and data refresh speed. Implement real-time and batch data integration strategies, ensuring smooth data flow from APIs, ERP/CRM systems (SAP, Salesforce, Dynamics 365), cloud storage, and third-party data sources into BI solutions. Establish and enforce BI governance best practices, including data cataloging, metadata management, access control, data lineage tracking, and compliance standards. Troubleshoot interactive dashboards, paginated reports, and embedded analytics solutions that deliver actionable insights. Implement DataOps and CI/CD pipelines for BI, leveraging Deployment Pipelines, Git integration, and Infrastructure as Code (IaC) to enable version control and automation. Stay up to date with emerging BI technologies, cloud analytics trends, and AI/ML-powered BI solutions to drive innovation. Collaborate with business leaders, data analysts, and engineering teams to ensure BI adoption, self-service analytics enablement, and business-aligned KPIs. Provide mentorship and training to BI developers, analysts, and business teams, fostering a data-driven culture across the enterprise. Must-Have Skills: Experience in BI architecture, data analytics, AWS, and enterprise BI solution development Strong expertise in Spotfire including information links, Spotfire Analyst, Spotfire Server, and Spotfire Web Player Hands-on experience with Databricks (Apache Spark, Delta Lake, SQL, PySpark) for data processing, transformation, and analytics. Experience in scripting and extensions Python or R Expertise in BI strategy, KPI standardization, and enterprise data modeling, including dimensional modeling, star schema, and data virtualization. Hands-on experience with cloud BI solutions and enterprise data warehouses, such as Azure Synapse, AWS Redshift, Snowflake, Google BigQuery, or SQL Server Analysis Services (SSAS). Experience with BI governance, access control, metadata management, data lineage, and regulatory compliance frameworks. Expertise in Agile BI development, Scaled Agile (SAFe), DevOps for BI, and CI/CD practices for BI deployments. Ability to collaborate with C-level executives, business units, and engineering teams to drive BI adoption and data-driven decision-making. Good-to-Have Skills: Experience with Tibco Spotfire Lead Discovery Knowledge of AI-powered BI, natural language processing (NLP) in BI, and automated machine learning (AutoML) for analytics. Experience with multi-cloud BI architectures and federated query solutions using Power BI Tableau. Understanding of GraphQL, REST APIs, and data mesh principles for enterprise data access in BI. Knowledge of AI/ML pipeline integration within enterprise data architectures. Education and Professional Certifications Bachelor’s degree with 9-13 years of experience in Computer Science, IT or related field Tibco Spotfire Certifications Power BI Certifications Tableau Certifications Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description About Oracle Customer Success Services(CSS): As a key member of Oracle Customer Success Services, you will join an international network of experts dedicated to driving customer success through innovation and expertise. Our One Oracle approach ensures you will be part of a team delivering comprehensive, end-to-end services and solutions that accelerate the entire customer journey. Work alongside certified and experienced professionals, gaining exposure to cutting-edge technologies and methodologies, and enhancing your skills and credentials. Engage with a diverse range of customers, managing the full lifecycle of delivery and services, and ensuring each project has a tangible impact. Benefit from robust team support in a collaborative environment that prioritizes teamwork and mutual success. Join us in Oracle Customer Success Services and elevate your career with a company that values innovation, expertise, and customer-centric solutions Career Level - IC4 Mandatory Skills: Should have 8+ years of experience in Oracle SQL query tuning expertise. Strong expertise in writing and understanding of Oracle SQL/PLSQL Functional knowledge of Oracle Fusion Middleware products, Weblogic & Database products. Interpreting explain plans and able to read AWRs and ASH for identifying SQL query bottle necks. Ability to interpret explain plans for access path, predicate filtering and high knowledge of joins and orders in table access. Advanced knowledge of SQLHC, parallel query knowledge. Should have rich experience in factored / scalar sub-queries and ability to utilise them in sql queries. Strong decision making, convincing skills to bring down the complex discussions to closure supported by right arguments and facts. Understanding of database initialization parameters, logs and traces. Experience working on complex software development/integration projects Experience in scripting tool like Unix shell, Python, Perl Optional hands on troubleshooting Oracle OTBI & BIP report issues and performance related techniques on components like Data models/sets, LOV, parameters , bursting, FF, scheduling. Good to Have Skills: Understanding of AI and Machine learning will be an added advantage. Oracle SQL performance diagnostic tools like SQLHC Experience in using monitoring tools like OEM. Building synthetic test cases. Responsibilities Responsibilities: Optimize Oracle SQL: Diagnose and resolve individual SQL performance issues (you would need to apply hints, define new indexes, investigate optimiser/environment issues) . Early Adopter of Innovation: Explore and adopt new technologies like Oracle 23ai, Autonomous DB, Auto indexing, AI related tools. Analysis and Optimize Oracle BIP Reports: Should be able to help customer to identify performance issues with Oracle BIP reports functionality and sql queries involved with them. Optional knowledge of various components of BIP reports like Data models, Parameters, LOVs, Flex Fields, Bursting. Should be able to advise customers to use right sql queries as per Oracle best practices and which aids in completing the report optimally. OTBI and BIP reports: Knowledge of troubleshooting OTBI reports for advising customers to use performance enhancing configuration parameters. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary: We are seeking a highly skilled and motivated SQL Developer with strong expertise in either PostgreSQL or Oracle databases, coupled with proficiency in Java and/or Python. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable database solutions, as well as integrating these solutions with applications built using Java and/or Python. You will play a crucial role in optimizing database performance, ensuring data integrity, and collaborating with cross-functional teams to deliver high-quality software. Responsibilities: Design, develop, and implement complex SQL queries, stored procedures, functions, and triggers for PostgreSQL or Oracle databases. Optimize database performance through indexing, query tuning, and database schema improvements. Work closely with application developers to integrate database solutions with Java and/or Python applications. Develop and maintain ETL processes (Extract, Transform, Load) for data migration and integration. Collaborate with business analysts to understand data requirements and translate them into technical specifications. Ensure data integrity, security, and availability. Perform database performance monitoring, troubleshooting, and tuning. Participate in database design reviews and provide recommendations for best practices. Develop and maintain documentation for database designs, processes, and procedures. Support existing database systems and applications, including on-call rotation as needed. Stay up-to-date with the latest database technologies and best practices in PostgreSQL, Oracle, Java, and Python. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent practical experience). 3+ years of experience as an SQL Developer. Strong proficiency in either PostgreSQL or Oracle databases, including: Expertise in writing complex SQL queries, stored procedures, functions, and triggers. Experience with database design, normalization, and optimization techniques. Understanding of database architecture and administration concepts. Proficiency in at least one of the following programming languages: Java: Experience with JDBC, ORM frameworks (e.g., Hibernate, JPA), and Spring Boot is a plus. Python: Experience with database connectors (e.g., psycopg2, cx_Oracle), ORM frameworks (e.g., SQLAlchemy), and data manipulation libraries (e.g., Pandas) is a plus. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to collaborate effectively with technical and non-technical stakeholders. Ability to work independently and as part of a team in a fast-paced environment. Preferred Skills (Bonus Points): Experience with both PostgreSQL and Oracle databases. Familiarity with cloud platforms (AWS, Azure, GCP) and their database services. Experience with CI/CD pipelines and DevOps practices. Knowledge of data warehousing concepts and tools. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with NoSQL databases (e.g., MongoDB, Cassandra). About Finacle Finacle is an industry leader in digital banking solutions. We partner with emerging and established financial institutions to inspire better banking. Our cloud-native solution suite and SaaS services help banks to engage, innovate, operate, and transform better. We are a business unit of EdgeVerve Systems, a wholly-owned product subsidiary of Infosys – a global technology leader with over USD 15 billion in annual revenues. We are differentiated by our functionally-rich solution suite, composable architecture, culture, and entrepreneurial spirit of a start-up. We are also known for an impeccable track record of helping financial institutions of all sizes drive digital transformation at speed and scale. Today, financial institutions in more than 100 countries rely on Finacle to help more than a billion people and millions of businesses to save, pay, borrow, and invest better. Finacle website (https://www.edgeverve.com/finacle/solutions/) Disclaimer :- Edgeverve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Position: JavaScript/Node Developer (React-Focused) About Us We are a team of extremely dedicated developers working on an online platform that could transform communication in the workplace. We are committed to improving productivity and streamlining information for all our users. The application is in its beta and we iteratively improve on it based on user feedback as soon as it is requested. The user experience is our top priority, and we consistently deliver to their expectations. If you're a driven MERN developer who thrives in dynamic environments and is passionate about building tools that enhance how teams work, we’d love to hear from you. Position Overview We are seeking a skilled MERN Stack Developer with a strong emphasis on React to design, develop, and maintain a high-performance, scalable web application. The ideal candidate will have deep expertise in building dynamic, responsive front-end interfaces using React, along with proficiency in the full MERN stack (MongoDB, Express.js, React, Node.js) to deliver end-to-end solutions. Technical Skills Requirements Core MERN Stack Skills React (Primary Focus) Advanced proficiency in React.js (v16.8+) and its core principles (components, props, state, lifecycle methods). Expertise in functional components, React Hooks (e.g., useState, useEffect, useContext, useReducer). Experience with state management libraries (e.g., Redux, Zustand, or Context API). Proficiency in React Router for client-side routing. Knowledge of performance optimization techniques (e.g., memoization, lazy loading, code splitting). Familiarity with React best practices, including component reusability, modularity, and clean code. Experience with UI libraries/frameworks (e.g., Material-UI, Ant Design, Tailwind CSS, or Chakra UI). Understanding of React’s virtual DOM and reconciliation process. Ability to implement responsive and adaptive designs for cross-device compatibility. Version Control Proficient with Git (e.g., GitHub, GitLab, Bitbucket) for version control, branching, and collaboration. MongoDB Proficiency in designing and managing NoSQL databases using MongoDB. Experience with schema design, indexing, and querying (e.g., aggregation pipelines). Knowledge of MongoDB Atlas or other cloud-based MongoDB services. Familiarity with ORMs/ODMs like Mongoose for schema validation and modeling. Node.js Proficiency in server-side JavaScript development using Node.js. Experience with asynchronous programming (e.g., Promises, async/await). Knowledge of building scalable backend services and microservices. Familiarity with Node.js frameworks and tools (e.g., NestJS, Fastify). Front-End Development Expertise in HTML5, CSS3/Tailwind, and JavaScript (ES6+). Experience with CSS preprocessors (e.g., SASS, LESS) and CSS-in-JS (e.g., styled-components, Emotion). Familiarity with front-end build tools (e.g., Webpack, Vite, Parcel). Additional Technical Skills Testing : Experience with testing frameworks and tools: Unit testing: Jest, React Testing Library. API Integration : Proficiency with RestAPI Security : Understanding of web security best practices (e.g., CORS, CSRF, XSS prevention). Experience with secure authentication and authorization (e.g., OAuth2, JWT). Performance Optimization : Ability to optimize front-end and back-end performance (e.g., lazy loading, caching, database indexing). Familiarity with tools like Lighthouse, Web Vitals, or Chrome DevTools. Optional but Desirable Skills Experience with Next.js for server-side rendering (SSR) and static site generation (SSG). Knowledge of GraphQL. Familiarity with real-time features using WebSockets or libraries like Socket.io. Experience with mobile-first development or React Native for cross-platform apps. Experience and Qualifications Professional Experience : Experience in full-stack development with a focus on React. Proven track record of building and deploying MERN stack applications. Portfolio or GitHub repository showcasing relevant projects (React-heavy projects preferred). Project Experience : Hands-on experience building scalable, production-ready MERN applications. Experience with e-commerce, SaaS, or data-driven applications is a plus. Responsibilities Develop and maintain high-quality, responsive front-end interfaces using React. Build and integrate RESTful or GraphQL APIs with Express.js and Node.js. Design and manage MongoDB databases for efficient data storage and retrieval. Collaborate with backend developers to deliver cohesive solutions. Write clean, modular, and testable code following industry best practices. Optimize application performance and ensure scalability. Participate in code reviews, testing, and deployment processes. Stay updated on the latest trends in React and MERN stack technologies. What to expect from the role Hybrid Work Model: The role requires working from our Hyderabad office 3–4 times a week , with flexibility for remote collaboration on other days. Team Collaboration: You’ll work closely with our development team in a fast-paced, agile environment. Deadline Commitment: Strong emphasis on timely delivery and accountability. You must be comfortable working with short iteration cycles and clear milestones. Apply now and help us shape the future of workplace communication.
Posted 1 week ago
4.0 years
0 Lacs
India
On-site
Required Skills & Qualifications: 4+ years of backend development experience with Node.js . Strong expertise in MongoDB (aggregation pipelines, indexing, sharding, etc.). Good understanding of relational databases (SQL) and schema optimization. Experience with Angular/ React.js is an added advantage. Ensure code quality : Ensure that the code is of high quality, meets coding standards, is well-documented Resolve technical issues: Expertise in code reviewing, Identify and resolve technical issues that arise during development and guide the team to create effective and reusable coding. Sound knowledge in creating semi-automated approach for code review, functional check, file handling operations, Database queries, global variables, etc. Must have strong debugging and troubleshooting skills, with solid understanding of logging and exception handling Knowledge on AI / ML / computer vision-based applications. Strong communication skills Job Types: Full-time, Permanent Location Type: In-person Schedule: Day shift Ability to commute/relocate: Palayam, Thiruvananthapuram, Kerala: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Current Monthly Salary? Least Expected Monthly Salary? How early you can join? Experience: Node.js: 4 years (Required) Mongodb: 4 years (Required) MySQL: 4 years (Required) Microservices: 4 years (Preferred) React.js: 1 year (Preferred) Work Location: In person Speak with the employer +91 9072049595
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France