Home
Jobs

726 Normalization Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA We are seeking an experienced Solution Architect/Business Development Manager with expertise in AI/ML to drive business growth and deliver innovative solutions. The successful candidate will be responsible for assessing client business requirements, designing technical solutions, recommending AI/ML approaches, and collaborating with delivery organizations to implement end-to-end solutions. What You'll Be Doing Key Responsibilities: Business Requirement Analysis: Assess client's business requirements and convert them into technical specifications that meet business outcomes. AI/ML Solution Design: Recommend the right AI/ML approaches to meet business requirements and design solutions that drive business value. Opportunity Sizing: Size the opportunity and develop business cases to secure new projects and grow existing relationships. Solution Delivery: Collaborate with delivery organizations to design end-to-end AI/ML solutions, ensuring timely and within-budget delivery. Costing and Pricing: Develop costing and pricing strategies for AI/ML solutions, ensuring competitiveness and profitability. Client Relationship Management: Build and maintain strong relationships with clients, understanding their business needs and identifying new opportunities. Technical Leadership: Provide technical leadership and guidance to delivery teams, ensuring solutions meet technical and business requirements. Knowledge Sharing: Share knowledge and expertise with the team, contributing to the development of best practices and staying up-to-date with industry trends. Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to ensure successful project delivery. Requirements: Education: Master's degree in Computer Science, Engineering, or related field Experience: 10+ years of experience in AI/ML solution architecture, business development, or a related field Technical Skills: Strong technical expertise in AI/ML, including machine learning algorithms, deep learning, and natural language processing. Technical Skills: Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Technical Skills: Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Hyperscaler: Experience with cloud-based AI/ML platforms and tools (e.g., AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform) Softskill: Excellent business acumen and understanding of business requirements and outcomes Softskill: Strong communication and interpersonal skills, with ability to work with clients and delivery teams Business Acumen: Experience with solution costing and pricing strategies with Strong analytical and problem-solving skills, with ability to think creatively and drive innovation Nice to Have: Experience with Agile development methodologies Knowledge of industry-specific AI/ML applications (e.g., healthcare, finance, retail) Certification in AI/ML or related field (e.g., AWS Certified Machine Learning – Specialty) Location: Delhi or Bangalore Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Xylem is a Fortune 500 global water solutions company dedicated to advancing sustainable impact and empowering the people who make water work every day. As a leading water technology company with 23,000 employees operating in over 150 countries, Xylem is at the forefront of addressing the world's most critical water challenges. We invite passionate individuals to join our team, dedicated to exceeding customer expectations through innovative and sustainable solutions. As a Data Engineer, you will design, develop, and optimize scalable data pipelines and workflows to support advanced analytics and business intelligence needs. You will collaborate with cross-functional teams to ensure data accessibility, integrity, and security. Core Responsibilities Design, develop, and implement robust data pipelines for data collection, transformation, and integration. Collaborate with senior engineers to architect scalable data solutions using Azure services, including Azure Data Factory and Databricks. Integrate data from SAP ERP systems and other enterprise platforms into modern cloud-based data ecosystems. Leverage Databricks for big data processing and workflow optimization. Work with stakeholders to understand data requirements, ensuring data quality and consistency. Maintain data governance practices to support compliance and security protocols. Support analytics teams by providing well-structured, reliable data for reporting and machine learning projects. Troubleshoot and resolve data pipeline and workflow issues. Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 3–5 years of experience in data engineering or a related role. Proficiency in Azure technologies, including Azure Data Factory, Azure SQL Database, and Databricks. Experience with SAP data integration is a plus. Strong SQL and Python programming skills for data engineering tasks. Familiarity with data modeling concepts (e.g., star and snowflake schemas) and best practices. Experience with CI/CD pipelines for deploying data workflows and infrastructure. Knowledge of distributed file systems like Azure Data Lake or equivalent cloud storage solutions. Basic understanding of Apache Spark for distributed data processing. Strong problem-solving skills and a collaborative mindset. Technical Knowledge Deep understanding of Azure cloud infrastructure and services, particularly those related to data management (e.g., Azure Data Lake, Azure Blob Storage, Azure SQL Database). Experience with Azure Data Factory (ADF) for orchestrating ETL pipelines and automating data workflows. Familiarity with Azure Databricks for big data processing, machine learning, and collaborative analytics. Expertise in Apache Spark for distributed data processing and large-scale analytics. Familiarity with Databricks, including managing clusters and optimizing performance for big data workloads. Understanding of Databricks Bronze, Silver, and Gold Model. Understanding of distributed file systems like HDFS and cloud-based equivalents like Azure Data Lake. Proficiency in SQL and NoSQL databases, including designing schemas, query optimization, and managing large datasets. Experience with data warehousing solutions like Databricks, Azure Synapse Analytics or Snowflake. Familiarity with connecting data Lakehouse’s with Power BI. Understanding of OLAP (Online Analytical Processing) and OLTP (Online Transaction Processing) systems. Strong grasp of data modeling techniques, including conceptual, logical, and physical data models. Experience with star schema, snowflake schema, and normalization for designing scalable, performant databases. Knowledge of data architecture best practices, ensuring efficient data flow, storage, and retrieval. Knowledge of CI/CD pipelines for automating the deployment of data pipelines, databases, and infrastructure. Experience with infrastructure as code tools like Terraform or Azure Resource Manager to manage cloud resources. Preferred Qualifications Familiarity with tools like Apache Airflow or other workflow orchestration tools. Knowledge of Azure Monitor or similar tools for system performance tracking. Certifications in Azure Data Engineering or related cloud platforms. Join the global Xylem team to be a part of innovative technology solutions transforming water usage, conservation, and re-use. Our products impact public utilities, industrial sectors, residential areas, and commercial buildings, with a commitment to providing smart metering, network technologies, and advanced analytics for water, electric, and gas utilities. Partner with us in creating a world where water challenges are met with ingenuity and dedication; where we recognize the power of inclusion and belonging in driving innovation and allowing us to compete more effectively around the world. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Delhi Cantonment, Delhi, India

On-site

Linkedin logo

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA We are seeking an experienced Solution Architect/Business Development Manager with expertise in AI/ML to drive business growth and deliver innovative solutions. The successful candidate will be responsible for assessing client business requirements, designing technical solutions, recommending AI/ML approaches, and collaborating with delivery organizations to implement end-to-end solutions. What You'll Be Doing Key Responsibilities: Business Requirement Analysis: Assess client's business requirements and convert them into technical specifications that meet business outcomes. AI/ML Solution Design: Recommend the right AI/ML approaches to meet business requirements and design solutions that drive business value. Opportunity Sizing: Size the opportunity and develop business cases to secure new projects and grow existing relationships. Solution Delivery: Collaborate with delivery organizations to design end-to-end AI/ML solutions, ensuring timely and within-budget delivery. Costing and Pricing: Develop costing and pricing strategies for AI/ML solutions, ensuring competitiveness and profitability. Client Relationship Management: Build and maintain strong relationships with clients, understanding their business needs and identifying new opportunities. Technical Leadership: Provide technical leadership and guidance to delivery teams, ensuring solutions meet technical and business requirements. Knowledge Sharing: Share knowledge and expertise with the team, contributing to the development of best practices and staying up-to-date with industry trends. Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to ensure successful project delivery. Requirements: Education: Master's degree in Computer Science, Engineering, or related field Experience: 10+ years of experience in AI/ML solution architecture, business development, or a related field Technical Skills: Strong technical expertise in AI/ML, including machine learning algorithms, deep learning, and natural language processing. Technical Skills: Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Technical Skills: Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Hyperscaler: Experience with cloud-based AI/ML platforms and tools (e.g., AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform) Softskill: Excellent business acumen and understanding of business requirements and outcomes Softskill: Strong communication and interpersonal skills, with ability to work with clients and delivery teams Business Acumen: Experience with solution costing and pricing strategies with Strong analytical and problem-solving skills, with ability to think creatively and drive innovation Nice to Have: Experience with Agile development methodologies Knowledge of industry-specific AI/ML applications (e.g., healthcare, finance, retail) Certification in AI/ML or related field (e.g., AWS Certified Machine Learning – Specialty) Location: Delhi or Bangalore Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities: 1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices. 3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies. 4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines. 5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows. 6.Optimize performance of new Python workflows and ensure data quality and consistency. 7.Document migration processes, coding standards, and pipeline configurations. 8.Integrate new pipelines with google cloud platforms as required. 9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Role: Sr. Cyber Security Engineer(L3) Type: Full Time Location: Remote Intraedge is seeking a seasoned Cybersecurity Engineer on behalf of its financial domain client to support advanced threat detection, data-driven defense, and automation within a cloud-first, consumer-centric environment. This role will lead the development and implementation of intelligent security solutions using SIEM, SOAR, and machine learning to enhance detection, response, and operational efficiency across the enterprise. Key Responsibilities * Design, implement, and manage enterprise SIEM (Splunk) solutions for centralized log analysis and real-time event monitoring. * Develop and fine-tune correlation rules, alerts, dashboards , and use cases to detect anomalous and malicious activity. * Lead data ingestion and normalization from varied enterprise systems (e.g., cloud workloads, endpoints, network devices). * Develop and maintain SOAR playbooks to automate incident detection, triage, response, and recovery. * Optimize SOAR workflows and integrations with security infrastructure to reduce MTTD/MTTR. * Build and apply machine learning models to identify security anomalies, enrich event context, and predict threats. * Collaborate with Security Operations Center (SOC) , DevOps , IT , and business units to align security automation with business goals. * Analyze incident data to uncover trends and provide recommendations for improving controls and detection. * Maintain detailed documentation for playbooks, integrations, automation processes, and incident response protocols. * Stay abreast of industry trends and emerging tools to continually advance detection and automation strategies. * Mentor junior engineers and assist in promoting SOAR and SIEM best practices across the team. Required Qualifications * 4+ years of experience in cybersecurity engineering , including SIEM (Splunk), SOAR, and machine learning-based threat detection. * 3+ years of experience in security automation using platforms such as Splunk SOAR, XSOAR, Swimlane , or similar. * 3+ years in cyber data engineering or analytics: log processing, enrichment, and telemetry pipelines. * Expertise in scripting languages like Python and PowerShell , and using REST APIs for integrations. * Proven experience designing and deploying security automation workflows in enterprise environments. * Bachelor's degree in Computer Science , Information Security , Engineering , or related field-or equivalent experience. * Ability to troubleshoot complex security issues and integrate with diverse platforms. * Strong communication and collaboration skills to work with technical and non-technical stakeholders. Preferred Qualifications * Hands-on experience with cloud platforms such as AWS , Azure , or Google Cloud Platform . * Familiarity with cloud-native security tooling , telemetry pipelines, and serverless security design patterns. * Experience working within Agile environments and cross-functional DevSecOps teams. * Knowledge of change management processes , compliance frameworks (e.g., NIST, ISO), and regulatory constraints in financial services. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Job Description It is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Summary Database Engineer/ Developer - Core Skills Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. A good understanding of data security measures and compliance is also required. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes, and knowledge of cloud-based databases like AWS RDS and Google BigQuery. Min 5 years of experience. JD Database Engineer - Data Research Engineering Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Responsible AI Engineer Project Role Description : Assess AI systems for adherence to predefined thresholds and benchmarks related to responsible, ethical and sustainable practices. Design and implement technology mitigation strategies for systems to ensure ethical and responsible standards are achieved. Must have skills : Responsible AI Good to have skills : NA Educational Qualification : 15 years full time education Summary: As a Responsible AI Engineer, you will assess AI systems for adherence to predefined thresholds and benchmarks related to responsible, ethical, and sustainable practices. Design and implement technology mitigation strategies for systems to ensure ethical and responsible standards are achieved. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Should have Influencing and Advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Develop and implement responsible AI frameworks. - Conduct audits and assessments of AI systems for ethical compliance. - Collaborate with cross-functional teams to ensure responsible AI practices are integrated. Professional & Technical Skills: - Must To Have Skills: Proficiency in Responsible AI. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Bea able to align data models with business goals and enterprise architecture Collaborate with Data Architects, Engineers, Business Analysts, and Leadership teams Lead data modelling, governance discussions and decision-making across cross-functional teams Proactively identify data inconsistencies, integrity issues, and optimization opportunities Design scalable and future-proof data models Define and enforce enterprise data modelling standards and best practices Experience working in Agile environments (Scrum, Kanban) Identify impacted applications, size capabilities, and create new capabilities Lead complex initiatives with multiple cross-application impacts, ensuring seamless integration Drive innovation, optimize processes, and deliver high-quality architecture solutions Understand business objectives, review business scenarios, and plan acceptance criteria for proposed solution architecture Discuss capabilities with individual applications, resolve dependencies and conflicts, and reach agreements on proposed high-level approaches and solutions Participate in Architecture Review, present solutions, and review other solutions Work with Enterprise architects to learn and adopt standards and best practices Design solutions adhering to applicable rules and compliances Stay updated with the latest technology trends to solve business problems with minimal change or impact Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 8+ years of proven experience in a similar role, leading and mentoring a team of architects and technical leads Extensive experience with Relational, Dimensional, and NoSQL Data Modelling Experience in driving innovation, optimizing processes, and delivering high-quality solutions Experience in large scale OLAP, OLTP, and hybrid data processing systems Experience in complex initiatives with multiple cross-application impacts Expert in Erwin for Conceptual, Logical, and Physical Data Modelling Expertise in Relational Databases, SQL, indexing and partitioning for databases like Teradata, Snowflake, Azure Synapse or traditional RDBMS Expertise in ETL/ELT architecture, data pipelines, and integration strategies Expertise in Data Normalization, Denormalization and Performance Optimization Exposure to cloud platforms, tools, and AI-based solutions Solid knowledge of 3NF, Star Schema, Snowflake schema, and Data Vault Knowledge of Java, Python, Spring, Spring boot framework, SQL, Mongo DBS, KAFKA, React JS, Dynatrace, Power BI kind of exposure Knowledge of Azure Platform as a Service (PaaS) offerings (Azure Functions, App Service, Event grid) Good knowledge of the latest happenings in the technology world Advanced SQL skills for complex queries, stored procedures, indexing, partitioning, macros, recursive queries, query tuning and OLAP functions Understanding of Data Privacy Regulations, Master Data Management, and Data Quality Proven excellent communication and leadership skills Proven ability to think from a long-term perspective and arrive at intentional and strategic architecture Proven ability to provide consistent solutions across Lines of Business (LOB) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Skillsets And Attitudes Must have : Educational Background : Bachelors degree in computer science, Information Technology, or a related field. Technical Proficiency Hands-on experience with database technologies (e.g., Oracle, SQL Server, MySQL, PostgreSQL) with ability to work with large data sets Expertise in writing complex SQL queries for data manipulation and analysis Experience with one or more programming languages (e.g., Python, Java, C++, etc.). Strong understanding of data architecture principles. Skills in tuning database performance, including indexing, partitioning, and query optimization. Experience in implementing robust backup and recovery strategies Familiarity with cloud database services (AWS RDS, Azure SQL Database) is a must. Experience with data warehouses, distributed data platforms, and data lakes. Good To Have Certifications (Preferred, but not mandatory) : Certifications such as Oracle Certified Professional, Microsoft Certified Database Administrator, or equivalent are advantageous. Problem-Solving Skills : Strong analytical and problem-solving abilities. Adaptability : Ability to adapt to new technologies and changing requirements. Proficiency in data analytics and visualization tools Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. Excellent business and technical communication Your Core Role We're looking for a skilled Data Engineer to enhance our data systems. You will design and build the foundation of the data/analytics architecture for the organization. Your contributions will be vital in maintaining efficient and reliable database operations to support our organization's data needs. Key Responsibilities Design, build, and optimize the data architecture and extract, transform, and load (ETL) pipelines to make them accessible for Business Data Analysts, Data Scientists, and business users. Drive standards in data reliability, data integrity, and data governance, enabling accurate, consistent, and trustworthy data sets, business intelligence products, and analyses. Database Normalization to reduce redundancy, improve data integrity and design scalable database schemas Performance Tuning including indexing, partitioning, and query optimization. Review data organization and implement archival strategies to improve overall DB performance Database Security best practices and implement security measures Implement robust backup and recovery strategies. Troubleshooting to diagnose and resolve database issues effectively. Scripting and Automation (e.g., PowerShell, Python) for automating database tasks. What You Can Expect Five-Day workweek. Fair pay. The team will support you and push you. We will debate and question you. We will help you find what you are good at and let you take unilateral decisions there. We will prod you to get better at the things you are not good at. You will interact with our coordinators and field agents on the ground. You will also interact with decision-makers from within the social impact ecosystem. You will enable data driven business decisions, create aha moments through insights you generate, and create new opportunities through deep analysis using internal/ external data. You will use these insights to create better touchpoints that get the job done. You will need to get your hands dirty. You will be a part of discussion with teams like program delivery team, program management team, product team, CXOs and Engineering Team which means you will be working across disciplines. (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

2.0 - 31.0 years

0 - 1 Lacs

Paldi, Ahmedabad Region

Remote

Apna logo

Key Responsibilities: Common: Develop and maintain custom ERP modules in Angular + Node.js architecture. Translate business logic into scalable RESTful APIs. Work with relational and NoSQL databases for ERP data structuring. Integrate intuitive UI/UX for internal and client-facing modules. Write clean, maintainable code with strong documentation. Understand ERP workflows: sales, purchase, HR, accounts, CRM, warehouse, etc. Perform testing, debugging, and deployment of ERP modules. Senior Developer: Design ERP architecture and lead module ownership. Mentor and review junior team members. Create microservice-based backend for scalable ERP logic. Suggest optimization for speed, performance, and usability. Handle API versioning, DevOps deployments, and database normalization. Junior Developer: Assist in ERP module development under senior guidance. Convert Figma/UI mockups into Angular components. Work on validations, UI logic, and frontend state management. Participate in testing, bug-fixing, and API consumption. Technical Skills – Must Have: Node.js with Express Angular 10+ with RxJS Experience in ERP modules (min. 1 live project) REST API & JSON workflows MongoDB / MySQL / PostgreSQL UI/UX design understanding (Figma/XD to HTML conversion) Git, GitHub / Bitbucket Agile / Scrum methodology

Posted 1 week ago

Apply

10.0 years

0 Lacs

India

Remote

Linkedin logo

Company Description Staffbee Solutions INC. is a company that focuses on providing quality staffing solutions by finding individuals with strong character attributes, educational backgrounds, practical skills, specialized knowledge, or work experience. The company aims to fulfill requirements with great quality and satisfaction for their clients. Role Overview: We are looking for a highly skilled and experienced ServiceNow professional (10+ years) to join our freelance technical interview panel . As a Panelist, you’ll play a critical role in assessing candidates for ServiceNow Developer, Admin, and Architect roles by conducting deep technical interviews and evaluating hands-on expertise, problem-solving skills, and platform knowledge. This is an excellent opportunity for technically strong freelancers who enjoy sharing their expertise, influencing hiring decisions, and working flexible hours remotely. Key Responsibilities: Conduct live technical interviews and evaluations over video calls (aligned to EST hours) Assess candidates’ practical expertise in: Core ServiceNow modules (ITSM, CMDB, Discovery, Incident/Change/Problem) Custom application development & configuration Client/Server-side scripting (JavaScript, Business Rules, UI Policies, Script Includes) Integrations (REST/SOAP APIs, Integration Hub) Flow Designer, Service Portal, ACLs, ATF, and CI/CD practices Review coding tasks and scenario-based architecture questions Provide detailed, structured feedback and recommendations to the hiring team Collaborate on refining technical evaluation criteria if needed Required Skills & Experience (Advanced Technical Expertise): 10+ years of extensive hands-on experience with the ServiceNow platform in enterprise-grade environments Strong command over ServiceNow Core Modules : ITSM, ITOM, CMDB, Asset & Discovery, Incident/Change/Problem/Knowledge Management Proven expertise in custom application development using scoped apps, App Engine Studio, and Now Experience UI Framework Deep proficiency in ServiceNow scripting , including: Server-side : Business Rules, Script Includes, Scheduled Jobs, GlideRecord, GlideAggregate Client-side : UI Policies, Client Scripts, UI Actions, GlideForm/GlideUser APIs Middleware logic for cross-platform communication and custom handlers Experience implementing Access Control Lists (ACLs) with dynamic filters and condition-based restrictions Expert in Service Portal customization using AngularJS widgets, Bootstrap, and custom REST endpoints Proficient in Integration Hub , Custom REST/SOAP APIs , OAuth 2.0 authentication, MID Server integrations, external system integration (e.g., SAP, Azure, Jira, Dynatrace, etc.) Hands-on with Flow Designer , Orchestration , and Event Management Expertise in ServiceNow CMDB , CI Class modeling, reconciliation rules, identification/normalization strategies, and dependency mappings Familiarity with ServiceNow Performance Tuning : Scheduled Jobs optimization, lazy loading, database indexing, client/server execution efficiency Working knowledge of Automated Test Framework (ATF) and integration with CI/CD pipelines (Jenkins, Git, Azure DevOps) Understanding of ServiceNow DevOps , version control, scoped app publishing, and update set migration best practices Knowledge of Security Operations (SecOps) and Governance, Risk & Compliance (GRC) is a plus Experience guiding architectural decisions, governance models, and platform upgrade strategies Prior experience conducting technical interviews, design evaluations , or acting as a technical SME/panelist Excellent communication and feedback documentation skills — able to clearly explain technical rationale and candidate assessments Comfortable working independently and engaging with global stakeholders during USA EST hours (after 8 PM IST) Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Title: ServiceNow Architect Location: Noida Experience: 7+ Job Summary We are seeking a highly skilled ServiceNow professional with deep expertise in Hardware Asset Management (HAM) , Software Asset Management (SAM) , and Configuration Management Database (CMDB) . The ideal candidate will play a key role in designing, implementing, and optimizing asset and configuration management solutions on the ServiceNow platform. This role requires both strong technical acumen and functional understanding of IT asset lifecycle and configuration management best practices. Key Responsibilities Design and configure ServiceNow modules including HAM, SAM, and CMDB to align with business goals and ITIL processes. Implement best practices for asset discovery, normalization, license compliance, and reconciliation using ServiceNow Discovery and IntegrationHub . Ensure CMDB data integrity and health through effective class models, data normalization, and relationship mapping. Define asset lifecycle workflows for hardware and software, from procurement to retirement. Integrate ServiceNow with third-party systems (e.g., SCCM, JAMF, Tanium, Flexera, AWS, Azure) for accurate asset and configuration data ingestion. Lead workshops with stakeholders to gather requirements and translate them into technical solutions. Establish and enforce governance, data quality, and reconciliation policies for CMDB and Asset Management. Collaborate with ITSM, ITOM, Security Ops, and Procurement teams to ensure data alignment across the platform. Mentor junior developers and provide technical oversight for asset and CMDB-related enhancements. Drive the roadmap for HAM/SAM/CMDB capabilities in alignment with ServiceNow's latest releases. Required Skills & Experience 5+ years of hands-on experience in ServiceNow with focus on HAM, SAM, and CMDB . Deep knowledge of ServiceNow Discovery , Asset Management Lifecycle , Software License Management , and CMDB design principles . Proficiency in JavaScript , Glide API , Flow Designer , and REST/SOAP integrations . Experience implementing ServiceNow SAM Professional and managing vendor software models, entitlements, and compliance. Familiarity with data ingestion sources and normalization techniques using ILMT , SCCM , BigFix , etc. Understanding of ITIL v3/v4 framework, especially around Asset, Configuration, and Change Management. Strong analytical and problem-solving skills, with attention to detail. Excellent communication and stakeholder management skills. Certifications- Would be great – Not Mandatory ServiceNow Certified System Administrator ServiceNow Certified Implementation Specialist – HAM / SAM / CMDB / Discovery ITIL v3 or v4 Foundation Certification ServiceNow Certified Technical Architect (a plus). Work on enterprise-scale ServiceNow implementations. Join a high-performing, collaborative ITSM/ITAM team. Opportunity to lead digital transformation initiatives using ServiceNow’s latest technologies. Flexible working environment and continuous learning culture. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

JOB_POSTING-3-71493-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Engineer Location: Bangalore About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Data Engineer on our newly formed Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. You will play a critical role in the implementation of Data Warehousing and Data Lake solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a dedicated team, leveraging your skills in the data engineering area to build solutions and drive innovation forward. ”. What You’ll Contribute Perform hands-on analysis, technical design, solution architecture, prototyping, proofs-of-concept, development, unit and integration testing, debugging, documentation, deployment/migration, updates, maintenance, and support on Data Platform technologies. Design, develop, and maintain robust, scalable data pipelines for batch and real-time processing using modern tools like Apache Spark, Kafka, Airflow, or similar. Build efficient ETL/ELT workflows to ingest, clean, and transform structured and unstructured data from various sources into a well-organized data lake or warehouse. Manage and optimize cloud-based data infrastructure on platforms such as AWS (e.g., S3, Glue, Redshift, RDS) or Snowflake. Collaborate with cross-functional teams to understand data needs and deliver reliable datasets that support analytics, reporting, and machine learning use cases. Implement and monitor data quality, validation, and profiling processes to ensure the accuracy and reliability of downstream data. Design and enforce data models, schemas, and partitioning strategies that support performance and cost-efficiency. Develop and maintain data catalogs and documentation, ensuring data assets are discoverable and governed. Support DevOps/DataOps practices by automating deployments, tests, and monitoring for data pipelines using CI/CD tools. Proactively identify data-related issues and drive continuous improvements in pipeline reliability and scalability. Contribute to data security, privacy, and compliance efforts, implementing role-based access controls and encryption best practices. Design scalable architectures that support FICO’s analytics and decisioning solutions Partner with Data Science, Analytics, and DevOps teams to align architecture with business needs. What We’re Seeking 7+ years of hands-on experience as a Data Engineer working on production-grade systems. Proficiency in programming languages such as Python or Scala for data processing. Strong SQL skills, including complex joins, window functions, and query optimization techniques. Experience with cloud platforms such as AWS, GCP, or Azure, and relevant services (e.g., S3, Glue, BigQuery, Azure Data Lake). Familiarity with data orchestration tools like Airflow, Dagster, or Prefect. Hands-on experience with data warehousing technologies like Redshift, Snowflake, BigQuery, or Delta Lake. Understanding of stream processing frameworks such as Apache Kafka, Kinesis, or Flink is a plus. Knowledge of data modeling concepts (e.g., star schema, normalization, denormalization). Comfortable working in version-controlled environments using Git and managing workflows with GitHub Actions or similar tools. Strong analytical and problem-solving skills, with the ability to debug and resolve pipeline and performance issues. Excellent written and verbal communication skills, with an ability to collaborate across engineering, analytics, and business teams. Demonstrated technical curiosity and passion for learning, with the ability to quickly adapt to new technologies, development platforms, and programming languages as needed. Bachelor’s in computer science or related field Exposure to MLOps pipelines MLflow, Kubeflow, Feature Stores is a plus but not mandatory Engineers with certifications will be preferred Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Engineer About US FICO, originally known as Fair Isaac Corporation, is a leading analytics and decision management company that empowers businesses and individuals around the world with data-driven insights. Known for pioneering the FICO® Score, a standard in consumer credit risk assessment, FICO combines advanced analytics, machine learning, and sophisticated algorithms to drive smarter, faster decisions across industries. From financial services to retail, insurance, and healthcare, FICO's innovative solutions help organizations make precise decisions, reduce risk, and enhance customer experiences. With a strong commitment to ethical use of AI and data, FICO is dedicated to improving financial access and inclusivity, fostering trust, and driving growth for a digitally evolving world. The Opportunity “As a Data Engineer on our newly formed Generative AI team, you will work at the frontier of language model applications, developing novel solutions for various areas of the FICO platform to include fraud investigation, decision automation, process flow automation, and optimization. You will play a critical role in the implementation of Data Warehousing and Data Lake solutions. You will have the opportunity to make a meaningful impact on FICO’s platform by infusing it with next-generation AI capabilities. You’ll work with a dedicated team, leveraging your skills in the data engineering area to build solutions and drive innovation forward. ”. What You’ll Contribute Perform hands-on analysis, technical design, solution architecture, prototyping, proofs-of-concept, development, unit and integration testing, debugging, documentation, deployment/migration, updates, maintenance, and support on Data Platform technologies. Design, develop, and maintain robust, scalable data pipelines for batch and real-time processing using modern tools like Apache Spark, Kafka, Airflow, or similar. Build efficient ETL/ELT workflows to ingest, clean, and transform structured and unstructured data from various sources into a well-organized data lake or warehouse. Manage and optimize cloud-based data infrastructure on platforms such as AWS (e.g., S3, Glue, Redshift, RDS) or Snowflake. Collaborate with cross-functional teams to understand data needs and deliver reliable datasets that support analytics, reporting, and machine learning use cases. Implement and monitor data quality, validation, and profiling processes to ensure the accuracy and reliability of downstream data. Design and enforce data models, schemas, and partitioning strategies that support performance and cost-efficiency. Develop and maintain data catalogs and documentation, ensuring data assets are discoverable and governed. Support DevOps/DataOps practices by automating deployments, tests, and monitoring for data pipelines using CI/CD tools. Proactively identify data-related issues and drive continuous improvements in pipeline reliability and scalability. Contribute to data security, privacy, and compliance efforts, implementing role-based access controls and encryption best practices. Design scalable architectures that support FICO’s analytics and decisioning solutions Partner with Data Science, Analytics, and DevOps teams to align architecture with business needs. What We’re Seeking 7+ years of hands-on experience as a Data Engineer working on production-grade systems. Proficiency in programming languages such as Python or Scala for data processing. Strong SQL skills, including complex joins, window functions, and query optimization techniques. Experience with cloud platforms such as AWS, GCP, or Azure, and relevant services (e.g., S3, Glue, BigQuery, Azure Data Lake). Familiarity with data orchestration tools like Airflow, Dagster, or Prefect. Hands-on experience with data warehousing technologies like Redshift, Snowflake, BigQuery, or Delta Lake. Understanding of stream processing frameworks such as Apache Kafka, Kinesis, or Flink is a plus. Knowledge of data modeling concepts (e.g., star schema, normalization, denormalization). Comfortable working in version-controlled environments using Git and managing workflows with GitHub Actions or similar tools. Strong analytical and problem-solving skills, with the ability to debug and resolve pipeline and performance issues. Excellent written and verbal communication skills, with an ability to collaborate across engineering, analytics, and business teams. Demonstrated technical curiosity and passion for learning, with the ability to quickly adapt to new technologies, development platforms, and programming languages as needed. Bachelor’s in computer science or related field Exposure to MLOps pipelines MLflow, Kubeflow, Feature Stores is a plus but not mandatory Engineers with certifications will be preferred Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

🚀 Job Title: Engineering Lead Company: Darwix AI Location: Gurgaon (On-site) Type: Full-Time Experience: 5–10 Years Compensation: Competitive + Performance-based incentives + Meaningful ESOPs 🧠 About Darwix AI Darwix AI is one of India’s fastest-growing AI startups, building the future of enterprise revenue intelligence. We offer a GenAI-powered conversational intelligence and real-time agent assist suite that transforms how large sales teams interact, close deals, and scale operations. We’re already live with enterprise clients across India, the UAE, and Southeast Asia , and our platform enables multilingual speech-to-text, AI-driven nudges, and contextual conversation coaching—backed by our proprietary LLMs and cutting-edge voice infrastructure. With backing from top-tier VCs and over 30 angel investors, we’re now hiring an Engineering Lead who can architect, own, and scale the core engineering stack as we prepare for 10x growth. 🌟 Role Overview As the Engineering Lead at Darwix AI , you’ll take ownership of our platform architecture, product delivery, and engineering quality across the board. You’ll work closely with the founders, product managers, and the AI team to convert fast-moving product ideas into scalable features. You will: Lead backend and full-stack engineers across microservices, APIs, and real-time pipelines Architect scalable systems for AI/LLM deployments Drive code quality, maintainability, and engineering velocity This is a hands-on, player-coach role —perfect for someone who loves building but is also excited about mentoring and growing a technical team. 🎯 Key Responsibilities🛠️ Technical Leadership Own technical architecture across backend, frontend, and DevOps stacks Translate product roadmaps into high-performance, production-ready systems Drive high-quality code reviews, testing practices, and performance optimization Make critical system-level decisions around scalability, security, and reliability 🚀 Feature Delivery Work with the product and AI teams to build new features around speech recognition, diarization, real-time coaching, and analytics dashboards Build and maintain backend services for data ingestion, processing, and retrieval from Vector DBs, MySQL, and MongoDB Create clean, reusable APIs (REST & WebSocket) that power our web-based agent dashboards 🧱 System Architecture Refactor monoliths into microservice-based architecture Optimize real-time data pipelines with Redis, Kafka, and async queues Implement serverless modules using AWS Lambda, Docker containers, and CI/CD pipelines 🧑‍🏫 Mentorship & Team Building Lead a growing team of engineers—guide on architecture, code design, and performance tuning Foster a culture of ownership, documentation, and continuous learning Mentor junior developers, review PRs, and set up internal coding best practices 🔄 Collaboration Act as the key technical liaison between Product, Design, AI/ML, and DevOps teams Work directly with founders on roadmap planning, delivery tracking, and go-live readiness Contribute actively to investor tech discussions, client onboarding, and stakeholder calls ⚙️ Our Tech Stack Languages: Python (FastAPI, Django), PHP (legacy support), JavaScript, TypeScript Frontend: HTML, CSS, Bootstrap, Mustache templates; (React.js/Next.js optional) AI/ML Integration: LangChain, Whisper, RAG pipelines, Transformers, Deepgram, OpenAI APIs Databases: MySQL, PostgreSQL, MongoDB, Redis, Pinecone/FAISS (Vector DBs) Cloud & Infra: AWS EC2, S3, Lambda, CloudWatch, Docker, GitHub Actions, Nginx DevOps: Git, Docker, CI/CD pipelines, Jenkins/GitHub Actions, load testing Tools: Jira, Notion, Slack, Postman, Swagger 🧑‍💼 Who You Are 5–10 years of professional experience in backend/full-stack development Proven experience leading engineering projects or mentoring junior devs Comfortable working in high-growth B2B SaaS startups or product-first orgs Deep expertise in one or more backend frameworks (Django, FastAPI, Laravel, Flask) Experience working with AI products or integrating APIs from OpenAI, Deepgram, HuggingFace is a huge plus Strong understanding of system design, DB normalization, caching strategies, and latency optimization Bonus: exposure to working with voice pipelines (STT/ASR), NLP models, or real-time analytics 📌 Qualities We’re Looking For Builder-first mindset – you love launching features fast and scaling them well Execution speed – you move with urgency but don’t break things Hands-on leadership – you guide people by writing code, not just processes Problem-solver – when things break, you own the fix and the root cause Startup hunger – you thrive on chaos, ambiguity, and shipping weekly 🎁 What We Offer High Ownership : Directly shape the product and its architecture from the ground up Startup Velocity : Ship fast, learn fast, and push boundaries Founding Engineer Exposure : Work alongside IIT-IIM-BITS founders with full transparency Compensation : Competitive salary + meaningful equity + performance-based incentives Career Growth : Move into an EM/CTO-level role as the org scales Tech Leadership : Own features end-to-end—from spec to deployment 🧠 Final Note This is not just another engineering role. This is your chance to: Own the entire backend for a GenAI product serving global enterprise clients Lead technical decisions that define our future infrastructure Join the leadership team at a startup that’s shipping faster than anyone else in the category If you're ready to build a product with 10x potential, join a high-output team, and be the reason why the tech doesn’t break at scale , this role is for you. 📩 How to Apply Send your resume to people@darwix.ai with the subject line: “Application – Engineering Lead – [Your Name]” Attach: Your latest CV or LinkedIn profile GitHub/portfolio link (if available) A short note (3–5 lines) on why you're excited about Darwix AI and this role Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

This role will be a part of our growing Platform Solutions team. The primary responsibility would involve working on Oxane’s proprietary platforms in Private Credit+ space. The incumbent is expected to take lead on client projects assigned to her/him, working directly with top investment banks, Asset Management, Investment firms. This role is at the intersection of finance & technology (FinTech) and provides a steep learning curve in the evolving landscape of Private Credit+. The candidate will gain exposure to diverse asset classes and master the complexities of deal structures. Work directly with the clients on various investment transactions for performing asset backed loan portfolios, real estate and specialty financing. Analyse and comprehend modelling inputs from deal documents such as information memorandums, servicer reports, facility agreements and other legal reports and help onboard the deals on Oxane’s tech platform. Create comprehensive working file using excel, transform, and load data into database via standard ETL process. Use SQL to query data and create views to support report implementation. Configure report components using facts, dimensions, and configurator parameters in JavaScript. Assist the client in their specific needs for ad-hoc analytics, portfolio monitoring, surveillance and reporting. Act as Business analyst for Platform features, client changes and issues working closely with development team. Act as extended client team for asset management reporting, financial due diligence, post-deal analysis and business planning. Requirements: B.E. /B. Tech in Computer Science or IT along with MBA/PGDM in Finance is mandatory. Strong acumen for engineering technology-driven solutions will be preferred Good understanding of financial & lending/debt concepts, Advanced Excel functions, with SQL (must have – Advanced level), along with understanding of Data Architecture, Storage and Normalization Prior experience with similar engagement will be given preference Good attention to detail and a logical thought process to analyse large amounts of qualitative and quantitative data Strong written and verbal communication skills Self-starter personality and ability to work well under pressure in a fast-paced environment Show more Show less

Posted 1 week ago

Apply

0 years

3 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

- Experience programming in Java, C++, Python or related language - Experience with SQL and an RDBMS (e.g., Oracle) or Data Warehouse Customer addresses, Geospatial information and Road-network play a crucial role in Amazon Logistics' Delivery Planning systems. We own exciting science problems in the areas of Address Normalization, Geocode learning, Maps learning, Time estimations including route-time, delivery-time, transit-time predictions which are key inputs in delivery planning. As part of the Geospatial science team within Last Mile, you will partner closely with other scientists and engineers in a collegial environment to develop enterprise ML solutions with a clear path to business impact. The setting also gives you an opportunity to think about a complex large-scale problem for multiple years and building increasingly sophisticated solutions year over year. In the process there will be opportunity to innovate, explore SOTA and publish the research in internal and external ML conferences. Successful candidates will have deep knowledge of competing machine learning methods for large scale predictive modelling, natural language processing, semi-supervised & graph based learning. We also look for the experience to graduate prototype models to production and the communication skills to explain complex technical approaches to the stakeholders of varied technical expertise. Key job responsibilities As an Applied Scientist I, your responsibility will be to deliver on a well defined but complex business problem, explore SOTA technologies including GenAI and customize the large models as suitable for the application. Your job will be to work on a end-to-end business problem from design to experimentation and implementation. There is also an opportunity to work on open ended ML directions within the space and publish the work in prestigious ML conferences. About the team LMAI team owns WW charter for address and location learning solutions which are crucial for efficient Last Mile delivery planning, who also owns problems in the space of maps learning and travel time estimations. Experience implementing algorithms using both toolkits and self-developed code Have publications at top-tier peer-reviewed conferences or journals Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Orissa

Remote

GlassDoor logo

No. of Positions: 1 Position: Data Integration Technical Lead Location: Hybrid or Remote Total Years of Experience: 8+ years Experience: 8+ years of experience in data integration, cloud technologies, and API-based integrations. At least 3 years in a technical leadership role overseeing integration projects. Proven experience in integrating cloud-based systems, on-premise systems, databases, and legacy platforms. Informatica Cloud (IICS) or Mulesoft certifications preferable. Technical Expertise: Expertise in designing and implementing integration workflows using IICS, Mulesoft, or other integration platforms. Proficient in integrating cloud and on-premise systems, databases, and legacy platforms using API integrations, REST/SOAP, and middleware tools. Strong knowledge of Salesforce CRM, Microsoft Dynamics CRM, and other enterprise systems for integration. Experience in creating scalable, secure, and high-performance data integration solutions. Deep understanding of data modelling, transformation, and normalization techniques for integrations. Strong experience in troubleshooting and resolving integration issues. Key Responsibilities: Work with architects and client stakeholders to design data integration solutions that align with business needs and industry best practices. Lead the design and implementation of data integration pipelines, frameworks, and cloud integrations. Lead and mentor a team of data integration professionals, conducting code reviews and ensuring high-quality deliverables. Design and implement integrations with external systems using APIs, middleware, and cloud services. Develop data transformation workflows and custom scripts to integrate data between systems. Stay updated on new integration technologies and recommend improvements as necessary. Excellent verbal and written communication skills to engage with both technical and non-technical stakeholders. Proven ability to explain complex technical concepts clearly and concisely. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.

Posted 1 week ago

Apply

0 years

4 - 9 Lacs

Noida

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – Java Developer In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Responsible to collaborate with businesspeople to have a real time understanding of business problems and expected to focus on agile methodology of development. Struts 6 (Good to have worked on Struts 6.0 version but even if worked on Struts 2.0 and knowledge of Struts 6 should work. Struts is Mandatory). Deliver high quality change within the deadlines. In this role, you will be responsible for coding, testing and delivering high quality deliverables along with the reviews of the team members. Should be willing to learn new technologies. Understand and effectively communicate interactions between the front end and back-end systems. Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Preferred qualifications Java (1.8 or higher), Spring Boot framework (Core, AOP, Batch, JMS), Web Services (SOAP/REST), Oracle PL/SQL, Microservices, SQL Experienced working on Java Script (ExtJs framework), J2EE, Spring Boot, REST, JSON, Micro Services. Experience in TCF Framework (This is Homegrown Java framework from CVS so the Resources may not have experience in this. But experience in any similar MVC Framework like Struts, JSF other MVC framework should be good) Experience with IBM WebSphere server Experience with version control tools like Dimensions. Experience with HTML, XML & XSLT. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 17, 2025, 5:42:29 AM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Position: Are you a passionate backend engineer looking to make a significant impact? Join our cross-functional, distributed team responsible for building and maintaining the core backend functionalities that power our customers. You’ll be instrumental in developing scalable and robust solutions, directly impacting on the efficiency and reliability of our platform. This role offers a unique opportunity to work on cutting-edge technologies and contribute to a critical part of our business, all within a supportive and collaborative environment. Role: Junior .Net Engineer Location: Hyderabad Experience: 3 to 5 years Job Type: Full Time Employment What You'll Do: Implement feature/module as per design and requirements shared by Architect, Leads, BA/PM using coding best practices Develop, and maintain microservices using C# and .NET Core perform unit testing as per code coverage benchmark. Support testing & deployment activities Micro-Services - containerized micro-services (Docker/Kubernetes/Ansible etc.) Create and maintain RESTful APIs to facilitate communication between microservices and other components. Analyze and fix defects to develop high standard stable codes as per design specifications. Utilize version control systems (e.g., Git) to manage source code. Requirement Analysis: Understand and analyze functional/non-functional requirements and seek clarifications from Architect/Leads for better understanding of requirements. Participate in estimation activity for given requirements. Coding and Development: Writing clean and maintainable code using best practices of software development. Make use of different code analyzer tools. Follow TTD approach for any implementation. Perform coding and unit testing as per design. Problem Solving/ Defect Fixing: Investigate and debug any defect raised. Finding root causes, finding solutions, exploring alternate approaches and then fixing defects with appropriate solutions. Fix defects identified during functional/non-functional testing, during UAT within agreed timelines. Perform estimation for defect fixes for self and the team. Deployment Support: Provide prompt response during production support Expertise You'll Bring: Language – C# Visual Studio Professional Visual Studio Code .NET Core 3.1 onwards Entity Framework with code-first approach Dependency Injection Error Handling and Logging SDLC Object-Oriented Programming (OOP) Principles SOLID Principles Clean Coding Principles Design patterns API Rest API with token-based Authentication & Authorization Postman Swagger Database Relational Database: SQL Server/MySQL/ PostgreSQL Stored Procedures and Functions Relationships, Data Normalization & Denormalization, Indexes and Performance Optimization techniques Preferred Skills Development Exposure to Cloud: Azure/GCP/AWS Code Quality Tool – Sonar Exposure to CICD process and tools like Jenkins etc., Good understanding of docker and Kubernetes Exposure to Agile software development methodologies and ceremonies Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a value-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.” Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Thergaon, Pune, Maharashtra

On-site

Indeed logo

PHP Developer Company Name: SiGa Systems Pvt. Ltd. S iGa Systems is the fastest-growing IT software development company that enables successful technology-based digital transformation initiatives for enterprises, to create a business that is connected, open, intelligent, and scalable. We are an offshore Web development company with clients all across the globe. Since our inception in the year 2016, we have provided web and application development services for varied business domains. Job Description in Brief: We are looking for 0 to 6 months of experience candidates proficient in PHP/ WordPress/Laravel/ CodeIgniter / to develop Websites and web applications in core PHP. The desired candidate would be involved in a full software/ website development life cycle starting from requirement analysis to testing. The candidate should be able to work in a team or should be able to handle projects independently. Company Address : Office No. 101, Metropole, Near BRT Bus Stop, Dange Chowk, Thergaon, Pune, Maharashtra – 411 033 Company Website: https://sigasystems.com/ Qualification: BE/ B. Tech/ M. Tech/ MCA/ MCS/ MCM Work Experience: 0 to 6 months Annual CTC Range: As per company norms & Market Standard Technical Key skills: · Expertise in MVC, PHP Framework ( Laravel , CodeIgniter) WCF, Web API, and Entity Framework. · Proficient in jQuery, AJAX, Bootstrap, · Good knowledge in HTML5, CSS3, JavaScript, SQL Server , WordPress , MySQL. · Hands-on core PHP along with experience in AJAX, jQuery, Bootstrap, APIs · Experience with Project Management systems like Jira, Trello, Click, Bug herd, Basecamp, etc. · High proficiency with Git. · Experience with RESTful APIs · Able to work with a team. · Must have good communication skills. Desired Competencies: Bachelor’s degree in Computer Science or related field. Good expertise in Core PHP along with working exposure in HTML, HTML 5, JavaScript, CSS, Ajax, jQuery, Bootstrap, and APIs. PHP Scripting with MVC architecture Frameworks like CodeIgniter and Laravel. Knowledge of Linux, Web application development, and Quality software development. Optimizing MySQL Queries and databases to improve performance. Excellent conceptual, analytical, and programming skills. Knowledge of Object-Oriented Programming (OOPS) concepts with Smarty and AJAX. Should be well-versed with OS: Linux/ UNIX, Windows (LAMP and WAMP). Knowledge in Relational Database Management Systems, Database design, and Normalization. Preference will be given if you hold working knowledge of Open Source like WordPress, Shopify, and other open-source e-commerce systems. Good communication skills (spoken/written) will be a plus. Must be technically and logically strong. Industry: IT-Software / Software Services Functional Area: IT Software – Design & Developer Role Category: Developer Role: PHP Developer/Laravel Employment Type: Permanent Job, Full Time Roles & Responsibilities: Should be responsible for developing websites and Web Based Applications using Open-Source systems. Monitor, manage, and maintain the server environments where PHP Laravel applications are hosted, ensuring optimal performance, security, and availability. Integrate third-party APIs and services as needed. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Actively participate in quality assurance activities including design and code reviews, unit testing, defect fixes, and operational readiness. Diagnose and resolve server-related issues, including those impacting the performance of Laravel applications. This includes debugging server errors, analyzing logs, and identifying root causes of downtime or slow response times. Manage development projects from inception to completion autonomously and independently Provide administrative support, tools, and documentation for specific development projects. Design applications and database structures for performance and scalability. Deliver accurate project requirements and timeline estimates, providing regular feedback and consistently meeting project deadlines. Designing and implementing web-based back-end components that are high-performing and scalable. Participating in and improving development processes and tools for other development teams. Contribute ideas and efforts towards the project and work as part of a team to find solutions to various problems. If this opportunity feels like the perfect match for you, don’t wait—apply now! Reach out to us via email or WhatsApp using the details below. Let’s connect and create something extraordinary together! Contact Person Name: HR Riddhi Email: hr@sigasystems.com WhatsApp: +91 8873511171 Job Type: Full-time Pay: ₹12,500.00 - ₹14,000.00 per month Benefits: Paid sick time Paid time off Schedule: Rotational shift Education: Bachelor's (Preferred) Experience: total work: 1 year (Preferred) PHP/LARAVEL: 1 year (Preferred) Language: English (Preferred) Expected Start Date: 16/07/2025

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies