Home
Jobs

733 Normalization Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Orissa

Remote

GlassDoor logo

No. of Positions: 1 Position: Data Integration Technical Lead Location: Hybrid or Remote Total Years of Experience: 8+ years Experience: 8+ years of experience in data integration, cloud technologies, and API-based integrations. At least 3 years in a technical leadership role overseeing integration projects. Proven experience in integrating cloud-based systems, on-premise systems, databases, and legacy platforms. Informatica Cloud (IICS) or Mulesoft certifications preferable. Technical Expertise: Expertise in designing and implementing integration workflows using IICS, Mulesoft, or other integration platforms. Proficient in integrating cloud and on-premise systems, databases, and legacy platforms using API integrations, REST/SOAP, and middleware tools. Strong knowledge of Salesforce CRM, Microsoft Dynamics CRM, and other enterprise systems for integration. Experience in creating scalable, secure, and high-performance data integration solutions. Deep understanding of data modelling, transformation, and normalization techniques for integrations. Strong experience in troubleshooting and resolving integration issues. Key Responsibilities: Work with architects and client stakeholders to design data integration solutions that align with business needs and industry best practices. Lead the design and implementation of data integration pipelines, frameworks, and cloud integrations. Lead and mentor a team of data integration professionals, conducting code reviews and ensuring high-quality deliverables. Design and implement integrations with external systems using APIs, middleware, and cloud services. Develop data transformation workflows and custom scripts to integrate data between systems. Stay updated on new integration technologies and recommend improvements as necessary. Excellent verbal and written communication skills to engage with both technical and non-technical stakeholders. Proven ability to explain complex technical concepts clearly and concisely. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.

Posted 1 week ago

Apply

0 years

4 - 9 Lacs

Noida

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – Java Developer In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Responsible to collaborate with businesspeople to have a real time understanding of business problems and expected to focus on agile methodology of development. Struts 6 (Good to have worked on Struts 6.0 version but even if worked on Struts 2.0 and knowledge of Struts 6 should work. Struts is Mandatory). Deliver high quality change within the deadlines. In this role, you will be responsible for coding, testing and delivering high quality deliverables along with the reviews of the team members. Should be willing to learn new technologies. Understand and effectively communicate interactions between the front end and back-end systems. Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Preferred qualifications Java (1.8 or higher), Spring Boot framework (Core, AOP, Batch, JMS), Web Services (SOAP/REST), Oracle PL/SQL, Microservices, SQL Experienced working on Java Script (ExtJs framework), J2EE, Spring Boot, REST, JSON, Micro Services. Experience in TCF Framework (This is Homegrown Java framework from CVS so the Resources may not have experience in this. But experience in any similar MVC Framework like Struts, JSF other MVC framework should be good) Experience with IBM WebSphere server Experience with version control tools like Dimensions. Experience with HTML, XML & XSLT. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 17, 2025, 5:42:29 AM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Position: Are you a passionate backend engineer looking to make a significant impact? Join our cross-functional, distributed team responsible for building and maintaining the core backend functionalities that power our customers. You’ll be instrumental in developing scalable and robust solutions, directly impacting on the efficiency and reliability of our platform. This role offers a unique opportunity to work on cutting-edge technologies and contribute to a critical part of our business, all within a supportive and collaborative environment. Role: Junior .Net Engineer Location: Hyderabad Experience: 3 to 5 years Job Type: Full Time Employment What You'll Do: Implement feature/module as per design and requirements shared by Architect, Leads, BA/PM using coding best practices Develop, and maintain microservices using C# and .NET Core perform unit testing as per code coverage benchmark. Support testing & deployment activities Micro-Services - containerized micro-services (Docker/Kubernetes/Ansible etc.) Create and maintain RESTful APIs to facilitate communication between microservices and other components. Analyze and fix defects to develop high standard stable codes as per design specifications. Utilize version control systems (e.g., Git) to manage source code. Requirement Analysis: Understand and analyze functional/non-functional requirements and seek clarifications from Architect/Leads for better understanding of requirements. Participate in estimation activity for given requirements. Coding and Development: Writing clean and maintainable code using best practices of software development. Make use of different code analyzer tools. Follow TTD approach for any implementation. Perform coding and unit testing as per design. Problem Solving/ Defect Fixing: Investigate and debug any defect raised. Finding root causes, finding solutions, exploring alternate approaches and then fixing defects with appropriate solutions. Fix defects identified during functional/non-functional testing, during UAT within agreed timelines. Perform estimation for defect fixes for self and the team. Deployment Support: Provide prompt response during production support Expertise You'll Bring: Language – C# Visual Studio Professional Visual Studio Code .NET Core 3.1 onwards Entity Framework with code-first approach Dependency Injection Error Handling and Logging SDLC Object-Oriented Programming (OOP) Principles SOLID Principles Clean Coding Principles Design patterns API Rest API with token-based Authentication & Authorization Postman Swagger Database Relational Database: SQL Server/MySQL/ PostgreSQL Stored Procedures and Functions Relationships, Data Normalization & Denormalization, Indexes and Performance Optimization techniques Preferred Skills Development Exposure to Cloud: Azure/GCP/AWS Code Quality Tool – Sonar Exposure to CICD process and tools like Jenkins etc., Good understanding of docker and Kubernetes Exposure to Agile software development methodologies and ceremonies Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a value-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.” Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Thergaon, Pune, Maharashtra

On-site

Indeed logo

PHP Developer Company Name: SiGa Systems Pvt. Ltd. S iGa Systems is the fastest-growing IT software development company that enables successful technology-based digital transformation initiatives for enterprises, to create a business that is connected, open, intelligent, and scalable. We are an offshore Web development company with clients all across the globe. Since our inception in the year 2016, we have provided web and application development services for varied business domains. Job Description in Brief: We are looking for 0 to 6 months of experience candidates proficient in PHP/ WordPress/Laravel/ CodeIgniter / to develop Websites and web applications in core PHP. The desired candidate would be involved in a full software/ website development life cycle starting from requirement analysis to testing. The candidate should be able to work in a team or should be able to handle projects independently. Company Address : Office No. 101, Metropole, Near BRT Bus Stop, Dange Chowk, Thergaon, Pune, Maharashtra – 411 033 Company Website: https://sigasystems.com/ Qualification: BE/ B. Tech/ M. Tech/ MCA/ MCS/ MCM Work Experience: 0 to 6 months Annual CTC Range: As per company norms & Market Standard Technical Key skills: · Expertise in MVC, PHP Framework ( Laravel , CodeIgniter) WCF, Web API, and Entity Framework. · Proficient in jQuery, AJAX, Bootstrap, · Good knowledge in HTML5, CSS3, JavaScript, SQL Server , WordPress , MySQL. · Hands-on core PHP along with experience in AJAX, jQuery, Bootstrap, APIs · Experience with Project Management systems like Jira, Trello, Click, Bug herd, Basecamp, etc. · High proficiency with Git. · Experience with RESTful APIs · Able to work with a team. · Must have good communication skills. Desired Competencies: Bachelor’s degree in Computer Science or related field. Good expertise in Core PHP along with working exposure in HTML, HTML 5, JavaScript, CSS, Ajax, jQuery, Bootstrap, and APIs. PHP Scripting with MVC architecture Frameworks like CodeIgniter and Laravel. Knowledge of Linux, Web application development, and Quality software development. Optimizing MySQL Queries and databases to improve performance. Excellent conceptual, analytical, and programming skills. Knowledge of Object-Oriented Programming (OOPS) concepts with Smarty and AJAX. Should be well-versed with OS: Linux/ UNIX, Windows (LAMP and WAMP). Knowledge in Relational Database Management Systems, Database design, and Normalization. Preference will be given if you hold working knowledge of Open Source like WordPress, Shopify, and other open-source e-commerce systems. Good communication skills (spoken/written) will be a plus. Must be technically and logically strong. Industry: IT-Software / Software Services Functional Area: IT Software – Design & Developer Role Category: Developer Role: PHP Developer/Laravel Employment Type: Permanent Job, Full Time Roles & Responsibilities: Should be responsible for developing websites and Web Based Applications using Open-Source systems. Monitor, manage, and maintain the server environments where PHP Laravel applications are hosted, ensuring optimal performance, security, and availability. Integrate third-party APIs and services as needed. Strong communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Actively participate in quality assurance activities including design and code reviews, unit testing, defect fixes, and operational readiness. Diagnose and resolve server-related issues, including those impacting the performance of Laravel applications. This includes debugging server errors, analyzing logs, and identifying root causes of downtime or slow response times. Manage development projects from inception to completion autonomously and independently Provide administrative support, tools, and documentation for specific development projects. Design applications and database structures for performance and scalability. Deliver accurate project requirements and timeline estimates, providing regular feedback and consistently meeting project deadlines. Designing and implementing web-based back-end components that are high-performing and scalable. Participating in and improving development processes and tools for other development teams. Contribute ideas and efforts towards the project and work as part of a team to find solutions to various problems. If this opportunity feels like the perfect match for you, don’t wait—apply now! Reach out to us via email or WhatsApp using the details below. Let’s connect and create something extraordinary together! Contact Person Name: HR Riddhi Email: hr@sigasystems.com WhatsApp: +91 8873511171 Job Type: Full-time Pay: ₹12,500.00 - ₹14,000.00 per month Benefits: Paid sick time Paid time off Schedule: Rotational shift Education: Bachelor's (Preferred) Experience: total work: 1 year (Preferred) PHP/LARAVEL: 1 year (Preferred) Language: English (Preferred) Expected Start Date: 16/07/2025

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Linkedin logo

Role : Database Engineer Location : Remote Notice Period : 30 Days Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Job title : Data Engineer Experience: 5–8 Years Location: Remote Shift: IST (Indian Standard Time) Contract Type: Short-Term Contract Job Overview We are seeking an experienced Data Engineer with deep expertise in Microsoft Fabric to join our team on a short-term contract basis. You will play a pivotal role in designing and building scalable data solutions and enabling business insights in a modern cloud-first environment. The ideal candidate will have a passion for data architecture, strong hands-on technical skills, and the ability to translate business needs into robust technical solutions. Key Responsibilities Design and implement end-to-end data pipelines using Microsoft Fabric components (Data Factory, Dataflows Gen2). Build and maintain data models , semantic layers , and data marts for reporting and analytics. Develop and optimize SQL-based ETL processes integrating structured and unstructured data sources. Collaborate with BI teams to create effective Power BI datasets , dashboards, and reports. Ensure robust data integration across various platforms (on-premises and cloud). Implement mechanisms for data quality , validation, and error handling. Translate business requirements into scalable and maintainable technical solutions. Optimize data pipelines for performance and cost-efficiency . Provide technical mentorship to junior data engineers as needed. Required Skills Hands-on experience with Microsoft Fabric : Dataflows Gen2, Pipelines, OneLake. Strong proficiency in Power BI , including semantic modeling and dashboard/report creation. Deep understanding of data modeling techniques: star schema, snowflake schema, normalization, denormalization. Expertise in SQL , stored procedures, and query performance tuning. Experience integrating data from diverse sources: APIs, flat files, databases, and streaming. Knowledge of data governance , lineage, and data catalog tools within the Microsoft ecosystem. Strong problem-solving skills and ability to manage large-scale data workflows. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. What We Offer Fully remote internship with flexible working hours. Hands-on experience with real-world database projects. Mentorship from experienced database professionals. Certificate of completion and potential for a full-time opportunity based on performance. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

0.0 - 12.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10363280 Date posted 06/17/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Additional Locations Noida, Uttar Pradesh Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Tech Lead, Data Architecture What Does a great Data Architecture do at Fiserv? We are seeking a seasoned Data Architect with extensive experience in data modeling and architecting data solutions, particularly with Snowflake. The ideal candidate will have 8-12 years of hands-on experience in designing, implementing, and optimizing data architectures to meet the evolving needs of our organization. As a Data Architect, you will play a pivotal role in ensuring the robustness, scalability, and efficiency of our data systems. What you will do: Data Architecture Design: Develop, optimize, and oversee conceptual and logical data systems, ensuring they meet both current and future business requirements. Data Modeling: Create and maintain data models using Snowflake, ensuring data integrity, performance, and security. Solution Architecture: Design and implement end-to-end data solutions, including data ingestion, transformation, storage, and access. Stakeholder Collaboration: Work closely with business stakeholders, data scientists, and engineers to understand data requirements and translate them into technical specifications. Performance Optimization: Monitor and improve data system performance, addressing any issues related to scalability, efficiency, and data quality. Governance and Compliance: Ensure data architectures comply with data governance policies, standards, and industry regulations. Technology Evaluation: Stay current with emerging data technologies and assess their potential impact and value to the organization. Mentorship and Leadership: Provide technical guidance and mentorship to junior data architects and engineers, fostering a culture of continuous learning and improvement. What you will need to have: 8-12 Years of Experience in data architecture and data modeling in Snowflake. Proficiency in Snowflake data warehousing platform. Strong understanding of data modeling concepts, including normalization, denormalization, star schema, and snowflake schema. Experience with ETL/ELT processes and tools. Familiarity with data governance and data security best practices. Knowledge of SQL and performance tuning for large-scale data systems. Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to translate technical concepts for non-technical stakeholders. Demonstrated ability to lead and mentor technical teams. What would be nice to have: Education: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certifications: Snowflake certifications or other relevant industry certifications. Industry Experience: Experience in Finance/Cards/Payments industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Port Blair, Andaman and Nicobar Islands, India

On-site

Linkedin logo

Job Title: Database Developer Location: Calicut, Kerala (On-site) Experience: Minimum 2 Years Job Type: Full-time Notice: immediate/15 days Candidates from Kerala are highly preferred. Job Summary: We are hiring a skilled and detail-oriented Database Developer with at least 2+ years of experience to join our team in Calicut. The ideal candidate will have hands-on expertise in SQL and PostgreSQL, with a strong understanding of database design, development, and performance optimization. Experience with Azure cloud services is a plus. Key Responsibilities: Design, develop, and maintain database structures, stored procedures, functions, and triggers Write optimized SQL queries for integration with applications and reporting tools Ensure data integrity, consistency, and security across platforms Monitor and tune database performance for high availability and scalability Collaborate with developers and DevOps teams to support application development Maintain and update technical documentation related to database structures and processes Assist in data migration and backup strategies Work with cloud-based databases and services (preferably on Azure) Required Skills & Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field Minimum 2 years of experience as a Database Developer or similar role Strong expertise in SQL and PostgreSQL database development Solid understanding of relational database design and normalization Experience in writing complex queries, stored procedures, and performance tuning Familiarity with version control systems like Git Strong analytical and troubleshooting skills Preferred Qualifications: Experience with Azure SQL Database, Data Factory, or related services Knowledge of data warehousing and ETL processes Exposure to NoSQL or other modern database technologies is a plus Show more Show less

Posted 1 week ago

Apply

9.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description Having 9+ years of working experience in Data Engineering and Data Analytic projects in implementing Data Warehouse, Data Lake and Lakehouse and associated ETL/ELT patterns. Worked as a Data Modeller in one or two implementations in creating and implementing Data models and Data Base designs using Dimensional, ER models. Good knowledge and experience in modelling complex scenario's like many to many relationships, SCD types, Late arriving fact and dimensions etc. Hands on experience in any one of the Data modelling tools like Erwin, ER/Studio, Enterprise Architect or SQLDBM etc. Experience in working closely with Business stakeholders/Business Analyst to understand the functional requirements and translating it into Data Models and database designs. Experience in creating conceptual models and logical models and translating it into physical models to address the both functional and non functional requirements. Strong knowledge in SQL, able to write complex queries and profile the data to understand the relationships and DQ issues. Very strong understanding of database modelling and design principles like normalization, denormalization, isolation levels. Experience in Performance optimizations through database designs (Physical Modelling). Good communication skills (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

JD For Data Modeler Key Requirements : Total 8 + years of Experience with 8 years of hands-on experience in data modelling Expertise in conceptual, logical, and physical data modeling Proficient in tools such as Erwin, SQL DBM, or similar Strong understanding of data governance and database design best practices Excellent communication and collaboration skills Having 8+ years of working experience in Data Engineering and Data Analytic projects in implementing Data Warehouse, Data Lake and Lakehouse and associated ETL/ELT patterns. Worked as a Data Modeller in one or two implementations in creating and implementing Data models and Data Base designs using Dimensional, ER models. Good knowledge and experience in modelling complex scenarios like many to many relationships, SCD types, Late arriving fact and dimensions etc. Hands on experience in any one of the Data modelling tools like Erwin, ER/Studio, Enterprise Architect or SQLDBM etc. Experience in working closely with Business stakeholders/Business Analyst to understand the functional requirements and translating it into Data Models and database designs. Experience in creating conceptual models and logical models and translating it into physical models to address the both functional and non functional requirements. Strong knowledge in SQL, able to write complex queries and profile the data to understand the relationships and DQ issues. Very strong understanding of database modelling and design principles like normalization, denormalization, isolation levels. Experience in Performance optimizations through database designs (Physical Modelling). (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Customer addresses, Geospatial information and Road-network play a crucial role in Amazon Logistics' Delivery Planning systems. We own exciting science problems in the areas of Address Normalization, Geocode learning, Maps learning, Time estimations including route-time, delivery-time, transit-time predictions which are key inputs in delivery planning. As part of the Geospatial science team within Last Mile, you will partner closely with other scientists and engineers in a collegial environment to develop enterprise ML solutions with a clear path to business impact. The setting also gives you an opportunity to think about a complex large-scale problem for multiple years and building increasingly sophisticated solutions year over year. In the process there will be opportunity to innovate, explore SOTA and publish the research in internal and external ML conferences. Successful candidates will have deep knowledge of competing machine learning methods for large scale predictive modelling, natural language processing, semi-supervised & graph based learning. We also look for the experience to graduate prototype models to production and the communication skills to explain complex technical approaches to the stakeholders of varied technical expertise. Key job responsibilities As an Applied Scientist I, your responsibility will be to deliver on a well defined but complex business problem, explore SOTA technologies including GenAI and customize the large models as suitable for the application. Your job will be to work on a end-to-end business problem from design to experimentation and implementation. There is also an opportunity to work on open ended ML directions within the space and publish the work in prestigious ML conferences. About The Team LMAI team owns WW charter for address and location learning solutions which are crucial for efficient Last Mile delivery planning, who also owns problems in the space of maps learning and travel time estimations. Basic Qualifications Experience programming in Java, C++, Python or related language Experience with SQL and an RDBMS (e.g., Oracle) or Data Warehouse Preferred Qualifications Experience implementing algorithms using both toolkits and self-developed code Have publications at top-tier peer-reviewed conferences or journals Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3009636 Show more Show less

Posted 1 week ago

Apply

5.0 - 31.0 years

0 - 1 Lacs

Nagole, Hyderabad

Remote

Apna logo

🔹 Job Title: Python Backend & Middleware Developer with Database Expertise 📍 Location: Hyderabad 🕒 Experience: 3–6 Years 🧾 Employment Type: Full-time 🔧 Key Responsibilities: 🔸 Python Backend Development: - Design, build, and maintain scalable RESTful APIs using Python (FastAPI/Django/Flask). - Write clean, efficient, and testable code. - Implement backend logic, data processing, and third-party API integrations. - Use asynchronous programming paradigms where required (e.g., asyncio, aiohttp). 🔸 Middleware Development: - Develop and maintain middleware components to handle cross-cutting concerns like logging, authentication, and request/response handling. - Ensure smooth communication between different systems, services, and microservices. - Optimize inter-service communication using message brokers (RabbitMQ, Kafka, etc.). - Implement caching and rate-limiting mechanisms where applicable. 🔸 Database Development: - Design and manage relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis) databases. - Write complex SQL queries, stored procedures, and views for efficient data retrieval. - Ensure database normalization, indexing, performance tuning, and optimization. - Implement data backup, recovery strategies, and migration scripts. 🧠 Required Skills: - Strong proficiency in Python 3.x and experience with frameworks like FastAPI, Django, or Flask. - Experience with middleware architecture, API Gateways, or microservice orchestration. - Expertise in SQL and hands-on experience with PostgreSQL / MySQL. - Familiarity with NoSQL databases like MongoDB or Redis. - Knowledge of RESTful APIs, OAuth2/JWT, and API security best practices. - Hands-on experience with Docker, Git, and CI/CD pipelines. - Familiarity with cloud platforms like AWS, GCP, or Azure is a plus. - Good understanding of software design patterns and architecture principles. ✅ Preferred Qualifications: - Bachelor's/Master's degree in Computer Science, Information Technology, or related fields. - Experience working in Agile/Scrum teams. - Exposure to Kafka, RabbitMQ, or similar messaging systems. - Experience with Unit Testing, Integration Testing, and Load Testing tools. 🧩 Soft Skills: - Strong problem-solving and analytical skills. - Excellent communication and teamwork abilities. - Ability to manage time effectively and deliver tasks within deadlines.

Posted 1 week ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Purpose of Role Chubb is seeking a highly skilled and experienced Deep Learning Engineer with Generative AI experience to develop and scale our Generative AI capabilities. The ideal candidate will be responsible for designing, finetuning and training large language models and developing Generative AI systems that can create and improve conversational abilities and decision-making skills of our machines. Key Accountabilities & Responsibilities Develop and improve Generative AI systems to enable high quality decision making, to refine answers for queries, and to enhance automated communication capabilities. Own the entire process of data collection, training, and deploying machine learning models. Continuous research and implementation of cutting-edge techniques in deep learning, NLP and Generative AI to build state-of-the-art models. Work closely with Data Scientists and other Machine Learning Engineers to design and implement end-to-end solutions. Optimize and streamline deep learning training pipelines. Develop performance metrics to track the efficiency and accuracy of deep learning models. Skills & Experience Minimum of 4 years of industry experience in developing deep learning models with a focus on NLP and Generative AI. Expertise in deep learning frameworks such as Tensorflow, PyTorch and Keras. Experience working with cloud-based services such as Azure for training and deployment of deep learning models. Experience with Hugging Face's Transformer libraries. Expertise in developing and scaling Generative AI systems. Experience in large dataset processing, including pre-processing, cleaning and normalization. Proficient in programming languages such as Python ( preferred ) /R. Experience with natural language processing (NLP) techniques and libraries Excellent analytical and problem-solving skills. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Tech Support Practitioner, you will act as the ongoing interface between the client and the system or application. You will be dedicated to quality, using exceptional communication skills to keep our world-class systems running. With your deep product knowledge, you will accurately define client issues and design resolutions. Your typical day will involve troubleshooting client issues, providing technical support, and collaborating with cross-functional teams to ensure smooth operations. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Ensure timely resolution of client issues and technical support. - Collaborate with cross-functional teams to troubleshoot and resolve complex technical problems. - Manage and prioritize multiple tasks to meet client expectations. - Continuously improve product knowledge and stay updated with the latest industry trends. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language). - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in Python (Programming Language). - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are looking for collection managers to handle our Retail and BB portfolio in Pune and Nashik region. The role holder is responsible for meeting the target assigned for the portfolio and vendor managed. Key responsibilities include achievement of set targets along with 100% process adherence. Collection Manager/ Agency Manager to ensure that the vendor adheres to the regulatory norms & process requirements which have been communicated as a part of COC/ agreement during empanelment. Ensure 100% allocation of the portfolio allocated for field coverage & payment follow up Ensure adequate workforce is deployed by the vendor for field coverage of accounts allocated across pin codes/ geographies Field workforce has to be DRA Certified as per regulatory norms All resources managing YBL portfolio have to have valid YBL ID cards & receipt books issued All payments collected on field to be deposited/ applied within the TAT Timely communication on targets across portfolio parameters of resolution, rollback & normalization to Agency proprietor & agency manager Ensure achievement of Resolution, Rollback/ Normalization targets assigned on the portfolio allocated are achieved Timely submission of used receipts & ensuring payment application within the approved TAT. Receipt reconciliation to be done within the approved cut off date Start of month audits to be conducted to ensure 100% process adherence. Any abbreviations to be reported to location head/ RCM as appropriate Field follow up to be updated in DCR & trails to be sent to Central Team for updation in V+ for all cases referred/ allocated Communication on pickups for accounts where pickups have been generated Ensure vendor payouts are timely raised & commission is subsequently paid to vendors Agency to operate within the regulatory framework with 100% compliance of regulatory norms Direct field follow up on difficult cases by Agency Manager along with agency proprietor/ agency supervisor for personal traction of high risk accounts. Ensure necessary investigation of complaints received across channels are shared post necessary review to prevent any reputational/ financial loss for the Bank. Identify & recommend cases where Legal recourses need to be sought & ensure legal coverage of all accounts allocated as appropriate Ensure regular exchange with agency proprietor on key deliverables & process norms on any variance/ process gap Ensure quality resources are hired & deployed by agency proprietor on YBL portfolio Ensure regular agency visits & review with field team Ensure there is no financial/ non-financial exchange with any third party (customer/ agency resources etc) All communication to the agencies to be sent only on registered email id Ensure all customer interactions are updated in DCR for future reference Ensure all customer issues involving judicial bodies are highlighted to all internal stake holders (Legal/ Compliance) Ensure rightful representation of the Bank in any customer/ third party interaction Timely closure of receipt reconciliation & providing confirmation on data destruction Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical And Functional Skills Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary A career in our Microsoft Dynamics team will provide the opportunity to help our clients transform their technology landscape across Front, Back and Mid-Office functions leveraging Microsoft Dynamics. We focus on contributing to PwC’s value proposition of “strategy led and technology enabled”, by aligning our Consulting Solutions’ industry focus with the Microsoft technologies such as Dynamics 365, Azure, Power Platform and Power BI. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Experience as a Data Analyst with high proficiency; · Expertise in writing and optimizing SQL Queries in SQL Server and/or Oracle; · Experience in Data Extraction, Transformation and Loading (ETL) using SSIS and ADF; · Experience in PowerBI and/or Tableau for visualizing and analyzing data; · Having knowledge in Database Normalization for optimum performance; · Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and VBA Macros; · Knowledge about data warehousing concepts · Performance optimization and troubleshooting capabilities · Good Project Management Skills- Client Meetings, Stakeholder Engagement · Familiarity with Agile Methodology · Strong knowledge in Azure DevOps Boards, Sprint, Queries, Pipelines (CI/ CD) etc. Mandatory skill sets: ADF, Power BI Preferred skill sets: Devops/CI/CD Years of experience required: 3-7 years Education qualification: B.Tech/B.E Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

We are looking to hire a Data or Business Analyst to join our data team. You will take responsibility for managing our master data set, developing reports, and troubleshooting data issues. To do well in this role you need a very fine eye for detail, experience as a data analyst, and a deep understanding of the popular data analysis tools and databases. Responsibilities: Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develops reports and analyses. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing. Requirements: Bachelor’s degree from an accredited university or college in computer science. Work experience as a Data or Business Analyst or in a related field. Ability to work with stakeholders to assess potential risks. Ability to analyze existing tools and databases and provide software solution recommendations. Ability to translate business requirements into nontechnical, lay terms. High-level experience in methodologies and processes for managing large-scale databases. Demonstrated experience in handling large data sets and relational databases. Understanding of addressing and metadata standards. High-level written and verbal communication skills. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: AI/ML Engineer Relevant experience: 3+ years Location: Bangalore/Mysore/Coimbatore Work mode: From office 5 days a week Job Summary: We are looking for an innovative AI/ML Engineer to join our team and work on cutting-edge machine learning and artificial intelligence projects. The ideal candidate will have experience in building, deploying, and optimizing AI/ML models, along with a strong foundation in data science, programming, and algorithms. You will help drive the development of intelligent systems that leverage machine learning to solve real-world problems and improve business outcomes. Key Responsibilities: Data Preparation and Analysis: Ability to understand large datasets, preprocess them, and extract features Data Preprocessing Techniques: Knowledge of normalization, feature encoding, and handling missing values Data Cleaning: Identifying and rectifying errors, outliers, and missing values within datasets Design, develop, and implement machine learning & Deep Learning (FNN, CNN, RNN) models, with a focus on LLMs, generative AI, and fraud detection systems. Deploy and maintain ML models in AWS or any other cloud environments. Optimize model performance and scalability. Collaborate with cross-functional teams to integrate AI solutions into existing applications. Develop and maintain APIs (RESTful) for AI model integration. Implement MLOps best practices to streamline the ML lifecycle. Stay up-to-date with the latest advancements in AI/ML and incorporate new techniques into our workflow. Develop and implement fraud detection models to identify and prevent fraudulent activities. Evaluate model performance using appropriate metrics and techniques, ensuring high accuracy and reliability. Experience with Machine Learning Libraries and Frameworks: Familiarity with tools like TensorFlow, PyTorch, and scikit-learn, Keras Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are seeking an experienced PHP Developer to design, develop, and maintain high-performance web applications. This role involves collaborating with cross-functional teams, optimizing application performance, and ensuring secure and scalable solutions. If you have a strong foundation in PHP development, frameworks, database management, and cloud technologies, we invite you to apply and contribute to cutting-edge projects. Key Responsibilities : Core PHP & Frameworks: Strong expertise in Core PHP and PHP web frameworks (preferably Symfony, Laravel, or CodeIgniter). Object-Oriented Programming (OOP): Deep understanding of OOP principles and MVC design patterns in PHP. Third-Party Integrations: Experience with third-party API integrations, authentication, and authorization mechanisms. Database Management: Strong proficiency in MySQL, knowledge of database normalization, ORM, and experience working with SQL/NoSQL databases. Web Development & Front-End: Familiarity with JavaScript, jQuery, VueJS, ReactJS, HTML5, CSS3, and front-end technologies. Security & Compliance: Knowledge of security best practices and compliance standards like HIPAA and GDPR. Application Design & Scalability: Understanding of scalable application architecture and secure authentication between systems. Cloud & DevOps: Hands-on experience with AWS cloud services, Docker containers, CI/CD pipelines, and automation scripts. Testing & Debugging: Proficiency in Test-Driven Development (TDD) and strong debugging skills. Version Control & Collaboration: Proficient with Git and working in a collaborative Agile/Scrum environment. Requirements : Education & Experience: Bachelor’s degree in computer science, Information Technology, or a related field with proven PHP development experience. PHP Frameworks: Strong expertise in Symfony, Laravel, or CodeIgniter. Front-End Development: Familiarity with HTML, CSS, JavaScript, jQuery. Database & API Management: Experience with MySQL, PostgreSQL, RESTful APIs, and web services. Version Control & CI/CD: Proficient in Git, CI/CD pipelines, and automation using shell scripts. Team Collaboration & Communication : Ability to work collaboratively, solve complex problems, and pay attention to detail. Preferred Qualifications : Agile & Scrum: Experience working in Agile/Scrum environments. Multi-Tech Expertise: Knowledge of additional programming languages (e.g., Python, JavaScript frameworks). Cloud & DevOps : Familiarity with AWS, Google Cloud, Docker, and Kubernetes Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies