Jobs
Interviews

1137 Normalization Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

7 - 8 Lacs

Hyderābād

On-site

Greetings from Star Secutech Pvt Ltd!!! Huge welcome to Immediate Joiners!!! Job Title: SR Executive Reporting to: Team Leader/AM/DM Location: Hyderabad Working Hours/ Days: 9 Hours / 5 Days a Week. Shift: U.S Shift (5:30 PM – 2:30AM) Salary: 7-8 LPA (Negotiable) Job Role:  Data types: Identify the data types of each data set and ensure compatibility.  Harmonization process: Develop a harmonization process that outlines the steps required to harmonize data, such as data cleansing, normalization, and validation.  Disparate data sources: Consider data sources that may have different formats, such as databases, spreadsheets, and APIs. Develop methods to integrate and harmonize data from various sources.  Harmonization tools: Utilize various tools and technologies, such as extract, transform, load (ETL) tools, data integration platforms, and data cleansing software, to streamline the harmonization process.  Harmonization schema: Define a harmonization schema that standardizes the data structure, format, and terminology across different data sets. Interested candidates don't wait call or DM to 9087726632 to proceed further with interview & start working!!!! All the best!! Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹800,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Evening shift Fixed shift Monday to Friday Night shift UK shift US shift Supplemental Pay: Performance bonus Shift allowance Yearly bonus Education: Bachelor's (Required) Experience: Pharmacobigilence: 1 year (Required) Location: Hyderabad, Telangana (Required) Shift availability: Night Shift (Required) Work Location: In person Application Deadline: 29/07/2025 Expected Start Date: 07/07/2025

Posted 1 month ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

8+ years of experience in data engineering or a related field. · Strong expertise in Snowflake including schema design, performance tuning, and security. · Proficiency in Python for data manipulation and automation. · Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). · Experience with DBT for data transformation and documentation. · Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). · Strong SQL skills and experience with large-scale data sets. · Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 1 month ago

Apply

0.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

We’re Hiring | PostgreSQL Developer (2-4 Years Experience) Location: Jayanagar, Bangalore Company: Agile Labs Work Type: Full-Time | Work from Office Experience: 2 to 4 Years About Us: At Agile Labs, we build innovative software solutions using our proprietary low-code, no-code platform. We serve a wide range of industries by helping them digitize and streamline operations quickly and efficiently. We are passionate about clean code, scalable architecture, and performance-driven development. Role Overview: We are looking for a skilled and detail-oriented PostgreSQL Developer to join our growing team. The ideal candidate will have hands-on experience in designing, optimizing, and maintaining PostgreSQL databases in a fast-paced application environment. Key Responsibilities: Design, implement, and optimize complex PostgreSQL database queries and stored procedures. Analyze existing SQL queries for performance improvements and suggest optimizations. Develop and maintain database structures to support evolving application and business requirements. Ensure data integrity and consistency across the platform. Work closely with application developers to understand data needs and deliver effective database solutions. Perform database tuning, indexing, and maintenance. Create, review, and optimize database scripts, views, and functions. Implement backup and recovery plans, data security protocols, and access control mechanisms. Required Skills & Experience: 2 to 4 years of hands-on experience in PostgreSQL development. Strong knowledge of SQL, PL/pgSQL, functions, triggers, and stored procedures. Experience in database design, normalization, and data modeling. Proficiency in writing efficient queries and performance tuning. Good understanding of indexing strategies and query execution plans. Experience in version control systems like Git. Familiarity with Linux-based environments and scripting is a plus. Strong problem-solving skills and ability to work independently or in a team. Nice to Have: Experience working in Agile development environments. Exposure to cloud platforms (AWS, Azure, or GCP) and DBaaS solutions. Understanding of NoSQL/other databases is a plus. Familiarity with data warehousing and analytics tools. Why Join Agile Labs? Opportunity to work on innovative, impactful software solutions. Collaborative and learning-driven culture. Flexible and transparent work environment. Be part of a company that is redefining how software is built. Job Types: Full-time, Permanent Pay: Up to ₹600,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Education: Bachelor's (Required) Experience: PostgreSQL: 2 years (Required) Location: Bengaluru, Karnataka (Required) Work Location: In person

Posted 1 month ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Lead Engineer Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing – Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases – Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud – GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles – Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality – experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

10.0 years

0 Lacs

India

On-site

About Fresh Gravity: Founded in 2015, Fresh Gravity helps businesses make data-driven decisions. We are driven by data and its potential as an asset to drive business growth and efficiency. Our consultants are passionate innovators who solve clients' business problems by applying best-in-class data and analytics solutions. We provide a range of consulting and systems integration services and solutions to our clients in the areas of Data Management, Analytics and Machine Learning, and Artificial Intelligence. In the last 10 years, we have put together an exceptional team and have delivered 200+ projects for over 80 clients ranging from startups to several fortune 500 companies. We are on a mission to solve some of the most complex business problems for our clients using some of the most exciting new technologies, providing the best of learning opportunities for our team. We are focused and intentional about building a strong corporate culture in which individuals feel valued, supported, and cared for. We foster an environment where creativity thrives, paving the way for groundbreaking solutions and personal growth. Our open, collaborative, and empowering work culture is the main reason for our growth and success. To know more about our culture and employee benefits, visit out website https://www.freshgravity.com/employee-benefits/ . We promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. We are data driven. We are passionate. We are innovators. We are Fresh Gravity. Requirements What you'll do: Solid hands-on experience with Talend Open Studio for Data Integration, Talend Administration Centre, and Talend Data Quality ETL Process Design: Able to develop and design ETL jobs, ensuring they meet business requirements and follow best practices. Knowledge of SCD, normalization jobs Talend Configuration: Proficiency in Configuring Talend Studio, Job Server, and other Talend components. Data Mapping: Proficiency in creating and refining Talend mappings for data extraction, transformation, and loading. SQL: Possess Strong knowledge of SQL and experience. Able to develop complex SQL queries for data extraction and loading, especially when working with databases like Oracle, RedShift, Snowflake. Custom Scripting: Knowledge to implement custom Talend components using scripting languages like Python or Java. Shell scripting to automate tasks Reusable Joblets: Working knowledge to Design and create reusable joblets for various ETL tasks. ESB Integration and real-time data integration : Able to implement and manage integrations with ESB (Enterprise Service Bus) systems - Kafka/Azure Event Hub, including REST and SOAP web services. Desirable Skills and Experience: Experience with ETL/ELT, data transformation, data mapping, and data profiling Strong analytical and problem-solving skills Ability to work independently and as part of a team Ability to work with cross-functional teams to understand business requirements and design data Troubleshoot and resolve data integration issues in a timely manner Mentor junior team members and help them improve their Talend development skills Stay up to date with the latest Talend and data integration trends and technologies, integration solutions that meet those requirements Benefits In addition to a competitive package, we promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. In keeping with Fresh Gravity's challenger ethos, we have developed the 5Dimensions (5D) benefits program. This program recognizes the multiple dimensions within each of us and seek to provide opportunities for deep development across these dimensions. Enrich Myself; Enhance My Client; Build my Company, Nurture My Family; and Better Humanity.

Posted 1 month ago

Apply

3.0 years

0 Lacs

India

On-site

Job Title: Oracle Product Data Hub (PDH) Technical Consultant – Product Master Data Specialist Location: India Job Type: Full-Time Consultant Experience Level: Mid to Senior-Level Industry: ERP / Master Data Management / Manufacturing / Retail / Supply Chain Job Summary: We are seeking a skilled Oracle Product Data Hub (PDH) Technical Consultant with deep expertise in Product Master Data Management to support the end-to-end lifecycle of finished goods, raw materials, and pricing data in Oracle Fusion PDH. The ideal candidate will have hands-on experience in data cleansing, enrichment, transformation, validation, and mass data loading into Oracle Cloud PDH using best practices and tools such as FBDI, REST/SOAP APIs, and Data Import Templates . This role requires strong technical knowledge of Oracle PDH, a problem-solving mindset, and experience collaborating with functional teams and business users to ensure clean, standardized, and accurate product data is maintained across systems.  Key Responsibilities: Lead technical efforts in product data onboarding , including finished goods , raw materials , and pricing structures into Oracle Fusion Product Data Hub. Perform data cleansing, de-duplication, normalization, and transformation activities using industry best practices and custom rulesets. Develop and execute data migration strategies using Oracle FBDI templates , Import Maps , REST/SOAP APIs , and spreadsheets . Create and maintain scripts or tools for mass upload, update, and validation of product data. Collaborate with business analysts, data stewards, and IT to define and implement product data governance, data quality rules, and workflows. Conduct data validation and reconciliation activities post-load, ensuring accuracy, completeness, and compliance with business rules. Troubleshoot and resolve technical issues related to PDH data imports, validations, and integrations. Support product hierarchy setup, item class configuration, attribute groups, catalogs, and data quality scorecards. Document technical specifications, data load procedures, and configuration guides. Required Skills and Experience: 3+ years of hands-on technical experience with Oracle Fusion Product Data Hub (PDH) . Proven experience in mass loading and maintaining product data, including finished goods , raw materials , and pricing . Strong experience with Oracle FBDI templates , REST/SOAP Web Services , and Excel-based data load tools . Proficiency in SQL and PL/SQL for data analysis and transformation. Solid understanding of Oracle Fusion Product Hub structures: Item Classes, Templates, Catalogs, Attributes, and Change Orders . Knowledge of item lifecycle management , global product definitions , and cross-functional data dependencies . Familiarity with Oracle SCM modules (Inventory, Costing, Pricing) is a plus. Experience in large-scale data migration, cleansing, and conversion projects. Excellent analytical, communication, and stakeholder engagement skills. Preferred Qualifications: Oracle Cloud Certification in Product Data Management or SCM . Experience with data governance frameworks or MDM tools . Exposure to tools like Oracle Integration Cloud (OIC) , OACS , or Informatica MDM . Experience in manufacturing, apparel, or retail industries preferred.

Posted 1 month ago

Apply

5.0 - 31.0 years

9 - 15 Lacs

Bengaluru/Bangalore

On-site

Job Title: NoSQL Database Administrator (DBA )Department: IT / Data Management Job Purpose: The NoSQL Database Administrator will be responsible for designing, deploying, securing, and optimizing NoSQL databases to ensure high availability, reliability, and scalability of mission-critical applications. The role involves close collaboration with developers, architects, and security teams, especially in compliance-driven environments such as UIDAI. Key Responsibilities: Collaborate with developers and solution architects to design and implement efficient and scalable NoSQL database schemas. Ensure database normalization, denormalization where appropriate, and implement indexing strategies to optimize performance. Evaluate and deploy replication architectures to support high availability and fault tolerance. Monitor and analyze database performance using tools like NoSQL Enterprise Monitor and custom monitoring scripts. Troubleshoot performance bottlenecks and optimize queries using query analysis, index tuning, and rewriting techniques. Fine-tune NoSQL server parameters, buffer pools, caches, and system configurations to improve throughput and minimize latency. Implement and manage Role-Based Access Control (RBAC), authentication, authorization, and auditing to maintain data integrity, confidentiality, and compliance. Act as a liaison with UIDAI-appointed GRCP and security audit agencies, ensuring all security audits are conducted timely, and provide the necessary documentation and artifacts to address risks and non-conformities. Participate in disaster recovery planning, backup management, and failover testing. Key Skills & Qualifications: Educational Qualifications: Bachelor’s or Master’s Degree in Computer Science, Information Technology, or a related field. Technical Skills: Proficiency in NoSQL databases such as MongoDB, Cassandra, Couchbase, DynamoDB, or similar. Strong knowledge of database schema design, data modeling, and performance optimization. Experience in setting up replication, sharding, clustering, and backup strategies. Familiarity with performance monitoring tools and writing custom scripts for health checks. Hands-on experience with database security, RBAC, encryption, and auditing mechanisms. Strong troubleshooting skills related to query optimization and server configurations. Compliance & Security: Experience with data privacy regulations and security standards, particularly in compliance-driven sectors like UIDAI. Ability to coordinate with government and regulatory security audit teams. Behavioral Skills: Excellent communication and stakeholder management. Strong analytical, problem-solving, and documentation skills. Proactive and detail-oriented with a focus on system reliability and security. Key Interfaces: Internal: Developers, Solution Architects, DevOps, Security Teams, Project Managers. External: UIDAI-appointed GRCP, third-party auditors, security audit agencies. Key Challenges: Maintaining optimal performance and uptime in a high-demand, compliance-driven environment. Ensuring security, scalability, and availability of large-scale NoSQL deployments. Keeping up with evolving data security standards and audit requirements.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

What You’ll Do Manage and maintain PostgreSQL databases in development, staging, and production environments. Write and optimize SQL queries, stored procedures, functions, and triggers to support application logic. Design, implement, and maintain logical and physical database schemas. Monitor database performance and implement performance tuning strategies. Ensure data integrity, security, and availability through regular maintenance and backups. Collaborate with application developers to understand requirements and provide efficient database solutions. Handle database migrations, versioning, and deployment as part of CI/CD pipelines. Perform regular database health checks, index analysis, and query optimization. Troubleshoot and resolve database issues, including slow queries, locking, and replication errors. What We Seek In You Proven experience as a PostgreSQL Database with hands-on SQL development experience. Strong knowledge of PL/SQL and writing efficient stored procedures and functions. Experience with database schema design, normalization, and data modeling. Solid understanding of PostgreSQL internals, indexing strategies, and performance tuning. Experience with backup and recovery tools, pg_dump, pg_restore, replication, and monitoring tools. Proficient in Linux/Unix command-line tools for database management. Familiar with version control systems (e.g., Git) and CI/CD practices. Life At Next At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks Of Working With Us Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 1 month ago

Apply

6.0 years

25 - 30 Lacs

India

Remote

Job Title: Enterprise Data Modeler – Snowflake Specialist Location: [Remote ] Experience Required: 6+ Years Employment Type: [Contract] Start Date: Immediate Role Overview We are seeking a highly skilled Enterprise Data Modeler with deep expertise in Snowflake and modern data modeling techniques. This role requires end-to-end experience in designing scalable, robust, and production-ready data models that align with complex business requirements and support analytics, reporting, and KPI logic. The ideal candidate has practical, hands-on experience—not just theoretical knowledge—of enterprise-level data modeling in real client projects. Key Responsibilities Lead the design and development of conceptual, logical, and physical data models using Snowflake. Collaborate with business and technical stakeholders to understand KPIs, metrics, and data flows to drive appropriate data architecture. Create technically sound, scalable data models using best-practice design patterns (e.g., Dimensional, Data Vault, Normalized). Translate complex and ambiguous business problems into structured Snowflake-ready models, including schema, table, and column-level specs. Develop schema objects such as views, constraints, partitions, clustering, and leverage Snowflake features like Time Travel, Zero-Copy Cloning, etc. Support downstream Power BI and data mart readiness, ensuring models are optimized for semantic and reporting layers. Design for historical and incremental loading (e.g., SCDs, CDC, audit columns, soft deletes). Produce clear and concise documentation including data flow diagrams, ER diagrams, lineage maps, and model architecture visuals. Collaborate closely with data architects and engineers to ensure model fitment within the larger data warehouse architecture. Ensure models support governance, metadata frameworks, and comply with enterprise data standards. Required Skills And Qualifications 6+ years of enterprise-level experience in data modeling across client-facing or production projects. Deep hands-on expertise with Snowflake SQL and schema design, including performance optimization. Strong understanding of data warehousing concepts, including: Dimensions, Facts, Surrogate Keys Star vs Snowflake Schema Normalization, Fact Grains, SCD Types ELT vs ETL, Semantic Layers, Data Vault Proficiency in using tools like Lucidchart, SQLDBM, dbt docs, or similar to create ERDs and architecture visuals. Ability to confidently present and defend data model decisions in technical review and stakeholder walkthroughs. Strong verbal and written communication skills in English. Ability to work independently, lead discussions with minimal handholding, and resolve ambiguity in business requirements. Nice to Have Experience with metadata-driven modeling and data governance initiatives. Exposure to modeling strategies that support Power BI, KPI tracking, and cross-platform analytics. Knowledge of data lineage mapping, version control of models, and model lifecycle management. Skills: data warehousing,snowflake,modeling,elt,enterprise data,enterprise,snowflake sql,dimensional modeling,sqldbm,performance optimization,metadata frameworks,data vault,models,data,normalization,lucidchart,data governance,dbt docs,etl,architecture,data modeling

Posted 1 month ago

Apply

2.0 - 3.0 years

0 Lacs

India

On-site

Flexera saves customers billions of dollars in wasted technology spend. A pioneer in Hybrid ITAM and FinOps, Flexera provides award-winning, data-oriented SaaS solutions for technology value optimization (TVO), enabling IT, finance, procurement and cloud teams to gain deep insights into cost optimization, compliance and risks for each business service. Flexera One solutions are built on a set of definitive customer, supplier and industry data, powered by our Technology Intelligence Platform, that enables organizations to visualize their Enterprise Technology Blueprint™ in hybrid environments—from on-premises to SaaS to containers to cloud. We’re transforming the software industry. We’re Flexera. With more than 50,000 customers across the world, we’re achieving that goal. But we know we can’t do any of that without our team. Ready to help us re-imagine the industry during a time of substantial growth and ambitious plans? Come and see why we’re consistently recognized by Gartner, Forrester and IDC as a category leader in the marketplace. Learn more at flexera.com Responsibilities Respond to requests to investigate content related issues, identify root cause, and remediate Conduct research, investigate and collect data on software and hardware products from various sources and curate the data into the Data Platform Relate data points from 3rd party vendors and product suppliers and ensure consistency, quality, and accuracy in our normalized database Advise how data drives customer value, business use cases and decision making Identify, analyze, and interpret trends or patterns in complex data sets Track the latest information from the IT market and several other vertical markets (Medical, Finance and Banking) and update/maintain a comprehensive reference catalog with the most up-to-date information Operate with consistency, quality, and accuracy in relation to our Content Operations standards Communicate effectively with Support, Engineering and Product Management regarding enrichment, defects, data alignments and gap-fill requests Contribute to continuous improvement initiatives and update articles on our Flexera Knowledge Base Confidently promote our team’s principles across the organization Creating tooling and systems for better maintenance and monitoring of the content services Understanding the system, tool, application workflows and adopt towards the same Requirements To be successful the Content Ops will need to have some (if not all) of the following attributes: Engineering / MCA graduates with 2 - 3 years' experience in content Ops. Firm understanding of IT Asset Management (ITAM), software and hardware assemblies, version, edition, and release management Familiarity of software licensing including open source and vulnerabilities. Authoritative knowledge of at least one mainstream development language such as Java, Python, Go, or Javascript Familiar with stored procedures in SQL/Oracle Server (Should be clear with the data normalization – collect, interpret, analyze, and report) Familiarity with SaaS, PaaS and IaaS and Cloud products. Strong research skills, able to investigate, locate, and collect specific information quickly and accurately Strong reading comprehension skills, able to understand information found on web pages, product documentation, technical and marketing articles and extract specific content quickly and accurately Comfortable dealing with huge amount of data and able to organize them according to specific rules and patterns Familiar with ticketing tools such as Service Now, Salesforce and JIRA Knowledge of databases, API’s, SQL, scripting languages and process automation is advantageous Strong interpersonal skills, a team player The Following Personal Qualities Are Desired Passionate about self-learning and active in the (local or online) tech community Highly motivated with attention to detail and strong problem-solving skills Self-driven and prepared to go the extra mile when the team is up against it Achieves the right balance of confidence and respect Ability to communicate effectively and efficiently within and between teams Flexera is proud to be an equal opportunity employer. Qualified applicants will be considered for open roles regardless of age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by local/national laws, policies and/or regulations. Flexera understands the value that results from employing a diverse, equitable, and inclusive workforce. We recognize that equity necessitates acknowledging past exclusion and that inclusion requires intentional effort. Our DEI (Diversity, Equity, and Inclusion) council is the driving force behind our commitment to championing policies and practices that foster a welcoming environment for all. We encourage candidates requiring accommodations to please let us know by emailing careers@flexera.com.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JOB DESCRIPTION “While we have our offices in Bangalore, Chennai, Hyderabad, Nagpur and Pune this position is hybrid with you being able to report to a location nearest to your current location, if the need arises.” We are looking for a highly experienced NLP Analyst with deep expertise in linguistic data analysis, annotation design, and production-scale NLP model evaluation. This role requires a blend of linguistic acumen, analytical rigor, and real-world application experience. You will drive the design and execution of NLP initiatives across diverse domains and guide cross-functional teams on best practices in language data handling and annotation quality. Key Responsibilities Manage large-scale text annotation and labeling pipelines for supervised and semi-supervised learning. Conduct advanced linguistic analysis of unstructured content (e.g., clinical notes, legal contracts, customer communications, claim documents) to identify patterns, gaps, and modeling opportunities. Define and enforce annotation schemas and QA protocols for complex NLP tasks (e.g., NER, relation extraction, coreference resolution, sentiment/intent classification). Evaluate and improve the performance of NLP models through rigorous error analysis and metric-driven feedback loops. Collaborate with ML/NLP engineers, data scientists, and domain experts to build robust NLP pipelines that scale across use cases. Lead internal research efforts on emerging NLP methodologies, including LLM prompt engineering, hybrid rule-learning approaches, and few-shot learning. Provide mentorship to junior analysts and contribute to developing internal NLP knowledge repositories and annotation standards. Required Qualifications Master’s in computational Linguistics, NLP, Data Science, Computer Science, or a related field. 5+ years of professional experience in NLP, with a strong track record of hands-on work in data annotation, language model evaluation, and NLP pipeline development. Expertise in Python and key NLP libraries (spaCy, NLTK, Scikit-learn, Hugging Face Transformers, etc.). Advanced proficiency in building and managing annotation workflows using tools like Prodigy, doccano, Brat, or in-house platforms. Deep understanding of linguistic structures (syntax, semantics, pragmatics) and their application to real-world NLP challenges. Experience evaluating ML/NLP models using metrics like F1, ROUGE, BLEU, precision/recall, and embedding-based similarity. Solid grasp of vectorization methods (TF-IDF, embeddings, transformer-based encodings) and modern language models (e.g., BERT, GPT, LLaMA). Preferred Qualifications Experience with domain-specific NLP (e.g., clinical/biomedical, legal, fintech). Knowledge of knowledge graph construction, relation extraction, and entity linking. Experience integrating structured/unstructured data for downstream AI/ML applications. Familiarity with prompt engineering for LLMs and tuning foundation models. Strong data querying and visualization skills (SQL, pandas, seaborn, Power BI/Tableau). Perficient is always looking for the best and brightest talent and we need you! We’re a quickly-growing, global digital consulting leader, and we’re transforming the world’s largest enterprises and biggest brands. You’ll work with the latest technologies, expand your skills, and become a part of our global community of talented, diverse, and knowledgeable colleagues. RESPONSIBILITIES Key Responsibilities Preprocess and clean raw text data for downstream NLP applications (e.g., tokenization, normalization, entity recognition). Annotate and label datasets for supervised learning tasks (e.g., intent classification, sentiment analysis, NER). Analyze and visualize linguistic patterns and insights from textual data. Work with data scientists to evaluate and improve model performance. Support the development of rule-based and machine learning-based NLP pipelines. Document and maintain guidelines for annotation and linguistic QA. Collaborate with stakeholders to understand domain-specific language challenges and requirements QUALIFICATIONS Required Qualifications Bachelor’s degree in Linguistics, Computer Science, Data Science, or a related field. 3+ years of hands-on experience with text analysis or NLP tasks. Proficiency in Python and common NLP libraries (e.g., spaCy, NLTK, pandas). Experience working with annotation tools (e.g., Prodigy, Labelbox, doccano). Strong understanding of language structure and linguistic features. Ability to apply regular expressions and text parsing techniques effectively. Familiarity with data visualization tools and basic statistics. Preferred Qualifications Experience in domain-specific NLP (e.g., clinical/biomedical, legal, financial). Knowledge of vectorization methods (TF-IDF, word2vec, BERT embeddings). Exposure to ML model evaluation metrics (e.g., precision, recall, F1 score). Experience with SQL and working with large datasets. Familiarity with LLMs (e.g., OpenAI, Hugging Face Transformers). Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work. WHO WE ARE Perficient is a leading global digital consultancy. We imagine, create, engineer, and run digital transformation solutions that help our clients exceed customers’ expectations, outpace competition, and grow their business. With unparalleled strategy, creative, and technology capabilities, our colleagues bring big thinking and innovative ideas, along with a practical approach to help our clients – the world’s largest enterprises and biggest brands succeed. WHAT WE BELIEVE At Perficient, we promise to challenge, champion, and celebrate our people. You will experience a unique and collaborative culture that values every voice. Join our team, and you’ll become part of something truly special. We believe in developing a workforce that is as diverse and inclusive as the clients we work with. We’re committed to actively listening, learning, and acting to further advance our organization, our communities, and our future leaders… and we’re not done yet. Perficient, Inc. proudly provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, genetic information, marital status, amnesty, or status as a protected veteran in accordance with applicable federal, state and local laws. Perficient, Inc. complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training. Perficient, Inc. expressly prohibits any form of unlawful employee harassment based on race, color, religion, gender, sexual orientation, national origin, age, genetic information, disability, or covered veterans. Improper interference with the ability of Perficient, Inc. employees to perform their expected job duties is absolutely not tolerated. Disability Accommodations: Perficient is committed to providing a barrier-free employment process with reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or accommodation due to a disability, please contact us. Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time. ABOUT US Perficient is always looking for the best and brightest talent and we need you! We’re a quickly growing, global digital consulting leader, and we’re transforming the world’s largest enterprises and biggest brands. You’ll work with the latest technologies, expand your skills, experience work-life balance, and become a part of our global community of talented, diverse, and knowledgeable colleagues. Select work authorization questions to ask when applicants apply 1. Are you legally authorized to work in the United States? 2. Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?

Posted 1 month ago

Apply

0.0 - 1.0 years

7 - 8 Lacs

Hyderabad, Telangana

On-site

Greetings from Star Secutech Pvt Ltd!!! Huge welcome to Immediate Joiners!!! Job Title: SR Executive Reporting to: Team Leader/AM/DM Location: Hyderabad Working Hours/ Days: 9 Hours / 5 Days a Week. Shift: U.S Shift (5:30 PM – 2:30AM) Salary: 7-8 LPA (Negotiable) Job Role:  Data types: Identify the data types of each data set and ensure compatibility.  Harmonization process: Develop a harmonization process that outlines the steps required to harmonize data, such as data cleansing, normalization, and validation.  Disparate data sources: Consider data sources that may have different formats, such as databases, spreadsheets, and APIs. Develop methods to integrate and harmonize data from various sources.  Harmonization tools: Utilize various tools and technologies, such as extract, transform, load (ETL) tools, data integration platforms, and data cleansing software, to streamline the harmonization process.  Harmonization schema: Define a harmonization schema that standardizes the data structure, format, and terminology across different data sets. Interested candidates don't wait call or DM to 9087726632 to proceed further with interview & start working!!!! All the best!! Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹800,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Evening shift Fixed shift Monday to Friday Night shift UK shift US shift Supplemental Pay: Performance bonus Shift allowance Yearly bonus Education: Bachelor's (Required) Experience: Pharmacobigilence: 1 year (Required) Location: Hyderabad, Telangana (Required) Shift availability: Night Shift (Required) Work Location: In person Application Deadline: 29/07/2025 Expected Start Date: 07/07/2025

Posted 1 month ago

Apply

14.0 years

0 Lacs

India

Remote

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life Careers that Change Lives Principal Data Software Engineer in the Cardiac Rhythm Disease Management (CRDM) R&D Software Organization developing software supporting Medtronic implantable cardiac devices. The individual will operate in all phases and contribute to all activities of the software development process.Candidates must be willing to work in a fast paced, multi-tasking, team environment. A Day in the Life Design, Develop and test Software high integrity software for class II and III medical devices. Learn and understand software standards for Medical devices, ex. IEC62304. Define and implement software requirements and designs and review software developed by other team members. Contributes and applies advanced technical principles, theories, and concepts to solve complex technical problems. Participate in process improvement initiatives for the software team. This includes recognizing areas for improvement as well as working with others to develop and document process improvements. Demonstrate ownership of software feature/module and drive development of the feature/module through SDLC. Provide hands-on leadership, coaching, mentoring, and software engineering best practices to junior software engineers. Develop reusable patterns and encourage innovation that will increase team velocity. Maintain, improve and design new software tools. These tools use either scripting languages (Perl, Python), programming languages (Java, C, C#), or web technology (HTML5, JavaScript). Work under general direction and collaboratively with internal and external partners. Continuously keep updated with latest technology trends and channel that learning to Medtronic Product development Must Have Job Respobsibilities Experience in software design for medical devices. Hands on experience in developing implantable System Software components related to data acquisition, Real Time Data processing and data presentation. Experience in defining control system state machine for processing real time data and synchronizing real time data across different inputs. Applying industry standard best practices to develop system software complying to security requirements to ensure patient privacy and safety. Experience in developing Firmware and Device Drivers for embedded peripherals. Experience in developing simulators for simulating implantable device behavior through design patterns and architecture patterns. Hands on experience in Blue Tooth enabled device communication. Hands on experience in SVG Graphic based development. Hands on experience in Mobile Operating System apps development targeted at Class III Medical Systems. Strong oral and written communication skills Experience with configuration management tools Proficiency working in a team environment Demonstrated skills in writing engineering documents (specifications, project plans, etc) Must Have Minimum Qualification B.E/BTech.in Computer Science Engineering and 14+ years of experience (or ME/MTech in computers science and 12+ years) Strong programming skills in C#, .NET AND/OR C/C++Strong knowledge of software design, development, debug and test practices Apply best practices to develop software that’s driven by test first approach. Create automation protocols to test complex software stack for behavior and coverage. Provide design guidance for designing Networking Services (Web Services, SOAP and REST services) for communicating over TCP / UDP between Tablet and external Servers. Perform thorough analysis and synthesis of data at hand to apply relevant software engineering algorithms to provide best user experience for real time data representation. Should be able to design systems that comply to object oriented design patterns for scalability and extensibility. Should be able to analyze system requirements, map them to sub system requirements , create design and design artifacts using UML diagrams, provide traceability into Requirements, Should be able to understand Operating System thread priorities, thread scheduling concepts and apply those concepts to realize efficient and optimal flow of data through the system for real time data processing. Apply software engineering principles for requirement analysis, requirement prioritization, life cycle models such as waterfall, Agile. Should be able to understand Web Based applications design , remote procedure calls and distributed computing and apply those concepts to Product development. Should be able to understand concepts of relational data base management, normalization of tables, and design well normalized data base tables. Should be able to understand Socket communication and design/development of applications involving socket communication across process boundaries. Should be able to perform build system management through thorough understanding of compiler optimization, compiler design. Principal Working Relationship Reports to the Engineering Manager The Senior Software Engineer frequently interacts with Product Owner, Tech Lead, other developers, V&V engineers, internal partners and stakeholders concerning estimations, design, implementation or requirement clarifications, works closely with global sites. Nice to Haves 5+ years of experience in software design for medical devices Strong Leadership skills and mentoring capabilities Experience in mobile software development, ex. iOS, Android Experience in web based technologies, ex. HTML5, JavaScript, CSS or Cordova Experience in Microsoft Visual Studio development platforms/TFS/tools Experience in Open Source development platform/tools, ex. Eclipse Effectively communicate and operate within a cross-functional work environment. (Mechanical Engineering, Systems Engineering, Firmware Development, Software Development, Test Development, Manufacturing) Experience leading a software development team. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.

Posted 1 month ago

Apply

5.0 - 9.0 years

9 - 10 Lacs

Hyderābād

On-site

About the Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 1 month ago

Apply

4.0 - 6.0 years

2 - 6 Lacs

Hyderābād

On-site

About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. We are seeking an experienced PHP Developer to design, develop, and maintain high-performance web applications. This role involves collaborating with cross-functional teams, optimizing application performance, and ensuring secure and scalable solutions . If you have a strong foundation in PHP development, frameworks, database management, and cloud technologies , we invite you to apply and contribute to cutting-edge projects. Key Responsibilities: Core PHP & Frameworks: Strong expertise in Core PHP and PHP web frameworks (preferably Symfony, Laravel, or CodeIgniter). Object-Oriented Programming (OOP): Deep understanding of OOP principles and MVC design patterns in PHP. Third-Party Integrations: Experience with third-party API integrations, authentication, and authorization mechanisms. Database Management: Strong proficiency in MySQL, knowledge of database normalization, ORM, and experience working with SQL/NoSQL databases. Web Development & Front-End: Familiarity with JavaScript, jQuery, VueJS, ReactJS, HTML5, CSS3, and front-end technologies. Security & Compliance: Knowledge of security best practices and compliance standards like HIPAA and GDPR. Application Design & Scalability: Understanding of scalable application architecture and secure authentication between systems. Cloud & DevOps: Hands-on experience with AWS cloud services, Docker containers, CI/CD pipelines, and automation scripts. Testing & Debugging: Proficiency in Test-Driven Development (TDD) and strong debugging skills. Version Control & Collaboration: Proficient with Git and working in a collaborative Agile/Scrum environment. Requirements: Education & Experience: Bachelor's degree in computer science, Information Technology, or a related field with proven PHP development experience up to 4-6 Years. PHP Frameworks: Strong expertise in Symfony, Laravel, or CodeIgniter. Front-End Development: Familiarity with HTML, CSS, JavaScript, jQuery. Database & API Management: Experience with MySQL, PostgreSQL, RESTful APIs, and web services. Version Control & CI/CD: Proficient in Git, CI/CD pipelines, and automation using shell scripts. Team Collaboration & Communication: Ability to work collaboratively, solve complex problems, and pay attention to detail. Preferred Qualifications: Agile & Scrum: Experience working in Agile/Scrum environments. Multi-Tech Expertise: Knowledge of additional programming languages (e.g., Python, JavaScript frameworks). Cloud & DevOps: Familiarity with AWS, Google Cloud, Docker, and Kubernetes.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Telangana

On-site

8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

Department Information Technology Job posted on Jun 26, 2025 Employee Type Permanent Experience range (Years) 5 years - 10 years Job Location Hyderabad Role Title Oracle DBA Administrator Role Purpose The purpose of the Business Development role is to identify, create, and nurture growth opportunities for the organization by building strategic relationships, expanding market presence, and driving revenue generation . The role involves proactively identifying new business prospects, developing tailored solutions to meet client needs, and working collaboratively across teams to close deals and foster long-term partnerships. Business Development professionals act as the bridge between market opportunities and the company's strategic goals , ensuring sustained business growth, competitive advantage, and customer success. Key Accountability Area Database Administration: Good knowledge on oracle 11g, 12c, 19c databases. Good knowledge on Structured Query Language (SQL) Comprehensive knowledge and hands-on experience in managing Oracle and MySQL Databases. Skill in optimizing database queries for better performance and understanding the importance of indexing, normalization, and denormalization. Minimize database downtime and manage parameters to provide fast query responses Monitoring databases and related systems to ensure optimized performance. Monitor database performance, implement changes and apply new patches and versions when required Exposure to Middleware (Oracle Forms and Reports) Applications would be a Significant Plus. System Monitoring and Maintenance: Perform regular system monitoring, verify the integrity and availability of Database, server resources, systems, and key processes, and review System and Application logs. Patch Management: Apply DB and OS patches and upgrades regularly and upgrade administrative tools and utilities. Configure and add new services as necessary. Troubleshooting and Support: Provide technical support and troubleshooting for server-related issues, ensuring minimal downtime and disruption. Backup and Recovery: Manage backup and recovery solutions for servers to ensure data integrity and availability. Documentation and Reporting: Maintain comprehensive documentation of systems, configurations, procedures, and changes. Provide regular reports on system performance and incidents. Reports to Lead- DBA No. of Reportees Individual Contributor Qualification Bachelor’s degree in computer science, Information Technology, or a related field. Work Experience Minimum of 2+ years of experience in Database administration. Proven expertise in managing complex Database environment’s Experience with Linux and Windows server OS. Technical / Functional Competencies Proficiency in Linux Server operating systems and technologies (RHEL, CentOS, Oracle Linux). Proficiency in Windows Server operating systems and technologies (Windows Server 2016, 2019,2022). Exposure to Oracle and AWS cloud platforms. Behavioral Competencies Excellent problem-solving and troubleshooting skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Attention to detail and strong organizational skills.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Job Description Key Responsibilities Design, build, and maintain scalable and secure relational and cloud-based database systems. Migrate data from spreadsheets or third-party sources into databases (PostgreSQL, MySQL, BigQuery). Create and maintain automated workflows and scripts for reliable, consistent data ingestion. Optimize query performance and indexing to improve data retrieval efficiency. Implement access controls, encryption, and data security best practices to ensure compliance. Monitor database health and troubleshoot issues proactively using appropriate tools. Collaborate with full-stack developers and data researchers to align data architecture with application needs. Uphold data quality through validation rules, constraints, and referential integrity checks. Keep up-to-date with emerging technologies and propose improvements to data workflows. Leverage tools like Python (Pandas, SQLAlchemy, PyDrive), and version control (Git). Support Agile development practices and CI/CD pipelines where applicable. Required Skills And Experience Strong SQL skills and understanding of database design principles (normalization, indexing, relational integrity). Experience with relational databases such as PostgreSQL or MySQL. Working knowledge of Python, including data manipulation and scripting (e.g., using Pandas, SQLAlchemy). Experience with data migration and ETL processes, including integrating data from spreadsheets or external sources. Understanding of data security best practices, including access control, encryption, and compliance. Ability to write and maintain import workflows and scripts to automate data ingestion and transformation. Experience with cloud-based databases, such as Google BigQuery or AWS RDS. Familiarity with cloud services (e.g., AWS Lambda, GCP Dataflow) and serverless data processing. Exposure to data warehousing tools like Snowflake or Redshift. Experience using monitoring tools such as Prometheus, Grafana, or the ELK Stack. Good analytical and problem-solving skills, with strong attention to detail. Team collaboration skills, especially with developers and analysts, and ability to work independently. Proficiency with version control systems (e.g., Git). Strong communication skills — written and verbal. Preferred / Nice-to-Have Skills Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience working with APIs for data ingestion and third-party system integration. Familiarity with CI/CD pipelines (e.g., GitHub Actions, Jenkins). Python experience using modules such as gspread, PyDrive, PySpark, or object-oriented design patterns. Experience in Agile/Scrum teams or working with product development cycles. Experience using Tableau and Tableau Prep for data visualization and transformation. Why Join Us Monthly long weekends — every third Friday off Wellness reimbursement to support your health and balance Paid parental leave Remote-first with flexibility and trust Work with a world-class data and marketing team inside a globally recognized brand Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves

Posted 1 month ago

Apply

0 years

4 - 9 Lacs

Noida

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – Java & GCP Developer In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Experience with Spring Boot Must have GCP Experience Experience with Microservices development Extensive Experience working with JAVA API with Oracle is critical. Extensive experience in Java 11 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Professional, precise communication skills Experience in API designing, troubleshooting, and tuning for performance Professional, precise communication skills Experience designing, troubleshooting, API Java services and microservices Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Preferred qualifications Experience with Oracle 11g or 12c pl/sql is preferred Experience in health care or pharmacy related industries is preferred. Familiarity with Toad and/or SQL Developer tools Experience working with Angular, Spring Boot frame as well Experience with Kubernetes, Azure cloud Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 25, 2025, 12:03:20 PM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time

Posted 1 month ago

Apply

0 years

0 Lacs

Greater Chennai Area

On-site

Job Title : Snowflake Data Engineer Location : Chennai Job Type : Full Time Job Summary: We are looking for a skilled and detail-oriented Snowflake Data Engineer to join our data engineering team. The ideal candidate should have hands-on experience with Snowflake, DBT, SQL, and any one of the cloud platforms (AWS, Azure, or GCP). Experience or exposure to Python for data transformation or scripting is a plus. Required Skills: Strong experience with Snowflake data warehousing architecture and features. Hands-on expertise in DBT (Data Build Tool) for transformation and modelling. Proficiency in SQL – complex joins, window functions, performance tuning. Experience in at least one major cloud platform: AWS, Azure, or GCP. Knowledge of data modelling (dimensional/star schema, normalization, etc.) Familiarity with CI/CD pipelines for data deployments.

Posted 1 month ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-71879-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Job Overview Plan A Technologies is looking for an MS SQL Server DB develope r . This is a fast-paced job with room for significant career growth. Please note: you must have at least 5+ years of experience as a MS SQL Server Developer or Database Developer to be considered for this role. JOB RESPONSIBILITY Develop, maintain, and optimize database solutions using SQL Server. Write efficient T-SQL queries, stored procedures, triggers, and functions. Perform database schema design, normalization, and optimization. Collaborate with developers, analysts, and stakeholders to understand database requirements. Optimize database performance through query optimization and indexing. Troubleshoot and resolve database issues such as performance bottlenecks and data corruption. Participate in code reviews, testing, and deployment activities. Stay updated on emerging database technologies and trends. Experience 5-7 years of experience as a MS SQL Server Developer or Database Developer. Proficiency in T-SQL and experience with SQL Server versions (2012/2014/2016/2019). Strong understanding of database concepts including normalization, indexing, and transactions. Experience with database administration tasks such as backup, recovery, and security. Familiarity with ETL tools for data integration (e.g., SSIS, Azure Data Factory). Knowledge in SSRS is an advantage. Excellent problem-solving skills and attention to detail. Excellent communication skills: must have at least Upper-Intermediate-level English (both verbal and written) Advanced problem-solving abilities, research, and learning skills Ability to work with engineers in multiple countries Must have an organized and analytical working style, and the ability to plan your own work Initiative and drive to do great things About The Company/Benefits Plan A Technologies is an American software development and technology advisory firm that brings top-tier engineering talent to clients around the world. Our software engineers tackle custom product development projects, staff augmentation, major integrations and upgrades, and much more. The team is far more hands-on than the giant outsourcing shops, but still big enough to handle major enterprise clients. Read more about us here: www.PlanAtechnologies.com . Location: Chennai, India – Hybrid work schedule: you will be required to work from our Chennai office for a minimum of 2 weeks per month. Great colleagues and an upbeat work environment: You'll join an excellent team of supportive engineers and project managers who work hard but don't ever compete with each other. Benefits: Vacation, Brand New Laptop, and More: You’ll get a generous vacation schedule, and other goodies. If this sounds like you, we'd love to hear from you!

Posted 1 month ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Project Role : Program/Project Management Representativ Project Role Description : Deliver business and technology outcomes for assigned program, project, or contracted service. Leverage standard tools, methodologies and processes to deliver, monitor, and control service level agreements. Must have skills : Laboratory Information and Execution Systems Good to have skills : Life Sciences Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: LabVantage, Design, develop, and maintain software applications using Laboratory Information Management System (LIMS). Collaborate with cross-functional teams to ensure seamless integration with other IT components. Conduct rigorous system testing and troubleshooting to optimize the performance of software applications. Provide expert technical guidance and support to project teams throughout the implementation lifecycle. Ensure compliance with software development standards and best practices Roles & Responsibilities: - As an LabVantage, application Developer, your day-to-day activities will revolve around leveraging your advanced proficiency in Laboratory Information Management System (LIMS) to develop and maintain software applications. - You'll be responsible for designing, coding, testing, and debugging software applications. - You'll be entrusted with the task of ensuring seamless integration with other IT components, thus playing a significant role in contributing to the organization's overall success. - You must have advanced proficiency in Laboratory Information Management System (LIMS). - Having intermediate proficiency in Configuration & Release Management and advanced proficiency in Design & Build Enablement will be advantageous. - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with stakeholders to define project objectives and scope. - Develop and maintain project plans, including timelines, budgets, and resource allocation. - Monitor project progress and ensure adherence to timelines and deliverables. - Identify and mitigate project risks and issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Laboratory Information and Execution Systems. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Laboratory Information and Execution Systems. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Details Category: Data Science Location : Bangalore Experience Level: 4-8 Years Position Description We are looking for a Data Engineer who will play a pivotal role in transforming raw data into actionable intelligence through sophisticated data pipelines and machine learning deployment frameworks. They will collaborate across functions to understand business objectives, engineer data solutions, and ensure robust AI/ML model deployment and monitoring. This role is ideal for someone passionate about data science, MLOps, and building scalable data ecosystems in cloud environments. Key Responsibilities Data Engineering & Data Science: Preprocess structured and unstructured data to prepare for AI/ML model development. Apply strong skills in feature engineering, data augmentation, and normalization techniques. Manage and manipulate data using SQL, NoSQL, and cloud-based data storage solutions such as Azure Data Lake. Design and implement efficient ETL pipelines, data wrangling, and data transformation strategies. Model Deployment & MLOps Deploy ML models into production using Azure Machine Learning (Azure ML) and Kubernetes. Implement MLOps best practices, including CI/CD pipelines, model versioning, and monitoring frameworks. Design mechanisms for model performance monitoring, alerting, and retraining. Utilize containerization technologies (Docker/Kubernetes) to support deployment and scalability Business & Analytics Insights Work closely with stakeholders to understand business KPIs and decision-making frameworks. Analyze large datasets to identify trends, patterns, and actionable insights that inform business strategies. Develop data visualizations using tools like Power BI, Tableau, and Matplotlib to communicate insights effectively. Conduct A/B testing and evaluate model performance using metrics such as precision, recall, F1-score, MSE, RMSE, and model validation techniques. Desired Profile Proven experience in data engineering, AI/ML data preprocessing, and model deployment. Strong expertise in working with both structured and unstructured datasets. Hands-on experience with SQL, NoSQL databases, and cloud data platforms (e.g., Azure Data Lake). Deep understanding of MLOps practices, containerization (Docker/Kubernetes), and production-level model deployment. Technical Skills Proficient in ETL pipeline creation, data wrangling, and transformation methods. Strong experience with Azure ML, Kubernetes, and other cloud-based deployment technologies. Excellent knowledge of data visualization tools (Power BI, Tableau, Matplotlib). Expertise in model evaluation and testing techniques, including A/B testing and performance metrics. Soft Skills Strong analytical mindset with the ability to solve complex data-related problems. Ability to collaborate with cross-functional teams to understand business needs and provide actionable insights. Clear communication skills to convey technical details to non-technical stakeholders. If you are passionate to work in a collaborative and challenging environment, apply now!

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies