Jobs
Interviews

45 Sql Proficiency Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

1 - 1 Lacs

bhubaneswar, odisha, india

Remote

Description We are seeking freshers/entry-level candidates for the GIS position in our team in India. This role offers an exciting opportunity to work with geographic information systems and contribute to various projects that utilize spatial data for effective decision-making. Responsibilities Assist in the collection, analysis, and interpretation of geographic data. Support the development and maintenance of GIS databases and applications. Prepare detailed maps, reports, and presentations for various stakeholders. Conduct spatial analysis and modeling to support decision-making processes. Collaborate with team members on various GIS projects and initiatives. Stay updated with the latest GIS technologies and methodologies. Skills and Qualifications Proficiency in GIS software such as ArcGIS, QGIS, or similar. Strong analytical and problem-solving skills. Familiarity with spatial data formats and databases (e.g., shapefiles, GeoJSON). Basic knowledge of remote sensing and cartography principles. Ability to work collaboratively in a team environment. Strong communication skills, both written and verbal.

Posted 5 days ago

Apply

10.0 - 15.0 years

0 Lacs

maharashtra

On-site

You will be responsible for leading the architectural design and modernization of defined components in the software engineering domain. Collaborating with product owners in the business, you will design and build solutions for IBD and GCM. Regular communication with product leads across the technology organization to identify opportunities for improving existing and future technology solutions will be essential. As a Technical Lead, you will be expected to act as a hands-on engineer, actively addressing the most challenging problems. Additionally, providing technical mentorship and leadership to a squad of developers will be a key part of your role. Your expertise in Java EE, Microservices, Web service development, REST, Services Oriented Architecture, Object-Oriented Design, Design patterns, Architecture, Application Integration, Databases, SpringBoot, Junit, BDDUnix/Linux will be crucial for success in this position. Experience with Web UI JS Framework such as AngularJS, NoSQL like MongoDB, and managing data through vendor feeds will be considered advantageous. If you are looking to lead and drive technological innovation while mentoring a team of developers, this role offers a rewarding opportunity to make a significant impact. For further details on this exciting opportunity, please reach out to 70454 59739 or email kajal@mmcindia.biz.,

Posted 1 week ago

Apply

6.0 - 9.0 years

6 - 9 Lacs

Delhi, India

On-site

Your Role: Lead OTM systems (IS) expertise across projects and operations, aligning with regional direction and working with internal and external stakeholders to ensure effective system deployment and support. Your Responsibilities: Ensure deployment of only approved IS applications such as Transport Management Systems (TMS), visibility tools, and reports. Coordinate IS input for tender/proposal requests and ensure smooth TMS implementation during customer onboarding. Maintain ongoing TMS application support for existing operations. Support continuous improvement initiatives and innovation in logistics systems. Enforce TMS governance and compliance practices. Upskill team members through structured training and development activities. Your Skills and Experiences: Minimum 6+ years of implementation and configuration experience in Oracle Transport Management (OTM) with both technical and functional expertise, especially in the distribution industry. Strong functional and techno-functional experience with OTM or Global Transportation Management implementations. Familiarity with OTM Release 6.5.X and above , including OTM Cloud . In-depth knowledge of key OTM application modules : order management, shipment management, OTM finance, automation agents, and interfaces. Ability to prepare mapping documents to interface OTM with EDI, WMS, Order Management, and Finance systems . Capable of translating operational requirements into technical design specifications for offshore delivery. Proficiency in SQL for automation agents and technical configurations. Experience with JSPX/XSL (preferred). Strong understanding of the end-to-end OTM lifecycle , including system architecture and implementations.

Posted 1 week ago

Apply

5.0 - 9.0 years

2 - 10 Lacs

Hyderabad, Telangana, India

On-site

Roles & Responsibilities: Collaborate with Product Teams and System Architects to understand business strategy, needs, and problems. Convert Epics into Features and granular User Stories with clear Acceptance Criteria and Definition of Done. Translate user stories into functional Anaplan model designs, ensuring alignment with best practices and Amgen architectural standards. Develop and maintain Anaplan modules, dashboards, and integrations. Create and validate proof-of-concepts (POCs) to test assumptions, validate solutions, or propose new features. Maintain up-to-date documentation of Anaplan model architecture, business logic, data integrations, and process configurations. Produce end-user guides, functional specs, and technical documentation to support user enablement and organisational change. Conduct impactful demos of Anaplan features internally to Product Teams and partners. Find opportunities to improve existing Anaplan models and processes. Stay current with Anaplan releases, features, and community standard processes; proactively recommend enhancements. Support the scaling of Anaplan across business units through templated solutions and reusable components. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree/Bachelor s degree with 5- 9 years of experience in Computer Science, IT or related field. Functional Skills:Must-Have Skills: Programming experience in at least one modern language (e. g. , Python, JavaScript, R, etc. ) for scripting, data transformation, or integration. Excellent problem-solving skills and a passion for tackling complex challenges with technology Experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Good-to-Have Skills: Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Able to communicate technical or complex subject matters in business terms Experience in Agile/Scrum and DevOps environments. Professional Certifications: Anaplan Certified Model Builder (incl. L1 and L2 MB) (required) Cloud certifications (AWS Certified Solutions Architect, DevOps Engineer, etc. ) (preferred) Databricks certifications (Data Engineer Professional) (preferred) Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

bhubaneswar

On-site

The Informatica Master Data Management (MDM) Expert plays a critical role in the organization by ensuring the integrity, consistency, and accuracy of master data across all business units. This position is essential for driving data governance initiatives and for supporting various data integration and management processes. As an MDM Expert, you will leverage your knowledge of Informatica tools to develop and implement MDM strategies that align with organizational goals. You will collaborate with cross-functional teams, providing expertise in data modeling, quality management, and ETL processes. This role requires a deep understanding of master data concepts as well as the ability to address complex data challenges, ensuring reliable data inputs for analytical and operational needs. In addition, you'll drive improvements in data processes, lead troubleshooting efforts for MDM-related incidents, and train other team members in best practices. Your contributions will not only enhance data quality but will also support strategic decision-making and business outcomes across the organization. Key Responsibilities - Design and implement Informatica MDM solutions according to business requirements. - Lead the development of data governance frameworks and best practices. - Integrate MDM with existing data management and analytics solutions. - Collaborate with IT and business stakeholders to gather requirements. - Perform data profiling and analysis to ensure governance standards are met. - Develop and maintain data quality metrics and KPIs. - Document data management processes, data flows, and MDM-related architecture. - Provide troubleshooting support for MDM incidents and data discrepancies. - Facilitate data model design and validation with stakeholders. - Conduct training sessions for users on MDM tools and procedures. - Stay current with industry trends and best practices in MDM. - Coordinate with ETL teams to ensure smooth data integration. - Manage ongoing MDM projects, ensuring timely delivery and quality. - Support audit and compliance efforts related to data governance. - Enhance and optimize existing MDM processes for efficiency. Required Qualifications - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in data management, with a focus on MDM. - Proven expertise in Informatica MDM and the Informatica toolset. - Strong understanding of data governance principles and practices. - Proficiency in SQL and relational database management. - Experience with data modeling concepts and best practices. - Knowledge of ETL processes and tools, particularly Informatica PowerCenter. - Familiarity with XML and data transformation techniques. - Prior experience with cloud-based data solutions is a plus. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal abilities. - Ability to train and mentor junior team members. - Hands-on experience with data quality tools and methodologies. - Strong organizational skills with the ability to manage multiple projects. - Experience in agile project management methodologies. - Relevant certifications in Informatica or data governance are desirable. Skills: management, agile project management methodologies, data management, data governance, data modeling, cloud-based data solutions, organizational skills, SQL, interpersonal skills, data transformation techniques, MDM, data integration, data quality, Informatica MDM, data, analytical skills, problem-solving skills, communication skills, data profiling, ETL, ETL processes, master data, relational database management, Informatica, data quality metrics, SQL proficiency,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Embark on a transformative journey as a Data Test Lead at Barclays, where the vision is clear to redefine the future of banking and craft innovative solutions. In this role, you will be responsible for creating and enhancing the data that drives the bank's financial transactions, placing data quality at the forefront of all operations. This presents a unique opportunity to shape the organization's data usage and be a part of an exciting transformation in the banking sector. To excel as a Data Test Lead, you should have experience with a diverse range of solutions including Fraud Detection, Fraud Servicing & IDV, Application Fraud, and Consumption BI patterns. Strong Test Automation skills are essential, along with the ability to create frameworks for regression packs. Providing technical guidance and driving the Test Automation team is crucial, emphasizing proactive automation to ensure alignment with the development lifecycle. Collaborating on the DevOps agenda, configuring Jenkins/GitLab pipelines, and maturing automation capabilities through proper documentation are key responsibilities. Additional valued skills for this role include collaborating with development teams to ensure testability and quality throughout the SDLC, identifying opportunities for test optimization, and mentoring junior QA engineers on automation best practices. Effective communication skills, SQL proficiency, working knowledge of Oracle, Hadoop, Pyspark, Ab-initio, and other ETL tools, as well as experience with metadata, domain maintenance, and JIRA, are also highly advantageous. The purpose of this role is to design, develop, and execute testing strategies to validate functionality, performance, and user experience, while working closely with cross-functional teams to identify and resolve defects. The Accountabilities include developing and implementing comprehensive test plans, executing automated test scripts, analysing requirements, conducting root cause analysis, and staying informed of industry technology trends. As an Assistant Vice President, you are expected to advise and influence decision-making, contribute to policy development, and lead a team performing complex tasks with professionalism and expertise. People Leaders are also expected to demonstrate leadership behaviours that create an environment for colleagues to excel. Colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their daily interactions and work.,

Posted 3 weeks ago

Apply

10.0 - 20.0 years

15 - 21 Lacs

Remote, , India

On-site

Description We are seeking a Senior Business Analyst to join our dynamic team in India. The ideal candidate will have significant experience in business analysis, with a proven track record of delivering data-driven insights and strategic recommendations. This role requires strong analytical skills, the ability to work collaboratively with various stakeholders, and a passion for driving business improvement initiatives. Responsibilities Work across business units to gather and analyze business requirements for cross-departmental projects. Interpret requirements (oral and written) into technical program specifications. Industry experience in insurance domain (Auto, Home, Property & Casualty) with experience on Policy Admin Systems. Create complex software requirements; document and manage them throughout the software development lifecycle. Evaluate proposed system changes on complex applications to determine effort, impact, and project timeline. Consider system capacity, limitations, and operating time while completing assignments. Participate in analytical activities throughout the software development lifecycle. Have full technical knowledge of all phases of applications systems analysis including but not limited to program design, testing, debugging, documenting, configuring, installing, etc. Self-monitor assignments to report status to project teams and management. Responsible for project completion and user satisfaction on complex assignments. Use system and business knowledge to optimize delivery and quickly resolve issues. Translate complex business and system needs for developers. Drive quality improvements - measure, monitor and analyze production quality trends. Ensure complex system solutions provided meet business needs. Skills and Qualifications 10-20 years of experience in business analysis or related field. Proficient in data analysis tools such as SQL, Excel, and Tableau. Strong understanding of business processes and project management methodologies. Excellent communication and interpersonal skills to work with cross-functional teams. Ability to analyze complex data sets and extract meaningful insights. Experience with Agile methodologies and tools like JIRA or Trello.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Tosca Automation Engineer with 6-8 years of experience, you will be responsible for conducting functional and ETL testing across various systems to ensure accuracy and data integrity. Your primary focus will be on automation, ETL testing, SQL proficiency, and hands-on experience with Azure Databricks (ADB). You should have excellent skills in developing automation frameworks using Tosca and a solid understanding of Tosca's tools and architecture. Your key responsibilities will include developing and maintaining automation frameworks for GUI and API testing using Tosca, working on Tosca TDM/TDS for test data management, conducting hands-on testing with Azure Databricks, and creating custom automation reports using Tosca Custom Reporting. You will collaborate with development teams to ensure proper test coverage and automation integration, as well as set up and maintain Tosca DEX and Jenkins pipelines for continuous testing. Your primary skills should include strong SQL proficiency for writing and understanding SQL queries, hands-on experience in ETL testing, proficiency in automating GUIs and APIs using Tosca, knowledge of Tosca TDM/TDS, and experience in developing automation frameworks using Tosca. Additionally, experience with Azure Databricks, knowledge of Tosca components and server architecture, ability to set up and maintain Tosca DEX environments, experience with ADO or Jenkins for continuous integration and delivery pipelines, and skills in configuring and customizing Tosca automation reports will be beneficial as secondary skills. Overall, as a Tosca Automation Engineer, you will play a crucial role in ensuring the quality and efficiency of testing processes, automation frameworks, and test data management while collaborating with cross-functional teams to achieve successful automation integration and testing outcomes.,

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 12 Lacs

Gurgaon, Haryana, India

On-site

Role & responsibilities Should be well conversant with IRAC, Exposure, Large exposure norms, Resolution Plan Implementation, Sensitive sector monitoring, etc. Exposure in preparation of DSB, CRILC, PSL, MSME, SLBC, Defaulter, NPA, Bureau reporting (CIBIL), NeSL submission, Quarterly disclosure preparation, Risk Based Supervision submission, Potential NPA tracking, Exception reports, SCOD tracking, etc. Facilitate Half yearly review of all process notes and QC checklists to align with extant circulars. Excellent oral and written communication. Excellent project management skills to manage multiple automation projects with circular implementations. To be able to provide assistance in evaluation and Review of BRDs as well as facilitating UATs for automation projects. Ability to manage work with minimum supervision. Ability to drive all process improvement initiatives. Ability to work under pressure and manage stakeholders expectations. Exposure towards handling Regulatory/Statutory/Concurrent audits and track open audit points till closure. Possess strong understanding of RBI circulars and have handled regulatory reporting team for a mid-sized/large Bank Preferred candidate profile Should have worked in BFSI domain (Indian Private banks*) Candidate should be hands on with Credit Regulatory reporting Should be well versed with IRAC, Large exposure norms, Credit Reporting on asset side Candidate should have exp in preparation of advances & exposure data, DSB 4 (RAQ), CRILC, LEF, PSL, MSME, SLBC, Defaulter, NPA identification and reporting, Bureau reporting (CIBIL), NeSL submission.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

5 - 10 Lacs

Mumbai, Maharashtra, India

On-site

Role & responsibilities Assistance in preparation of Compliance Review plan to be put up to the Board. Preparation and updating of Compliance review checklists in line with the regulatory guidelines/internal policies. Undertaking field work and ensuring that the review is completed within the prescribed timelines. Preparation of Draft Compliance Review report and discuss the same with the relevant stakeholders. Obtain the responses from the stakeholders and preparation of Final Compliance Review Report. Ensure proper documentation and working papers are maintained for all the compliance reviews conducted. Preparation of Open Issue tracker for Compliance Review Reports issued. Follow up with the management towards closure of compliance review open issues. Essential competencies In depth knowledge of the regulatory environment for the banks in India particularly RBI. Good research capabilities and intelligent interpretation of regulatory guidelines. Good knowledge of all the elements (commercial, operational) of banking areas like Trade, Remittances, Treasury, etc. Good interpersonal skills.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Data Platform Support Engineer, your main responsibility will be to ensure the smooth execution of data pipelines and prompt resolution of issues to uphold business continuity. You will play a crucial role in maintaining the health and reliability of data systems by conducting root cause analysis, implementing proactive measures, and minimizing disruptions. Your key accountabilities will include monitoring and managing the performance of Azure Data Factory pipelines, Databricks workflows, and SQL databases to guarantee seamless data processing. You will troubleshoot and resolve production incidents in Azure-based data pipelines, conduct root cause analysis, and implement preventive measures. Additionally, you will oversee and optimize the performance of Databricks notebooks and clusters to support efficient data transformations and analytics. It will be essential for you to ensure the reliability and scalability of data integration workflows by utilizing Azure-native monitoring tools and alerts. Collaborating with development teams to deploy and support new Azure Data Factory pipelines, SQL scripts, and Databricks jobs into production will also be part of your responsibilities. Maintaining compliance with data governance, security, and backup policies across the Azure platform is crucial. Furthermore, you will need to coordinate with stakeholders to provide clear updates on production incidents, resolutions, and performance improvements. Planning and executing disaster recovery and failover strategies for Azure Data Factory, Databricks, and SQL components is essential to ensure business continuity. Documenting operational processes, troubleshooting steps, and best practices for the Azure platform will be necessary to build a comprehensive knowledge base. Your technical skills should include expertise in Azure Data Factory and Databricks, proficiency in SQL, experience in monitoring and alerting using Azure Monitor and Log Analytics, strong incident management skills, knowledge of data governance and security standards, experience in process improvement, and proficiency in documentation. In summary, as a Data Platform Support Engineer, you will play a critical role in maintaining the health and reliability of data systems, ensuring seamless data processing, and implementing proactive measures to minimize disruptions. Your expertise in Azure Data Factory, Databricks, SQL, monitoring and alerting, incident management, data governance, process improvement, and documentation will be key to your success in this role.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is looking for a Business Consulting- Functional DB consultant to join the team in Pune, Maharashtra (IN-MH), India. As a part of our inclusive and forward-thinking organization, you will be responsible for various key tasks related to database management in the domain of Capital Markets-Wealth Management. Your primary responsibilities will include gathering and analyzing requirements from stakeholders to translate them into database needs. You will design and implement database schemas, data models, and structures to support business processes effectively. Additionally, you will focus on optimizing database performance, efficiency, and scalability through various techniques. Data migration, integration, security maintenance, troubleshooting, documentation, and collaboration with technical teams will also be crucial aspects of your role. To excel in this position, you must have strong expertise in database concepts, design principles, and various systems like Oracle, SQL Server, and PostgreSQL. Proficiency in SQL, data modeling, business acumen, communication, problem-solving skills, and experience with cloud-based technologies (AWS, Azure, Google Cloud) will be highly beneficial. Project management skills to lead database projects from planning to execution will also be required. NTT DATA is a trusted global innovator of business and technology services, serving Fortune Global 100 companies with a commitment to innovation and long-term success. With a diverse team and a wide range of services including consulting, data and artificial intelligence, industry solutions, and application management, we are dedicated to helping organizations navigate the digital future confidently and sustainably. Join us in our mission to innovate, optimize, and transform for success. Visit us at us.nttdata.com.,

Posted 1 month ago

Apply

10.0 - 12.0 years

4 - 8 Lacs

Kolkata, West Bengal, India

On-site

Job Description Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Data Architecture Principles Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : any graduate The Data Management Practitioner role you described falls under the Data Management or Data Governance function. Key Responsibilities: Designing Data Strategies : Developing and implementing strategies that ensure data integrity and compliance, while optimizing its usage across the organization. Data Quality & Governance : Designing data quality rules, setting up compliance policies, and enforcing governance frameworks to ensure that data is accurate, secure, and used optimally. Team Leadership : As an SME, managing teams, contributing to decision-making processes, and ensuring that all practices align with organizational goals and regulations. Professional & Technical Skills: Must Have Skills : Proficiency in Data Architecture Principles , ensuring that data is structured, organized, and governed effectively. Deep understanding of Data Management Best Practices , including how to implement them across different systems within the organization. Experience with Data Governance and Compliance Policies , critical for ensuring that data management adheres to regulatory standards. Additional Skills : Ability to optimize data usage to derive value for the organization by ensuring the right access, quality, and utility of data across departments. Required Experience: Minimum of 12 years of experience in Data Architecture Principles , showing that this role requires an experienced individual with a deep understanding of how to manage large sets of data and integrate them into business operations. Educational Requirements : A graduate degree in any field, indicating flexibility regarding educational background.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

bangalore, karnataka

On-site

As a Senior Tableau Developer located in Bangalore (working from the office), you will utilize your 8+ years of expertise in Tableau to develop, publish, and administer dashboards using Tableau Desktop and Tableau Server (or Tableau Cloud). Your role will require advanced skills in Tableau, including calculated fields, LOD expressions, data blending, actions, Tableau Prep, and advanced chart types. Your proficiency in SQL will be essential for data extraction, manipulation, and analysis. You should have experience with complex queries, joins, and subqueries. Additionally, your consulting experience will be crucial as you interact with business stakeholders to translate complex requirements into actionable solutions. You will work with diverse data sources such as relational databases, cloud storage (AWS, Azure), and APIs, with familiarity in ETL processes considered a plus. Project management skills are necessary to handle multiple projects concurrently while meeting deadlines. Your organizational skills and ability to work both independently and as part of a team will be instrumental in your success. Strong communication and presentation skills are required to effectively explain complex data concepts to non-technical stakeholders. Your problem-solving abilities, coupled with a focus on data accuracy and integrity, will be key in addressing analytical challenges that may arise.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The Information Services Group (ISG), Data Solutions organization at Citi is dedicated to fostering a data-driven culture by providing innovative technology solutions and simplifying access to quality data and business intelligence. Collaborating closely with data clients across Citi's business lines, the ISG organization strives to enhance competitiveness by restructuring business processes, facilitating information access, and offering business intelligence related to critical data. This role is a part of the Reference Data Management group within ISG Data Solutions. As a Senior Business Analyst at an intermediate level, you will play a crucial role in facilitating communication between business users and technologists. Your responsibility will involve exchanging information in a concise, logical, and understandable manner, in coordination with the Technology team. The primary objective of this position is to contribute to continuous iterative exploration and investigation of business performance and other metrics to gain insights and drive business planning. The Execution BA Team, where this role is situated, focuses on a global strategic technology platform known as the Global Corporate Action Event Repository. Your key tasks will include processing vendor data feeds, mapping data, analyzing and researching data, working on Swift messages, developing user interfaces, and ensuring timely and accurate data publishing to downstream clients. Collaboration with stakeholders, including IT, PM, and Operations partners, will be essential to create business and function requirements, develop interfaces, and define EPICs and user stories based on client requirements. Your role will also involve data modeling, mapping, data mining, transformation, SQL proficiency, working with data providers vendors (internal & external), and documentation and user training. Strong analytical and writing skills, experience in writing functional and technical specifications, and the ability to focus on high-quality work under pressure are crucial in this position. A college degree, specialized training, or equivalent work experience is required, alongside the ability to work under pressure, attention to detail, and a data-oriented mindset. To excel in this role, you should possess 8+ years of relevant experience, strong analytical, interpretive, and problem-solving skills, as well as interpersonal, management, and prioritization skills. Clear and concise communication, self-motivation, and the ability to work methodically under tight deadlines are essential. A Bachelor's degree or equivalent experience is required. Citi is an equal opportunity and affirmative action employer, encouraging all qualified interested applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability when using our search tools or applying for a career opportunity, please review the Accessibility at Citi guidelines.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,

Posted 1 month ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Hyderabad, Telangana, India

On-site

5+ Years of design, build and maintain complex ELT/ETL jobs that deliver business value. Extract, transform and load data from various sources including databases, APIs, and flat files using IICS or Python/SQL. Translate high-level business requirements into technical specs Conduct unit testing, integration testing, and system testing of data integration solutions to ensure accuracy and quality Ingest data from disparate sources into the data lake and data warehouse Cleanse and enrich data and apply adequate data quality controls Provide technical expertise and guidance to team members on Informatica IICS/IDMC and data engineering best practices to guide the future development of Clients Data Platform. Develop re-usable tools to help streamline the delivery of new projects Collaborate closely with other developers and provide mentorship Evaluate and recommend tools, technologies, processes and reference Architectures Work in an Agile development environment, attending daily stand-up meetings and delivering incremental improvements Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications and provide feedback on code quality, design, and performance Mandatory Skills Informatica/IICS/SQL

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Cross Markets Software Engineer at Barclays, you will play a crucial role in spearheading the evolution of the digital landscape, driving innovation and excellence. Your primary responsibility will be to harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. To be successful in this role, you should have experience in building interactive dashboards and visualizations using Tableau Desktop and Tableau Server. Additionally, you should possess advanced SQL skills for querying, transforming, and analyzing large datasets. You will also be expected to translate complex business requirements into intuitive visual analytics. Desirable skills include experience in Cloud platforms such as AWS or Azure, as well as a background in financial services (preferred but not essential). Your responsibilities will include designing, developing, and improving software using various engineering methodologies to provide business, platform, and technology capabilities for customers and colleagues. You will collaborate cross-functionally with product managers, designers, and other engineers to define software requirements and ensure seamless integration with business objectives. Furthermore, you will be accountable for developing high-quality software solutions, adhering to secure coding practices, and implementing effective unit testing practices to ensure proper code design, readability, and reliability. You will also need to stay informed about industry technology trends and actively contribute to the organization's technology communities. As an Analyst, you will be expected to consistently drive continuous improvement, demonstrate in-depth technical knowledge in your area of expertise, and lead and supervise a team if applicable. You will also be responsible for collaborating with other functions, advising on decision-making, and managing risk within your work area. Overall, all colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior and actions.,

Posted 1 month ago

Apply

3.0 - 6.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

5 years of relevant experienceRelevant Yrs. of experience(Total and Relevant cannot be the same which results in sourcing irrelevant talents)Minimum 5 years of exp in Teradata.Detailed Skill Set : Teradata,SQL Proficiency, Data Modeling, Teradata Utilities, Scripting Languages and Data Warehousing Experience : Min 5 years of Teradata Hands on experience. GoodCommunication and Problem Solving Skills. Mandatory skills Teradata SQL Desired skills*Development/ Configuration/solutions evaluation/ Validation and deployment Domain*Risk and Compliance

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Lead Database Administrator (DBA) at our organization, you will play a crucial role in managing and optimizing our database infrastructure. Your primary responsibilities will include designing, implementing, and overseeing scalable and high-performance database solutions across various environments. You will be working with a variety of relational database management systems such as MySQL, MSSQL, PostgreSQL, and MongoDB, as well as cloud database management on AWS. In this leadership position, you will lead database migrations, both on-premise and to the cloud, ensuring minimal downtime and a smooth transition. You will also be responsible for implementing best practices for database backup, disaster recovery, and security across multiple database systems. Your expertise in database performance tuning and query optimization will be essential in enhancing application performance. Additionally, you will be involved in capacity planning to ensure that our database environments are adequately scaled to meet application demands. Implementing automation tools for database monitoring, reporting, and health checks will also be a part of your responsibilities. You will be required to develop and enforce database policies, procedures, and documentation while staying up to date on industry trends and emerging technologies in database management and cloud platforms. The ideal candidate for this role will possess a strong background in database migrations, AWS cloud services, and various database technologies. Proficiency in database design, optimization, backup, recovery, and high availability is essential. Strong knowledge of database security best practices, automation and scripting, leadership and collaboration, problem-solving skills, and relevant certifications such as AWS Certified Database Specialty or AWS Solutions Architect are preferred. Additional qualifications that would be beneficial for this role include experience with big data technologies, CI/CD pipelines, database monitoring tools, and DevOps methodologies. Your ability to work with monitoring tools, SQL proficiency, high availability solutions, data security, performance tuning, and team leadership will be key to succeeding in this position.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is looking for a Business Consulting- Functional DB consultant to join their team in Pune, Maharashtra (IN-MH), India. As a part of this inclusive and forward-thinking organization, you will be involved in various key responsibilities related to database management in the domain of Capital Markets-Wealth Management. Your primary responsibilities will include gathering and analyzing requirements from stakeholders to translate them into database needs. You will be responsible for designing and implementing database schemas, data models, and structures to support business processes. Additionally, optimizing database performance, efficiency, and scalability through various techniques will be a crucial part of your role. You will also work on data migration, integration, security, access control, troubleshooting, support, documentation, training, collaboration with other teams, and staying updated with the latest database technologies and standards. To excel in this role, you should possess strong database expertise with knowledge of various systems like Oracle, SQL Server, and PostgreSQL. Proficiency in SQL, data modeling, business acumen, communication skills, problem-solving skills, and experience with cloud technologies are essential. Project management skills for managing database projects will be an added advantage. NTT DATA is a trusted global innovator offering business and technology services to Fortune Global 100 companies. With a commitment to innovation and long-term success, NTT DATA has experts in over 50 countries and a strong partner ecosystem. Their services include consulting, data and AI, industry solutions, and the development of applications, infrastructure, and connectivity. As a part of NTT Group, they invest significantly in R&D to support organizations and society in navigating the digital future confidently and sustainably. Visit us at us.nttdata.com.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is looking for a dedicated Business Consulting- Functional DB consultant to join their team in Pune, Maharashtra (IN-MH), India. As a part of NTT DATA, you will work with stakeholders to understand business needs and effectively translate them into necessary database requirements. Your primary focus will be on database design, implementation, optimization, and tuning to support various business processes efficiently. Your key responsibilities will include gathering and analyzing requirements, designing database schemas, data models, and structures, optimizing database performance, and ensuring data security and access control. You will also be responsible for data migration, integration, troubleshooting, and providing necessary support to end-users. As a Business Consulting- Functional DB consultant, you will collaborate with developers, system administrators, and other stakeholders to ensure seamless database integration with other systems. It is essential to stay updated with the latest database technologies, best practices, and security standards to deliver high-quality solutions. The ideal candidate should have strong expertise in database concepts, design principles, and various systems such as Oracle, SQL Server, and PostgreSQL. Proficiency in SQL, data modeling, business acumen, communication, and problem-solving skills are essential for this role. Experience with cloud-based database solutions and project management skills would be advantageous. If you are a passionate individual with a keen interest in database management and want to be part of an innovative and forward-thinking organization like NTT DATA, apply now and be a part of our inclusive and adaptable team. Join NTT DATA, a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts in over 50 countries, NTT DATA provides business and technology consulting, data and artificial intelligence, industry solutions, and digital infrastructure services. As a part of the NTT Group, we invest significantly in research and development to empower organizations and society in their digital transformation journey. Visit us at us.nttdata.com to learn more about our global initiatives and career opportunities.,

Posted 1 month ago

Apply

5.0 - 6.0 years

3 - 14 Lacs

Bengaluru, Karnataka, India

On-site

About the Role Uber sends billions of messages to our users across channels such as Email, Push, SMS, WhatsApp, and in-app surfaces, through an internally built CRM system. We're looking for a Product Manager to lead the development of marketing measurement and insight-generation tools. This role will focus on enabling clear performance tracking, consistent measurement, and data-driven decision-makingempowering teams across Uber to optimize marketing efforts with confidence and speed. What the Candidate Will Do Partner with Marketing, Data Science, Engineering, and other cross-functional teams to deeply understand business needs and define measurement strategies. Drive the product vision, roadmap, and execution Build and refine underlying data processes and pipelines to ensure reliable, high-quality datasets that power measurement and insight generation. Collaborate with Engineering to design, implement, and maintain scalable data systems (e.g., data lakes, ETL frameworks) supporting marketing workflows. Develop intuitive dashboards and analytics tools that surface actionable insights on campaign performance, audience engagement, channel effectiveness, and overall marketing impact. Establish frameworks for consistent marketing measurement, including attribution, incrementality, and experimentation, ensuring alignment across diverse teams and markets. Collaborate with stakeholders to define KPIs, track impact, and foster continuous improvement in data-driven marketing decisions. Champion data governance and best practices so that marketers can trust and confidently act on insights. Basic Qualifications Bachelor's degree in Computer Science, Engineering, Data Science, or a related technical or analytical field. 5+ years of product management experience with a focus on data platforms, analytics, or business intelligence. Strong understanding of marketing measurement, data modeling, and reporting best practices. Experience working with large-scale data infrastructure and tools (e.g., SQL, Looker, BigQuery, Airflow). Ability to translate complex data requirements into simple, user-centric products. Strong cross-functional collaboration and communication skills. Preferred Qualifications Master's degree in a technical field Experience in digital marketing, CRM, or MarTech environments. Familiarity with experimentation and incrementality testing. Interest in applying AI/ML to enhance marketing analytics and insights.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The Information Services Group (ISG), Data Solutions organization at Citi is dedicated to fostering a data-driven culture by providing innovative technology solutions and simplifying access to quality data and business intelligence. As a part of the Reference Data Management group within ISG Data Solutions, the Senior Business Analyst plays a crucial role in facilitating communication between business users and technologists. This intermediate-level position focuses on exchanging information in a clear and logical manner to support continuous exploration of business performance and drive business planning initiatives. The Execution BA Team, operating on a global strategic technology platform - Global Corporate Action Event Repository, carries out key tasks such as processing vendor data feeds, mapping data, analyzing and researching data, working on Swift messages, developing user interfaces, publishing data to downstream clients, establishing rules and governance, ensuring accuracy and timeliness of information, and collaborating with stakeholders worldwide. The team's efforts are centered around meeting client and user requirements to shape the product roadmap effectively. As a Senior Business Analyst, your responsibilities will include creating business and functional requirements, data modeling and mapping, data mining and transformation, SQL proficiency, collaborating closely with IT, project management, and operations teams, developing interfaces like web GUI and API, working with data providers vendors, defining EPICs and user stories based on client requirements, engaging with stakeholders, and documentation and user training. Key Skills required for this role include working with clients to gather business requirements, possessing strong analytical and writing skills, experience in writing functional and technical specifications, validation and testing in collaboration with development, quality assurance, and operations teams, understanding data vendor/data source hierarchy for Golden Copy, familiarity with business process reengineering and business modeling concepts, and an ability to focus on high-quality work under pressure with a keen eye for detail. The ideal candidate should have a bachelor's degree or equivalent experience, at least 8 years of relevant experience, proven ability in using complex analytical and problem-solving techniques, strong interpersonal and management skills, clear written and verbal communication, self-motivation, and the ability to work under pressure to meet deadlines methodically and with attention to detail. Citi is an equal opportunity and affirmative action employer, offering a full-time position within the Technology Management job family group. If you are passionate about leveraging data-driven insights to drive business strategies and possess the required skills and experience, we invite you to apply and explore career opportunities with us at Citi.,

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies