Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus
Posted 1 week ago
5.0 - 10.0 years
7 - 20 Lacs
Noida, Uttar Pradesh, India
On-site
We are looking for an experienced Data Analyst to join our team in India. The ideal candidate will have a strong background in data analysis and be proficient in various data analysis tools and techniques. You will be responsible for transforming raw data into actionable insights that drive business decisions. Responsibilities Collecting, processing, and analyzing large datasets to identify trends and insights. Creating detailed reports and visualizations to communicate findings to stakeholders. Collaborating with cross-functional teams to define data requirements and develop data solutions. Developing and maintaining dashboards and automated reporting tools. Ensuring data integrity and accuracy by conducting regular data audits. Skills and Qualifications Bachelor's degree in Data Science, Statistics, Mathematics, or a related field. 5-10 years of experience in data analysis or a related field. Proficiency in SQL for querying databases. Experience with data visualization tools such as Tableau, Power BI, or similar. Strong knowledge of statistical analysis techniques and tools (e.g., R, Python, SAS). Familiarity with data manipulation libraries (e.g., Pandas, NumPy). Excellent problem-solving skills and attention to detail. Strong communication skills to convey complex data insights to non-technical stakeholders.
Posted 1 week ago
3.0 - 4.0 years
5 - 9 Lacs
Ahmedabad
Work from Office
Roles and Responsibility : Design and implement efficient data models in Power BI to support business requirements. Ensure data models are scalable, optimized, and maintainable. Develop and optimize DAX formulas for creating calculated columns, measures, and calculated tables. Implement complex calculations and business logic in DAX. Design and implement ETL processes using Power Query to transform and load data into Power BI. Ensure data quality and integrity through effective data cleansing techniques. Integrate Power BI with Power Apps and Power Automate to create end-to-end solutions. Develop custom applications and automate business processes. Design and manage data flows to streamline data integration and transformation. Optimize data flow processes for improved performance. Write and optimize SQL queries for data extraction and manipulation. Collaborate with the database team to ensure efficient data retrieval. Create and maintain comprehensive documentation for Power BI solutions, data models, and processes. Ensure documentation is up-to-date and accessible to the team. Collaborate with cross-functional teams to understand business requirements and deliver effective BI solutions. Provide guidance and support to junior team members.
Posted 1 week ago
3.0 - 10.0 years
4 - 23 Lacs
Ahmedabad, Gujarat, India
On-site
Description We are seeking a skilled Data Analyst to join our dynamic team in India. The ideal candidate will be responsible for collecting and analyzing data to support our business objectives and drive decision-making. Responsibilities Collecting, processing, and analyzing data to provide actionable insights. Creating detailed reports and visualizations to communicate findings to stakeholders. Collaborating with cross-functional teams to understand their data needs and deliver solutions. Performing data cleansing and validation to ensure data quality and integrity. Utilizing statistical methods to analyze data trends and patterns. Skills and Qualifications Proficiency in SQL and experience with databases. Strong knowledge of data visualization tools such as Tableau or Power BI. Experience with programming languages such as Python or R for data analysis. Familiarity with statistical analysis and methodologies. Ability to communicate complex data insights in a clear and concise manner. Bachelor's degree in Data Science, Statistics, Mathematics, or a related field.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Req ID: 318519 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Salesforce Data Cloud Specialist to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Title: Salesforce Data Cloud Specialist Location: Hybrid Experience: 3-5 years Job Summary: We are seeking a skilled Salesforce Data Cloud Specialist to manage and optimize customer data platforms, ensuring seamless integration and data orchestration across Salesforce ecosystems. The ideal candidate will have strong expertise in managing Data Cloud, configuring customer data platforms, and aligning data models with business needs to provide actionable insights. Key Responsibilities: . Implement and configure Salesforce Data Cloud to unify and segment customer data effectively. . Design and manage data models that integrate seamlessly with Salesforce platforms, ensuring high-quality data ingestion and transformation. . Work closely with stakeholders to understand business requirements and translate them into technical solutions. . Build and manage data pipelines to aggregate and cleanse data from multiple sources. . Develop rules for data normalization, identity resolution, and deduplication. . Ensure data compliance, security, and privacy standards are maintained. . Collaborate with marketing, sales, and analytics teams to leverage Data Cloud capabilities to improve customer engagement and personalization. . Troubleshoot and optimize Data Cloud performance, ensuring timely resolution of issues. Required Skills and Qualifications: . Strong hands-on experience with Salesforce Data Cloud (formerly known as Customer Data Platform). . Proficiency in data modeling, ETL processes, and data integration with Salesforce ecosystems. . Knowledge of Salesforce CRM, Marketing Cloud, and related modules. . Experience with API integrations and data connectors. . Familiarity with identity resolution and customer segmentation techniques. . Strong understanding of data governance, privacy, and compliance requirements. . Analytical mindset with the ability to derive actionable insights from data. . Excellent communication and collaboration skills. Preferred Skills: . Salesforce certifications such as Salesforce Certified Data Cloud Specialist or related certifications. . Hands-on experience with SQL, Python, or other data manipulation tools. . Familiarity with AI/ML models for predictive analytics in customer data. Educational Qualifications: . Bachelor's or Master's degree in Computer Science, Information Systems, or a related field #Salesforce About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireData Engineering Professionals in the following areas : Experience 10-12 Years Job Description Location: Pune J ob Summary: We are seeking a detail-oriented and technically proficient Technical Project Manager (TPM) with a strong background in data engineering, analytics, or data science. The TPM will be responsible for leading cross-functional teams to deliver data-centric projects on time, within scope, and within budget. This role bridges the gap between business needs and technical execution, ensuring alignment across stakeholders. Key Responsibilities: . Lead end-to-end project management for data and engineering initiatives, including planning, execution, and delivery. . Lead the planning, execution, and delivery of data-related projects (e.g., data platform migrations, analytics implementations, ML model deployments). . Collaborate with data engineers, analysts, and business stakeholders to define project scope, goals, and deliverables. . Develop detailed project plans, timelines, and resource allocations. . Manage project risks, issues, and changes to ensure successful delivery. . Ensure data quality, governance, and compliance standards are met. . Facilitate communication across technical and non-technical teams. . Track project performance using appropriate tools and techniques. . Conduct post-project evaluations and implement lessons learned. Required Qualifications: . Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. . 5+ years of experience in project management, with at least 2 years managing data-focused projects. . Strong understanding of data pipelines, ETL processes, cloud platforms (e.g., AWS, Azure), and data governance. . Proficiency with project management tools (e.g., Jira, MS Project). . Excellent communication, leadership, and stakeholder management skills. . Familiarity with BI tools (e.g., Power BI, Tableau). . PMP or Agile/Scrum certification is a plus. Required Technical/ Functional Competencies Change Management: Specialized in overcoming resistance to change and helping organizations achieve their Agile goals. Able to guide teams in driving the change management projects or requirements. Customer Management: Specialized knowledge of customers business domain and technology suite. Use latest technology, communicate effectively, demonstrate leadership, present technical offerings, and proactively suggest solutions. Delivery Management: Specialized knowledge of deal modeling, commercial and pricing models. Create an integrated pricing model across service lines. Guide team members to apply pricing techniques. Grow the account, forecast revenues and analyze complex internal reports. Manage at least 1 complex account (>10m) or multiple small account independently. Domain/ Industry Knowledge: Specialized knowledge of customers business processes and relevant technology platform or product. Able to forecast business requirements and market trends, manage project issues, and validate customer strategy roadmap. Product/Technology Knowledge: In-depth knowledge of platform/product and associated technologies. Review various product-specific solutions for a specific project/client/organization and conduct product demos, walkthroughs and presentations to prospects if required. Profitability Management: Demonstrate competence in applying profitability and cost management techniques. Can develop Project budgets, monitor actual costs against the budget, and identify potential cost overruns or deviations. Use established processes and tools to track and control project expenses. Project Management: Extensive experience in managing projects and can handle complex projects with minimal supervision. Deep understanding of project management concepts and methodologies and can apply them effectively to achieve project goals. Scheduling and Resource Planning: Prepare independent global delivery models covering skill levels, skill mix and onsite/offshore work allocation. Create an accurate resource planfor people, space and infrastructure for the given requirements. Forecast people and skill requirements to align with plans. Optimize the schedule for complex projects. Service Support and Maintenance: Plan and execute transition for large/ complex activities. Define standards in transition management based on industry trends and contribute to building tools and accelerators for KT process. Optimize resource utilization based on demand from customers. Select and define SLAs track service levels and analyze impact of SLA on complex processes and deliverables. Risk Management: Good understanding of risk management principles and techniques. Identify, assess, and document risks independently, as well as prioritize risks based on their potential impact. Assist in developing risk mitigation plans and monitoring risk responses. Required Behavioral Competencies Accountability: Being a role model for taking initiative and ensuring others take initiative, removing obstacles for others, taking ownership for results and deadlines for self and others, and acting as a role model for being responsible. Agility: Works with a diverse set of situations, people and groups and adapts and motivates self and team to thrive in changing environment. Collaboration: Reaches out to others in team to ensure connections are made and team members are working together. Looks for ways to integrate work with other teams, identifying similarities and opportunities, making necessary changes in work to ensure successful integration. Customer Focus: Engages in executive customer discovery to predict future needs of customers and drives customer relationships with a long-term focus and takes actions to enhance customer loyalty. Communication: Communicates and presents complex ideas, information, and data to multiple, broad, and demanding stakeholders internal and/or external to the Organization. Helps others communicate better with their audience. Demonstrates honest, direct, and transparent communication and facilitates conversations within the team and its close collaborators. Drives Results: Proactively seeks challenging and differentiated opportunities and drives and motivates team members to take on more responsibility. Resolves Conflict: Balances the business interests of all stakeholders and manages any conflicts offering mutually beneficial options. Certifications PMP (Project Management Professional), PRINCE2 (Projects in Controlled Environments) At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer (AWS, Confluent & Snaplogic ) Data Integration : Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing : Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage : Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation : Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products : Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management : Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming : Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes : Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging : Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. Youd describe yourself as: Experience : 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills : Proficiency in Python, SQL, and other relevant programming languages. Data Modeling : Experience with data modeling and database design. Problem-Solving : Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail : Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills : Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability : Ability to adapt to changing technologies and work in a fast-paced environment. Team Player : Strong team player with a collaborative mindset. Continuous Learning : Eagerness to learn and stay updated with the latest trends and technologies in data engineering.
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Pune
Work from Office
Role Description Responsible for the day-to-day maintenance of the application systems in operation, including tasks related to identifying and troubleshooting application issues and issues resolution or escalation. Responsibilities also include root cause analysis, management communication and client relationship management in partnership with Infrastructure Service Support team members. Ensures all production changes are made in accordance with life-cycle methodology and risk guidelines. Responsible for coaching and mentoring less experienced team members and or acting as a subject matter expert. In depth Functional knowledge of the application(s) supported and interdependencies Is an experienced and detail-oriented person capable of integrating product knowledge, research and testing to answer complex questions about product behavior and provide end to end solution to permanently fix the issue. The engineer will assist customer teams and other team members to understand how customers can achieve desired outcomes using the applications it exists today. The output of could range from FAQs and knowledge base articles that describe to customers how to operate the product to achieve selected outcomes to end to end coding solution for the issue reported. The engineer would be liaising with the global stakeholders and vendors to deliver technology solutions as part of yearly book of work The engineer should also be able understand functional requirements / expectations of the various stakeholders and work towards an appropriate plan of action. The role also requires working with the product vendors and lead upgrades as applicable. What well offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Your key responsibilities Researching, designing, implementing and managing software programs Testing and evaluating new programs Identifying areas for modification in existing programs and subsequently developing these modifications Oversee resolution of technical issues coming from customer teams Fix and deliver the customer issues Follow ITIL processes including incident management, change management, release management, problem management and knowledge management Strong problem solving skills with good communication skills, ability to work under pressure with a high sense of urgency. Proactively identify potential incidents and problems as well as availability issues Manage any IT Security incidents that may occur in the application. Identify risk & issues and contribute in Service Management related audits. Perform environment maintenance and management Deploying software tools, processes and metrics Perform standard recurring activities like data and environment refreshes Be a liaison between the customer-facing teams and the Product and Engineering org for management and resolution of all technical questions and issues Work closely with other developers, business and systems analysts Maintain detailed documentation ranging from Knowledge Base articles to live logging of incidents for post-mortems Ensure delivery timelines and SLA obligations established with with internal and external stakeholders are observed and met; escalate as necessary using judgment and discretion Develop a deep understanding of the application platform across all product lines and clearly articulate support decisions and findings Work closely with internal teams to stay up to date on product features, changes, and issues Your skills and experience Must be having total 6+ years of experience and at least 5 years in software development/support engineering Must have advanced knowledge of Java / C# / .Net debugging & scripting (Power shell / Unix / any other) Must have advanced knowledge of MS SQL Sever, SSIS, Tableau and ETL processes Working Knowledge of SDLC & Agile processes Demonstrable experience in leading projects to successful conclusions Strong customer focus with experience of working with cross-functional/ cross-department teams A self-starter with strong organization skills, resolution management, and superior written and verbal communication skills Educational/Qualifications: B.E. / B. Tech. / Master's degree in computer science or equivalent ITIL Certification is good to have How well support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Mumbai
Hybrid
Employment Type: Full-Time About the role The Intermediate Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Conversant with the Mainframe Environment to login and look at the file layout, analyze the EDI layout mapping. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Data Engineering: Understanding of data pipelines in Azure DevOps, ETL processes, and data modeling. Database- DB2 Query Languages SQL, PL/SQL, Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Time management, relationship building, prioritization, Educational and Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA)
Posted 1 week ago
4.0 - 9.0 years
5 - 12 Lacs
Pune
Remote
Hi candidates , Urgently hiring for Ab Initio Admin Experience - 4+ Years Work Mode - Remote Notice Period - Immediate / Serving Role & responsibilities Strong knowledge of Ab Initio tools, including Co?Op, EME, GDE, and other components. • Understanding of data warehousing concepts, ETL processes, and data integration best practices • Manage all Ab Initio environments, including security, cluster setup, performance tuning, and continuous monitoring. • Perform cluster maintenance activities such as patching, Co?Op upgrades, EME backup and recovery, user provisioning, and automation of routine tasks. • Troubleshoot failed jobs and configure/maintain security policies effectively. • Knowledge of SQL for querying and manipulating data. Note - Interested candidate kindly share resume on priyanka.bhosale@talentsketchers.com
Posted 1 week ago
2.0 - 5.0 years
2 - 5 Lacs
Mumbai, Maharashtra, India
On-site
AMK Group is looking for a talented SQL Developer to join our dynamic team and play a crucial role in our product rollout initiatives. You'll be responsible for designing, developing, and maintaining robust SQL databases, ensuring data integrity, performance, and seamless data flow. If you're passionate about data, skilled in optimization, and thrive in a collaborative environment, we encourage you to apply! Key Responsibilities SQL Database Design and Development: Design and develop SQL databases and data models to support product rollout initiatives. Create efficient and scalable database structures, tables, views, and stored procedures, ensuring data integrity and performance. Data Migration and ETL: Extract, transform, and load (ETL) data from various sources into target databases for new product rollouts. Develop SQL scripts and procedures to migrate data accurately and efficiently, ensuring data quality and consistency. Data Analysis and Reporting: Analyze data using complex SQL queries to derive insights and generate reports for stakeholders. Develop advanced SQL queries and views to extract and aggregate data for comprehensive reporting and analysis purposes. Performance Optimization: Identify and optimize SQL queries, indexes, and database structures to significantly improve query performance and overall system efficiency. Monitor and tune SQL performance, ensuring efficient data retrieval and processing. Database Maintenance and Administration: Perform routine database maintenance tasks, such as backups, data archiving, database monitoring, and user access management. Ensure data security, implement data retention policies, and perform regular health checks and optimizations. Quality Assurance and Testing: Collaborate with testing teams to ensure the quality and reliability of SQL scripts and database changes. Develop and execute SQL-based test cases to validate data integrity, accuracy, and functionality. Documentation and Knowledge Sharing: Create and maintain thorough documentation related to database design, SQL scripts, data migration processes, and other relevant information. Share knowledge and best practices with the team to ensure efficient collaboration and knowledge transfer. Collaboration and Communication: Work closely with cross-functional teams, including developers, product managers, and business stakeholders, to understand requirements, provide technical guidance, and support the product rollout process. Communicate effectively to provide updates, address concerns, and ensure successful project delivery. Continuous Improvement: Stay updated with the latest SQL and database technologies, trends, and best practices. Identify opportunities for process improvement, automation, or innovation to enhance the efficiency and effectiveness of the product rollout process. Required Skills & Experience Proven experience as a SQL Developer . Strong expertise in SQL database design and development . Hands-on experience with Data Migration and ETL processes using SQL scripts. Proficiency in data analysis and reporting using complex SQL queries. Demonstrated ability in performance optimization of SQL queries and database structures. Experience with database maintenance and administration tasks. Familiarity with quality assurance and testing methodologies for SQL databases. Excellent communication and collaboration skills. A proactive approach to continuous improvement in database technologies.
Posted 1 week ago
3.0 - 8.0 years
2 - 6 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With presence in more than 45 countries across five continents, we excel in leveraging stateof-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients business impact and strategic decision-making. We have 4,500+ talented professionals operating across 45 countries, including India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets such as Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and open culture that prioritizes flexible work-life balance, diverse and inclusive teams, and equal opportunities for all. What you will be doing Evalueserve Data Operations Analyst is responsible to perform various data management and Analysis functions. Should have 5-7 years of overall industry experience which primarily involves and providing analytical and technical skills necessary to innovate, build, and maintain well-managed data solutions and 4-6 years of experience in Banking and financial data domain especially in in data analysis, Data Reconciliation / Regulatory reconciliation capabilities to solve Data Quality and business problems under the Risk and controls within Data Governance Framework. Key data deliverables: Strong Experience in BFSI to understand the Banking product knowledge and services, Duties include monitoring of data quality exceptions, Data Quality Controls, Data quality remediation, issue management, handling user inquiries, execution of DQ reports, dashboards, metrics and creating sufficient documentations by facilitating data certification. Also responsible for demonstrating thought leadership and coaches junior team members in performance of their duties. Work with business and technology teams to Monitor, triage & prioritize exceptions, Data / Financial variances, Data or Product Instruments, Data quality, remediation and reporting problems Triage and prioritization of data issues engage partners/vendors as needed to initiate problem resolution Work closely with Operational / Business owners and IT team for the regular BAUs to track and maintain to fix the Balance variances and causes for data breach Open service requests to technology teams for bulk or process remediations To work closely with SOX and Audit teams to walk through data controls and necessary preparations and Data certification sign off. Sound knowledge or experience in SQL, Data lineage, Source systems, BRD/FRD, Entrusted with improving, and maintaining a high level of, data quality. Identifies any issuers or anomalies in the quality of data within the bank's systems. Experience in DQ / DG tools like IDQ, Collibra Support project planning, management, and requirements to ensure best practices are followed with regards to data governance Definition and implementation of data quality controls & experience in development of dashboards, metrics, using tableau, power-BI, Makes recommendations to ensure adherence to consistent definitions, business naming standards, development of standard calculations, security requirements, etc. Prioritization of data quality issues for remediation Knowledge. What were looking for Preferred data analyst with a bachelors degree in Analytics, technology, information management, Computer Science, Data Science; Mathematics related field and 5-10 years of overall industry experience and 4-6 years of experience in data analysis in Banking and finance industries using Data Governance or Data Quality skills. Demonstrated experience in Enterprise Data Office under Data Governance framework to Define KDE / CDE, Writing, analyzing and executing Data quality rules. Strong experience in understanding Data Base and SQL query execution for data extraction and framing the data lineage document. Experience with regulated systems in the financial industry a plus. Knowledge about Data Elements review in data governance glossary, data dictionary or in any other data governance tools or framework. Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances.
Posted 1 week ago
4.0 - 10.0 years
6 - 12 Lacs
Pune, Maharashtra, India
On-site
Responsible for detailed analysis, creation of operations reporting, dashboards, macros, automation of reports and the work necessary to feed data into our reporting tools, and help to visualize that data in ways that allow managers to take action on a daily basis. Position Responsibilities and Essential Duties (other duties may be assigned): Automation of as much as possible of operations reporting and metrics. Creation of management-level dashboards for key metrics Provide accurate data analysis and reporting to support production operations. Manage operational or special projects assigned for business improvement and process efficiency. Coordination with on/offshore personnel, counterparts, and different departments to ensure alignment. Produce analysis relating to contract review, pricing review, billing, and performance evaluation data and reports as needed. Excellent advanced excel skills and Google sheets, including the ability to work with Google Data Studio . Who you will be partnering with: Pune Operations AVPs and their teams Internal Tools Development Manila Research and Analytics team Executive Leadership Team
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Overview: We are seeking a highly skilled Senior ODI Developer with strong hands-on experience in SQL , PL/SQL , and Oracle Data Integrator (ODI) projects. The ideal candidate will design, implement, and optimize ETL processes to meet evolving business needs. Prior experience in FLEXCUBE or any other Core banking projects is a significant advantage. Skills and Qualifications: Mandatory Skills: Strong hands-on experience with Oracle Data Integrator (ODI) development and administration. Proficiency in SQL and PL/SQL for complex data manipulation and query optimization. Experience deploying and managing ODI solutions. Deep understanding of ETL processes, data warehousing concepts, and data integration. Preferred Experience: Hands-on experience in banking domain projects, with knowledge of domain-specific data structures. Experience in integrating on-premise data sources. Other Skills: Strong problem-solving and debugging skills. Excellent communication and teamwork abilities. Knowledge of Agile methodologies and DevOps practices. Education and Experience: Bachelor's degree in computer science, Information Technology, or a related field. 4 to 6 years of experience in ODI development-based projects. Domain experience in FLEXCUBE or any other Core banking projects is an added advantage. Design, develop, and deploy ETL processes using Oracle Data Integrator (ODI). Configure and manage ODI instances, ensuring optimal performance and scalability. Develop and optimize complex SQL and PL/SQL scripts for data extraction, transformation, and loading. Implement data integration solutions, connecting diverse data sources on-premise systems, APIs, and flat files. Monitor and troubleshoot ODI jobs to ensure seamless data flow and resolve any issues promptly. Collaborate with data architects and business analysts to understand integration requirements and deliver robust solutions. Conduct performance tuning of ETL processes, SQL queries, and PL/SQL procedures. Prepare and maintain detailed technical documentation for developed solutions. Adhere to data security and compliance standards. Provide guidance and best practices for ODI projects. Career Level - IC2
Posted 1 week ago
0.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant-Test Auto mation Lead / Data Transformation/ETL Program ! Overview: We are currently looking for a highly skilled and motivated Test Automation Lead to lead the automation testing efforts for our Data Transformation/ETL program. The successful candidate will be responsible for leading the design, development, and implementation of automation frameworks and tests to ensure the completeness, accuracy, and timeliness of our data processing systems. This pivotal role involves close collaboration with the test management team, developers, and business analysts to create efficient, scalable, and reliable test automation solutions. Key Responsibilities Automation Strategy & Framework Development: Lead the development and implementation of a comprehensive test automation strategy tailored to the Data Transformation/ETL program%27s specific needs. Design, develop, and maintain robust automation frameworks that facilitate efficient script creation, execution, and maintenance. Identify opportunities for automation within the software development process and implement solutions that enhance testing efficiency and coverage. Test Automation & Execution: Develop and maintain automated test scripts for data validation, ETL processes, and end-to-end data workflows. Ensure automated tests are integrated into the CI/CD pipeline for continuous testing and feedback. Monitor and analyse automated test results, identifying issues and areas for improvement in data quality and process efficiency. Team Collaboration & Leadership: Work closely with the test management team to align automation efforts with overall testing strategies and objectives . Collaborate with developers and business analysts to understand system requirements and ensure automated tests accurately reflect business logic and data integrity needs. Mentor and support team members in automation best practices and tools, fostering a culture of quality and continuous improvement. Quality Assurance & Reporting: Provide detailed reports on automation metrics, including test coverage, defect detection rates, and automation ROI. Communicate automation progress, challenges, and outcomes to stakeholders, ensuring transparency and alignment with program goals. Qualifications we seek in you! Minimum Qualifications Bachelor&rsquos degree in computer science, Information Technology, or related field. experience in test automation, with a focus on data and ETL processes. Strong experience in designing and implementing test automation frameworks and strategies. (not UI/API test automation frameworks, but data related frameworks) Proficiency in SQL and experience with database technologies. Hands-on experience with test automation tools and scripting languages (e.g., Python). Excellent analytical, problem-solving, and communication skills. Ability to work collaboratively in a team environment and manage multiple priorities. Preferred Skills: Experience with cloud-based data warehousing solutions, such as AWS Redshift, Google BigQuery , or Azure Synapse Analytics. Knowledge of Agile methodologies and experience working in an Agile environment. Certifications in Quality Assurance or Test Automation (e.g., ISTQB Advanced Level Test Automation Engineer). Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 week ago
4.0 - 7.0 years
6 - 9 Lacs
Pune
Work from Office
The IT Applications Analyst Senior provides comprehensive application functionality, configuration, and support expertise for software solutions related to Procurement IT, including Procurement Analytics and applications. This role partners with business analysts, architects, technical teams, and software vendors to understand business requirements and deliver scalable solutions across Direct and Indirect Procurement. The role emphasizes analytics and reporting in Source-to-Contract (S2C) and Procure-to-Pay (P2P) processes, while supporting continuous improvements and innovation across procurement IT systems. Key Responsibilities: Analyze procurement analytics requirements and identify internal and external data sources. Collaborate with product owners to design analytical dashboards, business rules, and wireframes. Manage data intake and transformation from multiple sources to support procurement analytics. Lead or support the configuration, design, implementation, and enhancement of application software solutions. Evaluate application functionality and recommend increased utilization of standard capabilities. Oversee application setup, configuration, testing, and optimization to meet business requirements. Serve as the subject matter expert on procurement applications, dashboards, and reporting tools. Work closely with vendors and internal stakeholders to resolve application issues and improve system performance. Create and maintain functional specifications and technical documentation for systems and processes. Provide functional support for system upgrades, maintenance, and testing activities. External Qualifications and Competencies Experience: 4-7 years of relevant work experience required. Core Technical Skill Strong experience in Procurement Analytics with an understanding of S2C and P2P supply chain processes. Hands-on expertise with SQL, BI tools such as Power BI, Snowflake, and Power BI Service would be good to have. Proficiency in SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and data visualization/reporting. Experience designing, building, and maintaining dashboards and reports, including wireframing. Strong understanding of data modeling, ETL processes, and data engineering tools (e.g., SSIS, Data Bricks - preferred). Exposure to AI/ML concepts and tools is a plus. Preferred Skills: Exposure to procurement operations and systems. Familiarity with ERP platforms like Oracle or SAP. Strong technical writing and documentation skills. Skilled in requirements gathering and cross-functional collaboration. Relevant certifications or equivalent work experience in BI tools, data analytics, or procurement systems are an advantage. Additional Responsibilities Unique to this Position Qualifications, Skills, and Experience: Education, Licenses, Certifications: College, university, or equivalent degree in Computer Science, Information Technology, Business, or a related field required.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Ajmer
Work from Office
This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting. Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones
Posted 1 week ago
4.0 - 9.0 years
42 - 78 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
We are looking for a highly skilled Ab Initio Developer to join our data engineering team . The ideal candidate will have extensive experience in ETL development , data integration , and performance optimization using Ab Initio . Key Responsibilities: ETL Development: Design, develop, and optimize ETL processes using Ab Initio's Graphical Development Environment (GDE) to ensure data accuracy and availability. Data Integration: Build, maintain, and optimize data workflows to enable seamless data flow across multiple systems. Data Transformation: Implement data cleansing, enrichment, and transformation logic within Ab Initio graphs . Metadata Management: Utilize Ab Initio's metadata features to document data lineage, transformations, and definitions for compliance and transparency. Performance Optimization: Monitor and optimize ETL processes for efficiency, scalability, and performance , resolving bottlenecks when necessary. Error Handling: Develop error tracking and logging mechanisms to manage ETL job failures effectively. Collaboration: Work closely with data engineers, analysts, and business stakeholders to understand requirements and deliver data integration solutions. Version Control: Use Git or similar version control systems to manage Ab Initio code and collaborate with team members. Documentation: Maintain comprehensive documentation of Ab Initio graphs, processes, best practices, and standards . Troubleshooting & Support: Diagnose and resolve ETL-related issues , conduct root cause analysis , and support the team as needed. Join us and be a part of a dynamic data engineering team that drives high-performance data solutions using Ab Initio ! Apply now!
Posted 1 week ago
5.0 - 10.0 years
17 - 30 Lacs
Chennai, Tamil Nadu, India
On-site
About This Role Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today's complex challenges and tomorrow's opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. Information Security Responsibilities Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.) Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information Roles and Responsibilities Serve as a platform expert in Adobe Experience Platform Real-Time CDP Provide deep domain expertise in client business, technology stack, and data infrastructure, with a broad knowledge base in digital marketing Assess and audit the current state of client's marketing technology stack Strategize, architect, and document a scalable RTCDP implementation, tailored to client business needs Translate business requirements into technical specifications Lead technical delivery; provide guidance to Data Engineers Support the implementation of proper data governance 5+ years of experience architecting and building data pipelines 3+ years of experience working in an agency environment; strong consulting, client-facing skills Hands-on experience configuring Adobe Experience Platform RTCDP, including creating schemas, ingesting data from a variety of sources, configuring identity resolution, and connecting destinations for activation Experience with Adobe Tags (formerly called Launch) and AEP WebSDK implementation Experience working with APIs Strong understanding of customer data platforms and the modern data infrastructure Experience working with cloud technologies such as AWS, Google Cloud, Azure, etc. Experience working with data warehouse solutions such as Amazon Redshift, Google BigQuery, Snowflake, etc. Experience with Adobe Target a plus Experience with data visualization tools such as Tableau, PowerBI, etc. a plus Experience with marketing automation tools a plus Degree in Computer Science, Data Science, Analytics, or related field or equivalent work experience preferred
Posted 1 week ago
7.0 - 15.0 years
20 - 36 Lacs
Chennai, Tamil Nadu, India
On-site
About This Role Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today's complex challenges and tomorrow's opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. About Bounteous ( https://www.bounteous.com/ ) Founded in 2003 in Chicago, Bounteous is a leading digital experience consultancy that co-innovates with the world's most ambitious brands to create transformative digital experiences. With services in Strategy, Experience Design, Technology, Analytics and Insight, and Marketing, Bounteous elevates brand experiences through technology partnerships and drives superior client outcomes. For more information, please visit www.bounteous.com Information Security Responsibilities Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.) Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information Preferred Qualifications 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Working knowledge ofETL technology - Talend / Apache Ni-fi / AWS Glue Experience with relational SQL and NoSQL databases Experience with big data tools: Hadoop, Spark, Kafka, etc.(Nice to have) Advanced Alteryx Designer (Mandatory at this point - relaxing that would be tough) Tableau Dashboarding AWS (familiarity with Lambda, EC2, AMI) Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.(Nice to have) Experience with cloud services: EMR, RDS, Redshift or Snowflake Experience with stream-processing systems: Storm, Spark-Streaming, etc.(Nice to have) Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc. Responsibilities Work with Project Managers, Senior Architects and other team members from Bounteous & Client teams to evaluate data systems and project requirements In cooperation with platform developers, develop scalable and fault-tolerant Extract Transform Load (ETL) and integration systems for various data platforms which can operate at appropriate scale; meeting security, logging, fault tolerance and alerting requirements. Work on Data Migration Projects. Effectively communicate data requirements of various data platforms to team members Evaluate and document existing data ecosystems and platform capabilities Configure CI/CD pipelines Implement proposed architecture and assist in infrastructure setup
Posted 1 week ago
3.0 - 7.0 years
7 - 18 Lacs
Chennai, Tamil Nadu, India
On-site
About This Role Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today's complex challenges and tomorrow's opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. As an AEP Developer, you will play a key role in driving the successful implementation and optimization of Adobe Experience Platform (AEP) solutions. By collaborating with a multidisciplinary team of Consultants, Solution Architects, Data Scientist, Digital Marketers and IT Teams, you will integrate diverse data sources, develop comprehensive customer profiles, and design innovative, data-driven marketing strategies. Your contributions will be instrumental in unlocking the full potential of customer data platforms and enhancing overall marketing effectiveness. Information Security Responsibilities Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.) Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information Role and Responsibilities Design and implement effective data ingestion pipelines within Adobe Experience Platform to facilitate seamless integration of a variety of data sources. Work with the Solution Architect and extract, transform and load marketing and customer data into the platform in an automated and scalable manner Proactively monitor and troubleshoot AEP implementations, offering support and resolving technical issues as they arise. Work together with cross-functional teams to establish data strategies and promote best practices for data management and governance. Develop a deep expertise in our client's data infrastructure and partner with the respective teams Provide valuable training and support to internal teams on AEP functionalities and best practices, fostering a culture of continuous learning. Keep abreast of updates and enhancements to Adobe Experience Platform, leveraging new features to refine and improve current processes. Preferred Qualifications College degree in Computer Science, Data Science, Analytics or related field 3+ years of experience working with Adobe Experience Platform including hands-on experience in schema creation, identity configuration, data ingestion, data transformation, data debugging & troubleshooting, and custom development & scripting. Strong knowledge of data modeling, customer segmentation, and analytics principles. Proficiency in SQL, and at-least one programming language such as Python, Java, JavaScript. Familiarity with Adobe Analytics, Customer Journey Analytics, WebSDK & MobileSDK, Adobe Target, and other Adobe Experience Cloud products is advantageous. Experience working with cloud technologies (AWS, Google Cloud, Azure, etc.), especially products for for building and managing data pipelines. Strong understanding of customer data platforms Exposure to Spark, Hadoop, and other big data technologies is a plus Excellent analytical skills with the capability to interpret complex data sets effectively. Strong communication and interpersonal skills to foster collaboration with various stakeholders. A problem-solving mindset with a keen attention to detail, making you proactive in identifying and addressing challenges.
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Job Description: Major Accountabilities of Position: Designing, developing and maintaining business intelligence solutions. Knowledge of data modelling, crafting, and executing queries. Good knowledge and experience on data integration tool like SSIS. Data Analysis through reports and visualization. Building and deploying Power BI apps for multiple customers. Knowledge / Experience / Competencies Required: Technical Skills: Power BI DAX and Power Query M-Code. Building Data Model in Power BI. Preferably experience in any ETL Tools like SSIS. Knowledge of Microsoft MSBI stack like SSAS (tabular model), SSIS, SSRS. Experience in deploying Reports, Dashboards and Apps in Power BI. Solid understanding of Data warehousing and BI concepts. Working knowledge of Office365 groups/ workspaces. Knowledge of databases like MySQL, Postgres, MS SQL, Oracle, and hands-on experience in writing SQL Queries. Experience on any other visualization tool like Tableau or QlikView would be an added advantage. Knowledge of .Net, Python, or JavaScript would be an added advantage. People Skills: Engagement as a proactive member of the workgroup and team player. Professional and open communication to all internal and external stakeholders. Ability to take instruction and perform tasks accurately with minimal guidance. Accurately report to management in a timely and effective manner. Other: Ability to quickly and accurately troubleshoot complex, multi-component issues, providing appropriate cost-effective resolutions and solutions. Ability to abstract application-specific components to a generic solution, enabling re-use. Education: BE OR BTech or MCA or MTech only Interview Rounds: 2 or 3 Technical rounds Experience Level: 5 to 6 Years
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Delhi, India
On-site
Job Description: As a Senior Database Administrator, you will be responsible for managing and maintaining highly available EDB PostgreSQL and MySQL database environments. Your role will include database architecture design, performance tuning, replication setup, backup and recovery, and ensuring data security. You will also provide expertise in cloud migrations and ensure database environments meet high availability and disaster recovery requirements. Key Responsibilities: Database Management: Install, configure, and maintain EDB PostgreSQL and MySQL databases. Manage database schemas, perform upgrades, and apply patches. Monitor database health, availability, and performance across multi-node clusters. Ensure high availability and disaster recovery setups using replication, clustering, and failover mechanisms. Performance Tuning and Optimization: Conduct performance tuning and optimization for SQL queries, indexes, and overall database design. Monitor and optimize resource utilization (CPU, memory, storage) for both EDB PostgreSQL and MySQL. Troubleshoot and resolve performance bottlenecks and database-related application issues. Backup and Recovery: Implement robust backup strategies, including full, incremental, and point-in-time backups. Plan and perform database recovery operations to ensure minimal data loss. Use tools such as pgBackRest, WAL archiving, MySQL dump, and physical and logical backup techniques. Replication and Clustering: Set up and maintain replication (e.g., streaming replication, logical replication) and clustering solutions. Configure and manage failover solutions to enhance database resilience and availability. Administer and troubleshoot EDB Postgres Advanced Server tools like EFM (EDB Failover Manager), PG Pool, and PG Bouncer. Database Security and Compliance: Implement security best practices, including role-based access control and encryption. Ensure compliance with regulatory and security standards. Manage user access and permissions securely. Database Cloud Migration: Lead cloud migration projects to AWS, GCP, or Azure. Design and implement strategies for minimal-downtime migrations. Leverage cloud-native features to optimize performance and scalability. Automation and Scripting: Develop and maintain scripts for database automation tasks using Bash, Python, or SQL. Use monitoring tools and write custom scripts to proactively identify issues. Collaboration and Documentation: Work closely with development teams to optimize database interactions. Document database architectures, processes, configurations, and troubleshooting guidelines. Provide training and guidance to junior DBAs. Technical Skills: Database Platforms: EDB PostgreSQL, PostgreSQL, MySQL. Tools: pgBackRest, WAL, pgBouncer, PG Pool, PEM, MySQL Workbench, MySQL replication, EFM, and monitoring tools like Zabbix, Nagios, or Prometheus. Scripting: Proficiency in SQL, Bash, Python, or other automation languages. Cloud Platforms: AWS, GCP, Azure (experience with cloud-specific database services preferred). Security: Strong knowledge of database security protocols and compliance standards. Operating Systems: Linux (preferably Ubuntu, CentOS, or RHEL) and Windows Server. Soft Skills: Strong problem-solving and analytical skills. Effective communication and teamwork abilities. Proactive approach to learning and applying new technologies. Occasional on-call availability for emergency troubleshooting and support. Education: BE OR BTech or MCA or MTech only Interview Rounds: 2 or 3 Technical rounds Experience Level: 6 to 10 Years
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Career Level - IC2 Minimum 4-5 years hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in Data Migration using Sql loader, import/export Experience in SQL tuning and optimization using explain plan and Sql trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance Must have Good verbal and written communication in English, and good interpersonal, analytical and problem-solving abilities. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Low level design documents. Experience in understanding sophisticated source system data structures preferably in Financial services (preferred) Industry Ability to work with minimal guidance or supervision in a time critical environment. Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, assist with the design, development, modifications, debugging, and evaluation of programs for use in internal systems within a specific function area. Duties and tasks are standard with some variation. Completes own role largely independently within defined policies and procedures. BS or equivalent experience in programming on enterprise or department servers or systems. Life at Oracle and Equal Opportunity: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion, and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles to perform crucial job functions. That's why we're committed to creating a workforce where all individuals can do their best work. It's when everyone's voice is heard and valued that we're inspired to go beyond what's been done before. Disclaimer: Oracle is an Equal Employment Opportunity Employer.. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Which includes being a United States Affirmative Action Employer https://www.oracle.com/corporate/careers/diversity-inclusion/
Posted 1 week ago
5.0 - 8.0 years
8 - 12 Lacs
Mumbai City, Maharashtra, India
On-site
Description We are seeking an experienced Data Analyst to join our team in India. The ideal candidate will have a strong background in data analysis, with the ability to extract meaningful insights from complex datasets to drive business decisions. Responsibilities Collect, process, and analyze large datasets to identify trends and insights. Develop and maintain dashboards and reports to support data-driven decision making. Collaborate with cross-functional teams to understand data needs and provide analytical support. Utilize statistical techniques to interpret data and provide actionable recommendations. Ensure data integrity and accuracy by performing data validation and cleansing activities. Skills and Qualifications Bachelor's degree in Mathematics, Statistics, Computer Science, or a related field. 5-8 years of experience in data analysis or a related field. Proficiency in data analysis tools such as SQL, Python, or R. Experience with data visualization tools like Tableau, Power BI, or similar. Strong understanding of statistical methods and data modeling techniques. Excellent problem-solving skills and attention to detail. Ability to communicate complex data insights in a clear and concise manner. Familiarity with big data technologies (Hadoop, Spark) is a plus.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane