Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Posted 1 day ago
10.0 - 18.0 years
0 - 3 Lacs
Hyderabad
Work from Office
Greetings from Cognizant!!! We have an exciting opportunity for the skill Azure infrastructure with Cognizant, if you are an aspirant for matching the below criteria apply with us immediately!! Skill: Azure Data Factory Experience: 11 to 18 years Location: Hyderabad Notice Period: immediate to 30 days Interview mode : Virtual Required Qualifications: AZ Data Engineer profiles, who are strong in AZ ADF, Snowflake, SQL and DWH Concepts.
Posted 1 week ago
6.0 - 8.0 years
22 - 25 Lacs
Bengaluru
Work from Office
We are looking for energetic, self-motivated and exceptional Data engineers to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with a star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into databases from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business processes and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have the ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance
Posted 1 week ago
7.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Data Analyst (Data Analyst Corporate Technology Data Engineering & Analytics)-(Full-Time, Hyderabad) The Opportunity Join our dynamic team as a Data Analyst – Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in driving the execution of our Data strategy. This role is crucial in driving digital transformation and operational efficiency across Investment Management. As part of this role, you will lead to extracting value from data by facilitating the creation of high-quality data solutions that drive decision-making and operational efficiency. You’ll use your skills to provide subject matter expertise and complete in-depth data analysis, which contributes to the strategic efforts of the team. • Analyze data related to Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing to generate actionable insights. • Develop and maintain comprehensive data mapping documents and work closely with data engineering teams to ensure accurate data integration and transformation. • Partner with Business Analysts, Architects and Data engineers to validate datasets, optimize queries and perform reconciliation. • Support the design and delivery of Investment data and reporting solutions, including data pipelines, reporting dashboards. • Collaborate with Data Engineers, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. • Manage and oversee investment data, ensuring its accuracy, consistency, and completeness. The Minimum Qualifications Education: Bachelors or Master s degree in Finance, Computer Science, Information Systems or related field. Experience: • 7-9 years of experience as a Data Analyst or similar role supporting data analytics projects. • 5+ years of Mastery in SQL. • 5+ years of experience in financial services, insurance, or related industry. • Experience with data manipulation using Python. • Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing. • Investment Operations exposure - Critical Data Elements (CDE), data traps and other data recons. • Familiarity with data engineering concepts: ETL/ELT, data lakes, data warehouses. • Experience with BI tools like Power BI, MicroStrategy, Tableau. • Excellent communication, problem-solving, and stakeholder management skills. • Experience in Agile/Scrum and working with cross-functional delivery teams. • Proficiency in financial reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Ability to lead cross-functional teams and manage complex projects. • Hands-on experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. • Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. • Experience with cloud data platforms like Snowflake and Databricks. • Background in data governance, metadata management, and data lineage frameworks. Soft Skills: • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. • Regular meetings with the Corporate Technology leadership team • Focused one-on-one meetings with your manager • Access to mentorship opportunities • Access to learning content on Degreed and other informational platforms • Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits
Posted 1 week ago
8.0 - 13.0 years
40 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are looking for "Sr. Snowflake Developer/Specialist/Architect" with Minimum 8 years experience Contact- Atchaya (95001 64554) Required Candidate profile Snowflake + Advance SQL (5+ yrs) + DWH (Data warehouse) 8+ years of industry experience with hands-on managing projects in Data Warehousing. Minimum 4 years of experience working on Snowflake.
Posted 1 week ago
4.0 - 7.0 years
1 - 4 Lacs
Noida
Hybrid
Position Overview The role would be to help architect, build, and maintain a robust, scalable, and sustainable businessintelligence platform. Assisted by the Data Team this role will work with highly scalable systems, complex data models, and a large amount of transactional data. Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and dobetter. A career at BOLD promises great challenges, opportunities, culture and the environment. Withour headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry Key Responsibilities Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes. Utilize Python and Airflow to integrate data from across the business into data warehouse. Integrate third party data into the data warehouse like google analytics, google ads, Iterable Required Skills Experience working as an ETL developer in a Data Engineering, Data Warehousing or Business Intelligence team Understanding of data integration/data engineering architecture and should be aware of ETL standards, methodologies, guidelines and techniques Hands on with python programming language and its packages like Pandas, NumPy Strong understanding of SQL queries, aggregate functions, Complex joins and performance tuning Should have good exposure of Databases like Snowflake/SQL Server/Oracle/PostgreSQL(any one of these) Broad understanding of data warehousing and dimensional modelling concept
Posted 1 week ago
9.0 - 14.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.
Posted 2 weeks ago
6.0 - 9.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experience - 6 years - 9 years Location - Pune / Chennai / Mumbai / Bangalore Strong in-depth knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination Follow release and change processes: distribution of software builds and releases to development , test environments and production Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files Should engage in the intake/release/change/incident/problem management processes Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects ) Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans Thoroughly unit test ETL code to ensure error free/efficient delivery Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment. Should be able to provide data modeling solutions.
Posted 3 weeks ago
6.0 - 10.0 years
6 - 16 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role : AWS Redshift Ops + PLSQL + Unix No of years experience :6+ Detailed Job description - Skill Set: Incident Management Troubleshooting issues Contributing to development Collaborating with another team Suggesting improvements Enhancing system performance Training new employees Mandatory Skills : AWS Redshift PLSQL Apache Airflow Unix ETL DWH
Posted 3 weeks ago
13.0 - 20.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Principal Architect - Platform & Application Architect Experience 15+ years in software/data platform architecture 5+ years in architectural leadership roles Architecture & Data Platform Expertise Education Bachelors/Master’s in CS, Engineering, or related field Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.
Posted 3 weeks ago
5.0 - 10.0 years
13 - 22 Lacs
Pune
Work from Office
Role & responsibilities Solid understanding of testing principles, testing types, and methodologies Strong knowledge of SQL and ETL Testing. Strong SQL skills - ability to write and interpret complex SQL queries. Good understanding or proficiency in any one programming language ( Python or Java ) Extensive experience in ETL/DW backend testing & BI Intelligence Report testing Hands-on experience in identifying bugs in complex ETL data processing. Must have experience working with Test management tools like ALM, Rally, JIRA etc Experience with any scheduling tools, with preference given to those familiar with Cloud Composer (Airflow) or CTM Proficiency in GCP, AWS, Azure, or any other cloud platforms . Frontend automation using java/selenium or any other tool. Familiar with Security Testing. Strong comprehension, analytical and problem solving skills. Good working experience in Agile methodologies and usage of tools like Rally & JIRA Excellent written and verbal communications skills
Posted 3 weeks ago
4.0 - 6.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experience - 4 years - 6 years Location - Chennai / Pune / Mumbai / Bangalore Strong in-depth knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination. Follow release and change processes: distribution of software builds and releases to development , test environments and production. Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members. Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files. Should engage in the intake/release/change/incident/problem management processes. Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects ). Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans. Thoroughly unit test ETL code to ensure error free/efficient delivery. Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment. Should be able to provide data modeling solutions. Kindly share your updated resume to AISHWARYAG5@hexaware.com
Posted 3 weeks ago
7.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Posted 3 weeks ago
6.0 - 11.0 years
20 - 27 Lacs
Pune
Work from Office
Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)
Posted 3 weeks ago
6.0 - 11.0 years
20 - 27 Lacs
Pune
Work from Office
Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)
Posted 3 weeks ago
8.0 - 10.0 years
10 - 14 Lacs
Hyderabad
Work from Office
What you will do Lets do this. Lets change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgens infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelors degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e.g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills
Posted 3 weeks ago
4.0 - 8.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Primary skill - MSSQLSEVER , TSL, Performance Tuning, SSIS and DWH Experience - 6 years - years Location - Pune / Mumbai / Chennai / Bangalore Notice Period - Immediate joiners Proven experience and hands-on as a Microsoft BI Developer (SSIS ) Expert in SQL. Should be able to write complex, nested queries, stored procedures. Background in data warehouse design (e.g. dimensional modelling) and data mining Added advantage: knowledge on Master data services, Power BI and Tableau Basic understanding of Agile process is good to have Extensive experience in SQL Server database. Good knowledge of Stored Procedures, views, complex queries, functions.
Posted 3 weeks ago
4.0 - 9.0 years
5 - 10 Lacs
Chennai, Bengaluru
Work from Office
Job Purpose: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. Requirements: We are seeking an experienced Azure Data Engineer with over 4 to 13 years of proven expertise in Data lakes, Lake house, Synapse Analytic, Data bricks, Tsql, sql server, Synapse Db, Data warehouse and should have work exp in ETL, Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems with strong communication skills. The ideal candidate should have: Key Responsibilities: Create Data lakes from scratch, configure existing systems and provide user support Understand different datasets and Storage elements to bring data Have good knowledge and work experience in ADF, Synapse Data pipelines Have good knowledge in python, Py spark and spark sql Implement Data security at DB and data movement layers Should have experience in ci/cd data pipelines Work with internal teams to design, develop and maintain software Qualifications & Key skills required: Expertise in Datalakes, Lakehouse, Synapse Analytics, Data bricks, Tsql, sql server, Synapse Db, Data warehouse Hands-on experience in ETL, ELT, handling large volume of data and files. Working knowledge in json, parquet, csv, xl, structured, unstructured data and other data sets Exposure to any Source Control Management, like TFS/Git/SVN Understanding of non-functional requirements Should be proficient in Data catalogs, Meta data, DWH, mpp systems, OLTP, and OLAP systems Experience in Azure Data Fabric, MS Purview, MDM tools is an added advantage A good team player and excellent communicator
Posted 4 weeks ago
6.0 - 11.0 years
15 - 20 Lacs
Hyderabad, Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation:ETL Tester Skills:ETL Testing + Data warehouse + Snowflakes + Azure Location:Bang/Hyd/Pune/Chennai Exp: 5-10 yrs Call: Nisha:8875876654 Afreen:9610352987 Garima:8875813216 Kajal:8875831472 Team Converse
Posted 4 weeks ago
8.0 - 13.0 years
22 - 37 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Senior Professional Scrum Master- Experience- 8 to 12 years of IT Experience, Location- Pune, Nagpur, Hyderabad, Bangalore, Mumbai. - Certified Scrum Master - Excellent communication skills. - Well versed with Agile methodology to plan, manage, and deliver solutions - Conducts all scrum ceremonies and aids in story/task creation and estimation. - Identifies and manages issues, risks, and action items. - Schedules and facilitates all scrum events and decision-making processes. - Monitors progress and helps teams to make improvements. - Strong technical background preferably in Datawarehouse and Healthcare domain.
Posted 4 weeks ago
7.0 - 12.0 years
10 - 20 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Hexaware technologies is hiring for Senior Data Engineer Primary Skill - ETL, SSIS, SQL DWH Notice Period - Immediate/ Early joiners preferred Location - Chennai, Mumbai, Pune, Bangalore Total experience required - 6 to 12yrs If interested, Kindly share your updated resume with below required details. Full name: Total IT exp: Rel Exp in (ETL, SSIS, DWH): Current location: Current CTC: Exp CTC: Notice Period (Mention LWD, If serving): Job Description: 5+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 5+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 5+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 5 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies. 5+ years' Experience with large-scale data processing and query optimization techniques using TSQL. 5+ years' Experience with implementing audit, balance and control mechanism in data solutions 3+ year experience with any source control repos like GIT, TFVC or Azure DevOps , including branching and merging and implement CICD pipelines for database and ETL workloads. 2+ experience working with Python Pandas libraries to process semi structured data sets, and load them to SQL Server DB.
Posted 4 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-10 yrs Location: Gurugram/Bangalore/Pune Skill: Azure Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in Pyspark: Rel Exp in DWH: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 month ago
8.0 - 13.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
Consultant Data Engineer Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now. Duties and Responsibility Act as Consultant Data Engineer Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage. Create a robust, extensible architecture to meet the client/business requirements Snowflake objects with integration with AWS services and DBT Involved in different type of data ingestion pipelines as per requirements. Development in DBT (Data Build Tool) for data transformation as per the requirements Working on multiple AWS services integration with Snowflake. Working with integration of structured data & Semi-Structure data sets Work on Performance Tuning and cost optimization Work on implementing CDC or SCD type 2 Design and build solutions for near real-time stream as well as batch processing. Implement best practices for data management, data quality, and data governance. Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT Investigate production issues and fine-tune our data pipelines Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery. Co-ordinates and Support software developers, database architects, data analysts and data scientists on data initiatives Orchestrate the pipeline using Airflow Suggests improvements to processes, products and services. Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers. Create technical documentation on confluence to aim knowledge sharing. -Associate Data Engineer Tools & Technology : Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, shell Scripting, SQL, Git, Confluence, Python • Duties and Responsibility Act as offshore Data engineer and enhancement & testing. Design and build solutions for near real-time stream processing as well as batch processing. Development in snowflake objects with there unique features implemented Implementing data integration and transformation workflows using DBT Integration with AWS services with snowflake Participate in implementation plan, respond to production issues Responsible for data collection, data cleaning & pre-processing Experience in developing UDF, Snowflake Procedures, Streams, and Tasks. Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA Investigate Productions jobs failure with including investigation till find out RCA. Development of ETL processes and data integration solutions. Understanding the business needs of the client and provide technical solution Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Handling major outages effectively along with effective communication to business, users & development partners. Defines and creates Run Book entries and knowledge articles based on incidents experienced in the production - Associate Engineer • Tools and Technology: UNIX, ORACLE, Shell scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS • Duties and Responsibility Worked as an Senior Production /Application Support Engineer Working as Production support member for Loading, Processing and Reporting of files and generating Reports. Monitoring multiple Batches, Jobs, Processes and analyzing issue related the job failures and Handling FTP failure, Connectivity issue of Batch/Job failure. Performing data analysis on files and generating files and sending files to destination server depends on functionality of job. Creating Shell Script for automating the daily task or Service Owner Requested. Involved in tuning the Jobs to improve performance and performing daily checks. Coordinating with Middleware, DWH, CRM and most of teams in case of any issue of CRQ. Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Involved in tuning the Jobs to improve performance and raising PBI after approval from Service owner. Involved in performance improvement Automation activities to decrees manual workload Data ingestion from RDBMS system to HDFS/Hive through SQOOP Understand customer problems and provide appropriate technical solutions. Handling major outages effectively along with effective communication to business, users & development partners. Coordinating with Client, On- Site persons and joining the bridge call for any issues. Handling daily issues based on application and jobs performance.
Posted 1 month ago
10.0 - 16.0 years
0 - 1 Lacs
Noida
Work from Office
Job Title / Designation : Database Manager Location : Noida Job Summary: The role involves designing and developing solutions to support the business needs. Optimizing and tuning existing programs and developing new routines will be an integral part of the profile. Key Responsibility Areas: • Architect, Design and Develop solutions to support business requirements. • Use skill sets to Analyze and manage a variety of database environments such as Oracle, Postgres, Cassandra, MySql, Graph DB, etc • Provide optimal design of database environments, analysing complex distributed production deployments, and making recommendations to optimize performance. • Work closely with programming teams to deliver high quality software. • Provide innovative solutions to complex business and technology problems. • Propose best solution in Logical and Physical Data Modelling. • Perform Administration tasks including DB resource planning and DB tuning. • Mentor and train junior developers, lead & manage teams. Qualification: B.Tech. Experience: 10+ years of experience in a Data Engineering role. Bachelor's degree in Computer Science or related experience. Skill Sets / Requirements: • Experience of designing/architect database solutions • Experience with multiple RDBMS and NoSql databases of TB data size preferably Oracle, PostgreSQL, Cassandra and Graph DB. • Must be well versed in PL/SQL & PostgreSQL and Strong Query Optimization skills. • Expert knowledge in DB installation, configuration, replication, upgradation, security and HADR set up. • Experience in database deployment, performance and / troubleshooting issues. • Knowledge of scripting languages (such and Unix, shell, PHP). • Advanced knowledge of PostgreSQL will be preferred. • Experience working with Cloud Platforms and Services • Experience with migrating database environments from one platform to another • Ability to work well under pressure • Experience with big data technologies and DWH is a plus
Posted 1 month ago
10.0 - 16.0 years
25 - 27 Lacs
Chennai
Work from Office
We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
22645 Jobs | Dublin
Wipro
12405 Jobs | Bengaluru
EY
8519 Jobs | London
Accenture in India
7136 Jobs | Dublin 2
Uplers
6955 Jobs | Ahmedabad
Amazon
6685 Jobs | Seattle,WA
IBM
6478 Jobs | Armonk
Oracle
6281 Jobs | Redwood City
Muthoot FinCorp (MFL)
5249 Jobs | New Delhi
Capgemini
4637 Jobs | Paris,France