Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As a Data Governance and Management Developer at Assent, you will play a crucial role in ensuring the quality and reliability of critical data across systems and domains. Your responsibilities will include defining and implementing data quality standards, developing monitoring pipelines to detect data issues, conducting data profiling assessments, and designing data quality dashboards. You will collaborate with cross-functional teams to resolve data anomalies and drive continuous improvement in data quality. Key Requirements & Responsibilities: - Define and implement data quality rules, validation checks, and metrics for critical business domains. - Develop Data Quality (DQ) monitoring pipelines and alerts to proactively detect data issues. - Conduct regular data profiling and quality assessments to identify gaps, inconsistencies, duplicates, and anomalies. - Design and maintain data quality dashboards and reports for visibility into trends and issues. - Utilize generative AI to automate workflows, enhance data quality, and support responsible prompt usage. - Collaborate with data owners, stewards, and technical teams to resolve data quality issues. - Develop and document standard operating procedures (SOPs) for issue management and escalation workflows. - Support root cause analysis (RCA) for recurring or high-impact data quality problems. - Define and monitor key data quality KPIs and drive continuous improvement through insights and analysis. - Evaluate and recommend data quality tools that scale with the enterprise. - Provide recommendations for enhancing data processes, governance practices, and quality standards. - Ensure compliance with internal data governance policies, privacy standards, and audit requirements. - Adhere to corporate security policies and procedures set by Assent. Qualifications: - 2-5 years of experience in a data quality, data analyst, or similar role. - Degree in Computer Science, Information Systems, Data Science, or related field. - Strong understanding of data quality principles. - Proficiency in SQL, Git Hub, R, Python, SQL Server, and BI tools like Tableau, Power BI, or Sigma. - Experience with cloud data platforms (e.g., Snowflake, BigQuery) and data transformation tools (e.g., dbt). - Exposure to Graph databases and GenAI tools. - Ability to interpret dashboards and communicate data quality findings effectively. - Understanding of data governance frameworks and regulatory considerations. - Strong problem-solving skills, attention to detail, and familiarity with agile work environments. - Excellent verbal and written communication skills. Join Assent and be part of a dynamic team that values wellness, financial benefits, lifelong learning, and diversity, equity, and inclusion. Make a difference in supply chain sustainability and contribute to meaningful work that impacts the world. Contact talent@assent.com for assistance or accommodation during the interview process.,
Posted 1 week ago
5.0 - 9.0 years
35 - 37 Lacs
Bengaluru
Remote
Role & responsibilities Snowflake / SQL Architect • Architect and manage scalable data solutions using Snowflake and advanced SQL, optimizing performance for analytics and reporting. • Design and implement data pipelines, data warehouses, and data lakes, ensuring efficient data ingestion and transformation. • Develop best practices for data security, access control, and compliance within cloud-based data environments. • Collaborate with cross-functional teams to understand business needs and translate them into robust data architectures. • Evaluate and integrate third-party tools and technologies to enhance the Snowflake ecosystem and overall data strategy.
Posted 1 week ago
9.0 - 14.0 years
30 - 40 Lacs
Pune, Chennai
Work from Office
Designing, implementing, and optimizing data solutions using both Azure and Snowflake experience working with Matillion tool Azure and Snowflake, including data modeling, ETL processes, and data warehousing. SQL and data integration tools.
Posted 1 week ago
3.0 - 5.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions, including data warehouses and data lakes. Ensuring data quality and integrity through data validation, cleansing, and error handling. Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence). Implementing data security measures and access controls to protect sensitive information. Monitor and troubleshoot issues in data pipelines, notebooks, and SQL queries to ensure seamless data processing. Develop and maintain Power BI dashboards and reports. Work with DAX and Power Query to manipulate and transform data. Basic Qualifications Bachelor’s or master’s degree in computer science or data science 3-5 years of experience in data engineering, big data processing, and cloud-based data platforms. Proficient in SQL, Python, or Scala for data manipulation and processing. Proficient in developing data pipelines using Azure Synapse, Azure Data Factory, Microsoft Fabric. Experience with Apache Spark, Databricks and Snowflake is highly beneficial for handling big data and cloud-based analytics solutions. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Contributions to open-source data engineering projects.
Posted 1 week ago
8.0 - 13.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Hi All, Please find below JD SKILL : DBT Developer(Data Engineer) Location Bangalore. Experience: 8 to 15Years Position: Contract to hire Education: Engineering or equivalent (BTech\MTech\MCA) Job Description Mandatory Skills : Looking for a Data tester with DBT (Data built Tool) experience for Core conversion project. 8+ years of experience in data engineering, analytics engineering, or similar roles. Proven expertise in dbt (Data Build Tool) and modern data transformation practices. Advanced proficiency in SQL and deep understanding of dimensional modeling, medallion architecture and ELT principles. Strong hands-on experience with Snowflake, including query optimization Proficient with Azure cloud services, including Azure Data Factory, Blob Storage Strong communication and collaboration skills Familiarity with data governance, metadata management, and data quality frameworks.
Posted 1 week ago
5.0 - 10.0 years
7 - 14 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Company Name: LTIMindtree Experience : 5+ Years Location: Pan India (Hybrid Model) Interview Mode: Virtual Interview Rounds: 2 Rounds Notice Period: Immediate to 30 days Job description: Roles & Responsibilities: Responsibilities will include expanding and updating the production runbook as new functionality added and processes are finetuned Establish and maintain runbooks for the data processes Establish and maintain data quality and data technical controls Identify data process performance improvements Interpreting data and analyzing results Perform Incident management activities Stakeholder communication and SLA management Monitor data integrity data processing and coordinate corrective actions when necessary Ensuring data integrity by verifying and cleaning data Perform root cause analysis on production failures on data processes
Posted 1 week ago
4.0 - 9.0 years
15 - 27 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Location: Kolkata, Hyderabad, Bangalore Exp 4 to 17 years Band 4B, 4C, 4D Skill set -Snowflake,AWS/ Azure, Python,ETL Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark/ DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts Data Modeling , Design patterns
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Noida, India
Work from Office
Key Responsibilities: 1.Architect and design end to end data pipelines starting from Source systems to Data warehouse. 2.Lead the development of scalable Python- Spark based data processing workflows 3.Define and implement data modeling standards for DWH including fact/dimension schema and historical handling. 4.Oversee performance tuning of Python, Spark and ETL loads. 5.Ensure robust data integration with Tableau reporting by designing data structures optimized for Bl consumption. 6.Mentor junior engineers and drive engineering best practices. 7.Work loosely with business stakeholders, developers and product teams to align data initiatives with business goals, 8.Define SLAs, error handling, logging, monitoring and alerting mechanisms across pipelines. Must Have: 1. Strong Oracle SQL expertise and deep oracle DWH experience. 2. Proficiency in Python and Spark with experience handling large scale data transformations. 3. Experience in building batch data pipelines and managing dependencies. 4. Solid understanding of data warehousing principles and dimensional modeling. 5. Experience working with reporting tools like Tableau. 6. Good to have experience in cloud-based DWHs (like Snowflake) for future- readiness. Mandatory Competencies ETL - ETL - Data Stage Beh - Communication and collaboration BI and Reporting Tools - BI and Reporting Tools - Tableau QA/QE - QA Analytics - Data Analysis Database - Database Programming - SQL Big Data - Big Data - SPARK Programming Language - Python - Python Shell ETL - ETL - Ab Initio
Posted 1 week ago
2.0 - 4.0 years
10 - 14 Lacs
Bengaluru
Hybrid
Job Description Job Summary The Business Intelligence (BI) Developer II is responsible for supporting the current production BI platform along with the development of new business intelligence capabilities, leveraging data transformation best practices. The BI Developer is required to have a deep understanding of the BI architecture and processes to provide technical guidance on the optimal solution for business logic. The developer is seen as the subject matter expert (SME) on data warehousing and ELT processes leveraging SQL, Python, and Java, ideally on platforms including Snowflake and Matillion. The developer is required to effectively communicate orally and written. Responsibilities Implement new logic and/or transformation workflows to build new data products within our BI platform Manage existing code base and make required logic updates and/or technical debt cleanup Develop and support QA processes for our BI platform Provide consultation to the internal product team requesting new BI features Contribute to Data Governance policies and standards including data quality, data management, business process management, privacy, and security Troubleshoot integration/build failures to determine root cause and provide guidance on possible solutions; including writing code for resolution of an identified issue Create process models and data flow diagrams Participate in identifying and maintaining team best practices Participate in Agile SCRUM process, including managing tasks and test cases Qualifications, Skills, and Experience 2 - 4 years experience in BI Developer role, or related position 2 - 4 years of experience using SQL to query data 2 - 4 years of experience using SQL, Python, and Java to develop data warehouse and ELT processes B.S. in Computer Science or equivalent business experience Problem analysis and solving skills ability to identify root causes of problems and differentiate between perceived and actual problems. Experience leveraging Snowflake and Matillion preferred Demonstrated proficiency with Software Development Lifecycle (SDLC) Demonstrated experience working in a virtual team environment as well the ability to work independently Strong technical, organizational, and communication (written and verbal) skills that enable the ability to take requirements from a business user and implement them within the SDLC process Must be flexible with an ability to handle multiple tasks, projects, or work items concurrently while meeting prioritized deadlines Intermediate to Advanced knowledge of SQL Basic to Intermediate knowledge of Python and Java scripting Must have an eye for data quality and experience enforcing data governance across a vast volume of data Aptitude for learning new technologies and learning on the fly Flexibility and adaptability outside the assigned core responsibilities Staffing industry experience is a plus
Posted 1 week ago
7.0 - 12.0 years
30 - 37 Lacs
Hyderabad
Work from Office
Required Skills and Qualifications: 8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 1 week ago
3.0 - 6.0 years
14 - 18 Lacs
Mumbai
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. The position is based in Mumbai, India office. Responsibilities Build and maintain ETL pipelines for Snowflake. Manage Snowflake objects and data models. Integrate data from various sources. Optimize performance and query efficiency. Automate and schedule data workflows. Ensure data quality and reliability. Collaborate with cross-functional teams. Document processes and data flows. Qualifications Self-motivated, collaborative individual with passion for excellence B.E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Databases Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Experience managing Snowflake databases, schemas, tables, and other objects Proficient in Snowflake SQL, including CTEs, window functions, and stored procedures Familiar with Snowflake performance tuning and cost optimization tools Skilled in building ETL/ELT pipelines using dbt, Airflow, or Python Able to work with various data sources including RDBMS, APIs, and cloud storage Understanding of incremental loads, error handling, and scheduling best practices Strong SQL skills and intermediate Python proficiency for data processing Familiar with Git for version control and collaboration Basic knowledge of Azure, or GCP cloud platforms Capable of integrating Snowflake with APIs and cloud-native services What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :In this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform.The senior data engineer develops data pipelines extracting and transforming data as governed assets into the data platform, improves system quality by identifying issues and common patterns and developing standard operating procedures; and enhances applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Roles and responsibilities:(MUST HAVE) Extensive experience with cloud data warehouse like Snowflake, AWS Athena, and SQL databases like PostgreSQL, MS SQL Server. Experience with NoSQL databases like AWS DynamoDB and Azure Cosmos is a plus.(MUST HAVE) Solid experience and clear understanding of DBT.(MUST HAVE) Experience working with AWS and/or Azure CI/CD DevOps technologies, and extensive debugging experience.Good understanding of data modeling, ETL, data curation, and big data performance tuning.Experience with data ingestion tools like Fivetran is a big plus.Experience with Data Quality and Observability tools like Monte Carlo is a big plus.Experience working and integrating with Event Bus like Pulsar is a big plus.Experience integrating with a Data Catalog like Atlan is a big plus.Experience with Business Intelligence tools like PowerBI is a plus.An understanding of unit testing, test driven development, functional testing, and performanceKnowledge of at least one shell scripting language.Ability to network with key contacts outside own area of expertise.Must possess strong interpersonal, organizational, presentation and facilitation skills.Must be results oriented and customer focused.Must possess good organizational skills.Technical experience & Professional attributes:Prepare technical design specifications based on functional requirements and analysis documents.Implement, test, maintain and support software, based on technical design specifications.Improve system quality by identifying issues and common patterns and developing standard operating procedures.Enhance applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Maintain and improve existing codebases and peer review code changes.Liaise with colleagues to implement technical designs.Investigating and using new technologies where relevantProvide written knowledge transfer material.Review functional requirements, analysis, and design documents and provide feedback.Assist customer support with technical problems and questions.Ability to work independently with wide latitude for independent decision making.Experience in leading the work of others and mentor less experienced developers in the context of a project is a plus.Ability to listen and understand information and communicate the same.Participate in architecture and code reviews.Lead or participate in other projects or duties as need arises. Education qualifications:Bachelors degree in computer science, Information Systems, or related field; or equivalent combination of education/experience. Masters degree is a plus.5 years or more of extensive experience developing mission critical and low latency solutions.At least 3 years of experience with developing and debugging distributed systems and data pipelines in the cloud. Additional Information:The Winning Way behaviors that all employees need in order to meet the expectations of each other, our customers, and our partners. Communicate with Clarity - Be clear, concise and actionable. Be relentlessly constructive. Seek and provide meaningful feedback. Act with Urgency - Adopt an agile mentality - frequent iterations, improved speed, resilience. 80/20 rule better is the enemy of done. Dont spend hours when minutes are enough. Work with Purpose - Exhibit a We Can mindset. Results outweigh effort. Everyone understands how their role contributes. Set aside personal objectives for team results. Drive to Decision - Cut the swirl with defined deadlines and decision points. Be clear on individual accountability and decision authority. Guided by a commitment to and accountability for customer outcomes. Own the Outcome - Defined milestones, commitments and intended results. Assess your work in context, if youre unsure, ask. Demonstrate unwavering support for decisions.COMMENTS:The above statements are intended to describe the general nature and level of work being performed by individuals in this position. Other functions may be assigned, and management retains the right to add or change the duties at any time. Qualification 15 years full time education
Posted 1 week ago
4.0 - 8.0 years
20 - 27 Lacs
Chennai
Hybrid
Key Responsibilities Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc. Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic. Participate in code reviews , performance tuning, and defect triage sessions. Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines. Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations. Troubleshoot and resolve issues in QA/UAT/Production environments as needed. Adhere to agile delivery practices, sprint planning, and documentation requirements. Required Skills and Experience 4+ years of experience in ETL development with at least 12 years in IBM DataStage (preferably CP4D version) . Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing. Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning. Proficiency in SQL , Unix scripting , and basic Python for data handling or automation. Familiarity with S3 , version control systems (Git), and job orchestration tools. Experience with data profiling, cleansing, and quality validation routines. Understanding of data lake/data warehouse architectures and DevOps practices
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection
Posted 1 week ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The impact you will have in this role : The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence and will be responsible for managing a growing team of consultants and employees and running a development and production support teams for the Enterprise Intelligence team for DTCC Your Primary Responsibilities : Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCCs ITIL process for incident, change and problem resolution Manage delivery and production support teams Drive delivery independently and autonomously within team and vendor teams Liaise with onshore peers to drive seamless quality of service to stakeholders Conduct working sessions with users and SMEs to align reporting and reduce use of offline spreadsheets and potentially stale data sources Provide technical leadership for projects Work closely with other project managers and scrum masters to create and update project plans Work closely with peers to improve workflow processes & communication Qualifications: 8+ years of related experience Bachelor's degree (preferred) or equivalent experience Talents Needed for Success: Minimum of 12 years of related experience Minimum of 8 years of experience in managing data warehousing, SQL, Snowflake. Minimum of 5 years of experience in People management & Team Leadership Ability to manage distributed teams with an employee/vendor mix Strong understanding of snowflake schemas and data integration methods and tools Strong knowledge on managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Ability to meet deadlines, goals and objectives Ability to facilitate educational and working sessions with stakeholders and users Self-starter, continually striving to improve the teams service offerings and ones own skillset Must have a problem-solving and innovative mindset to meet a wide variety of challenges Willingness and ability to learn all aspects of our operating model as well as new tools Developed competencies around essential project management, communication (oral, written) and personal effectiveness Good SQL skills and good knowledge of relational databases, specifically, Snowflake Ability to manage agile development cycles within the DTCC SDLC (SDP) methodology Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys)
Posted 1 week ago
5.0 - 9.0 years
12 - 17 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Enterprise Test Engineering ("ETE") family is responsible for ensuring that all applications and systems meet defined quality standards. The ETE family encompasses three major areas including (a) functional testing, (b) non-functional testing, and (c) test architecture and enablement. Other key focuses include regression testing, browser testing, performance testing, capacity and stress testing, resiliency testing, environment management services, and infrastructure testing. Develops, conducts, and evaluates testing processes, working closely with developers to remediate identified system defects. In-depth knowledge of automated testing tools, and quality control and assurance approaches including the creation of reusable foundational test automation framework for the entire organization. The Lead Test Engineer is responsible for independently leading Test Engineering teams. You will be developing test plans and implementing those plans against the corresponding test procedures. You will be accountable for the development, release, and maintenance of test procedures. Your Primary Responsibilities : Responsible for system integration testing, including automation, of newly developed or enhanced applications. Play an active role in translating business and functional requirements into concrete results. Lead, develop, and advise on test automation strategies and provide critical feedback in requirements, design, implementation and execution phases. Partner with collaborators - Product Management, Application Development, DevOps and other technical teams. Track test execution milestones and report on issues and risks with the potential to affect project timelines. Construct appropriate end-2-end business scenarios through the application of a broad understanding of business objectives and goals. Responsible for Delivery Pipeline adoption Identify dependencies for environmental and data requirements. Contribute to standard framework of reusable functions. Develop a thorough understanding of the product(s) being delivered. Responsible for process compliance & associated documentation Aligns risk and control processes into day-to-day responsibilities to supervise and mitigate risk; escalates appropriately. Works closely with business and AD domain experts, to continually improve depth and breadth of knowledge for assigned applications/systems. Responsible for Project Coordination and Technical Management tasks. **NOTE: The Primary Responsibilities of this role are not limited to the details above. ** Qualifications: 5-7 years of related experience in delivering software solutionswith hands on automated testing. Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Experience in Agile/Waterfall, onsite/offshore work model and coordination. In depth knowledge of the software implementation lifecycle (specifically the testing model, methodology, and processes). Experience with Test Engineering methodologies and Test Automation Frameworks Proficient in automation at all software layers (e.g. UI, services, APIs, etc.) as well as CI/CD technologies (e.g. Cloudbees, Jenkins, Cucumber, Git, JUnit, Jira, etc.). Sophisticated Java / Selenium development skills with significant experience applying those skills in test environments. Track test execution milestones, report on issues and risks with the potential to affect project timelines. Extensive experience with testing modern scripting language-based components. Proven expertise in frontend test automation using Selenium Webdriver. Expert and hands on with backend test automation using Rest Assured/Karate for API testing. JDBC/JPA for database testing (Oracle/ DB2/ Snowflake). Experience in writing sophisticated SQL queries. Experience with JIRA, ALM, Bitbucket, Git, and Jenkins. Requires the ability to work with both business clients and technical team and the ability to work independently, both individually and as part of a team. Experience in mentoring junior test engineers, verifying work products and providing mentorship, as needed. Unix, Python and AWS experience in a plus. Accountable for process compliance & associated documentation Aligns risk and control processes into day-to-day responsibilities to supervise and mitigate risk; escalates appropriately. Excellent problem-solving, collaboration, and written and verbal communication skills.
Posted 1 week ago
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Act as a technical expert on one or more applications used by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, solving, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Expertise in Snowflake DB and its various architecture principles, capabilities Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at enterprise scale that are built on Snowflake Ability to create Strong SQL Procedures in Snowflake, Build a Data Pipeline efficiently in a cost-optimizing & performance efficient way Proficient understanding of code versioning tools - Git, Mercurial, SVN Knowledge of SDLC, Testing & CI/CD aspects such as Jenkins, BB , JIRA Fosters a culture where integrity and transparency are encouraged. Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date. Invests in effort to individually coach others. Build collaborative teams across the organization. Communicates openly keeping everyone across the organization advised.
Posted 1 week ago
5.0 - 10.0 years
14 - 19 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The SLM and JDM applications monitor all jobs, processes and output of systems run within DTCC. Our goal is to identify deviation from historical trends and known patterns of execution that may lead to issues latter in the process cycle that would interrupt our clients business interaction Your Primary Responsibilities: Working with the current Power BI based dashboards, design new Java based dashboards to replace and enhance the functionality of the JDM system. Help develop specifications for new dashboards and required applications changes to support the new design. Build and deploy the new dashboards and work with the application and business support teams to train in the usage of the application. Utilize feedback to further design enhancements to the application. Qualifications: Minimum of 4 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 4+ Years of Active DevelopmentExperience/ Expertise in Java/J2EE Based Applications Must HaveStrong Frontend Experience - Angular Experience in Web based UI development and SPA Development Experience with CI/CD technologies like GIT, Jenkins, and Maven Experience with containers like OpenShift is a plus. Experience with Messaging, ETL or Reporting tools is a plus. Database and PL/SQL skills (snowflake preferred) is a plus Knowledge of Python a plus Familiarity with Agile development methodology
Posted 1 week ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, designing, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Lead technical processes and designs considering reliability, data integrity, maintainability, reuse, extensibility, usability, and scalability. Review code of development team to ensure quality and adherence to best practices and standards. Mentor junior developers to develop their skills and build strong talent. Collaborate with Infrastructure partners to identify and deploy optimal hosting environments. Define scalability and performance criteria for assigned applications. Ensure application meets the performance, privacy, and security requirements. Verify test plans to ensure compliance with performance and security requirements. Support business and technical presentations in relation to technology platforms and business solutions. Mitigate risk by following established procedures and monitoring controls. Help develop solutions that balance cost and delivery while meeting business requirements. implement technology-specific best practices that are consistent with corporate standards. Partner with multi-functional teams to ensure the success of product strategy and project deliverables. Manage the software development process. Drive new technical and business process improvements. Estimate total costs of modules/projects covering both hours and expense. Research and evaluate specific technologies, and applications, and contributes to the solution design. Construct application Architecture encompassing end-to-end designs. Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Qualifications: Minimum of 7+ years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 7+ years Strong Frontend Experience - jQuery and JavaScript 7+ years of Active Development Experience/ Expertise in Java/J2EE Based Applications proven ability with Hibernate, Spring, Spring MVC Experience in Web based UI development Experience with CSS, HTML, JavaScript, and similar UI frameworks (jQuery, React) Familiarity with Microservices based architecture and distributed systems Hands on experience with AI tools such as Amazon Q is a plus Ability to develop and work with REST APIs using Spring Boot framework. Hands-on experience with AWS technologies, Snow Flake is a plus Strong database and PL/SQL skills (Oracle, Postgres preferred) Experience with Messaging, ETL or Reporting tools is a plus. Knowledge of Python a plus Familiarity with Agile development methodology Collaborate with multiple collaborators such as product management, application development, DevOps, and other technical groups.
Posted 1 week ago
5.0 - 9.0 years
14 - 19 Lacs
Hyderabad, Chennai
Work from Office
The Impact you will have in this role: The role involves developing and maintaining control functions for GTR application. This role is also expected to work closely with the required development teams, our Enterprise Infrastructure partners and our internal business clients to resolve and escalate technical support incidents where necessary Your Primary Responsibilities: Developing Python based control functions and maintaining it. Developing Data model for various applications based on snowflake database Working in streams, streamlit in Snowflake for GUI based developments Work with support teams like EAS GTR for resolving Production & PSE related incidents Qualifications: Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 5+ Years of Active DevelopmentExperience/ Expertise in Python Based Applications Experience in ticket tracking tools like ServiceNOW (SNOW),Jira etc. Database and PL/SQL skills (snowflake preferred) is a plus. Experience in Bitbucket and Jenkin tools. Experience with Messaging, ETL or Reporting tools is a plus. Familiarity with Agile development methodology.
Posted 1 week ago
9.0 - 14.0 years
30 - 35 Lacs
Chennai
Work from Office
DTCC Digital Assets DTCC Digital Assets is at the forefront of driving institutional adoption of digital assets technology with a steadfast commitment to innovation anchored in security and stability. As the financial services industrys trusted technology partner, we pride ourselves on empowering a globally interconnected and efficient ecosystem.Our mission is to provide secure and compliant infrastructure for digital assets, enabling financial institutions to unlock the full potential of blockchain technology We are seeking an experienced and highly skilled Principal Data Engineer to join our dynamic team. As a Principal Data Engineer, you will play a crucial role in designing and building and growing our greenfield Snowflake Data Platform for Digital Assets. The Impact you will have in this role: Principal Data Engineer role is substantial in shaping the data infrastructure and strategic direction of the Digital Assets department. By leading the design and implementation of a greenfield Snowflake Data Platform, this role directly influences the organizations ability to manage and leverage data for operational efficiency and risk assessment. The Associate Director ensures that data systems are scalable, secure, and aligned with business goals, enabling faster decision-making and innovation. Their leadership in managing cross-functional teams and collaborating with stakeholders ensures that technical solutions are not only robust but also responsive to evolving business needs. Beyond technical execution, this role plays a pivotal part in fostering a culture of accountability, growth, and inclusion. By mentoring team members, driving employee engagement, and promoting best practices in agile development and data governance, the Associate Director helps build a resilient and high-performing engineering organization. Their contributions to incident management, platform adoption, and continuous improvement efforts ensure that the data platform remains reliable and future-ready, positioning the company to stay competitive in the rapidly evolving digital assets landscape. Role description: Lead engineering and development focused projects from start to finish with minimal supervision. Provide technical and operational support for our customer base as well as other technical areas within the company. Review and supervise the system design and architecture. Interact with stakeholder to understand requirements and provide solutions. Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives. Refine and prioritize the backlog for the team in partnership with product management. Groom and guide the team of employees and consultants. Responsible for employee engagement, growth and appraisals. Participate in user training to increase awareness of the platform. Ensure incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues. Ensure quality and consistency of data from source systems and align with data product managers on facilitating resolution of these issues in a consistent manner. Follow DTCCs ITIL process for incident, change and problem resolution Talents Needed for Success Bachelors degree in Computer Science, Information Technology, Engineering (any) or related field 8 years of experience in the job or related position. Prior experience to include: 5 years of experience in managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep, and retirement. 5 years leading development teams from with mix of onshore and offshore members. Experience designing and architecting data warehousing applications. Warehousing concepts involving fact and dimensions. Star/snowflake schemas and data integration methods and tools. Deep understanding of the Snowflake platform. Designing data pipelines. SQL and relational databases. Development in agile scrum teams. Development following CI/CD processes. Demonstrable experience with data streaming technologies like Kafka for data ingestion. Knowledge of Blockchain technologies, Smart Contracts and Financial Services a plus. Designing low latency data platforms a plus. Knowledge of Data Governance principles a plus. Optimize/Tune source streams, queries, and Power BI (or equivalent) Dashboards Leadership competencies Champion Inclusion - Embrace individual difference and create an environment of support, belonging and trust. Communicate Clearly - Listen to understand. Ask questions for clarity and deliver messages with purpose. Cultivate Relationships show care and compassion for others and authentically build networks across functions. Instill Ownership Ensure accountability, manage execution, and mitigate risk to deliver results. Inspire Growth Develop yourself and others through coaching, feedback, and mentorship to meet carer goals. Propel Change Think critically, respectfully challenge, and create innovative ways to drive growth.
Posted 1 week ago
5.0 - 10.0 years
14 - 19 Lacs
Hyderabad, Chennai
Work from Office
The Impact you will have in this role: The role involves developing and maintaining control functions for GTR application. This role is also expected to work closely with the required development teams, our Enterprise Infrastructure partners and our internal business clients to resolve and escalate technical support incidents where necessary Your Primary Responsibilities: Developing Python based control functions and maintaining it. Developing Data model for various applications based on snowflake database Working in streams, streamlit in Snowflake for GUI based developments Work with support teams like EAS GTR for resolving Production & PSE related incidents Qualifications: Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 5+ Years of Active DevelopmentExperience/ Expertise in Python Based Applications Experience in ticket tracking tools like ServiceNOW (SNOW),Jira etc. Database and PL/SQL skills (snowflake preferred) is a plus. Experience in Bitbucket and Jenkin tools. Experience with Messaging, ETL or Reporting tools is a plus. Familiarity with Agile development methodology.
Posted 1 week ago
5.0 - 10.0 years
18 - 22 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: The Lead Platform Engineer is responsible for design analysis, documentation, testing, installation, implementation, optimization, maintenance and support for the z/OS Operating System, Third Party products, the UNIX System Services environment, Mainframe WebSphere Application Server, and WebSphere Liberty. You willcollaborate with application developers, middleware support, database administrators, and other IT professionals.Requires experience with z/OS, JES2, USS internals, SMP/E installations, and mainframe vendor product knowledge. Skills in creating and managing web sites using both common and advanced Web programming languages is advantageous. What You'll Do: Perform design analysis, documentation, testing, implementation, and support for the mainframe infrastructure environment Install and manage mainframe software deployments in a highly granular SYSPLEX environment Experience with installing, and maintaining WASz and Liberty Enhance reporting and automation using supported mainframe tools such as JCL, REXX, SAS, SQL, PYTHON and Java/JavaScript Complete assignments by due dates, without detailed supervision Responsible for Incident, Problem, and Change Management for all assigned products Ensure incidents and problems are closed according to domain standards, and all change management requirements are strictly followed Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Participate in team on-call coverage rotation, which includes tactical systems administration and provide weekend support Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Participate in disaster recovery tests (on weekends) Actively engage in strategic goals for mainframe engineering, the department and organization Provide input and follow-through for continuous improvement to mainframe systems engineering processes and procedures Perform level 1 network troubleshooting for mainframe applications. Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Accountability: Demonstrates reliability by taking necessary actions to continuously meet required deadlines and goals. Global Collaboration: Applies global perspective when working within a team by being aware of ones own style and ensures all relevant parties are involved in key team tasks and decisions. Communication: Articulates information clearly and presents information effectively and confidently when working with others. Influencing: Convinces others by making a strong case, bringing others along to their viewpoint; maintains strong, trusting relationships, while at the same time is comfortable challenging ideas. Innovation and Creativity: Thinks > Additional Qualification: A minimum of 6+ years System Programmers experience in an IBM z/OS environment REXX programming experience preferred HTML, XML, Java, and Java Script programming experience is preferred Experience with Mainframe system automation (BMC AMI Ops) is a plus Understanding of VTAM and TCP/IP is a plus. Knowledge of Ansible, Splunk, Snowflake, ZOWE, SAS is a plus Knowledge of Bitbucket, Jira and DevOps orchestration tools are a plus Excellent written and verbal skills. The ability to multitask and work in a team environment is a must. Excellent customer service skills to be able to develop mutuallybeneficial relationships with a diverse set of customers. Knowledge of Infrastructure as Code (IaC) standards is a plus Experience in a 24x7 global environment with knowledge of system highavailability (HA), design and industry standard disaster recovery practices.
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Database Administrator Project Role Description : Design, implement and maintain databases. Install database management systems (DMBS). Develop procedures for day-to-day maintenance and problem resolution. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will design, implement, and maintain databases to ensure optimal performance and reliability. Your typical day will involve installing database management systems, developing procedures for daily maintenance, and resolving any issues that arise. You will work collaboratively with team members to enhance database functionality and support various applications, ensuring that data is accessible and secure for users across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Monitor database performance and implement improvements as necessary.- Ensure data integrity and security through regular audits and updates. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of database management systems and their architecture.- Familiarity with SQL and query optimization techniques.- Experience in troubleshooting and resolving database issues. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Database Administrator Project Role Description : Design, implement and maintain databases. Install database management systems (DMBS). Develop procedures for day-to-day maintenance and problem resolution. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will design, implement, and maintain databases to ensure optimal performance and reliability. Your typical day will involve installing database management systems, developing procedures for daily maintenance, and resolving any issues that arise. You will work collaboratively with team members to enhance database functionality and support various applications, ensuring that data is accessible and secure for users across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Monitor database performance and implement improvements as necessary.- Ensure data integrity and security through regular audits and updates. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of database management systems and their architecture.- Experience in performance tuning and optimization of database queries.- Familiarity with backup and recovery strategies to safeguard data. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France