Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 16 hours ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.-Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 16 hours ago
3.0 - 5.0 years
5 - 8 Lacs
Pune
Work from Office
Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake
Posted 16 hours ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Tech stack GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager Git, Ansible Tower / Ansible scripts, Jenkins, Java, Python, Terraform, Cloud Composer/Airflow Experience and Skills Must Have Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc. Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts. Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion) Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. Experience in working in Agile environment and toolset. Strong problem-solving and analytical skills Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently. Strong organisational and multi-tasking skills. Good team player who embraces teamwork and mutual support. Nice to Have Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub Hands on development in Python, Terraform Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect) Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query Experience in working in DataOps model Experience in Data Vault modelling and usage. Proficiency in Git usage for version control and collaboration. Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)
Posted 16 hours ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Design, build, test and deploy Google Cloud data models and transformations in BigQuery, environment . SQL, stored procedures, indexes, clusters, partitions, triggers, Deliver a data warehouse and pipelines which follow abstraction and database refactoring best practice in order to support evolutionary development and continual change Protect the solution with appropriate Authorization and Authentication models, data encryption and other security components this will include consumer registration and storage of identification and change management considerations Review and refine, interpret and implement business and technical requirements Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc. Estimate, commit and deliver requirements to scope, quality, and time expectations Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective Write well-commented, maintainable and self-documenting code Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team Essential Experience Expert in database design, development and administration understanding of relational and dimensional data models (and preferably Data Vault) Expertise in On-prem or Cloud Databases, Warehouses and Lakes Excellent understanding of GCP Architecting and solution design Proven experience / solid knowledge in developing and optimization of SQL/T-SQL procedures in Traditional or Cloud Databases Coding and development of DDL and DML database components Excellent knowledge of devops tools like Ansible , Jenkins, Github ,Puppet, Chef etc. IT methodology/procedural knowledge; Agile/Scrum, DevOps and ITIL principles. BS/MS degree in Computer Science, Engineering or a related subject Excellent communication and interpersonal skills in English. Proficiency in verbal, listening and written English is crucial.
Posted 16 hours ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating seamless data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and design processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Pune.- A 15 years full time education is required.-Must have skills-Snowflake Data Vault 2.0 Modeler Qualification 15 years full time education
Posted 16 hours ago
4.0 - 9.0 years
25 - 30 Lacs
Pune, Thiruvananthapuram
Work from Office
Mandatory:- Data modelling ,SQL, Erwin or Er studio Data architect , Data Vault , Dimensional Modelling Work mode – Currently this is remote (WFH) but it’s not permanent WFH , once business ask the candidate to come to office, they must relocate. Required Candidate profile o Experience in Data vault 2.0 certification o Experience with data modeling tools such as SQLDBMS, ERwin, or similar. o Strong understanding of database management systems (DBMS) and SQL.
Posted 1 day ago
5.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Looking for Use Case Specialists with 5+ years of experience to implement use cases in data systems using Snowflake, Power BI, WhereScape, and DataVault 2.0. Strong stakeholder communication is essential. Required Candidate profile Experienced data professional skilled in Snowflake, Power BI, and WhereScape. Strong in implementing data use cases, stakeholder communication, and analytical problem-solving.
Posted 2 days ago
15.0 - 20.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationData ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise.Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. ResponsibilitiesDefine architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelors degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills. Qualification 15 years full time education
Posted 3 days ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: Masters degree in computer science, Information Technology, or a related field. At least 3+ years of proven experience as a Snowflake Developer and minimum 8+ years of total experience with data modelling (OLAP & OLTP). Extensive hands-on experience in writing complex SQL queries and advanced Python, demonstrating proficiency in data manipulation and analysis for large data volumes. Strong understanding of data warehousing concepts, methodologies, and technologies with in-depth experience in data modelling techniques (OLTP, OLAP, Data Vault 2.0) Experience building data pipelines using DBT (Data Build Tool) for data transformation. Familiarity with advanced techniques for performance tuning methodologies in Snowflake including query optimization. Strong knowledge with CI/CD pipelines, preferably in Azure DevOps. Excellent problem-solving, analytical, and critical thinking skills. Strong communication, collaboration, and interpersonal skills. Knowledge of additional data technologies (e.g., AWS, Azure, GCP) is a plus. Knowledge of Infrastructure as Code (IAC) tools such as Terraform or cloud formation is a plus. Experience in leading projects or mentoring junior developers is advantageous.
Posted 1 week ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.
Posted 1 week ago
9.0 - 14.0 years
35 - 55 Lacs
Noida
Hybrid
Looking For A Better Opportunity? Join Us and Make Things Happen with DMI a Encora company now....! Encora is seeking a full-time Lead Data Engineer with Logistic domian expertise to support our manufacturing large scale client in digital transformation. The Lead Data Engineer is responsible for ensuring the day-to-day leadership and guidance of the local, India-based, data team. This role will be the primary interface with the management team of the client and will work cross functionally with various IT functions to streamline project delivery. Minimum Requirements: l 8+ years of experience overall in IT l Current - 5+ years of experience on Azure Cloud as Data Engineer l Current - 3+ years of hands-on experience on Databricks/ AzureDatabricks l Proficient in Python/PySpark l Proficient in SQL/TSQL l Proficient in Data Warehousing Concepts (ETL/ELT, Data Vault Modelling, Dimensional Modelling, SCD, CDC) Primary Skills: Azure Cloud, Databricks, Azure Data Factory, Azure Synapse Analytics, SQL/TSQL, PySpark, Python + Logistic domain expertise Work Location: Noida, India (Candidates who are open for relocation on immediate basis can also apply) Interested candidates can apply at nidhi.dubey@encora.com along with their updated resume: 1. Total experience: 2.Relevant experience in Azure Cloud: 3. Relevant experience in Azure Databricks: 4. Relevant experience in Azure Syanspse: 5. Relevant experience in SQL/T-SQL: 6. Relevant experience in Pyspark: 7. Relevant experience in python: 8. Relevant experience in logistic domain: 9. Relevant experience in data warehosuing: 10. Current CTC: 11. Expected CTC: 12. Official Notice Period. if serving please specify LWD:
Posted 1 week ago
9.0 - 14.0 years
20 - 35 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities JD: Primary skill: Datavault 2.0,SNOWFLAKE AND DBT Secondary skill: Data Modelling Responsibilities: Collaborate with business analysts and stakeholders to understand the business needs and data requirements Analyze business processes and identify critical data elements for modeling Conduct data profiling and analysis to identify data quality issues and potential data modeling challenges Develop conceptual data models to capture high-level business entities, attributes and relationships within the database structure Using Data Vault to create data models that details the different parts of a business such as hubs, satellites and links Build automated loading processes and patterns to minimize development expenses and operational costs Good knowledge of Dimensional Modeling and Design patterns Lead and manage data operations projects, demonstrating a strong experience in managing end-to-end data analytics workflow. Lead the data engineering team in designing, building, and maintaining Data Lake on Snowflake using ADF Analyze and interpret data from Snowflake data warehouse to identify key trends and insights. Document work processes and maintain clear documentation for future reference. Solid understanding of data modeling concepts, including dimensional and fact modeling. Bring hands-on experience from at least one implementation project related to reporting and visualization. Apply best practices for dimensional and fact modeling for optimal performance and scalability. Qualifications: 8+ years of experience in dimensional modeling and design patterns Proven experience in designing and developing data models that details business parts such as hubs, satellites and links Familiarity with Snowflake and designing the models on top of it Strong SQL skills for data retrieval and manipulation. Experience with data visualization best practices and principles. Ability to work independently and collaboratively within a team.
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Role : Lead data Engineer Location : Chennai, Bangalore, Pune, Mumbai, Ahmedabad, Primary Skills : Snowflake DBT(Mnadatory) Secondary Skill : Experience / good knowledge of data modeling, database design, data quality and governance principles. Responsibilities: • Design and implement data pipelines using Azure Data Factory for data ingestion. • Design and implement data pipelines using DBT for transformation and loading processes into Snowflake Data Vault 2.0. • Execute ETL processes using DBT to load the data from Snowflake Business Data Vault to Snowflake Consumption Layer, ensuring the smooth implementation of dataintegration strategies. • Demonstrate a good understanding and hands-on experience in DevOps, actively participating in troubleshooting issues within the data operations environment. • Participate in data quality checks and monitoring, implementing appropriate metrics and alerts. • Work with the team to automate data pipelines and deploy code using CI/CD tools. • Monitor and optimize data platform performance and resource utilization. • Document data pipelines and processes for maintainability and knowledge sharing. Qualifications: • Overall 10+ years of experience on data engineering • Demonstrated experience of 5+ years in primary skill with similar engineering role, showcasing proficiency in Data Transformations and Integration Projects • Solid understanding and hands-on experience in the Azure Cloud environment, particularly DBT, ADF, ADLS, Python and Snowflake • Proficient in troubleshooting data pipelines built using DBT & ADF. • Good experience in ETL processes, and data integration. • Ability to work collaboratively in a team-oriented environment • Excellent problem-solving and analytical skills. Preferred candidate profile Perks and benefits
Posted 1 week ago
16.0 - 22.0 years
40 - 55 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 1 week ago
7.0 - 10.0 years
15 - 30 Lacs
Pune
Hybrid
Looking for 7–10 yrs exp (4+ in data modeling, 2–3 in Data Vault 2.0). Must know DBT, Dagster/Airflow, GCP (BigQuery, CloudSQL), and data modeling. DV 2.0 hands-on is a must. Docker is a plus.
Posted 1 week ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai
Work from Office
Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.
Posted 2 weeks ago
6.0 - 10.0 years
3 - 8 Lacs
Noida
Work from Office
Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives.
Posted 2 weeks ago
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 2 weeks ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 3 weeks ago
9.0 - 14.0 years
20 - 32 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)
Posted 3 weeks ago
7.0 - 12.0 years
20 - 22 Lacs
Bengaluru
Remote
Collaborate with senior stakeholders to gather requirements, address constraints, and craft adaptable data architectures. Convert business needs into blueprints, guide agile teams, maintain quality data pipelines, and drive continuous improvements. Required Candidate profile 7+yrs in data roles(Data Architect/Engineer). Skilled in modelling (incl. Data Vault 2.0), Snowflake, SQL/Python, ETL/ELT, CI/CD, data mesh, governance & APIs. Agile; strong stakeholder & comm skills. Perks and benefits As per industry standards
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Are you ready to play a key role in transforming Thomson Reuters into a truly data-driven companyJoin our Data & Analytics (D&A) function and be part of the strategic ambition to build, embed, and mature a data-driven culture across the entire organization. The Data Architecture organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the Role In this opportunity as a Data Architect, you will: Lead Architecture Design: Architect and Lead Data Platform EvolutionSpearhead the conceptual, logical, and physical architecture design for our enterprise Data Platform (encompassing areas like our data lake, data warehouse, streaming services, and master data management systems). You will define and enforce data modeling standards, data flow patterns, and integration strategies to serve a diverse audience from data engineers to AI/ML practitioners and BI analysts. Technical Standards and Best Practices: Research and recommend technical standards, ensuring the architecture aligns with overall technology and product strategy. Be hands-on in implementing core components reusable across applications. Hands-on Prototyping and Framework DevelopmentWhile a strategic role, maintain a hands-on approach by designing and implementing proof-of-concepts and core reusable components/frameworks for the data platform. This includes developing best practices and templates for data pipelines, particularly leveraging dbt for transformations, and ensuring efficient data processing and quality. Champion Data Ingestion StrategiesDesign and oversee the implementation of robust, scalable, and automated cloud data ingestion pipelines from a variety of sources (e.g., APIs, databases, streaming feeds) into our AWS-based data platform, utilizing services such as AWS Glue, Kinesis, Lambda, S3, and potentially third-party ETL/ELT tools. Design and optimize solutions utilizing our core cloud data stack, including deep expertise in Snowflake (e.g., architecture, performance tuning, security, data sharing, Snowpipe, Streams, Tasks) and a broad range of AWS data services (e.g., S3, Glue, EMR, Kinesis, Lambda, Redshift, DynamoDB, Athena, Step Functions, MWAA/Managed Airflow) to build and automate end-to-end analytics and data science workflows. Data-Driven Decision-Making: Make quick and effective data-driven decisions, demonstrating strong problem-solving and analytical skills. Align strategies with company goals. Stakeholder Collaboration: Collaborate closely with external and internal stakeholders, including business teams and product managers. Define roadmaps, understand functional requirements, and lead the team through the end-to-end development process. Team Collaboration: Work in a collaborative team-oriented environment, sharing information, diverse ideas, and partnering with cross-functional and remote teams. Quality and Continuous Improvement: Focus on quality, continuous improvement, and technical standards. Keep service focus on reliability, performance, and scalability while adhering to industry best practices. Technology Advancement: Continuously update yourself with next-generation technology and development tools. Contribute to process development practices. About You Youre a fit for the role of Data Architect, Data Platform if your background includes: Educational Background: Bachelors degree in information technology. Experience: 10 + years of IT experience with at least 5 years in a lead design or architectural capacity. Technical Expertise: Broad knowledge and experience with Cloud-native software design, Microservices architecture, Data Warehousing, and proficiency in Snowflake. Cloud and Data Skills: Experience with building and automating end-to-end analytics pipelines on AWS, familiarity with NoSQL databases. Data Pipeline and Ingestion MasteryExtensive experience in designing, building, and automating robust and scalable cloud data ingestion frameworks and end-to-end data pipelines on AWS. This includes experience with various ingestion patterns (batch, streaming, CDC) and tools. Data Modeling: Proficient with concepts of data modeling and data development lifecycle. Advanced Data ModelingDemonstrable expertise in designing and implementing various data models (e.g., relational, dimensional, Data Vault, NoSQL schemas) for transactional, analytical, and operational workloads. Strong understanding of the data development lifecycle, from requirements gathering to deployment and maintenance. Leadership: Proven ability to lead architectural discussions, influence technical direction, and mentor data engineers, effectively balancing complex technical decisions with user needs and overarching business constraints Programming Skills: Strong programming skills in languages such as Python or Java or data manipulation, automation, and API development. Regulatory Awareness and Security Acumen: Data Governance and Security AcumenDeep understanding and practical experience in designing and implementing solutions compliant with robust data governance principles, data security best practices (e.g., encryption, access controls, masking), and relevant privacy regulations (e.g., GDPR, CCPA). Containerization and Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. #LI-VN1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
1.0 - 4.0 years
5 - 9 Lacs
Bengaluru
Work from Office
: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer. About the Role In this opportunity as Data Engineer, you will: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer Innovate with new approaches to meeting data management requirements Effectively communicate and liaise with other data management teams embedded across the organization and data consumers in data science and business analytics teams. Analyze existing data pipelines and assist in enhancing and re-engineering the pipelines as per business requirements. Bachelor’s degree or equivalent required, Computer Science or related technical degree preferred About You You’re a fit for the role if your background includes: Mandatory skills Data Warehousing, data models, data processing[ Good to have], SQL, Power BI / Tableau, Snowflake [good to have] , Python 3.5 + years of relevant experience in Implementation of data warehouse and data management of data technologies for large scale organizations Experience in building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting Worked on Analyzing data pipelines Knowledgeable about Data Warehousing, including data models and data processing Broad understanding of the technologies used to build and operate data and analytic systems Excellent critical thinking, communication, presentation, documentation, troubleshooting and collaborative problem-solving skills Beginner to intermediate Knowledge of AWS, Snowflake, Python Hands-on experience with programming and scripting languages Knowledge of and hands on experience with Data Vault 2.0 is a plus Also have experience in and comfort with some of the following skills/concepts: Good in writing SQL and performance tuning Data Integration tools lie DBT, Informatica, etc. Intermediate in programming language like Python/PySpark/Java/JavaScript AWS services and management, including Serverless, Container, Queueing and Monitoring services Consuming and building APIs. #LI-SM1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Mumbai
Work from Office
#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane