Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
karnataka
On-site
NTT DATA strives to hire exceptional, innovative, and passionate individuals who want to grow with the organization. If you aspire to be part of an inclusive, adaptable, and forward-thinking workplace, we encourage you to apply now. We are currently looking for a Data Engineer with proficiency in SQL, Snowflake, AWS, Git, and Jenkins to join our team in Bangalore, Karnataka, India. As a Data Engineer, you will be responsible for deploying code using Git and Jenkins, working with large-scale data sets, and having exposure to Relational and NoSQL Databases and ETL tools. Knowledge of Snowflake, AWS, Python, data warehousing, and data modeling is essential for this role. Key Skills: - Proficiency in SQL, Snowflake, AWS - Experience with Git and Jenkins for code deployment - Exposure to large-scale data sets - Familiarity with Relational and NoSQL Databases and ETL tools - Knowledge of Snowflake, AWS, Python, data warehousing, and data modeling Good To Have Skills: - Passion for data-driven enterprise business strategy - Strong communication skills for both technical and business interactions - Ability to build trust and collaborate with cross-functional teams - Self-directed and capable of managing complex projects independently - Understanding of continuous integration techniques - Experience in Agile/Scrum methodologies - Strong analytical, diagnostic, and problem-solving abilities - Results-oriented with a focus on delivering business value - Experience in the financial sector is a plus Minimum Experience Required: 6-9 Years General Expectations: 1) Excellent communication skills 2) Willingness to work in a 10:30 AM to 8:30 PM shift 3) Flexibility to work at client locations in Bangalore 4) Ready to work in a hybrid office environment 5) Full return to the office expected by 2025 Pre-Requisites: 1) Genuine and digitally signed Form16 for all employments 2) Employment history details present in UAN/PPF statements 3) Candidates must undergo video screening to verify their authenticity and work setup 4) Real work experience on mandatory skills is required 5) Notice period of 0 to 3 weeks 6) Screening for any gaps in education or employment About NTT DATA: NTT DATA is a trusted global innovator in business and technology services, serving 75% of the Fortune Global 100. With a commitment to helping clients innovate and succeed in the long term, NTT DATA offers diverse expertise across more than 50 countries. As a Global Top Employer, we provide services in business and technology consulting, data and artificial intelligence, industry solutions, and application development. We are a leading provider of digital and AI infrastructure, investing in R&D to support organizations in their digital transformation journey. For more information, visit us at us.nttdata.com.,
Posted 1 week ago
5.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Software Architect at Adobe, you will play a crucial role in defining and evolving the architectural vision and roadmap for our products. Your responsibilities will include ensuring alignment with business goals, providing proactive thought leadership, designing and overseeing the implementation of highly scalable distributed systems, driving the technical delivery of AI-powered features, exploring and implementing AI solutions, and fostering a culture of technical excellence and collaboration within the engineering organization. You will need a passion and love for what you do, along with 14+ years of experience in software development and 5+ years in a software architect role. Deep expertise in designing and implementing highly scalable architectures, proficiency in Java, Spring Boot, Rest Services, MySQL or Postgres, MongoDB, Kafka, and experience with cloud technologies such as AWS and/or Azure are essential. Additionally, a strong understanding of Artificial Intelligence (AI), particularly Generative AI (GenAI) and Agents, is required. The ideal candidate will be ambitious, thrive in a fast-paced environment, demonstrate a strong bias to action, and possess excellent interpersonal, analytical, problem-solving, and conflict resolution skills. Strong business acumen, self-motivation, and the ability to mentor a team towards high-quality deliverables are also key attributes. A Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field is necessary for this role. If you are looking to join a team of passionate engineers at Adobe, drive technical excellence, and contribute to shaping the technology stack of next-gen products and offerings, then this Software Architect position is perfect for you. Join us in our mission to transform how companies interact with customers across every screen and be part of a culture that values innovation, collaboration, and continuous learning.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a skilled Data Engineer with expertise in Data Modeling, SQL, Snowflake, Python, AWS, and NoSQL. Your primary responsibility will be designing and implementing scalable data solutions to ensure efficient data storage, retrieval, and processing. Experience in NoSQL Data Modeling would be an additional advantage. Your key responsibilities will include designing and implementing data models to support analytical and operational workloads, developing and managing SQL queries for data extraction, transformation, and loading (ETL), working extensively with Snowflake to build scalable data pipelines and warehouses, developing Python scripts for data processing and automation, implementing AWS services for cloud-based data solutions, working with NoSQL databases to handle semi-structured and unstructured data, ensuring data accuracy, consistency, and security across various storage systems, and collaborating with data scientists, analysts, and software engineers to deliver business insights. You must possess strong experience in Data Modeling for both Relational and NoSQL databases, proficiency in SQL with practical experience in database technologies, hands-on experience with Snowflake for data warehousing, strong programming skills in Python for data processing, expertise in AWS cloud services for data infrastructure, and experience working with NoSQL databases. It would be beneficial if you have experience in NoSQL Data Modeling best practices. Location: Bangalore Experience: 6-9 Years,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Engineer for our data-rich e-commerce platform catering to the life sciences sector, your primary responsibility will be to support infrastructure, develop data pipelines, and deploy pricing logic. You will play a crucial role in ensuring the usability and interface design of internal tools that facilitate experimentation, pricing configuration, and real-time monitoring. Your key responsibilities will include: - Building and maintaining ETL pipelines for pricing, shipping, and behavioral datasets - Collaborating with data scientists and product managers to facilitate model development and experimentation - Developing APIs or backend logic to implement dynamic pricing algorithms - Creating internal dashboards or tools with a strong focus on usability and performance - Ensuring data quality, reliability, and documentation across all systems - Performing feature engineering to support predictive and optimization algorithms - Aggregating and transforming high-dimensional datasets at scale to enhance modeling efficiency and robustness - Optimizing algorithm performance for real-time and large-scale deployment To excel in this role, you must possess: - Flexibility to thrive in a dynamic, startup-like environment and tackle diverse tasks with innovative solutions - 3+ years of experience in data engineering or backend development - Proficiency in Databricks and distributed data processing frameworks - Strong skills in Python, SQL, and cloud-based platforms such as AWS, BigQuery, and Snowflake - Demonstrated expertise in designing user-friendly internal tools and interfaces - Familiarity with experimentation systems and monitoring infrastructure - Experience in efficiently handling large-scale, high-dimensional datasets - Preferred domain knowledge in e-commerce, with a strong advantage for familiarity with the pharmaceutical or scientific supply sector This is a contract role with the potential for conversion to full-time, starting from August to December 2025. The preferred location for this position is Bangalore, with alternatives in Mumbai and Kathmandu. If you are looking to contribute to a cutting-edge platform and drive impactful changes in the life sciences industry, we welcome your application.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
DXFactor is a US-based tech company working with customers globally. We are a certified Great Place to Work and currently seeking candidates for the role of Data Engineer with 4 to 6 years of experience. Our presence spans across the US and India, specifically in Ahmedabad. As a Data Engineer at DXFactor, you will be expected to specialize in SnowFlake, AWS, and Python. Key Responsibilities: - Design, develop, and maintain scalable data pipelines for both batch and streaming workflows. - Implement robust ETL/ELT processes to extract data from diverse sources and load them into data warehouses. - Build and optimize database schemas following best practices in normalization and indexing. - Create and update documentation for data flows, pipelines, and processes. - Collaborate with cross-functional teams to translate business requirements into technical solutions. - Monitor and troubleshoot data pipelines to ensure optimal performance. - Implement data quality checks and validation processes. - Develop and manage CI/CD workflows for data engineering projects. - Stay updated with emerging technologies and suggest enhancements to existing systems. Requirements: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 4+ years of experience in data engineering roles. - Proficiency in Python programming and SQL query writing. - Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Familiarity with data warehousing technologies such as Snowflake, Redshift, and BigQuery. - Demonstrated ability in constructing efficient and scalable data pipelines. - Practical knowledge of batch and streaming data processing methods. - Experience in implementing data validation, quality checks, and error handling mechanisms. - Work experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight). - Understanding of various data architectures including data lakes, data warehouses, and data mesh. - Proven ability to debug complex data flows and optimize underperforming pipelines. - Strong documentation skills and effective communication of technical concepts.,
Posted 1 week ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You have a great opportunity to join our team as a Data Architect with 9+ years of experience. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions on AWS and Snowflake. Your main tasks will include working with stakeholders to gather requirements, designing solutions, developing and executing test plans, and overseeing the information architecture for the data warehouse. To excel in this role, you must have a strong skillset in Snowflake, DBT, and Data Architecture Design experience in Data Warehouse. Additionally, it would be beneficial to have Informatica or any ETL Knowledge or Hands-On Experience, as well as Databricks understanding. You should have 9 - 11 years of IT experience with 3+ years of Data Architecture experience in Data Warehouse and 4+ years in Snowflake. As a Data Architect, you will need to optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. You should have a deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support. In addition to your technical responsibilities, you will also be required to maintain detailed documentation for data solutions and processes, provide training and leadership to share expertise and best practices with the team, and collaborate with the data engineering team to ensure that data solutions are developed according to best practices. If you have 10+ years of overall experience in architecting and building large-scale, distributed big data products, expertise in designing and implementing highly scalable, highly available Cloud services and solutions, experience with AWS and Snowflake, as well as a strong understanding of data warehousing and data engineering principles, then this role is perfect for you. This is a full-time position based in Hyderabad, Telangana, with a Monday to Friday work schedule. Therefore, you must be able to reliably commute or plan to relocate before starting work. As part of the application process, we would like to know your notice period, years of experience in Snowflake, Data Architecture experience in Data Warehouse, current location, willingness to work from the office in Hyderabad, current CTC, and expected CTC. If you meet the requirements and are excited about this opportunity, we look forward to receiving your application. (Note: Experience: total work: 9 years is required for this position),
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Project Manager at Myridius, you will be responsible for leading projects from start to finish, ensuring successful delivery within budget and timeline constraints. You will define project scope, objectives, and milestones, effectively communicate them to stakeholders, and manage project resources including team members, budget, and technology stack. Proactively identifying and mitigating project risks will be a key part of your role, along with tracking project progress and providing regular status updates to stakeholders. In terms of team management, you will build and lead a high-performing team of data engineers and analysts. Creating a collaborative and productive work environment by promoting open communication and effective problem-solving will be essential. You will assign tasks based on team member strengths and workload capacity, provide regular feedback, coaching, and support for team members" growth. Your technical skills should include experience managing data warehousing projects on platforms like Snowflake and a basic understanding of cloud computing infrastructure and platforms such as AWS, Azure, or GCP. Collaboration with the data architect to ensure the data architecture can handle continuous data growth and complexity will also be part of your responsibilities. Maintaining clear and consistent communication with stakeholders, facilitating collaboration across cross-functional teams, and resolving data-related issues effectively are crucial aspects of your role. You will contribute to the Cloud Center of Excellence initiatives, share knowledge and best practices within the Cloud solutioning domain, and foster a culture of continuous learning and collaborative problem-solving within the team. Recruiting and onboarding new talent to strengthen the Cloud practice, implementing coaching and mentoring programs to upskill the team, and fostering cross-functional collaboration to achieve project goals and milestones will also be part of your responsibilities. Motivating and inspiring the project team, making clear and informed decisions under pressure, managing project costs effectively, and ensuring timely project completion within defined timelines and budget constraints are key components of your role. If you are passionate about driving innovation and excellence in data project management, and if you thrive in a dynamic and collaborative work environment, then this role at Myridius is the perfect opportunity for you to make a significant impact in the rapidly evolving landscapes of technology and business. Visit www.myridius.com to learn more and be a part of our transformative journey in helping businesses thrive in a world of continuous change.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Senior Analyst-Qlik Sense Developer at Alexion, you will have the opportunity to leverage your expertise in data analysis and data visualization tools to transform raw data into actionable insights. Your role will be crucial in designing, developing, and maintaining data reporting solutions and analytics platforms that drive informed decision-making and support strategic initiatives within the organization. Your primary accountabilities will include supporting the Alexion team with field force reporting by designing, developing, validating, and maintaining Qlik Sense dashboards for various business units and indications. You will work closely with stakeholders to understand business objectives, data sources, and key performance indicators in order to design effective solutions. Additionally, you will be responsible for designing and implementing data models in QlikSense, including data extraction, transformation, and loading processes using SQL scripting language. In this role, you will integrate data from multiple sources to ensure accuracy, consistency, and optimal performance of the analytics platforms. You will develop interactive dashboards, reports, and visualizations using Qlik Sense, identifying and addressing performance bottlenecks to optimize user experiences. Collaboration with cross-functional teams, conducting thorough testing of Qlik applications, and communicating project status and recommendations to stakeholders will be essential aspects of your responsibilities. To excel in this role, you should possess advanced understanding/experience with SQL, Snowflake, and Veeva CRM, along with expertise in Qlik scripting language, data modeling concepts, and data visualization tools. Desirable skills include a background in computer science or related field, 5-6 years of experience in reporting and visualization applications, and proficiency in web development technologies such as JavaScript and CSS. Strong analytical, problem-solving, communication, and interpersonal skills are also key requirements for this position. Join us at AstraZeneca's Alexion division where your work is not just a job, but a mission to make a real difference in the lives of patients worldwide. We offer a dynamic and inclusive environment where you can grow both personally and professionally, supported by exceptional leaders who value diversity and innovation. If you are ready to make an impact and contribute to our mission, apply now to join our team and be a part of our unique and ambitious world.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 3 years of experience in a similar role. You must be proficient in Java and Python programming languages. A strong understanding and working experience in Solidatus is required. Additionally, you should have a solid understanding of XML and JSON data formats. Knowledge of relational SQL and NoSQL databases such as Oracle, MSSQL, and Snowflake is essential. Preferred qualifications include exposure to NLP and LLM technologies and approaches, experience with machine learning and data mining techniques, familiarity with data security and privacy concerns, knowledge of data warehousing and business intelligence concepts, and an advanced degree in Computer Science, Engineering, or a related field. The ideal candidate will have a Bachelor's degree in Computer Science, Engineering, or a related field.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer specializing in Databricks, your primary responsibility will be to develop, support, and drive end-to-end business intelligence solutions using Databricks. You will collaborate with business analysts and data architects to transform requirements into technical implementations. Your role will involve designing, developing, implementing, and maintaining PySpark code through the Databricks UI to facilitate data and analytics use cases for the client. Additionally, you will code, test, and document new or enhanced data systems to build robust and scalable applications for data analytics. You will also delve into performance, scalability, capacity, and reliability issues to identify and address any arising challenges. Furthermore, you will engage in research projects and proof of concepts to enhance data processing capabilities. Key Requirements: - 3+ years of hands-on experience with Databricks and PySpark. - Proficiency in SQL and adept data manipulation skills. - Sound understanding of data warehousing concepts and technologies. - Familiarity with Google Pub sub, Kafka, or Mongo DB is a plus. - Knowledge of ETL processes and tools for data extraction, transformation, and loading would be beneficial. - Experience with cloud platforms such as Databricks, Snowflake, or Google Cloud. - Understanding of data governance and data quality best practices. Qualifications: - Bachelor's degree in computer science, engineering, or a related field. - Continuous learning demonstrated through technical certifications or related methods. - 3+ years of relevant experience in Data Analytics, preferably within the Retail domain. Desired Qualities: - Self-motivated and dedicated to achieving outcomes for a rapidly growing team and organization. - Effective communication skills through verbal, written, and client presentations. Location: India Years of Experience: 3 to 5 years In this role, your expertise in Databricks and data engineering will play a crucial part in driving impactful business intelligence solutions and contributing to the growth and success of the organization.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a skilled Data Engineer with 7-10 years of experience, you will be a valuable addition to our dynamic team in India. Your primary focus will involve designing and optimizing data pipelines to efficiently handle large datasets and extract valuable business insights. Your responsibilities will include designing, building, and maintaining scalable data pipelines and architecture. You will be expected to develop and enhance ETL processes for data ingestion and transformation, collaborating closely with data scientists and analysts to meet data requirements and deliver effective solutions. Monitoring data integrity through data quality checks and ensuring compliance with data governance and security policies will also be part of your role. Leveraging cloud-based data technologies and services for storage and processing will be crucial to your success in this position. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proficiency in SQL and practical experience with databases such as MySQL, PostgreSQL, or Oracle is essential. Your expertise in programming languages like Python, Java, or Scala will be highly valuable, along with hands-on experience in big data technologies like Hadoop, Spark, or Kafka. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is preferred. Understanding data warehousing concepts and tools such as Redshift and Snowflake, coupled with experience in data modeling and architecture design, will further strengthen your candidacy.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You will be working as a Data Engineer with expertise in Python and Pyspark programming. You should have a strong background in utilizing Cloud services such as Snowflake, Databricks, Informatica, Azure, AWS, GCP, as well as proficiency in Reporting technologies like PowerBI, Tableau, Spotfire, Alteryx, and Microstrategy. Your responsibilities will include developing and maintaining data pipelines, optimizing data workflows, and ensuring the efficiency and reliability of data integration processes. You will be expected to possess strong programming skills in Python and Pyspark, along with a deep understanding of SQL. It is essential for you to have experience in utilizing Snowflake, Databricks, PowerBI, Microstrategy, Tableau, and Spotfire. Additionally, familiarity with Informatica and Azure/AWS services would be advantageous. The interview process will be conducted virtually, and the work model for this position is remote. If you have 7-10 years of experience in this field and are available to start within 15 days, please consider applying for this opportunity by sending your resume to netra.s@twsol.com.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are a highly skilled Architect with expertise in Snowflake Data Modeling and Cloud Data solutions. With over 12 years of experience in Data Modeling/Data Warehousing and 5+ years specifically in Snowflake, you will lead Snowflake optimizations at warehouse and database levels. Your role involves setting up, configuring, and deploying Snowflake components efficiently for various projects. You will work with a passionate team of engineers at ValueMomentum's Engineering Center, focused on transforming the P&C insurance value chain through innovative solutions. The team specializes in Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. As part of the team, you will have opportunities for role-specific skill development and contribute to impactful projects. As an Architect, you will be responsible for optimizing Snowflake at both warehouse and database levels, setting up and configuring Snowflake components, and implementing cloud management frameworks. Proficiency in Python, PySpark, SQL, and experience with cloud platforms such as AWS, Azure, and GCP are essential for this role. Key Responsibilities: - Work on Snowflake optimizations at warehouse and database levels. - Setup, configure, and deploy Snowflake components like Databases, Warehouses, and Roles. - Setup and monitor data shares and Snow Pipes for Snowflake projects. - Implement Snowflake Cloud management frameworks for monitoring, alerting, governance, budgets, change management, and cost optimization. - Develop cloud usage reporting for cost-related insights, metrics, and KPIs. - Build and enhance Snowflake forecasting processes and explore cloud spend trends. Requirements: - 12+ years of experience in Data Modeling/Data Warehousing. - 5+ years of experience in Snowflake Data Modeling and Architecture, including expertise in Cloning, Data Sharing, and Search optimization. - Proficiency in Python, PySpark, and complex SQL for analysis. - Experience with cloud platforms like AWS, Azure, and GCP. - Knowledge of Snowflake performance management and cloud-based database role management. ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. It focuses on helping insurers achieve sustained growth, high performance, and stakeholder value. The company has served over 100 insurers and is dedicated to fostering resilient societies. Benefits at ValueMomentum include a competitive compensation package, career advancement opportunities through coaching and mentoring programs, comprehensive training and certification programs, and performance management with goal setting, continuous feedback, and rewards for exceptional performers.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Product Analyst at Wells Fargo, you will be responsible for participating in low to moderate complexity data product initiatives. Your role will involve identifying opportunities for data roadmap improvements within your scope of responsibilities to drive data enablement and capabilities across platforms and utilities. You will review and analyze basic business, operational, or technical assignments that require research and evaluation to drive data enablement strategies. Additionally, you will present recommendations for resolving data product situations, collaborate with stakeholders to understand business requirements, and manage datasets focusing on consumer needs and data governance standards. Moreover, you will participate in the creation and maintenance of data product roadmaps, gather data requirements, and communicate data problems and initiatives effectively to all audiences. Required qualifications include 2+ years of data product or data management experience, or equivalent demonstrated expertise in maintaining and improving data quality across the organization. Your responsibilities will also involve participating in analysis to identify and remediate data quality issues, adhering to data governance standards, and designing data governance and data quality policies. Furthermore, you will support regulatory analysis and reporting requirements, work with business and technology partners to document metadata about systems, and assess the current state of data quality. Desired qualifications for this role include experience in large enterprise data initiatives, managing data entry processes, resolving data quality issues, banking business or technology experience, and familiarity with BI tools and cloud concepts. In addition, knowledge of T-SQL, database, data warehousing, ETL concepts, BI solutions, Agile principles, and various technical skills are preferred for this position. As a Data Product Analyst, you are expected to assist in implementing data processes, monitor data flows, ensure consistent data definition across systems, collaborate with data engineers, and resolve data quality issues. The posting end date for this job is 17 Jul 2025, with the possibility of early closure due to the volume of applicants. Wells Fargo values equal opportunity and encourages applications from all qualified candidates. The company maintains a drug-free workplace and requires candidates to represent their own experiences during the recruiting and hiring process. If you require a medical accommodation during the application or interview process, you can visit Disability Inclusion at Wells Fargo for assistance. Third-party recordings are prohibited unless authorized by Wells Fargo, and candidates should adhere to the company's recruitment and hiring requirements.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Data Scientist in the Global Data Science & Advanced Analytics team at Colgate-Palmolive, your role will involve leading projects within the Analytics Continuum. You will be responsible for conceptualizing and developing machine learning, predictive modeling, simulations, and optimization solutions to address business questions with clear dollar objectives. Your work will have a significant impact on revenue growth management, price elasticity, promotion analytics, and marketing mix modeling. Your responsibilities will include: - Conceptualizing and building predictive modeling solutions to address business use cases - Applying machine learning and AI algorithms to develop scalable solutions for business deployment - Developing end-to-end business solutions from data extraction to statistical modeling - Conducting model validations and continuous improvement of algorithms - Deploying models using Airflow and Docker on Google Cloud Platforms - Leading pricing, promotion, and marketing mix initiatives from scoping to delivery - Studying large datasets to discover trends and patterns - Presenting insights in a clear and interpretable manner to business teams - Developing visualizations using frameworks like Looker, PyDash, Flask, PlotLy, and streamlit - Collaborating closely with business partners across different geographies To qualify for this position, you should have: - A degree in Computer Science, Information Technology, Business Analytics, Data Science, Economics, or Statistics - 5+ years of experience in building statistical models and deriving insights - Proficiency in Python and SQL for coding and statistical modeling - Hands-on experience with statistical models such as linear regression, random forest, SVM, logistic regression, clustering, and Bayesian regression - Knowledge of GitHub, Airflow, and visualization frameworks - Understanding of Google Cloud and related services like Kubernetes and Cloud Build Preferred qualifications include experience with revenue growth management, pricing, marketing mix models, and third-party data. Knowledge of machine learning techniques and Google Cloud products will be advantageous for this role. Colgate-Palmolive is committed to fostering an inclusive environment where diversity is valued, and every individual is treated with respect. As an Equal Opportunity Employer, we encourage applications from candidates with diverse backgrounds and perspectives. If you require accommodation during the application process due to a disability, please complete the request form provided. Join us in building a brighter, healthier future for all.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
pune, maharashtra
On-site
Our world is currently undergoing transformation, with PTC at the forefront of this evolution. Our software serves as a bridge between the physical and digital realms, empowering companies to enhance their operations, develop superior products, and equip individuals across all facets of their business. The driving force behind our success is our talented workforce. Presently, we are a diverse global team comprising nearly 7,000 individuals. Our primary goal is to provide our team members with opportunities to expand their horizons, acquire new knowledge, and foster personal growth. We value the realization of their ideas and embrace the unique qualities that define us, enabling us to achieve our collective objectives. Life at PTC transcends mere utilization of cutting-edge technologies to revolutionize the physical landscape. It entails embracing one's true self and collaborating with some of the industry's foremost experts to effect positive change in the world. If you possess a shared ardor for resolving challenges through innovation, you are likely to find immense satisfaction in the PTC experience, mirroring our own enthusiasm. Are you prepared to embark on your next career endeavor with us We hold individual privacy rights in high regard and are dedicated to managing Personal Information ethically and in compliance with all relevant privacy and data protection regulations. Please refer to our Privacy Policy for further details.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a highly skilled Salesforce Developer with over 3 years of experience, possessing comprehensive end-to-end business process knowledge. Your role involves working on enhancement and support projects. Your key responsibilities include managing the data migration process, developing best practices and protocols, evaluating different source systems, coordinating with clients to understand their data needs, establishing testing procedures, providing technical support for the data migration process, and creating documentation of the migration process for future projects. To be successful in this role, you must have a minimum of 2 years of experience in data migration, expertise in Snowflake, knowledge of ETL processes and data deduplication, proficiency in SQL, XML, and JSON, experience with REST API and SOAP, strong problem-solving skills, attention to detail, excellent communication and coordination skills. Knowledge of sales processes such as quoting and Opportunity management in Salesforce is an added advantage.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As an Associate in the Data Transfer, Integration & Quality II role at BNY, you will have the opportunity to be a key part of the Wealth Management Data Governance team. Located in Pune, MH HYBRID, you will play a crucial role in the transformation of data within Wealth Management, ensuring information is accessible and actionable for our business partners. Your responsibilities will include learning industry best practices for data management, data quality, refining data wrangling skills, and providing data-driven insights to our front-line business partners. Your main focus will be on implementing a collaborative data platform to streamline the movement, transformation, analysis, and communication of information. This will involve building relationships with key stakeholders, understanding data needs of internal clients, and collaborating with IT to deliver data solutions. You will utilize tools such as Collibra, CDQ, and DataIku to perform your duties effectively, including connecting to databases, SQL functions, and creating reports upon request. To excel in this role, we are looking for candidates with a B Tech/BE/BS Degree, with a preference for stats, math, or engineering degrees. You should have at least 3 years of experience in Data Quality and Data Management, along with 2+ years of experience with Collibra and CDQ. Strong interpersonal skills, SQL proficiency, knowledge of Snowflake, and a passion for helping others and learning new skills are essential. Experience with Dataiku, the financial industry, Excel, and Agile methodologies would be advantageous. Joining BNY means becoming part of a culture that has been recognized with numerous awards, including being named America's Most Innovative Company and one of the World's Most Admired Companies by Fortune. We are committed to diversity and inclusivity, as demonstrated by our high scores in the Corporate Equality Index and Best Places to Work for Disability Inclusion. Additionally, we have been recognized for our sustainability efforts and gender equality initiatives. At BNY, you will enjoy equal employment opportunities and affirmative action, with a focus on supporting underrepresented groups, females, individuals with disabilities, and protected veterans.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
You will be responsible for leading and managing the delivery of projects as well as achieving project and team goals. Your tasks will include building and supporting data ingestion and processing pipelines, designing and maintaining machine learning infrastructure, and leading client engagement on technical projects. You will define project scopes, track progress, and allocate work to the team. It will be essential to stay updated on big data technologies and conduct pilots to design scalable data architecture. Collaboration with software engineering teams to drive multi-functional projects to completion will also be a key aspect of your role. To excel in this position, we expect you to have a minimum of 6 years of experience in data engineering with at least 2 years in a leadership role. Experience working with global teams and remote clients is required. Hands-on experience in building data pipelines across various infrastructures, knowledge of statistical and machine learning techniques, and the ability to integrate machine learning into data pipelines are essential. Proficiency in advanced SQL, data warehousing concepts, and DataMart designing is necessary. Strong familiarity with modern data platform components like Spark and Python, as well as experience with Data Warehouses (e.g., Google BigQuery, Redshift, Snowflake) and Data Lakes (e.g., GCS, AWS S3) is expected. Experience in setting up and maintaining data pipelines with AWS Glue, Azure Data Factory, and Google Dataflow, along with relational SQL and NoSQL databases, is also required. Excellent problem-solving and communication skills are essential for this role.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
The role of Staff Engineer - Data in SonyLIV's Digital Business is to lead the data engineering strategy, architect scalable data infrastructure, drive innovation in data processing, ensure operational excellence, and build a high-performance team to enable data-driven insights for OTT content and user engagement. This position is based in Mumbai and requires a minimum of 8 years of experience in the field. Responsibilities include defining the technical vision for scalable data infrastructure using modern technologies like Spark, Kafka, Snowflake, and cloud services, leading innovation in data processing and architecture through real-time data processing and streaming analytics, ensuring operational excellence in data systems by setting and enforcing standards for data reliability and privacy, building and mentoring a high-caliber data engineering team, collaborating with cross-functional teams, and driving data quality and business insights through automated quality frameworks and BI dashboards. The successful candidate should have 8+ years of experience in data engineering, business intelligence, and data warehousing, with expertise in high-volume, real-time data environments. They should possess a proven track record in building and managing large data engineering teams, designing and implementing scalable data architectures, proficiency in SQL, experience with object-oriented programming languages, and knowledge of A/B testing methodologies and statistical analysis. Preferred qualifications include a degree in a related technical field, experience managing the end-to-end data engineering lifecycle, working with large-scale infrastructure, familiarity with automated data lineage and auditing tools, expertise with BI and visualization tools, and advanced processing frameworks. Joining SonyLIV offers the opportunity to drive the future of data-driven entertainment by collaborating with industry professionals, working with comprehensive data sets, leveraging cutting-edge technology, and making a tangible impact on product delivery and user engagement. The ideal candidate will bring a strong foundation in data infrastructure, experience in leading and scaling data teams, and a focus on operational excellence to enhance efficiency.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for an Informatica Admin to join their team in Bangalore, Karnataka, India. As an Informatica Admin, you will be responsible for designing and implementing scalable ETL solutions using Informatica PowerCenter/IICS/IDMC for structured and semi-structured data. You will also define, implement, and monitor data quality rules and scorecards using Informatica Data Quality (IDQ) and work on data governance by implementing data policies, lineage, and metadata management using Axon and Enterprise Data Catalog (EDC) or on Cloud metadata catalog/data governance. You will be involved in integrating on-prem and cloud-based applications through Informatica Application Integration and Cloud Services (IICS) and designing and consuming REST/SOAP APIs within Informatica Cloud for real-time data exchange. Additionally, you will translate business rules into technical specifications, implement profiling, and manage rule-based transformation logic, as well as optimize ETL workflows and data mappings for performance and scalability. Key Responsibilities: - ETL Development using Informatica PowerCenter/IICS/IDMC - Data Quality implementation using IDQ - Data Governance with Axon and EDC/MCC - Application Integration with IICS - API & Web Services design and consumption - Rule Specifications & Occurrence Handling - Performance Tuning for ETL workflows Technical Skills: - Informatica Tools: PowerCenter, IICS/IDMC, IDQ, Axon/Cloud, EDC/MCC - Integration: REST/SOAP API development, Informatica Cloud Application Integration, JSON/XML transformations - Database & Data Warehousing: Strong knowledge of SQL and PL/SQL, experience with Oracle, SQL Server, Snowflake, or similar DW platforms - Data Governance: Understanding of data stewardship, lineage, and metadata standards, exposure to frameworks like DAMA DMBOK is a plus - Other Tools/Technologies: Git, Jira, ServiceNow, Unix/Linux scripting - Cloud Platforms: AWS, Azure, or GCP preferred NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. With diverse experts in more than 50 countries and a robust partner ecosystem, NTT DATA provides business and technology consulting, data and artificial intelligence, industry solutions, as well as development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure and is part of NTT Group, investing over $3.6 billion each year in R&D to help organizations and society move confidently into the digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
The ideal candidate for this role should have at least 4 years of experience as an ETL/Informatica developer. You should also have a minimum of 1 year of experience working with Snowflake and 1 year of experience with IICS. It is essential that you have hands-on experience developing specifications, test scripts, and code coverage for all integrations. Additionally, you should be adept at supporting the migration of integration code from lower to higher environments, such as production. In this role, you will be responsible for full and incremental ETL using Informatica Power Center. Your expertise in developing ETL/Informatica for Data Warehouse Integration from various data sources will be valuable. You should also have experience supporting integration configurations with iPaaS through connected apps or web services. Being able to work with Agile framework is a must for this position. The successful candidate should be willing to be on-call for selected off-shift hours. If you meet the requirements and are interested in this onsite position located in Hyderabad, please share your resumes with bhavana@ketsoftware.com and contact at 91828 22519.,
Posted 1 week ago
5.0 - 8.0 years
6 - 10 Lacs
Pune
Hybrid
Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough