Jobs
Interviews

2769 Snowflake Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Propel operational success with your expertise in technology support and a commitment to continuous improvement. As a Technology Support II team member within JPMorgan Chase, you will play a vital role in ensuring the operational stability, availability, and performance of our production application flows. You will be responsible for troubleshooting, maintaining, identifying, escalating, and resolving production service interruptions for all internally and externally developed systems, thereby supporting a seamless user experience and fostering a culture of continuous improvement. You will analyze and troubleshoot production application flows to ensure end-to-end application or infrastructure service delivery supporting the business operations of the firm. Your contributions will be instrumental in improving operational stability and availability through participation in problem management. Additionally, you will monitor production environments for anomalies and address issues utilizing standard observability tools. Your role will involve assisting in the escalation and communication of issues and solutions to the business and technology stakeholders. Furthermore, you will play a key role in identifying trends and assisting in the management of incidents, problems, and changes in support of full stack technology systems, applications, or infrastructure. There may be instances where you will be required to provide on-call coverage during weekends. **Job Responsibilities:** - Analyze and troubleshoot production application flows to ensure end-to-end application or infrastructure service delivery supporting the business operations of the firm. - Improve operational stability and availability through participation in problem management. - Monitor production environments for anomalies and address issues utilizing standard observability tools. - Assist in the escalation and communication of issues and solutions to the business and technology stakeholders. - Identify trends and assist in the management of incidents, problems, and changes in support of full stack technology systems, applications, or infrastructure. - May require the role to provide on-call coverage during weekends. **Required qualifications, capabilities, and skills:** - Possess 2+ years of experience, ideally working with Data/Python applications in a production environment. - Experience in a programming or scripting language (Python). - Experience working with containers and container orchestration (Kubernetes). - Experience working with orchestration tools (Control-M). - Experience with cloud platforms (AWS), ideally provisioning infrastructure using Terraform. - Exposure to observability and monitoring tools and techniques. - Good communication and collaboration skills, with the ability to work effectively in a fast-paced, dynamic environment. **Preferred qualifications, capabilities, and skills:** - Significant advantage to have experience supporting applications on platforms such as Databricks, Snowflake, or AWS EMR (Databricks preferred). - Actively self-educates, evaluate new technology, and recommend suitable ones. - Knowledge of virtualization, cloud architecture, services, and automated deployments.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

Join us as a Data Engineer at Barclays, where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Data Engineer, you should have experience with hands-on experience in Pyspark and a strong knowledge of Dataframes, RDD, and SparkSQL. You should also have hands-on experience in developing, testing, and maintaining applications on AWS Cloud. A strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) is essential. Additionally, you should be able to design and implement scalable and efficient data transformation/storage solutions using Snowflake. Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc., is required. Familiarity with using DBT (Data Build Tool) with Snowflake for ELT pipeline development is necessary. Advanced SQL and PL SQL programming skills are a must. Experience in building reusable components using Snowflake and AWS Tools/Technology is highly valued. Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. Knowledge of Orchestration tools such as Apache Airflow or Snowflake Tasks is beneficial, and familiarity with Abinitio ETL tool is a plus. Some other highly valued skills may include the ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. A good understanding of infrastructure setup and the ability to provide solutions either individually or working with teams is essential. Knowledge of Data Marts and Data Warehousing concepts, along with good analytical and interpersonal skills, is required. Implementing Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy is also important. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Meet the needs of stakeholders/customers through specialist advice and support. - Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. - Likely to have responsibility for specific processes within a team. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - Manage own workload, take responsibility for the implementation of systems and processes within own work area and participate in projects broader than the direct team. - Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. - Provide specialist advice and support pertaining to own work area. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how all teams in the area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. - Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative/operational expertise. - Make judgements based on practice and previous experience. - Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. - Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day-to-day administrative requirements. - Build relationships with stakeholders/customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have 5 to 8 years of experience for the job based in Coimbatore. As a developer, you will be responsible for designing, implementing, coding, testing & documenting new programs. You will also be debugging, troubleshooting, and maintaining source code related to various programs. It is essential to work closely with business analysts, architects, and stakeholders to gather and analyze requirements. Collaborating with designers and content producers is also a part of your role. You will be required to develop specifications, prototypes, or initial user guides and create visual modeling or diagramming of the current and proposed workflows. Additionally, you will assist in the development of processes and procedures to streamline and increase efficiency. Integration with enterprise products/systems and performing code reviews while providing mentorship to junior developers is expected. Keeping yourself updated on Salesforce platform enhancements, new releases, and best practices is crucial. You must have a minimum of 4 years of hands-on experience in any programming languages like JAVA / PHP. A strong working experience in structured database systems like SQL, MySQL, MSSQL is required. Experience in requirement understanding and analyzing is essential. Good writing and oral communication skills are necessary. A good working knowledge of front-end languages like JavaScript, HTML, CSS is beneficial. Sound knowledge in Rest APIs is required. Experience with build tools like VS code and using tools like Bitbucket, JIRA is preferred. Experience in cloud data platforms like Snowflake would be an added advantage.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer at Intelex Technologies, you will have the opportunity to bring passion, craftsmanship, and innovation to the development and delivery of Intelex's software products. Your role involves empowering customers with industry-leading capabilities and user experiences by working within Agile practices and a DevOps culture. You will contribute to building the next-generation platform using modern patterns such as microservices and micro frontends. Your responsibilities will include collaborating with Product Managers, UX designers, and Engineers to translate business requirements into actionable technical work. You will design and implement new features and solutions to support key customer use cases in both existing and new software products. Additionally, you will participate in agile sprint planning, proactively remove blockers, and deliver on sprint commitments. As a Senior Software Engineer, you will build scalable web applications using technologies like ReactJS, Webpack, .NET, SQL Server, and AWS. Applying a "security-first" mindset across development activities and CI/CD pipelines will be crucial. You will collaborate cross-functionally with Architecture, QA, Product Management, and other stakeholders to deliver best-in-class user experiences. Moreover, mentoring junior engineers, sharing knowledge, and encouraging continuous learning will be part of your role. Your technical skills should include 5+ years of experience in full-stack or front-end software development, a solid understanding of .NET Framework/.NET Core and relational database design, and experience with front-end frameworks like React. Proficiency in JavaScript/TypeScript, CSS, and bundlers, as well as familiarity with RESTful APIs, GraphQL, and state management libraries, will be necessary. Experience in cloud environments, particularly AWS, is preferred. Your qualifications should include a Bachelor's Degree in Computer Science, Engineering, or a related field, with a Master's Degree or relevant certifications considered an asset. Prior experience in a software product or SaaS organization is preferred. This role also requires a satisfactory Criminal Background Check and Public Safety Verification. Join us at Intelex Technologies, a global leader in environmental, health, safety, and quality management software, and be part of a dynamic, inclusive culture that values growth, innovation, and collaboration. Visit intelex.com/careers to explore opportunities and become a #ProudIntelexian.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Engineer, Technical Lead at Zywave, you will play a crucial role in developing cutting-edge SaaS applications that disrupt and innovate our market space. Zywave is dedicated to continuous improvement and growth, and we are looking for individuals who can lead our Product Development team in supporting our major company initiatives. Your responsibilities will include taking ownership of development efforts, mentoring junior engineers, and collaborating with team members to develop, test, troubleshoot, and maintain Zywave's web-based applications. You will contribute to all aspects of the product development lifecycle, ensuring that our multi-tenant SaaS application remains best in class. To excel in this role, you should be able to develop rich client web applications using the latest .NET technologies, lead team members in sprint cycle planning and software development practices, and implement unit tests and build scripts for bug-free releases. Additionally, you should have a strong technical background in .NET development technologies, Microsoft SQL Server, Snowflake, and ELT processes. To be considered a strong fit for this position, you should possess a Bachelor's degree in information systems technology or computer science, along with at least 5 years of relevant experience. You should also demonstrate out-of-the-box thinking, problem-solving skills, and excellent communication abilities. Familiarity with Agile methodologies, web service programming, and Internet design methodologies is highly desirable. At Zywave, you will have the opportunity to work in a dynamic environment where you can learn, grow, and contribute to making Zywave the best in the business. If you are looking to be part of a team that values innovation, collaboration, and continuous improvement, Zywave is the place for you. Join Zywave, a leader in the insurtech industry, and be a part of a company that powers the modern insurance lifecycle. With over 15,000 insurers, agencies, and brokerages worldwide using Zywave solutions, you will be part of a team that accelerates digitalization, distribution, and profitability in the insurance sector. Learn more about Zywave at www.zywave.com.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

The position requires an experienced Snowflake Developer with a demonstrated track record in designing, implementing, and optimizing data solutions utilizing Snowflake's cloud data platform. The ideal candidate should possess extensive expertise in data loading processes from AWS S3 to Snowflake and should be well-versed in AWS services and DevOps practices. **Required Qualifications:** - Minimum 4 years of professional experience as a Snowflake Developer or in a similar data engineering role - Strong proficiency in Snowflake's architecture, features, and best practices - Demonstrated experience in loading data from AWS S3 to Snowflake using various methods such as COPY, Snowpipe, etc. - Proficient in writing optimized SQL queries for Snowflake - Experience with AWS services (S3, Lambda, IAM, etc.) - Knowledge of CI/CD pipelines and AWS CloudFormation - Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience) **Preferred Qualifications:** - Snowflake Certification - Experience with Snowflake DevOps practices - Familiarity with version control tools like Git, GitHub, etc. - Proficiency in Python or another programming language for scripting and automation - Understanding of data governance and security principles **Key Responsibilities:** - Design and implement efficient data loading processes from S3 to Snowflake - Create and maintain Snowflake objects including warehouses, databases, schemas, tables, views, and stored procedures - Collaborate with data engineers, analysts, and business stakeholders - Assist in establishing and maintaining CI/CD pipelines for Snowflake deployments - Document processes, configurations, and implementations - Support Snowflake maintenance activities such as user management and resource monitoring - Troubleshoot and resolve data loading and processing issues **Skills Required:** - Advanced SQL knowledge - Proficiency in AWS services (S3, Lambda, IAM, CloudFormation) - Understanding of Snowflake architecture and features - Experience in data integration and ETL processes - Familiarity with CI/CD and DevOps practices - Strong problem-solving and analytical thinking skills - Effective communication and collaboration abilities About UST: UST is a global digital transformation solutions provider partnering with clients worldwide to drive real impact through transformation. With over 30,000 employees in 30 countries, UST empowers organizations with deep domain expertise, innovation, and agility. UST's philosophy revolves around embedding innovation and agility into client organizations, ensuring boundless impact and touching billions of lives.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

pune, maharashtra

On-site

About the Role: We are looking for a motivated and detail-oriented Junior Data Engineer with 1 to 2 years of hands-on experience in either Snowflake or Databricks. As a Junior Data Engineer, you will collaborate closely with our data engineering and analytics teams to assist in the design, development, and maintenance of robust data pipelines and scalable data solutions. Key Responsibilities: - Designing, building, and maintaining ETL/ELT data pipelines using Snowflake or Databricks. - Collaborating with data analysts and stakeholders to comprehend data requirements. - Optimizing data workflows to ensure data quality, performance, and security. - Writing efficient SQL queries for data transformation and cleansing. - Monitoring and troubleshooting data issues in pipelines and storage systems. - Documenting solutions, data models, and processes for future reference. Required Skills and Qualifications: - 1-2 years of experience in data engineering or data platform development. - Hands-on experience with either Snowflake (data warehousing, SnowSQL, Snowpipe, etc.) or Databricks (PySpark/Scala/SQL notebooks, Delta Lake, MLflow, etc.). - Proficiency in SQL and data modeling techniques. - Working knowledge of cloud platforms such as AWS, Azure, or GCP. - Experience with version control tools like Git. - Strong understanding of data warehousing concepts and performance tuning. Good to Have: - Familiarity with orchestration tools like Airflow or dbt. - Exposure to CI/CD pipelines in data environments. - Basic understanding of data privacy, compliance, and security principles.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimizing of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. As a skilled AWS and Snowflake Production Support Specialist, you will join our dynamic IT team. In this role, you will be responsible for ensuring the availability, performance, and security of our AWS cloud infrastructure and Snowflake data platform. Your critical role will involve monitoring, troubleshooting, and resolving incidents to minimize downtime and support our business operations. Responsibilities: - Monitor and manage AWS cloud resources such as EC2 instances, S3 buckets, RDS databases, and Lambda functions. - Configure and optimize AWS services for scalability, reliability, and cost-efficiency. - Implement infrastructure as code (IaC) using tools like CloudFormation or Terraform. - Monitor and maintain Snowflake data warehouses and data pipelines. - Perform performance tuning and optimization of Snowflake queries and data loading processes. - Respond to alerts and incidents related to AWS and Snowflake environments. - Diagnose and troubleshoot issues, collaborating with internal teams and vendors as needed. - Design and implement backup strategies for AWS resources and Snowflake data. - Test and maintain disaster recovery plans to ensure business continuity. - Implement and enforce AWS and Snowflake security best practices. - Coordinate and execute changes to AWS and Snowflake configurations following change management processes. - Collaborate effectively with development teams, infrastructure teams, and business stakeholders to support application deployments and releases. - Communicate effectively with internal teams and external vendors, including AWS and Snowflake support. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience). - Proven experience in AWS cloud infrastructure management and Snowflake data platform administration. - Strong knowledge of AWS services (EC2, S3, RDS, Lambda, etc.) and Snowflake features (warehouses, data loading, security). - Experience with infrastructure as code and automation tools. - Familiarity with data warehousing concepts, ETL processes, and SQL querying. - Experience in incident management, troubleshooting, and root cause analysis. - Solid understanding of cloud security best practices and compliance requirements. - Excellent communication and collaboration skills, with the ability to work effectively in a team environment.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a member of the Security Solutions, Platform and Analytics team (SPA) at Snowflake, you will play a crucial role in developing custom solutions to enhance the security of Snowflake's Data Cloud. Your expertise in SQL, Python, and security domain knowledge will be instrumental in analyzing security logs and event data to translate security requirements into effective technical solutions. You will have the opportunity to utilize advanced analytics techniques to identify patterns, anomalies, and trends in security data. Your responsibilities at Snowflake will include developing and optimizing data pipelines, data models, and visualization dashboards for security analytics. You will collaborate with various security teams to design and develop scalable automated solutions, taking ownership of database management tasks such as data modeling and performance optimization. Using tools like DBT, you will streamline data transformations to ensure data quality and propose innovative approaches to enhance security posture aligned with organizational goals. To excel in this role at Snowflake, you should possess a Bachelor's degree in Computer Science, Information Security, or a related field, along with 5-8 years of experience in Data Analytics with strong SQL and Python skills. Experience with data visualization, DBT, and data pipeline development is essential, and hands-on experience with Snowflake and Cortex functions would be a plus. A strong understanding of databases, data modeling, and data warehousing, as well as security domain knowledge including SIEM systems and threat intelligence platforms, is required. Your proven ability to analyze complex security events and effectively communicate findings will be critical in this role. Joining the team at Snowflake will provide you with the opportunity to work with cutting-edge technology and contribute to the security of a rapidly growing data platform. The team values innovation, continuous learning, and the chance to make a significant impact on enterprise-scale security solutions. Snowflake is growing rapidly, and they are looking for individuals who share their values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. If you are interested in making a meaningful impact on enterprise-scale security solutions, consider joining Snowflake and being part of a team that actively uses its products to strengthen internal security practices. For more information on job opportunities, including salary and benefits information for positions in the United States, please visit the Snowflake Careers Site at careers.snowflake.com.,

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

delhi

On-site

You will support the Analytics solutions team in ramping up F&A analytics and reporting practice using the Dataiku platform. Your primary responsibilities will include partnering with internal stakeholders and clients to identify, analyze, and deliver analytics and automation solutions using Dataiku. You will be expected to translate business requirements into technical solutions and manage the end-to-end delivery of Dataiku-based projects. Additionally, you will need to communicate technical infrastructure requirements to deploy automation solutions and convert solutions to tools and products. A key aspect of your role will be to lead and mentor a team of junior resources, enabling skill development in Dataiku, data engineering, and machine learning workflows. In this position, it is essential that you can identify F&A automation opportunities in the client environment and perform end-to-end automation operations. As a Senior Dataiku developer with over 6 years of experience, you should have a proven track record in building dynamic workflows, models, and pipelines. You must have experience in developing custom formulas, applications, and plugins within the Dataiku DSS environment and integrating and working with Snowflake. A good understanding of SQL is required, along with experience integrating Dataiku with enterprise systems such as SAP, Oracle, or cloud data platforms. A balance of analytical problem-solving and strong interpersonal and relationship development skills is a must. Your technical skills should include hands-on experience in Dataiku, Alteryx, SQL, Power BI, and Snowflake. You should be proficient in creating data pipelines for data ingestion, transformation, and output within the Dataiku platform. An understanding of Python and R scripting within Dataiku is considered a strong plus. Moreover, you should have a strong working knowledge of JIRA for agile project and task tracking. Desired soft skills for this role include excellent presentation, verbal, and written communication skills, as well as strong analytical skills and an aptitude for problem-solving, including data analysis and validation. You should be able to work both independently and as part of a team effectively. To be considered for this position, you should have 5-12 years of total analytics experience, with at least 6 years of experience specifically in Dataiku. Additionally, having 1-2 years of working experience in Insurance Analytics would be beneficial.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

maharashtra

On-site

NTT DATA is looking for a Data & AI Technical Solution Architect to join their team in Pune, Maharashtra, India. As a Data & AI Architect, you will be responsible for delivering multi-technology consulting services to clients, providing strategies and solutions for infrastructure and related technology components. Your role will involve collaborating with stakeholders to develop architectural approaches for solutions and working on strategic projects to ensure optimal functioning of clients" technology infrastructure. Key Responsibilities: - Engage in conversations with CEO, Business owners, and CTO/CDO - Analyze complex business challenges and propose effective solutions focusing on client needs - Develop high-level innovative solution approaches for complex business problems - Utilize best practices and creativity to address challenges - Conduct market research, formulate perspectives, and communicate insights to clients - Build strong client relationships and ensure client satisfaction - Contribute to the improvement of internal effectiveness by enhancing methodologies, processes, and tools Minimum Skills Required: - Academic Qualifications: BE/BTech or equivalent in Information Technology and/or Business Management - Scaled Agile certification is desirable - Relevant consulting and technical certifications, such as TOGAF - 12-15 years of experience in a similar role within a large-scale technology services environment - Proficiency in Data, AI, Gen AI, and Agentic AI - Experience in Data Architecture and Solutioning, E2E Data Architecture, and GenAI Solution design - Ability to work on Data & AI RFP responses as Solution Architect - Experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect - Proficiency in Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools - Experience in large-scale consulting and program execution engagements in AI and data - Expertise in multi-technology infrastructure design and client engagement Additional Career Level Description: - Seasoned professional with complete knowledge and understanding of the specialization area - Solves diverse problems using judgment and interpretation - Enhances relationships with senior partners and suggests variations in approach About NTT DATA: NTT DATA is a global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. With experts in over 50 countries, NTT DATA offers business and technology consulting, data and artificial intelligence solutions, industry solutions, as well as application, infrastructure, and connectivity management. They are a leading provider of digital and AI infrastructure and are part of the NTT Group, investing significantly in R&D to support organizations in their digital transformation journey. Visit us at us.nttdata.com,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced professional in the field, you will be responsible for finalizing solution architecture, system architecture, and technology architecture for the Data bricks migration program/projects. Your role will involve finalizing conversion methodology including ETL, Data pipelines, and Visualization tools using 3rd party accelerators and Data bricks tools like Lake bridge. Additionally, you will provide technical guidance to a group of data engineers on code conversion to data bricks compliant formats, Database objects to databricks compliant formats, data loading, and data reconciliation with old and new systems. Your primary skills in Data bricks, Lake Bridge, AWS or other cloud platforms, and Technical team management will be crucial in successfully carrying out the responsibilities of this role. It is also beneficial to have knowledge of other cloud analytics platforms like Redshift, BigQuery, Snowflake. Having an advanced certification in Databricks will be an added advantage in excelling in this position.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Job Description: As an MSBI Developer, you will be responsible for leveraging your expertise in Microsoft Business Intelligence tools to drive data analysis and reporting solutions. With a strong background in PowerBI, SQL Server, and Microsoft SSIS/SSRS, you will play a key role in developing and maintaining complex data models and ETL processes. Your responsibilities will include creating and debugging SSIS packages, deploying to SSIS Catalogue, and generating reports using SSRS. Additionally, you will be expected to possess a good understanding of Star/Snowflake Schema modeling and have basic knowledge of Oracle PL-SQL. Your proficiency in tools such as PBI Desktop, DAX language, Power Query, M language, and PBI Report Server will be crucial to your success in this role. Requirements: - Bachelor's degree in any field - Minimum of 5 years of experience in MSBI development - Advanced T-SQL skills in SQL Server Management Studio (SSMS) - Familiarity with Microsoft Visual Studio data tools - Ability to work effectively in a fast-paced environment Key Skills: - Microsoft Power BI - PBI Desktop - DAX language - Power Query - M language - PBI Report Server - RLS - SSMS - PL SQL - T-SQL - SSRS - SSIS - Snowflake This is a full-time, permanent position based in Mumbai, offering a challenging opportunity for a talented MSBI Developer to contribute to the success of our IT/Computers - Software industry. Join our team and be part of a dynamic work environment where your skills and expertise will be valued and recognized. Job Code: GO/JC/040/2025 Recruiter Name: [Not Specified],

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You should possess expert-level proficiency in Python and Python frameworks or Java. Additionally, you must have hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, CloudFormation, Docker, ECS, Fargate, and ECR. Your deep experience should cover key AWS services such as Compute (PySpark, Lambda, ECS), Storage (S3), Databases (DynamoDB, Snowflake), Networking (VPC, 53, CloudFront, API Gateway), DevOps/CI-CD (CloudFormation, CDK), Security (IAM, KMS, Secrets Manager), and Monitoring (CloudWatch, X-Ray, CloudTrail). Moreover, you should be proficient in NoSQL Databases like Cassandra, PostgreSQL, and have strong hands-on knowledge of using Python for integrations between systems through different data formats. Your expertise should extend to deploying and maintaining applications in AWS, with hands-on experience in Kinesis streams and Auto-scaling. Designing and implementing distributed systems and microservices, scalability, high availability, and fault tolerance best practices are also key aspects of this role. You should have strong problem-solving and debugging skills, with the ability to lead technical discussions and mentor junior engineers. Excellent communication skills, both written and verbal, are essential. You should be comfortable working in agile teams with modern development practices, collaborating with business and other teams to understand business requirements and work on project deliverables. Participation in requirements gathering and understanding, designing solutions based on available frameworks and code, and experience with data engineering tools or ML platforms (e.g., Pandas, Airflow, SageMaker) are expected. An AWS certification (AWS Certified Solutions Architect or Developer) would be advantageous. This position is based in multiple locations in India, including Indore, Mumbai, Noida, Bangalore, and Chennai. To qualify, you should hold a Bachelor's degree or a foreign equivalent from an accredited institution. Alternatively, three years of progressive experience in the specialty can be considered in lieu of each year of education. A minimum of 8+ years of Information Technology experience is required for this role.,

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Marketing Reporting Specialist at Monotype, you will play a pivotal role in visualizing the effectiveness of marketing initiatives through comprehensive reporting. Leveraging your experience with complex marketing data, you'll uncover valuable insights for key stakeholders, driving informed decisions. Your exceptional analytical skills and strategic mindset will empower you to identify and illustrate trends that uncover consumer behavior patterns. In this position, you will be integral to Monotype's marketing operations, collaborating closely with senior leaders to ensure data-driven decision-making. Responsibilities: Leverage marketing data sources to ensure an accurate, comprehensive view of all metrics, behavior data, and KPIs. Develop, build, and maintain marketing reports and dashboards using visualization tools and platforms to clearly present key metrics and trends. Garner actionable insights from complex datasets by identifying anomalies, patterns, and correlations that present opportunities for optimization and growth. Partner with multiple teams to interpret and translate their needs into compelling reports and presentations that communicate complex data insights in a clear and impactful way. Champion, create, and lead initiatives and methodologies to inform, optimize, and expand marketing reporting and analysis. Continuously learn and explore new marketing technologies and data analysis tools to stay ahead of the curve. What we're looking for: 5+ years of experience with BI, analytics tools, SQL, and insight delivery. Proficiency in SQL and data warehousing (AWS & Snowflake). Experience with marketing analytics platforms (e.g., Adobe Analytics, CJA, Marketo, Salesforce). Expertise in BI tools like Power BI and Tableau. Data Migration and Connection. Strong analytical thinking, problem-solving, and attention to detail. Excellent communication and presentation skills to engage diverse audiences. Process-driven mindset with a keen eye for data accuracy and consistency. Additional knowledge of Python and familiarity with general marketing tech stacks is a plus.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As an Informatica IDMC Developer at Coforge, your primary responsibility will be to design, develop, and maintain robust ETL pipelines using Informatica Intelligent Data Management Cloud (IDMC/IICS). You will collaborate with data architects, analysts, and business stakeholders to gather and comprehend data requirements. Your role will involve integrating data from various sources including databases, APIs, and flat files, and optimizing data workflows for enhanced performance, scalability, and reliability. Monitoring and troubleshooting ETL jobs to address data quality issues will be a part of your daily tasks. Implementing data governance and security best practices will also be crucial, along with maintaining detailed documentation of data flows, transformations, and architecture. Your contribution to code reviews and continuous improvement initiatives will be valued. The ideal candidate for this position should possess strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and prior experience working with relational databases like Oracle, SQL Server, and PostgreSQL is essential. Additionally, familiarity with cloud platforms such as AWS, Azure, or GCP, and knowledge of data warehousing concepts and tools like Snowflake, Redshift, or BigQuery are required. Excellent problem-solving skills and effective communication abilities are highly desirable qualities for this role. Preferred qualifications for this position include experience with CI/CD pipelines and version control systems, as well as knowledge of data modeling and metadata management. Holding certifications in Informatica or cloud platforms will be considered a plus. If you have 5-8 years of relevant experience and possess the mentioned skill set, we encourage you to apply for this position by sending your CV to Gaurav.2.Kumar@coforge.com. This position is based in Greater Noida.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a talented individual in the field of data engineering, your role will involve designing, developing, and maintaining scalable data pipelines utilizing Snowflake. Your expertise will be crucial in optimizing SQL queries and data models to enhance performance and efficiency. Implementing data security and governance best practices within Snowflake will be a key responsibility to ensure data integrity and compliance. Collaboration with data scientists and analysts will be a significant aspect of your job to understand and fulfill their data requirements effectively. Your problem-solving skills will be put to the test as you troubleshoot and resolve data-related issues promptly, ensuring smooth data operations within the organization. If you are passionate about leveraging your data engineering skills and working in a dynamic environment, this opportunity offers a platform to grow and contribute meaningfully to the organization's data infrastructure. Join us to be a part of a team that values innovation, collaboration, and continuous learning. #CareerOpportunities #JobVacancy #WorkWithUs,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

At Medtronic, you can embark on a rewarding career dedicated to exploration and innovation, all while contributing to the advancement of healthcare access and equity for all. As a Digital Engineer at our new Minimed India Hub, you will play a crucial role in leveraging technology to enhance healthcare solutions on a global scale. Specifically, as a PySpark Data Engineer, you will be tasked with designing, developing, and maintaining data pipelines using PySpark. Your collaboration with data scientists, analysts, and stakeholders will be essential in ensuring the efficient processing and analysis of large datasets, as well as handling complex transformations and aggregations. This role offers an exciting opportunity to work within Medtronic's Diabetes business. As the Diabetes division prepares for separation to foster future growth and innovation, you will have the chance to operate with increased speed and agility. By working as a separate entity, there will be a focus on driving meaningful innovation and enhancing the impact on patient care. Your responsibilities will include designing, developing, and maintaining scalable and efficient ETL pipelines using PySpark, working with structured and unstructured data from various sources, optimizing PySpark applications for performance and scalability, collaborating with data scientists and analysts to understand data requirements, implementing data quality checks, monitoring and troubleshooting data pipeline issues, documenting technical specifications, and staying updated on the latest trends and technologies in big data and distributed computing. To excel in this role, you should possess a Bachelor's degree in computer science, engineering, or a related field, along with 4-5 years of experience in data engineering focusing on PySpark. Proficiency in Python and Spark, strong coding and debugging skills, knowledge of SQL and relational databases, hands-on experience with cloud platforms, familiarity with data warehousing solutions, experience with big data technologies, problem-solving abilities, and effective communication and collaboration skills are essential. Preferred skills include experience with Databricks, orchestration tools like Apache Airflow, knowledge of machine learning workflows, understanding of data security and governance best practices, familiarity with streaming data platforms, and knowledge of CI/CD pipelines and version control systems. Medtronic offers a competitive salary and flexible benefits package, along with a commitment to recognizing and supporting employees at every stage of their career and life. As part of the Medtronic team, you will contribute to the mission of alleviating pain, restoring health, and extending life by tackling the most challenging health problems facing humanity. Join us in engineering solutions that make a real difference in people's lives.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As the Technical Lead for Data Engineering at Assent, you will collaborate with other team members to identify opportunities and assess the feasibility of solutions. Your role will involve providing technical guidance, influencing decision-making, and aligning data engineering initiatives with business goals. You will drive the technical strategy, team execution, and process improvements to build resilient and scalable data systems. Mentoring a growing team and building robust, scalable data infrastructure will be essential aspects of your responsibilities. Your key requirements and responsibilities will include driving the technical execution of data engineering projects, working closely with Architecture members to design and implement scalable data pipelines, and providing technical guidance to ensure best practices in data engineering. You will collaborate cross-functionally with various teams to define and execute data initiatives, plan and prioritize work with the team manager, and stay updated with emerging technologies to drive their adoption. To be successful in this role, you should have 10+ years of experience in data engineering or related fields, expertise in cloud data platforms such as AWS, proficiency in modern data technologies like Spark, Airflow, and Snowflake, and a deep understanding of distributed systems and data pipeline design. Strong programming skills in languages like Python, SQL, or Scala, experience with infrastructure as code and DevOps best practices, and the ability to influence technical direction and advocate for best practices are also necessary. Strong communication and leadership skills, a learning mindset, and experience in security, compliance, and governance related to data systems will be advantageous. At Assent, we value your talent, energy, and passion, and offer various benefits to support your well-being, financial health, and personal growth. Our commitment to diversity, equity, and inclusion ensures that all team members are included, valued, and provided with equal opportunities for success. If you require any assistance or accommodation during the interview process, please feel free to contact us at talent@assent.com.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a data engineer at Infiligence, you will play a crucial role in designing, developing, and maintaining scalable data pipelines for batch and real-time data processing. You will have the opportunity to work with cutting-edge technologies on cloud platforms such as Azure or AWS to ensure efficient data flow across systems. Collaboration with cross-functional teams to translate business requirements into robust data models and solutions will be a key aspect of your role. Your responsibilities will also include implementing data quality, governance, and security standards throughout the data lifecycle. You will be expected to develop and maintain documentation, conduct code reviews, unit testing, and peer reviews to ensure code quality and compliance. Troubleshooting, monitoring, and resolving data pipeline and infrastructure issues will be essential to minimize business impact. To succeed in this role, you should have a minimum of 5-7 years of experience in data engineering, with a focus on building and managing large-scale data pipelines. Hands-on experience with Azure Data Services is preferred, but experience with AWS data services is also acceptable. Proficiency in SQL, Python, or Scala for data processing and transformation is required, along with familiarity with data warehousing and real-time databases. A strong understanding of data architecture, ingestion, curation, and consumption patterns is essential, along with knowledge of data quality management, metadata management, data lineage, and data security best practices. Excellent communication skills and the ability to work collaboratively with global teams are highly valued in this role. Preferred skills include experience with CI/CD processes, source control for data engineering workflows, data observability, self-testing pipelines, and exposure to business intelligence and reporting platforms. In return, Infiligence offers comprehensive insurance coverage, a competitive salary, opportunities for professional growth, and an inclusive and collaborative work culture. If you are interested in joining our team, please submit your updated CV and a cover letter via our job URL. Shortlisted candidates will undergo an HR, technical assessment, and interviews. For any queries regarding the position or application process, please contact our Talent team at Infiligence, US or Chennai offices through careers@infiligence.com.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

At Goldman Sachs, our Engineers don't just make things - we make things possible. We change the world by connecting people and capital with ideas, solving the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets. Engineering, which is comprised of our Technology Division and global strategists groups, is at the critical center of our business. Our dynamic environment requires innovative strategic thinking and immediate, real solutions. If you want to push the limit of digital possibilities, start here. Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile, and more. We look for creative collaborators who evolve, adapt to change, and thrive in a fast-paced global environment. Data plays a critical role in every facet of the Goldman Sachs business. The Data Engineering group is at the core of that offering, focusing on providing the platform, processes, and governance for enabling the availability of clean, organized, and impactful data to scale, streamline, and empower our core businesses. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will be responsible for observability, cost, and capacity with operational accountability for some of Goldman Sachs's largest data platforms. We engage in the full lifecycle of platforms from design to demise with an adapted SRE strategy to the lifecycle. We are looking for individuals with a background as a developer who can express themselves in code. You should have a focus on Reliability, Observability, Capacity Management, DevOps, and SDLC (Software Development Lifecycle). As a self-leader comfortable with problem statements, you should structure them into data-driven deliverables. You will drive strategy with skin in the game, participate in the team's activities, drive Postmortems, and have an attitude that the problem stops with you. **How You Will Fulfil Your Potential** - Drive adoption of cloud technology for data processing and warehousing - Drive SRE strategy for some of GS's largest platforms including Lakehouse and Data Lake - Engage with data consumers and producers to match reliability and cost requirements - Drive strategy with data **Relevant Technologies**: Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, Gitlab **Basic Qualifications** - A Bachelor's or Master's degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) - 1-4+ years of relevant work experience in a team-focused environment - 1-2 years hands-on developer experience at some point in career - Understanding and experience of DevOps and SRE principles and automation, managing technical and operational risk - Experience with cloud infrastructure (AWS, Azure, or GCP) - Proven experience in driving strategy with data - Deep understanding of multi-dimensionality of data, data curation, and data quality - In-depth knowledge of relational and columnar SQL databases, including database design - Expertise in data warehousing concepts - Excellent communication skills - Independent thinker, willing to engage, challenge, or learn - Ability to stay commercially focused and to always push for quantifiable commercial impact - Strong work ethic, a sense of ownership and urgency - Strong analytical and problem-solving skills - Ability to build trusted partnerships with key contacts and users across business and engineering teams **Preferred Qualifications** - Understanding of Data Lake / Lakehouse technologies incl. Apache Iceberg - Experience with cloud databases (e.g., Snowflake, Big Query) - Understanding concepts of data modeling - Working knowledge of open-source tools such as AWS lambda, Prometheus - Experience coding in Java or Python,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The Senior Data Analyst (Supply Chain) will have a crucial role in developing the technology platform to support all supply chain integrations and solutions. You will be responsible for driving insights in collaboration with the analytics team to define system performance and identify opportunities. Additionally, you will collect business specifications and requirements from partner integration solutions and portal product management. You should have a Bachelor's degree in Supply Chain Management, Business Management, Engineering, or a related field, or equivalent work experience. You must possess intermediate to advanced skills in using data analysis tools such as MySQL and Snowflake, along with familiarity with visualization tools like PowerBI, Tableau, or Sigma. Experience with integration technologies like API, EDI, and other communication forms is essential. While an understanding of coding languages is preferred, it is not mandatory. With 3-5+ years of relevant experience, you should demonstrate the ability to define problems, collect data, and draw valid conclusions. Effective communication skills are necessary to drive projects and insights for the betterment of the business. You will coordinate with core systems project management for strategic alignment and implementation, document workflows, and ensure governance over system solutions. The ideal candidate is adaptable, resourceful, and possesses creative problem-solving skills. You should work with a sense of urgency, be able to work independently with minimal supervision, and thrive in a fast-paced, evolving environment. A passion for achieving industry-leading performance and breaking established norms is crucial. Organizational skills and high attention to detail are required to manage projects effectively and expeditiously. At HNR Tech, you will have access to an inspiring work environment, a performance-driven work culture, opportunities to learn new technologies, and guidance for growth within the company and sector. You will be exposed to complex and challenging projects within an international context and work alongside driven and passionate colleagues who strive for top quality. This position is based in Matunga, Mumbai, Maharashtra, India, with a hybrid work model. The benefits include flexible working style, diversity and inclusion, opportunities for learning and growth, a balanced working life, flexible work hours, health insurance, and fixed off on Saturdays and Sundays. HNR Tech is committed to creating a workplace and global community where inclusion is prioritized. As an equal opportunity employer, we seek to foster a welcoming and diverse environment. All qualified applicants will receive consideration regardless of non-merit-based or legally protected grounds.,

Posted 2 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You are invited to apply for the position of Lead / Senior ETL & Data Migration QA Engineer at our company based in Hyderabad, India (Mandatory 5 days working from Office). With 4 to 12 years of experience, you will be a key member of our Quality Assurance team, focusing on a high-impact data migration project. Your responsibilities will include ETL testing, data validation, and cloud migration, utilizing your expertise in SQL, ETL tools (preferably Talend), and cloud platforms like Snowflake. This role necessitates leadership in overseeing QA efforts across global teams to ensure the accuracy of large-scale data transformations. Your main duties will involve designing and implementing robust test strategies and plans for data migration and ETL processes, as well as developing and executing detailed test cases, scripts, and plans to validate data accuracy, completeness, and consistency. You will conduct advanced SQL-based data validation and transformation testing, utilize ETL tools such as Talend, Informatica PowerCenter, or DataStage to validate data pipelines, and test semi-structured data formats like JSON and XML. Additionally, you will lead QA activities for cloud data migration projects, particularly to Snowflake, and coordinate testing activities across on-shore and off-shore teams to ensure timely and quality delivery. Documenting test results, defects, and collaborating with development teams for resolution, as well as contributing to automated testing frameworks for ETL processes, will also be part of your responsibilities. You will be expected to promote QA best practices and drive continuous improvement initiatives. To be eligible for this position, you should have at least 3 years of experience in QA with a focus on ETL testing, data validation, and data migration. Proficiency in SQL for complex queries and data validation is essential, along with hands-on experience in Talend (preferred), Informatica PowerCenter, or DataStage. Experience with cloud data platforms, especially Snowflake, and a strong understanding of semi-structured data formats (JSON, XML) are required. Your excellent analytical and problem-solving skills, along with experience working in distributed teams and leading QA efforts, will be highly valuable in this role. Additionally, preferred skills include experience with automated testing tools for ETL processes, knowledge of data governance and data quality standards, familiarity with AWS or other cloud ecosystems, and an ISTQB or equivalent certification in software testing. If you are passionate about quality assurance, data migration, and ETL processes, and possess the required qualifications and skills, we encourage you to apply for this challenging and rewarding opportunity.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a Data Governance and Management Developer at Assent, you will play a crucial role in ensuring the quality and reliability of critical data across systems and domains. Your responsibilities will include defining and implementing data quality standards, developing monitoring pipelines to detect data issues, conducting data profiling assessments, and designing data quality dashboards. You will collaborate with cross-functional teams to resolve data anomalies and drive continuous improvement in data quality. Key Requirements & Responsibilities: - Define and implement data quality rules, validation checks, and metrics for critical business domains. - Develop Data Quality (DQ) monitoring pipelines and alerts to proactively detect data issues. - Conduct regular data profiling and quality assessments to identify gaps, inconsistencies, duplicates, and anomalies. - Design and maintain data quality dashboards and reports for visibility into trends and issues. - Utilize generative AI to automate workflows, enhance data quality, and support responsible prompt usage. - Collaborate with data owners, stewards, and technical teams to resolve data quality issues. - Develop and document standard operating procedures (SOPs) for issue management and escalation workflows. - Support root cause analysis (RCA) for recurring or high-impact data quality problems. - Define and monitor key data quality KPIs and drive continuous improvement through insights and analysis. - Evaluate and recommend data quality tools that scale with the enterprise. - Provide recommendations for enhancing data processes, governance practices, and quality standards. - Ensure compliance with internal data governance policies, privacy standards, and audit requirements. - Adhere to corporate security policies and procedures set by Assent. Qualifications: - 2-5 years of experience in a data quality, data analyst, or similar role. - Degree in Computer Science, Information Systems, Data Science, or related field. - Strong understanding of data quality principles. - Proficiency in SQL, Git Hub, R, Python, SQL Server, and BI tools like Tableau, Power BI, or Sigma. - Experience with cloud data platforms (e.g., Snowflake, BigQuery) and data transformation tools (e.g., dbt). - Exposure to Graph databases and GenAI tools. - Ability to interpret dashboards and communicate data quality findings effectively. - Understanding of data governance frameworks and regulatory considerations. - Strong problem-solving skills, attention to detail, and familiarity with agile work environments. - Excellent verbal and written communication skills. Join Assent and be part of a dynamic team that values wellness, financial benefits, lifelong learning, and diversity, equity, and inclusion. Make a difference in supply chain sustainability and contribute to meaningful work that impacts the world. Contact talent@assent.com for assistance or accommodation during the interview process.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies