Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
14.0 - 18.0 years
0 Lacs
karnataka
On-site
Changing the world through digital experiences is what Adobe is all about. At Adobe, we provide everyone - from emerging artists to global brands - with everything they need to design and deliver exceptional digital experiences. We are passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We are on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We believe that new ideas can come from anywhere in the organization, and we value the contribution of every individual. Digital Experience (DX) is a USD 4B+ business that serves the needs of enterprise businesses, including 95%+ of Fortune 500 organizations. Adobe Marketo Engage, within Adobe DX, is the world's largest marketing automation platform. It is a comprehensive solution that enables enterprises to attract, segment, and nurture customers from discovery to becoming their biggest fans. It helps enterprises effectively engage with customers through various surfaces and touchpoints. We are looking for dedicated and enthusiastic engineers to join our team as we expand the business by developing next-generation products and enhancing our current offerings. If you are passionate about innovative technology, we would be thrilled to have a conversation with you! **What you'll do:** - Be an inspiring leader in building next-generation Multi-cloud services. - Deliver high-performance services that are adaptable to multifaceted business needs and influence ideation & outstanding problem-solving. - Build secure cloud services that provide very high availability, reliability, and security to our customers and their assets. - Lead the technical design, vision, and implementation strategy. - Define and apply best practices to build maintainable and modular solutions with high quality. - Partner with global product management, UX, engineering, and operations teams to help shape technical product architecture & practices, roadmap, and release plans. - Develop and evolve engineering processes and teamwork models, applying creative problem-solving to optimize team efficiency. - Create technical specifications, prototypes, and presentations to communicate your ideas. - Mentor and guide a high-performing engineering team. - Craft a positive winning culture built on collaboration and shared accomplishments. - Lead discussions on emerging industry technologies & trends and work with the team & leadership to use this knowledge to influence product direction. **What you need to succeed:** - Passion and love for what you do! - 14+ years of experience in software development with 5+ years as an Architect. - Strong design, coding, and architectural skills along with problem-solving and analytical abilities. - Expertise in architecting, designing, and building scalable and performant frontend applications. - Expertise in Java, Spring Boot, Rest Services, MySQL or Postgres, MongoDB, Kafka. - Experience in developing and building solutions with cloud technologies (AWS and/or Azure). - Good understanding of working with Cassandra, Solr, ElasticSearch, Snowflake. - Experience with API Design, the ability to architect and implement an intuitive customer and third-party integration story. - Exceptional coding skills, including an excellent understanding of optimization, performance ramifications of coding decisions, and object-oriented design. - Proven track record of working, coaching, and mentoring software engineers. - Ambitious and not afraid to tackle unknowns, demonstrates a strong bias for action. - Self-motivated with a desire to mentor a team and the ability to drive the team to accomplish high-quality work. - Strong interpersonal, analytical, problem-solving, and conflict resolution skills. - Excellent speaking, writing, and presentation skills, as well as the ability to persuade, encourage, and empower others. - BS or MS or equivalent experience in Computer Science or a related field. Adobe aims to make Adobe.com accessible to all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, please email accommodations@adobe.com or call (408) 536-3015.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Manager, Data Platform Engineering at Assent, you will play a crucial role in leading and coaching a team of data platform engineers. Your primary responsibility will be to keep your team organized, motivated, and engaged in delivering high-quality, scalable software solutions that address key business challenges. By recruiting, developing, and retaining a high-performing team, you will ensure that Assent's products meet the needs of its customers and align with the company's business objectives. Your role will involve coordinating and allocating work among the team members, driving delivery, and championing product quality. You will collaborate closely with other teams in the product value chain such as Product Management, User Experience, Quality Assurance, Customer Success, and Infrastructure. Additionally, you will visualize upcoming work, manage product releases, and ensure that the software developed follows Assent's guidelines and standards. To excel in this role, you should have at least 10 years of progressive experience in a data-focused role, with a proven track record in a leadership position. Strong mentoring and coaching skills are essential to keep your team engaged and motivated. A solid technical background, familiarity with AWS, Snowflake, dbt, Python, SQL, Kafka, and experience in working with large volumes of unstructured data are also required. As a strategic and business-minded individual, you should possess strong analytical skills and be able to manage short-term projects as well as long-term strategies effectively. Adaptability, flexibility, and a growth mindset are key attributes that will help you thrive in Assent's dynamic environment. Your contributions will not only impact the success of Assent but will also contribute to addressing global challenges related to supply chain sustainability. At Assent, we value your talent, energy, and passion. In addition to competitive financial benefits, we offer wellness programs, flexible work options, volunteer opportunities, and a commitment to diversity, equity, and inclusion. Join us in our mission to create a sustainable future and be a part of a team that values inclusivity, respect, and continuous learning.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
At PwC, the focus in data and analytics is on leveraging data to drive insights and make informed business decisions. Utilizing advanced analytics techniques helps clients optimize operations and achieve strategic goals. In business intelligence at PwC, the emphasis is on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. The role involves developing and implementing innovative solutions to optimize business performance and enhance competitive advantage. As a Senior Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. The PwC Professional skills and responsibilities for this level include but are not limited to: - Using feedback and reflection to develop self-awareness, personal strengths, and address development areas. - Delegating to others to provide stretch opportunities and coaching them to deliver results. - Demonstrating critical thinking and the ability to bring order to unstructured problems. - Reviewing work and that of others for quality, accuracy, and relevance. - Knowing how and when to use tools available for a given situation and explaining the reasons for this choice. - Seeking and embracing opportunities that provide exposure to different situations, environments, and perspectives. - Using straightforward communication in a structured way when influencing and connecting with others. - Upholding the firm's code of ethics and business conduct. **Job Description & Summary** **Data Visualization: Senior Associate** **Experience:** 6+ years **Qualification:** Full-time graduate **Mandatory Skills:** - Designing, developing, and maintaining Power BI reports and dashboards to ensure data accuracy, reliability, and performance optimization. - Building and managing data models in Power BI, leveraging Power Query for data transformation and optimization. - Conducting detailed data analysis to uncover insights and presenting findings visually using Power BI features like DAX, custom visuals, and interactive elements. - Proficiency in creating data models, setting up relationships, and structuring data for optimal performance in Power BI. - Strong hands-on knowledge of advanced SQL or PL/SQL. - Strong analytical and problem-solving skills to interpret complex data and deliver actionable insights. - Maintaining quality control standards compliant with policies. - Creating and maintaining process documentation. - Analyzing current and future technology/process needs and recommending enhancements and solutions. **Additional Good To Have Skills:** - Familiarity with building simple applications in Power Apps to support data entry and user interaction. - Basic experience creating automated workflows using Power Automate and integrating with Power BI, Power Apps, and other Microsoft 365 tools. - Experience with database technologies such as Oracle, Snowflake, Azure. - Optimizing data workflows using Alteryx to prepare, cleanse, and analyze data, ensuring seamless integration with Power BI.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for an experienced Data Engineer-Offshore to join their team in Bangalore, Karnataka, India. As a Data Engineer, you will be responsible for designing and implementing tailored data solutions to meet customer needs across various technical platforms. You will be working with languages such as Qlik and Python to build and deploy data replication solutions using Qlik. Collaboration across different technical stacks like Snowflake, AWS, Oracle, dbt, and SQL is essential. You will also need to generate comprehensive solution documentation and adhere to Agile practices throughout the development process. The ideal candidate should have a minimum of 3+ years of experience in supporting Software Engineering, Data Engineering, or Data Analytics projects, along with 2+ years of experience leading a team in data-related projects. It is preferred that you have demonstrated production experience in core data platforms like QLIK, Snowflake, AWS, dbt, SQL, and possess a strong understanding of Data integration technologies. Additionally, excellent written and verbal communication skills are required to effectively convey complex technical concepts. An Undergraduate or Graduate degree is preferred. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. With a commitment to helping clients innovate, optimize, and transform for long-term success, NTT DATA has a diverse team of experts in over 50 countries and a strong partner ecosystem. Their services range from business and technology consulting to data and artificial intelligence solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. As a leading provider of digital and AI infrastructure globally, NTT DATA is part of the NTT Group, investing significantly in R&D each year to support organizations and society in moving confidently into the digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
delhi
On-site
You will support the Analytics solutions team in ramping up F&A analytics and reporting practice using the Dataiku platform. Your role will involve partnering with internal stakeholders and clients to identify, analyze, and deliver analytics and automation solutions using Dataiku. You will be responsible for translating business requirements into technical solutions and managing the end-to-end delivery of Dataiku-based projects. Additionally, you will communicate technical infrastructure requirements to deploy automation solutions and convert solutions into tools and products. Leading and mentoring a team of junior resources to enable skill development in Dataiku, data engineering, and machine learning workflows will also be part of your responsibilities. Essential duties include identifying F&A automation opportunities in the client environment and performing end-to-end automation operations. As a Senior Dataiku developer with over 6 years of experience, you should be proficient in building dynamic workflows, models, and pipelines. You should have experience in developing custom formulas, applications, and plugins within the Dataiku DSS environment, as well as integrating and working with Snowflake. A good understanding of SQL and experience integrating Dataiku with enterprise systems such as SAP, Oracle, or cloud data platforms is required. You must possess a balance of analytical problem-solving skills and strong interpersonal and relationship development abilities. In terms of technical skills, you should have hands-on experience in Dataiku, Alteryx, SQL, Power BI, and Snowflake. Proficiency in creating data pipelines for data ingestion, transformation, and output within the Dataiku platform is essential. An understanding of Python and R scripting within Dataiku is considered a strong advantage, along with a strong working knowledge of JIRA for agile project and task tracking. Desired soft skills for this role include excellent presentation, verbal, and written communication skills, as well as excellent analytical skills and an aptitude for problem-solving, including data analysis and validation. The ability to work independently and as part of a team is also highly valued. To be considered for this position, you should have 5-12 years of total analytics experience, with at least 6 years of experience specifically in Dataiku. Additionally, 1-2 years of working experience in Insurance Analytics would be beneficial.,
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. With over 5 years of experience, you will be responsible for designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. Your expertise will contribute to efficient ELT processes using Snowflake, Fivetran, and DBT for data integration and pipeline development. You will write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Additionally, you will implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and design high-performance data architectures. Collaboration with business stakeholders to understand data needs, troubleshooting data-related issues, ensuring high data quality standards, and documenting data processes will be part of your responsibilities. Your qualifications include expertise in Snowflake for data warehousing and ELT processes, strong proficiency in SQL for relational databases, experience with Informatica PowerCenter for data integration and ETL development, and familiarity with tools like Power BI for data visualization, Fivetran for automated ELT pipelines, and Sigma Computing, Tableau, Oracle, and DBT. You possess strong data analysis, requirement gathering, and mapping skills and are familiar with cloud services such as Azure, AWS, or GCP, along with workflow management tools like Airflow, Azkaban, or Luigi. Proficiency in Python for data processing is required, and knowledge of other languages like Java and Scala is a plus. You hold a graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Your skills include data modeling, business intelligence, Python, DBT, performance BI, ETL, DWH, Fivetran, data quality, Snowflake, SQL, and more. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Technical Lead of Data Engineering at Assent, you will collaborate with various stakeholders including Product Managers, Product Designers, and Engineering team members to identify opportunities and evaluate the feasibility of solutions. Your role will involve offering technical guidance, influencing decision-making, and aligning data engineering initiatives with business objectives as part of Assent's roadmap development. You will be responsible for driving the technical strategy, overseeing team execution, and implementing process improvements to construct resilient and scalable data systems. In addition, you will lead data engineering efforts, mentor a growing team, and establish robust and scalable data infrastructure. Key Requirements & Responsibilities: - Lead the technical execution of data engineering projects to ensure high-quality and timely delivery, covering discovery, delivery, and adoption stages. - Collaborate with Architecture team members to design and implement scalable, high-performance data pipelines and infrastructure. - Provide technical guidance to the team, ensuring adherence to best practices in data engineering, performance optimization, and system reliability. - Work cross-functionally with various teams such as Product Managers, Software Development, Analysts, and AI/ML teams to define and implement data initiatives. - Partner with the team manager to plan and prioritize work, striking a balance between short-term deliverables and long-term technical enhancements. - Keep abreast of emerging technologies and methodologies, advocating for their adoption to accelerate the team's objectives. - Ensure compliance with corporate security policies and follow the established guidelines and procedures of Assent. Qualifications: Your Knowledge, Skills and Abilities: - Possess 10+ years of experience in data engineering, software development, or related fields. - Proficient in cloud data platforms, particularly AWS. - Expertise in modern data technologies like Spark, Airflow, dbt, Snowflake, Redshift, or similar. - Deep understanding of distributed systems and data pipeline design, with specialization in ETL/ELT processes, data warehousing, and real-time streaming. - Strong programming skills in Python, SQL, Scala, or similar languages. - Experience with infrastructure as code tools like Terraform, CloudFormation, and knowledge of DevOps best practices. - Ability to influence technical direction and promote best practices across teams. - Excellent communication and leadership skills, with a focus on fostering collaboration and technical excellence. - A learning mindset, continuously exploring new technologies and best practices. - Experience in security, compliance, and governance related to data systems is a plus. This is not an exhaustive list of duties, and responsibilities may be modified or added as needed to meet business requirements. Life at Assent: At Assent, we are dedicated to cultivating an inclusive environment where team members feel valued, respected, and heard. Our diversity, equity, and inclusion practices are guided by our Diversity and Inclusion Working Group and Employee Resource Groups (ERGs), ensuring that team members from diverse backgrounds are recruited, retained, and provided opportunities to contribute to business success. If you need assistance or accommodation during any stage of the interview and selection process, please reach out to talent@assent.com, and we will be happy to assist you.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The People Services & Technology team at Apple is currently seeking an experienced analytics developer to join our team. As a professional data analytics developer, you will have the opportunity to work alongside a team of experts with varied strengths at Apple. This role offers a chance to be part of a dynamic HR analytics team dedicated to enhancing the employee hire-to-retire experience, enabling individuals at Apple to excel in their work. You will collaborate with a global people analytics team to deliver insights and analytics for various people-related products, including recruitment, candidate care, employee relations, people surveys, talent planning, and other strategic focus areas within the People Services & Technology organization. As a people analytics developer, you will be integral to the team, utilizing data from multiple systems across the Apple HRIS landscape to derive powerful insights for leadership and develop analytical products that enhance the employee experience at Apple. Your responsibilities will include designing, developing, and deploying reports, analytics, and dashboards for the Source-to-Hire-to-Retire cycle to support data-driven decision-making. You will collaborate with business partners, regional coordinators, and leaders to identify reporting requirements, use tools like Tableau to create dashboards and reports, and re-model database architecture to meet business needs. Additionally, you will maintain existing analytics products, ensure successful delivery of new features, and enhance the reporting ecosystem for improvement, simplification, standardization, and security while adhering to Apple Data Compliance and Security standards. Minimum Qualifications: - Bachelor's degree or equivalent experience in Computer Science, Information Management Systems, Data Science, or related field - 6+ years of experience in developing and maintaining reports and analytics using analytical reporting tools with a focus on Tableau - Proficiency in data visualization and charting tools to generate dashboards for data insights Preferred Qualifications: - Technical expertise in advanced analytics techniques, SQL development, data visualization, and statistical analysis - Familiarity with HR analytics and reporting tools for measuring HR performance and employee satisfaction - Strong data engineering skills, including expertise in SQL and knowledge of Snowflake, MySQL, and Postgres databases - Ability to translate high-level business requirements into tangible prototypes and drive discussions for optimal solutions - Proactive approach to task assessment, priority setting, and resolution - Comfortable managing multiple priorities and setting customer expectations - Influence technology direction to enhance toolset and skills - Effective communication with partner teams, managers, and leadership - Foster a positive, open culture and exhibit Apple values of inclusion, trust, and respect - Certifications in Data Science, Reporting & Analytics tools - Knowledge of advanced data processing techniques using Python libraries such as Numpy & Pandas and data science libraries for statistical analysis If you meet the qualifications and are excited about this opportunity, please submit your CV for consideration.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Quality Integration Engineer, your primary responsibility will be to incorporate data quality capabilities into enterprise data landscapes. You will play a crucial role in integrating advanced data quality tools like Ataccama and Collibra with cloud data platforms such as Snowflake and SQL databases. Your role is essential in ensuring that data governance standards are met through robust, scalable, and automated data quality processes. In this role, you will need to develop scalable applications using appropriate technical options and optimize application development, maintenance, and performance. You will be required to implement integration of data quality tools with Snowflake and SQL-based platforms, develop automated pipelines and connectors for data profiling, cleansing, monitoring, and validation, and configure data quality rules aligned with governance policies and KPIs. Troubleshooting integration issues, monitoring performance, and collaborating with various teams to align solutions with business needs will also be part of your responsibilities. You will need to adhere to coding standards, perform peer reviews, write optimized code, create and review design documents, templates, test cases, and checklists, and develop and review unit and integration test cases. Additionally, you will estimate efforts for project deliverables, track timelines, perform defect RCA, trend analysis, and propose quality improvements, and mentor team members while managing aspirations and keeping the team engaged. To excel in this role, you must have strong experience with data quality tools like Ataccama and Collibra, hands-on experience with Snowflake and SQL databases, proficiency in SQL scripting and data pipeline development (preferably Python or Scala), and a sound understanding of data profiling, cleansing, enrichment, and monitoring. Knowledge of REST APIs, metadata integration techniques, and cloud platforms like AWS and Azure would be advantageous. Furthermore, soft skills such as strong analytical and problem-solving abilities, effective communication and presentation skills, and the ability to manage high-pressure environments and multiple priorities are essential. Certification in Ataccama, Collibra, Snowflake, AWS, or Azure, along with domain knowledge in enterprise data architecture and financial services, insurance, or asset management domains, would be beneficial for this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Are you passionate about the intersection of data, technology, and science, and excited by the potential of Real-World Data (RWD) and AI Do you thrive in collaborative environments and aspire to contribute to the discovery of groundbreaking medical insights If so, join the data42 team at Novartis! At Novartis, we reimagine medicine by leveraging state-of-the-art analytics and our extensive internal and external data resources. Our data42 platform grants access to high-quality, multi-modal preclinical and clinical data, along with RWD, creating the optimal environment for developing advanced AI/ML models and generating health insights. Our global team of data scientists and engineers utilizes this platform to uncover novel insights and guide drug development decisions. As an RWD SME / RWE Execution Data Scientist, you will focus on executing innovative methodologies and AI models to mine RWD on the data42 platform. You will be the go-to authority for leveraging diverse RWD modalities patterns crucial to understanding patient populations, biomarkers, and drug targets, accelerating the development of life-changing medicines. Duties and Responsibilities: - Collaborate with R&D stakeholders to co-create and implement innovative, repeatable, scalable, and automated data and technology solutions in line with data42 strategy. - Be a data Subject Matter Expert (SME), understand Real World Data (RWD) of different modalities, vocabularies (LOINC, ICD, HCPCS, etc.), non-traditional RWD (Patient reported outcomes, Wearables and Mobile Health Data) and where and how they can be used, including in conjunction with clinical data, omics data, pre-clinical data, and commercial data. - Contribute to data strategy implementation such as Federated Learning, tokenization, data quality frameworks, regulatory requirements (submission data to HL7 FHIR formats conversion, Sentinel initiative), conversion to common data models and standards (OMOP, FHIR, SEND, etc.), FAIR principles, and integration with enterprise catalog. - Define and execute advanced integrated and scalable analytical approaches and research methodologies (including industry trends) in support of exploratory and regulatory use of AI models for RWD analysis across the Research Development Commercial continuum by facilitating research questions. - Stay current with emerging applications and trends, driving the development of advanced analytic capabilities for data42 across the Real-world evidence generation lifecycle, from ideation to study design and execution. - Demonstrate high agility working across various cross-located and cross-functional associates across business domains (commercial, Development, Biomedical Research) or Therapeutic area divisions for our priority disease areas to execute complex and critical business problems with quantified business impact/ROI. Ideal Candidate Profile: - PhD or MSc. in a quantitative discipline (e.g., but not restricted to Computer Science, Physics, Statistics, Epidemiology) with proven expertise in artificial Intelligence / Machine Learning. - 8+ years of relevant experience in Data Science (or 4+ years post-qualification in case of PhD). - Extensive experience in Statistical and Machine Learning techniques: Regression, Classification, Clustering, Design of Experiments, Monte Carlo Simulations, Statistical Inference, Feature Engineering, Time Series Forecasting, Text Mining, and Natural Language Processing, LLMs, and multi-modal Generative AI. - Good to have skills: Stochastic models, Bayesian Models, Markov Chains, Optimization techniques including, Dynamic Programming Deep Learning techniques on structured and unstructured data, Recommender Systems. - Proficiency in tools and packages: Python, R(optional), SQL; exposure to dashboard or web-app building using PowerBI, R-Shiny, Flask, open source or proprietary software and packages is an advantage. - Knowledge in data standards e.g. OHDSI OMOP, and other data standards, FHIR HL7 for regulatory, and best practices. - Good to have: Foundry, big data programming, working knowledge of executing data science on AWS, DataBricks, or SnowFlake. - Strong in Matrix collaboration environments with good communication and collaboration skills with country/ regional/ global stakeholders in an individual contributor capacity. Novartis is committed to building an outstanding, inclusive work environment and diverse teams representative of the patients and communities we serve. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting, and inspiring each other. Combining to achieve breakthroughs that change patients" lives. Ready to create a brighter future together Join our Novartis Network: Not the right Novartis role for you Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up. Benefits and Rewards: Read our handbook to learn about all the ways we'll help you thrive personally and professionally.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The role of AWS and Snowflake Production Support Specialists at PwC involves ensuring the availability, performance, and security of the AWS cloud infrastructure and Snowflake data platform. You will be responsible for monitoring, troubleshooting, and resolving incidents to minimize downtime and support business operations. As a member of the dynamic IT team, you will play a critical role in managing AWS cloud resources such as EC2 instances, S3 buckets, RDS databases, and Lambda functions. Additionally, you will configure and optimize AWS services for scalability, reliability, and cost-efficiency, and implement infrastructure as code using tools like CloudFormation or Terraform. In terms of Snowflake Data Platform Support, your responsibilities will include monitoring and maintaining Snowflake data warehouses and data pipelines, performing performance tuning and optimization of Snowflake queries and data loading processes, and responding to alerts and incidents related to AWS and Snowflake environments. You will diagnose and troubleshoot issues, collaborate with internal teams and vendors, and implement resolutions and preventive measures. Furthermore, you will design and implement backup strategies for AWS resources and Snowflake data, test and maintain disaster recovery plans, implement and enforce security best practices, conduct security assessments and audits to ensure compliance, and coordinate and execute changes to AWS and Snowflake configurations following change management processes. Effective collaboration and communication with development teams, infrastructure teams, business stakeholders, and external vendors are essential aspects of this role. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent experience. Proven experience in AWS cloud infrastructure management and Snowflake data platform administration, strong knowledge of AWS services and Snowflake features, experience with infrastructure as code and automation tools, familiarity with data warehousing concepts, ETL processes, and SQL querying, incident management experience, cloud security knowledge, and excellent communication and collaboration skills are required for this role.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer specializing in Snowflake, you will leverage your 10+ years of experience to design, build, optimize, and maintain robust and scalable data solutions on the Snowflake platform. Your expertise in cloud data warehouse principles will be utilized to collaborate with stakeholders in translating business requirements into efficient data pipelines and models. Passionate about unlocking data-driven insights, you will work with a team to drive business value through Snowflake's capabilities. Key Skills: - Proficient in Snowflake architecture and features such as virtual warehouses, storage, data sharing, and data governance. - Advanced SQL knowledge for complex queries, stored procedures, and performance optimization in Snowflake. - Experience in ETL/ELT development using Snowflake tools, third-party ETL tools, and scripting languages. - Skilled in data modelling methodologies and performance tuning specific to Snowflake. - Deep understanding of Snowflake security features and data governance frameworks. - Proficient in scripting languages like Python for automation and integration. - Familiarity with cloud platforms like Azure and data analysis tools for visualization. - Experience in version control using Git and working in Agile methodologies. Responsibilities: - Collaborate with Data and ETL team to review, improve, and maintain data pipelines and models on Snowflake. - Optimize SQL queries for data extraction, transformation, and loading within Snowflake. - Ensure data quality, integrity, and security in the Snowflake environment. - Participate in code reviews and contribute to development standards. Education: - Bachelors degree in computer science, Data Science, Information Technology, or equivalent. - Relevant Snowflake certifications (e.g., Snowflake Certified Pro / Architecture / Advanced) are a plus. If you are a proactive Senior Data Engineer with a strong background in Snowflake, eager to drive business value through data-driven insights, this full-time opportunity in Pune awaits you. Join us at Arthur Grand Technologies Inc and be a part of our dynamic team.,
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 2 - 7 years of experience in Python with a good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Your experience should include developing data processing tasks using PySpark and expertise in at least one popular cloud provider, preferably AWS. Additionally, you should possess good knowledge of any RDBMS/NoSQL database with strong SQL writing skills. Experience with Datawarehouse tools like Snowflake and any ETL tool would be a plus. Strong analytical and problem-solving capabilities are essential, along with excellent verbal and written communication skills. Client-facing skills are required, as you will be working directly with clients to build trusted relationships with stakeholders. Ability to collaborate effectively across global teams is crucial. You should have a strong understanding of data structures, algorithms, object-oriented design, and design patterns. Experience in multi-dimensional data, data curation processes, and data quality improvement is desired. General knowledge of business processes, data flows, and quantitative models is expected. An independent thinker who is willing to engage, challenge, and learn new technologies would be an ideal fit for this role. Role & Responsibilities: - Maintain high-quality coding standards and deliver work within the stipulated time frame. - Review the work of team members and occasionally provide guidance. - Develop an understanding of Work Breakdown Structure and assist the manager in delivering the same. - Develop sector initiatives like credential building and knowledge management. - Act as a Team lead and proficiently deliver key responsibilities aligned with the project plan.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to join InfoBeans Technologies as a Data Engineer with a minimum of 5 years of experience in the field. This is a full-time position and we are looking for individuals who are proficient in Snowflake, along with expertise in either Azure Data Factory (ADF) and Python or Power BI and Data Modeling. As a Data Engineer at InfoBeans Technologies, you will be required to have hands-on experience with tools such as WhereScape RED + 3D, DataVault 2.0, SQL, and data transformation pipelines. A strong understanding of Data Management & Analytics principles is essential for this role. Additionally, excellent communication skills and the ability to engage in requirements engineering are highly valued. The successful candidate will be responsible for delivering and supporting production-ready data systems at an expert level of proficiency. The primary skill areas required for this role include Data Engineering & Analytics. If you are passionate about building robust data pipelines, modeling enterprise data, and visualizing meaningful insights, we would love to connect with you. Immediate availability or joining within 15 days is preferred for this position. To apply for this exciting opportunity, please send your resume to mradul.khandelwal@infobeans.com or reach out to us directly. Join us in shaping the future of data analytics and engineering at InfoBeans Technologies.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Snowflake Developer at UST in Trivandrum, you will be responsible for designing, implementing, and optimizing data solutions using Snowflake's cloud data platform. With a minimum of 4 years of professional experience in a similar role, you will leverage your expertise in Snowflake's architecture, features, and best practices to ensure efficient data loading processes from AWS S3 to Snowflake. Your key responsibilities will include designing and implementing data loading processes, creating and maintaining Snowflake objects, collaborating with data engineers and business stakeholders, and establishing CI/CD pipelines for Snowflake deployments. You will also document processes, configurations, and implementations, support Snowflake maintenance activities, and troubleshoot data loading and processing issues. To excel in this role, you should have advanced SQL knowledge, hands-on experience with AWS services (S3, Lambda, IAM, CloudFormation), a deep understanding of Snowflake architecture and features, expertise in data integration and ETL processes, and familiarity with CI/CD and DevOps practices. Your problem-solving skills, analytical thinking, effective communication, and collaboration abilities will be essential in delivering high-quality solutions. UST is a global digital transformation solutions provider with a track record of partnering with leading companies to drive impactful transformations. Joining UST means being part of a dynamic team that embraces innovation, agility, and purpose to deliver solutions that touch billions of lives worldwide. If you are passionate about leveraging your skills to make a real impact in the digital transformation space, this role offers a rewarding opportunity to grow and contribute to UST's mission of boundless impact. Apply now and be a part of our journey towards shaping the future through technology and purpose.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an Engineer, IT Data at American Airlines, you will be part of a diverse and high-performing team dedicated to technical excellence. Your main focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you will be working in refers to the area within Information Technology that focuses on managing and leveraging data as a strategic asset. This includes data management, storage, integration, and governance, leaning into Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will work closely with source data application teams and product owners to design, implement, and support analytics solutions that provide insights to make better decisions. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc., as well as traditional data warehouse tools. Your responsibilities will involve multiple aspects of the development lifecycle including design, cloud engineering, ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, and production support. You will provide technical leadership, collaborate within a team environment, and work independently. Additionally, you will be part of a DevOps team that completely owns and supports the product, implementing batch and streaming data pipelines using cloud technologies. As an essential member of the team, you will lead the development of coding standards, best practices, privacy, and security guidelines. You will also mentor others on technical and domain skills to create multi-functional teams. Your success in this role will require a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering, or a related technical discipline, or equivalent experience/training. To excel in this position, you should have at least 3 years of software solution development experience using agile, DevOps, and operating in a product model. Moreover, you should have 3+ years of data analytics experience using SQL and cloud development and data lake experience, preferably with Microsoft Azure. Preferred qualifications include 5+ years of software solution development experience, 5+ years of data analytics experience using SQL, 3+ years of full-stack development experience, and familiarity with Azure technologies. Additionally, skills, licenses, and certifications required for success in this role include expertise with the Azure Technology stack, practical direction within Azure Native cloud services, Azure Development Track Certification, Spark Certification, and a combination of Development, Administration & Support experience with various tools/platforms such as Scripting (Python, Spark, Unix, SQL), Data Platforms (Teradata, Cassandra, MongoDB, Oracle, SQL Server, ADLS, Snowflake), Azure Cloud Technologies, CI/CD tools (GitHub, Jenkins, Azure DevOps, Terraform), BI Analytics Tool Stack (Cognos, Tableau, Power BI, Alteryx, Denodo, Grafana), and Data Governance and Privacy tools (Alation, Monte Carlo, Informatica, BigID). Join us at American Airlines, where you can explore a world of possibilities, travel the world, grow your expertise, and become the best version of yourself while contributing to the transformation of technology delivery for our customers and team members worldwide.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Data Warehouse Engineer will be responsible for managing and optimizing data processes in an Azure environment using Snowflake. The ideal candidate should have solid SQL skills and a basic understanding of data modeling. Experience with CI/CD processes and Azure ADF is preferred. Additionally, expertise in ETL/ELT frameworks and ER/Studio would be a plus. As a Senior Data Warehouse Engineer, in addition to the core requirements, you will oversee other engineers while also being actively involved in data modeling and Snowflake SQL optimization. You will be responsible for conducting design reviews, code reviews, and deployment reviews with the engineering team. Familiarity with medallion architecture and experience in Healthcare or life sciences industry will be highly advantageous. At Myridius, we are committed to transforming the way businesses operate by offering tailored solutions in AI, data analytics, digital engineering, and cloud innovation. With over 50 years of expertise, we aim to drive organizations through the rapidly evolving landscapes of technology and business. Our integration of cutting-edge technology with deep domain knowledge enables businesses to seize new opportunities, drive significant growth, and maintain a competitive edge in the global market. We go beyond typical service delivery to craft transformative outcomes that help businesses not just adapt, but thrive in a world of continuous change. Discover how Myridius can elevate your business to new heights of innovation by visiting us at www.myridius.com and start leading the change.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
pune, maharashtra
On-site
Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. Life at PTC is about more than working with today's most cutting-edge technologies to transform the physical world. It's about showing up as you are and working alongside some of today's most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you'll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Senior Data Modeller, you will be responsible for leading the design and development of conceptual, logical, and physical data models for enterprise and application-level databases. Your expertise in data modeling, data warehousing, and data governance, particularly in cloud environments, Databricks, and Unity Catalog, will be crucial for the role. You should have a deep understanding of business processes related to master data management in a B2B environment and experience with data governance and data quality concepts. Your key responsibilities will include designing and developing data models, translating business requirements into structured data models, defining and maintaining data standards, collaborating with cross-functional teams to implement models, analyzing existing data systems for optimization, creating entity relationship diagrams and data flow diagrams, supporting data governance initiatives, and ensuring compliance with organizational data policies and security requirements. To be successful in this role, you should have at least 12 years of experience in data modeling, data warehousing, and data governance. Strong familiarity with Databricks, Unity Catalog, and cloud environments (preferably Azure) is essential. Additionally, you should possess a background in data normalization, denormalization, dimensional modeling, and schema design, along with hands-on experience with data modeling tools like ERwin. Experience in Agile or Scrum environments, proficiency in integration, databases, data warehouses, and data processing, as well as a track record of successfully selling data and analytics software to enterprise customers are key requirements. Your technical expertise should cover Big Data, streaming platforms, Databricks, Snowflake, Redshift, Spark, Kafka, SQL Server, PostgreSQL, and modern BI tools. Your ability to design and scale data pipelines and architectures in complex environments, along with excellent soft skills including leadership, client communication, and stakeholder management will be valuable assets in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at Ethoca, a Mastercard Company in Pune, India, you will play a crucial role in driving data enablement and exploring big data solutions within our technology landscape. Your responsibilities will include designing, developing, and optimizing batch and real-time data pipelines using tools such as Snowflake, Snowpark, Python, and PySpark. You will also be involved in building data transformation workflows, implementing CI/CD pipelines, and administering the Snowflake platform to ensure performance tuning, access management, and platform scalability. Collaboration with stakeholders to understand data requirements and deliver reliable data solutions will be a key part of your role. Your expertise in cloud-based database infrastructure, SQL development, and building scalable data models using tools like Power BI will be essential in supporting business analytics and dashboarding. Additionally, you will be responsible for real-time data streaming pipelines, data observability practices, and planning and executing deployments, migrations, and upgrades across data platforms while minimizing service impacts. To be successful in this role, you should have a strong background in computer science or software engineering, along with deep hands-on experience with Snowflake, Snowpark, Python, PySpark, and CI/CD tooling. Familiarity with Schema Change, Java JDK, Spring & Springboot framework, Databricks, and real-time data processing is desirable. You should also possess excellent problem-solving and analytical skills, as well as effective written and verbal communication abilities for collaborating across technical and non-technical teams. You will be part of a high-performing team that is committed to making systems resilient and easily maintainable on the cloud. If you are looking for a challenging role that allows you to leverage cutting-edge software and development skills while working with massive data volumes, this position at Ethoca may be the right fit for you.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Manager, Data Engineering at Assent, you will play a crucial role in leading a team of specialists in data engineering to bring key products and features to life for Assent and its customers. Your responsibilities will include recruiting, developing, and retaining a high-performing and engaged team, coaching them to deliver high-quality, scalable software that addresses key business problems. You will be responsible for coordinating and allocating work among the team, driving delivery, and championing product quality. Your role will involve establishing and maintaining rapport with other teams in the product value chain such as Product Management, User Experience, Quality Assurance, Customer Success, and Infrastructure. You will visualize upcoming work, manage product releases by monitoring progress and making necessary adjustments to meet delivery schedule and requirements. Additionally, you will ensure that the team develops software that follows Assent's Design Guidelines, Coding Standards, Security Guidelines, and any other relevant guidelines provided by the Architecture team. To excel in this role, you should have at least 10 years of progressive experience in a data-focused role with a proven track record in a leadership position. Strong mentoring and coaching skills are essential to keep your team engaged and motivated. A strong technical background with experience in AWS, Snowflake, dbt, Python, SQL, and handling large volumes of unstructured data is required. You should possess a growth mindset, be highly organized, adaptable, and strategic in your approach to business challenges. At Assent, we value your talent, energy, and passion. In addition to competitive financial benefits, we offer wellness programs, flexible work options, professional development opportunities, and a commitment to diversity, equity, and inclusion. Join us in our mission to create a sustainable future and make a global impact with your expertise in data engineering.,
Posted 1 week ago
2.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
You are a highly skilled and experienced professional tasked with leading and supporting data warehousing and data center architecture initiatives. Your expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance, along with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake, will be crucial for this role. You are expected to have a strong presales experience, technical leadership capabilities, and the ability to manage complex enterprise deals across various geographies. Your main responsibilities will include architecting and designing scalable Data Warehousing and Data Lake solutions, leading presales engagements, creating and presenting proposals and solution designs to clients, collaborating with cross-functional teams, estimating efforts and resources for customer requirements, driving Managed Services opportunities and enterprise deal closures, engaging with clients globally, ensuring alignment of solutions with business goals and technical requirements, and maintaining high standards of documentation and presentation for client-facing materials. To excel in this role, you must possess a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Certifications in AWS, Azure, GCP, or Snowflake are advantageous. You should have experience working in consulting or system integrator environments, a strong understanding of Data Warehousing, Data Lakes, Data Integration, and Data Governance, hands-on experience with ETL tools, exposure to cloud environments, a minimum of 2 years of presales experience, experience in enterprise-level deals and Managed Services, the ability to handle multi-geo engagements, excellent presentation and communication skills, and a solid grasp of effort estimation techniques for customer requirements.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a member of our team, you will be responsible for designing, developing, and maintaining scalable data pipelines utilizing Snowflake. Your role will involve optimizing SQL queries and data models to enhance performance and efficiency. Furthermore, you will play a crucial part in implementing data security and governance best practices within the Snowflake platform. Collaboration is key in this position, as you will work closely with data scientists and analysts to address and fulfill their data requirements effectively. Additionally, you will be expected to troubleshoot and resolve any data-related issues promptly to ensure smooth operations. If you are passionate about working with cutting-edge technologies and are eager to contribute to a dynamic team environment, this opportunity is perfect for you. Join us and be a part of our exciting journey towards achieving excellence in data management and analysis. #CareerOpportunities #JobVacancy #WorkWithUs,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You are a Data Engineer with 3-7 years of experience, currently based in Mumbai and available for face-to-face interaction. Your responsibilities will include building and managing data pipelines using Snowflake and Azure Data Factory (ADF), writing optimized SQL for large-scale data analysis, and monitoring and enhancing Snowflake performance. To excel in this role, you should have a strong background in data engineering and SQL with a focus on Snowflake and ADF. Additionally, familiarity with data quality, governance, and Python will be beneficial. Possessing a Snowflake certification will be considered a plus. If you meet these requirements and are passionate about working in a dynamic environment, where you can utilize your skills in Snowflake and SQL, we encourage you to apply for this position. Please send your CV to shruthi.pu@andortech.com to be considered for this opportunity. #Hiring #DataEngineer #Snowflake #AzureDataFactory #SQL #NowHiring #JobOpening,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You have a fantastic opportunity to join our team as a Senior Database Developer with a focus on SQL, Data analytics, and Snowflake. With over 7 years of experience, you will be responsible for designing and developing Datawarehouse & Data integration projects at the SSE / TL level. Your expertise in working in an Azure environment will be beneficial as you develop ETL pipelines in and out of the data warehouse using Python and Snowflake's SnowSQL. While writing SQL queries against Snowflake is a plus, having a good understanding of database design concepts such as Transactional, Datamart, and Data warehouse is essential. You will excel in loading data from various sources, translating complex requirements into detailed designs, and analyzing vast data stores to uncover valuable insights. As a Snowflake data engineer, you will play a pivotal role in architecting and implementing large-scale data intelligence solutions around the Snowflake Data Warehouse. Your solid experience in architecting, designing, and operationalizing data & analytics solutions on the Snowflake Cloud Data Warehouse is crucial. Additionally, strong articulation skills and a willingness to learn new skills are highly valued in this role. If you are ready to take on this exciting challenge, we look forward to welcoming you to our team at Bangalore's Global Village Tech Park, Mysore Road Kengeri.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France