Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Sr. Data Engineer will provide technical expertise in designing and building Modern Data warehouse in Azure Cloud to meet the data needs for various BU in Gartner. You will be part of Ingestion Team to bring data from multiple sources into Data warehouse. Collaborate with Dashboard, Analytic & Business Team to build end to end scalable data pipelines. What You Will Do Responsible to review and analysis of business requirements and design technical mapping document Build new ETL pipelines using Azure Data Factory and Synapse Help build defining best practices & processes Collaborate on Data warehouse architecture and technical design discussions Perform and participate in code reviews, peer inspections and technical design and specifications, as well as document and review detailed designs Provide status reports to the higher management Maintain Service Levels and department goals for problem resolution What You Will Need 4-6 years of experience in Data warehouse design & development Experience in ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures in Synapse / Sql Data warehouse Experience in analyze complex code and performance tune pipelines Experience crafting, building, and deploying applications in a DevOps environment utilizing CI/CD tools Good knowledge in Azure cloud technology and exposure in Azure cloud components Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modeling Who You Are Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audience’s Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101014 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
Posted 3 weeks ago
0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: aws,analytics,sales,sql,data,snowflake,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud
Posted 3 weeks ago
0 years
20 - 25 Lacs
Thane, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Mumbai Metropolitan Region
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Thane, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 3 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Join us as an "Associate" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as a "Associate", you should have experience with: About India Corporate Operations About Regulatory Reporting department As part of the regulatory and supervisory functions bestowed on it, the Regulators in India collects various fixed format data (called 'Returns') from commercial banks, financial institutions, authorised dealers and non-banking financial institutions. This department is responsible for timely and accurate filing of Operations Returns to Regulator either directly or indirectly. This department is also accountable for preparation and oversight of various exposure reports for local and group Credit risk. Overall purpose of role The purpose of this role is to lead the Regulatory Reporting team in preparation, submission and automation of Corporate & Investment Banking Regulatory returns for Corporate and Investment Bank Operations as well as exposure reports for local and group Credit risk team. This role envisages team management, stakeholder management and maintain robust control environment. Managing and leading the team in delivering solutions and effective decision making Liaise with respective Stakeholders (Finance, Credit, Coverage, BIU, Compliance, Legal, Internal & External Auditors, Risk Control Unit, Technology, Vendor partners etc) on an ongoing basis to meet Barclays deliverables and Internal, external customer requirements. To act as a role model for all our values as well as inspire, motivate the team, drive for results, and communicate powerfully and prolifically. To conduct periodic assessments of the Control environment by analysing existing controls and issue around timeliness accuracy and completeness of risk information. Identify missing or weak controls, and work with risk reporting teams and other infrastructure teams to improve the control environment. Key Accountabilities Credit Reporting: Management of Operations support activities: Timely follow-up with Internal stakeholders for data input and timely escalation. Timely contribute to decks submitted to banks Governance forums. Maintain effective and standard operational processes and documentation. Assist in preparing any other documentation as may be required from time to time. Partner with support functions to drive excellence, continuous improvement, and simplification of processes in a timely and professional manner. Regulatory Reporting: Ensure that all returns and reports are delivered timely and accurately, SLAs are met, measured, and reported to stakeholders on agreed frequency. Accountable for preparation and production of 100+ Regulatory Returns like CRILC, RLC, LEF, RAQ, DSB XII, PSL, Non Resident Guarantee and Invocation, CIC Reporting, FTD, GPB, LCR Reporting, DSB Return - I, DEAF -Form I and II, DEAF -Form III, DEAF -Form IV, BAL Statement, R Return, DEAF -Form V, FC-TRS form, , Quarterly Investment Reconciliation Certificate, Short Sale Reporting, Pvt Placement Data, Basel III Liquidity Return (BLR6), Quarterly Review of Investment, RBS – (Tranche I, IA,IB, IC, ID, IE, IF, IG, IH II, III, Bank Profile), Half Yearly Review of Investment, LRA2, DICGC Premium, QCCP Exposure Report, Cross currency derivative statement , Past Performance Report, Commodity Hedging and any other return as assigned from time to time. Timely issue management. Escalate open and aging issues as per the bank’s escalation metrics and follow-up for resolution. Timely contribute to decks submitted to banks various Governance forums. Ensuring that the regulatory filings are in line with the Regulatory guidelines and Barclays standards and policy. Manage RBI ADF automation project for the returns owned by Operations. Clearly understanding the Returns automation requirements, interacting with the Stakeholders, and preparing BRDs. User Acceptance Testing from a functional point of view, raising defects if any and following up for closure. Collaborating with stakeholders like Credit Risk, Compliance, Finance, Technology teams and vendor partners in the automation cycle. Serve as an in-house subject matter expert in issues arising out of functional areas. Maintain effective and standard operational processes and documentation. Assist in preparing any other documentation as may be required from time to time. Partner with support functions to drive excellence, continuous improvement, and simplification of processes in a timely and professional manner. Contribute to regulatory reporting compliance framework. Stakeholder management and leadership. Stakeholder Management and Leadership skills are critical components to the successful delivery of many activities required within this role. Stakeholder Management Liaising with Technology on automation of Regulatory returns, preparation of BRDs and defining of logics. Liaising with Credit Risk and Coverage team catering to various data and information requirements. Liaising with the BIU team for obtaining of various reports for internal or regulatory requirements. Liaising with the Compliance and Legal teams towards new Regulations and changes in process notes, regulatory submissions, and compliance requirements. Liaising with Corporate & Investment Operation teams. Liaising with RCU for assistance on recording their borrower’s static data in CFMS & Regulatory submissions Liaising with internal Audit teams for any audit requirements / change in existing processes. Liaising with external vendors (IT support / Auditors) as and when the requirement arises. Work with the wider risk reporting and risk management teams to ensure controls are fit for purpose, with agreed schedule to implement missing or weak controls. Leadership: Being proactive and to provide a strong sense of ownership to be demonstrated by the team. Decision making and problem solving. Effective problem-solving skills with a deeper, broader, and clear understanding key concerns challenging the team and driving control improvements. Ensure efficiency by highlighting areas that could cause potential risk to the bank and developing solutions to enhance current on-going processes and controls. Create strong partnerships with the Monitoring team within RCU, Trade Ops, Payments Ops, Investment Bank Ops and other divisions within Operations. Support business areas in deciphering upcoming regulatory & reporting changes and help them implement appropriate controls to meet these requirements. Strong analytical skills to enable good decision making. Incumbent should be able to provide guidance to other team members/colleagues on the specific areas of expertise. Demonstrate ability to manage, motivate and develop the team by way of proper planning and execution thereof. Flexibility to adapt to rapidly changing business events; Ability to work well under pressure, working accurately with attention to detail, and meeting deadlines. Active multi-tasking skills to analyse in detail and react quickly to problems performance related issues, coordination with other teams and task prioritization conflicts. Risk and Control Objective Take ownership for managing risk and strengthening controls in relation to the work you do Skills Skills and Qualifications will include. Basic understanding of Group Policy Guidelines, Credit Risk, Country Grades and Exposure Guidelines General knowledge and understanding of the Bank’s Products and Services is required to assist with proposed or existing transactions. IT Skills are required to extract and analyse a wide variety of reports. Management & Leadership skills Including people development. Person Specification This position requires an analytics professional specializing in Regulatory reporting and Credit reporting in financial services industry especially related to Corporate and Investment banking products and Operations. Sound knowledge of financial accounting concepts and banking applications. Experience working in Regulatory Reporting and Reconciliation function. Clear understanding of Regulatory reporting guidelines and Change Management principles, within a banking environment. Highly motivated, results-oriented, stakeholder -focused with strong people management skills. Good communication skills – should have fluent oral and written English skills. Strong analytical skills and the ability to correlate general ledger, data and reporting impacts across different interfacing applications and data flows. Should be able to visualize, implement and generate improvements in the current process, deliver efficiencies, strengthen the process framework and controls while making sure that the quality of reporting is immaculate. Ability to analyse and interpret large volumes of data, aggregation, and analysis of data on MS Excel to produce reports. Understand key performance measures and indicators that drive reporting and analytics. Proficient in MS Office. Strong interpersonal, analytical, facilitating, decision making and organization skills. Proactive, independent, and self-managing; Organized, detail Oriented & results driven. Change and transformation experience will be a plus. Desirable Skills/Preferred Qualifications: Fluent written and spoken English. Eye for detail in Document Vetting and Facility documentation. Customer-centric attitude Relationship Management Skills Communication Skills Personal Organisation Information Gathering Ability Problem Solving/Decision Making Skills Proactive Person with high Integrity Essential Skills/Basic Qualifications: Experience in Ops support function related activities like preparation of various regulatory returns, MIS, system knowledge MBA/Post-Graduate/Graduate. Desirable Skills/Preferred Qualifications: Knowledge of Barclays business areas, key priorities, and challenges Banking and Financial sector experience and knowledge of the types of activities that Ops function does. Job location is Mumbai Purpose of the role To support business areas with day-to-day processing, reviewing, reporting, trading and issue resolution. Accountabilities Support various business areas with day-to-day initiatives including processing, reviewing, reporting, trading, and issue resolution. Collaboration with teams across the bank to align and integrate operational processes. Identification of areas for improvement and providing recommendations in operational processes. Development and implementation of operational procedures and controls to mitigate risks and maintain operational efficiency. Development of reports and presentations on operational performance and communicate findings to internal senior stakeholders. Identification of industry trends and developments to implement best practice in banking operations. Participation in projects and initiatives to improve operational efficiency and effectiveness. Analyst Expectations To meet the needs of stakeholders/ customers through specialist advice and support Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. Likely to have responsibility for specific processes within a team They may lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. They supervise a team, allocate work requirements and coordinate team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they manage own workload, take responsibility for the implementation of systems and processes within own work area and participate on projects broader than direct team. Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. Check work of colleagues within team to meet internal and stakeholder requirements. Provide specialist advice and support pertaining to own work area. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise. Make judgements based on practise and previous experience. Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements. Build relationships with stakeholders/ customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 3 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Join us as an "Associate" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as a "Associate", you should have experience with: About India Corporate Operations About Regulatory Reporting department As part of the regulatory and supervisory functions bestowed on it, the Regulators in India collects various fixed format data (called 'Returns') from commercial banks, financial institutions, authorised dealers and non-banking financial institutions. This department is responsible for timely and accurate filing of Operations Returns to Regulator either directly or indirectly. This department is also accountable for preparation and oversight of various exposure reports for local and group Credit risk. Overall purpose of role The purpose of this role is to lead the Regulatory Reporting team in preparation, submission and automation of Corporate & Investment Banking Regulatory returns for Corporate and Investment Bank Operations as well as exposure reports for local and group Credit risk team. This role envisages team management, stakeholder management and maintain robust control environment. Managing and leading the team in delivering solutions and effective decision making Liaise with respective Stakeholders (Finance, Credit, Coverage, BIU, Compliance, Legal, Internal & External Auditors, Risk Control Unit, Technology, Vendor partners etc) on an ongoing basis to meet Barclays deliverables and Internal, external customer requirements. To act as a role model for all our values as well as inspire, motivate the team, drive for results, and communicate powerfully and prolifically. To conduct periodic assessments of the Control environment by analysing existing controls and issue around timeliness accuracy and completeness of risk information. Identify missing or weak controls, and work with risk reporting teams and other infrastructure teams to improve the control environment. Key Accountabilities Credit Reporting: Management of Operations support activities: Timely follow-up with Internal stakeholders for data input and timely escalation. Timely contribute to decks submitted to banks Governance forums. Maintain effective and standard operational processes and documentation. Assist in preparing any other documentation as may be required from time to time. Partner with support functions to drive excellence, continuous improvement, and simplification of processes in a timely and professional manner. Regulatory Reporting: Ensure that all returns and reports are delivered timely and accurately, SLAs are met, measured, and reported to stakeholders on agreed frequency. Accountable for preparation and production of 100+ Regulatory Returns like CRILC, RLC, LEF, RAQ, DSB XII, PSL, Non Resident Guarantee and Invocation, CIC Reporting, FTD, GPB, LCR Reporting, DSB Return - I, DEAF -Form I and II, DEAF -Form III, DEAF -Form IV, BAL Statement, R Return, DEAF -Form V, FC-TRS form, , Quarterly Investment Reconciliation Certificate, Short Sale Reporting, Pvt Placement Data, Basel III Liquidity Return (BLR6), Quarterly Review of Investment, RBS – (Tranche I, IA,IB, IC, ID, IE, IF, IG, IH II, III, Bank Profile), Half Yearly Review of Investment, LRA2, DICGC Premium, QCCP Exposure Report, Cross currency derivative statement , Past Performance Report, Commodity Hedging and any other return as assigned from time to time. Timely issue management. Escalate open and aging issues as per the bank’s escalation metrics and follow-up for resolution. Timely contribute to decks submitted to banks various Governance forums. Ensuring that the regulatory filings are in line with the Regulatory guidelines and Barclays standards and policy. Manage RBI ADF automation project for the returns owned by Operations. Clearly understanding the Returns automation requirements, interacting with the Stakeholders, and preparing BRDs. User Acceptance Testing from a functional point of view, raising defects if any and following up for closure. Collaborating with stakeholders like Credit Risk, Compliance, Finance, Technology teams and vendor partners in the automation cycle. Serve as an in-house subject matter expert in issues arising out of functional areas. Maintain effective and standard operational processes and documentation. Assist in preparing any other documentation as may be required from time to time. Partner with support functions to drive excellence, continuous improvement, and simplification of processes in a timely and professional manner. Contribute to regulatory reporting compliance framework. Stakeholder management and leadership. Stakeholder Management and Leadership skillsare critical components to the successful delivery of many activities required within thisrole. Stakeholder Management Liaising with Technology on automation of Regulatory returns, preparation of BRDs and defining of logics. Liaising with Credit Risk and Coverage team catering to various data and information requirements. Liaising with the BIU team for obtaining of various reports for internal or regulatory requirements. Liaising with the Compliance and Legal teams towards new Regulations and changes in process notes, regulatory submissions, and compliance requirements. Liaising with Corporate & Investment Operation teams. Liaising with RCU for assistance on recording their borrower’s static data in CFMS & Regulatory submissions Liaising with internal Audit teams for any audit requirements / change in existing processes. Liaising with external vendors (IT support / Auditors) as and when the requirement arises. Work with the wider risk reporting and risk management teams to ensure controls are fit for purpose, with agreed schedule to implement missing or weak controls. Leadership: Being proactive and to provide a strong sense of ownership to be demonstrated by the team. Decision making and problem solving. Effective problem-solving skills with a deeper, broader, and clear understanding key concerns challenging the team and driving control improvements. Ensure efficiency by highlighting areas that could cause potential risk to the bank and developing solutions to enhance current on-going processes and controls. Create strong partnerships with the Monitoring team within RCU, Trade Ops, Payments Ops, Investment Bank Ops and other divisions within Operations. Support business areas in deciphering upcoming regulatory & reporting changes and help them implement appropriate controls to meet these requirements. Strong analytical skills to enable good decision making. Incumbent should be able to provide guidance to other team members/colleagues on the specific areas of expertise. Demonstrate ability to manage, motivate and develop the team by way of proper planning and execution thereof. Flexibility to adapt to rapidly changing business events; Ability to work well under pressure, working accurately with attention to detail, and meeting deadlines. Active multi-tasking skills to analyse in detail and react quickly to problems performance related issues, coordination with other teams and task prioritization conflicts. Risk and Control Objective Take ownership for managing risk and strengthening controls in relation to the work you do Skills Skills and Qualifications will include. Basic understanding of Group Policy Guidelines, Credit Risk, Country Grades and Exposure Guidelines General knowledge and understanding of the Bank’s Products and Services is required to assist with proposed or existing transactions. IT Skills are required to extract and analyse a wide variety of reports. Management & Leadership skills Including people development. Person Specification This position requires an analytics professional specializing in Regulatory reporting and Credit reporting in financial services industry especially related to Corporate and Investment banking products and Operations. Sound knowledge of financial accounting concepts and banking applications. Experience working in Regulatory Reporting and Reconciliation function. Clear understanding of Regulatory reporting guidelines and Change Management principles, within a banking environment. Highly motivated, results-oriented, stakeholder -focused with strong people management skills. Good communication skills – should have fluent oral and written English skills. Strong analytical skills and the ability to correlate general ledger, data and reporting impacts across different interfacing applications and data flows. Should be able to visualize, implement and generate improvements in the current process, deliver efficiencies, strengthen the process framework and controls while making sure that the quality of reporting is immaculate. Ability to analyse and interpret large volumes of data, aggregation, and analysis of data on MS Excel to produce reports. Understand key performance measures and indicators that drive reporting and analytics. Proficient in MS Office. Strong interpersonal, analytical, facilitating, decision making and organization skills. Proactive, independent, and self-managing; Organized, detail Oriented & results driven. Change and transformation experience will be a plus. Desirable Skills/Preferred Qualifications: Fluent written and spoken English. Eye for detail in Document Vetting and Facility documentation. Customer-centric attitude Relationship Management Skills Communication Skills Personal Organisation Information Gathering Ability Problem Solving/Decision Making Skills Proactive Person with high Integrity Essential Skills/Basic Qualifications: Experience in Ops support function related activities like preparation of various regulatory returns, MIS, system knowledge MBA/Post-Graduate/Graduate. Desirable Skills/Preferred Qualifications: Knowledge of Barclays business areas, key priorities, and challenges Banking and Financial sector experience and knowledge of the types of activities that Ops function does. Job location is Mumbai Purpose of the role To support business areas with day-to-day processing, reviewing, reporting, trading and issue resolution. Accountabilities Support various business areas with day-to-day initiatives including processing, reviewing, reporting, trading, and issue resolution. Collaboration with teams across the bank to align and integrate operational processes. Identification of areas for improvement and providing recommendations in operational processes. Development and implementation of operational procedures and controls to mitigate risks and maintain operational efficiency. Development of reports and presentations on operational performance and communicate findings to internal senior stakeholders. Identification of industry trends and developments to implement best practice in banking operations. Participation in projects and initiatives to improve operational efficiency and effectiveness. Analyst Expectations To meet the needs of stakeholders/ customers through specialist advice and support Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. Likely to have responsibility for specific processes within a team They may lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. They supervise a team, allocate work requirements and coordinate team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they manage own workload, take responsibility for the implementation of systems and processes within own work area and participate on projects broader than direct team. Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. Check work of colleagues within team to meet internal and stakeholder requirements. Provide specialist advice and support pertaining to own work area. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise. Make judgements based on practise and previous experience. Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements. Build relationships with stakeholders/ customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)
Posted 3 weeks ago
4.0 - 7.0 years
3 - 5 Lacs
Pune
Work from Office
Position: SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience: Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager Job Description: Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT.
Posted 3 weeks ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Designation: Data Architect Location: Pune Experience: 10-15 years Skills Azure Expertise: The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Architecture Creation: Assist with creation of a robust, sustainable architecture that supports requirements and provides for expansion with secured access. BFSI Experience: Experience in building/running large data environment for BFSI clients. Collaboration: Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic solutions. Data Modeling: Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort. Communication: Communicates physical database designs to lead data architect/database administrator. Data Model Evolution: Evolves data models to meet new and changing business requirements. Business Analysis: Work with business analysts to identify and understand requirements and source data systems. Big Data Technologies: Expert in big data technologies on Azure/GCP. ETL Platforms: Experience with ETL platforms like ADF, Glue, Ab Initio, Informatica, Talend, Airflow. Data Visualization: Experience in data visualization tools like Tableau, Power BI, etc. Data Engineering & Management: Experience in a data engineering, metadata management, database modeling and development role. Streaming Data Handling: Strong experience in handling streaming data with Kafka. Data API Understanding: Understanding of Data APIs, Web services. Data Security: Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. DataOps/MLOps: Experience in setting up DataOps and MLOps. Database Design: Ensure that the database designs fulfill the requirements, including data volume, frequency, and long-term BI/Analytics growth requirements. Integration: Work with other architects to ensure that all components work together to meet objectives and performance goals as defined in the requirements. System Performance: Improve system performance by conducting tests, troubleshooting, and integrating new elements. Data Science Coordination: Coordinate with the Data Science Teams to identify future data needs and requirements and creating pipelines for them. Soft Skills: Soft skills such as communication, leading the team, taking ownership and accountability to successful engagement. Quality Management: Participate in quality management reviews. Customer Management: Managing customer expectation and business user interactions. Research and Development: Deliver key research (MVP, POC) with an efficient turn-around time to help make strong product decisions. Mentorship: Demonstrate key understanding and expertise on modern technologies, architecture, and design. Mentor the team to deliver modular, scalable, and high-performance code. Innovation: Be a change agent on key innovation and research to keep the product, team at the cutting edge of technical and product innovation.
Posted 3 weeks ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: We are seeking a Data Engineer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Responsibilities: Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. ETL development experience in Microsoft data track are required. Work with business team to translate the business requirement to technical requirements. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory skill sets: · Strong proficiency in Azure Databricks, including Spark and Delta Lake. · Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. · Proficiency in data integration and ETL processes and T-SQL. · Experienced working in Python for data engineering · Experienced working in Postgres Database · Experienced working in graph database · Experienced in architecture design and data modelling Good To Have Skill Sets: · Unity Catalog / Purview · Familiarity with Fabric/Snowflake service offerings · Visualization tool – PowerBI Preferred skill sets: Hands on knowledge of python, Pyspark and strong SQL knowledge. ETL and data warehousing is must. Relevant certifications (Any one) (e.g., Databricks Data Engineer Associate Microsoft Certified: Azure Data Engineer Associate Azure Solution Architect) are mandatory Years of experience required: 5+yrs Education qualification: Bachelor's degree in Computer Science, IT, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 3 weeks ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Us CodeVyasa is a mid-sized product engineering company working with top-tier product and solutions companies like McKinsey, Walmart, RazorPay, Swiggy, and others. We are a growing team of 550+ professionals , delivering cutting-edge solutions across Agentic AI, RPA, Full-stack development, Data Engineering, and various other GenAI areas. If you’re passionate about coding, problem-solving, innovation, and leading impactful data projects, we would love to hear from you! Key Responsibilities • Design, build, and manage scalable, high-performance data pipelines using Azure Data Factory and PySpark. • Lead data warehousing, data lake, and lakehouse architecture initiatives to enable advanced analytics and BI solutions . • Collaborate closely with business stakeholders to translate complex data requirements into effective technical solutions. • Build and maintain impactful dashboards and reports using Power BI. • Provide technical leadership, mentorship, and guidance to junior and mid-level engineers across data projects. • Oversee workflow management, job monitoring, pipeline health, and troubleshooting to ensure seamless data operations. • Ensure best practices in data governance, quality, security, and performance tuning. • Manage project timelines, task prioritization, and cross-team collaboration to ensure timely and high-quality delivery. Must-Have Skills • 5+ years of cumulative experience in data engineering or similar roles. • Strong hands-on experience with: • Azure Data Factory (ADF) • PySpar k, Spark SQL, and Python • SQL Se rver, SSIS, SSRS • Databr icks • Power BI • Deep u nderstanding of: • Data W arehousing, Data Lake, and Data Lakehouse architecture • Reference and Master Data Management, Data Governance, MLOps, and AI/ML solutions • Proficient in: • ETL/ELT processes, data modeling, performance tuning, and pipeline optimization • Strong experie nce with : • Workflow documentation, Jira, Confluence, and ServiceNow • Excellent s kills in: • Problem-solving, time management, stakeholder communication, task prioritization, and team collaboration • Ability to thrive under pressure while maintaining quality and accuracy . Why Join CodeVyasa? • Opportunity to work on innovative, high-impact projects alongside a team of top-tier profe ssionals. • Exposure to cutting-edge technologies and global clients. • Continuous learning and professional growth opportunities. • Flexible work environment and supportive company culture. • Competitive salary, comprehensive benefits, and free healthcare coverage. 📩 You can reach out to me at kumkum@codevyasa.com for more details.
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a Data Engineer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Responsibilities: Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. ETL development experience in Microsoft data track are required. Work with business team to translate the business requirement to technical requirements. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory skill sets: · Strong proficiency in Azure Databricks, including Spark and Delta Lake. · Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. · Proficiency in data integration and ETL processes and T-SQL. · Experienced working in Python for data engineering · Experienced working in Postgres Database · Experienced working in graph database · Experienced in architecture design and data modelling Good To Have Skill Sets: · Unity Catalog / Purview · Familiarity with Fabric/Snowflake service offerings · Visualization tool – PowerBI Preferred skill sets: Hands on knowledge of python, Pyspark and strong SQL knowledge. ETL and data warehousing is must. Relevant certifications (Any one) (e.g., Databricks Data Engineer Associate Microsoft Certified: Azure Data Engineer Associate Azure Solution Architect) are mandatory Years of experience required: 5+yrs Education qualification: Bachelor's degree in Computer Science, IT, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
Mulshi, Maharashtra, India
On-site
Area(s) of responsibility Experience: 12 to 16 Years Develop and maintain scalable architecture, data warehouse design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure Drive the design, sizing, estimation & POC activities for Azure Data environments and related services for the use cases and the solutions Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Provide technical guidance, mentoring and code review, design level technical best practices Cloud Architect with experience in Azure ADF, Databricks, PySpark Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experince in Azure Databricks and ADF, Azure Synapse and PySpark Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage AWS and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderābād
On-site
Job Title: Senior Data Engineer (Full Stack) Experience Required: 8+ Years Work Mode: Hybrid Locations: Hyderabad / Noida / Bangalore / Indore Must have exoerience with : Pyspark, Data Bricks,ADF, Big Data, Hadoop, Hive Job Summary: We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in big data technologies and full-stack data pipeline development. The ideal candidate will have deep expertise in PySpark, Databricks, ADF, and the Hadoop ecosystem. This is a hybrid role requiring on-site presence in one of our listed locations. Key Responsibilities: Design, build, and manage scalable and reliable data pipelines using PySpark and Databricks Develop and orchestrate data workflows using Azure Data Factory (ADF) Work with large datasets in distributed environments using Hadoop, Hive, and other Big Data tools Optimize data solutions for performance, scalability, and reliability Collaborate with cross-functional teams including Data Scientists, Analysts, and Software Engineers Ensure data quality and integrity across various stages of the data lifecycle Participate in code reviews, troubleshooting, and performance tuning Required Skills & Qualifications: Minimum 8 years of experience in Data Engineering Strong programming skills in PySpark and working experience with Databricks Hands-on experience with ADF for pipeline orchestration Proficiency with Hadoop, Hive, and other big data tools Experience working in hybrid or distributed teams Solid understanding of data architecture, ETL processes, and performance tuning Excellent problem-solving and communication skills Good to Have: Azure cloud experience Familiarity with DevOps practices and CI/CD for data pipelines Experience with Delta Lake or similar data lake architectures interested candidates may share their resume at humanresource[dot]professional693[at]gmail[dot]com Job Type: Contract Contract length: 6 months Application Question(s): What is your notice period in days? Experience: pyspark: 8 years (Required) databricks: 8 years (Required) ADF: 8 years (Required) Big data: 8 years (Required) Hadoop: 8 years (Required) Apache Hive: 8 years (Required) Work Location: In person
Posted 3 weeks ago
7.0 - 9.0 years
0 Lacs
Hyderābād
On-site
Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0625-0925 Employment Type: Full Time Position Description: Azure data bricks developer with 7-9 years of experience We are seeking a skilled Azure Databricks Developer to design, develop, and optimize big data pipelines using Databricks on Azure. The ideal candidate will have strong expertise in PySpark, Azure Data Lake, and data engineering best practices in a cloud environment. Key Responsibilities: Design and implement ETL/ELT pipelines using Azure Databricks and PySpark. Work with structured and unstructured data from diverse sources (e.g., ADLS Gen2, SQL DBs, APIs). Optimize Spark jobs for performance and cost-efficiency. Collaborate with data analysts, architects, and business stakeholders to understand data needs. Develop reusable code components and automate workflows using Azure Data Factory (ADF). Implement data quality checks, logging, and monitoring. Participate in code reviews and adhere to software engineering best practices. Required Skills & Qualifications: 3-5 years of experience in Apache Spark / PySpark. 3-5 years working with Azure Databricks and Azure Data Services (ADLS Gen2, ADF, Synapse). Strong understanding of data warehousing, ETL, and data lake architectures. Proficiency in Python and SQL. Experience with Git, CI/CD tools, and version control practices. Skills: ETL SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 3 weeks ago
0 years
0 Lacs
India
On-site
Primary Skills: ETL. C. Azure Cloud. Python. API. Azure Functions. ADF. Spark. Scala. Azure Data Bricks. snowflake. SQL Server Secondary Skills: C# Job Type: Full-time Schedule: Day shift Work Location: In person
Posted 3 weeks ago
15.0 years
3 - 9 Lacs
Indore
On-site
Date: Jun 23, 2025 Job Requisition Id: 59383 Location: Hyderabad, IN Indore, MP, IN Hyderabad, TG, IN Indore, IN Indore, MP, IN Indore, IN Indore, MP, IN, 452001 YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Project Management Professionals in the following areas : Technical skills Should have 15 + years of working experience handling end to end DWH projects. Experience handling ETL Migration / Visualization projects includes technologies like AWS Glue /Redshift, Power BI/Tableau, Azure ADF/Data bricks , Lead technical design and architecture discussions across cross-functional teams Oversee software requirements (including design, architecture, and testing) Manage through agile methodologies, such as Scrum Decipher technical needs of other departments within the organization and translate them across stakeholder groups. Leadership skills Act as a communications liaison between technical and non-technical audiences Develop and maintain productive internal relationships Facilitate cross-collaboration and understanding between IT and other departments Generate targeted reports for different internal and/or external audiences Stay current on the latest news, information, and trends about program management and the organization’s industry. Business responsibilities Organize and track jobs, clarify project scopes, proactively manage risks, deal with project escalations, ruthlessly prioritize tasks and dependencies, and problem solve Meet specific business objectives and metrics Support the roadmap planning process Develop strategies and implement tactics to follow through on those strategies Solve complex business problems within allocated timelines and budget Represent company management to technical teams and vice versa At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 weeks ago
10.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Job Summary: We are seeking an experienced and hands-on Data Lead with deep expertise in Microsoft Azure Data Analytics ecosystem. The ideal candidate will lead the design, development, and implementation of scalable data pipelines and analytics solutions using Azure Data Factory (ADF), Synapse Analytics, Microsoft Fabric, Apache Spark, and modern data modeling techniques. A strong grasp of CDC mechanisms, performance tuning, and cloud-native architecture is essential. Key Responsibilities: Lead the architecture and implementation of scalable data integration and analytics solutions in Azure. Design and build end-to-end data pipelines using ADF, Azure Synapse Analytics, Azure Data Lake, and Microsoft Fabric. Implement and manage large-scale data processing using Apache Spark within Synapse or Fabric. Develop and maintain data models using Star and Snowflake schema for optimal reporting and analytics performance. Implement Change Data Capture (CDC) strategies to ensure near real-time or incremental data processing. Collaborate with stakeholders to translate business requirements into technical data solutions. Manage and mentor a team of data engineers and analysts. Monitor, troubleshoot, and optimize performance of data workflows and queries. Ensure best practices in data governance, security, lineage, and documentation. Stay updated with the latest developments in the Azure data ecosystem and recommend enhancements. Required Skills and Qualifications: 8–10 years of overall experience in data engineering and analytics, with at least 3+ years in a data lead role. Strong expertise in Azure Data Factory, Azure Synapse, and Microsoft Fabric. Hands-on experience with Apache Spark for large-scale data processing. Proficient in SQL, Python, or PySpark for data transformation and automation. Solid experience with CDC patterns (e.g., using ADF, or SQL-based approaches). In-depth understanding of data warehousing concepts and data modeling (Star, Snowflake). Knowledge of Power BI integration with Synapse/Fabric is a plus. Familiarity with DevOps for data pipelines, version control (Git), and CI/CD for data solutions. Strong problem-solving skills and ability to lead architecture discussions and POCs. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft Certifications in Azure Data Engineering or Analytics. Experience with Delta Lake, Databricks, or Snowflake (as source/target). Knowledge of data privacy and compliance standards like GDPR, HIPAA. What We Offer: Opportunity to lead strategic data initiatives on the latest Azure stack. A dynamic and collaborative work environment. Access to continuous learning, certifications, and upskilling programs. Competitive compensation and benefits package.
Posted 3 weeks ago
5.0 years
0 Lacs
Vishakhapatnam, Andhra Pradesh, India
On-site
Position: Azure Data Engineer Experience: 5+ Years Shift Timings: Should be flexible for (UK Timings) Skills Required: Azure Synapse Analytics, Azure Data Factory (ADF) and Databricks Work Location: Rushi Konda, Visakhapatnam Key Responsibilities: Design, build, and maintain efficient and scalable data pipelines using Azure Synapse, ADF, and Databricks . Implement data modeling and transformation logic using DBT (Data Build Tool) to meet reporting and analytics needs. Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Optimize data workflows for performance and cost-efficiency within the Azure ecosystem. Monitor and troubleshoot data pipelines to ensure accuracy, completeness, and timeliness of data. Maintain data quality, documentation, and governance standards. Participate in code reviews, best practices, and performance tuning of complex SQL and PySpark workflows. Automate workflows and support CI/CD processes for data engineering deployments. Develop Power BI reports & dashboards Develop Scalable Data models for Power BI reporting. Required Skills and Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with Azure Synapse Analytics and Azure Data Factory (ADF) . Proven experience with Databricks , including development in PySpark or Scala . Proficiency in DBT for data modeling and transformation. Expert in Analytics and reporting Power BI expert who can Develop power BI models and develop interactive BI reports Setting up RLS in Power BI reports Expertise in SQL and performance tuning techniques. Solid understanding of data warehousing concepts and ETL/ELT design patterns. Experience working in Agile environments and familiarity with Git-based version control. Strong communication and collaboration skills. Preferred Qualifications: Experience with CI/CD tools and DevOps for data engineering. Familiarity with Delta Lake and Lakehouse architecture. Exposure to other Azure services such as Azure Data Lake Storage (ADLS) , Azure Key Vault , and Azure DevOps. Experience with data quality frameworks or tools.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France