Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Snowflake. Experience: 5-8 Years.
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ onSnowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands ofusers
Posted 1 week ago
8.0 - 12.0 years
15 - 30 Lacs
Gurugram
Work from Office
Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.
Posted 1 week ago
8.0 - 10.0 years
25 - 30 Lacs
Gurugram
Work from Office
Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.
Posted 1 week ago
10.0 - 15.0 years
17 - 22 Lacs
Pune
Work from Office
What You'll Do The Global Analytics & Insights (GAI) team is seeking a Data & Analytics Engineering Manager to lead our team in designing, developing, and maintaining data pipelines and analytics infrastructure. As a Data & Analytics Engineering Manager, you will play a pivotal role in empowering a team of engineers to build and enhance analytics applications and a modern data platform using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will become an expert in Avalaras financial, marketing, sales, and operations data. The ideal candidate will have deep SQL experience, an understanding of modern data stacks and technology, demonstrated leadership and mentoring experience, and an ability to drive innovation and manage complex projects. This position will report to Senior Manager. What Your Responsibilities Will Be Mentor a team of data engineers, providing guidance and support to ensure a high level of quality and career growth Lead a team of data engineers in the development and maintenance of data pipelines, data modelling, code reviews and data products Collaborate with cross-functional teams to understand requirements and translate them into scalable data solutions Drive innovation and continuous improvements within the data engineering team Build maintainable and scalable processes and playbooks to ensure consistent delivery and quality across projects Drive adoption of best practices in data engineering and data modelling Be the visible lead of the team- coordinate communication, releases, and status to various stakeholders What You'll Need to be Successful Bachelor's degree in Computer Science, Engineering, or related field 10+ years experience in data engineering field, with deep SQL knowledge 2+ years management experience, including direct technical reports 5+ years experience with data warehousing concepts and technologies 4+ years of working with Git, and demonstrated experience using these tools to facilitate growth of engineers 4+ years working with Snowflake 3+ years working with dbt (dbt core preferred) Preferred Qualifications: Snowflake, Dbt, AWS Certified 3+ years working with Infrastructure as Code, preferably Terraform 2+ years working with CI CD, and demonstrated ability to build and operate pipelines Experience and understanding of Snowflake administration and security principles Demonstrated experience with Airflow
Posted 1 week ago
6.0 - 11.0 years
8 - 12 Lacs
Pune
Work from Office
What You'll Do The Global Analytics & Insights (GAI) team is looking for a Senior Data Engineer to lead our build of the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data backed decisions. As A Senior Data Engineer, you will help architect, implement, and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will immerse yourself in our financial, marketing, and sales data to become an expert of Avalara's domain. You will have deep SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data related. What Your Responsibilities Will Be You will architect repeatable, reusable solutions to keep our technology stack DRY Conduct technical and architecture reviews with engineers, ensuring all contributions meet quality expectations You will develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions Build scalable, complex dbt models Demonstrate ownership of complex projects and calculations of core financial metrics and processes Work with Data Engineering teams to define and maintain scalable data pipelines. Promote automation and optimization of reporting processes to improve efficiency. You will be reporting to Senior Manager What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 6+ years experience in data engineering field, with advanced SQL knowledge 4+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 4+ years of working with Snowflake 3+ years working with dbt (dbt core) 3+ years working with Infrastructure as Code Terraform 3+ years working with CI CD, and demonstrated ability to build and operate pipelines AWS Certified Terraform Certified Experience working with complex Salesforce data Snowflake, dbt certified
Posted 1 week ago
8.0 - 13.0 years
10 - 15 Lacs
Pune
Work from Office
The Data Science Engineering team is looking for a Lead Data Analytics Engineer to join our team! You should be and gather our requirements, understanding complex product, business, and engineering challenges, composing and prioritizing research projects, and then building them in partnership with cloud engineers and architects, and using the work of our data engineering team. You have deep SQL experience, an understanding of modern data stacks and technology, experience with data and all things data-related, and experience guiding a team through technical and design challenges. You will report into the Sr. Manager, Cloud Software Engineering and be a part of the larger Data Engineering team. What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization component (PowerBI). Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What You'll Need to be Successful What You'll Need to be Successful Bachelor's Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.
Posted 1 week ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Hybrid
Job Title:Snowflake Developer Experience: 6-8 Years Location:Chennai - Hybrid Job Description : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure, GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We assist organizations in streamlining their operations, cutting costs, and enhancing efficiency by managing key processes and functions on their behalf. Our expertise in project management, technology, and process optimization enables us to deliver top-notch services to our clients. If you join our managed service management and strategy team at PwC, your focus will involve transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your responsibilities will include working on continuous improvement, optimizing managed services processes, tools, and services. As a Specialist in the Data Analytics & Insights Managed Service tower, you will work alongside a team of problem solvers to address complex business issues from strategy to execution using your skills in Data, Analytics, and Insights. Your role at this management level will require you to use feedback and reflection to enhance self-awareness, demonstrate critical thinking, and maintain high standards of quality in your work. You will also be responsible for reviewing ticket quality and deliverables, status reporting for projects, adherence to SLAs, incident management, change management, problem management, and more. To excel in this position, you should possess primary skills in ETL/ELT, SQL, Informatica, and Python, along with secondary skills in Azure/AWS/GCP, Talend, DataStage, etc. As a Data Engineer, you must have a minimum of 1 year of experience in Operate/Managed Services/Production Support. Your role will involve designing and implementing data pipelines, building ETL/ELT processes, monitoring data pipelines, ensuring data security and privacy, and optimizing schema and performance tuning. Additionally, you should have experience with ITIL processes, strong communication skills, problem-solving abilities, and analytical skills. Certifications in Cloud Technology and experience with visualization tools like Power BI, Tableau, Qlik, etc., are considered nice-to-have qualifications for this role. Our Managed Services in Data, Analytics & Insights focus on delivering integrated solutions that add value to our clients through technology and human-enabled experiences. By joining our team, you will be part of a group dedicated to empowering clients to optimize operations, accelerate outcomes, and drive transformational journeys. We prioritize a consultative approach to operations, leveraging industry insights and world-class talent to achieve sustained client outcomes. Our goal is to provide clients with flexible access to business and technology capabilities that align with the demands of the dynamic business environment.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a BI Developer at NiCE, you will play a crucial role in developing Reports for a multi-region, multi-tenant SaaS product. Collaborating with the core R&D team, you will be responsible for creating high-performance Reports that cater to the use cases of various applications within the suite. Your impact will be significant as you take ownership of the software development lifecycle, encompassing design, development, unit testing, and deployment. You will work closely with QA teams to ensure the consistent implementation of architectural concepts across the product. Additionally, you will act as a product expert within R&D, understanding the product's requirements and market positioning. Collaboration with cross-functional teams such as Product Managers, Sales, Customer Support, and Services will be essential to ensure successful product delivery. Key responsibilities include designing and building Reports based on given requirements, creating design documents and test cases, developing SQL to address ad-hoc report requirements, conducting analyses, and creating visualizations and reports as per specifications. You will also be involved in executing unit testing, functional and performance testing, documenting results, conducting peer reviews, and ensuring quality standards are met throughout all stages. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Electronic Engineering, or equivalent from a reputable institute. With 2-4 years of BI report development experience, you should possess expertise in SQL and cloud-based databases, along with proficiency in BI tools such as Tableau, Power BI, or MicroStrategy. Experience in enterprise Data warehouse/Data Lake systems, analytical databases, and ETL frameworks is essential. Familiarity with Snowflake, database management systems, OLAP, and working in Agile environments is highly desirable. Joining NiCE offers you the opportunity to be part of an ever-growing, market-disrupting global company where the best talents collaborate in a fast-paced, innovative environment. As a NiCE team member, you will have access to endless internal career opportunities across various roles, disciplines, domains, and locations. If you are passionate, innovative, and eager to push boundaries, NiCE might just be the perfect fit for you. At NiCE, we operate according to the NiCE-FLEX hybrid model, allowing maximum flexibility with 2 days working from the office and 3 days of remote work each week. Office days focus on face-to-face meetings that foster teamwork, collaboration, innovation, and a vibrant atmosphere. Reporting directly to the Tech Manager, this role is classified as an Individual Contributor position at NiCE. NiCE Ltd. (NASDAQ: NICE) is a renowned provider of software products used by over 25,000 global businesses, including 85 Fortune 100 corporations. With a focus on delivering exceptional customer experiences, combatting financial crime, and ensuring public safety, NiCE software manages more than 120 million customer interactions and monitors over 3 billion financial transactions daily. Recognized for innovation in AI, cloud, and digital solutions, NiCE employs over 8,500 professionals across 30+ countries.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
Capgemini Invent is the digital innovation, consulting, and transformation brand of the Capgemini Group, a global business line that combines market-leading expertise in strategy, technology, data science, and creative design to help CxOs envision and build what's next for their businesses. In this role, you should have developed/worked on at least one Gen AI project and have experience in data pipeline implementation with cloud providers such as AWS, Azure, or GCP. You should also be familiar with cloud storage, cloud database, cloud data warehousing, and Data lake solutions like Snowflake, BigQuery, AWS Redshift, ADLS, and S3. Additionally, a good understanding of cloud compute services, load balancing, identity management, authentication, and authorization in the cloud is essential. Your profile should include a good knowledge of infrastructure capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs. performance and scaling. You should be able to contribute to making architectural choices using various cloud services and solution methodologies. Proficiency in programming using Python is required along with expertise in cloud DevOps practices such as infrastructure as code, CI/CD components, and automated deployments on the cloud. Understanding networking, security, design principles, and best practices in the cloud is also important. At Capgemini, we value flexible work arrangements to provide support for maintaining a healthy work-life balance. You will have opportunities for career growth through various career growth programs and diverse professions tailored to support you in exploring a world of opportunities. Additionally, you can equip yourself with valuable certifications in the latest technologies such as Generative AI. Capgemini is a global business and technology transformation partner with a rich heritage of over 55 years. We have a diverse team of 340,000 members in more than 50 countries, working together to accelerate the dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. Trusted by clients to unlock the value of technology, we deliver end-to-end services and solutions leveraging strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, cloud, and data, combined with deep industry expertise and partner ecosystem. Our global revenues in 2023 were reported at 22.5 billion.,
Posted 1 week ago
8.0 - 12.0 years
0 - 0 Lacs
bangalore
On-site
Role Data Engineer Experience 8-12 Years Location Bangalore Design, develop, and maintain robust and scalable data pipelines that ingest, transform, and load data from various sources into data warehouse. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data quality checks and monitoring to ensure data accuracy and integrity. Optimize data pipelines for performance and efficiency. Troubleshoot and resolve data pipeline issues. Stay up-to-date with emerging technologies and trends in data engineering. Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in SQL and at least one programming language (e.g., Python, Java). Experience with data pipeline tools and frameworks Experience with cloud-based data warehousing solutions (Snowflake). Experience with AWS Kinesis, SNS, SQS Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Desired Skills & Experience:
Posted 1 week ago
3.0 - 5.0 years
7 - 11 Lacs
Gurugram
Work from Office
Job Summary Synechron is seeking a detail-oriented Data Analyst to leverage advanced data analysis, visualization, and insights to support our business objectives. The ideal candidate will have a strong background in creating interactive dashboards, performing complex data manipulations using SQL and Python, and automating workflows to drive efficiency. Familiarity with cloud platforms such as AWS is a plus, enabling optimization of data storage and processing solutions. This role will enable data-driven decision-making across teams, contributing to strategic growth and operational excellence. Software Requirements Required: PowerBI (or equivalent visualization tools like Streamlit, Dash) SQL (for data extraction, manipulation, and querying) Python (for scripting, automation, and advanced analysis) Data management tools compatible with cloud platforms (e.g., AWS S3, Redshift, or similar) Preferred: Cloud platform familiarity, especially AWS services related to data storage and processing Knowledge of other visualization platforms (Tableau, Looker) Familiarity with source control systems (e.g., Git) Overall Responsibilities Develop, redesign, and maintain interactive dashboards and visualization tools to provide actionable insights. Perform complex data analysis, transformations, and validation using SQL and Python. Automate data workflows, reporting, and visualizations to streamline processes. Collaborate with business teams to understand data needs and translate them into effective visual and analytical solutions. Support data extraction, cleaning, and validation from various sources, ensuring data accuracy. Maintain and enhance understanding of cloud environments, especially AWS, to optimize data storage, processing pipelines, and scalability. Document technical procedures and contribute to best practices for data management and reporting. Performance Outcomes: Timely, accurate, and insightful dashboards and reports. Increased automation reducing manual effort. Clear communication of insights and data-driven recommendations to stakeholders. Technical Skills (By Category) Programming Languages: Essential: SQL, Python Preferred: R, additional scripting languages Databases/Data Management: Essential: Relational databases (SQL Server, MySQL, Oracle) Preferred: NoSQL databases like MongoDB, cloud data warehouses (AWS Redshift, Snowflake) Cloud Technologies: Essential: Basic understanding of AWS cloud services (S3, EC2, RDS) Preferred: Experience with cloud-native data solutions and deployment Frameworks and Libraries: Python: Pandas, NumPy, Matplotlib, Seaborn, Plotly, Streamlit, Dash Visualization: PowerBI, Tableau (preferred) Development Tools and Methodologies: Version control: Git Automation tools for workflows and reporting Familiarity with Agile methodologies Security Protocols: Awareness of data security best practices and compliance standards in cloud environments Experience Requirements 3-5 years of experience in data analysis, visualization, or related data roles. Proven ability to deliver insightful dashboards, reports, and analysis. Experience working across teams and communicating complex insights clearly. Knowledge of cloud environments like AWS or other cloud providers is desirable. Experience in a business environment, not necessarily as a full-time developer, but as an analytical influencer. Day-to-Day Activities Collaborate with stakeholders to gather requirements and define data visualization strategies. Design and maintain dashboards using PowerBI, Streamlit, Dash, or similar tools. Extract, transform, and analyze data using SQL and Python scripts. Automate recurring workflows and report generation to improve operational efficiencies. Troubleshoot data issues and derive insights to support decision-making. Monitor and optimize cloud data storage and processing pipelines. Present findings to business units, translating technical outputs into actionable recommendations. Qualifications Bachelors degree in Computer Science, Data Science, Statistics, or related field. Masters degree is a plus. Relevant certifications (e.g., PowerBI, AWS Data Analytics) are advantageous. Demonstrated experience with data visualization and scripting tools. Continuous learning mindset to stay updated on new data analysis trends and cloud innovations. Professional Competencies Strong analytical and problem-solving skills. Effective communication, with the ability to explain complex insights clearly. Collaborative team player with stakeholder management skills. Adaptability to rapidly changing data or project environments. Innovative mindset to suggest and implement data-driven solutions. Organized, self-motivated, and capable of managing multiple priorities efficiently. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
Posted 1 week ago
5.0 - 7.0 years
3 - 7 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 29 July 2025 Title Senior Analyst Programmer Department FIL India Technology - GPS Location Gurugram Level Software Engineer- 3 Fidelity International offers investment solutions and services and retirement expertise to more than 2.52 million customers globally. As a privately-held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 locations and with $750.2 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals. Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our GPS Data Platform team and feel like youre part of something bigger. About your team The GPS Lakehouse & Reporting is a team of around 100 people whose role is to develop and maintain the datwarehouse and reporting platforms that we use to administer the pensions and investments of our workplace and retail customers across the world. In doing this we critical to the delivery of our core product and value proposition to these clients today and in future. About your role The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day to day basis including data centre, networks, proximity services, security, voice, incident management and remediation. Below are the key responsibilities: Work with Delivery Managers and System/Business Analysts and other subject matter experts to understand the requirements Implement Informatica mappings between inbound and target data model Produce Technical specifications, unit test cases for the interfaces under development Provide support through all phases of implementation Adhere to the source code control policies of the project Implement and use appropriate Change Management processes Develop capability to implement Business Intelligence tools. About you Must have technical skills: Strong understanding of standard ETL tool Informatica Power Centre with a minimum of 3 years experience. Strong Oracle SQL/PLSQL, Stored Procedure experience Knowledge of Devops, Configuration Management tools like SVN, CI tools Experience of using job scheduling tools (Control-M preferred) Experience in UNIX or Python scripting Good to have technical skills- Familiarity in Data Warehouse, Data marts and ODS concepts Exposure to Agile (Scrum) development practices Knowledge of data normalisation and Oracle performance optimisation techniques Cloud Technologies like AWS and Snowflake Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 1 week ago
5.0 - 7.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 Title Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Department Technology Location Bangalore (hybrid / flexible working permitted) Reports To Data Analysis Chapter Lead Level Associate Director About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across Fidelitys key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 1 week ago
5.0 - 7.0 years
2 - 6 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies
Posted 1 week ago
9.0 - 12.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Snowflake Requirements Administration Experience: Managing user access, roles, and security protocols Configuring and optimizing virtual warehouses for different workloads Setting up and maintaining database replication and failover procedures Setting up programmatic access Platform-Level Expertise: Performance tuning and query optimization Data clustering and micro-partitioning strategies Time travel and zero-copy cloning implementation Experience with Snowflake's architecture (storage/compute separation) Managing Snowflake integrations with other cloud services Security and Governance: Implementing row/column level security Setting up and managing data masking policies Configuring network policies and private connectivity options Audit logging and compliance monitoring Managing encryption and key rotation AWS Requirements OpenSearch Experience: Deploying and scaling OpenSearch domains Implementing search solutions with complex query requirements Performance optimization for large-scale search operations Managing security and access controls Setting up monitoring and alerting Experience with OpenSearch Dashboards Specialty/Analytical Databases: DynamoDB: Neptune: Additional Database Experience: Amazon MemoryDB Amazon DocumentDB (MongoDB compatibility) General AWS Skills: Infrastructure as Code (CloudFormation, CDK) VPC and networking configuration Security best practices and IAM management Monitoring and logging (CloudWatch, CloudTrail) Cost optimization strategies
Posted 1 week ago
6.0 - 11.0 years
3 - 7 Lacs
Karnataka
Hybrid
PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills
Posted 1 week ago
5.0 - 10.0 years
4 - 7 Lacs
Mumbai
Hybrid
PF Detection is mandatory Minimum 5 years of experience in database development and ETL tools. 2. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). 3. Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). 4. Experience with data modeling and schema design. 5. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices
Posted 1 week ago
6.0 - 11.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Strong experience in o 7 Years in ETL Domain (Mandate) o In 7 Years (3 + years in Talend ETL is (Mandate) o Talend Deployments Development (Mandate) o SQL Queries Writing (Mandate) o Trouble Shooting SQL Queries (Mandate) o DWH (Data Warehouse Concepts) Data Warehouse ETL (Mandate) o Any Cloud database Experience (Mandate) Preferably i.e. Redshift AWS Aurora PostgreSQL Snowflake etc o Dimensional Data Modeling (Optional)
Posted 1 week ago
5.0 - 10.0 years
5 - 8 Lacs
Chennai
Work from Office
We are looking immediatefor SnowflakeDataWarehouse Engineers_ Contract_ Chennai:Snowflake Data Warehouse Engineers5+yearsChennaiPeriodImmediateTypeContractDescription:- We need an experienced, collaborative Snowflake Data Warehouse Engineers with 5+ Yrs of experience in developing Snowflake data models, data ingestion, views, Stored procedures, complex queries Good experience in SQL Experience in Informatica Powercenter / IICS ETL tools Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Provide production support for Data Warehouse issues such data load problems, transformation translation problems Ability to facilitate and coordinate discussion and to manage expectations of multiple stakeholders Candidate must have good communication and facilitation skills Work in an onsite-offshore model involving daily interactions with Onshore teams to ensure on-time quality deliverables
Posted 1 week ago
10.0 - 15.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Architecting and designing analytics platform solution for clients Identifying data sources and defining data pipeline, segmentation, and performance engineering for scale Managing team of Data Analyst/ Data Engineers assigning work and reviewing quality of deliverables. Experience in building business reports and dashboards with latest data warehousing and BI tools such as Snowflake/ Cognos/ Tableau/Superset/PowerBI Strong understanding of data modeling, data warehousing concepts and various analytical data models Familiarity with business domains such as healthcare Migration experience highly preferred Azure Cloud experience highly preferred Perform research with new technologies and third-party vendor technologies as required and prototyping Design, develop and implement analytics based on the discussions with the team. Programming skills in generic analytic platforms using Python/ Java/ Scala Skills & Experience: A minimum of 10+ years of experience with complex/large enterprise BI/ETL/Data Warehouse projects,Big Decisions Sales Analytics a.k.a. Redshift and Databricks Platform Experience in Designing and architecting Data and BI solutions related to one or more of the followingdata quality/data governance assessments, data strategy and roadmaps, integration scoping (requirements, design and mapping), Data warehouse and Data Lake design and implementation. Ability to perform Information Management assessments (review of needs, data quality, data security, etc.) and accurately estimate the implementation effort Awareness of emerging Data warehouse design approaches and the potential future impact to data warehouse architecture. Advanced knowledge in Data warehouse dimensional modelling. Hands on knowledge with at least one Key ETL Market tool like Informatica Power Center, Oracle Data Integrator, Microsoft SSIS etc. Hands-on knowledge with at least one Key BI Market tool (IBM Cognos, SAP BO, Oracle BI, Tableau, MS Power BI, QlickView, SSRS etc.) Hands-on knowledge of enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools. (e.g. SAS, R etc.) Knowledge of latest market trends (Big Data, Blockchain, AI, Cloud, Mobile, etc.) Knowledge of master data management and exposure to MDM tools Strong communication, networking & interpersonal skills Strong business analysis, planning, monitoring, requirement management and related knowledge Experience with at least one of the Data Quality technologies (i.e. Informatica Master Data Management, SAS data management, Talend Data Quality) Solid experience in developing data governance practices at enterprise level - Meta data, data lineage, data classification, data security & data life cycle Experience with Normalized data store, Operational data store, Dimension data store & Enterprise Data lake Understanding of predictive and prescriptive modelling. Coordinate with the Data Science department to identify future needs and requirements Excellent interpersonal and teamwork skills Experience to drive solution/enterprise level architecture, collaborate with other tech leads Strong problem solving, troubleshooting and analysis skills Experience working in a geographically distributed team Experience with leading and mentorship of other team members Good knowledge of Agile Scrum Good communication skills
Posted 1 week ago
5.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
BI Tools or Data Acceleration/Data Processing deployment and administration Any previous experience of administering In memory columnar databases like Exasol, Greenplum, Vertica, Snowflake Strong analytical and problem-solving skills Ability to communicate orally and in writing in a clear and straightforward manner with a broad range of technical and non-technical users and stakeholders Proactive and focused on results and success; conveys a sense of urgency and drives issues to closure Should be a team player and leader, flexible, hardworking, and self-motivated and have a positive outlook with the ability to take on difficult initiatives and challenges Ability to handle multiple concurrent projects First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 week ago
5.0 - 10.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Stellantis is seeking a passionate, innovative, results-oriented Information Communication Technology (ICT) Manufacturing AWS Cloud Architect to join the team. As a Cloud architect, the selected candidate will leverage business analysis, data management, and data engineering skills to develop sustainable data tools supporting Stellantiss Manufacturing Portfolio Planning. This role will collaborate closely with data analysts and business intelligence developers within the Product Development IT Data Insights team. Job responsibilities include but are not limited to Having deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, cloud and other SQL technologies. Working with stakeholders to support their data infrastructure needs while assisting with data-related technical issues Maintain high-quality ontology and metadata of data systems Establish a strong relationship with the central BI/data engineering COE to ensure alignment in terms of leveraging corporate standard technologies, processes, and reusable data models Ensure data security and develop traceable procedures for user access to data systems Qualifications, Experience and Competency Education Bachelors or Masters degree in Computer Science, or related IT-focused degree Experience Essential Overall 10-15 years of IT experience Develop, automate and maintain the build of AWS components, and operating systems. Work with application and architecture teams to conduct proof of concept (POC) and implement the design in a production environment in AWS. Migrate and transform existing workloads from on premise to AWS Minimum 5 years of experience in the area of data engineering or data architectureconcepts, approach, data lakes, data extraction, data transformation Proficient in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies. Experience operating very large data warehouses or data lakes Investigate and develop new micro services and features using the latest technology stacks from AWS Self-starter with the desire and ability to quickly learn new technologies Strong interpersonal skills with ability to communicate & build relationships at all levels Hands-on experience from AWS cloud technologies like S3, AWS glue, Glue Catalog, Athena, AWS Lambda, AWS DMS, pyspark, and snowflake. Experience with building data pipelines and applications to stream and process large datasets at low latencies. Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Desirable Familiarity with data analytics, Engineering processes and technologies Ability to work successfully within a global and cross-functional team A passion for technology. We are looking for someone who is keen to leverage their existing skills while trying new approaches and share that knowledge with others to help grow the data and analytics teams at Stellantis to their full potential! Specific Skill Requirement AWS services (GLUE, DMS, EC2, RDS, S3, VPCs and all core services, Lambda, API Gateway, Cloud Formation, Cloud watch, Route53, Athena, IAM) andSQL, Qlik sense, python/Spark, ETL optimization , If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France