Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
16 - 21 Lacs
Hyderabad
Work from Office
Responsibilities: * Design, develop, and maintain Power BI solutions using DAX, ETL tools, SQL, and Snowflake. * Collaborate with cross-functional teams on data modeling and reporting requirements.
Posted 1 week ago
5.0 - 9.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: \u2022 Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. \u2022 Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. \u2022 Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. \u2022 Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. \u2022 Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. \u2022 Build and reuse template/components/web services across multiple dashboards \u2022 Support presentations to Customers and Partners \u2022 Advising on new technology trends and possible adoption to maintain competitive advantage \u2022 Mentoring Associates Experience Needed: \u2022 8+ years of related experience is required. \u2022 A Bachelor degree or Masters degree in Computer Science or related technical discipline is required \u2022 Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. \u2022 Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. \u2022 Strong SQL coding experience with performance optimization experience for data queries. \u2022 Understands different data models like normalized, de-normalied, stars, and snowflake models. \u2022 Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. \u2022 Experience in design, development, and deployment of BI systems. \u2022 Candidates with ETL experience preferred. \u2022 Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. \u2022 Has strong technical background and remains evergreen with technology and industry developments. Additional Requirements \u2022 Demonstrated ability to have successfully completed multiple, complex technical projects \u2022 Prior experience with application delivery using an Onshore/Offshore model \u2022 Experience with business processes across multiple Master data domains in a services based company \u2022 Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. \u2022 Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. \u2022 Strong written communication skills. Is effective and persuasive in both written and oral communication. \u2022 Experience with gathering end user requirements and writing technical documentation \u2022 Time management and multitasking skills to effectively meet deadlines under time-to-market pressure \u2022 May require occasional travel Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded:click here to access or download the form. Complete the form and then email it as an attachment toFTADAAA@conduent.com.You may alsoclick here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Were looking for a Senior Data Analyst to join our data-driven team at an ad-tech company that thrives on turning complexity into clarity. Our analysts play a critical role in transforming raw, noisy data into accurate, actionable signals that drive real-time decision-making and long-term strategy. Youll work closely with product, engineering, and business teams to uncover insights, shape KPIs, and guide performance optimization. Responsibilities: Analyze large-scale datasets from multiple sources to uncover actionable insights and drive business impact. Design, monitor, and maintain key performance indicators (KPIs) across ad delivery, bidding, and monetization systems. Partner with product, engineering, and operations teams to define metrics, run deep-dive analyses, and influence strategic decisions. Develop and maintain dashboards, automated reports, and data pipelines to ensure data accessibility and accuracy. Lead investigative analysis of anomalies or unexpected trends in campaign performance, traffic quality, or platform behavior. Requirements BA / BSc in Industrial Engineering and Management / Information Systems Engineering / Economics / Statistics / Mathematics / similar background. 3+ years of experience in Data Analysis and interpretation (Marketing/ Business/ Product). High proficiency in SQL. Experience with data visualization of large data sets using BI systems (Qlik Sense, Sisense, Tableau, Looker, etc.). Experience working with data warehouse/data lake tools like Athena / Redshift / Snowflake /BigQuery. Knowledge of Python - An advantage. Experience building ETL processes An advantage. Fluent in English both written and spoken - Must
Posted 1 week ago
5.0 - 10.0 years
19 - 25 Lacs
Bengaluru
Hybrid
5+ years of experience in data analysis, reporting, or business intelligence.Advanced proficiency in Tableau – ability to create complex, interactive dashboards.Strong SQL skills – Experience with Snowflake.
Posted 1 week ago
3.0 - 8.0 years
10 - 13 Lacs
Bengaluru
Hybrid
CAN Share Resume to : sowmya.v@acesoftlabs.com Position Data Engineer Experience Range: 3-8years Key Responsibilities Data Pipeline Development Design, build, and optimize scalable data pipelines Ingest, transform, and load data from multiple sources Use Azure Databricks , Snowflake , and DBT for pipeline orchestration Data Architecture & Modeling Develop and manage data models within Snowflake Ensure efficient data organization, accessibility, and quality Data Transformation Implement standardized data transformations using DBT Performance Optimization Monitor pipeline performance Troubleshoot and resolve issues Optimize workflows for efficiency Collaboration Work with data scientists, analysts, and business stakeholders Ensure access to reliable, well-structured data for analytics and reporting Required Qualifications Bachelors degree in Computer Science, Data Engineering, or related field Proficiency in Azure Databricks for data processing Experience with Snowflake as a data warehouse platform Hands-on expertise with DBT for data transformations Strong SQL skills and understanding of data modeling principles Ability to troubleshoot and optimize complex data workflows Preferred / Additional Skills Experience in MS Azure , Snowflake , DBT , and Big Data (Hadoop ecosystem) Knowledge of Hadoop architecture and storage frameworks Hands-on with Hadoop, Spark, Hive , and Databricks Experience with Data Lake solutions using Scala and Python Experience with Azure Data Factory (ADF) for orchestration Familiarity with CI/CD tools such as Jenkins , Azure DevOps , and GitHub Strong programming skills in Python or Scala
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad, Chennai
Work from Office
Key Responsibilities: Design, develop, and maintain data pipelines and data models using Snowflake. Validate and optimize ETL processes to ensure accurate and efficient data movement. Collaborate with data engineers and analysts to understand business requirements and deliver scalable solutions. Perform data quality checks and resolve data issues in the Snowflake environment. Implement best practices for data warehousing, performance tuning, and security in Snowflake. Document ETL processes, data flows, and architectural decisions.
Posted 1 week ago
9.0 - 14.0 years
35 - 50 Lacs
Chennai
Hybrid
Hiring: Power BI Architect Location: Chennai (Hybrid 3 Days Office) Experience: 9+ Years Notice Period: Immediate to 30 Days About the Role We are looking for an experienced and visionary Power BI Architect to lead the design, development, and optimization of business intelligence solutions across enterprise data platforms. This role demands deep expertise in Power BI , data modeling , ETL frameworks , and cloud data platforms like Snowflake or Data Lake . You will play a key role in architecting scalable and high-performing BI systems while collaborating closely with stakeholders, data engineers, and analysts. Key Responsibilities Architect end-to-end Power BI solutions including semantic models, ETL pipelines, and secure data access layers. Design and manage data modeling strategies (Star/Snowflake schemas), leveraging DAX , Power Query , and best practices. Lead ETL development for data ingestion from various sources (cloud/on-prem) into Power BI or data lakes. Define and enforce reporting standards , performance optimization guidelines, and governance frameworks . Collaborate with engineering teams on data quality , integration , and automation using tools like Azure Data Factory , Dataflows , or Databricks . Implement advanced security measures such as Row-Level Security (RLS) and data classification. Mentor Power BI developers and contribute to code reviews , documentation , and solution templates . Stay updated on Power BI roadmap, and proactively evaluate new features and tools for integration. Participate in requirement gathering, solution design discussions, and architectural reviews. Required Qualifications Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field. 9+ years of experience in Business Intelligence , with at least 5+ years hands-on in Power BI . Expertise in DAX , Power Query (M) , and building optimized data models . Strong command of SQL , data warehousing , and dimensional modeling . Proven experience with Snowflake or Azure Data Lake implementations. Experience in designing and maintaining enterprise BI architecture and ETL frameworks . Familiarity with Azure ecosystem (ADF, Azure SQL DB, Azure Synapse, Databricks) is highly desirable. Exposure to version control tools like Git/GitHub or Azure DevOps. Strong communication, stakeholder management, and leadership skills. Nice to Have Experience in Power BI Embedded , Deployment Pipelines , or Paginated Reports . Knowledge of CI/CD for BI solutions . Certifications: Microsoft Certified: Power BI Data Analyst Associate or Azure Data Engineer Associate . Why Join Us? Work with a dynamic and collaborative team building cutting-edge analytics solutions. Flexible hybrid work setup. Opportunity to architect enterprise-scale BI platforms in a modern data stack environment.
Posted 1 week ago
5.0 - 10.0 years
25 - 30 Lacs
Chennai
Work from Office
Job Summary: We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures. The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization. This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance. Key Responsibilities Design and implement scalable data pipelines to support real-time and batch processing. Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources. Build and manage modern data architectures that support efficient storage, processing, and access. Collaborate with stakeholders to understand data needs and deliver reliable solutions. Perform data transformation, enrichment, validation, and normalization for analysis and reporting. Monitor and ensure the quality, integrity, and consistency of data across systems. Optimize workflows for performance, scalability, and cost-efficiency. Support cloud and on-premise data integrations, migrations, and automation initiatives. Document data flows, schemas, and infrastructure for operational and development purposes. • Apply best practices in data governance, security, and compliance. Required Qualifications & Skills: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Proven 6+ Years experience in data engineering, ETL development, or data pipeline management. Proficiency with tools and technologies such as: SQL, Python, Spark, Scala ETL tools (e.g., Apache Airflow, Talend) Cloud platforms (e.g., AWS, GCP, Azure) Big Data tools (e.g., Hadoop, Hive, Kafka) Data warehouses (e.g., Snowflake, Redshift, BigQuery) Strong understanding of data modeling, data architecture, and data lakes. Experience with CI/CD, version control, and working in Agile environments. Preferred Qualifications: • Experience with data observability and monitoring tools. • Knowledge of data cataloging and governance frameworks. • AWS/GCP/Azure data certification is a plus.
Posted 1 week ago
9.0 - 14.0 years
30 - 37 Lacs
Hyderabad
Hybrid
SQL & Database Management : Deep knowledge of relational databases (SQL / PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Imp :-The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems the same it should reflect in profile
Posted 1 week ago
1.0 - 4.0 years
12 - 16 Lacs
Gurugram
Hybrid
Primary Role Responsibilities: Develop and maintain data ingestion and transformation pipelines across on-premise and cloud platforms. Develop scalable ETL/ELT pipelines that integrate data from a variety of sources (i.e. form-based entries, SQL databases, Snowflake, SharePoint). Collaborate with data scientists, data analysts, simulation engineers and IT personnel to deliver data engineering and predictive data analytics projects. Implement data quality checks, logging, and monitoring to ensure reliable operations. Follow and maintain data versioning, schema evolution, and governance controls and guidelines. Help administer Snowflake environments for cloud analytics. Work with more senior staff to improve solution architectures and automation. Stay updated with the latest data engineering technologies and trends. Participate in code reviews and knowledge sharing sessions. Participate in and plan new data projects that impact business and technical domains. Required Qualifications: Bachelors or masters degree in computer science, data engineering, or related field. 1-3 years of experience in data engineering, ETL/ELT development, and/or backend software engineering. Demonstrated expertise in Python and SQL. Demonstrated experience working with data lakes and/or data warehouses (e.g. Snowflake, Databricks, or similar) Familiarity with source control and development practices (e.g Git, Azure DevOps) Strong problem-solving skills and eagerness to work with cross-functional globalized teams. Preferred Qualifications: Required qualification plus Working experience and knowledge of scientific and R&D workflows, including simulation data and LIMS systems. Demonstrated ability to balance operational support and longer-term project contributions. Experience with Java Strong communication and presentation skills. Motivated and self-driven learner
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Your Impact: As a developer you will work as part of a highly skilled team of professionals who are responsible for architecture, designing and developing of cost effective and sustainable solutions for Security Product business of OpenText. Strong organizational skills, technical expertise and attention to detail are key in this customer-focused role. What the role offers: Translate business requirements using complex methods/models to determine appropriate system solutions Work within a cross-functional team to provide technical expertise in the design and planning of system solutions. Research, identify, test, certify, and select technology required for solution delivery. Maximize the performance, uptime, and supportability of the product. Developing highly scalable Security product using technologies such as Java, J2EE, REST, Azure, Aws, GCP and Snowflake. Working with team to design solutions to security problems, monitor and analyze the security vulnerabilities reported in bundled 3rd party products. Designs and implements new interface components in collaboration with the product owner and other OpenText development teams. Collaborates with engineer and development partners to develop reliable, cost-effective, and high-quality software solutions. Maintains the existing components and resolves problems reported by customers. Enhances existing components with new capabilities whilst maintaining compatibility. Provide feedback on test plans, test cases, and test methodologies. Research new technologies for product improvements and future roadmap. Communicate with stakeholders, provide project progress, highlight any risks involved along with mitigation plan. What you need to succeed: Bachelor's or masters degree in computer science, Information Systems, or equivalent. 2-5 years of software development experience building large-scale and highly distributed applications. Developing highly scalable Security product using technologies such as Java, J2EE, REST/SOAP, AWS, GCP, Snowflake, Azure. Demonstrated ability to have completed multiple, complex technical projects. Strong programming skills in Java, J2EE. Experience in Cloud (Aws or GCP or Azure) is must. Experience in working in Devops, Continuous Integration environment. Excellent communication skills and ability to interact effectively with both technical and non-technical staff In-depth technical experience in IT infrastructure area, Understanding of operational challenges involved in managing complex systems Previous experience in being a part of complex integration projects Technical execution of project activities and responsibilities for on-time delivery and results. Interfacing with customer facing functions to gather project requirements and performing due diligence as required. Providing technical guidance for trouble shooting and issue resolution when needed. Familiarity with Agile Software Development (preferably Scrum). Unit testing and mock framework like mockito. Desired Skills Understanding of the Security domain. Experience in Azure, Aws, GCP and Hadoop. Working knowledge in Linux. Cloud technologies and cloud application development. Good knowledge about the security threat models and good knowledge of various security encryption techniques. Knowledge of different types of security vulnerabilities, attack vectors and common type of cyberattacks.
Posted 1 week ago
4.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Ahmedabad
Hybrid
Snowflake Data Engineer Exp-5-10 years Location-Hyderabad Role & responsibilities Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines in Snowflake to support data migration from legacy systems. Leverage Python for data transformation, automation, and orchestration of migration workflows. Optimize and refactor complex SQL queries to ensure efficient data processing and reporting in Snowflake. Collaborate on data modeling and schema design to align with Snowflake architecture and performance best practices. Monitor and troubleshoot data pipeline performance during and after migration phases. Work closely with data analysts, scientists, and business stakeholders to ensure accurate and timely data delivery. Implement and enforce data governance, security policies , and access controls within Snowflake. Collaborate with DevOps teams to integrate data engineering workflows into broader CI/CD frameworks. Required Skills: 45 years of experience in data engineering , with proven expertise in Snowflake and Python . Strong command of Snowflake features such as scripting, time travel, virtual warehouses, and query optimization. Hands-on experience with ETL tools , data integration strategies, and migration methodologies. Solid understanding of data warehousing principles , normalization techniques, and performance optimization. Familiarity with cloud platforms (AWS, Azure, or GCP) and orchestration tools. Excellent problem-solving skills and ability to work independently in a dynamic, fast-paced environment
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
punjab
On-site
About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant, Power BI Developer In this role, you will be responsible for coding, testing, and delivering high quality deliverables, and should be willing to learn new technologies. Responsibilities Understand fundamentals of data preparation/data modelling necessary for visualization purposes. Develop the report and visualization using Power BI Work experience in Stored Procedure, SQL Strong knowledge Database concepts and internals Work experience in Snowflake and MS SQL Strong understanding of Power BI user and group security configuration Qualifications we seek in you! Minimum Qualifications BE/B Tech/MCA Excellent written and verbal communication skills Preferred Qualifications/ Skills Good conceptual knowledge and experience in other BI tools is an added advantage. Experienced Power BI developer including experience in Azure environment. Experience required in implementing Power BI reports reading data from different data sources including on premise data servers, cloud services and several file formats. Experience in creating Reports, Dashboard and visualizations in PowerBI Competent with Power BI Desktop, Gateway, Service and Report Server. Good Experience in Paginated Reports using Report Builder. Experience in migrating SSRS reports/Tabular reports to Power BI Paginated Reports. Good understanding of Power Query M and hands-on experience in building sophisticated DAX queries Experience in implementing static and dynamic Row Level Security, extensive experience with Dataset design, Data Cleansing and Data Aggregation Understanding of relational database structures, theories, principles, and practices Solid SQL reporting skills with the ability to create SQL views and write SQL queries to build custom datasets for reporting or analysis Able to nurture robust working relationships with the team, peers and clients, scheduling flexibility required. Overall the candidate should have problem solving, macro-level research and an analytic approach and good in numbers. Job Principal Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Mar 19, 2025, 10:19:53 PM Unposting Date Apr 18, 2025, 11:59:00 PM Master Skills List Consulting Job Category Full Time,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Position Type : Full time Type Of Hire : Experienced (relevant combo of work and education) Education Desired : Bachelor of Computer Science Travel Percentage : 0% Principal/Sr. Lead Engineer Automation Are you curious, motivated, and forward-thinking At FIS youll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. About the team FIS Protegent helps compliance officers cope with increasing data volumes, understand more sophisticated product risks, stay ahead of evolving regulations and rapidly respond to ongoing demands from auditors and regulators. FIS Compliance Suite of products helps firms meet their compliance and regulatory obligations from providing comprehensive surveillance for insider trading and market manipulation to assisting in supervisory controls and supporting management reporting. What you will be doing You will be a part of the FIS Compliance Suite (formerly Protegent) next generation product development initiative, joining a team of highly motivated and focused developers to develop our next generation compliance product. As a Senior Automation Engineer at FIS Global, you will play a key role in automation for cloud-based applications by understanding data flow and configurations for multiples environments to expedite release/build verifications to improve on quality and stability. Leveraging your expertise in scripting, AWS, Jenkins and Snowflake, you will collaborate with cross-functional teams to solve challenging problems and drive strategic automation initiatives. You will be responsible to devise and utilize automation framework and DevOps best practices to set up end to end automation pipeline to improve build and release quality; review and validate data for uniformity and accuracy, analyse results for failures, and interpret with clear objectives in mind. You will get an opportunity to work with multiple products and businesses and develop a good understanding of the interesting world of trading and compliance. What you bring: Knowledge / Experience - Minimum of five years of experience in AWS, Snowflake and DevOps automation work with a proven track record of delivering impactful solutions. - Strong SQL skills - Proficiency in programming languages such as Python, Unix scripting as well as experience with Jenkins build pipeline and release deployment automation. - Strong analytical and problem-solving skills, with the ability to translate business requirements into analytical solutions. - Excellent communication and presentation skills, with the ability to convey complex technical concepts - Experience working with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP) is a plus. - Demonstrated ability to work effectively in a collaborative team environment and manage multiple priorities in a fast-paced, dynamic setting. - Proven ability to automate data creation work to reduce manual efforts. Responsibilities: Define automation plan and own it end to end for release verification and CI/CD pipeline set up. Understand product architecture and workflow to build optimized automation pipeline for continuous delivery. Work closely with product and solution management teams to understand business use cases and convert into efficient automation set up and execution to reduce time to market. Stay abreast of the latest advancements in DevOps and AWS automation to leverage latest concepts and methodologies. Contribute to the development and implementation of best practices for DevOps, automation, and release deployment/verification. Setting up new and upgrading existing environments for automation pipeline and monitor it for failure analysis for daily sanity and regression verification. Qualifications Bachelor's or masters degree in computer science or a related field. Competencies Fluent in English Excellent communicator ability to discuss initiatives for automation and provide optimized solution Attention to detail and quality focus. Organized approach manage and adapt priorities according to client and internal requirements. Self-starter but team mindset - work autonomously and as part of a global team What we offer you A multifaceted job with a high degree of responsibility, visibility, and ownership. An opportunity to work with one of the fastest growing segments and products in the capital markets space with great opportunity for growth. Strong exposure to the exciting trading and compliance space, with excellent learning opportunities. A broad range of professional education and personal development possibilities FIS is your final career step! A competitive salary and benefits A variety of career development tools, resources and opportunities With a 50-year history rooted in the financial services industry, FIS is the world's largest global provider dedicated to financial technology solutions. We champion clients from banking to capital markets, retail to corporate and everything touched by financial services. Headquartered in Jacksonville, Florida, our 53,000 worldwide employees help serve more than 20,000 clients in over 130 countries. Our technology powers billions of transactions annually that move over $9 trillion around the globe. FIS is a Fortune 500 company and is a member of Standard & Poors 500 Index. FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the FIS Online Privacy Notice. Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
The position at Iris Software in Noida, UP, India is looking for a candidate with 3-4 years of working experience. The ideal candidate must have a strong background in Python Django and should also possess expertise in SQL, Snowflake, and DBT. As a part of the Iris Software team, you will be working on complex, mission-critical applications using cutting-edge technologies such as Python, Django, SQL, Snowflake, and more. The company's vision is to be the most trusted technology partner for its clients and create an environment where professionals can realize their full potential. Iris Software values its employees and offers a supportive work culture where individuals are encouraged to grow both professionally and personally. The company's Employee Value Proposition focuses on enabling employees to excel in their careers, be challenged by inspiring work, and be part of a culture that recognizes and nurtures talent. Joining Iris Software comes with a range of benefits aimed at supporting the financial, health, and well-being needs of its employees. From competitive salaries to comprehensive health insurance and flexible work arrangements, the company is committed to providing a rewarding work environment that fosters personal and professional growth. If you are looking to be a part of one of India's Top 25 Best Workplaces in the IT industry and want to work with a rapidly growing IT services company, Iris Software could be the place for you to do your best work and thrive in an award-winning work culture.,
Posted 1 week ago
4.0 - 9.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Looking for Data engineer with 4+ yrs exp Skills: Azure functionalities,AWS lambda ,serverless, Python ,API,snowflake Work from office--Bangalore ( Yeshvanthpur)--India
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Position Title: Senior Specialist Reports to : Program Manager- Analytics BI Position Summary A Specialist shall work with the development team and responsible for development task as individual contribution. He/she should be able to mentor team and able to help in resolving issues. He/she should be technical sound and able to communicate with client perfectly. Key Duties & Responsibilities Work as Lead Developer Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in Computer Science or equivalent experience is required. B.Tech/MCA preferable. Minimum 5 7 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key Competency Profile Own youre a development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visit: r1rcm.com Visit us on Facebook,
Posted 1 week ago
6.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Design and implement Snowflake Cortex solutions for advanced analytics and AI use cases. Optimize data pipelines, ensure security compliance, and support scalable deployments across cloud environments.
Posted 1 week ago
3.0 - 10.0 years
18 - 22 Lacs
Hyderabad
Work from Office
WHAT YOU'LL DO Lead the development of scalable data infrastructure solutions Leverage your data engineering expertise to support data stakeholders and mentor less experienced Data Engineers. Design and optimize new and existing data pipelines Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 8 to 10 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.
Posted 1 week ago
5.0 - 10.0 years
10 - 12 Lacs
Navi Mumbai
Work from Office
Hello Candidates , We are Hiring !! Job Position - Data Engineer Experience - 5+ years Location - NAVI MUMBAI ( Juinagar ) Work Mode - WFO Job Description We are looking for an experienced and results-driven Senior Data Engineer to join our Data Engineering team. In this role, you will design, develop, and maintain robust data pipelines and infrastructure that enable efficient data flow across our systems. As a senior contributor, you will also help define best practices, mentor junior team members, and contribute to the long-term vision of our data platform. You will work closely with cross-functional teams to deliver reliable, scalable, and high-performance data systems that support critical business intelligence and analytics initiatives. Responsibilities Design, build, and maintain scalable ETL/ELT pipelines to support analytics, Data Warehouse, and business operations. Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions. Develop and manage data models, data lakes, and data warehouse solutions in cloud environments (e.g., AWS, Azure, GCP). Monitor and optimize the performance of data pipelines and storage systems. Ensure data quality, integrity, and security across all platforms. Optimize and tune SQL queries and ETL jobs for performance and scalability. Collaborate with business analysts, data scientists, and stakeholders to understand requirements and deliver data solutions. Contribute to architectural decisions and development standards across the data engineering team. Participate in code reviews and provide guidance to junior developers. Leverage tools such as Airflow, Spark, Kafka, dbt, or Snowflake to build modern data infrastructure. Ensure data accuracy, completeness, and integrity across systems. Implement best practices in data governance, security, and compliance (e.g., GDPR, HIPAA). Mentor junior developers and participate in peer code reviews. Create and maintain detailed technical documentation. Required Qualifications Bachelors degree in Computer Science, Information Systems, or a related field; Masters degree is a plus. 5+ years of experience in data warehousing, ETL development, and data modeling. Strong hands-on experience with one or more databases: Snowflake, Redshift, SQL Server, Oracle, Postgres, Teradata, BigQuery. Proficiency in SQL and scripting languages (e.g., Python, Shell). Deep knowledge of data modeling techniques and ETL frameworks. Excellent communication, analytical thinking, and troubleshooting skills. Preferred Qualifications Experience with modern data stack tools like dbt, Fivetran, Stitch, Looker, Tableau, or Power BI. Knowledge of data lakes, lakehouses, and real-time data streaming (e.g., Kafka). Agile/Scrum project experience and version control using Git. NOTE - Candidates can share their Resume - shruti.a@talentsketchers.com
Posted 1 week ago
8.0 - 10.0 years
36 - 60 Lacs
Pune
Work from Office
We are seeking a data modelling professional with strong experience in data modeling. The ideal candidate should have hands-on experience in designing scalable, efficient data solutions and integrating them within enterprise systems. Provident fund Health insurance
Posted 1 week ago
2.0 - 7.0 years
12 - 16 Lacs
Hyderabad
Work from Office
WHAT YOU'LL DO Build scalable data infrastructure solutions Design and optimize new and existing data pipelines Integrate new data sources into our existing data architecture Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 3 to 5 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.
Posted 1 week ago
6.0 - 10.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Responsibilities: Identify & understand business requirements for Reporting, design, and implement future proof8 solutions, optimizing data delivery, etc. Solution-based thinking in the DWH design and integration / optimization of the existing designs related to DWHv Together with the Business end users and the contact persons of a specific domain / capability, responsible for building a standard & sustainable DWH solution Thinking along and involvement in the E2E DWH design (everything needed for this) to help build reports and dashboards for our partners in an insightful and efficient way Support the ETL engineers in translating the functional requirement into technical designs Responsible for the delivery and follow-up of the solutions within the lifecycle management Coaching/Training team members on various processes - BI analysis Profile: Should have atleast 6+ years of experience in business requirements analysis (Reporting) for BI projects Should be able to understand & convert business requirements (functional) into logical & physical DWH models (Star, Snowflake) using Kimball methodologyv Should be proficient in understanding DWH databases, expertise in analysing & writing complex SQLs in an efficient way Strong working experience in Data Modelling tools - erwin Data Modeller or SAP Powerdesigner is a must Assemble large, complex data sets that meet functional / non-functional business requirements
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Apache Spark, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to identify data needs and optimize data workflows, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Databricks Unified Data Analytics Platform, Apache Spark, Python (Programming Language).- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud data warehousing solutions and big data technologies. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France