Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
18 - 22 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Role: Data Engineer Work Mode: (work from home) Total Experience: 7+ Years Relevant Experience: 5+ Years Primary Skills: Design & Build Data Pipelines Develop scalable ETL/ELT workflows to ingest, transform, and load data into Snowflake using SQL, Python, or data integration tools. Data Modeling Create and optimize Snowflake schemas, tables, views, and materialized views to support business analytics and reporting needs. Performance Optimization Tune Snowflake compute resources (warehouses), optimize query performance, and manage clustering and partitioning strategies. Data Quality & Validation Security & Access Control Automation & CI/CD Monitoring & Troubleshooting Documentation
Posted 5 days ago
6.0 - 10.0 years
15 - 30 Lacs
Gurugram
Work from Office
We are specifically looking for candidates with strong SQL skills, along with experience in Snowflake or Looker
Posted 5 days ago
3.0 - 5.0 years
3 - 7 Lacs
New Delhi, Bengaluru
Work from Office
As an Offshore Sales and Marketing Professional at XO Tek, you will play a pivotal role in accelerating our growth by leveraging your expertise in IT software professional services sales. You will be responsible for generating leads, nurturing relationships, and closing deals to meet a sales target of $1MM USD. Your deep understanding of the technology landscape and existing network will be crucial in offering our specialized services to a broader audience. Responsibilities Develop and implement effective sales strategies to achieve a sales target of $1MM USD. Utilize CRM tools, such as LinkedIn Sales Navigator, to manage and grow sales pipelines efficiently. Create and maintain a sales run book for replicable success in lead generation and conversion. Leverage existing relationships and networks to offer XO Tek's services, identifying new business opportunities. Work closely with the marketing team to develop targeted campaigns that align with sales strategies. Provide detailed sales forecasting and track sales activities to ensure targets are met. Stay abreast of industry trends and competitive landscapes to position XO Tek as a leader in IT software and services. Proven track record of generating and meeting sales targets of at least $1MM USD, with evidence to support this achievement. Established relationships and networks within the IT and technology sectors, with a focus on software and professional services sales. Demonstrated experience in generating leads that convert into sales, with a strategic approach to business development. Proficiency in utilizing CRM tools, specifically LinkedIn Sales Navigator, for sales management and lead generation. Experience in creating and utilizing a sales run book to drive lead generation and sales efforts. Excellent communication and interpersonal skills, with the ability to engage effectively with clients and team members. Self-motivated with a results-driven approach, capable of working independently in a remote setting.
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Key Responsibilities: Design, develop, and optimize stored procedures, functions, and database triggers. Perform regular database backups, recovery, and maintenance tasks. Ensure data integrity, security, and performance tuning of databases. Manage structured data using Oracle and unstructured data using MongoDB . Develop and maintain data models, database schemas, and ETL pipelines in Snowflake . Collaborate with application developers and DevOps teams for seamless integration. Troubleshoot database issues and support performance analysis. Must-Have Skills: Strong experience in Oracle (SQL/PLSQL) including stored procedure development. Proficiency in MongoDB for handling unstructured datasets. Experience in Snowflake for data warehousing and cloud-based analytics. Solid understanding of database administration and backup/recovery concepts. Excellent problem-solving and communication skills.
Posted 5 days ago
4.0 - 8.0 years
15 - 27 Lacs
Pune
Hybrid
Key Responsibilities: Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes. Must have skills: 57 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Preferred Skills: Experience with Qlik Replicate , data replication , or data migration tools. Familiarity with data governance , data quality frameworks , and metadata management . Exposure to cloud-based architectures, Big Data platforms (e.g., Spark, Hive, Kafka), and distributed storage systems (e.g., HBase, MongoDB). Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement. Why Join Cummins? Opportunity to work with a global leader in power solutions and digital transformation. Be part of a collaborative and inclusive team culture. Access to cutting-edge data platforms and tools. Exposure to enterprise-scale data challenges and finance domain expertise . Drive impact through data innovation and process improvement .
Posted 5 days ago
5.0 - 10.0 years
10 - 16 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
Designation : Senior Data Engineer Experience: 5+ Years Location: Navi Mumbai (JUINAGAR) - WFO Immediate Joiners preferred. Interview : Face - 2 - Face (Only 1 Day Process) Job Description We are looking for an experienced and results-driven Senior Data Engineer to join our Data Engineering team . In this role, you will design, develop, and maintain robust data pipelines and infrastructure that enable efficient data flow across our systems. As a senior contributor, you will also help define best practices, mentor junior team members, and contribute to the long-term vision of our data platform. You will work closely with cross-functional teams to deliver reliable, scalable, and high-performance data systems that support critical business intelligence and analytics initiatives. Required Qualifications: Bachelors degree in Computer Science, Information Systems, or a related field; Masters degree is a plus. 5+ years of experience in data warehousing, ETL development, and data modeling. Strong hands-on experience with one or more databases: Snowflake, Redshift, SQL Server, Oracle, Postgres, Teradata, BigQuery. Proficiency in SQL and scripting languages (e.g., Python, Shell). Deep knowledge of data modeling techniques and ETL frameworks. Excellent communication, analytical thinking, and troubleshooting skills. Preferred Qualifications Experience with modern data stack tools like dbt, Fivetran, Stitch, Looker, Tableau,or PowerBI. Knowledge of data lakes, lakehouses, and real-time data streaming (e.g., Kafka). Agile/Scrum project experience and version control using Git. Sincerely, Sonia TS
Posted 5 days ago
10.0 - 15.0 years
22 - 37 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As an AWS Data Engineer at Kyndryl, you will be responsible for designing, building, and maintaining scalable, secure, and high-performing data pipelines using AWS cloud-native services. This role requires extensive hands-on experience with both real-time and batch data processing, expertise in cloud-based ETL/ELT architectures, and a commitment to delivering clean, reliable, and well-modeled datasets. Key Responsibilities: Design and develop scalable, secure, and fault-tolerant data pipelines utilizing AWS services such as Glue, Lambda, Kinesis, S3, EMR, Step Functions, and Athena. Create and maintain ETL/ELT workflows to support both structured and unstructured data ingestion from various sources, including RDBMS, APIs, SFTP, and Streaming. Optimize data pipelines for performance, scalability, and cost-efficiency. Develop and manage data models, data lakes, and data warehouses on AWS platforms (e.g., Redshift, Lake Formation). Collaborate with DevOps teams to implement CI/CD and infrastructure as code (IaC) for data pipelines using CloudFormation or Terraform. Ensure data quality, validation, lineage, and governance through tools such as AWS Glue Data Catalog and AWS Lake Formation. Work in concert with data scientists, analysts, and application teams to deliver data-driven solutions. Monitor, troubleshoot, and resolve issues in production pipelines. Stay abreast of AWS advancements and recommend improvements where applicable. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field Over 8 years of experience in data engineering More than 3 years of experience with the AWS data ecosystem Strong experience with Java, Pyspark, SQL, and Python Proficiency in AWS services: Glue, S3, Redshift, EMR, Lambda, Kinesis, CloudWatch, Athena, Step Functions Familiarity with data modelling concepts, dimensional models, and data lake architectures Experience with CI/CD, GitHub Actions, CloudFormation/Terraform Understanding of data governance, privacy, and security best practices Strong problem-solving and communication skills Preferred Skills and Experience Experience working as a Data Engineer and/or in cloud modernization. Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Engineer or AWS Certified Solutions Architect is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 5 days ago
7.0 - 11.0 years
15 - 30 Lacs
Noida
Remote
Job Title: IoT Solutions Architect (MQTT/HiveMQ) Consultant Location: 100% Remote Notes: Consumer goods and Manufacturing experience are highly preferred. Comfortable to work on the US Timezone Job Description The consultant will be working on a new MQTT/Hive MQ setup. IoT smart manufacturing project. Cloud platform - Azure Must have 3-4 years of experience in the field in manufacturing solutions and minimum 2 years of HiveMQ - Sequel, cloud/edge integrations experience. These are the skills required: Expertise in MQTT Protocols: Deep understanding of MQTT 3.1.1 and MQTT 5.0, including advanced features like QoS levels, retained messages, session expiry, and shared subscriptions. HiveMQ Platform Proficiency: Hands-on experience with HiveMQ broker setup, configuration, clustering, and deployment (on-premises, cloud, or Kubernetes). Edge-to-Cloud Integration: Ability to design and implement solutions that bridge OT (Operational Technology) and IT systems using MQTT. Sparkplug B Knowledge: Familiarity with Sparkplug B for contextual MQTT data in IIoT environments. Enterprise Integration: Experience with HiveMQ Enterprise Extensions (e.g., Kafka, Google Cloud Pub/Sub, AWS Kinesis, PostgreSQL, MongoDB, Snowflake). Security Implementation: Knowledge of securing MQTT deployments using HiveMQ Enterprise Security Extension (authentication, authorization, TLS, etc.). Custom Extension Development: Ability to develop and deploy custom HiveMQ extensions using the open-source SDK. Development & Scripting MQTT Client Libraries: Proficiency in using MQTT client libraries (e.g., Eclipse Paho, HiveMQ MQTT Client) in languages like Java, Python, or JavaScript. MQTT CLI: Familiarity with the MQTT Command Line Interface for testing and debugging. Scripting & Automation: Ability to automate deployment and testing using tools like HiveMQ Swarm. Soft Skills & Experience Must have 3-4 years of experience in the field in manufacturing solutions and minimum 2 years of HiveMQ - Sequel, cloud/edge integrations experience. IoT/IIoT Project Experience: Proven track record in implementing MQTT-based IoT solutions. Problem Solving & Debugging: Strong analytical skills to troubleshoot MQTT communication and broker issues. Communication & Documentation: Ability to clearly document architecture, configurations, and best practices for clients. Interested Candidate can apply : dsingh15@fcsltd.com
Posted 5 days ago
7.0 - 12.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Skill: Data Engineer Experience: 7+ Years Location: Warangal, Bangalore, Chennai, Hyderabad, Mumbai, Pune, Delhi, Noida, Gurgaon, Kolkata, Jaipur, Jodhpur Notice Period: Immediate - 15 Days Job Description: Design & Build Data Pipelines Develop scalable ETL/ELT workflows to ingest, transform, and load data into Snowflake using SQL, Python, or data integration tools. Data Modeling Create and optimize Snowflake schemas, tables, views, and materialized views to support business analytics and reporting needs. Performance Optimization Tune Snowflake compute resources (warehouses), optimize query performance, and manage clustering and partitioning strategies. Data Quality & Validation Security & Access Control Automation & CI/CD Monitoring & Troubleshooting Documentation
Posted 5 days ago
5.0 - 10.0 years
16 - 20 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Role & responsibilities TALEND ., SNOWFLAKE
Posted 5 days ago
7.0 - 12.0 years
15 - 27 Lacs
Pune
Hybrid
Notice Period - Immediate joiner Responsibilities Lead, develop and support analytical pipelines to acquire, ingest and process data from multiple sources Debug, profile and optimize integrations and ETL/ELT processes Design and build data models to conform to our data architecture Collaborate with various teams to deliver effective, high value reporting solutions by leveraging an established DataOps delivery methodology Continually recommend and implement process improvements and tools for data collection, analysis, and visualization Address production support issues promptly, keeping stakeholders informed of status and resolutions Partner closely with on and offshore technical resources Provide on-call support outside normal business hours as needed Provide status updates to the stakeholders. Identify obstacles and seek assistance with enough lead time to ensure delivery on time Demonstrate technical ability, thoroughness, and accuracy in all assignments Document and communicate on proper operations, standards, policies, and procedures Keep abreast on all new tools and technologies that are related to our Enterprise data architecture Foster a positive work environment by promoting teamwork and open communication. Skills/Qualifications Bachelors degree in computer science with focus on data engineering preferable. 6+ years of experience in data warehouse development, building and managing data pipelines in cloud computing environments Strong proficiency in SQL and Python Experience with Azure cloud services, including Azure Data Lake Storage, Data Factory, and Databricks Expertise in Snowflake or similar cloud warehousing technologies Experience with GitHub, including GitHub Actions. Familiarity with data visualization tools, such as Power BI or Spotfire Excellent written and verbal communication skills Strong team player with interpersonal skills to interact at all levels Ability to translate technical information for both technical and non-technical audiences Proactive mindset with a sense of urgency and initiative Adaptability to changing priorities and needs If you are interested share your updated resume on mail - recruit5@focusonit.com. Also Request you to please spread this message across your Networks or Contacts.
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a highly motivated and experienced Data Engineer, you will be responsible for designing, developing, and implementing solutions that enable seamless data integration across multiple cloud platforms. Your expertise in data lake architecture, Iceberg tables, and cloud compute engines like Snowflake, BigQuery, and Athena will ensure efficient and reliable data access for various downstream applications. Your key responsibilities will include collaborating with stakeholders to understand data needs and define schemas, designing and implementing data pipelines for ingesting, transforming, and storing data. You will also be developing data transformation logic to make Iceberg tables compatible with the data access requirements of Snowflake, BigQuery, and Athena, as well as designing and implementing solutions for seamless data transfer and synchronization across different cloud platforms. Ensuring data consistency and quality across the data lake and target cloud environments will be crucial in your role. Additionally, you will be analyzing data patterns and identifying performance bottlenecks in data pipelines, implementing data optimization techniques to improve query performance and reduce data storage costs, and monitoring data lake health to proactively address potential issues. Collaboration and communication with architects, leads, and other stakeholders to ensure data quality meet specific requirements will also be an essential part of your role. To be successful in this position, you should have a minimum of 4+ years of experience as a Data Engineer, strong hands-on experience with data lake architectures and technologies, proficiency in SQL and scripting languages, and experience with data governance and security best practices. Excellent problem-solving and analytical skills, strong communication and collaboration skills, and familiarity with cloud-native data tools and services are also required. Additionally, certifications in relevant cloud technologies will be beneficial. In return, GlobalLogic offers exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. You will have the opportunity to collaborate with a diverse team of highly talented individuals in an open, laidback environment. Work-life balance is prioritized with flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional development opportunities include Communication skills training, Stress Management programs, professional certifications, and technical and soft skill trainings. GlobalLogic provides competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses. Fun perks such as sports events, cultural activities, food on subsidized rates, corporate parties, dedicated GL Zones, rooftop decks, and discounts for popular stores and restaurants are also part of the vibrant office culture at GlobalLogic. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise, GlobalLogic helps clients accelerate their transition into tomorrow's digital businesses. Operating under Hitachi, Ltd., GlobalLogic contributes to driving innovation through data and technology for a sustainable society with a higher quality of life.,
Posted 5 days ago
6.0 - 12.0 years
0 Lacs
karnataka
On-site
Your role as a Supervisor at Koch Global Services India (KGSI) will involve being part of a global team dedicated to creating new solutions and enhancing existing ones for Koch Industries. With over 120,000 employees worldwide, Koch Industries is a privately held organization engaged in manufacturing, trading, and investments. KGSI is being established in India to expand its IT operations and serve as an innovation hub within the IT function. This position offers the chance to join at the inception of KGSI and play a pivotal role in its development over the coming years. You will collaborate closely with international colleagues, providing valuable global exposure to the team. In this role, you will lead a team responsible for developing innovative solutions for KGS and its customers. You will oversee the performance and growth of data engineers at KGSI, ensuring the delivery of application solutions. Collaboration with global counterparts will be essential for enterprise-wide delivery success. Your responsibilities will include mentoring team members, providing feedback, and coaching them for their professional growth. Additionally, you will focus on understanding individual career aspirations, addressing challenges, and facilitating relevant training opportunities. Ensuring compensation aligns with Koch's philosophy and maintaining effective communication with HR will be key aspects of your role. Timely delivery of projects is crucial, and you will be responsible for identifying and addressing delays proactively. By fostering knowledge sharing and best practices within the team, you will contribute to the overall success of KGSI. Staying updated on market trends, talent acquisition, and talent retention strategies will be vital for your role. Your ability to lead by example, communicate effectively, and solve problems collaboratively will be essential in driving team success. To qualify for this role, you should hold a Bachelor's or Master's degree in computer science or information technology with a minimum of 12 years of IT experience, including leadership roles in integration teams. A solid background in data engineering, AWS cloud migration, and team management is required. Strong communication skills, customer focus, and a proactive mindset towards innovation are essential for success in this position. Experience with AWS Lambda, Glue, ETL projects, Python, SQL, and BI tools will be advantageous. Familiarity with manufacturing business processes and exposure to Scrum Master practices would be considered a plus. Join Koch Global Services (KGS) to be part of a dynamic team that creates solutions to support various business functions worldwide. With a global presence in India, Mexico, Poland, and the United States, KGS empowers employees to make a significant impact on a global scale.,
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
As a Software Engineer at Mastercard, you will play a crucial role in supporting applications software by utilizing your programming, analysis, design, development, and delivery skills to provide innovative software solutions. You will be involved in designing highly scalable, fault-tolerant, and performant systems in the cloud, ensuring that project implementations and technical deliveries align with solution architectural design and best practices. Your responsibilities will include collaborating with stakeholders to understand business needs, evaluating emerging technologies, providing technical guidance to project teams, and supporting services before and after they go live. You will be expected to analyze ITSM activities, maintain system health, scale systems sustainably through automation, and evolve systems for improved reliability and velocity. Your role will involve explaining technical issues and solution strategies to stakeholders, assisting with project scoping, planning, and estimation, and staying up to date with new technologies through self-study and participation in relevant events. This position requires a minimum bachelor's degree in information technology, Computer Science, or equivalent work experience, along with at least 2 years of hands-on software development experience and familiarity with software and microservices architecture. The ideal candidate should have a current understanding of best practices in application and system security, experience in data analytics, ETL, data modeling, and pattern analysis, and be willing to learn new technology stacks. Strong domain knowledge of Java 8 (or later), experience with relational and NoSQL databases, and proficiency in user interface development frameworks, particularly Angular, are desired. Excellent communication skills, a collaborative mindset, and the ability to work effectively in a global team across different time zones are essential for success in this role. At Mastercard, corporate security responsibility is paramount, and it is expected that every individual takes ownership of information security by abiding by security policies, ensuring confidentiality and integrity of accessed information, reporting any security violations or breaches, and completing mandatory security trainings. If you are a motivated software engineer who thrives in a collaborative and innovative environment, eager to tackle challenging business problems using cutting-edge technologies, and keen on contributing to the growth of a dynamic company, we invite you to be part of our team and drive our solutions to the next level.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Location: Pune (Hybrid) Experience: 5+ years Key Responsibilities: Data Pipeline Architecture: Build and optimize large-scale data ingestion pipelines from multiple sources. Scalability & Performance: Ensure low-latency, high-throughput data processing for real-time and batch workloads. Cloud Infrastructure: Design and implement cost-effective, scalable data storage solutions. Automation & Monitoring: Implement observability tools for pipeline health, error handling, and performance tracking. Security & Compliance: Ensure data encryption, access control, and regulatory compliance in the data platform. Ideal Candidate Profile: Strong experience in Snowflake, dbt, and AWS for large-scale data processing. Expertise in Python, Airflow, and Spark for orchestrating pipelines. Deep understanding of data architecture principles for real-time and batch workloads. Hands-on experience with Kafka, Kinesis, or similar streaming technologies. Ability to work on cloud cost optimizations and infrastructure-as-code (Terraform, CloudFormation).,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
As a Snowflake DBT Lead at Pyramid Consulting, you will be responsible for overseeing Snowflake data transformation and validation processes in Delhi, India. Your role will include ensuring efficient data handling, maintaining data quality, and collaborating closely with cross-functional teams. To excel in this role, you should have strong expertise in Snowflake, DBT, and SQL. Your experience in data transformation, modeling, and validation will be crucial for success. Proficiency in ETL processes and data warehousing is essential to meet the job requirements. Your excellent problem-solving and communication skills will enable you to effectively address challenges and work seamlessly with team members. As a candidate for this position, you should hold a Bachelor's degree in Computer Science or a related field. Your ability to lead and collaborate within a team environment will be key to delivering high-quality solutions and driving impactful results for our clients.,
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
Optum is a global organization dedicated to delivering care using technology to improve the lives of millions of people. Your work with our team will directly enhance health outcomes by providing individuals with access to care, pharmacy benefits, data, and resources necessary for their well-being. Our culture is defined by diversity and inclusion, alongside talented colleagues, comprehensive benefits, and opportunities for career development. Join us in making a positive impact on the communities we serve while contributing to the advancement of global health equity through caring, connecting, and growing together. In this role, your primary responsibilities will include analyzing client requirements and complex business scenarios, designing innovative and fully automated products and solutions, serving as a BI Developer for key projects, ensuring high-quality execution of products, providing consulting to teammates, leaders, and clients, and offering extensive solutions in ETL strategies. You should possess an undergraduate degree or equivalent experience, along with expertise in ETL processes and data integration using Azure Data Factory. Proficiency in Power BI semantic model creation, report development, and data visualization is required, with Snowflake and Azure Data Warehouse as primary data sources. Additionally, you should have a strong understanding of data modeling concepts, relational database systems, Snowflake, and Azure Data Warehouse. Familiarity with Databricks for data engineering, advanced analytics, and machine learning tasks is preferred, as well as proficiency in Azure Cloud services such as Azure Data Factory, Azure SQL Data Warehouse, Azure Data Lake Storage, and Azure Analytics. Solid programming skills in Python, SQL, and other scripting languages are essential, along with proven problem-solving abilities, effective communication and collaboration skills, and the capacity to manage multiple tasks simultaneously. Microsoft certifications in Power BI, Azure Cloud, Snowflake, or related fields are a plus. The role is based in Hyderabad, Telangana, IN.,
Posted 5 days ago
8.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
At Techwave, we are committed to fostering a culture of growth and inclusivity. We ensure that every individual associated with our brand is challenged at every step and provided with the necessary opportunities to excel in their professional and personal lives. People are at the core of everything we do. Techwave is a leading global IT and engineering services and solutions company dedicated to revolutionizing digital transformations. Our mission is to enable clients to maximize their potential and achieve a greater market share through a wide array of technology services, including Enterprise Resource Planning, Application Development, Analytics, Digital solutions, and the Internet of Things (IoT). Founded in 2004 and headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to help businesses accelerate their growth. We are a team of dreamers and doers who constantly push the boundaries of what's possible, and we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire: Full-time Key Skills: As a senior-level ETL developer with 10-13 years of experience, you will be responsible for building relational and data warehousing applications. Your primary role will involve supporting the existing EDW, designing and developing various layers of our data, and testing, documenting, and optimizing the ETL process. You will collaborate within a team environment to design and develop frameworks and services according to specifications. Your responsibilities will also include preparing detailed system documentation, performing unit and system tests, coordinating with Operations staff on application deployment, and ensuring that all activities are performed with quality and compliance standards. Additionally, you will design and implement ETL batches that meet SLAs, develop data collection, staging, movement, quality, and archiving strategies, and design automation processes to control data access and movement. To excel in this role, you must have 8-10 years of ETL/ELT experience, strong SQL skills, and proficiency in Stored Procedures and database development. Experience in Azure Data Lake, Synapse, Azure Data Factory, and Databricks, as well as Snowflake, is essential. You should possess a good understanding of data warehouse ETL and ELT design best practices, be able to work independently, and have a strong database experience with DB2, SQL Server, and Azure. Furthermore, you should be adept at designing Relational and Dimensional Data models, have a good grasp of Enterprise reporting (particularly Power BI), and understand Agile practices and methodologies. Your role will also involve assisting in analyzing and extracting relevant information from historical business data to support Business Intelligence initiatives and conducting Proof of Concept for new technology selection and proposing data warehouse architecture enhancements. If you are a self-starter with the required skills and experience, we invite you to join our dynamic team at Techwave and be a part of our journey towards innovation and excellence.,
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer with 8-12 years of experience, you will be responsible for handling a variety of tasks for one of our clients in Hyderabad. This is a full-time position with an immediate start date. Your proficiency in Python, Spark, SQL, Snowflake, Airflow, AWS, and DBT will be essential for this role. In this role, you will be expected to work on a range of data engineering tasks using the specified skill set. Your expertise in these technologies will be crucial in developing efficient data pipelines, ensuring data quality, and optimizing data workflows. Furthermore, you will collaborate with cross-functional teams to understand data requirements, design and implement data solutions, and provide technical guidance on best practices. Your ability to communicate effectively and work well in a team setting will be key to your success in this role. If you are interested in this opportunity and possess the required skill set, please share your profile with us at srujanat@teizosoft.com. We look forward to potentially having you join our team in Hyderabad.,
Posted 5 days ago
3.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Talend Data Engineer specializing in Talend and Snowflake, you will be responsible for developing and maintaining ETL workflows using Talend and Snowflake. Your role will involve designing and optimizing data pipelines to ensure performance and scalability. Collaboration with various teams to address data integration and reporting requirements will be crucial. Your focus will also include ensuring data quality, governance, and security protocols. To excel in this role, you should possess 3 to 9 years of experience working with Talend ETL and Snowflake. Proficiency in SQL, Python, and working knowledge of cloud platforms such as AWS, Azure, and Google Cloud is essential. Previous experience in constructing end-to-end data pipelines and familiarity with data warehouses are key requirements for this position. Experience in Snowflake performance tuning and an understanding of Agile methodologies are considered advantageous for this role. This is a full-time position that follows a day shift schedule from Monday to Friday, requiring your presence at the office in Nagpur, Pune, Bangalore, or Chennai. Join us in this dynamic opportunity to leverage your expertise in Talend and Snowflake to drive impactful data solutions while collaborating with cross-functional teams to meet business objectives effectively.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
If you are a software engineering leader ready to take the reins and drive impact, weve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Asset and Wealth Management LOB, you lead a data technology area and drive impact within teams, technologies, and deliveries. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex initiatives, while serving as a primary decision maker for your teams and be a driver of engineering innovation and solution delivery. The current role focuses on delivering data solutions for some of the Wealth Management businesses. Job responsibilities Leads engineering and delivery of a data and analytics solutions Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership & maintainability Delivers technical solutions that can be leveraged across multiple businesses and domains Influences and collaborates with peer leaders and senior stakeholders across the business, product, and technology teams Champions the firms culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Experience managing data solutions across a large, global consumer community in the Financial Services domain Experience hiring, developing and leading cross-functional teams of technologists Experience handling multiple, global stakeholders across business, technology and product Appreciation of the data product; modeling, sourcing, quality, lineage, discoverability, access management, visibility, purging, etc. Experience researching and upgrading to latest technologies in the continuously evolving data ecosystem Practical hybrid cloud native experience, preferably AWS Experience using current technologies, such as GraphQL, Glue, Spark, SnowFlake, SNS, SQS, Kinesis, Lambda, ECS, EventBridge, QlikSense, etc. Experience with Java and/or Python programming languages Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field Preferred qualifications, capabilities, and skills Comfortable being hands-on as required to drive solutions and solve challenges for the team Exposure and appreciation of the continuously evolving data science space Exposure to the Wealth Management business,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),
Posted 6 days ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 6 days ago
7.0 - 10.0 years
30 - 32 Lacs
Hyderabad
Work from Office
6+ years of Java development. Strong knowledge of SQL Agile development methodologies. Working experience with Snowflake and its native features (snowpark, data shares) Python Understanding of core AWS services and cloud infrastructure
Posted 6 days ago
10.0 - 20.0 years
20 - 30 Lacs
Pune
Remote
Role & responsibilities Minimum 10+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31300 Jobs | Dublin
Wipro
16502 Jobs | Bengaluru
EY
10539 Jobs | London
Accenture in India
10399 Jobs | Dublin 2
Uplers
8481 Jobs | Ahmedabad
Amazon
8475 Jobs | Seattle,WA
IBM
7957 Jobs | Armonk
Oracle
7438 Jobs | Redwood City
Muthoot FinCorp (MFL)
6169 Jobs | New Delhi
Capgemini
5811 Jobs | Paris,France