Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
DXFactor is a US-based tech company working with customers globally. We are a certified Great Place to Work and currently seeking candidates for the role of Data Engineer with 4 to 6 years of experience. Our presence spans across the US and India, specifically in Ahmedabad. As a Data Engineer at DXFactor, you will be expected to specialize in SnowFlake, AWS, and Python. Key Responsibilities: - Design, develop, and maintain scalable data pipelines for both batch and streaming workflows. - Implement robust ETL/ELT processes to extract data from diverse sources and load them into data warehouses. - Build and optimize database schemas following best practices in normalization and indexing. - Create and update documentation for data flows, pipelines, and processes. - Collaborate with cross-functional teams to translate business requirements into technical solutions. - Monitor and troubleshoot data pipelines to ensure optimal performance. - Implement data quality checks and validation processes. - Develop and manage CI/CD workflows for data engineering projects. - Stay updated with emerging technologies and suggest enhancements to existing systems. Requirements: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 4+ years of experience in data engineering roles. - Proficiency in Python programming and SQL query writing. - Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Familiarity with data warehousing technologies such as Snowflake, Redshift, and BigQuery. - Demonstrated ability in constructing efficient and scalable data pipelines. - Practical knowledge of batch and streaming data processing methods. - Experience in implementing data validation, quality checks, and error handling mechanisms. - Work experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight). - Understanding of various data architectures including data lakes, data warehouses, and data mesh. - Proven ability to debug complex data flows and optimize underperforming pipelines. - Strong documentation skills and effective communication of technical concepts.,
Posted 17 hours ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Technical Lead of Data Engineering at Assent, you will collaborate with various stakeholders including Product Managers, Product Designers, and Engineering team members to identify opportunities and evaluate the feasibility of solutions. Your role will involve offering technical guidance, influencing decision-making, and aligning data engineering initiatives with business objectives as part of Assent's roadmap development. You will be responsible for driving the technical strategy, overseeing team execution, and implementing process improvements to construct resilient and scalable data systems. In addition, you will lead data engineering efforts, mentor a growing team, and establish robust and scalable data infrastructure. Key Requirements & Responsibilities: - Lead the technical execution of data engineering projects to ensure high-quality and timely delivery, covering discovery, delivery, and adoption stages. - Collaborate with Architecture team members to design and implement scalable, high-performance data pipelines and infrastructure. - Provide technical guidance to the team, ensuring adherence to best practices in data engineering, performance optimization, and system reliability. - Work cross-functionally with various teams such as Product Managers, Software Development, Analysts, and AI/ML teams to define and implement data initiatives. - Partner with the team manager to plan and prioritize work, striking a balance between short-term deliverables and long-term technical enhancements. - Keep abreast of emerging technologies and methodologies, advocating for their adoption to accelerate the team's objectives. - Ensure compliance with corporate security policies and follow the established guidelines and procedures of Assent. Qualifications: Your Knowledge, Skills and Abilities: - Possess 10+ years of experience in data engineering, software development, or related fields. - Proficient in cloud data platforms, particularly AWS. - Expertise in modern data technologies like Spark, Airflow, dbt, Snowflake, Redshift, or similar. - Deep understanding of distributed systems and data pipeline design, with specialization in ETL/ELT processes, data warehousing, and real-time streaming. - Strong programming skills in Python, SQL, Scala, or similar languages. - Experience with infrastructure as code tools like Terraform, CloudFormation, and knowledge of DevOps best practices. - Ability to influence technical direction and promote best practices across teams. - Excellent communication and leadership skills, with a focus on fostering collaboration and technical excellence. - A learning mindset, continuously exploring new technologies and best practices. - Experience in security, compliance, and governance related to data systems is a plus. This is not an exhaustive list of duties, and responsibilities may be modified or added as needed to meet business requirements. Life at Assent: At Assent, we are dedicated to cultivating an inclusive environment where team members feel valued, respected, and heard. Our diversity, equity, and inclusion practices are guided by our Diversity and Inclusion Working Group and Employee Resource Groups (ERGs), ensuring that team members from diverse backgrounds are recruited, retained, and provided opportunities to contribute to business success. If you need assistance or accommodation during any stage of the interview and selection process, please reach out to talent@assent.com, and we will be happy to assist you.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer at Perch Energy, you will be a key player in the design, development, and maintenance of our data infrastructure and pipelines. Your collaboration with the Data and Analytics Engineering team, as well as engineering and operations teams, will ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be crucial in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support the company's growth. You will have the opportunity to engage with cutting-edge technology, influence the development of a world-class data ecosystem, and work in a fast-paced environment as part of a small, high-impact team. The core data stack at Perch Energy includes Snowflake and dbt Core, orchestrated in Prefect and Argo within our AWS-based ecosystem. Data from a wide range of sources is loaded using Fivetran or Segment, with custom Python utilized when necessary. Your responsibilities will include designing, developing, and maintaining scalable and efficient data pipelines in an AWS environment, focusing on the Snowflake instance and utilizing tools such as Fivetran, Prefect, Argo, and dbt. Collaboration with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions will be essential. Additionally, you will design, build, and maintain tooling that facilitates interaction with the data platform, including CI/CD pipelines, testing frameworks, and command-line tools. To succeed in this role, you should have at least 3 years of experience as a Data Engineer, data-adjacent Software Engineer, or member of a small data team, with a strong focus on building and maintaining data pipelines. Proficiency in Python, SQL, database management, and design is required, along with familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts. Experience with data orchestration tools like Argo, Prefect, or Airflow is a must, along with excellent problem-solving skills and attention to detail. While not mandatory, an undergraduate or graduate degree in a technical field, experience with AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, as well as DevOps practices, would be advantageous. Previous experience in managing enterprise-level data pipelines and working with large datasets or knowledge of the energy sector would also be beneficial. Perch Energy offers competitive compensation, a remote-first policy, flexible leave policy, medical insurance, annual performance cycle, team engagement activities, L&D programs, and a supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency. Perch Energy is committed to providing reasonable accommodations for individuals with disabilities throughout the job application, interview process, and employment tenure.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You should possess a Bachelor's degree in Computer Science, Engineering, or a related field along with at least 8 years of work experience in Data First systems. Additionally, you should have a minimum of 4 years of experience working on Data Lake/Data Platform projects specifically on AWS/Azure. It is crucial to have extensive knowledge and hands-on experience with Data warehousing tools such as Snowflake, BigQuery, or RedShift. Proficiency in SQL for managing and querying data is a must-have skill for this role. You are expected to have experience with relational databases like Azure SQL, AWS RDS, as well as an understanding of NoSQL databases like MongoDB for handling various data formats and structures. Familiarity with orchestration tools like Airflow and DBT would be advantageous. Experience in building stream-processing systems using solutions such as Kafka or Azure Event Hub is desirable. Your responsibilities will include designing and implementing ETL/ELT processes using tools like Azure Data Factory to ingest and transform data into the data lake. You should also have expertise in data migration and processing with AWS (S3, Glue, Lambda, Athena, RDS Aurora) or Azure (ADF, ADLS, Azure Synapse, Databricks). Data cleansing and enrichment skills are crucial to ensure data quality for downstream processing and analytics. Furthermore, you must be capable of managing schema evolution and metadata for the data lake, with experience in tools like Azure Purview for data discovery and cataloging. Proficiency in creating and managing APIs for data access, preferably with experience in JDBC/ODBC, is required. Knowledge of data governance practices, data privacy laws like GDPR, and implementing security measures in the data lake are essential aspects of this role. Strong programming skills in languages like Python, Scala, or SQL are necessary for data engineering tasks. Additionally, experience with automation and orchestration tools, familiarity with CI/CD practices, and the ability to optimize data storage and retrieval for analytical queries are key requirements. Collaboration with the Principal Data Architect and other team members to align data solutions with architectural and business goals is crucial. As a lead, you will be responsible for critical system design changes, software projects, and ensuring timely project deliverables. Collaboration with stakeholders to translate business needs into efficient data infrastructure systems is a key aspect of this role. Your ability to review design proposals, conduct code review sessions, and promote best practices is essential. Experience in an Agile model, delivering quality deliverables on time, and translating complex requirements into technical solutions are also part of your responsibilities.,
Posted 3 days ago
8.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
Job Description: As a Data Scientist at Hitachi Solutions India Pvt Ltd in Pune, India, you will be a valuable member of our dynamic team. Your primary responsibility will be to extract valuable insights from complex datasets, develop advanced analytical models, and drive data-driven decision-making across the organization. With 8-14 years of experience, your primary skills should include Data Science, with secondary skills in Data Engineering/Data Analytics. You will play a pivotal role in working on cutting-edge AI applications with a focus on Natural Language Processing (NLP), Time Series Forecasting, and a working knowledge of Computer Vision (CV) techniques. Your role will involve collaborating with a diverse team of engineers, analysts, and domain experts to build holistic, multi-modal solutions. Your expertise in Python and libraries like Pandas, NumPy, Scikit-learn, HuggingFace Transformers, and Prophet/ARIMA will be essential. Additionally, you should have a strong understanding of the model development lifecycle, from data ingestion to deployment, and hands-on experience with SQL and data visualization tools like Seaborn, Matplotlib, and Tableau. Experience in handling retail-specific data, familiarity with cloud platforms like AWS, GCP, or Azure, and exposure to API development (FastAPI, Flask) for ML model deployment will be beneficial. Knowledge of MLOps practices, previous experience in fine-tuning language models, and expertise in Data Engineering using Azure technologies are desirable skills for this role. Key responsibilities will include applying NLP techniques to extract insights from text data, analyzing historical demand data for Time Series Forecasting, and potentially contributing to Computer Vision projects. Collaboration with cross-functional teams and developing scalable ML components for production environments will be crucial aspects of your role. Qualifications required for this position include a Master's degree in Computer Science, Data Science, Statistics, or a related field, proven experience in data science or machine learning, strong proficiency in Python and SQL, and familiarity with cloud technologies like Azure Databricks and MLflow. Excellent problem-solving skills, strong communication abilities, and the capability to work independently and collaboratively in a fast-paced environment are essential for success in this role. Please be cautious of potential scams during the recruitment process, and all official communication regarding your application and interview requests will be from our @hitachisolutions.com domain email address.,
Posted 4 days ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Technical Lead for Data Engineering at Assent, you will collaborate with other team members to identify opportunities and assess the feasibility of solutions. Your role will involve providing technical guidance, influencing decision-making, and aligning data engineering initiatives with business goals. You will drive the technical strategy, team execution, and process improvements to build resilient and scalable data systems. Mentoring a growing team and building robust, scalable data infrastructure will be essential aspects of your responsibilities. Your key requirements and responsibilities will include driving the technical execution of data engineering projects, working closely with Architecture members to design and implement scalable data pipelines, and providing technical guidance to ensure best practices in data engineering. You will collaborate cross-functionally with various teams to define and execute data initiatives, plan and prioritize work with the team manager, and stay updated with emerging technologies to drive their adoption. To be successful in this role, you should have 10+ years of experience in data engineering or related fields, expertise in cloud data platforms such as AWS, proficiency in modern data technologies like Spark, Airflow, and Snowflake, and a deep understanding of distributed systems and data pipeline design. Strong programming skills in languages like Python, SQL, or Scala, experience with infrastructure as code and DevOps best practices, and the ability to influence technical direction and advocate for best practices are also necessary. Strong communication and leadership skills, a learning mindset, and experience in security, compliance, and governance related to data systems will be advantageous. At Assent, we value your talent, energy, and passion, and offer various benefits to support your well-being, financial health, and personal growth. Our commitment to diversity, equity, and inclusion ensures that all team members are included, valued, and provided with equal opportunities for success. If you require any assistance or accommodation during the interview process, please feel free to contact us at talent@assent.com.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Technical Solution Analyst, you will be instrumental in creating, enhancing, and automating solutions to streamline operational procedures. Your responsibilities will include configuring TSA & TSE solutions, managing product setups, and ensuring data accuracy for reporting and analytics. By leveraging your analytical skills and technical proficiency, you will identify and resolve technical challenges, align solutions with business goals, and implement automation tools to boost efficiency. You will collaborate with stakeholders to understand requirements, develop SQL queries, utilize BI tools such as Tableau, Power BI, and Looker, and work on ETL/ELT pipelines for data transformation. Your role will also involve integrating APIs, scripting for automation, and documenting technical solutions for debugging and future reference. To excel in this role, you should possess 2-5 years of relevant experience in data management, automation, and solution development. Strong expertise in data analysis, SQL, BI tools, and programming languages like Python, R, or JavaScript is essential. Additionally, hands-on experience with API integration, problem-solving abilities, and effective communication skills are crucial for success in this position. Previous experience in UI development would be advantageous. At ThoughtSpot, we value diversity, inclusion, and continuous learning. We believe that a diverse team with varied perspectives and experiences leads to innovative solutions. We encourage individuals from all backgrounds to apply, regardless of whether they meet 100% of the criteria listed. If you are passionate about working in a dynamic environment with talented individuals and contributing to groundbreaking products, we invite you to explore our mission and consider joining our team.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Azure Data Engineering Director, you will play a pivotal role in leading the data strategy and operations for our EDP Cloud Fabric. Your expertise will be essential in establishing resilience through a multi-cloud model and enabling key capabilities such as PowerBI and OpenAI from Microsoft. Collaborating with leads across GTIS, CSO, and CTO, you will accelerate the introduction and adoption of new designs on Azure. Your key responsibilities will include defining and executing a comprehensive data strategy aligned with business objectives, leveraging Azure services for innovation in data processing, analytics, and insights delivery. You will architect and manage large-scale data platforms using Azure tools like Azure Data Factory, Azure Synapse Analytics, Databricks, and Cosmos DB, optimizing data engineering pipelines for performance, scalability, and cost-efficiency. Furthermore, you will establish robust data governance frameworks to ensure compliance with industry regulations, oversee data quality, security, and consistency across all platforms, and build, mentor, and retain a high-performing data engineering team. Collaboration with cross-functional stakeholders to bridge technical and business objectives will be a key aspect of your role. You will also ensure data readiness for AI/ML initiatives, drive the adoption of real-time insights through event-driven architectures, streamline ETL/ELT processes for faster data processing and reduced downtime, and identify and implement cutting-edge Azure technologies to create new revenue streams through data-driven innovation. In this role, you will be accountable for building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to build and deploy machine learning models. You will manage a business function, provide input to strategic initiatives, and lead a large team or sub-function, embedding a performance culture aligned with the organization's values. Additionally, you will provide expert advice to senior management, manage resourcing and budgeting, and foster compliance within the function. As a Senior Leader, you are expected to demonstrate a clear set of leadership behaviors, including listening and authenticity, energizing and inspiring others, aligning across the enterprise, and developing colleagues. Upholding the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, along with the Barclays Mindset of Empower, Challenge, and Drive, will be essential in creating an environment for colleagues to thrive and deliver to an excellent standard.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
vadodara, gujarat
On-site
Job Title: Data Architect Experience : 3 to 4 Location : Vadodara , Gujarat Contact : 9845135287 Job Summary We are seeking a highly skilled and experienced Data Architect to join our team. As a Data Architect, you will play a crucial role in assessing the current state of our data landscape and working closely with the Head of Data to develop a comprehensive data strategy that aligns with our organisational goals. Your primary responsibility will be to understand and map our current data environments and then help develop a detailed roadmap that will deliver a data estate that enables our business to deliver on its core objectives. Main Duties & Responsibilities The role core duties include but are not limited to: Assess the current state of our data infrastructure, including data sources, storage systems, and data processing pipelines. Collaborate with the Data Ops Director to define and refine the data strategy, taking into account business requirements, scalability, and performance. Design and develop a cloud-based data architecture, leveraging Azure technologies such as Azure Data Lake Storage, Azure Synapse Analytics, and Azure Data Factory. Define data integration and ingestion strategies to ensure smooth and efficient data flow from various sources into the data lake and warehouse. Develop data modelling and schema design to support efficient data storage, retrieval, and analysis. Implement data governance processes and policies to ensure data quality, security, and compliance. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand data requirements and provide architectural guidance. Conduct performance tuning and optimization of the data infrastructure to meet business and analytical needs. Stay updated with the latest trends and advancements in data management, cloud technologies, and industry best practices. Provide technical leadership and mentorship to junior team members. Key Skills Proven work experience as a Data Architect or in a similar role, with a focus on designing and implementing cloud-based data solutions using Azure technology. Strong knowledge of data architecture principles, data modelling techniques, and database design concepts. Experience with cloud platforms, particularly Azure, and a solid understanding of their data-related services and tools. Proficiency in SQL and one or more programming languages commonly used for data processing and analysis (e.g., Python, R, Scala). Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong analytical and problem-solving skills, with the ability to translate business requirements into scalable and efficient data solutions. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Working relationships Liaison with stakeholders at all levels of the organisation Communication: Communicate with leadership and colleagues in relation to all business activities Highly articulate and able to explain complex concepts in bite size chunks Strong ability to provide clear written reporting and analysis Personal Qualities Ability to work to deadlines Commercially mindful and able to deliver solution to maximise value Good time management skills and ability to work to deadlines Strong analytical skills Accurate with excellent attention to detail Personal strength and resilience Adaptable and embraces change Reliable, conscientious and hardworking Approachable and professional Show willingness to learn however recognise limits of ability and when to seek advice Knowledge / Key Skills: Essential Desirable Experience of Azure Development and design principals Enterprise level Data warehousing design and implementation Architecture Principles Proficiency in SQL development. Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong experience mapping existing data landscape and developing roadmap to deliver business requirements. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Knowledge of Enterprise Architecture frameworks (Eg. TOGAF) Programming languages such as R, Python, Scala etc Job Type: Full-time Experience: total work: 1 year (Preferred) Work Location: In person,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France