Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
3 - 6 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Sr Data Engg- Detailed JD *(Roles and Responsibilities) Education: Bachelors degree in Computer Science or Engineering • Candidate should have 5+ years of experience as Data Engineering, or any related role to Data solutions. • Hands-on experience solutioning and implementing analytical capabilities using the Azure Data Analytics platform including, Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake. • Candidate should be capable to support in all the phases of Analytical Development from identification of key business questions, through Data Collection and ETL. • Good experience in Developing Data solutions in Lakehouse platforms like Dremio is an added benefit. • Strong knowledge of Data Modelling and Data Design is a plus • Microsoft Data Certification is a plus. Mandatory skills* Azure Data Factory Desired skills* Azure Data Factory, Data Modeling Domain* Financial Services Work Location - Any location WFO/WFH/Hybrid WFO - Hybrid Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No shifts Location- PAN India Yrs of Exp-5Yrs
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: · Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python · Collaborate with data analysts, data scientists, and product teams to understand data needs · Optimize queries and data models for performance and reliability · Integrate data from various sources, including APIs, internal databases, and third-party systems · Monitor and troubleshoot data pipelines to ensure data quality and integrity · Document processes, data flows, and system architecture · Participate in code reviews and contribute to a culture of continuous improvement Required Skills: · 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data · Strong command of SQL for data transformation and performance tuning · Experience with Python (e.g., pandas, Spark, ADF) · Solid understanding of ETL/ELT processes and data pipeline orchestration · Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) · Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) · Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) · Basic Programming Skills · Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: · Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. · Exposure to enterprise solutions (e.g., Databricks, Synapse) · Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) · Background in real-time data streaming and event-driven architectures · Understanding of data governance, security, and compliance best practices · Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Role :Senior Dot Net Developer Experience: 8+ years Notice period: Immediate Location : Trivandrum / Kochi Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: • Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery • Integrate and support third-party APIs and external services • Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack • Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) • Participate in Agile/Scrum ceremonies and manage tasks using Jira • Understand technical priorities, architectural dependencies, risks, and implementation challenges • Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : • Microsoft Certified: Azure Fundamentals • Microsoft Certified: Azure Developer Associate • Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server , including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership • Strong problem-solving and debugging capabilities • Ability to write reusable, testable, and efficient code • Develop and maintain frameworks and shared libraries to support large-scale applications • Excellent technical documentation, communication, and leadership skills • Microservices and Service-Oriented Architecture (SOA) • Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Secondary Skills: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Head the Production Support division , ensuring uninterrupted operations of critical banking systems including Oracle FLEXCUBE , OFSAA , Oracle Financials , and regulatory platforms . Lead and manage a team of 30+ professionals , across L1 to L3 support tiers, overseeing incident management, root cause analysis, and service improvement initiatives. Ensure high system availability and strict adherence to SLAs in coordination with business, compliance, and infrastructure teams. Manage cross-departmental support including Corporate Banking, Finance, Risk, and Compliance. Handle incident escalations , prioritize issues based on business impact, and oversee timely resolution. Coordinate release and change management activities to minimize disruption to business operations. Regularly engage with business heads to align system capabilities with operational requirements. Define and enforce robust monitoring, alerting, and reporting frameworks for production systems. Champion automation and optimization initiatives to reduce manual dependencies and improve turnaround times. Corporate Banking: Managed and supported applications for Corporate Lending , Trade Finance (LC, BC, Bank Guarantees). Finance Department: Supported Oracle Financials (GL, AP, AR) including GL consolidation and monthly GST filings . Compliance Systems: Oversaw OFSAA AML/KYC , enabling AML alert generation and KYC scoring with reverse feeds to CBS. Risk Systems: Seamless operation for LOS,LMS,MF and oracle Regulatory Reporting: Managed ADF for centralized regulatory report generation and NACH for mandate registration and processing. Key Skills and Technical Expertise: Platforms: Oracle FLEXCUBE, OFSAA (AML/KYC, BASEL, ALM, LRM), Oracle Financials (GL/AP/AR), ADF, NACH Databases: Oracle 10g/9i/8i Tools: PL/SQL Developer, SQL Navigator, Toad, OBIEE Operating Systems: Windows, HP-UX, IBM AIX Methodologies: Agile, Waterfall, ITIL (preferred) Preferred Qualifications: Bachelor’s Degree in Engineering (Electronics/Computer Science preferred) Scrum Master Certification or equivalent Agile training Strong understanding of regulatory frameworks and compliance obligations in banking Leadership & Soft Skills: Proven ability to lead large, diverse teams (30+ members) Exceptional communication and stakeholder management Strong problem-solving and critical thinking skills High accountability, operational discipline, and performance focus Comfortable working in high-pressure, production environments Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Data Engineer (Azure) Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Must Have Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Ingestion and Integration. Microsoft Certified: Azure Data Engineer Associate Required Skills And Experiences 5-8 years of experience as a Data Engineer, focusing on Azure cloud services Bachelor’s degree in computer science, Information Technology, or related field. Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Storage. Strong SQL skills, including experience with data modeling, complex queries, and performance optimization. Ability to work independently and manage multiple tasks simultaneously. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (Azure DevOps). Knowledge of Data Lake Architecture, Data Warehousing, and Data Modeling principles. Experience with RESTful APIs, Data APIs, and event-driven architecture. Familiarity with data governance, lineage, security, and privacy best practices. Strong problem-solving, communication, and collaboration skills. Skills: event-driven architecture,data transformation,modeling,data governance,azure devops,azure,azure cloud services,restful apis,sql,data warehousing,azure sql,azure data factory,data modeling,data security,ingestion,data lake architecture,data privacy,synapse,etl-ssis,data apis,integration,data,data lake,adf,sql server,data lineage Show more Show less
Posted 1 month ago
0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Key Responsibilities Graph Database Development: Design, develop, and maintain graph database schemas using Neo4j. Query Optimization: Optimize Neo4j queries for performance and efficiency. Data Processing & Analysis: Utilize Python, PySpark, or Spark SQL for data transformation and analysis. User Acceptance Testing (UAT): Conduct UAT to ensure data accuracy and overall system functionality. Data Pipeline Management: Develop and manage scalable data pipelines using Databricks and Azure Data Factory (ADF). Cloud Integration: Work with Azure cloud services and be familiar with Azure data engineering components. Desired Skills Strong experience with Neo4j and Cypher query language Proficient in Python and/or PySpark Hands-on experience with Databricks and Azure Data Factory Familiarity with data engineering tools and best practices Good understanding of database performance tuning Ability to work in fast-paced, client-driven environments Skills: azure,data engineering tools,neo4j,pyspark,azure data factory,spark sql,databricks,cloud,database performance tuning,cypher query language,python Show more Show less
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: • Designs, implements and maintains reliable and scalable data infrastructure • Writes, deploys and maintains software to build, integrate, manage, maintain , and quality-assure data • Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud • Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes • Works with customers to deploy, manage, and audit standard processes for cloud products • Adheres to and advocates for software & data engineering standard processes ( e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) • Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline • Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain , responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. • Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : 1) Years of experience: 8 to 12 years relevant experience 2) Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments 3) Hands on experience with: a) Spark for data processing (batch and/or real-time) b) Configuring Delta Lake on Azure Databricks c) Languages: SQL, pyspark , python d) Cloud platforms: Azure e) Azure Data Factory (must ) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) f) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability g) Azure Devops (or similar tools) for source control & building CI/CD pipelines 4) Experience designing and implementing large-scale distributed systems 5) Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : • Strong customer management- own the delivery for Data track with customer stakeholders • Continuous learning and improvement attitude Key Behaviors : • Empathetic: Cares about our people, our community and our planet • Curious: Seeks to explore and excel • Creative: Imagines the extraordinary • Inclusive: Brings out the best in each other Mandatory skill sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark , spark-SQL Preferred skill sets: ‘Good to have’ knowledge, skills and experiences C osmos DB, Data modeling, Databricks, PowerBI , experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years of experience required : 8 to 12 years relevant experience Education qualification: BE, B.Tech , ME, M,Tech , MBA, MCA (60% above ) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer with 5-10 Years of Experience to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory , and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview . The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ , etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Mars Data Hiring for Fulltime Dot Net Developer Positions in Trivandrum/Kochi locations Skills : .Net/.Net Core 6/8+/T-SQL/Azure Cloud Service/ Azure DevOps, React.JS/Angular.JS, C#, X-Unit, MS-Test, RDBMS, AWS, CI/CD, SDLC, Restful API, PowerShell, Agile/Scrum/Jira. Job Title: Dot Net Developer Location: Trivandrum/Kochi Job type: Full Time Working hours : 8 hours, Mid Shift Notice Period: Immediate Rel Experience : 10+ years Introduction Candidates with 10+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include • Develop, enhance, document, and maintain application features in .Net Core 6/8+ , C#, /REST API/T-SQL and AngularJS/React JS • Application Support & API Integrations with third party solutions/services • Understand technical project priorities, implementation dependencies, risks and issues • Participate and develop code as part of a unified development group, working the whole technological stack • Identify, prioritize and execute tasks in the software development life cycle • Work with the team to define, design, and deliver on new features • Broad and extensive knowledge of the software development life cycle (SDLC) with software development models like Agile, Scrum model, Jira models. Primary Skills • Develop high-quality software design and architecture • 10+ years of development experience in C#, .Net technologies, SQL and at least 2 years working with Azure Cloud Services • Expertise in C#, .Net Core 6.0/8.0 or higher, Entity framework, EF core, Microservices, Azure Cloud services, Azure DevOps and SOA • Ability to lead, inspire and motivate teams through effective communication and established credibility • Guide team to write reusable, testable, performant and efficient code • Proficient in writing Unit Test Cases using X-Unit, MS-Test • Build standards-based frameworks and libraries to support a large-scale application • Expertise in RDBMS including MS SQL Server with thorough knowledge in writing SQL queries, Stored Procedures, Views, Functions, Packages, Cursors & tables and object types. • Experience in large scale software development. • Prior experience in Application Support & API Integrations • Knowledge of architectural styles and design patterns, experience in designing solutions • Strong debugging and problem-solving skills • Effective communication skill, technical documentation, leadership and ownership quality Azure Skills • Azure Messaging services - Service Bus or Event Grid, Event hub • Azure Storage Account - Blobs, Tables, Queue etc • Azure Function / Durable Functions • Azure ADF and Logic APP • Azure DevOps - CI/CD pipelines (classic / YAML) • Application Insights, Azure Monitoring, KeyVault and SQL Azure Secondary Skills • Good knowledge of JavaScript, React JS, jQuery, Angular and other front end technologies • API Management - APIM • Azure Containerization and Container Orchestration Contact #8825984917 send your resume to hr@marsdata.in Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Result Areas And Activities Design, develop and deploy ETL/ELT solutions on premise or in the cloud Transformation of data with stored procedures Report Development (MicroStrategy/Power BI) Create and maintain comprehensive documentation for data pipelines, configurations, and processes Ensure data quality and integrity through effective data management practices Monitor and optimize data pipeline performance Troubleshoot and resolve data-related issues Technical Experience Must Have Good experience in Azure Synapse Good experience in ADF Good experience in Snowflake & Stored Procedures Experience with ETL/ELT processes, data warehousing, and data modelling Experience with data quality frameworks, monitoring tools, and job scheduling Knowledge of data formats like JSON, XML, CSV, and Parquet English Fluent (Strong written, verbal, and presentation skills) Agile methodology & tools like JIRA Good communication and formal skills Good To Have Good experience in MicroStrategy and PowerBI Experience in scripting languages such as Python, Java, or Shell scripting Familiarity with Azure cloud platforms and cloud data services Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 3+ years of experience in Azure Synapse Qualities Experience with or knowledge of Agile Software Development methodologies Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions. Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues. Able to consult, write, and present persuasively Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
On-site
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and architectures, databases. The ideal candidate will have hands-on experience in Azure Data Factory (ADF), Azure Synapse, and SQL, and will support reporting teams by ensuring reliable, curated datasets are available for downstream analysis. Key Responsibilities: Design, develop, and manage scalable data pipelines to ingest and transform data from multiple source systems using Azure Data Factory (ADF) and Synapse Analytics. Build and maintain data integration workflows that enable downstream analytics, ensuring data is clean, consistent, and timely. Partner with power bi developer and analysts within team and cross functional to develop data models optimized for reporting and machine learning. Implement data validation and quality checks to detect anomalies or pipeline failures early, ensuring high data reliability. Maintain and improve existing ETL frameworks, enabling incremental data loads, performance tuning, and reusability of components. Collaborate with the Power BI development team to ensure data availability in curated formats that support self-service analytics. Write and maintain advanced SQL scripts, stored procedures, and views to transform and manipulate large datasets efficiently. Monitor pipeline performance and implement logging, alerting, and retry mechanisms to handle failures. Stay updated on Azure tools and recommend newer approaches (e.g., Dataflows, Delta Lakes) where beneficial. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. Data Engineering pipelines, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, Architects, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL, Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 6 to 9 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Expected Joining: 3 weeks Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Our Procurement Cloud is the key offering from the Oracle Applications Cloud Suite. Procurement Cloud is a fast growing division within Oracle Cloud Applications and have a variety of customers starting from a leading fast-food joint to world's largest furniture maker. Procurement Cloud Development works on different sophisticated areas starting from a complex search engine to a time critical auctions/bidding process to core business functionalities like bulk order processing, just to name a few. As a member of our team, you will use the latest technologies, including JDeveloper, ADF, Oracle 12c Database, Oracle SQL, BPEL, Oracle Text, BC4J, web-services, and service oriented architectures (SOA). In addition to gaining this technical experience, you will also be exposed to the business side of the industry. Developers are involved in the entire development cycle, so you will have the chance to take part in activities such as working with the product management team to define the product’s functionality and interacting with customers to resolve issues. So are you looking to be technically challenged and gain business experience? Do you want to be part of a team of upbeat, hard-working developers who know how to work and have fun at the same time? Well look no further. Join us and be the newest member of the Fusion Procurement Development! Skills/languages:: 1-8 years of experience in building Java based Applications. Good programming skills, excellent analytical/logical skills. Able to craft a feature from end to end. Can think out of the box, has practical knowledge on the given technologies, can apply logic to tackle a technical problem though might not have the background on the same. Should be persistent in their efforts. Experience in BPEL, Workflow System, ADF, REST Implementation, AI/ML, Scrum processes is a plus. Required: Java, OOPS Concepts, JavaScript/VBCS/JET Optional: JDBC, XML, SQL, PL/SQL, Unix/Linux, REST, ADF, AI/ML, Scrum Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description The main role of a Support engineer is to troubleshoot and resolve highly complex techno-functional problems. The key skills put to use on a daily basis are - high level of techno-functional skills, Oracle products knowledge, problem-solving skills, and customer interaction/service expertise. Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the areas - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands-on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalization. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem-solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilize the organizational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organizational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self-driven and result oriented Strong problem-solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3 Responsibilities As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, India Duration: 6 Months Email to Apply: navaneeta@suzva.com Contact: 9032956160
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Job type Contract to HIRE 4-5 years of hands-on experience on GenRocket Tool. Strong SQL knowledge with ability to write complex queries (example Left, right joins etc..) 3. Strong knowledge in SQL server, MSSQL, Microsoft Azure, ADF and synapse for nata base validation 4. Intermediate knowledge on ETL Transformations, workflows, STTM mappings (source to Target Data mappings) 5. Strong Knowledge on powershell scripting. 6. Ability to test Data validation and Data Transformation from source to Target 7. Data validation: Validating Data sources, Extracting data, and applying transformation logic 8. Test planning & Execution: Defining testing scope, Prepares test cases and test conditions, Test Data preparation 9. Coordinating test activities with Dev, BA & DBA and conduct defect triage for resolution of issues 10. Test quality: Ensuring the quality of their work and the work of the development team 11. QA Test documentation: Creating and maintaining documentation of test plans, test deliverables document such as QA Estimates, RTM (requirement traceability matrix, Peer reviews, QA sign-off documents. 12. Hands experience working on ADO/JIRA for test management, reporting defects and Dashboards creation. 13. Ability to Identify and report the risks and provide the mitigation plans. Coordinate with internal and external teams for completion of activities. If you are interested in, please share the update profile with below details. Current CTC Expected CTC Notice Period Total Experience Relevant Experience Location Mumbai, Pune, Banglore,Hyderabad
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Azure Data Engineer – Databricks Required Technical Skill Set Data Lake architecture, Azure Services – ADLS, ADF, Azure Databricks, Synapse Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle Netsuite to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Lead a team of NetSuite developers, providing guidance, mentorship, and technical expertise to ensure high-quality deliverables and project success. Define technical architecture and design standards for NetSuite solutions, ensuring scalability, performance, and maintainability. Stay updated on emerging technologies and best practices in NetSuite development, driving innovation and continuous improvement within the team. Manage end-to-end technical projects for NetSuite implementations, upgrades, and customizations, ensuring adherence to scope, budget, and timeline. Develop project plans, resource allocation strategies, and risk mitigation plans, and monitor project progress to identify and address issues proactively. Lead the development and customization of NetSuite solutions, including Suite Scripting, Suite Flow, Suite Builder, and Suite Cloud development. Collaborate with functional consultants to translate business requirements into technical solutions, ensuring alignment with best practices and industry standards. Serve as a technical liaison between the development team and clients, providing technical expertise, addressing concerns, and managing expectations. Participate in client meetings and workshops to understand their technical requirements, propose solutions, and provide updates on project status. Mandatory Skill Sets: Netsuite Preferred skill sets Netsuite Years of experience required Minimum 2Years of Netsuite expert Education Qualification Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Netsuite Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Sr. Executive / Assistant Manager Data Engineer Godrej Agrovet Mumbai, Maharashtra, India ------------------------------------------------------------------------------------------------------------- Job Title: Sr. Executive / Assistant Manager Data Engineer Job Type: Permanent, Full-time Function: Digital Business: Godrej Agrovet Location: Mumbai, Maharashtra, India About Godrej Agrovet: Godrej Agrovet Limited (GAVL) is a diversified, Research & Development focused agri-business Company dedicated to improving the productivity of Indian farmers by innovating products and services that sustainably increase crop and livestock yields. GAVL holds leading market positions in the different businesses it operates - Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry and Processed Foods. GAVL has a pan India presence with sales of over a million tons annually of high-quality animal feed and cutting- edge nutrition products for cattle, poultry, aqua feed and specialty feed. Our teams have worked closely with Indian farmers to develop large Oil Palm Plantations which is helping in bridging the demand and supply gap of edible oil in India. In the crop protection segment, the company meets the niche requirement of farmers through innovative agrochemical offerings. GAVL through its subsidiary Astec Life Sciences Limited, is also a business-to-business (B2B) focused bulk manufacturer of fungicides & herbicides. In Dairy and Poultry and Processed Foods, the company operates through its subsidiaries Creamline Dairy Products Limited and Godrej Tyson Foods Limited. Apart from this, GAVL also has a joint venture with the ACI group of Bangladesh for animal feed business in Bangladesh. For more information on the Company, please log on to www.godrejagrovet.com Roles & Responsibilities: Data Pipeline Development : Design, develop, and optimize data pipelines to ingest, process, and transform data from various sources (e.g., APIs, databases, files) into the data warehouse. Data Integration: Integrate data from various structured and unstructured sources into the Databricks Lakehouse environment, ensuring data accuracy and reliability. Data Lakehouse & storage Management: Design and maintain data warehouse solutions using medallion architecture practices, optimizing storage, cloud utilization, costs and query performance. Collaboration with Data Teams : Work closely with data scientists, analysts, to understand requirements, translate them into technical solutions, and implement data solutions. Data Quality and Monitoring : Cleanse, transform, and enrich data. Implement data quality checks and establish monitoring processes to ensure data integrity and accuracy. Implement monitoring for data pipelines and troubleshoot any issues or failures promptly to ensure data reliability. Optimization and Performance Tuning: Optimize data processing workflows for performance, reliability, and scalability, including tuning spark jobs, caching, and partitioning data appropriately. Data Security and Privacy: Manage and organize data lakes using Unity catalog, ensuring proper governance, security, role-based access and compliance with data management policies. Key Skills: Technical Skills: Proficiency with Databricks Lakehouse platform, Delta Lake, Genie, ML Flow (e.g., Databricks Certified Data Engineer Associate) is a plus. SQL and NoSQL: Experienced working with both SQL and NoSQL data sources (e.g., MySQL, PostgreSQL, MongoDB etc.) Strong knowledge of Spark, especially in PySpark or Scala, for data transformation. Proficiency in Python, R and other programming languages used in data processing. Experience with cloud platforms like Azure, AWS, particularly Azure storage & services Knowledge of ML Pipelines, data streaming platforms (e.g., Apache Kafka, AWS Kinesis). Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker) Educational Qualification: Education: Bachelor’s degree in Computer Science, Engineering/MCA, or a related field (Master’s preferred) 3+ years of experience as a Data Engineer, with hands-on experience in Databricks Experience: Seeking a skilled Data Engineer with expertise in Databricks on Azure (ADF, ADLS) to join our data team. As a Data Engineer, you will work with both structured and unstructured data. You will be responsible for designing, building, and maintaining scalable and reliable data pipelines that support business intelligence, data analytics, and machine learning efforts. You will collaborate closely with data scientists, analysts, and cross-functional teams to ensure data is available, accurate, and optimized for processing and storage. What’s in it for you? Be an equal parent Maternity support, including paid leave ahead of statutory guidelines, and flexible work options on return Paternity support, including paid leave New mothers can bring a caregiver and children under a year old, on work travel Adoption support; gender neutral and based on the primary caregiver, with paid leave options No place for discrimination at Godrej Gender-neutral anti-harassment policy Same sex partner benefits at par with married spouses Gender transition support We are selfish about your wellness Comprehensive health insurance plans, as well as accident coverage for you and your family, with top-up options Uncapped sick leave Mental wellness and self-care programmes, resources and counselling Celebrating wins, the Godrej Way Structured recognition platforms for individual, team and business-level achievements Performance-based earning opportunities https://www.godrejcareers.com/benefits/ An inclusive Godrej Before you go, there is something important we want to highlight. There is no place for discrimination at Godrej. Diversity is the philosophy of who we are as a company. And has been for over a century. It’s not just in our DNA and nice to do. Being more diverse - especially having our team members reflect the diversity of our businesses and communities - helps us innovate better and grow faster. We hope this resonates with you. We take pride in being an equal opportunities employer. We recognise merit and encourage diversity. We do not tolerate any form of discrimination on the basis of nationality, race, colour, religion, caste, gender identity or expression, sexual orientation, disability, age, or marital status and ensure equal opportunities for all our team members. If this sounds like a role for you, apply now! We look forward to meeting you. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description As a Data Engineer you will work independently or with a team of data engineers on cloud technology products, projects, and initiatives. Work with all customers, both internal and external, to design and implement features. Will collaborate with other technical teams across the organization as required to deliver proposed solutions. You will also be responsible for collaborating with tech leads, architects and managers. Responsibilities Works with Scrum masters, product owners, and others to identify new features for digital products. Takes responsibility of the maintenance and security of existing solutions, platforms and frameworks, designing new ones and recommending relevant upgrades and fixes. Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution. Maintains culture of open communication, collaboration, mutual respect and productive behavior. Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices. Explores all technical options when considering solution, including homegrown coding, third-party subsystems, enterprise platforms, and existing technology components. Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support. Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts. Collaborates accordingly while working on cross-functional projects or production issues. Qualifications 5 - 8 years preferred experience in a data engineering role. Minimum of 4 years of preferred experience in Azure data services (Data Factory, Databricks, ADLS, SQL DB, etc.). Minimum bachelor's degree in computer science, Computer Engineering or in "STEM" Majors (Science, Technology, Engineering, and Math). Strong working knowledge of Databricks, ADF. Expertise working with databases and SQL. Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github) preferred. Familiarity with Agile delivery methodologies. Familiarity with NoSQL databases (such as MongoDB) preferred. Any experience on IoT Data Standards like Project Haystack, Brick Schema, Real Estate Core is an added advantage. Ability to multi-task and reprioritize in a dynamic environment. Outstanding written and verbal communication skills. About The Team At Wesco, we build, connect, power and protect the world. As a leading provider of business-to-business distribution, logistics services and supply chain solutions, we create a world that you can depend on. Our Company’s greatest asset is our people. Wesco is committed to fostering a workplace where every individual is respected, valued, and empowered to succeed. We promote a culture that is grounded in teamwork and respect. With a workforce of over 20,000 people worldwide, we embrace the unique perspectives each person brings. Through comprehensive benefits and active community engagement, we create an environment where every team member has the opportunity to thrive. Learn more about Working at Wesco here and apply online today! Founded in 1922 and headquartered in Pittsburgh, Wesco is a publicly traded (NYSE: WCC) FORTUNE 500® company. Wesco International, Inc., including its subsidiaries and affiliates (“Wesco”) provides equal employment opportunities to all employees and applicants for employment. Employment decisions are made without regard to race, religion, color, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, or other characteristics protected by law. US applicants only, we are an Equal Opportunity Employer. Los Angeles Unincorporated County Candidates Only: Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance and the California Fair Chance Act. Show more Show less
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 68 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote
Posted 1 month ago
5.0 - 14.0 years
0 Lacs
Kochi, Kerala, India
On-site
Skill: - Databricks Admin Experience: 5 to 14 years Location: - Kochi (Walkin on 14th June) Key Skills Expertise in Databricks Admin, Azure Data Factory (ADF), Synapse, Azure Data Lake Storage (ADLS). Strong knowledge of SQL, performance tuning, and data pipeline optimization. Experience with Python for data engineering and scripting tasks. v Nice-to-haves: Knowledge of Power BI, SSIS, and advanced analytics. Roles & Responsibilities Lead the architecture and delivery of Databricks-based data solutions for the Data Office. Design, implement, and optimize scalable data pipelines and workflows in Databricks. Collaborate with stakeholders to onboard and engineer data for analytics, dashboards, and reporting needs. Review and troubleshoot Databricks notebooks, workflows, and pipelines for quality assurance. Promote best practices for big data processing using Databricks. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
🚀 Now Hiring: Senior Data Engineers – Azure & DBT/Snowflake 📍 Location: Chennai (Hybrid) 🕒 Experience: 5+ Years 🔢 Openings: 4 📌 Employment Type: Full-time About the Role We’re looking for an experienced Senior Data Engineer who’s passionate about building modern data platforms and solving complex data challenges. You’ll work across cloud technologies like Azure, Snowflake, DBT, and Databricks to architect, develop, and scale our data infrastructure. If you’re excited about optimizing data pipelines, enabling better analytics, and shaping data architecture — this role is for you! What You’ll Do ✅ Design and build scalable data pipelines using Azure Data Factory, Databricks, and Snowflake ✅ Develop modular and reusable data models with DBT ✅ Transform and load data from diverse sources into cloud-based data lakes and warehouses ✅ Ensure data accuracy, security, and performance at every step ✅ Collaborate with data analysts, product teams, and business stakeholders ✅ Maintain clean code with unit tests and clear documentation ✅ Drive automation, CI/CD, and process improvement initiatives What We’re Looking For 🔹 Strong hands-on experience with Azure Data Services (Data Lake, ADF, Synapse, Databricks) 🔹 Proficient in DBT and Snowflake for modeling and transformation 🔹 Advanced SQL and Python skills 🔹 Understanding of data warehouse concepts (dimensional modeling, SCD, CDC) 🔹 Familiar with Airflow, Fivetran, Glue, or similar tools 🔹 Comfortable working in Agile environments and cross-functional teams 🔹 Great communication skills and a proactive mindset #DataEngineer #Azure #Snowflake #DBT #Databricks #Hiring #TechJobs #DataJobs #RemoteJobs #SQL #Python Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France