Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
13 - 20 Lacs
Patna
Work from Office
Key Responsibilities: OSI PI System Development: Design, configure, and implement OSI PI System solutions, including PI Asset Framework (AF), PI Data Archive, PI Vision, and PI Integrators. Asset Framework (AF) Modeling: Develop hierarchical asset models, templates, and calculations to standardize data across industrial operations. Real-time Data Integration: Work with SCADA, DCS, PLCs, and IoT systems to integrate real-time and historical data into OSI PI. Scripting & Automation: Develop scripts using PowerShell, Python, or PI SDK (AF SDK, PI Web API, or PI SQL DAS) to automate data processes.
Posted 4 weeks ago
8.0 - 12.0 years
13 - 20 Lacs
Vadodara
Work from Office
Key Responsibilities: OSI PI System Development: Design, configure, and implement OSI PI System solutions, including PI Asset Framework (AF), PI Data Archive, PI Vision, and PI Integrators. Asset Framework (AF) Modeling: Develop hierarchical asset models, templates, and calculations to standardize data across industrial operations. Real-time Data Integration: Work with SCADA, DCS, PLCs, and IoT systems to integrate real-time and historical data into OSI PI. Scripting & Automation: Develop scripts using PowerShell, Python, or PI SDK (AF SDK, PI Web API, or PI SQL DAS) to automate data processes.
Posted 4 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Job Title :Senior Oracle PL SQL Developer. Location:Hyderabad. Experience:6 8 Years. Primary Skills: Oracle Database,Oracle Sql Developer. Roles And Responsibilities. Job Summary :. The Database Lead PL/SQL role requires an experienced professional with 8-10 years in database management and administration. This position focuses on designing, implementing, and optimizing databases, specifically with Oracle and MS SQL Server, ensuring efficient data storage, high performance, and robust data security. The ideal candidate will excel in SQL, T-SQL, and PL/SQL, have a deep understanding of database design and schema modeling, and possess advanced skills in database administration, including backup, recovery, and performance tuning. Additionally, experience with data integration and reporting tools like SSIS and SSRS is essential. This role involves close collaboration with cross-functional teams to meet business requirements and maintain compliance standards. Key Responsibilities. SQL Language Proficiency :A strong understanding of Structured Query Language (SQL) is essential. Should be familiar with SQL concepts like querying, data manipulation, and data definition. Database Design and Modeling :Proficiency in designing and modeling databases is crucial. Should be able to create efficient and well-structured database schemas, tables, indexes, and relationships. Oracle & MS SQL Server :In-depth knowledge and experience working with Oracle DB & Microsoft SQL Server, including the latest versions. Familiarity with various features and components of SQL Server, such as stored procedures, triggers, functions, and views, is important. T-SQL :Thorough knowledge of Transact-SQL (T-SQL), the Oracle SQL Server implementation of SQL. Should be proficient in writing complex queries, optimizing query performance, and understanding T SQL programming constructs. Database Administration :Understanding of database administration tasks, including security management, user management, backup and recovery, monitoring, and performance tuning. Query Optimization :Ability to analyze and optimize SQL queries for improved performance. Familiarity with query execution plans, indexing strategies, and performance tuning techniques is valuable. Data Integration :Experience in integrating data from various sources using SQL Server Integration Services (SSIS) or other ETL (Extract, Transform, Load) tools. Reporting and Analysis :Knowledge of reporting and analysis tools such as SQL Server Reporting Services (SSRS), Power BI, or other similar tools to create meaningful reports and visualizations. Troubleshooting and Debugging :Proficiency in identifying and resolving database-related issues, debugging stored procedures, and optimizing database performance. Version Control :Familiarity with version control systems, such as Git, to manage and track database schema changes and scripts. Security and Compliance :Understanding of database security best practices, including authentication, authorization, and data encryption. Awareness of regulatory compliance requirements, such as IRDAI, is beneficial. Scripting and Automation :Ability to write scripts using PowerShell, Python, or other scripting languages to automate routine tasks, perform data transformations, or deploy database changes. Collaboration and Communication :Strong collaboration and communication skills to work effectively in a team, gather requirements, and communicate technical concepts to non-technical stakeholders. Continuous Learning :Being proactive in keeping up with the latest developments, trends, and best practices in PL/SQL Server and database technologies. Qualifications. Bachelors degree in computer science, Information Technology, or related field. Advanced certification in SQL or Oracle databases is preferred. 6-8 years of hands-on experience with SQL, PL/SQL, Oracle, and MS SQL Server. Technical Skills. Strong foundation in SQL, data manipulation, and data definition. Expertise in database schema design, indexes, and relational database structures. Extensive experience with Oracle and MS SQL Server, including advanced features like stored procedures, functions, triggers, and views. Proficiency in T-SQL, query execution plans, indexing strategies, and optimization techniques. Knowledge of security, user management, backup/recovery, monitoring, and performance tuning. Familiarity with SQL Server Integration Services (SSIS) and other ETL tools for data extraction, transformation, and loading. Experience with SQL Server Reporting Services (SSRS), Power BI, or similar tools for reporting and data visualization. Strong problem-solving skills in identifying and debugging database issues. Understanding of version control practices, especially with Git. Awareness of database security, encryption, and regulatory standards like IRDAI. Ability to automate processes using PowerShell, Python, or similar scripting languages. Soft Skills. Collaboration & Teamwork. Communication. Problem Solving. Attention to Detail. Adaptability & Continuous Learning. Good To Have. Cloud Platforms. Additional Scripting Knowledge. Database Migration Experience. CI/CD for Databases. Data Governance. Advanced Data Analytics Tools. Compensation & Benefits. Competitive salary and annual performance-based bonuses. Comprehensive health and Optional Parental insurance. Retirement savings plans and tax saving plan. Key Performance Indicators (KPI). Database Uptime:Maintain database availability with 99.9% uptime. Query Performance:Average query execution time under a set threshold. Data Recovery Time:Meet RTO and RPO for data recovery. Incident Resolution Time:Average time to identify, troubleshoot, and resolve database issues. Data Integration Efficiency:Percentage of ETL processes completed without failure or delay. Compliance Adherence:Ensure all database practices adhere to regulatory and compliance standards. Project Turnaround:Timely delivery of database design, modeling, or optimization tasks within project deadlines. Key Responsibility Areas (KRA). Database Design and Architecture. Database Administration and Maintenance. Performance Optimization. Data Integration and ETL Processes. Compliance and Security. Documentation and Version Control. Reporting and Visualization. Collaboration with Stakeholders. Continuous Improvement. (ref:hirist.tech). Show more Show less
Posted 4 weeks ago
6.0 - 10.0 years
15 - 16 Lacs
Pune
Work from Office
Role Description This is a full-time on-site role for an Appian Developer located in Pune. The Appian Developer will be responsible for developing and maintaining Appian-based applications. Day-to-day tasks include designing and coding software solutions, collaborating with cross-functional teams, debugging issues, and implementing new features. The role also involves ensuring applications are efficiently integrated with backend systems and complying with best practices in software development. Experience Required : 6 to 10 years Appian developer with RPA exposure Requirment: 1) Process Modeling and Workflow Automation: Candidates must showcase their understanding of Appians Process Modeler and demonstrate the ability to optimize execution paths and manage exceptions. 2) Data Integration and Management Expertise: This skill ensures that candidates can create secure API integrations and manage external data sources, crucial for maintaining data integrity and connectivity. 3) Scripting and Expression Rule Development: Evaluates the candidates capability to write expressions and scripts that extend application functionality. 4) Security and Access Control Management: Assess the ability to configure security roles and access permissions, ensuring that applications comply with organizational policies and protect sensitive data. 5) Appian Deployment and Performance Optimization: Ensures that applications are deployed efficiently and perform optimally. These skills are essential for designing, implementing, and managing scalable and efficient applications using Appian's low-code platform.
Posted 4 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less
Posted 4 weeks ago
1.0 - 5.0 years
9 - 13 Lacs
Pune
Work from Office
-Data Integration Lead (Only US Citizens and Green Card Holders) Company Overview: At Codvo, software and people transformations go hand-in-hand We are a global empathy-led technology services company Product innovation and mature software engineering are part of our core DNA Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. Key Responsibilities: Data Integration Strategy & DesignLead the design and implementation of data integration strategies that align with business goals and data architecture. Project ManagementLead and manage data integration projects, ensuring the successful deployment of integration processes and tools. Technical LeadershipProvide technical leadership and mentoring to a team of data engineers and analysts Troubleshoot and resolve complex data integration issues across various systems and platforms. Data Pipeline DevelopmentDesign, implement, and optimize robust data pipelines to integrate data from multiple internal and external sources (e.g., APIs, databases, cloud services). Data Quality & GovernanceEnsure that data integrations maintain high standards of data quality and governance. Documentation & ReportingMaintain detailed documentation of integration processes, workflows, and data sources Provide regular progress reports to senior management on integration activities and performance. Key Qualifications: EducationBachelors or Master's degree in Computer Science, Information Systems, Data Science, Engineering, or related field. Experience12+ years of experience in data integration, data engineering, or related fields, with a strong background in leading data integration projects. Show more Show less
Posted 4 weeks ago
4.0 - 7.0 years
10 - 14 Lacs
Chennai
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Role in BriefThe Data Integration Analyst, is responsible for implementing, maintaining, and supporting HL7 interfaces between customers, both external and internal, and Optum’s integration platforms. The Engineer will work in a team, but will have individual assignments that he/she will work on independently. Engineers are expected to work under aggressive schedules, be self-sufficient, work within established standards, and be able to work on multiple assignments simultaneously. Candidates must be willing to work in a 24/7 environment and will be on-call as needed for critical issues. Primary Responsibilities Interface Design and Development: Interface AnalysisHL7 message investigation to determine gaps or remediate issues Interface Design, Development, and Delivery - Interface planning, filtering, transformation, and routing Interface ValidationReview, verification, and monitoring to ensure delivered interface passes acceptance testing Interface Go-Live and Transition to SupportCompleting cutover events with teams / partners and executing turnover procedures for hand-off Provider EnrollmentsProvisioning and documentation of all integrations Troubleshooting and Support: Issue ResolutionTroubleshoot issues raised by alarms, support, or project managers from root cause identification to resolution Support RequestsHandle tier 2 / 3 support requests and provide timely solutions to ensure client satisfaction Enhancements / MaintenanceEnsuring stable and continuous data delivery Collaboration and Communication: Stakeholder InteractionWork closely with Clients, Project Managers, Product managers and other stakeholders to understand requirements and deliver solutions DocumentationContribute to technical documentation of specifications and processes CommunicationEffectively communicate complex concepts, both verbally and in writing, to team members and clients Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Basic Qualifications EducationBachelor’s degree in Computer Science or any engineering field Experience2+ years of experience working with HL7 data and Integration Engines or Platforms Technical AptitudeAbility to learn new technologies Skills: Proven solid analytical and problem-solving skills Required Qualifications Undergraduate degree or equivalent experience HL7 Standards knowledgeHL7 v2, v3, CDA Integration Tools knowledgeInter Systems IRIS, Infor Cloverleaf, NextGen Mirth Connect, or equivalent Cloud Technology knowledgeAzure or AWS Scripting and StructureProficiency in T-SQL and procedural scripting, XML, JSON Preferred Qualifications HL7 Standards knowledge HL7 FHIR, US Core Integration Tools knowledge Inter Systems Ensemble or IRIS, Cache Scripting and Structure knowledge Object Script, Perl, TCL, Java script US Health care Knowledge Health Information SystemsWorking knowledge Clinical Data Analysis knowledge Clinical ProcessesUnderstanding of clinical processes and vocabulary Soft Skills Analytical and CreativeHighly analytical, curious, and creative OrganizedProven solid organization skills and attention to detail OwnershipTakes ownership of responsibilities At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 4 weeks ago
4.0 - 7.0 years
12 - 17 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The CMDB Analyst is essential in maintaining the integrity and accuracy of the ServiceNow Configuration Management Database (CMDB). This role focuses on enhancing data integration and management through robust ETL processes and detailed data analysis. Primary Responsibilities Oversee the ServiceNow CMDB operations and data integrity Develop and refine ETL processes for optimal data integration Regularly audit CMDB data to ensure accuracy and reliability Collaborate with IT teams to manage configuration item relationships in the CMDB Provide training and support to internal customers on CMDB functions Align CMDB management processes with organizational goals Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors degree or equivalent degree Proven experience in managing ServiceNow CMDB Expertise in SQL and PL-SQL, focusing on ETL process optimization Solid background in database development and data analysis within large-scale environments Familiarity with modern, open-source technologies and integration into existing systems Demonstrated ability to develop and deploy database solutions in distributed systems Proven excellent customer service skills and the ability to influence successful outcomes At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 4 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Support the implementation of a modern data framework that facilitates business intelligence reporting and advanced analytics Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 3+ years of experience in Microsoft Azure Cloud, Azure Data Factory, Data Bricks, Spark, Scala / Python , ADO. 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 4 weeks ago
7.0 - 11.0 years
18 - 22 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Senior CMDB Architect plays a critical Senior role in overseeing the strategic design and maintenance of the ServiceNow Configuration Management Database (CMDB). This senior position is responsible for architecting robust ETL processes, comprehensive data analysis frameworks, and ensuring the overall integrity and effectiveness of the CMDB within large-scale enterprise environments. Primary Responsibilities Lead the strategic design and continuous enhancement of the ServiceNow CMDB architecture Design, implement, and oversee complex ETL processes tailored for optimal data integration and system efficiency Conduct high-level audits and develop advanced data validation techniques to ensure utmost accuracy and reliability of CMDB data Spearhead collaboration with senior IT leadership and cross-functional teams to ensure alignment of the CMDB with business objectives and IT infrastructure changes Provide expert guidance and mentorship to CMDB analysts and other IT staff on best practices, advanced troubleshooting, and complex issues resolution Drive innovation in CMDB processes through the integration of cutting-edge technologies and methodologies Develop and enforce governance policies to maintain data integrity and compliance across the CMDB lifecycle Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 10+ years of experience in a similar role, with at least 5+ years in a leadership or architectural capacity Extensive experience in ServiceNow CMDB architecture and administration, with a proven track record of managing CMDB in a large-scale, distributed environment Deep expertise in SQL, PL-SQL, and ETL process design, with a solid emphasis on performance optimization and scalability Advanced knowledge of modern, open-source technologies and their deployment in enterprise settings Proven masterful analytical skills and the ability to synthesize complex data into actionable insights Demonstrated leadership in developing, integrating, and deploying sophisticated database solutions Demonstrated exceptional problem-solving abilities and capacity to work on multiple projects and issues simultaneously Demonstrated solid communication and interpersonal skills, with experience in influencing C-suite executives and fostering a collaborative team environment Preferred Qualification Relevant certifications in ServiceNow, Database Management, or a related field This senior role demands a visionary leader who can maintain technological excellence and drive the strategic goals of our CMDB initiatives while ensuring alignment with the broader IT and business strategies.
Posted 4 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and maintain scalable data/code pipelines using Azure Databricks, Apache Spark, and Scala Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and deliver high-quality data solutions Optimize and tune Spark applications for performance and scalability Implement data processing workflows, ETL processes, and data integration solutions Ensure data quality, integrity, and security throughout the data lifecycle Troubleshoot and resolve issues related to data processing and pipeline failures Stay updated with the latest industry trends and best practices in big data technologies Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 6+ years Proven experience with Azure Databricks, Apache Spark, and Scala 6+ years experience with Microsoft Azure Experience with data warehousing solutions and ETL tools Solid understanding of distributed computing principles and big data processing Proficiency in writing complex SQL queries and working with relational databases Proven excellent problem-solving skills and attention to detail Solid communication and collaboration skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 4 weeks ago
4.0 - 7.0 years
10 - 14 Lacs
Gurugram
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities As a Senior Data Engineering Analyst, you will be instrumental in driving our data initiatives and enhancing our data infrastructure to support strategic decision-making and business operations. You will lead the design, development, and optimization of complex data pipelines and architectures, ensuring the efficient collection, storage, and processing of large volumes of data from diverse sources. Leveraging your advanced expertise in data modeling and database management, you will ensure that our data systems are scalable, reliable, and optimized for high performance A core aspect of your role will involve developing and maintaining robust ETL (Extract, Transform, Load) processes to facilitate seamless data integration and transformation, thereby supporting our analytics and reporting efforts. You will implement best practices in data warehousing and data lake management, organizing and structuring data to enable easy access and analysis for various stakeholders across the organization. Ensuring data quality and integrity will be paramount; you will establish and enforce rigorous data validation and cleansing procedures to maintain high standards of accuracy and consistency within our data repositories In collaboration with cross-functional teams, including data scientists, business analysts, and IT professionals, you will gather and understand their data requirements, delivering tailored technical solutions that align with business objectives. Your ability to communicate complex technical concepts to non-technical stakeholders will be essential in fostering collaboration and ensuring alignment across departments. Additionally, you will mentor and provide guidance to junior data engineers and analysts, promoting a culture of continuous learning and professional growth within the data engineering team Take a proactive role in performance tuning and optimization of our data systems, identifying and resolving bottlenecks to enhance efficiency and reduce latency. Staying abreast of the latest advancements in data engineering technologies and methodologies, you will recommend and implement innovative solutions that drive our data capabilities forward. Your strategic input will be invaluable in planning and executing data migration and integration projects, ensuring seamless transitions between systems with minimal disruption to operations Maintaining comprehensive documentation of data processes, architectural designs, and technical specifications will be a key responsibility, supporting knowledge sharing and maintaining organizational standards. You will generate detailed reports on data quality, system performance, and the effectiveness of data engineering initiatives, providing valuable insights to inform strategic decisions. Additionally, you will oversee data governance protocols, ensuring compliance with relevant data protection regulations and industry standards, thereby safeguarding the integrity and security of our data assets Leadership and expertise will contribute significantly to the enhancement of our data infrastructure, enabling the organization to leverage data-driven insights for sustained growth and competitive advantage. By fostering innovation, ensuring data excellence, and promoting best practices, you will play a critical role in advancing our data engineering capabilities and supporting the overall success of the business Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field Experience5+ years in data engineering, data analysis, or a similar role with a proven track record Technical Skills: Advanced proficiency in SQL and experience with relational databases (Oracle, MySQL, SQL Server) Expertise in ETL processes and tools Solid understanding of data modeling, data warehousing, and data lake architectures Proficiency in programming languages such as Python or Java Familiarity with cloud platforms (Azure Platform) and their data services Knowledge of data governance principles and data protection regulations (GDPR, HIPAA, CCPA) Soft Skills: Proven excellent analytical and problem-solving abilities Solid communication and collaboration skills Leadership experience and the ability to mentor junior team members Proven proactive mindset with a commitment to continuous learning and improvement Preferred Qualifications Relevant certifications Experience with version control systems (Git) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 4 weeks ago
4.0 - 7.0 years
7 - 11 Lacs
Kolkata
Work from Office
Subject matter experts in Marketing and Comms provide business stakeholders with specialized advice on their subjects, and act as an advisor leveraging on a specific MC expertise. Shehe.is a person with in-depth, unique knowledge and expertise on a specific subject or in a particular industry ex digital marketing, internal comms, telecom, etc. : Experience working with Product Information Management (PIM) systems and familiarity with other marketing technology stacks. Basic knowledge of data integration and flow automation is a plus. Strong focus on maintaining data accuracy and consistency across systems, with the ability to identify and resolve data issues. Primary Skills: Minimum of 4-6 years of experience in product data management, marketing operations, or related fields, with a strong focus on managing product data in a digital marketing or e-commerce context. Previous experience in the Consumer-Packaged Goods (CPG) sector or a similar industry, with a strong understanding of how product data supports marketing efforts. Secondary Skills: Ability to quickly learn new tools and processes and adapt to evolving product data management needs within a fast-paced, dynamic environment. Bachelors degree in marketing, Information Technology, Business Administration, or a related field (or equivalent experience).
Posted 4 weeks ago
1.0 - 5.0 years
10 - 13 Lacs
Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 08 The Role Data Transformation Analyst I - Well Logs The Team: The global subsurface operations team is responsible for regional formation tops studies, structural maps, international data, pressure data and directional surveys. Specifically, The Well Logs team is responsible for digitizing, log editing, and petrophysical data analysis. The team also manages log data collection and publication, accuracy, customer feedback, digital and raster sales. We value shared contributions, client satisfaction and being part of the team. Responsibilities and Impact: The Data Transformation Analyst will be responsible for sourcing, analysis, digitizing, data entry, maintenance and quality control of the exploration and production well log data within the S&P Global US Energy database. Well log identification, splicing, scoring and quality assurance of high business-value well-log curves into a composite log curve set and then distribute the data to interpretation applications. Petrophysical processing of well log data using different logs software including editing of bad data, depth alignment including Powerlog and Kingdom Resolving well log escalations and providing solutions. Manage historical entries in the database. Participate in data improvement projects through global, country, basin or area reviews conducted by the team Ensure consistency, currency and correctness of the data captured from various sources. Support the team in day-to-day activities to achieve the set goals. What Were Looking For: Basic Required Qualifications: Should have bachelors or masters degree in Geology/Applied Geology/Petroleum Engineering/Earth Science. Good computer skills and basic knowledge on MS-Office suite. Good understanding of petroleum geology, well logging. Experience in the oil and gas industry. Additional Preferred Qualifications: Experience in Powerlog or Kingdom software is preferred. Interest in managing and handling geological information. Ability to convert technical information into a usable format for entry into databases. Confident user of MS Excel main functions. Good written and oral communication skills in English. Good team player with proactive behavior. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwideso we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Our benefits include: Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), DTMGOP203 - Entry Professional (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 4 weeks ago
4.0 - 8.0 years
14 - 18 Lacs
Gurugram
Work from Office
About the Role: We are seeking a highly analytical and data-driven individual to join our team as a Central Analytics Lead. In this role, you will play a critical role in driving data-driven decision-making across the organization. You will be responsible for building and scaling our central analytics function, identifying key business opportunities, and enabling data-informed strategies across all departments. Key Responsibilities: - Develop and implement robust data analysis frameworks and methodologies. - Build and maintain data dashboards and reports to track key business metrics and monitor performance. - Conduct in-depth data analysis to identify trends, uncover insights, and inform strategic decisions. - Develop and present data-driven presentations to stakeholders across the organization. - Collaborate closely with cross-functional teams (product, marketing, sales, operations) to understand their data needs and provide actionable insights. - Build and mentor a high-performing team of analysts (future expectation). - Identify opportunities for data-driven innovation and process improvement. - Ensure data quality and integrity across all systems. - Stay abreast of the latest advancements in data analytics and visualization. Qualifications: - 4+ years of experience in an analytics role, preferably in a high-growth startup environment. - Exceptional SQL skills and experience with data warehousing and ETL processes. - Strong experience with data visualization tools (e.g., Tableau, Power BI, Looker). - Excellent communication, presentation, and storytelling skills. - Ability to translate complex data into actionable insights and recommendations. - Strong analytical and problem-solving skills with the ability to think critically and independently. - Growth mindset with a strong desire to learn and adapt to new challenges. - Bachelor's degree in Statistics, Mathematics, Economics, Computer Science, or a related field. Bonus Points: - Experience building and scaling data infrastructure and systems. - Experience with machine learning and predictive modeling techniques.
Posted 4 weeks ago
8.0 - 10.0 years
8 - 12 Lacs
Pune
Work from Office
Mandatory Skills : Power Bi and Qlik tools We are seeking a highly skilled BI Lead with 8-10 years of experience to oversee our BI initiatives. The ideal candidate will have significant hands-on experience with both Qlik and Power BI and will be responsible for leading the migration of existing reports to Power BI. This role requires a deep understanding of BI tools, data modeling, reporting, and data visualization best practices, as well as the ability to technically lead and mentor a BI development team. - Lead BI Projects : Oversee the end-to-end design, development, and deployment of business intelligence solutions, ensuring alignment with business objectives and requirements. - Migration Leadership : Lead and manage the migration of reports and dashboards, ensuring a smooth transition with minimal disruption. - Power BI & Qlik Expertise : Provide hands-on expertise in designing and developing complex dashboards and reports in Power BI and Qlik to meet business needs. - Data Integration & Modeling : Design and implement data models, integrating data from multiple sources for reporting and analytics, ensuring optimal performance and scalability. - Team Leadership & Mentoring : Lead and mentor a team of BI developers, providing technical guidance and supporting their growth. - Collaboration : Work closely with stakeholders across various departments to understand their reporting needs and provide data-driven solutions. - Performance Optimization : Optimize existing BI infrastructure, improving performance, and ensuring data accuracy and security. - Process Improvement : Identify opportunities to enhance existing BI processes, recommending and implementing best practices for data governance and reporting. - Responsible for presales activities including effort estimation, solution design, and proposal development to support business acquisition and client engagements. - Experience with different databases and environments - MongoDB, PostgreSQL, AWS, NoSQL, Redshift- - Development of advanced Tableau dashboards and BOBJ reports (complex formulas, variables, chart types, use multiple data sources such as HANA, SQL, Excel) - Lead - Responsible for leading BI development projects - Improve - Work with the team to pragmatically and continually reduce the gap between getting things done fast and getting things done right - Plans and manages team resources through scheduling and load balancing to achieve team objective, SLAs - Communicates to all team members any new requirements, changes, updates on all support related information company information - Meet and collaborate with Duke Health stakeholders to understand the EHR data, information, and analytic needs of their business units - Work collaboratively with Business Analyst to identify and gather detailed business requirements - Design and develop SSIS packages for ETL processes to integrate data from various source systems, ie - Performs or assists developers in the analysis and debugging of application issues quickly and effectively - Design Tableau dashboards for optimal use on laptop, mobile device, mobile phone, in application integration such as Salesforce and SF1 mobile all - Data source analysis of source data to be used in content development to optimize build for end user - Provide oversight for Production Support team in reporting applications - Collaborate with business analysts on requirements for new requests and enhancements and provide level of effort estimates - Propose prototypes or wire frames based on business requirements - Create and manage Tableau data sources on Tableau server that are optimized for the back end database environment and front end usability - Analyzes, designs, constructs, tests, and implements system or application enhancements - Creates and maintains system documentation according to the IS methodology (proposals, designs, unit test documentation and other deliverables as necessary) - Maintains collaborative working relationship with IS project teams, IS services (database, data center, servers, desktop, networking, security), and software vendors, as needed - Adheres to project plans, tasks, and deliverables
Posted 4 weeks ago
8.0 - 13.0 years
25 - 35 Lacs
Kolkata, Hyderabad, Bengaluru
Work from Office
We are seeking a highly skilled ETL Architect Powered by AI (Apache NiFi/Kafka) to join our team. The ideal candidate will have expertise in managing, automating, and orchestrating data flows using Apache NiFi. In this role, you will design, implement, and maintain scalable data pipelines that handle real-time and batch data processing. The role also involves integrating NiFi with various data sources, performing data transformation tasks, and ensuring data quality and governance Key Responsibilities: Real-Time Data Integration (Apache NiFi & Kafka): Design, develop, and implement real-time data pipelines leveraging Apache NiFi for seamless data flow. Build and maintain Kafka producers and consumers for effective streaming data management across systems. Ensure the scalability, reliability, and performance of data streaming platforms using NiFi and Kafka. Monitor, troubleshoot, and optimize data flow within Apache NiFi and Kafka clusters. Manage schema evolution and support data serialization formats such as Avro , JSON , and Protobuf . Set up, configure, and optimize Kafka topics, partitions, and brokers for high availability and fault tolerance. Implement backpressure handling, prioritization, and flow control strategies in NiFi data flows. Integrate NiFi flows with external services (e.g., REST APIs , HDFS , RDBMS ) for efficient data movement. Establish and maintain secure data transmission, access controls, and encryption mechanisms in NiFi and Kafka environments. Develop and maintain batch ETL pipelines using tools like Informatica , Talend , and custom Python/SQL scripts . Continuously optimize and refactor existing ETL workflows to improve performance, scalability, and fault tolerance. Implement job scheduling, error handling, and detailed logging mechanisms for data pipelines. Conduct data quality assessments and design frameworks to ensure high-quality data integration. Design and document both high-level and low-level data architectures for real-time and batch processing. Lead technical evaluations of emerging tools and platforms for potential adoption into existing systems. Qualifications we seek in you: Minimum Qualifications / Skills: Bachelors degree in computer science , Information Technology , or a related field. Significant experience in IT with a focus on data architecture and engineering . Proven experience in technical leadership , driving data integration projects and initiatives. Certifications in relevant technologies (e.g., AWS Certified Solutions Architect , Microsoft Certified: Azure Data Engineer ) are a plus. Strong analytical skills and the ability to translate business requirements into effective technical solutions. Proficiency in communicating complex technical concepts to non-technical stakeholders. Preferred Qualifications / Skills: Extensive hands-on experience as a Data Architect . In-depth experience with Apache NiFi , Apache Kafka , and related ecosystem components (e.g., Kafka Streams , Schema Registry ). Ability to develop and optimize NiFi processors to handle various data sources and formats. Proficient in creating reusable NiFi templates for common data flows and transformations. Familiarity with integrating NiFi and Kafka with big data technologies like Hadoop , Spark , and Databricks . At least 2 end-to-end implementations of data integration solutions in a real-world environment. Experience in metadata management frameworks and scalable data ingestion processes. Solid understanding of data platform design patterns and best practices for integrating real-time data systems. Knowledge of ETL processes , data integration tools, and data modeling techniques. Demonstrated experience in Master Data Management (MDM) and data privacy standards . Experience with modern data platforms such as Snowflake , Databricks , and big data tools. Proven ability to troubleshoot complex data issues and implement effective solutions . Strong project management skills with the ability to lead data initiatives from concept to delivery. Familiarity with AI/ML frameworks and their integration with data platforms is a plus. Excellent communication and interpersonal skills , with the ability to collaborate effectively across cross-functional teams . Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 4 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills.
Posted 1 month ago
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking an experienced SQL Developer with expertise in SQL Server Analysis Services (SSAS) and AWS to join our growing team. The successful candidate will be responsible for designing, developing, and maintaining SQL Server-based OLAP cubes and SSAS models for business intelligence purposes. You will work with multiple data sources, ensuring data integration, optimization, and performance of the reporting models. This role offers an exciting opportunity to work in a hybrid work environment, collaborate with cross-functional teams, and
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
: As a SaaS Developer focused on SSAS and OLAP, you will play a critical role in our data warehousing and business intelligence initiatives. You will work closely with data engineers, business analysts, and other stakeholders to ensure the delivery of accurate and timely data insights. Your expertise in SSAS development, performance optimization, and data integration will be essential to your success. Responsibilities : - Design, develop, and maintain SQL Server Analysis Services (SSAS) models (multidimensional and tabular). - Create and manage OLAP cubes to support business intelligence reporting and analytics. - Implement best practices for data modeling and cube design. - Optimize the performance of SSAS solutions for efficient query processing and data retrieval. - Tune SSAS models and cubes to ensure optimal performance. - Identify and resolve performance bottlenecks. - Integrate data from various sources (relational databases, flat files, APIs) into SQL Server databases and SSAS models. - Develop and implement ETL (Extract, Transform, Load) processes for data integration. - Ensure data quality and consistency across integrated data sources. - Support the development of business intelligence reports and dashboards. - Collaborate with business analysts to understand reporting requirements and translate them into SSAS solutions. - Provide technical support and troubleshooting for SSAS-related issues. - Preferably have knowledge of AWS S3 and SQL Server PolyBase for data integration and cloud-based data warehousing. - Integrate data from AWS S3 into SSAS models using PolyBase or other appropriate methods. Required Skills & Qualifications : Experience : - 5-8 years of experience as a SQL Developer with a focus on SSAS and OLAP. - Proven experience in designing and developing multidimensional and tabular SSAS models. Technical Skills : - Strong expertise in SQL Server Analysis Services (SSAS) and OLAP cube development. - Proficiency in writing MDX and DAX queries. - Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Experience with SQL Server databases and related tools. - Preferably knowledge of AWS S3 and SQL Server PolyBase.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 1 month ago
3.0 - 6.0 years
5 - 12 Lacs
Chennai
Remote
Job Description We are seeking a skilled Talend Developer with 3 to 6 years of hands-on experience in designing, developing, and optimizing ETL pipelines. The ideal candidate will be proficient in working with Talend, AWS, APIs, and databases and can join on immediate to 15 days notice . Key Responsibilities: Design, develop, and maintain ETL workflows to extract data from AWS S3, transform it per business rules, load into APIs, and retrieve results. Analyze existing ETL workflows and identify areas for performance and design improvements. Build scalable, dynamic ETL pipelines from scratch with future enhancement capability. Collaborate with data engineering and data science teams to ensure data consistency and integrity. Conduct comprehensive unit testing of ETL pipelines and troubleshoot performance issues. Deploy Talend pipelines across different environments using best practices and context variables. Create clear and comprehensive documentation of ETL processes, pipelines, and methodologies. Required Skills & Experience: Minimum 3 years of Talend Development experience Strong expertise in Talend components for File, Database, and API (GET & POST) integration Experience with AWS services and incorporating them in Talend workflows Proven experience in pipeline migration and multi-environment deployment Proficient in SQL and relational databases Working knowledge of Java or Python for automation and logic handling Familiarity with Git for version control and Nexus for artifact management Strong debugging and troubleshooting skills for ETL workflows Excellent attention to detail and analytical mindset Effective communication and collaboration skills Benefits: Competitive salary and performance bonuses Work on cutting-edge data engineering projects Collaborative work culture Learning and growth opportunities How to Apply: Interested candidates meeting the criteria and ready to join within 15 days, please apply directly via Naukri or send your updated resume to hanifarsangeetha@sightspectrum.com
Posted 1 month ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS , Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL, Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance
Posted 1 month ago
2.0 - 7.0 years
4 - 7 Lacs
Bengaluru
Work from Office
As a member of RSMs Application Development, Data Migration, and Data Integration (AppDev) team, you will help clients on their digital transformation journey by designing, developing, deploying, and supporting data and software solutions. Responsibilities: Support the design and implementation of data migration and system integration projects Collaborate with ERP, CRM, and HCM teams to gather and review business and technical requirements Develop, test, and deploy integrations and data pipelines using modern integration platforms Conduct unit testing and assist with QA to ensure technical solutions meet client needs and follow best practices Take ownership of individual tasks and workstreams, delivering high-quality results within established timelines Assist with preparing documentation related to design, testing, and implementation for client-facing and internal use Participate in solution/code reviews and knowledge-sharing sessions with peers Assist in developing internal tools and accelerators to improve project delivery Stay up to date on trends in integration platforms, data architecture, and system optimization Basic Qualifications: Bachelors degree in Computer Science, Information Technology, Systems Engineering, or a related field Minimum 2 years of experience in data migration, application development, or system integration Experience with integration tools such as Boomi, Azure Integration Services (Azure Data Factory, Logic Apps, etc.), MuleSoft, SSIS, or Celigo Strong analytical, problem-solving, and data wrangling skills, including the ability to clean, transform, and prepare data for integration, migration, or reporting. Familiarity with ERP, CRM, HCM, or CPM systems and their data structures Solid understanding of software development principles and good documentation habits Strong communication skills and ability to work collaboratively with cross-functional teams Preferred Qualifications: Experience developing or consuming APIs (REST/SOAP) Knowledge of Microsoft Dynamics 365, NetSuite, Salesforce, or Intacct Exposure to data architecture, system performance tuning, or DevOps practices Platform certifications (Boomi, Azure, MuleSoft, Alteryx, etc.) are a plus Interest in growing into a lead role over time
Posted 1 month ago
3.0 - 5.0 years
3 - 8 Lacs
Chennai
Work from Office
Total experience: 4yr to 6yr Work location: Chennai (Siruseri SIPCOT IT Park) Work Mode: Hybrid (3 days at office every week) Shift timings: UK Shift and UK Holidays 4+ Years with Retail Knowledge and Application, should be experienced in SSAS and Cube Reports and Quickly understand Business process and enable to onboard Application Support ASAP Work with an experienced product and architecture team to build IT solutions that support merchandising processes. Be an expert on certain functions, processes, or tools. Use SSAS and SSIS for reporting and able to write complex Queries in SQL Strong SQL skills, experience with data warehousing concepts, proficiency in the respective toolset (SSAS for data modeling and analysis, SSIS for data extraction, transformation, and loading), and understanding of data integration best practices. Experience with data visualization tools like Power BI or Tableau (preferred) Performance optimization techniques for SSAS cubes Knowledge of ETL processes (especially if working closely with SSIS)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France