Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 9.0 years
14 - 18 Lacs
Pune
Hybrid
The SQL+Power BI Lead is responsible for designing, developing, and maintaining complex data solutions using SQL and Power BI. They serve as a technical lead, guiding the team in implementing best practices and efficient data architectures. The SQL+PowerBI Lead plays a key role in translating business requirements into effective data and reporting solutions. Design and develop advanced SQL queries, stored procedures, and other database objects to support data extraction, transformation, and loading Create dynamic, interactive PowerBI dashboards and reports to visualize data and provide insights Provide technical leadership and mentorship to junior team members on SQL and PowerBI best practices Collaborate with business stakeholders to understand requirements and translate them into data solutions Optimize database performance and implement security measures to ensure data integrity Automate data integration, extraction, and reporting processes where possible Participate in data architecture planning and decision-making Troubleshoot and resolve complex data-related issues Stay up-to-date with the latest trends, technologies, and best practices in data analytics.
Posted 3 days ago
6.0 - 9.0 years
14 - 18 Lacs
Hyderabad
Work from Office
The ideal candidate will have a strong background in IT Services & Consulting, with expertise in Oracle Data Science. Roles and Responsibility Design and implement data science solutions using Oracle technologies. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Develop and maintain large-scale data systems and architectures. Work closely with stakeholders to understand requirements and deliver high-quality results. Stay up-to-date with industry trends and emerging technologies. Lead the development of data science projects from concept to delivery. Job Requirements Strong knowledge of Oracle Data Science and related technologies. Experience working with large datasets and developing predictive models. Excellent communication and collaboration skills. Ability to work in a fast-paced environment and meet deadlines. Strong problem-solving skills and attention to detail. Bachelor's degree in Computer Science or related field.
Posted 3 days ago
1.0 - 6.0 years
3 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will join a collaborative team implementing and supporting the next generation of safety platforms and supporting technologies. In this role, you will analyze and resolve issues with adverse event data and file transmissions across integrated systems, leveraging data analytics to identify trends, optimize workflows, and prevent future incidents. Collaborating closely with various teams, you will develop insights and implement solutions to improve system performance, ensuring reliable and efficient data flow critical to safety operations Roles & Responsibilities: Monitor, solve, and resolve issues related to adverse event data processing across multiple systems. Conduct detailed investigations into system disruptions, data anomalies, or processing delays and implement corrective and preventive measures. Work closely with internal teams, external vendors, and business partners to address dependencies and resolve bottlenecks for critical issues Design and maintain dashboards, reports, and analytics to monitor system performance and identify trends or areas of improvements. Present findings and recommendations to leadership, ensuring data-driven decision-making and clear transparency into system operations. Identify inefficiencies and propose data-driven solutions to optimize and enhance reliability. Collaborate on the development of test plans, scenarios to ensure robust validation of system updates, patches and new features Perform regression testing to verify the changes do not negatively impact existing system functionality Support the creating and implementation of automated testing frameworks to improve efficiency and consistency Support compliance with Key Control Indicators (KCI) and chips in to overall process governance What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: About the role Role Description: As a Sr Associate IS Analyst, you will join a collaborative team implementing and supporting the next generation of safety platforms and supporting technologies. In this role, you will analyze and resolve issues with adverse event data and file transmissions across integrated systems, leveraging data analytics to identify trends, optimize workflows, and prevent future incidents. Collaborating closely with various teams, you will develop insights and implement solutions to improve system performance, ensuring reliable and efficient data flow critical to safety operations Roles & Responsibilities: Monitor, solve, and resolve issues related to adverse event data processing across multiple systems. Conduct detailed investigations into system disruptions, data anomalies, or processing delays and implement corrective and preventive measures. Work closely with internal teams, external vendors, and business partners to address dependencies and resolve bottlenecks for critical issues Design and maintain dashboards, reports, and analytics to monitor system performance and identify trends or areas of improvements. Present findings and recommendations to leadership, ensuring data-driven decision-making and clear transparency into system operations. Identify inefficiencies and propose data-driven solutions to optimize and enhance reliability. Collaborate on the development of test plans, scenarios to ensure robust validation of system updates, patches and new features Perform regression testing to verify the changes do not negatively impact existing system functionality Support the creating and implementation of automated testing frameworks to improve efficiency and consistency Support compliance with Key Control Indicators (KCI) and chips in to overall process governance Basic Qualifications and Experience: Masters degree and 1 to 3 years of experience in Computer Science, IT or related field OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: Demonstrated expertise in monitoring, troubleshooting, and resolving data and system issues. Proficiency in data analytics, with experience in dashboarding and reporting tools such as Tableau or Power BI. Familiarity with database technologies and querying tools, including SQL (Oracle SQL, PL/SQL preferred). Understanding of API integrations and middleware platforms (e.g., MuleSoft). Experience with testing methodologies, tools, and automation practices. Experienced in Agile methodology Good-to-Have Skills: Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited to: Java or Python Experience with cloud-based technologies and modern data architectures. Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Experience with Signal platforms is a plus Professional Certifications: SAFe for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Excellent leadership and strategic thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to deal with ambiguity and think on their feet Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom,BSc Service Line Data & Analytics Unit Responsibilities Spark Expertise Expert proficiency in Spark Ability to design and implement efficient data processing workflows Experience with Spark SQL and DataFrames Good exposure to Big Data architectures and good understanding of Big Data eco system Experience with some framework building experience on Hadoop Good with DB knowledge with SQL tuning experience. Good to have experience with Python, APIs and exposure to Kafka. Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your role and responsibilities Develop, test and support future-ready data solutions for customers across industry verticals.Develop, test, and support end-to-end batch and near real-time data flows/pipelines.Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 4 weeks ago
2.0 - 4.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your role and responsibilities Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Faridabad
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Vadodara
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 1 month ago
10.0 - 14.0 years
10 - 16 Lacs
Pune
Work from Office
Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Ludhiana
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Coimbatore
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Jaipur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 1 month ago
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
The Sustainability Data and Technology Program is a bank wide program to deliver a strategic solution for Environmental, Social and Governance data across Deutsche Bank. The Program is part of the Sustainability Strategy Key Deliverable. As a Business Analyst, you will be part of the Data Team. You will be responsible for reviewing business use cases from stakeholders, gathering & documenting requirements, defining high level implementation steps and creating business user stories. You will closely work with the Product Owner and development teams and bring business and functional analysis skills into the development team to ensure that the implementation of requirements aligns with our business needs and technical quality standards. Your key responsibilities Working with the business and technology stakeholders to define, agree and socialise requirements for ESG Data Sourcing and Transformation, needed for the Consumer base within the bank. Work with architects and engineers to ensure that both functional and non-functional requirements can be realised in the design and delivery in a way which respects the architecture strategy. Analyse complex datasets to derive insights to support requirement definition by completing the data profiling of vendor data. Define & document business requirements for review by senior stakeholders, in JIRA and other documentation tools such as Confluence, Draw.IO. Defining acceptance criteria with stakeholders and supporting user acceptance testing to ensure quality product delivery, supporting the Defect Management. Responsible for reviewing User Stories along with test cases based on appropriate interpretation of Business Requirements Liaising with business teams and development teams in Agile ceremonies such as Product Backlog Refinements to review the User Stories and to prioritise the Product Backlog, to support the requirements in its path to release in production environment. To act as a point of contact for the Development Teams for any business requirement clarifications Provide support to the Functional Analysts within the Development Teams to produce Analysis artifacts Designing & specifying data mapping to transform source system data into a format which can be consumed by other business areas within the bank Supporting the design and conceptualization of new business solution options and articulating identified impacts and risks Monitor, track issues, risks and dependencies on analysis and requirements work Your skills and experience Mandatory Skills 4+ years business analyst experience in the Banking Industry across the full project life cycle, with broad domain knowledge and understanding of core business processes, systems and data flows Experience of specifying ETL processes within Data projects Experience of a large system implementation project across multiple Business Units and across multiple geographies. It is essential that they are aware of the sort of issues that may arise with a central implementation across different locations Strong knowledge of business analysis methods (e.g. best practices in Requirements Management and UAT) Demonstrates the maturity and persuasiveness required to engage in business dialogue and support stakeholders Excellent analysis skills and good problem solving skills Ability to communicate and interpret stakeholders needs and requirements An understanding of systems delivery lifecycles and Agile delivery methodologies A good appreciation of systems and data architectures Strong discipline in data reconciliation, data integrity, controls and documentation Understanding of controls around software development to manage business requirements Ability to work in virtual teams and matrixed organizations Good team player, facilitator-negotiator and networker. Able to lead senior managers towards common goals and build consensus across a diverse group Ability to share information, transfer knowledge and expertise to team members Ability to commit to and prioritise work duties and tasks Ability to work in a fast paced environment with competing and ever changing priorities, whilst maintaining a constant focus on delivery Willingness to chip in and cover multiple roles when required such as cover for Project Managers, assisting architecture, performing testing and write ups of meeting minutes Expertise in Microsoft Office applications (Word, Excel, Visio, PowerPoint) Proficient ability to query large datasets (e.g. SQL, Hue, Impala, Python) with a view to test/analyse content and data profiling Desirable Skills In depth understanding of the aspects of ESG reporting Knowledge of ESG data vendors
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane