Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
6 - 14 Lacs
Hyderabad, Chennai, Coimbatore
Hybrid
4 to 6 years of python based automation testing experience - Strong Experience in Data Validation using Python, API Automation lib requests - Good knowledge in object oriented programming approach - Basic knowledge in Jenkins CI/CD tool or GitHub Actions - Good Knowledge in SQL and ability to frame advanced SQL queries - Exposure in Data Science lib Pandas including data analysis and validation - ETL knowledge is an added advantage
Posted 6 days ago
8.0 - 13.0 years
20 - 35 Lacs
Mumbai
Remote
Interview Process: 1 Virtual Technical Round, 1 Face-to-Face Round (Mumbai), 1 HR Round ---------------------------------------------------------------------------------- Position: Associate Work Mode: 100% Remote Work Hours: Preferred overlap with New York, 6PM to 3AM IST Holidays: Follows US holidays Experience: Min 8 years Start Date: ASAP Key Technical Requirements: Python: Experience in web applications Python + sql with ETL background, Familiarity with Pandas Numpy, Differentiate where to use Python and SQL SQL : Performance optimization & best practices Data storage & retrieval strategies Writing stored procedures, functions Understanding of views, indexes, stored procedures and functions. Ability to optimize logic and analyze implemented solutions ETL & Data Warehousing: Strong understanding of data warehousing concepts • Ability to handle large datasets efficiently Other details: Individual Contributor role ability to work independently Ability to gather requirements and interact with stakeholders Team & Reporting Structure: Hired candidate will report to a US-based manager Sole developer for the role; part of a global team of 12 (including 3 in India) Interview Process: Round 1: Technical assessment Round 2: Advanced technical evaluation Key Responsibilities: Work with development teams and product managers to design and implement software solutions. Develop, test, and maintain backend code using Python. Develop and manage well-functioning databases and applications. Create and manage ETL processes to ensure efficient data integration and transformation. Develop and maintain APIs to facilitate communication between different systems. Collaborate with cross-functional teams to design and implement scalable backend solutions. Optimize and troubleshoot backend code to ensure high performance and reliability. Participate in code reviews and provide constructive feedback to peers. Stay updated with the latest industry trends and technologies to ensure best practices. Qualifications: Proven experience as a Backend Developer with expertise in Python. Strong proficiency in database development with SQL (NoSQL a plus). Experience with ETL processes and tools. Expertise in developing and maintaining APIs. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication skills and the ability to collaborate effectively with stakeholders. Preferred Qualifications: Experience with cloud platforms, preferably Azure. Databricks experience a plus. Knowledge of data warehousing and big data technologies. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience working with investment teams, particularly structured credit
Posted 1 week ago
9.0 - 14.0 years
25 - 40 Lacs
Pune
Work from Office
Cilicant Private Limited is a fast-growing and innovation-led pharma packaging company , working towards a vision of becoming a fully digitized and lean organization. With SAP B1 at the core of our business systems, integrated with Salesforce and a recently implemented Warehouse Management System (WMS) , we are now looking to scale up our IT and automation capabilities across all business functions. We are looking for a Techno-Functional SAP B1 & IT Lead to drive our digital transformation initiatives. The ideal candidate will lead both Business Applications (SAP B1, Salesforce, WMS, custom solutions) and IT Infrastructure & Security , while working closely with cross-functional teams to digitize and automate core business processes. This is a hands-on leadership role with team responsibility. Key Responsibilities 1. SAP B1 & Business Application Management Lead implementation, customization, and optimization of SAP B1 (SQL) modules across departments. Design and manage integrations between SAP B1, Salesforce, WMS, Barcoding Systems , and in-house applications. Develop and maintain add-ons using SAP B1 SDK, SQL queries, and Crystal Reports . Collaborate with business users to identify process gaps and recommend automation and digital solutions. 2. Application Development & Automation Develop in-house web and desktop applications using .NET technologies to support unique business needs. Identify automation opportunities across Production, Finance, Stores, Quality, Purchase, Sales, and Dispatch. Lead automation initiatives with external vendors or internal developers, ensuring security and scalability. 3. IT Infrastructure & Cybersecurity Oversee the company's IT architecture , including server setup, cloud/data storage, networks, and endpoints. Define and implement IT policies for access, security, disaster recovery, procurement, and asset management . Ensure cybersecurity by implementing tools like firewalls, endpoint security, and user access control. 4. User Support & Training Lead a support structure for all business users using SAP B1, Salesforce, WMS, and in-house systems. Create SOPs and conduct user training sessions. Troubleshoot system issues and act as the point of escalation for all IT-related concerns. 5. Documentation & Compliance Maintain updated documentation of configurations, integrations, custom modules, and IT policies. Ensure compliance with industry regulations and internal data protection policies . Stay informed of SAP B1 updates, best practices, and industry trends . Candidate Profile Technical Skills: Proficiency in SAP B1 SDK, SQL (Advanced), .NET (C# / VB.NET) for application development. Experience in integrating SAP B1 with third-party systems like Salesforce and WMS. Experience with Crystal Reports , stored procedures, and database optimization. Good understanding of IT infrastructure, networking, and cybersecurity protocols. Functional Knowledge: Strong understanding of manufacturing business processes : Sales, Purchase, Inventory, Production, Quality, Finance, Dispatch. Experience in process mapping and automation for lean operations. Ability to interpret business needs into technical solutions. Behavioral & Leadership Competencies: Self-driven with a problem-solving mindset . Ability to manage cross-functional teams and external vendors . Strong project management and execution capability. Excellent communication skills to collaborate with business and tech stakeholders.
Posted 1 week ago
1.0 - 2.0 years
3 - 4 Lacs
Bengaluru
Work from Office
: Headquartered in Noida, India, Paytm Insurance Broking Private Limited (PIBPL), a wholly owned subsidiary of One97 Communications (OCL) is an online insurance market place, that offers insurance products across all leading insurance companies, with products across auto, life and health insurance and provide policy management and claim services for our customers. Expectations/ : 1. Using automated tools to extract data from primary and secondary sources 2. Removing corrupted data and fixing coding errors and related problems 3. Developing and maintaining databases, data systems - reorganizing data in a readable format 4. Preparing reports for the management stating trends, patterns, and predictions using relevant data 5. Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends 6. Supporting the data warehouse in identifying and revising reporting requirements. 7. Setup robust automated dashboards to drive performance management 8. Derive business insights from data with a focus on driving business level metrics 9. 1 -2 years of experience in business analysis or a related field. Superpowers/ Skills that will help you succeed in this role 1. Problem solving - Assess what data is required to prove hypotheses and derive actionable insights 2. Analytical skills - Top notch excel skills are necessary 3. Strong communication and project management skills 4. Hands on with SQL, Hive, Excel and comfortable handling very large scale data. 5. Ability to interact and convince business stakeholders. 6. Experience working with web analytics platforms is an added advantage. 7. Experimentative mindset with attention to detail. 8. Proficiency in Advance SQL , MS Excel and Python or R is a must 9. Exceptional analytical and conceptual thinking skills. 10. The ability to influence stakeholders and work closely with them to determine acceptable solutions. 11. Advanced technical skills. 12. Excellent documentation skills. 13. Fundamental analytical and conceptual thinking skills. 14. Experience creating detailed reports and giving presentations. 15. Competency in Microsoft applications including Word, Excel, and Outlook. 16. A track record of following through on commitments. 17. Excellent planning, organizational, and time management skills. 18. Experience leading and developing top-performing teams. 19. A history of leading and supporting successful projects. Preferred Industry - Fintech/ E-commerce / Data Analytics Education - Any graduate or a Graduate from Premium Institute is preferred. Why join us: 1. We give immense opportunities to make a difference, and have a great time doing that. 2. You are challenged and encouraged here to do meaning work for yourself and customers/clients 3. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be
Posted 1 week ago
4.0 - 7.0 years
5 - 15 Lacs
Mumbai
Hybrid
Role: Sr Python FastAPI Developer Location: Mumbai Experience: 4yrs to 7yrs Technologies / Skills: Python (FastAPI), Advance SQL, Postgres, DynamoDB, Docker Responsibilities: - Build high-performance REST APIs & WebSockets to power web applications. - Design, develop, and maintain scalable and efficient backend services using FastAPI for web applications. - Coordinating with development teams to determine application requirements and integration points. - Understanding of fundamental design principles behind a scalable application and writing scalable code. - Implement security best practices to safeguard sensitive data and ensure compliance with privacy regulations. - Own and manage all phases of the software development lifecycle planning, design, implementation, deployment, and support. - Build reusable, high-quality code and libraries for future use that are high-performance and can be used across multiple projects. - Conduct code reviews and provide constructive feedback to team members. - Stay up-to-date with emerging technologies and trends in Python development and FastAPI framework. - Ensuring the reliability and correctness of FastAPI applications using Pytest - Defines and documents business requirements for complex system development or testing - Comfortable working with agile / scrum / kanban - Willingness to join a distributed team operating across different time-zones Required Qualification for Sr Python FastAPI Developer - Bachelors degree in IT, computer science, computer engineering, or similar - Min. 3+ years of experience in Python (FastAPI) development. - Strong understanding of asynchronous programming and background tasks. - Knowledge of Pydantic, CRON jobs scheduler, Swagger Ul for endpoints. - Proficiency in database management systems.(e. g., DynamoDB, PostgreSQL). - Familiarity with containerization technologies such as Docker. - Excellent verbal and written communication skills - Experience with version control systems (e.g., Git, Git actions) is a plus.
Posted 1 week ago
8.0 - 13.0 years
20 - 30 Lacs
Hyderabad
Work from Office
Job Duties & Responsibilities : Analyze transportation and logistics data to identify trends, anomalies, and improvement opportunities. Build and maintain dashboards and reports using tools like Power BI or Tableau. Collaborate with business stakeholders to understand reporting needs and deliver actionable insights. Support data validation and quality checks during TMS implementation. Facilitate technical design of complex data sourcing, transformation, and aggregation logic, ensuring business analytics requirements are met. Use multiple data systems, tools, platforms to analyze the key business trends, formulate hypothesis and present meaningful business insights. Effectively Communicate with multiple stakeholders while owning various business initiatives and delivering the quality output. Work closely with data engineers to ensure data pipelines meet analytical needs. Required Skills: Bachelors Degree in related field or equivalent experience. 8+ years of end-to-end experience in working with data; including data discovery, data integration, data analysis, data manipulation and creation of reports & visualizations Deeply knowledgeable in Business Intelligence Life Cycle; from data ingestion to data visualization Strong SQL skills and experience working with Snowflake or similar cloud data platforms. Proficiency in data visualization tools (Power BI, Tableau, etc.). Experience in supply chain or transportation analytics preferred. Has strong presentation and collaboration skills with the ability to work in cross functional multi- cultural global teams and can communicate all aspects of the job requirements, including the creation of formal documentation Independent. Strong critical thinking, decision making, troubleshooting and problem-solving skills. Go-Getter. Possesses strong, planning, execution and multitasking skills and demonstrated ability to reprioritize accordingly. Must be able to manage quickly changing priorities while meeting deadlines
Posted 1 week ago
6.0 - 11.0 years
25 - 40 Lacs
Gurugram, Bengaluru
Hybrid
Salary: 25 to 40 LPA Exp: 7 to 11 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Pune/Bangalore/Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 week ago
3.0 - 8.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Hiring - Sr. SQL DBA Location- Gurugram- Hybrid Fulltime Education - Bachelor or higher degree. 10+ years experience in working in IT services organization. At least 8+ years of SQL DBA experience covering implementation, production support and project work. At least 5+ years of exposure as L3/Lead/ SME support role. Knowledge and Understanding the ITIL process In client facing role(s) globally for at least 5 years. Excellent communication skills On call support for at least 5 years. Excellent Documentations skills in MS SharePoint 2010 /2013 Mandatory skills - SQL Server & Cloud administration required skills • Senior level Microsoft SQL Server DBA with experience in large and critical environments. Excellent knowledge of performance tuning at both server level and query level. Candidate must have knowledge of perfmon, SQL Server dynamic management views. Knowledge of SQL server internals. • Knowledge of SQL partitioning and compression • Hands-on experience on mirroring, replication and Always ON Must have automation experience. Good knowledge of scripting. Ability to write T-Sql scripts, Power Shell or Python. Excellent knowledge of SSIS packages. Hands-on experience on SQL Server Clustering. Hands-on experience on Integration Services, Reporting services and Analysis services • Must have good knowledge in SQL Server permission/security policies • At least 6 months experience with AWS Cloud RDS Good to have skills MySQL knowledge / experience. • Linux Knowledge Experience with Snow Flake Certifications Desired MS certifications on MSSQL latest versions. AWS Certifications
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Pune
Work from Office
What You'll Do The Global Analytics & Insights (GAI) team is looking for a Senior Data Engineer to lead our build of the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data backed decisions. As A Senior Data Engineer, you will help architect, implement, and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will immerse yourself in our financial, marketing, and sales data to become an expert of Avalara's domain. You will have deep SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data related. What Your Responsibilities Will Be You will architect repeatable, reusable solutions to keep our technology stack DRY Conduct technical and architecture reviews with engineers, ensuring all contributions meet quality expectations You will develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions Build scalable, complex dbt models Demonstrate ownership of complex projects and calculations of core financial metrics and processes Work with Data Engineering teams to define and maintain scalable data pipelines. Promote automation and optimization of reporting processes to improve efficiency. You will be reporting to Senior Manager What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 6+ years experience in data engineering field, with advanced SQL knowledge 4+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 4+ years of working with Snowflake 3+ years working with dbt (dbt core) 3+ years working with Infrastructure as Code Terraform 3+ years working with CI CD, and demonstrated ability to build and operate pipelines AWS Certified Terraform Certified Experience working with complex Salesforce data Snowflake, dbt certified
Posted 1 week ago
1.0 - 5.0 years
3 - 5 Lacs
Gurugram
Work from Office
Role & responsibilities As part of the Home Credit analytics team, the successful candidate will be responsible for developing, analyzing and executing ideas and initiatives designed to achieve business reports.. Would need to learn HCIN data base, absorb current reporting and should be able to create new reports as per business requirement. Should have strong base in SQL and Power BI reporting Mandatory Skills SQL Coding, power bi, Excel Report Preparation, Dashboards, Reporting, Advanced Excel, PowerPoint, data analysis Preferred candidate profile SQL Coding, MS office ,Power BI, Power point, Dashboards, Advanced Excel and outlook Good Communication, Analytics and Decision Making Notice Period - Immediate - 30 days Max
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5+ years of experience with BI tools, with expertise and/or certification in at least one major BI platform – Tableau preferred. Advanced knowledge of SQL, including the ability to write complex stored procedures, views, and functions. Proven capability in data storytelling and visualization, delivering actionable insights through compelling presentations. Excellent communication skills, with the ability to convey complex analytical findings to non-technical stakeholders in a clear, concise, and meaningful way. 5.Identifying and analyzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)
Posted 1 week ago
5.0 - 8.0 years
18 - 22 Lacs
Bengaluru
Work from Office
Job Title - Sales Excellence - COE - Data Engineering Specialist Management Level: 9-Team Lead/Consultant Location: Mumbai, MDC2C Must-have skills: Sales Good to have skills: Data Science, SQL, Automation, Machine Learning Job Summary : Apply deep statistical tools and techniques to find relationships between variables Roles & Responsibilities: - Apply deep statistical tools and techniques to find relationships between variables. - Develop intellectual property for analytical methodologies and optimization techniques. - Identify data requirements and develop analytic solutions to solve business issues. Job Title - Analytics & Modelling Specialist Management Level :9-Specialist Location:Bangalore/ Gurgaon/Hyderabad/Mumbai Must have skills:Python, Data Analysis, Data Visualization, SQL Good to have skills:Machine Learning Job Summary : The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. - The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent Total experience Range:5-8 years in the relevant field A minimum of 3 years of GCP experience with exposure to machine learning/data science Experience in configuration the machine learning workflow in GCP. A minimum of 5 years Advanced SQL knowledge and experience working with relational databases A minimum of 3 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 3 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A minimum of 3 years Hands on experience in advanced excel topics such as cube functions, VBA Automation, Power Pivot etc. A minimum of 3 years Hands on experience in Python Additional Information: Understanding of sales processes and systems. Masters degree in a technical field. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connectivity and a distraction-free environment for working at home, in accordance with local guidelines. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience:8 to 10 Years Educational Qualification: B.Com
Posted 1 week ago
4.0 - 9.0 years
15 - 30 Lacs
Gurugram, Chennai
Work from Office
Role & responsibilities • Assume ownership of Data Engineering projects from inception to completion. Implement fully operational Unified Data Platform solutions in production environments using technologies like Databricks, Snowflake, Azure Synapse etc. Showcase proficiency in Data Modelling and Data Architecture Utilize modern data transformation tools such as DBT (Data Build Tool) to streamline and automate data pipelines (nice to have). Implement DevOps practices for continuous integration and deployment (CI/CD) to ensure robust and scalable data solutions (nice to have). Maintain code versioning and collaborate effectively within a version-controlled environment. Familiarity with Data Ingestion & Orchestration tools such as Azure Data Factory, Azure Synapse, AWS Glue etc. Set up processes for data management, templatized analytical modules/deliverables. Continuously improve processes with focus on automation and partner with different teams to develop system capability. Proactively seek opportunities to help and mentor team members by sharing knowledge and expanding skills. Ability to communicate effectively with internal and external stakeholders. Coordinating with cross-functional team members to make sure high quality in deliverables with no impact on timelines Preferred candidate profile • Expertise in computer programming languages such as: Python and Advance SQL • Should have working knowledge of Data Warehousing, Data Marts and Business Intelligence with hands-on experience implementing fully operational data warehouse solutions in production environments. • 3+ years of Working Knowledge of Big data tools (Hive, Spark) along with ETL tools and cloud platforms. • 3+ years of relevant experience in either Snowflake or Databricks. Certification in Snowflake or Databricks would be highly recommended. • Proficient in Data Modelling and ELT techniques. • Experienced with any of the ETL/Data Pipeline Orchestration tools such as Azure Data Factory, AWS Glue, Azure Synapse, Airflow etc. • Experience working with ingesting data from different data sources such as RDBMS, ERP Systems, APIs etc. • Knowledge of modern data transformation tools, particularly DBT (Data Build Tool), for streamlined and automated data pipelines (nice to have). • Experience in implementing DevOps practices for CI/CD to ensure robust and scalable data solutions (nice to have). • Proficient in maintaining code versioning and effective collaboration within a versioncontrolled environment. • Ability to work effectively as an individual contributor and in small teams. Should have experience mentoring junior team members. • Excellent problem-solving and troubleshooting ability with experience of supporting and working with cross functional teams in a dynamic environment. • Strong verbal and written communication skills with ability to communicate effectively, articulate results and issues to internal and client team.
Posted 1 week ago
5.0 - 8.0 years
18 - 25 Lacs
Noida, Hyderabad, Chennai
Hybrid
Hiring for SQL SSIS Developer for Hyderabad/Chennai/Noida/Pune locations. Looking for immediate joiners: 0-1 week Primary Skill Advanced SQL, SSIS PF deduction is mandatory for all companies. Role & responsibilities 5 years of practical experience with SSIS, Advanced SQL, T-SQL, experience with Microsoft SQL Server platform, and Transact SQL stored procedures and triggers Working knowledge of Azure Synapse & Snowflake Experience executing SSIS packages via SQL Server Job Agent Experience with complex queries including use of CTEs, table variables, merge and dynamic SQL.
Posted 1 week ago
10.0 - 15.0 years
22 - 27 Lacs
Gurugram
Work from Office
Position Summary Core Objectives in this roleDrive revenue growth at a Key Account, in partnership with onshore Client Partners, and offshore Delivery Leaders, by focusing on Delivery excellence, leading to increased customer satisfaction strategic account management and business development Work Experience 10 +years of relevant delivery experience in a large/midsize IT services/Consulting/Analytics Company. 10 years Delivery Management – Business information management (major) and Commercial Ops (minor) 5+ years of Account Management/ Business Development/ Solutioning experience in the IT services/Consulting industry Extensive Pharma/ Life Science industry experience- commercial & clinical Datasets Tech Skills Azure ADF Informatica Intelligent Cloud Services (IICS) PowerShell Scripting Tidal for Scheduling Azure Cloud Services Advance SQL Databricks (second priority) Job Responsibilities Program Management and Delivery Governance coordination with delivery teams to Represent the ‘voice of the client’ in front of delivery teams Serve as the glue between onshore-offshore and cross-LOB stakeholders Anticipate and mitigate risks Uncover new areas to add value to the client’s business Business Development Activities to grow existing accounts Account Strategy and Tactics Proposal Writing (storyline, slides, documents) Bid Management,Drive to uncover cross-sell, up-sell opportunities Process Activities to ensure smooth financial and project operations Track and follow up on CRM opportunities, Active Projects Prepare reports/deckspipeline, revenue, profitability, invoicing Being successful in this role requires an attitude of ‘ownership’ of all aspects of a business.It entails working collaboratively in a matrix organization, with the full range of stakeholders Onshore client-facing partners Finance, Legal, Marketing Delivery teams both onshore and offshore Education MBA PG Diploma in Management Behavioural Competencies Project Management Client Engagement and Relationship Building Attention to P&L Impact Capability Building / Thought Leadership Customer focus Technical Competencies Account management Azure Data Factory Data Governance Pharma Data Analytics Delivery Management- BIM/ Cloud Info Management Informatica Azure SQL
Posted 1 week ago
1.0 - 3.0 years
3 - 5 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL , Python, Databricks ) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 1 week ago
6.0 - 11.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
We are looking for a Snowflake Developer with deep expertise in Snowflake and DBT or SQL to help us build and scale our modern data platform. Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT/SQL . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 5+ years of experience as a Data Engineer, with at least 4 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 10 Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of big data from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 6+ years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python/ PySpark etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres and MongoDB. Experience workflow management toolsAirflow, AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade10 LocationGurugram Hybrid Modeltwice a week work from office Shift Time12 pm to 9 pm IST What You'll Love About Us Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! About automotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. Were an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of Drive and Help have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries
Posted 2 weeks ago
8.0 - 12.0 years
20 - 22 Lacs
Chennai, Bengaluru
Work from Office
Experience working closely with other data scientists, data engineers' software engineers, data managers and business partners. 7+ years in designing, planning, prototyping, productionizing, maintaining knowledge in Python, Go, Java, SQL knowledge
Posted 2 weeks ago
4.0 - 7.0 years
4 - 9 Lacs
Pune
Hybrid
Role Overview: This hybrid role sits within the Distribution Data Stewardship Team and combines operational and technical responsibilities to ensure data accuracy, integrity, and process optimization across sales reporting functions. Key Responsibilities: Support sales reporting inquiries from sales staff at all levels. Reconcile omnibus activity with sales reporting systems. Analyze data flows to assess impact on commissions and reporting. Perform data audits and updates to ensure integrity. Lead process optimization and automation initiatives. Manage wholesaler commission processes, including adjustments and manual submissions. Oversee manual data integration from intermediaries. Execute territory alignment changes to meet business objectives. Contribute to team initiatives and other responsibilities as assigned. Growth Opportunities: Exposure to all facets of sales reporting and commission processes. Opportunities to develop project and relationship management skills. Potential to explore leadership or technical specialist roles within the firm. Qualifications: Bachelors degree in Computer Engineering or a related field. 4–7 years of experience with Python programming and automation . Strong background in SQL and data analysis . Experience in relationship/customer management and leading teams . Experience working with Salesforce is a plus. Required Skills: Technical proficiency in Python and SQL . Strong communication skills and stakeholder engagement. High attention to data integrity and detail . Self-directed with excellent time management. Project coordination and documentation skills. Proficiency in MS Office , especially Excel .
Posted 2 weeks ago
8.0 - 10.0 years
30 - 35 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience – 8 + yrs Notice period – Immediate joiner Location – Any Brillio location (Preferred is Bangalore) Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management Preferred candidate profile
Posted 2 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships
Posted 2 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships) Preferred technical and professional experience Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France