Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
13 - 14 Lacs
Jaipur, Delhi / NCR, Bengaluru
Hybrid
Location: DELHI / NCR / Jaipur / Bangalore / Hyderabad Work Mode: Hybrid - 2 Days WFO Working Time: 1:00 PM to 10:00 PM IST iSource Services is hiring for one of their USA based client for the position of Data Integration Specialist. About the Role - We are seeking a skilled Data Integration Specialist to manage data ingestion, unification, and activation across Salesforce Data Cloud and other platforms. You will design and implement robust integration workflows, leveraging APIs and ETL tools to enable seamless data flow and support a unified customer experience. Key Responsibilities: Design and implement data ingestion workflows into Salesforce Data Cloud Unify data from multiple sources to create a 360 customer view Develop integrations using APIs, ETL tools, and middleware (e.g., MuleSoft) Collaborate with cross-functional teams to gather and fulfill data integration requirements Monitor integration performance and ensure real-time data availability Ensure compliance with data privacy and governance standards Enable data activation across Salesforce Marketing, Sales, and Service Clouds Must-Have Skills: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery) Salesforce certifications (e.g., Data Cloud Consultant, Integration Architect) Hands-on experience with Salesforce Data Cloud (CDP) Proficiency in ETL, data transformation, and data mapping Strong knowledge of REST/SOAP APIs and integration tools Solid understanding of data modeling and customer data platforms Familiarity with data privacy regulations (e.g., GDPR, CCPA).
Posted 3 days ago
4.0 - 7.0 years
10 - 15 Lacs
Mumbai
Work from Office
We are seeking a skilled Business Intelligence Manager to construct and uphold analytics and reporting solutions that convert data into actionable insights. The BI Manager role is pivotal, involving the conversion of provided data into meaningful insights through user-friendly dashboards and reports. An ideal BI Manager possesses proficiency in Business Intelligence tools and technology, overseeing the creation and administration of BI tools with comprehensive knowledge of the BI system, managing stakeholder expectations and ensuring we deliver to that expectation as a team. This role demands a grasp of business concepts, strong problem-solving abilities, and prior experience in data and business analysis. Analytical prowess and effective communication skills are highly valued attributes for this position.. BI Responsibilities. The day-to-day responsibilities include but not limited to:. Develop actionable insights that can be used to make business decisions by building reports and dashboards.. Understand business stakeholders’ objectives, metrics that are most important to them, and how they measure performance.. Translate data into highly leveraged and effective visualizations. Share knowledge and skills with your teammates to grow analytics impact. Ability to come up with an overall design strategy for all analytics that improves the user experience. Influence and educate stakeholders on the appropriate data, tools, and visualizations.. Review all analytics for quality before final output are delivered to stakeholders.. Responsibly for version control and creating technical documentation.. Partner with IT to provide different ways of improving on existing processes.. Successful contribution to delivery through the development and implementation of best-in-class data visualization and insights. Strong relationship with the business stakeholders to ensure understanding of business needs.. Improvement in performance for all visualizations due to optimized code. Experience with custom/ third party visuals. Design, implement, and maintain scalable data pipelines and architectures. Essential Traits. Qualifications/Skills:. Graduate or equivalent level qualification, preferably in a related discipline; Master’s degree preferred. 6-8 years of analytical experience in Data and Analytics: Building reports and dashboards. 6-8 years of experience with visualization tools such as Power BI. Hands on experience in DAX, Power Query, SQL and build data models that can generate meaningful insights.. Experience working with and creating analytics to enable stakeholders for data driven decision making. 4+ years of experience with requirements gathering.. Should have expert level proficiency in data transformation/configuration and connecting the data with the Power BI dashboard.. Exposure in implementing row-level security and bookmarks.. Competencies. Highly motivated and influential team player with a proven track record of driving results.. Strong communicator and collaborator with exceptional interpersonal skills.. Analytical problem-solver with a passion for innovation and continuous improvement.. Teachable, embraces best practices, and leverages feedback as a means of continuous improvement.. Consistently high achiever marked by perseverance, humility, and a positive outlook in the face of challenges.. Strong problem solving, quantitative and analytical abilities.. Solid written and verbal communication skills and knowledge to build strong relationships. Preferred. Microsoft/ Any other BI Certified. About Kroll. In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity—not just answers—in all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you’ll contribute to a supportive and collaborative work environment that empowers you to excel. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally—and encourage our people to do the same.. Kroll is committed to equal opportunity and diversity, and recruits people based on merit.. In order to be considered for a position, you must formally apply via careers.kroll.com. Show more Show less
Posted 3 days ago
3.0 - 4.0 years
11 - 14 Lacs
Mumbai
Work from Office
AEP Data Architect. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Key Responsibilities. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Requirements & Qualifications. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Show more Show less
Posted 3 days ago
5.0 - 8.0 years
10 - 20 Lacs
Chennai, Bengaluru
Work from Office
ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046
Posted 3 days ago
5.0 - 8.0 years
18 - 25 Lacs
Pune
Work from Office
We are seeking an experienced Modern Microservice Developer to join our team and contribute to the design, development, and optimization of scalable microservices and data processing workflows. The ideal candidate will have expertise in Python, containerization, and orchestration tools, along with strong skills in SQL and data integration. Key Responsibilities: Develop and optimize data processing workflows and large-scale data transformations using Python. Write and maintain complex SQL queries in Snowflake to support efficient data extraction, manipulation, and aggregation. Integrate diverse data sources and perform validation testing to ensure data accuracy and integrity. Design and deploy containerized applications using Docker, ensuring scalability and reliability. Build and maintain RESTful APIs to support microservices architecture. Implement CI/CD pipelines and manage orchestration tools such as Kubernetes or ECS for automated deployments. Monitor and log application performance, ensuring high availability and quick issue resolution. Requirements Mandatory: Bachelor's degree in Computer Science, Engineering, or a related field. 5-8 years of experience in Python development, with a focus on data processing and automation. Proficiency in SQL, with hands-on experience in Snowflake. Strong experience with Docker and containerized application development. Solid understanding of RESTful APIs and microservices architecture. Familiarity with CI/CD pipelines and orchestration tools like Kubernetes or ECS. Knowledge of logging and monitoring tools to ensure system health and performance. Preferred Skills: Experience with cloud platforms (AWS, Azure, or GCP) is a plus.
Posted 3 days ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Data Scientist to build and scale data science solutions in cloud-native environments. Ideal for candidates who specialize in analytics and machine learning using cloud ecosystems. Key Responsibilities: Design predictive and prescriptive models using cloud ML tools Use BigQuery, SageMaker, or Azure ML Studio for scalable experimentation Collaborate on data sourcing, transformation, and governance in the cloud Visualize insights and present findings to stakeholders Required Skills & Qualifications: Strong Python/R skills and experience with cloud ML stacks (AWS, GCP, or Azure) Familiarity with cloud-native data warehousing and storage (Redshift, BigQuery, Data Lake) Hands-on with model deployment, CI/CD, and A/B testing in the cloud Bonus: Background in NLP, time series, or geospatial analysis Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 3 days ago
3.0 - 8.0 years
4 - 9 Lacs
Mumbai Suburban
Work from Office
Job Title: Data Processing (DP) Executive Location: MIDC, Andheri East, Mumbai Work Mode: Work From Office (WFO) Work Days: Monday to Friday Work Hours: 9:00 PM 6:00 AM IST (Night Shift) Job Summary: We are seeking a highly skilled and detail-oriented Data Processing (DP) Executive to join our team. The ideal candidate will have a solid background in data analysis and processing, strong proficiency in industry-standard tools, and the ability to manage large data sets efficiently. This role is critical in ensuring data integrity and delivering accurate insights for business decision-making. Key Responsibilities: Manage and process data using tools like SPSS and Q programming . Perform data cleaning, transformation, and statistical analysis. Collaborate with research and analytics teams to interpret and format data for reporting. Create reports and dashboards; experience with Tableau or similar visualization tools is an advantage. Utilize SQL for data querying and validation. Ensure accuracy and consistency of data deliverables across projects. Handle multiple projects simultaneously with a keen eye for detail and timelines. Technical Skills: Proficiency in SPSS and Q programming . Strong understanding of data processing techniques and statistical methods. Familiarity with Tableau or other data visualization tools (preferred). Basic working knowledge of SQL . Educational Qualifications: Bachelor's degree in Statistics, Computer Science, Data Science , or a related field. Experience: Minimum 3 years of experience in data processing or a similar analytical role. Soft Skills: Excellent analytical and problem-solving abilities. Strong attention to detail and accuracy. Good communication skills and the ability to work in a team-oriented environment. Self-motivated with the ability to work independently and manage multiple tasks effectively.
Posted 4 days ago
8.0 - 13.0 years
25 - 30 Lacs
Pune
Work from Office
What You'll Do We are seeking a highly skilled and motivated Senior Data Engineer to join our Data Operations team The ideal candidate will have deep expertise in Python, Snowflake SQL, modern ETL tools, and business intelligence platforms such as Power BI This role also requires experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs You will be responsible for building and maintaining data pipelines, developing robust data models, and ensuring seamless data integrations that support business analytics and reporting The role requires flexibility to collaborate in US time zones as needed, What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python, Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs, Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors, Develop and support dashboards and reports using Power BI and other reporting tools, Work closely with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions, Ensure data quality, accuracy, and consistency across systems and datasets, Write clean, well-documented, and testable code with a focus on performance and reliability, Participate in peer code reviews and contribute to best practices in data engineering, Be available for meetings and collaboration in US time zones as required, What Youll Need To Be Successful You should have 5+ years' experience in data engineering field, with deep SQL knowledge, Strong experience in Snowflake SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must, Proficiency in Python for data transformation and scripting, Proficiency in writing complex SQL queries, Stored Procedures, Strong experience in Data Warehouse, data modeling and ETL design concepts, Should have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP, etc Knowledge of AWS technologies (EC2, S3, RDS, Redshift, etc ) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders, Flexibility to work during US business hours as required for team meetings and collaboration, How Well Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses, Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance, Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship, What You Need To Know About Avalara Were Avalara Were defining the relationship between tax and tech, Weve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business Our growth is real, and were not slowing down until weve achieved our mission to be part of every transaction in the world, Were bright, innovative, and disruptive, like the orange we love to wear It captures our quirky spirit and optimistic mindset It shows off the culture weve designed, that empowers our people to win Ownership and achievement go hand in hand here We instill passion in our people through the trust we place in them, Weve been different from day one Join us, and your career will be too, Were An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company ? we dont want people to fit into our culture, but to enrich it All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law If you require any reasonable adjustments during the recruitment process, please let us know,
Posted 4 days ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Engineering Manager to lead a team building data pipelines, models, and analytics infrastructure. Ideal for experienced engineers who can manage both technical delivery and team growth. Key Responsibilities: Lead development of ETL/ELT pipelines and data platforms Manage data engineers and collaborate with analytics/data science teams Architect systems for data ingestion, quality, and warehousing Define best practices for data architecture, testing, and monitoring Required Skills & Qualifications: Strong experience with big data tools (Spark, Kafka, Airflow) Proficiency in SQL, Python, and cloud data services (e.g., Redshift, BigQuery) Proven leadership and team management in data engineering contexts Bonus: Experience with real-time streaming and ML pipeline integration Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 5 days ago
4.0 - 7.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Description Summary The Data Scientist will work in teams addressing statistical, machine learning and data understanding problems in a commercial technology and consultancy development environment. In this role, you will contribute to the development and deployment of modern machine learning, operational research, semantic analysis, and statistical methods for finding structure in large data sets. Job Description Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is GE Aerospaces multidisciplinary research and engineering center. Pushing the boundaries of innovation every day, engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Role Overview: As a Data Scientist, you will be part of a data science or cross-disciplinary team on commercially-facing development projects, typically involving large, complex data sets. These teams typically include statisticians, computer scientists, software developers, engineers, product managers, and end users, working in concert with partners in GE business units. Potential application areas include remote monitoring and diagnostics across infrastructure and industrial sectors, financial portfolio risk assessment, and operations optimization. In this role, you will: Develop analytics within well-defined projects to address customer needs and opportunities. Work alongside software developers and software engineers to translate algorithms into commercially viable products and services. Work in technical teams in development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics. Perform exploratory and targeted data analyses using descriptive statistics and other methods. Work with data engineers on data quality assessment, data cleansing and data analytics Generate reports, annotated code, and other projects artifacts to document, archive, and communicate your work and outcomes. Share and discuss findings with team members. Required Qualifications: Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) with basic experience. Desired Characteristics: - Expertise in one or more programming languages and analytic software tools (e.g., Python, R, SAS, SPSS). Strong understanding of machine learning algorithms, statistical methods, and data processing techniques. - Exceptional ability to analyze large, complex data sets and derive actionable insights. Proficiency in applying descriptive, predictive, and prescriptive analytics to solve real-world problems. - Demonstrated skill in data cleansing, data quality assessment, and data transformation. Experience working with big data technologies and tools (e.g., Hadoop, Spark, SQL). - Excellent communication skills, both written and verbal. Ability to convey complex technical concepts to non-technical stakeholders and collaborate effectively with cross-functional teams - Demonstrated commitment to continuous learning and staying up-to-date with the latest advancements in data science, machine learning, and related fields. Active participation in the data science community through conferences, publications, or contributions to open-source projects. - Ability to thrive in a fast-paced, dynamic environment and adapt to changing priorities and requirements. Flexibility to work on diverse projects across various domains. Preferred Qualifications: - Awareness of feature extraction and real-time analytics methods. - Understanding of analytic prototyping, scaling, and solutions integration. - Ability to work with large, complex data sets and derive meaningful insights. - Familiarity with machine learning techniques and their application in solving real-world problems. - Strong problem-solving skills and the ability to work independently and collaboratively in a team environment. - Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Domain Knowledge: Demonstrated awareness of industry and technology trends in data science Demonstrated awareness of customer and stakeholder management and business metrics Leadership: Demonstrated awareness of how to function in a team setting Demonstrated awareness of critical thinking and problem solving methods Demonstrated awareness of presentation skills Personal Attributes: Demonstrated awareness of how to leverage curiosity and creativity to drive business impact Humble: respectful, receptive, agile, eager to learn Transparent: shares critical information, speaks with candor, contributes constructively Focused: quick learner, strategically prioritizes work, committed Leadership ability: strong communicator, decision-maker, collaborative Problem solver: analytical-minded, challenges existing processes, critical thinker Whether we are manufacturing components for our engines, driving innovation in fuel and noise reduction, or unlocking new opportunities to grow and deliver more productivity, our GE Aerospace teams are dedicated and making a global impact. Join us and help move the aerospace industry forward . Additional Information Relocation Assistance Provided: No
Posted 5 days ago
4.0 - 6.0 years
12 - 18 Lacs
Noida, Greater Noida
Work from Office
Role & responsibilities Utilize Python (specifically Pandas) to clean, transform, and analyze data, automate repetitive tasks, and create custom reports and visualizations. Analyze and interpret complex datasets, deriving actionable insights to support business decisions. Write and optimize advanced SQL queries for data extraction, manipulation, and analysis from various sources, including relational databases and cloud-based data storage. Collaborate with cross-functional teams to understand data needs and deliver data-driven solutions. Create and maintain dashboards and reports that visualize key metrics and performance indicators. Identify trends, patterns, and anomalies in data to support business intelligence efforts and provide strategic recommendations. Ensure data integrity and accuracy by developing and implementing data validation techniques. Support data migration, transformation, and ETL processes within cloud environments. Requirements 3 - 5 years of experience as a Data analyst or equivalent role. Good experience in Python, with hands-on experience using Pandas for data analysis and manipulation. Expertise in analytical SQL, including writing complex queries for data extraction, aggregation, and transformation. Knowledge of cloud platforms, particularly AWS (Amazon Web Services). Strong analytical thinking, problem-solving, and troubleshooting abilities. Familiarity with data visualization tools (e.g., Tableau, Power BI, Quicksight , Superset etc.) is a plus. Excellent communication skills, with the ability to explain complex data insights in a clear and actionable manner. Detail-oriented with a focus on data quality and accuracy. Preferred Qualifications: Experience working in a cloud-based data analytics environment. Familiarity with additional cloud services and tools (e.g. Snowflake , Athena). Experience working in an Agile environment or with data-oriented teams.
Posted 5 days ago
5.0 - 10.0 years
18 - 25 Lacs
Bengaluru
Remote
Job Title: Data Engineer ETL & Spatial Data Expert Locations: Bengaluru / Gurugram / Nagpur / Remote Department: Data Engineering / GIS / ETL Experience: As per requirement (CTC capped at 3.5x of experience in years) Notice Period: Max 30 days Role Overview: We are looking for a detail-oriented and technically proficient Data Engineer with strong experience in FME, spatial data handling , and ETL pipelines . The role involves building, transforming, validating, and automating complex geospatial datasets and dashboards to support operational and analytical needs. Candidates will work closely with internal teams, local authorities (LA), and HMLR specs. Key Responsibilities: 1. Data Integration & Transformation Build ETL pipelines using FME to ingest and transform data from Idox/CCF systems. Create Custom Transformers in FME to apply reusable business rules. Use Python (standalone or within FME) for custom transformations, date parsing, and validations. Conduct data profiling to assess completeness, consistency, and accuracy. 2. Spatial Data Handling Manage and query spatial datasets using PostgreSQL/PostGIS . Handle spatial formats like GeoPackage, GML, GeoJSON, Shapefiles . Fix geometry issues like overlaps or invalid polygons using FME or SQL . Ensure proper coordinate system alignment (e.g., EPSG:27700). 3. Automation & Workflow Orchestration Use FME Server/FME Cloud to automate and monitor ETL workflows. Schedule batch processes via CI/CD, Cron, or Python . Implement audit trails and logs for all data processes and rule applications. 4. Dashboard & Reporting Integration Write SQL views and aggregations to support dashboard visualizations. Optionally integrate with Power BI, Grafana, or Superset . Maintain metadata tagging for each data batch. 5. Collaboration & Communication Interpret validation reports and collaborate with Analysts/Ops teams. Translate business rules into FME logic or SQL queries. Map data to LA/HMLR schemas accurately. Preferred Tools & Technologies: CategoryToolsETLFME (Safe Software), Talend (optional), PythonSpatial DBPostGIS, Oracle SpatialGIS ToolsQGIS, ArcGISScriptingPython, SQLValidationFME Testers, AttributeValidator, SQL viewsFormatsCSV, JSON, GPKG, XML, ShapefilesCollaborationJira, Confluence, Git Ideal Candidate Profile: Strong hands-on experience with FME workflows and spatial data transformation . Proficient in scripting using Python and working with PostGIS . Demonstrated ability to build scalable data automation pipelines. Effective communicator capable of converting requirements into technical logic. Past experience with LA or HMLR data specifications is a plus. Required Qualifications: B.E./B.Tech. (Computer Science, IT, or ECE) B.Sc. (IT/CS) or Full-time MCA Strict Screening Criteria: No employment gaps over 4 months. Do not consider candidates from Jawaharlal Nehru University. Exclude profiles from Hyderabad or Andhra Pradesh (education or employment). Reject profiles with BCA, B.Com, Diploma, or open university backgrounds. Projects must detail technical tools/skills used clearly. Max CTC is 3.5x of total years of experience. No flexibility on notice period or compensation. No candidates from Noida for Gurugram location.
Posted 6 days ago
10.0 - 15.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Novo Nordisk Global Business Services ( GBS) India Department - Global Data & Artificial lntelligence Are you passionate about building scalable data pipelines and optimising data workflowsDo you want to work at the forefront of data engineering, collaborating with cross-functional teams to drive innovationIf so, we are looking for a talented Data Engineer to join our Global Data & AI team at Novo Nordisk. Read on and apply today for a life-changing career! The Position As a Senior Data Engineer, you will play a key role in designing, developing, and main-taining data pipelines and integration solutions to support analytics, Artificial Intelligence workflows, and business intelligence. It includes: Design, implement, and maintain scalable data pipelines and integration solutions aligned with the overall data architecture and strategy. Implement data transformation workflows using modern ETL/ELT approaches while establishing best practices for data engineering, including testing methodologies and documentation. Optimize data workflows by harmonizing and securely transferring data across systems, while collaborating with stakeholders to deliver high-performance solutions for analytics and Artificial Intelligence. Monitoring and maintaining data systems to ensure their reliability. Support data governance by ensuring data quality and consistency, while contributing to architectural decisions shaping the data platform's future. Mentoring junior engineers and fostering a culture of engineering excellence. Qualifications Bachelor’s or master’s degree in computer science, Software Development, Engineering. Possess over 10 years of overall professional experience, including more than 4 years of specialized expertise in data engineering. Experience in developing production-grade data pipelines using Python, Data-bricks and Azure cloud, with a strong foundation in software engineering principles. Experience in the clinical data domain, with knowledge of standards such as CDISC SDTM and ADaM (Good to have). Experience working in a regulated industry (Good to have). About the department You will be part of the Global Data & AI team. Our department is globally distributed and has for mission to harness the power of Data and Artificial Intelligence, integrating it seamlessly into the fabric of Novo Nordisk's operations. We serve as the vital link, weaving together the realms of Data and Artificial Intelligence throughout the whole organi-zation, empowering Novo Nordisk to realize its strategic ambitions through our pivotal initiatives. The atmosphere is fast-paced and dynamic, with a strong focus on collaboration and innovation. We work closely with various business domains to create actionable insights and drive commercial excellence.
Posted 6 days ago
2.0 - 4.0 years
3 - 5 Lacs
Bengaluru
Work from Office
Description of the position/role: Design, develop, and maintain interactive dashboards and reports in Power BI to showcase key performance indicators and business trends. Write complex SQL queries to extract, manipulate, and analyze data from relational databases, ensuring data accuracy and integrity. Develop and implement Macros and VBA scripts to automate repetitive tasks and streamline data processing workflows. Collaborate with cross-functional teams to gather requirements, understand business needs, and translate them into technical specifications for reporting and analytics. Perform data analysis to identify trends, patterns, and anomalies, and provide recommendations based on findings. Ensure timely delivery of reports and analyses to meet business objectives and support decision-making processes. Troubleshoot and resolve any data-related issues, ensuring high-quality data for reporting and analysis. Stay updated on industry trends and best practices related to data visualization and analytics. Requirements: Bachelors degree/Diploma in data science, Information Technology, Business Analytics, or a related field. 2 to 5 years of experience as Data or Business Analyst, Night shift ( 06 :00 PM to 03 :00 AM) Proven experience in data analysis and business intelligence, specifically using Power BI, SQL, and Macro/VBA. Strong understanding of database management systems and ETL processes. Proficient in writing SQL queries for data extraction and manipulation. Experience with Power BI, including DAX functions, data modeling, and report designing. Building relations, parameters and measures and Back-end formatting. Knowledge of Macros and VBA programming to automate tasks within Excel and other applications. Excellent analytical and problem-solving skills, with the ability to interpret data and offer actionable insights. Strong attention to detail and the ability to work independently as well as in a team environment. Good communication skills to effectively present findings and collaborate with stakeholders.
Posted 6 days ago
6.0 - 9.0 years
27 - 42 Lacs
Pune
Work from Office
Job Summary We are seeking a Developer with 4 to 9 years of experience to join our team. The ideal candidate will have strong technical skills in experience in integration development using Workato. Key Responsibilities Design and implement robust, reusable, and scalable integrations using Workato Recipes, Connectors, and Workbot. Work closely with business stakeholders, architects, and product teams to understand integration needs and translate them into technical requirements. Develop custom connectors and scripts using JavaScript, HTTP connectors, and Webhook listeners within Workato. Maintain and enhance existing integrations, troubleshoot issues, and ensure high availability and performance. Implement data mapping, transformation, and error handling best practices. Leverage Workato SDK (if needed) to create reusable components and extend platform capabilities. Monitor and optimize recipe performance and perform root cause analysis for failed jobs. Mentor junior developers and contribute to integration governance frameworks and best practice Participate in agile ceremonies, provide input on story estimations, and contribute to technical documentation. Required Skills 2+ years of experience in integration development using Workato. Deep understanding of Workato platform features: Recipes, Recipe Functions,Collections, Lookup Tables, Connections, Jobs, and Logs. Strong experience in REST/SOAP API consumption, authentication (OAuth 2.0, API Keys), and data formats (JSON, XML). Proficiency in SQL, JavaScript, and data transformation logic within integrations. Experience in building custom connectors using Workato Connector SDK (preferred). Solid understanding of error handling, logging, and retry mechanisms. Workato Automation Pro certifications (e.g., Level 1, 2, or Workato Partner Certification)
Posted 6 days ago
6.0 - 11.0 years
13 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Aws Glue - Mandatory Aws S3 and AWS lambada - should have some experience Must have used snowpipe to build integration pipelines. how to build procedure from scratch. write complex Sql queries writing complex Sql queries python-Numpy and pandas
Posted 6 days ago
6.0 - 10.0 years
20 - 35 Lacs
Noida
Work from Office
Job Title: Solutions Architect Type: Full-Time Location - Sec 63 Noida Salary: Best in Market About the Role:- We are building a Microsoft-focused IT consulting firm serving SMB clients across cloud solutions, cybersecurity, and digital transformation. We are seeking a client-facing, hands-on Solutions Architect who will not only design and deliver technical solutions but also lead projects and manage technical teams . This is a dual-role opportunity: youll be the lead technical consultant on client calls and pre-sales discussions, and also the technical delivery lead , responsible for creating roadmaps and ensuring successful execution using internal and external resources. Prior experience leading technical teams and projects is a must. Key Responsibilities:- Client-Facing & Business Consulting Join client discovery meetings with the sales team to identify pain points and technical needs. Present technical solutions clearly to both technical and non-technical stakeholders. Collaborate with sales on developing SOWs, proposals, and solution estimates. Build and maintain long-term relationships as the trusted technical advisor. Solution Architecture & Delivery Leadership Design and deploy Microsoft-based solutions, including Azure infrastructure, SharePoint, Intune, MFA, and Office 365. Conduct cybersecurity assessments and penetration tests; implement Microsoft Defender and Azure Security Center. Plan and lead seamless Office 365 migrations with strong emphasis on user experience and uptime. Lead technical project teams , allocate tasks, and oversee execution from kickoff through post-implementation. Develop and maintain project roadmaps , timelines, and delivery milestones aligned with client goals. Be hands-on when needed and help resolve complex issues directly or by guiding team members. Build and manage a trusted network of independent technical experts for flexible delivery capacity. Documentation & Continuous Improvement Produce architecture diagrams, SOWs, security reports, test results, and audit-ready deliverables. Stay up to date with Microsoft technologies and bring new ideas to evolve our offerings. Qualifications:- Education Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Required Experience:- Minimum 2+ years leading technical teams and IT projects from planning through executionthis is mandatory . 5+ years in solution architecture or IT consulting, with hands-on Microsoft ecosystem experience. Proven success in Azure (IaaS/PaaS), Intune, SharePoint, MFA, and Office 365 migrations. Experience conducting penetration tests and implementing security/compliance frameworks (HIPAA, CMMC, etc.). Previous experience working in or with an IT consulting or managed services firm. Certifications (Preferred):- Microsoft Certified: Azure Solutions Architect Expert Microsoft 365 Certified: Enterprise Administrator Expert CEH / OSCP / CISSP / CompTIA Security+ Skills:- Strong leadership and team coordination experience. Ability to translate technical requirements into business-focused solutions. Excellent communicationclear, confident, and able to represent the company to executive-level stakeholders. Proficient with project management tools (Jira, MS Project, Asana, etc.). Capable of independently managing project delivery and motivating technical contributors. Nice to Have:- A strong network of IT professionals and contractors for rapid scaling. Experience with SMB clients in regulated industries (healthcare, finance, etc.) Who You Are:- A natural team leader with strong client-facing presence. A technical expert who thrives in a fast-paced, high-responsibility role. A builder , excited about contributing to the foundation of a growing consulting firm. To speed up processing, you might also send a copy of your profile along with a brief write-up supporting your case to :- vinod@apetanco.com, rajni@apetan.com , riya@apetan.com
Posted 6 days ago
8.0 - 10.0 years
10 - 12 Lacs
Pune
Work from Office
Key Responsibilities Develop and maintain supply chain analytics to monitor operational performance and trends. Lead and participate in Six Sigma and supply chain improvement initiatives. Ensure data integrity and consistency across all analytics and reporting platforms. Design and implement reporting solutions for key supply chain KPIs. Analyze KPIs to identify improvement opportunities and develop actionable insights. Build and maintain repeatable, scalable analytics using business systems and BI tools. Conduct scenario modeling and internal/external benchmarking. Provide financial analysis to support supply chain decisions. Collaborate with global stakeholders to understand requirements and deliver impactful solutions. External Qualifications and Competencies Qualifications Bachelors degree in Engineering, Computer Science, Supply Chain, or a related field. Relevant certifications in BI tools, Agile methodologies, or cloud platforms are a plus. This position may require licensing for compliance with export controls or sanctions regulations. Additional Responsibilities Unique to this Position Experience 8-10 years of total experience, with at least 6 years in a relevant analytics or supply chain role. Proven experience in leading small teams and managing cross-functional projects. Technical Skills Expertise in : SQL, SQL Server, SSIS, SSAS, Power BI. Advanced DAX development for complex reporting needs. Performance optimization for SQL and SSAS environments. Cloud and Data Engineering : Azure Synapse, Azure Data Factory (ADF), Python, Snowflake Agile methodology : Experience working in Agile teams and sprints.
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 week ago
5.0 - 10.0 years
35 - 40 Lacs
Bengaluru
Hybrid
Expert in the Operating Model & AJG data governance, SOP’s for Collibra, Collibra Data Catalog KPI, manual stitching of assets in Collibra, Technical skills. Workflow Design & Stakeholder Management, Hands on exp in Data Governance & Collibra. Required Candidate profile Implementation, configuration, and maintenance of the Collibra Data Governance Platform Stewards, data owners, stakeholders, data governance, data quality, and data integration principles.
Posted 1 week ago
7.0 - 12.0 years
18 - 30 Lacs
Chennai, Bengaluru, Thiruvananthapuram
Hybrid
Seeking a skilled Intermediate WebMethods Developer to design and implement integration solutions using Software AG's WebMethods Integration Server platform. The ideal candidate will ensure seamless connectivity and data exchange between enterprise systems, contributing to the overall efficiency and effectiveness of our IT operations. Key Responsibilities: Design, develop, and deploy integration solutions using WebMethods Integration Server. Configure and utilize WebMethods components such as Broker, Universal Messaging, and Trading Networks. Create Flow Services, Java Services, and Adapter Services for integration. Implement web services (SOAP/REST), XML, JSON, and messaging protocols (JMS, MQ). Configure WebMethods adapters (e.g., JDBC, SAP, JMS) for system connectivity. Perform data transformation and mapping using XSLT, EDI, and related technologies. Handle errors, debug issues, and perform performance tuning of integration solutions. Monitor integration processes using MyWebMethods Server (MWS) and other tools. Collaborate with business stakeholders, technical teams, and end-users to ensure integration solutions meet requirements. Document integration solutions, designs, and processes clearly and comprehensively. Contribute to the continuous improvement of integration processes and best practices. Required Skills 6-8 years of experience working with Software AGs WebMethods platform. Strong understanding of integration concepts and middleware technologies. Proficiency in web services (SOAP/REST), XML, JSON, and messaging protocols (JMS, MQ).
Posted 1 week ago
7.0 - 12.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical skills & design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key shared accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key profile requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment.
Posted 1 week ago
1.0 - 6.0 years
5 - 10 Lacs
Maharashtra
Work from Office
We seek a Digital Marketing Specialist, Mid Level with Marketing Media Mix who shares our passion for innovation and change. This role is critical to helping our business partners evolve and adapt to consumers' personalized expectations in this new technological era. Roles and Responsibilities: Delivery of MMM/ROI projects: Delivery of multiple High Impact ROI projects as described below: Data preparation, harmonization and transformation for model feed Thorough understanding of different types of Media, sales and equity datatypes Media ROI modelling with granular insights at Campaign level, Objectives, Formats and Ad types Promo ROI modelling at granular and retailer level with deep dive into various promotion techniques Sales and Equity ROI modelling at channel, format level with deep dive into marketing drivers. Ecomm. ROI projects with deep dive into retail media, influencer spends and online promotions along with other driver Develop and implement advanced market mix models to quantify the ROI of marketing activities across various product categories and channels. Interpret and analyse model outputs, identifying key trends and actionable insights for marketing optimization. Foster a collaborative and results-oriented team environment Technical Skills and Project experience requirements: Should have executed ROI / MMM projects in past. Understanding of statistical modelling techniques, including ensemble modelling, Bayesian HLM analysis and multivariate analysis. Understanding impact of product propositions and marketing drivers on brand equity metrics and deriving incremental lift from the drivers to uplift the brand sales. Understanding of data types used in MMM including sales, equity, macro-economic data Experience with data transformation techniques including nonlinear transformation, ad stock transformation of variables used in the modelling. . Experience in Ecomm. Modelling with deep dive on retail media, influencer and online promotions. Education: Bachelor s degree in engineering/Statistics or Specialization in Business analytics Prior work ex- Minimum 1 years of relevant experience This job can be filled in Bangalore/Pune #LI-Hybrid
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a significant rise in the demand for data transformation professionals in recent years. With the increasing importance of data in business decision-making, companies across various industries are actively seeking skilled individuals who can transform raw data into valuable insights. If you are considering a career in data transformation in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for their thriving tech industries and have a high demand for data transformation professionals.
The average salary range for data transformation professionals in India varies based on experience levels. Entry-level positions typically start at INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
A typical career path in data transformation may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise, they may progress to roles like Senior Data Scientist, Lead Data Engineer, and Chief Data Officer.
In addition to data transformation skills, professionals in this field are often expected to have knowledge of programming languages (such as Python, R, or SQL), data visualization tools (like Tableau or Power BI), statistical analysis, and machine learning techniques.
As the demand for data transformation professionals continues to rise in India, now is a great time to explore opportunities in this field. By honing your skills, gaining relevant experience, and preparing for interviews, you can position yourself for a successful career in data transformation. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane