Jobs
Interviews

96701 Analytics Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Company Description Codes For Tomorrow specializes in delivering products and services based on JavaScript frameworks such as ReactJS, NodeJS, Angular, AngularJS, and Core JavaScript. We also have expertise in working with the world's most popular CMS, WordPress, to deliver headless WordPress websites. We are open to partnering with other technology companies to meet their product needs. Role Description This is a full-time on-site role for an SEO Intern located in Indore. The SEO Intern will be responsible for conducting keyword research, performing SEO audits, and building backlinks. Day-to-day tasks will also include working on on-page SEO optimizations and analyzing web analytics data to improve search engine rankings and visibility. Qualifications Proficiency in Keyword Research and On-Page SEO Experience in conducting SEO Audits and Link Building Skilled in using Web Analytics tools Strong analytical and problem-solving skills Excellent verbal and written communication skills Ability to work independently and as part of a team Relevant coursework or experience in digital marketing is a plus Currently pursuing or recently completed a degree in Marketing, Communications, or a related field

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

1.0 years

0 Lacs

Kochi, Kerala, India

On-site

Digital Marketing Specialist Plan and execute all digital marketing, including SEO/SEM, marketing database, email, social media and display advertising campaigns. Prepare weekly and monthly reports of marketing performances related to client projects with insights and the next action steps. Brainstorm new and creative growth strategies. Identify trends and insights, and optimize spend and performance based on the insights. Responsible for business development, client meeting and handling under digital marketing. Collaborate with other team members to help them with their marketing strategies and change processes as needed. Collaborate with agencies and other vendor partners. Evaluate emerging technologies. Ensure projects run smoothly, campaigns remain on track and project goals are met. Create opportunities online that accelerate growth. Requirements 1+ Years experience in Digital media strategy, planning, online sales, strategic alliances & Social Media Marketing. Well versed with Google Analytics, Adwords, ppc and all digital marketing activities. Working knowledge of HTML, CSS, and JavaScript development and constraints. Ability to work with technical team and drive the business through marketing initiatives. Must be well versed in CRM, analytics and scheduling tools. Good team building and management skills. Solid experience in handling the entire gamut of Digital Marketing such as SEO, SMM, PPC, Content writing. A good command over English language. What we Expect from you? Solid experience in handling the entire gamut of Digital Marketing such as SEO, SMM, PPC, Content writing. A good command over English language. What you've got? CRM management through web chat bots and email marketing. Work with the product team to create high conversion funnels for our audiences on web + app.

Posted 12 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Shift : General Shift, All 5 Days WFO Strong functional knowledge of Six Sigma, Statistical tools, Quality& Process re-engineering Exposure to ISO, CMMi or COPC will be added advantage Consulting experience in Shared Services Set up/BPO space is desirable Single point of contact for Quality Support of Client Account(s).Connect and collaborate with Client PgMs independently on daily basis Lead and guide a team of Improvement Consultants Ensuring Implementation of Quality Standards in the account Design and deploy consistent improvement framework to enhance customer satisfaction Facilitation of Metrics Management. CPMs/KPIs Providing High Quality Business analytics support to Management team Execute client projects on cost reduction, customer experience improvement, process re-engineering, process improvement, workforce optimization Project planning, project management, change management at client locations, stakeholder management & communication Facilitating workshops, remote groups and leading process diagnostic for articulation of process issues and solution formulation Benchmarking processes, Management dashboard set up & building the best practices repository. Leverage business optimization & innovation tools & application for process re-engineering Bring in expertise regarding moderate AI/ ML capability driven transformation experience Any experience in driving process improvement in Geo-Maps and SDV space will be added advantage Keep oneself updated, aware of, and compliant to all Company policies and procedures which include Information Security Management Systems Ensure that all company information which includes customer information are kept confidential and secured as part of the Organizational Policy

Posted 12 hours ago

Apply

1.0 years

0 Lacs

Chandigarh, India

On-site

Company Description QuadB Tech is a boutique blockchain development studio specializing in cutting-edge blockchain solutions. The company integrates the latest technologies such as AI, the Metaverse, and Web3 to deliver innovative products. QuadB Tech is focused on pushing the boundaries of what is possible with blockchain technology. About the Role: As a Social Media Assistant, you’ll manage publishing and engagement across agency channels. You’ll help run daily social campaigns, monitor community engagement, and support content execution across Twitter and LinkedIn. Key Responsibilities: Schedule and publish posts for Mythic Studio’s Twitter & LinkedIn Engage with the community — 100+ replies/day across all accounts Format copy, coordinate with designers, and repurpose posts Help gather analytics, post insights, and weekly performance recaps Keep agency voice consistent across multiple brand touchpoints Who We’re Looking For: 6 months – 1 year experience in social media or marketing Familiarity with Twitter, LinkedIn, and tools like Buffer or Hootsuite Quick learner, detail-oriented, comfortable with fast turnarounds Team player who thrives on feedback and execution

Posted 12 hours ago

Apply

0.0 - 3.0 years

0 - 0 Lacs

Rajajipuram, Lucknow, Uttar Pradesh

On-site

Job Title: Senior Team Leader – Collections & Recovery Department: Collections & Recovery Location: INSPIRE BPO 356/088, 3rd floor alamnagar Road, near Bawali police chauki, Rajajipuram, Lucknow, Uttar Pradesh -226017 Reports To: Collections Manager / Head of Recovery Employment Type: Full-Time Job Summary: We are seeking a highly motivated and experienced Senior Team Leader to oversee and manage a team of collection officers focused on debt recovery. The ideal candidate will bring strong leadership, strategic thinking, and a results-driven mindset to improve recovery rates, maintain compliance, and ensure high performance within the team. Key Responsibilities: Team Management: Lead, coach, and manage a team of collection agents to achieve daily, weekly, and monthly recovery targets. Monitor team performance, provide feedback, and implement corrective actions where necessary. Motivate and inspire the team to maintain high levels of productivity and morale. Debt Recovery Operations: Oversee the execution of collection strategies for delinquent accounts (early, mid, and late stage). Handle escalated or complex recovery cases and negotiate with high-value or sensitive customers. Ensure compliance with all regulatory and legal requirements in recovery processes. Performance Monitoring & Reporting: Track KPIs and prepare regular reports on team performance and recovery metrics. Analyze trends, identify gaps, and recommend process improvements. Collaborate with analytics or MIS teams to develop insight-driven strategies. Process & Compliance Oversight: Ensure adherence to standard operating procedures, internal policies, and regulatory guidelines. Conduct regular audits of collection activities and documentation. Provide training and upskilling for team members on compliance and soft skills. Stakeholder Management: Liaise with legal, operations, and customer service departments for case coordination. Represent the collections team in internal reviews and strategic planning meetings. Qualifications & Experience: At least 12th pass ,Bachelor’s degree in Business Administration, Finance, or related field . 5–8 years of experience in collections or recovery, with at least 2–3 years in a leadership or supervisory role. Strong knowledge of collection tools, legal recovery processes, and regulatory requirements. Excellent communication, negotiation, and interpersonal skills. Proficiency in MS Excel, CRM systems, and collection software tools. Ability to work under pressure, handle escalations, and drive results. Key Competencies: Leadership & Team Management Analytical & Problem-Solving Skills Negotiation & Conflict Resolution Attention to Detail Adaptability and Strategic Thinking Strong Ethical Standards and Integrity Job Type: Full-time Pay: ₹22,000.00 - ₹25,000.00 per month Application Question(s): Do you have Advance knowledge of Excel Work Location: In person Speak with the employer +91 8882287773

Posted 12 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Kadapa Mandal, Andhra Pradesh, India

On-site

Experience- 5-8 years Qualification – Degree – Electrical Engineering Location- Sprng 250 MW Kadapa, Andhra Pradesh 1. Should have sound knowledge on basic principles of Electrical Components .2. Preparation of various documents as per ISO standards .3. Experience in problem solving by using tools like Root Cause Analysis, Corrective Action & Preventive Action process .4. Raising Non-Conformities, conducting RCA, CAPA and tracking closure of NCs for timely completion .5. Supervising and verifying preventive maintenance as per schedule for solar power plant equipment’s e.g. Modules, Inverters .6. Checking inverters and Strings for faults and related. Rectifications, checking and rectification of defective strings and modules .7. Coordinating with OEMs for spares, AMC, warranty and service requests .8. Analytics of various plant performance parameters with respect to Plant design (PV syst) .

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 12 hours ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Join PwC US - Acceleration Center as a Manager of GenAI Data Science to lead innovative projects and drive significant advancements in GenAI solutions. We offer a competitive compensation package, a collaborative work environment, and ample opportunities for professional growth and impact. Years of Experience: Candidates with 8+ years of hands on experience Responsibilities Lead and mentor a team of data scientists in understanding business requirements and applying GenAI technologies to solve complex problems. Oversee the development, implementation, and optimization of machine learning models and algorithms for various GenAI projects. Direct the data preparation process, including data cleaning, preprocessing, and feature engineering, to ensure data quality and readiness for analysis. Collaborate with data engineers and software developers to streamline data processing and integration into machine learning pipelines. Evaluate model performance rigorously using advanced metrics and testing methodologies to ensure robustness and effectiveness. Spearhead the deployment of production-ready machine learning applications, ensuring scalability and reliability. Apply expert programming skills in Python, R, or Scala to develop high-quality software components for data analysis and machine learning. Utilize Kubernetes for efficient container orchestration and deployment of machine learning applications. Design and implement innovative data-driven solutions such as chatbots using the latest GenAI technologies. Communicate complex data insights and recommendations to senior stakeholders through compelling visualizations, reports, and presentations. Lead the adoption of cutting-edge GenAI technologies and methodologies to continuously improve data science practices. Champion knowledge sharing and skill development within the team to foster an environment of continuous learning and innovation. Requirements 8-10 years of relevant experience in data science, with significant expertise in GenAI projects. Advanced programming skills in Python, R, or Scala, and proficiency in machine learning libraries like TensorFlow, PyTorch, or scikit-learn. Extensive experience in data preprocessing, feature engineering, and statistical analysis. Strong knowledge of cloud computing platforms such as AWS, Azure, or Google Cloud, and data visualization techniques. Demonstrated leadership in managing data science teams and projects. Exceptional problem-solving, analytical, and project management skills. Excellent communication and interpersonal skills, with the ability to lead and collaborate effectively in a dynamic environment. Preferred Qualifications Experience with object-oriented programming languages such as Java, C++, or C#. Proven track record of developing and deploying machine learning applications in production environments. Understanding of data privacy and compliance regulations in a corporate setting. Relevant advanced certifications in data science or GenAI technologies. Nice To Have Skills Experience with specific tools such as Azure AI Search, Azure Document Intelligence, Azure OpenAI, AWS Textract, AWS Open Search, and AWS Bedrock. Familiarity with LLM backed agent frameworks like Autogen, Langchain, Semantic Kernel, and experience in chatbot development. Professional And Educational Background Any graduate /BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA

Posted 12 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Skill required: Marketing Operations - Campaign Management Designation: Campaign Management Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? We are seeking a highly skilled and detail-oriented Ad Operations Specialist to join our dynamic team. The ideal candidate will be responsible for managing and optimizing the delivery of digital advertising campaigns, ensuring smooth execution, and maintaining the quality and performance of digital ad operations. The role requires a strong understanding of digital advertising platforms, analytics, and the ability to troubleshoot issues effectively. 1. Campaign Management: o Set up, monitor, and optimize digital ad campaigns across various platforms/products (display, video, social media, etc.). o Ensure proper targeting, scheduling, and creative deployment for optimal campaign delivery. o Manage creative assets and ad trafficking, ensuring the correct formats and specifications are used. o Work closely with the client and provide analytical/campaign reports, track KPIs, and optimize campaigns based on performance metrics. o Troubleshoot and resolve campaign issues related to delivery, tracking, and ad quality. 2. Technical Setup & Troubleshooting: o Perform ad trafficking tasks, ensuring that all campaigns are set up properly and execute without errors. o Troubleshoot technical issues, such as discrepancies in reporting, creative issues, or campaign performance problems. o Coordinate with vendors or partners to resolve any issues impacting campaign delivery. 3. Client Servicing: o Collaborate with account managers/clients, and internal teams to align campaign objectives with ad execution. o Communicate with Internal & External teams to ensure a smooth campaign delivery takes place. o Excellent written and verbal communication skills for internal and client-facing interactions o Good at articulating the problems/challenges in simple words o Proactive in identifying issues/challenges and use the technical knowledge to suggest solutions What are we looking for? 4. Reporting & Analysis: o Create campaign performance reports and actionable insights to clients and stakeholders. o Help with the analysis of campaign data to identify trends, opportunities, and areas for improvement. 5. Platform Expertise: o Stay up to date with the latest trends and updates in digital advertising platforms (Google Ad Manager, Magnite, etc) o Maintain expert knowledge of ad-serving technologies, tracking methods, and optimization tools 6. Quality Assurance: Review and ensure all creative assets meet technical specifications and are free from errors. Resolve any discrepancies before the ads go live. Roles and Responsibilities: Qualifications & Skills: Education: Bachelor’s degree or Preferred in Marketing, Advertising or related field. Experience: 2-3 years of experience in Campaign Management or Ad Operations or Digital marketing. Technical Skills: Familiarity with ad-serving platforms (DoubleClick, Sizmek, Google Ad Manager, etc.) and analytics tools (Google Analytics, Magnite, Tableau, etc). Attention to Detail: Strong ability to manage and optimize campaigns with a focus on precision and accuracy. Analytical Mindset: Strong data analysis skills and comfort with numbers to make informed decisions. Communication Skills: Excellent written and verbal communication skills for internal and client-facing interactions. Problem-Solving: Ability to troubleshoot and resolve issues in a timely and efficient manner.

Posted 12 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Company Description Reimagining Oral Care: Where Sustainability meets Safety! Salt Oral Care is a brand under the parent company Manor Rama Care Pvt. Ltd. We are committed to providing eco-friendly, toxin-free, gluten-free, and 100% vegan oral care products. Our products are designed with minimal plastic and feature recyclable or eco-friendly packaging. Certified by GMP, ISO 9001-2015, and FDA approved, our products ensure safety and sustainability, free from harmful chemicals like Fluoride, SLS, GMO, Paraben, Triclosan, and Carrageenan. Job Summary: We are seeking a dynamic and results-driven Growth - HORECA Distribution Manager to expand our presence in the Hotels, Restaurants, and Catering (HORECA) sector. The ideal candidate will be responsible for developing and executing strategies to drive sales, build strong partnerships with key accounts, and enhance brand visibility within the luxury oral care industry. This role requires a deep understanding of HORECA distribution, sales management, and business development. Key Responsibilities: Develop and implement a strategic distribution plan to grow sales in the HORECA sector. Identify and establish partnerships with hotels, restaurants, and catering businesses. Manage and nurture relationships with key accounts, ensuring strong customer engagement and retention. Drive revenue growth by negotiating contracts, pricing, and distribution agreements with HORECA clients. Collaborate with the marketing team to create tailored promotional campaigns and brand activations. Monitor market trends, competitor activities, and customer preferences to optimize sales strategies. Work closely with supply chain and logistics teams to ensure timely order fulfillment and inventory management. Utilize data analytics and AI tools to track sales performance and identify new business opportunities. Lead the expansion of HORECA distribution channels in new markets and regions. Ensure compliance with industry regulations and company policies. Key Requirements: Education: Bachelor's degree in Business, Sales, Marketing, or a related field. Experience: 5-7 years of experience in sales, distribution, or business development within the HORECA industry. Proven track record of driving growth and managing key accounts in the hospitality sector. Strong understanding of HORECA distribution networks and B2B sales strategies. Excellent negotiation, communication, and relationship-building skills. Experience in luxury, premium, or FMCG brands is preferred.

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MS SQL Professionals in the following areas : Experience 5 - 7 Years Job Description Job Summary: We are looking for a skilled SQL Server, Snowflake Developer to join our data and analytics team. The ideal candidate will have strong experience in developing and maintaining data solutions using SQL Server, Snowflake. You will play a key role in building scalable data pipelines, designing data models, and delivering business intelligence solutions. Key Responsibilities Develop and optimize complex SQL queries, stored procedures, and ETL processes in SQL Server. Design and implement data pipelines and models in Snowflake. Build and maintain SSIS packages for ETL workflows. Migrate and integrate data between on-premise SQL Server and Snowflake cloud platform. Collaborate with business analysts and stakeholders to understand reporting needs. Ensure data quality, performance tuning, and error handling across all solutions. Maintain technical documentation and support data governance initiatives. Required Skills & Qualifications 5-7 years of experience with SQL Server (T-SQL). 2+ years of hands-on experience with Snowflake. Strong understanding of ETL/ELT processes and data warehousing principles. Experience with data modeling, performance tuning, and data integration. Familiarity with Azure cloud platforms is a plus. Good communication and problem-solving skills. Preferred / Good-to-Have Skills Experience with Azure Data Factory (ADF) for orchestrating data workflows. Experience with Power BI or other visualization tools. Exposure to CI/CD pipelines and DevOps practices in data environments. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Accountability Required Behavioral Competencies Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration Shares information within team, participates in team activities, asks questions to understand other points of view. Agility Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 12 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the team and your role We are currently looking for integration consultants that are passionate about integration and understand what it takes to deliver TOP quality integration solutions to our clients and partners. You have an eagle-eye for identifying the integration challenges and the ability to translate those same business challenges into the best integration solutions. You can listen and stay calm under pressure and can be the linking pin between business and IT. You have seen integrations in different shapes, sizes, and colours, you can integrate any to any, either on-premise or cloud. You advise our clients about the best integration strategies based on best practices, analysts' recommendations, and architecture patterns. Last but not least, to be successful in this position we expect you to apply your strong consultancy & integration skills. Part of your responsibilities is to support the different developing teams during the entire lifecycle of an interface, from requirements gathering, analysis, design, development, testing, and handing over to operational support. We are looking for experienced Enterprise Integration Consultants to join our team. The ideal candidate has: Strong knowledge of integration principles and consultancy skills to be able to translate business to IT requirements. Hands-on experience in the integration of SAP and non-SAP systems in A2A and B2B scenarios. Deep expertise and hands-on experience in Boomi as the primary integration platform. Working knowledge of API Management platforms (e.g., SAP API Management, Apigee, or others). Familiarity with event-driven architecture and distributed streaming platforms like Solace or Confluent Kafka . In-depth technical, functional, and architectural expertise in integrating applications using different technologies such as but not limited to REST, SOAP, ALE-IDocs, EDI, RFC, XI, HTTP, IDOC, JDBC, File/FTP, Mail, JMS. Solid middleware knowledge and web service skills. Good understanding of REST API and Web Services. Extensive experience with integrating 3rd party applications using REST-based services. Experience using tools like Postman & SOAPUI for service testing. Proven experience with full life cycle Integration implementation or rollout projects. Demonstrated experience with deliverables planning, client-facing roles, and high pace environments. What is Rojo all about? Founded in 2011, Rojo Integrations has transformed from a consulting firm into a comprehensive SAP integration leader, partnering with top software vendors like SAP, Coupa, SnapLogic, and Solace. As the leading SAP integration partner and ultimate expert, we provide seamless enterprise integration and data analytics solutions, enabling real-time insights and empowering digital transformation. Trusted by global Bluechip companies such as Heineken and Siemens, we deliver tailored services to meet unique business needs. Rojo is headquartered in the Netherlands and operates globally from its offices in the Netherlands, Spain, and India. We specialize in SAP integration modernization and business processes, improving data integration and business strategies. Our 360-degree portfolio includes consultancy, software development, and managed services to streamline integration, enhance observability, and drive growth. Requirements to succeed in this role Experience using Boomi , SAP PO, SAP Cloud Integration, SnapLogic, and/or API Management. Quick Learner and adapt to the new tools and technologies and evaluate their test applicability. Team Player with good technical, analytical, communication skills and client-driven mindset. A bright mind and ability to understand a complex platform. Ability to understand technical/engineering concepts and to learn integration product functionality and applications. Demonstrated user-focused technical writing ability. Must be able to communicate complex technical concepts clearly and effectively. Strong analytical and problem-solving skills. Ability to work independently in a dynamic environment. Ability to work on multiple complex projects simultaneously. Strong interpersonal communication skills. Effectively communicates in one-to-one and group situations. At least three years of previous experience in a similar role. Additional desired skills: You have at least a Bachelors degree in computer engineering or a related field. Experience with any API Management Platform. Experience with Distributed Streaming Platforms and Event-based Integration Architecture such as Kafka or Solace. Extensive experience in integration of SAP and non-SAP systems in A2A and B2B scenarios using SAP Integration Suite or Cloud Integration (CPI). Experience in integration with main SAP backend systems (SAP ERP, SAP S/4HANA, SAP S/4HANA Cloud). SAP PO experience in programming UDFs, Modules, Look-Ups (RFC, SOAP, JDBC), BPM, Inbound and Outbound ABAP Proxies. Extensive knowledge of Java, JavaScript and/or GroovyScript. Good understanding CI/CD concepts. Speak and write English fluently. Affinity and experience with integration platforms/software like Boomi, SAP Cloud Integration, or SnapLogic is desirable. What do we offer? The chance to gain work experience in a dynamic and inspiring environment and launch your career. Plenty growth opportunities while working in a high energy and fun environment. The opportunity to work on innovative projects with colleagues who are genuinely proud of their contribution. Training and mentoring to support your professional development with a yearly education budget. International atmosphere with Multicultural environments (+- 20 nationalities). A global, inclusive and diverse working climate within a world conscious organization. Plus, other exciting benefits specific to each region. Rojo is committed in achieving diversity & inclusion in terms of gender, caste, race, religion, nationality, ethnic origin, sexual orientation, disability, age, pregnancy, or other status. All qualified candidates are encouraged to apply. No one fits a job description perfectly, and there is no such thing as the perfect candidate. If you don't meet all the criteria, we'd still love to hear from you. Does that spark your interest? Apply now.

Posted 12 hours ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location: Pune, India Experience: 5–7 Years Employment Type: Full-Time Work Mode: Onsite About the Role: We are seeking a Senior Data Engineer with extensive experience in Dataiku to join our expanding data engineering team. In this role, you will be responsible for designing, developing, and optimizing data pipelines, workflows, and automation using Dataiku's capabilities . You will work closely with data scientists, analysts, and business stakeholders to enable advanced analytics and deliver scalable, production-ready data solutions. Key Responsibilities: Design and Develop Data Pipelines : Build, manage, and optimize scalable data pipelines using Dataiku for data ingestion, transformation, and delivery. Leverage Dataiku Features : Utilize the full capabilities of Dataiku, including flow design, visual and code recipes, scenarios, metrics, and plugins to support end-to-end data solutions. Workflow Automation : Develop and automate workflows and repeatable processes to support analytics and machine learning use cases. ETL/ELT Development : Design and build ETL/ELT pipelines using Dataiku , SQL , Python , and related technologies. Model Integration Support : Collaborate with data science teams to integrate and deploy models as part of the Dataiku flows. Data Governance & Quality : Ensure high data quality, consistency, lineage, and compliance with data governance policies. Performance Optimization : Monitor, troubleshoot, and tune data workflows for performance and scalability. Documentation & Best Practices : Maintain clear documentation of workflows, design decisions, and engineering standards within Dataiku projects. Required Skills and Qualifications: 5–7 years of experience as a Data Engineer with strong hands-on expertise in Dataiku . Proven ability to build complex data workflows and solutions using visual and code-based components in Dataiku. Strong proficiency in SQL and Python for data processing, cleaning, and transformation. Familiarity with cloud environments (AWS, Azure, or GCP) and experience integrating cloud-based data sources with Dataiku. Deep understanding of data modeling, ETL/ELT concepts, and pipeline orchestration. Experience working with large-scale datasets and building high-performance data solutions. Knowledge of Git or other version control tools. Nice to Have: Exposure to MLOps practices and integrating machine learning workflows in Dataiku. Experience with Airflow , Docker , or other orchestration/container tools. Familiarity with data warehouse platforms like Snowflake, BigQuery, or Redshift. Knowledge of data governance tools or frameworks. Dataiku certifications or experience in enterprise-grade Dataiku implementations. Apply now at tisha.goyal@rsquaresoft.com

Posted 12 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies