Jobs
Interviews

3557 Redshift Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 - 0 Lacs

Gurgaon, Haryana, India

On-site

About Us KlearNow.AI digitizes and contextualizes unstructured trade documents to create shipment visibility, business intelligence, and advanced analytics for supply chain stakeholders. It provides unparalleled transparency and insights, empowering businesses to operate efficiently. We futurize supply chains with AI&ML-powered collaborative digital platforms created from ingesting required trade documentation without the pain of complex integrations. We achieve our goals by assembling a team of the best talents. As we expand, it's crucial to maintain and strengthen our culture, which places a high value on our people and teams. Our collective growth and triumphs are intrinsically linked to the success and well-being of every team member. OUR VISION To futurize global trade, empowering people and optimizing processes with AI-powered clarity. YOUR MISSION As part of a diverse, high-energy workplace, you will challenge the status quo of supply chain operations with your knack for engaging clients and sharing great stories. KlearNow is operational and a certified Customs Business provider in US, Canada, UK, Spain and Netherlands with plans to grow in many more markets in near future. About Us KlearNow.AI digitizes and contextualizes unstructured trade documents to create shipment visibility, business intelligence, and advanced analytics for supply chain stakeholders. It provides unparalleled transparency and insights, empowering businesses to operate efficiently. We futurize supply chains with AI&ML-powered collaborative digital platforms created from ingesting required trade documentation without the pain of complex integrations. We achieve our goals by assembling a team of the best talents. As we expand, it's crucial to maintain and strengthen our culture, which places a high value on our people and teams. Our collective growth and triumphs are intrinsically linked to the success and well-being of every team member. OUR VISION To futurize global trade, empowering people and optimizing processes with AI-powered clarity. YOUR MISSION As part of a diverse, high-energy workplace, you will challenge the status quo of supply chain operations with your knack for engaging clients and sharing great stories. KlearNow is operational and a certified Customs Business provider in US, Canada, UK, Spain and Netherlands with plans to grow in many more markets in near future. Join our vibrant and forward-thinking team at KlearNow.ai as we continue to push the boundaries of AI/ML technology. We offer a competitive salary, flexible work arrangements, and ample opportunities for professional growth. We are committed to diversity, equality and inclusion. If you are passionate about shaping the future of logistics and supply chain and making a difference, we invite you to apply . Join our vibrant and forward-thinking team at KlearNow.ai as we continue to push the boundaries of AI/ML technology. We offer a competitive salary, flexible work arrangements, and ample opportunities for professional growth. We are committed to diversity, equality and inclusion. If you are passionate about shaping the future of logistics and supply chain and making a difference, we invite you to apply . Business Analyst - Data Science & Business Intelligence Location: India Employment Type: Full-time The Role: Join our Data & Analytics team as a Business Analyst where you'll transform data from our modern data warehouse into actionable business insights and strategic recommendations. You'll work with advanced analytics tools and techniques to create compelling reports, dashboards, and predictive models that drive data-driven decision making across the organization. Key Responsibilities: Analyze data from cloud data warehouses (like Amazon Redshift) to identify business trends and opportunities Create interactive dashboards and reports using Business Intelligence platforms(like ThoughtSpot, PowerBI) Develop statistical models and perform predictive analytics using tools (like Python, R) Collaborate with stakeholders to understand business requirements and translate them into analytical solutions Design and implement KPIs, metrics, and performance indicators for various business functions Conduct ad-hoc analysis to support strategic business decisions and initiatives Present findings and recommendations to leadership through compelling data visualizations Monitor and troubleshoot existing reports and dashboards to ensure accuracy and performance Ensure data quality and consistency in all analytical outputs and reporting Support business teams with self-service analytics training and best practices Required Qualifications: Strong analytical and problem-solving skills with business acumen Experience with Business Intelligence tools and dashboard creation Proficiency in data analysis using programming languages (like Python, R) or advanced Excel Experience querying cloud data warehouses and relational databases Strong data visualization and storytelling capabilities Experience with statistical analysis and basic predictive modeling Preferred Qualifications: Experience with advanced BI platforms (like ThoughtSpot) is a significant advantage Machine learning and advanced statistical modeling experience Experience with modern analytics tools and frameworks Advanced data visualization and presentation skills Experience with business process optimization and data-driven strategy

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorganChase within the Consumer and Community Banking, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Work with Cloud Architect to identify data components and process flows Design and Develop data ingestion processes into Hadoop/AWS Platform Collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Identify, analyze, and interpret trends or patterns in complex data sets Innovate new ways of managing, transforming and validating data Establish and enforce guidelines to ensure consistency, quality and completeness of data assets Apply quality assurance best practices to all work products Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 2+ years applied experience Experience in a Big Data technologies (Spark, Glue, Hive, Redshift, Kafka, etc.) Experience programming in Python/JAVA Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on big data platform like NoSQL databases (Mongo, DynamoDB, Cassandra) Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc. Strong development discipline and adherence to best practices and standards Demonstrated independent problem solving skills and ability to develop solutions to complex analytical/data-driven problems Experience of working in a development teams using agile techniques ABOUT US

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Key Attributes: ü - Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform. Design/ implement, and maintain the data architecture for all AWS data services ü - A strong understanding of data modelling, data structures, databases (Redshift), and ETL processes ü - Work with stakeholders to identify business needs and requirements for data-related projects ü - Strong SQL and/or Python or PySpark knowledge ü - Creating data models that can be used to extract information from various sources & store it in a usable format ü - Optimize data models for performance and efficiency ü - Write SQL queries to support data analysis and reporting ü - Monitor and troubleshoot data pipelines ü - Collaborate with software engineers to design and implement data-driven features ü - Perform root cause analysis on data issues ü - Maintain documentation of the data architecture and ETL processes ü - Identifying opportunities to improve performance by improving database structure or indexing methods ü - Maintaining existing applications by updating existing code or adding new features to meet new requirements ü - Designing and implementing security measures to protect data from unauthorized access or misuse ü - Recommending infrastructure changes to improve capacity or performance ü - Experience in Process industry Mandatory skill sets: Data Modelling, AWS, ETL Preferred skill sets: Data Modelling, AWS, ETL Years of experience required: 4-8 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

JOB_POSTING-3-72576 Job Description Role Title : Analyst, Analytics - Data Quality Developer(L08) Company Overview : Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Analytics - Data Quality Developer (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role will be responsible for the proactive design, implementation, execution, and monitoring of Data Quality process capabilities within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. The Data Quality Developer – Analyst will work within the IT organization to support and participate in build and run activities and environment (e.g. DevOps) for Data Quality. Key Responsibilities Monitor and maintain Data Quality and Data Issue Management operating level agreements in support of data quality rule execution and reporting Assist in performing root cause analysis for data quality issues and data usage challenges, particularly for the workload migration to the public cloud. Recommend, design, implement and refine / remediate data quality specifications within Synchrony’s approved Data Quality platforms Participate in the solution design of data quality and data issue management technical and procedural solutions, including metric reporting Work closely with Technology teams and key stakeholders to ensure the data quality issues are prioritized, analyzed and addressed Regularly communicate the states of data quality issues and progress to key stakeholders Participate in the planning and execution of agile release cycles and iterations Qualifications/Requirements Minimum of 1 years’ experience in data quality management, including implementing data quality rules, data profiling and root cause analysis for data issues, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Minimum of 1 years’ experience with data quality or data integration tools such as Ab Initio, Informatica, Collibra, Stonebranch or Tableau, gained through hands-on experience or projects. Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Characteristics Broad understanding of banking, credit card, payment solutions, collections, marketing, risk and regulatory & compliance. Experience using data governance and data quality tools such as: Collibra, Ab Initio Express>IT; Ab Initio MetaHub. Proficient in writing / understanding SQL. Experience querying/analyzing data in cloud-based environments (e.g, AWS, Redshift) AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Intermediate to advanced MS Office Suite skills including Power Point, Excel, Access, Visio. Strong relationship management and influencing skills to build enduring and productive alliances across matrix organizations. Demonstrated success in managing multiple deliverables concurrently often within aggressive timeframes; ability to cope under time pressure. Experience in partnering with a diverse team composed of staff and consultants located in multiple locations and time zones. Eligibility Criteria: Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (Formal/Final Formal) or PIP L4 to L7 Employees who have completed 12 months in the organization and 12 months in their current role and level are eligible. L8+ Employees who have completed 18 months in the organization and 12 months in their current role and level are eligible. Grade/Level: 08 Job Family Group Information Technology

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Company Description At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description About the Role Nielsen is seeking an organized, detail oriented, team player, to join the ITAM Back Office Engineering team in the role of Software Engineer. Nielsen’s Audience Measurement Engineering platforms support the measurement of television viewing in more than 30 countries around the world. Ideal candidates will have exceptional skills in programming, testing, debugging and problem solving as well as effective communication and writing skills. Qualifications Responsibilities System Deployment: Conceive, design and build new features in the existing backend processing pipelines. CI/CD Implementation: Design and implement CI/CD pipelines for automated build, test, and deployment processes. Ensure continuous integration and delivery of features, improvements, and bug fixes. Code Quality and Best Practices: Enforce coding standards, best practices, and design principles. Conduct code reviews and provide constructive feedback to maintain high code quality. Performance Optimization: Identify and address performance bottlenecks in both reading, processing and writing data to the backend data stores. Mentorship and Collaboration: Mentor junior engineers, providing guidance on technical aspects and best practices. Collaborate with cross-functional teams to ensure a cohesive and unified approach to software development. Security and Compliance: Implement security best practices for all tiers of the system. Ensure compliance with industry standards and regulations related to AWS platform security. Key Skills Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field. Proven experience, minimum 8 years, in high-volume data processing development expertise using ETL tools such as AWS Glue or PySpark, Python , SQL and databases such as Postgres Experience in development on an AWS platform Strong understanding of CI/CD principles and tools. GitLab a plus Excellent problem-solving and debugging skills. Strong communication and collaboration skills with ability to communicate complex technical concepts and align organization on decisions Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply Utilizes team collaboration to create innovative solutions efficiently Other Desirable Skills Knowledge of networking principles and security best practices. AWS certifications Experience with Data Warehouses, ETL, and/or Data Lakes very desirable Experience with RedShift, Airflow, Python, Lambda, Prometheus, Grafana, & OpsGeni a bonus Exposure to the Google Cloud Platform (GCP) Additional Information Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

9 Lacs

India

On-site

Data Engineer Experience: 2-4Years Location: Kochin, Kerala (Work From Office) Key Responsibilities: Build and manage data lakes and data warehouses using services like Amazon S3, Redshift, and Athena Design and build secure, scalable, and efficient ETL/ELT pipelines on AWS using services like Glue, Lambda, Step Functions Work on SAP Datasphere to build and maintain Spaces, Data Builders, Views, and Consumption Layers Support data integration between AWS, Datasphere, and various source systems(SAP S4HANA, Non-SAP apps, Flat-files etc) Develop and maintain scalable data models and optimize queries for performance · Monitor and optimize data workflows to ensure reliability, performance, and cost-efficiency Collaborate with Data Analysts and BI teams to provide clean, validated, and well-documented datasets Monitor, troubleshoot, and enhance data workflows and pipelines Ensure data quality, integrity, and governance policies are met Required Skills Strong SQL skills and experience with relational databases like MySQL, or SQL Server Proficient in Python or Scala for data transformation and scripting Familiarity with cloud platforms like AWS (S3, Redshift, Glue), Datasphere, Azure Good-to-Have Skills: AWS Certification – AWS Certified Data Analytics Exposure to modern data stack tools like Snowflake Experience in cloud-based projects and working in an Agile environment Understanding of data governance, security best practices, and compliance standards Job Types: Full-time, Permanent Pay: Up to ₹960,000.00 per year Application Question(s): Willing to take up Work from Office mode in Kochi Location? Experience: Data Engineer / ETL Developer: 2 years (Required) AWS: 2 years (Required) SQL and (Python OR Scala): 2 years (Required) Datasphere OR "SAP BW" OR "SAP S/4HANA": 2 years (Required) AWS (S3, Redshift, Glue), Datasphere, Azure: 2 years (Required) PostgreSQL and MySQL or SQL Server: 2 years (Required)

Posted 2 weeks ago

Apply

8.0 years

20 Lacs

Hyderābād

On-site

Job Title: Senior Database Administrator Job Type: Full Time Experience Required: 8+ Years Job Description : We are seeking an experienced and strategic Senior Database Administrator (DBA) with deep expertise in SQL/MySQL , AWS Redshift , and infrastructure automation using Terraform . This role requires someone who can design scalable data solutions, lead database optimization efforts, and support modern data platforms in a cloud-native environment. Key Responsibilities: Design, deploy, and manage highly available, scalable databases — with a strong emphasis on SQL, MySQL , and AWS Redshift . Implement and maintain infrastructure as code using Terraform for automating database and AWS infrastructure provisioning. Optimize performance and reliability across relational and NoSQL databases including Redshift, MySQL, SQL Server, DynamoDB, and Neo4j . Lead data platform integration efforts with applications developed in Node.js and other backend technologies. Manage real-time and batch data pipelines using tools like Qlik Replicate and Kafka . Architect and maintain workflows using a range of AWS services, such as: · Kinesis, Lambda, Glue, S3, Step Functions, SNS, SQS, EventBridge, EC2, CloudFormation, and API Gateway . Establish robust observability using tools like New Relic for database monitoring and performance Required Skills and Qualifications: · 8+ years of professional experience in database administration and data engineering. · Extensive hands-on experience with SQL and MySQL , and managing AWS Redshift in production environments. · Strong command of Terraform for infrastructure automation and provisioning. · Proficiency in PowerShell and Python for scripting and automation. · Solid experience with Node.js or a similar programming language for integration. · Working knowledge of Neo4j , DynamoDB , and SQL Server . · Experience with Qlik Replicate and Kafka for data replication and streaming. · Deep understanding of cloud architecture, event-driven systems, and serverless AWS environments. · Proficiency with monitoring and observability tools such as New Relic . · Familiarity with Okta for identity and access management. · Excellent problem-solving and communication skills; ability to lead initiatives and mentor junior team members. Education: · Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field is required. Job Type: Full-time Pay: Up to ₹2,000,000.00 per year Schedule: Day shift Application Question(s): What is your expected CTC? Do you have 6+years of Hands - on experience with SQL, MySQL, AWS? Work Location: In person

Posted 2 weeks ago

Apply

1.0 - 3.0 years

8 - 9 Lacs

Hyderābād

Remote

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers.bms.com/working-with-us . Job Description Summary The Software Engineer II, Aera DI role is accountable for developing data solutions and operations support of the Enterprise data lake. The role will be accountable for developing the pipelines for the data enablement projects, production/application support and enhancements, and support data operations activities. Additional responsibilities include data analysis, data operations process and tools, data cataloguing, and developing data SME skills in Global Product Development and Supply - Data and Analytics Enablement organization. Key Responsibilities: Responsible for delivering high quality, data products and analytic ready data solutions Develop and maintain data models to support our reporting and analysis needs. Develop ad-hoc analytic solutions from solution design to testing, deployment, and full lifecycle management. Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Proficient Python/node.js along with UI technologies like Reacts.js, Spark, SQL, AWS Redshift, AWS S3, Glue/Glue Studio, Athena, IAM, other Native AWS Service familiarity with Domino/data lake principles. Required: 1-3 years of experience in information technology field in developing Software Applications. Working Experience with Aera Decision Intelligence is preferred. Good understanding of cloud technologies preferably AWS and related services in delivering and supporting data and analytics solutions/data lakes Proficient in Java/ReactJS/NodeJS, SQL, Python, Spark, SQL. Java knowledge can be an added advantage to explore java based DI tools. Ideal Candidates Would Also Have: Prior experience in global life sciences especially in the GPS functional area will be a plus Experience working internationally with a globally dispersed team including diverse stakeholders and management of offshore technical development team(s) Strong communication and presentation skills Other Qualifications: Bachelor's degree in Computer Science, Information Systems, Computer Engineering or equivalent is preferred If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 2 weeks ago

Apply

2.0 years

1 - 10 Lacs

Hyderābād

On-site

You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorganChase within the Consumer and Community Banking, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities: Work with Cloud Architect to identify data components and process flows Design and Develop data ingestion processes into Hadoop/AWS Platform Collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Identify, analyze, and interpret trends or patterns in complex data sets Innovate new ways of managing, transforming and validating data Establish and enforce guidelines to ensure consistency, quality and completeness of data assets Apply quality assurance best practices to all work products Required qualifications, capabilities, and skills : Formal training or certification on software engineering concepts and 2+ years applied experience Experience in a Big Data technologies (Spark, Glue, Hive, Redshift, Kafka, etc.) Experience programming in Python/JAVA Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on big data platform like NoSQL databases (Mongo, DynamoDB, Cassandra) Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc. Strong development discipline and adherence to best practices and standards Demonstrated independent problem solving skills and ability to develop solutions to complex analytical/data-driven problems Experience of working in a development teams using agile techniques

Posted 2 weeks ago

Apply

8.0 years

30 - 38 Lacs

Gurgaon

Remote

Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: AWS Glue Catalog : 4 years (Required) Data Engineering: 5 years (Required) AWS CDK, Cloud-formation, Lambda, Step-function : 5 years (Required) AWS Elastic MapReduce (EMR) : 4 years (Required) Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

10 Lacs

Gurgaon

On-site

Gurgaon, India We are seeking an Associate Consultant to join our India team based in Gurgaon. This role at Viscadia offers a unique opportunity to gain hands-on experience in the healthcare industry, with comprehensive training in core consulting skills such as critical thinking, market analysis, and executive communication. Through project work and direct mentorship, you will develop a deep understanding of healthcare business dynamics and build a strong foundation for a successful consulting career. ROLES AND RESPONSIBILITIES Technical Responsibilities Design and build full-stack forecasting and simulation platforms using modern web technologies (e.g., React, Node.js, Python) hosted on AWS infrastructure (e.g., Lambda, EC2, S3, RDS, API Gateway). Automate data pipelines and model workflows using Python for data preprocessing, time-series modeling (e.g., ARIMA, Exponential Smoothing), and backend services. Develop and enhance product positioning, messaging, and resources that support the differentiation of Viscadia from its competitors. Conduct research and focus groups to elucidate key insights that augment positioning and messaging Replace legacy Excel/VBA tools with scalable, cloud-native applications, integrating dynamic reporting features and user controls via web UI. Use SQL and cloud databases (e.g., AWS RDS, Redshift) to query and transform large datasets as inputs to models and dashboards. Develop interactive web dashboards using frameworks like React + D3.js or embed tools like Power BI/Tableau into web portals to communicate insights effectively. Implement secure, modular APIs and microservices to support modularity, scalability, and seamless data exchange across platforms. Ensure cost-effective and reliable deployment of solutions via AWS services, CI/CD pipelines, and infrastructure-as-code (e.g., CloudFormation, Terraform). Business Responsibilities Support the development and enhancement of forecasting and analytics platforms tailored to the needs of pharmaceutical clients across various therapeutic areas Build in depth understanding of pharma forecasting concepts, disease areas, treatment landscapes, and market dynamics to contextualize forecasting models and inform platform features Partner with cross-functional teams to ensure forecast deliverables align with client objectives, timelines, and decision-making needs Contribute to a culture of knowledge sharing and continuous improvement by mentoring junior team members and helping codify best practices in forecasting and business analytics Grow into a client-facing role, combining an understanding of commercial strategy with forecasting expertise to lead engagements and drive value for clients QUALIFICATIONS Bachelor’s degree (B.Tech/B.E.) from a premier engineering institute, preferably in Computer Science, Information Technology, Electrical Engineering, or related disciplines 2+ years of experience in full-stack development, with a strong focus on designing, developing, and maintaining AWS-based applications and services SKILLS & TECHNICAL PROFICIENCIES Technical Skills Proficient in Python, with practical experience using libraries such as pandas, NumPy, matplotlib/seaborn, and statsmodels for data analysis and statistical modeling Strong command of SQL for data querying, transformation, and seamless integration with backend systems Hands-on experience in designing and maintaining ETL/ELT data pipelines, ensuring efficient and scalable data workflows Solid understanding and applied experience with cloud platforms, particularly AWS; working familiarity with Azure and Google Cloud Platform (GCP) Full-stack web development expertise, including building and deploying modern web applications, web hosting, and API integration Proficient in Microsoft Excel and PowerPoint, with advanced skills in data visualization and delivering professional presentations Soft Skills Excellent verbal and written communication skills, with the ability to effectively engage both technical and non-technical stakeholders Strong analytical thinking and problem-solving abilities, with a structured and solution-oriented mindset Demonstrated ability to work independently as well as collaboratively within cross-functional teams Adaptable and proactive, with a willingness to thrive in a dynamic, fast-growing environment Genuine passion for consulting, with a focus on delivering tangible business value for clients Domain Expertise (Good to have) Strong understanding of pharmaceutical commercial models, including treatment journeys, market dynamics, and key therapeutic areas Experience working with and interpreting industry-standard datasets such as IQVIA, Symphony Health, or similar secondary data sources Familiarity with product lifecycle management, market access considerations, and sales performance tracking metrics used across the pharmaceutical value chain

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

3.0 years

6 - 9 Lacs

Ahmedabad

Remote

Job Title: Power BI Developer Location: Ahmedabad, Gujarat (Preferred) Experience Required: 3+ Years Employment Type: Full-time (Immediate Joiners Preferred) About IGNEK: At IGNEK, we specialize in remote staff augmentation and custom development solutions, offering expert teams in technologies like Liferay, AEM, Java, React, and Node.js. We help global clients meet their project goals efficiently by delivering innovative and scalable digital solutions. Job Summary: We’re looking for an experienced Power BI Developer to join our analytics team at IGNEK. The ideal candidate will be responsible for transforming complex data into visually impactful dashboards and providing actionable insights for data-driven decision-making. Key Responsibilities: Develop, maintain, and optimize interactive Power BI dashboards and reports. Write complex SQL queries to extract, clean, and join data from multiple sources including data warehouses and APIs. Understand business requirements and collaborate with cross-functional teams to deliver scalable BI solutions. Ensure data accuracy and integrity across all reporting outputs. Create robust data models and DAX measures within Power BI. Work with data engineers and analysts to streamline data pipelines. Maintain documentation for all dashboards, definitions, and processes. (Optional) Use Python for automation, data manipulation, or API integration. Requirements: 3+ years of experience in BI or Analytics roles. Strong expertise in Power BI , including DAX, Power Query, and data modeling. Advanced SQL skills and experience with relational databases or cloud data warehouses (e.g., SQL Server, Redshift, Snowflake). Understanding of ETL processes and data quality management. Ability to communicate data-driven insights effectively to stakeholders. Bonus: Working knowledge of Python for scripting or automation. Preferred Qualifications: Hands-on experience with Power BI Service , Power BI Gateway , or Azure . Exposure to agile methodologies and collaborative development teams. Familiarity with key business metrics across functions like sales, operations, or finance. How to Apply: Please send your resume and a cover letter detailing your experience to Job Type: Full-time Pay: ₹600,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Leave encashment Provident Fund Work from home Work Location: In person

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Exp - 5-8 Years Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Description: Data Engineer (Entry-Level) About Amvion Labs Amvion Labs is on a mission to help organizations unlock the full potential of their data using cutting-edge cloud and analytics technologies. As we expand our Data Science and Engineering team, we’re looking for dynamic young professionals ready to learn, innovate, and contribute to impactful projects across industries. This is your opportunity to work with Snowflake, modern cloud platforms (AWS/Azure), and advanced data engineering tools while being mentored by experienced leaders. Key Responsibilities · Building and maintaining data pipelines using Python, SQL, and ETL frameworks. · Supporting data preparation, cleaning, and transformation for analytics and reporting. · Designing and implementing data warehouse solutions on Snowflake. · Developing and maintaining interactive dashboards using Power BI or Tableau for business insights. · Helping optimize data queries and models for performance and scalability. · Integrating data from multiple sources like MySQL, APIs, CSVs, and cloud storage systems. · Collaborating with data scientists, analysts, and business teams to solve real-world data challenges. · Staying up to date with emerging trends in cloud data engineering, Snowflake, and big data technologies. Required Skills & Qualifications · Bachelor’s or Master’s degree in Computer Science, Data Analytics, or related fields. · Strong foundation in Python (Pandas, NumPy) and SQL. · Understanding of database concepts (schemas, tables, views, stored procedures). · Familiarity with ETL concepts and data transformation workflows. · Basic knowledge of visualization tools (Power BI, Tableau, or Looker Studio). · Excellent problem-solving and analytical thinking abilities. · Exposure to cloud environments (AWS, Azure, GCP, or Oracle Cloud). · Knowledge of Snowflake, Redshift, BigQuery, or similar cloud data warehouses (nice to have). · Familiarity with Big Data tools (PySpark, Spark) or AI/ML concepts (nice to have). · Academic projects, internships, or certifications in data engineering or analytics (nice to have). Why Join Us? · Hands-on training in Snowflake and cloud data engineering. · Opportunity to work on live client projects from Day 1. · Mentorship from seasoned Data Science and Cloud leaders. · Dynamic, collaborative culture focused on continuous learning and innovation. · Competitive salary with fast growth opportunities.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Lead Analytics Engineer We are seeking a talented, motivated and self-driven professional to join the HH Digital, Data & Analytics (HHDDA) organization and play an active role in Human Health transformation journey to become the premier “Data First” commercial biopharma organization. As a Lead Analytics Engineer, you will be part of the HHDDA Commercial Data Solutions team, providing technical/data expertise development of analytical data products to enable data science & analytics use cases. In this role, you will create and maintain data assets/domains used in the commercial/marketing analytics space – to develop best-in-class data pipelines and products, working closely with data product owners to translate data product requirements and user stories into development activities throughout all phases of design, planning, execution, testing, deployment and delivery. Your specific responsibilities will include Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required Experience 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science and visualization/reporting products, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI) Our Human Health Division maintains a “patient first, profits later” ideology. The organization is comprised of sales, marketing, market access, digital analytics and commercial professionals who are passionate about their role in bringing our medicines to our customers worldwide. We are proud to be a company that embraces the value of bringing diverse, talented, and committed people together. The fastest way to breakthrough innovation is when diverse ideas come together in an inclusive environment. We encourage our colleagues to respectfully challenge one another’s thinking and approach problems collectively. We are an equal opportunity employer, committed to fostering an inclusive and diverse workplace. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 04/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R323237

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This role is for one of the Weekday's clients Min Experience: 4 years Location: Bengaluru JobType: full-time Requirements Key Responsibilities As a Data Engineer, you will play a crucial role in designing and maintaining scalable and high-performance data systems. Your responsibilities will include: Data Pipeline Development and Management Design, build, test, and maintain efficient data pipelines and data management systems. Develop and manage ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to integrate data from diverse sources such as databases, APIs, and real-time streams. Data Modeling and Architecture Design data models and implement schemas for data warehouses and data lakes to support analytics and business operations. Optimize data storage, access, and performance for scalability and maintainability. Data Quality and Integrity Implement validation, cleansing, and monitoring to maintain data accuracy, consistency, and reliability. Define and enforce best practices and standards for data governance and quality. Infrastructure Management Manage and monitor key data infrastructure components including databases, data lakes, and distributed computing environments. Apply data security protocols and ensure proper access controls are in place. Automation and Optimization Automate data workflows and pipelines to improve reliability and performance. Continuously monitor and fine-tune systems for operational efficiency. Collaboration and Support Partner with data scientists, analysts, software engineers, and business stakeholders to gather requirements and provide scalable data solutions. Document processes, workflows, and system designs; support cross-functional teams with technical guidance. Technology Evaluation Stay current with emerging tools and technologies in the data engineering space. Evaluate and recommend new solutions to enhance data capabilities and performance. Education And Experience Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, Data Science, or a related field. 5 to 7 years of experience in data engineering, software development, or a similar domain. Technical & Functional Competencies Required Skills & Qualifications Technical Proficiency Programming: Strong experience in Python and SQL. Databases: Proficient in relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases. Data Warehousing & Lakes: Hands-on experience with platforms like Snowflake, Redshift, BigQuery. ETL/ELT Tools: Proficiency with tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend. Big Data: Working knowledge of Apache Spark or similar big data technologies. Cloud Platforms: Experience with AWS, Azure, or GCP for data engineering workflows. Data Modeling: Strong understanding of modeling techniques and best practices. API Integration: Ability to build and consume APIs for data integration. Version Control: Experience with Git or other version control systems. Soft Skills Analytical mindset with a strong problem-solving approach. Excellent communication skills for both technical and non-technical audiences. Team player with a collaborative work ethic. Detail-oriented with a commitment to data quality. Adaptability to new technologies and changing project requirements. Key Skills: ETL, Data Modeling, Data Architecture, Cloud Data Platforms, Python, SQL, Big Data, Data Warehousing, API Integration

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Data Architect is responsible to define and lead the Data Architecture, Data Quality, Data Governance, ingesting, processing, and storing millions of rows of data per day. This hands-on role helps solve real big data problems. You will be working with our product, business, engineering stakeholders, understanding our current eco-systems, and then building consensus to designing solutions, writing codes and automation, defining standards, establishing best practices across the company and building world-class data solutions and applications that power crucial business decisions throughout the organization. We are looking for an open-minded, structured thinker passionate about building systems at scale. Role Design, implement and lead Data Architecture, Data Quality, Data Governance Defining data modeling standards and foundational best practices Develop and evangelize data quality standards and practices Establish data governance processes, procedures, policies, and guidelines to maintain the integrity and security of the data Drive the successful adoption of organizational data utilization and self-serviced data platforms Create and maintain critical data standards and metadata that allows data to be understood and leveraged as a shared asset Develop standards and write template codes for sourcing, collecting, and transforming data for streaming or batch processing data Design data schemes, object models, and flow diagrams to structure, store, process, and integrate data Provide architectural assessments, strategies, and roadmaps for data management Apply hands-on subject matter expertise in the Architecture and administration of Big Data platforms, Data Lake Technologies (AWS S3/Hive), and experience with ML and Data Science platforms Implement and manage industry best practice tools and processes such as Data Lake, Databricks, Delta Lake, S3, Spark ETL, Airflow, Hive Catalog, Redshift, Kafka, Kubernetes, Docker, CI/CD Translate big data and analytics requirements into data models that will operate at a large scale and high performance and guide the data analytics engineers on these data models Define templates and processes for the design and analysis of data models, data flows, and integration Lead and mentor Data Analytics team members in best practices, processes, and technologies in Data platforms Qualifications B.S. or M.S. in Computer Science, or equivalent degree 10+ years of hands-on experience in Data Warehouse, ETL, Data Modeling & Reporting 7+ years of hands-on experience in productionizing and deploying Big Data platforms and applications, Hands-on experience working with: Relational/SQL, distributed columnar data stores/NoSQL databases, time-series databases, Spark streaming, Kafka, Hive, Delta Parquet, Avro, and more Extensive experience in understanding a variety of complex business use cases and modeling the data in the data warehouse Highly skilled in SQL, Python, Spark, AWS S3, Hive Data Catalog, Parquet, Redshift, Airflow, and Tableau or similar tools Proven experience in building a Custom Enterprise Data Warehouse or implementing tools like Data Catalogs, Spark, Tableau, Kubernetes, and Docker Knowledge of infrastructure requirements such as Networking, Storage, and Hardware Optimization with hands-on experience in Amazon Web Services (AWS) Strong verbal and written communications skills are a must and should work effectively across internal and external organizations and virtual teams Demonstrated industry leadership in the fields of Data Warehousing, Data Science, and Big Data related technologies Strong understanding of distributed systems and container-based development using Docker and Kubernetes ecosystem Deep knowledge of data structures and algorithms Experience working in large teams using CI/CD and agile methodologies Unique ID -

Posted 2 weeks ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Bengaluru

Hybrid

Hi all, We are hiring for the role C&S ETL Engineer Experience: 6 - 8 Years Location: Bangalore Notice Period: Immediate - 15 Days Skills: Mandatory Skills: AWS Glue Job Description: Minimum experience of 6 years in building, optimizing, and maintaining scalable data pipelines as an ETL Engineer. Hands-on experience in coding techniques with a proven record. Hands-on experience in end-to-end data workflows, including pulling data from third-party and in-house tools via APIs, transforming and loading it into data warehouses, and improving performance across the ETL lifecycle. Hands-on experience with scripting (Python, shell scripting), relational databases (PostgreSQL, Redshift), REST APIs (OAuth, JWT, Basic Auth), job scheduler (cron), version control system (Git), and in AWS environment. Hands-on experience in integrating data from various data sources. Understanding of Agile processes and principles. Good communication and presentation skills. Good documentation skills. Preferred: Ability to understand business problems and customer needs and provide data solutions. Hands-on experience in working with Qualys and its APIs. Understanding of business intelligence tools such as PowerBI. Knowledge of data security and privacy. Design, develop, implement, and maintain robust and scalable ETL pipelines using Python and SQL as well as AWS Glue and AWS Lambda for data ingestion, transformation, loading into various data targets (e.g., PostgreSQL, Amazon S3, Redshift, Aurora) and structured data management. If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793

Posted 2 weeks ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Data Engineering Architect, develop, and maintain highly scalable and robust data pipelines using Apache Kafka, Apache Spark, and Apache Airflow. Design and optimize data storage solutions, including Amazon Redshift, S3, or comparable platforms, to support large-scale analytics. Ensure data quality, integrity, security, and compliance across all data platforms. Data Science Design, develop, and deploy sophisticated machine learning models to solve complex business challenges. Build and optimize end-to-end ML pipelines, including data preprocessing, feature engineering, model training, and deployment. Drive Generative AI initiatives, creating innovative products and solutions. Conduct in-depth analysis of large, complex datasets to generate actionable insights and recommendations. Work closely with cross-functional stakeholders to understand business requirements and deliver data-driven strategies. Collaboration and Mentorship Mentor and guide junior data scientists and data engineers, fostering a culture of continuous learning and professional growth. Contribute to the development of best practices in Data Science, Data Engineering, and Generative AI. General Write clean, efficient, and maintainable code in Python and at least one other language (e.g., C#, Go, or equivalent). Participate in code reviews, ensuring adherence to best practices and coding standards. Stay abreast of the latest industry trends, tools, and technologies in Data Engineering, Data Science, and Generative AI. Document processes, models, and workflows to ensure knowledge sharing and reproducibility.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing . Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies. Skills: Agile, Tableau, BigQuery, SQL, T-SQL, data modelling, Snowflake, redshift, CDP Platform, Power BI, Data Visualization.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

3-5 years of Professional experience designing/building/maintaining highly available data and analytics platform. 3+ years of experience in data engineering, with a focus on building large-scale data processing systems. Hands-on experience with AWS or similar cloud platform building data engineering solutions for analytics and science. (2+ years) Must have experience building complex data pipelines – batch and/or real time event-based processing (2+ years) Strong experience in designing, building and maintaining data warehouse in Redshift or similar cloud-based solutions. (2+ years) Experience in Matillion or similar ETL/ELT tool for developing data ingestion and curation flow (2+ years) Must have strong hands-on experience in SQL. (2+ years) Strong hands-on experience in modern scripting languages using Python. (2+ years) Experience building complex ETL using Spark (Scala or Python) for event based big data processing (1+ years) Strong hands-on experience with NoSQL DBs – MongoDB, Cassandra or DynamoDB (1+ years) Strong experience with AWS deployment using CI/CD pipeline is preferred. (1+ years) Experience in infrastructure as a code services like Terraform preferred. (1+ years) Experience building mission critical systems, running 24x7. Desire to work within a team of engineers at all levels of experience. Desire to mentor junior developers, maximizing their productivity. Good written and spoken communication skills. Bangalore Karnataka India

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra

Remote

Solution Engineering - Cloud & AI - Data Mumbai, Maharashtra, India Date posted Jul 16, 2025 Job number 1847893 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Solution Engineering Employment type Full-Time Overview Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Qualifications 10+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies