Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
17 - 25 Lacs
hyderabad
Hybrid
Job Description Role & Responsibilities: Develop and maintain SQL Server databases, stored procedures, and SSIS packages. Build and optimize ADF pipelines and integrate data from various sources. Write Python scripts for automation and data handling tasks. Create robust and high-performing ETL solutions in Azure environment. Participate in requirement gathering, debugging, testing, and performance tuning. Collaborate with business and technical teams using Agile practices. Work on API-based data integrations and monitor SQL jobs. Document solutions and maintain coding standards and best practices. Shift Timing: 2:00 PM 11:00 PM IST Work Location: Hyderabad (In-Person) Interview Mode: In-Person only Preferred Candidate Profile: 35 years of experience in SQL Server, SSIS, ADF, and Python. Strong ETL knowledge and Azure SQL/Data Factory experience. Experience with debugging, job scheduling, and performance optimization. Strong communication and stakeholder coordination skills. Open to work in late shift and available for in-person interview in Hyderabad. Bonus: Knowledge of full stack (MVC.Net) or BI tools.
Posted 19 hours ago
2.0 - 5.0 years
0 Lacs
gurugram, haryana, india
On-site
Job Title: Product Architecture Planner Platform & Systems Integration Department: Product Strategy / Architecture & Compliance Location: Gurugram Employment Type: Full-Time Experience Required: 25 Years Reporting To: AVP Product & Architecture About the Role As a Product Architecture Planner, you will be responsible for designing, aligning, and ensuring seamless integration of product platforms across multiple ecosystems. You will act as the custodian of product-level architectural standards, compliance protocols, and interoperability guidelines. This role demands an innovative mindset, strong attention to detail, and the ability to draft processes, SoPs, and frameworks that will scale across business functions and partner ecosystems. Key Responsibilities Platform Architecture & Design Define and evolve the overall product and platform architecture roadmap. Ensure modular, scalable, and future-ready design for cross-platform products. Collaborate with technology, product, and business teams to translate requirements into robust architecture. Lead the strategy and execution for interoperability across systems and platforms. Ensure smooth API integrations, ecosystem connectivity, and data exchange protocols. Monitor and ensure compliance of platform architecture with industry standards, legal, data privacy, and security protocols. Stay updated with evolving regulatory frameworks and integrate them into product/system designs. Conduct audits and compliance reviews, recommending corrective actions where required. Draft and maintain Standard Operating Procedures (SoPs), technical documentation, and architectural guidelines. Define policies for change management, quality control, and versioning of platform updates. Engage with internal and external stakeholders to align on architecture decisions and integration roadmaps. Act as the single point of contact (SPOC) for architecture-related discussions, technical compliance, and integration partners. Assess system performance and recommend enhancements for efficiency, reliability, and scalability. Conduct benchmarking against industry best practices to ensure competitive advantage. Own architecture governance processes to maintain product ecosystem integrity. Explore emerging technologies, frameworks, and integration models to enhance platform capabilities. Candidate Profile & Requirements Experience: 24 years of experience in platform architecture, systems integration, product compliance, or enterprise solution planning. Strong knowledge of software architectures (microservices, cloud-native, modular frameworks). Proven track record of managing integration architecture (APIs, data pipelines, middleware). Educational Qualification: Relwvant Experience, B.E./B.Tech in Computer Science, IT, or Electronics. MBA / M.Tech preferred. Exposure to compliance frameworks (ISO, GDPR, DPDP Act, cybersecurity regulations) is highly desirable. Proficiency in documentation, SoP development, and architectural frameworks (TOGAF, Zachman, etc.). Strong analytical and problem-solving ability to balance business requirements with technical feasibility. Excellent communication, stakeholder management, and presentation skills. High ownership, attention to detail, and innovation-driven mindset. A Typical Day in This Role Review ongoing platform integrations and identify improvements. Draft or refine SoPs for new integration processes or compliance updates. Collaborate with product, tech, and legal teams to align on platform architecture decisions. Monitor compliance reports and suggest corrective or preventive actions. Evaluate requests from business/product teams for new cross-platform features. Research emerging tools, frameworks, and compliance requirements to future-proof the product architecture. Show more Show less
Posted 1 day ago
12.0 - 15.0 years
0 Lacs
mumbai, maharashtra, india
On-site
About Us Anand Rathi Shares and Stock Brokers Limited is a leading financial services company with 30 years of proven market excellence. The company offers a wide range of products and services under brokerage and distribution of equities, commodities, currencies and mutual funds. Anand Rathi has a strong presence in globally and serves a diverse client base. We are scaling our digital-first business with a next-generation, high-performance trading platform that delivers lightning-fast executions and rich market insights to a broader spectrum of our retail clients. Our commitment to transparent, responsible product innovation drives every enhancement: intuitive onboarding flows, seamless multi-asset trading, and personalized research tools. By partnering closely with the digital product team, we will optimize every customer interaction and streamline onboarding for Anand Rathis digital investment platforms, empowering investors with the speed, reliability, and cutting-edge offerings they need to achieve their financial goals. POSITION OVERVIEW Lead the design and implementation of scalable, high-performance backend systems for financial applications. Drive technical architecture decisions while ensuring regulatory compliance and system reliability for mission-critical trading platforms. KEY RESPONSIBILITIES Architecture & Design: Create scalable backend architectures for high-volume financial transactions, real-time trading systems, and core banking platforms. Define technical roadmaps and evaluate emerging technologies aligned with business objectives. Technical Leadership: Mentor development teams, conduct architecture and code reviews, drive adoption of engineering best practices, and lead technical decision making for complex architectural challenges. System Integration: Design seamless integrations with third-party financial systems, payment gateways, regulatory platforms, and external APIs while maintaining system integrity and performance. Performance & Security: Architect high-throughput systems with optimized databases, caching strategies, load balancing, and implement robust security measures ensuring adherence to financial regulations including SEBI, GDPR, and SOC2. TECHNICAL REQUIREMENTS Core Programming: Expert-level proficiency in Java with Spring Framework, or Golang and related frameworks. Working knowledge of JavaScript/Node.js and microservices architecture patterns. Backend Technologies: Strong experience with REST APIs, GraphQL, message queues (Kafka, RabbitMQ), event-driven architectures, distributed systems, and real-time data processing frameworks. Database Expertise: Advanced skills in PostgreSQL and MySQL with query optimization, plus NoSQL databases including MongoDB, Redis, and Elasticsearch. GenAI and Data Engg: Experience building performant data pipelines for processing TBs of data per day. Coversant with LLMs, GenAI,AI Agents and Agentic Workflows and related tools and framework (CrewAI, Langraph, etc). Cloud & Infrastructure : Hands-on experience with AWS, Azure, or Google Cloud platforms. Proficiency with Docker, Kubernetes, Terraform, CI/CD pipelines, infrastructure as code, and comprehensive monitoring solutions. Financial Services: Understanding of trading systems, payment processing, risk management platforms, regulatory compliance frameworks, high-frequency trading, and real-time financial data processing. EXPERIENCE & QUALIFICATIONS Professional Background: 12-15+ years in backend development with 5+ years in architecture roles, 2-5+ years in Fintech or financial services, and 2+ years driving organisational technical vision. Education: Bachelor&aposs or Master&aposs degree from Top Tier College in Computer Science, Software Engineering, or related technical field with strong computer science fundamentals. Certifications: AWS/Azure/GCP Solutions Architect certification preferred. Key Competencies: Strong analytical and problem-solving skills, excellent communication and stakeholder management abilities, business acumen with financial services knowledge, and commitment to continuous learning with emerging technologies. PERFORMANCE EXPECTATIONS Design and deliver backend systems that maintain 99.999%+ availability with mili-second response times for critical financial applications. Lead architecture standardization across development teams while ensuring regulatory compliance and successful audit outcomes. Drive technical excellence that supports business scalability and reduces operational costs through efficient system design. Show more Show less
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Senior Data Engineer with 5+ years of experience for our Bengaluru location (max 30 days notice period) The ideal candidate will have strong expertise in designing, developing, and maintaining robust data ingestion frameworks, scalable pipelines, and DBT-based transformations Responsibilities include building and optimizing DBT models, architecting ELT pipelines with orchestration tools like Airflow/Prefect, integrating workflows with AWS services (S3, Lambda, Glue, RDS), and ensuring performance optimization on platforms like Snowflake, Redshift, and Databricks The candidate will implement CI/CD best practices for DBT, manage automated deployments, troubleshoot pipeline issues, and collaborate cross-functionally to deliver cloud-based real-time and batch data solutions Strong SQL, scripting, API integrations, and AWS experience are essential
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Data Engineer with 5+ years of experience for our Bengaluru location (max 30 days notice period). The ideal candidate will have strong expertise in designing, developing, and maintaining robust data ingestion frameworks, scalable pipelines, and DBT-based transformations. Responsibilities include building and optimizing DBT models, architecting ELT pipelines with orchestration tools like Airflow/Prefect, integrating workflows with AWS services (S3, Lambda, Glue, RDS), and ensuring performance optimization on platforms like Snowflake, Redshift, and Databricks. The candidate will implement CI/CD best practices for DBT, manage automated deployments, troubleshoot pipeline issues, and collaborate cross-functionally to deliver cloud-based real-time and batch data solutions. Strong SQL, scripting, API integrations, and AWS experience are essential.
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
hyderabad, telangana, india
On-site
Experience: 3-5 years Experience with Apache Spark is must. Looking for a skilled backend developer with strong experience in Java, Spring Boot, and Apache Spark. Responsible for building scalable microservices and processing large datasets in real-time or batch environments. Must have solid understanding of REST APIs, distributed systems, and data pipelines. Experience with cloud platforms (AWS/GCP) is a plus. Show more Show less
Posted 2 days ago
3.0 - 6.0 years
0 Lacs
hyderabad, telangana, india
On-site
Hi Greetings from Zensar!! We are looking for ETL L2 Support role for Hyderabad location. Exp - 3-6 Years Notice - Immediate to 15 Days 3-5 years of production support experience on Informatica/Python/AWS Technologies and applications. Must have good understanding and technical knowledge on Informatica architecture/client components such as Workflow Manager, Mapping Designer, workflow monitor and Repo manager. Excellent knowledge on AWS/Python concepts. Informatica to Cloud Migration. Hands-on expertise in debugging Informatica ETL Mapping to narrow down the Issue. Hands-on experience in ETL transformation such as lookup/joiners/source qualifier/normalizer. Hands-on experience in dealing with various types of sources such as Flat files/Mainframes/XML files and Databases. Experience on AWS environment, Data Pipelines, RDS, Reporting tools. Hands-on experience in Unix scripting/file operations. Strong knowledge of SQL/PL-SQL and oracle Databases. Able to debug complex queries. Good understanding on scheduling tool such as TWS/TIDAL/Others Worked at least 2 years on ServiceNow for application incident management, problem management in a 24*7 model. Strong communication skills both written and verbal with the ability to follow the processes If interested, pls share resumes to [HIDDEN TEXT] Show more Show less
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
hyderabad, telangana, india
On-site
Lexington Partners is one of the worlds largest and most trusted managers of secondary private equity and co-investment funds. Since our founding in 1994, we have been at the forefront of innovation in private equity investing, managing over $70 billion in committed capital and partnering with a global network of institutional investors, private equity firms, and portfolio companies. What are the ongoing responsibilities of Associate Software Engineer (Data Engineer) responsible for We are building a growing Data and AI team. You will play a critical role in the efforts to centralize structured and unstructured data for the firm. We seek a candidate with skills in data modeling, data management and data governance, and can contribute first-hand towards firms data strategy. The ideal candidate is a self-starter with a strong technical foundation, a collaborative mindset, and the ability to navigate complex data challenges #ASSOCIATE What ideal qualifications, skills & experience would help someone to be successful Bachelors degree in computer science or computer applications; or equivalent experience in lieu of degree with 3 years of industry experience. Strong expertise in data modeling and data management concepts. Experience in implementing master data management is preferred. Sound knowledge on Snowflake and data warehousing techniques. Experience in building, optimizing, and maintaining data pipelines and data management frameworks to support business needs. Proficiency in at least one programming language, preferably python. Collaborate with cross-functional teams to translate business needs into scalable data and AI-driven solutions. Take ownership of projects from ideation to production, operating in a startup-like culture within an enterprise environment. Excellent communication, collaboration, and ownership mindset. Foundational Knowledge of API development and integration. Knowledge of Tableau, Alteryx is good-to-have. Work Shift Timings - 2:00 PM - 11:00 PM IST Show more Show less
Posted 2 days ago
8.0 - 10.0 years
12 - 16 Lacs
indore, hyderabad, ahmedabad
Work from Office
Notice Period: Immediate joiners or within 15 days preferred Share Your Resume With: Current CTC Expected CTC Notice Period Preferred Job Location Primary Skills: MSSQL, Redshift, Snowflake T-SQL, LinkSQL, Stored Procedures ETL Pipeline Development Query Optimization & Indexing Schema Design & Partitioning Data Quality, SLAs, Data Refresh Source Control (Git/Bitbucket), CI/CD Data Modeling, Versioning Performance Tuning & Troubleshooting What You Will Do: Design scalable, partitioned schemas for MSSQL, Redshift, and Snowflake. Optimize complex queries, stored procedures, indexing, and performance tuning. Build and maintain robust data pipelines to ensure timely, reliable delivery of data. Own SLAs for data refreshes, ensuring reliability and consistency. Collaborate with engineers, analysts, and DevOps to align data models with product and business needs. Troubleshoot performance issues, implement proactive monitoring, and improve workflows. Enforce best practices for data security, governance, and compliance. Utilize schema migration/versioning tools for database changes. What Youll Bring: Bachelors or Masters in Computer Science, Engineering, or related field. 8+ years of experience in database engineering or backend data systems. Expertise in MySQL, Redshift, Snowflake, and schema optimization. Strong experience in writing functions, procedures, and robust SQL scripts. Proficiency with ETL processes, data modeling, and data freshness SLAs. Experience handling production performance issues and being the go-to database expert. Hands-on with Git, CI/CD pipelines, and data observability tools. Strong problem-solving, collaboration, and analytical skills. If youre interested and meet the above criteria, please share your resume with your current CTC, expected CTC, notice period, and preferred job location. Immediate or 15-day joiners will be prioritized.
Posted 2 days ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
As a leading provider of AI-powered solutions to clean dirty data and unlock its hidden potential for healthcare transformation, HiLabs is committed to revolutionizing the healthcare industry through innovation, collaboration, and a relentless focus on improving patient outcomes. Join a multidisciplinary team of industry leaders, healthcare domain experts, AI/ML, and data science professionals from prestigious institutions worldwide, including Harvard, Yale, Carnegie Mellon, Duke, Georgia Tech, IIM, and IIT. Be part of a team that leverages advanced AI, ML, and big data technologies to develop cutting-edge healthcare technology platforms and deliver innovative business solutions. Position: Delivery Head Location: Bangalore or Pune, India Experience: 15+ years in software/product delivery leadership Job Summary: We are looking for an experienced and dynamic Delivery Head to oversee the end-to-end delivery of multiple enterprise software and data products for US-based clients. Based in Bangalore or Pune, you will lead cross-functional teams in software development, data engineering, and data science to ensure timely, high-quality, and scalable releases. This leadership role demands strong technical expertise, strategic planning, and exceptional stakeholder management to drive innovation and operational excellence in a fast-paced environment. Responsibilities: - Own the complete delivery lifecycle of 4-5 major product releases annually, aligning with client expectations, timelines, and quality standards. - Lead, mentor, and inspire a diverse team of 50+ engineers, data scientists, and QA professionals to cultivate a high-performance culture. - Collaborate with US-based clients, product managers, and architects to translate business requirements into actionable technical plans. - Drive Agile/Scrum best practices and DevOps principles for efficient and predictable delivery cycles. - Manage budgets, resource allocation, and proactively mitigate risks to avoid delays and cost overruns. - Ensure all deliverables comply with scalability, security, and regulatory requirements, such as HIPAA and GDPR. - Act as the primary escalation point for critical delivery issues, resolving conflicts and dependencies promptly. - Define and implement KPIs to monitor delivery performance and enhance processes continuously. Desired Profile: - 15+ years of experience in software/product delivery leadership, including 5 years in a senior role like Delivery Head or Program Director. - Strong technical background with hands-on experience in full stack development using React.js and Java (Spring Boot). - Expertise in data engineering (ETL, data pipelines, data warehousing) and data science (machine learning, AI models, analytics). - Proven track record of delivering enterprise-grade SaaS and data solutions for US clients. - Thorough understanding of Agile methodologies, DevOps practices, CI/CD pipelines, and cloud platforms (AWS, Azure, or GCP). - Excellent stakeholder management skills bridging gaps between technical teams and business leadership. - Strong analytical, problem-solving, and decision-making abilities. Preferred Qualifications: - Experience in the healthcare domain or with healthcare-related compliance standards. - Familiarity with regulatory compliance frameworks like HIPAA and GDPR. - Exposure to managing large, distributed teams and cross-geographical collaboration. - Advanced certifications in Agile, Scrum, or project/program management (e.g., PMP, SAFe) are advantageous. If you are a visionary leader ready to drive impactful product deliveries and lead high-performing teams, we encourage you to apply and join our innovative journey at HiLabs. Why Join HiLabs - Work on MCheck, our AI-powered platform that has analyzed over 15 billion records, impacting 1 in 3 insured Americans. - Shape the future of healthcare technology with explainable AI and data-driven design. - Competitive salary, performance-based accelerators, 401(k) matching, medical coverage, and more. - Access mentorship, training, conferences, and certifications for professional growth. - Enjoy a supportive, inclusive culture with flexible PTO and collaboration with diverse global teams. HiLabs is an Equal Opportunity Employer, committed to diversity and excellence. Apply now by sending your resume and portfolio to careers@hilabs.com and be part of our impactful journey. Thank you for considering HiLabs. We look forward to the impact you will make!,
Posted 3 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are looking for a Senior Delivery Manager to lead a cross-functional team in implementing and configuring product solutions for enterprise-level clients. This role requires strong technical leadership, people management, and project ownership skills, ensuring quality checks and innovative approaches to streamline delivery. Collaboration with internal engineering, R&D, QA, and support teams is crucial, with high-level requirements coming from Technical Program Managers. As a Senior Delivery Manager, you will be responsible for: Strategic Leadership & Team Management: - Managing, mentoring, and inspiring a team of engineers and project leads for high performance, engagement, and growth - Developing and optimizing team structures, resource planning, and talent development strategies - Fostering a culture of ownership, accountability, innovation, and continuous learning Senior-Level Project Delivery: - Translating complex requirements from Technical Program Managers into project roadmaps and execution plans - Overseeing multiple implementation projects, ensuring timely and high-quality releases - Communicating progress, risks, and mitigation strategies with senior stakeholders and leadership Quality Assurance & Release Management: - Establishing and upholding rigorous quality controls, including functionality, data integrity, and performance - Collaborating with QA resources on testing strategies, automation frameworks, and release protocols - Driving continuous improvement in release readiness processes and performance metrics Technical Innovation & Problem-Solving: - Providing technical guidance for complex challenges, from code-level debugging to data-level analysis - Collaborating with R&D to seamlessly integrate new product features for enterprise-scale use - Introducing cutting-edge tools like GenAI to automate and simplify implementation workflows Process Optimization & Automation: - Identifying and implementing strategic improvements to boost delivery speed and maintain quality - Promoting documentation, best practices, and standardized procedures for efficiency and scalability - Leveraging metrics and user feedback to evolve best-in-class delivery methodologies Requirements: - 10+ years of experience in software delivery or technical program management - Proven track record of managing large-scale projects in fast-paced environments - Bachelor's degree in Computer Science, Engineering, or related field; advanced degree is a plus About You: - Ability to work effectively with all levels within the organization - Passion for a clean energy future and innovation - Results-driven with a focus on quality and precision - Strong leadership, communication, and stakeholder management skills Perks: - Growth potential with a startup - Collaborative environment focused on a clean energy future - Unique tools provided for success - Group Health Insurance, Internet/Telephone Reimbursement, Professional Development Allowance, Gratuity, and more - Mentorship programs and flexible work arrangements Diversity, Equity, Inclusion, and Equal Opportunity: Bidgely is an equal-opportunity employer committed to diversity and equal opportunity. Hiring decisions are based on skills, talent, and passion. Join Bidgely to be part of a team that values diversity and strives to build a better future and workforce.,
Posted 3 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
SAIGroup is a private investment firm that has committed $1 billion to incubate and scale revolutionary AI-powered enterprise software application companies. Our portfolio consists of rapidly growing AI companies catering to over 2,000+ major global customers, generating almost $800 million in annual revenue, and employing a global workforce of over 4,000 individuals. We invest in new ventures based on breakthrough AI-based products that have the potential to disrupt existing enterprise software markets. Our latest investment, JazzX AI, is a pioneering technology company focused on shaping the future of work through an AGI platform designed for the enterprise. JazzX AI is not just creating another AI tool; it is reimagining business processes to enable seamless collaboration between humans and intelligent systems. This transformation leads to a significant increase in productivity, efficiency, and decision velocity, empowering enterprises to become industry leaders and set new standards for innovation and excellence. **Client Technical Solutions Architect (AI Solutions Deployment)** **Role Overview:** The Client Technical Solutions Architect is a strategic, client-facing role responsible for deploying and customizing advanced AI solutions on top of our company's AI platform. In this hands-on position, you will collaborate closely with clients to understand their business challenges and translate them into effective technical solutions. You will also work internally with product and engineering teams to ensure the AI platform is configured and extended to provide maximum value for each client. The ideal candidate possesses strong technical aptitude and excellent stakeholder management skills to ensure the smooth delivery of complex AI solutions aligned with client objectives across various industries. **Responsibilities:** - **Deploy and Customize AI Solutions:** Implement and tailor our platform's AI capabilities to address clients" specific business problems, both on-site at client locations and through remote collaboration. - **Client Engagement and Requirements Gathering:** Work directly with client stakeholders to comprehend their requirements, pain points, and goals. Translate business needs into technical solutions and actionable project plans to ensure alignment between client expectations and delivered functionality. - **Solution Design and Implementation:** Design end-to-end workflows and AI solution architectures that integrate with clients" existing systems. Develop and test custom modules, data pipelines, and prompts on our platform to meet use-case requirements. - **Internal Collaboration:** Partner with internal product managers and engineering teams to enhance platform features and capabilities. Provide feedback from field deployments to influence product roadmaps and enhance our AI platform's effectiveness for all clients. - **Project Management and Delivery:** Oversee the deployment lifecycle from proof-of-concept to production rollout at client sites. Ensure timely delivery of solutions meeting quality standards while proactively managing technical issues or risks. - **Stakeholder Management:** Manage relationships with client technical teams and business leaders. Communicate progress, present solution demos, and adjust plans based on stakeholder feedback. Act as a trusted technical advisor aligning the AI solution with clients" strategic objectives and demonstrating value. - **Training and Knowledge Transfer:** When necessary, provide hands-on training and guidance to client teams on using and maintaining the AI solutions. Document configurations and best practices to enable clients to become self-sufficient with the deployed technology. **Requirements:** - **Education and Experience:** Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, data engineering, or a similar technical role involving building or deploying software solutions. - **AI Expertise:** Practical experience with AI/ML technologies, particularly Natural Language Processing. Familiarity with prompt engineering techniques and working with large language models or conversational AI systems is highly desirable. - **Technical Proficiency:** Strong programming and scripting skills (e.g., Python, Java) to customize solutions and integrate systems. Experience with APIs, data pipelines, and cloud platforms or ML frameworks for deploying AI solutions. - **Solution Architecture Skills:** Ability to design scalable, secure, and maintainable system architectures and workflows. Experience integrating AI solutions into existing enterprise environments (data sources, databases, APIs, etc.). - **Problem-Solving:** Proven ability to troubleshoot technical issues and optimize AI models or pipelines. Creative mindset to adapt and tailor solutions to unique client scenarios, focusing on delivering business value. - **Communication and Stakeholder Management:** Excellent communication skills to explain complex technical concepts to both technical and non-technical stakeholders. Experience gathering requirements and maintaining client relationships in a professional services or customer-facing capacity. - **Team Collaboration:** Comfortable working in cross-functional teams and fast-paced environments. Able to coordinate between client personnel and internal teams, balancing priorities and ensuring alignment on solution delivery.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
andhra pradesh
On-site
You will be responsible for assembling large, complex sets of data that meet non-functional and functional business requirements. You will develop and maintain scalable data pipelines, as well as build new API integrations to support increasing data volume and complexity. Collaboration with analytics and business teams is crucial to improve data models feeding business intelligence tools, increasing data accessibility, and promoting data-driven decision making across the organization. Your role will involve building the necessary infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL and other technologies. You will implement processes and systems to monitor data quality, ensuring accurate production data is always available for key stakeholders and business processes. Additionally, you will write unit/integration tests, contribute to the engineering wiki, and document your work. Data analysis will be a key part of your responsibilities to troubleshoot data-related issues and aid in resolving them. Working closely with frontend and backend engineers, product managers, and analysts is essential. You will define company data assets, develop spark, sparkSQL, and hiveSQL jobs to populate data models, design data integrations, and establish a data quality framework. Collaboration with all business units and engineering teams will be necessary to devise a strategy for long-term data platform architecture. Moreover, you will create analytical tools utilizing the data pipeline to provide actionable insights into key business performance metrics such as operational efficiency and customer acquisition. Qualifications / Skills: - Knowledge of best practices and IT operations for an always-up, always-available service - Experience with Agile Software Development methodologies - Previous experience as a data engineer or similar role - Technical expertise in data models, data mining, and segmentation techniques - Proficiency in programming languages such as Java and Python - Hands-on experience with SQL database design - Strong numerical and analytical skills - Excellent problem-solving and troubleshooting abilities - Process-oriented with exceptional documentation skills - Outstanding oral and written communication skills coupled with a strong customer service orientation,
Posted 5 days ago
9.0 - 13.0 years
0 Lacs
maharashtra
On-site
The Data Engineering team within the AI, Data, and Analytics (AIDA) organization plays a crucial role in supporting data-driven sales and marketing operations by providing a strong foundation for transformative insights and data innovation. The team focuses on integration, curation, quality, and data expertise across diverse sources to develop world-class solutions that advance Pfizer's mission of making a global impact through data-driven decision-making. As a Senior Data Solutions Engineering Senior Manager, you will be responsible for designing and developing robust, scalable data models to optimize the consumption of data sources and generate unique insights from Pfizer's extensive data ecosystems. Your technical expertise will be essential in collaborating with engineering and developer team members to create and maintain data capabilities that enable advanced analytics and data-driven decision-making. In this role, you will work closely with stakeholders to understand their needs and design end-to-end data solutions. This includes creating data models and pipelines, establishing robust CI/CD procedures, and implementing the right architecture to build reusable data products and solutions to support various analytics use cases. Key Responsibilities: - Project solutioning, scoping, and estimation - Data sourcing, investigation, and profiling - Prototyping and design thinking - Designing and developing data pipelines and complex data workflows - Creating standard procedures for efficient CI/CD - Developing data quality and integrity standards and controls - Collaborating with internal and external partners to deliver best-in-class data products globally - Demonstrating outstanding collaboration and operational excellence - Driving best practices and world-class product capabilities Qualifications: - Bachelor's degree in a technical area such as computer science, engineering, or management information science (Master's degree preferred) - 9+ years of combined data warehouse/data lake experience as a data lake/warehouse developer or data engineer - Experience in developing data products and features for analytics and AI use cases - Domain knowledge in the pharmaceutical industry preferred - Good knowledge of data governance and data cataloging best practices Technical Skillset: - Proficiency in SQL, Python, and object-oriented scripting languages for building data pipelines - Experience in designing and delivering data lake/data warehousing projects - Hands-on experience in data modeling - Working knowledge of cloud native SQL and NoSQL database platforms (Snowflake experience desirable) - Familiarity with AWS services such as EC2, EMR, RDS, Spark - Understanding of Scrum/Agile methodologies and CI/CD practices - Knowledge of data privacy standards, governance principles, and industry practices compliance Pfizer is an equal opportunity employer that values diversity and complies with all applicable equal employment opportunity legislation.,
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. Were building a more open world. Join us. Data Engineer III Introduction to the Team Expedia Technology teams partner with our Product teams to create innovative products, services, and tools to deliver high-quality experiences for travelers, partners, and our employees. A singular technology platform powered by data and machine learning provides secure, differentiated, and personalized experiences that drive loyalty and traveler satisfaction. Expedia Group is seeking a skilled and motivated Data Engineer III to join our Finance Business Intelligence team supporting the Product & Technology Finance organization. In this role, you will help drive data infrastructure and analytics solutions that support strategic financial planning, reporting, and operational decision-making across the Global Finance community. Youll work closely with Finance and Technology partners to ensure data accuracy, accessibility, and usability in support of Expedias business objectives. As a Data Engineer III, you have strong experience working with a variety of datasets, data environments, tools, and analytical techniques. You enjoy a fun, collaborative and stimulating team environment. Successful candidates should be able to own projects end-to-end, including identifying problems and solutions, building and maintain data pipelines and dashboards, distilling key insights and communicate to stakeholders. In This Role, You Will Develop new and improve existing end to end Business Intelligence products (data pipelines, Tableau dashboards, and Machine Learning predictive forecasting models). Drive internal efficiencies through streamline code/documentation/Tableau development to maintain high data integrity. Troubleshoot and resolve production issues with the team products (automation opportunities, optimizations, back-end data issues, data reconciliations). Proactively reach out to subject matter experts /stakeholders and collaborate to solve problems. Respond to ad hoc data requests and conduct analysis to provide valuable insights to stakeholders. Collaborate and coordinate with team members/stakeholders to translate complex data into meaningful insights, that improve the analytical capabilities of the business. Apply knowledge of database design to support migration of data pipelines from on prem to cloud environment (including data extraction, ingestion, processing of large data sets) Support dashboard development on cloud environment to enable self-service reporting. Communicate clearly on current work status and design considerations Think broadly and comprehend the how, why, and what behind data architecture designs Experience & Qualifications Bachelors in Computer Science, Mathematics, Statistics, Information Systems, or related field 5+ years experience in a Data Analyst, Data Engineer or Business Analyst role Proven expertise in SQL, with practical experience utilizing query engines including SQL Server, Starburst, Trino, Querybook and data science tools such as Python/R, SparkSQL. Proficient visualization skills (Tableau, Looker, or similar) and excel modeling/report automation. Exceptional understanding of relational and dimensional datasets, data warehouse and data mining and applies database design principles to solve data requirements Experience building robust data extract, load and transform (ELT) processes, that source data from multiple databases. Demonstrated record of defining and executing key analysis and solving problems with minimal supervision. Dynamic individual contributor who consistently enhances operational playbooks to address business problems. 3+ year working in a hybrid environment that uses both on-premise and cloud technologies is preferred. Experience working in an environment that manipulates large datasets on the cloud platform preferred. Background in analytics, finance or a comparable reporting and analytics role preferred. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group&aposs family of brands includes: Brand Expedia, Hotels.com, Expedia Partner Solutions, Vrbo, trivago, Orbitz, Travelocity, Hotwire, Wotif, ebookers, CheapTickets, Expedia Group Media Solutions, Expedia Local Expert, CarRentals.com, and Expedia Cruises. 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Groups Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless youre confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age. Show more Show less
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
The role at Prudential involves performing strategic analysis of structured and unstructured data from various sources. You will be responsible for developing data structures and pipelines to organize, collect, cleanse, and standardize data to generate actionable insights and address reporting needs. Your role will also include defining data requirements, gathering and validating information, and supporting the creation of Data Quality rules for formal governance. Additionally, you will identify innovative opportunities to develop data insights, maximize data usage, and improve business performance. Acting as a Subject Matter Expert, you will provide complex Business Intelligence solutions involving SQL and Python, guiding junior team members and engaging extensively with users to understand and translate their requirements into user-friendly dashboards and insights. In this position, you will be expected to up-skill continuously, venture into Advanced Analytics, and become familiar with data science languages like Python and R. You will also be responsible for identifying and managing risks within your area of responsibility, including resolving blockers and bottlenecks. To be successful in this role, you should hold a university degree in Computer Science, Data Analysis, or a related field, along with a minimum of 4 years" experience as a data analyst. Experience in analyzing mobile applications data, preparing business reports, and possessing exceptional analytical skills are essential. You should have a good understanding of the power and value of data, the ability to apply technology solutions to meet business needs, and assess stakeholder requirements to enhance customer experience. Moreover, you must display resilience under pressure, provide high-quality solutions, meet deadlines consistently, and effectively handle requests and queries from senior management. Technical requirements include demonstrable experience in data-related roles, knowledge of ETL processes, data warehousing principles, and expertise in data visualization using advanced Tableau skills. Proficiency in SQL, Python, or Scala is necessary, along with familiarity with business tools like JIRA and Confluence. This role demands flexibility to work with various technologies and a commitment to continuous learning and development.,
Posted 1 week ago
7.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
As an experienced professional in the field of technology leadership, your role as part of our team will involve shaping and implementing the enterprise-wide AI and automation strategy in alignment with Cyient's digital transformation objectives. Collaborating with various business units such as Sales, HR, Finance, Engineering, Operations, and IT, you will identify high-impact AI opportunities and define an AI adoption roadmap for both short-term wins and long-term transformation initiatives. You will take charge of designing, developing, and deploying AI agents to automate knowledge-intensive and repetitive processes, focusing on creating custom AI agent ecosystems that seamlessly integrate with our existing systems and workflows. By evaluating emerging technologies like multi-agent orchestration platforms, LLMs, and generative AI, you will contribute towards enhancing our enterprise deployment capabilities. Driving process re-engineering efforts, you will leverage AI, ML, and advanced analytics to maximize automation outcomes and deliver measurable improvements in productivity, efficiency, quality, and customer experience. Establishing governance frameworks for AI-enabled automation will be essential to ensure scalability, security, and compliance within our operations. Acting as a trusted advisor to the executive leadership team, you will provide insights on AI opportunities and risks while fostering a culture of AI awareness and adoption through structured communication and engagement activities. Leading change management initiatives will be crucial to ensure the seamless adoption of AI-driven automation solutions across all organizational functions. Your responsibilities will also involve building and leading a high-performing AI Automation Center of Excellence (CoE) within Cyient, where you will mentor teams in AI technologies, engineering, ML model development, and AI agent orchestration. Additionally, you will establish partnerships with technology providers, startups, and academic institutions to accelerate AI innovation. Ensuring compliance with ethical standards, data privacy laws, and responsible AI practices will be a key focus, along with establishing guardrails for transparency, fairness, and accountability in AI-driven decision-making processes. Key Deliverables - Develop an AI Adoption Roadmap and business case documentation. - Deploy enterprise AI agent ecosystem across multiple functions. - Achieve annual productivity improvements and ROI from automation initiatives. - Foster the development of AI skillsets and culture across the organization. Required Skills & Qualifications Education: - Bachelor's degree in Computer Science, Engineering, or a related field (Master's preferred). - Certifications in AI/ML, Data Science, or Automation will be advantageous. Experience: - 15+ years of experience in technology leadership roles, with at least 7 years in AI/ML and Intelligent Automation. - Proven track record in deploying AI/automation solutions at scale across global enterprises. - Experience in building AI agents, LLM-based solutions, and orchestration platforms. Technical Skills: - Deep expertise in AI/ML, NLP, GenAI (LLMs), RPA, Process Mining, and intelligent automation. - Familiarity with cloud AI platforms (Azure AI, AWS SageMaker, GCP AI), orchestration tools, APIs, and data pipelines. - Strong understanding of data architecture, security, and compliance in AI systems. Leadership Skills: - Exceptional stakeholder management, communication, and influencing skills. - Ability to lead cross-functional, multi-geography teams and drive complex transformation initiatives. - Strategic thinker with a strong execution focus. Join us in shaping the future of AI and automation at Cyient, where your expertise and leadership will play a pivotal role in driving innovation and transformation across the organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a Senior Data Platform Engineer responsible for leading the design, development, and optimization of the data platform infrastructure. Your primary focus will be on driving scalability, reliability, and performance across data systems to enable data-driven decision-making at scale. Working closely with data engineers, analysts, and product teams, you will play a crucial role in enhancing the overall data platform. Your responsibilities will include architecting and implementing scalable, secure, and high-performance data platforms on AWS cloud using Databricks. You will be building and managing data pipelines and ETL processes utilizing modern data engineering tools such as AWS RDS, REST APIs, and S3 based ingestions. Monitoring and maintaining production data pipelines, along with working on enhancements, will be essential tasks. Optimizing data systems for improved performance, reliability, and cost efficiency will also fall under your purview. Implementation of data governance, quality, and observability best practices in line with Freshworks standards will be a key focus area. Collaboration with cross-functional teams to support diverse data needs is also a critical aspect of this role. Qualifications for this position include a Bachelor's/Masters degree in Computer Science, Information Technology, or a related field. You should have good exposure to data structures and algorithms, coupled with proven backend development experience using Scala, Spark, or Python. A strong understanding of REST API development, web services, and microservices architecture is essential. Experience with Kubernetes and containerized deployment is considered a plus. Proficiency in working with relational databases like MySQL, PostgreSQL, or similar platforms is required. A solid understanding and hands-on experience with AWS cloud services are also important. Knowledge of code versioning tools such as Git and Jenkins is necessary. Excellent problem-solving skills, critical thinking, and keen attention to detail will be valuable assets in this role.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a software engineer at Google, you will play a crucial role in developing cutting-edge technologies that revolutionize the way billions of users worldwide connect, explore, and engage with information. Our projects require handling massive amounts of data and go beyond traditional web search. We are seeking talented engineers who can bring innovative ideas from various domains such as information retrieval, distributed computing, system design, networking, security, artificial intelligence, UI design, and more. You will have the opportunity to work on vital projects that cater to Google's evolving needs, with the flexibility to switch between teams and projects as both you and our dynamic business progress. Versatility, leadership skills, and a passion for tackling new challenges across the entire technology stack are essential qualities we look for in our engineers as we continue to drive technological advancements. In the realm of Google Search, we are reshaping the concept of information retrieval in diverse ways and locations. To achieve this, we must surmount complex engineering obstacles, expand our infrastructure, and uphold a universally accessible and valuable user experience that people across the globe depend on. By joining the Search team, you will have the chance to make a significant impact on billions of individuals worldwide. Your responsibilities will include leading a team of software engineers, providing guidance in planning, design, execution, and mentorship. You will take charge of resolving substantial technical challenges that span multiple teams and components, guiding your team systematically towards solutions. Understanding intricate technical details and facilitating connections between teams to tackle complex technical issues within this domain will be a key aspect of your role. Additionally, you will need to adapt quickly to changes and advancements, ensuring you stay abreast of all progress within the field.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will be responsible for leading the migration of SQL-based data warehouses to Snowflake using SnowConvert. Your role will involve analyzing existing SQL code, stored procedures, and ETL jobs to design equivalent Snowflake solutions. Additionally, you will optimize Snowflake performance through best practices in partitioning, clustering, and caching. Collaboration with data architects, analysts, and business stakeholders will be crucial to ensure successful delivery. Providing technical leadership, mentoring, and code reviews within the team is also expected from you. The ideal candidate for this role should possess a strong expertise in Snowflake development and architecture. Hands-on experience with SnowConvert for automated SQL migration is a must. Deep understanding of SQL, procedural logic, and performance tuning is essential. Experience with cloud platforms such as AWS, Azure, or GCP, and data integration tools is required. Excellent problem-solving skills and communication abilities are highly valued. Experience in CI/CD for data pipelines and Snowflake certification will be considered as additional advantages for this position.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be joining Zoca, a company dedicated to revolutionizing the growth process for local businesses. Zoca's AI agents are designed to assist talented beauty professionals in expanding their reach by managing all their marketing needs automatically. With a focus on hyperlocal search optimization, 24/7 lead conversion, and retention campaigns, Zoca has already made a significant impact, having raised $6M in funding led by Accel, helped over 1,000 businesses achieve full booking capacity, and generated more than $10M in revenue for local entrepreneurs. As an AI Product Manager at Zoca, you will have a unique and multifaceted role in the development, optimization, and scaling of one of the core AI agents. Your primary objective will be to elevate the current state of the agent to a world-class, autonomous system that guarantees positive outcomes for local businesses. This position goes beyond the traditional product management role, requiring you to blend elements of product management, AI research, customer advocacy, and execution prowess to enhance the intelligence, reliability, and impact of the AI agents. You will take ownership of either the Lead Generation Agent, Retention/Loyalty Agent, or Local SEO Agent based on your experience and interest. Your responsibilities will include driving the end-to-end development of your assigned AI agent, collaborating with stakeholders to implement customer-centric improvements, optimizing agent performance with the help of ML engineers and data science, working cross-functionally to ship agent enhancements, and strategizing the evolution of your agent over the coming months. Additionally, you will be accountable for tracking and reporting on the impact of your agent through various performance metrics, business outcomes, and customer satisfaction levels. To excel in this role, you should possess 2-4 years of experience in product management, AI/ML product development, or technical program management. Direct experience in building or optimizing AI-powered products is essential, along with a solid understanding of machine learning workflows, model training, and performance optimization. Your ability to collaborate effectively with engineering teams, ship products that address real customer needs, and communicate technical concepts to non-technical stakeholders will be crucial for success. Furthermore, familiarity with LLMs, prompt engineering, conversational AI systems, AI agent frameworks, and automation workflows will be beneficial. Your comfort level with working on APIs, data pipelines, and technical integrations, along with your proficiency in metrics analysis, cross-functional collaboration, and product management, will be integral to driving the growth and impact of the AI agents at Zoca.,
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
, India
On-site
Job Overview The Sr. Engineering Manager/Engineering Manager will oversee the technological direction and strategy of the company. This key executive role requires a blend of strategic vision and technical expertise to align the company&aposs technology with its business goals. The Sr. Engineering Manager/Engineering Manager will lead the technology and product development teams, drive innovation, and ensure the company remains competitive through the adoption of cutting-edge technologies. The Sr. Engineering Manager/Engineering Manager will also collaborate with other senior leaders to create scalable systems, establish high-performance engineering practices, and foster a culture of continuous improvement and technological excellence. Key Responsibilities Technology Strategy: Define and execute the company&aposs technology vision, strategy, and roadmap aligned with business objectives. Leadership & Management: Lead and manage a team of off chain engineers, providing guidance, mentorship, and support to help them achieve their goals and objectives. Innovation & R&D: Identify emerging technology trends and guide the organisation in adopting innovative solutions to drive growth and efficiency. Product & Development: Oversee the development of new products and services, ensuring alignment with customer needs and market demands Collaboration with other functions: Work closely with marketing, sales, and operations teams to ensure seamless integration of technology into the broader business strategy. Budget & Resource Allocation: Manage the technology budget, ensuring that financial resources are allocated effectively to meet organizational goals along with recruiting and onboarding new engineering talent to support the growth and expansion of the team. Performance Review & Feedbacks: Conduct regular performance reviews and provide feedback to team members to help them grow and develop in their roles. Technical Operations: Ensure the reliability, security, and scalability of technology systems and infrastructure. Data Security & Compliance: Maintain and improve cybersecurity practices and ensure compliance with industry standards and regulations. Stakeholder Communication: Regularly update executive leadership, board members, and stakeholders on technological developments, challenges, and opportunities. Qualifications Educational & Professional Background Bachelors or Masters degree in Computer Science, Engineering, or related field (or equivalent experience) 8-13 years of experience in software engineering, prototyping, product innovation or events platform Experience in taking a 0 to 1 journey into scaling a startup and building team Technical Expertise Proficient in at least one primary programming language (e.g., JavaScript, Typescript, Go, etc.) Demonstrated ability to quickly pick up new languages and frameworks (e.g., Go, React, Node.js, TensorFlow, etc.) Familiarity with on-chain products and work on-chain is a significant plus Familiarity with AI/ML, data pipelines, or cloud infrastructure is a plus Innovation & Experimentation Mindset Proven track record of rapid prototyping, hackathons, or personal side projects Comfortable with failure, able to pivot quickly, and thrive in a fast-changing environment Ability to balance short-term experimentation with long-term product vision Collaboration & Communication Excellent communication skills, capable of distilling complex ideas into clear recommendations Experience working in cross-functional teams and presenting ideas to stakeholders Growth & Curiosity High level of intellectual curiosity and desire to keep learning Ability to research, adapt, and implement new technologies ahead of the curve Preferred Qualities Experimental Mindset: You focus on learning fast rather than chasing perfection from day one. Curiosity: You always ask, Why and What if Technical Range: You can quickly move from writing front-end code for a new feature to spinning up a simple backend service. Bias for Action: Youd instead build, test, fail, and iterate than debate. Show more Show less
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and skilled Python Data Science Professional to join our dynamic Production Management AIOps team. In this role, you will leverage data science techniques and Python to develop innovative solutions that optimize production processes, enhance predictive capabilities, and integrate AI/ML models into operational workflows. The ideal candidate is a passionate data scientist with a strong understanding of Python's data science ecosystem, experience in building and deploying machine learning models, and a desire to work with large datasets in a fast-paced, collaborative environment. Responsibilities: Data Analysis & Modeling: - Analyze large, complex datasets to identify trends, patterns, and anomalies. - Develop and implement machine learning models using Python libraries to address business challenges related to production optimization, predictive maintenance, and anomaly detection. - Evaluate and refine model performance, ensuring accuracy and reliability. - Deploy and maintain machine learning models in production environments. Data Engineering & Pipelines: - Design and implement data pipelines to collect, process, and transform data from various sources. - Work with structured and unstructured data, ensuring data quality and integrity. - Collaborate with data engineers to integrate data science solutions into existing systems. Visualization & Communication: - Create clear and compelling visualizations to communicate data insights to technical and non-technical audiences. - Present findings and recommendations to stakeholders, effectively conveying the value of data-driven solutions. Collaboration & Continuous Learning: - Participate actively in agile development processes, collaborating with other data scientists, engineers, and product managers. - Stay up-to-date with the latest advancements in data science, machine learning, and AI. - Contribute to a positive and collaborative team environment by sharing knowledge and supporting colleagues. Qualifications: Technical Skills: - 5+ years of proven experience as a Data Scientist or similar role, with a strong focus on Python. - Proficiency in Python and essential data science libraries such as pandas, NumPy, scikit-learn, TensorFlow, PyTorch. - Experience with statistical modeling, machine learning algorithms, and data mining techniques. - Strong data visualization skills using libraries like Matplotlib, Seaborn, or Plotly. - Experience with data engineering tools and techniques such as SQL, Spark, and cloud-based data warehousing solutions. - Knowledge of version control systems like Git. - Familiarity with agile development methodologies. Soft Skills: - Excellent problem-solving and analytical skills. - Strong communication and interpersonal skills. - Ability to work independently and as part of a team. - Ability to manage multiple tasks and prioritize effectively. - Passion for data science and a desire to learn and grow. Education: - Bachelor's or Master's degree in Data Science, Computer Science, Statistics, or a related field.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
We are seeking an experienced Lead Data Engineer with a strong proficiency in Google BigQuery and a working knowledge of Looker to provide support for a data modernization project. In this role, you will be responsible for leading efforts to revamp data architecture, optimize data pipelines, and meet reporting requirements. Additionally, you will play a key role in mentoring an internal client developer to ensure a seamless transition post-engagement. As a Lead Data Engineer, your responsibilities will include assessing and enhancing existing data models, pipelines, and architecture within BigQuery. You will focus on optimizing data flows and storage for improved performance, scalability, and cost-efficiency, while implementing best practices in data engineering and governance. You will also be tasked with reviewing and enhancing data collection and transformation processes to ensure high-quality, consistent, and reliable data availability. Furthermore, you will be responsible for enhancing and maintaining existing reports and dashboards using Looker, as well as designing and developing new reports based on evolving business requirements. Collaboration with business and technical stakeholders is essential in this role. You will work closely with stakeholders to gather data and reporting needs, and translate business requirements into technical deliverables. Additionally, you will guide and mentor the internal client developer, ensuring a smooth handover for long-term maintainability after the project's completion. The ideal candidate will possess strong experience in Google BigQuery, data modeling, query optimization, and performance tuning. Proficiency in building and managing data pipelines, ETL/ELT workflows, and solid SQL skills are crucial for this role. Experience with Looker for creating/modifying dashboards and understanding LookML, as well as familiarity with version control (e.g., Git) and CI/CD for data solutions, are also required. Moreover, the ability to work in Agile environments and with remote teams is necessary. Exposure to GCP services beyond BigQuery (e.g., Dataflow, Cloud Functions) and soft skills such as excellent communication, interpersonal skills, and proven ability to work independently in a client-facing role, are highly valued. Experience in mentoring or coaching team members is considered a plus. In summary, the successful candidate for this role will have a solid technical foundation in data engineering, Google BigQuery, Looker, and related tools, along with excellent communication skills and the ability to work effectively in a collaborative environment.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Full-Stack Developer with 1 to 4 years of experience, you will be responsible for developing scalable and high-performance applications using JavaScript, TypeScript, Node.js, React, Angular, and Redux. You will have hands-on experience in creating microservice architectures, serverless architectures, data pipelines, and message queue systems. Additionally, you will work with both NoSQL and RDBMS databases and have proficiency in containerization using Docker and Kubernetes. Your key skills will include expertise in JavaScript and TypeScript, with a focus on building front-end applications using Angular or React. You should be competent in Node.js and have experience with frameworks like Express.js or Sails.js. Familiarity with Python or Go programming languages is preferred, as well as practical experience in designing microservice architectures and MonoRepo structures. You will be required to lead technical architecture discussions, design solutions, and have a strong understanding of software design and architecture patterns to create robust and scalable solutions. Holding an AWS or Azure certification with experience in cloud development services is a must for this role. Desired capabilities and preferable areas include hands-on experience with GenAI projects, LLM frameworks, and GenAI tools across the development lifecycle. Proficiency in Python programming, excellent communication skills, and the ability to work as a self-driven individual contributor to ensure high-quality deliverables are also important qualities for this position.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City