Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an Odoo Developer at CGI, you will be responsible for developing and customizing solutions on the Odoo Platform v15+. With a minimum of 4 years of experience, you will have a deep understanding of Odoo modules, architecture, APIs, and the ability to integrate Odoo with other systems and data sources. You will work on Odoo deployments with more than 1000 logged-in users, ensuring scalability for a large number of users and transactions. Proficiency in Python is essential, and experience with other programming languages such as Java or Scala is a plus. In this role, you will have the opportunity to analyze and interpret complex data sets, utilize data visualization tools like Superset, and work with technologies such as Cassandra and Presto for data analysis and reporting. Your experience with SQL, relational databases like PostgreSQL or MySQL, ETL tools, and data warehousing concepts will be crucial for success. Familiarity with big data technologies like Hadoop and Spark is advantageous. DevSecOps practices are integral to the role, requiring experience with containerization, Docker, Kubernetes clusters, and CI/CD using GitLab. Knowledge of SCRUM and Agile methodologies is essential, as well as proficiency in Linux/Windows operating systems and tools like Jira, GitLab, and Confluence. As a successful candidate, you will demonstrate strong problem-solving and analytical skills, effective communication and collaboration abilities, attention to detail, and a commitment to data quality. You will thrive in a fast-paced, dynamic environment and contribute to turning meaningful insights into action. At CGI, you will be part of a team that values ownership, teamwork, respect, and belonging. You will have the opportunity to shape your career, develop innovative solutions, and access global capabilities while being supported by leaders who care about your well-being and growth. Join CGI as a partner and contribute to one of the largest IT and business consulting services firms in the world.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
The role of Staff Engineer - Data in SonyLIV's Digital Business is to lead the data engineering strategy, architect scalable data infrastructure, drive innovation in data processing, ensure operational excellence, and build a high-performance team to enable data-driven insights for OTT content and user engagement. This position is based in Mumbai and requires a minimum of 8 years of experience in the field. Responsibilities include defining the technical vision for scalable data infrastructure using modern technologies like Spark, Kafka, Snowflake, and cloud services, leading innovation in data processing and architecture through real-time data processing and streaming analytics, ensuring operational excellence in data systems by setting and enforcing standards for data reliability and privacy, building and mentoring a high-caliber data engineering team, collaborating with cross-functional teams, and driving data quality and business insights through automated quality frameworks and BI dashboards. The successful candidate should have 8+ years of experience in data engineering, business intelligence, and data warehousing, with expertise in high-volume, real-time data environments. They should possess a proven track record in building and managing large data engineering teams, designing and implementing scalable data architectures, proficiency in SQL, experience with object-oriented programming languages, and knowledge of A/B testing methodologies and statistical analysis. Preferred qualifications include a degree in a related technical field, experience managing the end-to-end data engineering lifecycle, working with large-scale infrastructure, familiarity with automated data lineage and auditing tools, expertise with BI and visualization tools, and advanced processing frameworks. Joining SonyLIV offers the opportunity to drive the future of data-driven entertainment by collaborating with industry professionals, working with comprehensive data sets, leveraging cutting-edge technology, and making a tangible impact on product delivery and user engagement. The ideal candidate will bring a strong foundation in data infrastructure, experience in leading and scaling data teams, and a focus on operational excellence to enhance efficiency.,
Posted 5 days ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
> 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 4+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 3-5 Years. >
Posted 5 days ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
> Long Description Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 5-8 Years. >
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer (A2), your main responsibilities will revolve around designing and developing AI-driven data ingestion frameworks and real-time processing solutions to enhance data analysis and machine learning capabilities across the full technology stack. You will be tasked with deploying, maintaining, and supporting application codes and machine learning models in production environments while ensuring seamless integration with front-end and back-end systems. Additionally, you will create and improve AI solutions that facilitate the smooth flow of data across the data ecosystem, enabling advanced analytics and insights for end users. Your role will also involve conducting business analysis to gather requirements and develop ETL processes, scripts, and machine learning pipelines that meet technical specifications and business needs using both server-side and client-side technologies. You will be responsible for developing real-time data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Python, and cloud platforms to support AI applications. Utilizing multiple programming languages and tools like Python, Spark, Hive, Presto, Java, and JavaScript frameworks, you will build prototypes for AI models and assess their effectiveness and feasibility. It will be essential to develop application systems adhering to standard software development methodologies to deliver high-performance AI solutions across the full stack. Collaborating with other engineers, you will provide system support to resolve issues and enhance system performance for both front-end and back-end components. Furthermore, you will operationalize open-source AI and data-analytic tools for enterprise-scale applications, ensuring alignment with organizational needs and user interfaces. Compliance with data governance policies by implementing and validating data lineage, quality checks, and data classification in AI projects will be a crucial aspect of your role. You will need to understand and follow the company's software development lifecycle effectively to develop, deploy, and deliver AI solutions. In terms of technical skills, you are expected to have a strong proficiency in Python, Java, C++, and familiarity with machine learning frameworks like TensorFlow and PyTorch. A deep understanding of ML, Deep Learning, and NLP algorithms is also required. Proficiency in building backend services using frameworks like FastAPI, Flask, and Django, as well as full-stack development skills with JavaScript frameworks such as React and Angular, will be essential for integrating user interfaces with AI models and data solutions. Preferred technical skills include expertise in big data processing technologies like Azure Databricks and Apache Spark to handle, analyze, and process large datasets for machine learning and AI applications. Additionally, certifications such as Microsoft Certified: Azure Data Engineer Associate or Azure AI Engineer are considered advantageous. To excel in this role, you should possess strong oral and written communication skills to effectively communicate technical and non-technical concepts to peers and stakeholders. Being open to collaborative learning, able to manage project components beyond individual tasks, and having a good understanding of business objectives driving data needs will be key behavioral attributes for success. This role is suitable for individuals holding a Bachelors or Masters degree in Computer Science with 2 to 4 years of software engineering experience.,
Posted 6 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer (A2), you will be responsible for designing and developing AI-driven data ingestion frameworks and real-time processing solutions to enhance data analysis and machine learning capabilities across the full technology stack. Your duties will include deploying, maintaining, and supporting application codes and machine learning models in production environments, ensuring seamless integration with front-end and back-end systems. You will also create and enhance AI solutions that enable seamless integration and flow of data across the data ecosystem, facilitating advanced analytics and insights for end users. Additionally, you will conduct business analysis to gather requirements and develop ETL processes, scripts, and machine learning pipelines that meet technical specifications and business needs. You will be tasked with developing real-time data ingestion and stream-analytic solutions using technologies such as Kafka, Apache Spark, Python, and cloud platforms to support AI applications. Utilizing multiple programming languages and tools, including Python, Spark, Hive, Presto, Java, and JavaScript frameworks, you will build prototypes for AI models and evaluate their effectiveness and feasibility. Furthermore, you will develop application systems adhering to standard software development methodologies, ensuring robust design, programming, backup, and recovery processes to deliver high-performance AI solutions across the full stack. As part of a team rotation, you will provide system support, collaborating with other engineers to resolve issues and enhance system performance regarding both front-end and back-end components. You will operationalize open-source AI and data-analytic tools for enterprise-scale applications, aligning them with organizational needs and user interfaces. Your responsibilities will also include ensuring compliance with data governance policies by implementing and validating data lineage, quality checks, and data classification in AI projects. You will need to understand and follow the company's software development lifecycle to effectively develop, deploy, and deliver AI solutions. Designing and developing AI frameworks leveraging open-source tools and advanced data processing frameworks, integrating them with user-facing applications, will also be part of your role. You will lead the design and execution of complex AI projects, ensuring alignment with ethical guidelines and principles under the guidance of senior team members. In terms of mandatory technical skills, you should have a strong proficiency in Python, Java, C++, and familiarity with machine learning frameworks such as TensorFlow and PyTorch. An in-depth knowledge of ML, Deep Learning, and NLP algorithms is essential, along with hands-on experience in building backend services using frameworks like FastAPI, Flask, or Django. Proficiency in front-end and back-end technologies, including JavaScript frameworks like React and Angular, is required to integrate user interfaces with AI models and data solutions. Developing and maintaining data pipelines for AI applications to ensure efficient data extraction, transformation, and loading processes is also a key aspect of the role. Moreover, possessing strong oral and written communication skills to effectively convey technical and non-technical concepts to peers and stakeholders is crucial. Preferred technical skills include utilizing big data technologies such as Azure Databricks and Apache Spark, developing real-time data ingestion and stream-analytic solutions leveraging various technologies, and holding relevant certifications like Microsoft Certified: Azure Data Engineer Associate or Azure AI Engineer. The ideal candidate for this role will have a Bachelor's or Master's degree in Computer Science and 2 to 4 years of Software Engineering experience. If you are open to collaborative learning, adept at managing project components beyond individual tasks, and strive to understand business objectives driving data needs, then this role is tailored for you.,
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/Benefits- https://india.target.com/life-at-target/workplace/benefitsCulture- https://india.target.com/life-at-target/belonging
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining Lifesight as a Data Engineer in our Bengaluru office, playing a pivotal role in the Data and Business Intelligence organization. Your primary focus will be on leading deep data engineering projects and contributing to the growth of our data platform team. This is an exciting opportunity to shape our technical strategy and foster a strong data engineering team culture in India. As a Data Engineer at Lifesight, you will be responsible for designing and constructing data platforms and services, managing data infrastructure in cloud environments, and enabling strategic business decisions across Lifesight products. Your role will involve building highly scalable, fault-tolerant distributed data processing systems, optimizing data quality in pipelines, and owning data mapping, transformations, and business logic. You will also engage in low-level system debugging, performance optimization, and actively participate in architecture discussions to drive new projects forward. The ideal candidate for this position will possess proficiency in Python and PySpark, along with a deep understanding of Apache Spark, Spark tuning, and building data frames. Experience with big data technologies such as HDFS, YARN, Map-Reduce, Hive, Kafka, and Airflow, as well as NoSQL databases and cloud platforms like AWS and GCP, are essential. You should have at least 5 years of professional experience in data or software engineering, demonstrating expertise in data quality, data engineering, and various big data frameworks and tools. In summary, as a Data Engineer at Lifesight, you will have the opportunity to work on cutting-edge data projects, collaborate with a talented team of engineers, and contribute to the ongoing success and innovation of Lifesight's data platform.,
Posted 1 week ago
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
About The Role Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
About the Team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. Weve done this with zero downtime! ?? Sounds impossible? Well, thats the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection, and see failures as opportunities to become better. Weve taken steps to inculcate a strong Founders Mindset across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As Engineering Manager, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we arent building unparalleled tech solutions, you can find us debating the plot points of our favourite books and games or even gossipping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the Role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled. You will analyse other employees tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects. What you will do Design tasks for other engineers, keeping Meeshos guidelines and standards in mind Keep a close look on various projects and monitor the progress Drive excellence in quality across the organisation and solutioning of product problems Collaborate with the sales and design teams to create new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's / Masters in computer science At least 8+ years of professional experience At least 4+ years experience in managing software development teams Experience in building large-scale distributed Systems Experience in Scalable platforms Expertise in Java/Python/Go-Lang and multithreading Good understanding on Spark and internals Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems Kafka Good experience on cloud infrastructure - AWS preferably Ability to drive sprints and OKRs with good stakeholder management experience. Exceptional team managing skills Experience in managing a team of 4-5 junior engineers Good understanding on Streaming and real time pipelines Good understanding on Data modelling concepts, Data Quality tools Good knowledge in Business Intelligence tools Metabase, Superset, Tableau etc. Good to have knowledge - Trino, Flink, Presto, Druid, Pinot etc. Good to have knowledge - Data pipeline building
Posted 3 weeks ago
1.0 - 3.0 years
15 - 30 Lacs
Bengaluru
Work from Office
About the Role Does digging deep for data and turning it into useful, impactful insights get you excited? Then you could be our next SDE II Data-Real Time Streaming. In this role, youll oversee your entire teams work, ensuring that each individual is working towards achieving their personal goals and Meeshos organisational goals. Moreover, youll keep an eye on all engineering projects and ensure the team is not straying from the right track. Youll also be tasked with directing programming activities, evaluating system performance, and designing new programs and features for smooth functioning. What you will do Build a platform for ingesting and processing multi terabytes of data daily Curate, build and transform raw data into scalable information Create prototypes and proofs-of-concept for iterative development Reduce technical debt with quality coding Keep a closer look at various projects and monitor the progress Carry on smooth collaborations with the sales team and engineering teams Provide management mentorship which sets the tone for holistic growth Ensure everyone is on the same page and taking ownership of the project What you will need Bachelors/Masters degree in Computer Science At least 1 to 3 years of professional experience Exceptional coding skills using Java, Scala, Python Working knowledge of Redis, MySQL and messaging systems like Kafka Knowledge of RxJava, Java Springboot, Microservices architecture Hands-on experience with the distributed systems architecture dealing with high throughput. Experience in building streaming and real-time solutions using Apache Flink/Spark Streaming/Samza. Familiarity with software engineering best practices across all stages of software development Expertise in Data system internalsStrong problem-solving and analytical skills Familiarity with Big Data systems (Spark/EMR, Hive/Impala, Delta Lake, Presto, Airflow, Data Lineage) is an advantage Familiarity with data modeling, end-to-end data pipelining, OLAP data cubes and BI tools is a plus Experience as a contributor/committer to the Big data stack is a plus Having been a contributor/committer to the big data stack Data modeling experience and end-to-end data pipelining experience is a plus Brownie points for knowledge of OLAP data cubes and BI tools
Posted 3 weeks ago
8.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Urgent Opening for Solution Architect- Data Warehouse-Bangalore Posted On 04th Jul 2019 12:25 PM Location Bangalore Role / Position Solution Architect- Data Warehouse Experience (required) 8 Plus years Description 8-10 years experience in consulting or IT experience supporting Enterprise Data Warehouses & Business Intelligence environments, including experience with data warehouse architecture & design, ETL design/development, and Analytics. Responsiblefor defining the data strategy andfor ensuring that the programs and project align to that strategy. Provides thought leadership in following areas: -Data Access, Data Integration, Data Visualization, Data Modeling, Data Quality and Metadata management -Analytics, Data Discovery, Use Statistical methods, Database Design and Implementation Expertise in Database Appliance, RDBMS, Teradata,Netezza Hands-on experience with data architecting, data mining, large-scale data modeling, and business requirements gathering/analysis. Experience in ETL and Data Migration Tools. Direct experience in implementing enterprise data management processes, procedures, and decision support. Responsiblefor defining the data strategy andfor ensuring that the programs and project align to that strategy. Strong understanding of relational data structures, theories, principles, and practices. Strong familiarity with metadata management and associated processes. Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping tools, and data profiling tools. Demonstrated expertise with repository creation, and data and information system life cycle methodologies. Experience with business requirements analysis, entity relationship planning, database design, reporting structures, and so on. Ability to manage data and metadata migration. Experience with data processing flowcharting techniques. Hands on Experience in Big Data Technologies(5 years)-Hadoop, MapReduce, MongoDB, and Integration with the Legacy environmentswould be preferred . Experience with Spark using Scala or Python is a big plus Experience in Cloud Technologies(primarily in AWS, Azure) and integration with on premise existing Data warehouse technologies. Have good knowledge on S3, Redshift, Blob Storage, Presto DB etc. Attitude to learn and adopt emerging technologies. Send Resumes to girish.expertiz@gmail.com -->Upload Resume
Posted 3 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Mumbai
Work from Office
The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Senior Process Manager Roles and responsibilities: Understand the business model on why things are the way they are, ask relevant questions and get them clarified. Breakdown complex problems in small solvable components, to be able to identify problem areas in each component. Conduct cost/benefit analysis and feasibility studies for proposed projects to aid in decision-making. Facilitate the implementation of new or improved business processes and systems. Coordinate with business stakeholders to identify gaps in data, processes and suggest process improvements. Understand and follow the project roadmap, plan data availability and coordinate with the execution team to ensure a successful execution of projects. Prescribe suitable solutions with an understanding in limitations of toolsets and available data. Manage procurement of data from various sources and perform data audits. Fetch and analyze data from disparate sources and drive meaningful insights. Provide recommendations on the business rules for effective campaign targeting. Interpret analytical results and provide insights; present key findings and recommended next steps to clients. Develop tangible analytical projects; communicate project details to clients and internal delivery team via written documents and presentations, in forms of specifications, diagrams, and data/process models. Audit deliverables ensuring accuracy by critically examining the data and reports against requirements. Collaborate on regional/global analytic initiatives and localize inputs for country campaign practices. Actively work on audience targeting insights, optimize campaigns and improve comm governance. Technical and Functional Skills: Must Have BS/BA degree or equivalent professional experience required Degree. Minimum 8-10 years of professional experience in advanced analytics for a Fortune 500-scale company or a prominent consulting organization. Experience inData Extraction tools, Advanced Excel, CRM Analytics, Campaign Marketing, and Analytics knowledge - Campaign Analytics. Strong in numerical and analytical skills. Strong in Advanced Excel (prior experience with Google sheets is an added plus) Strong analytical and storytelling skills; ability to derive relevant insights from large reports and piles of disparate data. Comfortable working autonomously with broad guidelines. Passion for data andanalytics for marketing and eagerness to learn. Excellent communications skills, both written and spoken; ability to explain complex technical concepts in plain English. Ability to manage multiple priorities and projects, aligning teams to project timelines and ensuring quality of deliverables. Work with business teams to identify business use cases and develop solutions to meet these needs using analytical approaches. Manage regular reporting and ad-hoc data extract from other departments. Knowledge on analyzing digital campaigns and the tools/technologies of performance marketing. Experience with Google sheet/Excel. Good To Have Hands-on experience in digital marketing and/or 1:1 marketing in any channel; expert level knowledge in database marketing and CRM. Working knowledge in data visualization tools (Tableau, QlikView, etc.). Working knowledge of analytical/statistical techniques. Experience in Hadoop environment Hive, Presto is a plus. Experience in Python/R. Previous consulting experience is a definite plus.
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Chandigarh
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Roles and responsibilities: Understand the business model on why things are the way they are, ask relevant questions and get them clarified. Breakdown complex problems in small solvable components, to be able to identify problem areas in each component. Conduct cost/benefit analysis and feasibility studies for proposed projects to aid in decision-making. Facilitate the implementation of new or improved business processes and systems. Coordinate with business stakeholders to identify gaps in data, processes and suggest process improvements. Understand and follow the project roadmap, plan data availability and coordinate with the execution team to ensure a successful execution of projects. Prescribe suitable solutions with an understanding in limitations of toolsets and available data. Manage procurement of data from various sources and perform data audits. Fetch and analyze data from disparate sources and drive meaningful insights. Provide recommendations on the business rules for effective campaign targeting. Interpret analytical results and provide insights; present key findings and recommended next steps to clients. Develop tangible analytical projects; communicate project details to clients and internal delivery team via written documents and presentations, in forms of specifications, diagrams, and data/process models. Audit deliverables ensuring accuracy by critically examining the data and reports against requirements. Collaborate on regional/global analytic initiatives and localize inputs for country campaign practices. Actively work on audience targeting insights, optimize campaigns and improve comm governance. Technical and Functional Skills: Must Have BS/BA degree or equivalent professional experience required Degree. Minimum 7+ years of professional experience in advanced analytics for a Fortune 500-scale company or a prominent consulting organization. Experience inData Extraction tools, Advanced Excel, CRM Analytics, Campaign Marketing, and Analytics knowledge Campaign Analytics. Strong in numerical and analytical skills. Strong in Advanced Excel (prior experience with Google sheets is an added plus) Strong analytical and storytelling skills; ability to derive relevant insights from large reports and piles of disparate data. Comfortable working autonomously with broad guidelines. Passion for data andanalytics for marketing and eagerness to learn. Excellent communications skills, both written and spoken; ability to explain complex technical concepts in plain English. Ability to manage multiple priorities and projects, aligning teams to project timelines and ensuring quality of deliverables. Work with business teams to identify business use cases and develop solutions to meet these needs using analytical approaches. Manage regular reporting and ad-hoc data extract from other departments. Knowledge on analyzing digital campaigns and the tools/technologies of performance marketing. Experience with Google sheet/Excel. Good To Have Hands-on experience in digital marketing and/or 1:1 marketing in any channel; expert level knowledge in database marketing and CRM. Working knowledge in data visualization tools (Tableau, QlikView, etc.). Working knowledge of analytical/statistical techniques. Experience in Hadoop environment Hive, Presto is a plus. Experience in Python/R. Previous consulting experience is a definite plus.
Posted 3 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Mumbai
Work from Office
The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Senior Process Manager Roles and responsibilities: Understand the business model on why things are the way they are, ask relevant questions and get them clarified. Breakdown complex problems in small solvable components, to be able to identify problem areas in each component. Conduct cost/benefit analysis and feasibility studies for proposed projects to aid in decision-making. Facilitate the implementation of new or improved business processes and systems. Coordinate with business stakeholders to identify gaps in data, processes and suggest process improvements. Understand and follow the project roadmap, plan data availability and coordinate with the execution team to ensure a successful execution of projects. Prescribe suitable solutions with an understanding in limitations of toolsets and available data. Manage procurement of data from various sources and perform data audits. Fetch and analyze data from disparate sources and drive meaningful insights. Provide recommendations on the business rules for effective campaign targeting. Interpret analytical results and provide insights; present key findings and recommended next steps to clients. Develop tangible analytical projects; communicate project details to clients and internal delivery team via written documents and presentations, in forms of specifications, diagrams, and data/process models. Audit deliverables ensuring accuracy by critically examining the data and reports against requirements. Collaborate on regional/global analytic initiatives and localize inputs for country campaign practices. Actively work on audience targeting insights, optimize campaigns and improve comm governance. Technical and Functional Skills: Must Have BS/BA degree or equivalent professional experience required Degree. Minimum 8-10 years of professional experience in advanced analytics for a Fortune 500-scale company or a prominent consulting organization . Experience in Data Extraction tools, A dvanced Excel, CRM Analytics, Campaign Marketing, and Analytics knowledge - Campaign Analytics . Strong in numerical and analytical skills. Strong in Advanced Excel (prior experience with Google sheets is an added plus) Strong analytical and storytelling skills; ability to derive relevant insights from large reports and piles of disparate data. Comfortable working autonomously with broad guidelines. Passion for data andanalytics for marketing and eagerness to learn. Excellent communications skills, both written and spoken; ability to explain complex technical concepts in plain English. Ability to manage multiple priorities and projects, aligning teams to project timelines and ensuring quality of deliverables. Work with business teams to identify business use cases and develop solutions to meet these needs using analytical approaches. Manage regular reporting and ad-hoc data extract from other departments. Knowledge on analyzing digital campaigns and the tools/technologies of performance marketing. Experience with Google sheet/Excel. Good To Have Hands-on experience in digital marketing and/or 1:1 marketing in any channel; expert level knowledge in database marketing and CRM. Working knowledge in data visualization tools (Tableau, QlikView, etc.). Working knowledge of analytical/statistical techniques. Experience in Hadoop environment Hive, Presto is a plus. Experience in Python/R. Previous consulting experience is a definite plus.
Posted 3 weeks ago
6.0 - 11.0 years
4 - 8 Lacs
Hyderabad, New Delhi
Hybrid
Working Mode : Hybrid Payroll: IDESLABS Location : Pan India PF Detection is mandatory Job Description: Key Responsibilities: Work on AS400 with expertise in I5250 and Presto for inventory and order fulfillment. Utilize modern RPG techniques including RPG, RPGLE , and Embedded SQL . Design and develop applications using RPG, RPGLE, CL400, and CLLE . Perform database design and development on DB2/400 , ensuring efficient query execution and optimization. Develop and troubleshoot applications using AS400 tools like SDA, RLU, and PDM . Write and optimize complex SQL queries for embedded and interactive sessions. Perform end-to-end unit testing , debug interactive and batch jobs, and ensure seamless functionality of applications. Collaborate directly with client teams to gather business requirements and deliver appropriate solutions. Maintain and enhance existing applications, ensuring smooth operations and performance. Work with one or more change management tools such as Aldon, Turnover, or Thenon (preferred). Be adaptable to switching between tasks, handling both development and maintenance activities.
Posted 3 weeks ago
4.0 - 9.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Key Responsibilities Work on AS400 with expertise in I5250 and Presto for inventory and order fulfillment. Utilize modern RPG techniques including RPG, RPGLE , and Embedded SQL . Design and develop applications using RPG, RPGLE, CL400, and CLLE . Perform database design and development on DB2/400 , ensuring efficient query execution and optimization. Develop and troubleshoot applications using AS400 tools like SDA, RLU, and PDM . Write and optimize complex SQL queries for embedded and interactive sessions. Perform end-to-end unit testing , debug interactive and batch jobs, and ensure seamless functionality of applications. Collaborate directly with client teams to gather business requirements and deliver appropriate solutions. Maintain and enhance existing applications, ensuring smooth operations and performance. Work with one or more change management tools such as Aldon, Turnover, or Thenon (preferred). Be adaptable to switching between tasks, handling both development and maintenance activities.
Posted 3 weeks ago
8.0 - 13.0 years
5 - 9 Lacs
Hyderabad
Hybrid
Immediate Openings on # AS400 Developer _ Pan India_ Contract #Skill:AS400 Developer #Notice Period:Immediate #Employment Type: Contract Work on AS400 with expertise in I5250 and Presto for inventory and order fulfillment. Utilize modern RPG techniques including RPG, RPGLE , and Embedded SQL . Design and develop applications using RPG, RPGLE, CL400, and CLLE . Perform database design and development on DB2/400 , ensuring efficient query execution and optimization. Develop and troubleshoot applications using AS400 tools like SDA, RLU, and PDM . Write and optimize complex SQL queries for embedded and interactive sessions. Perform end-to-end unit testing , debug interactive and batch jobs, and ensure seamless functionality of applications. Collaborate directly with client teams to gather business requirements and deliver appropriate solutions. Maintain and enhance existing applications, ensuring smooth operations and performance. Work with one or more change management tools such as Aldon, Turnover, or Thenon (preferred). Be adaptable to switching between tasks, handling both development and maintenance activities.
Posted 3 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
WHAT YOU WILL WORK ON Serve as a liaison between product, engineering & data consumers by analyzing the data, finding gaps and help drive roadmap Support and troubleshoot issues (data & process), identify root cause, and proactively recommend sustainable corrective actions by collaborating with engineering/product teams Communicate actionable insights using data, often for the stakeholders and non-technical audience. Ability to write technical specifications describing requirements for data movement, transformation & quality checks WHAT YOU BRING Bachelors Degree in Computer Science, MIS, other quantitative disciplines, or related fields 3-7 years of relevant analytical experiences that can translate into defining strategic vision into requirements and working with the best engineers, product managers, and data scientists Ability to conduct data analysis, develop and test hypothesis and deliver insights with minimal supervision Experience identifying and defining KPIs using data for business areas such as Sales, Consumer Behaviour, Supply Chain etc. Exceptional SQL skills Experience with modern visualization tool stack, such as: Tableau, Power BI, Domo etc. Knowledge of open-source, big data and cloud infrastructure such as AWS, Hive, Snowflake, Presto etc. Incredible attention to detail, with structured problem-solving approach Excellent communications skills (written and verbal) Experience with agile development methodologies Experience with retail or ecommerce domains is a plus.
Posted 3 weeks ago
6.0 - 11.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Experience in Cloud platform, e.g., AWS, GCP, Azure, etc. Experience in distributed technology tools, viz. SQL, Spark, Python, PySpark, Scala Performance Turing Optimize SQL, PySpark for performance Airflow workflow scheduling tool for creating data pipelines GitHub source control tool & experience with creating/ configuring Jenkins pipeline Experience in EMR/ EC2, Databricks etc. DWH tools incl. SQL database, Presto, and Snowflake Streaming, Serverless Architecture
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
Posted 4 weeks ago
5.0 - 10.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Job description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Senior Data Analyst Experience: 5+ Years Skill Set: Data Analysis, SQL and Cloud (AWS, Azure, GCP), Retail domain exposure-must Location: Pune, Hyderabad, Gurgaon Key Requirements: Bachelors degree in Computer Science, MIS, or related fields. 6-7 years of relevant analytical experience, translating strategic vision into actionable requirements. Ability to conduct data analysis, develop and test hypotheses, and deliver insights with minimal supervision. Experience identifying and defining KPIs for business areas such as Sales, Consumer Behavior, Supply Chain, etc. Exceptional SQL skills. Experience with modern visualization tools like Tableau, Power BI, Domo, etc. Knowledge of open-source, big data, and cloud infrastructure such as AWS, Hive, Snowflake, Presto, etc. Incredible attention to detail with a structured problem-solving approach. Excellent communication skills (written and verbal). Experience with agile development methodologies. Experience in retail or e-commerce domains is a plus. How to Apply: Interested candidates can share their CV at pragati.jha@gspann.com.
Posted 1 month ago
2.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organisations seeking independent talent.. Flexing It has partnered with Our client, a global leader in energy management and automation, is seeking a Data engineer to prepare data and make it available in an efficient and optimized format for their different data consumers, ranging from BI and analytics to data science applications. It requires to work with current technologies in particular Apache Spark, Lambda & Step Functions, Glue Data Catalog, and RedShift on AWS environment.. Key Responsibilities:. Design and develop new data ingestion patterns into IntelDS Raw and/or Unified data layers based on the requirements and needs for connecting new data sources or for building new data objects. Working in ingestion patterns allow to automate the data pipelines.. Participate to and apply DevSecOps practices by automating the integration and delivery of data pipelines in a cloud environment. This can include the design and implementation of end-to-end data integration tests and/or CICD pipelines.. Analyze existing data models, identify and implement performance optimizations for data. ingestion and data consumption. The objective is to accelerate data availability within the. platform and to consumer applications.. Support client applications in connecting and consuming data from the platform, and ensure they follow our guidelines and best practices.. Participate in the monitoring of the platform and debugging of detected issues and bugs. Skills required:. Minimum of 3 years prior experience as data engineer with proven experience on Big Data and Data Lakes on a cloud environment.. Bachelor or Master degree in computer science or applied mathematics (or equivalent). Proven experience working with data pipelines / ETL / BI regardless of the technology.. Proven experience working with AWS including at least 3 of: RedShift, S3, EMR, Cloud. Formation, DynamoDB, RDS, lambda.. Big Data technologies and distributed systems: one of Spark, Presto or Hive.. Python language: scripting and object oriented.. Fluency in SQL for data warehousing (RedShift in particular is a plus).. Good understanding on data warehousing and Data modelling concepts. Familiar with GIT, Linux, CI/CD pipelines is a plus.. Strong systems/process orientation with demonstrated analytical thinking, organization. skills and problem-solving skills.. Ability to self-manage, prioritize and execute tasks in a demanding environment.. Strong consultancy orientation and experience, with the ability to form collaborative,. productive working relationships across diverse teams and cultures is a must.. Willingness and ability to train and teach others.. Ability to facilitate meetings and follow up with resulting action items. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Digantara is a leading Space Surveillance and Intelligence company focused on ensuring orbital safety and sustainability. With expertise in space-based detection, tracking, identification, and monitoring, Digantara provides comprehensive domain awareness across various regimes, allowing end-users to gain actionable intelligence on a single platform. At the core of its infrastructure lies a sophisticated integration of hardware and software capabilities aligned with the key principles of situational awareness: perception (data collection), comprehension (data processing), and prediction (analytics). This holistic approach empowers Digantara to monitor all Resident Space Objects (RSOs) in orbit, fostering comprehensive domain awareness.. We are seeking a skilled Back End Developer to join our dynamic team. As a Back End Developer, you will be responsible for designing, developing, and maintaining the server-side logic of our web platform. Your primary focus will be on building an efficient event streaming platform that can handle massive data pipelines. You will work closely with the front-end developers, astrodynamics engineers, and other team members to ensure seamless integration between the multiple microservices.. Why Us?. Competitive incentives, blazing team, pretty much everything that you have heard about a startup, plus you get to work on space technology.. Hustle in a well-funded startup with a flat hierarchy that allows you to take charge of your responsibilities and create your moonshot.. Ideal Candidate. Someone experienced in building distributed event streaming platforms capable of handling massive data pipelines.. Responsibilities. Build the space situational awareness platform with clean, modular, and well-documented code that complies with best practices and coding standards.. Handle continuous streams of data, complex event processing, and asynchronous communication to build real-time data pipelines and event-driven architecture.. Manage databases and handle big data that power our web platform.. Troubleshoot and debug issues to ensure smooth functionality across different systems.. Stay up-to-date with the latest backend development trends, tools, and techniques.. Participate in code reviews, providing constructive feedback and suggestions for improvement.. Contribute to the continuous improvement of development processes and workflows.. Required Qualifications. 2 or more years of experience in designing APIs & Databases.. Solid understanding of event streaming platforms and messaging queues like Apache Kafka.. Proficiency in any server-side programming language and runtimes, preferably Golang.. Experience with relational databases like Postgres.. Knowledge of version control systems (e.g., Git) and collaborative development workflows.. Excellent problem-solving and debugging skills.. Ability to work effectively in a fast-paced, collaborative team environment.. Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders.. Preferred Qualities. Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.. Proficiency in Golang would be an added advantage.. Familiarity with the Hadoop ecosystem.. Being acquainted with Hive, HDFS, Presto, HBase, and Spark would be an added benefit.. Knowledge of cloud platforms and their performance optimisation techniques (e.g., AWS, Azure, GCP).. Having prior experience in a product or start-up company would further enhance your appeal to us.. Demonstrates a proactive attitude towards learning, being highly adaptable, and eager to acquire new knowledge and skills.. General Requirements. Ability to work in a mission-focused, operational environment.. Ability to think critically and make independent decisions.. Interpersonal skills to enable working in a diverse and dynamic team.. Maintain a regular and predictable work schedule.. Verbal and written communication skills as well as organisational skills.. Travel occasionally as necessary.. Job Location: Hebbal, Bengaluru, Karnataka, India. PI272704447. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Notice period: Immediate 15days Timings:1:00pm 10:00pm (IST) Work Mode: WFO (Mon-Fri) We are seeking a strategic and innovative Senior Data Scientist to join our high-performing Data Science team. In this role, you will lead the design, development, and deployment of advanced analytics and machine learning solutions that directly impact business outcomes. You will collaborate cross-functionally with product, engineering, and business teams to translate complex data into actionable insights and data products. Key Responsibilities Lead and execute end-to-end data science projects, encompassing problem definition, data exploration, model creation, assessment, and deployment. Develop and deploy predictive models, optimization techniques, and statistical analyses to address tangible business needs. Articulate complex findings through clear and persuasive storytelling for both technical experts and non-technical stakeholders. Spearhead experimentation methodologies, such as A/B testing, to enhance product features and overall business outcomes. Partner with data engineering teams to establish dependable and scalable data infrastructure and production-ready models. Guide and mentor junior data scientists, while also fostering team best practices and contributing to research endeavors. Required Qualifications & Skills: Masters or PhD in Computer Science, Statistics, Mathematics, or a related 5+ years of practical experience in data science, including deploying models to Expertise in Python and SQL; Solid background in ML frameworks such as scikit-learn, TensorFlow, PyTorch, and Competence in data visualization tools like Tableau, Power BI, matplotlib, and Comprehensive knowledge of statistics, machine learning principles, and experimental Experience with cloud platforms (AWS, GCP, or Azure) and Git for version Exposure to MLOps tools and methodologies (e.g., MLflow, Kubeflow, Docker, CI/CD). Familiarity with NLP, time series forecasting, or recommendation systems is a Knowledge of big data technologies (Spark, Hive, Presto) is desirable
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France