Home
Jobs

1932 Clustering Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

3 - 9 Lacs

Gurgaon

Remote

Job description About this role About Aladdin Financial Engineering (AFE): Join a diverse and collaborative team of over 300 modelers and technologists in Aladdin Financial Engineering (AFE) within BlackRock Solutions, the business responsible for the research and development of Aladdin’s financial models. This group is also accountable for analytics production, enhancing the infrastructure platform and delivering analytics content to portfolio and risk management professionals (both within BlackRock and across the Aladdin client community). The models developed and supported by AFE span a wide array of financial products covering equities, fixed income, commodities, derivatives, and private markets. AFE provides investment insights that range from an analysis of cash flows on a single bond, to the overall financial risk associated with an entire portfolio, balance sheet, or enterprise. Role Description: We are looking for a person to join the Advanced Data Analytics team with AFE Single Security. Advanced Data Analytics is a team of Quantitative Data and Product Specialists, focused on delivering Single Security Data Content, Governance and Product Solutions and Research Platform. The team leverages data, cloud, and emerging technologies in building aninnovative data platform, with the focus on business and research use cases in the Single Security space. The team uses various statistical/mathematical methodologies to derive insights and generate content to help develop predictive models, clustering, and classification solutions and enable Governance.The team works on Mortgage, Structured & Credit Products. We are looking for a person to help build and expand Data & Analytics Content in the Credit space. The person will be responsible for building, enhancing, and maintaining the Credit Content Suite. The person will work on the below – Credit Derived Data Content Model & Data Governance Credit Model & Analytics Experience Experience on Scala Knowledge of ETL, data curation and analytical jobs using distributed computing framework with Spark Knowledge and Experience of working with large enterprisedatabases like Snowflake, Cassandra & Cloud manged services like Dataproc, Databricks Knowledge of financial instruments like Corporate Bonds, Derivatives etc. Knowledge of regression methodologies Aptitude for design and building tools for Data Governance Python knowledge is a plus Qualifications Bachelors/master's in computer science with a majorin Math, Econ, or related field 3-6 years of relevant experience Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R253234

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Job Description Are you excited to work with innovative security products? Do you enjoy working with innovative and strategic solutions and to solve complex problems? Join our IT Support team Akamai's Infrastructure Services team manages and supports Corporate IT infrastructure for internal business functions. In this role, you will manage daily operations of Data Protection and Network Storage within Corporate IT. You will also collaborate with technical teams and business functions to deliver IT solutions efficiently. Partner with the best Akamai's Infrastructure Services team is seeking a Unix System Administrator II. The role involves providing operational support for Data Protection and Network Storage Infrastructure. Additionally, you will be responsible for delivering IT support services efficiently. As a Systems Administrator II UNIX, you will be responsible for: On-going administration support of NetApp platform (installation, upgrade, patching, monitoring) Using system resource utilization and monitoring tools like GAP (Grafana, Alert manager & Prometheus) to identify potential problem areas before they result in incidents. Assisting in change management of platforms for new versions of ONTAP, Hotfixes, Sysadmin tasks, etc. Automating routine administrative and maintenance tasks by scripting the procedures through Shell Scripting, Perl or ansible. Handling Major/Critical incidents for problem resolution & work through root cause analysis and preventive actions. Collaborating with engineering teams for project implementation. Do What You Love To be successful in this role you will: Have a Bachelor's degree in Computer Science, Engineering. Have 5+ years of experience in IT Infrastructure Management and Service Delivery. Have hands-on expertise in managing NetApp storage arrays and Cisco MDS fabric switches (5+ years). Have 2+ years of experience with Linux, Windows servers, virtualization, clustering, and data protection. Possess advanced knowledge of TCP/IP, VLANs, DNS, DHCP, and high-availability networking. Demonstrate good communication, documentation skills, and relevant certifications (NetApp, RedHat). Work in a way that works for you FlexBase, Akamai's Global Flexible Working Program, is based on the principles that are helping us create the best workplace in the world. When our colleagues said that flexible working was important to them, we listened. We also know flexible working is important to many of the incredible people considering joining Akamai. FlexBase, gives 95% of employees the choice to work from their home, their office, or both (in the country advertised). This permanent workplace flexibility program is consistent and fair globally, to help us find incredible talent, virtually anywhere. We are happy to discuss working options for this role and encourage you to speak with your recruiter in more detail when you apply. Learn what makes Akamai a great place to work Connect with us on social and see what life at Akamai is like! We power and protect life online, by solving the toughest challenges, together. At Akamai, we're curious, innovative, collaborative and tenacious. We celebrate diversity of thought and we hold an unwavering belief that we can make a meaningful difference. Our teams use their global perspectives to put customers at the forefront of everything they do, so if you are people-centric, you'll thrive here. Working for you Benefits At Akamai, we will provide you with opportunities to grow, flourish, and achieve great things. Our benefit options are designed to meet your individual needs for today and in the future. We provide benefits surrounding all aspects of your life: Your health Your finances Your family Your time at work Your time pursuing other endeavors Our benefit plan options are designed to meet your individual needs and budget, both today and in the future. About Us Akamai powers and protects life online. Leading companies worldwide choose Akamai to build, deliver, and secure their digital experiences helping billions of people live, work, and play every day. With the world's most distributed compute platform from cloud to edge we make it easy for customers to develop and run applications, while we keep experiences closer to users and threats farther away. Join us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clients? Come join us and grow with a team of people who will energize and inspire you! Show more Show less

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Noida

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP for Utilities Cust Financial Mgt FICA Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : Graduate Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems that apply across multiple teams. With your expertise and leadership, you will contribute to the success of the project and drive innovation in application development. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Expected to provide solutions to problems that apply across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact - Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP for Utilities Cust Financial Mgt FICA - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 12 years of experience in SAP for Utilities Cust Financial Mgt FICA - This position is based in Noida - A Graduate degree is required Graduate

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Calcutta

Remote

3+ years Create and maintain Database / warehouse / User / Schemas/ File format/ Stage in Snowflake. Add and adjust required clusters to the warehouse based on warehouse load. Implement Snowflake database security changes as per requirements. Implement Network Policies for managing network configurations to Snowflake service. good Should have excellent knowledge on clustering and defining cluster keys to tables. Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Primary skills:Database Administration->SQL Database Administration(SQL DBA),Technology->Database Administration->MS SQL Server,Technology->Database->Oracle Database,Technology->infrastructure-database administration->SQL administration Good Knowledge of SQL DBA or Oracle DBA (Experience in both would be an added advantage) For SQL DBA Applicants: Experience in Mirroring, Clustering, Log shipping, HADR activities and Always ON is preferred. For Oracle DBA Applicants: Experience in RAC, RMAN, Dataguard, Exadata, Golden gate will be preferred. Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Lead Data Us Cognitio Analytics, founded in 2013, aims to be the preferred provider of AI / ML driven productivity solutions for large enterprises. The company has received awards for its Smart Operations and Total Rewards Analytics Solutions and is dedicated to innovation, R&D, and creating sustained value for clients. Cognitio Analytics has been recognized as a "Great Place to Work" for its commitment to fostering an innovative work environment and employee satisfaction. Our solutions include Total Rewards Analytics powered by Cognitios Total Rewards Data Factory, The Total Rewards Analytics solutions help our clients achieve better outcomes and higher ROI on investments in all kinds of Total Rewards programs. Our smart operations solutions drive productivity in complex operations, such as claims processing, commercial underwriting etc. These solutions, based on proprietary capabilities based on AI, advanced process and task mining, and deep understanding of operations drive effective digital transformation for our clients. Ideal qualifications, skills and experiences we are looking for are : We are actively seeking a talented and results-driven Data Scientist to join our team and take on a leadership role in driving business outcomes through the power of data analytics and insights. Your contributions will be instrumental in making data-informed decisions, identifying growth opportunities, and propelling our organization to new levels of success. Doctorate/Master's/bachelor's degree in data science, Statistics, Computer Science, Mathematics, Economics, commerce or a related field. Minimum of 6 years of experience working as a Data Scientist or in a similar analytical role, with experience leading data science projects and teams. Experience in Healthcare domain with exposure to clinical operations, financial, risk rating, fraud, digital, sales and marketing, and wellness, e-commerce or the ed tech industry is a plus. Proven ability to lead and mentor a team of data scientists, fostering an innovative environment. Strong decision-making and problem-solving skills to guide strategic initiatives. Expertise in programming languages such as Python and R, and proficiency with data manipulation, analysis, and visualization libraries (e.g., pandas, NumPy, Matplotlib, seaborn). Very strong python and exceptional with pandas, NumPy, advanced python (pytest, class, inheritance, docstrings). Deep understanding of machine learning algorithms, model evaluation, and feature engineering. Experience with frameworks like scikit-learn, TensorFlow, or Py torch. Experience of leading a team and handling projects with end-to-end ownership is a must. Deep understanding of ML and Deep Learning is a must. Basis NLP experience is highly valuable. Pyspark experience is highly valuable. Competitive coding experience (LeetCode) is highly valuable. Strong expertise in statistical modelling techniques such as regression, clustering, time series analysis, and hypothesis testing. Experience of building & deploying machine learning models in cloud environment : Microsoft Azure preferred (Databricks, Synapse, Data Factory, etc. Basic MLOPs experience with FastAPIs and experience of docker is highly valuable and AI governance. Ability to understand business objectives, market dynamics, and strategic priorities. Demonstrated experience translating data insights into tangible business outcomes and driving data-informed decision-making. Excellent verbal and written communication skills. Proven experience leading data science projects, managing timelines, and delivering results within deadlines. Strong collaboration skills with the ability to work effectively in cross-functional teams, build relationships, and foster a culture of knowledge sharing and continuous learning. Cognitio Analytics is an equal-opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. All Cognitio employees are expected to understand and adhere to all Cognitio Security and Privacy related policies in order to protect Cognitio data and our clients data. Our salary ranges are based on paying competitively for our size and industry and are one part of the total compensation package that also includes a bonus plan, equity, benefits, and other opportunities at Cognitio. Individual pay decisions are based on a number of factors, including qualifications for the role, experience level, and skillset. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

About The Role We are looking for a passionate and skilled AI/ML Developer to join our dynamic team. The ideal candidate will have strong experience with LangChain, Retrieval-Augmented Generation (RAG), Agentic AI systems, as well as expertise in clustering, regression, deep learning, and data transformation/cleaning techniques. You will work on building intelligent, autonomous AI solutions that drive innovative business applications. Responsibilities Design and develop AI applications using LangChain and Agentic AI architectures. Implement RAG pipelines for advanced knowledge retrieval and reasoning tasks. Build machine learning models for clustering, regression, and prediction tasks. Develop and fine-tune deep learning models for text-based or tabular data. Perform thorough data cleaning, transformation, and feature engineering. Write efficient, production-quality Python code. Work with SQL databases for data querying and preparation. Research and stay updated with new techniques in LLMs, Agentic AI, and machine learning. Requirements Excellent programming skills in Python. Proficient in SQL for database querying and data manipulation. Strong understanding of machine learning algorithms (especially clustering and regression). Hands-on experience with deep learning frameworks (TensorFlow, PyTorch, or Keras). Practical experience with LangChain, RAG, and building autonomous AI agents. Expertise in data preprocessing, cleaning, and transformation techniques. Preferred Skills Experience working with LLMs (Large Language Models) and prompt engineering. Knowledge of vector databases like Pinecone, FAISS, or ChromaDB. Strong analytical thinking and problem-solving skills. Ability to work independently and in collaborative environments. Skills: data transformation,machine learning,artificial intelligence,regression,retrieval-augmented generation (rag),python,langchain,deep learning,data cleaning,clustering,skills,agentic ai systems,sql Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are looking for a Lead Data Scientist to join our collaborative team. You will play a key role in developing and implementing AI solutions across various applications, from statistical analysis to natural language processing. If you are passionate about leveraging data to create impactful solutions, we encourage you to apply. Responsibilities Develop and implement AI solutions including classification, clustering, and anomaly detection Conduct statistical data analysis and apply machine learning techniques Manage complete project delivery from data preparation to model evaluation Utilize Python programming and SQL for data manipulation and analysis Engage in ML Ops and model development workflows Create models that are accessible for business use Collaborate with teams using software development methodologies and version control Document processes and maintain project tracking tools such as Jira Stay updated with new technologies and apply problem-solving skills effectively Deliver production-ready solutions and facilitate knowledge sharing Requirements 9+ years of experience in software engineering, specializing in Data Science At least 1 year of relevant leadership experience Proficiency in statistical data analysis, machine learning, and NLP, with a clear understanding of practical applications and limitations Experience in developing and implementing AI solutions, including classification, clustering, anomaly detection, and NLP Expertise in complete project delivery, from data preparation to model building, evaluation, and visualization Proficiency in Python programming and SQL, with experience in production-level code and data analysis libraries Familiarity with ML Ops, model development workflows, and feature engineering techniques Capability in manipulating data and developing models accessible for business use, with experience in Azure AI Search Competence in software development methodologies, code versioning (e.g., GitLab), and project tracking tools (e.g., Jira) Enthusiasm for learning new technologies, with expertise in problem-solving and delivering production-ready solutions Fluency in UNIX command line Familiarity with Agile development practices Excellent communication skills in English, with a minimum proficiency level of B2+ Nice to have Knowledge of Cloud Computing Experience with Big Data tools Familiarity with visualization tools Proficiency in containerization tools Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Translate complex business problems into data science solutions and articulate the value and limitations of these solutions to non-technical stakeholders. Develop, deploy, and monitor predictive models, statistical analyses, and optimization algorithms. Work hands-on with large datasets: cleaning, transforming, and analysing to extract actionable insights. Collaborate with cross-functional teams including Business, Product, Engineering, and Operations to ensure data science outputs are actionable and aligned with strategic goals. Present findings and recommendations to senior leadership in a clear and compelling manner. Continuously validate and improve models based on real-world feedback and changing business needs. Leverage Microsoft ecosystem tools (Azure ML, Power BI, SQL Server, Excel, etc.) extensively for data preparation, model development, deployment, and visualization. Key Skills & Competencies: 5+ years of experience in data science, machine learning, or advanced analytics roles. Proficiency in Python for data analysis and machine learning. Strong working knowledge of Microsoft technologies: Azure Data Services (Azure ML, Azure Data Factory, Azure Synapse) Power BI for data visualization (optional) Microsoft SQL Server and Azure SQL Database Advanced Excel (Power Query, Power Pivot) Solid understanding of a range of machine learning techniques (classification, regression, clustering, recommendation systems, etc.) and their real-world trade-offs. Strong business acumen with the ability to contextualize technical outputs into strategic business insights. Excellent communication skills (written and verbal) with the ability to influence decision-making. Bachelor’s or Master’s degree in Computer Science, Data Science, Mathematics, Statistics, Engineering, or a related field. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a Senior Data Scientist to join our team and drive innovation by leveraging your expertise in statistical data analysis, machine learning, and NLP to create and deliver impactful AI solutions. As a Senior Data Scientist, you will work on challenging projects that require end-to-end involvement, from data preparation to model deployment, all while working collaboratively with cross-functional teams and delivering production-ready solutions. Responsibilities Develop, implement, and evaluate AI solutions, including classification, clustering, anomaly detection, and NLP Apply advanced statistical techniques and machine learning algorithms to solve complex business problems Utilize Python and SQL to write production-level code and perform comprehensive data analysis Implement model development workflows, including ML Ops, and feature engineering techniques Utilize Azure AI Search and other tools to make data and models accessible to stakeholders Collaborate with software development and project management teams, leveraging version control tools like GitLab and project tracking software like Jira Optimize data pipelines and model performance for real-world applications Communicate technical concepts effectively to both technical and non-technical audiences Stay updated on emerging technologies, applying a problem-solving mindset to integrate them into projects Ensure adherence to Agile development practices and maintain fluency in UNIX command line operations Requirements 4+ years of experience in Data Science Proficiency in statistical data analysis, machine learning, and NLP with practical applications and limitations Expertise in Python programming and SQL, with experience in data analysis libraries and production-level code Background in developing AI solutions, including classification, clustering, anomaly detection, or NLP Familiarity with ML Ops and feature engineering techniques, with hands-on experience in model workflows Flexibility to use tools like Azure AI Search to make models accessible for business use Competency in software development methodologies and code versioning tools such as GitLab Knowledge of project management tools such as Jira and Agile development practices Qualification in working with UNIX command line and problem-solving with innovative technologies B2 level of English or higher, with an emphasis on technical communication skills Nice to have Familiarity with Cloud Computing, Big Data tools, and/or containerization technologies Proficiency in data visualization tools for clear communication of insights Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary Senior Analyst - Technical Specialist – Windows, VMWare & Linux - Deloitte Support Services India Private Limited Do you thrive on developing creative and innovative insights to solve complex challenges? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with premiere thought leaders in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Want to make an impact that matters? Consider Deloitte Global. Work you’ll do: The Deloitte Technical Operations Center (TOC) has a broad responsibility to maintain and enhance IT service availability 24x7x365. This includes infrastructure and application services for consumption internally (by Deloitte professionals) as well as by Deloitte’s clients, worldwide. As a TOC Operations Technical Specialist, you will respond to early indicators of system distress to avoid business disruption. You will participate in service restoration efforts through the Major Incident Management process, leading diverse teams of technical professionals in complex troubleshooting efforts. Your contribution to and participation in internal learning delivery will enable the TOC to maintain and improve system availability from multiple locations in support of business operations. As the organization matures, you will provide valuable input to the service design process creating resilient systems. Critical to this role is a mature, cross disciplinary skill set, spanning multiple aspects of service design and delivery, advanced troubleshooting, and the ability to lead disparate technical teams in the pursuit of rapid solutions to complex issues. Technical Skills 24x365 Windows, Virtualization and Linux operations and management Working knowledge of RedHat v7, v8+ in a support setting. Working knowledge of scripting methods such as ansible. Willingness to get involved with technologies that interface with UNIX (such as but not limited to) LAMP Stack, TSM, Samba, LDAP, SAS, GPFS, Oracle, SAP, NAS, SAN, Storage, SFTP etc. Experience with working with HP, IBM, Dell or other datacenter server hardware and storage systems. Experience with working with cloud architecture (GCP, Azure, AWS.) In-depth knowledge of VMware In-depth knowledge of automating and orchestrating tasks using PowerShell & API’s. In-depth knowledge of Windows Server operating systems In-depth knowledge of Microsoft SQL Server as it pertains to virtualization and storage best practices. In-depth knowledge of Windows Clustering. Managing Disk Space, Processor, Memory and Network Utilization realted to server support during software installation LVM Management and File system Management OS hardening and OS Troubleshooting Experience with Application Performance Monitoring tools; AppInsights or NewRelic Experience troubleshooting both Microsoft Windows Server and top tier Linux distributions (RHEL, SLES, Ubuntu) Identify repeatable operational tasks and issues; create automated resolutions to these situations to reduce operational overhead within the virtualization function as well as other enabling areas as required. Support infrastructure applications from a Virtualization function as required, including VMWare Cloud Foundation (ESX, VSAN, vRealize), SQL Server and Microsoft Clustering. Optimization and performance of hardware infrastructure. Ability to diagnose issues with RedHat Linux OS and Windows Server OS to a competent level within a pressured environment and able to meet tight project deadlines to deliver new services to the organization. Responsible for training and knowledge sharing of the service delivery process to other members Responsible for management of server lifecycles within multiple large scale data center environments across the globe. Compliance with Change Control Processes and adherence to standards and documentation. Working with project teams, network engineers and the technical support group implementing new services/systems as required. Performs all related job functions following established process and procedures to preserve the confidentiality of information hosted and managed by the Deloitte Technology organization from unauthorized disclosure. What you’ll be part of—our Deloitte Global culture: At Deloitte, we expect results. Incredible—tangible—results. And Deloitte Global professionals play a unique role in delivering those results. We reach across disciplines and borders to serve our global organization. We are the engine of Deloitte. We develop and lead global strategies and provide programs and services that unite our network. In Deloitte Global, everyone has an opportunity to lead. We see the importance of your perspective and your ability to create value. We want you to fit in—with an inclusive culture, focus on work-life fit and well-being, and a supportive, connected environment; but we also want you to stand out—with opportunities to have a strategic impact, innovate, and take the risks necessary to make your mark. Who you’ll work with: Global Technology Services works at the forefront of technology development and processes to support and protect Deloitte around the world. In this truly global environment, we operate not in "what is" but rather "what can be" to help Deloitte deliver and connect with its clients, its communities, and one another in ways not previously conceived Qualifications Required Mastery of English language skills (oral and written) Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and accomplishments 4 years of experience in the IT field 3+ years of experience with Windows, VMWare and Linux technologies etc. Fluent in ITIL methodology Working knowledge of at least one of the following scripting languages: PowerShell, Python Working knowledge of ServiceNow or similar service management platform Certifications preferred: Windows, VMware or Linux How you’ll grow: Deloitte Global inspires leaders at every level. We believe in investing in you, helping you embrace leadership opportunities at every step of your career, and helping you identify and hone your unique strengths. We encourage you to grow by providing formal and informal development programs, coaching and mentoring, and on-the-job challenges. We want you to ask questions, take chances, and explore the possible. Benefits you’ll receive: Deloitte’s Total Rewards program reflects our continued commitment to lead from the front in everything we do — that’s why we take pride in offering a comprehensive variety of programs and resources to support your health and well-being needs. We provide the benefits, competitive compensation, and recognition to help sustain your efforts in making an impact that matters. Corporate Citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 301015 Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Extensive Experience MS SQL Server Administration: Minimum 5+ years of experience with in-depth knowledge of SQL database monitoring, performance tuning, and troubleshooting. High Availability & Disaster Recovery (HA/DR): Advanced knowledge and implementation of HA/DR strategies. T-SQL : Hands-on experience in T-SQL is mandatory (a DBA without T-SQL expertise is not considered a DBA engineer). Transaction Replication: Proficient in managing and troubleshooting transaction replication. Availability Groups & Clustering: Expertise in configuring and maintaining high availability and clustering solutions. Communication Skills: Strong verbal and written communication is essential. Automation & Scripting: Development of scripts and automation tools (e.g., PowerShell) to reduce manual tasks and enhance efficiency. Cloud Migration: Contribution to planning and executing Azure cloud migration strategies. Replication Management: Implementation, maintenance, and troubleshooting of database replication using SQL Server tools and third-party solutions (e.g., Fivetran, Qlik). Requirements Experience: 7+ years of production DBA experience. 5+ years of SQL development experience. Hands-on expertise in PowerShell scripting. Cloud Expertise: Proven experience with cloud migrations, particularly Microsoft Azure and/or AWS. Multi-datacenter environment experience, including Azure/AWS integration. Technical Skills: Strong analytical and problem-solving skills with a focus on automation and innovation. High proficiency in HA and DR solutions for MSSQL and Azure SQL. Expertise in database replication across multi-server/datacenter setups. SSRS development and SSDT with TFS/GIT source control. Skills Sql Dba,T SQL, Powershell Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

We are looking for a Lead Data Scientist to join our collaborative team. You will play a key role in developing and implementing AI solutions across various applications, from statistical analysis to natural language processing. If you are passionate about leveraging data to create impactful solutions, we encourage you to apply. Responsibilities Develop and implement AI solutions including classification, clustering, and anomaly detection Conduct statistical data analysis and apply machine learning techniques Manage complete project delivery from data preparation to model evaluation Utilize Python programming and SQL for data manipulation and analysis Engage in ML Ops and model development workflows Create models that are accessible for business use Collaborate with teams using software development methodologies and version control Document processes and maintain project tracking tools such as Jira Stay updated with new technologies and apply problem-solving skills effectively Deliver production-ready solutions and facilitate knowledge sharing Requirements 9+ years of experience in software engineering, specializing in Data Science At least 1 year of relevant leadership experience Proficiency in statistical data analysis, machine learning, and NLP, with a clear understanding of practical applications and limitations Experience in developing and implementing AI solutions, including classification, clustering, anomaly detection, and NLP Expertise in complete project delivery, from data preparation to model building, evaluation, and visualization Proficiency in Python programming and SQL, with experience in production-level code and data analysis libraries Familiarity with ML Ops, model development workflows, and feature engineering techniques Capability in manipulating data and developing models accessible for business use, with experience in Azure AI Search Competence in software development methodologies, code versioning (e.g., GitLab), and project tracking tools (e.g., Jira) Enthusiasm for learning new technologies, with expertise in problem-solving and delivering production-ready solutions Fluency in UNIX command line Familiarity with Agile development practices Excellent communication skills in English, with a minimum proficiency level of B2+ Nice to have Knowledge of Cloud Computing Experience with Big Data tools Familiarity with visualization tools Proficiency in containerization tools Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are looking for a Lead Data Scientist to join our collaborative team. You will play a key role in developing and implementing AI solutions across various applications, from statistical analysis to natural language processing. If you are passionate about leveraging data to create impactful solutions, we encourage you to apply. Responsibilities Develop and implement AI solutions including classification, clustering, and anomaly detection Conduct statistical data analysis and apply machine learning techniques Manage complete project delivery from data preparation to model evaluation Utilize Python programming and SQL for data manipulation and analysis Engage in ML Ops and model development workflows Create models that are accessible for business use Collaborate with teams using software development methodologies and version control Document processes and maintain project tracking tools such as Jira Stay updated with new technologies and apply problem-solving skills effectively Deliver production-ready solutions and facilitate knowledge sharing Requirements 9+ years of experience in software engineering, specializing in Data Science At least 1 year of relevant leadership experience Proficiency in statistical data analysis, machine learning, and NLP, with a clear understanding of practical applications and limitations Experience in developing and implementing AI solutions, including classification, clustering, anomaly detection, and NLP Expertise in complete project delivery, from data preparation to model building, evaluation, and visualization Proficiency in Python programming and SQL, with experience in production-level code and data analysis libraries Familiarity with ML Ops, model development workflows, and feature engineering techniques Capability in manipulating data and developing models accessible for business use, with experience in Azure AI Search Competence in software development methodologies, code versioning (e.g., GitLab), and project tracking tools (e.g., Jira) Enthusiasm for learning new technologies, with expertise in problem-solving and delivering production-ready solutions Fluency in UNIX command line Familiarity with Agile development practices Excellent communication skills in English, with a minimum proficiency level of B2+ Nice to have Knowledge of Cloud Computing Experience with Big Data tools Familiarity with visualization tools Proficiency in containerization tools Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are looking for a Lead Data Scientist to join our collaborative team. You will play a key role in developing and implementing AI solutions across various applications, from statistical analysis to natural language processing. If you are passionate about leveraging data to create impactful solutions, we encourage you to apply. Responsibilities Develop and implement AI solutions including classification, clustering, and anomaly detection Conduct statistical data analysis and apply machine learning techniques Manage complete project delivery from data preparation to model evaluation Utilize Python programming and SQL for data manipulation and analysis Engage in ML Ops and model development workflows Create models that are accessible for business use Collaborate with teams using software development methodologies and version control Document processes and maintain project tracking tools such as Jira Stay updated with new technologies and apply problem-solving skills effectively Deliver production-ready solutions and facilitate knowledge sharing Requirements 9+ years of experience in software engineering, specializing in Data Science At least 1 year of relevant leadership experience Proficiency in statistical data analysis, machine learning, and NLP, with a clear understanding of practical applications and limitations Experience in developing and implementing AI solutions, including classification, clustering, anomaly detection, and NLP Expertise in complete project delivery, from data preparation to model building, evaluation, and visualization Proficiency in Python programming and SQL, with experience in production-level code and data analysis libraries Familiarity with ML Ops, model development workflows, and feature engineering techniques Capability in manipulating data and developing models accessible for business use, with experience in Azure AI Search Competence in software development methodologies, code versioning (e.g., GitLab), and project tracking tools (e.g., Jira) Enthusiasm for learning new technologies, with expertise in problem-solving and delivering production-ready solutions Fluency in UNIX command line Familiarity with Agile development practices Excellent communication skills in English, with a minimum proficiency level of B2+ Nice to have Knowledge of Cloud Computing Experience with Big Data tools Familiarity with visualization tools Proficiency in containerization tools Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are seeking a Senior Data Scientist to join our team and drive innovation by leveraging your expertise in statistical data analysis, machine learning, and NLP to create and deliver impactful AI solutions. As a Senior Data Scientist, you will work on challenging projects that require end-to-end involvement, from data preparation to model deployment, all while working collaboratively with cross-functional teams and delivering production-ready solutions. Responsibilities Develop, implement, and evaluate AI solutions, including classification, clustering, anomaly detection, and NLP Apply advanced statistical techniques and machine learning algorithms to solve complex business problems Utilize Python and SQL to write production-level code and perform comprehensive data analysis Implement model development workflows, including ML Ops, and feature engineering techniques Utilize Azure AI Search and other tools to make data and models accessible to stakeholders Collaborate with software development and project management teams, leveraging version control tools like GitLab and project tracking software like Jira Optimize data pipelines and model performance for real-world applications Communicate technical concepts effectively to both technical and non-technical audiences Stay updated on emerging technologies, applying a problem-solving mindset to integrate them into projects Ensure adherence to Agile development practices and maintain fluency in UNIX command line operations Requirements 4+ years of experience in Data Science Proficiency in statistical data analysis, machine learning, and NLP with practical applications and limitations Expertise in Python programming and SQL, with experience in data analysis libraries and production-level code Background in developing AI solutions, including classification, clustering, anomaly detection, or NLP Familiarity with ML Ops and feature engineering techniques, with hands-on experience in model workflows Flexibility to use tools like Azure AI Search to make models accessible for business use Competency in software development methodologies and code versioning tools such as GitLab Knowledge of project management tools such as Jira and Agile development practices Qualification in working with UNIX command line and problem-solving with innovative technologies B2 level of English or higher, with an emphasis on technical communication skills Nice to have Familiarity with Cloud Computing, Big Data tools, and/or containerization technologies Proficiency in data visualization tools for clear communication of insights Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Join MSBC as an AI/ML Engineer – Deliver Real-World Intelligent Systems At MSBC, we design and implement practical AI solutions that solve real business problems across industries. As an AI/ML Engineer, you will play a key role in building and deploying machine learning models and data-driven systems that are used in production. This role is ideal for engineers with solid hands-on experience delivering end-to-end AI/ML projects. Key Tools and Frameworks • Programming Languages – Python (FastAPI, Flask, Django) • Machine Learning Libraries – scikit-learn, XGBoost, TensorFlow or PyTorch • Data Pipelines – Pandas, Spark, Airflow • Model Deployment – FastAPI, Flask, Docker, MLflow • Cloud Platforms – AWS, GCP, Azure (any one) • Version Control – Git Key Responsibilities • Design and develop machine learning models to address business requirements. • Build and manage data pipelines for training and inference workflows. • Train, evaluate, and optimise models for accuracy and performance. • Deploy models in production environments using containerised solutions. • Work with structured and unstructured data from various sources. • Ensure robust monitoring, retraining, and versioning of models. • Contribute to architecture and design discussions for AI/ML systems. • Document processes, results, and deployment procedures clearly. • Collaborate with software engineers, data engineers, and business teams. Required Skills and Qualifications • 4+ years of hands-on experience delivering ML solutions in production environments. • Strong programming skills in Python and deep understanding of ML fundamentals. • Experience with supervised and unsupervised learning, regression, classification, and clustering techniques. • Practical experience in model deployment and lifecycle management. • Good understanding of data preprocessing, feature engineering, and model evaluation. • Experience with APIs, containers, and cloud deployment. • Familiarity with CI/CD practices and version control. • Ability to work independently and deliver results in fast-paced projects. • Excellent English communication skills for working with distributed teams. • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. MSBC Group has been a trusted technology partner for over 20 years, delivering the latest systems and software solutions for financial services, manufacturing, logistics, construction, and startup ecosystems. Our expertise includes Accessible AI, Custom Software Solutions, Staff Augmentation, Managed Services, and Business Process Outsourcing. We are at the forefront of developing advanced AI-enabled services and supporting transformative projects. Operating globally, we drive innovation, making us a trusted AI and automation partner. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description And Requirements CareerArc Code CA-DN Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! The DSOM product line includes BMC’s industry-leading Digital Services and Operation Management products. We have many interesting SaaS products, in the fields of: Predictive IT service management, Automatic discovery of inventories, intelligent operations management, and more! We continuously grow by adding and implementing the most cutting-edge technologies and investing in Innovation! Our team is a global and versatile group of professionals, and we LOVE to hear our employees’ innovative ideas. So, if Innovation is close to your heart – this is the place for you! BMC is looking for a Senior Java Developer, an innovator at heart, to join us and design, develop, and implement complex applications, using the latest technologies. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: You will play a pivotal role in the design, development, and delivery of the BMC Helix Suite. You will collaborate with cross-functional teams, including Product Management, Architects, Quality Engineering, and DevSecOps, to lead initiatives that align with our vision of creating intelligent, customer-centric solutions. This role is ideal for a highly skilled technologist passionate about creating scalable, high-performance SaaS products in a fast-paced, Agile environment. security processes, ensuring compliance with industry regulations You will play important role in development of core features and modules for the Helix ITSM platform, design and implementation of microservices using modern frameworks and technologies like Java, Spring Boot, Kubernetes, and RESTful APIs. Ensure high-quality code by adhering to best practices and industry standards. Collaborate with Product Managers to understand requirements and align technical solutions with business goals. Work closely with Quality Engineers to define robust testing strategies, including automated and performance testing. Drive innovation through R&D, focusing on enhancing AI-driven capabilities in areas like predictive analytics and automation. Analyze and resolve complex technical issues, ensuring scalability and performance of the product. Collaborate with DevSecOps teams to ensure seamless CI/CD pipelines and maintain product security standards. Participate in product deployment, upgrade, and security processes, ensuring compliance with industry regulations. To ensure you’re set up for success, you will bring the following skillset & experience: You have 8+ years in software development, with at least 2+ years as Senior Developer or equivalent. You are Proficient in Java (11+/17+), Spring Boot, RESTful API development, and Microservices architecture. with open-source Java frameworks such as OSGI, Spring, Hibernate, Maven, JSecurity, JMS, JPA, JTA, JDBC. You are experienced in designing and developing complex framework and platform solutions with practical use of design patterns. You have expertise with unit/integration testing, test driven development and related modern best practices/technologies You are experienced with server-side issues such as caching, clustering, persistence, security, SSO, state management, high scalability/availability, and failover. You have experience in implementing Business Process Management software and integrating complex enterprise systems. You are experienced with PostgreSQL, Oracle or MS-SQL databases and Aspect oriented architectures. Good understanding of web services and SOA standards like REST, SOAP, XML, etc. and microservices architecture, including Kubernetes, Docker and Kafka. You are experienced in open-source participation and Apache projects, patent process, in depth knowledge of App server architectures and SaaS or PaaS enabling platforms is a big plus. Hands-on experience with container orchestration tools like Kubernetes and Docker. Knowledge of DevSecOps tools (e.g., Jenkins, Terraform, Helm, Ansible) and security practices. Experience/knowledge with Networking & Web Communication protocols, Kaazing Gateway or equivalent is a big plus. Security: Authentication and Authorization: Experience with OAuth, SAML, or similar identity management systems. Encryption: Knowledge of TLS/SSL protocols to secure WebSocket communication. Firewall and Network Security: Understanding how to secure WebSocket connections in an enterprise environment. Whilst these are nice to have, our team can help you develop in the following skills: Familiarity with cloud platforms (AWS, Azure, GCP) and SaaS environments. Experience with AI/ML integration in SaaS applications. Knowledge of ITIL/ITSM processes and tools. Certifications: SAFe Agilist, AWS Cloud Practitioner, or equivalent. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 3,315,400 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply. Min salary 2,486,550 Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. Mid point salary 3,315,400 Max salary 4,144,250 Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Analyst – MySQL Database Administration Job Description: · Install, configure, maintain, and database systems, schema, tables, indexes, procedures, and permissions. · Create backup / recovery of databases for MySQL . · Ensure SQL Transactional Replication (Create / monitor / resolve) is occurring successfully. · Conduct regular performance tuning, indexing, and normalization of database systems . · Develop and test SQL scripts for task automation and reporting. · Install/Migrate/Administrate MySQL to MariaDB. · Ensure server downtime resulting from maintenance and improvements are limited to non-production hours whenever possible. · Db patching and db. upgrade/migrate. Resource monitoring and requirement analysis. Position Requirements: · Bachelor’s degree in information technology, Computer Science or other related discipline a plus. · Minimum of 5 years of experience in database administration, specifically in MySQL alone. · Must have knowledge on MySQL Installation and Replication · Thorough training and demonstrated analytical and technical skills in installing, configuring, integrating and administering MySQL within the Linux / LAMP stack. · Experience including, but not limited to, MySQL, MariaDB, Linux RedHat, Apache, and PHP or other scripting languages. · Hands-on experience supporting Highly Available (HA) database environments with solutions such as clustering. This should include supporting a (3) node MySQL cluster. · Excellent written and verbal communication skills. · Ability to work as a team member in a fast-paced environment. · Ability to manage and communicate project tasks, status, and deadlines. · Looking only for immediate joiner. · Candidate should be within Pune location. · Should be ready to do work from office and US Shift. Additional Qualifications: · Experienced in working within a multi-tiered architecture environment, supporting B2B and/or hosting operations. · Experience with open-source software, including LAMP software bundle. · Working knowledge in NoSQL/MongoDB Show more Show less

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About Us: Traya is an Indian direct-to-consumer hair care brand platform provides a holistic treatment for consumers dealing with hairloss. The Company provides personalized consultations that help determine the root cause of hair fall among individuals, along with a range of hair care products that are curated from a combination of Ayurveda, Allopathy, and Nutrition. Traya's secret lies in the power of diagnosis. Our unique platform diagnoses the patient’s hair & health history, to identify the root cause behind hair fall and delivers customized hair kits to them right at their doorstep. We have a strong adherence system in place via medically-trained hair coaches and proprietary tech, where we guide the customer across their hair growth journey, and help them stay on track. Traya is founded by Saloni Anand, a techie-turned-marketeer and Altaf Saiyed, a Stanford Business School alumnus. Our Vision: Traya was created with a global vision to create awareness around hair loss, de-stigmatise it while empathizing with the customers that it has an emotional and psychological impact. Most importantly, to combine 3 different sciences (Ayurveda, Allopathy and Nutrition) to create the perfect holistic solution for hair loss patients. Responsibilities: Data Analysis and Exploration: Conduct in-depth analysis of large and complex datasets to identify trends, patterns, and anomalies. Perform exploratory data analysis (EDA) to understand data distributions, relationships, and quality. Machine Learning and Statistical Modeling: Develop and implement machine learning models (e.g., regression, classification, clustering, time series analysis) to solve business problems. Evaluate and optimize model performance using appropriate metrics and techniques. Apply statistical methods to design and analyze experiments and A/B tests. Implement and maintain models in production environments. Data Engineering and Infrastructure: Collaborate with data engineers to ensure data quality and accessibility. Contribute to the development and maintenance of data pipelines and infrastructure. Work with cloud platforms (e.g., AWS, GCP, Azure) and big data technologies (e.g., Spark, Hadoop). Communication and Collaboration: Effectively communicate technical findings and recommendations to both technical and non-technical audiences. Collaborate with product managers, engineers, and other stakeholders to define and prioritize projects. Document code, models, and processes for reproducibility and knowledge sharing. Present findings to leadership. Research and Development: Stay up-to-date with the latest advancements in data science and machine learning. Explore and evaluate new tools and techniques to improve data science capabilities. Contribute to internal research projects. Qualifications: Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or a related field. 3-5 years of experience as a Data Scientist or in a similar role. Leverage SageMaker's features, including SageMaker Studio, Autopilot, Experiments, Pipelines, and Inference, to optimize model development and deployment workflows. Proficiency in Python and relevant libraries (e.g., scikit-learn, pandas, NumPy, TensorFlow, PyTorch). Solid understanding of statistical concepts and machine learning algorithms. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Experience deploying models to production. Experience with version control (Git) Preferred Qualifications: Experience with specific industry domains (e.g., e-commerce, finance, healthcare). Experience with natural language processing (NLP) or computer vision. Experience with building recommendation engines. Experience with time series forecasting. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for an enthusiastic Machine Learning Engineer to join our growing team. The hire will be responsible for working in collaboration with other data scientists and engineers across the organization to develop production-quality models for a variety of problems across Razorpay. Some possible problems include : making recommendations to merchants from Razorpay’s suite of products, cost optimisation of transactions for merchants, automatic address disambiguation / correction to enable tracking customer purchases using advanced natural language processing techniques, computer vision techniques for auto-verifications, running large-scale bandit experiments to optimize Razorpay’s merchant facing web pages at scale, and many more. In addition to this, we expect the MLE to be adept at productionising ML models using state-of-the-art systems. As part of the DS team @ Razorpay, you’ll work with some of the smartest engineers/architects/data scientists/product leaders in the industry and have the opportunity to solve complex and critical problems for Razorpay. As a Senior MLE, you will also have the opportunity to partner with and be mentored by senior engineers across the organization and lay the foundation for a world-class DS team here at Razorpay. You come and work with the right attitude, fun and growth guaranteed! Required qualifications 5+ years of experience doing ML in a production environment and productionising ML models at scale Bachelors (required) or Masters in a quantitative field such as Computer science, operations research, statistics, mathematics, physics Familiarity with basic machine learning techniques : regression, classification, clustering, model metrics and performance (AUC, ROC, precision, recall and their various flavors) Basic knowledge of advanced machine learning techniques : regression, clustering, recommender systems, ranking systems and neural networks Expertise in coding in python and good knowledge of at least one language from C, C++, Java and at least one scripting language (perl, shell commands) Experience with big data tools like Spark and experience working with Databricks / DataRobots Experience with AWS’ suite of tools for production-quality ML work, or alternatively familiarity with Microsoft Azure / GCP Experience deploying complex ML algorithms to production in collaboration with engineers using Flask, MLFlow, Seldon, etc. Good to have: Excellent communication skills and ability to keep stakeholders informed of progress / blockers Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

The Role: We are looking for an enthusiastic Senior Data Scientist to join our growing team. The hire will be responsible for working in collaboration with other data scientists and engineers across the organization to develop production-quality models for a variety of problems across Razorpay. Some possible problems include : making recommendations to merchants from Razorpay’s suite of products, cost optimization of transactions for merchants, automatic address disambiguation / correction to enable tracking customer purchases using advanced natural language processing techniques. As part of the DS team @ Razorpay, you’ll work with some of the smartest engineers/architects/data scientists in the industry and have the opportunity to solve complex and critical problems for Razorpay. Responsibilities: Apply advanced data science, mathematics, and machine learning techniques to solve complex business problems. Collaborate with cross-functional teams to design and deploy data science solutions. Analyze large volumes of data to derive actionable insights. Present findings and recommendations to stakeholders, effectively communicating complex concepts. Identify key metrics, conduct exploratory data analysis, and create executive-level dashboards. Manage multiple projects in a fast-paced environment, ensuring high-quality deliverables. Train and maintain machine learning models, utilizing deep learning frameworks and big data tools. Continuously improve solutions, evaluating their effectiveness and optimizing performance. Deploy data-driven solutions and effectively communicate results to stakeholders. Mandatory Qualifications: 5+ years experience working with machine learning in a production environment. Bachelor's or Master's degree in a quantitative field (e.g., Computer Science, Operations Research, Statistics, Mathematics, Physics). Strong knowledge of fundamental machine learning techniques, such as regression, classification, clustering, and model evaluation metrics. Proficiency in Python and familiarity with languages like C, C++, or Java. Experience with scripting languages like Perl and command-line Unix is a plus. Experience with deep learning frameworks (TensorFlow, Keras, PyTorch) and big data tools like Spark, and 2-3 years experience in building production-quality machine learning code on platforms like Databricks Experience with AWS / GCP / Microsoft Azure for building production quality ML models and systems Ability to conduct end-to-end ML experimentation, including model experimentation, success reporting, A/B testing, and testing metrics. Excellent communication skills and the ability to keep stakeholders informed of progress and potential blockers. Show more Show less

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Job Title: AI/ML Developer – (Intern) Company: VASPP Technologies Pvt. Ltd. Location: Bengaluru, Karnataka, India Job Type: Full-Time Experience: Fresher (0–1 year) Department: Technology / Development About VASPP Technologies: VASPP Technologies Pvt. Ltd. is a fast-growing software company focused on delivering cutting-edge digital transformation solutions for global enterprises. Our innovative projects span across AI/ML, data analytics, enterprise solutions, and cloud computing. We foster a collaborative and dynamic environment that encourages learning and growth. Job Summary: We are seeking a motivated and enthusiastic AI/ML Developer – Fresher to join our growing technology team. The ideal candidate will have a foundational understanding of machine learning algorithms, data analysis, and model deployment. You will work closely with senior developers to contribute to real-world AI/ML projects and software applications. Responsibilities: ·Assist in the design, development, training, and deployment of AI and machine learning models. Collaborate with cross-functional teams including software engineers, data scientists, and product managers to build intelligent applications. Perform data collection, cleaning, transformation, and exploratory data analysis (EDA). Test various ML algorithms (e.g., classification, regression, clustering) and optimize them for performance. Implement model evaluation metrics and fine-tune hyperparameters. Contribute to integrating ML models into software applications using REST APIs or embedded services. Stay updated with the latest AI/ML frameworks, research papers, and industry trends. Document all work including model development, experiments, and deployment steps in a structured format. Required Skills: Proficiency in Python and libraries such as NumPy, Pandas, Scikit-learn, TensorFlow, or PyTorch. Solid understanding of machine learning principles: supervised/unsupervised learning, overfitting, cross-validation, etc. Familiarity with data visualization tools: Matplotlib, Seaborn, Plotly. Basic knowledge of SQL and working with relational databases. Good understanding of software development basics, version control (Git), and collaborative tools. Strong problem-solving mindset, eagerness to learn, and ability to work in a team environment. Educational Qualification: Bachelor’s degree in Computer Science , Information Technology , Data Science , Artificial Intelligence , or related fields from a recognized institution. Preferred Qualifications (Optional): Internship or academic projects related to AI/ML. Participation in online competitions (e.g., Kaggle, DrivenData) or open-source contributions. Exposure to cloud platforms like AWS, Google Cloud (GCP), or Microsoft Azure. Familiarity with model deployment techniques using Flask/FastAPI, Docker, or Streamlit. Compensation: CTC/ Stipend: 5000 or 8000 rs per month How to Apply: Send your updated resume and portfolio to: Email: piyush.vs@vaspp.com or aparna.bs@vaspp.com Job Type: Internship Contract length: 2 months Pay: ₹5,000.00 - ₹8,000.00 per month Benefits: Paid sick time Work from home Schedule: Monday to Friday Morning shift Application Question(s): This is an 2 month Internship and the stipend will be based on performance and interview process so, is it okay for you ? Education: Bachelor's (Preferred) Experience: AI: 1 year (Preferred) Language: English (Preferred) Location: Bangalore, Karnataka (Required) Work Location: In person Application Deadline: 14/06/2025

Posted 3 weeks ago

Apply

0.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Role: Senior Analyst - Data Engineering Experience: 3 to 6 years Location: Bengaluru, Karnataka , India (BLR) Job Descrition: We are seeking a highly experienced and skilled Senior Data Engineer to join our dynamic team. This role requires hands-on experience with databases such as Snowflake and Teradata, as well as advanced knowledge in various data science and AI techniques. The successful candidate will play a pivotal role in driving data-driven decision-making and innovation within our organization. Job Resonsbilities: Design, develop, and implement advanced machine learning models to solve complex business problems. Apply AI techniques and generative AI models to enhance data analysis and predictive capabilities. Utilize Tableau and other visualization tools to create insightful and actionable dashboards for stakeholders. Manage and optimize large datasets using Snowflake and Teradata databases. Collaborate with cross-functional teams to understand business needs and translate them into analytical solutions. Stay updated with the latest advancements in data science, machine learning, and AI technologies. Mentor and guide junior data scientists, fostering a culture of continuous learning and development. Communicate complex analytical concepts and results to non-technical stakeholders effectively. Key Technologies & Skills: Machine Learning Models: Supervised learning, unsupervised learning, reinforcement learning, deep learning, neural networks, decision trees, random forests, support vector machines (SVM), clustering algorithms, etc. AI Techniques: Natural language processing (NLP), computer vision, generative adversarial networks (GANs), transfer learning, etc. Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn, Plotly, etc. Databases: Snowflake, Teradata, SQL, NoSQL databases. Programming Languages: Python (essential), R, SQL. Python Libraries: TensorFlow, PyTorch, scikit-learn, pandas, NumPy, Keras, SciPy, etc. Data Processing: ETL processes, data warehousing, data lakes. Cloud Platforms: AWS, Azure, Google Cloud Platform. Big Data Technologies: Apache Spark, Hadoop. Job Snapshot Updated Date 11-06-2025 Job ID J_3679 Location Bengaluru, Karnataka, India Experience 3 - 6 Years Employee Type Permanent

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bhopal, Madhya Pradesh, India

On-site

Key Skills : PostgreSQL, Pl/SQL Education Qualification : Any Graduate Minimum Years of Experience : 3+ Years Type of Employment : Permanent Job Description : Our dynamic and growing company is actively seeking an experienced PostgreSQL Database Developer to join our team. As a PostgreSQL Database Developer, you will play a crucial role in designing, implementing, and maintaining our database systems. The ideal candidate should have a strong background in database development, performance optimization, and data modeling. Job Responsibilities Responsible to design, implement, and maintain database schemas in PostgreSQL and perform data modeling to ensure efficiency, reliability, and scalability. Responsible to optimize and tuneSQL queries for improved performance , and also identify and resolve performance bottlenecks in database systems. Responsible to manage data migration and integration processes between different systems and ensure data consistency and integrity during the migration process. Responsible to develop and maintain stored procedures, functions, and triggers to support application requirements and Implement business logic within the database layer. Responsible to Implement and maintain database security policies and manage user roles, permissions, and access control within the database. Responsible to implement and oversee database backup and recovery processes and also ensure data availability and reliability. Responsible to collaborate with cross-functional teams, including application developers, system administrators, and business analysts, to understand database requirements. Responsible to create and maintain documentation related to database design, processes, and best : Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Database Developer with a focus on PostgreSQL. In-depthknowledge of database design principles, normalization, and data modeling. Strong proficiency in writing and optimizing SQL queries. Experience with performance tuning and query optimization techniques. Familiarity with database security best practices and access control. Hands-on experience with data migration, integration, and ETL processes. Proficiency in scripting languages (e.g., Python, Bash) for automation tasks. Knowledge of backup and recovery processes. Excellent communication and collaboration skills. Ability to work independently and as part of a Skills : Experience with PostgreSQL replication and clustering. Familiarity with NoSQL databases. Knowledge of cloud database solutions (e.g., AWS RDS, Azure Database for PostgreSQL). Understanding of DevOps practices and tools (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies