Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
tiruchirappalli, tamil nadu
On-site
Join us at Gnapi Technologies as a seasoned Full Stack Tech Lead / Architect and take on the exciting opportunity to lead the development of innovative multi-tenant SaaS products, including mobile apps and web portals. In this role, you will be responsible for designing a robust architecture that harmonizes functional and non-functional requirements, prioritizing user experience, compliance, security, and performance. Your expertise will guide our tech teams in front-end, back-end, DevOps, cloud, and security to create scalable solutions compliant with regulations such as HIPAA. As an Architectural Leader, you will establish excellence in architecture, ensuring that B2C mobile application architectures make use of technologies like React Native, React.js, and Python for maintainability, reusability, and testability. You will implement rigorous security frameworks for healthcare applications in compliance with OWASP Top 10 and architect Azure-native applications utilizing extensive knowledge of Azure services. Developing a robust data architecture supporting both transactional and analytical needs using SQL and NoSQL databases will be a key part of your role, along with a strong understanding of Cloud Services spanning across network, security, databases, and AI services. Your role will also involve Performance Optimization where you will apply advanced techniques throughout the application stack to enhance user experience under diverse network conditions. You will optimize applications for various devices and environments while providing technical guidance and mentorship to front-end developers, promoting high standards in code quality and architectural practices. Data Integration will be another critical aspect of your role, ensuring seamless integration between front-end components and back-end services through RESTful APIs and GraphQL. You will need to have proven experience as a full stack developer or application architect, working with technologies like Javascript (ReactJs, Typescript), Python, and mobile app development for Android and iOS platforms. Your extensive experience with Microsoft Azure, including Azure Active Directory and Azure DevOps, will be highly valuable in this role. Preferred qualifications include prior experience in Utility or Telecom IT projects and knowledge of additional cloud services such as AWS and Google Cloud for scalable hosting. A Bachelor's or Master's degree in Computer Science, Engineering, or a related field is required for this full-time position at Gnapi Technologies.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
If you are excited about shaping the future of technology and driving significant business impact in financial services, we are looking for people just like you. Join our team and help us develop game-changing, high-quality solutions. As a Senior Lead Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you will be a key member of the Data Product Solutions Architecture Team. Your role involves designing, developing, and implementing analytical data solutions that align with the organization's strategic goals. You will leverage your expertise in data architecture, data modeling, data migrations, and data integration, collaborating with cross-functional teams to achieve target state architecture goals. Represent the Data Product Solutions Architecture team in various forums, advising on Data Product Solutions. Lead the design and maintenance of scalable data solutions, including data lakes and warehouses. Collaborate with cross-functional teams to ensure data product solutions support business needs and enable data-driven decision-making. Evaluate and select data technologies, driving the adoption of emerging technologies. Develop architectural models using Archimate, C4 Model, etc., and other artifacts to support data initiatives. Serve as a subject matter expert in specific areas. Contribute to the data engineering community and advocate for firm-wide data practices. Engage in hands-on coding and design to implement production solutions. Optimize system performance by resolving inefficiencies. Influence product design and technical operations. Develop multi-year roadmaps aligned with business and data technology strategies. Design reusable data frameworks using new technologies. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science or related field with 10+ years of experience. - 5+ years as a Data Product Solution Architect or similar role leading technologists to manage, anticipate, and solve complex technical items within your domain of expertise. - Hands-on experience in system design, application development, and operational stability. - Expertise in architecture disciplines and programming languages. - Deep knowledge of data architecture, modeling, integration, cloud data services, data domain-driven design, best practices, and industry trends in data engineering. - Practical experience with AWS, big data technologies, and data engineering disciplines. - Advanced experience in one or more data engineering disciplines, e.g., streaming, ELT/ELT, event processing. - Proficiency in SQL and data warehousing solutions using Teradata or similar cloud-native relational databases, e.g., Snowflake, Athena, Postgres. - Strong problem-solving, communication, and interpersonal skills. - Ability to evaluate and recommend technologies for future state architecture. Preferred qualifications, capabilities, and skills: - Financial services experience, especially in card and banking. - Experience with modern data processing technologies such as Kafka streaming, DBT, Spark, Python, Java, Airflow, etc., using data mesh & data lake. - Business architecture knowledge and experience with architecture assessment frameworks.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Technical Data Analyst, you will be responsible for analyzing data to meet business requirements. You will be involved in designing data platform solutions for the seamless flow of data from the source to the target. Your expertise in mapping source data elements to the target table will be crucial for the success of the projects. Proficiency in SQL (Teradata, Oracle, DB2, MSSQL, DAS) is a must for this role. Experience in ETL/EDW/Informatica, Data lake/Azure, and Data Warehouse architecture will be highly beneficial. Your knowledge of data modeling and data architecture will play a vital role in shaping the data infrastructure. Additionally, having experience in the banking domain will be considered a significant advantage for this position. If you are passionate about data analysis, data platform solutions, and data architecture, and have a strong technical background in SQL and related technologies, we encourage you to apply for this role and be part of our dynamic team.,
Posted 1 day ago
10.0 - 15.0 years
0 Lacs
kolkata, west bengal
On-site
Excited to work in the IT software product space and collaborate with a team on cutting-edge products at the intersection of GenAI and data transformation Our client is looking for a Data Management Lead to join their R&D team in Kolkata. As the Data Management Lead, you will leverage your 10-15 years of experience in data management to spearhead the development and maintenance of data management modules. Your key responsibilities will include driving the design, development, and deployment of data management and storage modules, overseeing data architecture and integration processes, enabling ETL and ELT processes, ensuring data quality and performance, and optimizing storage technologies. You will also be responsible for ensuring that the platform's data management and storage modules uphold data governance principles, data security, access controls, data masking, encryption processes, and a central technical and business meta data layer. Leading a team of developers, you will collaborate with product management to align with market and customer trends, and with QA to deliver industry-standard software quality. To qualify for this role, you should hold a Bachelor's or Master's degree in computer science or a related field, with a preference for BTech. Additionally, you should have proven experience in data management, team leadership, proficiency in big data technologies, SQL, data warehousing solutions, ETL and ELT tools, data modeling, data quality concepts, metadata management, data governance principles, and product lifecycle planning. Experience with large product or services companies is advantageous.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Embark on a transformative journey as a Workday Data Analyst at Barclays. At Barclays, we are committed to pushing boundaries. This role involves enabling HR Strategy by fostering and embedding a comprehensive understanding of how data can be utilized, bringing thought leadership on best practices and new capabilities to deliver business outcomes. You will lead data initiatives to ensure the effective and efficient use of data for informed decision-making, support regulatory and control requirements, improve operational efficiency, and generate business value. Key skills required for this role include: - Workday functional knowledge and hands-on experience. - Proven ability to effectively mobilize, lead, and manage multiple priorities/projects, ensuring consistently high delivery. - Ability to build trust through credible use of facts and data, and creatively resolve problems and issues at pace. - Understanding of HR products and technology with a pragmatic and solution-oriented approach. - Strong communication and collaboration skills. Some other highly valued skills may include: - Exposure in Workday Prism and Data Architecture. - Experience/understanding of ETL processes. - Experience working with chatbots, etc. - Knowledge of Python or equivalent. - Experience working with Qlik Sense or equivalent. You may be assessed on critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. This role is based in the Pune office. **Purpose of the role:** To implement data quality processes and procedures, ensuring that data is reliable and trustworthy, extracting actionable insights to help the organization improve its operations and optimize resources. **Accountabilities:** - Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. - Execution of data cleansing and transformation tasks to prepare data for analysis. - Designing and building data pipelines to automate data movement and processing. - Development and application of advanced analytical techniques, including machine learning and AI, to solve complex business problems. - Documentation of data quality findings and recommendations for improvement. **Vice President Expectations:** - Contribute to or set strategy, drive requirements, and make recommendations for change. - Plan resources, budgets, and policies; manage and maintain policies/processes; deliver continuous improvements and escalate breaches of policies/procedures. - For an individual contributor, be a subject matter expert within own discipline and guide technical direction. - Advise key stakeholders, including functional leadership teams and senior management on functional and cross-functional areas of impact and alignment. - Demonstrate leadership and accountability for managing risk and strengthening controls. - Collaborate with other areas of work for business-aligned support areas. - Create solutions based on sophisticated analytical thought and in-depth analysis. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Senior Manager, AI & Data Systems Architecture at AGCO, you will play a crucial role in leading the design, creation, and evolution of system architectures for AI, analytics, and data systems within the organization. Your primary responsibility will be to ensure the development of scalable, high-performance data and AI architectures across cloud platforms such as AWS, Google Cloud Platform, and Databricks, aligning technological solutions with business objectives. Collaboration with cross-functional teams, including data engineers, data scientists, and other IT professionals, will be essential in creating cutting-edge AI and data initiatives. Your leadership will drive efficiency, scalability, and innovation in the architecture of AI and data systems. Your impact will include taking charge of the end-to-end architecture for AI and data systems, ensuring cost-effective scalability, performance, and security across cloud and on-premises environments. You will design, implement, and manage data infrastructure and AI platforms, champion cloud adoption strategies, and drive the continuous improvement and evolution of data and AI architectures to meet emerging business needs and industry trends. Partnering with business and technology stakeholders, you will translate long-term goals into architectural frameworks and roadmaps that drive business value. Additionally, you will oversee the implementation of best practices in data governance, security, and compliance across AI and data systems. To excel in this role, you should have at least 10 years of experience in data architecture, AI systems, or cloud infrastructure, with a minimum of 3-5 years in a leadership position. Deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks is essential, along with familiarity with CRM systems like Salesforce and AI systems within those solutions. Your technical expertise should include designing architectures for AI, machine learning, analytics, and large-scale data processing systems. Proficiency in data architecture, containerization, infrastructure as code, and big data ecosystems is also required. Strong leadership and communication skills, along with the ability to work in a collaborative and fast-paced environment, are crucial for success in this role. A bachelor's degree in Computer Science, Data Science, or a related field is necessary, with a master's degree or relevant certifications such as AWS Certified Solutions Architect being preferred. Experience in industries such as manufacturing, agriculture, or supply chain, and familiarity with regulatory requirements related to data governance and security, will be advantageous. At AGCO, we value diversity, inclusion, and innovation, and we are committed to providing a positive workplace culture where every individual can contribute to feeding the world. Join us in bringing agriculture into the future by applying for the Senior Manager, AI & Data Systems Architecture position today! AGCO is an Equal Opportunity Employer and offers benefits such as health care and wellness plans, as well as flexible and virtual work options to support your professional growth and development.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Database Administrator specializing in Postgres SQL, you will be responsible for various tasks related to database management. Your role will involve designing data architecture, monitoring database performance, and enforcing database rules to ensure optimal functionality. Your key responsibilities will include managing and maintaining PostgreSQL databases, ensuring database performance, integrity, and security, as well as designing and implementing database systems. You will also be required to monitor and optimize database performance, ensure data integrity and security, and develop backup and recovery processes. Additionally, you will play a crucial role in planning and executing disaster recovery strategies, collaborating with development and operations teams, automating database tasks and processes, and performing database tuning and optimization. Troubleshooting database issues, providing solutions, and maintaining database documentation and standards will also be part of your daily tasks. Moreover, you will be responsible for implementing database upgrades, patches, and migrations to the latest version, monitoring database health and performance metrics, and ensuring compliance with data protection regulations. You will also provide support for database-related issues, develop and maintain database scripts and tools, conduct database capacity planning, and stay updated with the latest database technologies and trends. In summary, as a Database Administrator focusing on Postgres SQL, your role will be instrumental in ensuring the efficient and secure management of databases while staying informed about industry best practices and trends.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
vadodara, gujarat
On-site
As the Head of Business Intelligence & AI at our organization, your primary responsibility will be to lead our data strategy, design scalable data models, and drive analytical and AI innovation. You will play a crucial role in aligning data initiatives with business objectives and improving operational efficiency. The ideal candidate for this position should possess a combination of strategic thinking, technical expertise, and effective communication skills. Your role will involve leveraging AI and analytics across various functions such as R&D, production, supply chain, sales, marketing, finance, HR, and compliance. You will be responsible for incorporating dashboarding, ETL processes, and a data lake to enable data-driven decision-making. Additionally, you will define and drive the enterprise-wide business intelligence and analytics strategy, align BI initiatives with overall business goals, and formulate a comprehensive AI and analytics roadmap. Key Responsibilities: - Define and drive the enterprise-wide business intelligence and analytics strategy - Align BI initiatives with overall business goals and digital transformation priorities - Formulate a comprehensive AI and analytics roadmap aligned with the organization's goals - Oversee the design and maintenance of a centralized data lake - Identify cross-functional use cases for AI and analytics - Collaborate with executive leadership and functional heads to identify analytics needs - Lead the Analytics and AI team and provide strategic insights - Develop and maintain interactive dashboards for all functions - Deliver KPIs, scorecards, and predictive models to enable strategic decision-making - Spearhead AI and Generative AI initiatives - Ensure best practices in data governance, security, and quality management Education Qualification: - Bachelors or masters in computer science, Data Science, Statistics, or related field. PhD is a plus. Experience: - 10+ years of experience in analytics, data architecture, or related roles - Strong knowledge of data modeling techniques - Understanding of Data Science (SQL, Python, R, and at least one cloud platform) - Experience with modern data warehousing tools and orchestration - Familiarity with analytics tools and integration with other systems Technical Competencies/Skills: - Deep understanding of manufacturing processes and best practices - Proven track record of implementing enterprise analytics solutions and predictive modeling at scale - Strong hands-on experience with tools like Power BI, Tableau, Python/R, SQL, and cloud platforms - Experience setting up and managing data lakes and developing end-to-end data pipelines - Sound understanding of AI/ML techniques and emerging technologies in data science Behavioural Competencies: - Strong leadership and team management skills - Excellent communication and interpersonal skills - High level of initiative and proactive approach to problem-solving - Ability to work under pressure and manage multiple priorities - Excellent verbal and written communication skills - Strong analytical and problem-solving skills,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
indore, madhya pradesh
On-site
Job Description: Avinka Solutions Pvt. Ltd. is seeking a Senior MongoDB Developer/Data Modeler for a full-time on-site role based in Indore. As a Senior MongoDB Developer/Data Modeler, you will be responsible for designing and implementing MongoDB databases, developing data models, managing data governance and data quality initiatives, and creating data architecture strategies. In this role, you will also be involved in Extract Transform Load (ETL) operations and collaborate with cross-functional teams to ensure effective data management. The ideal candidate should have experience in Data Governance and Data Quality management, possess proficiency in Data Modeling and Data Architecture, demonstrate hands-on experience with ETL processes, exhibit a strong understanding of MongoDB and related database technologies, showcase excellent problem-solving and analytical skills, and have the ability to work collaboratively in a team environment. A Bachelor's degree in Computer Science, Information Technology, or a related field is required, and prior experience in a similar role is considered an advantage. Join our team at Avinka Solutions Pvt. Ltd. and be part of a dynamic environment where you can contribute to innovative IT solutions that drive business success. Empower businesses to reach new heights through streamlined project outcomes, enhanced talent acquisition, and optimized costs. Experience the excitement of simplifying manpower management in today's dynamic market and help businesses thrive.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role that involves participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. You will be responsible for utilizing your knowledge of applications development procedures and concepts, along with basic knowledge of other technical areas, to identify and define necessary system enhancements. This includes using script tools, analyzing/interpreting code, consulting with users, clients, and other technology groups on issues, recommending programming solutions, installing, and supporting customer exposure systems. Additionally, you will apply fundamental knowledge of programming languages for design specifications, analyze applications to identify vulnerabilities and security issues, conduct testing and debugging, and serve as an advisor or coach to new or lower-level analysts. Your role will also involve identifying problems, analyzing information, and making evaluative judgments to recommend and implement solutions. You will resolve issues by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. You should be able to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. Moreover, you will need to appropriately assess risks when making business decisions, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4-6 years of proven experience in developing and managing Big data solutions using Apache Spark, Scala is a must - Strong programming skills in Scala, Java, or Python - Hands-on experience with technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume, etc. - Proficiency in SQL and experience with relational databases (Oracle/PL-SQL) - Experience in working on Kafka, JMS/MQ applications, and multiple operating systems (Unix, Linux, Win) - Familiarity with data warehousing concepts, ETL processes, data modeling, data architecture, and data integration techniques - Knowledge of best practices for data security, privacy, and compliance - Strong technical knowledge of Apache Spark, Hive, SQL, and the Hadoop ecosystem - Experience with developing frameworks and utility services, including logging/monitoring - Experience delivering high-quality software following continuous delivery and using code quality tools (JIRA, GitHub, Jenkins, Sonar, etc.) - Experience creating large-scale, multi-tiered, distributed applications with Hadoop and Spark - Profound knowledge of implementing different data storage solutions such as RDBMS (Oracle), Hive, HBase, Impala, and NoSQL databases Education: - Bachelor's degree/University degree or equivalent experience Please note that this job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You will be responsible for leading the delivery of complex solutions, coding larger features from start to finish, actively participating in planning, and performing code and architecture reviews of your team's product. You will help ensure the quality and integrity of the Software Development Life Cycle (SDLC) for your team and identify opportunities for improvement in how the team works, through the usage of recommended tools and practices. Leading the triage of complex production issues across systems and demonstrating creativity and taking initiative in solving complex problems will be a crucial part of your role. As a high performer, you will consistently deliver a high volume of story points relative to your team and be aware of the technology landscape to plan the delivery of coarse-grained business needs spanning multiple applications. Influencing technical peers outside your team, setting a consistent example of agile development practices, and coaching other engineers to work as a team with Product and UX will be essential. Additionally, creating and improving internal libraries and tools, providing technical leadership on the product, and determining the technical approach will be part of your responsibilities. You will proactively communicate status and issues to your manager, collaborate with other teams to find creative solutions to customer issues, and show a commitment to delivery deadlines, especially seasonal and vendor partner deadlines critical to Best Buy's continued success. Requirements: - 7+ years of relevant technical professional experience with a bachelor's degree OR equivalent professional experience. - 2+ years of experience with Google cloud services including Dataflow, Big query, Looker. - 1+ years of experience with Adobe Analytics, Content Square, or similar technologies. - Hands-on experience with data engineering and visualization including SQL, Airflow, DBT, Power BI, Tableau, and Looker. - Strong understanding of real-time data processing and issue detection. - Expertise in data architecture, database design, data quality standards/implementation, and data modeling. Preferred qualifications include experience working in an omni-channel retail environment, experience connecting technical issues with business performance metrics, experience with Forsta or similar customer feedback systems, certification in Google Cloud Platform services, and a good understanding of data governance, data privacy laws & regulations, and best practices. Best Buy India is a vibrant hub of tech talent, driving innovation and accelerated business outcomes for Best Buy, customers, employees, and partners every day. The inclusive culture empowers you to learn, grow, collaborate, and make a real impact. Best Buy is North America's No. 1 tech specialty retailer, enriching lives through technology by helping personalize and humanize technology for millions of customers. This position is based in Bengaluru, and Best Buy India operates under a hybrid work model with an expectation for employees to be in the office three days a week. As a global organization, maintaining collaboration across the globe is a key proposition, and employees may be required to engage in extended hours during critical periods.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should be an expert with 3 to 6 years of experience in advanced topics related to Modern Data Warehousing, Database, Master Data, and Data Visualization. Your responsibilities will include defining, designing, and building MDM Data models and Data transformation rules to align with business requirements. You must possess strong hands-on experience in Master Data Management (MDM) processes and solid foundational and advanced skills in SQL. Additionally, you will be involved in applying and implementing data standards and guidelines regarding data ownership, coding structures, and data replication to ensure data access and integrity. Data cleaning to eliminate old, unused, or duplicate data for efficient management and quick access will be part of your tasks. You will also need to develop and execute strategies to translate business requirements and models into practical MDM designs. Your role will require a strong understanding of Data architecture and database skills, covering data consumption to rendering for effective data implementation. Proficiency in Agile processes and principles, including SDLC and CI/CD, is essential. You will provide data consulting to support business and IT initiatives and improve client database systems. Effective people management and stakeholder engagement skills will be crucial for success. Moreover, you should have diverse experience in MDM application tools, languages, and frameworks. Mandatory technical skills include expertise in Postgres SQL, SQL Scripting, Database, Data Warehousing, and Master Data. Familiarity with Oracle, Python Scripting, ETL, XMLs & XSDs is considered a secondary skill set. This is a full-time position with a work schedule from Monday to Friday. The work location is in person, requiring your presence on-site.,
Posted 2 days ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Technical Product Expert specializing in Dynamics 365 for Finance and Supply at Aptean in Bangalore, you will be a key player in the transformation of global businesses through bespoke ERP solutions. With a focus on providing targeted solutions that leverage cutting-edge technology, you will play a crucial role in driving greater results for our diverse client base. Your role will require a deep understanding of ERP platforms, with a significant emphasis on D365F&O (or AX2012) and expertise in finance, supply chain, and warehousing processes. Drawing on your 12-15 years of hands-on experience, including at least 5 years in architectural or lead design roles, you will be responsible for architecting and delivering highly available, high-volume D365 solutions within multi-entity, multi-legal environments. Collaboration will be key in this role, as you partner with product owners, enterprise architects, and engineering leads to ensure the delivery of robust, scalable, and maintainable solutions. Your responsibilities will include defining and owning enterprise-grade architecture, leading technical strategy aligned with product growth, and mentoring engineers and tech leads to enhance team capability. To excel in this role, you must possess a strong command of X++, LCS, data architecture, SQL optimization, and enterprise integration patterns. Your ability to lead code reviews, enforce coding standards, and evaluate new tools and best practices will be essential in driving solution quality, user experience, and regulatory compliance. Aptean is committed to fostering a company culture that values diversity, equity, and inclusion. By embracing our unique differences and harnessing individual strengths, we aim to deliver innovative solutions that maximize success for our customers, employees, and company as a whole. If you are ready to join a dynamic team and contribute to our shared success, we invite you to explore opportunities at Aptean today. --- Please note that the above job description has been curated based on the provided details and may require further customization based on specific requirements.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
guwahati, assam
On-site
As a PHP Developer with 2+ years of experience, you will be responsible for managing back-end services and facilitating the exchange of data between the server and users. Your primary skills and responsibilities will include: - Designing and architecting highly scalable and usable web applications using PHP Frameworks such as Codeigniter. Knowledge of Laravel will be an added advantage. - Developing high-end web applications using PHP Frameworks, MYSQL, AJAX, JS, CSS, HTML, XHTML, and XML. - Proficiency in working with REST APIs for mobile applications. Familiarity with JavaScript Libraries such as jQuery and Google Web Kit. - Expertise in database queries, schemas, stored procedures, cursors, views, triggers, and best practices. - Strong understanding of Object-Oriented Programming (OOP), PHP, and MVC frameworks. - Experience in data architecture including schema design, data constraints, integrity, stored procedures, query optimization, etc. - Integration and utilization of both public and private APIs. - Conducting testing and debugging processes along with knowledge in UI Designing. - Demonstrating a high level of responsibility with ownership at the module level. If you possess the aforementioned skills and are looking to contribute to the development of innovative web applications, we encourage you to apply for this PHP Developer position.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As the Enterprise Data Architect, you play a critical role in our organization by designing, managing, and implementing data frameworks that support our strategic goals. Your responsibility includes crafting an architecture that ensures data accuracy, availability, and security across all data-related initiatives. You will collaborate with various stakeholders, including IT teams, data analysts, and business leaders, to align data solutions with business requirements and identify opportunities for data integration and optimization. Your deep technical knowledge paired with a strategic mindset will enable you to address complex data challenges and foster a culture of data-driven decision-making. You will be a key influencer in adopting best practices in data management and governance, driving efficiency and innovation while ensuring compliance with regulatory requirements. Your ability to analyze and interpret vast amounts of data is essential for our organization's growth and sustainability, making a significant impact on our overall performance. Your key responsibilities will include designing and implementing enterprise data strategy and architecture, developing and maintaining data models that align with business needs, ensuring data integrity, quality, and consistency across various platforms, collaborating with cross-functional teams to define data requirements, leading the development of data governance policies and standards, implementing best practices in data architecture and management, facilitating data migration projects and system integrations, evaluating and selecting appropriate data technologies and tools, monitoring and assessing the effectiveness of data solutions, documenting architecture designs, processes, and standards, providing technical guidance and support to data teams, staying current with industry trends in data technology and management, conducting training sessions on data best practices for staff, advising on data security measures and compliance requirements, and engaging with stakeholders to communicate architecture vision and progress. To be successful in this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, at least 5 years of experience in data architecture or related roles, strong knowledge of database technologies such as SQL and NoSQL, experience with cloud platforms (e.g., AWS, Azure, Google Cloud), familiarity with ETL tools and big data technologies (e.g., Hadoop, Spark), certification in data management or architecture (e.g., CDMP, CDBL), proven experience in data governance and compliance frameworks, strong analytical, problem-solving, and communication skills, experience with data visualization tools (e.g., Tableau, Power BI), ability to manage multiple projects and prioritize tasks effectively, familiarity with Agile project management methodologies, understanding of data security practices and data privacy laws, experience collaborating with various teams and stakeholders, ability to translate business requirements into technical specifications, and proficiency in programming languages such as Python or Java is a plus. Your skills should include proficiency in programming languages, database management, communication skills, Power BI, Google Cloud, Java, Agile methodologies, Azure, problem-solving, cloud platforms, NoSQL, ETL tools, data governance, data visualization tools, analytical skills, data architecture, Tableau, project management, enterprise data, database technologies, SQL, Spark, AWS, Hadoop, data modeling, big data technologies.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
In Norconsulting, we are currently seeking a Data Architect to join our team in Chennai for a freelance opportunity with a major Banking organization. This is a long-term assignment based in Chennai, India, offering a daily rate of 150 USD (approximately 3300 USD monthly) for full-time work (8 hours/day, Monday to Friday). The ideal candidate for this role should possess a strong understanding of Data Modeling, Data Architecture, and technology systems. Key responsibilities include designing, developing, migrating, integrating, and configuring a robust Data Warehouse solution. This involves analyzing client operations, applications, and programming to determine database structural requirements, in addition to reviewing objectives with clients and evaluating current systems. The Data Architect will define the database's physical structure and functional capabilities to meet data integration requirements, security protocols, backup procedures, storage needs, and recovery specifications. Collaboration with business clients to comprehend their needs and requirements, as well as maintaining client relationships as necessary, is essential. Furthermore, the candidate will engage with various IT teams within the organization to ensure that the application fulfills all bank and stakeholder requirements. This involves considerations such as security, redundancy, storage, performance, mobile readiness, and reporting capabilities. Building stakeholder consensus, developing business cases, enterprise architecture blueprints, and detailed plans are also part of the responsibilities. The Data Architect should possess the ability to identify multiple solutions for a problem and recommend the most suitable option based on measurable factors. Managing and documenting issues and actions, providing overall support for the Analysis, Design, Development, and Deployment of the Data Warehouse solution, and assisting in training stakeholders on Data Architecture and modeling are crucial aspects of the role. Other responsibilities include preparing and delivering presentations to project stakeholders and management using tools like MS PowerPoint and Visio, evaluating new products or initiatives for required technology support, assessing gaps between current and desired IT environments, proposing recommendations based on industry best practices, and estimating work effort and completion timelines. This is a challenging yet rewarding opportunity for a Data Architect to contribute significantly to the success of a major banking organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere - on premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. MongoDB Technical Services Engineers use their exceptional problem-solving and customer service skills, along with their deep technical experience, to advise customers and to solve their complex MongoDB problems. Technical Service Engineers are experts in the entire MongoDB ecosystem - database server, drivers, cloud, and infrastructure. This also includes services such as Atlas (database as a service), or Cloud Manager (which helps customers with automation, backup, and monitoring of their MongoDB systems). Our engineers combine their MongoDB expertise with passion, initiative, teamwork, and a great sense of humor to help our customers to be successful with MongoDB. We are looking to speak to candidates who are based in Bangalore for our hybrid working model. Cool things you'll do: You'll be working alongside our largest customers, solving their complex challenges - resolving questions on architecture, performance, recovery, security, and everything in between. You'll be an expert resource on best practices in running MongoDB at scale, whatever that scale may be. You'll be an advocate for customers" needs - interfacing with our product management and development teams on their behalf. And you'll contribute to internal projects, including software development of support tools for performance, benchmarking, and diagnostics. This role specifically follows a weekend support model (Sunday to Thursday, with Friday and Saturday as the week-off) and requires adherence to EMEA Hours (2pm to 10pm IST). If you're passionate about being a Technical Services Engineer - Core and are open to flexible, weekend-oriented scheduling, we encourage you to apply! As an ideal candidate, you will have: We consider all candidates with an eye for those who are self-taught, curious, and multi-faceted. Our ideal TSE candidate should also have: - 5+ years of relevant experience - Strong technical experience in one (or more) of the following areas: Systems administration, Scalable and Highly available distributed systems, Network Administration, Database architecture and administration, Application Architecture, Data architecture and design, Performance tuning and benchmarking - A B.Tech / B.S. or equivalent work experience Nice to have: - Basic understanding of AI, including ML, LLMs, and RAG principles - Experience in one or more of: Java, Python, Ruby, C, C++, C#, Javascript, node.js, Go, PHP, or Perl It's crucial for every candidate that they can check off all of these boxes: - Excellent communication skills, both written and verbal - Genuine desire to help people - Uncontrollable urge to investigate and solve problems, with advanced diagnostic and troubleshooting skills - Ability to think on your feet, remain calm under pressure, and solve problems in real-time - Desire and ability to rapidly learn a wide variety of new technical skills - Strong teamwork: willingness and ability to get help from team members when required, and the good judgment to know when to seek help Success Measures: - In 3 months, you'll have gained a deep understanding of MongoDB and its ecosystem. You will complete New Hire Training. - In 6 months, you will be comfortable working frontline with our customers. You will also complete the MongoDB Certified DBA Associate exam. - In 12 months, you will work on gaining expertise to be a part of a technical experts group within the MongoDB ecosystem and will be helping your peer engineers in advance diagnostics. Also, you will be encouraged to handle technical escalations independently. To drive the personal growth and business impact of our employees, we're committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it's like to work at MongoDB, and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The job is based in Hyderabad/Bangalore/Chennai and at Gramener, you will find a welcoming work environment with diverse colleagues, a clear career path, and ample opportunities for growth and innovation. The company aims to develop a range of easily configurable data applications focused on storytelling for both public and private use. As part of your role, you will be involved in various impactful customer technical data platform projects, taking the lead on strategic initiatives that cover the design, development, and deployment of cutting-edge solutions. Collaboration with platform engineering teams will be key in implementing Data Brick services effectively within the company's infrastructure. Your expertise with Data Bricks Unity Catalog will be crucial in establishing strong data governance and lineage capabilities. Implementing CI/CD practices to streamline the deployment of Data bricks solutions will be part of your responsibilities. Additionally, you will contribute to the development of a data mesh architecture that promotes decentralized data ownership and accessibility across the organization, enabling teams to derive actionable insights from their data. In terms of skills and qualifications, you should have expertise in Azure Architecture and Platform, including Azure Data Lake, AI/ML model hosting, Key Vault, Event Hub, Logic Apps, and other Azure cloud services. Strong integration with Azure, workflow orchestration, and governance in Databricks Development is required. Hands-on experience with scalable ETL/ELT pipelines, Delta Lake, and enterprise data management is essential in Data Engineering & Architecture. Your coding and implementation skills should encompass modular design, CI/CD, version control, and best coding and design practices in Software Engineering. Proficiency in Python and PySpark is necessary for building reusable packages and components catering to both technical and business users. Experience with enterprise processes, structured environments, and compliance frameworks is valuable, along with knowledge of data lineage, GxP, HIPAA, GDP compliance, and regulatory requirements in Governance and Security. Familiarity with Pharma, MedTech, and Life Sciences domains is a plus, including an understanding of industry-specific data, regulatory constraints, and security considerations. Gramener specializes in providing data-driven decision-making solutions to organizations, helping them leverage data as a strategic asset. The company offers strategic data consulting services to guide organizations in making data-driven decisions and transforming data into a competitive advantage. Through a range of products, solutions, and services, Gramener analyzes and visualizes large volumes of data to drive insights and decision-making processes. To learn more about Gramener, visit the company's website and blog.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You have over 8 years of experience and are located in Balewadi, Pune. You possess a strong understanding of Data Architecture and have led data-driven projects. Your expertise includes knowledge of Data Modelling paradigms like Kimball, Inmon, Data Marts, Data Vault, Medallion, etc. Experience with Cloud Based data strategies, particularly AWS, is preferred. Designing data pipelines for ETL with expert knowledge on ingestion, transformation, and data quality is a must, along with hands-on experience in SQL. In-depth understanding of PostGreSQL development, query optimization, and designing indexes is a key requirement. Proficiency in Postgres PL/SQL for complex warehouse workflows is necessary. You should be able to manipulate intermediate to complex SQL and use advanced SQL concepts like RANK, DENSE_RANK, and apply advanced statistical concepts through SQL. Working experience with PostGres SQL extensions like PostGIS is desired. Expertise in writing ETL pipelines combining Python + SQL is required, as well as understanding of data manipulation libraries in Python like Pandas, Polars, DuckDB. Experience in designing Data visualization with tools such as Tableau and PowerBI is desirable. Your responsibilities include participation in designing and developing features in the existing Data Warehouse, providing leadership in establishing connections between Engineering, product, and analytics/data scientists team. Designing, implementing, and updating existing/new batch ETL pipelines, defining and implementing data architecture, and working with various data orchestration tools like Apache Airflow, Dagster, Prefect, and others. Collaboration with engineers and data analysts to build reliable datasets that can be trusted and used by the company is essential. You should be comfortable in a fast-paced start-up environment, passionate about your job, and enjoy a dynamic international working environment. Background or experience in the telecom industry is a plus, though not mandatory. You should have a penchant for automating tasks and enjoy monitoring processes.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
As an Assistant Vice President, Data Engineering Expert at Analytics & Information Management (AIM) in Gurugram, you will play a crucial role in leading the Data/Information Management Team. Your responsibilities will include driving the development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of the COO (Chief Operating Office). You will be expected to build and manage high-performing teams, deliver impactful insights, and foster a data-driven culture within the organization. In this role, you will be responsible for supporting Business Execution, Legal Data & Reporting activities for the Chief Operating Office by implementing data engineering solutions to manage banking operations. This will involve establishing monitoring routines, scorecards, and escalation workflows, as well as overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. Additionally, you will be required to enable proactive issue detection, implement a governance framework, and interface between business and technology partners for digitizing data collection. You will also need to communicate findings and recommendations to senior management, stay updated with the latest trends in analytics, ensure compliance with data governance policies, and set up a governance operating framework to enable operationalization of data domains. To excel in this role, you should have at least 8 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, and SQL. Strong understanding of Data Transformation, Data Strategy, Data Architecture, Data Tracing & Lineage, and Database Management & Optimization will be essential. Additionally, experience in AI solutions, banking operations, and regulatory requirements related to data privacy and security will be beneficial. A Bachelor's/University degree in STEM is required for this position, with a Master's degree being preferred. Your ability to work as a senior member in a team of data engineering professionals and effectively manage end-to-end conceptualization & implementation of data strategies will be critical for success in this role. If you are excited about the opportunity to lead a dynamic Data/Information Management Team and drive impactful insights through data analytics solutions, we encourage you to apply for this position and be a part of our talented team at AIM, Gurugram.,
Posted 1 week ago
20.0 - 24.0 years
0 Lacs
hyderabad, telangana
On-site
A senior operations leader is required to oversee holistic IT Service Management, Service Operations, Operational Quality Management, continual service improvement, and operational governance across a function. You will be responsible for developing and implementing a comprehensive enterprise data strategy, guiding the business data strategy, establishing long-term roadmaps, policies, procedures, and standards for data management. You will ensure data quality, privacy, and security, align data initiatives with business objectives, and work with businesses to implement data quality initiatives for trusted decision-making. Additionally, you will drive data architecture practices such as cataloguing, glossary, and lineage for traceability and transparency of data. In collaboration with businesses, Enterprise Data Owners (EDOs), IT teams, and strategic partners, you will transform the vision, build and execute a roadmap for enterprise data management. You will drive executive data governance, cross-domain data governance committees, stakeholder engagement, and collaboration to ensure transparency and progress of data management across the enterprise. As a critical role in ensuring regulatory compliance and risk management, you will establish data governance frameworks, implement controls for data protection, and monitor data usage to mitigate risks. Moreover, you will drive innovation by leveraging data and emerging technologies to create new business models, enhance products or services, and improve customer engagement. Promoting a data-driven culture within the organization, you will democratize data access, promote data literacy, empower teams to make informed decisions based on data insights, and leverage data assets for revenue generation and cost optimization. Additionally, you will ensure technology strategy alignment with Enterprise Data Management (EDM), drive next-generation capabilities in data management, and build an inventory of data assets. The ideal candidate should have over 20 years of overall experience, with at least 15 years in leading Service Delivery teams, including 5 years in Pharma/Healthcare. Proficiency in ITIL-based IT Service Management, extensive experience in managing ITIL life-cycle processes, and expertise in conducting IT audits are essential. Strong leadership in IT shared services, managing budgets, controlling costs, and managing risks in a dynamic IT environment are required. Novartis is committed to creating an inclusive work environment and diverse teams. Joining Novartis means being part of a community that strives to improve and extend people's lives through innovative science and collaborative efforts. If you are passionate about making a difference and want to be part of a mission-driven organization, consider joining the Novartis team.,
Posted 1 week ago
5.0 - 15.0 years
0 Lacs
karnataka
On-site
As the Head of Delivery Management in our organization, you will play a crucial role in leading our delivery operations with a focus on Data Engineering and Data Analytics. Your primary responsibility will be to oversee the end-to-end execution of projects related to data pipelines, analytics platforms, and data-driven solutions. Your expertise in managing projects, optimizing delivery processes, and fostering continuous improvement will be essential in working collaboratively with cross-functional teams comprising data scientists, analysts, and engineers. Your key responsibilities will include leading and overseeing delivery teams, developing strategies for data-centric project delivery, ensuring successful delivery of data solutions, monitoring delivery performance, and collaborating with teams to address challenges in data architecture, integration, and scalability. You will be required to drive continuous improvement in processes, methodologies, and tools tailored to data projects, maintain strong client and stakeholder relationships, and ensure adherence to best practices in data security, privacy, and compliance. Effective resource management, fostering a culture of innovation, collaboration, and accountability within the delivery team will also be important aspects of your role. To be successful in this position, you should have a minimum of 15 years of experience in delivery management, with at least 5 years specifically in Data Engineering or Data Analytics domains. Your proven track record in delivering large-scale data projects involving ETL processes, cloud platforms, or data warehouses, along with a strong understanding of data architecture, big data technologies, and analytics frameworks will be highly valuable. Exceptional leadership and team management skills, excellent project management abilities with exposure to agile methodologies, and familiarity with tools like Tableau, Power BI, Snowflake, Hadoop, or similar platforms are essential requirements. Moreover, your strong analytical and problem-solving skills, experience with financial planning and resource management in data projects, deep understanding of industry trends in data and analytics, and proven ability to drive stakeholder alignment and ensure delivery excellence will set you up for success in this role. If you are passionate about leading teams and delivering excellence in data-driven initiatives, we welcome you to bring your expertise to our team and contribute to our mission of driving innovation and success in the data engineering and analytics space.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Applications Support Engineer at Nordex, you will play a crucial role in providing technical support for web applications used within the engineering department. Your responsibilities will include troubleshooting and resolving issues related to web technologies such as HTML, CSS, and JavaScript. Ensuring seamless access to historical data stored in databases will also be a key part of your role. You will be responsible for managing and maintaining the datalake implemented with Databricks, monitoring data ingestion processes, and ensuring data integrity. Addressing any issues related to data storage and retrieval will be essential to maintain the efficiency of the system. Monitoring and analyzing data from sensors on wind turbines, interpreting statistical measures, and identifying anomalies will be part of your tasks related to alarm and sensor data handling. Additionally, providing support for systems generating alarms and notifications will be crucial for maintaining the reliability of the systems. Your role will also involve overseeing the daily execution of ETL pipelines, identifying and resolving data gaps, and ensuring smooth data flow from various sources to the datalake. Configuring and managing download agents that connect to wind turbines, troubleshooting connectivity and data transfer issues, and centralizing data from multiple devices for analysis and reporting will also be part of your responsibilities. Effective communication of technical issues and solutions to non-technical users, documenting support processes, troubleshooting steps, and resolutions, as well as preparing regular reports on system performance and incidents will be essential for ensuring effective collaboration and continuous improvement within the team. To be successful in this role, you should have experience in application support, knowledge of web technologies such as Angular, HTML, CSS, JavaScript, databases, and accessing historical data. Familiarity with Databricks, Data Lake management, ETL pipeline management, and monitoring, as well as handling download agents and knowledge of Wind Energy Environment, wind turbine components, and IIoT (Industrial Internet of Things) Environment will be beneficial. Nordex is committed to providing equal employment opportunities, and all decisions are made without regard to protected characteristics in full compliance with all laws and legislations. Join us at Nordex, where we are passionate about driving forward the expansion of alternative energies worldwide, and be a part of the #TeamNordex. We look forward to receiving your application!,
Posted 1 week ago
5.0 - 15.0 years
0 Lacs
karnataka
On-site
As a Senior eCommerce Data Analyst Consultant with over 15 years of experience, you will play a crucial role in our data analytics organization. Your responsibilities will include conducting advanced data analysis, providing actionable insights, designing scalable data solutions, and guiding the strategic direction of our analytics capabilities. Leveraging your deep understanding of B2B eCommerce, you will drive significant business impact and empower data-driven decision-making across the company. In addition, you will mentor junior analysts and collaborate closely with engineering teams to build a robust and efficient data infrastructure. Your key responsibilities will involve leading data analysis initiatives, spearheading complex projects such as ecommerce clickstream and user behavior analysis, and identifying strategic opportunities for influencing business decisions and product strategy. You will be tasked with designing and implementing scalable data models and data warehousing solutions within GCP BigQuery to support our growing analytics needs. Providing technical guidance and mentorship to junior data analysts will be essential in fostering their growth and development in data analysis techniques, tools, and best practices. Collaborating with product management and engineering teams, you will define and prioritize data-related requirements for the product roadmap. You will work directly with the customer ecommerce product team to articulate and present analytical findings, as well as brainstorm ideas for the product roadmap. Advanced clickstream and user behavior analysis using Adobe Analytics will be a key part of your role, helping optimize conversion funnels and personalize user experiences. You will define frameworks for monitoring the performance of our ecommerce platform and key business metrics, proactively identifying areas for optimization and improvement. Collaborating with data engineering teams, you will design, build, and maintain robust and reliable data pipelines that feed our analytics platforms. Advocating for data quality and governance, you will establish and enforce data quality standards and governance policies to ensure the accuracy, consistency, and integrity of our data assets. Additionally, you will research and evaluate emerging data analytics technologies and tools, recommending and implementing solutions that enhance our analytical capabilities and efficiency. Effectively communicating complex data insights and technical solutions to both technical and non-technical audiences, including senior leadership, through compelling visualizations and presentations will be a key aspect of your role. To qualify for this position, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Statistics, Mathematics, Engineering, or a related quantitative field. You should have extensive experience as a Data Analyst, with a significant focus on ecommerce analytics and architectural responsibilities. Expert-level proficiency in Adobe Analytics, mastery of SQL, advanced programming skills in Python, and experience in designing and implementing data models and data warehousing solutions in a cloud environment are required. Strong analytical, communication, and interpersonal skills, along with the ability to mentor and guide junior team members, will be essential for success in this role. If you are passionate about leveraging data analytics to drive business impact and are looking to join a dynamic team at LTIMindtree, apply now to be a part of our global technology consulting and digital solutions company.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Development Lead in Power Automate and Python at Astellas Pharma Inc., you will play a key role in designing and implementing robust applications for maximum speed using Python. Your responsibilities will include analyzing business processes, designing automated workflows using Microsoft Power Automate, and developing scalable and efficient process automations. You will collaborate with cross-functional teams to integrate user-facing workflow solutions and ensure smooth operations. Your role will involve staying updated with the latest trends and technologies in automation, workflow management, and Python development. You will be responsible for monitoring and troubleshooting workflow processes, training users on workflow management tools, and documenting procedures for workflow management systems. Additionally, you will provide technical support to internal users, champion continuous improvement initiatives, and participate in the continuous delivery pipeline. To qualify for this position, you should have a Bachelor's degree in computer science or a related field, with 5-7 years of experience in Python/Automation tool development and workflow management. You must have a solid understanding of Python libraries and frameworks, as well as experience in software development and coding in automation languages. Knowledge of front-end technologies, database languages, and frameworks/libraries is advantageous. Excellent problem-solving skills, analytical thinking, and communication abilities are essential for this role. Experience working in agile development environments, adherence to DevOps principles, and technical proficiency in SQL, ML, Python, Microsoft Power Automate, and other related technologies are also required. Prior experience within the Life Sciences/Pharma/Manufacturing industry is preferred. Certifications in automation platforms, Microsoft Power Automate, Python, or related areas, as well as training in machine learning or artificial intelligence, are desirable. Subject matter expertise in data architecture/ engineering/operations/reporting within the Life Sciences/Pharma industry is a plus. Experience with cloud-based automation, DevOps practices, and agile methodologies will be beneficial. This permanent position will be based in Bengaluru, India, with a hybrid work model of 2-3 days per week on-site. Successful candidates should be willing to work across different time zones and locations based on demand. Astellas is committed to equality of opportunity in all aspects of employment, including Disability/Protected Veterans. Join Astellas and contribute to the development of innovative therapies that bring value and hope to patients worldwide. Your expertise in Power Automate and Python will be instrumental in driving the continuous improvement and delivery of critical IT solutions at Astellas.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data architecture job market in India is rapidly expanding, with a high demand for skilled professionals in this field. As companies increasingly rely on data to drive business decisions, the role of data architects has become crucial in designing and managing data infrastructure.
These cities are known for their thriving tech industries and offer ample opportunities for data architecture professionals.
The average salary range for data architecture professionals in India varies based on experience levels. Entry-level positions typically start at INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
A typical career path in data architecture may include roles such as Data Analyst, Data Engineer, Data Architect, Senior Data Architect, and Chief Data Officer. As professionals gain more experience and expertise, they can progress to leadership positions such as Director of Data Architecture or Chief Data Officer.
In addition to data architecture, professionals in this field are often expected to have skills in data modeling, database management, ETL processes, data warehousing, and data governance. Knowledge of programming languages such as SQL, Python, and Java is also beneficial.
As you explore data architecture jobs in India, remember to continuously enhance your skills and stay updated with the latest trends in the field. Prepare thoroughly for interviews by understanding the core concepts of data architecture and showcasing your problem-solving abilities. With dedication and perseverance, you can excel in this dynamic and rewarding career path. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France