Ciklum
We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live.
About the role:
As a Data Architect with Microsoft Fabric, become a part of a cross-functional development team engineering experiences of tomorrow.
We are looking for an experienced design and implement an enterprise-scale data fabric solution leveraging Medallion Architecture (Bronze, Silver, Gold layers). The architect will play a key role in data ingestion, transformation, governance, and analytics enablement on Microsoft Fabric, ensuring seamless integration with cloud and enterprise applications. This role demands a blend of data platform expertise, solution architecture, and business alignment to accelerate our organizations data-driven strategy.
Responsibilities:
Architecture & Strategy
- Design and implement end-to-end Microsoft Fabric architecture using the Medallion data model (Bronze, Silver, Gold).
- Define standards and best practices for data ingestion, curation, and consumption within Microsoft Fabric.
- Establish a governed, metadata-driven approach for lineage, observability, and data quality.
- Partner with business stakeholders to align architecture with analytics, BI, and AI/ML use cases.
Data Ingestion & Transformation
- Architect data pipelines for batch and streaming ingestion into Fabric (from ERP, CRM, APIs, flat files, and external data).
- Implement Bronze layer for raw/landing data, Silver layer for cleansed and conformed data, and Gold layer for business-ready curated data.
- Optimize dataflows, pipelines, and notebooks within Microsoft Fabric for performance and scalability.
- Ensure reusability of transformation logic and conformed dimensions for enterprise-wide reporting
Governance & Security
- Define frameworks for data governance, security, and compliance across Fabric workspaces.
- Implement role-based access control (RBAC), data masking, and sensitivity labels for regulatory compliance (GDPR, HIPAA, etc.).
- Set up data cataloging, lineage, and quality monitoring using Fabrics built-in capabilities and Microsoft Purview.
Technology & Tools
- Leverage Fabrics components:
- Data Factory for orchestration & pipelines
- Data Engineering – for Spark notebooks & transformations
- Data Science – for ML pipeline integration
- Real-Time Analytics – for streaming data ingestion & Synapse Data Warehouse (Lakehouse & SQL Endpoint) – for modeling curated data
- Power BI – for dashboards, semantic models, and self-service analytics
- Integrate Fabric with Microsoft Purview, Azure Active Directory, and external data platforms.
Collaboration & Leadership
- Work closely with data engineers, BI developers, and business analysts to ensure data products meet business requirements.
- Provide technical leadership and mentorship on Microsoft Fabric adoption.
- Act as the subject matter expert (SME) for Medallion architecture implementation in Fabric.
Requirements:
We know that sometimes, you can’t tick every box. We would still love to hear from you if you think you’re a good fit!
- 5+ years of experience coding in SQL, Java, Python, Scala, with solid CS fundamentals including data structure and algorithm design
- 3+ years contributing to production deployments of large backend data processing and analysis systems as a team lead
- 2+ years of hands-on implementation experience working with a combination of the following technologies: Hadoop, Map Reduce, Pig, Hive, Impala, Spark, Kafka, Storm, SQL and NoSQL data warehouses such as Hbase and Cassandra
- 3+ years of experience in cloud data platforms (AWS, Azure, GCP)
- Detailed knowledge and understanding of Data Governance process definition, development and deployment. This should include data quality, security and other data management standards & practices
- Demonstrable experience in leading development teams including managing the teams backlog and assigning tasks to team members, leading on maintaining development standards & the peer review process
- Ability to work on and manage multiple projects simultaneously while adapting to changing priorities in a fast-paced environment
- Knowledge of SQL and MPP databases (e.g. Vertica, Netezza, Greenplum, Aster Data)
- Knowledge of professional software engineering best practices for the full software
- Knowledge of Data Warehousing, design, implementation and optimization
- Knowledge of Data Quality testing, automation and results visualization
- Knowledge of BI reports and dashboards design and implementation (PowerBI, Tableau)
- Knowledge of development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Experience participating in an Agile software development team, e.g. SCRUM
- Experience designing, documenting, and defending designs for key components in large distributed computing systems
- A consistent track record of delivering exceptionally high-quality software on large, complex, cross-functional projects
- Demonstrated ability to learn new technologies quickly and independently
- Understanding of cloud infrastructure design and implementation
- Undergraduate degree in Computer Science or Engineering from a top CS program required. Masters preferred
- Experience with supporting data scientists and complex statistical usecases highly desirable
Desirable:
- Experience in data science and machine learning
- Experience in backend development and deployment
- Experience in CI/CD configuration
- Good knowledge of data analysis in enterprises
- Experience with Databricks, Snowflake
- Experience with Kubernetes
What`s in it for you?
- Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation
- Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications
- Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally
- Flexibility: hybrid work mode at Chennai or Pune
- Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential
- Global impact: work on large-scale projects that redefine industries with international and fast-growing clients
- Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events
About us:
At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress.
India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level.
Want to learn more about us? Follow us onInstagram,Facebook,LinkedIn.
Explore, empower, engineer with Ciklum!
Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.