About Lowes
Lowe s is a FORTUNE 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com.
Lowes India
Lowe s India, the Global Capability Center of Lowe s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India
About the Team
This team is responsible for building and maintaining critical enterprise platforms and frameworks that empower internal developers and drive key business functions. Their work spans the entire software development lifecycle and customer journey, encompassing tools like an Internal Developer Portal, front-end frameworks, A/B testing and customer insights platforms, workflow and API management solutions, a Customer Data Platform (CDP), and robust testing capabilities including performance and chaos testing. This team is instrumental in providing the foundational technology that enables innovation, efficiency, and a deep understanding of their customers.
Job Summary:
As a Software Engineer with a focus on data engineering, you will play a critical role in building and optimizing our data infrastructure. Your responsibilities will include designing and implementing scalable data pipelines, working with various data processing frameworks, and ensuring data quality and availability for analytics and decision-making processes.
Roles & Responsibilities:
Core Responsibilities:
-
Data Engineering: Design, build, and maintain robust data pipelines to ingest, process, and analyze large datasets.
-
System Design & Architecture: Contribute to the design and implementation of distributed systems that ensure high availability, low latency, and fault tolerance at scale.
-
Testing & Quality Assurance: Implement best practices for data governance, data quality, and security in data engineering workflows.
-
Debugging & Troubleshooting: Investigate and resolve bugs and performance issues in both development and production environments.
-
Code Review & Mentorship: Participate in code reviews, share knowledge with peers, and learn from senior engineers to continuously improve code quality and team productivity.
-
Technical Documentation: Document system design, APIs, and service behavior to facilitate maintainability and cross-team collaboration.
-
Cross-functional Collaboration: Work with engineering and analytics teams to understand their needs, ensure compliance, and provide tools that empower them to make data-driven decisions.
-
Innovation & Learning: Stay updated with emerging trends in data engineering and continuously suggest improvements to our existing systems.
-
Operational Excellence: Monitor data pipeline performance and troubleshoot issues to ensure high availability and reliability of data services.
Years of Experience:
-
2 years of experience in software development or a related field
-
2 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) through iterative agile development
-
2 years experience in data engineering or software development with a strong emphasis on data processing.
Required Minimum Qualifications:
-
2 years of experience writing technical documentation in a software environment and developing and implementing business systems within an organization
-
Bachelors degree in computer science, computer information systems, or related field (or equivalent work experience in lieu of degree)
Skill Set Required
-
Solid knowledge of Software Design Patterns, Distributed Systems, and Microservices Architecture.
-
Proficiency in programming languages such as Python, Java, or Scala.
-
Experience working with OLAP databases like Druid and ClickHouse.
-
Familiarity with tools such as Kafka, Git/Bitbucket, the ELK Stack, Prometheus, and Grafana.
Secondary Skills (desired)
-
Exposure to containerization and orchestration tools (e.g., Docker, Kubernetes).
-
Understanding of performance tuning and system reliability engineering concepts.
-
Knowledge of security best practices and compliance in data handling.
-
Familiarity with large-scale data processing frameworks (e.g., Apache Flink or Spark).
-
Experience with cloud platforms, such as Google Cloud Platform (GCP) or others.