Periyapalaiyam, Tamil Nadu
INR 0.15 - 0.18 Lacs P.A.
Work from Office
Full Time
Monitoring and overseeing plant operations. Assisting with startup, shutdown, and operations of facility equipment. Identifying problems that arise and resolving them. Ensuring that safety and environmental rules and programs are strictly adhered to. Conceptualizing and recommending plant improvement strategies. Carrying out site inspections and audits. Performing preventative and maintenance measures. Observing gauges, dials, switches, and alarms, and other indicators to ensure that all machines are working properly. Training new hires and cross-training other staff members. Maintaining a clean, hazard-free work environment. Work location: periyapala man Medavakkam Candidate having experience 2 to 5 years experience Completed diploma And iti most preferable Males candidate only Job Type: Full-time Pay: ₹15,000.00 - ₹18,000.00 per month Benefits: Cell phone reimbursement Schedule: Day shift Weekend availability Supplemental Pay: Commission pay Performance bonus Yearly bonus Language: English (Preferred) Work Location: In person
India
INR Not disclosed
Remote
Full Time
Python JD: Role Summary: We are seeking a skilled Python Developer with strong experience in data engineering, distributed computing, and cloud-native API development. The ideal candidate will have hands-on expertise in Apache Spark, Pandas, and workflow orchestration using Airflow or similar tools, along with deep familiarity with AWS cloud services. You’ll work with cross-functional teams to build, deploy, and manage high-performance data pipelines, APIs, and ML integrations. Key Responsibilities: Develop scalable and reliable data pipelines using PySpark and Pandas. Orchestrate data workflows using Apache Airflow or similar tools (e.g., Prefect, Dagster, AWS Step Functions). Design, build, and maintain RESTful and GraphQL APIs that support backend systems and integrations. Collaborate with data scientists to deploy machine learning models into production. Build cloud-native solutions on AWS, leveraging services like S3, Glue, Lambda, EMR, RDS, and ECS. Support microservices architecture with containerized deployments using Docker and Kubernetes. Implement CI/CD pipelines and maintain version-controlled, production-ready code. Required Qualifications: 3–5 years of experience in Python programming with a focus on data processing. Expertise in Apache Spark (PySpark) and Pandas for large-scale data transformations. Experience with workflow orchestration using Airflow or similar platforms. Solid background in API development (RESTful and GraphQL) and microservices integration. Proven hands-on experience with AWS cloud services and cloud-native architectures. Familiarity with containerization (Docker) and CI/CD tools (GitHub Actions, CodeBuild, etc.). Excellent communication and cross-functional collaboration skills. Preferred Skills: Exposure to infrastructure as code (IaC) tools like Terraform or CloudFormation. Experience with data lake/warehouse technologies such as Redshift, Athena, or Snowflake. Knowledge of data security best practices, IAM role management, and encryption. Familiarity with monitoring/logging tools like Datadog, CloudWatch, or Prometheus. Pyspark, Pandas, Data Transformation or Workflow experience is a MUST atleast 2 years Pay: Attractive Salary Interested candidate can call or whats app the resume @ 9092626364 Job Type: Full-time Benefits: Cell phone reimbursement Work from home Schedule: Day shift Weekend availability Work Location: In person
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.