About
Aspiring Data Engineer with a robust foundation in Python and SQL, specializing in crafting efficient data workflows that have improved processing speeds by up to 30%. Eager to leverage expertise in real-time data challenges and advanced analytics to drive impactful solutions in a dynamic tech environment.
Work
Ahmedabad, Gujarat, India
→
Summary
Currently contributing to data engineering and automation initiatives for a leading IT solutions provider, focusing on real-time data ingestion and workflow optimization.
Highlights
Engineered and deployed real-time data ingestion pipelines using Apache NiFi, significantly reducing manual data processing by 60%.
Optimized complex SQL queries, enhancing data retrieval speed by 50% for critical data analysis and reporting.
Developed and implemented Python-based automation scripts for data workflows, minimizing manual processing effort and improving operational efficiency.
Managed Linux system administration tasks, including user access, system updates, and performance monitoring, ensuring stable data infrastructure.
Leveraged Hadoop and HDFS for robust batch processing of large-scale datasets, supporting critical data analytics initiatives.
Education
Skills
Programming Languages
Python, MySQL, SQL.
Cloud Platforms
AWS S3, AWS EC2, AWS RDS, AWS DynamoDB, AWS Lambda, AWS IAM, Snowflake.
Big Data Technologies
Databricks, Pyspark, Apache Hadoop, Apache Spark, Apache NiFi, MongoDB, HDFS.
Operating Systems
Linux.
Business Intelligence & Visualization
Power BI, QuickSight.
Professional Skills
Communication, Problem-Solving, Time Management, Adaptability, Mentoring, Data Workflow Automation, ETL, Schema Detection, Serverless Architecture.
Projects
S3 Auto-Upload with Email Alerts
→
Summary
Automated S3 uploads by scanning local directories using Python (boto3 + os.walk) and saving files to timestamped subfolders. Configured Apache NiFi to send email notifications after each successful upload, enabling real-time visibility and backup reliability.