When I assumed responsibility for the purchasing department, I began collaborating with the development team
to create web scrapings and crawlers.
Our goal was to monitor competitor pricing, enabling us to make more informed, margin-boosting purchase
decisions.
We implemented alerts for low prices and various other metrics, utilizing Python and SQL for these tasks.
Subsequently, I adopted Airflow to manage our data pipelines, further enhancing our capabilities by integrating
report generation through the Google Sheets API.
We also explored rapid prototyping of applications within the Google ecosystem.
This approach allowed us to test new ideas in a more agile and cost-effective manner, circumventing the need
for extensive development.
In parallel, I developed multiple automation processes to alleviate manual workloads across different teams.
This effort included integrating third-party APIs to access vital information, such as product stock levels
and sales conditions.
Most recently, I embarked on constructing our company's Data Warehouse single-handedly.
The inaugural data mart
I established focuses on customer orders.
I am responsible for its continual update, employing SQL queries and
leveraging Airflow's scheduling capabilities.