You will be successful in the role if you have
confidence doing text and string manipulation, regular expressions
Strong production-level Python experience
Experience developing ETL data processing pipelines
Knowledge of RESTful APIs
Understanding of data structures such as XML, JSON, YAML, CSV and TSV
Good working knowledge of Linux operating systems.
Solid understanding of SQL and NoSQL databases
Experience with cloud-based infrastructures (AWS, Azure, GCP)
Attention to detail
Excellent verbal and written communication skills
Good documentation skills (e.g. writing blog posts and technical process documentation)
Responsibilities will include:
Development of data processing pipelines in Python using SDKs and RESTful APIs
Create, maintain, and improve data ingestion and processing pipeline architecture
Work with data pipelines that process large data sets
Integrate data from multiple disparate systems using APIs, Webhooks, JSON, REST, SFTP, etc.
This job comes with several perks and benefits
Time is precious. Make it count. Morning person or night owl, this job is for you.
Get your caffeine fix to get you started and keep you going.
Easy access and treehugger friendly workplace.
Social gatherings and games; hang out with your colleagues.
You know how you perform best. Work from your couch, your favorite cafe or abroad when you feel like it.
Your local office is your anchor point, thus, we placed it centrally at your convenience.