What you’ll do
- Design and implementing data ingestion and processing pipelines in airflow.
- Write code on backend, creating APIs and microservices for our PCR primers analysis and vacciness design systems.
- Operate large scale batch data pipelines and backend services.
- Deploy the services on kubernetes cluster.
- Create systems for MLOps and experiments management.
Background and Experience
- +2 years of commercial experience in writing python code
- Deep understanding of python
- Experience with building orchestrated pipelines (eg. Airflow)
- Familiarity with Linux based development
- Experience with building APIs
- Experience working in a team and using automated CI/testing
- Exposure to cloud platforms and tools such as Docker/Kubernetes
Nice to have
- Experience with Azure
- Interest in MLOps
- Experience with Kubernetes
- Experience with asynchronous communication
- Familiarity with Kubeflow, MLFlow or Sagemaker