About the job
Responsibilities:
Design, develop, and maintain full-stack applications tailored for data acquisition, including internal tools and comprehensive dashboards.
Work collaboratively with cross-functional teams, such as Data Processing, Architecture, and Scaling, to facilitate efficient data ingestion and workflow management.
Craft and implement APIs that streamline data interactions between internal systems and external data sources.
Enhance user experience by creating intuitive web-based interfaces for effective management and monitoring of data pipelines.
Optimize backend services focusing on performance, scalability, and security within a distributed computing framework.
Collaborate with legal and compliance teams to ensure our data acquisition practices meet privacy regulations and adhere to industry best practices.
Deploy and maintain infrastructure leveraging Kubernetes and Infrastructure-as-Code (IaC) methodologies.
Analyze system performance, conduct thorough experiments, and refine data workflows to drive operational efficiency.
Qualifications:
Bachelor's, Master's, or PhD in Computer Science or a related discipline.
4+ years of professional experience in full-stack development.
Expertise in frontend frameworks (e.g., React, Vue) as well as backend technologies like Python, Node.js, or Go.
Strong command of RESTful APIs, GraphQL, and database design (both SQL and NoSQL).
Experience in developing data-intensive applications capable of managing large-scale datasets.
Familiarity with cloud platforms (AWS, GCP, Azure) and container orchestration technologies (Kubernetes, Docker).
Previous experience in web crawling and large-scale data processing is advantageous.
Exceptional problem-solving abilities and adeptness at juggling multiple tasks in a dynamic environment.
Outstanding communication and collaboration skills.

