Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Entry Level
Qualifications
We are looking for candidates with a strong foundation in data engineering, particularly in web scraping technologies. Familiarity with programming languages such as Python or JavaScript, experience with data extraction frameworks, and a solid understanding of data structures and algorithms are essential. A Bachelor’s Degree in Computer Science, Data Science, or a related field is preferred.
About the job
Join 10alabs as a Data Engineer specializing in web scraping, where you will play a pivotal role in gathering and processing data from diverse online sources. Your expertise will help drive our data-driven decision-making processes, enhance our data pipelines, and optimize our web scraping solutions.
About 10alabs
10alabs is an innovative technology company based in San Francisco, dedicated to leveraging data to drive business insights and solutions. Our team is passionate about technology and committed to creating impactful products that enhance the way businesses operate.
Similar jobs
1 - 20 of 45,068 Jobs
Search for Freelance Python Data Scraping Engineer
Part-time|$32/hr - $32/hr|Remote|Remote — Virginia, United States
Mindrift seeks a Freelance Python Data Scraping Engineer to support specialized data workflows for the Tendem project. This is a remote, part-time contract based in Virginia, United States. The role centers on complex data extraction, quality control, and collaboration with Tendem Agents who automate routine tasks. Role overview This position involves designing and managing data scraping operations for dynamic and interactive websites. The focus is on delivering structured, reliable datasets that meet project requirements. As an "AI Pilot," the engineer will handle the more challenging aspects of extraction while ensuring data quality and actionable outcomes. What you will do Lead end-to-end extraction from complex sites, producing accurate and structured data. Utilize internal tools like Apify and OpenRouter, along with custom Python workflows, for collecting, validating, and processing data. Adjust techniques to extract information from JavaScript-heavy or interactive sources. Perform validation checks and consistency controls to maintain high data quality before delivery. Scale operations for large datasets by batching or parallelizing tasks, monitor for failures, and adapt to changes in website structures. Compensation Rates reach up to $32 per hour, based on experience, speed, and project complexity. Actual pay may vary depending on the specific assignment and required skills. How to apply Submit an application to this post and complete the qualification steps. Candidates who qualify may join projects that fit their technical background and availability. Assignments include coding, automation, and refining AI-driven outputs, contributing to practical AI solutions.
Part-time|$32/hr - $32/hr|Remote|Remote — San Antonio, Texas, United States
Mindrift brings together technical specialists and AI-driven projects from major technology innovators. The platform’s goal is to blend generative AI with real-world expertise from a global network. Role overview This part-time, contract position focuses on building and maintaining Python-based data scraping workflows for the Tendem project. The role is open to candidates in San Antonio, Texas, or working remotely from anywhere in the United States. As a Freelance Python Data Scraping Engineer, you will work within a hybrid system that combines AI and human input. Internally, this position is known as an AI Pilot, collaborating with Tendem Agents who manage routine tasks. The AI Pilot uses domain expertise and quality assurance skills to deliver reliable, actionable datasets. What you will do Oversee end-to-end data extraction workflows on complex websites, ensuring accurate and well-structured results. Utilize internal tools such as Apify and OpenRouter, along with custom Python scripts, to manage data collection, validation, and processing. Adjust extraction methods for dynamic or JavaScript-heavy sites, refining techniques as website behaviors evolve. Apply data quality checks, including validation, cross-source consistency, and adherence to formatting standards before delivering datasets. Scale up scraping operations for large data sets using batching or parallelization, monitor workflow stability, and address errors from minor site changes. Requirements Minimum of 3 years’ experience in data engineering, web scraping, automation, or a closely related technical field. Strong Python programming skills and practical experience with data extraction tools. Demonstrated ability to think critically and solve problems. High attention to detail and a strong focus on data quality. Compensation Earn up to $32 per hour, depending on experience and contribution speed. Actual pay depends on project scope, complexity, and required expertise. Other projects may offer different rates. How to apply Submit your application to this post. Qualified candidates may be invited to work on projects that align with their technical skills and availability. Projects include coding, automation, and optimizing AI outputs, contributing directly to practical AI applications.
Part-time|$32/hr - $32/hr|Remote|Remote — Iowa, United States
Mindrift brings together professionals from around the world to work on AI projects for major technology companies. The team’s focus is on advancing Generative AI by connecting specialists with real-world expertise. Role overview This part-time, remote contract is for a Freelance Python Data Scraping Engineer (AI Pilot) supporting the Tendem project. Candidates must be based in Iowa, United States. The work centers on managing and carrying out web data extraction tasks, collaborating closely with Tendem Agents, and applying critical thinking to ensure the accuracy and relevance of collected data. Quality assurance is a key part of the position. What you will do Manage end-to-end data extraction workflows for complex websites, delivering structured datasets with precision and reliability. Use internal tools such as Apify and OpenRouter, along with custom workflows, to collect, validate, and process data according to project needs. Adapt scraping methods for dynamic web sources, including handling JavaScript-rendered content and responding to changing site behaviors. Apply strict data quality standards, running validation checks and systematic verification before delivering results. Scale operations for large datasets using batching or parallelization, monitor for failures, and maintain stability when site structures change. Requirements Minimum 3 years of experience in data engineering, web scraping, automation, or a related field. Compensation Earn up to $32 per hour, based on expertise and contribution speed. Actual pay may vary depending on project scope and complexity, and may differ across projects on the Mindrift platform. How to apply Submit an application through this posting to be considered for projects that fit your technical background and availability. Work may involve coding, automation, or refining AI outputs, all contributing to AI advancement and practical use cases.
Part-time|$32/hr - $32/hr|Remote|Remote — New York, United States
Mindrift connects technical experts with AI-driven projects, focusing on the intersection of generative AI and specialized human knowledge. The company partners with technology leaders to deliver real-world solutions powered by collaborative intelligence. Role overview This part-time, remote contract position centers on advanced data extraction and processing for the Tendem project. As a Freelance Python Data Scraping Engineer (called "AI Pilot" at Mindrift), the role works within an AI-human hybrid system. Engineers collaborate with Tendem Agents, who handle routine tasks, while focusing on critical thinking and applying domain expertise to produce accurate, actionable data insights. What you will do Run end-to-end workflows to extract data from complex websites, delivering structured datasets with high accuracy. Use internal tools such as Apify and OpenRouter, as well as custom-built solutions, to collect, validate, and process data according to project needs. Adapt extraction methods for dynamic or JavaScript-rendered content and evolving site structures. Ensure data quality through validation checks, cross-source comparisons, formatting standards, and systematic verification before delivery. Optimize large-scale scraping with batching or parallelization, monitor for failures, and maintain resilience to minor layout changes. Requirements Minimum of 3 years of experience in data engineering, web scraping, automation, or a closely related technical field. Compensation Earn up to $32 per hour on this project, depending on experience and pace. Actual rates may vary by project scope, complexity, and required skills. Other projects on the Mindrift platform may offer different compensation based on their needs. How to apply Submit an application to this posting. After qualifying, join projects that fit your technical strengths and work on a flexible schedule. Tasks may include coding, automation, and refining AI outputs, all contributing to the advancement of AI and its real-world uses.
Part-time|$32/hr - $32/hr|Remote|Remote — Wisconsin, United States
Mindrift connects specialists with AI-driven projects from technology innovators. The company blends generative AI with expertise from contributors worldwide. Role overview This part-time, remote position is open to candidates based in Wisconsin, United States. As a Freelance Python Data Scraping Engineer, you will support the Tendem project by executing specialized data scraping workflows within an AI and human collaboration system. Internally, this role is called an AI Pilot. Work closely with Tendem Agents to tackle repetitive tasks, applying critical thinking and domain knowledge to deliver accurate, actionable data. Consistent quality control and attention to detail are essential in this role. Main responsibilities Manage end-to-end data extraction workflows across complex websites, ensuring thorough coverage and accuracy. Use internal tools such as Apify and OpenRouter, along with custom-built workflows, to accelerate data collection, validation, and task completion. Adapt extraction strategies for dynamic web sources, including those with JavaScript-rendered content or changing structures. Apply validation checks and systematic verification to maintain data quality before delivering results. Scale scraping operations for large datasets using efficient methods that remain stable as target sites evolve. Compensation Earn up to $32 per hour, depending on experience and pace of contribution. Actual pay may vary by project scope and complexity. Application process Submit your application and demonstrate your technical skills to be considered. This freelance role offers the chance to contribute to real-world AI projects while working on a flexible schedule.
Jobgether is seeking a Senior Python Data Scraping Engineer for a remote freelance role based in the US. This position involves working with a partner company to deliver high-quality data extraction solutions. Role overview The main focus is on building and maintaining web data extraction systems that can handle both scale and complexity. Projects often involve scraping dynamic websites and processing large datasets, with a strong emphasis on accuracy and reliability. The workflow combines AI-driven agents with human oversight to ensure quality control as requirements shift. What you will do Develop and support scalable, reliable web scraping systems using Python Tackle advanced scraping challenges, including dynamic content and sizable data volumes Deliver well-structured data for downstream processes Collaborate within a hybrid AI-human workflow to maintain accuracy and quality Adjust scraping strategies as websites and project needs change Requirements Significant experience with Python in web scraping contexts Strong problem-solving skills and adaptability in evolving technical settings Ability to work independently and deliver precise, high-quality outcomes Background in managing large, complex datasets Benefits Fully remote freelance arrangement with flexible scheduling Engage with projects that support advanced AI and analytics initiatives Contribute to dependable datasets for emerging technologies
Contract|$90/hr - $90/hr|Remote|Remote — Virginia, United States
Please submit your CV in English and indicate your English proficiency level. Toloka AI, working with Mindrift, offers project-based freelance roles for professionals who want to help test, evaluate, and improve artificial intelligence systems for leading technology companies. These are contract assignments, not permanent positions. Role overview The Freelance Data Science Engineer (Python & SQL) works remotely from Virginia, United States, and takes on a variety of AI-related projects. Assignments change from project to project, but the core work centers on designing and validating computational data science challenges that reflect real-world analytical problems across industries like telecommunications, finance, government, e-commerce, and healthcare. Develop data science problems that require Python programming, using libraries such as Pandas, Numpy, Scipy, Sklearn, Statsmodels, Matplotlib, and Seaborn. Ensure tasks are complex enough to need significant computation and cannot be solved manually in a short time. Create scenarios involving advanced data processing, statistical analysis, feature engineering, predictive modeling, and business insight generation. Design deterministic problems with reproducible results, using fixed random seeds if randomness is necessary. Base assignments on real business challenges like customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency. Build end-to-end tasks that cover the data science workflow: data ingestion, cleaning, exploration, modeling, validation, and deployment considerations. Integrate big data scenarios that require scalable computation strategies. Validate solutions using Python, standard data science libraries, and statistical methods. Document each problem clearly, including realistic business contexts and verified solutions. Requirements Minimum 5 years of hands-on data science experience with proven business results. Portfolio of completed projects or publications that highlight practical problem-solving skills. Advanced Python programming for data science, especially with Pandas, Numpy, Scipy, and Scikit-learn. Strong background in statistical analysis and machine learning, including algorithms and real-world applications. Proficiency in SQL and database operations for data analysis. Experience with Generative AI (LLMs, RAG, prompt engineering, vector databases). Understanding of MLOps and model deployment processes. Familiarity with tools such as TensorFlow, PyTorch, and LangChain. Excellent written English skills at C1 level or higher. How to join Apply Pass qualifications Join a project Complete tasks Receive compensation
Lead Python Engineer - Data Infrastructure About AscentAI AscentAI is at the forefront of developing intelligent software solutions tailored for risk and compliance teams within financial institutions. Our innovative platform simplifies complex regulatory information into actionable insights, empowering teams to mitigate risks, enhance operational efficiency, and proactively adapt to changes in global regulations. As a vibrant, mission-driven organization, we are pushing the limits of machine learning and artificial intelligence, combined with human-in-the-loop systems, to tackle some of the most challenging issues in regulatory compliance. The Role We are seeking a skilled Python Engineer to join our dynamic team. In this pivotal role, you will lead the design and development of robust, large-scale web scraping platforms that underpin AscentAI's data infrastructure. You will work collaboratively with fellow engineers and analysts to define data requirements, architect efficient data pipelines, and ensure the delivery of reliable, high-quality data at scale. Your expertise will also be critical in advising on scraping strategies, counteracting anti-bot measures, and implementing best practices in data extraction for cross-functional stakeholders in engineering, data science, and product development. This is a significant role that offers ownership and visibility, providing an opportunity to influence our technical architecture and overall business success. What You’ll Do Lead the design and development of large-scale web scraping platforms using Python and related frameworks. Mentor junior developers, providing technical guidance and conducting code reviews to ensure high-quality and maintainable code. Devise advanced strategies to navigate and overcome sophisticated anti-bot defenses such as CAPTCHAs, Cloudflare, and IP blocking, while adhering to legal and ethical standards and website terms of service. Collaborate with data analysts and engineers to establish data requirements and facilitate seamless data integration into databases. Optimize scrapers for performance, speed, and stability; set up real-time monitoring and alert systems to quickly respond to failures or changes in target sites. Create comprehensive technical documentation and engage effectively with cross-functional teams to ensure alignment and manage expectations.
Part-time|$32/hr - $32/hr|Remote|Remote — United States
This role is exclusively available to candidates currently residing in the United States. Your location may influence both eligibility and compensation. Please submit your resume in English, indicating your proficiency level.Mindrift is on the lookout for talented Vibecode specialists to become integral members of the Tendem project (https://tendem.ai/), leading specialized data scraping initiatives within our innovative hybrid AI and human framework.As an AI Pilot—the title we use at Mindrift for this position—you will work closely with Tendem Agents managing routine tasks, while you apply your critical thinking, domain expertise, and quality assurance skills to ensure the delivery of precise and actionable insights.This part-time remote position is perfect for technical professionals with proven experience in web scraping, data extraction, and processing.About MindriftThe Mindrift platform bridges the gap between specialists and AI projects from leading technology innovators. Our vision is to harness the potential of Generative AI by leveraging real-world expertise from around the globe.Role OverviewThis freelance role focuses on the Tendem project. As a Vibe Code specialist, you will manage data scraping tasks that demand technical accuracy for web extraction and processing, utilizing various tools including our proprietary Apify and OpenRouter, as well as your own inventive strategies.Key ResponsibilitiesOversee comprehensive data extraction workflows across intricate websites, ensuring thorough coverage, accuracy, and dependable delivery of structured datasets.Utilize internal tools (Apify, OpenRouter) and custom workflows to expedite data collection, validation, and task execution while adhering to established requirements.Guarantee reliable extraction from dynamic and interactive web sources, adjusting methodologies as necessary to accommodate JavaScript-rendered content and evolving site behaviors.Implement data quality standards through validation checks, consistency controls, formatting adherence, and systematic verifications prior to data delivery.Scale scraping operations for extensive datasets utilizing efficient batching or parallel processing, monitor for failures, and maintain operational stability against minor site structure changes.Application ProcessTo apply, simply submit your application. If qualified, you will have the opportunity to contribute to projects that align with your technical expertise, on a flexible schedule. From coding and automation to refining AI outputs, you will play a pivotal role in enhancing AI capabilities and real-world applications.
Contract|$90/hr - $90/hr|Remote|Remote — Iowa, United States
Please submit your resume in English and indicate your English proficiency level. Mindrift connects skilled professionals with project-based AI work for leading technology companies. Projects focus on testing, evaluating, and improving AI systems. This is a freelance, project-based position rather than a permanent staff role. Role overview This freelance Data Science Engineer position centers on designing and validating challenging data science problems for real-world business scenarios. Work is fully remote and project-based, with a focus on Python and SQL. What you will do Design data science problems that mirror analytical challenges in industries such as telecommunications, finance, government, e-commerce, and healthcare. Create tasks requiring Python programming, using libraries like Pandas, Numpy, Scipy, scikit-learn, Statsmodels, Matplotlib, and Seaborn. Ensure problems are computationally intensive, with solutions that may require days or weeks to process. Develop scenarios involving advanced data processing, statistical analysis, feature engineering, predictive modeling, and generating business insights. Write deterministic problems with reproducible results by using fixed random seeds or avoiding stochastic elements. Base challenges on real business cases, including customer analytics, risk assessment, fraud detection, forecasting, optimization, and efficiency improvements. Cover the full data science workflow: data ingestion, cleaning, exploratory analysis, modeling, validation, and deployment considerations. Incorporate big data scenarios that require scalable computation strategies. Validate all solutions in Python, using standard data science libraries and statistical techniques. Document each problem clearly within a realistic business context and provide accurate, verified answers. Requirements Minimum 5 years of hands-on data science experience with proven business results. Portfolio of completed projects or publications demonstrating real-world problem solving. Advanced Python skills for data science, including experience with pandas, numpy, scipy, scikit-learn, and statsmodels. Strong background in statistical analysis and machine learning, with deep understanding of algorithms and their practical use. Expertise in SQL and database operations for data analysis and manipulation. Familiarity with Generative AI tools and concepts (LLMs, Retrieval-Augmented Generation, prompt engineering, vector databases). Understanding of MLOps and model deployment workflows. Knowledge of modern frameworks such as TensorFlow, PyTorch, and LangChain. Excellent written English communication skills at C1 level or above. How to join Apply Complete qualifications Join a project Fulfill assigned tasks Receive compensation Location Remote , Iowa, United States
Join 10alabs as a Data Engineer specializing in web scraping, where you will play a pivotal role in gathering and processing data from diverse online sources. Your expertise will help drive our data-driven decision-making processes, enhance our data pipelines, and optimize our web scraping solutions.
Contract|$90/hr - $90/hr|Remote|Remote — Wisconsin, United States
Toloka AI offers project-based freelance roles for experienced Data Science Engineers. This position supports leading technology companies by designing and validating data science challenges that reflect real-world analytics scenarios. The engagement is freelance, not permanent employment. Role overview Freelance Data Science Engineers at Toloka AI create computational challenges for industries such as telecom, finance, government, e-commerce, and healthcare. Projects require designing problems that use Python and SQL, focusing on tasks that cannot be solved manually. Each challenge covers the full data science workflow, from data ingestion and cleaning to modeling and deployment considerations. Engineers use libraries such as Pandas, NumPy, SciPy, Scikit-learn, Statsmodels, Matplotlib, and Seaborn. Problems must be deterministic, reproducible, and based on realistic business scenarios like customer analytics, risk assessment, fraud detection, forecasting, and operational efficiency. Documentation of each challenge includes business context and a verified solution. Big data processing and scalable computational approaches are often required. Requirements At least 5 years of hands-on data science experience with measurable business outcomes Portfolio of completed projects or publications showing real-world problem-solving Advanced Python skills for data science, including pandas, numpy, scipy, scikit-learn, and statsmodels Strong background in statistical analysis and machine learning algorithms Proficiency in SQL and database operations for data analysis Familiarity with Generative AI tools and concepts (LLMs, RAG, prompt engineering, vector databases) Understanding of MLOps practices and model deployment workflows Knowledge of frameworks such as TensorFlow, PyTorch, or LangChain Excellent written English skills at C1 level or higher How to apply Submit your CV in English and specify your English proficiency level Complete the qualification process Join a project after qualification Carry out assigned tasks Receive payment upon completion of work Location: Remote , Wisconsin, United States
Full-time|$75K/yr - $150K/yr|Remote|Remote — New Jersey, United States
Role Overview mlabs is hiring a Web Scraping Specialist to support large-scale data extraction for AI model training. This full-time position is fully remote, but requires at least six hours of workday overlap with the Eastern Standard Time (EST) zone. The team manages distributed crawlers and complex pipelines that process billions of data points, including video, transcripts, and audio. Compensation Annual salary ranges from $75,000 to $150,000, depending on experience. What You Will Do Develop and optimize code: Build, test, and refine high-performance scraping solutions for a wide range of online sources. Focus on reliability and efficiency. Oversee data retrieval: Manage complex extraction tasks, including handling pagination and dynamic content such as AJAX-loaded pages. Ensure data quality: Clean and format collected data to meet strict standards for downstream analysis and processing. Database management: Organize and store scraped data in appropriate databases, with attention to speed and long-term integrity. Monitor and maintain systems: Continuously track scraping operations and infrastructure to quickly identify and resolve issues, maintaining steady data flow. Work Environment This role suits someone who enjoys technical challenges and prefers working without heavy bureaucracy. Expect a hands-on, collaborative setting focused on delivering results.
Part-time|$55/hr - $55/hr|Remote|Remote — United States
We kindly ask you to submit your CV in English and specify your English proficiency level.At Mindrift, we specialize in connecting skilled professionals with project-based AI opportunities from leading technology firms, focusing on the assessment, testing, and enhancement of AI systems. Please note that participation is project-based and does not constitute permanent employment.What This Role EntailsWhile each project presents distinct tasks, contributors may be involved in:Developing mechanical engineering problems at both graduate and industry levels, based on practical applications.Assessing AI-generated outputs for accuracy, underlying assumptions, and sound engineering principles.Utilizing Python (including libraries such as NumPy, SciPy, and Pandas) to validate analytical or numerical results.Enhancing AI reasoning to ensure alignment with fundamental principles and established engineering norms.Employing structured scoring criteria to evaluate multi-step problem-solving processes.Qualifications We SeekThis opportunity is particularly suitable for mechanical engineers with Python experience who are open to part-time, non-permanent projects. Ideal candidates will possess:A degree in Mechanical Engineering or related disciplines (e.g., Thermodynamics, Fluid Mechanics, Mechanical Design, Computational Mechanics).A minimum of 3 years of professional experience in mechanical engineering.Excellent written English skills (C1/C2 level).Proficiency in Python for numerical validation.A reliable internet connection.Professional certifications (e.g., PE, CEng, PMP) and experience with international or applied projects will be considered advantageous.How the Process WorksStep 1: Apply → Step 2: Pass qualifications → Step 3: Join a project → Step 4: Complete tasks → Step 5: Receive payment.Project Commitment ExpectationsFor this project, tasks are estimated to require approximately 10–20 hours per week during active phases, depending on project needs. This estimate is not a guarantee of workload and only applies while the project remains active.CompensationContributions will be compensated at rates up to $55/hour*.Compensation may be structured as a fixed project rate or individual rates, depending on the specific project.Some projects may offer incentive payments.*Note: Compensation rates vary based on expertise, skill assessments, location, project requirements, and other factors. Highly specialized experts may receive higher rates, while lower rates may apply during onboarding or non-core project phases. Specific payment details will be provided for each project.
Contract|$90/hr - $90/hr|Remote|Remote — Iowa, United States
Please submit your CV in English and mention your English proficiency level. Mindrift connects experienced professionals with project-based AI assignments for technology companies. The platform focuses on testing, evaluating, and improving AI systems. This is a project-based, non-permanent position. Role overview The Freelance Data Scientist (Python & SQL) - AI Trainer designs and validates data science challenges that mirror real-world analytics problems from fields like telecom, finance, government, e-commerce, and healthcare. Projects are remote and based in Iowa, United States. What you will do Create computational data science challenges based on genuine business scenarios, such as customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency. Develop problems that require advanced Python programming, using libraries like Pandas, Numpy, Scipy, Sklearn, Statsmodels, Matplotlib, and Seaborn. Ensure challenges are computationally intensive and cannot be solved by hand. Design tasks involving data processing, statistical analysis, feature engineering, predictive modeling, and extracting business insights. Produce deterministic problems with reproducible solutions, using fixed seeds or avoiding randomness. Build comprehensive challenges that cover the entire data science workflow, from data ingestion to deployment considerations. Include big data scenarios that require scalable solutions. Validate solutions using Python, standard data science libraries, and statistical methods. Document each problem clearly, providing realistic business context and verified answers. Requirements Minimum 5 years of hands-on data science experience with proven business impact. Portfolio of completed projects or publications showing real-world problem-solving abilities. Advanced Python skills for data science (including Pandas, Numpy, Scipy, Sklearn, Statsmodels). Strong knowledge of statistical analysis and machine learning, including algorithm applications. Proficiency in SQL and database operations for data analysis and manipulation. Experience with Generative AI (LLMs, RAG, prompt engineering, vector databases). Familiarity with MLOps practices and deploying machine learning models. Understanding of frameworks such as TensorFlow, PyTorch, or LangChain. Strong written English skills at C1 level or above. Application process Apply Pass qualifications Join a project Complete tasks Receive compensation
Contract|$90/hr - $90/hr|Remote|Remote — United States
We invite you to submit your resume in English, specifying your level of English proficiency.At Mindrift, we merge innovation with opportunity, harnessing the power of collective intelligence to ethically shape the future of AI.About UsThe Mindrift platform serves as a bridge connecting specialists with AI projects from leading tech innovators. Our mission is to unleash the potential of Generative AI by leveraging real-world expertise from around the globe.Role OverviewAs a Data Science AI Trainer, you will play a pivotal role in advancing GenAI models to tackle specialized questions and enhance complex reasoning skills. You will have the chance to collaborate on various unique projects.Your responsibilities may include:Crafting original computational data science challenges that replicate real-world analytical processes across sectors such as telecom, finance, government, e-commerce, and healthcare.Designing problems that require Python programming solutions (utilizing libraries like pandas, numpy, scipy, sklearn, statsmodels, matplotlib, seaborn).Ensuring that the challenges are computationally demanding and cannot be solved manually within feasible timeframes (days/weeks).Creating scenarios necessitating sophisticated reasoning in data processing, statistical analysis, feature engineering, predictive modeling, and insight extraction.Formulating deterministic problems with reproducible outcomes, avoiding stochastic elements or employing fixed random seeds for exact results.Grounding challenges in actual business scenarios focusing on customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency.Designing comprehensive problems that cover the entire data science pipeline (data ingestion → cleaning → EDA → modeling → validation → deployment considerations).Integrating big data processing requirements that call for scalable computational strategies.Validating solutions through Python using standard data science libraries and statistical techniques.Clearly documenting problem statements within realistic business contexts and providing verified correct answers.Getting StartedTo join us, simply apply to this posting, meet the qualifications, and you can contribute to exciting projects that align with your skills, all while working on your own schedule. By creating training prompts and refining model responses, you will help shape the future of AI, ensuring that technology serves everyone.
Are you passionate about Python and eager to share your expertise with a vast community of developers?The Real Python tutorial team is renowned for delivering top-tier Python tutorials online. Our mission is to empower Python developers globally to enhance their skills.With over 3 million monthly visitors, we are excited about our journey thus far, but believe we can reach even greater heights!To elevate our tutorial quality and broaden our content offerings, we are seeking enthusiastic video course instructors who:Have a deep love for Python and a desire to assist learners in advancing their skillsUnderstand the significance of clarity and tone in educational video contentAspire to refine their craft and leverage our comprehensive publishing processCan commit to producing one or more new recordings each month and adhere to deadlinesThis position is fully remote. For more details, visit: realpython.com/jobs/video-course-instructorIdeal candidates will:Possess several years of programming experienceBe passionate about teaching “programming concepts” and have experience recording screencasts. The content you create will primarily be derived from existing written tutorials, so your ability to transform written material into engaging short videos is crucial.Have the ability to integrate Real Python into their weekly routine, as this role requires a notable time commitment.Joining the Real Python team comes with numerous benefits:Continuous Learning: Engage in ongoing learning and enjoy the process, enhancing your skills as a developer, writer, and communicator while forming valuable connections.Wide Reach: Our website attracts significant traffic—over 3 million visitors monthly and consistently growing. We are frequently highlighted in various Python publications and manage one of the largest email newsletters and social media channels in the community. Our YouTube channel boasts over 150,000 subscribers, ensuring that your published video series garners substantial viewership and appreciation.Content Enhancement: Upon submission of a video series to Real Python, our dedicated team will work with you to ensure the highest quality output.
Contract|$80/hr - $80/hr|Remote|Remote — United States
Please submit your CV in English and specify your level of English proficiency.Mindrift is your gateway to project-based AI opportunities, connecting skilled professionals with top-tier tech companies focused on testing, evaluating, and enhancing AI systems. This is a project-based collaboration, not a permanent position.About the RoleWe are looking for a seasoned Python Engineer with extensive functional testing expertise. The ideal candidate will have robust skills in Linux and Docker, a proficiency for reading code across multiple languages (such as C, Rust, and Go) with the aid of LLMs, and the capability to translate migration requirements effectively. Familiarity with tools like Roo Code or Claude Code to streamline iterative development is essential.Key ResponsibilitiesDevelop and implement functional black box tests for sizable codebases across various programming languages.Set up and oversee Docker environments to guarantee fully reproducible builds and test executions across platforms.Monitor code coverage and develop automated scoring criteria aligning with industry benchmarks.Utilize LLMs (such as Roo Code and Claude) to enhance development cycles, automate repetitive tasks, and elevate overall code quality.Requirements5+ years of software engineering experience, primarily in Python.In-depth knowledge of pytest (including fixtures, session-scoped, timeouts) and experience in designing black-box functional tests for CLI tools.Advanced proficiency with Docker (including reproducible Dockerfiles, user contexts, and secure workspaces).Strong skills in Linux & Bash scripting and debugging within containers.Familiarity with modern Python tools (like uv, pyproject.toml, and packaging).Ability to interpret and understand multiple programming languages with LLM support (such as C, C++, Rust, or Go).Experience leveraging LLMs (Claude Code, Roo Code, Cursor) for accelerating iterative development and generating test cases.English proficiency at a B2 level or higher.Preferred QualificationsPrevious experience with agent evaluation platforms and MCP CLI.Tools and Technologies: Python (pytest, uv, Pillow), Docker, Bash, Git Submodules, C/C++/Rust/Go (reading), Dagger, GitHub Codespaces, LLMs (Claude Code, Roo Code, Cursor), coverage.py, gcov, kcov.BenefitsProject-based freelance collaboration via the Mindrift platform (powered by Toloka AI).Fully remote and flexible participation—choose your working hours and commitment (20-30 hours per week).Compensation based on task performance, up to $80/hour*.
Contract|$90/hr - $90/hr|Remote|Remote — Wisconsin, United States
Please submit your CV in English and indicate your English proficiency level. This freelance, project-based contract connects experienced data scientists with AI training assignments for major technology clients. Projects focus on testing, evaluating, and improving AI systems. This is not a permanent employment position. Role overview The AI Training Specialist will design advanced computational data science challenges that mirror real-world analytical workflows. Scenarios span industries such as telecom, finance, government, e-commerce, and healthcare. Each challenge should require deep analytical thinking and practical coding skills. What you will do Create data science problems that require Python programming with libraries like Pandas, Numpy, Scipy, Sklearn, Statsmodels, Matplotlib, and Seaborn. Develop computationally intensive tasks that cannot be solved manually in a few days or weeks. Formulate challenges involving complex data processing, statistical analysis, feature engineering, predictive modeling, and generating insights. Ensure all problems are deterministic, with replicable solutions and fixed random seeds. Base scenarios on business needs such as customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency. Cover the full data science pipeline, from data ingestion to deployment considerations. Include tasks that require scalable computational approaches and big data processing. Validate solutions using standard Python data science libraries and statistical methods. Document each problem clearly, providing realistic business context and verified answers. Requirements Minimum 5 years of hands-on data science experience with measurable business results. Portfolio of projects or publications demonstrating real-world problem-solving. Advanced Python skills for data science (pandas, numpy, scipy, scikit-learn, statsmodels). Strong background in statistical analysis and machine learning, including practical applications and algorithms. Expertise in SQL and database management for data manipulation and analysis. Familiarity with Generative AI tools and concepts (LLMs, RAG, prompt engineering, vector databases). Understanding of MLOps and model deployment workflows. Experience with frameworks such as TensorFlow, PyTorch, and LangChain. Excellent written English at C1 level or above. Application process Apply Pass qualification(s) Join a project Complete tasks Receive payment This contract is remote and open to candidates based in Wisconsin, United States.
Contract|$76/hr - $76/hr|Remote|Remote — San Antonio, Texas, United States
Submit your CV in English and indicate your proficiency level. Role overview This freelance, project-based role at Mindrift focuses on evaluating, testing, and improving AI systems for leading technology companies. Assignments connect specialized optical engineering talent with AI-driven projects. The position is not permanent employment, and work is remote from San Antonio, Texas, or elsewhere in the United States. What you will do Create original computational physics problems modeled after real-world research workflows. Develop Python-based solutions using libraries such as Numpy, SciPy, and Sympy. Design computationally intensive problems that cannot be solved by hand in a reasonable time. Formulate advanced problems in mechanics, electromagnetism, thermodynamics, and quantum mechanics. Draw from authentic research challenges or practical physics applications. Validate solutions using Python and standard physics simulation libraries. Document each problem statement and provide verified, correct solutions. Requirements Degree in Physics (theoretical, experimental, or computational) or a related discipline. Proficiency in Python for numerical validation. Experience with MATLAB, R, C, SQL, Numpy, Pandas, SciPy, or similar tools is a plus. At least 2 years of professional experience in applied, research, or teaching roles. Background in numerical simulation techniques. Ability to design problems reflecting real research workflows in physics. Creativity in developing problems across multiple areas of physics. Understanding of physics modeling and approximation methods. Strong written English skills at C1 level or above. How to apply Submit your CV. Meet the qualification criteria. Join a project. Complete assigned tasks. Receive compensation. Project commitment During active project periods, expect to dedicate around 10–20 hours per week. Actual workload may vary and is not guaranteed outside of these phases. Compensation Earnings can reach up to $76 per hour, depending on expertise and the rate of task completion. Compensation varies by project scope, complexity, and required proficiency. Other projects on the Mindrift platform may offer different rates based on specific requirements.
Apr 22, 2026
Sign in to browse more jobs
Create account — see all 45,068 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.