Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.

With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.

Here's some projects that our expert Web Scraping Specialist made real:

  • Web searches and collecting data
  • Data transfers between websites
  • Downloading images from url and insertion to database
  • Automating the sending of emails and SMSs
  • Collecting website data and exporting it to spreadsheets
  • Creating custom bots to generate or collect online user feedbacks
  • Collecting contact details, business leads, influencers or any other specific data
  • Creating dictionaries with official languages of the world apart from English

Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!

De 361,585 opiniones, los clientes califican nuestro Web Scraping Specialists 4.9 de un total de 5 estrellas.
Contratar a Web Scraping Specialists

Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.

With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.

Here's some projects that our expert Web Scraping Specialist made real:

  • Web searches and collecting data
  • Data transfers between websites
  • Downloading images from url and insertion to database
  • Automating the sending of emails and SMSs
  • Collecting website data and exporting it to spreadsheets
  • Creating custom bots to generate or collect online user feedbacks
  • Collecting contact details, business leads, influencers or any other specific data
  • Creating dictionaries with official languages of the world apart from English

Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!

De 361,585 opiniones, los clientes califican nuestro Web Scraping Specialists 4.9 de un total de 5 estrellas.
Contratar a Web Scraping Specialists

Filtro

Mis búsquedas recientes
Filtrar por:
Presupuesto
a
a
a
Tipo
Habilidades
Idiomas
    Estado del trabajo
    88 trabajos encontrados
    Carga masiva de productos Shopify
    6 días left
    Verificado

    Necesito que tomes más de 200 productos que actualmente aparecen en la web de mis proveedores (todo el contenido está en formato texto e imágenes) y los publiques correctamente en mi tienda Shopify. También que elimines los que están descontinuados. Alcance del trabajo • Copiar nombre, descripción, precio, variantes y atributos clave de cada producto. • Descargar y subir las imágenes en alta calidad, asociándolas al producto correspondiente. • Crear/ajustar colecciones, etiquetas y metadatos para facilitar la navegación y el SEO interno de Shopify. • Verificar que cada ficha quede con inventario, SKU y opciones de envío configuradas. • Mantener la coherencia visual y de formato entre to...

    $150 Average bid
    $150 Oferta promedio
    26 ofertas

    INGENIERO SENIOR DE IA: SISTEMA RAG MULTIMODAL ON-PREMISE CON APRENDIZAJE CONTINUO 1. CONTEXTO Y DESAFÍO REAL proyecto del sector de la trefilería y el galvanizado con más de 40 líneas de producción activas. desafío no es la falta de información, sino que el conocimiento crítico es volátil: reside en la experiencia de supervisores y operarios veteranos y se transmite de forma verbal. Cuando surge una solución técnica en planta, esta no se documenta y se pierde para el siguiente turno. Buscamos desarrollar un ecosistema de IA que no solo responda preguntas, sino que capture y democratice el conocimiento técnico que surge en el día a día. 2. LA SOLUCIÓN: "THE KNOWLEDGE LOOP" B...

    $9 / hr Average bid
    $9 / hr Oferta promedio
    12 ofertas
    Rapid Black SEO Ranking Surge
    6 días left
    Verificado

    The site is brand-new, well-structured still invisible to every search engine. I’m ready to handle manual indexing as soon as our plan is in place, but I want a seasoned SEO partner who can execute an aggressive 3-4-day push that mixes black-hat or grey tactics for an immediate spike. I’m comfortable with high-risk manoeuvres—PBN blasts, automated link wheels, or any other rapid-fire strategy you trust—so long as we complement them with on-page adjustments, fresh meta tags, and stabilising signals to soften potential penalties. I’ll supply quick access to the codebase to update titles, descriptions, schema, and internal link structure the moment you recommend a change. What I need from you: • A clear 72-hour action plan outlining the high-impact black/...

    $418 Average bid
    $418 Oferta promedio
    94 ofertas

    I am looking for a developer to build a browser-based automation workflow that collects publicly visible information from selected websites and transfers the data into Microsoft Excel in a structured format. The automation should operate through a standard web browser environment and simulate normal user interaction where appropriate. The solution must be stable, maintainable, and designed in a way that complies with the usage policies of the websites involved. Scope of Work Browser automation to navigate specified web pages Capture and structure visible data fields Automatic transfer of data into Excel Scheduled or continuous updates Error handling and recovery if pages fail to load or change Technical Requirements Experience with browser automation tools (e.g., Selenium, Playwright, or s...

    $544 Average bid
    $544 Oferta promedio
    23 ofertas

    I am looking for an experienced developer with strong expertise in Python and web automation to build a smart system for monitoring ticket availability and event updates on the Webook platform. The system should focus on automation, notifications, and usability while following best technical and compliance practices. Scope of Work • Develop a Python-based automation system to monitor events and ticket availability. • Send real-time notifications when: • New events are published • New ticket batches become available • Build a clean and user-friendly dashboard to: • Manage monitoring settings • Control alerts and configurations • Implement structured and scalable automation logic. • Ensure the solution is maintainable and adaptable to f...

    $7631 Average bid
    $7631 Oferta promedio
    60 ofertas
    Scrape 500+ LinkedIn Profiles Data
    6 días left
    Verificado

    For an upcoming market research study, I need a fully-automated workflow that gathers and enriches data from well over 500 LinkedIn profiles. The automation should locate the profiles that match criteria I will provide, pull the key public details, then append reliable off-platform contact information so I can reach those professionals directly. Please design the script or low-code sequence with any reliable stack you prefer—Python, Selenium, PhantomBuster, Sales Navigator API, or comparable tools are fine as long as the method is repeatable and respects rate limits. Deliverables • CSV/Excel file containing one row per person with: – Current job title – Company name – Verified email (and phone, when available) • Source code or workflow fi...

    $420 Average bid
    $420 Oferta promedio
    125 ofertas
    Pepsico Portal Data Extractor.
    6 días left
    Verificado

    I need a small utility—either a straight-forward C# console app or a PowerShell script—that signs in to my Pepsico web portal with the credentials I will supply, discovers whatever REST or JSON endpoints the site exposes (you can confirm them quickly through Chrome DevTools), pulls every available data set returned once authenticated, and drops the raw response into a clean XLSX workbook. It should not open chrome browser and no UI steps. Everything should happen in backend. A few specifics to keep in mind: • Authentication happens through the live portal, so the code should reproduce whatever token, cookie, or header sequence Chrome shows. • No filtering or sorting is required on my side; simply fetch all records the endpoint offers. • The final step w...

    $12 Average bid
    $12 Oferta promedio
    6 ofertas

    I have a growing list of company names, and I need a small, reliable Python script that can: Automatically find each company’s career/jobs page where open positions are posted (pages may be built using HTML, JavaScript, or modern front-end frameworks) Navigate through all job listings, including: Pagination (page numbers, next/previous, etc.) “Load more” buttons Infinite scrolling Ability to fetch data from multiple pages (e.g., page 3, 4, or beyond) Apply job filters, especially location-based filtering, so that only job links for specific locations are collected Extract only individual job posting links after filters are applied Visit each job link and scrape complete job details, including: Job title Job description Location Employment type (if available) Department / ...

    $19 Average bid
    $19 Oferta promedio
    23 ofertas

    My website was recently wiped and restored from a 5-month-old backup. It is currently "broken"—images are missing, theme settings are off, and it’s outdated. I need a talented developer to fix the damage first, then complete a series of specific scraping, data, and checkout upgrades. The template is use boasts numerous features and options such as a Elementor Page Builder, Mega Menu, Slider Revolution, Responsive & Modile Optimized, Ajax Advanced Search, Ajax Add to Cart, Ajax Pagination, Ajax Load More, Infinite Scroll, Color/Image Swatches, Quick View, Wishlist, Compare, easy one-click demo data import, and much more. The template has a 99 Desktop performance and a 98 mobile performance. It is optimise for SEO performance. Phase 1: The "Emergency Room&...

    $102 Average bid
    $102 Oferta promedio
    131 ofertas

    I need help to make my catalogue of automotive spare parts by pairing every OEM number I supply with a clean, high-resolution product photo and basic part information. The scope covers the full range of engine, suspension and brake system components, so you’ll be digging through manufacturer websites and trustworthy e-commerce listings until you find an image that is crisp, watermark-free and matches the exact OEM reference. Once you locate a match, capture the part name exactly as it appears on the source page, copy the product-page link, download the image at its highest available resolution, and note everything in a structured Google Sheet. File naming should mirror the OEM numbers so that images and rows line up perfectly. Deliverables • A Google Sheet containing OEM num...

    $547 Average bid
    $547 Oferta promedio
    19 ofertas

    Senior Automation Engineer for Traffic Simulation & Referrer Spoofing I am looking for a specialized Automation/Growth Engineer to build a custom Traffic Orchestration System. The goal is to simulate "Viral" traffic spikes to specific URLs to test search engine ranking signals. The Technical Challenge: This is not a simple headless browser task. You must solve the problem of taking high-volume, raw human traffic (via Pop-Under/PPV APIs) and "cleaning" it through a Bridge/Redirect layer to spoof specific social referrers while maintaining session integrity. Core Deliverables: - Referrer Bridge: Build a script/server that receives raw hits and uses a "Double Meta Refresh" or similar logic to spoof (e.g., masking traffic to appear as if it's coming f...

    $24 / hr Average bid
    $24 / hr Oferta promedio
    94 ofertas

    I need a clean, freshly-sourced list of 5,000–10,000 tech-startup contacts for a one-to-one outreach campaign promoting GrowthAI’s free trial. Every record has to come from information that is already public—think company websites, press pages, blog author bios, event directories, Crunchbase-style listings—never scraped LinkedIn data, leaked dumps, or anything that could be considered private. What the sheet must contain • Company name • Website URL • Contact name (when it’s on the site) • Public business email only (no personal Gmail/Yahoo unless the firm itself lists it as its main contact) • Industry tag • Country Target profile • Primary industry: Tech Startups • Regions: North America, Europe, and ...

    $105 Average bid
    $105 Oferta promedio
    22 ofertas

    Please Read Carefully Before Applying It does not matter whether you consider yourself a “vibe coder” or a traditional software engineer we accept both here. What matters is whether you can make this system work reliably at scale. We operate a production scraper that processes 500+ leaderboard sites per hour. All sites we scrape are leaderboards, but no two sites are the same. This is not a basic scraper. What Makes This Scraper Different The leaderboards we scrape vary heavily in structure and behavior: Dynamic buttons, tabs, and switchers JavaScript-rendered content Hybrid navigation (UI interaction + background API calls) Tables, card layouts, podium layouts, or combinations of all three Masked usernames and inconsistent rank formats Different ordering of wager / prize data ...

    $20 Average bid
    $20 Oferta promedio
    25 ofertas

    I need a small proof-of-concept scraper written in Python that pulls user information from a set of static website pages and exports it into a clean CSV file. The pages load without JavaScript, so a lightweight stack such as requests + BeautifulSoup (or lxml) should be all that’s required; no browser automation is necessary unless you can justify a clear advantage. I will supply the page URLs and highlight the exact fields to capture (name, profile link, location, and any other visible user meta). Your code should handle pagination where applicable, respect polite crawl rates, and be easy for me to adjust if the HTML structure shifts. Deliverables • Well-commented Python script (.py) • Sample CSV containing the extracted records • README with setup steps and a qu...

    $129 Average bid
    $129 Oferta promedio
    120 ofertas

    I need a high-end automation tool for a scheduling portal. Requirements: ​Handle 50+ unique browser profiles. ​Integrated media stream handling for verification steps. ​Automation of form filling and fast navigation. ​Anti-detection measures to avoid bot blocks. Budget: $1,500. Milestone based only.

    $1245 Average bid
    $1245 Oferta promedio
    80 ofertas

    I have an existing Flutter mobile app (Firebase backend + RevenueCat + web scraping). Most functionality is already implemented. I need an experienced Flutter developer to update and refine several features. I believe this should not take more than a few days for an experienced developer. Scope of Work: 1. Sync Local Storage with Firestore (Offline Support) - Keep using local storage for offline mode - Sync shift data with Firebase Firestore - Handle offline → online auto sync - Prevent duplicates (unique shift ID) - Secure Firestore rules (user-only access) - Ensure cross-device sync works properly 2. Fix Email Verification (Spam Issue) - Configure Firebase Auth to use custom domain () - Set up SPF, DKIM, DMARC - Improve email template - Ensure emails land in inbox (Gmail/Outlook) ...

    $149 Average bid
    $149 Oferta promedio
    167 ofertas
    DRM-Protected Video download
    5 días left
    Verificado

    The contractor is commissioned to download DRM-protected videos from an online portal to which the client has legitimate access and usage rights. The videos must be processed as follows: - Download approximately 240 videos from the portal with about 18 hours video material - The videos have an average length of approximately 5 minutes - Original video titles must be preserved - The videos must be organized into folders according to the portal order/structure - All files must be uploaded and stored on Google Drive - The final folder structure on Google Drive must be same like on the portal

    $128 Average bid
    $128 Oferta promedio
    81 ofertas
    Daily Car Dealer Data Extraction
    5 días left
    Verificado

    I need a reliable specialist who can log into our dealership’s backend every weekday, pull fresh customer information, and feed it straight into our call-tracking platform the same day. The only data I’m after are contact details and service records—nothing else—so the extraction script or manual process can stay laser-focused on those two fields for speed and accuracy. Turnaround is critical. If you can set this up and have the first full export/import cycle running smoothly right away, I’m happy to add a rush bonus on top of the agreed rate. Accuracy must be spot-on and the data has to land in the tracking system without duplicates or formatting hiccups. Deliverables each weekday: • Clean export of new customer contact details and service record...

    $453 Average bid
    $453 Oferta promedio
    206 ofertas
    Product Data Research & Entry
    5 días left
    Verificado

    I have an Excel template ready and a list of items I need populated with reliable, up-to-date product details. For every product on the list, please pull information only from official brand websites, leading eCommerce platforms, and the customer-review sections of those sites. What I expect captured for each item: • Current price and stock status • Key features or technical specifications exactly as stated by the manufacturer or retailer • Average customer rating plus any standout review insights (e.g., “4.5/5 from 230 reviews”) Accuracy matters more than speed, so cross-check conflicting figures before entering them. Add the source URL next to every data point so I can verify quickly. Once the sheet is complete, send it back in the same format&mdash...

    $6 / hr Average bid
    $6 / hr Oferta promedio
    60 ofertas

    I have a spreadsheet with 200 U S-based websites and I need the direct phone numbers of each owner. The numbers are not published on the sites themselves, so please pull them through your own account. Alongside every number, include the owner’s LinkedIn profile URL; no other fields are required. What I expect from you • A clean CSV or Google Sheet with three columns: Website, Owner Phone Number, LinkedIn Profile • Accuracy checked against Apollo’s latest data • Completion within 24 hours of project acceptance This is a quick job for an experienced user. I will review the sheet immediately and release payment within 24 hours once the data is verified.

    $11 Average bid
    $11 Oferta promedio
    21 ofertas

    Every week I compile a fresh list of Danish houses and apartments that may have changed hands in the previous seven days. Your job is to open the specific web link I supply for each property and confirm whether the listing now shows as “Solgt / Sold.” No phone calls to agents, no site visits—everything happens inside the browser, one URL at a time. I need someone who can commit to roughly 50 hours of this work each week on an ongoing basis and who is comfortable updating a shared Google Sheet (or Excel file, if you prefer) as you go. For each address you will: • mark the sale status (Sold / no data) • attach or link a screenshot of the listing as proof That’s it. The task is straightforward but must be done manually—no bots or scraping tools. Wh...

    $7 / hr Average bid
    $7 / hr Oferta promedio
    55 ofertas

    I'm looking for a comprehensive list of home decor small businesses in Florida. The list should be organized by city and delivered in an Excel spreadsheet format. Requirements: - Contact details - Product offerings - Customer reviews - Categorized by city Ideal skills and experience: - Attention to detail - Experience with data collection - Proficient in spreadsheet software

    $393 Average bid
    $393 Oferta promedio
    166 ofertas

    I want to replace several manual reporting routines with an end-to-end AI workflow that ingests data from our internal finance databases and live web sources, then produces clear, timely analytics for management. Reporting and analytics are the sole focus—no transaction execution—so the system must excel at pulling, cleaning, and interpreting numbers rather than booking them. We also want to compare legal documents vs term sheets and excel spreadsheets Data sources • Company databases (SQL, flat files, Excel exports) - Dropbox all our files are in drop box • Extensive web scraping for competitor benchmarks and investment-market signals If you have ideas for safely adding external financial APIs later, let me know, but the two feeds above are mandatory. - Th...

    $22 / hr Average bid
    $22 / hr Oferta promedio
    109 ofertas

    I need an experienced engineer to analyze and improve a high-demand online booking workflow so bookings can succeed reliably even under extreme traffic. I already have a working Playwright-based browser automation, but during peak demand all sessions currently land on a “high demand / unavailable” state. The goal is to improve success rate through deeper system understanding, better timing, and smarter flow control . The work involves analyzing booking flow and state transitions, understanding how availability actually appears during high demand (including delayed or staggered releases), improving timing, retries, waiting strategy, and navigation logic, eliminating race conditions and aborted navigations, and designing the automation to be long-running, reactive, and resilient ...

    $2895 Average bid
    $2895 Oferta promedio
    49 ofertas

    Hi. I have an Excel list with around 170.000 local businesses in Spain with their emails: "File 1". I have another Excel list with around 65.000 local businesses with their web sites but without email addresses: "File 2" Tasks that I need from your site 1º With the help of IA or another tool to check every web site of every business in the "File 2" to try to obtain their emails. I will add the emails obtained in the task 1º to the "File 1" to create a complete file, File 3. 2º With the help of some tool to verify if the email of every business in the File 3 is active. 3º With the help of some tool to verify if the website of every business in the File 3 is active. 4º With help of IA or another tool check every activ...

    $239 Average bid
    $239 Oferta promedio
    30 ofertas

    I am looking for a full-stack developer to build a Web Compliance Audit platform. The goal is to create a solution that scans websites for GDPR, Privacy, and Cookie compliance using AI (OpenAI API) and generates professional audit reports. The front-end will be WordPress (for user management, payments, and UI), while the "brain" will be a Python-based engine running on a Linux VPS. Key Features & Requirements: 1. WordPress Frontend: Landing Page: Professional UI where users can enter a URL for a "Quick Scan." User Dashboard: Where clients can see their scan history and download PDF reports. Payment Integration: WooCommerce or Paid Member Subscriptions (Stripe/PayPal) for one-time reports or monthly monitoring plans. API Integration: A custom function to send the U...

    $526 Average bid
    $526 Oferta promedio
    159 ofertas

    I need a developer to collect data from multiple public websites and deliver it in a clean, structured format. This is for legitimate data extraction from publicly available pages. I will share the target URLs and exact data fields with shortlisted candidates. Scope of work Scrape data from multiple public websites (details shared after shortlisting) Extract specific fields consistently and handle pagination/filtering where needed Normalize/clean the data (remove duplicates, consistent formatting) Export results to CSV/Excel/JSON (format to be confirmed) Provide a repeatable solution (script or small app) that I can run on demand Basic documentation: how to run it, how to adjust settings, where outputs go Quality requirements Reliable scraping with error handling and retries Resp...

    $141 Average bid
    $141 Oferta promedio
    179 ofertas
    Betfair API Data Retrieval (AU/UK)
    5 días left
    Verificado

    Hi, ******Need someone ONLY AND ONLY from AUSTRALIA AND UK ONLY for my betfair apis development. I’m expanding an existing trading suite and now need clean, reliable access to Betfair’s Market Data API. Because of licensing considerations, I can only work with developers physically based in Australia or the United Kingdom. Scope of work – Connect to the Betfair Exchange API (non-interactive login is already in place on my side) – Retrieve real-time odds, price ladders and market status updates for Horse Racing, Football, Tennis and Cricket markets only – Structure the responses so they can be consumed by my Python back-end (JSON is fine) – Handle throttling limits and session renewal gracefully Deliverables 1. Well-commented source co...

    $2096 Average bid
    $2096 Oferta promedio
    41 ofertas

    I need a reliable LEGAL COMMENT & researcher who can MAKE legal COMMENTRY ON OUR OLD company filings from the U.S. SEC’s EDGAR database for publicly traded companies. The immediate task is to locate, download, and organise the relevant documents in a structured way that lets me review them quickly—filing date, form type, company name, CIK, and a direct link back to the source must all be captured. Acceptance TO BE CAPABLE OF PROVING WHETHER TO REESTABLISH THIS OLD APPLICATION. Turnaround is flexible but please outline how long you need per 100 filings so I can plan the next milestones.

    $19 Average bid
    $19 Oferta promedio
    19 ofertas

    I’m looking for a large data dump of UK-focused domains in the property, home improvement/construction, landlord, property service niches. The aim is to find unloved, good quality SEO websites with affiliate or lead gen potential that i can purchase and grow. See attached for the types of websites/niches that would be suitable. This is a quantity-first task, no manually website review or qualitative judgements required - i will outreach. Use Ahrefs or Semrush only to export domains that rank organically for broad UK property topics such as: landlord guides/compliance, property finance (BTL, bridging), property investment & deal sourcing, refurb/renovation planning, EPC & energy efficiency, planning permission & permitted development, HMO licensing, serviced accommodat...

    $146 Average bid
    $146 Oferta promedio
    42 ofertas
    Scrape Zillow Agent Data
    4 días left
    Verificado

    I have an urgent need for a clean, well-structured dataset containing the listing agent’s first name, last name, mailing address, and phone number for well over 500 active Zillow listings. Speed is critical, but accuracy matters just as much; the final file should be ready for immediate import into my CRM. You are free to use whichever stack you prefer—Python with BeautifulSoup or Scrapy, Selenium, residential proxies, even the unofficial Zillow API—so long as rate-limits are respected and the data is complete. I don’t need property details or price history; the focus is strictly on the agent contact fields. Deliverables • CSV or XLSX with a separate column for each required field • A short read-me explaining the script or method so I can rerun it la...

    $25 Average bid
    $25 Oferta promedio
    68 ofertas
    Apify Scraping Actor Development
    4 días left
    Verificado

    I have a set of websites whose data I need to capture automatically, and I want the whole process built as a reusable Apify actor. I will share the exact URLs, the fields to be collected, and the desired output format once we agree to proceed, but the common theme is structured extraction (think product specs, profile info, or similar). Here’s the outcome I’m expecting: • A clean Node.js actor that runs on the Apify platform, uses the latest Apify SDK, and follows best practices for request queuing, proxy rotation, and error handling. • Configurable input schema so I can plug in new target URLs or tweak search parameters without touching the code. • Output saved to an Apify dataset (JSON/CSV) and pushed to my Google Drive via webhook on each successful r...

    $83 Average bid
    $83 Oferta promedio
    22 ofertas

    Looking for a technical expert to build a custom workflow on n8n. Required Stack: • n8n (advanced logic & error handling) • OpenAI API integration • HTTP Request / Webhooks / API connections • Database management (Airtable or similar) The Project: Build a scalable infrastructure to connect lead data with AI-driven personalization and automated outreach. The system needs to be modular so it can be replicated easily. Detailed technical requirements and the specific workflow logic will be provided during the interview. To apply: 1. Send a screenshot or video of your most complex n8n workflow. 2. Tell me which hosting you recommend for n8n to ensure 99% uptime. 3. Quote me your price for a 30-day build & test period."

    $566 Average bid
    $566 Oferta promedio
    152 ofertas
    Website Location Data Scraping
    4 días left
    Verificado

    I need a clean pull of every location listed on For each branch please capture: country, state, complete address, service type, phone number, and email address. The final deliverable is a single Microsoft Excel workbook containing one sheet only. All columns should be clearly labelled and the range converted to an official Excel Table so I can apply native filters instantly. No additional filtering is required on your side; just be sure the table structure supports easy filtering by any column once I open the file. Accuracy matters more than speed—every location on the site has to be included and the contact details must match what is shown online. When you hand over the file I will spot-check a sample of entries against the live site to confirm completeness and correctness bef...

    $120 Average bid
    $120 Oferta promedio
    203 ofertas
    Retail Express SKU Migration
    4 días left
    Verificado

    We are mid-migration to Retail Express and must have every product from 33 suppliers—about 4,000 SKUs—ready for a single, clean upload. I already have Retail Express’ import template, plus partial Excel sheets from each supplier, yet gaps remain. Your task is to take the existing files, reconcile them against the template, and fill in any blanks you discover for the three critical fields: • Missing SKUs • Missing Barcodes • Missing Descriptions Where information is absent, you’ll need to source it directly from the supplier catalogues or websites, confirm accuracy, and then complete the master spreadsheet. Once validated, the final file must load into Retail Express without errors or duplicates. Success is measured by a fully populated impo...

    $15 / hr Average bid
    $15 / hr Oferta promedio
    58 ofertas

    I need Octoparse templates built for roughly fifty manufacturer sites in the flooring & renovation niche. Each template must crawl the full product catalog and push clean, structured data into my Supabase database. The extraction scope includes: high-quality images, complete text descriptions and feature lists, links to warranty documents or other disclosures, detailed dimensions and specifications, style and color information, collection / color-family, and every SKU shown on the page. Price data is nice-to-have when present, but its absence should not break the run. Many product pages list matching accessories (trim, transitions, quarter-round, etc.). Your logic must identify those by shared style and color so they enter the database as related items. Typical sites you will start ...

    $26 Average bid
    $26 Oferta promedio
    50 ofertas
    Apify Web Data Scraper
    4 días left
    Verificado

    I need an Apify actor that crawls a single website and delivers two things for every page. You can use Puppeteer, Playwright or any other Apify helper library that keeps the run stable and fast. Here’s how I see the workflow: • I’ll share the target domain, URL pattern, and the exact text blocks I care about. • You create or fork an Apify actor in JavaScript/TypeScript, configure the request queues, handle pagination where needed, and store results in a dataset. • The final dataset should export cleanly to JSON and CSV, and the image URLs should be downloadable in bulk (a simple link list or an Apify key-value export is fine). • When the crawl completes, I want a brief README so I can rerun it myself later without touching the code. Acceptance crit...

    $10 Average bid
    $10 Oferta promedio
    12 ofertas
    Shopify B2B Data Migration
    3 días left
    Verificado

    We’re consolidating a Shopify B2B store into a blended B2B/B2C store. This is a data migration project, not theme or app development. Scope: Export Companies, Company Locations, and Customers via Matrixify Clean and normalize large CSV datasets (dedupe emails, fix relationships) Rebuild import-ready files that match Shopify B2B requirements Import in the correct order (Companies → Locations → Customers) Validate that customer ↔ company/location links are intact Requirements: Hands-on experience with Shopify B2B (Companies + Locations) Matrixify migrations involving Companies, not just products Comfortable handling large datasets (automation preferred)

    $130 Average bid
    $130 Oferta promedio
    89 ofertas

    I’m looking for a dependable script or lightweight application that can collect sports betting odds from a web-based platform I have access to and export them into a structured Excel (XLSX) file. The initial focus will be on outright winner markets for: Golf Cycling Baseball The Excel output should remain clean and well-organized, grouping rows by sport, league, and event, so the data can be easily filtered and analyzed later. Update Frequency: Data refresh every 5 minutes Real-time or in-play updates are not required Accuracy and stability are more important than speed Technical Expectations: Ability to handle dynamic web content Robust approach that runs consistently over time Technology stack is flexible (Python, browser automation, or other suitable solutions) Clear...

    $144 Average bid
    $144 Oferta promedio
    88 ofertas

    I already have valid login credentials to my coaching app, but the platform doesn’t give a built-in option to save lessons offline. I need every video class I’m enrolled in pulled down from both the Android app and the web version, then handed back to me neatly organised (Course → Module → Lesson, MP4 or the source format). Use whatever reliable method you prefer—Python scripts, yt-dl, network-capture tools, or similar—to grab the streams, keep the original resolution, and avoid quality loss. A light DRM layer may be present, so prior experience with HLS/DASH stream extraction will help. Deliverables • Full set of video files, correctly named and structured • Short guide or reusable script so I can repeat the download when new classes appea...

    $51 Average bid
    $51 Oferta promedio
    9 ofertas
    Retrieve Pre-Scraped Data Sets
    3 días left
    Verificado

    Need previously scrapped truepeoplesearch data. Only bid if you already have the dataset.

    $118 Average bid
    $118 Oferta promedio
    28 ofertas
    LinkedIn Navigator Access Needed
    3 días left
    Verificado

    I already have a scenario set up and ready to run—what I’m missing is an active, fully-functional LinkedIn Sales Navigator seat that I can connect my Vayne module to. If you currently hold such an account and can grant API or session-cookie access (whichever method you normally use for integrations), let’s work together. Once connected, I’ll handle the filtering logic inside , but I need your account to serve as the data source and, ideally, your guidance to be sure the pull limits stay within LinkedIn’s acceptable use. When you reply, focus on your experience—how you’ve successfully linked Sales Navigator with automation platforms before, any anti-scrape precautions you follow, and typical daily search volumes you’ve handled without issu...

    $122 Average bid
    $122 Oferta promedio
    57 ofertas
    iGaming Website Database Creation
    3 días left
    Verificado

    We are looking for a freelancer to build a database of 1,000 active iGaming websites (casino / sportsbook / betting operators). Scope of Work: You will identify and collect 1,000 unique, live iGaming operator websites and enter them into our provided form/database. Each record will have several different data points, including but not limited to: - Website URL - Contact emails - Company / Brand Name - Country / Jurisdiction (where the operator is based or licensed) - Website Languages (select all that apply) - Availability of Providers Important Guidelines - Only live, operational websites (no affiliates, review sites, or news portals) - No duplicates (we control this in the record entry form) - We prefer speed and volume over excessive research - Do not spend significant time trying to ...

    $175 - $525
    Sellado
    $175 - $525
    19 ofertas

    I want a Telegram bot that can reliably extract the client’s phone number, the property owner’s number, and the unit number from listings on Bayut, Propertyfinder, and similar real-estate sites—even though these fields aren’t shown in the public UI and no official API is used. Here’s the flow I’m after: I drop a listing URL (or several) into the chat, the bot quietly scrapes the page, jumps through whatever loophole is needed to reveal the hidden contact and unit details, then replies with a single, structured template that looks something like: Property: <Title> Unit No.: <unit_number> Client: <client_phone> Owner: <owner_phone> Source: <URL> Key points • No reliance on the Bayut or Propertyfinder APIs&m...

    $127 Average bid
    $127 Oferta promedio
    43 ofertas

    I have an unstructured text file that needs to end up as clean, well-organized rows and columns in Google Sheets. The data will come strictly from this file, not manual keying or an API, so the first step is parsing whatever patterns you can detect—line breaks, repeating markers, dates, or any other cues that let you segment the content logically. Once parsed, I want each logical field mapped to its own column in a Google Sheet I’ll share with you. If a repeatable rule can be established, please codify it in either Google Apps Script or a small Python script so I can reuse the process whenever a new file arrives. Deliverables: • The populated Google Sheet, fully checked for alignment and obvious anomalies • The script or documented method you used to transform th...

    $10 / hr Average bid
    $10 / hr Oferta promedio
    45 ofertas

    I need to scrape a website with public content and export it to an organized Excel file. It's approximately 700k pages with specific data. Some of the data I need is missing, but I have an Excel file with this data that should be used to autocomplete the missing information. In summary: 1. Scrape the website to an Excel file (I will give an example) 2. Autocomplete the missing information based on my Excel file. After this project, I will need another one, which will require finding contact information with some precision, perhaps using AI or some specific logic, but that will be a topic for later. Thank you all

    $189 Average bid
    $189
    50 participaciones

    I’m interested in buying an existing dataset you’ve already scraped from TruePeopleSearch. What I specifically need is the contact information section—phone numbers for sure, and any email addresses or other direct lines you may have captured at the same time. I don’t need the address history or relatives/associates fields, so feel free to leave those out if they’re present. This is not a fresh scrape request; I only want data that you currently have on hand. Anything compiled within roughly the last twelve months is perfect, but older archives could still be useful if they’re large and well-structured. I’m flexible on format: CSV, Excel, or JSON all work. Just let me know which one your files are in, when the data was pulled, and roughly how man...

    $117 Average bid
    $117 Oferta promedio
    26 ofertas

    I need help compiling a clean, well-structured spreadsheet of information pulled directly from publicly available websites and social media platforms. Scope • Locate the pages I specify (or similar ones you suggest) and extract relevant text content—company descriptions, service lists, post captions, etc. • Capture all visible contact information on those same pages, including email addresses, phone numbers, and any listed location details. Requirements • Manual collection only; no automated scraping tools that violate terms of service. • Record each source URL alongside the data so I can verify accuracy. • Maintain consistent formatting in Excel or Google Sheets—one row per entity, separate columns for each data point. • Double-c...

    $7 / hr Average bid
    $7 / hr Oferta promedio
    48 ofertas
    EDGAR Company Filings Research
    3 días left
    Verificado

    I need a reliable researcher who can pull company filings from the U.S. SEC’s EDGAR database for publicly traded companies. The immediate task is to locate, download, and organise the relevant documents in a structured way that lets me review them quickly—filing date, form type, company name, CIK, and a direct link back to the source must all be captured. Scope • Search the EDGAR system and collect every filing that matches the criteria I’ll send (ticker list or CIKs). • Save each document in its original format (HTML, PDF, or XBRL when available) and label the files clearly. • Build a spreadsheet that lists each filing with the key metadata mentioned above. Acceptance The work is complete when I receive a zipped folder containing the documents an...

    $24 Average bid
    $24 Oferta promedio
    42 ofertas

    I need a detail-oriented data entry operator who can take raw information and input it manually with flawless accuracy. The task is straightforward but demands precision: transfer data from source documents into the designated spreadsheets and cross-check entries to avoid any discrepancies. You’ll be working with standard tools—Excel or Google Sheets—so please be comfortable navigating formulas for quick validation and spotting inconsistencies. Speed is welcome, yet accuracy is non-negotiable; I will run random checks to verify every batch before sign-off. Deliverables • A fully populated, error-free spreadsheet formatted exactly as the template provided • A brief changelog noting any unclear or missing source information you encountered I’m r...

    $10 / hr Average bid
    $10 / hr Oferta promedio
    111 ofertas

    Artículos recomendados solo para ti