The ultimate guide to hiring a web developer in 2021
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
De 369,869 opiniones, los clientes califican nuestro Web Scraping Specialists 4.9 de un total de 5 estrellas.Web Scraping is the process of extracting data or information from an online source such as a website, database, application, etc. Web Scraping Specialists have the skill that helps people collect valuable digital data and quickly find the useful information they need from websites, mobile apps, and APIs. The experts usually use web scraping tools and advanced technologies to collect large amounts of targeted data without any manual work for the client.
With web scraping, tasks that otherwise may require a lot of time can be automated and done faster. Our experienced Web Scraping Specialists use their expertise to develop scripts that continuously target structured and unstructured data sources.
Here's some projects that our expert Web Scraping Specialist made real:
Web Scraping Specialists are skilled professionals who know how to help businesses optimize processes while collecting rich structured data they need for their specific purposes. Our experts fasten the process and return accurate results in less time, so that the customer can make better decisions more quickly without any manual labour. If you are looking for a talented professional to make a web scraping project for you, you have come to the right place. Here in Freelancer.com you can find talented professionals who will get the job done with top quality results! Post your project now and see what our Web Scraping professionals can do for you!
De 369,869 opiniones, los clientes califican nuestro Web Scraping Specialists 4.9 de un total de 5 estrellas.Data Sniper B2B (República Dominicana) - Cacería de Vacantes y Números Telefónicos A MISIÓN EXACTA (QUÉ VAS A BUSCAR): No busco listas estáticas. Tu única misión es rastrear empresas en República Dominicana que tengan POSICIONES / VACANTES ABIERTAS (preferiblemente mandos medios y gerencias que lleven días estancadas). La vacante abierta es el único disparador válido para extraer al prospecto. EL ENTREGABLE DIARIO (80 Prospectos): Entregarás la data en nuestra matriz de 12 campos. ATENCIÓN: EL TELÉFONO ES EL REY. El correo electrónico es secundario. Si me entregas un prospecto con un correo perfecto pero sin número de teléfono válido para contactar a...
R Expert Needed: Advanced Signal Processing for Maternal, Fetal & Neonatal Health We are seeking a senior R scientist with deep experience in audio / seismic data, wavelet methods, and Bayesian modeling to help demonstrate that R is just as good as Python at surfacing early physiological health markers. This role is for someone comfortable working below the noise floor—where the signal of interest is subtle, nonstationary, and embedded in the mechanics of living systems. What you’ll work on 1. Signal conditioning & noise removal * Design and evaluate signal conditioning pipelines for low-amplitude physiological data * Implement filtering strategies to remove mains contamination (50/60 Hz and harmonics), including: o Notch / comb filters o Adaptive and time-varying filt...
I have roughly one thousand JPEG images named. Each filename matches a numbered row already laid out in my Excel/Numbers workbook, and every image needs to land in the specific cell that carries its number. I also need every photo resized so it fits cleanly inside its cell without overflowing. I will pass you the folder of JPEGs plus the workbook. You return the same file, now populated with the images in their correct positions and neatly scaled. Accuracy is critical here, so please be comfortable with bulk image insertion and cell-sized resizing in either Excel or Apple Numbers—whichever you prefer as long as the final file opens perfectly on both Mac and Windows. Deliverables: • Updated workbook with all 1,000 images placed in their matching cells • Images resized to...
I need a rock-solid n8n workflow that, whenever I trigger it, navigates through selected e-commerce sites and public business directories, captures every piece of business information that is publicly available, and stores it in a clean, query-ready format. The data I care about includes the business name, category or type, “about” text, founders’ names, any additional corporate details the site reveals, plus all images properly downloaded and tagged. I will be running various data-analysis models on the output, so accuracy, consistency, and tidy structuring are non-negotiable. The flow must: • Accept a list of target URLs and run on demand (no fixed schedule). • Respect and site rate limits while still remaining efficient. • Handle pagination, lazy-...
I have a detailed set of internal criteria and need an experienced researcher to apply it to a large pool of informational websites. The goal is simple: surface only the sites that truly fit my content-curation needs.
I will send you a single Excel workbook that already contains a clean template and the full list of 500 company names. Your task is to visit each company’s official website and capture three fields—contact name, email address, and phone number—then enter them in the exact columns provided. Please work strictly with information that is publicly available on the company site; do not pull data from LinkedIn or any third-party source. Consistency matters, so enter phone numbers in international format where it is shown and keep the spelling and punctuation of names exactly as they appear online. Deliverable • The same Excel file returned, fully populated and double-checked for accuracy and spelling. I will review the sheet for completeness (500 rows, three data poin...
I need product details captured from a set of websites and delivered in a clean, structured format I can load straight into Excel or a database. The job involves visiting the URLs I provide, pulling every product’s name, price, SKU, description, and any other specifications that appear on the page, then handing everything back to me in a .csv or similar flat file. A lightweight script—Python with BeautifulSoup, Scrapy, or a comparable tool—would be ideal so I can rerun the extraction whenever the catalogue changes, but I’m happy to discuss whether you deliver only the compiled dataset or include the code as well. Please keep the workflow ethical (no site overload, respect where applicable) and ensure the final data set is complete, deduplicated, and readable wi...
Hello! :) I'm seeking a versatile virtual assistant to join my team for 15+ hours per week (minimum). The role involves a mix of marketing and admin-related support tasks (content, landing pages, research etc). The ideal candidate should be skilled in traffic generation tasks: SEO/GEO, Reddit, blog content, and social media management as the key task is to help setup offer pages (landing pages) and drive traffic to them (organic traffic). Your success in this role will be determined by your ability to generate traffic for the projects you're assigned to over a 3-month probation period. After which your hourly rate will increase and your contract will be extended for a further 9-12-month contract. If you perform exceptionally (above traffic targets), you may be offered a reve...
I need an experienced developer to build a fully automated affiliate marketing intelligence system that discovers trending products, generates platform-optimized content, and publishes across 5 social media platforms with embedded affiliate links — all running on autopilot via N8N workflows. This is a complete end-to-end system, not a simple automation. Read the full spec before bidding. System Modules Required Module 1: Trend Discovery Engine Scrape Google Trends via pytrends for rising product keywords (runs every 6 hours) Cross-reference with Amazon Best Sellers to validate commercial intent Score and rank products by trend velocity, search volume, and commission potential Store ranked product queue in PostgreSQL database Module 2: Affiliate Data Collector Connect to Amazon Pr...
Regular comprehensive snapshot. There are 3,000 products. 20 columns for each product. Page by page. I’m looking for a repeatable, fully automated workflow. A Python-based stack (Scrapy, BeautifulSoup, Selenium, Playwright, or an equivalent you prefer). Robustness is key: the crawler should cope with pagination, JavaScript-rendered. Clear, well-commented code is part of the deliverable so my team can review and rerun it internally. Each quarterly hand-off must include: • Cleaned CSV or JSON containing the structured product records • The raw HTML or a compressed WARC snapshot for auditing • The executable script(s) plus a brief change log highlighting any site-structure updates you handled Please outline your proposed tool chain, an example of a large scrape yo...
I want a nice table with top 5 browser game websites, with their stats extracted from SimilarWeb (traffic, global position, share of desktop devices, top 3 countries), as well as with contact email/contact form address. Another column would be a proposed backlink price from their domain according to your own experience. I do not like too many questions, therefore looking forward to see the result online rather than adiditonal questions. This is a testing task, if you will be chosen, then we will continue to work on a regular basis, I will send you websites for analysis, and you will add these new to the existing table with gathered information from various sources.
I’m looking to start a Python-based project purely for Personal use and I’m intentionally keeping the brief open so creative developers can pitch ideas that excite both of us. Whether it’s a handy automation script, a data-driven dashboard, a lightweight Flask or Django web app, a web-scraping utility, or even a small game, I’m happy to explore any direction—as long as it showcases clean, well-documented Python code. Because I do not have a strict deadline (No time limit), I prefer quality over speed. Take the time to think through the concept, architecture, and tech stack; then send me a Detailed project proposal that explains: • The core idea and its personal value • Key Python libraries or frameworks you plan to use (e.g., Pandas, Selenium, Fa...
**Title:** Data Research – Collect Email Addresses of Local Businesses (South London Postcodes) **Description:** I’m looking for a freelancer to build a list of local businesses located in the following London postcodes: SE22 SE23 SE24 SE25 SE26 SE27 SE28 The task is to identify **shops/businesses that do NOT have a website** and collect their **publicly available email addresses**. **Requirements:** For each business, please provide the following in a spreadsheet (Excel or Google Sheets): * Business Name * Email Address * Business Address * Postcode * Phone Number (if available) * Business Category (e.g., barber, café, convenience store, etc.) **Important:** * Only include businesses **without their own website**. * Email addresses must be **publicly available*...
I’m running detailed market research and need a clean, verifiable list of U.S.-based Shopify stores that actively use Shopify Fraud Protection. The focus is on small-to-mid tier shops, capped at roughly 400 000 visits per month, so the big names are out. Please leave out grocery, clothing, perfume, and pet stores; categories such as electronics, home & garden, beauty & health—or any other niche that isn’t excluded—are welcome. I’m flexible on how you gather the data: scraping, APIs, or well-documented manual checks are all fine as long as the results are accurate. Deliverables • Spreadsheet (Excel or Google Sheets) listing: – Store URL and brand name – Estimated monthly traffic figure and data source (Similarweb, BuiltWit...
hi, need someone to scrape some data. I need a google sheet with the name, email, phone and website of every company listed here https://www.freizeitmesse.de/ausstellerverzeichnis/#/suche/f=h-entity_orga;v_sg=0;v_fg=0;v_fpa=FUTURE Should be around 500 entries. Please share your rate and timeframe.
Hello! :) I'm seeking a versatile virtual assistant to join my team for 15+ hours per week (minimum). The role involves a mix of marketing and admin-related support tasks (content, landing pages, research etc). The ideal candidate should be skilled in traffic generation tasks: SEO/GEO, Reddit, blog content, and social media management as the key task is to help setup offer pages (landing pages) and drive traffic to them (organic traffic). Your success in this role will be determined by your ability to generate traffic for the projects you're assigned to over a 3-month probation period. After which your hourly rate will increase and your contract will be extended for a further 9-12-month contract. If you perform exceptionally (above traffic targets), you may be offered a reve...
I’m running detailed market research and need a clean, verifiable list of U.S.-based Shopify stores that actively use Shopify Fraud Protection. The focus is on small-to-mid tier shops, capped at roughly 400 000 visits per month, so the big names are out. Please leave out grocery, clothing, perfume, and pet stores; categories such as electronics, home & garden, beauty & health—or any other niche that isn’t excluded—are welcome. I’m flexible on how you gather the data: scraping, APIs, or well-documented manual checks are all fine as long as the results are accurate. Deliverables • Spreadsheet (Excel or Google Sheets) listing: – Store URL and brand name – Estimated monthly traffic figure and data source (Similarweb, BuiltWit...
I have a growing list of websites from which I routinely pull business details, and I’m looking for a single, reusable script that lets me swap in a new URL, hit “run,” and walk away with clean data. The fields I always need are: Name, Address, Phone number, Email, Website, and a short description of Services. Python 3 is my first choice because I’m already set up with it locally, and I’m comfortable installing BeautifulSoup, Requests, Scrapy, or Selenium if the page structure calls for it. If you feel another stack (e.g., Node-JS with Cheerio or Puppeteer) would shorten development time or handle tricky JavaScript sites better, let me know—flexibility is more important than sticking to a single library. What matters to me is that: • The URL (or...
I need someone to collect freelancer profiles from freelancer platforms and put them into a Google Sheet or Excel file. Criteria: • Freelancer must speak Dutch ( Native or Bilingual or fluent) • Their hourly rate must be between $5 and $35 For each freelancer collect: • Name • Profile link • Hourly rate • Country • Skills • Languages The final output should be a Google Sheet or Excel file. Please confirm you understand the criteria.
Project Description: We are looking for a freelancer to help with a data research and filtering task using LinkedIn Sales Navigator and some online tools. Workflow: 1. I will provide LinkedIn Sales Navigator filter criteria to find companies. 2. Using those filters, you need to extract the list of companies from Sales Navigator. 3. For each company, collect: - Company name - LinkedIn company page - Website domain 4. Next, check the MX records of each website using the MX checker link that I will provide. 5. If the domain uses Google Workspace (Google MX records) → proceed to the next step. 6. Then check the BIMI record using the BIMI checker link I will provide. 7. If: - MX = Google Workspace - BIMI = Not enabled → Then collect: - CEO LinkedIn pr...
I have a curated list of specific company websites and I need an automated solution that extracts complete contact information from each one. The goal is to turn every URL into a clean, ready-to-use lead. WEBSITE : The scraper should capture: • Email addresses • Phone numbers • Mailing addresses • LinkedIn profile link • Location (city / state / country) • First and last name • Occupation / job title • Company name • Company website A well-structured CSV or Excel file is the preferred output, with each field in its own column. I am comfortable with your choice of tech—Python with BeautifulSoup, Scrapy, or Selenium are all fine—as long as the script runs reliably and respects and rate limits where required. Ac...
Project Title: RPA Automation for RTO Document Upload (Excel to Web) Project Description: I need a robust automation (RPA) solution, preferably using Power Automate Desktop (PAD), to automate the document uploading process on an RTO (Regional Transport Office) portal. The bot needs to handle data from an Excel sheet, match it with local PC folders, and upload documents to the website. Detailed Workflow: 1. Data Input (Excel Integration): The bot should read an Excel file containing two primary columns: Vehicle Number and Chassis Number. It must iterate through each row one by one. 2. Web Navigation & Search: Open the RTO portal and log in (if required). Input the Vehicle Number and Chassis Number from Excel into the website's search fields to fetch the specific vehicle's deta...
We’re looking for someone experienced in setting up and stabilizing large batches of TikTok accounts targeting the US market. The goal is to properly configure the environment so accounts look legitimate and can safely scale activity. This is not beginner work. You should already understand device/IP hygiene, account aging, and how TikTok detects suspicious behavior. What you’ll be responsible for: • Setting up multiple TikTok accounts for US usage • Configuring clean IP environments (residential/mobile preferred) • Ensuring each account has a unique fingerprint • Proper account warm-up strategy (activity simulation, browsing, engagement) • Avoiding bans, shadow bans, or verification loops • Creating a repeatable setup process for scaling accoun...
I need a small, always-on scraper that keeps an eye on a popular second-hand marketplace and alerts me the moment any Electronics listing matching my keywords appears. My priority is speed—ideally I hear about a new post within seconds, certainly no longer than a minute after it goes live. Here’s what the script must do: • Crawl the marketplace continuously without being blocked, parse every new listing, and filter it against a configurable set of electronics keywords. • Extract and store the Price and Condition fields so I can track changes and avoid duplicates. • Push an instant notification (email, SMS, or Slack—whichever you prefer to wire up) each time a fresh match is found. I’m comfortable with a Python 3 stack—think Requests/...
Job Description I am looking for a technically proficient developer with a background in Computer Science and Local SEO to build and manage a Mobile Mechanic Lead-Gen Engine. Your primary responsibility is to create and scale unique digital entities for mobile car repair services, ensuring each location dominates the Google Maps 3-Pack through automated "Trust Signals." Project Scope & Compensation • Phase 1 (Trial): Management of 3 initial locations at $300/month. • Phase 2 (Growth): Upon reaching Day 50 performance metrics, scaling to 10 locations per developer at $1,000/month. • Payment Trigger: Success is measured by the completion of the 50-day roadmap and the achievement of "Human Dialogue" metrics (verified customer interactions). Key Responsib...
I need a search engine that connects to my Excel sheet for efficient data retrieval. The sheet primarily contains text data across 1-5 columns. Key requirements: - Ability to perform exact match searches - Simple and user-friendly interface Ideal skills and experience: - Experience with search engine development - Proficiency in handling Excel data - Strong programming skills, preferably in Python or similar languages Looking forward to your bids!
Project Title: WhatsApp to Web Portal Automation (Python) - Multi-Recharge Distributor Project Description: I am looking for a developer to automate a repetitive task for my multi-recharge business. I am a distributor for a portal () and I currently manage retailer balance transfers manually via WhatsApp. Current Workflow: Retailers send a payment screenshot and a message via WhatsApp (Format: PAY [ID] [Amount]). I manually log in to the web portal or mobile app. I enter the Retailer ID and the Amount to transfer the wallet balance. I do not verify screenshots instantly; I manually verify bank statements at night. What I Need: I need a "Robot" or an automation script (using Python Selenium ) that can: Trigger: Read incoming WhatsApp notifications. Extract Data: Automatica...
READ FULLY BEFORE BIDDING. Bids that ask questions already answered here will be rejected. Bids over $1,500 USD will be rejected automatically. Only developers who have previously built similar portal automation systems will be considered. ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ PROJECT OVERVIEW ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ We need a Python-based multi-client automation system that monitors a government visa appointment portal (Turkey), detects available slots in real time, and completes the reservation process automatically on behalf of multiple applicants. The system manages a queue of applicants, runs configurable parallel sessions, and operates 24/7 as a background service. ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ TARGET PORTALS — Phase 1 (5 countries) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ ...
Title Build Comprehensive Global Movie & TV Metadata Database for Recommendation Engine Project Overview I am building a personal recommendation engine that predicts my rating for movies and TV shows based on a large history of titles I have already rated. To support accurate predictions, I need a global media metadata database that contains rich structured information for movies and TV series. The dataset should combine multiple trusted sources and be designed for machine-learning comparison against my rating history. Scope of Work Build a master dataset containing global movie and TV metadata. This will serve as the candidate pool for prediction models. The database should include titles from: • IMDb official dataset • The Movie Database API • Optional enri...
PROJECT TITLE Web Scraping Developer for Global Legal & Regulatory Data Collection PROJECT OVERVIEW We are looking for a developer who can build an automated system to collect legal and regulatory documents from multiple global sources. The goal is to create a scalable automated pipeline that can gather legal data across multiple jurisdictions and regulatory domains. DATA COLLECTION SCOPE The system will collect information related to: - Medical law and healthcare regulation - Medical advertising regulation - Corporate formation and company governance laws - Investment regulation (stocks, cryptocurrency, real estate) - Tax law and administrative tax rulings - Beauty and cosmetic regulation - Medical and cosmetic manufacturing compliance - Import and export law - Customs and tariff...
More details: De unde vor proveni informațiile despre produse? Din toate 3 sursele de mai sus What information should successful freelancers include in their application? This question was skipped by the user How soon do you need your project completed? ASAP Prefer pe vb de limba aromana pentru acest proiect.
Each week I have a narrow booking window on our club’s Northstar Technologies app and I need that tee time grabbed the instant the system opens. The script must log in automatically, choose one of our three courses based on a simple setting I can change each week, then secure a slot that falls anywhere inside a time range I provide. Your solution has to run reliably on Windows and cope gracefully with captchas, latency, or the app’s occasional hiccups. As long as it is rock-solid I’m flexible on the language and tooling you use, but I do want clear instructions for installing and scheduling it so the task fires off at the exact minute I specify. Deliverables • Source code and any auxiliary files • A short README explaining setup, configuration for cour...
I need an AI-driven pipeline that takes a topic idea and automatically does everything that follows: it finds the most-searched Google keywords, retrieves the top 10 URLs for each term, scrapes the images and other media assets from those pages, adapts or re-edits the visuals to match my branding guidelines, then publishes the finished piece to my website while triggering the appropriate lead-capture sequence. All three stages—keyword research and analysis, scraping plus in-house editing, and final posting with lead automation—must run without manual intervention. A multi-agent architecture is ideal, so feel free to leverage LangChain, CrewAI, AutoGPT, or any comparable framework that lets independent agents pass tasks between one another. Think of one agent focused on Google ...
We are building a cloud-based cross-listing engine for resellers (eBay, Poshmark, Mercari, Vinted, etc.). What you’ll be doing Architecting a headless browser fleet that scales horizontally. Implementing "human-like" behavioral simulation. Building a Local Proxy Tunneling system. Building a robust "Session Mirroring" engine that captures, encrypts, and replays storage states (cookies/localStorage) from our extension to our cloud workers. You should be able to explain the difference between a JA3 fingerprint and a Canvas fingerprint. Experience with serverless execution (Azure Functions) and container orchestration. You’ve spent time in Chrome DevTools looking at how sites track "Device IDs" or handle rolling tokens. To prove you actually...
I need an experienced Python developer to build a commercial multi-client visa appointment automation system for Turkey-based applicants. Full source code ownership is required upon delivery. --- PROJECT BACKGROUND I run a visa consultancy service in Turkey. My clients need visa appointments from VFS Global portals. The current manual process is too slow and I need a fully automated, scalable system that handles 50-100+ clients simultaneously. --- TARGET PORTALS Source: Turkey () Target countries (minimum 6-7): - United Kingdom - Germany - France - Netherlands - Italy - Spain - Sweden System must be modular so new countries can be added later. --- CORE FEATURES REQUIRED [1] MULTI-CLIENT MANAGEMENT DASHBOARD - Register and manage 100+ client records - Per client: Full name, Passpo...
I need a Node JS service that pulls every piece of live information available on and its Android app (). The feed must refresh virtually every second, covering scores and statistics, detailed player profiles, and full match schedules. Your code should expose the data through robust JSON endpoints while respecting the target platform’s structure. Alongside the scraper, build a lightweight admin panel where I can: • Whitelist or block consuming domains / mobile apps in real time • View scrape and API-access logs at a glance • Add, edit or disable user accounts, assigning granular access rights Please design the solution so it can scale gracefully, use common Node JS tooling (TypeScript, Express, Puppeteer, Cheerio, or similar), and keep response latency low enoug...
I’m looking for a single application that can scan—and when possible automatically secure—usernames across several social platforms. The must-cover list is Instagram, TikTok and Snapchat, with bonus points if you can also include Gunslol, Discord and any other networks you already know how to tap into. Core workflow • I enter a series of prefixes or exact names, choose whether I want 2, 3 or 4-character checks, then hit start. • Your script pings each platform’s availability endpoint or page, handles any “Are you a robot?” challenge in the background, and reports back instantly. • The moment a name is free, the tool registers it with a pre-configured email account so the handle is locked in before anyone else grabs it. Notificati...
Project Overview I am seeking an expert Magento 2 Developer with strong data extraction (scraping) capabilities to build out a comprehensive medical supply catalog. The previous developer started a sample set but is no longer on the project. I need a professional to scrape high-volume product data and perform a structured, complex import into my Magento 2 staging site. Scope of Work 1. Data Scraping & Extraction Source: Target website (medical supply industry) to be provided. Requirements: Full extraction of Product Titles, High-Res Images, Long/Short Descriptions, SKUs, and Technical Specifications. Variation Mapping: Correctly identify and link "Simple" products to their "Configurable" parents (e.g., mapping different sizes or packaging options to a single prod...
Contact selection criteria: Upholstered furniture manufacturing Area: Romania, Bulgaria, Serbia Data scope: Company name 100% Country 100% Telephone 70% Email 100% Industry 100% Turnover (for available companies) Number of contacts: 1,100
Thanks for looking. I urgently need data for a set of local businesses in and around Berkshire and London UK. We will pay per 2k list of the industries we will send upon acceptance. Email addresses must not be role based or trip any spam traps. I need a one-time extraction of verified email addresses from reputable online business directories. No other data fields are required—just the clean list of emails. Please choose whatever approach you prefer—Python with Scrapy/BeautifulSoup, browser automation with Selenium, or a similar tool chain—as long as the result is accurate and the scraping respects each site’s terms of service and rate limits. Deliverable • A CSV or XLSX file containing every unique email address you capture, de-duplicated and ready ...
I need a quick-turnaround PHP script that pulls the live gold price from three different publicly accessible websites (no authentication required). The current feed I rely on has gone down, so this will act as an immediate rescue solution. Key requirements • Parse each site’s HTML/JavaScript to extract the latest price. • Merge the three readings into one final value (average or last-in wins—whichever is fastest to implement). • Refresh the value every 30 seconds and write it to a variable or simple cache file I can echo anywhere on my site. • Code must be plain PHP (I edit in Notepad++), well-commented, and ready to drop into an existing page. Timing is critical: I need the working script delivered within 30 minutes of awarding the project. If you...
I have a spreadsheet filled with email addresses and I need each one matched to its correct LinkedIn URL, focusing only on professional profiles (no company pages or recruiter accounts). The task is straightforward: for every email I supply, locate the corresponding LinkedIn profile and return a clean file that pairs the original email with the verified profile link. Accuracy is more important than speed; every URL should open directly to the individual’s main profile and clearly relate to the supplied email. Please avoid automated matches that haven’t been double-checked—if a link cannot be confirmed, flag it rather than guessing. Deliverables • Updated spreadsheet with two columns: Email | LinkedIn URL • A brief note on any emails that could not be match...
I need a freelancer to download text data from a single web page. Requirements: - Access the specified web page - Download and format the text data as required - Ensure accuracy and completeness of the data Ideal Skills and Experience: - Experience with web scraping or data extraction - Familiarity with handling and formatting text data - Attention to detail and reliability Please provide samples of similar work and estimated time for completion.
You’ll be working inside a private, password-protected website that I’ll grant you access to once we start. After logging in, I need every profile scanned for four key fields—Name, Email, Phone number, and Location (captured as City and State only). Accuracy matters: no missed entries, no duplicates, and the information must remain strictly confidential. The finished data should arrive in a single, neatly structured Excel workbook. If you prefer to stage it in Google Sheets or supply a quick CSV for interim checks, that’s fine, but the final hand-off must be Excel. Clear column headers (Name, Company, Email, Phone, City, State, Notes) are essential so I can filter and sort easily. I value meticulous web research experience, especially on projects with large record...
I need a Python developer to create a Windows-based automation script that helps monitor shift availability on an online hiring portal. The tool should check the schedule page at regular intervals and assist with quickly selecting shifts that match predefined preferences. The goal is to reduce the need for manual page refreshing and speed up the shift selection process. Key requirements: • Developed in Python and compatible with Windows • Ability to monitor the hiring portal and detect newly available shifts • Smart filtering based on preferences such as day of week, time window, and shift duration • Automatic navigation of the schedule interface to help select matching shifts • Auto-login capability after the initial setup using a saved browser session ...
My site used to rank on page one and I want it back there. The immediate priority is to tighten up usability and navigation while at the same time correcting on-page SEO. Here is what I need from you: • Audit the current menu structure and internal links, then implement clear, intuitive navigation that keeps visitors moving and reduces bounce. • Create roughly twenty suburb-focused landing pages and fold the same suburbs naturally into key existing pages, so potential customers in each local area find us quickly. • Optimise every new and updated page for search: titles, meta descriptions, H-tags, image alts, schema where helpful, and clean URL slugs. • Once changes are live, submit an updated XML sitemap and see that everything is indexed correctly. I am happy...
I’m in the UK and need a fresh, reliable list of genuine car importers across Europe, Asia and Africa. For every company you include, I must have full contact details, the specific makes or categories of vehicles they handle and a realistic picture of their monthly or annual import volume. Please double-check each entry—phone or email verification rather than simple web scraping—so I can reach out with confidence. Organise everything by region and deliver the final list as a well-formatted PDF report, complete with a one-page executive summary of totals. I’ll spot-check roughly 10 % of the leads; once they prove accurate, the milestone is complete. If you’ve sourced automotive contacts before, a quick note about a similar project will help me choose the ...
I need a clean, well-documented Python script that can scrape Airbnb listings for a neighbourhood or city that I will specify before you start. The scraper must gather every listing it finds and export the results to a single Excel file (.xlsx). Essential columns • URL • Title • Price per Night (+Airbnb Service Fee) • Ratings • Reviews ‑ include the review text and the number of nights the guest stayed • Description • Address Deliverables 1. Python source code with clear comments. 2. or so I can recreate the environment. 3. The populated Excel file for the first run. 4. A short README explaining how to change the target location and rerun the scraper. Acceptance criteria • All columns above are present and correctly po...
I need a robust yet easy-to-maintain web scraper that pulls player statistics from four different sports sites—a blend of official league pages, sports news outlets, and a couple of well-known fan forums. All scraped data should flow into a single database and surface through a lightweight web dashboard where I can search by player, season, and team, compare numbers side by side, and export results to CSV. My ideal flow looks like this: enter or schedule the URLs, run or auto-run the scraper, watch progress logs, and then immediately view fresh stats inside the dashboard—no command-line work once everything is deployed. If any source changes its HTML, the scraper should fail gracefully and flag the issue in the UI so I can react quickly. Tech stack is flexible; Python with Be...
**Task Description:** We are developing a Selenium-based automation tool and need a **JavaScript stealth patch specifically optimized for Microsoft Edge (Chromium)**. The goal is to make the automated browser appear **as close as possible to a normal real user browser** when checked on fingerprinting test sites. **Requirements:** 1. Create a **JavaScript stealth patch** that can be injected using Selenium CDP: ```python driver.execute_cdp_cmd("", {"source": JS_PATCH}) ``` 2. The patch should help ensure that on fingerprint testing sites such as: * PixelScan * BrowserScan * similar browser fingerprint check tools the browser appears **natural and not flagged as automated**. 3. The patch should address common automation signals including (but not limited to): * `...
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Learn how to find and work with a top-rated Google Chrome Developer for your project today!
Learn how to find and work with a skilled Geolocation Developer for your project. Tips and tricks to ensure successful collaboration.