We are seeking a skilled professional or team to develop an advanced AI-driven solution for sourcing and ranking property development opportunities. The AI agent will focus on identifying office and retail properties with high potential for residential conversion, leveraging data from various sources and performing detailed profitability analyses.
Scope of Work
The project involves the following key deliverables:
1. Web Scraping Framework - Scrape property listings from a variety of sources, including aggregators, local agencies, and major commercial property firms. - Integrate planning history and nearby development data from local councils and public databases to enhance contextual understanding.
2. Data Enrichment - Enrich property data with additional insights, including: - Residential and commercial market values. - Planning history of the identified properties. - Details and proximity of nearby major developments (e.g., consented projects or completed developments).
3. Property Ranking System - Develop a ranking model using a Profitability Score: - **Formula:** Projected Residential Value - Existing Commercial Value. - Incorporate additional factors such as planning favorability and the influence of nearby developments.
4. Geospatial Analysis (Optional) - Integrate GIS tools for mapping properties, planning contexts, and major developments. - Provide distance-based insights to assess the impact of surrounding projects on profitability.
5. Data Output and Visualization - Create a user-friendly database and an interactive dashboard with filters for: - Location. - Residential and commercial values. - Planning history and nearby developments. - Constraints and other contextual data. - Alternatively, provide the data output in a structured format like an Excel file.
6. Discounted Properties Insights Include a feature to flag discounted properties and document the reasons for the discount, if available.
Data Sources to Scrape
Property Listing Websites • Aggregators - examples include: EG Property Link, Rightmove Commercial, Zoopla Commercial, CoStar. • Local Agencies - examples include: Lewis and Co, Andrew Scott Robertson, Kalmars, Frost Meadowcroft. • Major Agencies - examples include: CBRE, Knight Frank, Savills, JLL. • Government websites
Planning Context • Planning History: Local council planning portals (e.g., Planning Portal UK). Data points: past applications, approval/denial status, constraints (e.g., conservation areas, flood zones). • Nearby Major Developments: Local council development plans, regional planning announcements. Data points: project type, size, status, proximity. • News outlets: Check if there are any articles about the property or the surrounding area
Market Value Data • APIs or public databases providing residential sale/rental values and commercial sale values. Right move and Zoopla do residential values well. Commercial values can be from the listing. Data Points to Collect
Property Information • Title, listing type (sale/rent), source URL, use type (office/retail). • Location (address, nearest tube or train station). • Sale price (in the cases where the unit is for rent or the price in not listed there will need to be a function to reach out.) • Rent (£ and £/psf) • Size (psf) • Floors (#) • Agent Details - (Name, Email, number, website)
Financial Data • Residential value: average price/rent per square foot in the area. Right move and Zoopla do residential values well. Commercial values can be from the listing. • Commercial value: average lease/sale price for commercial properties. This can come from the listing • Price history and vacancy duration.
Deliverables • A system that collects property data from various sources and organizes it in a user-friendly way. • A database containing enriched property information, including market values and planning context. • An interactive dashboard to visualize and filter property information based on user needs. • Simple documentation to explain how to use the system and its features. • A system that keeps track of issues during data collection and provides error reports if something goes wrong.
Additional Considerations • Real-Time Updates: Potential to schedule regular scrapes to keep the data current. Implement change detection to update existing records. • Scalability: Design the system for high-volume scraping and analysis. • Error Handling: Handle site changes, rate limiting, and data inconsistencies gracefully.