Helvedom: an AI tool to estimate real estate prices

I am looking for a place to buy and got frustrated with what’s out there. Comparis and Homegate show you listings, but they tell you nothing about whether a price makes sense, or how municipality X compares to Y once you factor in taxes and commute. The
tools that actually do this (PriceHubble, Wüest Partner) either cost thousands per month or want to harvest your data to upsell you a service.

Long story short, I scraped about 650k Swiss buy listings going back to 2010, trained an ML model on them, and built a website around it: helvedom.ch

It does price estimates (median error ~12-13%, roughly what the professional tools achieve), but the part I personally find more interesting is the factor breakdown. The model shows you why something costs what it costs. Location is about 55% of it, size/rooms 18%, the
municipality’s tax burden around 10%, ÖV access 5%. You can see how much a balcony, lift, or Minergie certification actually adds in CHF.

There’s also a “best place to live” tool where you put in your workplace, max commute, income, Eigenkapital, and property requirements. It ranks all ~2,800 municipalities by total annual cost: mortgage + taxes + Nebenkosten.

Other stuff: a hex-grid price map of Switzerland, municipality browser with median prices and tax rates, available in EN/DE/FR/IT.

The whole thing is free, no lead gen, no data selling. Still work in progress, so feedback is welcome if you have a look.

17 Likes

Very cool! We recently had a house professionally estimated. Your tool was more than 20% lower than the average estimations we got for it. How have you modeled price appreciation trends?

This is interesting. The price is taken as advertised price, rather than selling price, right? So the actual valuation could be different depending on what was finally agreed.

Also, do you manage to make any adjustment for condition? e.g. identical property could have drastic price depending on whether it needs full renovation or was just renovated. Did you attempt any adjustment for this?

It’s a nice site! BTW, what source did you use for the historical data? I found a site once which had historical listings but then I lost the URL.

Nice tool, very professional look. I’ve just tried it for a property I’m in the process of putting on sale and the tool gave me a price which is 65% of the lowest estimate I have and less than 50% of the highest one.

It might be a bit of fringe case, it’s a studio in an area where small flats are rare and has a big outside terrace (I didn’t find a balcony/terrace choice, I selected garden though).

Still it gave me 7.25 kCHF/sq.m. in an area where everything goes essentially at >10kCHF/sq.m.

May be the old data is overweighted? Anyway, an expert is going to evaluate the property in a month time, I’ll give you more feedback then.

1 Like

Yeah, definitely something I need to look into. Various temporal features (listing date, national price index, cantonal trend year-on-year, BIS price index) are added to the data set, which (in theory) should allow the model to learn the time trends. Moreover, the KNN comparison features use a 1-year exponential half-life, so a comp that is 1 year older has 0.5x weight, 2 years older 0.25x, 3 years older 0.125x.

However, since the appreciation has been so strong in Switzerland over the last 5 years, it could be that the prediction error is still lower for older listings and higher for newer ones. If that is the case, I might need to change the loss function and give more weight to more recent listings.

Yes exactly, would love to get a dataset with selling prices, but they are either not available or extremely expensive based on my research.

I try to extract textual features from the advertisement text and expose them in the model (if it needs renovation, possible proxies such as heat pump or Minergie), these are some of the checkboxes that are shown on the page. But they are not perfect (and there is probably a pretty big bias there (sellers overreporting good condition and underreporting a bad one).

Scraped it from various portals. I guess showing the actual historical listings would be nice, but not sure about the legality there.

1 Like

If your source has photos, you might want to run it through a vision model to make a rough assessment which you can feed back into the main model.

I just took it for a spin for couple of properties which I know exactly the value of and the app got it right down to a franc.

Seriously amazed

2 Likes

My first impression was also that it was an amazing tool. A couple of things obviously you’re still ironing out, but Nice work

1 Like

@rbo Oh. Another thought. I have you thought of scraping also rentals? These values are much more reliable and numerous. It is useful to have a rental estimate too.

On top of that, you can then correlate rents to the purchase price to help refine purchase price valuations.

2 Likes

Actual transaction prices are critical for accuracy but very hard to come by. Farlaender, WP, pricehubble work closely with banks that have the those at scale.

@RBO What features in term of micro locations are you using yet? In my experience this can make explain quite some price variance. Particularly POI (schools, public transports, shopping.. ) but also proximity to to main roads or train tracks with high noice levels.

Great idea, but I unfortunately do not have any photos and this would probably be pretty expensive for a decent vision model.

Yes planned this initially, but did not have the chance to implement it yet. I guess the prediction itself is probably not that useful for rentals since you are not that interested in what the “market rent” for the property is / often times cannot really negotiate in the current rental market. But that would be a pretty nice addition to the best place to live map (with predicted rent instead of mortgage and affordability calculated with rent < income / 3).

Unfortunately very busy at work at the moment, but good idea for my to do list when I have more free time again (famous last words :grinning_face_with_smiling_eyes:)

Currently:

  • Public transport: Swiss transit quality class plus travel time to the
    nearest urban center
  • Schools / daily needs: travel time to nearest school, grocery and hospital
  • POIs: local POI density from OpenStreetMap, including shops / restaurants / leisure,
    plus distance to nearest school / kindergarten and number of schools within
    1 km
  • Noise: road and rail noise exposure, day and night
  • Very local spatial context: micro-location hex cells plus nearby
    comparable listings, which should capture unexplained street / neighborhood-level
    variation

I tried to incorporate as much openly available features (mainly from BFS, luckily Switzerland is great when it comes to open data) as possible.

4 Likes

Any IPO expected soon?

1 Like

Cool project, may i ask what you plan to do with it once ready? B2c? (Like houzy, realadvisor etc) or b2b? Or just remain a free hobby project?

No real plans for commercialization at the moment. I guess B2C would probably be better suited for something like that, but have not really thought about this.

1 Like

The rental price would actually be a good signal to feed into the purchase price prediction algo.