Development Application Tracking & Visualization

We've spent a lot of time over the last few months talking to real estate developers and industry professionals in Toronto. The single most requested feature has been a way to quickly explore the development applications that already exist. This information is extremely useful for understanding the development potential of a site as it conveys local approvals precedent and neighbourhood activity. Unfortunately, it is very difficult to get this data in a useful format. The City of Toronto hosts a search portal, but it is difficult to filter results to a set that is relevant to particular address.

Today, we are extremely pleased to announce that development applications can now be viewed in 3D for the entire City of Toronto!

dev applications.png

Information available includes current application status, proposed stories, descriptive text and more. Just click on the parcel associated with the application! We have colour coded each application according to its status:

  • Yellow = SPA / ZBA / OPA application submitted & under review

  • Green = Approved

  • Red = Under Appeal

  • Light Blue = Building Permit applied / under construction

  • Dark Blue = Completed

We have translated the applied-for heights (based on number of stories multiplied by 3.5m) and represented them with a 650sm cylindrical extrusion in the middle of the site. Please note that this is NOT a literal representation of the applied for building form but rather an abstraction for understanding scale.

The technical nuts and bolts:

For a bit of a deep dive into how (and why!) we developed this feature please read on.

Let’s start with the City of Toronto site for finding development applications: http://app.toronto.ca/DevelopmentApplications/mapSearchSetup.do?action=init.

City Dev App.png

First, the good stuff. The city has all the information in database with geo-spatial attributes and they actively maintain it. They make this site available to anyone and for free.

Now the not-so-good stuff. Because there is a map on the page, a user has the reasonable expectation that the data can be explored via the map. But there is no data shown on the map without first engaging with the filter search on the left side of the page. Even worse, the filter search is not capable of returning all the data in the database at once, and the filtering options presume a sophisticated knowledge of the whole process of making an application for development. "Community Planning" or "Committee of Adjustment"? Maybe what I'm looking for is under "Toronto Local Appeal Body"? Who knows? I am a registered professional planner and I find this interface massively confusing.

So, the first thing to do is scrape the data that this page provides acccess to and store it in more effective ways. At Ratio.City we lean heavily on PostgreSQL (with the PostGIS extensions) to do high performance querying of geospatial data. We also rely on MapBox with their excellent map-tile hosting service and their WebGL enabled 3D mapping framework. We want to translate the data on the city site into both these formats so.

Some structured exploration with the site's filter search (while observing the requests and responses in the developer tools of your favourite web browser) will reveal that there are clear patterns for how the data on this site is retrieved and structured. This leads to the most frustrating thing about this site: the returned data IS hierarchical, but it is returned in a flattened list with internal referencing rather than a natively hierarchical format like XML or better yet, JSON. Compounding the problem is that to get all the data available, each combination of application type and city ward must be queried, and each response parsed. Much Python scripting was done to manage this querying and parsing until we had nice, tidy JSON records. So where do we store this newly cleaned data?

EM Screen Shot 2018-09-21 at 5.08.21 PM.png

Fortunately, at Ratio.City we already run a high performance API on top of our PostGIS database. (For the curious, our APIs are built with Django REST Framework and deployed to Amazon's server-less infrastructure with Zappa which is an amazing tech combo.) This allows us to simply pass the JSON records we have extracted as the data payload of PUT requests to our API endpoints and our backend services convert them to database records with indexing that enables extremely fast spatial queries. Our first data transformation is complete!

Next we need to generate map tiles to store on MapBox. MapBox provides a great command line tool called Tippecanoe to convert various geospatial file formats to MapBox's .mbtiles format. Keen observers will note that we don't have a geospatial file, we have a bunch of records in a geospatial database. This is true, and there are a few different ways to deal with this, but our preferred approach is to simply ask our API for all the relevant development application records and dump them out as a geoJSON file. Then we run this file through tippecanoe to get our map tiles. The resulting .mbtiles file contains not just the geometric data we want to visualize on our website, but also a number of other parameters that we believe our users will be interested in. We upload this file to MapBox and our web application displays it in glorious 3D!

One final wrinkle is that the City of Toronto is updating their applications database on a daily basis. All the steps described in this post are tedious to carry out, even if a computer is doing most of the heavy lifting. The last step is automate the whole process on a virtual machine on AWS. A bash script runs each of the steps in sequence and emails us with success or failure status every morning at 2am. Voila! A tricky bit of data acquisition and organization running automagically every night while we sleep.

brxxto-346280-unsplash.jpg