How can we interact intuitively with our digital models?
Augmented reality takes our generated parametric model “out” of the screen and “places” it within an actual physical model, here the Nordwestbahnhof in Vienna. Decision makers and collaborators can now interact with the model in a natural way, while still having the power of urban analysis and AI at hand.
We can use devices like the Microsoft HoloLens or Tablets to access the model from different parts of the world, and exchange design suggestions at the click of a button. This means that we can collaborate with clients, municipalities and other partners in real time, and globally.
Any kind of changes to the model, such as to the street network or to building density, are reflected directly to all active devices.
On top of that, we can generate real-time accessibility analyses, such as walking distances to parks or retail services. The stakeholders involved can plan collaboratively and receive immediate urban analysis results.
What is more, we can distribute retail manually, or we can simulate its distribution by estimating retail location choices and deduce attractiveness rankings for each street. In this way, we depart from a purely top-down master planning approach that focusses on defining land-use, and instead aim for understanding and planning with market forces already inscribed in the city.
Through the use of augmented reality, we can quickly adjust, change and get real time performance results for our designs. This allows for a fine-grained decision-making process, as we can consider the impact on multiple fields of interest simultaneously and collaboratively.