Professional Documents
Culture Documents
of Information
Modelling
and the End
of Theory
Less is Limited,
More is Different
The computational design strategist Cynthia
Ottchen, who was previously Head of Research and
Innovation at OMA, offers insights into the future
of building information modelling (BIM). Now in
the Petabyte Age of the data deluge, she argues
that in our adoption of BIM we have to surpass
mere data collection and technical optimisation
and open up new ways of thinking with the
creative use of ‘soft data’.
23
OMA Research and Innovation Parametrics Cell, Facade Study for the Slab, 2008
This parametric facade strategy experimented with the soft data of sociocultural material through scripts that
articulated the building’s internal public zones and registered the site’s external urban pattern while
incorporating technical functional, structural and material parameters. The series of diagrams shows how the
context was translated into data through a process analogous to that of creating a physical impression: 1) the
facade grid was conceptually ‘flipped down’ on to the existing ruins of a site; 2) an impression of each building’s
location and height was measured; and 3) the data was registered into the corresponding cell of an Excel sheet.
A special script was created that keyed this input data to a degree and type of mutation in the facade grid.
24
options, consider more types of information, and Reconceptualised Role
generally be more creative with how we use them. In the Such an approach implies that the architect’s role should be
Petabyte Age we can ‘view data mathematically first and reconceptualised from that of romantic genius or technological guru to
9
establish a context for it later’. We can incorporate data that of multidisciplinary strategist. The new architect is still ultimately
from many sources, including aesthetics and the responsible for design intent and needs to be able to look at the big
sociocultural, political and historical dimensions picture to decide which factors to parameterise, to give limits to the
because massive data is the new meaning. Rather than parameters, assign a weight to each factor and determine the order and
necessitating the mere reduction of the qualitative into method of the information modelling process: in summary to strategise
the quantitative, this approach actually creates the which factors and methods will be used, how they will be applied or
conditions for the optimistic potential of emergence: the generated, and to judge what they contribute. But, just as in the ‘new,
cumulative effects of complexity and multiplicity may softer science’, algorithms assume the burden of rigorous quantitative
themselves result in the production of new qualities, methodology and the mind is left ‘free to move around the data in the
which in turn allow new relationships between most creative way’.11 Technically this implies a broader use of
architecture and culture to emerge.10 creative scripting in the initial stages of design, one that is capable of
25
OMA Research and Innovation Parametrics Cell, Study for Phototropic Tower, 2008
The surface geometry of this iteration of the experimental project was generated through a digital
‘growth’ process based on recursive fractals and their fragmentation. The trajectories of two
possible ‘branch’ vectors were aligned with the edges of each building facet and the
parametrically limited segmentation of the ‘branches’ created a varied, but algorithmic,
distribution of structural members. The density and size of the facade members on each facet was
keyed to both the programme behind it and its general position in the building: public
programmes were assigned a more open structure and areas that were cantilevered or carrying
more vertical load required greater structural dimensions for members.
26
folding in diverse formal strategies and sociocultural data
(such as programmatic volumes and adjacencies, the
external legibility of programme, site history and so on)
beyond the limited scope of, for example, shape
grammars. In terms of software, it suggests that rather
than trying to develop a single and ideal design
environment of embedded tools in a closed feedback
loop, we should strive for seamless interoperability –
without loss of metadata – between scripting protocols,
the more pragmatic stages of BIM and a variety of
analytical software.
Conclusion
As proponents of BIM we need to acknowledge the
implications of the massive expansion of data and move
on from a performative analytical model to a more
comprehensive conceptualisation of information
modelling that opens up creative options leading to new
qualities and relationships, and does not just streamline a
process. It should expand the ways we use data rather
than merely generating taxonomies or collecting an
envelope of constraints. Some of the best designs have
been those that have broken the rules and gone beyond
technical optimisations or the prescribed constraints of
clients and municipalities. Rather than limiting our
choices, information modelling can open us up to the new
way of thinking and its massive potential. 4
Notes
1. George EP Box and Norman R Draper, Empirical Model-Building and
Response Surfaces, Wiley (New York), 1987, p 424.
2. See Manuel DeLanda, ‘Philosophies of design: The case for modeling
software’, in Verb Processing: Architecture Boogazine, Actar (Barcelona),
2002, pp 131–43.
3. Chris Anderson, ‘The End of Theory: The Data Deluge Makes the
Scientific Method Obsolete’, Wired, August 2008, pp 106–29.
4. Philip Anderson, ‘On My Mind’, Seed, July/August 2008, p 32.
Anderson acknowledges the limits of known laws of physics at larger
scales and primitive levels, but also believes computers have limits of
error when simulating micro-level interactions.
5. ‘Data Set: Dark Matter Observation’, Seed, July/August 2008, p 32.
Only about 5 per cent of the composition of the universe is understood
by scientists for certain. The rest is composed of dark energy and dark
matter about which little is known.
This parametrically designed mixed-use tower project
6. Chris Anderson, op cit, p 108.
strategically privileges analyses of contextual solar patterns and
7. Ibid, p 109. For instance, J Craig Venter’s application of high-speed
view foci over normative economic and zoning constraints. Point
gene sequencing to entire ecosystems has discovered thousands of
clouds were developed by processing the following data:
previously unknown species of life, based entirely on unique statistical
sunlight averaged over the year to determine the primary
blips rather than on a scientific model or theory.
orientation of the building; programmatic data to determine the
8. Ibid, p 109.
general floor-plate size and volume; solar insolation data
9. Ibid, p 108.
(radiation energy) to determine the first level of deformation of
10. See Gilles Deleuze and Felix Guattari, A Thousand Plateaus:
each floor plate; and selected view foci to determine the second
Capitalism and Schizophrenia, University of Minnesota Press
level of deformation of each floor plate. Variations of potential
(Minneapolis), 1987, p 21.
building forms were produced by differently weighting the solar
11. Gloria Origgi, ‘Miscellanea: A Softer Science,’ 3 July 2008. See
and view factors and by the application of varying degrees of
http://gloriaoriggi.blogspot.com/2008/07/softer-science-reply-to-chris-
faceting (expressed as a percentage) to the point cloud.
andersons.html.
Text © 2009 John Wiley & Sons Ltd. Images © OMA/Rem Koolhaas
27