Typical architectural design process is based on designer interpretation and connection establishment between sets of rules and restriction, driven by designer intelligence and clients demands. It is common opinion that good design is just an answer to set of questions, a problem solving process, involving economical, historical, aesthetical, technological, cultural and many more aspects blooming around each step that designer needs to make. One of inter-lapping steps is architectural programming process, which is, using high-level of simplification parallel, part in which designer sets up programs for each space in the building, considering connections between spaces, users demands, relation between inside and outside etc. Successful results of programming process are often base on designer knowledge in domain of building typologies1, personal experience, ability to pretend certain scenarios and technological restrictions as well as so called aesthetical touch. Certain periods in architectural history were marked by overlapping tendencies in following architectural guidebooks or central attack on traditional way of thinking about architectural typology2. Nevertheless, one thing was always common: complicated tree of connections emerges every time multilevel plan of building is being created. The more technically restricted building (i.ex. hospital) the higher complexity level involved in the programming process.
Thanks to computation possibilities there is growing number of tools prepared to help designers comprehend complexity of programming process. Aedas R&D (http://aedasresearch.com/) developed series of application to help architects visualise, manage and understand relationships between designed parts of the building or any other complex system of relation (pic.01). Generative program finding approach is more and more present, not only as a tool, but as an integral heuristic part driven computational approach in architectural design. New Media School contest entry for ACADIA International Competition "Expanding Bodies" 2007 by MisoSoupDesign is one of the examples of how programming can be literally integrated into overall design as a part bottom-up design process. In this example, conventional program assembly process, represented by cubical volumes, has been directly translated into building structure and appearance through 3D Voronoi algorithm. Similar process has been undertaken by members of Collective Architects in a Hybrid study design project, at Warsaw University of Technology, Poland (pic. 02)
INTEGRATED PROGRAM FINDING PROCESS
Regardless undisputed usefulness of mentioned tools and their partial integration in the design process, genesis of spatial distribution within the building is still being controlled in a top-down convention by the designer. In other words, those generative systems have capabilities of being feed up by something more then just an subjective designer decision making process. Space distribution, appearance and building performance, can be link directly to the first stage of building mophogenesis3 process, in situation where building is "understanding" its location and "growing" its program based on date acquired from the site. This kind of approach, while still unable to comprehend all aspects requiring consideration during programming process, has growing research and study value for future development of architectural design.
"Computer-based models allow complex exploration not possible with the real system.(...) Such models can provide existence proof, which show that given mechanisms are sufficient to generate a given phenomenon. They can also suggest critical patterns and interesting hypotheses to the prepared observer(...)"
John H. Holland
To understand true value of building morphogenesis proces based on integrated programming, we need to set some relevant aspects straight. First steps that we need to take in order to perform building evolution is to simulate its environment. Two key aspects are highly relevant in this case. 1) simulate4 does not means faithfully recreating existing environment. Every computer simulation is based on input data selection, generalisation and attempt to underline some factors. 2) creating computational data based environment for building evolution is a root of new kind of contextualism in architecture. Every period in architectural history was driven by the context of those times. Even attempts to ignore context (read more in Rem Koolhas writings) are somehow driven by present cultural state of mind in society. In other words context can be described as anything emerging from philosophy of given times - ignoring context create social context of separateness and ignorance. Second step is to assure differences within the simulated environment. Notion of difference as an nuomenon5 is highly important and will be future explored in coming article "Far From Equalibrium Architecture", as for now the only thing that we need to keep in mind as an dogmatic assumption, is that differences within the environment are responsible for any state of matter, interaction and emergence process which we are looking for. Third thing is gathering and processing information acquired from simulation. This can be done in various different kinds of ways, the point is that date from simulation in its pure form (in our particular case very narrow part of complex active system) is not enough to proceed with the form generation. In case of full scale cas6 simulations, data can be directly connected with the form, but this approach required brother scale simulation including evolution and adaptation of the system, as well as function diversification. In our case we are going to disregard extensive characteristic of complex adaptative systems (cas) looking only on one aspect of cas which is agent motion (flow) within the simulated environment.
Summing up: integrated programming process requires simulating building environment in computational space, creating new context for a building to grow from. In our particular case, we are going to have a closer look on a narrow branch of agent based simulation7 within simulated environment. Agents are interacting thanks to differences in their characteristic as well as in the environment and the date acquired from this interaction is post-produced in a form generation process.
Based on observation and empirical research, initial physical location of the investigation site has been established and rebuild as a boundary for agents interaction within echo. Agent emission into echo8 is based on key emitters points representing physical entrances to the site as well as building entrances at the site. Those points (emitters) are responsible for creating agents and for establishing their possible destinations in their future movement.
Next step in the process of establishing echo is providing resources. In our case resources are environmental and spatial parameters within the simulation's boundaries. Based on daily observation of the site, such emission points for resources as green area centres, existing communication nodes, water etc. has been programmed to provide information (resources) about noise, temperature and sun light, etc.. Those parameters are stratified through two hours day-time interpolation as well as division between seasons in the year (pic.03) Additional statistical information about weather conditions in the seasonal scale has been acquired from The Royal Netherlands Meteorological Institute Annual Weather Reports (http://www.knmi.nl/) as well as through sunlight and shading analysis performed in environmental analysis software9. (mov. 01)
There is no use from any echo environment till its not populated with elements able to act based on data acquired from the area of interaction. Generated agents, in our particular case, are representation of humans moving through the site. During its route to destination, its agent is driven by certain desires common for all agents but multiplied by the value characteristic for particular agent type. Particular needs of particular agent are also stratified through 2h interpolated day time, as well as changing environmental factors based on year season. Initial location, amount, kind and destination - all those factors are link to parametrical echo environment.
As mentioned before, aspect of our system to be considered it this particular case, is agent flow. Driven by local (personal) desires agents are wondering through the site, being attracted or repelled by environmental resources, to finally reach their destination (mov. 02). There is finite number of agent, so as their starting and destinations points are limited, nevertheless possible movement path is undefined and slightly different for every simulation made using the same set-up. This is because at every movement step agent trajectory is a sum of local decisions. Tracing this king of movement at the brother scale of many agents interacting at the same time, and overlapping path traces results for simulation made under different global set-ups, generates characteristic patters to be further interoperate by the designer operating generative system.
This level of visual data simplification, includes all environmental and behavioural information in a kind of curve-based diagram. Overlapped colourful lines cares information about agent type, destination, starting point, and the site resources in relation to agent preferences at the informational level and interesting inspiring patterns at global level of visual appearance. (pic.04)
"Who know why people do what they do? High enough data numbers speaks for themselves"
Having information about agent movement through the site further synthesis of information useful for architectural programming is required. Each curve can be interpolated into set of point, each embed with information about actual state of agent ant its type. Translation from curves to point-sets is purely geometrical transformation, increasing readability of the diagram and underlying important patterns created by point-set. (pic. 05). The outcome from this transformation are clusters of over 1000 points marking different types of agent in their position and this is where direct parallel to architectural function programming is being build. Being able to read statistical information for each point about what is going out around it in terms of kind of surrounding agents creates open-ended10 set of possibilities to build function finding rules on them. Let say for example that certain are agent in certain radius around has mostly agent types representing kids and lets say that the total number of kids + youth needs to be higher than total number of any other agents, an we can predict that this area will be populated mostly by young people having certain desires. This is only example condition, point is that having massive date in number of over 180,000 possible combination between agents, and being able to check statistical data for each of those combination, possible top-down11 interpretation of the data are endless.
Thanks to computational potential of programming, writing a script capable of real time data interpretation based on predefined rules and asserting program location enriched with additional spatial information is easy thing to do. Example application (mov. 03) of this kind of program finding system, including connection generation has been developed during this research and has been main tool in further architectural form generation.
While still marked with a high level over-formalisation, integrated bottom-up architectural programming opens up broad possibilities in research area as well as for further development and implementation in actual process of architectural design. Brother practical application requires developing complex adaptive system with full range of characteristics responsible for direct form finding and evolution. Aspects of adaptation and evolution of the system during the simulation are needed in order to develop higher complexity not available in narrow patch of patch tracing analysis. Nevertheless, exploration of bottom-up12 approach in architectural programming might be the starting point for developing higher orders of building complexity impossible to achieve within traditional approach. Highly attractive is also perspective of linking environmental data and structural analysis system, as a feed for an overall system. Efficient, environment friendly and low cost building can be described with extreme open-source code-based precision, which, in result generates possibilities of specialistic input of every branch involved in architectural design. Having no clear boundaries between each aspect integrated into architectural design but rather smooth parallel cooperation based on same one building genotype to be modified individually but at the same time in a coherent way, is a thing worth going for.
| MARIUSZ G. POLSKI Eng. Arch. |
Steven Johnson "Emergence" Penguin Books, 2002
John H. Holland "Hidden Order. How Adaptation Builds Complexity" Helix Books, 1996
"Deleuze and open-ended becoming of the world" article by Manule de Landa, first print 1999
"The Future of Information Modelling and the End of Theory" article by Cythiana Ottchen AD vol. 79 No 2, page 22-27
http://www.misosoupdesign.com/feidad/NewMediaSchool/ - visited on 28.02.2011
http://www.acadia.org/ - visited on 28.02.2011
http://www.misosoupdesign.com/ - visited on 28.02.2011
1. typology in general refers to study of type, where in architecture and urban planning is the taxonomic classification of (usually physical) characteristics commonly found in buildings and urban places, according to their association with different categories, such as intensity of development (from natural or rural to highly urban), degrees of formality, and school of thought (for example, modernist or traditional). Individual characteristics form patterns. Patterns relate elements hierarchically across physical scales (from small details to large systems) (source: wikipedia.org)
2. read more about post-Second World War discourse in Great Britain, between neo-palladians and new brutalists (Colin Rowe neo-classical approach as an attack on modern architecture and Peter Smithson brutalism as an extension of modern tradition in architecture)
3. morphogenesis (from the Greek morphê shape and genesis creation, literally, "beginning of the shape"), is the biological process that causes an organism to develop its shape. It is one of three fundamental aspects of developmental biology along with the control of cell growth and cellular differentiation. Digital morphogenesis is a process of shape development (or morphogenesis) enabled by computation. While this concept is applicable in many areas, the term "digital morphogenesis" is used primarily in architecture. (source: wikipedia.org)
4. simulation is the imitation of some real thing, state of affairs, or process. The act of simulating something generally entails representing certain key characteristics or behaviours of a selected physical or abstract system. (source: wikipedia.org)
5. nuomenon (from Gr. νοούμενoν, present participle of νοέω "I think, I mean"; plural: νοούμενα - noumena) is a posited object or event that is known (if at all) without the use of the senses. Classically, the noumenal realm is the higher reality known to the philosophical mind. However, the term is better known from the philosophy of Immanuel Kant, where noumena are regarded as unknowable to humans. The term is generally used in contrast with, or in relation to "phenomenon", which in philosophy refers to anything that appears, or objects of the senses. Here used as a reference to Deleuze's philosophical approach interpreted by Manule de Landa in article "Deleuze and open-ended becoming of the world" (source: wikipedia.org)
6. cas (complex adaptive systems) are special cases of complex systems. They are complex in that they are dynamic networks of interactions and relationships not aggregations of static entities. They are adaptive in that their individual and collective behaviour changes as a result of experience. Here used in a reference to John H. Holland work where cas describes complex, self-similar collection of interacting adaptive agents that interact and adapt or learn in order to achieve higher level of organisation. (source: wikipedia.org)
7. agent based simulation is a kind of agent-based model (ABM) (also sometimes related to the term multi-agent system) is a class of computational models for simulating the actions and interactions of autonomous agents (both individual or collective entities such as organizations or groups) with a view to assessing their effects on the system as a whole. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo Methods are used to introduce randomness. ABMs are also called individual-based models. (source: wikipedia.org)
8. echo is a simulation tool developed to investigate mechanisms which regulate diversity and information-processing in systems comprised of many interacting adaptive agents, or complex adaptive systems (CAS). Echo agents interact via combat, trade and mating and develop strategies to ensure survival in resource-limited environments. Individual genotypes encode rules for interactions. In a typical simulation, populations of these genomes evolve interaction networks which regulate the flow of resources. Resulting networks resemble species communities in ecological systems. Flexibly defined parameters and initial conditions enable researchers to conduct a range of "what-if" experiments.
9. environmental analysis software is a comprehensive concept-to-detail sustainable building design tool, usually offers a wide range of simulation and building energy analysis functionality that can improve performance of existing buildings and new building designs. Online energy, water, and carbon-emission analysis capabilities integrate with tools that enable you to visualize and simulate a building's performance within the context of its environment. Example software might be Autodesk Ecotect Analysis
10. open ended refers to dynamic situations or scenarios that allow the individual to determine the outcome having limited range of possible movements in single step.
11.12. Top-down and bottom-up are strategies of information processing and knowledge ordering, mostly involving software, but also other humanistic and scientific theories (see systemics). In practice, they can be seen as a style of thinking and teaching. In many cases top-down is used as a synonym of analysis or decomposition, and bottom-up of synthesis. (source: wikipedia.org)
| text based on |
| HYPERBODY DESIGN STUDIO | Msc 1 | infoMATTERS |
| authors | MARIUSZ G. POLSKI | KEN ZONG | BRETA BISHOP |
| tutors | Dr. NIMISH BILORIA | Ir. HANG FENG |