space syntax3 theory vs practice and ABM

Marvin Malecha during his lecture on our university said ”in theory is no difference between theory and practise in practise there is” so funny enough that these famous quote by American athlete is so universal that can be applicable everywhere - space syntax and ABM including

world of these analysis is truly attractive. Wouldn't be marvellous to have tools which can make space experience judgement objective and assimilations of danger situations possible.?for the need of this entry let's say it would. Unfortunately when you are going deeper and deeper in the subject you can fill a little bit like on Peter Eisenman's project presentation session. You see graphs and maps which represents fe. Relation between metro accessibility and prices of apartment and there is no doubt that these is rather sophisticated tool. But when you want to do these sort of map yourself it is not so easy to find clear explanation ow it was done. So can we trust it?

Going even deeper in the subject I got two discoveries. First one encouraging was that people who investigate these subjects are generally more scientist then magicians. At least some of them present mathematical rules which where used in calculations. Some of them do not show back stage but still they are sort of convincing http://www.geosimulation.org/geosim/abms.htm. As far as ABM are concerned, second discovery is not so encouraging... nothing is really tested yet and if you want to do something you basically have to do it yourself. Only very simple things are ready to use for public. What I mean, you have to basicaly write program yourself.

http://www.swarm.org/index.php/Main_Page

so it is rather a pilosophy of simulating sistems than tools

even if you go to

http://www.swarm.org/index.php/Tools_for_Agent-Based_Modelling

an you think that you will get some simple tool you can find mostly fully programmable environments

like http://education.mit.edu/starlogo/

so it is a great thing if you really want to do something and have full contorted of it but if you are an architect an you just want to examine your propositions it is just too much of a trouble.

Most of these software are home made that why they are not stable. And if you find something more user friendlily like

http://www.xjtek.com/

it not free anymore....

CONCLUSION

If for each model I have to basically develop my system from the search maybe it is better to just do mah by your self and just present clue points of solutions instead of becoming a programmer, because in few years standard architectural will have simple ABM analysis inside them. It is truly interested topic for group of researchers from various fields but if somebody is focused on designing it is probably not the most important thing for time investment.

1 comment

SPACE SYNTAX 2



SPACE SYNTAX
FOWL SIMULATION METHOD CHARACTERISTIC AND COMPARISON
There are two main systems for flow simulations; Agent based methods
and Space syntax methods.
Inside Agent based methods we can find differ solutions related to particular scale and analysis purpose.
-big models land use and transmission models LUTM TRANSMIS
http://www.geosimulation.org/geosim/abms.htm (inculding interesting software link and some lectures)

-small models Drager, Galea, Helbing
http://www.youtube.com/watch?v=zhTnlxhq0Tc&feature=player_embedded
unfortunately movie itself is not so comunicative so probably you have to get in touch with publications wich are not free:(
http://www.soms.ethz.ch/
While space syntax tools use various space calculation models. Problem with these calculations is that solution should be extremely simple and still work for various space types (form orthogonal grid to organic structures). As far as Agent methods are concerned system are filled with agent generations who have absolute knowledge about their environment which is very far form reality. Non of pure methodology
seem to be satisfying but merging these two approaches into one gives analysis which reflect 80% of homogenius environment like shopping moll.
Having in mind on how poor data set system is working I want to examine
present and designed models using this type methodology


isowist and visibility graph





algorytm for agents incorporating visibility analises








conection between two models could look like that,
unfortuantely at the moment i cant gaet acces to softwere wich usese these algorythm.

0 comments

Space Syntax -universal spatial rules

So far we don't have mathematical rules for making architecture or urbanism but there are matematical tools which may be helpful in deign process. Space syntax is a set of methodologies which describe effect of space geometry. Having in mind that this system presents just part of reality not taking into account many other factors and space aspects I am starting my research on these still growing movement. In first blog entry consist mainly of net text collage.


Abstract - cities are similar they consist of spaces and barriers, movement and occupation

Space Syntax is an advanced spatial technology as well as a highly influential theory of architecture and town planning. It was originally developed by Professor Bill Hillier and his colleagues at University College London (UCL).
Through over twenty years of research-informed consulting, they have developed a powerful computer-based modelling technique that demonstrates the key role of spatial layout in shaping patterns of human behaviour. These patterns include movement on foot, on cycles and in vehicles; wayfinding and purchasing in retail environments; vulnerability and criminal activity in buildings and urban settings; co-presence and communications in the workplace.
Space syntax derives from a set of analytic measures of configuration that have been shown to correlate well with how people move through and use buildings and urban environments. Space syntax represents the open space of an environment in terms of the intervisibility of points in space. The measures are thus purely configurational, and take no account of attractors, nor do they make any assumptions about origins and destinations or path planning. Space syntax has found that, despite many proposed higher-level cognitive models, there appears to be a fundamental process that informs human and social usage of an environment. In this paper we describe an exosomatic visual architecture, based on space syntax visibility graphs, giving many agents simultaneous access to the same pre-processed information about the configuration of a space layout. Results of experiments in a simulated retail environment show that a surprisingly simple 'random next step' based rule outperforms a more complex 'destination based' rule in reproducing observed human movement behaviour. We conclude that the effects of spatial configuration on movement patterns that space syntax studies have found are consistent with a model of individual decision behaviour based on the spatial affordances offered by the morphology of the local visual field.


The general idea is that spaces can be broken down into components, analyzed as networks of choices, then represented as maps and graphs that describe the relative connectivity and integration of those spaces.

BASIC SPACE CONCEPTS

AN ISOVIST or viewshed or visibility polygon, the field of view from any particular point.
A single isovist is the volume of space visible from a given point in space, together with a specification of the location of that point. Isovists are naturally three-dimensional, but they may also be studied in two dimensions: either in horizontal section ("plan") or in other vertical sections through the three-dimensional isovist. Every point in physical space has an isovist associated with it.
The boundary-shape of an isovist may or may not vary with location in, say, a room. If the room is convex for example (like a rectangle or circle), then the boundary-shape of every isovist in that room is the same; and so is its volume (or area, if we are thinking in plan). But the location of the viewpoint relative to the boundary would or could be different. If the room were non-convex, however, (say an L-shaped room, or a rectangular room with partitions), then there would be many isovists whose volume (area) would be less than the whole room's, and perhaps some that were; and many would have different perhaps-unique shapes...large and small, narrow and wide, centric and eccentric, whole and shredded.
One can also think of the isovist as the volume of space illuminated by a point source of light.

AXIAL SPACE a straight sight-line and possible path.

CONVEX SPACE an occupiable void where, if imagined as a wireframe diagram, no line between two of its points goes outside its perimeter(???)

These components describe how easily navigable any space is.
Space syntax has also been applied to predict the correlation between spatial layouts and social effects such as crime, traffic flow, sales per unit area,etc.
In general, the analysis uses one of many software programs that allow researchers to analyse graphs of one (or more) of the primary spatial components.

ANALYSIS METHODS

INTEGRATION measures how many turns one has to make from a street segment to reach all other street segments in the network, using shortest paths. If the amount of turns required for reaching all segments in the graph is analyzed, then the analysis is said to measure integration at radius 'n'. The first intersecting segment requires only one turn, the second two turns and so on. The street segments that require the least amount of turns to reach all other streets are called 'most integrate' and are usually represented with hotter colors, such as red or yellow. Integration can also be analyzed in local scale, instead of the scale of the whole network. In case of radius 4, for instance, only four turns are counted departing from each street segment. Theoretically, the integration measure shows the cognitive complexity of reaching a street, and is often argued to 'predict' the pedestrian use of a street. It is argued that the easier it is to reach a street, the more popularly it should be used. While there is some evidence of this being true, the method is also biased towards long, straight streets that intersect with lots of other streets. Such streets, as Oxford street in London, come out as especially strongly integrated. However, a slightly curvy street of the same length would typically not be counted as a single line, but instead be segmented into individual straight segments, which makes curvy streets appear less integrated in the analysis.

CHOICE measure is easiest to understand as a 'water-flow' in the street network. Imagine that each street segment is given an initial load of one unit of water, which then starts pouring out of the starting street segment onto all the other segments that successively connect to it. Each time an intersection appears, the remaining value of flow is divided equally amongst the splitting streets, until all the other street segments in the graph are reached. For instance, at the first intersection with a single other street, the initial value of one is split into two remaining values of one half, and allocated to the two intersecting street segments. Moving further down, the remaining one half value is again split among the intersecting streets and so on. When the same procedure has been conducted using each segment as a starting point for the initial value of one, then a graph of final values appears. The streets with the highest total values of accumulated flow are said to have the highest choice values. Like Integration, Choice analysis too can be restricted to limited local radii, for instance 400m, 800m, 1600m etc. Interpreting Choice analysis is trickier than Integration. Space Syntax argues that these values often predict the car traffic flow of streets. However, strictly speaking, Choice analysis can also be thought to represent the amount of intersections that need to be crossed to reach a street. However, since flow values are divided, not subtracted at each intersection, the output shows an exponential distribution. It is considered best to take a log of base two of the final values in order to get a more accurate picture.

DEPTH DISTANCE is the most intuitive of the three analysis methods, it explains the linear distance from the center point of each street segment to the center points of all the other segments. If every segment is successively chosen as a starting point, then a graph of accumulative final values is achieved. The streets with lowest Depth Distance values are said to be nearest to all the other streets. Again, the search radius can be limited to any distance.

Space syntax's mathematical reliability has recently come under scrutiny because of a number of paradoxes that arise under certain geometric configurations. These paradoxes have been highlighted by Carlo Ratti at the Massachusetts Institute of Technology, in a passionate academic exchange with
Bill Hiller and Alan Penn. There have also been moves to return to combine space syntax with more traditional transport engineering models, using intersections as nodes and constructing visibility graphs to link them by various researchers, including Bin Jiang, Valerio Cutini and Mike Batty.

Recently there has also been research development that combine space syntax
with geographic accessibility analysis in GIS, such as the place syntax-models developed by the research group Spatial Analysis and Design at the Royal Institute of Techology in Stockholm, Sweden.

Having theory in mind I am starting available software testing phase. I hope after some practical experiences theory will be more understandable.

Recommended links:

SPACE SYNTAX
http://en.wikipedia.org/wiki/Space_syntax
SPACE SYNTAX software
http://en.wikipedia.org/wiki/Spatial_network_analysis_software
free 2d softwere
http://taubmancollege.umich.edu/architecture/faculty/research_and_outreach/syntax2d/index.php
Visibility graph analysis
http://www.vr.ucl.ac.uk/research/vga/
Space Syntax Based Agent Simulation
http://www.vr.ucl.ac.uk/publications/penn2002-000.html
Visualistations
http://www.zupastudio.com/projects/ssx_visualisation/ssx_visualisation.shtml

0 comments

introductory note

I am interested in general laws which are not going out of fashion. With this type of approach I want to investigate contemporary architectural discourse. Simultaneously I want to fill gaps in my knowledge and understand themes which should be understood long time ago but were postponed...


FUTURE SEEMS TO BE KNOWN, as far as direction of professional practise software solutions

http://www.petermiller.com/proddetail.asp?prod=01974-08&cat=10

it is just a meter of time (and politics) when all options, studies and simulations will be integrated in one software. Our tools will be more stable and will offer multilevel real time analyses....

Being very optimistic we can predict that operating in more comfortable environment architects interest will focus on PURE CREATIVITY in stead of data transformations.


Interesting part is what is unknown. Instead of treating computer as stronger calculator there is a dream of dialogue with digital word....

and trying to be a little bit responsible in order to continue this stream of consciousness

I have to investigate following


http://en.wikipedia.org/wiki/Artificial_intelligence

http://en.wikipedia.org/wiki/Greedy_algorithm

http://en.wikipedia.org/wiki/Heuristic

http://en.wikipedia.org/wiki/Genetic_algorithm


I always had an impression that digital involved architects has these tendency to lean towards evolutionary approach just because it sounds cool but maybe these methods are not only fashion and lead to some potentials. Lets figure it out after making homework and getting back in touch with S.Lem

3 comments