0% found this document useful (0 votes)
5 views42 pages

Harmonic Sculpting of Concrete Shell (3)

The document discusses the optimization of free-form concrete shell structures using novel strategies, focusing on shape optimization through NURBS and eigen functions. It outlines a workflow for evaluating material costs and fabrication efficiency, utilizing various optimization algorithms. Results indicate the effectiveness of multi-objective optimization in achieving desirable shell designs while considering cost and construction factors.

Uploaded by

afaq ahmad khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views42 pages

Harmonic Sculpting of Concrete Shell (3)

The document discusses the optimization of free-form concrete shell structures using novel strategies, focusing on shape optimization through NURBS and eigen functions. It outlines a workflow for evaluating material costs and fabrication efficiency, utilizing various optimization algorithms. Results indicate the effectiveness of multi-objective optimization in achieving desirable shell designs while considering cost and construction factors.

Uploaded by

afaq ahmad khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

Chien-Chun Kuo | Pinaki Mohanty | Tzu-Ching Wen

Computing in Architecture Tzu-Ching Wen | Pinaki Mohanty

Computing in Architecture
Harmonic Sculpting of Concrete Shell
Harmonic Sculpting of Concrete Shell

1 1
1
I. INTRODUCTION
● GOAL
● BACKGROUND
● PROBLEM SET-UP

2 2
2
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GOAL

To find most optimized shape(s) for free-form concrete shell using novel strategies.

3 3
3
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Shape Optimisation using NURBS

The shell forms found using physical models tend to be rather unvarying for a fixed plan
and support conditions. But with the development of more robust optimization algorithms
the field of generative design has become fairly broad. One such example where it has
been successfully utilised and designed is the Kakamigahara Crematorium in Gifu, Japan
[3]. The free-form surface is represented using NURBS with sufficient number of control
points to control the shape. The advantage of using NURBS discretization over mesh is that
the parametric variables for optimization is greatly reduced and adequate degree of
continuity is maintained too.

Fig.2: Kakamigahara crematorium,


Gifu, Japan, designed by Toyo Ito
together with Mutsuro Sasaki. [3]

Fig 1. Overview of the morphogenetic design


process of the Kakamigahara Crematorium Fig.3: NURBS representation of the
using evolutionary algorithm [3] crematorium roof. [3]

4 4
4
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Eigen Shells

Another way of choosing the parametric variables for shape optimisation is to


find a set of shapes that are intrinsically linked to geometry of the shell so that
the designer can choose from these set of shapes or a combination of them
which are visually satisfactory. For this purpose one could use eigen functions of
a discrete lapalcian. Each of these eigen functions represents an eigen shape
giving a displacement variable and a linear combination of these shapes using
weight factors controls the shape of the shell structure. So clearly the weight
factors become the parameters for optimization.

Although grasshopper plugins like Millipede allows one to evaluate eigen


vectors, however it has no features to include the boundary conditions. The
plugin is virtually a black box and extracting the matrix to manipulate it seems
impossible. We therefore then chose to utilise the modal shapes which not only
takes into account the mass and area properties of the shell but also quite
accurately deals with different boundary conditions. The suitability of this
approach was well studied and documented by Cecilie Brandt-Olsen in her MPhil
thesis at the University of Bath. [5]

Fig.4: First few eigenfunctions of a circular potential. [4]

5 5
5
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Eigen Shells

Fig 6. Frequency comparison between the results obtained from a modal


analysis in Autodesk Robot and the eigenvalues from an eigen decomposition of
the different discretisations of the Laplacian matrix for the ten first modes of a
flat square mesh. A linear relation is observed. [5]

The relationship between the modal frequencies obtained using both the
methods is observed to be linear which establishes the fact that the weight
Fig. 5. Comparison between the frst eight mode factors can still be used as parameters of optimization. We therefore used
shapes of a flat square mesh calculated from an
the Karamba plugin to evaluate the mode shapes and proceed.
eigen decomposition of the graph Laplacian (left) and
a modal analysis in Autodesk Robot (right) [5]

6 6
6
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Problem Set-Up

Fig. 7. Google maps satellite view of the site. Fig. 9. Google Street view of the site.
Fig. 10. Column positions of
the proposed shell structure
[7].

For our assignment we chose to select a parking garage site near the the
famous Sun-Moon Lake in Taiwan in order to to test our shell design method.
Figure 7 and Figure 8 show the site location and other aspects clearly. We wish
to create a shed made of concrete shell over the parking lot as marked red in
Figure 6. This serves as the starting flat surface for computing the modal
shapes. The columns which act as point supports for the shell structure are
appropriately placed to give least hindrance to the parking lot with the highest
span of around 30m.
Fig. 8. Site drawing. The exact location is marked in red [7].

7 7
7
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Workflow
Obj 1: Material Cost

DEFINE 2D REGION DEFINE SUPPORTS COMBINE EIGEN FINAL SHAPE


SHAPES

w1 x λ1 + w2 x λ2…..
Obj 2: Fabrication
Efficiency

+
...
...
..
Fig. 12. Workflow of the optimization process.

As shown in the figure above the basic workflow is as follows. We first fefine the 2D region or the flat shape on which the modal analysis is to
be carried out. After defining the boundary conditions or support locations modal eigen shapes are evaluated using Karamba. The eigen
shapes linearly combined using suitable weight factors. These weight factors serve as the parameters for optimisation.The final shape is then
evaluated to get the material cost and the fabrication efficiency, which will be the first and the second objective respectively.

8 8
8
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Parameters

Parameter 1: Shell Thickness Parameter 2: Weight Factors for 15 Mode Shapes

Fig. 13. Parameters controlling the shape of the shell structure.

The shell thickness serves as the first parameter and since we chose the first 15 modes, the weight factors for each mode shapes become a
parameter. Therefore in all there are 16 parameters.

9 9
9
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Objective 1: Material Cost

p - Soft Penalty:
Deflection
(0 or d/dmax + pmin - 1)

180€/m3
Concrete Volume
(m3)

Cost + wp √(p)
Material Cost Final Cost
(wp = 100000)

Steel Weight (kg)


1.1€/kg

Fig. 14. Summary of the evaluation procedure for material cost.

Material cost includes the concrete volume and the steel weight. The rates used to compute the total cost are taken from the Cost
Construction Analysis report by Bilfinger Tebodin [6]. A soft penalty is applied on the final cost to take into account excessive deflection.
This helps the algorithm to avoid iterating in the design space which give very high deflections.

1010
10
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

Objective 2: Fabrication Efficiency Indicator

Panelize Planarization Calculate Indicators of


Nurb Surface Optimization Double Curved Surface

Fig. 15. Summary of the evaluation procedure for fabrication efficiency indicator.

The fabrication efficiency indicator is based on the assumption that the shell structure could be either made of precast panels or using
in-situ concrete pour.Regardless of the method one prime factor which controls the cost during construction is the curvature of the
shuttering panels or the precat panels. For this reason we first discretised the shell into diamond shaped panels and evaluated the
flatness of each of these panels. The cumulative flatness is then reported as the value for Objective 2. Higher the value of this number
indicates higher cumulative curvature and hence higher is the fabrication/construction cost. For this reason it has been termed as
fabrication efficiency indicator.

1111
11
II . RESULTS
● SOO
● MOO

1212
12
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

BENCHMARK RESULTS
Objective - Material Cost

Fig. 16. Convergence graph of the three chosen single objective Fig. 17. Robustness graph of the three chosen single objective
optimisation algorithms. optimisation algorithms.

From Figure 16 on the left it’s quite clear that both RBFOpt and GA are better in estimating the hypervolume even after
1000 iterations. DIRECT algorithm although converges earlier but does not converge to a result which is as optmised as
others. When we look at the robustness graph, GA is not as robust as others. DIRECT is the most robust followed by
RBFOpt. This result is as one would expect, since GA is a metaheuristic algorithm and DIRECT is a direct search
algorithm.

1313
13
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

BENCHMARK RESULTS
Objective - Material Cost

RBFOpt DIRECT GA

Steel Concrete Material Thickness Steel Concrete Material Thickness Steel Concrete Material Thickness
(kg) (m^3) Cost (€) (cm) (kg) (m^3) Cost (€) (cm) (kg) (m^3) Cost (€) (cm)

16678 242 61589 7.7 12880 239 57172 8 14062 230 56821 7.7

Above is the documented values of different SOO algorithms. They all seem to have values in the same range.

1414
14
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

BENCHMARK RESULTS
Objective - Objectives - Material Cost & Fabrication Indicator

Fig. 18. Convergence graph of the three chosen single objective Fig. 19. Robustness graph of the three chosen single objective
optimisation algorithms. optimisation algorithms.

Each algorithm was run for a total 3 times each with 1000 evaluations.The convergence graph on the left shows that RBFMOpt
although converges later than the other algorithms but it’s hypervolume approximation is the best. HypE is better that NSGA II. This is
quite understandable considering NSGA II is a metaheuristic algorithm. Surprisingly, NSGA-II seems to be more robust than the other
two although by not a big margin. These observations prove that one must look at both the convergence and robustness graph to
come to a good conclusion. An algorithm showing high robustness may not be after converging to a satisfactory value even after large
number of iterations.

1515
15
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

BENCHMARK RESULTS
Objective - Objectives - Material Cost & Fabrication Indicator

Fig. 20. Pareto front approximation for the multi-objective optimisation.

The pareto fronts plotted are pretty smooth and look complete for all the algorithms which proves that the optimization runs were
successful. A clear convex pareto front indicates that the objectives are indeed conflicting strengthening the case for using
Multi-Objective optimization per se. RBFMOpt as expected covers a winder range and hence heavily influences the best known front.

1616
16
III . RESULTS
● Unsupervised Machine Learning

1717
17
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

RESULTS - Pareto Rank 1, 2 & 3

2 4 5 7 10

2 3 5 6 9

2 3 4 6 8

1 2 4 6 7

1 2 4 5 7

Fig. 21. All results in pareto rank 1, 2 and 3

In this all the results from paret rank 1, 2 and 3 are plotted from all of the data generated during the MOO iterations. In the
picture above they are unclustered.

1818
18
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

CLUSTERING METHOD (without PCA)

k = 10
Fig. 23. The elbow method as explained in the scikit
Fig. 22. The elbow method applied to the data from pareto
documentation. [8]
rank 1, 2 and 3

The scikit documentation states that “It is important to remember that the “elbow” method does not work well if the data is not very
clustered. In this case, you might see a smooth curve and the optimal value of K will be unclear” [8]. Hence as can be seen in Figure
22 it is yet not very clear where the elbow occurs. We nevertheless chose to adop the best possible result of k =10

1919
19
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

CLUSTERING RESULTS (without PCA) - Pareto Rank 1, 2, 3

10 9 8 7 6 5 4 3 2 1

Fig. 24. Clustering results based on slab thickness and weight factors of mode shapes.

All the geometries from Pareto rank 1,2, and 3 are clustered based on parameters. Here shows the morphologically similar geometries
are layout and sorted together. Looking at cluster 1 and cluster 8, 9 and 10 it seems they are mostly dominated by a single mode and
the ones in between are variations of different mode shape weight factors except for one geomtery in cluster 4 which seems to have
been influenced by the shell thickness in that cluster.

2020
20
IV . RESULTS
● Supervised Machine Learning

2121
21
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DATA ANALYSIS

Fig. 25. Plot of Material Cost v/s Slab Fig. 26. Plot of Fabrication Indicator v/s Fig. 27. Plot of Material Cost v/s Mode 1
Thickness Slab Thickness Weight factor

The plot of material cost v/s slab thickness shows a clear convex relationship and suggests a good training could be achieved with a
polynomial regression model. The plot of fabrication indicator v/s slab thickness shows a highly scattered plot and does seem to have
a good corelation. We also plotted the mode shape weight factor against the material cost. agin not a very clear corelation is seen but
is comparatively less scattered compared to the previous plot. This suggests that the influence of mode shape 1 is quite consistent in
most of the iterations and one can use a linear model to fairly predict the values.

2222
22
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

TRAINING DATASET

x3 x3 x3
Dataset Dataset Dataset
(HypE) (RBFMOpt) (NSGA2)

3000 3000 3000

9000

TRAINING DATA TESTING DATA VALIDATION DATA


60% 20% 20%

Fig. 27. Data split for model training performance evaluation.

We used all the datasets generated during the MOO iterations and split them such that such that we get 60% for training and 20%
each for testing and validation. We used the cross validation scheme to evaluate all our models.

2323
23
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

MODEL PERFORMANCE

Table 1: CV scores for Polynomial regression models Table 2: CV scores for support vector regression models

MODEL CV SCORE
MODEL CV SCORE

Support Vector Regression


Polynomial Linear Regression (poly, degree 1) C=100, epsilon = 1 0.54
0.59
(degree=1)

Polynomial Linear Regression Support Vector Regression


0.78 (poly, degree 2) C=1000, epsilon = 1 0.51
(degree=2)

Polynomial Linear
0.80 Support Vector Regression
Regression(degree=3) 0.54
(poly, degree 3) C=1000, epsilon = 1

We tried Polynomial regression models with various Support Vector Regression


(rbf, degree 1) C=100, epsilon = 1 0.55
degrees and support vector regression models using both
poly and rbf kernel with various values of c and epsilon.
The polynomial linear regression model of degree 3
Support Vector Regression
clearly stands out. We were unable to calculate CV 0.75
(rbf, degree 2) C=10000, epsilon = 1
scores for higher degrees of polynomial regression due to
serious computational limitations.
Support Vector Regression
(rbf, degree 3) C=10000, epsilon = 1 0.75

2424
24
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

SUPERVISED ML
Simulated Value vs. Prediction Value
Model: Polynomial Regression (Degree 3)

Table 3: Predicted values using Polynomial regression model

PERCENTAGE
SLAB THICKNESS GROUND TRUTH PREDICTION
DIFFERENCE

Obj 1 : 175470.07 Obj 1 : 172037.42 Obj 1 : 1.96 %


19.5 cm
Obj 2 : 200.32 Obj 2 : 199.76 Obj 2 : 0.28 %

Obj 1 : 326806.52 Obj 1 : 420399.84 Obj 1 : 28.63 %


10 cm
Obj 2 : 136.9 Obj 2 : 134.09 Obj 2 : 2.05 %

Obj 1 : 238054.69 Obj 1 : 253518.55 Obj 1 : 6.49 %


35.8 cm
Obj 2 : 179.23 Obj 2 : 179.08 Obj 2 : 0.08 %

2525
25
V. RESULTS
● Deep Learning

2626
26
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DEEP LEARNING

Steps

Following the previous chapter, in this chapter, the The main processes are shown as the following:
Deep Learning method has been further applied to
1. Two Deep Learning models will be constructed and train with a training
train and build the surrogate model to predict the
dataset. The DL models are built respectively with two hidden layers with
Fabrication Efficiency Indicator and Cost.
64 neurons in each layer and 2000 epochs and two hidden layers with 64
Models and Dataset neurons in each layer and 3000 epochs.

The datasets generated by HypE, NSGA2, and


2. Compare the MAE and LOSS and choose the model with better
RBFMOpt Multi-Objective optimization algorithm are
performance to create the HDP5 file, which has stored the trained weights.
used as the training and testing dataset. The
dataset is with 9000 data in total. The Deep
Learning models will be built by using Tensorflow 3. Run the python script loading with the pre-trained HDP5 model and with
and Keras machine learning library. Tensorflow running.

4. Run the grasshopper definition to get the parameters of the features.

5. Send the parameters of the features to the pre-trained model to predict.

6. The predictions value will be sent back to grasshopper environment.

2727
27
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DEEP LEARNING
Results Comparison: Sum of Ground Truth from Fabrication Efficiency Indicator & Cost vs. Prediction

Fig. 28. Mean Abs error plot for 2000 epochs Fig. 29. Mean Abs error plot for 3000 epochs

From the graph Figure 28, it is clear that the model has start to radically reduce the error. more than 500 epochs.
Both models are taking the epochs up to 2000 and 3000 epochs. So both have converged and reduced the Mean
Abs Error.

2828
28
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DEEP LEARNING

Results Comparison: Sum of Ground Truth from Fabrication Efficiency Indicator & Cost vs. Prediction

Fig. 30. Prediction v/s true values plot for 2000 Fig. 31. Prediction v/s true values plot for 3000
epochs epochs

From the graph Figure 30, the model with 2000 epochs shows the True Values and Predictions Value have a
proportional relationship. However, the model with 3000 epochs has a less proportional relationship, i.e.,
increasing the number of epochs from 2000 to 3000 has decreased the prediction accuracy.

2929
29
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DEEP LEARNING

Predictions Accuracy

Fig. 32. Sum of Difference vs. Shell Thickness Fig. 33. Fabrication Efficiency Indicator Difference
vs. Shell Thickness

The graph (Figure 32) shows the difference between the simulated and predicted values from the surrogate model trained by the
deep learning method. When the shell thickness is between 30 – 50 mm, the predicted values from the surrogate model have fewer
differences against the simulated value, i.e., the surrogate model in this range predicts much more accurately.

3030
30
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

DEEP LEARNING
Comparison of Supervised Learning and Deep Learning Surrogates Models

Table 3: Predicted values using Polynomial regression model Table 4: Predicted values using DL surrogate models

SLAB PERCENTAGE PERCENTAGE


GROUND TRUTH PREDICTION SLAB THICKNESS GROUND TRUTH PREDICTION
THICKNESS DIFFERENCE DIFFERENCE

Obj 1 : 175470 Obj 1 : 172037 Obj 1 : 1.96 % Obj 1 : 327737 Obj 1 : 389832 Obj 1 : 18.94 %
19.5 cm 19.5 cm
Obj 2 : 200.32 Obj 2 : 199.76 Obj 2 : 0.28 % Obj 2 : 142.9 Obj 2 : 133.8 Obj 2 : 6.38%

Obj 1 : 326806 Obj 1 : 420399 Obj 1 : 28.63 % Obj 1 : 238790 Obj 1 : 368116 Obj 1 : 54.15 %
10 cm 10 cm
Obj 2 : 136.9 Obj 2 : 134.09 Obj 2 : 2.05 % Obj 2 : 165.4 Obj 2 : 129.2 Obj 2 : 21.88 %

Obj 1 : 238054 Obj 1 : 253518 Obj 1 : 6.49 % Obj 1 : 253655 Obj 1 : 388426 Obj 1 : 53.13 %
35.8 cm 35.8 cm
Obj 2 : 179.23 Obj 2 : 179.08 Obj 2 : 0.08 % Obj 2 : 142.7 Obj 2 : 134.2 Obj 2 : 5.93 %

Compared to the prediction values from the Supervised Learning 1. Small training dataset.
Surrogate and Deep Learning Surrogate models, the above tables 2. Too less hidden layers.
show that Supervised Learning has a much better prediction accuracy 3. Too little amount of neurons in each layer.
and stable prediction results. This result is slightly different than we 4. The small number of epochs.
have expected.
To increase the training dataset, increase the hidden layers, the
The reasons why the Deep Learning surrogate model is less accurate neurons number in each layer, and the number of epochs might help
might can be as the following reasons: to improve the performance of the Deep Learning Model.

3131
31
VI . RESULTS
1. Generative Deep Learning I: Style Transfer for Shell Substructure
2. Generative Deep Learning II: Style Transfer for Brick Facade Patterning

3232
32
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING I : Style Transfer for Shell Substructure


Shape Optimisation using NURBS

ASSIGNMENT 1- 4 : Harmonic Sculpting of Concrete


Shell

Origin Image Profile Pattern from Nature Final Result Image


Fig. 34. Workflow of style transfer on 2D shape

The following is the workflow of the entire process

- Keep the shell with double curved panel


- Generate the pattern from algorithm
- Scattering the grid points on the surface
- Mapping the values from the style transfer image to the points
- Extrude the curve and line as ribs which stiffens the shell.
- Analyzing the structural performance
- Comparing to the original shell

3333
33
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING I : Style Transfer for Shell Substructure


Style Transfer

Texture 01 Texture 02 Texture 03 Texture 04


Fig. 35 Style transfer results using each of the textures.

3434
34
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING I : Style Transfer for Shell Substructure

Fig. 36. 3D view Shell with ribs from Texture 04 Fig. 37. Plan view Shell with ribs from Texture 04

Table 4: Comparison of structural performance.


The optimised result from SOO was taken to test this idea. Figure 36 above
shows the ribs pattern on the shell and Figure 37 exactly shows how the curves Maximum Displacement (cm) 2.599757
Original
and lines were placed after the style transfer in a plan view. In Table 4 we Shell Elastic Energy (kN.m) 22.36022
compare the results with the unstiffened shell. As one would expect the
displacement is reduced albeit by a small amount. This is probably because of Maximum Displacement (cm) 2.475245
Shell with
the additional weight introduced in to the structure because of the ribs.
Ribs Elastic Energy (kN.m) 29.999619
Nevertheless it helps and in some cases enhances the appearance of the shell.

3535
35
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING II : Brick Facade Patterning


Inspirations

Beijing National Stadium

Fig. 39. Common cell texture from creatures and plantations [11].

Beijing National Water Cube

Fig. 38. Beijing National Stadium (Top) [9] and Beijing National
Aquatics Center (Bottom) [10].

3636
36
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING II : Brick Facade Patterning


WORKFLOW

First Style Transfer Second Style Transfer

Double Style
Transfer

Brick Pattern
Genaration Image Sampling

Fig. 40 Workflow of the entire process of brick facade patterning.

3737
37
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING II : Brick Facade Patterning


WORKFLOW - STEP 2: Image Sampling

Brick Pattern Translation Result

Grid Layout Image Remap the Bricks


Sampler Values Rotation

Fig. 41. Grasshopper Implementation of the brick facade patterning.

3838
38
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING II : Brick Facade Patterning


CONTEXT APPLICATION

Original Content Style Images Final Images Brick Patterns


Image

Fig. 42 Application of style transfer images on facades

3939
39
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

GENERATIVE DEEP LEARNING II : Brick Facade Patterning


CONTEXT APPLICATION

Casa da Musica -
Rem Koolhas

Building Faces Building Faces with Pattern Transfer


Fig. 43. Brick pattern applied on all the facade face of Casa da Musica.

4040
40
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

THANK YOU

Tzu-Ching Wen | Pinaki Mohanty

4141
41
Introduction Problem Definition SOO MOO Unsupervised ML Supervised ML DL Generative DL Appendix

REFERENCES

[1] J. Ochsendorf and P. Block, “Exploring Shell forms,” in Shell Structures for Architecture - Form Finding and Optimization, Routledge, 2014, pp. 7-12.
[2] D. Piker, “Kangaroo: Form Finding with Computational Physics,” Architectural Design, pp. 136-137, 2013.
[3] A. Pugnale and M. Sassone, “Morphogenesis and Structural Optimization of Shell Structures with the aid of a genetic algorithm.,” Journal of the
International Association for Shell and Spatial Structures, pp. 161-166, 2007.
[4] P. Michalatos and S. Kaijima, “Eigenshells - Structural patterns on modal forms,” in Shell Structures for Architecture: Form Finding and Optimization,
Routledge, 2014.
[5] C. Brandt-Olsen, “Harmonic form-finding for the design of curvature-stiffened shells,” University of Bath, 2015.
[6] Bilfinger Tebodin, “Construction Cost Analysis - Industrial Projects Central and Eastern Europe,” 2019.
[7] Norihiko Dan and Associates", ArchDaily, 2021. [Online]. Available: https://www.archdaily.cn/cn/771310/ri-yue-tan-you-ke-zhong-xin-norihiko-dan-and-associates.
[Accessed: 13- July- 2021].
[8] "Elbow Method — Yellowbrick v1.3.post1 documentation", Scikit-yb.org, 2021. [Online]. Available: https://www.scikit-yb.org/en/latest/api/cluster/elbow.html.
[Accessed: 13- July- 2021].
[9]"Bird's Nest Beijing: Beijing National Stadium, How to Visit Bird's Nest, Location", Chinabeijingprivatetour.com, 2021. [Online]. Available:
https://www.chinabeijingprivatetour.com/attractions/show/bird_s_nest.htm. [Accessed: 14- July- 2021].
[10]"Beijing National Aquatics Center - Wikipedia", En.wikipedia.org, 2021. [Online]. Available: https://en.wikipedia.org/wiki/Beijing_National_Aquatics_Center.
[Accessed: 14- July- 2021].
[11] “Common cell texture from creatures and plantations“. [Online]. Available:
https://unsplash.com/s/photos/natural-texture

TOOLS/PLUGINS
1. Modellling and Simulation Environment - Rhino/Grasshopper
2. Simulation plug-ins - Karamba
3. Optimisation Plugins - MOpossum, Goat, Octopus, Galapagos
4. Programming languages - Python in conda environment
5. Google maps

4242
42

You might also like