Technologies and Techniques For Collaborative Robotics in Architecture
Technologies and Techniques For Collaborative Robotics in Architecture
ROBOTICS IN ARCHITECTURE
1. Introduction
The background for this work is the capacity for Industrial robotic arms to
engage and change the way architects explore and fabricate novel structures
and material compositions. Also, the advancement of computational design
processes and CAD-related technologies have made possible the modelling,
analysis and simulation of complex performance-driven constructs. The recent
development of robotic arms, with their versatile and highly customizable setup,
has made the fabrication of these architectural constructs feasible. In the
most exceptional cases, the robotic arm is such a well-integrated aspect of the
design exploration that one cannot separate the resulting design from its means
of fabrication. The use of industrial robots has become a “transformational
technology in architecture” (Daas and Wit, 2018). Despite these advancements, in
RE: Anthropocene, Proceedings of the 25th International Conference of the Association for Computer-Aided
Architectural Design Research in Asia (CAADRIA) 2020, Volume 2, 293-302. © 2020 and published by the
Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong.
294 M.B. JENSEN AND A. DAS
processes, as in the wine pouring case study in the RoboSense project. To obtain
a collaborative process the robot needs to perform in a way that exceeds what
can be anticipated by the designer - a robotic agent that contributes with actions
and intentions that assist the human designer in exploring unknown areas and
connections between a given problem-solution space.
In the field of architectural robotics, a growing range of technologies and
methods are emerging, of which some hold great potential for supporting a
framework for collaborative human-robot design exploration. As displayed in the
diagram in Figure 1, the construction of a framework for collaborative robotic
design processes requires certain key elements. While some are well described,
and standard in the field of computational architecture and robot-based design,
others need to be adopted from other research fields. Another important aspect for
a proposed framework is the option of parallel exploration of the physical design
object and its digital twin, which demands that the physical object (or system) can
be reproduced for further exploration in a CAD environment.
This study aims to utilize and explore the missing links needed to connect these
fragments into a combined framework for design exploration with collaborative
robotics. Therefore it is helpful to briefly look at the existing work for each of the
key elements displayed in Figure 1.
Most of the elements needed to construct a collaborative framework already
exist; however, a method for connecting and controlling all the sequential
processes needs to be defined. As the behaviour of each key element is likely
to pass through a series of clearly defined steps, triggered by input from either
the internal processes of the system or from external user input, the concept of
state machines are interesting. A state machine can be defined “by identifying
what states the system can be in, what inputs or events trigger state transitions,
and how the system will behave in each state” (Wright, 2005) and can be used to
control the behaviour of simple systems, as in the example in Figure 2, or very
complex UI systems. As GH is based on dataflow programming, suitable methods
will have to be investigated to ‘break’ this flow and construct a customizable state
machine.
296 M.B. JENSEN AND A. DAS
Figure 2. Example of a State Machine Diagram, with the key concepts ‘State’, ‘Event’,
‘Action’ and ‘Transition’.
2. Methods
To investigate the technologies and techniques needed to support a collaborative
robotic-based design process, the study applied a research-by-design strategy
(Hauberg, 2011). The strategy relies on physical and digital prototyping for
uncovering possible solutions and allows for a continuous and parallel process of
designing the framework and designing with the framework. The design process
thereby informs the development of the framework and vice versa.
input, two simple push-buttons were connected to the digital inputs in the control
box of the UR10.
Figure 3. The physical design system consist of 24 wood lamellas mounted in a steel frame.
Variations caused by rotation of individual elements creates potential for directed views and
intentional blocking of sunlight.
3. Results
3.1. COMPUTATIONAL FRAMEWORK FOR COLLABORATIVE DESIGN
EXPLORATION
The study resulted in a collaborative framework that integrates visual analysis
methods and a state machine to successfully allow for human-robot design
exploration of a material system. The computational design framework allows for
an interactive design exploration where a human agent, guided by design intentions
regarding obstruction of desired view lines and sun shading, can manually alter
the rotation of the wood lamellas. Subsequently, the robotic agent can be initiated
and via the mounted camera register the current rotation for each lamella. This
information allows the computational design model to perform environmental
simulation based on a series of alternative lamella configuration and suggest a
new and improved version by robotic rotation of the wood lamellas. As visualised
in the flow chart in Figure 4, the framework allows interactions by the human
designer (cyan coloured circles in the flow chart) to occur both during the physical
material-based design exploration and the robotic fabrication process.
Figure 4. Flow chart for the proposed human-robot framework for collaborative design
exploration. The cyan-coloured circles represent human actions during the design process
while the three types of colored flow-lines refer to processes within the ‘Human mode’ (pink),
‘Robot mode 1’ (orange), and ‘Robot mode 2’ (green).
The main result of the study is the design of the collaborative framework in its
entirety and the design process it supports. However, two aspects were crucial for
successfully achieving this objective; the integration of visual analysis features in
GH and the introduction of a state machine for controlling the interactive robotic
processes.
TECHNOLOGIES AND TECHNIQUES FOR COLLABORATIVE 299
ROBOTICS IN ARCHITECTURE
Figure 5. Top: Diagram of the visual analysis process from image data to rotational data.
Bottom: Example of the visual analysis routine performed on the top side of the wood lamellas
by using OpenCV in Python. The first picture shows the cropped image recorded by the
robot-mounted webcam. The second picture shows the masking out of all unwanted colours.
The third picture shows the result of applying the mask to the cropped image. .
Figure 6. Example of a simple State Machine in Grasshopper. The Grasshopper definition uses
custom python components and MetaHopper components to change and keep track of states.
The State Machine integrated in the proposed framework is an expansion of this setup.
TECHNOLOGIES AND TECHNIQUES FOR COLLABORATIVE 301
ROBOTICS IN ARCHITECTURE
4. Discussion
Following a research-by-design approach, this study has investigated and
established a framework that allows a designer to engage directly with a physical
design object, while in succession obtaining new design suggestion from a robotic
system. The development process and the final result reveals essential aspects of
human-robot design exploration.
The camera used to capture the coloured markers and the ArUco markers has an
automatic focus feature which often affects the image capturing process and results
in blurry photos with a negative effect on the colour detection procedure. During
the prototyping process, this was resolved by inserting a time delay (approx. 2
seconds) between robot (and camera) arriving at capture position and the actual
capturing process. Another challenge, well-known in the field of visual analysis,
is the importance of lighting conditions. The colour detection algorithm used
in the framework takes in the absolute HSV values from the colour system and
detects accordingly within a given range the varying hue, saturation and value. In
the physical setup, due to the presence of a skylight in the indoor environment,
the ambient light varied by a visible spectrum to cause significant error in the
colour detection. This issue can be mitigated using complete artificial light or by
minimizing the effect of varying coloured light in the system.
Figure 7. Physical demonstrator placed in an outdoor environment. The façade system clearly
displays its environmental performance towards shading the sun and directing views.
When taking part in a creative and collaborative design process, the experience
of time and the maintenance of creative flow is essential. An important aspect
of successful collaborative work is knowing the intention of the co-workers - a
challenging aspect when working with robots. Not knowing what goes on “behind
the scenes” during ex the time-span of computational performance search, which
often took 30-90 seconds, leaves the designer in a state of passive waiting. Initial
experiments show a significant difference in running the collaborative process with
or without the opportunity to see GH-based visualisations of the computational
302 M.B. JENSEN AND A. DAS
References
“OpenCV - Open Source Computer Vision” : 2019. Available from <https://docs.opencv.
org/trunk/d5/dae/tutorial�
e xtbackslash_aruco�e xtbackslash_detection.html> (accessed 10th
December 2019).
Daas, M. and Wit, A.J. 2018, Introduction, in M. Dass and A.J. Wit (eds.), Towards a Robotic
Architecture, Applied Research and Design Publishing, Novato, CA, 8-11.
Dorst, K. and Cross, N.: 2001, Creativity in the design process: Co-evolution of
problem-solution, Design Studies, 22(5), 425-437.
Dubor, A., Camprodom, G., Diaz, G.B., Reinhardt, D., Saunders, R., Dunn, K., Niemel”a, M.,
Horlyck, S., Alarcon-Licona, S., Wozniak-O’Connor, D. and Watt, R.: 2016, Sensors and
Workflow Evolutions: Developing a Framework for Instant Robotic Toolpath Revision,
Robotic Fabrication in Architecture, Art and Design 2016, Cham, 410-425.
Garrido-Jurado, S., Mu\˜noz-Salinas, R., Madrid-Cuevas, F. and Mar’in-Jim’enez, M.: 2014,
Automatic generation and detection of highly reliable fiducial markers under occlusion,
Pattern Recognition, 47(6), 2280-2292.
Hauberg, J.: 2011, Research by design: a research strategy, Architecture \& Education Journal,
1(2), 1-11.
Johns, R.L., Kilian, A. and Foley, N.: 2014, Design Approaches Through Augmented
Materiality and Embodied Computation, Robotic Fabrication in Architecture, Art and
Design 2014, Cham, 319-332.
Lawson, B.: 2005, How Designers Think - The Design Process Demystified, Taylor \& Francis.
Maher, M.L.: 1994, Creative design using a genetic algorithm, Computing in Civil Engineering,
Nwe York, 2014-2021.
Moorman, A., Sabin, J.E. and Liu, J.: 2016, RoboSense: Context-Dependent Robotic Design
Protocols and Tools, ACADIA 2016 Posthuman frontiers, 174-183.
Pigram, D., Maxwell, I. and Mcgee, W.: 2016, Towards Real-Time Adaptive Fabrication-Aware
Form Finding in Architecture, Robotic Fabrication in Architecture, Art and Design 2016, 1,
427-437.
Wright, D.R.: 2005, “Finite State Machines” . Available from <http://www4.ncsu.edu/̃drwrigh
3/docs/courses/csc216/fsm-notes.pdf>.