Integrated Environmental Design and Robotics Workflow for Double-Skin Facade Perforation Onsite

Featured Application: Robotics can perforate façades in construction site compliance with environmental requirements. Visual algorithms can integrate facade design and construction using robotics. Facade perforation design that complies with the environment can be integrated with robot milling on site. Near real-time robot construction monitoring is possible using extended reality Abstract: In contemporary design practices, there is a disconnect between the design techniques used for early-stage design experimentation and performance analysis, and those used for the manufacture and construction. This study addresses the problems in developing an integrated digital design workflow and provides a research framework for integrating environmental performance requirements with robotic manufacturing processes on a construction site. The proposed method enables the user to import a design surface, identify design parameters, set several environmental performance goals, and thereafter simulate and select a robotic building strategy. Based on these inputs, design alternatives are developed and evaluated, considering their robotically simulated constructibility, in terms of their performance criteria. To validate the proposed method, the design is evaluated in an experiment wherein a double-skin facade perforation is generated using the proposed methodology. The results suggest a heuristic feature to improve the simulated robotic constructibility. Moreover, the functionality of the prototype is demonstrated.


Introduction
Automated systems and robotics are considered as technologies that will revolutionize the construction industry by addressing the problems of profitability while increasing efficiency [1,2]. On-site robotics has received significant attention in building applications recently. It has been utilized in various construction operations such as milling, perforation, inspection, structural safety monitoring [3][4][5], manufacturing of building components [2], assembly of building components [6][7][8][9], managing materials [6,10,11], and surveying and control of construction [12][13][14].
The rapid development of the architecture, engineering, and construction (AEC) industry's digital design methodologies and innovations and their combination with modern manufacturing and construction processes have aided architects in moving away from Fordist standardization to a post-Fordist domain with previously unattainable design possibilities [15,16]. Although symmetry and repetition have yielded economic efficiencies in the AEC industry, these principles are now outdated and have been overtaken by an age of digital design, analysis, and manufacturing [6,6].
Building information modeling (BIM) enables the discovery of new forms and multiple design alternatives that are effective for a wide range of pre-and postrationalization design approaches [6,15]. Moreover, simulations help designers formulate informed design decisions by ensuring consistency in the decision-making process for dynamic and synthetic design [17].
The current construction trend is to demonstrate the value of non-standard architecture and use its intricacy and distinction to provide higher-performing design solutions in a multi-objective and analytical manner as well as qualitative aesthetic metrics and delightful communication [3,13,18,19]. Therefore, robotic manufacturing provides a unique opportunity to reconsider the architectural design process by designing not only the building but also the construction process itself, which manifests in its own aesthetic and intrinsic post-Fordist performance gains [1,7,20]. The rapid convergence of advanced digital and robotic modeling techniques has contributed to reconsideration of an architect as a contemporary master-builder and a digital toolmaker [21,22]. This further illustrates the need for more comprehensive solutions in the AEC industry and the importance of developing new design methodologies that endorse a more holistic approach to robotics adoption [23].
Despite several advances, there is a lack of intuitive workflows for designers to integrate simulation results as drivers and constraints early in the design process [21]. Furthermore, diverse software is used in various design phases-concept generation, modeling, manufacturing, and construction planning-and consequently, the continuous exchange of knowledge between the AEC disciplines is a challenge [24][25][26]. The current design methods and analytical resources lack foresight and restrict the transfer of design parameters and constraints from the upstream to the downstream [16].
In general, these approaches do not consider the limitations of assembly and construction, the sensitivity of projects to local conditions, and the nature of comparable real-world noise [24,25]. There are currently few robust and efficient programming strategies available to manage multiple robots under custom semi-automated construction conditions [2]. Challenges for these programming tasks include a) a constantly evolving environment (building sites); b) significantly smaller production volumes (buildings are one-off products); and c) a wider variety of tasks involved [27,28].
This study presents an integrated approach to double-skin facade perforation on the construction site using a robot arm, compliant with the Leadership in Energy and Environmental Design (LEED) v4 daylight index, utilizing visual programming and extended reality technologies. Initially, we focus on coupling environmental performance simulations and double-skin facade perforations to conduct robotic construction simulations for informing perforation design configurations based on our system. A case study was developed and tested to examine the benefits and disadvantages of the proposed program. This study hypothesizes that the next steps in digital design and robotic manufacturing are the development of methodologies that consider computers and robotics as collaborative partners in the design process and have the capacity to record contextual factors, environmental analyses, and construction constraints to provide architects with design alternatives that meet complex requirements. We developed an integrated robotic development workflow architecture to address the problem, bypassing several proprietary software systems, which require specialized training and significant annual investments. The purpose of this analysis is to develop a framework that considers the significant facade perforation factors and combines architecture and compliance with climate, construction, and quality assurance requirements in a simple-to-use and effective platform. The proposed system is developed to provide a medium for the workflow between architects, engineers, specialist consultants, and manufacturers and the effective exchange of information on a single platform. This study is limited to double-facade perforation design and does not consider other facade design iterations.

Involvement of robotics in architectural design and construction
The use of large-scale CAD / CAM manufacturing and robotics in the AEC industry fills the gap between what is digitally designed and what could be achieved physically [20]. The AEC industry, however, remains a trade-oriented and labor-intensive sector with limited job automation. Research on construction-specific robots started in Japan in the 1980s and focused on various activities related to building [29].
However, the unpredictability of the construction sites, equipment limitations, and high costs have resulted in the practical use of a few construction robots on construction sites [12]. Related work has focused on either applying various approaches to specific robotic construction subtasks, such as assembly robots, or large-scale additive manufacturing techniques [7,30]. These approaches use machine control to take advantage of the superior surface-shaping capabilities of concrete and adobe frameworks or to incorporate autonomous distributed robotic systems to execute basic tasks collectively [1,31].
Recently, in architecture, multifunctional industrial robots have become popular in automation because they are not designed for a particular objective, like personal computers, but are appropriate for a variety of applications [21]. In particular, the widespread use of industrial robots has enabled the development of specialized manufacturing of architectural objects and mass personalized manufacturing.These advancements led to an increasing interest in reviving traditional techniques, or "digital craftsmanship," which contribute to creative and regionally appropriate architectural design that is formally complicated and complex, and often yield higher outputs [16,32].
To date, automated modeling methods are used as an effective drafting table edition and not as involved assistance in the design process [18,20]. Although current methods integrate performance input, the use of results-based simulation to inform generic design is only recently successful [17,33]. Although tools for designing complex geometric shapes are now common, there is a lack of intuitive tools that enable designers to design and explore construction processes, particularly robotic solutions, and their highly complex and precise geometric configurations [2]. This is due, firstly, to the open-mindedness of design problems and their unknown complexity, often involving hard-to-calculate synthesis [15]; secondly, to the general lack of a deep understanding and abstraction of fundamental shaping processes in complexity that enables building tools to aid design intuition [20]; and thirdly, the exponential advancement of automated production and robotic technology on one hand, and the growing need for architects to develop their programming skills on the other [16,34]. Together, these factors have inhibited the development and role of computing and robotics in shaping and further contextualizing design awareness and integrating it into architectural design decisions [35].

Automated continuous construction progress monitoring using 3D scans and extended reality
In the construction industry, there are increasing concerns related to timely and reliable information on the status of building projects [4]. Unfortunately, one of the most challenging aspects of building project management is constant supervision and regulation at all stages of the project [36,37]. This involves assessing progress via site assessments and alignment with the project schedule; however, the accuracy of progress assessment results depends highly on the expertise of the surveyor and the consistency of the measurements [38,39].
Manual visual assessments and standard development reports help to provide reviews on progress measurements [39], hardware and inventory tracking [40], health planning [6], and production tracking [41]. However, such processes require time, are vulnerable to error, and are uncommon [16]. Kinect is a common 3D scanning tool used for building textual analysis [40], spatial accuracy of depth data, and indoor mapping applications [42,43], near-range 3D measurements and modeling [42,44], and automatic identification of worker actions.
According to Kwon et al. ( 2014), an introduction of BIM models and AR offers an opportunity to improve current inspection models related to defect and quality control [45,46]. That is, specific details such as sketches, schedules, and materials from a BIM model will be translated to a marker that results as actual components on-site, in addition to details. Park et al. ( 2013) describe the AR implementation process in three stages: defining objects and estimating information, predicting the positions of virtual objects based on the estimated information, and finally, integrating the real world with the virtual elements [47,48].
Several studies have evaluated the use of BIM and AR/VR. Bae et al. (2013) analyzed a mobile AR app wherein the user takes photographs of the web and photographs taken from the pre-collected web, the program decides the positioning of the user relative to the site and delivers cyber knowledge accurately depending on the user role [49]. The frameworks display a model that is updated at various times, and users can use a mobile device to monitor the progress between different areas. The integration of the two techniques was also used in the scanning of a historical house to provide a virtual reality interface with enhanced degrees of information [50].
The authors hypothesize that the next moves in digital architecture and robotic manufacturing are the creation of methodologies that regard computers and robots as collaborating partners in the design phase, thus providing the capacity to document contextual factors, environmental analyses, and construction constraints to provide architects with design alternatives that meet specific requirements.
Despite the several advancements, a void exists in terms of a practical workflow for designers to integrate simulation outcomes as drivers and constraints early in the design phase (Kilian 2006). Additionally, diverse software is used in different design processesconcept development, modeling, production, and construction planning-and therefore, the continuous sharing of knowledge between the AEC processes is a challenge. Furthermore, based on the results of the previous studies, none of the existing approaches offer an extensive and accurate construction surveillance covering an entire building (outdoor and indoor) during the construction phase.

Methodology
The objective of this study is to develop a facade construction methodology on the construction site using a robot arm as an extension of our robot-based facade assembly optimization [18]. The extension of the iFOBOT system proposes a double-skin facade perforation on the construction site, and incorporates environment analysis into the design stage and a monitoring tool in the construction stage of the previous iFOBOT system.
The research questions are as follows. How do we develop a simple facade design tool compliant with environment LEED analysis rules? How can robotics constraints and environmental analysis validate simulations as performance drivers to improve a designer's decision-making process? Can the design and construction of a double-skin facade be integrated using the iFOBOT system, limiting human input to selecting design parameters and monitoring the perforation process using a robot arm?
The bottom-up approach to facade design, wherein environmental analysis and robot-operating strategies drive the perforation size and location from the beginning of the facade design, can provide designers with optimized solutions to constructability problems and feedback on the perforation process activity on the construction site [8,51].
This study, therefore, explores the hypothesis that robotic simulation can become an integral part of the digital design workflow, combining robotic agency with other objectives, including environmental analysis and heuristic estimation of construction efficiency.
This study proposes double-skin facade perforation on the construction site using a robot arm and environmental analysis. This system is called iFOBOT, wherein "i" denotes minimum human input, "F" denotes facade design and fabrication, and "OBOT" represents the involvement of the robot arm in the system.
The proposed system architecture comprises four modules: (1) the iFOBOTenvironment module provides the essential environmental analysis to guide the main double-skin facade outlines, (2) iFOBOT-design generates a perforated exterior layer of the double-skin facade according to the outlines provided by the previous module, (3) the iFOBOT-construct module controls the robot arm movement on the construction site for facade perforation, and (4) the iFOBOT-monitor module observes the perforation activity on the construction site and transfers data between the construction office and the job site in near real-time. The iFOBOT system utilizes the graphical algorithm editor Grasshopper [52] in the commercially developed 3D modeling Rhinoceros 3D software [53].
The proposed system inputs are targeted at building envelopes that include the interior space, interior facade, and exterior facade layer geometry. Additionally, they comprise the LEED regulations including environmental information, rule constraints, and geospatial location information that will be used for environmental evaluation. The system input is imported into iFOBOT-environment that produces a LEED compliance heatmap of the exterior facade layer and transmits it to the other system modules. Thereafter, iFOBOT-design uses the heatmap from iFOBOT-environment as a guide to generate a perforated facade layer and transmits it to the next module. Subsequently, iFOBOT-construct uses a perforation toolpath to guide the robot arm to mill the exterior facade wall onsite. The iFOBOT-construct trade-off metrics are daylight factor analysis, collision detection between the robot arm body and surrounding objects, avoiding singularity, reachability detection, and reducing movement travel time. Finally, iFOBOTmonitor registers the robot movement and perforation details and transmits them to the construction office for quality evaluation. The review is sent back to the construction job site utilizing the 3D scanner and extended reality technologies. The system outputs are the double-skin perforated facade geometry, perforation drawings, environmental analysis, the KUKA krl file to run the robot arm, and construction progress steps, as depicted in Figure 1.

iFOBOT-environment
The iFOBOT-environment module uses predefined input data including weather analysis data to develop an environment rule compliance examination tool that generates an environment-friendly facade perforation heatmap and passes it to iFOBOT-design. The input to iFOBOT-environment includes weather data, building location, and orientation. These inputs are used to build the sun path control and climate-based performance simulation using the Ladybug and Honeybee plugins [54] within the Grasshopper canvas.
This method is developed to measure the annual sun exposure (ASE), which is a metric that identifies the potential for visual discomfort in the interior space. iFOBOTenvironment calculates the ASE to determine the LEED v4 daylight credit [18]. Less than 10% of interior space should receive direct sunlight greater than 1000 lux for a maximum period of 250 h per year ( ASE1000, 250). The ASE calculation grid size should be less than 600 mm2 at a work plane height of 750 mm above the finished floor. For the nearest accessible weather station, an hourly time-step analysis based on standard meteorological year results, or an equivalent, is used. Any existing obstructions in the interior space are included and movable furniture and partitions are excluded.
The iFOBOT-environment partitions the ASE calculation methods into discrete components to calculate it for double-skin facade exterior layer building geometry. In particular, the ASE determines the percentage of floor area receiving more than 1000 lux of direct sunlight only for more than 250 h per year. The visual program script is built on the work of ASE in the LEED credit index [55] Consequently, the iFOBOT-environment begins by measuring only the direct sunlight that falls on the exterior horizontal surface. This is thereafter compounded by a window transmission and used to select the sun positions with values that are just above the 100 lux threshold. The resulting sun vectors are then used in the measurement of "sunlight hours" to calculate the floor area that reaches 250 h per year. The resulting values indicate if the facade design perforations meet or fail these criteria, as depicted in Figure  2.

iFOBOT-construct
iFOBOT-construct is responsible for robot movement during the perforation activity on the construction site. It receives the facade perforation design from iFOBOT-design, in accordance with the design parameters, and generates the robot language control data.
The iFOBOT-design provides the perforation outlines and depth required for the robot arm to mill the exterior facade layer. Designers need to input predefined parameters to iFOBOT-construct for the digital robot movement simulation to be performed. The input data include the robot arm type selection, location, orientation, milling spindle tool, robot brand, construction environment 3D model, perforation activity, and surrounding collision objects on the construction site. This module processes the input data to calculate the robot arm safe working space, as depicted in Figure 3 (components d, e). Thereafter, the module divides the building envelope into workstations (WS) according to the robot arm safe working space parameters, as depicted in Figure 3 (component c). The workstation script was developed based on the iFOBOT-B dataset [17].
iFOBOT-construct uses the KUKA|prc plugin [56] in the Grasshopper environment. It builds a parametric model of robot arm movement and a kinematic simulation of the robot. The outputs of this module are workstation 3D models of perforation activity that include robot arm, facade exterior wall, construction constraints, and perforation toolpath. The output also includes the KUKA krl file that can be executed on the KUKA robot without additional software, as depicted in Figure 3.

iFOBOR-monitor
The iFOBOT-monitor module monitors the robot arm at the construction site using 3D camera scanning and extended reality. The iFOBOT-monitor was added as an essential subsystem of iFOBOT because the robot arm operates in a dynamic environment with high risk, and an effective monitoring tool that records the robot arm movement and exchanges data with the construction office is essential in onsite perforation activities.
The iFOBOT-monitor is initialized with predefined settings including scanner coordinates on the job site, field of view, resolution quality, color coding robot arm, robot workstation model location, and BIM level of detail. None of the previously reported methods in the literature are capable of providing extensive and accurate construction monitoring that covers an entire building (outdoor and indoor) in a construction process. Therefore, our study focused on comparing 4D as-built and 4D as-designed BIM models. We aimed to create a system that does not require additional efforts for point cloud acquisition and solves the problem of incomplete point clouds owing to obscure building elements or missing building parts (e.g. indoor or outdoor scanning online). The basic idea is to create 4D point clouds continuously by collecting 3D scans from the site using suitable low-cost 3D scanning tools, such as Kinect [43,47]. Although we utilized the Kinect v2 sensor in our case study experiments, several other tools are also appropriate (e.g. Xtion ProLive 3D sensor or Structure sensor). The approach is explained in detail in subsequent sections.
The 3D laser scanner obtains the point cloud data of the workstation including robot movements, facade perforations, construction constraints, and surroundings and registers the data in a portable computer on the job site using Quokka Kinect data processing [57] and Firefly real-time data processing [58] plugins in a Grasshopper environment. Thereafter, it processes the target objects' (facade perforations herein) point cloud data into a 3D mesh geometry with reduced vertices to yield a light and transformable file. Subsequently, the iFOBOT-monitor utilizes a Speckle plugin [59] to transmit the 3D mesh geometry to the construction office and synchronize/update digital models in the BIM environment. The inspector utilizes virtual reality to examine the progress rate, assess the quality, add notes, and view schedules and compares the BIM model with data from the job site using the Mindesk plugin [60] in the Grasshopper environment. Finally, the inspector sends the inspection digital report feedback to the job site for the worker to view in augmented reality using the Fologram plugin [61] in the Grasshopper environment, as depicted in Figure 4.

Discussion
The core thermal model variables are illustrated in Figure 5. The model comprised a single zone layout with all surfaces assumed to be adiabatic except the glass façade, representing a single room in a large building. For the thermal, daylight, and glare simulations, each shading variant was applied to the model's exterior. The location of the case study was 471-080 Galmae-dong Guri-si, Gyeonggi-do Seoul South Korea.
Four design choices for the facade were examined to evaluate the efficiency of the traditional shading approaches with a rigorous solution focused on constraints. The iFOBOT process began by simulating four different exterior facade designs in the iFOBOT-environment and examining the portion of space that receives more than 250 h of daylight and the LEED states of the geometry. The first facade design achieved a Partition of space with > 250 h = 54% and did not comply with the LEED v4 index. The second facade design achieved a Partition of space with > 250 h = 0% and complied with the LEED v4 index. The third facade design realized a Partition of space with > 250 h = 20% and LEED status of Geometry = B. The fourth and facade design achieved Partition of space with > 250 h = 1% and complied with the LEED v4 index. The façade designs are shown in Figure 5 (denoted by 1, 2, 3, and 4).
The heatmaps of the four facade alternatives followed the traditional Sun-Path solution model. Determined by an hourly thermal simulation, the solar benefit desirability is mapped to the region of the sky where the sun is at the respective hour. The chart further highlights that weather conditions in the summer solstice are not symmetrical; a particular location of the sun can be advantageous in terms of the solar gain in the spring, whereas this same location can be undesirable in the fall. The 3D design model designed herein included all the features required to incorporate detailing and structural constraints. Because the model is developed in a software framework that also supports environmental optimization, it is simple to use the final design to re-run the Ladybug plugin for the Rhino simulation environment. The simulation helps investigate the impact of incorporating construction constraints on the shading system performance. An iterative feedback process for simulation and concept detailing is enabled because the detailing requires integration. If a satisfactory solution is obtained, the final geometry is selected for use in the robotic code generation required for the production process.

Case study validating iFOBOT-environment and iFOBOT-design
The iFOBOT-design module then used the best facade design (facade design 4 herein) heatmap to generate the perforation corresponding to the initial facade openings using an image sampler component in a grasshopper environment. The iFOBOT-design and iFOBOT-environment passed the perforated facade openings to the iFOBOT-construct for onsite construction using robot arms as depicted in Figure 5.
The geometry of the perforation was further modified to satisfy different building requirements and material constraints. The environmental design phase achieved a continuous surface model of the perforation that was subdivided in the next stage to fit the supporting framework and the corresponding environmental heatmap. A Grasshopper concept designed to balance the user-defined supporting structure and accept the desired relation information was used to automate the surface subdivision. The resulting individual perforation surface geometry was used to identify the component's central axis and to locate mounting mechanisms on the support.

Case study validating iFOBOT-construct
The next stage was performed by the designer in operation, whereas the previous stages were conducted by the construction team comprising the contractor, façade specialist, and environmental engineer.
A 6-axis commercial robotic arm KUKA HA 30/60 was used for the perforation. A series of custom Grasshopper definitions were created from the optimized geometry positioning commands developed in the previous stages of the workflow. Herein, every digital part was perforated using a flat end mill bit of the central digital construction model, which was rotated and positioned relative to the manufacturing process requirements. Additional content restrictions can be considered in the scripts and included as components. Furthermore, additional constraints of high-quality industrial facade perforation products were dimensionally rectified at the end of production to achieve specific tolerances in the perforation process. Similarly, the manufacturing process developed uses robotic post-processing at the green state that provides a consistent thickness and an exact shape of the component, as illustrated in Figure 6. Herein, robotic milling was used to achieve dimensional rectification using the automation tools developed. Using the KUKA plugin in Grasshopper, a rapid code was developed for surface milling tool paths and contour cuts [56]. Without manual tooling, all necessary code was generated directly from the central Rhinoceros model for the 6-axis robotic work cell.
The integration process was examined with the full-scale prototypical facade segment developed. Perforations on the facade were observed from the iFOBOT-env heatmap. Figure 7 shows that the design appropriately addresses the material and the construction parameters.

Case study validating iFOBOT-monitor
The iFOBOT-monitor performs perforation activity monitoring and exchanges progress updates between the job site and construction office. It first scans the scene using a 3D camera and captures 3D point cloud data. Thereafter, it converts point cloud modeling data to a 3D mesh. Then, it sends the site data to the construction office. An inspector examines the data using virtual reality and prepares an inspection report. Finally, the worker on the job site receives the inspection feedback and visualizes it using augmented reality technology.
In our case study, the monitoring commenced by placing a Kinect V2 camera safely beyond the range of the robot arm movement. The camera was manually adjusted to capture the robot arm body, facade wall, and surroundings. Thereafter, the exterior wall of the double-skin facade was cropped from the point cloud to reduce computational time and data. It was converted to 3D mesh in the iFOBOT-monitor initialization stage. We used a Speckle plugin to transmit the 3D mesh to the construction office. The inspector synchronized the BIM model in the office and received the scanned 3D mesh from the job site. The inspector used an Oculus Rift S headset as a virtual reality host to inspect the asbuilt model and compared it to the designed facade wall. The Mindesk plugin in the Grasshopper environment enabled the inspector to zoom in, write notes, take measurements, draw shapes, examine the quality, monitor coordinates, and examine progress in the VR mode. The inspector then prepared a feedback report and connected it to the Fologram plugin. The worker on the job site received an update report on using a smartphone pre-synchronized with the construction office and connected to the same WIFI network. Finally, the subject was able to visualize the inspector's report and overlapped the as-designed model with the as-built model using a QR code. The iFOBOTmonitor is an essential part of iFOBOT as it allows exchange of robotic perforation data between the job site and construction office rapidly and easily because the different technologies are connected to the same platform. Each loop of data exchange lasts for an average of 17 min. The digital model overlaps with the as-built point cloud at the construction site to assess if the robot is performing the correct joint movements and milling the perforations accurately as depicted in Figure 7.

Discussion
This Research developed and evaluated an optimized iFOBOT system for on-site double-skin surface perforation. Our case study provided insights into the effect of one design parameter and the angle of perforation on the system's design generation and behavior. The double-skin facade perforation scale and shape are managed by the iFOBOT-design and guided by the iFOBOT-environment, which indicates whether the façade complies with the LEED v4 index or not. In our case study, the fourth iteration successfully passed the environment rule compliance examination. Then, the iFOBOTconstruct developed a milling toolpath for the robot arm to initiate perforation at the job site. The experimental test was successful, and approximately 1.5 h were required to complete half of the wall milling task. The iFOBOT-monitor successfully registered a 3D point cloud of the perforation activity and converted the target wall to a 3D mesh. Thereafter, it sent the 3D mesh geometry to the construction office for inspection and quality examination. The inspector used the virtual reality interface connected to the iFOBOT system to review, comment, consider measures, and examine the perforation progress. Finally, the worker received an inspection report at the construction site and visualized it using augmented reality technology. Each loop of this data exchange requires 17 min. The Kinect camera used in this study performs best in indoor environments with controlled illumination but has limitations in outdoor environments. Therefore, in the future, we plan to use outdoor depth capturing sensors, such as ZED 2 [62] and integrate them with the iFOBOT system. We will collect and evaluate data to further improve the heuristic method, as well as to identify the modules and negotiate them. Furthermore, the full integration of the geometry and the data flowing through the modules of iFOBOT will be considered. Improving the dataflow to provide essential feedback via inputs from the building and simulation is an important enhancement to the iFOBOT framework.

Conclusion
This study introduced an integrated concept for the robotic manufacturing workflow. Although this workflow was designed as a high-performance ceramic facade system herein, it is transferable to other material systems. Custom Grasshopper components and scripts were merged into a single software framework for all required actions. This approach to design automation promotes the cooperation of team leaders engaged in tailor-made, high-performance façade concept creation and construction planning. Furthermore, the automated workflow directly facilitates the manufacturing process. It enables the iterative simulation of the geometry of buildings and examination of whether the geometry generated in the early stages of design is performing as expected.
Environmental optimization creates highly customized perforation forms on facades, which can result in visually rich and complex façade patterns. In practice, there is a widespread interest in strictly formal expressions of parametric designs, an interest that can lead to the same requirement for optimization even without performance optimization. By considering existing manufacturing processes and integrating related digital design processes into production environments, highly variable geometries can be readily implemented in physical form.
The field of architecture has long considered the design problem in the form of expensive one-off designed architecture versus inexpensive yet relentless high-volume industrial manufacturing. Robotic technologies and related approaches for design automation can provide a third route to solve this age-old problem. Focusing on environmental efficiency as a configuration engine does not mean sacrificing architecture-these are no longer mutually exclusive.

Conflicts of Interest:
Declare conflicts of interest or state "The authors declare no conflict of interest." Authors must identify and declare any personal circumstances or interest that may be perceived as inappropriately influencing the representation or interpretation of reported research results. Any role of the funders in the design of the study; in the collection, analyses or interpretation of data; in the writing of the manuscript, or in the decision to publish the results must be declared in this section. If there is no role, please state "The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results".