Download PDF
Research Article  |  Open Access  |  24 Mar 2024

Robotized technologies for enhanced shipyard operations: challenges and solutions

Views: 188 |  Downloads: 32 |  Cited:   0
Green Manuf Open 2024;2:6.
10.20517/gmo.2023.102601 |  © The Author(s) 2024.
Author Information
Article Notes
Cite This Article

Abstract

Large component manufacturing relies heavily on manual operations and human workers. Human-centric solutions can preserve industry-specific knowledge, extend capabilities, and improve job performance. Three robotized technologies were developed for shipyard operations: ABB™ and KUKA™ robot hand-guiding systems (HGS), a lightweight collaborative system for plasma cutting, and a cost-effective 3D projection system for retrofitting. These technologies were developed at the open didactic factory, which served as platforms for rapid technological advancement. The HGS was integrated with ABB™ and KUKA™, and the 3D projection technology and lightweight collaborative system offered a cost-effective solution for small and medium shipyards. However, transitioning to non-flat surfaces presents challenges due to geometric variations and discrepancies between the computer-aided design model and the actual component.

Keywords

Large component manufacturing, human-robot collaboration, 3D projection, hand-guiding systems

INTRODUCTION

Manufacturing large components involves designing components into sub-assemblies, which are then assembled in a specific sequence. This process is influenced by factors such as materials, geometry, and accessibility. Shipbuilding often opts for manual labor over digital robotized solutions due to considerations such as tolerances, expensive equipment, and the need for immediate access to components[1]. At the same time, the shipbuilding is a significant contributor to the global greenhouse gas emissions. Amidst growing political and social pressures for reducing the environmental impact, the shipbuilding industry is adopting advanced solutions such as Shipyard 4.0 to enhance efficiency and sustainability[2,3]. Digital transformation of shipbuilding sector is needed to address technology adoption and reduce carbon footprints[3]. While integrated data management[4] improves product traceability and quality, human skills remain crucial for sustainability[5]. Tools such as human-centric robotics technologies powered by artificial intelligence (AI) and Augmented Reality/Virtual Reality (AR/VR) can support the human workforce for improving manufacturing digitization, efficiency, quality, and sustainability of large-sized components[6].

This research contributes to human-centric robotics, enhancing shipyard operations through improved production, efficiency, and green innovation. These technologies facilitate the adoption of green technologies, boosting productivity and green product development, thereby reducing energy use and reliance on traditional energy sources. This paper focuses on hand-guiding systems (HGS), 3D projection, and collaborative robotic technologies.

● High-payload industrial robots, a mature technology, are used in manufacturing large-sized components[7]. However, they present challenges in flexibility, necessitating significant investment and dedicated space, which limits their adaptability in operations. HG industrial robots can mitigate these issues by enabling the manipulation of heavy components, reducing reliance on traditional jigs and fixtures, and leveraging digitization capabilities. This not only enhances production system efficiency but also aids in emission reduction. Nonetheless, in practice, HGS often struggle to balance the support of heavy weights with the transparency required for human manipulation.

● Significant research efforts have been dedicated to exploring the application of AR/VR for construction supervision, maintenance, and assembly instructions. These instructions can include textual, visual, or auditory information[8]. There is also a growing interest in projection-based AR, which offers immersive human-machine interfaces (HMI) to help human operators perform tasks more quickly, accurately, and with fewer mistakes, thereby reducing the dependence on engineering paper-based drawings[9]. However, projection systems often fall short in terms of portability, range, and flexibility.

● Portable collaborative robots, capable of operating within closed structures, have been a focus of past research[10]. Despite their potential, many tasks are still performed manually, which is not only error-prone and energy-intensive but also hinders the efficiency and sustainability of production processes. Collaborative robotic technology, a mature field, is becoming more accessible thanks to the development of small, portable robots. This advancement opens up new possibilities for applications such as welding and cutting in narrow ship structures, with the ability to integrate digital infrastructure. However, there remains a need for human-centric, portable robotic solutions to address customized, low-volume manufacturing tasks.

Finally, this opens the possibility to explore the concept of open didactic factories[11] for technology validation. This can also be extendable to other topics, such as demonstrating technology specifications and preserving technology skills, although these are outside the scope of this paper. Technology validation was achieved to raise awareness among industrial companies about the advantages and benefits of industrial robotics, portable projection systems, and mobile manipulators, and their impact. This concept is designed to equip and enhance the workforce for emerging manufacturing ecosystems at different stages of their product life cycle. In this way, their overall impact, including on the environment, can be assessed before their deployment at the shipyards.

This paper is structured as follows: we begin by detailing the technology-wise methods, which includes an explanation of the scenario at didactic factories, materials, and procedures involved. This is followed by a presentation of the results, organized according to the different technologies used. The paper concludes with a discussion section, which not only presents the results but also outlines potential future research directions.

METHODS

HG technology for industrial robots

Scenario

Manipulation scenario for HG industrial robots was selected to move large components within the shipyard and simulated in the didactic factory [Figure 1A]. This task, designed to simulate a human-robot shared workspace, used the hand-guided industrial robot to manipulate large pre-fabrication components. The safety system was devised that dynamically switches between autonomous and collaborative modes. This paper focuses on the HGS aspect of the safety system. When the robot picked up a large component, it triggered the collaborative mode. Assisted by the HGS, the human operator moved the robot to the desired position for the next fabrication process. The safety system automatically and dynamically switched between working modes, i.e., autonomous and collaborative. When the robot picked the large component, the collaborative mode was activated. With the assistance of the HGS, the human operator can then maneuver the robot to the desired position for the next fabrication process.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 1. HGS. (A) ABB™ and KUKA™ robot moved by an operator using HGS; (B) Hardware and software relations. HGS: Hand-guiding systems.

Material

The system consisted of three main components: an industrial robot (ABB™ and KUKA™), a force/torque (FT) sensor (SCHUNK™), and an HMI (joystick from ABB™) [Figure 1B]. The software comprised two-layered architectures. One layer provided an interface to communicate with the hardware and the main HG module layer. The HG was centered on the PID-based force controller whose inputs are the current robot position and force and return the position/speed command, adjusting force and torque with the proportional part.

Procedure

The configuration of the HGS involved four steps. First, the controller parameters were computed to achieve a soft and correct controller response. Next, the sensor was calibrated to identify adapters and devices attached to the robot tool. In the third step, the controller was enabled and had the operator move the end effector/tool to the piece position for pickup. Finally, this process was repeated with unknown loads.

The hand-guided, reaction-based force controller detects human intent to move the robot using a joystick and a calibrated FT sensor. This controller processes the FT sensor information in conjunction with the position data of the robot to generate the desired robot trajectory during hand-guided motion. To ensure better robot movement performance, it is important to set the Denavit-Hartemberg (DH) specific robot model parameters in the main controller and calibrate the FT sensor to account for the load from tools, joystick, and other components attached to the sensor.

Externally guided motion (EGM) feature of the ABB™ robot software was leveraged, and the robot sensor interface (RSI) feature of the KUKA™ robot software was exploited to communicate with externally developed controllers. Chosen communication protocols supported high-speed (250 Hz) communication. However, the KUKA™ RSI operates on position commands, while the ABB™ system is velocity commanded. Therefore, deploying this controller for KUKA™ requires additional computational operations to integrate speed within the control loop time to send to the robot.

Portable and cost-effective 3D projection technology

Scenario

The retrofitting application was selected within a didactic factory testbed, simulating a real-world scenario [Figure 2A and B]. The computer-aided design (CAD) file of the retrofitting structure was available, aiming to project the anticipated fabrication work onto the structure. This approach seeks to guide operators through their tasks without resorting to traditional paper drawings and cumbersome measurement tools.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 2. 3D projection Pan/tilt mechanism. (A) Real prototype inside didactic factory; (B) Real prototype on the ceiling; (C) Range of movement of pan/tilt mechanism; (D) FOV range envelope of the pan/tilt mechanism. FOV: Field of view.

Material

The newly developed pan/tilt mechanism, featuring two degrees of freedom, offered a range of motion from 0º to 306º in the pan axis and -66.5º to +66.5º in the tilt axis, as demonstrated in Figure 2C and D. Weighing approximately 10 kg, the unit can support up to 9.5 kg payload and is mountable on any flat ferromagnetic surface via a MagMount Superior 300 MagSwitch. The system, equipped with an EPSON EH-TW5400 projector and a Zed 2i camera, delivered a resolution of 1,280 × 720 at 60 fps, enabling efficient perception and projection.

Procedure

The procedure encompasses three primary steps: scanning the target area, applying a localization algorithm, and projecting operational data. Initially, the pan/tilt system scans the area, moving the projector bidirectionally to gather 3D data for the digital reconstruction of the target area for future fabrication. The localization algorithm then compares this reconstructed model with its corresponding CAD model, using an Iterative Closest Point (ICP) algorithm[12] to optimize by minimizing the distance between observed and CAD points. Lastly, the procedure involves projecting dimensions or operational information onto the scanned area, a process crucial to the system accuracy due to its dependency on the localization of the projector relative to its surroundings.

The intrinsic parameters of the 3D camera were calibrated using chessboard pattern methods, and a similar camera was utilized for projector calibration using known circular patterns[13]. The real-world location of the system and its relation to the CAD were established through a point cloud captured with the ICP algorithm. The efficiency of the algorithm is contingent on accurate initialization, necessitating the development of two approaches to prevent issues when the initial state is far from the solution. The Feature Search algorithm approach employs 2D features to estimate the initial pose of an object, leveraging the ability of 3D cameras to generate 2D images. This approach is challenged when the object is symmetrical or lacks unique features, necessitating the use of markers on both the CAD and real object or an alternative method. It involves an initial pose estimation where a human manually selects an approximate viewpoint through a user interface. This method is particularly suitable for objects with symmetries or limited distinctive features, as it provides an initial approximation of their pose without requiring markers.

After establishing the spatial relationship between the object and the camera-projector system, the next step was to project information onto the object. Using the calibration matrices and the transformation determined through ICP, an accurate projection is achieved.

Portable lightweight collaborative robotic technology

Scenario

Plasma cutting operation was selected during pre-fabrication and retrofitting for developing the lightweight robotic solution inside the didactic factory [Figure 3]. Robot was randomly positioned on the wall within the double hull structure. However, it can also be mounted on the ceiling or floor using the magnetic base. The CAD files of the double hull structure were accessible. The main objective of the robotic system was to localize itself using the perception system and scan the surrounding environment. It then projected the cutting profile from the digitized operation drawings onto the localized digital representation of the double hull virtually. Finally, the robot autonomously followed the cutting profile, completing the cutting job. Similarly, we can apply this technology to welding scenarios.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 3. The collaborative robot fixed to the wall of the double hull structure of the didactic factory with the plasma cutting torch.

Material

As depicted in Figure 3, the MagSwitch magnetic base was used to position the robot UR10, equipped with an Intel RealSense D435 camera and a Hypertherm Duramax Hyamp 180 plasma torch powered by a Hypertherm Powermax 85 power plant, on the hull block wall. Plasma cutting parameters were defined as follows: we set the tool speed at 60 mm/s, the torch piece distance at 1.5 mm, and the perforation delay time at 0.5 s. The intensity was selected at 80 A, the initial perforation height at 3.8 mm, and the air pressure at 6.5 bar. Finally, the material width was selected at 6.5 mm.

Procedure

As mentioned earlier, the robotic system primarily aimed to employ the perception system for self-localization and scan the surroundings. The configuration of the collaborative plasma cutting consisted of four main steps. First, cutting parameters and scanning areas were defined. Next, the computer was triggered to enable the perception to perform localization and optimization. Then, the calculated information was shared with the robot. It then projected the cutting profile from the digitized operation drawings onto the localized digital representation of the double hull virtually. Finally, the robot performed the cutting operation by following the respective profile, which was verified with the help of the onboard camera.

RealSense depth camera was mounted to the UR10 collaborative robot flange as an eye-to-hand configuration and positioned inside the hull block, facing the processing area. The robot was moved to several positions near the processing area to capture an image at each position. Point cloud images taken with the camera were stitched to create a larger image. The last captured image was utilized to manually locate the robot in its real surroundings, offering a first approximation by comparing the image with a nearby hull block CAD view.

ICP algorithm and CAD matching localization methodology were applied similarly to 3D projection. The only difference was the hardware. After the localization, the intended cutting operation was virtually projected over the hull block and shared the information with the robot, which then automatically executed the routines, considering the cutting parameters and desired shape. For this study, simple shapes, such as circles, squares, and triangles, were considered.

RESULTS

HG technology for industrial robots

In this section, the results of the HG of industrial robots for manufacturing large components are bundled into communication verification and FT sensor validation for ABB™ and KUKA™ robots.

First, the communication between the FT sensors and the HG modules and between the robot controller and dynamic switching of operational modes with the help of Robot Operating System (ROS) was established. The findings verified the software abstraction between ABB™ and KUKA™ robots and the high-speed communication between the HG controller and the ABB™ EGM interface and ABB™ robot software KUKA™ RSI. In addition, the feature Operational Mode was confirmed, exhibiting the smooth switch between automatic and manual mode successfully. We demonstrated our results for the ABB™ and KUKA™ robots and presented the possible technology extension to the COMAU™ robotics. Other industrial robots were not presented as they are outside the scope of this study; however, we anticipated that with a similar procedure, it could be extended to any industrial robot.

Second, the validation and verification of the FT sensor were exploited for ABB and KUKA robots.

● In the case of the ABB™ robot tests, the system was in “Manual Mode”, which was integrated with the HMI (joystick) [Figure 4]. Applied forces and the real cartesian position of the robot (xyz) were analyzed. Controller noise correction from the force input data was observed in Figure 4A and D (dash). This data was recorded from an ABB™ robot mounted on a support with rotations in the Y and Z axes. Thus, Z movement on the sensor influenced the other positions. Between Figure 4A and D (dash), in the last applied force peak, the tool center point (TCP) decreased its position in the -X axis. In the Z axis, the robot TCP maintained its position, in the beginning, when more -Z effort was applied, the robot elevates in the Z axis, as demonstrated in Figure 4C and F (dash). The results showed the smooth desired position generated by the module by processing the noisy force and torque input from the operator sensed by the joystick and the FT sensor.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 4. Hand-Guiding Module random force and position results in three axes. (A-C) The HG force torque sensor output in the form of the wrench with respect to time; (D-F) The cartesian position when HG wrench was applied at the HMI corresponding to the sample time. HG: Hand-guiding; HMI: human-machine interfaces.

● In KUKA™ robot tests, we achieved a smooth response by obtaining the command to send with a window interval twice the control frequency to correct these values. In this case, we computed the new command due to the obtained velocity from the HG module, the current position of each joint, and twice the control loop following the Uniform Rectilinear Motion equations. Therefore, we are controlling in 0.05 s but computing the value in 0.1 s. Hence, we ensure that the robot will perform a smooth movement without stopping and moving in each sent command.

Portable and cost-effective 3D projection technology

Figure 5A shows the validation results of the projection technologies in five steps. The red point cloud represents the scan area of the retrofitting structure. The system calibration axis is also depicted in the middle. The scanning outcomes were consistent with continuous recording of the structural information in three continuous scans in the pan direction. Figure 5B presents the fused point cloud information from the required scans. We noted the homogeneous stitching of the digital representation from the scans. Figure 5C illustrates the ICP iteration process with 820 iterations, and it aligns the grey target which was the CAD model-based point cloud information with the green point cloud taken from the scans. We observed the good alignment of both point clouds. Figure 5D displays the results of projecting the thickness of the beams in the retrofitting structure and window profiles. The projections correctly followed the structure profiles of interest, exhibiting good accuracy for the display features perpendicular to the projector position. Distortions were visible for the oblique and distant display features.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 5. Projection results. (A) Scanning; (B) Stitching; (C) Optimization; (D) Feature display.

Portable lightweight collaborative robotic technology

The localization and the plasma cutting results are shown in Figure 6. Figure 6A showcases the point cloud reconstruction of the double hull structure. The reconstruction was completed successfully and the main features such as the openings, corners, and steps can be verified with the naked eye [Figure 6B and C]. An initial guess of the location of the system concerning the structure can be observed. Figure 6D shows the optimization result of the point cloud and the CAD model. Figure 6E shows the overlap of the green point cloud on the grey CAD model. Furthermore, it also displays the results of the overlaying of the round cutting operation on the top of localized double hull digital representation. Finally, Figure 6F presents the result of the execution of the plasma operation.

Robotized technologies for enhanced shipyard operations: challenges and solutions

Figure 6. Lightweight collaborative robots for cut openings results. (A) Quick deployment of a UR10 portable cobot to a magnetic base for reconstruction; (B) Generation of a CAD model, and comparison of the initial guess with the generated point cloud; (C) Optimization; (D) Initial guess results; (E) Representation of the cutting operation overlay on top of the point cloud overlay; (F) Demonstration by robotized cutting. CAD: Computer-aided design.

DISCUSSION

Three human-centric, robotized technologies designed for large-scale manufacturing operations in shipyards are presented in this paper. These innovations include ABB™ and KUKA™ robot HGS for manipulation tasks, a cost-effective projection system for information display during retrofitting and pre-fabrication processes, and a lightweight collaborative system for plasma cutting.

The proposed HGS has been successfully integrated with two major industrial robot controllers, ABB™ and KUKA™, and the COMAU™ robot. The integration process involved the use of a developed ROS controller, which allowed us to analyze the topics and endpoint data types. The analysis led to the conclusion that the KUKA™ approach would be suitable for COMAU™ robots, as both are position-commanded through ROS controllers. This technology has been extended to other industrial robots, with an example from the Mari4_YARD project[14] where the technology was implemented for the COMAU™ robot. However, there are inherent limitations to HG technology. These include position-based control strategies, safety, standardization, and legislation.

In the future, more research will focus on velocity, higher derivative control strategies, and standardization activities. These areas present opportunities for further exploration and development of this technology.

The proposed 3D projection technology offers a promising solution for small and medium shipyards. It provides a cost-effective and lightweight system with a wide tilt angle range. However, transitioning to non-flat surfaces, particularly during oblique projections, presents certain challenges. A notable increase in error is observed, which can, under certain conditions, extend to several centimeters, because the angular errors are magnified due to the curvatures and irregularities of the surface. Additionally, distortions occur since the projected light falls on a larger surface. This variability in error underscores the system sensitivity to geometric variations. Another factor contributing to the observed errors is the discrepancy between the CAD model and the actual component. These deviations can originate from manufacturing variations, unforeseen deformations, or unrecorded alterations in the design. While these inconsistencies may seem minor during the design phase, they become more prominent in projection accuracy. Besides these limitations, we envision the system can be used in shipbuilding applications, such as identifying structural failures, locating cables and pipes in a wall, or inspecting tiny components. In the future, the focus will be on enhancing the accuracy of this system.

In the case of collaborative robots, off-the-shelf sensors are utilized for area scanning. The portability of this technology for welding applications is evaluated, thereby enhancing its usability. This technology, however, shares similar limitations and future research directions with 3D projection. In addition, in the future, the integration of this system with the enterprise resource planning (ERP) system of the shipyards is envisioned.

The development of these technologies took place at the didactic factories. These facilities serve as platforms for rapid technological advancement, allowing for short development cycles. Such didactic factories will significantly contribute to the evolution of emerging technologies. In the future, the research will focus on the vertical integration and data management of the presented technologies in the didactic factories for the full design life-cycle demonstration of large components and their environmental impact.

DECLARATIONS

Authors’ contributions

Conceptualization, supervision, and writing-reviewing and editing: Masood J, Vidal F

Writing-reviewing and editing, methodology, results, and discussion: Castro D, Pertusa AM, Feijoo A

Availability of data and materials

Not applicable.

Financial support and sponsorship

The work was supported by the Spanish “Ministerio de Ciencia e Innovación” through the “Centro para el Desarrollo Tecnológico Industrial” (CDTI) and the European Union’s Horizon 2020 R&D program. The support was provided for the “5R - Cervera network of robotic technologies in smart manufacturing” project under Grant EXP00139978CER-20211007, the PENELOPE project under Grant Agreement No. 958303, and the Mari4_YARD project under Grant Agreement No. 101006798.

Conflicts of interest

All authors declared that there are no conflicts of interest.

Ethical approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Copyright

© The Author(s) 2024.

REFERENCES

1. Iwańkowicz R, Rutkowski R. Digital twin of shipbuilding process in shipyard 4.0. Sustainability 2023;15:9733.

2. Müller R, Esser M, Vette M. Reconfigurable handling systems as an enabler for large components in mass customized production. J Intell Manuf 2013;24:977-90.

3. Monostori L, Kádár B, Bauernhansl T, et al. Cyber-physical systems in manufacturing. Cirp Annals 2016;65:621-41.

4. Vidal F, Alonso L, de Luca P, Duplessis-Kergomard Y, Castillo R. Development of a digital thread for orchestrating data along the product lifecycle for large-part and high-precision manufacturing. In: International Conference on Flexible Automation and Intelligent Manufacturing. FAIM 2023: Flexible Automation and Intelligent Manufacturing: Establishing Bridges for More Sustainable Manufacturing Systems. Springer, Cham; 2023. pp. 668-76.

5. Sdoukopoulos E, Tsafonias G, Perra VM, Boile M, Lago LF. Identifying skill shortages and education and training gaps for the shipbuilding industry in Europe. In: Sustainable development and innovations in marine technologies. 2019. Available from: https://www.taylorfrancis.com/chapters/edit/10.1201/9780367810085-61/identifying-skill-shortages-education-training-gaps-shipbuilding-industry-europe-sdoukopoulos-tsafonias-perra-boile-fraga-lago. [Last accessed on 21 Mar 2024].

6. Heavin C, Power DJ. Challenges for digital transformation - towards a conceptual decision support guide for managers. J Decis Syst 2018;27:38-45.

7. Ferreira LA, Figueira YL, Iglesias IF, Souto MÁ. Offline CAD-based robot programming and welding parametrization of a flexible and adaptive robotic cell using enriched CAD/CAM system for shipbuilding. Procedia Manuf 2017;11:215-23.

8. Friedrich W. ARVIKA-augmented reality for development, production and service. In: Proceedings. International Symposium on Mixed and Augmented Reality; 2002 Oct 01-01; Darmstadt, Germany. IEEE; 2002. p. 3-4.

9. Serván J, Rubio JM, Mas F, Gomez A, Ríos J. Augmented reality using laser projection for the Airbus A400M wing assembly. In: Proceedings of the 29th International Manufacturing Conference. 2012. pp. 154-62.

10. Dharmawan AG, Vibhute AA, Foong S, Soh GS, Otto K. A survey of platform designs for portable robotic welding in large scale structures. In: 2014 13th international conference on control automation robotics & vision (ICARCV); 2014 Dec 10-12; Singapore. IEEE; 2014. pp. 1683-8.

11. Masood J, Vidal F. Didactic factories for emerging technologies: occupational exoskeleton. Adv Rob Tec 2023;1:000103.

12. Holz D, Ichim AE, Tombari F, Rusu RB, Behnke S. Registration with the point cloud library: a modular framework for aligning in 3-D. IEEE Robot Automat Mag 2015;22:110-24.

13. Hartley R, Zisserman A. Multiple view geometry in computer vision. 2nd edition. New York: Cambridge University Press. 2003. Available from: https://www.r-5.org/files/books/computers/algo-list/image-processing/vision/Richard_Hartley_Andrew_Zisserman-Multiple_View_Geometry_in_Computer_Vision-EN.pdf. [Last accessed on 21 Mar 2024]

14. MARI4 YARD. 2023. Available from: https://www.mari4yard.eu/. [Last accessed on 21 Mar 2024].

Cite This Article

Export citation file: BibTeX | RIS

OAE Style

Masood J, Vidal F, Castro D, Pertusa AM, Feijoo A. Robotized technologies for enhanced shipyard operations: challenges and solutions. Green Manuf Open 2024;2:6. http://dx.doi.org/10.20517/gmo.2023.102601

AMA Style

Masood J, Vidal F, Castro D, Pertusa AM, Feijoo A. Robotized technologies for enhanced shipyard operations: challenges and solutions. Green Manufacturing Open. 2024; 2(2): 6. http://dx.doi.org/10.20517/gmo.2023.102601

Chicago/Turabian Style

Masood, Jawad, Felix Vidal, David Castro, Afra M. Pertusa, Abel Feijoo. 2024. "Robotized technologies for enhanced shipyard operations: challenges and solutions" Green Manufacturing Open. 2, no.2: 6. http://dx.doi.org/10.20517/gmo.2023.102601

ACS Style

Masood, J.; Vidal F.; Castro D.; Pertusa AM.; Feijoo A. Robotized technologies for enhanced shipyard operations: challenges and solutions. Green. Manuf. Open. 2024, 2, 6. http://dx.doi.org/10.20517/gmo.2023.102601

About This Article

Special Issue

© The Author(s) 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, sharing, adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Data & Comments

Data

Views
188
Downloads
32
Citations
0
Comments
0
16

Comments

Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at support@oaepublish.com.

0
Download PDF
Cite This Article 0 clicks
Like This Article 16 likes
Share This Article
Scan the QR code for reading!
See Updates
Contents
Figures
Related
Green Manufacturing Open
ISSN 2835-7590 (Online)
Follow Us

Portico

All published articles will preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles will preserved here permanently:

https://www.portico.org/publishers/oae/