Rotating 3D laser mapping system for multi-rotor drones
Abstract
Accurate positional estimation is an essential prerequisite for the regular operation of an autonomous rotary-wing Unmanned Aerial Vehicles (UAV). However, the field of view (FOV) limitation problem of lidar makes it more challenging to locate the rotary-wing UAV in an unknown environment. To address rotor drones with an insufficient FOV and the observation blindness of lidar in complex environments, this paper designs a rotorcraft UAV system based on rotating 3D lidar and proposes a simultaneous localization and mapping algorithm for rotating 3D lidar. The algorithm distinguishes between planar and edge features based on the curvature value of the point cloud first. Then, to reduce the impact caused by the UAV motion and lidar rotation, messages about the Inertial Measurement Unit (IMU) and real-time rotation angles are used to compensate for these motions twice, while the IMU measurements are used for state prediction, and the error-sate iterative extended Kalman filter is used to update the residuals after matching line and surface features with sub-map. Finally, Smoother high-rate odometer data was obtained through IMU pre-integration and a first-order low-pass filter. The experimental results show that the proposed rotating lidar unit in indoor and outdoor conditions makes the rotorcraft UAV have a larger FOV, which not only improves the environmental perception capability and positional estimation accuracy of the rotorcraft but enhances the positioning reliability and flight stability of the rotorcraft UAV in complex environments.
Keywords
1. INTRODUCTION
In recent years, multi-rotor Unmanned Aerial Vehicles (UAVs) have been gradually used in several industries, such as aerial photography, agricultural plant protection, power inspection, film, television shooting, post-disaster search, and rescue, because of their autonomous take-off and landing capabilities, hovering capability, and high mobility and agility[1–4]. They have also been widely studied by both academia and industry[5]. However, remote-controlled rotor UAVs can be limited by various factors, for example, distance and the operator's field of view (FOV), so it is necessary to implement autonomous rotor UAVs. In autonomous rotorcraft UAVs, reliable localization information and strong environmental awareness are prerequisites for their ability to perform other subsequent tasks[6]. Positioning information can be obtained by simultaneous localization and mapping (SLAM) technology, a crucial technology in industrial automation, autonomous driving, and mapping[7,8]. Furthermore, during its development, cameras, lidar, and Inertial Measurement Unit (IMU) have been used as the primary sensors for information acquisition. Although cameras can be lightweight and small in size while still providing rich color information, they cannot be widely used in SLAM techniques due to their lack of direct distance information, high computational resources, and sensitivity to light. Meanwhile, with the development of the manufacturing industry, 3D lidar has not only solved the above shortcomings of the camera but has also been widely used in SLAM systems because of its strong environmental awareness, high measurement accuracy, anti-jamming solid ability, and precise mapping details[9].
The pure laser odometry calculation methods, LOAM[10] and LEGO-LOAM[11], are classical algorithms based on lidar, which only match by extracting line surface features that will cause inaccurate odometer to pose estimation in complex scenes or intense robot motion, thus leading to the accuracy degradation of the SLAM algorithm. The development of multi-sensor fusion methods has led to more attention to inertial laser SLAM based on lidar and IMU[12], and lidar-based inertial odometer fusion methods can be divided into loosely coupled and tightly coupled. Compared with loosely coupled methods, tightly coupled methods can obtain better localization results due to the online estimation of IMU bias. The LIOM proposed by Ye et al. utilizes the same features as LOAM while optimizing the IMU pre-integration and inter-plane constraints using factor graphs[13]. Shan et al. proposed LIO-SAM, which uses factor graphs to optimize four different factors for further improvement in the accuracy of the positional estimation and scalability[14]. With the increase of feature points and system dimensions, various deformations of the Kalman filter are widely used, such as the extended Kalman filter, traceless Kalman filter, iterative Kalman filter, etc. Xu et al. proposed a tightly coupled laser SLAM based on an iterative Kalman filter, which uses a new Kalman gain calculation formula to reduce the computational complexity from the measurement dimension to the state dimension to the computational complexity[15]. Although SLAM technology can provide high-frequency and accurate position information to autonomous rotary-wing UAVs, obtaining more map information from the environment and enhancing the environmental awareness of rotary-wing UAVs are also urgent problems to be solved. The FOV in the vertical direction of the lidar in the above methods is limited, which makes the mobile robot in complex takes weak in perceiving the surrounding environment, thus causing a blind area in the robot's FOV, which has a significant impact on the subsequent work. Conventional methods to address these deficiencies are to increase the lidar scanning beam to expand the vertical FOV or to increase the number of lidars, which can solve the mobile robot FOV blindness problem to some extent, but they pose a challenging task for production[16]. Rotating the entire lidar around a fixed axis to obtain a more extensive scanning range in the vertical direction is a more effective solution[17]. Droeschel et al. proposed a small, continuously rotating 3D laser scanner with a 360° * 270° FOV for UAV applications[18]. Bosse et al. proposed a 3D-based rotating 2D lidar scan matching method, which can generate accurate local maps and reliable positional estimation of the vehicle[19]. However, there remains room for improvement in the perception of the carrier for the environment.
Therefore, this paper first designs a UAV system with rotating 3D lidar, in which the rotation device theoretically makes the FOV of the lidar in the vertical direction 360 degrees. Then, this paper proposes a SLAM algorithm based on the FAST-LIO algorithm framework for rotating 3D lidar to achieve reliable localization of the UAV in complex environments and real-time building of the surrounding environment map.
2. METHODS
2.1. Unmanned aerial system
The rotor-wing UAV system based on a rotating 3D lidar is in Figure 1, and the sensors and their parameters are in Table 1. The system mainly comprises an environment sensing unit, a data processing unit, and a UAV platform. The UAV platform is built based on a four-axis UAV, which is mainly responsible for carrying the rotating 3D lidar device to complete subsequent tasks; the environment sensing unit can realize the UAV's perception of the surrounding environment and provide real-time position information for the UAV; the data processing unit is responsible for various data processing and UAV control command issuance to ensure the UAV's autonomous flight.
Sensor parameters
Sensor name | Parameters |
Flight controller | Pixhwak |
Lidar | Velodyne-VLP16 |
Inertial measurement unit | Xsense-mti300 |
Motor | Pulse towre intelligence |
On-board computer | Manifold2-C |
The rotating 3D lidar unit added in this paper is the UAV system's environment sensing unit. It mainly consists of a 3D laser, IMU, and motor with an encoder; the periphery is its protection device. The FOV is changed to 360 * 360 degrees by rotating the lidar in one direction with its horizontal plane placed vertically on a rotating motor. Compared to conventional rotor UAVs, the proposed rotor UAV can effectively increase the FOV in the vertical direction.
2.2. Rotating external parameter calibration
As mentioned earlier, the rotating 3D lidar sensor platform consists of three main parts, and installation errors, i.e., external parameters, inevitably occur during the assembly process. Incorrect external parameters can distort the build map, localization drift, and so on. To eliminate the effect of installation errors on the building map and localization, calibration of external parameters is required, and in this paper, we utilize the team's previous work[20] for the calibration of external parameters. This method is an external parameter calibration method that does not require specific calibration parts and can be performed in any environment. Since the angle of the motor rotation direction is not observable, only the two-degree-of-freedom rotation angles other than the motor rotation direction are calibrated. To combine the information from the rotating motor and the 3D lidar, it is first necessary to time-synchronize the messages from both before the subsequent parameter calibration can be performed. Under the Robot Operating System (ROS), the system subscribes to the motor angle information and point cloud information and uses the time-similar point cloud information and motor angle information. This process aligns current point cloud information with the angle information when the servo is rotating, which can be used for the subsequent three-dimensional mapping and other work. Firstly, the two half-scans are constructed by a module that divides all the scanned points into two half-scans using the rotation angle. Next, the module that generates the grids divides all the half-scan points into small grids, ensures that the grids closest to the position but from different half-scan points are consistent, and then estimates the approximate probability that each grid contains a plane. Finally, based on the previously estimated probabilities, valuable grids are selected to extract planar points, correlate planes, and efficiently estimate parameters. The method makes the lidar rotate one revolution at a constant speed, combines the angle information of the rotating motor and the original point cloud of the 3D lidar, maps the scanned point cloud to the 3D space, obtains the external parameters by the method of plane extraction and matching, and establishes the objective function, which is optimized by using the Ceres library. The workflow is shown in Figure 2.
We denote the estimation parameters
where the planes are from half-scans
2.3. Rotational laser SLAM method
2.3.1. FAST-LIO algorithm
FAST-LIO is a filter-based, tightly coupled laser SLAM algorithm that uses the lidar and IMU as sensors and does not require an external reference between them. Firstly, the state prediction is performed by forward propagation of IMU measurements, along with point cloud feature extraction in the same way as in the literature [9]. Secondly, all the newly formed feature points are transformed to the end moment of each lidar scan frame using the IMU data to remove motion distortion during backpropagation of the state and residual calculation. Then, the Error-sate Iteration Kalman Filter (ESIEKF) sub-module is judged to converge; if not, the updated state is used to continue the iteration to obtain new global points for feature matching again until convergence to the optimal state; otherwise, the global map and odometer will be updated. This tightly coupled approach significantly improves the localization accuracy and map-building effect.
2.3.2. Rotating 3D laser SLAM algorithm
In order to enable the multi-rotor UAV to obtain more information about its surrounding environment, this paper proposes a rotating 3D laser SLAM algorithm based on FAST-LIO, whose workflow is in Figure 3. First of all, the line surface features are extracted from the lidar output and additional motion compensation using the encoder angle and IMU measurement data, respectively; the IMU measurement data are used as the state prediction of the system; then, the structural parameters and external parameters between sensors are combined to update the state and feature matching using the error-sate iteration Kalman filter, and the optimal state is obtained at the end of the iteration as the odometry output while the global map is updated; finally, the odometry output frequency is increased using IMU pre-integration, and the odometer data is smoothed by a first-order low-pass filter to achieve the condition of stable flight of the rotorcraft UAV.
Figure 3. Rotational laser SLAM workflow diagram. IMU: Inertial Measurement Unit; SLAM: simultaneous localization and mapping.
(1) Angular and IMU-based motion compensation
Before the state estimation, the raw data obtained from sensors are firstly preprocessed, and the data preprocessing of the proposed algorithm in this paper mainly contains three parts. The first part is to extract the line and surface features from the original point cloud of the lidar output; the second part obtains the real-time rotation angle information of the lidar through the encoder to complete the first motion compensation. Subsequently, the real-time measurement data of IMU is used for the second motion compensation and state prediction.
(a) Extraction of point cloud feature
The feature point extraction in the rotating 3D lidar SLAM method is the same as in the literature [10]. The feature extraction is based on the curvature value of points; the points with larger curvature are considered edge points, and the points with smaller curvature are considered plane points, where
(b) Motion compensation
The rotational 3D laser SLAM algorithm requires two motion compensations for the acquired raw point cloud. Firstly, real-time rotation angle information is used to compensate for the motion compensation brought to the lidar by the rotation. In each frame of the lidar scan, the angle values of the start and end moments are obtained, so all feature points during this time should have an angle value, which is obtained by linear interpolation, thus converting the lidar feature points under the rotational coordinate system
where
Each frame of data acquired by the lidar is from a period in the past rather than a moment in time during which the lidar or its carrier usually moves and results in inconsistencies in the origin of the current frame, so it is necessary to borrow the IMU measurement data to compensate for the second motion and convert the current frame to the last moment in the coordinate system, thereby converting the feature points from the motor axis coordinate system
where
(2) State estimation with rotational external parameters
In this paper, the algorithm of the literature [15] is used as the main framework, and the rotation angle information of the lidar is written into the state equation on its original basis. The discrete equation of the state can be described as follows
where
In this paper, the state is predicted and updated using an error-state iterative extended Kalman filter. The state prediction is first started when the IMU data is received, at which there is the
where
in this instance, the system state covariance matrix
where
the error
where
Then, based on the error between the ground truth at the previous moment and the predicted value at the current moment, an iterative state update is performed and the updated state is output when the update error is less than a threshold or reaches the maximum number of iterations. The residual
Where
where
At this point, the updated state
where
where
(3) High frequency odometry based on IMU and filter
The odometer frequency after the error iterative Kalman filtering optimization is far from the normal flight conditions of the rotorcraft. The IMU has a high output frequency, so this paper first improves the overall interval frequency of the odometry through IMU pre-integration, which can be calculated by
In this case,
where
3. RESULTS
In order to verify the effectiveness of the rotorcraft UAV and the positioning algorithm designed in this paper, rotating external parameter calibration experiments, simulation tests, indoor positioning map building, and outdoor positioning map building tests were constructed, respectively. The simulation experiments were conducted under the Ubuntu 18.04 system to test the map-building effect. Then, the actual tests were carried out indoors and outdoors using the UAV in Figure 1, and the sensors and other equipment used are detailed in Table 1.
3.1. External parameter calibration experiment
As mentioned before, the sensing platform consists of rotating motors, 3D lidar, and IMU. There is necessarily an external reference linking them. As can be seen from the workflow diagram of the algorithm, in this paper, the point cloud acquired by the rotating 3D lidar is firstly converted to a certain moment in time by removing aberrations, and then, the position estimation is carried out by utilizing the external parameters between the IMU and the lidar, which reveals that there are two sets of external parameters in the whole process. One group involves the relationship between IMU and 3D lidar, which will be estimated in real-time in the odometry method, and the other group pertains to the external between rotating motors and 3D lidar, which needs to be calibrated before proceeding with the SLAM algorithm. In this paper, we use the sensing platform to rotate one circle in the room at a constant speed and then use the algorithm for the estimation of the external parameter, and the final calibration result is 0.2 for the y-axis and 0.3 for the z-axis. The effect of the external parameter before and after the calibration is shown in Figure 5.
3.2. Simulation test
In order to verify the robustness and flight safety of the proposed algorithm, the proposed system and the UAV system without rotating lidar were first built under Ubuntu 18.04 using the gazebo simulation platform, as in Figure 6A and B, respectively. Then, the simulation experiments were carried out in this scenario to build the map, as in Figure 7A and B separately. It can be seen from these figures that the FOV of the rotating 3D lidar is improved in the vertical direction compared with the non-rotating 3D lidar to build a complete map of the surrounding environment, and the lidar device proposed in this paper has a theoretical FOV of 360 degrees in both vertical and horizontal directions, but there may be partial occlusion due to installation. Since the FAST-LIO algorithm lacks the external parameters of the lidar and the rotating motor after rotating the lidar, it will make the FAST-LIO algorithm invalid. So, the lidar can only be fixed to a certain angle, and then independent experiments of the same scene will be conducted and then compared with the positioning accuracy of the proposed algorithm in this paper.
3.3. Indoor positioning and map building test
Because the PX4 flight controller requires a positional input frequency of not less than 30Hz, to ensure a smooth UAV flight experiment, the odometry output frequency of this paper is 50Hz. Meanwhile, to get smoother odometer data, low-pass filtering is also performed on the positional results of the SLAM algorithm after pre-integration, and the filtering coefficient is 0.62.
3.3.1. Indoor positioning data validation
The rectangular motion was performed under the indoor motion capture system, and the displacement x-axis, y-axis, and z-axis output results of the rotating 3D laser SLAM odometer, fixed-angle FAST-LIO, and rotating FAST-LIO (Ours, F-FAST-LIO, R-FAST-LIO) were compared with the positioning results (Nokov) of the motion capture system, respectively, as in Figure 8A and B. From the experimental data, it is clear that the proposed method in the maximum positioning error is 0.08 m; the average error is 0.04 m, and the minimum is 0.005 m. The maximum positioning error of fixed angle FAST-LIO is 0.147 m; the average error is 0.062, and the minimum error is 0.005 m. The absolute attitude error (ATE) and root mean square error (RMSE) are also compared; the results are detailed in Table 2.
Algorithm positioning error comparison results
Scenes | Algorithm | ATE[m] | RMSE[M] |
Indoor | R-FAST-LIO | Failed | Failed |
Indoor | F-FAST-LIO | 0.062 | 0.070 |
Indoor | Ours | 0.034 | 0.028 |
Outdoor | R-FAST-LIO | Failed | Failed |
Outdoor | F-FAST-LIO | 1.25 | 1.302 |
Outdoor | Ours | 0.599 | 0.654 |
3.3.2. Indoor positioning and mapping experiment
To further verify the effect of the rotating 3D lidar SLAM algorithm in indoor localization and mapping, the localization and mapping experiments were conducted in the underground parking lot in Figure 9A and compared with the FAST-LIO localization and mapping effect in the same scene with a linear motion. The experimental localization data results are in Figure 9B; the mapping effects are in Figure 10A and B below, from which it can be seen that the proposed algorithm can build a complete environmental map to provide more environmental information for the UAV compared with the unrotated laser mapping.
3.4. Outdoor positioning and mapping experiment
To verify the environmental perception capability of this system in complex outdoor scenes, the UAV platform is used to conduct localization and mapping experiments in outdoor scenes. The global position system (GPS) data of PX4 is used as the reference value to compare with the positioning data of the proposed algorithm and the positioning data of the FAST-LIO algorithm, as in Figure 11A and B, respectively, as can be seen from the chart that the GPS data is not stable in outdoor scenes, which has some influence on the error evaluation. The average absolute trajectory error of the proposed algorithm is 0.599 m. The absolute trajectory error of the FAST-LIO algorithm is 0.654 m, and the outdoor scene is in Figure 12A and B correspondingly.
4. DISCUSSION
This article is proposed by the FAST-LIO algorithm that could effectively provide reliable pose information for rotor drones in indoor and outdoor scenarios; as shown in Figures 8A and 9B, the experimental results obtained are within an acceptable range compared to the actual values and the comparison algorithm. From Figure 7A and 8B, it can be concluded that compared to traditional positioning mapping platforms, our platform has a larger FOV and can obtain more environmental information. This also increases the application scenarios of rotor drones, such as when used to explore unknown environments, allowing them to obtain more environmental maps in a short period of time and have more options for path planning.
5. CONCLUSIONS
To address the problem of observation blindness of rotorcraft in complex environments for its own position estimation and environment map establishment, a rotorcraft UAV system based on rotating 3D lidar is first introduced, and then, a tightly coupled rotating laser SLAM algorithm is proposed. The algorithm uses FAST-LIO as the main framework, adds the lidar rotation angle information, and then obtains the odometry results after two motion compensation and error-sate iterative Kalman filtering iterations to update the state. It also combines IMU pre-integration and low-pass filtering to enhance the output frequency of the odometer to meet the rotorcraft UAV autonomous flight conditions. The absolute estimation error in this article is 0.034 m indoors and 0.599 m outdoors; these results show that the system obtains reliable positioning data and high-quality map-building effect in both indoor and outdoor complex scenes and has more robust environment sensing ability compared with traditional UAVs, which meets the usage conditions of various application scenarios. However, the algorithm is prone to degradation under the single constraint scenario, so further research will be conducted to improve the robustness of the algorithm under the single feature scenario. For example, corridors and tunnels, and we also plan to use them for autonomous exploration of underground mines by drones.
DECLARATIONS
Authors' contributions
Made substantial contributions to the conception and design of the study and performed data analysis and interpretation: Fu M, Zhang H
Performed process guidance responsibility for the planning and execution of the study and the evolution of overarching research aims, critical review, and material support. Review and revise the original draft: Wang S, Shui Y
Availability of data and materials
Not applicable.
Financial support and sponsorship
This research was funded by the National Natural Science Foundation of China (No.12205245) and the Natural Science Foundation of Sichuan Province (No.2023NSFSC1437).
Conflicts of interest
All authors declared that there are no conflicts of interest.
Ethical approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Copyright
© The Author(s) 2023.
REFERENCES
1. Li Z, Chen Y, Lu H, Wu H, Cheng L. UAV autonomous landing technology based on apriltags vision positioning algorithm. In: 2019 Chinese Control Conference (CCC); 2019 Jul 27-30; Guangzhou, China. IEEE; 2019. p. 8148-53.
2. Takaya K, Ohta H, Kroumov V, Shibayama K, Nakamura M. Development of UAV system for autonomous power line inspection. In: 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC); 2019 Oct 09-11; Sinaia, Romania. IEEE; 2019. p. 762-7.
3. Siean AI, Vatavu RD, Vanderdonckt J. Taking that perfect aerial photo: a synopsis of interactions for drone-based aerial photography and video. In: Proceedings of the 2021 ACM International Conference on Interactive Media Experiences. 2021. p. 275-9.
4. Reinecke M, Prinsloo T. The influence of drone monitoring on crop health and harvest size. In: 2017 1st International Conference on Next Generation Computing Applications (NextComp); 2017 Jul 19-21; Mauritius. IEEE; 2017. p. 5-10.
5. Gargoum S, El-Basyouny K. Automated extraction of road features using LiDAR data: a review of LiDAR applications in transportation. In: 2017 4th International Conference on Transportation Information and Safety (ICTIS); 2017 Aug 08-10; Banff, Canada. IEEE; 2017. p. 563-74.
6. Zhao S, Fang Z, Li H, Scherer S. A robust laser-inertial odometry and mapping method for large-scale highway environments. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2019 Nov 03-08; Macau, China. IEEE; 2020. p. 1285-92.
7. Dissanayake MWMG, Newman P, Clark S, Durrant-Whyte HF, Csorba M. A solution to the simultaneous localization and map building (SLAM) problem. IEEE Tran Robot Automat 2001;17:229-41.
8. Yuan X, Liu H, Qi R. Research on key technologies of autonomous driving platform. J Phys Conf Ser 2001;1754:012127.
9. Wang H, Wang C, Xie L. Lightweight 3-D localization and mapping for solid-state LiDAR. IEEE Robot Autom Lett 2021;6:1801-7.
10. Zhang J, Singh S. LOAM: lidar odometry and mapping in real-time. In: Proceedings of Robotics: Science and Systems; California, USA. 2014. p. 1-9. Available from: https://www.ri.cmu.edu/pub_files/2014/7/Ji_LidarMapping_RSS2014_v8.pdf. [Last accessed on 21 Nov 2023].
11. Shan T, Englot B. LeGO-LOAM: lightweight and ground-optimized lidar odometry and mapping on variable terrain. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2018 Oct 01-05; Madrid, Spain. IEEE; 2018. p. 4758-65.
12. Xu X, Zhang L, Yang J, et al. A review of multi-sensor fusion SLAM systems based on 3D LIDAR. Remote Sens 2022;14:2835.
13. Ye H, Chen Y, Liu M. Tightly coupled 3D lidar inertial odometry and mapping. In: 2019 International Conference on Robotics and Automation (ICRA), 2019 May 20-24; Montreal, Canada. IEEE; 2019. p. 3144-50.
14. Shan T, Englot B, Meyers D, Wang W, Ratti C, Rus D. LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2020 Oct 24-2021 Jan 24; Las Vegas, USA. IEEE; 2020. p. 5135-42.
15. Xu W, Zhang F. FAST-LIO: a fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman Filter. IEEE Robot Autom Lett 2021;6:3317-24.
16. Wang S, Zhang H, Wang G. OMC-SLIO: online multiple calibrations spinning LiDAR inertial odometry. Sensors 2023;23:248.
17. Harchowdhury A, Kleeman L, Vachhani L. Coordinated nodding of a two-dimensional lidar for dense three-dimensional range measurements. IEEE Robot Autom Lett 2018;3:4108-15.
18. Droeschel D, Holz D, Behnke S. Omnidirectional perception for lightweight MAVs using a continuously rotating 3D laser. Photogramm Fernerkund Geoinform 2014;5:451-64.
19. Bosse M, Zlot R. Continuous 3D scan-matching with a spinning 2D laser. In: 2009 IEEE International Conference on Robotics and Automation; 2009 May 12-17; Kobe, Japan. IEEE; 2009. p. 4312-9.
20. Wang S, Zhang H, Wang G, Liu R, Huo J, Chen B. FGRSC: improved calibration for spinning LiDAR in unprepared scenes. IEEE Sens J 2022;22:14250-62.
Cite This Article
How to Cite
Fu, M.; Zhang H.; Wang S.; Shui Y. Rotating 3D laser mapping system for multi-rotor drones. Intell. Robot. 2023, 3, 632-46. http://dx.doi.org/10.20517/ir.2023.35
Download Citation
Export Citation File:
Type of Import
Tips on Downloading Citation
Citation Manager File Format
Type of Import
Direct Import: When the Direct Import option is selected (the default state), a dialogue box will give you the option to Save or Open the downloaded citation data. Choosing Open will either launch your citation manager or give you a choice of applications with which to use the metadata. The Save option saves the file locally for later use.
Indirect Import: When the Indirect Import option is selected, the metadata is displayed and may be copied and pasted as needed.
Comments
Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at support@oaepublish.com.