<?xml version="1.0" encoding="UTF-8" ?>
<modsCollection xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.loc.gov/mods/v3" xmlns:slims="http://slims.web.id" xsi:schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-3.xsd">
<mods version="3.3" ID="21822">
<titleInfo>
<title><![CDATA[Advanced Robotics Vol 32, 2018, issue 19]]></title>
</titleInfo>
<name type="Personal Name" authority="">
<namePart>Koh Hosoda</namePart>
<role><roleTerm type="text">Pengarang</roleTerm></role>
</name>
<typeOfResource manuscript="yes" collection="yes"><![CDATA[mixed material]]></typeOfResource>
<genre authority="marcgt"><![CDATA[bibliography]]></genre>
<originInfo>
<place><placeTerm type="text"><![CDATA[Japan]]></placeTerm></place>
<publisher><![CDATA[Osaka University, Osaka, Japan]]></publisher>
<dateIssued><![CDATA[2018]]></dateIssued>
<issuance><![CDATA[continuing]]></issuance>
<frequency><![CDATA[Bi-Monthly]]></frequency>
<edition><![CDATA[Publish]]></edition>
</originInfo>
<language>
<languageTerm type="code"><![CDATA[en]]></languageTerm>
<languageTerm type="text"><![CDATA[English]]></languageTerm>
</language>
<physicalDescription>
<form authority="gmd"><![CDATA[Text]]></form>
<extent><![CDATA[]]></extent>
</physicalDescription>
<note>Formation control of multiple quadcopters using model predictive control
Loïc Dubois a∗ and Satoshi Suzukib
aSchool of Engineering, Swiss Federal Institute of Technology, Lausanne, Switzerland; bDepartment of Mechanical Engineering and Robotics, Shinshu University, Ueda–shi, Japan
ABSTR AC T
This paper presents the formation control of a fleet of three small quadcopters in a motion capture environment. The dynami cmodel of a single quadcopter is derived fo rmodel predictive control (MPC) and then constraints are explained and expressed in an adequate manner to be included in the cost function for the optimization problem to be solved by the C/GMRES method. Two control architectures, centralized and decentralized, were implemented in the ROS framework and tested on the Crazy Flie quadcopter. First performances are assessed for a static reference, the formation regulation problem, then for a dynamic reference, the formation tracking one. Finally, computational cost of the MPC controllers is evaluated.
KEY WORDS
Formation control; quadcopters; collaborative robotics; model predictive control; decentralized control

Stable impact and contact force control by UAV for inspection of floor slab of
bridge
Takahiro Ikedaa, Shogo Yasuia, Satoshi Minamiyamaa, Kenichi Oharaa, Satoshi Ashizawaa, Akihiko Ichikawaa, Akihisa Okinob, Takeo Oomichia and Toshio Fukudaa ,c
aDepartment of Mechatronics Engineering, Meijo University, Tempaku-ku, Nagoya, Japan; bOkino INDUSTRIES, LTD., Kodama-Gun, Saitama pref., Japan; cBeijing Institute of Technology, Haidian Qu, Beijing Shi, People’s Republic of China
ABSTR AC T
This paper describes the contact force control on an unmanned aerial vehicle (UAV) developed to inspect the floor slabs of bridges. Our UAV is equipped with a three degree-of-freedom manipulator on top of the UAV body. To control the UAV for stable contact with the slab surface, the impact force should be considered. The impact force is modeled based on Hertzian contact stress. The control strategy of the UAV is cascade control separated into attitude control and position-force control.
The attitude, position and force feedback are PID control. The force feedback is integrated into the position feedback seamlessly, and the output of the force feedback is added to the desired endpoint position of the manipulator. This paper focuses on contact of the UAV and the floor slab. Therefore, the UAV is modeled considering the impact force in the vertical direction. The control
method in the vertical direction is described, and then the altitude control and the contact force control are assessed. The altitude of the UAV was controlled with a 0.45 [sec] delay during ascending and 1.76 [sec] during descending. The UAV could control the contact force with mean error 1.61 ± 1.08 [N] while the desired contact force was 3 [N].
KEY WORDS
UAV; contact force control; Hertzian contact stress; bridge inspection

Three-dimensional position estimation method via AM pulse light modulation
and an application to control multiple UAVs
Seita Sukisaki , Ryo Shimomura and Hajime Nobuhara
Depar tment o f I ntelligent I nterac tion and Technologies, University of Tsukuba, Tsukuba City, Ibaraki, Japan
ABSTR AC T
In this study, a three-dimensional position estimation is proposed to estimate self-position via amplitude-modulated pulse infrared light to solve the problem of position estimation from an existing device used in the cooperation of unmanned air vehicles (UAVs) that can be used in various environments. The proposed method estimates self-position as the distance between the receiver and the transmitter and the angle of direction by attenuation of amplitude-modulated pulse infrared light. By attaching a transmitter to a leader UAV and a receiver to follower UAVs, the cooperation flight can be achieved. The following three experiments are conducted: (1) evaluation of the accuracy of self-position estimation, (2) implementation of the proposed method on an UAV, and (3) evaluation of the robustness of the proposed method in an outdoor environment. The findings confirm the
effectiveness of the proposed method. The proposed approach allows estimation of position with high speed, precision in various environments, and low-cost devices.
KEY WORDS
Self-position estimation; unmanned  aerial vehicle; drone; infrared light; sensor

Visual-GPS combined ‘follow-me’ tracking for selfie drones
T. Tuan Do and Heejune Ahn
Depar tment of Elec trical and I nformation Engineering, S eoul National Universit y of S cience and Technology, S eoul, Republic of Korea
ABSTR AC T
The ‘follow-me’ mode, where the drone autonomously follows and captures the user or target, is a new and attractive feature for camera drones, especially selfie drones. For this purpose, today’s commercial drones use the difference between two GPS data in the drone and user-side mobile GCS, e.g. a smartphone, but the targeting performance is often not satisfactory due to the inaccuracy of the GPS data, ranging from a few to tens of meters. Visual tracking can be considered for a solution to this problem, but the reliability of visual tracking is still questionable for long-term tracking in unexpected operating environments. The paper proposes a hybrid approach that combines the high accuracy of a visual tracking algorithm in short-term tracking and the reliability of GPS-based one in long-term tracking. The experiment with our prototype drone system demonstrates that the proposed combined approach can accomplish the follow-me operation very successfully, capturing the target in the center of video contents over 50% higher accuracy than the GPS-based ones. Also, the extreme scenario experiments verify the system can recover vision tracking failure and Wi-Fi failure quickly in a short-term, e.g. 3– 7 s.
KEY WORDS
S e lfie drone; fo llow-me m o d e ; s e n s o r f u s i o n ; v i s u a l track i ng; GPS</note>
<classification><![CDATA[]]></classification><identifier type="isbn"><![CDATA[20190314]]></identifier><location>
<physicalLocation><![CDATA[E-Library POLIJE Sistem Elektronik Tesis Dan Disertasi]]></physicalLocation>
<shelfLocator><![CDATA[E-J005-Vol.32,No.19,2018]]></shelfLocator>
<holdingSimple>
<copyInformation>
<numerationAndChronology type="1"><![CDATA[E-J005-Vol.32,No.19,]]></numerationAndChronology>
<sublocation><![CDATA[perpuspolije]]></sublocation>
<shelfLocator><![CDATA[E-J005-Vol.32,No.19,2018]]></shelfLocator>
</copyInformation>
</holdingSimple>
</location>
<slims:digitals>
<slims:digital_item id="3168" url="" path="/Advanced Robotics Vol 32, 2018 issue 19.pdf" mimetype="application/pdf"><![CDATA[Advanced Robotics Vol 32, 2018, issue 19]]></slims:digital_item>
</slims:digitals><slims:image><![CDATA[advance_robotik.jpg.jpg]]></slims:image>
<recordInfo>
<recordIdentifier><![CDATA[21822]]></recordIdentifier>
<recordCreationDate encoding="w3cdtf"><![CDATA[2019-03-14 14:50:44]]></recordCreationDate>
<recordChangeDate encoding="w3cdtf"><![CDATA[2019-03-14 14:51:03]]></recordChangeDate>
<recordOrigin><![CDATA[machine generated]]></recordOrigin>
</recordInfo></mods></modsCollection>