3D laser from RGBD projections in robot local navigation
<?xml version="1.0" encoding="UTF-8" ?>
<oai_dc:dc schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
<dc:title>3D laser from RGBD projections in robot local navigation</dc:title>
<dc:creator>Calderita, Luis Vicente</dc:creator>
<dc:creator>Bandera-Rubio, Juan Pedro</dc:creator>
<dc:creator>Manso, Luis J.</dc:creator>
<dc:creator>Vázquez-Martín, Ricardo</dc:creator>
<dc:subject>Robots autónomos</dc:subject>
<dc:subject>Mobile robots</dc:subject>
<dc:subject>Reactive navigation</dc:subject>
<dc:subject>RGBD</dc:subject>
<dc:subject>Sensor arrays</dc:subject>
<dc:description>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dc:description>
<dc:description>Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.</dc:description>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-06</dc:date>
<dc:date>2014-11-03</dc:date>
<dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
<dc:identifier>http://hdl.handle.net/10630/8352</dc:identifier>
<dc:identifier>https://orcid.org/0000-0003-3814-0335</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>Workshop on Physical Agents WAF 2014</dc:relation>
<dc:relation>León</dc:relation>
<dc:relation>12-12 Junio 2014</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
</oai_dc:dc>
<?xml version="1.0" encoding="UTF-8" ?>
<d:DIDL schemaLocation="urn:mpeg:mpeg21:2002:02-DIDL-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/did/didl.xsd">
<d:DIDLInfo>
<dcterms:created schemaLocation="http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/dcterms.xsd">2014-11-03T13:24:39Z</dcterms:created>
</d:DIDLInfo>
<d:Item id="hdl_10630_8352">
<d:Descriptor>
<d:Statement mimeType="application/xml; charset=utf-8">
<dii:Identifier schemaLocation="urn:mpeg:mpeg21:2002:01-DII-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/dii/dii.xsd">urn:hdl:10630/8352</dii:Identifier>
</d:Statement>
</d:Descriptor>
<d:Descriptor>
<d:Statement mimeType="application/xml; charset=utf-8">
<oai_dc:dc schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
<dc:title>3D laser from RGBD projections in robot local navigation</dc:title>
<dc:creator>Calderita, Luis Vicente</dc:creator>
<dc:creator>Bandera-Rubio, Juan Pedro</dc:creator>
<dc:creator>Manso, Luis J.</dc:creator>
<dc:creator>Vázquez-Martín, Ricardo</dc:creator>
<dc:subject>Robots autónomos</dc:subject>
<dc:description>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dc:description>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-06</dc:date>
<dc:date>2014-11-03</dc:date>
<dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
<dc:identifier>http://hdl.handle.net/10630/8352</dc:identifier>
<dc:identifier>https://orcid.org/0000-0003-3814-0335</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>Workshop on Physical Agents WAF 2014</dc:relation>
<dc:relation>León</dc:relation>
<dc:relation>12-12 Junio 2014</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
</oai_dc:dc>
</d:Statement>
</d:Descriptor>
<d:Component id="10630_8352_1">
</d:Component>
</d:Item>
</d:DIDL>
<?xml version="1.0" encoding="UTF-8" ?>
<dim:dim schemaLocation="http://www.dspace.org/xmlns/dspace/dim http://www.dspace.org/schema/dim.xsd">
<dim:field authority="6103b4a8-fb61-43b8-b5a4-5d2138abecbf" confidence="500" element="contributor" mdschema="dc" qualifier="author">Calderita, Luis Vicente</dim:field>
<dim:field authority="153" confidence="500" element="contributor" mdschema="dc" qualifier="author">Bandera-Rubio, Juan Pedro</dim:field>
<dim:field authority="6892e2f1-473d-46e5-afa3-4440f4b68177" confidence="500" element="contributor" mdschema="dc" qualifier="author">Manso, Luis J.</dim:field>
<dim:field authority="a2e428a6-88bc-4040-b1ed-9ef52c60547d" confidence="500" element="contributor" mdschema="dc" qualifier="author">Vázquez-Martín, Ricardo</dim:field>
<dim:field element="date" mdschema="dc" qualifier="accessioned">2014-11-03T13:24:39Z</dim:field>
<dim:field element="date" mdschema="dc" qualifier="available">2014-11-03T13:24:39Z</dim:field>
<dim:field element="date" mdschema="dc" qualifier="created">2014-06</dim:field>
<dim:field element="date" mdschema="dc" qualifier="issued">2014-11-03</dim:field>
<dim:field element="identifier" mdschema="dc" qualifier="uri">http://hdl.handle.net/10630/8352</dim:field>
<dim:field element="identifier" lang="es_ES" mdschema="dc" qualifier="orcid">https://orcid.org/0000-0003-3814-0335</dim:field>
<dim:field element="description" lang="es_ES" mdschema="dc" qualifier="abstract">Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dim:field>
<dim:field element="description" lang="es_ES" mdschema="dc" qualifier="sponsorship">Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.</dim:field>
<dim:field element="language" lang="es_ES" mdschema="dc" qualifier="iso">eng</dim:field>
<dim:field element="rights" lang="es_ES" mdschema="dc">info:eu-repo/semantics/openAccess</dim:field>
<dim:field element="subject" lang="es_ES" mdschema="dc">Robots autónomos</dim:field>
<dim:field element="subject" lang="es_ES" mdschema="dc" qualifier="other">Mobile robots</dim:field>
<dim:field element="subject" lang="es_ES" mdschema="dc" qualifier="other">Reactive navigation</dim:field>
<dim:field element="subject" lang="es_ES" mdschema="dc" qualifier="other">RGBD</dim:field>
<dim:field element="subject" lang="es_ES" mdschema="dc" qualifier="other">Sensor arrays</dim:field>
<dim:field element="title" lang="es_ES" mdschema="dc">3D laser from RGBD projections in robot local navigation</dim:field>
<dim:field element="type" lang="es_ES" mdschema="dc">info:eu-repo/semantics/conferenceObject</dim:field>
<dim:field element="relation" lang="es_ES" mdschema="dc" qualifier="eventtitle">Workshop on Physical Agents WAF 2014</dim:field>
<dim:field element="relation" lang="es_ES" mdschema="dc" qualifier="eventplace">León</dim:field>
<dim:field element="relation" lang="es_ES" mdschema="dc" qualifier="eventdate">12-12 Junio 2014</dim:field>
</dim:dim>
<?xml version="1.0" encoding="UTF-8" ?>
<europeana:record schemaLocation="http://www.europeana.eu/schemas/ese/ http://www.europeana.eu/schemas/ese/ESE-V3.4.xsd">
<dc:title>3D laser from RGBD projections in robot local navigation</dc:title>
<dc:creator>Calderita, Luis Vicente</dc:creator>
<dc:creator>Bandera-Rubio, Juan Pedro</dc:creator>
<dc:creator>Manso, Luis J.</dc:creator>
<dc:creator>Vázquez-Martín, Ricardo</dc:creator>
<dc:subject>Robots autónomos</dc:subject>
<dc:subject>Mobile robots</dc:subject>
<dc:subject>Reactive navigation</dc:subject>
<dc:subject>RGBD</dc:subject>
<dc:subject>Sensor arrays</dc:subject>
<dc:description>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dc:description>
<dc:description>Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.</dc:description>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-06</dc:date>
<dc:date>2014-11-03</dc:date>
<dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
<dc:identifier>http://hdl.handle.net/10630/8352</dc:identifier>
<dc:identifier>https://orcid.org/0000-0003-3814-0335</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>Workshop on Physical Agents WAF 2014</dc:relation>
<dc:relation>León</dc:relation>
<dc:relation>12-12 Junio 2014</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
<europeana:provider>Universidad de Málaga</europeana:provider>
<europeana:type>TEXT</europeana:type>
<europeana:rights>http://rightsstatements.org/vocab/CNE/1.0/</europeana:rights>
<europeana:dataProvider>Universidad de Málaga</europeana:dataProvider>
<europeana:isShownAt>http://hdl.handle.net/10630/8352</europeana:isShownAt>
</europeana:record>
<?xml version="1.0" encoding="UTF-8" ?>
<thesis schemaLocation="http://www.ndltd.org/standards/metadata/etdms/1.0/ http://www.ndltd.org/standards/metadata/etdms/1.0/etdms.xsd">
<title>3D laser from RGBD projections in robot local navigation</title>
<creator>Calderita, Luis Vicente</creator>
<creator>Bandera-Rubio, Juan Pedro</creator>
<creator>Manso, Luis J.</creator>
<creator>Vázquez-Martín, Ricardo</creator>
<subject>Robots autónomos</subject>
<description>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</description>
<date>2014-11-03</date>
<date>2014-11-03</date>
<date>2014-06</date>
<date>2014-11-03</date>
<type>info:eu-repo/semantics/conferenceObject</type>
<identifier>http://hdl.handle.net/10630/8352</identifier>
<identifier>https://orcid.org/0000-0003-3814-0335</identifier>
<language>eng</language>
<relation>Workshop on Physical Agents WAF 2014</relation>
<relation>León</relation>
<relation>12-12 Junio 2014</relation>
<rights>info:eu-repo/semantics/openAccess</rights>
</thesis>
<?xml version="1.0" encoding="UTF-8" ?>
<record schemaLocation="http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd">
<leader>00925njm 22002777a 4500</leader>
<datafield ind1=" " ind2=" " tag="042">
<subfield code="a">dc</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="720">
<subfield code="a">Calderita, Luis Vicente</subfield>
<subfield code="e">author</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="720">
<subfield code="a">Bandera-Rubio, Juan Pedro</subfield>
<subfield code="e">author</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="720">
<subfield code="a">Manso, Luis J.</subfield>
<subfield code="e">author</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="720">
<subfield code="a">Vázquez-Martín, Ricardo</subfield>
<subfield code="e">author</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="260">
<subfield code="c">2014-11-03</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="520">
<subfield code="a">Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</subfield>
</datafield>
<datafield ind1="8" ind2=" " tag="024">
<subfield code="a">http://hdl.handle.net/10630/8352</subfield>
</datafield>
<datafield ind1="8" ind2=" " tag="024">
<subfield code="a">https://orcid.org/0000-0003-3814-0335</subfield>
</datafield>
<datafield ind1=" " ind2=" " tag="653">
<subfield code="a">Robots autónomos</subfield>
</datafield>
<datafield ind1="0" ind2="0" tag="245">
<subfield code="a">3D laser from RGBD projections in robot local navigation</subfield>
</datafield>
</record>
<?xml version="1.0" encoding="UTF-8" ?>
<mets ID=" DSpace_ITEM_10630-8352" OBJID=" hdl:10630/8352" PROFILE="DSpace METS SIP Profile 1.0" TYPE="DSpace ITEM" schemaLocation="http://www.loc.gov/METS/ http://www.loc.gov/standards/mets/mets.xsd">
<metsHdr CREATEDATE="2018-07-03T09:17:50Z">
<agent ROLE="CUSTODIAN" TYPE="ORGANIZATION">
<name>Repositorio Institucional de la Universidad Málaga</name>
</agent>
</metsHdr>
<dmdSec ID="DMD_10630_8352">
<mdWrap MDTYPE="MODS">
<xmlData schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd">
<mods:mods schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd">
<mods:name>
<mods:role>
<mods:roleTerm type="text">author</mods:roleTerm>
</mods:role>
<mods:namePart>Calderita, Luis Vicente</mods:namePart>
</mods:name>
<mods:name>
<mods:role>
<mods:roleTerm type="text">author</mods:roleTerm>
</mods:role>
<mods:namePart>Bandera-Rubio, Juan Pedro</mods:namePart>
</mods:name>
<mods:name>
<mods:role>
<mods:roleTerm type="text">author</mods:roleTerm>
</mods:role>
<mods:namePart>Manso, Luis J.</mods:namePart>
</mods:name>
<mods:name>
<mods:role>
<mods:roleTerm type="text">author</mods:roleTerm>
</mods:role>
<mods:namePart>Vázquez-Martín, Ricardo</mods:namePart>
</mods:name>
<mods:extension>
<mods:dateAccessioned encoding="iso8601">2014-11-03T13:24:39Z</mods:dateAccessioned>
</mods:extension>
<mods:extension>
<mods:dateAvailable encoding="iso8601">2014-11-03T13:24:39Z</mods:dateAvailable>
</mods:extension>
<mods:originInfo>
<mods:dateIssued encoding="iso8601">2014-11-03</mods:dateIssued>
</mods:originInfo>
<mods:identifier type="uri">http://hdl.handle.net/10630/8352</mods:identifier>
<mods:identifier type="orcid">https://orcid.org/0000-0003-3814-0335</mods:identifier>
<mods:abstract>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</mods:abstract>
<mods:language>
<mods:languageTerm authority="rfc3066">eng</mods:languageTerm>
</mods:language>
<mods:accessCondition type="useAndReproduction">info:eu-repo/semantics/openAccess</mods:accessCondition>
<mods:subject>
<mods:topic>Robots autónomos</mods:topic>
</mods:subject>
<mods:titleInfo>
<mods:title>3D laser from RGBD projections in robot local navigation</mods:title>
</mods:titleInfo>
<mods:genre>info:eu-repo/semantics/conferenceObject</mods:genre>
</mods:mods>
</xmlData>
</mdWrap>
</dmdSec>
<amdSec ID="TMD_10630_8352">
<rightsMD ID="RIG_10630_8352">
<mdWrap MDTYPE="OTHER" MIMETYPE="text/plain" OTHERMDTYPE="DSpaceDepositLicense">
<binData>CjEuIEFjZXB0YW5kbyBlc3RhIGxpY2VuY2lhLCB1c3RlZCAoZWwgYXV0b3IvZXMgbyBlbCBwcm9waWV0YXJpby9zIGRlIGxvcyBkZXJlY2hvcyBkZSBhdXRvcikgZ2FyYW50aXphIGEgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBlbCBkZXJlY2hvIG5vIGV4Y2x1c2l2byBkZSBhcmNoaXZhciwgcmVwcm9kdWNpciwgY29udmVydGlyIChjb21vIHNlIGRlZmluZSBtw6FzIGFiYWpvKSwgY29tdW5pY2FyIHkvbyBkaXN0cmlidWlyIHN1IGRvY3VtZW50byBtdW5kaWFsbWVudGUgZW4gZm9ybWF0byBlbGVjdHLDs25pY28uCgoyLiBUYW1iacOpbiBlc3TDoSBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBwdWVkYSBjb25zZXJ2YXIgbcOhcyBkZSB1bmEgY29waWEgZGUgZXN0ZSBkb2N1bWVudG8geSwgc2luIGFsdGVyYXIgc3UgY29udGVuaWRvLCBjb252ZXJ0aXJsbyBhIGN1YWxxdWllciBmb3JtYXRvIGRlIGZpY2hlcm8sIG1lZGlvIG8gc29wb3J0ZSwgcGFyYSBwcm9ww7NzaXRvcyBkZSBzZWd1cmlkYWQsIHByZXNlcnZhY2nDs24geSBhY2Nlc28uCgozLiBEZWNsYXJhIHF1ZSBlbCBkb2N1bWVudG8gZXMgdW4gdHJhYmFqbyBvcmlnaW5hbCBzdXlvIHkvbyBxdWUgdGllbmUgZWwgZGVyZWNobyBwYXJhIG90b3JnYXIgbG9zIGRlcmVjaG9zIGNvbnRlbmlkb3MgZW4gZXN0YSBsaWNlbmNpYS4gVGFtYmnDqW4gZGVjbGFyYSBxdWUgc3UgZG9jdW1lbnRvIG5vIGluZnJpbmdlLCBlbiB0YW50byBlbiBjdWFudG8gbGUgc2VhIHBvc2libGUgc2FiZXIsIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSBuaW5ndW5hIG90cmEgcGVyc29uYSBvIGVudGlkYWQuCgo0LiBTaSBlbCBkb2N1bWVudG8gY29udGllbmUgbWF0ZXJpYWxlcyBkZSBsb3MgY3VhbGVzIG5vIHRpZW5lIGxvcyBkZXJlY2hvcyBkZSBhdXRvciwgZGVjbGFyYSBxdWUgaGEgb2J0ZW5pZG8gZWwgcGVybWlzbyBzaW4gcmVzdHJpY2Npw7NuIGRlbCBwcm9waWV0YXJpbyBkZSBsb3MgZGVyZWNob3MgZGUgYXV0b3IgcGFyYSBvdG9yZ2FyIGEgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBsb3MgZGVyZWNob3MgcmVxdWVyaWRvcyBwb3IgZXN0YSBsaWNlbmNpYSwgeSBxdWUgZXNlIG1hdGVyaWFsIGN1eW9zIGRlcmVjaG9zIHNvbiBkZSB0ZXJjZXJvcyBlc3TDoSBjbGFyYW1lbnRlIGlkZW50aWZpY2FkbyB5IHJlY29ub2NpZG8gZW4gZWwgdGV4dG8gbyBjb250ZW5pZG8gZGVsIGRvY3VtZW50byBlbnRyZWdhZG8uCgo1LiBTaSBlbCBkb2N1bWVudG8gc2UgYmFzYSBlbiB1bmEgb2JyYSBxdWUgaGEgc2lkbyBwYXRyb2NpbmFkYSBvIGFwb3lhZGEgcG9yIHVuYSBhZ2VuY2lhIHUgb3JnYW5pemFjacOzbiBkaWZlcmVudGUgZGUgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSwgc2UgcHJlc3Vwb25lIHF1ZSBzZSBoYSBjdW1wbGlkbyBjb24gY3VhbHF1aWVyIGRlcmVjaG8gZGUgcmV2aXNpw7NuIHUgb3RyYXMgb2JsaWdhY2lvbmVzIHJlcXVlcmlkYXMgcG9yIGVzdGUgY29udHJhdG8gbyBhY3VlcmRvLgoKNi4gTGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBpZGVudGlmaWNhcsOhIGNsYXJhbWVudGUgc3UvcyBub21icmUvcyBjb21vIGVsL2xvcyBhdXRvci9lcyBvIHByb3BpZXRhcmlvL3MgZGUgbG9zIGRlcmVjaG9zIGRlbCBkb2N1bWVudG8sIHkgbm8gaGFyw6EgbmluZ3VuYSBhbHRlcmFjacOzbiBkZSBzdSBkb2N1bWVudG8gZGlmZXJlbnRlIGEgbGFzIHBlcm1pdGlkYXMgZW4gZXN0YSBsaWNlbmNpYS4KCg==</binData>
</mdWrap>
</rightsMD>
</amdSec>
<amdSec ID="FO_10630_8352_1">
<techMD ID="TECH_O_10630_8352_1">
<mdWrap MDTYPE="PREMIS">
<xmlData schemaLocation="http://www.loc.gov/standards/premis http://www.loc.gov/standards/premis/PREMIS-v1-0.xsd">
<premis:premis>
<premis:object>
<premis:objectIdentifier>
<premis:objectIdentifierType>URL</premis:objectIdentifierType>
<premis:objectIdentifierValue>https://riuma.uma.es/xmlui/bitstream/10630/8352/1/PresentWAF2014.pdf</premis:objectIdentifierValue>
</premis:objectIdentifier>
<premis:objectCategory>File</premis:objectCategory>
<premis:objectCharacteristics>
<premis:fixity>
<premis:messageDigestAlgorithm>MD5</premis:messageDigestAlgorithm>
<premis:messageDigest>03a2523662636bb51bdd52528df7e617</premis:messageDigest>
</premis:fixity>
<premis:size>378791</premis:size>
<premis:format>
<premis:formatDesignation>
<premis:formatName>application/pdf</premis:formatName>
</premis:formatDesignation>
</premis:format>
</premis:objectCharacteristics>
<premis:originalName>PresentWAF2014.pdf</premis:originalName>
</premis:object>
</premis:premis>
</xmlData>
</mdWrap>
</techMD>
</amdSec>
<fileSec>
<fileGrp USE="ORIGINAL">
<file ADMID="FO_10630_8352_1" CHECKSUM="03a2523662636bb51bdd52528df7e617" CHECKSUMTYPE="MD5" GROUPID="GROUP_BITSTREAM_10630_8352_1" ID="BITSTREAM_ORIGINAL_10630_8352_1" MIMETYPE="application/pdf" SEQ="1" SIZE="378791">
</file>
</fileGrp>
</fileSec>
<structMap LABEL="DSpace Object" TYPE="LOGICAL">
<div ADMID="DMD_10630_8352" TYPE="DSpace Object Contents">
<div TYPE="DSpace BITSTREAM">
</div>
</div>
</structMap>
</mets>
<?xml version="1.0" encoding="UTF-8" ?>
<mods:mods schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd">
<mods:name>
<mods:namePart>Calderita, Luis Vicente</mods:namePart>
</mods:name>
<mods:name>
<mods:namePart>Bandera-Rubio, Juan Pedro</mods:namePart>
</mods:name>
<mods:name>
<mods:namePart>Manso, Luis J.</mods:namePart>
</mods:name>
<mods:name>
<mods:namePart>Vázquez-Martín, Ricardo</mods:namePart>
</mods:name>
<mods:extension>
<mods:dateAvailable encoding="iso8601">2014-11-03T13:24:39Z</mods:dateAvailable>
</mods:extension>
<mods:extension>
<mods:dateAccessioned encoding="iso8601">2014-11-03T13:24:39Z</mods:dateAccessioned>
</mods:extension>
<mods:originInfo>
<mods:dateIssued encoding="iso8601">2014-11-03</mods:dateIssued>
</mods:originInfo>
<mods:identifier type="uri">http://hdl.handle.net/10630/8352</mods:identifier>
<mods:identifier type="orcid">https://orcid.org/0000-0003-3814-0335</mods:identifier>
<mods:abstract>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</mods:abstract>
<mods:language>
<mods:languageTerm>eng</mods:languageTerm>
</mods:language>
<mods:accessCondition type="useAndReproduction">info:eu-repo/semantics/openAccess</mods:accessCondition>
<mods:subject>
<mods:topic>Robots autónomos</mods:topic>
</mods:subject>
<mods:titleInfo>
<mods:title>3D laser from RGBD projections in robot local navigation</mods:title>
</mods:titleInfo>
<mods:genre>info:eu-repo/semantics/conferenceObject</mods:genre>
</mods:mods>
<?xml version="1.0" encoding="UTF-8" ?>
<atom:entry schemaLocation="http://www.w3.org/2005/Atom http://www.kbcafe.com/rss/atom.xsd.xml">
<atom:id>http://hdl.handle.net/10630/8352/ore.xml</atom:id>
<atom:published>2014-11-03T13:24:39Z</atom:published>
<atom:updated>2014-11-03T13:24:39Z</atom:updated>
<atom:source>
<atom:generator>Repositorio Institucional de la Universidad Málaga</atom:generator>
</atom:source>
<atom:title>3D laser from RGBD projections in robot local navigation</atom:title>
<atom:author>
<atom:name>Calderita, Luis Vicente</atom:name>
</atom:author>
<atom:author>
<atom:name>Bandera-Rubio, Juan Pedro</atom:name>
</atom:author>
<atom:author>
<atom:name>Manso, Luis J.</atom:name>
</atom:author>
<atom:author>
<atom:name>Vázquez-Martín, Ricardo</atom:name>
</atom:author>
<oreatom:triples>
<rdf:Description about="http://hdl.handle.net/10630/8352/ore.xml#atom">
<dcterms:modified>2014-11-03T13:24:39Z</dcterms:modified>
</rdf:Description>
<rdf:Description about="https://riuma.uma.es/xmlui/bitstream/10630/8352/1/PresentWAF2014.pdf">
<dcterms:description>ORIGINAL</dcterms:description>
</rdf:Description>
<rdf:Description about="https://riuma.uma.es/xmlui/bitstream/10630/8352/2/license.txt">
<dcterms:description>LICENSE</dcterms:description>
</rdf:Description>
</oreatom:triples>
</atom:entry>
<?xml version="1.0" encoding="UTF-8" ?>
<qdc:qualifieddc schemaLocation="http://purl.org/dc/elements/1.1/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dc.xsd http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dcterms.xsd http://dspace.org/qualifieddc/ http://www.ukoln.ac.uk/metadata/dcmi/xmlschema/qualifieddc.xsd">
<dc:title>3D laser from RGBD projections in robot local navigation</dc:title>
<dc:creator>Calderita, Luis Vicente</dc:creator>
<dc:creator>Bandera-Rubio, Juan Pedro</dc:creator>
<dc:creator>Manso, Luis J.</dc:creator>
<dc:creator>Vázquez-Martín, Ricardo</dc:creator>
<dc:subject>Robots autónomos</dc:subject>
<dcterms:abstract>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dcterms:abstract>
<dcterms:dateAccepted>2014-11-03T13:24:39Z</dcterms:dateAccepted>
<dcterms:available>2014-11-03T13:24:39Z</dcterms:available>
<dcterms:created>2014-11-03T13:24:39Z</dcterms:created>
<dcterms:issued>2014-11-03</dcterms:issued>
<dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
<dc:identifier>http://hdl.handle.net/10630/8352</dc:identifier>
<dc:identifier>https://orcid.org/0000-0003-3814-0335</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>Workshop on Physical Agents WAF 2014</dc:relation>
<dc:relation>León</dc:relation>
<dc:relation>12-12 Junio 2014</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
</qdc:qualifieddc>
<?xml version="1.0" encoding="UTF-8" ?>
<rdf:RDF schemaLocation="http://www.openarchives.org/OAI/2.0/rdf/ http://www.openarchives.org/OAI/2.0/rdf.xsd">
<ow:Publication about="oai:riuma.uma.es:10630/8352">
<dc:title>3D laser from RGBD projections in robot local navigation</dc:title>
<dc:creator>Calderita, Luis Vicente</dc:creator>
<dc:creator>Bandera-Rubio, Juan Pedro</dc:creator>
<dc:creator>Manso, Luis J.</dc:creator>
<dc:creator>Vázquez-Martín, Ricardo</dc:creator>
<dc:subject>Robots autónomos</dc:subject>
<dc:description>Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</dc:description>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-11-03T13:24:39Z</dc:date>
<dc:date>2014-06</dc:date>
<dc:date>2014-11-03</dc:date>
<dc:type>info:eu-repo/semantics/conferenceObject</dc:type>
<dc:identifier>http://hdl.handle.net/10630/8352</dc:identifier>
<dc:identifier>https://orcid.org/0000-0003-3814-0335</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>Workshop on Physical Agents WAF 2014</dc:relation>
<dc:relation>León</dc:relation>
<dc:relation>12-12 Junio 2014</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
</ow:Publication>
</rdf:RDF>
<?xml version="1.0" encoding="UTF-8" ?>
<metadata schemaLocation="http://www.lyncode.com/xoai http://www.lyncode.com/xsd/xoai.xsd">
<element name="dc">
<element name="contributor">
<element name="author">
<element name="none">
<field name="value">Calderita, Luis Vicente</field>
<field name="authority">6103b4a8-fb61-43b8-b5a4-5d2138abecbf</field>
<field name="confidence">500</field>
<field name="value">Bandera-Rubio, Juan Pedro</field>
<field name="authority">153</field>
<field name="confidence">500</field>
<field name="value">Manso, Luis J.</field>
<field name="authority">6892e2f1-473d-46e5-afa3-4440f4b68177</field>
<field name="confidence">500</field>
<field name="value">Vázquez-Martín, Ricardo</field>
<field name="authority">a2e428a6-88bc-4040-b1ed-9ef52c60547d</field>
<field name="confidence">500</field>
</element>
</element>
</element>
<element name="date">
<element name="accessioned">
<element name="none">
<field name="value">2014-11-03T13:24:39Z</field>
</element>
</element>
<element name="available">
<element name="none">
<field name="value">2014-11-03T13:24:39Z</field>
</element>
</element>
<element name="created">
<element name="none">
<field name="value">2014-06</field>
</element>
</element>
<element name="issued">
<element name="none">
<field name="value">2014-11-03</field>
</element>
</element>
</element>
<element name="identifier">
<element name="uri">
<element name="none">
<field name="value">http://hdl.handle.net/10630/8352</field>
</element>
</element>
<element name="orcid">
<element name="es_ES">
<field name="value">https://orcid.org/0000-0003-3814-0335</field>
</element>
</element>
</element>
<element name="description">
<element name="abstract">
<element name="es_ES">
<field name="value">Social robots are required to work in daily life environments. The navigation algorithms they need to safely move through these environments require reliable sensor data. We present a novel approach to increase the obstacle-avoidance abilities of robots by mounting several sensors and fusing all their data into a single representation. In particular, we fuse data from multiple RGBD cameras into a single emulated two-dimensional laser reading of up to 360 degrees. While the output of this virtual laser is two-dimensional, it integrates the obstacles detected at any height, so it can be safely used as input for regular two-dimensional navigation algorithms (both VFH* and R-ORM have been tested). Experiments conducted on real scenarios demonstrate the usefulness and efficiency of the proposed solution, which allows the robot to reach goals while avoiding static and dynamic obstacles.</field>
</element>
</element>
<element name="sponsorship">
<element name="es_ES">
<field name="value">Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.</field>
</element>
</element>
</element>
<element name="language">
<element name="iso">
<element name="es_ES">
<field name="value">eng</field>
</element>
</element>
</element>
<element name="rights">
<element name="es_ES">
<field name="value">info:eu-repo/semantics/openAccess</field>
</element>
</element>
<element name="subject">
<element name="es_ES">
<field name="value">Robots autónomos</field>
</element>
<element name="other">
<element name="es_ES">
<field name="value">Mobile robots</field>
<field name="value">Reactive navigation</field>
<field name="value">RGBD</field>
<field name="value">Sensor arrays</field>
</element>
</element>
</element>
<element name="title">
<element name="es_ES">
<field name="value">3D laser from RGBD projections in robot local navigation</field>
</element>
</element>
<element name="type">
<element name="es_ES">
<field name="value">info:eu-repo/semantics/conferenceObject</field>
</element>
</element>
<element name="relation">
<element name="eventtitle">
<element name="es_ES">
<field name="value">Workshop on Physical Agents WAF 2014</field>
</element>
</element>
<element name="eventplace">
<element name="es_ES">
<field name="value">León</field>
</element>
</element>
<element name="eventdate">
<element name="es_ES">
<field name="value">12-12 Junio 2014</field>
</element>
</element>
</element>
</element>
<element name="bundles">
<element name="bundle">
<field name="name">ORIGINAL</field>
<element name="bitstreams">
<element name="bitstream">
<field name="name">PresentWAF2014.pdf</field>
<field name="originalName">PresentWAF2014.pdf</field>
<field name="description">Presentación del artículo realizada en el congreso</field>
<field name="format">application/pdf</field>
<field name="size">378791</field>
<field name="url">https://riuma.uma.es/xmlui/bitstream/10630/8352/1/PresentWAF2014.pdf</field>
<field name="checksum">03a2523662636bb51bdd52528df7e617</field>
<field name="checksumAlgorithm">MD5</field>
<field name="sid">1</field>
</element>
</element>
</element>
<element name="bundle">
<field name="name">LICENSE</field>
<element name="bitstreams">
<element name="bitstream">
<field name="name">license.txt</field>
<field name="originalName">license.txt</field>
<field name="format">text/plain; charset=utf-8</field>
<field name="size">1747</field>
<field name="url">https://riuma.uma.es/xmlui/bitstream/10630/8352/2/license.txt</field>
<field name="checksum">71c3055fe9fdc9820f2aca3c57ab7400</field>
<field name="checksumAlgorithm">MD5</field>
<field name="sid">2</field>
</element>
</element>
</element>
</element>
<element name="others">
<field name="handle">10630/8352</field>
<field name="identifier">oai:riuma.uma.es:10630/8352</field>
<field name="lastModifyDate">2017-12-04 16:02:59.281</field>
</element>
<element name="repository">
<field name="name">Repositorio Institucional de la Universidad Málaga</field>
<field name="mail">riuma@uma.es</field>
</element>
<element name="license">
<field name="bin">CjEuIEFjZXB0YW5kbyBlc3RhIGxpY2VuY2lhLCB1c3RlZCAoZWwgYXV0b3IvZXMgbyBlbCBwcm9waWV0YXJpby9zIGRlIGxvcyBkZXJlY2hvcyBkZSBhdXRvcikgZ2FyYW50aXphIGEgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBlbCBkZXJlY2hvIG5vIGV4Y2x1c2l2byBkZSBhcmNoaXZhciwgcmVwcm9kdWNpciwgY29udmVydGlyIChjb21vIHNlIGRlZmluZSBtw6FzIGFiYWpvKSwgY29tdW5pY2FyIHkvbyBkaXN0cmlidWlyIHN1IGRvY3VtZW50byBtdW5kaWFsbWVudGUgZW4gZm9ybWF0byBlbGVjdHLDs25pY28uCgoyLiBUYW1iacOpbiBlc3TDoSBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBwdWVkYSBjb25zZXJ2YXIgbcOhcyBkZSB1bmEgY29waWEgZGUgZXN0ZSBkb2N1bWVudG8geSwgc2luIGFsdGVyYXIgc3UgY29udGVuaWRvLCBjb252ZXJ0aXJsbyBhIGN1YWxxdWllciBmb3JtYXRvIGRlIGZpY2hlcm8sIG1lZGlvIG8gc29wb3J0ZSwgcGFyYSBwcm9ww7NzaXRvcyBkZSBzZWd1cmlkYWQsIHByZXNlcnZhY2nDs24geSBhY2Nlc28uCgozLiBEZWNsYXJhIHF1ZSBlbCBkb2N1bWVudG8gZXMgdW4gdHJhYmFqbyBvcmlnaW5hbCBzdXlvIHkvbyBxdWUgdGllbmUgZWwgZGVyZWNobyBwYXJhIG90b3JnYXIgbG9zIGRlcmVjaG9zIGNvbnRlbmlkb3MgZW4gZXN0YSBsaWNlbmNpYS4gVGFtYmnDqW4gZGVjbGFyYSBxdWUgc3UgZG9jdW1lbnRvIG5vIGluZnJpbmdlLCBlbiB0YW50byBlbiBjdWFudG8gbGUgc2VhIHBvc2libGUgc2FiZXIsIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSBuaW5ndW5hIG90cmEgcGVyc29uYSBvIGVudGlkYWQuCgo0LiBTaSBlbCBkb2N1bWVudG8gY29udGllbmUgbWF0ZXJpYWxlcyBkZSBsb3MgY3VhbGVzIG5vIHRpZW5lIGxvcyBkZXJlY2hvcyBkZSBhdXRvciwgZGVjbGFyYSBxdWUgaGEgb2J0ZW5pZG8gZWwgcGVybWlzbyBzaW4gcmVzdHJpY2Npw7NuIGRlbCBwcm9waWV0YXJpbyBkZSBsb3MgZGVyZWNob3MgZGUgYXV0b3IgcGFyYSBvdG9yZ2FyIGEgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBsb3MgZGVyZWNob3MgcmVxdWVyaWRvcyBwb3IgZXN0YSBsaWNlbmNpYSwgeSBxdWUgZXNlIG1hdGVyaWFsIGN1eW9zIGRlcmVjaG9zIHNvbiBkZSB0ZXJjZXJvcyBlc3TDoSBjbGFyYW1lbnRlIGlkZW50aWZpY2FkbyB5IHJlY29ub2NpZG8gZW4gZWwgdGV4dG8gbyBjb250ZW5pZG8gZGVsIGRvY3VtZW50byBlbnRyZWdhZG8uCgo1LiBTaSBlbCBkb2N1bWVudG8gc2UgYmFzYSBlbiB1bmEgb2JyYSBxdWUgaGEgc2lkbyBwYXRyb2NpbmFkYSBvIGFwb3lhZGEgcG9yIHVuYSBhZ2VuY2lhIHUgb3JnYW5pemFjacOzbiBkaWZlcmVudGUgZGUgbGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSwgc2UgcHJlc3Vwb25lIHF1ZSBzZSBoYSBjdW1wbGlkbyBjb24gY3VhbHF1aWVyIGRlcmVjaG8gZGUgcmV2aXNpw7NuIHUgb3RyYXMgb2JsaWdhY2lvbmVzIHJlcXVlcmlkYXMgcG9yIGVzdGUgY29udHJhdG8gbyBhY3VlcmRvLgoKNi4gTGEgVW5pdmVyc2lkYWQgZGUgTcOhbGFnYSBpZGVudGlmaWNhcsOhIGNsYXJhbWVudGUgc3UvcyBub21icmUvcyBjb21vIGVsL2xvcyBhdXRvci9lcyBvIHByb3BpZXRhcmlvL3MgZGUgbG9zIGRlcmVjaG9zIGRlbCBkb2N1bWVudG8sIHkgbm8gaGFyw6EgbmluZ3VuYSBhbHRlcmFjacOzbiBkZSBzdSBkb2N1bWVudG8gZGlmZXJlbnRlIGEgbGFzIHBlcm1pdGlkYXMgZW4gZXN0YSBsaWNlbmNpYS4KCg==</field>
</element>
</metadata>