ForestHash : semantic hashing with shallow random forests and tiny convolutional networks
Resumen:
In this paper, we introduce a random forest semantic hashing scheme that embeds tiny convolutional neural networks (CNN) into shallow random forests. A binary hash code for a data point is obtained by a set of decision trees, setting ‘1’ for the visited tree leaf, and ‘0’ for the rest. We propose to first randomly group arriving classes at each tree split node into two groups, obtaining a significantly simplified two-class classification problem that can be a handled with a light-weight CNN weak learner. Code uniqueness is achieved via the random class grouping, whilst code consistency is achieved using a low-rank loss in the CNN weak learners that encourages intra-class compactness for the two random class groups. Finally, we introduce an information-theoretic approach for aggregating codes of individual trees into a single hash code, producing a nearoptimal unique hash for each class. The proposed approach significantly outperforms state-of-the-art hashing methods for image retrieval tasks on large-scale public datasets, and is comparable to image classification methods while utilizing a more compact, efficient and scalable representation. This work proposes a principled and robust procedure to train and deploy in parallel an ensemble of light-weight CNNs, instead of simply going deeper
2018 | |
Computing methodologies Machine learning Procesamiento de Señales |
|
Inglés | |
Universidad de la República | |
COLIBRI | |
https://hdl.handle.net/20.500.12008/43551 | |
Acceso abierto | |
Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
_version_ | 1807522994947358720 |
---|---|
author | Sapiro, Guillermo |
author2 | Bronstein, Alex Lezama, José Qiu, Qiang |
author2_role | author author author |
author_facet | Sapiro, Guillermo Bronstein, Alex Lezama, José Qiu, Qiang |
author_role | author |
bitstream.checksum.fl_str_mv | 528b6a3c8c7d0c6e28129d576e989607 9833653f73f7853880c94a6fead477b1 4afdbb8c545fd630ea7db775da747b2f 9da0b6dfac957114c6a7714714b86306 99b1ab6964cc695da1efb6ffa182293a |
bitstream.checksumAlgorithm.fl_str_mv | MD5 MD5 MD5 MD5 MD5 |
bitstream.url.fl_str_mv | http://localhost:8080/xmlui/bitstream/20.500.12008/43551/5/license.txt http://localhost:8080/xmlui/bitstream/20.500.12008/43551/2/license_text http://localhost:8080/xmlui/bitstream/20.500.12008/43551/3/license_url http://localhost:8080/xmlui/bitstream/20.500.12008/43551/4/license_rdf http://localhost:8080/xmlui/bitstream/20.500.12008/43551/1/QLBS18.pdf |
collection | COLIBRI |
dc.creator.none.fl_str_mv | Sapiro, Guillermo Bronstein, Alex Lezama, José Qiu, Qiang |
dc.date.accessioned.none.fl_str_mv | 2024-04-16T16:21:22Z |
dc.date.available.none.fl_str_mv | 2024-04-16T16:21:22Z |
dc.date.issued.es.fl_str_mv | 2018 |
dc.date.submitted.es.fl_str_mv | 20240416 |
dc.description.abstract.none.fl_txt_mv | In this paper, we introduce a random forest semantic hashing scheme that embeds tiny convolutional neural networks (CNN) into shallow random forests. A binary hash code for a data point is obtained by a set of decision trees, setting ‘1’ for the visited tree leaf, and ‘0’ for the rest. We propose to first randomly group arriving classes at each tree split node into two groups, obtaining a significantly simplified two-class classification problem that can be a handled with a light-weight CNN weak learner. Code uniqueness is achieved via the random class grouping, whilst code consistency is achieved using a low-rank loss in the CNN weak learners that encourages intra-class compactness for the two random class groups. Finally, we introduce an information-theoretic approach for aggregating codes of individual trees into a single hash code, producing a nearoptimal unique hash for each class. The proposed approach significantly outperforms state-of-the-art hashing methods for image retrieval tasks on large-scale public datasets, and is comparable to image classification methods while utilizing a more compact, efficient and scalable representation. This work proposes a principled and robust procedure to train and deploy in parallel an ensemble of light-weight CNNs, instead of simply going deeper |
dc.description.es.fl_txt_mv | Trabajo presentado a 15th European Conference Computer Vision,ECCV 2018, Munich, Germany, 8-14, set., 2018 |
dc.identifier.citation.es.fl_str_mv | Qiu, Q, Lezama, J, Bronstein, A, Sapiro, G. "ForestHash : semantic hashing with shallow random forests and tiny convolutional networks" Publicado en: Proceedings of the 15th European Conference Computer Vision, ECCV 2018, Munich, Germany, 8-14, set., 2018, , Part II, 442–459, https://doi.org/10.1007/978-3-030-01216-8_27 |
dc.identifier.uri.none.fl_str_mv | https://hdl.handle.net/20.500.12008/43551 |
dc.language.iso.none.fl_str_mv | en eng |
dc.rights.license.none.fl_str_mv | Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
dc.rights.none.fl_str_mv | info:eu-repo/semantics/openAccess |
dc.source.none.fl_str_mv | reponame:COLIBRI instname:Universidad de la República instacron:Universidad de la República |
dc.subject.es.fl_str_mv | Computing methodologies Machine learning |
dc.subject.other.es.fl_str_mv | Procesamiento de Señales |
dc.title.none.fl_str_mv | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
dc.type.es.fl_str_mv | Ponencia |
dc.type.none.fl_str_mv | info:eu-repo/semantics/conferenceObject |
dc.type.version.none.fl_str_mv | info:eu-repo/semantics/publishedVersion |
description | Trabajo presentado a 15th European Conference Computer Vision,ECCV 2018, Munich, Germany, 8-14, set., 2018 |
eu_rights_str_mv | openAccess |
format | conferenceObject |
id | COLIBRI_3a75e21c2f40fb607ba6aea1bce2fb28 |
identifier_str_mv | Qiu, Q, Lezama, J, Bronstein, A, Sapiro, G. "ForestHash : semantic hashing with shallow random forests and tiny convolutional networks" Publicado en: Proceedings of the 15th European Conference Computer Vision, ECCV 2018, Munich, Germany, 8-14, set., 2018, , Part II, 442–459, https://doi.org/10.1007/978-3-030-01216-8_27 |
instacron_str | Universidad de la República |
institution | Universidad de la República |
instname_str | Universidad de la República |
language | eng |
language_invalid_str_mv | en |
network_acronym_str | COLIBRI |
network_name_str | COLIBRI |
oai_identifier_str | oai:colibri.udelar.edu.uy:20.500.12008/43551 |
publishDate | 2018 |
reponame_str | COLIBRI |
repository.mail.fl_str_mv | mabel.seroubian@seciu.edu.uy |
repository.name.fl_str_mv | COLIBRI - Universidad de la República |
repository_id_str | 4771 |
rights_invalid_str_mv | Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
spelling | 2024-04-16T16:21:22Z2024-04-16T16:21:22Z201820240416Qiu, Q, Lezama, J, Bronstein, A, Sapiro, G. "ForestHash : semantic hashing with shallow random forests and tiny convolutional networks" Publicado en: Proceedings of the 15th European Conference Computer Vision, ECCV 2018, Munich, Germany, 8-14, set., 2018, , Part II, 442–459, https://doi.org/10.1007/978-3-030-01216-8_27https://hdl.handle.net/20.500.12008/43551Trabajo presentado a 15th European Conference Computer Vision,ECCV 2018, Munich, Germany, 8-14, set., 2018In this paper, we introduce a random forest semantic hashing scheme that embeds tiny convolutional neural networks (CNN) into shallow random forests. A binary hash code for a data point is obtained by a set of decision trees, setting ‘1’ for the visited tree leaf, and ‘0’ for the rest. We propose to first randomly group arriving classes at each tree split node into two groups, obtaining a significantly simplified two-class classification problem that can be a handled with a light-weight CNN weak learner. Code uniqueness is achieved via the random class grouping, whilst code consistency is achieved using a low-rank loss in the CNN weak learners that encourages intra-class compactness for the two random class groups. Finally, we introduce an information-theoretic approach for aggregating codes of individual trees into a single hash code, producing a nearoptimal unique hash for each class. The proposed approach significantly outperforms state-of-the-art hashing methods for image retrieval tasks on large-scale public datasets, and is comparable to image classification methods while utilizing a more compact, efficient and scalable representation. This work proposes a principled and robust procedure to train and deploy in parallel an ensemble of light-weight CNNs, instead of simply going deeperMade available in DSpace on 2024-04-16T16:21:22Z (GMT). No. of bitstreams: 5 QLBS18.pdf: 1840028 bytes, checksum: 99b1ab6964cc695da1efb6ffa182293a (MD5) license_text: 21936 bytes, checksum: 9833653f73f7853880c94a6fead477b1 (MD5) license_url: 49 bytes, checksum: 4afdbb8c545fd630ea7db775da747b2f (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) license.txt: 4244 bytes, checksum: 528b6a3c8c7d0c6e28129d576e989607 (MD5) Previous issue date: 2018enengLas obras depositadas en el Repositorio se rigen por la Ordenanza de los Derechos de la Propiedad Intelectual de la Universidad De La República. (Res. Nº 91 de C.D.C. de 8/III/1994 – D.O. 7/IV/1994) y por la Ordenanza del Repositorio Abierto de la Universidad de la República (Res. Nº 16 de C.D.C. de 07/10/2014)info:eu-repo/semantics/openAccessLicencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)Computing methodologiesMachine learningProcesamiento de SeñalesForestHash : semantic hashing with shallow random forests and tiny convolutional networksPonenciainfo:eu-repo/semantics/conferenceObjectinfo:eu-repo/semantics/publishedVersionreponame:COLIBRIinstname:Universidad de la Repúblicainstacron:Universidad de la RepúblicaSapiro, GuillermoBronstein, AlexLezama, JoséQiu, QiangProcesamiento de SeñalesTratamiento de ImágenesLICENSElicense.txttext/plain4244http://localhost:8080/xmlui/bitstream/20.500.12008/43551/5/license.txt528b6a3c8c7d0c6e28129d576e989607MD55CC-LICENSElicense_textapplication/octet-stream21936http://localhost:8080/xmlui/bitstream/20.500.12008/43551/2/license_text9833653f73f7853880c94a6fead477b1MD52license_urlapplication/octet-stream49http://localhost:8080/xmlui/bitstream/20.500.12008/43551/3/license_url4afdbb8c545fd630ea7db775da747b2fMD53license_rdfapplication/octet-stream23148http://localhost:8080/xmlui/bitstream/20.500.12008/43551/4/license_rdf9da0b6dfac957114c6a7714714b86306MD54ORIGINALQLBS18.pdfapplication/pdf1840028http://localhost:8080/xmlui/bitstream/20.500.12008/43551/1/QLBS18.pdf99b1ab6964cc695da1efb6ffa182293aMD5120.500.12008/435512024-08-01 18:18:47.578oai:colibri.udelar.edu.uy:20.500.12008/43551VGVybWlub3MgeSBjb25kaWNpb25lcyByZWxhdGl2YXMgYWwgZGVwb3NpdG8gZGUgb2JyYXMNCg0KDQpMYXMgb2JyYXMgZGVwb3NpdGFkYXMgZW4gZWwgUmVwb3NpdG9yaW8gc2UgcmlnZW4gcG9yIGxhIE9yZGVuYW56YSBkZSBsb3MgRGVyZWNob3MgZGUgbGEgUHJvcGllZGFkIEludGVsZWN0dWFsICBkZSBsYSBVbml2ZXJzaWRhZCBEZSBMYSBSZXDvv71ibGljYS4gKFJlcy4gTu+/vSA5MSBkZSBDLkQuQy4gZGUgOC9JSUkvMTk5NCDvv70gRC5PLiA3L0lWLzE5OTQpIHkgIHBvciBsYSBPcmRlbmFuemEgZGVsIFJlcG9zaXRvcmlvIEFiaWVydG8gZGUgbGEgVW5pdmVyc2lkYWQgZGUgbGEgUmVw77+9YmxpY2EgKFJlcy4gTu+/vSAxNiBkZSBDLkQuQy4gZGUgMDcvMTAvMjAxNCkuIA0KDQpBY2VwdGFuZG8gZWwgYXV0b3IgZXN0b3MgdO+/vXJtaW5vcyB5IGNvbmRpY2lvbmVzIGRlIGRlcO+/vXNpdG8gZW4gQ09MSUJSSSwgbGEgVW5pdmVyc2lkYWQgZGUgUmVw77+9YmxpY2EgcHJvY2VkZXLvv70gYTogIA0KDQphKSBhcmNoaXZhciBt77+9cyBkZSB1bmEgY29waWEgZGUgbGEgb2JyYSBlbiBsb3Mgc2Vydmlkb3JlcyBkZSBsYSBVbml2ZXJzaWRhZCBhIGxvcyBlZmVjdG9zIGRlIGdhcmFudGl6YXIgYWNjZXNvLCBzZWd1cmlkYWQgeSBwcmVzZXJ2YWNp77+9bg0KYikgY29udmVydGlyIGxhIG9icmEgYSBvdHJvcyBmb3JtYXRvcyBzaSBmdWVyYSBuZWNlc2FyaW8gIHBhcmEgZmFjaWxpdGFyIHN1IHByZXNlcnZhY2nvv71uIHkgYWNjZXNpYmlsaWRhZCBzaW4gYWx0ZXJhciBzdSBjb250ZW5pZG8uDQpjKSByZWFsaXphciBsYSBjb211bmljYWNp77+9biBw77+9YmxpY2EgeSBkaXNwb25lciBlbCBhY2Nlc28gbGlicmUgeSBncmF0dWl0byBhIHRyYXbvv71zIGRlIEludGVybmV0IG1lZGlhbnRlIGxhIHB1YmxpY2Fjae+/vW4gZGUgbGEgb2JyYSBiYWpvIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgc2VsZWNjaW9uYWRhIHBvciBlbCBwcm9waW8gYXV0b3IuDQoNCg0KRW4gY2FzbyBxdWUgZWwgYXV0b3IgaGF5YSBkaWZ1bmRpZG8geSBkYWRvIGEgcHVibGljaWRhZCBhIGxhIG9icmEgZW4gZm9ybWEgcHJldmlhLCAgcG9kcu+/vSBzb2xpY2l0YXIgdW4gcGVy77+9b2RvIGRlIGVtYmFyZ28gc29icmUgbGEgZGlzcG9uaWJpbGlkYWQgcO+/vWJsaWNhIGRlIGxhIG1pc21hLCBlbCBjdWFsIGNvbWVuemFy77+9IGEgcGFydGlyIGRlIGxhIGFjZXB0YWNp77+9biBkZSBlc3RlIGRvY3VtZW50byB5IGhhc3RhIGxhIGZlY2hhIHF1ZSBpbmRpcXVlIC4NCg0KRWwgYXV0b3IgYXNlZ3VyYSBxdWUgbGEgb2JyYSBubyBpbmZyaWdlIG5pbmfvv71uIGRlcmVjaG8gc29icmUgdGVyY2Vyb3MsIHlhIHNlYSBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwgbyBjdWFscXVpZXIgb3Ryby4NCg0KRWwgYXV0b3IgZ2FyYW50aXphIHF1ZSBzaSBlbCBkb2N1bWVudG8gY29udGllbmUgbWF0ZXJpYWxlcyBkZSBsb3MgY3VhbGVzIG5vIHRpZW5lIGxvcyBkZXJlY2hvcyBkZSBhdXRvciwgIGhhIG9idGVuaWRvIGVsIHBlcm1pc28gZGVsIHByb3BpZXRhcmlvIGRlIGxvcyBkZXJlY2hvcyBkZSBhdXRvciwgeSBxdWUgZXNlIG1hdGVyaWFsIGN1eW9zIGRlcmVjaG9zIHNvbiBkZSB0ZXJjZXJvcyBlc3Tvv70gY2xhcmFtZW50ZSBpZGVudGlmaWNhZG8geSByZWNvbm9jaWRvIGVuIGVsIHRleHRvIG8gY29udGVuaWRvIGRlbCBkb2N1bWVudG8gZGVwb3NpdGFkbyBlbiBlbCBSZXBvc2l0b3Jpby4NCg0KRW4gb2JyYXMgZGUgYXV0b3Lvv71hIG3vv71sdGlwbGUgL3NlIHByZXN1bWUvIHF1ZSBlbCBhdXRvciBkZXBvc2l0YW50ZSBkZWNsYXJhIHF1ZSBoYSByZWNhYmFkbyBlbCBjb25zZW50aW1pZW50byBkZSB0b2RvcyBsb3MgYXV0b3JlcyBwYXJhIHB1YmxpY2FybGEgZW4gZWwgUmVwb3NpdG9yaW8sIHNpZW5kbyDvv71zdGUgZWwg77+9bmljbyByZXNwb25zYWJsZSBmcmVudGUgYSBjdWFscXVpZXIgdGlwbyBkZSByZWNsYW1hY2nvv71uIGRlIGxvcyBvdHJvcyBjb2F1dG9yZXMuDQoNCkVsIGF1dG9yIHNlcu+/vSByZXNwb25zYWJsZSBkZWwgY29udGVuaWRvIGRlIGxvcyBkb2N1bWVudG9zIHF1ZSBkZXBvc2l0YS4gTGEgVURFTEFSIG5vIHNlcu+/vSByZXNwb25zYWJsZSBwb3IgbGFzIGV2ZW50dWFsZXMgdmlvbGFjaW9uZXMgYWwgZGVyZWNobyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwgZW4gcXVlIHB1ZWRhIGluY3VycmlyIGVsIGF1dG9yLg0KDQpBbnRlIGN1YWxxdWllciBkZW51bmNpYSBkZSB2aW9sYWNp77+9biBkZSBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGxhIFVERUxBUiAgYWRvcHRhcu+/vSB0b2RhcyBsYXMgbWVkaWRhcyBuZWNlc2FyaWFzIHBhcmEgZXZpdGFyIGxhIGNvbnRpbnVhY2nvv71uIGRlIGRpY2hhIGluZnJhY2Np77+9biwgbGFzIHF1ZSBwb2Ry77+9biBpbmNsdWlyIGVsIHJldGlybyBkZWwgYWNjZXNvIGEgbG9zIGNvbnRlbmlkb3MgeS9vIG1ldGFkYXRvcyBkZWwgZG9jdW1lbnRvIHJlc3BlY3Rpdm8uDQoNCkxhIG9icmEgc2UgcG9uZHLvv70gYSBkaXNwb3NpY2nvv71uIGRlbCBw77+9YmxpY28gYSB0cmF277+9cyBkZSBsYXMgbGljZW5jaWFzIENyZWF0aXZlIENvbW1vbnMsIGVsIGF1dG9yIHBvZHLvv70gc2VsZWNjaW9uYXIgdW5hIGRlIGxhcyA2IGxpY2VuY2lhcyBkaXNwb25pYmxlczoNCg0KDQpBdHJpYnVjae+/vW4gKENDIC0gQnkpOiBQZXJtaXRlIHVzYXIgbGEgb2JyYSB5IGdlbmVyYXIgb2JyYXMgZGVyaXZhZGFzLCBpbmNsdXNvIGNvbiBmaW5lcyBjb21lcmNpYWxlcywgc2llbXByZSBxdWUgc2UgcmVjb25vemNhIGFsIGF1dG9yLg0KDQpBdHJpYnVjae+/vW4g77+9IENvbXBhcnRpciBJZ3VhbCAoQ0MgLSBCeS1TQSk6IFBlcm1pdGUgdXNhciBsYSBvYnJhIHkgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIGluY2x1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBwZXJvIGxhIGRpc3RyaWJ1Y2nvv71uIGRlIGxhcyBvYnJhcyBkZXJpdmFkYXMgZGViZSBoYWNlcnNlIG1lZGlhbnRlIHVuYSBsaWNlbmNpYSBpZO+/vW50aWNhIGEgbGEgZGUgbGEgb2JyYSBvcmlnaW5hbCwgcmVjb25vY2llbmRvIGEgbG9zIGF1dG9yZXMuDQoNCkF0cmlidWNp77+9biDvv70gTm8gQ29tZXJjaWFsIChDQyAtIEJ5LU5DKTogUGVybWl0ZSB1c2FyIGxhIG9icmEgeSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcywgc2llbXByZSB5IGN1YW5kbyBlc29zIHVzb3Mgbm8gdGVuZ2FuIGZpbmVzIGNvbWVyY2lhbGVzLCByZWNvbm9jaWVuZG8gYWwgYXV0b3IuDQoNCkF0cmlidWNp77+9biDvv70gU2luIERlcml2YWRhcyAoQ0MgLSBCeS1ORCk6IFBlcm1pdGUgZWwgdXNvIGRlIGxhIG9icmEsIGluY2x1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBwZXJvIG5vIHNlIHBlcm1pdGUgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIGRlYmllbmRvIHJlY29ub2NlciBhbCBhdXRvci4NCg0KQXRyaWJ1Y2nvv71uIO+/vSBObyBDb21lcmNpYWwg77+9IENvbXBhcnRpciBJZ3VhbCAoQ0Mg77+9IEJ5LU5DLVNBKTogUGVybWl0ZSB1c2FyIGxhIG9icmEgeSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcywgc2llbXByZSB5IGN1YW5kbyBlc29zIHVzb3Mgbm8gdGVuZ2FuIGZpbmVzIGNvbWVyY2lhbGVzIHkgbGEgZGlzdHJpYnVjae+/vW4gZGUgbGFzIG9icmFzIGRlcml2YWRhcyBzZSBoYWdhIG1lZGlhbnRlIGxpY2VuY2lhIGlk77+9bnRpY2EgYSBsYSBkZSBsYSBvYnJhIG9yaWdpbmFsLCByZWNvbm9jaWVuZG8gYSBsb3MgYXV0b3Jlcy4NCg0KQXRyaWJ1Y2nvv71uIO+/vSBObyBDb21lcmNpYWwg77+9IFNpbiBEZXJpdmFkYXMgKENDIC0gQnktTkMtTkQpOiBQZXJtaXRlIHVzYXIgbGEgb2JyYSwgcGVybyBubyBzZSBwZXJtaXRlIGdlbmVyYXIgb2JyYXMgZGVyaXZhZGFzIHkgbm8gc2UgcGVybWl0ZSB1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBkZWJpZW5kbyByZWNvbm9jZXIgYWwgYXV0b3IuDQoNCkxvcyB1c29zIHByZXZpc3RvcyBlbiBsYXMgbGljZW5jaWFzIGluY2x1eWVuIGxhIGVuYWplbmFjae+/vW4sIHJlcHJvZHVjY2nvv71uLCBjb211bmljYWNp77+9biwgcHVibGljYWNp77+9biwgZGlzdHJpYnVjae+/vW4geSBwdWVzdGEgYSBkaXNwb3NpY2nvv71uIGRlbCBw77+9YmxpY28uIExhIGNyZWFjae+/vW4gZGUgb2JyYXMgZGVyaXZhZGFzIGluY2x1eWUgbGEgYWRhcHRhY2nvv71uLCB0cmFkdWNjae+/vW4geSBlbCByZW1peC4NCg0KQ3VhbmRvIHNlIHNlbGVjY2lvbmUgdW5hIGxpY2VuY2lhIHF1ZSBoYWJpbGl0ZSB1c29zIGNvbWVyY2lhbGVzLCBlbCBkZXDvv71zaXRvIGRlYmVy77+9IHNlciBhY29tcGHvv71hZG8gZGVsIGF2YWwgZGVsIGplcmFyY2Egbe+/vXhpbW8gZGVsIFNlcnZpY2lvIGNvcnJlc3BvbmRpZW50ZS4NCg0KDQoNCg0KDQoNCg0KDQo=Universidadhttps://udelar.edu.uy/https://www.colibri.udelar.edu.uy/oai/requestmabel.seroubian@seciu.edu.uyUruguayopendoar:47712024-08-13T03:01:00.001357COLIBRI - Universidad de la Repúblicafalse |
spellingShingle | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks Sapiro, Guillermo Computing methodologies Machine learning Procesamiento de Señales |
status_str | publishedVersion |
title | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
title_full | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
title_fullStr | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
title_full_unstemmed | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
title_short | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
title_sort | ForestHash : semantic hashing with shallow random forests and tiny convolutional networks |
topic | Computing methodologies Machine learning Procesamiento de Señales |
url | https://hdl.handle.net/20.500.12008/43551 |