Generation of english question answer exercises from texts using transformers based models
Resumen:
This paper studies the use of NLP techniques, in particular, neural language models, for the generation of question/answer exercises from English texts. The experiments aim to generate beginner-level exercises from simple texts, to be used in teaching ESL (English as a Second Language) to children. The approach we present in this paper is based on four stages: a pre-processing stage that, among other basic tasks, applies a co-reference resolution tool; an answer candidate selection stage, which is based on semantic role labeling; a question generation stage, which takes as input the text with the resolved co-references and returns a set of questions for each answer candidate using a language model based on the Transformers architecture; and a post-processing stage that adjusts the format of the generated questions. The question generation model was evaluated on a benchmark obtaining similar results to those of previous works, and the complete pipeline was evaluated on a corpus specifically created for this task, achieving good results.
2022 | |
Agencia Nacional de Investigación e Innovación. Proyecto FSED_2_2020_1_163587. | |
NLP for language teaching Question & answering Transformers Neural language models |
|
Inglés | |
Universidad de la República | |
COLIBRI | |
https://hdl.handle.net/20.500.12008/37155 | |
Acceso abierto | |
Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
_version_ | 1807522945855127552 |
---|---|
author | Berger, Gonzalo |
author2 | Rischewski, Tatiana Chiruzzo, Luis Rosá, Aiala |
author2_role | author author author |
author_facet | Berger, Gonzalo Rischewski, Tatiana Chiruzzo, Luis Rosá, Aiala |
author_role | author |
bitstream.checksum.fl_str_mv | 6429389a7df7277b72b7924fdc7d47a9 a006180e3f5b2ad0b88185d14284c0e0 e8c30e04e865334cac2bfcba70aad8cb 1996b8461bc290aef6a27d78c67b6b52 06511a50453e6b1b2256cda20648dd1d |
bitstream.checksumAlgorithm.fl_str_mv | MD5 MD5 MD5 MD5 MD5 |
bitstream.url.fl_str_mv | http://localhost:8080/xmlui/bitstream/20.500.12008/37155/5/license.txt http://localhost:8080/xmlui/bitstream/20.500.12008/37155/2/license_url http://localhost:8080/xmlui/bitstream/20.500.12008/37155/3/license_text http://localhost:8080/xmlui/bitstream/20.500.12008/37155/4/license_rdf http://localhost:8080/xmlui/bitstream/20.500.12008/37155/1/BRCR22.pdf |
collection | COLIBRI |
dc.contributor.filiacion.none.fl_str_mv | Berger Gonzalo, Universidad de la República (Uruguay). Facultad de Ingeniería. Rischewski Tatiana, Universidad de la República (Uruguay). Facultad de Ingeniería Chiruzzo Luis, Universidad de la República (Uruguay). Facultad de Ingeniería. Rosá Aiala, Universidad de la República (Uruguay). Facultad de Ingeniería. |
dc.creator.none.fl_str_mv | Berger, Gonzalo Rischewski, Tatiana Chiruzzo, Luis Rosá, Aiala |
dc.date.accessioned.none.fl_str_mv | 2023-05-16T16:27:08Z |
dc.date.available.none.fl_str_mv | 2023-05-16T16:27:08Z |
dc.date.issued.none.fl_str_mv | 2022 |
dc.description.abstract.none.fl_txt_mv | This paper studies the use of NLP techniques, in particular, neural language models, for the generation of question/answer exercises from English texts. The experiments aim to generate beginner-level exercises from simple texts, to be used in teaching ESL (English as a Second Language) to children. The approach we present in this paper is based on four stages: a pre-processing stage that, among other basic tasks, applies a co-reference resolution tool; an answer candidate selection stage, which is based on semantic role labeling; a question generation stage, which takes as input the text with the resolved co-references and returns a set of questions for each answer candidate using a language model based on the Transformers architecture; and a post-processing stage that adjusts the format of the generated questions. The question generation model was evaluated on a benchmark obtaining similar results to those of previous works, and the complete pipeline was evaluated on a corpus specifically created for this task, achieving good results. |
dc.description.es.fl_txt_mv | 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay. |
dc.description.sponsorship.none.fl_txt_mv | Agencia Nacional de Investigación e Innovación. Proyecto FSED_2_2020_1_163587. |
dc.format.extent.es.fl_str_mv | 5 p. |
dc.format.mimetype.es.fl_str_mv | application/pdf |
dc.identifier.citation.es.fl_str_mv | Berger, G., Rischewski, T., Chiruzzo, L. y otros. Generation of english question answer exercises from texts using transformers based models [en línea] EN : 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay. 5 p. DOI: 10.1109/LA-CCI54402.2022.9981171 |
dc.identifier.doi.none.fl_str_mv | 10.1109/LA-CCI54402.2022.9981171 |
dc.identifier.uri.none.fl_str_mv | https://hdl.handle.net/20.500.12008/37155 |
dc.language.iso.none.fl_str_mv | en eng |
dc.publisher.es.fl_str_mv | IEEE |
dc.rights.license.none.fl_str_mv | Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
dc.rights.none.fl_str_mv | info:eu-repo/semantics/openAccess |
dc.source.none.fl_str_mv | reponame:COLIBRI instname:Universidad de la República instacron:Universidad de la República |
dc.subject.es.fl_str_mv | NLP for language teaching Question & answering Transformers Neural language models |
dc.title.none.fl_str_mv | Generation of english question answer exercises from texts using transformers based models |
dc.type.es.fl_str_mv | Ponencia |
dc.type.none.fl_str_mv | info:eu-repo/semantics/conferenceObject |
dc.type.version.none.fl_str_mv | info:eu-repo/semantics/publishedVersion |
description | 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay. |
eu_rights_str_mv | openAccess |
format | conferenceObject |
id | COLIBRI_e9fd37fc3f5c5a971a1a77b37a2c1f58 |
identifier_str_mv | Berger, G., Rischewski, T., Chiruzzo, L. y otros. Generation of english question answer exercises from texts using transformers based models [en línea] EN : 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay. 5 p. DOI: 10.1109/LA-CCI54402.2022.9981171 10.1109/LA-CCI54402.2022.9981171 |
instacron_str | Universidad de la República |
institution | Universidad de la República |
instname_str | Universidad de la República |
language | eng |
language_invalid_str_mv | en |
network_acronym_str | COLIBRI |
network_name_str | COLIBRI |
oai_identifier_str | oai:colibri.udelar.edu.uy:20.500.12008/37155 |
publishDate | 2022 |
reponame_str | COLIBRI |
repository.mail.fl_str_mv | mabel.seroubian@seciu.edu.uy |
repository.name.fl_str_mv | COLIBRI - Universidad de la República |
repository_id_str | 4771 |
rights_invalid_str_mv | Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |
spelling | Berger Gonzalo, Universidad de la República (Uruguay). Facultad de Ingeniería.Rischewski Tatiana, Universidad de la República (Uruguay). Facultad de IngenieríaChiruzzo Luis, Universidad de la República (Uruguay). Facultad de Ingeniería.Rosá Aiala, Universidad de la República (Uruguay). Facultad de Ingeniería.2023-05-16T16:27:08Z2023-05-16T16:27:08Z2022Berger, G., Rischewski, T., Chiruzzo, L. y otros. Generation of english question answer exercises from texts using transformers based models [en línea] EN : 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay. 5 p. DOI: 10.1109/LA-CCI54402.2022.9981171https://hdl.handle.net/20.500.12008/3715510.1109/LA-CCI54402.2022.99811712022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), 23-25 November 2022, Montevideo, Uruguay.This paper studies the use of NLP techniques, in particular, neural language models, for the generation of question/answer exercises from English texts. The experiments aim to generate beginner-level exercises from simple texts, to be used in teaching ESL (English as a Second Language) to children. The approach we present in this paper is based on four stages: a pre-processing stage that, among other basic tasks, applies a co-reference resolution tool; an answer candidate selection stage, which is based on semantic role labeling; a question generation stage, which takes as input the text with the resolved co-references and returns a set of questions for each answer candidate using a language model based on the Transformers architecture; and a post-processing stage that adjusts the format of the generated questions. The question generation model was evaluated on a benchmark obtaining similar results to those of previous works, and the complete pipeline was evaluated on a corpus specifically created for this task, achieving good results.Submitted by Machado Jimena (jmachado@fing.edu.uy) on 2023-05-15T20:45:15Z No. of bitstreams: 2 license_rdf: 23149 bytes, checksum: 1996b8461bc290aef6a27d78c67b6b52 (MD5) BRCR22.pdf: 182736 bytes, checksum: 06511a50453e6b1b2256cda20648dd1d (MD5)Approved for entry into archive by Machado Jimena (jmachado@fing.edu.uy) on 2023-05-16T16:25:12Z (GMT) No. of bitstreams: 2 license_rdf: 23149 bytes, checksum: 1996b8461bc290aef6a27d78c67b6b52 (MD5) BRCR22.pdf: 182736 bytes, checksum: 06511a50453e6b1b2256cda20648dd1d (MD5)Made available in DSpace by Luna Fabiana (fabiana.luna@seciu.edu.uy) on 2023-05-16T16:27:08Z (GMT). No. of bitstreams: 2 license_rdf: 23149 bytes, checksum: 1996b8461bc290aef6a27d78c67b6b52 (MD5) BRCR22.pdf: 182736 bytes, checksum: 06511a50453e6b1b2256cda20648dd1d (MD5) Previous issue date: 2022Agencia Nacional de Investigación e Innovación. Proyecto FSED_2_2020_1_163587.5 p.application/pdfenengIEEELas obras depositadas en el Repositorio se rigen por la Ordenanza de los Derechos de la Propiedad Intelectual de la Universidad de la República.(Res. Nº 91 de C.D.C. de 8/III/1994 – D.O. 7/IV/1994) y por la Ordenanza del Repositorio Abierto de la Universidad de la República (Res. Nº 16 de C.D.C. de 07/10/2014)info:eu-repo/semantics/openAccessLicencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)NLP for language teachingQuestion & answeringTransformersNeural language modelsGeneration of english question answer exercises from texts using transformers based modelsPonenciainfo:eu-repo/semantics/conferenceObjectinfo:eu-repo/semantics/publishedVersionreponame:COLIBRIinstname:Universidad de la Repúblicainstacron:Universidad de la RepúblicaBerger, GonzaloRischewski, TatianaChiruzzo, LuisRosá, AialaLICENSElicense.txtlicense.txttext/plain; charset=utf-84267http://localhost:8080/xmlui/bitstream/20.500.12008/37155/5/license.txt6429389a7df7277b72b7924fdc7d47a9MD55CC-LICENSElicense_urllicense_urltext/plain; charset=utf-850http://localhost:8080/xmlui/bitstream/20.500.12008/37155/2/license_urla006180e3f5b2ad0b88185d14284c0e0MD52license_textlicense_texttext/html; charset=utf-838782http://localhost:8080/xmlui/bitstream/20.500.12008/37155/3/license_texte8c30e04e865334cac2bfcba70aad8cbMD53license_rdflicense_rdfapplication/rdf+xml; charset=utf-823149http://localhost:8080/xmlui/bitstream/20.500.12008/37155/4/license_rdf1996b8461bc290aef6a27d78c67b6b52MD54ORIGINALBRCR22.pdfBRCR22.pdfapplication/pdf182736http://localhost:8080/xmlui/bitstream/20.500.12008/37155/1/BRCR22.pdf06511a50453e6b1b2256cda20648dd1dMD5120.500.12008/371552023-05-16 13:27:08.305oai:colibri.udelar.edu.uy:20.500.12008/37155VGVybWlub3MgeSBjb25kaWNpb25lcyByZWxhdGl2YXMgYWwgZGVwb3NpdG8gZGUgb2JyYXMKCgpMYXMgb2JyYXMgZGVwb3NpdGFkYXMgZW4gZWwgUmVwb3NpdG9yaW8gc2UgcmlnZW4gcG9yIGxhIE9yZGVuYW56YSBkZSBsb3MgRGVyZWNob3MgZGUgbGEgUHJvcGllZGFkIEludGVsZWN0dWFsICBkZSBsYSBVbml2ZXJzaWRhZCBEZSBMYSBSZXDDumJsaWNhLiAoUmVzLiBOwrogOTEgZGUgQy5ELkMuIGRlIDgvSUlJLzE5OTQg4oCTIEQuTy4gNy9JVi8xOTk0KSB5ICBwb3IgbGEgT3JkZW5hbnphIGRlbCBSZXBvc2l0b3JpbyBBYmllcnRvIGRlIGxhIFVuaXZlcnNpZGFkIGRlIGxhIFJlcMO6YmxpY2EgKFJlcy4gTsK6IDE2IGRlIEMuRC5DLiBkZSAwNy8xMC8yMDE0KS4gCgpBY2VwdGFuZG8gZWwgYXV0b3IgZXN0b3MgdMOpcm1pbm9zIHkgY29uZGljaW9uZXMgZGUgZGVww7NzaXRvIGVuIENPTElCUkksIGxhIFVuaXZlcnNpZGFkIGRlIFJlcMO6YmxpY2EgcHJvY2VkZXLDoSBhOiAgCgphKSBhcmNoaXZhciBtw6FzIGRlIHVuYSBjb3BpYSBkZSBsYSBvYnJhIGVuIGxvcyBzZXJ2aWRvcmVzIGRlIGxhIFVuaXZlcnNpZGFkIGEgbG9zIGVmZWN0b3MgZGUgZ2FyYW50aXphciBhY2Nlc28sIHNlZ3VyaWRhZCB5IHByZXNlcnZhY2nDs24KYikgY29udmVydGlyIGxhIG9icmEgYSBvdHJvcyBmb3JtYXRvcyBzaSBmdWVyYSBuZWNlc2FyaW8gIHBhcmEgZmFjaWxpdGFyIHN1IHByZXNlcnZhY2nDs24geSBhY2Nlc2liaWxpZGFkIHNpbiBhbHRlcmFyIHN1IGNvbnRlbmlkby4KYykgcmVhbGl6YXIgbGEgY29tdW5pY2FjacOzbiBww7pibGljYSB5IGRpc3BvbmVyIGVsIGFjY2VzbyBsaWJyZSB5IGdyYXR1aXRvIGEgdHJhdsOpcyBkZSBJbnRlcm5ldCBtZWRpYW50ZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYSBiYWpvIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgc2VsZWNjaW9uYWRhIHBvciBlbCBwcm9waW8gYXV0b3IuCgoKRW4gY2FzbyBxdWUgZWwgYXV0b3IgaGF5YSBkaWZ1bmRpZG8geSBkYWRvIGEgcHVibGljaWRhZCBhIGxhIG9icmEgZW4gZm9ybWEgcHJldmlhLCAgcG9kcsOhIHNvbGljaXRhciB1biBwZXLDrW9kbyBkZSBlbWJhcmdvIHNvYnJlIGxhIGRpc3BvbmliaWxpZGFkIHDDumJsaWNhIGRlIGxhIG1pc21hLCBlbCBjdWFsIGNvbWVuemFyw6EgYSBwYXJ0aXIgZGUgbGEgYWNlcHRhY2nDs24gZGUgZXN0ZSBkb2N1bWVudG8geSBoYXN0YSBsYSBmZWNoYSBxdWUgaW5kaXF1ZSAuCgpFbCBhdXRvciBhc2VndXJhIHF1ZSBsYSBvYnJhIG5vIGluZnJpZ2UgbmluZ8O6biBkZXJlY2hvIHNvYnJlIHRlcmNlcm9zLCB5YSBzZWEgZGUgcHJvcGllZGFkIGludGVsZWN0dWFsIG8gY3VhbHF1aWVyIG90cm8uCgpFbCBhdXRvciBnYXJhbnRpemEgcXVlIHNpIGVsIGRvY3VtZW50byBjb250aWVuZSBtYXRlcmlhbGVzIGRlIGxvcyBjdWFsZXMgbm8gdGllbmUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yLCAgaGEgb2J0ZW5pZG8gZWwgcGVybWlzbyBkZWwgcHJvcGlldGFyaW8gZGUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yLCB5IHF1ZSBlc2UgbWF0ZXJpYWwgY3V5b3MgZGVyZWNob3Mgc29uIGRlIHRlcmNlcm9zIGVzdMOhIGNsYXJhbWVudGUgaWRlbnRpZmljYWRvIHkgcmVjb25vY2lkbyBlbiBlbCB0ZXh0byBvIGNvbnRlbmlkbyBkZWwgZG9jdW1lbnRvIGRlcG9zaXRhZG8gZW4gZWwgUmVwb3NpdG9yaW8uCgpFbiBvYnJhcyBkZSBhdXRvcsOtYSBtw7psdGlwbGUgL3NlIHByZXN1bWUvIHF1ZSBlbCBhdXRvciBkZXBvc2l0YW50ZSBkZWNsYXJhIHF1ZSBoYSByZWNhYmFkbyBlbCBjb25zZW50aW1pZW50byBkZSB0b2RvcyBsb3MgYXV0b3JlcyBwYXJhIHB1YmxpY2FybGEgZW4gZWwgUmVwb3NpdG9yaW8sIHNpZW5kbyDDqXN0ZSBlbCDDum5pY28gcmVzcG9uc2FibGUgZnJlbnRlIGEgY3VhbHF1aWVyIHRpcG8gZGUgcmVjbGFtYWNpw7NuIGRlIGxvcyBvdHJvcyBjb2F1dG9yZXMuCgpFbCBhdXRvciBzZXLDoSByZXNwb25zYWJsZSBkZWwgY29udGVuaWRvIGRlIGxvcyBkb2N1bWVudG9zIHF1ZSBkZXBvc2l0YS4gTGEgVURFTEFSIG5vIHNlcsOhIHJlc3BvbnNhYmxlIHBvciBsYXMgZXZlbnR1YWxlcyB2aW9sYWNpb25lcyBhbCBkZXJlY2hvIGRlIHByb3BpZWRhZCBpbnRlbGVjdHVhbCBlbiBxdWUgcHVlZGEgaW5jdXJyaXIgZWwgYXV0b3IuCgpBbnRlIGN1YWxxdWllciBkZW51bmNpYSBkZSB2aW9sYWNpw7NuIGRlIGRlcmVjaG9zIGRlIHByb3BpZWRhZCBpbnRlbGVjdHVhbCwgbGEgVURFTEFSICBhZG9wdGFyw6EgdG9kYXMgbGFzIG1lZGlkYXMgbmVjZXNhcmlhcyBwYXJhIGV2aXRhciBsYSBjb250aW51YWNpw7NuIGRlIGRpY2hhIGluZnJhY2Npw7NuLCBsYXMgcXVlIHBvZHLDoW4gaW5jbHVpciBlbCByZXRpcm8gZGVsIGFjY2VzbyBhIGxvcyBjb250ZW5pZG9zIHkvbyBtZXRhZGF0b3MgZGVsIGRvY3VtZW50byByZXNwZWN0aXZvLgoKTGEgb2JyYSBzZSBwb25kcsOhIGEgZGlzcG9zaWNpw7NuIGRlbCBww7pibGljbyBhIHRyYXbDqXMgZGUgbGFzIGxpY2VuY2lhcyBDcmVhdGl2ZSBDb21tb25zLCBlbCBhdXRvciBwb2Ryw6Egc2VsZWNjaW9uYXIgdW5hIGRlIGxhcyA2IGxpY2VuY2lhcyBkaXNwb25pYmxlczoKCgpBdHJpYnVjacOzbiAoQ0MgLSBCeSk6IFBlcm1pdGUgdXNhciBsYSBvYnJhIHkgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIGluY2x1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBzaWVtcHJlIHF1ZSBzZSByZWNvbm96Y2EgYWwgYXV0b3IuCgpBdHJpYnVjacOzbiDigJMgQ29tcGFydGlyIElndWFsIChDQyAtIEJ5LVNBKTogUGVybWl0ZSB1c2FyIGxhIG9icmEgeSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcywgaW5jbHVzbyBjb24gZmluZXMgY29tZXJjaWFsZXMsIHBlcm8gbGEgZGlzdHJpYnVjacOzbiBkZSBsYXMgb2JyYXMgZGVyaXZhZGFzIGRlYmUgaGFjZXJzZSBtZWRpYW50ZSB1bmEgbGljZW5jaWEgaWTDqW50aWNhIGEgbGEgZGUgbGEgb2JyYSBvcmlnaW5hbCwgcmVjb25vY2llbmRvIGEgbG9zIGF1dG9yZXMuCgpBdHJpYnVjacOzbiDigJMgTm8gQ29tZXJjaWFsIChDQyAtIEJ5LU5DKTogUGVybWl0ZSB1c2FyIGxhIG9icmEgeSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcywgc2llbXByZSB5IGN1YW5kbyBlc29zIHVzb3Mgbm8gdGVuZ2FuIGZpbmVzIGNvbWVyY2lhbGVzLCByZWNvbm9jaWVuZG8gYWwgYXV0b3IuCgpBdHJpYnVjacOzbiDigJMgU2luIERlcml2YWRhcyAoQ0MgLSBCeS1ORCk6IFBlcm1pdGUgZWwgdXNvIGRlIGxhIG9icmEsIGluY2x1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBwZXJvIG5vIHNlIHBlcm1pdGUgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIGRlYmllbmRvIHJlY29ub2NlciBhbCBhdXRvci4KCkF0cmlidWNpw7NuIOKAkyBObyBDb21lcmNpYWwg4oCTIENvbXBhcnRpciBJZ3VhbCAoQ0Mg4oCTIEJ5LU5DLVNBKTogUGVybWl0ZSB1c2FyIGxhIG9icmEgeSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcywgc2llbXByZSB5IGN1YW5kbyBlc29zIHVzb3Mgbm8gdGVuZ2FuIGZpbmVzIGNvbWVyY2lhbGVzIHkgbGEgZGlzdHJpYnVjacOzbiBkZSBsYXMgb2JyYXMgZGVyaXZhZGFzIHNlIGhhZ2EgbWVkaWFudGUgbGljZW5jaWEgaWTDqW50aWNhIGEgbGEgZGUgbGEgb2JyYSBvcmlnaW5hbCwgcmVjb25vY2llbmRvIGEgbG9zIGF1dG9yZXMuCgpBdHJpYnVjacOzbiDigJMgTm8gQ29tZXJjaWFsIOKAkyBTaW4gRGVyaXZhZGFzIChDQyAtIEJ5LU5DLU5EKTogUGVybWl0ZSB1c2FyIGxhIG9icmEsIHBlcm8gbm8gc2UgcGVybWl0ZSBnZW5lcmFyIG9icmFzIGRlcml2YWRhcyB5IG5vIHNlIHBlcm1pdGUgdXNvIGNvbiBmaW5lcyBjb21lcmNpYWxlcywgZGViaWVuZG8gcmVjb25vY2VyIGFsIGF1dG9yLgoKTG9zIHVzb3MgcHJldmlzdG9zIGVuIGxhcyBsaWNlbmNpYXMgaW5jbHV5ZW4gbGEgZW5hamVuYWNpw7NuLCByZXByb2R1Y2Npw7NuLCBjb211bmljYWNpw7NuLCBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24geSBwdWVzdGEgYSBkaXNwb3NpY2nDs24gZGVsIHDDumJsaWNvLiBMYSBjcmVhY2nDs24gZGUgb2JyYXMgZGVyaXZhZGFzIGluY2x1eWUgbGEgYWRhcHRhY2nDs24sIHRyYWR1Y2Npw7NuIHkgZWwgcmVtaXguCgpDdWFuZG8gc2Ugc2VsZWNjaW9uZSB1bmEgbGljZW5jaWEgcXVlIGhhYmlsaXRlIHVzb3MgY29tZXJjaWFsZXMsIGVsIGRlcMOzc2l0byBkZWJlcsOhIHNlciBhY29tcGHDsWFkbyBkZWwgYXZhbCBkZWwgamVyYXJjYSBtw6F4aW1vIGRlbCBTZXJ2aWNpbyBjb3JyZXNwb25kaWVudGUuCg==Universidadhttps://udelar.edu.uy/https://www.colibri.udelar.edu.uy/oai/requestmabel.seroubian@seciu.edu.uyUruguayopendoar:47712024-07-25T14:34:03.695042COLIBRI - Universidad de la Repúblicafalse |
spellingShingle | Generation of english question answer exercises from texts using transformers based models Berger, Gonzalo NLP for language teaching Question & answering Transformers Neural language models |
status_str | publishedVersion |
title | Generation of english question answer exercises from texts using transformers based models |
title_full | Generation of english question answer exercises from texts using transformers based models |
title_fullStr | Generation of english question answer exercises from texts using transformers based models |
title_full_unstemmed | Generation of english question answer exercises from texts using transformers based models |
title_short | Generation of english question answer exercises from texts using transformers based models |
title_sort | Generation of english question answer exercises from texts using transformers based models |
topic | NLP for language teaching Question & answering Transformers Neural language models |
url | https://hdl.handle.net/20.500.12008/37155 |