Distributed sparse linear regression

Mateos, Gonzalo - Bazerque, Juan Andrés - Giannakis, Georgios B

Resumen:

The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garrote


Detalles Bibliográficos
2010
Distributed linear regression
Lasso
Parallel optimization
Sparse estimation
Sistemas y Control
Inglés
Universidad de la República
COLIBRI
https://hdl.handle.net/20.500.12008/38723
Acceso abierto
Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)
_version_ 1807522934767484928
author Mateos, Gonzalo
author2 Bazerque, Juan Andrés
Giannakis, Georgios B
author2_role author
author
author_facet Mateos, Gonzalo
Bazerque, Juan Andrés
Giannakis, Georgios B
author_role author
bitstream.checksum.fl_str_mv 7f2e2c17ef6585de66da58d1bfa8b5e1
9833653f73f7853880c94a6fead477b1
4afdbb8c545fd630ea7db775da747b2f
9da0b6dfac957114c6a7714714b86306
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
bitstream.url.fl_str_mv http://localhost:8080/xmlui/bitstream/20.500.12008/38723/4/license.txt
http://localhost:8080/xmlui/bitstream/20.500.12008/38723/1/license_text
http://localhost:8080/xmlui/bitstream/20.500.12008/38723/2/license_url
http://localhost:8080/xmlui/bitstream/20.500.12008/38723/3/license_rdf
collection COLIBRI
dc.creator.none.fl_str_mv Mateos, Gonzalo
Bazerque, Juan Andrés
Giannakis, Georgios B
dc.date.accessioned.none.fl_str_mv 2023-08-01T20:33:30Z
dc.date.available.none.fl_str_mv 2023-08-01T20:33:30Z
dc.date.issued.es.fl_str_mv 2010
dc.date.submitted.es.fl_str_mv 20230801
dc.description.abstract.none.fl_txt_mv The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garrote
dc.identifier.citation.es.fl_str_mv Mateos, G, Bazerque, J, Giannakis, G. “Distributed sparse linear regression” IEEE Transactions on Signal Processing, 2010, v. 58, no.10.
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12008/38723
dc.language.iso.none.fl_str_mv en
eng
dc.rights.license.none.fl_str_mv Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.source.none.fl_str_mv reponame:COLIBRI
instname:Universidad de la República
instacron:Universidad de la República
dc.subject.es.fl_str_mv Distributed linear regression
Lasso
Parallel optimization
Sparse estimation
dc.subject.other.es.fl_str_mv Sistemas y Control
dc.title.none.fl_str_mv Distributed sparse linear regression
dc.type.es.fl_str_mv Artículo
dc.type.none.fl_str_mv info:eu-repo/semantics/article
dc.type.version.none.fl_str_mv info:eu-repo/semantics/publishedVersion
description The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garrote
eu_rights_str_mv openAccess
format article
id COLIBRI_6a820c236a3164fba3062507ec5d3acc
identifier_str_mv Mateos, G, Bazerque, J, Giannakis, G. “Distributed sparse linear regression” IEEE Transactions on Signal Processing, 2010, v. 58, no.10.
instacron_str Universidad de la República
institution Universidad de la República
instname_str Universidad de la República
language eng
language_invalid_str_mv en
network_acronym_str COLIBRI
network_name_str COLIBRI
oai_identifier_str oai:colibri.udelar.edu.uy:20.500.12008/38723
publishDate 2010
reponame_str COLIBRI
repository.mail.fl_str_mv mabel.seroubian@seciu.edu.uy
repository.name.fl_str_mv COLIBRI - Universidad de la República
repository_id_str 4771
rights_invalid_str_mv Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)
spelling 2023-08-01T20:33:30Z2023-08-01T20:33:30Z201020230801Mateos, G, Bazerque, J, Giannakis, G. “Distributed sparse linear regression” IEEE Transactions on Signal Processing, 2010, v. 58, no.10.https://hdl.handle.net/20.500.12008/38723The Lasso is a popular technique for joint estimation and continuous variable selection, especially well-suited for sparse and possibly under-determined linear regression problems. This paper develops algorithms to estimate the regression coefficients via Lasso when the training data are distributed across different agents, and their communication to a central processing unit is prohibited for e.g., communication cost or privacy reasons. A motivating application is explored in the context of wireless communications, whereby sensing cognitive radios collaborate to estimate the radio-frequency power spectrum density. Attaining different tradeoffs between complexity and convergence speed, three novel algorithms are obtained after reformulating the Lasso into a separable form, which is iteratively minimized using the alternating-direction method of multipliers so as to gain the desired degree of parallelization. Interestingly, the per agent estimate updates are given by simple soft-thresholding operations, and inter-agent communication overhead remains at affordable level. Without exchanging elements from the different training sets, the local estimates consent to the global Lasso solution, i.e., the fit that would be obtained if the entire data set were centrally available. Numerical experiments with both simulated and real data demonstrate the merits of the proposed distributed schemes, corroborating their convergence and global optimality. The ideas in this paper can be easily extended for the purpose of fitting related models in a distributed fashion, including the adaptive Lasso, elastic net, fused Lasso and nonnegative garroteMade available in DSpace on 2023-08-01T20:33:30Z (GMT). No. of bitstreams: 4 license_text: 21936 bytes, checksum: 9833653f73f7853880c94a6fead477b1 (MD5) license_url: 49 bytes, checksum: 4afdbb8c545fd630ea7db775da747b2f (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) license.txt: 4194 bytes, checksum: 7f2e2c17ef6585de66da58d1bfa8b5e1 (MD5) Previous issue date: 2010enengLas obras depositadas en el Repositorio se rigen por la Ordenanza de los Derechos de la Propiedad Intelectual de la Universidad De La República. (Res. Nº 91 de C.D.C. de 8/III/1994 – D.O. 7/IV/1994) y por la Ordenanza del Repositorio Abierto de la Universidad de la República (Res. Nº 16 de C.D.C. de 07/10/2014)info:eu-repo/semantics/openAccessLicencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0)Distributed linear regressionLassoParallel optimizationSparse estimationSistemas y ControlDistributed sparse linear regressionArtículoinfo:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionreponame:COLIBRIinstname:Universidad de la Repúblicainstacron:Universidad de la RepúblicaMateos, GonzaloBazerque, Juan AndrésGiannakis, Georgios BLICENSElicense.txttext/plain4194http://localhost:8080/xmlui/bitstream/20.500.12008/38723/4/license.txt7f2e2c17ef6585de66da58d1bfa8b5e1MD54CC-LICENSElicense_textapplication/octet-stream21936http://localhost:8080/xmlui/bitstream/20.500.12008/38723/1/license_text9833653f73f7853880c94a6fead477b1MD51license_urlapplication/octet-stream49http://localhost:8080/xmlui/bitstream/20.500.12008/38723/2/license_url4afdbb8c545fd630ea7db775da747b2fMD52license_rdfapplication/octet-stream23148http://localhost:8080/xmlui/bitstream/20.500.12008/38723/3/license_rdf9da0b6dfac957114c6a7714714b86306MD5320.500.12008/387232023-08-01 17:33:30.146oai:colibri.udelar.edu.uy:20.500.12008/38723VGVybWlub3MgeSBjb25kaWNpb25lcyByZWxhdGl2YXMgYWwgZGVwb3NpdG8gZGUgb2JyYXMKCgpMYXMgb2JyYXMgZGVwb3NpdGFkYXMgZW4gZWwgUmVwb3NpdG9yaW8gc2UgcmlnZW4gcG9yIGxhIE9yZGVuYW56YSBkZSBsb3MgRGVyZWNob3MgZGUgbGEgUHJvcGllZGFkIEludGVsZWN0dWFsICBkZSBsYSBVbml2ZXJzaWRhZCBEZSBMYSBSZXDvv71ibGljYS4gKFJlcy4gTu+/vSA5MSBkZSBDLkQuQy4gZGUgOC9JSUkvMTk5NCDvv70gRC5PLiA3L0lWLzE5OTQpIHkgIHBvciBsYSBPcmRlbmFuemEgZGVsIFJlcG9zaXRvcmlvIEFiaWVydG8gZGUgbGEgVW5pdmVyc2lkYWQgZGUgbGEgUmVw77+9YmxpY2EgKFJlcy4gTu+/vSAxNiBkZSBDLkQuQy4gZGUgMDcvMTAvMjAxNCkuIAoKQWNlcHRhbmRvIGVsIGF1dG9yIGVzdG9zIHTvv71ybWlub3MgeSBjb25kaWNpb25lcyBkZSBkZXDvv71zaXRvIGVuIENPTElCUkksIGxhIFVuaXZlcnNpZGFkIGRlIFJlcO+/vWJsaWNhIHByb2NlZGVy77+9IGE6ICAKCmEpIGFyY2hpdmFyIG3vv71zIGRlIHVuYSBjb3BpYSBkZSBsYSBvYnJhIGVuIGxvcyBzZXJ2aWRvcmVzIGRlIGxhIFVuaXZlcnNpZGFkIGEgbG9zIGVmZWN0b3MgZGUgZ2FyYW50aXphciBhY2Nlc28sIHNlZ3VyaWRhZCB5IHByZXNlcnZhY2nvv71uCmIpIGNvbnZlcnRpciBsYSBvYnJhIGEgb3Ryb3MgZm9ybWF0b3Mgc2kgZnVlcmEgbmVjZXNhcmlvICBwYXJhIGZhY2lsaXRhciBzdSBwcmVzZXJ2YWNp77+9biB5IGFjY2VzaWJpbGlkYWQgc2luIGFsdGVyYXIgc3UgY29udGVuaWRvLgpjKSByZWFsaXphciBsYSBjb211bmljYWNp77+9biBw77+9YmxpY2EgeSBkaXNwb25lciBlbCBhY2Nlc28gbGlicmUgeSBncmF0dWl0byBhIHRyYXbvv71zIGRlIEludGVybmV0IG1lZGlhbnRlIGxhIHB1YmxpY2Fjae+/vW4gZGUgbGEgb2JyYSBiYWpvIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgc2VsZWNjaW9uYWRhIHBvciBlbCBwcm9waW8gYXV0b3IuCgoKRW4gY2FzbyBxdWUgZWwgYXV0b3IgaGF5YSBkaWZ1bmRpZG8geSBkYWRvIGEgcHVibGljaWRhZCBhIGxhIG9icmEgZW4gZm9ybWEgcHJldmlhLCAgcG9kcu+/vSBzb2xpY2l0YXIgdW4gcGVy77+9b2RvIGRlIGVtYmFyZ28gc29icmUgbGEgZGlzcG9uaWJpbGlkYWQgcO+/vWJsaWNhIGRlIGxhIG1pc21hLCBlbCBjdWFsIGNvbWVuemFy77+9IGEgcGFydGlyIGRlIGxhIGFjZXB0YWNp77+9biBkZSBlc3RlIGRvY3VtZW50byB5IGhhc3RhIGxhIGZlY2hhIHF1ZSBpbmRpcXVlIC4KCkVsIGF1dG9yIGFzZWd1cmEgcXVlIGxhIG9icmEgbm8gaW5mcmlnZSBuaW5n77+9biBkZXJlY2hvIHNvYnJlIHRlcmNlcm9zLCB5YSBzZWEgZGUgcHJvcGllZGFkIGludGVsZWN0dWFsIG8gY3VhbHF1aWVyIG90cm8uCgpFbCBhdXRvciBnYXJhbnRpemEgcXVlIHNpIGVsIGRvY3VtZW50byBjb250aWVuZSBtYXRlcmlhbGVzIGRlIGxvcyBjdWFsZXMgbm8gdGllbmUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yLCAgaGEgb2J0ZW5pZG8gZWwgcGVybWlzbyBkZWwgcHJvcGlldGFyaW8gZGUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yLCB5IHF1ZSBlc2UgbWF0ZXJpYWwgY3V5b3MgZGVyZWNob3Mgc29uIGRlIHRlcmNlcm9zIGVzdO+/vSBjbGFyYW1lbnRlIGlkZW50aWZpY2FkbyB5IHJlY29ub2NpZG8gZW4gZWwgdGV4dG8gbyBjb250ZW5pZG8gZGVsIGRvY3VtZW50byBkZXBvc2l0YWRvIGVuIGVsIFJlcG9zaXRvcmlvLgoKRW4gb2JyYXMgZGUgYXV0b3Lvv71hIG3vv71sdGlwbGUgL3NlIHByZXN1bWUvIHF1ZSBlbCBhdXRvciBkZXBvc2l0YW50ZSBkZWNsYXJhIHF1ZSBoYSByZWNhYmFkbyBlbCBjb25zZW50aW1pZW50byBkZSB0b2RvcyBsb3MgYXV0b3JlcyBwYXJhIHB1YmxpY2FybGEgZW4gZWwgUmVwb3NpdG9yaW8sIHNpZW5kbyDvv71zdGUgZWwg77+9bmljbyByZXNwb25zYWJsZSBmcmVudGUgYSBjdWFscXVpZXIgdGlwbyBkZSByZWNsYW1hY2nvv71uIGRlIGxvcyBvdHJvcyBjb2F1dG9yZXMuCgpFbCBhdXRvciBzZXLvv70gcmVzcG9uc2FibGUgZGVsIGNvbnRlbmlkbyBkZSBsb3MgZG9jdW1lbnRvcyBxdWUgZGVwb3NpdGEuIExhIFVERUxBUiBubyBzZXLvv70gcmVzcG9uc2FibGUgcG9yIGxhcyBldmVudHVhbGVzIHZpb2xhY2lvbmVzIGFsIGRlcmVjaG8gZGUgcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIHF1ZSBwdWVkYSBpbmN1cnJpciBlbCBhdXRvci4KCkFudGUgY3VhbHF1aWVyIGRlbnVuY2lhIGRlIHZpb2xhY2nvv71uIGRlIGRlcmVjaG9zIGRlIHByb3BpZWRhZCBpbnRlbGVjdHVhbCwgbGEgVURFTEFSICBhZG9wdGFy77+9IHRvZGFzIGxhcyBtZWRpZGFzIG5lY2VzYXJpYXMgcGFyYSBldml0YXIgbGEgY29udGludWFjae+/vW4gZGUgZGljaGEgaW5mcmFjY2nvv71uLCBsYXMgcXVlIHBvZHLvv71uIGluY2x1aXIgZWwgcmV0aXJvIGRlbCBhY2Nlc28gYSBsb3MgY29udGVuaWRvcyB5L28gbWV0YWRhdG9zIGRlbCBkb2N1bWVudG8gcmVzcGVjdGl2by4KCkxhIG9icmEgc2UgcG9uZHLvv70gYSBkaXNwb3NpY2nvv71uIGRlbCBw77+9YmxpY28gYSB0cmF277+9cyBkZSBsYXMgbGljZW5jaWFzIENyZWF0aXZlIENvbW1vbnMsIGVsIGF1dG9yIHBvZHLvv70gc2VsZWNjaW9uYXIgdW5hIGRlIGxhcyA2IGxpY2VuY2lhcyBkaXNwb25pYmxlczoKCgpBdHJpYnVjae+/vW4gKENDIC0gQnkpOiBQZXJtaXRlIHVzYXIgbGEgb2JyYSB5IGdlbmVyYXIgb2JyYXMgZGVyaXZhZGFzLCBpbmNsdXNvIGNvbiBmaW5lcyBjb21lcmNpYWxlcywgc2llbXByZSBxdWUgc2UgcmVjb25vemNhIGFsIGF1dG9yLgoKQXRyaWJ1Y2nvv71uIO+/vSBDb21wYXJ0aXIgSWd1YWwgKENDIC0gQnktU0EpOiBQZXJtaXRlIHVzYXIgbGEgb2JyYSB5IGdlbmVyYXIgb2JyYXMgZGVyaXZhZGFzLCBpbmNsdXNvIGNvbiBmaW5lcyBjb21lcmNpYWxlcywgcGVybyBsYSBkaXN0cmlidWNp77+9biBkZSBsYXMgb2JyYXMgZGVyaXZhZGFzIGRlYmUgaGFjZXJzZSBtZWRpYW50ZSB1bmEgbGljZW5jaWEgaWTvv71udGljYSBhIGxhIGRlIGxhIG9icmEgb3JpZ2luYWwsIHJlY29ub2NpZW5kbyBhIGxvcyBhdXRvcmVzLgoKQXRyaWJ1Y2nvv71uIO+/vSBObyBDb21lcmNpYWwgKENDIC0gQnktTkMpOiBQZXJtaXRlIHVzYXIgbGEgb2JyYSB5IGdlbmVyYXIgb2JyYXMgZGVyaXZhZGFzLCBzaWVtcHJlIHkgY3VhbmRvIGVzb3MgdXNvcyBubyB0ZW5nYW4gZmluZXMgY29tZXJjaWFsZXMsIHJlY29ub2NpZW5kbyBhbCBhdXRvci4KCkF0cmlidWNp77+9biDvv70gU2luIERlcml2YWRhcyAoQ0MgLSBCeS1ORCk6IFBlcm1pdGUgZWwgdXNvIGRlIGxhIG9icmEsIGluY2x1c28gY29uIGZpbmVzIGNvbWVyY2lhbGVzLCBwZXJvIG5vIHNlIHBlcm1pdGUgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIGRlYmllbmRvIHJlY29ub2NlciBhbCBhdXRvci4KCkF0cmlidWNp77+9biDvv70gTm8gQ29tZXJjaWFsIO+/vSBDb21wYXJ0aXIgSWd1YWwgKENDIO+/vSBCeS1OQy1TQSk6IFBlcm1pdGUgdXNhciBsYSBvYnJhIHkgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMsIHNpZW1wcmUgeSBjdWFuZG8gZXNvcyB1c29zIG5vIHRlbmdhbiBmaW5lcyBjb21lcmNpYWxlcyB5IGxhIGRpc3RyaWJ1Y2nvv71uIGRlIGxhcyBvYnJhcyBkZXJpdmFkYXMgc2UgaGFnYSBtZWRpYW50ZSBsaWNlbmNpYSBpZO+/vW50aWNhIGEgbGEgZGUgbGEgb2JyYSBvcmlnaW5hbCwgcmVjb25vY2llbmRvIGEgbG9zIGF1dG9yZXMuCgpBdHJpYnVjae+/vW4g77+9IE5vIENvbWVyY2lhbCDvv70gU2luIERlcml2YWRhcyAoQ0MgLSBCeS1OQy1ORCk6IFBlcm1pdGUgdXNhciBsYSBvYnJhLCBwZXJvIG5vIHNlIHBlcm1pdGUgZ2VuZXJhciBvYnJhcyBkZXJpdmFkYXMgeSBubyBzZSBwZXJtaXRlIHVzbyBjb24gZmluZXMgY29tZXJjaWFsZXMsIGRlYmllbmRvIHJlY29ub2NlciBhbCBhdXRvci4KCkxvcyB1c29zIHByZXZpc3RvcyBlbiBsYXMgbGljZW5jaWFzIGluY2x1eWVuIGxhIGVuYWplbmFjae+/vW4sIHJlcHJvZHVjY2nvv71uLCBjb211bmljYWNp77+9biwgcHVibGljYWNp77+9biwgZGlzdHJpYnVjae+/vW4geSBwdWVzdGEgYSBkaXNwb3NpY2nvv71uIGRlbCBw77+9YmxpY28uIExhIGNyZWFjae+/vW4gZGUgb2JyYXMgZGVyaXZhZGFzIGluY2x1eWUgbGEgYWRhcHRhY2nvv71uLCB0cmFkdWNjae+/vW4geSBlbCByZW1peC4KCkN1YW5kbyBzZSBzZWxlY2Npb25lIHVuYSBsaWNlbmNpYSBxdWUgaGFiaWxpdGUgdXNvcyBjb21lcmNpYWxlcywgZWwgZGVw77+9c2l0byBkZWJlcu+/vSBzZXIgYWNvbXBh77+9YWRvIGRlbCBhdmFsIGRlbCBqZXJhcmNhIG3vv714aW1vIGRlbCBTZXJ2aWNpbyBjb3JyZXNwb25kaWVudGUuCgoKCgoKCgoKUniversidadhttps://udelar.edu.uy/https://www.colibri.udelar.edu.uy/oai/requestmabel.seroubian@seciu.edu.uyUruguayopendoar:47712024-07-25T14:33:27.893169COLIBRI - Universidad de la Repúblicafalse
spellingShingle Distributed sparse linear regression
Mateos, Gonzalo
Distributed linear regression
Lasso
Parallel optimization
Sparse estimation
Sistemas y Control
status_str publishedVersion
title Distributed sparse linear regression
title_full Distributed sparse linear regression
title_fullStr Distributed sparse linear regression
title_full_unstemmed Distributed sparse linear regression
title_short Distributed sparse linear regression
title_sort Distributed sparse linear regression
topic Distributed linear regression
Lasso
Parallel optimization
Sparse estimation
Sistemas y Control
url https://hdl.handle.net/20.500.12008/38723