Skip to content
This repository has been archived by the owner on May 21, 2022. It is now read-only.

Implement Sparse Coding #2

Open
tbreloff opened this issue Sep 7, 2016 · 2 comments
Open

Implement Sparse Coding #2

tbreloff opened this issue Sep 7, 2016 · 2 comments

Comments

@tbreloff
Copy link
Member

tbreloff commented Sep 7, 2016

Sparse Coding is a framework of finding basis vectors and weights to decompose an input vector into the linear combination of a subset of basis vectors. The framework is very similar to Empirical Risk Minimization in that there's a loss and a penalty, except that the penalty is on the output, not the parameters. I believe sparse coding refers specifically to a linear model, but of course any combinations of transformation/loss/penalty could be used... we just need to add a penalty on the output. Should we just go ahead and add this to RegularizedObjective? It can default to NoPenalty.

ref:
http://ufldl.stanford.edu/wiki/index.php/Sparse_Coding
http://gabgoh.github.io/SARG/

cc: @gabgoh

@tbreloff
Copy link
Member Author

tbreloff commented Sep 7, 2016

Rethinking this a bit, I don't think this is correct:

The framework is very similar to Empirical Risk Minimization in that there's a loss and a penalty, except that the penalty is on the output, not the parameters

If you consider the input x as being approximated by the transformation y = w*a, then we want to ensure sparsity on a, not y. Couple this with the concept that both w and a are "parameters" to be learned, and you have something very different from ERM. I think it requires a separate type/implementation.

@gabgoh
Copy link

gabgoh commented Sep 7, 2016

I think there are two separate problems to consider, the Dictionary Learning problem (learning w and a) and Sparse coding (just learning a with w fixed). The latter, is also known as sparse regression, which could be ERM? (I'm not familiar with ERM, but it sounds like a generalized regression).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants