Skip to content
Snippets Groups Projects
Commit ec78e73d authored by briggan2's avatar briggan2
Browse files

updated README

parent 0dc9739b
Branches
No related tags found
No related merge requests found
......@@ -8,109 +8,108 @@ This repository cotains:
## Instructions
### 1. Complete zca.py:
### 1. Complete omp.py:
`zca.py` contains the following subroutine
`omp.py` contains the following subroutine
```python
def zca_white(x):
""" perform zca whitening
def omp(D,x,a,L=None, tol=1e-6):
""" perform orthogonal matching pursuit
inputs:
x: numpy array of images
D: dictionary (k x d numpy array s.t. k>d)
x: image/data vector (1 x d numpy array)
L: maximum sparse coefficients (default: 0.5*k)
a: sparse vector (1 x k numpy array)
outputs:
y: numpy array of whitened images
a: updated sparse vector (k x 1 numpy array)
"""
# dims
k, d = D.shape
# *** put code for zca whitening here ***
# default parameter
if L is None:
L = 0.5 * k
return y
# *** init parameters here
# *** put code for omp here ***
return a
```
which need to completed using only the `numpy` package.
**No other packages should be used**
### 2. Complete ica.py
### 2. Complete dictlearn.py
`ica.py` includes the following subroutines that need to be completed. The first
subroutine `sample` should sample patches from images.
`dictlearn.py` includes the following subroutine that needs to be completed.
```python
def sample(x, patch_size=16, num_patches=10):
''' randomly sample patches from x
def ksvd(X, **args):
''' k-svd algorithm
inputs:
x: numpy array of images
patch_size: patch dims for patch_size x patch_size images
(default 16)
num_patches: number of patches to sample (default 10)
outputs:
y: numpy array of patches
'''
return y
```
The second subroutine `ica` should perform gradient descent with the backtracking
line search to adapted the learning ratefor the ICA objective function.
```python
def ica(x, **args):
''' perform independent component analysis (ICA)
inputs:
x: numpy array of images
x: images (n x d array)
D: optional dictionary. If D is None, then dictionary is learned. Otherwise,
only sparse coefficients are learned. (default None)
args:
lr: learning rate (default 1e-3)
nsteps: maximum iterations (default 1000)
k: number of latent variables (defualt 20)
L: sparsity parameter
nsteps: maximum number of iterations (default 200)
tol: toleraance for convergence (default 1e-3)
returns:
L: numpy array of loss function value for all iterations
W: numpy array of ICA basis vectors
J: objective error
D: dictionary (k x d numpy array)
A: sparse coefficients (n x k numpy array)
'''
# default parameters
if not len(args):
args['lr'] = 1
args['k'] = 1024
args['nsteps'] = 200
args['k'] = 64
args['L'] = 100
args['tol'] = 1e-3
lr=args['lr']
nsteps = args['nsteps']
k = args['k']
L = args['L']
tol = args['tol']
# dims
n, d = X.shape
# ***initialize variables here***
'''training loop using graident descent'''
# *** start main loop ***
for step in range(nsteps):
# ***insert gradient descent code here***
''' use backtracking line search '''
# *** test for convergence here ***
# display progress
print('{} / {} - J: {}'.format(step, nsteps, J))
# *** sparse coding using orthogonal matching pursuit here ***
# print loss
print('step: {} / {}, L: {}'.format(step, nsteps, L[step]))
# *** dictionary update here ***
return L, W
return J, D, A
```
`ica` and `sample` need to completed only the following packages:
`ksvd` needs to completed only using the following packages:
* `numpy`
* `scipy.linalg`
**No other packages should be used**
### Parameters
`ica.py` also provides a sample main that loads cifar10 using `cifar10.py`,
whitens the images using `zca.py`, performs ICA using the `ica(x,**args)`, and displays
and displays the learned basis images `W`.
`dictlearn.py` also provides a sample main that loads cifar10 using `cifar10.py`, performs dictionary learning using `omp` and `ksvd` subroutines, and displays the learned dictionary `D`.
**Note that values of parameters such as the learning rate `lr`, number of basis
images `k`, and number of optimization steps `nsteps` may need to be changed**
**Note that values of parameters such as the learning rate `tol`, number of dictionary atoms `k`, and number of optimization steps `nsteps` may need to be changed**
### Example
The following image show an example result of applying ICA to whitened cifar10 data
The following image shows an example dictionary learned with 8 x 8 images patches (assuming k=512 and L=25).
![Test Image 1](cifar_ica_basis_64.png)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment