Skip to content

Commit 0c09f36

Browse files
SudhiMohanOmkarPathak
authored andcommitted
Added machine learning program (#15)
* created folder machine learning * Added gradient descent program * Delete file1
1 parent 377cf55 commit 0c09f36

File tree

3 files changed

+241
-0
lines changed

3 files changed

+241
-0
lines changed

MachineLearning/gradient_descent.py

Lines changed: 102 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,102 @@
1+
####################################################################################
2+
## PROBLEM1: Gradient Descent
3+
## Gradient descent is a popular optimization technique to solve many
4+
## machine learning problems. In this case, we will explore the gradient
5+
## descent algorithm to fit a line for the given set of 2-D points.
6+
## ref: https://tinyurl.com/yc4jbjzs
7+
## ref: https://spin.atomicobject.com/2014/06/24/gradient-descent-linear-regression/
8+
##
9+
##
10+
## input: directory of faces in ./data/1_points.csv/
11+
## function for reading points is provided
12+
##
13+
##
14+
## your task: fill the following functions:
15+
## evaluate_cost
16+
## evaluate_gradient
17+
## udpate_params
18+
## NOTE: do NOT change values of 'init_params' and 'max_iterations' in optimizer
19+
##
20+
##
21+
## output: cost after convergence (rmse, lower the better)
22+
##
23+
##
24+
## NOTE: all required modules are imported. DO NOT import new modules.
25+
## NOTE: references are given intline
26+
## tested on Ubuntu14.04, 22Oct2017, Abhilash Srikantha
27+
####################################################################################
28+
29+
import numpy as np
30+
import matplotlib.pyplot as plt
31+
import time
32+
33+
def load_data(fname):
34+
points = np.loadtxt(fname, delimiter=',')
35+
y_ = points[:,1]
36+
# append '1' to account for the intercept
37+
x_ = np.ones([len(y_),2])
38+
x_[:,0] = points[:,0]
39+
# display plot
40+
#plt.plot(x_[:,0], y_, 'ro')
41+
#plt.xlabel('x-axis')
42+
#plt.ylabel('y-axis')
43+
#plt.show()
44+
print('data loaded. x:{} y:{}'.format(x_.shape, y_.shape))
45+
return x_, y_
46+
47+
def evaluate_cost(x_,y_,params):
48+
tempcost = 0
49+
for i in range(len(y_)):
50+
tempcost += (y_[i] - ((params[0] * x_[i,0]) + params[1])) ** 2
51+
return tempcost / float(10000)
52+
53+
def evaluate_gradient(x_,y_,params):
54+
m_gradient = 0
55+
b_gradient = 0
56+
N = float(len(y_))
57+
for i in range(len(y_)):
58+
m_gradient += -(2/N) * (x_[i,0] * (y_[i] - ((params[0] * x_[i,0]) + params[1])))
59+
b_gradient += -(2/N) * (y_[i] - ((params[0] * x_[i,0]) + params[1]))
60+
return [m_gradient,b_gradient]
61+
62+
def update_params(old_params, grad, alpha):
63+
new_m = old_params[0] - (alpha * grad[0])
64+
new_b = old_params[1] - (alpha * grad[1])
65+
return [new_m,new_b]
66+
67+
# initialize the optimizer
68+
optimizer = {'init_params':np.array([4.5,2.0]) ,
69+
'max_iterations':10000,
70+
'alpha':0.69908,
71+
'eps':0.0000001,
72+
'inf':1e10}
73+
74+
# load data
75+
x_, y_ = load_data("./data/1_points.csv")
76+
77+
# time stamp
78+
start = time.time()
79+
80+
try:
81+
# gradient descent
82+
params = optimizer['init_params']
83+
old_cost = 1e10
84+
for iter_ in range(optimizer['max_iterations']):
85+
# evaluate cost and gradient
86+
cost = evaluate_cost(x_,y_,params)
87+
grad = evaluate_gradient(x_,y_,params)
88+
# display
89+
if(iter_ % 10 == 0):
90+
print('iter: {} cost: {} params: {}'.format(iter_, cost, params))
91+
# check convergence
92+
if(abs(old_cost - cost) < optimizer['eps']):
93+
break
94+
# udpate parameters
95+
params = update_params(params,grad,optimizer['alpha'])
96+
old_cost = cost
97+
except:
98+
cost = optimizer['inf']
99+
100+
# final output
101+
print('time elapsed: {}'.format(time.time() - start))
102+
print('cost at convergence: {} (lower the better)'.format(cost))

MachineLearning/readme.txt

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
The assignment consists of three problems based on basic machine learning and computer vision.
2+
Numerous problems in these areas are well studied in statistics and applied mathematics.
3+
Solutions are to be implemented in python by filling out required functions in each python file.
4+
A basic framework for data i/o and evaluation is already provided (see header comments in each python file).
5+
Please note that all required libraries are already imported so please DO NOT import anything new.
6+
7+
The four problems are briefly discussed below.
8+
9+
1. Gradient Descent: This is a popular optimization problem to find solutions to differentiable equations.
10+
Typically, learning problems involve minimizing a cost function by appropriately setting model parameters.
11+
In this task, we are given a set of (noisy) points on a line and we wish to retrieve model parameters (intercept and slope) through gradient descent.
12+
Please refer to 'gradient_descent.py' and inline comments for further details.
13+
14+
2. Eigenfaces: This is a popular application of learning a basis representation of input data.
15+
The application of this technique is the basis for simple recognition/compression algorithms.
16+
In this task, we want to learn orthonormal basis using PCA of images that correspond to faces.
17+
Please refer to 'eigenfaces.py' and inline comments for further details.
18+
19+
3. Classification: This is among the basic tasks of machine learning problems.
20+
Here, we will learn a classifier to using groundtruth labels on the training data to be able to distinguish between two object classes.
21+
You will use the scikit library to learn two classifiers (svm and random forest).
22+
Feel free to explore the parameters of both models to maximize classifier performance.
23+
Please refer to 'classification.py' and inline comments for further details.
24+
25+
4. Disparity map: This is among the basic tasks of 3D computer vision
26+
Here, given two differnce perspectives of the same scene, we will reconstruct an approximate of the depth map.
27+
This is called the disparity map (higher disparity is similar to lower depth).
28+
You will use the scikit library to implement the module. Feel free to explore the parameters 'downsample' and 'patchsize'
29+
Please refer to disparity.py and inline comments for further details.
30+

MachineLearning/version_list.txt

Lines changed: 109 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,109 @@
1+
alabaster==0.7.10
2+
angles==1.9.11
3+
astroid==1.5.3
4+
Babel==2.5.0
5+
backports.weakref==1.0rc1
6+
bleach==1.5.0
7+
chardet==3.0.4
8+
configparser==3.5.0
9+
cycler==0.10.0
10+
decorator==4.1.2
11+
docutils==0.14
12+
entrypoints==0.2.3
13+
html5lib==0.9999999
14+
imagesize==0.7.1
15+
imutils==0.4.3
16+
ipykernel==4.6.1
17+
ipython==6.1.0
18+
ipython-genutils==0.2.0
19+
ipywidgets==6.0.0
20+
isort==4.2.15
21+
jedi==0.10.2
22+
Jinja2==2.9.6
23+
jsonschema==2.6.0
24+
jupyter==1.0.0
25+
jupyter-client==5.1.0
26+
jupyter-console==5.2.0
27+
jupyter-core==4.3.0
28+
lazy-object-proxy==1.3.1
29+
lxml==3.8.0
30+
Mako==1.0.6
31+
Markdown==2.6.9
32+
MarkupSafe==1.0
33+
matplotlib==2.0.2
34+
mistune==0.7.4
35+
mock==2.0.0
36+
mpmath==0.19
37+
nbconvert==5.2.1
38+
nbformat==4.4.0
39+
networkx==1.11
40+
nose==1.3.7
41+
notebook==5.0.0
42+
numpy==1.13.1
43+
numpydoc==0.7.0
44+
olefile==0.44
45+
opencv==1.0.1
46+
pandas==0.20.3
47+
pandocfilters==1.4.2
48+
pbr==3.1.1
49+
pexpect==4.2.1
50+
pickleshare==0.7.4
51+
Pillow==3.4.2
52+
prompt-toolkit==1.0.15
53+
protobuf==3.4.0
54+
psutil==5.2.2
55+
ptyprocess==0.5.2
56+
pycodestyle==2.3.1
57+
pyflakes==1.6.0
58+
Pygments==2.2.0
59+
pygpu==0.6.9
60+
pylint==1.7.2
61+
pyparsing==2.2.0
62+
python-dateutil==2.6.1
63+
python-qt-binding==0.2.19
64+
pytz==2017.2
65+
PyWavelets==0.5.2
66+
pyzmq==16.0.2
67+
qt-dotgraph==0.2.32
68+
qt-gui==0.2.32
69+
qt-gui-py-common==0.2.32
70+
QtAwesome==0.4.4
71+
qtconsole==4.3.1
72+
QtPy==1.3.1
73+
requests==2.14.2
74+
rope-py3k==0.9.4.post1
75+
rosboost-cfg==1.11.14
76+
rosclean==1.11.14
77+
roscreate==1.11.14
78+
rosgraph==1.11.21
79+
roslint==0.10.0
80+
roslz4==1.11.21
81+
rosmaster==1.11.21
82+
rosparam==1.11.21
83+
scikit-image==0.13.0
84+
scikit-learn==0.19.0
85+
scipy==0.19.1
86+
simplegeneric==0.8.1
87+
singledispatch==3.4.0.3
88+
six==1.10.0
89+
sklearn-theano==0.0.1
90+
smach==2.0.1
91+
smclib==1.7.19
92+
snowballstemmer==1.2.1
93+
Sphinx==1.6.3
94+
sphinxcontrib-websupport==1.0.1
95+
spyder==3.2.3
96+
sympy==1.1.1
97+
tensorflow==1.3.0
98+
tensorflow-tensorboard==0.1.5
99+
terminado==0.6
100+
testpath==0.3
101+
Theano==0.9.0
102+
tornado==4.5.2
103+
traitlets==4.3.2
104+
wcwidth==0.1.7
105+
webencodings==0.5
106+
Werkzeug==0.12.2
107+
widgetsnbextension==3.0.1
108+
wrapt==1.10.11
109+
xdot==2.0.1

0 commit comments

Comments
 (0)