-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1 changed file
with
1 addition
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"Beating University of Oxford-IIIT Pet Model.ipynb","provenance":[],"collapsed_sections":["-zPn180gQ0sz","wmuF2z7vV15M"]},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"SYOWmy55OMKh","colab_type":"code","colab":{}},"source":[""],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"J7eUhwn-Ppx4","colab_type":"text"},"source":[""]},{"cell_type":"markdown","metadata":{"id":"-zPn180gQ0sz","colab_type":"text"},"source":["# University of Oxford's IIIT Pet Model\n","The Oxford-IIIT Pet dataset was created for the fine-grained categorisation problem of identifying the family and breed of pets (both;cats and dogs). Three different tasks and corresponding baseline algorithms were proposed and investigated, obtaining a mere 42% probability.\n","[Comparison of 3 models of Oxford's IIT pet ](https://http://www.robots.ox.ac.uk/~vgg/publications/2012/parkhi12a/parkhi12a.pdf)"]},{"cell_type":"markdown","metadata":{"id":"sqoPsSSVPo_Y","colab_type":"text"},"source":[""]},{"cell_type":"markdown","metadata":{"id":"PLpJAkVTTeze","colab_type":"text"},"source":["# Using CNN to build an Image- Classifier Model"]},{"cell_type":"code","metadata":{"id":"T7R6ghvhTyMp","colab_type":"code","colab":{}},"source":["# Going to start with the following three lines; ensuring that any edits to libraries one make get reloaded here automatically, and also that any charts or images displayed are shown in this notebook.\n","\n","%reload_ext autoreload\n","%autoreload 2\n","%matplotlib inline\n","\n","# importing the necessary packages\n","\n","from fastai.vision import *\n","from fastai.metrics import error_rate"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"mNYGRcsPUc93","colab_type":"code","colab":{}},"source":["bs = 64\n","# bs = 16 # uncomment this line if you run out of memory even after clicking Kernel->Restart"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"jCkGDhQAUhsA","colab_type":"code","colab":{}},"source":["# going to use the untar_data function to which we must pass a URL as an argument and which will download and extract the data.\n","\n","help(untar_data)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"7zEiasq1Uuie","colab_type":"code","colab":{}},"source":["path = untar_data(URLs.PETS); path"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"EfJ7wG42Uvl5","colab_type":"code","colab":{}},"source":["path.ls()"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"5OyK8SbGUxvC","colab_type":"code","colab":{}},"source":["fnames = get_image_files(path_img)\n","fnames[:5]"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"Or1eM3YyU2_Y","colab_type":"code","colab":{}},"source":["np.random.seed(2)\n","pat = r'/([^/]+)_\\d+.jpg$'"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"whyFYrkpU6_Y","colab_type":"code","colab":{}},"source":["data = ImageDataBunch.from_name_re(path_img, fnames, pat, ds_tfms=get_transforms(), size=224, bs=bs\n"," ).normalize(imagenet_stats)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"UN_DUI1vU-Pe","colab_type":"code","colab":{}},"source":["data.show_batch(rows=3, figsize=(7,6))"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"fOaCkJ01VA6m","colab_type":"code","colab":{}},"source":["print(data.classes)\n","len(data.classes),data.c"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"j6iwOBccVYWe","colab_type":"text"},"source":["## Using restnet34"]},{"cell_type":"code","metadata":{"id":"OwfvFsd5Vflc","colab_type":"code","colab":{}},"source":["learn = cnn_learner(data, models.resnet34, metrics=error_rate)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"X_G94sexVkLb","colab_type":"code","colab":{}},"source":["learn.model"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"8CLr8Y6aVmab","colab_type":"code","colab":{}},"source":["learn.fit_one_cycle(4)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"Iy722NDSVpDs","colab_type":"code","colab":{}},"source":["learn.save('stage-1')"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"wmuF2z7vV15M","colab_type":"text"},"source":["### Interpretation"]},{"cell_type":"code","metadata":{"id":"yTdWiBH-V7Fs","colab_type":"code","colab":{}},"source":["interp = ClassificationInterpretation.from_learner(learn)\n","\n","losses,idxs = interp.top_losses()\n","\n","len(data.valid_ds)==len(losses)==len(idxs)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"fbMFI0uDV9-n","colab_type":"code","colab":{}},"source":["interp.plot_top_losses(9, figsize=(15,11))"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"TilWU74QWAcX","colab_type":"code","colab":{}},"source":["interp.plot_confusion_matrix(figsize=(12,12), dpi=60)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"LvsSvLI2WmRE","colab_type":"code","colab":{}},"source":["interp.most_confused(min_val=2)"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"30GKIKP8Wpim","colab_type":"text"},"source":["### Unfreezing and fine-tuning"]},{"cell_type":"code","metadata":{"id":"WcCSQ7fdWyU9","colab_type":"code","colab":{}},"source":["learn.unfreeze()"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"LHPmlEoFW1dY","colab_type":"code","colab":{}},"source":["learn.fit_one_cycle(1)"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"PzBg_c0RW35c","colab_type":"code","colab":{}},"source":["learn.load('stage-1');"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"MjA6ZE2XW6SD","colab_type":"code","colab":{}},"source":["learn.lr_find()"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"zQp7g6APW8Ry","colab_type":"code","colab":{}},"source":["learn.recorder.plot()"],"execution_count":0,"outputs":[]},{"cell_type":"code","metadata":{"id":"y4UqhyDxW-nd","colab_type":"code","colab":{}},"source":["learn.unfreeze()\n","learn.fit_one_cycle(2, max_lr=slice(1e-6,1e-4))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"xm8NQAdPXdrO","colab_type":"text"},"source":["# CNN- a fine development"]},{"cell_type":"markdown","metadata":{"id":"A_wT44RHXxol","colab_type":"text"},"source":["*A regularized version of multilayer perceptrons*, convolutional Neural Network i a class of deep neural networks, most commonly applied to analyzing visual imagery.\n","Also known as **shift invariant or space invariant artificial neural networks (SIANN)**, based on their shared-weights architecture and translation invariance characteristics, Convolutional networks were inspired by biological processes in that the connectivity pattern between neurons resembles the organization of the animal visual cortex.\n","**CNNs use relatively little pre-processing compared to other image classification algorithms implying that the network learns the filters that in traditional algorithms were hand-engineered. This independence from prior knowledge and human effort in feature design is a major advantage.**"]}]} |