Skip to content

Commit

Permalink
nn article: bottom section and bp update
Browse files Browse the repository at this point in the history
  • Loading branch information
mlu-explain committed May 2, 2023
1 parent 9f0d604 commit 1d7123b
Show file tree
Hide file tree
Showing 17 changed files with 409 additions and 33 deletions.
1 change: 1 addition & 0 deletions code/nn/public/assets/styles/global.css
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@
--size-default: 16px;

--viz-height: 80vh;
--max-viz-height: 750px;
}

body {
Expand Down
9 changes: 5 additions & 4 deletions code/nn/src/App.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import Resources from "./Components/new/Resources.svelte";
import BackProp from "./Components/new/BackProp.svelte";
import VizNet from "./Components/new/VizNet.svelte";
import OtherArchitectures from "./Components/new/OtherArchitectures.svelte";
function handleResize() {
$mobile = window.innerWidth <= 950;
Expand All @@ -41,13 +42,13 @@

<svelte:window on:resize={handleResize} />

<!-- <Logo />
<Logo />
<Title />
<Intro />
<NetworkScroll />
<ActivationFunctions /> -->
<ActivationFunctions />
<BackPropagation />
<BackProp />
<!-- <VizNet /> -->
<!-- <CommonArchitectures /> -->
<VizNet />
<OtherArchitectures />
<Resources />
25 changes: 13 additions & 12 deletions code/nn/src/Components/CommonArchitectures.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -6,21 +6,22 @@

<section>
<hr />
<h1>Activation Functions</h1>
<h1>Going Forward: Other Neural Network Architectures</h1>
<p class="body-text">
Linear Regression is a simple and powerful model for predicting a numeric
response from a set of one or more independent variables. This article will
focus mostly on how the method is used in machine learning, so we won't
cover common use cases like causal inference or experimental design. And
although it may seem like linear regression is overlooked in modern machine
learning's ever-increasing world of complex neural network architectures,
the algorithm is still widely used across a large number of domains because
it is effective, easy to interpret, and easy to extend. The key ideas in
linear regression are recycled everywhere, so understanding the algorithm is
a must-have for a strong foundation in machine learning.
Up to this point, we've describe a specific neural network architecture
where values flow forward linearly through a network, and gradients flow
linearly backwards through a network. These are often referred to as feed
<span class="bold">forward neural networks</span>, or
<span class="bold">artificial neural networks</span> (the word 'artificial' comes
from the network's composition of artificial neurons). However, this is just
the tip of the iceberg when it comes to the field of neural networks. While artificial
neural networks have been incredibly successful in a wide range of applications,
many other types of neural network architectures exist that can be used to solve
different types of problems. In this section, we will briefly explore some of
the other network architectures that are commonly used and why they are necessary
for solving different types of problems.
</p>
</section>
<Table />

<style>
h1 {
Expand Down
2 changes: 1 addition & 1 deletion code/nn/src/Components/DynamicNetworkChart.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -355,8 +355,8 @@
}
#network-chart {
width: 100%;
max-height: 100%;
height: 100%;
max-height: var(--max-viz-height);
background: conic-gradient(
from 90deg at 1px 1px,
#0000 90deg,
Expand Down
1 change: 1 addition & 0 deletions code/nn/src/Components/NetworkScroll.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -391,6 +391,7 @@
.chart-holder {
width: 100%;
height: 100%;
max-height: var(--max-viz-height);
}
.step {
Expand Down
47 changes: 47 additions & 0 deletions code/nn/src/Components/new/Architectures/CNN.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
<section>
<h2>Convolutional Neural Networks (CNN's)</h2>
<div class="wrapper">
<div class="left">
<svg><!-- Insert SVG here --></svg>
</div>
<div class="right">
<p class="body-text">
Convolutional Neural Networks (CNNs) are a type of neural network
architecture that has been particularly successful in computer vision
tasks. The key difference between CNNs and traditional feedforward
neural networks is the presence of convolutional layers, which are
specialized layers designed to process images and other types of
multidimensional data. In a CNN, the convolutional layers learn to
detect and extract features from the input data by sliding a set of
filters over the input data and performing convolution operations. The
</p>
</div>
</div>
</section>

<style>
h2 {
font-size: var(--size-default);
text-decoration: underline;
}
.wrapper {
display: grid;
grid-template-columns: 30% 70%;
grid-gap: 1rem;
align-items: center;
max-width: 100%;
margin: auto;
}
.left {
display: flex;
justify-content: center;
}
.right {
/* border: 2px solid black; */
}
svg {
max-width: 100%;
max-height: 100%;
border: 2px solid red;
}
</style>
47 changes: 47 additions & 0 deletions code/nn/src/Components/new/Architectures/GAN.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
<section>
<h2>Generative Adversarial Networks (GANS's)</h2>
<div class="wrapper">
<div class="left">
<svg><!-- Insert SVG here --></svg>
</div>
<div class="right">
<p class="body-text">
Generative Adversarial Networks (GANs) are a type of neural network
architecture that is used for generative modeling. GANs consist of two
networks: a generator network and a discriminator network. The generator
network learns to generate samples that resemble the training data,
while the discriminator network learns to distinguish between real and
fake samples. The two networks are trained simultaneously, with the
generator network trying to produce samples that fool the discriminator
network, and the discriminator network trying to accurately distinguish
</p>
</div>
</div>
</section>

<style>
h2 {
font-size: var(--size-default);
text-decoration: underline;
}
.wrapper {
display: grid;
grid-template-columns: 30% 70%;
grid-gap: 1rem;
align-items: center;
max-width: 100%;
margin: auto;
}
.left {
display: flex;
justify-content: center;
}
.right {
/* border: 2px solid black; */
}
svg {
max-width: 100%;
max-height: 100%;
border: 2px solid red;
}
</style>
46 changes: 46 additions & 0 deletions code/nn/src/Components/new/Architectures/RNN.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
<section>
<h2>Recurrent Neural Networks (RNN's)</h2>
<div class="wrapper">
<div class="left">
<svg><!-- Insert SVG here --></svg>
</div>
<div class="right">
<p class="body-text">
Recurrent Neural Networks (RNNs) are a type of neural network
architecture that is particularly well-suited for processing sequential
data, such as speech or text. The key difference between RNNs and
traditional feedforward neural networks is the presence of recurrent
connections, which allow information to persist over time within the
network. In an RNN, each neuron receives an input as well as a hidden
state from the previous time step, which allows the network to use
</p>
</div>
</div>
</section>

<style>
h2 {
font-size: var(--size-default);
text-decoration: underline;
}
.wrapper {
display: grid;
grid-template-columns: 30% 70%;
grid-gap: 1rem;
align-items: center;
max-width: 100%;
margin: auto;
}
.left {
display: flex;
justify-content: center;
}
.right {
/* border: 2px solid black; */
}
svg {
max-width: 100%;
max-height: 100%;
border: 2px solid red;
}
</style>
46 changes: 46 additions & 0 deletions code/nn/src/Components/new/Architectures/Transformer.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
<section>
<h2>Attention Mechanisms (Transformers)</h2>
<div class="wrapper">
<div class="left">
<svg><!-- Insert SVG here --></svg>
</div>
<div class="right">
<p class="body-text">
Attention Mechanisms, also known as Transformers, are a type of neural
network architecture that is particularly well-suited for natural
language processing tasks. The key difference between Attention
Mechanisms and traditional feedforward neural networks is the use of
attention mechanisms, which allow the network to focus on specific parts
of the input data. Attention mechanisms work by assigning a weight to
each element of the input data, allowing the network to focus more on
</p>
</div>
</div>
</section>

<style>
h2 {
font-size: var(--size-default);
text-decoration: underline;
}
.wrapper {
display: grid;
grid-template-columns: 30% 70%;
grid-gap: 1rem;
align-items: center;
max-width: 100%;
margin: auto;
}
.left {
display: flex;
justify-content: center;
}
.right {
/* border: 2px solid black; */
}
svg {
max-width: 100%;
max-height: 100%;
border: 2px solid red;
}
</style>
1 change: 1 addition & 0 deletions code/nn/src/Components/new/BackProp.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -376,6 +376,7 @@
.chart-holder-backprop {
width: 100%;
height: 100%;
max-height: 750px;
}
.step-bp {
Expand Down
6 changes: 4 additions & 2 deletions code/nn/src/Components/new/BackPropOutput.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@
export let width;
// init to false so don't show drawing during rendering
const rectDim = 160;
$: xScale = scaleLinear()
.domain([-1, $numLayers])
.range([$marginScroll.left, width - $marginScroll.right]);
Expand All @@ -19,9 +21,9 @@
.range([height - $marginScroll.bottom, $marginScroll.top]);
// responsive dimensions for scatter plot
$: scatterWidth = xScale(1) - xScale(0);
$: scatterWidth = rectDim ? rectDim : xScale(1) - xScale(0);
$: yVals = positionElements(3, maxNumNeurons);
$: scatterHeight = yScale(yVals[0]) - yScale(yVals[2]);
$: scatterHeight = 1 ? rectDim : yScale(yVals[0]) - yScale(yVals[1]);
</script>

<!-- scatterplot -->
Expand Down
93 changes: 93 additions & 0 deletions code/nn/src/Components/new/Difficulty.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
<script>
let open = false;
let selected = "Intro";
function handleOptionClick(option) {
selected = option;
console.log(`Selected option: ${option}`);
}
function toggle() {
open = !open;
}
</script>

<template>
<div class="container">
<div class="header" on:click={() => toggle()}>
<span>Select Text Difficulty:</span>
<span class="selected">{selected}</span>
<span class="arrow">{open ? "" : ""}</span>
</div>
{#if open}
<div class="options">
<button
on:click={() => handleOptionClick("Intro")}
class:selected={selected === "Intro"}>Intro</button
>
<button
on:click={() => handleOptionClick("Experienced")}
class:selected={selected === "Experienced"}>Experienced</button
>
<button
on:click={() => handleOptionClick("Expert")}
class:selected={selected === "Expert"}>Expert</button
>
</div>
{/if}
</div>
</template>

<style>
.container {
position: fixed;
top: 0;
left: 0;
background-color: pink;
height: 50px;
width: 300px;
padding: 10px;
box-sizing: border-box;
display: flex;
flex-direction: column;
}
.header {
display: flex;
flex-direction: row;
justify-content: space-between;
align-items: center;
cursor: pointer;
}
.options {
display: flex;
flex-direction: row;
justify-content: center;
align-items: center;
}
.arrow {
font-size: 0.8em;
}
.selected {
font-weight: bold;
}
button {
margin-right: 10px;
background-color: transparent;
border: none;
outline: none;
font-size: 1em;
font-weight: bold;
text-transform: uppercase;
color: white;
cursor: pointer;
}
button:hover {
background-color: rgba(255, 255, 255, 0.2);
}
</style>
Loading

0 comments on commit 1d7123b

Please sign in to comment.