Skip to content

Commit

Permalink
Downloading data in a temporary folder
Browse files Browse the repository at this point in the history
The `/tmp` folder is available on most unix-flavored systems (inluding Ubuntu & Apple) and is a good place to put temporary data like MNIST.

Good is subject to subjectivity :-)
  • Loading branch information
laurentperrinet committed Oct 10, 2023
1 parent 97f60f5 commit 6d1c48e
Show file tree
Hide file tree
Showing 24 changed files with 25 additions and 25 deletions.
2 changes: 1 addition & 1 deletion docs/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Define variables for dataloading.
::

batch_size = 128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device("cpu")

Load MNIST dataset.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/legacy/tutorial_1_old.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Let's define a few variables:

# Training Parameters
batch_size=128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
num_classes = 10

# Temporal Dynamics
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/legacy/tutorial_3_old.rst
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ Much of the following code has already been explained in the first two tutorials

# Training Parameters
batch_size=128
data_path='/data/mnist'
data_path='/tmp/data/mnist'

# Temporal Dynamics
num_steps = 25
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_1.rst
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ Install the latest PyPi distribution of snnTorch:

# Training Parameters
batch_size=128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
num_classes = 10 # MNIST has 10 output classes
# Torch Variables
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_5.rst
Original file line number Diff line number Diff line change
Expand Up @@ -388,7 +388,7 @@ training a fully-connected spiking neural net.

# dataloader arguments
batch_size = 128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
dtype = torch.float
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device("cpu")
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_6.rst
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ here. <https://snntorch.readthedocs.io/en/latest/snntorch.surrogate.html>`__

# dataloader arguments
batch_size = 128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
dtype = torch.float
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device("cpu")
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_ipu_1.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Load in the MNIST dataset.
from torchvision import datasets, transforms

batch_size = 128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
# Define a transform
transform = transforms.Compose([
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_pop.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Define variables for dataloading.
::

batch_size = 128
data_path='/data/fmnist'
data_path='/tmp/data/fmnist'
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device("cpu")

Load FashionMNIST dataset.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_regression_2.rst
Original file line number Diff line number Diff line change
Expand Up @@ -332,7 +332,7 @@ temporal data is an exercise left to the reader/coder.
::

batch_size = 128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
# Define a transform
transform = transforms.Compose([
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/tutorial_sae.rst
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@
# dataloader arguments
batch_size = 250
data_path='/data/mnist'
data_path='/tmp/data/mnist'
dtype = torch.float
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device("cpu")
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/zh-cn/tutorial_1_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ snnTorch 教程系列基于以下论文。如果您发现这些资源或代码

# Training Parameters
batch_size=128
data_path='/data/mnist'
data_path='/tmp/data/mnist'
num_classes = 10 # MNIST has 10 output classes
# Torch Variables
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/Alpha_neuron_training_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/CIFAR_temp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/FCN_truncatedfromscratch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/TBPTT.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/tutorial_3_FCN.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/legacy/tutorial_4_CNN.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
"\n",
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"# Temporal Dynamics\n",
"num_steps = 25\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"outputs": [],
"source": [
"batch_size = 128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device(\"cpu\")"
]
},
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial_1_spikegen.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@
"source": [
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"num_classes = 10 # MNIST has 10 output classes\n",
"\n",
"# Torch Variables\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial_5_FCN.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -398,7 +398,7 @@
"source": [
"# dataloader arguments\n",
"batch_size = 128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"dtype = torch.float\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device(\"cpu\")"
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial_6_CNN.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@
"source": [
"# dataloader arguments\n",
"batch_size = 128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"dtype = torch.float\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device(\"cpu\")"
Expand Down
4 changes: 2 additions & 2 deletions examples/tutorial_pop.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -113,8 +113,8 @@
"outputs": [],
"source": [
"batch_size = 128\n",
"data_path='/data/fmnist'\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device(\"cpu\")"
"data_path='/tmp/data/fmnist'\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device('mps') if torch.backends.mps.is_available() else torch.device(\"cpu\")"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial_sae.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@
"source": [
"# dataloader arguments\n",
"batch_size = 250\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"\n",
"dtype = torch.float\n",
"device = torch.device(\"cuda\") if torch.cuda.is_available() else torch.device("mps") if torch.backends.mps.is_available() else torch.device(\"cpu\")"
Expand Down
2 changes: 1 addition & 1 deletion examples/zh-cn/tutorial_1_spikegen_cn.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@
"source": [
"# Training Parameters\n",
"batch_size=128\n",
"data_path='/data/mnist'\n",
"data_path='/tmp/data/mnist'\n",
"num_classes = 10 # MNIST has 10 output classes\n",
"\n",
"# Torch Variables\n",
Expand Down

0 comments on commit 6d1c48e

Please sign in to comment.