Skip to main content

A hobby project just like torch for learning how to design a deep learning framework.

Project description

torchdiy

A hobby project inspired by torch to learn how to design a deep learning framework.

This project also includes a DIY version of the torchdiy.transformers package, similar to Hugging Face's transformers.

Installation

To install the torchdiy package, run the following command:

$ pip install torchdiy

Running the Example

To run the example on the MNIST dataset, execute the following command:

$ python mnist.py

mnist.py file:

from torchvision import datasets, transforms
import torchdiy as torch

nn = torch.nn
optim = torch.optim
DataLoader = torch.utils.data.DataLoader

transform = transforms.Compose([
    transforms.ToTensor(),
    transforms.Normalize((0.1307,), (0.3081,))
])

train_dataset = datasets.MNIST(root='./data', train=True, download=True, transform=transform)
test_dataset = datasets.MNIST(root='./data', train=False, download=True, transform=transform)

train_loader = DataLoader(train_dataset, batch_size=64, shuffle=True)
test_loader = DataLoader(test_dataset, batch_size=64, shuffle=False)

class MLP(nn.Module):
    def __init__(self):
        super(MLP, self).__init__()
        self.fc1 = nn.Linear(28 * 28, 512)
        self.fc2 = nn.Linear(512, 256)
        self.fc3 = nn.Linear(256, 10)
        self.relu = nn.ReLU()
        # self.dropout = nn.Dropout(0.5)

    def forward(self, x):
        x = x.view(-1, 28 * 28)
        x = self.relu(self.fc1(x))
        # x = self.dropout(x)
        x = self.relu(self.fc2(x))
        # x = self.dropout(x)
        x = self.fc3(x)
        return x

class CNN(nn.Module):
    def __init__(self):
        super(CNN, self).__init__()
        self.conv1 = nn.Conv2d(1, 32, kernel_size=3, stride=1, padding=1)
        self.conv2 = nn.Conv2d(32, 64, kernel_size=3, stride=1, padding=1)
        self.fc1 = nn.Linear(64 * 7 * 7, 128)
        self.fc2 = nn.Linear(128, 10)
        self.relu = nn.ReLU()
        self.maxpool = nn.MaxPool2d(kernel_size=2, stride=2)
        # self.dropout = nn.Dropout(0.5)

    def forward(self, x):
        x = self.relu(self.conv1(x))
        x = self.maxpool(x)
        x = self.relu(self.conv2(x))
        x = self.maxpool(x)
        x = x.view(-1, 64 * 7 * 7)
        x = self.relu(self.fc1(x))
        # x = self.dropout(x)
        x = self.fc2(x)
        return x

# model = CNN()
model = MLP()

criterion = nn.CrossEntropyLoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)

num_epochs = 3
for epoch in range(num_epochs):
    model.train()
    running_loss = 0.0
    for i, (images, labels) in enumerate(train_loader):
        optimizer.zero_grad()
        outputs = model(images)
        loss = criterion(outputs, labels)
        loss.backward()
        optimizer.step()

        running_loss += loss.item()
        if (i + 1) % 100 == 0:
            print(f'Epoch [{epoch + 1}/{num_epochs}], Step [{i + 1}/{len(train_loader)}], Loss: {loss.item():.4f}')

    print(f'Epoch [{epoch + 1}/{num_epochs}], Loss: {running_loss / len(train_loader):.4f}')

model.eval()
correct = 0
total = 0
with torch.no_grad():
    for images, labels in test_loader:
        outputs = model(images)
        _, predicted = torch.max(outputs.data, 1)
        total += labels.size(0)
        correct += (predicted == labels).sum().item()

print(f'Accuracy of the model on the 10000 test images: {100 * correct / total:.2f}%')

torch.save(model.state_dict(), 'mnist_cnn.pth')

model.load_state_dict(torch.load('mnist_cnn.pth'))
model.eval()

with torch.no_grad():
    sample_image, true_label = test_dataset[0] 
    sample_image = sample_image.unsqueeze(0)
    output = model(sample_image)
    _, predicted = torch.max(output, 1)
    print(f'Predicted: {predicted.item()}, True Label: {true_label}')

Output example:

$ python mnist.py
Epoch [1/3], Step [100/938], Loss: 0.1205
Epoch [1/3], Step [200/938], Loss: 0.1729
...
Accuracy of the model on the 10000 test images: 96.92%
Predicted: 7, True Label: 7

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchdiy-0.4.3.tar.gz (15.3 kB view details)

Uploaded Source

File details

Details for the file torchdiy-0.4.3.tar.gz.

File metadata

  • Download URL: torchdiy-0.4.3.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for torchdiy-0.4.3.tar.gz
Algorithm Hash digest
SHA256 a6071303f2332372104e39edf27fed5f867bd95ec6a2cda093f97ec0d96393c7
MD5 0d14d91346326d5a9ee095fc4b723e90
BLAKE2b-256 d5ab887a94edb75cb9c21a9b90b84f6e95fdad1a19a8180353c9bd8694ed2c68

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page