Skip to content

Instantly share code, notes, and snippets.

View CVxTz's full-sized avatar
💭
All Dawgz go to heaven

Mansar Youness CVxTz

💭
All Dawgz go to heaven
View GitHub Profile
@thomwolf
thomwolf / gradient_accumulation.py
Last active November 23, 2024 20:53
PyTorch gradient accumulation training loop
model.zero_grad() # Reset gradients tensors
for i, (inputs, labels) in enumerate(training_set):
predictions = model(inputs) # Forward pass
loss = loss_function(predictions, labels) # Compute loss function
loss = loss / accumulation_steps # Normalize our loss (if averaged)
loss.backward() # Backward pass
if (i+1) % accumulation_steps == 0: # Wait for several backward steps
optimizer.step() # Now we can do an optimizer step
model.zero_grad() # Reset gradients tensors
if (i+1) % evaluation_steps == 0: # Evaluate the model when we...
@AlexanderFabisch
AlexanderFabisch / load_sarcos.py
Last active September 1, 2020 09:45
Load SARCOS data in Python
# Get the dataset here: http://www.gaussianprocess.org/gpml/data/
import scipy.io
# Load training set
train = scipy.io.loadmat("sarcos_inv.mat")
# Inputs (7 joint positions, 7 joint velocities, 7 joint accelerations)
Xtrain = train["sarcos_inv"][:, :21]
# Outputs (7 joint torques)
Ytrain = train["sarcos_inv"][:, 21:]