Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
satori
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
GitLab community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Erik Strand
satori
Repository graph
Repository graph
You can move around the graph by using the arrow keys.
fdf08d82e8f639fb8f09fb3756ebef217237b80c
Select Git revision
Selected
fdf08d82e8f639fb8f09fb3756ebef217237b80c
Branches
3
master
default
protected
develop
pi
4 results
Begin with the selected commit
Created with Raphaël 2.2.0
19
Mar
18
1
Write an MPI pi calculation for GPUs
develop master
develop master
Run on eight GPUs (two nodes)
Run on four GPUs (one node)
Fix indexing errors and clean up code
Separate kernels
Add MPI pi calculation (GPU)
Write an MPI pi calculation for CPUs
Determine the most CPUs that slurm will hand out
Evaluate more terms
Run 32 tasks per node
Increase NPTS, log CPU data
Fix off by one error
Add MPI pi calculation (CPU)
Add a CUDA pi calculation
Write an MPI hello world test
Run without cuda module loaded
Remove debug prints
Try removing CUDA runtime
Try 2 nodes
Up memory to 1G
Add debug prints
Try exactly as it says in the docs
Move output files
Try a different openmpi module
Add slurm batch script
Check SLURM_LOCALID environment variable
Start working on an MPI demo
Fix things
pi
pi
Add a CUDA pi calculation
Write an MPI hello world test
Run without cuda module loaded
Remove debug prints
Try removing CUDA runtime
Try 2 nodes
Up memory to 1G
Add debug prints
Try exactly as it says in the docs
Move output files
Try a different openmpi module
Add slurm batch script
Loading