How can I run several independent jobs using one bash script?
I have several independent jobs that I want to run simultaneously on a computer node.
I want to run each job using only one core of the requested node. How can I write a bash script to do this?
This is the bash script that I wrote but didn't work
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=20
for n in {1..20};
do
cd "dictionary$n"
./ the job
done
How can I modify the script to run the 20 independent jobs simutanuously?
bash slurm
add a comment |
I have several independent jobs that I want to run simultaneously on a computer node.
I want to run each job using only one core of the requested node. How can I write a bash script to do this?
This is the bash script that I wrote but didn't work
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=20
for n in {1..20};
do
cd "dictionary$n"
./ the job
done
How can I modify the script to run the 20 independent jobs simutanuously?
bash slurm
1
Replace./ the job
with./ the job & cd ..
?
– Cyrus
Nov 1 '18 at 10:45
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
As Cyrus says, put an&
at the end of the./ the job
line. If you want to wait for all the jobs to finish, add a linewait
at the end of the script
– Jon
Nov 1 '18 at 14:23
add a comment |
I have several independent jobs that I want to run simultaneously on a computer node.
I want to run each job using only one core of the requested node. How can I write a bash script to do this?
This is the bash script that I wrote but didn't work
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=20
for n in {1..20};
do
cd "dictionary$n"
./ the job
done
How can I modify the script to run the 20 independent jobs simutanuously?
bash slurm
I have several independent jobs that I want to run simultaneously on a computer node.
I want to run each job using only one core of the requested node. How can I write a bash script to do this?
This is the bash script that I wrote but didn't work
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=20
for n in {1..20};
do
cd "dictionary$n"
./ the job
done
How can I modify the script to run the 20 independent jobs simutanuously?
bash slurm
bash slurm
edited Nov 1 '18 at 11:02
Stephan
35.7k43256
35.7k43256
asked Nov 1 '18 at 10:40
meTchaikovskymeTchaikovsky
186110
186110
1
Replace./ the job
with./ the job & cd ..
?
– Cyrus
Nov 1 '18 at 10:45
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
As Cyrus says, put an&
at the end of the./ the job
line. If you want to wait for all the jobs to finish, add a linewait
at the end of the script
– Jon
Nov 1 '18 at 14:23
add a comment |
1
Replace./ the job
with./ the job & cd ..
?
– Cyrus
Nov 1 '18 at 10:45
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
As Cyrus says, put an&
at the end of the./ the job
line. If you want to wait for all the jobs to finish, add a linewait
at the end of the script
– Jon
Nov 1 '18 at 14:23
1
1
Replace
./ the job
with ./ the job & cd ..
?– Cyrus
Nov 1 '18 at 10:45
Replace
./ the job
with ./ the job & cd ..
?– Cyrus
Nov 1 '18 at 10:45
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
As Cyrus says, put an
&
at the end of the ./ the job
line. If you want to wait for all the jobs to finish, add a line wait
at the end of the script– Jon
Nov 1 '18 at 14:23
As Cyrus says, put an
&
at the end of the ./ the job
line. If you want to wait for all the jobs to finish, add a line wait
at the end of the script– Jon
Nov 1 '18 at 14:23
add a comment |
1 Answer
1
active
oldest
votes
The easiest solution is with a job array:
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --array=1-20
cd "dictionary$SLURM_ARRAY_TASK_ID"
./ the job
The above script will create a job array with 20 jobs, and each job will cd
to the directory based on the $SLURM_ARRAY_TASK_ID
variable, that will take a different value between 1 and 20 for each job in the array.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53099596%2fhow-can-i-run-several-independent-jobs-using-one-bash-script%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
The easiest solution is with a job array:
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --array=1-20
cd "dictionary$SLURM_ARRAY_TASK_ID"
./ the job
The above script will create a job array with 20 jobs, and each job will cd
to the directory based on the $SLURM_ARRAY_TASK_ID
variable, that will take a different value between 1 and 20 for each job in the array.
add a comment |
The easiest solution is with a job array:
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --array=1-20
cd "dictionary$SLURM_ARRAY_TASK_ID"
./ the job
The above script will create a job array with 20 jobs, and each job will cd
to the directory based on the $SLURM_ARRAY_TASK_ID
variable, that will take a different value between 1 and 20 for each job in the array.
add a comment |
The easiest solution is with a job array:
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --array=1-20
cd "dictionary$SLURM_ARRAY_TASK_ID"
./ the job
The above script will create a job array with 20 jobs, and each job will cd
to the directory based on the $SLURM_ARRAY_TASK_ID
variable, that will take a different value between 1 and 20 for each job in the array.
The easiest solution is with a job array:
#!/bin/bash
#SBATCH --job-name=20
#SBATCH --partition=the_partition
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --array=1-20
cd "dictionary$SLURM_ARRAY_TASK_ID"
./ the job
The above script will create a job array with 20 jobs, and each job will cd
to the directory based on the $SLURM_ARRAY_TASK_ID
variable, that will take a different value between 1 and 20 for each job in the array.
edited Nov 22 '18 at 7:14
answered Nov 19 '18 at 12:21
damienfrancoisdamienfrancois
26k54762
26k54762
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53099596%2fhow-can-i-run-several-independent-jobs-using-one-bash-script%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Replace
./ the job
with./ the job & cd ..
?– Cyrus
Nov 1 '18 at 10:45
Start jobs in the background
– Walter A
Nov 1 '18 at 11:02
As Cyrus says, put an
&
at the end of the./ the job
line. If you want to wait for all the jobs to finish, add a linewait
at the end of the script– Jon
Nov 1 '18 at 14:23