Submit new jobs with snakemake when previous ones are not yet finished on SLURM clusterslurm: use any node from nodelistRunning Batch Job on Slurm ClusterSlurm Multiprocessing Python Jobhow to set number of task per node in slurm based on the parameter that I passed it to my program?What does the --ntasks or -n tasks does in SLURM?Using Python subprocess to run SLURM script to submit multiple long jobs to queue and waiting for jobs to finish before continuing python scriptSlurm : cannot allocate resources even when they are availableHow to submit parallel job steps with SLURM?Why does slurm assign more tasks than I asked when I “sbatch” multiple jobs with a .sh file?How to properly define the number of threads for jobs spanning multiple nodes?
What is the offset in a seaplane's hull?
Why doesn't Newton's third law mean a person bounces back to where they started when they hit the ground?
How to re-create Edward Weson's Pepper No. 30?
How do I create uniquely male characters?
Addon: add submenu
Writing rule which states that two causes for the same superpower is bad writing
Is it legal for company to use my work email to pretend I still work there?
Is it possible to do 50 km distance without any previous training?
If Manufacturer spice model and Datasheet give different values which should I use?
Is it possible to rebuild the bike frame (to make it lighter) by welding aluminum tubes
How to feed LSTM with different input array sizes?
Why can't I see bouncing of a switch on an oscilloscope?
same font throughout bibliography
If two metric spaces are topologically equivalent (homeomorphic) imply that they are complete?
Is the language <p,n> belongs to NP class?
Mathematical cryptic clues
What are these boxed doors outside store fronts in New York?
XeLaTeX and pdfLaTeX ignore hyphenation
How to test if a transaction is standard without spending real money?
Is it possible to make sharp wind that can cut stuff from afar?
Is there really no realistic way for a skeleton monster to move around without magic?
Can a country ban oversea biotechnology research out of an endemic plant that is only found in its territory?
How do we improve the relationship with a client software team that performs poorly and is becoming less collaborative?
Approximately how much travel time was saved by the opening of the Suez Canal in 1869?
Submit new jobs with snakemake when previous ones are not yet finished on SLURM cluster
slurm: use any node from nodelistRunning Batch Job on Slurm ClusterSlurm Multiprocessing Python Jobhow to set number of task per node in slurm based on the parameter that I passed it to my program?What does the --ntasks or -n tasks does in SLURM?Using Python subprocess to run SLURM script to submit multiple long jobs to queue and waiting for jobs to finish before continuing python scriptSlurm : cannot allocate resources even when they are availableHow to submit parallel job steps with SLURM?Why does slurm assign more tasks than I asked when I “sbatch” multiple jobs with a .sh file?How to properly define the number of threads for jobs spanning multiple nodes?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
I am running Snakemake
on SLURM
cluster and I have such a problem: the cluster allows me to submit only a number (around 20) jobs at a time. After running snakemake.sh
which is:
#!/bin/bash
INPUT_DIR=...
snakemake -j 190 --latency-wait 1000 --cluster-config cluster.json --
cluster "sbatch -A cluster.A -p cluster.p -t cluster.time --
output cluster.output --error cluster.error --nodes cluster.nodes
--ntasks cluster.ntasks --cpus-per-task cluster.cpus --mem cluster.mem"
20 jobs are run (not 190) and so I end up waiting while all of the 20 finish up and then rerunning the script. This is not optimal obviously. Lets' say that 15 jobs completed but 5 are still running, is there a way to submit additional 15 somehow?
hpc slurm snakemake
add a comment |
I am running Snakemake
on SLURM
cluster and I have such a problem: the cluster allows me to submit only a number (around 20) jobs at a time. After running snakemake.sh
which is:
#!/bin/bash
INPUT_DIR=...
snakemake -j 190 --latency-wait 1000 --cluster-config cluster.json --
cluster "sbatch -A cluster.A -p cluster.p -t cluster.time --
output cluster.output --error cluster.error --nodes cluster.nodes
--ntasks cluster.ntasks --cpus-per-task cluster.cpus --mem cluster.mem"
20 jobs are run (not 190) and so I end up waiting while all of the 20 finish up and then rerunning the script. This is not optimal obviously. Lets' say that 15 jobs completed but 5 are still running, is there a way to submit additional 15 somehow?
hpc slurm snakemake
add a comment |
I am running Snakemake
on SLURM
cluster and I have such a problem: the cluster allows me to submit only a number (around 20) jobs at a time. After running snakemake.sh
which is:
#!/bin/bash
INPUT_DIR=...
snakemake -j 190 --latency-wait 1000 --cluster-config cluster.json --
cluster "sbatch -A cluster.A -p cluster.p -t cluster.time --
output cluster.output --error cluster.error --nodes cluster.nodes
--ntasks cluster.ntasks --cpus-per-task cluster.cpus --mem cluster.mem"
20 jobs are run (not 190) and so I end up waiting while all of the 20 finish up and then rerunning the script. This is not optimal obviously. Lets' say that 15 jobs completed but 5 are still running, is there a way to submit additional 15 somehow?
hpc slurm snakemake
I am running Snakemake
on SLURM
cluster and I have such a problem: the cluster allows me to submit only a number (around 20) jobs at a time. After running snakemake.sh
which is:
#!/bin/bash
INPUT_DIR=...
snakemake -j 190 --latency-wait 1000 --cluster-config cluster.json --
cluster "sbatch -A cluster.A -p cluster.p -t cluster.time --
output cluster.output --error cluster.error --nodes cluster.nodes
--ntasks cluster.ntasks --cpus-per-task cluster.cpus --mem cluster.mem"
20 jobs are run (not 190) and so I end up waiting while all of the 20 finish up and then rerunning the script. This is not optimal obviously. Lets' say that 15 jobs completed but 5 are still running, is there a way to submit additional 15 somehow?
hpc slurm snakemake
hpc slurm snakemake
asked Mar 9 at 3:27
Nikita VlasenkoNikita Vlasenko
9271427
9271427
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
A couple of thoughts...:
Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.
You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,
Like (not tested, just get the idea):
for i in 1..30
do
sbatch --wrap "sleep 30 && touch test$i.tmp"
done
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55073702%2fsubmit-new-jobs-with-snakemake-when-previous-ones-are-not-yet-finished-on-slurm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
A couple of thoughts...:
Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.
You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,
Like (not tested, just get the idea):
for i in 1..30
do
sbatch --wrap "sleep 30 && touch test$i.tmp"
done
add a comment |
A couple of thoughts...:
Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.
You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,
Like (not tested, just get the idea):
for i in 1..30
do
sbatch --wrap "sleep 30 && touch test$i.tmp"
done
add a comment |
A couple of thoughts...:
Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.
You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,
Like (not tested, just get the idea):
for i in 1..30
do
sbatch --wrap "sleep 30 && touch test$i.tmp"
done
A couple of thoughts...:
Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.
You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,
Like (not tested, just get the idea):
for i in 1..30
do
sbatch --wrap "sleep 30 && touch test$i.tmp"
done
answered Mar 10 at 8:44
darioberdariober
1,1411222
1,1411222
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55073702%2fsubmit-new-jobs-with-snakemake-when-previous-ones-are-not-yet-finished-on-slurm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown