This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
tutorial:torque [2017/01/13 10:11] sertalpbilal Changed Example Hierarchy |
tutorial:torque [2017/06/08 22:45] afo214 |
||
---|---|---|---|
Line 64: | Line 64: | ||
Now, we will run the code but we are setting the job parameters using '' | Now, we will run the code but we are setting the job parameters using '' | ||
- | ===== Important | + | ===== Options ===== |
+ | ^ Option | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
+ | | '' | ||
- | * '' | + | You can find detailed information |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | * '' | + | |
- | + | ||
- | See [[http:// | + | |
+ | <note tip>You need to use option '' | ||
===== Monitoring and Removing jobs ===== | ===== Monitoring and Removing jobs ===== | ||
Line 96: | Line 97: | ||
</ | </ | ||
+ | Moreover, you can use the following command: | ||
+ | < | ||
+ | < | ||
+ | < | ||
==== Queues ==== | ==== Queues ==== | ||
Line 127: | Line 132: | ||
===== Examples ===== | ===== Examples ===== | ||
- | ==== Submitting Large Memory Job ==== | + | ==== Submitting |
- | Sometimes your job needs more memory. This can be achieved by '' | + | You can use the option |
- | < | + | |
- | ==== Running MATLAB | + | <code bash limited.sh> |
+ | qsub -l mem=4gb, | ||
+ | </ | ||
+ | |||
+ | Sometimes your job needs more memory. You can choose a larger memory size with the same option: | ||
+ | |||
+ | <code bash large.pbs> | ||
+ | |||
+ | ==== Running MATLAB ==== | ||
You just have to create a submission job which looks like this | You just have to create a submission job which looks like this | ||
Line 147: | Line 159: | ||
<note tip>Use **-singleCompThread** [[https:// | <note tip>Use **-singleCompThread** [[https:// | ||
- | ==== Interactive Jobs ==== | + | ==== Running Solvers |
+ | In order to run solvers (such as Gurobi/ | ||
+ | |||
+ | < | ||
+ | |||
+ | This flag enables the solver to find necessary authentication information. | ||
+ | |||
+ | ==== Interactive Jobs ==== | ||
If you do not care where you run your job just use '' | If you do not care where you run your job just use '' | ||
Line 164: | Line 183: | ||
However, first you have to have a permission to use GPU (given by Prof. Takac) -- this is just formality to allow to certain users to use video driver on polyp30 | However, first you have to have a permission to use GPU (given by Prof. Takac) -- this is just formality to allow to certain users to use video driver on polyp30 | ||
+ | |||
+ | If you are using TensorFlow, you can set the limit on amount of GPU memory using: | ||
+ | < | ||
+ | config_tf.gpu_options.per_process_gpu_memory_fraction = p</ | ||
+ | in which < | ||
==== Running MPI and Parallel Jobs ==== | ==== Running MPI and Parallel Jobs ==== | ||
Line 289: | Line 313: | ||
+ | ==== Tensorflow with GPU ==== | ||
+ | To use tensorflow with a specific GPU, say GPU 1, you can simply set | ||
+ | <code bash> | ||
+ | export CUDA_VISIBLE_DEVICES=1 | ||
+ | </ | ||
+ | and then schedule your jobs with Torque to perform experiments on GPU 1. |