Cluster 9
Nodes Summary
Total Number of CPUs: 768State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 33 | 528 | 0 | 0.00 |
down,offline | 15 | 240 | 0 | 0.00 |
Free CPUs (nodewise)
There is no free CPU available now.
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 9
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 16 | down | 0 | 0 | ||
compute-0-1 | 16 | down,offline | 0 | 0 | ||
compute-0-2 | 16 | down | 0 | 0 | ||
compute-0-3 | 16 | down,offline | 0 | 0 | ||
compute-0-4 | 16 | down | 0 | 0 | ||
compute-0-5 | 16 | down,offline | 0 | 0 | ||
compute-0-6 | 16 | down | 0 | 0 | ||
compute-0-7 | 16 | down | 0 | 0 | ||
compute-0-8 | 16 | down | 0 | 0 | ||
compute-0-9 | 16 | down | 0 | 0 | ||
compute-0-10 | 16 | down | 0 | 0 | ||
compute-0-11 | 16 | down | 0 | 0 | ||
compute-0-12 | 16 | down | 0 | 0 | ||
compute-0-13 | 16 | down | 0 | 0 | ||
compute-0-14 | 16 | down | 0 | 0 | ||
compute-0-15 | 16 | down,offline | 0 | 0 | ||
compute-0-16 | 16 | down | 0 | 0 | ||
compute-0-17 | 16 | down | 0 | 0 | ||
compute-0-18 | 16 | down | 0 | 0 | ||
compute-0-19 | 16 | down | 0 | 0 | ||
compute-0-20 | 16 | down | 0 | 0 | ||
compute-0-21 | 16 | down,offline | 0 | 0 | ||
compute-0-22 | 16 | down,offline | 0 | 0 | ||
compute-0-23 | 16 | down,offline | 0 | 0 | ||
compute-0-24 | 16 | down,offline | 0 | 0 | ||
compute-0-25 | 16 | down | 0 | 0 | ||
compute-0-26 | 16 | down | 0 | 0 | ||
compute-0-27 | 16 | down | 0 | 0 | ||
compute-0-28 | 16 | down | 0 | 0 | ||
compute-0-29 | 16 | down,offline | 0 | 0 | ||
compute-0-30 | 16 | down,offline | 0 | 0 | ||
compute-0-31 | 16 | down | 0 | 0 | ||
compute-0-32 | 16 | down | 0 | 0 | ||
compute-0-33 | 16 | down | 0 | 0 | ||
compute-0-34 | 16 | down | 0 | 0 | ||
compute-0-35 | 16 | down | 0 | 0 | ||
compute-0-36 | 16 | down,offline | 0 | 0 | ||
compute-0-37 | 16 | down | 0 | 0 | ||
compute-0-38 | 16 | down | 0 | 0 | ||
compute-0-39 | 16 | down,offline | 0 | 0 | ||
compute-0-40 | 16 | down | 0 | 0 | ||
compute-0-41 | 16 | down | 0 | 0 | ||
compute-0-42 | 16 | down | 0 | 0 | ||
compute-0-43 | 16 | down | 0 | 0 | ||
compute-0-44 | 16 | down,offline | 0 | 0 | ||
compute-0-45 | 16 | down,offline | 0 | 0 | ||
compute-0-46 | 16 | down | 0 | 0 | ||
compute-0-47 | 16 | down,offline | 0 | 0 | ||
Cluster 9
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 10
Nodes Summary
Total Number of CPUs: 280State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 12 | 240 | 0 | 0.00 |
down,offline | 1 | 20 | 0 | 0.00 |
free | 1 | 0 | 20 | 7.14 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute-0-8 | 20 |
Total | 20 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 10
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 20 | down | 0 | 0 | ||
compute-0-1 | 20 | down | 0 | 0 | ||
compute-0-2 | 20 | down | 0 | 0 | ||
compute-0-4 | 20 | down | 0 | 0 | ||
compute-0-5 | 20 | down | 0 | 0 | ||
compute-0-6 | 20 | down | 0 | 0 | ||
compute-0-7 | 20 | down,offline | 0 | 0 | ||
compute-0-8 | 20 | free | 0 | 20 | ||
compute-0-10 | 20 | down | 0 | 0 | ||
compute-0-11 | 20 | down | 0 | 0 | ||
compute-0-12 | 20 | down | 0 | 0 | ||
compute-0-13 | 20 | down | 0 | 0 | ||
compute-0-3 | 20 | down | 0 | 0 | ||
compute-0-9 | 20 | down | 0 | 0 | ||
Cluster 10
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 11
Nodes Summary
Total Number of CPUs: 960State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
free | 28 | 24 | 648 | 67.50 |
down | 11 | 264 | 0 | 0.00 |
down,job-exclusive | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute000 | 24 |
compute001 | 24 |
compute002 | 24 |
compute003 | 24 |
compute005 | 24 |
compute007 | 24 |
compute008 | 24 |
compute009 | 24 |
compute010 | 24 |
compute011 | 24 |
compute012 | 24 |
compute013 | 24 |
compute014 | 24 |
compute015 | 24 |
compute017 | 24 |
compute020 | 24 |
compute021 | 24 |
compute022 | 24 |
compute024 | 24 |
compute026 | 24 |
compute027 | 24 |
compute028 | 19 |
compute030 | 24 |
compute032 | 24 |
compute034 | 24 |
compute035 | 18 |
compute036 | 13 |
compute039 | 22 |
Total | 648 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R | |||||||
tanoykanti | 12 | 12 | 1.25% | 48 days 12:31:19 hrs | 77.27% | 77.27% | |
sud98 | 5 | 5 | 0.52% | 23 days 03:50:08 hrs | 64.11% | 64.11% | |
tanmoymondal | 24 | 24 | 2.50% | 1 day 18:04:15 hrs | 100.11% | 100.11% | |
arghyamaity | 4 | 4 | 0.42% | 4 days 08:18:30 hrs | 100.09% | 100.09% | |
aparajita | 3 | 3 | 0.31% | 11:52:41 hrs | 100.08% | 100.08% |
Cluster 11
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute000 | 24 | free | 0 | 24 | ||
compute001 | 24 | free | 0 | 24 | ||
compute002 | 24 | free | 0 | 24 | ||
compute003 | 24 | free | 0 | 24 | ||
compute004 | 24 | down | 0 | 0 | ||
compute005 | 24 | free | 0 | 24 | ||
compute006 | 24 | down | 0 | 0 | ||
compute007 | 24 | free | 0 | 24 | ||
compute008 | 24 | free | 0 | 24 | ||
compute009 | 24 | free | 0 | 24 | ||
compute010 | 24 | free | 0 | 24 | ||
compute011 | 24 | free | 0 | 24 | ||
compute012 | 24 | free | 0 | 24 | ||
compute013 | 24 | free | 0 | 24 | ||
compute014 | 24 | free | 0 | 24 | ||
compute015 | 24 | free | 0 | 24 | ||
compute016 | 24 | down | 0 | 0 | ||
compute017 | 24 | free | 0 | 24 | ||
compute018 | 24 | down | 0 | 0 | ||
compute019 | 24 | down | 0 | 0 | ||
compute020 | 24 | free | 0 | 24 | ||
compute021 | 24 | free | 0 | 24 | ||
compute022 | 24 | free | 0 | 24 | ||
compute023 | 24 | down | 0 | 0 | ||
compute024 | 24 | free | 0 | 24 | ||
compute025 | 24 | down | 0 | 0 | ||
compute026 | 24 | free | 0 | 24 | ||
compute027 | 24 | free | 0 | 24 | ||
compute028 | 24 | free | 5 | 19 | ||
compute029 | 24 | down,job-exclusive | 24 | 0 | ||
compute030 | 24 | free | 0 | 24 | ||
compute031 | 24 | down | 0 | 0 | ||
compute032 | 24 | free | 0 | 24 | ||
compute033 | 24 | down | 0 | 0 | ||
compute034 | 24 | free | 0 | 24 | ||
compute035 | 24 | free | 6 | 18 | ||
compute036 | 24 | free | 11 | 13 | ||
compute037 | 24 | down | 0 | 0 | ||
compute038 | 24 | down | 0 | 0 | ||
compute039 | 24 | free | 2 | 22 | ||
Cluster 11
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|
651705 | tanoykanti | fin07 | R | 48 days 12:34:14 hrs | 1 | 33.69 MB | 77.27% |
651706 | tanoykanti | fin08 | R | 48 days 12:33:44 hrs | 1 | 33.84 MB | 77.27% |
651707 | tanoykanti | fin09 | R | 48 days 12:33:10 hrs | 1 | 33.75 MB | 77.28% |
651708 | tanoykanti | fin10 | R | 48 days 12:32:26 hrs | 1 | 33.76 MB | 77.27% |
651709 | tanoykanti | fin11 | R | 48 days 12:31:54 hrs | 1 | 33.69 MB | 77.27% |
651710 | tanoykanti | fin12 | R | 48 days 12:31:25 hrs | 1 | 33.73 MB | 77.27% |
651711 | tanoykanti | fin13 | R | 48 days 12:31:07 hrs | 1 | 33.77 MB | 77.27% |
651712 | tanoykanti | fin14 | R | 48 days 12:30:37 hrs | 1 | 33.78 MB | 77.27% |
651713 | tanoykanti | fin15 | R | 48 days 12:30:09 hrs | 1 | 31.87 MB | 77.28% |
651714 | tanoykanti | fin16 | R | 48 days 12:29:33 hrs | 1 | 33.69 MB | 77.27% |
651715 | tanoykanti | fin17 | R | 48 days 12:29:03 hrs | 1 | 33.77 MB | 77.28% |
651716 | tanoykanti | fin18 | R | 48 days 12:28:29 hrs | 1 | 33.81 MB | 77.27% |
659000 | sud98 | 5_qubit_1 | R | 23 days 04:09:20 hrs | 1 | 14.04 MB | 64.11% |
659001 | sud98 | 5_qubit_2 | R | 23 days 04:08:38 hrs | 1 | 14.07 MB | 64.16% |
659002 | sud98 | 5_qubit_3 | R | 23 days 03:49:26 hrs | 1 | 14.07 MB | 64.14% |
659003 | sud98 | 5_qubit_4 | R | 23 days 03:32:01 hrs | 1 | 14.04 MB | 64.07% |
659004 | sud98 | 5_qubit_5 | R | 23 days 03:31:17 hrs | 1 | 14.04 MB | 64.07% |
659150 | tanmoymondal | sf_2_3.0_8_0.5_0.038593 | R | 1 day 18:05:07 hrs | 1 | 706.75 MB | 100.12% |
659151 | tanmoymondal | sf_2_3.0_8_0.5_0.035084 | R | 1 day 18:05:01 hrs | 1 | 654.37 MB | 100.31% |
659152 | tanmoymondal | sf_2_3.0_8_0.5_0.031895 | R | 1 day 18:04:58 hrs | 1 | 663.71 MB | 100.28% |
659153 | tanmoymondal | sf_2_3.0_8_0.5_0.028995 | R | 1 day 18:04:52 hrs | 1 | 651.44 MB | 100.04% |
659154 | tanmoymondal | sf_2_3.0_8_0.5_0.026359 | R | 1 day 18:04:49 hrs | 1 | 615.97 MB | 100.05% |
659155 | tanmoymondal | sf_2_3.0_8_0.5_0.023963 | R | 1 day 18:04:44 hrs | 1 | 650.77 MB | 99.99% |
659156 | tanmoymondal | sf_2_3.0_8_0.5_0.021785 | R | 1 day 18:04:40 hrs | 1 | 617.48 MB | 100.17% |
659157 | tanmoymondal | sf_2_3.0_8_0.5_0.019804 | R | 1 day 18:04:35 hrs | 1 | 613.68 MB | 100.12% |
659158 | tanmoymondal | sf_2_3.0_8_0.5_0.018004 | R | 1 day 18:04:30 hrs | 1 | 621.50 MB | 100.21% |
659159 | tanmoymondal | sf_2_3.0_8_0.5_0.016367 | R | 1 day 18:04:27 hrs | 1 | 640.98 MB | 100.17% |
659160 | tanmoymondal | sf_2_3.0_8_0.5_0.014879 | R | 1 day 18:04:21 hrs | 1 | 615.02 MB | 100.06% |
659161 | tanmoymondal | sf_2_3.0_8_0.5_0.013527 | R | 1 day 18:04:18 hrs | 1 | 607.94 MB | 99.95% |
659162 | tanmoymondal | sf_2_3.0_8_0.5_0.012297 | R | 1 day 18:04:14 hrs | 1 | 613.84 MB | 100.15% |
659163 | tanmoymondal | sf_2_3.0_8_0.5_0.011179 | R | 1 day 18:04:10 hrs | 1 | 618.93 MB | 99.97% |
659164 | tanmoymondal | sf_2_3.0_8_0.5_0.010163 | R | 1 day 18:04:05 hrs | 1 | 617.34 MB | 100.14% |
659165 | tanmoymondal | sf_2_3.0_8_0.5_0.009239 | R | 1 day 18:04:01 hrs | 1 | 627.28 MB | 100.18% |
659166 | tanmoymondal | sf_2_3.0_8_0.5_0.008399 | R | 1 day 18:03:57 hrs | 1 | 616.14 MB | 100.20% |
659167 | tanmoymondal | sf_2_3.0_8_0.5_0.007635 | R | 1 day 18:03:53 hrs | 1 | 619.93 MB | 100.17% |
659168 | tanmoymondal | sf_2_3.0_8_0.5_0.006941 | R | 1 day 18:03:48 hrs | 1 | 626.80 MB | 100.16% |
659169 | tanmoymondal | sf_2_3.0_8_0.5_0.006310 | R | 1 day 18:03:44 hrs | 1 | 615.64 MB | 99.94% |
659170 | tanmoymondal | sf_2_3.0_8_0.5_0.005737 | R | 1 day 18:03:36 hrs | 1 | 616.26 MB | 100.15% |
659171 | tanmoymondal | sf_2_3.0_8_0.5_0.005215 | R | 1 day 18:03:35 hrs | 1 | 618.10 MB | 99.92% |
659172 | tanmoymondal | sf_2_3.0_8_0.5_0.004741 | R | 1 day 18:03:31 hrs | 1 | 613.44 MB | 100.07% |
659173 | tanmoymondal | sf_2_3.0_8_0.5_0.004310 | R | 1 day 18:03:27 hrs | 1 | 613.11 MB | 100.17% |
660320 | arghyamaity | KC_s1.2 | R | 4 days 08:29:49 hrs | 1 | 23.42 MB | 100.09% |
660321 | arghyamaity | KC_s1.5 | R | 4 days 08:25:21 hrs | 1 | 23.43 MB | 100.09% |
660323 | arghyamaity | KC_s1.2 | R | 4 days 08:15:14 hrs | 1 | 23.41 MB | 100.09% |
660328 | arghyamaity | KC_s1.2 | R | 4 days 08:03:38 hrs | 1 | 23.42 MB | 100.09% |
660513 | aparajita | gg_1.2_12_g | R | 12:08:59 hrs | 1 | 48.30 MB | 100.08% |
660515 | aparajita | gg_1.2_12_g | R | 11:46:24 hrs | 1 | 50.37 MB | 100.08% |
660516 | aparajita | gg_1.2_12_g | R | 11:42:40 hrs | 1 | 45.53 MB | 100.09% |
Cluster 12
Nodes Summary
Total Number of CPUs: 1056State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 21 | 504 | 0 | 0.00 |
free | 22 | 44 | 484 | 45.83 |
down | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node3 | 24 | |
node9 | 9 | |
node11 | 23 | |
node12 | 23 | |
node16 | 24 | |
node17 | 16 | |
node19 | 12 | |
node20 | 24 | |
node21 | 24 | |
node22 | 24 | |
node1 | 24 | |
node13 | 17 | |
node36 | 24 | |
node37 | 24 | |
node38 | 24 | |
node39 | 24 | |
node40 | 24 | |
node41 | 24 | |
node42 | 24 | |
node33 | 24 | |
node43 | 24 | |
node44 | 24 | |
Total | 484 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
debarupa11 | 86 | 86 | 8.14% | 44 days 00:44:24 hrs | 100.05% | 100.05% | ||
subhojit | 2 | 48 | 4.55% | 14 days 15:20:27 hrs | 100.06% | 100.05% | ||
debs | 1 | 48 | 4.55% | 18 days 08:20:31 hrs | 0.00% | 0.00% | ||
shuvam | 3 | 3 | 0.28% | 12 days 00:54:11 hrs | 100.05% | 100.05% | ||
bikashvbu | 2 | 72 | 6.82% | 4 days 12:17:44 hrs | 100.00% | 99.99% | ||
sejalahuja | 3 | 3 | 0.28% | 5 days 04:36:04 hrs | 100.07% | 100.07% | ||
arijeetsarangi | 1 | 144 | 13.64% | 4 days 00:20:00 hrs | 100.04% | 100.04% | ||
swapnild | 6 | 144 | 13.64% | 01:37:13 hrs | 99.97% | 99.98% |
Cluster 12
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node2 | workq | 24 | job-busy | 24 | 0 | |
node3 | workq | 24 | free | 0 | 24 | |
node4 | workq | 24 | job-busy | 24 | 0 | |
node5 | workq | 24 | job-busy | 24 | 0 | |
node8 | workq | 24 | job-busy | 24 | 0 | |
node9 | workq | 24 | free | 15 | 9 | |
node10 | workq | 24 | job-busy | 24 | 0 | |
node11 | workq | 24 | free | 1 | 23 | |
node12 | workq | 24 | free | 1 | 23 | |
node14 | workq | 24 | job-busy | 24 | 0 | |
node15 | workq | 24 | job-busy | 24 | 0 | |
node16 | workq | 24 | free | 0 | 24 | |
node17 | workq | 24 | free | 8 | 16 | |
node18 | workq | 24 | job-busy | 24 | 0 | |
node19 | workq | 24 | free | 12 | 12 | |
node20 | workq | 24 | free | 0 | 24 | |
node21 | workq | 24 | free | 0 | 24 | |
node22 | workq | 24 | free | 0 | 24 | |
node1 | workq | 24 | free | 0 | 24 | |
node23 | workq | 24 | job-busy | 24 | 0 | |
node24 | workq | 24 | job-busy | 24 | 0 | |
node25 | workq | 24 | job-busy | 24 | 0 | |
node6 | workq | 24 | job-busy | 24 | 0 | |
node7 | workq | 24 | job-busy | 24 | 0 | |
node26 | workq | 24 | down | 0 | 0 | |
node27 | workq | 24 | job-busy | 24 | 0 | |
node13 | workq | 24 | free | 7 | 17 | |
node28 | workq | 24 | job-busy | 24 | 0 | |
node29 | workq | 24 | job-busy | 24 | 0 | |
node30 | workq | 24 | job-busy | 24 | 0 | |
node31 | workq | 24 | job-busy | 24 | 0 | |
node32 | workq | 24 | job-busy | 24 | 0 | |
node34 | workq | 24 | job-busy | 24 | 0 | |
node35 | workq | 24 | job-busy | 24 | 0 | |
node36 | workq | 24 | free | 0 | 24 | |
node37 | workq | 24 | free | 0 | 24 | |
node38 | workq | 24 | free | 0 | 24 | |
node39 | workq | 24 | free | 0 | 24 | |
node40 | workq | 24 | free | 0 | 24 | |
node41 | workq | 24 | free | 0 | 24 | |
node42 | workq | 24 | free | 0 | 24 | |
node33 | workq | 24 | free | 0 | 24 | |
node43 | workq | 24 | free | 0 | 24 | |
node44 | workq | 24 | free | 0 | 24 | |
Cluster 12
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
102710 | debarupa11 | workq | GMBF100000000 | R | 69 days 01:18:19 hrs | 1 | 6.76 MB | 99.28% |
104309 | debarupa11 | workq | Therm1.00 | R | 48 days 04:52:01 hrs | 1 | 7.28 MB | 100.06% |
104310 | debarupa11 | workq | Therm1.02 | R | 48 days 04:51:49 hrs | 1 | 9.28 MB | 100.06% |
104311 | debarupa11 | workq | Therm1.04 | R | 48 days 04:51:39 hrs | 1 | 9.26 MB | 100.06% |
104312 | debarupa11 | workq | Therm1.06 | R | 48 days 04:51:28 hrs | 1 | 7.27 MB | 100.06% |
104313 | debarupa11 | workq | Therm1.08 | R | 48 days 04:51:17 hrs | 1 | 9.28 MB | 100.06% |
104314 | debarupa11 | workq | Therm1.10 | R | 48 days 04:51:06 hrs | 1 | 9.27 MB | 100.06% |
104315 | debarupa11 | workq | Therm1.12 | R | 48 days 04:50:55 hrs | 1 | 7.28 MB | 100.06% |
104316 | debarupa11 | workq | Therm1.14 | R | 48 days 04:50:44 hrs | 1 | 7.27 MB | 100.06% |
104318 | debarupa11 | workq | Therm1.18 | R | 48 days 04:50:23 hrs | 1 | 7.27 MB | 100.06% |
104319 | debarupa11 | workq | Therm1.20 | R | 48 days 04:50:12 hrs | 1 | 7.28 MB | 100.06% |
104320 | debarupa11 | workq | Therm1.22 | R | 48 days 04:50:01 hrs | 1 | 7.28 MB | 100.06% |
104321 | debarupa11 | workq | Therm1.24 | R | 48 days 04:49:50 hrs | 1 | 7.28 MB | 100.06% |
104322 | debarupa11 | workq | Therm1.26 | R | 48 days 04:49:40 hrs | 1 | 9.26 MB | 100.06% |
104323 | debarupa11 | workq | Therm1.28 | R | 48 days 04:49:29 hrs | 1 | 7.27 MB | 100.06% |
104324 | debarupa11 | workq | Therm1.30 | R | 48 days 04:49:18 hrs | 1 | 7.27 MB | 100.06% |
104325 | debarupa11 | workq | Therm1.32 | R | 48 days 04:49:07 hrs | 1 | 9.27 MB | 100.06% |
104326 | debarupa11 | workq | Therm1.34 | R | 48 days 04:48:57 hrs | 1 | 9.27 MB | 100.06% |
104327 | debarupa11 | workq | Therm1.36 | R | 48 days 04:48:46 hrs | 1 | 9.27 MB | 100.06% |
104328 | debarupa11 | workq | Therm1.38 | R | 48 days 04:48:35 hrs | 1 | 7.28 MB | 100.06% |
104330 | debarupa11 | workq | Therm1.42 | R | 48 days 04:48:13 hrs | 1 | 7.28 MB | 100.06% |
104331 | debarupa11 | workq | Therm1.44 | R | 48 days 04:48:02 hrs | 1 | 7.29 MB | 100.05% |
104332 | debarupa11 | workq | Therm1.46 | R | 48 days 04:47:51 hrs | 1 | 9.28 MB | 100.06% |
104333 | debarupa11 | workq | Therm1.48 | R | 48 days 04:47:36 hrs | 1 | 9.30 MB | 100.07% |
104335 | debarupa11 | workq | Therm1.52 | R | 48 days 04:47:15 hrs | 1 | 7.29 MB | 100.06% |
104337 | debarupa11 | workq | Therm1.56 | R | 48 days 04:46:53 hrs | 1 | 7.30 MB | 100.06% |
104338 | debarupa11 | workq | Therm1.58 | R | 48 days 04:46:42 hrs | 1 | 7.30 MB | 100.06% |
104339 | debarupa11 | workq | Therm1.60 | R | 48 days 04:46:31 hrs | 1 | 9.30 MB | 100.06% |
104340 | debarupa11 | workq | Therm1.62 | R | 48 days 04:46:21 hrs | 1 | 7.29 MB | 100.06% |
104341 | debarupa11 | workq | Therm1.64 | R | 48 days 04:46:10 hrs | 1 | 7.29 MB | 100.06% |
104342 | debarupa11 | workq | Therm1.66 | R | 48 days 04:45:59 hrs | 1 | 7.30 MB | 100.06% |
104343 | debarupa11 | workq | Therm1.68 | R | 48 days 04:45:48 hrs | 1 | 7.30 MB | 100.07% |
104344 | debarupa11 | workq | Therm1.70 | R | 48 days 04:45:37 hrs | 1 | 9.29 MB | 100.06% |
104345 | debarupa11 | workq | Therm1.72 | R | 48 days 04:45:27 hrs | 1 | 9.29 MB | 100.06% |
104350 | debarupa11 | workq | Therm1.82 | R | 48 days 04:44:32 hrs | 1 | 7.29 MB | 100.07% |
105277 | debarupa11 | workq | T3.00 | R | 41 days 05:02:21 hrs | 1 | 6.99 MB | 100.06% |
105278 | debarupa11 | workq | T3.02 | R | 41 days 05:02:11 hrs | 1 | 8.98 MB | 100.06% |
105279 | debarupa11 | workq | T3.04 | R | 41 days 05:02:01 hrs | 1 | 8.97 MB | 100.06% |
105280 | debarupa11 | workq | T3.06 | R | 41 days 05:01:51 hrs | 1 | 6.99 MB | 100.07% |
105281 | debarupa11 | workq | T3.08 | R | 41 days 05:01:41 hrs | 1 | 6.99 MB | 100.06% |
105282 | debarupa11 | workq | T3.10 | R | 41 days 05:01:31 hrs | 1 | 6.99 MB | 100.07% |
105283 | debarupa11 | workq | T3.12 | R | 41 days 05:01:21 hrs | 1 | 6.98 MB | 100.07% |
105284 | debarupa11 | workq | T3.14 | R | 41 days 05:01:11 hrs | 1 | 8.99 MB | 100.07% |
105285 | debarupa11 | workq | T3.16 | R | 41 days 05:01:01 hrs | 1 | 8.98 MB | 100.06% |
105286 | debarupa11 | workq | T3.18 | R | 41 days 05:00:51 hrs | 1 | 7.00 MB | 100.07% |
105287 | debarupa11 | workq | T3.20 | R | 41 days 05:00:41 hrs | 1 | 6.97 MB | 100.07% |
105288 | debarupa11 | workq | T3.22 | R | 41 days 05:00:31 hrs | 1 | 6.99 MB | 100.07% |
105289 | debarupa11 | workq | T3.24 | R | 41 days 05:01:59 hrs | 1 | 6.98 MB | 100.07% |
105290 | debarupa11 | workq | T3.26 | R | 41 days 05:01:49 hrs | 1 | 6.97 MB | 100.07% |
105291 | debarupa11 | workq | T3.28 | R | 41 days 05:01:39 hrs | 1 | 6.98 MB | 100.07% |
105292 | debarupa11 | workq | T3.30 | R | 41 days 05:01:29 hrs | 1 | 6.98 MB | 100.07% |
105294 | debarupa11 | workq | T3.34 | R | 41 days 05:01:09 hrs | 1 | 6.98 MB | 100.07% |
105295 | debarupa11 | workq | T3.36 | R | 41 days 05:00:59 hrs | 1 | 8.97 MB | 100.07% |
105296 | debarupa11 | workq | T3.38 | R | 41 days 05:00:49 hrs | 1 | 6.97 MB | 100.07% |
105297 | debarupa11 | workq | T3.40 | R | 41 days 05:00:39 hrs | 1 | 6.99 MB | 100.07% |
105298 | debarupa11 | workq | T3.42 | R | 41 days 05:00:28 hrs | 1 | 8.98 MB | 100.07% |
105299 | debarupa11 | workq | T3.44 | R | 41 days 05:00:18 hrs | 1 | 6.99 MB | 100.07% |
105300 | debarupa11 | workq | T3.46 | R | 41 days 05:00:07 hrs | 1 | 6.98 MB | 100.07% |
105301 | debarupa11 | workq | T3.48 | R | 41 days 04:58:09 hrs | 1 | 6.99 MB | 100.07% |
105302 | debarupa11 | workq | T3.50 | R | 41 days 04:57:59 hrs | 1 | 6.98 MB | 100.06% |
105303 | debarupa11 | workq | T3.52 | R | 41 days 04:57:49 hrs | 1 | 6.98 MB | 100.07% |
105304 | debarupa11 | workq | T3.54 | R | 41 days 04:57:39 hrs | 1 | 6.98 MB | 100.07% |
105305 | debarupa11 | workq | T3.56 | R | 41 days 04:57:29 hrs | 1 | 6.99 MB | 100.06% |
105306 | debarupa11 | workq | T3.58 | R | 41 days 04:57:19 hrs | 1 | 6.97 MB | 100.06% |
105307 | debarupa11 | workq | T3.60 | R | 41 days 04:57:09 hrs | 1 | 6.98 MB | 100.07% |
105308 | debarupa11 | workq | T3.62 | R | 41 days 04:56:59 hrs | 1 | 6.98 MB | 100.07% |
105309 | debarupa11 | workq | T3.64 | R | 41 days 04:57:37 hrs | 1 | 6.98 MB | 100.07% |
105310 | debarupa11 | workq | T3.66 | R | 41 days 04:57:27 hrs | 1 | 6.98 MB | 100.07% |
105311 | debarupa11 | workq | T3.68 | R | 41 days 04:57:17 hrs | 1 | 6.98 MB | 100.07% |
105312 | debarupa11 | workq | T3.70 | R | 41 days 04:57:07 hrs | 1 | 8.98 MB | 100.06% |
105313 | debarupa11 | workq | T3.72 | R | 41 days 04:56:57 hrs | 1 | 6.97 MB | 100.07% |
105314 | debarupa11 | workq | T3.74 | R | 41 days 04:56:47 hrs | 1 | 6.98 MB | 100.06% |
105315 | debarupa11 | workq | T3.76 | R | 41 days 04:56:37 hrs | 1 | 6.98 MB | 100.07% |
105316 | debarupa11 | workq | T3.78 | R | 41 days 04:56:27 hrs | 1 | 6.98 MB | 100.06% |
105317 | debarupa11 | workq | T3.80 | R | 41 days 04:56:17 hrs | 1 | 8.96 MB | 100.07% |
105318 | debarupa11 | workq | T3.82 | R | 41 days 04:56:07 hrs | 1 | 6.99 MB | 100.06% |
105319 | debarupa11 | workq | T3.84 | R | 41 days 04:55:57 hrs | 1 | 6.99 MB | 100.07% |
105320 | debarupa11 | workq | T3.86 | R | 41 days 04:55:48 hrs | 1 | 6.98 MB | 100.06% |
105321 | debarupa11 | workq | T3.88 | R | 41 days 04:56:25 hrs | 1 | 7.00 MB | 100.06% |
105322 | debarupa11 | workq | T3.90 | R | 41 days 04:56:15 hrs | 1 | 8.99 MB | 100.07% |
105323 | debarupa11 | workq | T3.92 | R | 41 days 04:56:05 hrs | 1 | 6.98 MB | 100.06% |
105324 | debarupa11 | workq | T3.94 | R | 41 days 04:55:55 hrs | 1 | 7.00 MB | 100.07% |
105325 | debarupa11 | workq | T3.96 | R | 41 days 04:55:45 hrs | 1 | 6.99 MB | 100.07% |
105326 | debarupa11 | workq | T3.98 | R | 41 days 04:55:35 hrs | 1 | 7.00 MB | 100.07% |
105327 | debarupa11 | workq | T4.00 | R | 41 days 04:55:25 hrs | 1 | 6.98 MB | 100.06% |
109277 | subhojit | workq | test | R | 9 days 12:20:07 hrs | 24 | 4.11 GB | 100.07% |
109278 | subhojit | workq | test | R | 19 days 18:20:47 hrs | 24 | 4.44 GB | 100.04% |
109963 | debarupa11 | workq | 5dcheck0.00 | R | 18 days 09:45:11 hrs | 1 | 7.07 MB | 100.06% |
110172 | debs | workq | initial_phono | R | 18 days 08:20:31 hrs | 48 | 35.77 MB | 0.00% |
110602 | shuvam | workq | 0.40_0.80_0.10 | R | 14 days 13:53:45 hrs | 1 | 12.48 MB | 100.06% |
110629 | shuvam | workq | 0.30_0.60_0.20 | R | 14 days 13:46:03 hrs | 1 | 332.94 MB | 100.04% |
115965 | shuvam | workq | two_qbit | R | 6 days 23:02:46 hrs | 1 | 77.74 MB | 100.05% |
115968 | bikashvbu | workq | Tuya | R | 6 days 15:42:05 hrs | 48 | 6.78 GB | 99.99% |
116009 | sejalahuja | workq | N7_qmcl | R | 5 days 12:00:40 hrs | 1 | 16.33 MB | 100.07% |
116012 | sejalahuja | workq | N7_qml | R | 5 days 11:55:43 hrs | 1 | 14.41 MB | 100.07% |
116027 | sejalahuja | workq | N7_amdam | R | 4 days 13:51:50 hrs | 1 | 16.40 MB | 100.07% |
116053 | arijeetsarangi | workq | test-md | R | 4 days 00:20:00 hrs | 144 | 9.99 GB | 100.04% |
116310 | bikashvbu | workq | JCM | R | 05:29:05 hrs | 24 | 8.32 GB | 100.00% |
116313 | swapnild | workq | c2_3 | R | 04:25:15 hrs | 24 | 57.87 GB | 100.00% |
116326 | swapnild | workq | 4hCintr1_3 | R | 01:06:27 hrs | 24 | 50.85 GB | 99.98% |
116327 | swapnild | workq | 4hCintr2_3 | R | 01:06:23 hrs | 24 | 51.20 GB | 99.95% |
116328 | swapnild | workq | 4hsiintr1_3 | R | 01:04:21 hrs | 24 | 64.49 GB | 99.98% |
116329 | swapnild | workq | 4hsiintr2_3 | R | 01:02:24 hrs | 24 | 52.08 GB | 99.96% |
116330 | swapnild | workq | si2_3 | R | 00:58:31 hrs | 24 | 58.12 GB | 99.92% |
Cluster 13
Nodes Summary
Total Number of CPUs: 1024State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 26 | 832 | 0 | 0.00 |
free | 4 | 1 | 127 | 12.40 |
down | 2 | 64 | 0 | 0.00 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
c13node8 | 31 | |
neutrino | ||
c13node28 | 32 | |
c13node30 | 32 | |
c13node31 | 32 | |
Total | 127 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
vanshreep | 1 | 128 | 12.50% | 2 days 04:13:49 hrs | 99.66% | 99.66% | ||
prajna | 1 | 128 | 12.50% | 2 days 12:51:17 hrs | 99.64% | 99.64% | ||
pradhi | 1 | 128 | 12.50% | 1 day 09:17:43 hrs | 99.61% | 99.61% | ||
tanoykanti | 1 | 1 | 0.10% | 11:49:08 hrs | 99.75% | 99.75% | ||
arijeetsarangi | 1 | 128 | 12.50% | 06:04:52 hrs | 99.65% | 99.65% | ||
manasagb | 1 | 128 | 12.50% | 08:09:25 hrs | 99.70% | 99.70% | ||
shilendra | 1 | 64 | 6.25% | 03:58:40 hrs | 99.60% | 99.60% | ||
swapnild | 1 | 128 | 12.50% | 01:10:33 hrs | 99.49% | 99.49% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
vanshreep | 2 | 256 | ||
arijeetsarangi | 1 | 128 |
Cluster 13
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
c13node1 | workq | 32 | job-busy | 32 | 0 | |
c13node2 | workq | 32 | job-busy | 32 | 0 | |
c13node3 | workq | 32 | job-busy | 32 | 0 | |
c13node4 | workq | 32 | job-busy | 32 | 0 | |
c13node5 | workq | 32 | job-busy | 32 | 0 | |
c13node7 | workq | 32 | job-busy | 32 | 0 | |
c13node8 | workq | 32 | free | 1 | 31 | |
c13node9 | workq | 32 | job-busy | 32 | 0 | |
c13node10 | workq | 32 | job-busy | 32 | 0 | |
c13node11 | workq | 32 | job-busy | 32 | 0 | |
c13node12 | workq | 32 | job-busy | 32 | 0 | |
c13node14 | workq | 32 | job-busy | 32 | 0 | |
c13node15 | workq | 32 | job-busy | 32 | 0 | |
c13node0 | workq | 32 | job-busy | 32 | 0 | |
c13node16 | workq | 32 | job-busy | 32 | 0 | |
c13node17 | workq | 32 | job-busy | 32 | 0 | |
c13node18 | workq | 32 | job-busy | 32 | 0 | |
c13node6 | workq | 32 | down | 0 | 0 | |
c13node19 | workq | 32 | job-busy | 32 | 0 | |
c13node20 | workq | 32 | job-busy | 32 | 0 | |
c13node22 | workq | 32 | job-busy | 32 | 0 | |
c13node13 | workq | 32 | job-busy | 32 | 0 | |
c13node23 | workq | 32 | job-busy | 32 | 0 | |
c13node21 | workq | 32 | job-busy | 32 | 0 | |
c13node24 | workq | 32 | job-busy | 32 | 0 | |
c13node25 | workq | 32 | job-busy | 32 | 0 | |
c13node26 | workq | 32 | job-busy | 32 | 0 | |
c13node27 | workq | 32 | job-busy | 32 | 0 | |
c13node28 | neutrino | 32 | free | 0 | 32 | |
c13node29 | neutrino | 32 | down | 0 | 0 | |
c13node30 | neutrino | 32 | free | 0 | 32 | |
c13node31 | neutrino | 32 | free | 0 | 32 | |
Cluster 13
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
405185 | vanshreep | workq | psb5.0cs | R | 2 days 04:13:49 hrs | 128 | 51.87 GB | 99.66% |
405214 | prajna | workq | relax | R | 2 days 12:51:17 hrs | 128 | 11.04 GB | 99.64% |
405244 | vanshreep | workq | psb4.5cs | Q | 128 | 0.00% | ||
405255 | pradhi | workq | mos2_all | R | 1 day 09:17:44 hrs | 128 | 32.01 GB | 99.61% |
405264 | tanoykanti | workq | n_610 | R | 11:49:08 hrs | 1 | 78.79 MB | 99.75% |
405267 | arijeetsarangi | workq | phonon | R | 06:04:53 hrs | 128 | 40.17 GB | 99.65% |
405269 | manasagb | workq | test | R | 08:09:25 hrs | 128 | 34.60 GB | 99.70% |
405275 | shilendra | workq | opt_0.99 | R | 03:58:40 hrs | 64 | 33.28 GB | 99.60% |
405276 | vanshreep | workq | psb2.0t | Q | 128 | 0.00% | ||
405278 | swapnild | workq | wL | R | 01:10:33 hrs | 128 | 49.92 GB | 99.49% |
405279 | arijeetsarangi | workq | phonon | Q | 128 | 0.00% | ||
Cluster 14
Nodes Summary
Total Number of CPUs: 1040State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 14 | 784 | 0 | 0.00 |
free | 5 | 0 | 256 | 24.62 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node3 | 56 | |
node4 | 56 | |
node11 | 56 | |
node13 | 56 | |
Total | 256 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
prajna | 1 | 56 | 5.38% | 2 days 11:46:03 hrs | 99.56% | 99.56% | ||
mab5 | 1 | 112 | 10.77% | 1 day 13:46:50 hrs | 99.38% | 99.38% | ||
swapnild | 1 | 112 | 10.77% | 1 day 12:40:39 hrs | 99.52% | 99.52% | ||
mshashank | 1 | 56 | 5.38% | 13:52:31 hrs | 99.14% | 99.14% | ||
arijeetsarangi | 1 | 112 | 10.77% | 07:51:34 hrs | 99.70% | 99.70% | ||
manasagb | 1 | 112 | 10.77% | 07:41:40 hrs | 99.70% | 99.70% | ||
shilendra | 1 | 112 | 10.77% | 06:03:21 hrs | 99.09% | 99.09% | ||
vanshreep | 1 | 112 | 10.77% | 04:19:03 hrs | 99.53% | 99.53% |
Cluster 14
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node1 | workq | 56 | job-busy | 56 | 0 | |
node2 | workq | 56 | job-busy | 56 | 0 | |
node3 | workq | 56 | free | 0 | 56 | |
node4 | workq | 56 | free | 0 | 56 | |
node5 | workq | 56 | job-busy | 56 | 0 | |
node6 | workq | 56 | job-busy | 56 | 0 | |
node7 | workq | 56 | job-busy | 56 | 0 | |
node8 | workq | 56 | job-busy | 56 | 0 | |
node9 | workq | 56 | job-busy | 56 | 0 | |
node10 | workq | 56 | job-busy | 56 | 0 | |
node11 | workq | 56 | free | 0 | 56 | |
node12 | workq | 56 | job-busy | 56 | 0 | |
node13 | workq | 56 | free | 0 | 56 | |
node14 | workq | 56 | job-busy | 56 | 0 | |
node15 | workq | 56 | job-busy | 56 | 0 | |
node16 | workq | 56 | job-busy | 56 | 0 | |
node17 | workq | 56 | job-busy | 56 | 0 | |
node18 | workq | 56 | job-busy | 56 | 0 | |
gpu1 | neutrino | 32 | free | 0 | 32 | |
Cluster 14
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
20659.c14m1.clusternet | prajna@c14m2.clusternet | workq | LaNbO2N_010 | R | 2 days 11:46:03 hrs | 56 | 102.95 GB | 99.56% |
20686.c14m1.clusternet | mab5@c14m2.clusternet | workq | _p1_SR_BR_ | R | 1 day 13:46:51 hrs | 112 | 25.36 GB | 99.38% |
20687.c14m1.clusternet | swapnild@c14m2.clusternet | workq | wOAc | R | 1 day 12:40:40 hrs | 112 | 127.62 GB | 99.52% |
20726.c14m1.clusternet | mshashank@c14m2.clusternet | workq | f4000 | R | 13:52:31 hrs | 56 | 112.25 GB | 99.14% |
20741.c14m1.clusternet | arijeetsarangi@c14m2.clusternet | workq | test | R | 07:51:35 hrs | 112 | 76.97 GB | 99.70% |
20748.c14m1.clusternet | manasagb@c14m2.clusternet | workq | test | R | 07:41:40 hrs | 112 | 76.97 GB | 99.70% |
20749.c14m1.clusternet | shilendra@c14m2.clusternet | workq | opt_98 | R | 06:03:21 hrs | 112 | 24.69 GB | 99.09% |
20766.c14m1.clusternet | vanshreep@c14m2.clusternet | workq | NMO | R | 04:19:03 hrs | 112 | 26.68 GB | 99.53% |