Cluster 9
Nodes Summary
Total Number of CPUs: 768State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 33 | 528 | 0 | 0.00 |
down,offline | 15 | 240 | 0 | 0.00 |
Free CPUs (nodewise)
There is no free CPU available now.
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 9
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 16 | down | 0 | 0 | ||
compute-0-1 | 16 | down,offline | 0 | 0 | ||
compute-0-2 | 16 | down | 0 | 0 | ||
compute-0-3 | 16 | down,offline | 0 | 0 | ||
compute-0-4 | 16 | down | 0 | 0 | ||
compute-0-5 | 16 | down,offline | 0 | 0 | ||
compute-0-6 | 16 | down | 0 | 0 | ||
compute-0-7 | 16 | down | 0 | 0 | ||
compute-0-8 | 16 | down | 0 | 0 | ||
compute-0-9 | 16 | down | 0 | 0 | ||
compute-0-10 | 16 | down | 0 | 0 | ||
compute-0-11 | 16 | down | 0 | 0 | ||
compute-0-12 | 16 | down | 0 | 0 | ||
compute-0-13 | 16 | down | 0 | 0 | ||
compute-0-14 | 16 | down | 0 | 0 | ||
compute-0-15 | 16 | down,offline | 0 | 0 | ||
compute-0-16 | 16 | down | 0 | 0 | ||
compute-0-17 | 16 | down | 0 | 0 | ||
compute-0-18 | 16 | down | 0 | 0 | ||
compute-0-19 | 16 | down | 0 | 0 | ||
compute-0-20 | 16 | down | 0 | 0 | ||
compute-0-21 | 16 | down,offline | 0 | 0 | ||
compute-0-22 | 16 | down,offline | 0 | 0 | ||
compute-0-23 | 16 | down,offline | 0 | 0 | ||
compute-0-24 | 16 | down,offline | 0 | 0 | ||
compute-0-25 | 16 | down | 0 | 0 | ||
compute-0-26 | 16 | down | 0 | 0 | ||
compute-0-27 | 16 | down | 0 | 0 | ||
compute-0-28 | 16 | down | 0 | 0 | ||
compute-0-29 | 16 | down,offline | 0 | 0 | ||
compute-0-30 | 16 | down,offline | 0 | 0 | ||
compute-0-31 | 16 | down | 0 | 0 | ||
compute-0-32 | 16 | down | 0 | 0 | ||
compute-0-33 | 16 | down | 0 | 0 | ||
compute-0-34 | 16 | down | 0 | 0 | ||
compute-0-35 | 16 | down | 0 | 0 | ||
compute-0-36 | 16 | down,offline | 0 | 0 | ||
compute-0-37 | 16 | down | 0 | 0 | ||
compute-0-38 | 16 | down | 0 | 0 | ||
compute-0-39 | 16 | down,offline | 0 | 0 | ||
compute-0-40 | 16 | down | 0 | 0 | ||
compute-0-41 | 16 | down | 0 | 0 | ||
compute-0-42 | 16 | down | 0 | 0 | ||
compute-0-43 | 16 | down | 0 | 0 | ||
compute-0-44 | 16 | down,offline | 0 | 0 | ||
compute-0-45 | 16 | down,offline | 0 | 0 | ||
compute-0-46 | 16 | down | 0 | 0 | ||
compute-0-47 | 16 | down,offline | 0 | 0 | ||
Cluster 9
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 10
Nodes Summary
Total Number of CPUs: 280State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 12 | 240 | 0 | 0.00 |
down,offline | 1 | 20 | 0 | 0.00 |
free | 1 | 0 | 20 | 7.14 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute-0-8 | 20 |
Total | 20 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 10
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 20 | down | 0 | 0 | ||
compute-0-1 | 20 | down | 0 | 0 | ||
compute-0-2 | 20 | down | 0 | 0 | ||
compute-0-4 | 20 | down | 0 | 0 | ||
compute-0-5 | 20 | down | 0 | 0 | ||
compute-0-6 | 20 | down | 0 | 0 | ||
compute-0-7 | 20 | down,offline | 0 | 0 | ||
compute-0-8 | 20 | free | 0 | 20 | ||
compute-0-10 | 20 | down | 0 | 0 | ||
compute-0-11 | 20 | down | 0 | 0 | ||
compute-0-12 | 20 | down | 0 | 0 | ||
compute-0-13 | 20 | down | 0 | 0 | ||
compute-0-3 | 20 | down | 0 | 0 | ||
compute-0-9 | 20 | down | 0 | 0 | ||
Cluster 10
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 11
Nodes Summary
Total Number of CPUs: 960State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
free | 14 | 73 | 263 | 27.40 |
job-exclusive | 13 | 312 | 0 | 0.00 |
down,job-exclusive | 1 | 24 | 0 | 0.00 |
down | 8 | 192 | 0 | 0.00 |
down,offline | 3 | 72 | 0 | 0.00 |
offline | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute000 | 24 |
compute005 | 8 |
compute010 | 11 |
compute011 | 24 |
compute012 | 24 |
compute013 | 24 |
compute014 | 24 |
compute015 | 24 |
compute017 | 24 |
compute020 | 24 |
compute022 | 24 |
compute024 | 24 |
compute031 | 3 |
compute034 | 1 |
Total | 263 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R | |||||||
ganeshchandra | 1 | 12 | 1.25% | 15 days 21:54:28 hrs | 50.01% | 50.01% | |
tanoykanti | 6 | 17 | 1.77% | 19 days 22:11:11 hrs | 143.78% | 113.24% | |
aparajita | 20 | 20 | 2.08% | 70 days 22:28:35 hrs | 97.94% | 98.59% | |
shilendra | 1 | 144 | 15.00% | 27 days 16:31:30 hrs | 100.08% | 100.08% | |
priyaghosh | 80 | 80 | 8.33% | 11 days 05:01:23 hrs | 77.49% | 77.46% | |
debs | 1 | 144 | 15.00% | 9 days 03:50:00 hrs | 100.06% | 100.06% |
Cluster 11
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute000 | 24 | free | 0 | 24 | ||
compute001 | 24 | job-exclusive | 24 | 0 | ||
compute002 | 24 | job-exclusive | 24 | 0 | ||
compute003 | 24 | job-exclusive | 24 | 0 | ||
compute004 | 24 | down,job-exclusive | 24 | 0 | ||
compute005 | 24 | free | 16 | 8 | ||
compute006 | 24 | down | 0 | 0 | ||
compute007 | 24 | down | 0 | 0 | ||
compute008 | 24 | job-exclusive | 24 | 0 | ||
compute009 | 24 | job-exclusive | 24 | 0 | ||
compute010 | 24 | free | 13 | 11 | ||
compute011 | 24 | free | 0 | 24 | ||
compute012 | 24 | free | 0 | 24 | ||
compute013 | 24 | free | 0 | 24 | ||
compute014 | 24 | free | 0 | 24 | ||
compute015 | 24 | free | 0 | 24 | ||
compute016 | 24 | down,offline | 0 | 0 | ||
compute017 | 24 | free | 0 | 24 | ||
compute018 | 24 | down | 0 | 0 | ||
compute019 | 24 | down | 0 | 0 | ||
compute020 | 24 | free | 0 | 24 | ||
compute021 | 24 | down | 12 | 0 | ||
compute022 | 24 | free | 0 | 24 | ||
compute023 | 24 | down,offline | 0 | 0 | ||
compute024 | 24 | free | 0 | 24 | ||
compute025 | 24 | down,offline | 0 | 0 | ||
compute026 | 24 | job-exclusive | 24 | 0 | ||
compute027 | 24 | job-exclusive | 24 | 0 | ||
compute028 | 24 | job-exclusive | 24 | 0 | ||
compute029 | 24 | job-exclusive | 24 | 0 | ||
compute030 | 24 | job-exclusive | 24 | 0 | ||
compute031 | 24 | free | 21 | 3 | ||
compute032 | 24 | job-exclusive | 24 | 0 | ||
compute033 | 24 | offline | 0 | 0 | ||
compute034 | 24 | free | 23 | 1 | ||
compute035 | 24 | job-exclusive | 24 | 0 | ||
compute036 | 24 | job-exclusive | 24 | 0 | ||
compute037 | 24 | down | 0 | 0 | ||
compute038 | 24 | down | 0 | 0 | ||
compute039 | 24 | down | 0 | 0 | ||
Cluster 11
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|
560857 | ganeshchandra | submit_p.sh | R | 15 days 21:54:29 hrs | 12 | 25.15 GB | 50.01% |
575781 | tanoykanti | sr_2 | R | 99 days 20:06:16 hrs | 1 | 5.96 MB | 90.93% |
577935 | aparajita | k1=10000_N=70 | R | 74 days 18:37:21 hrs | 1 | 1.43 GB | 98.85% |
577936 | aparajita | k1=10000_N=80 | R | 74 days 18:36:05 hrs | 1 | 2.51 GB | 98.85% |
577937 | aparajita | k1=10000_N=90 | R | 74 days 18:34:23 hrs | 1 | 2.89 GB | 98.84% |
577938 | aparajita | k1=10000_N=100 | R | 74 days 18:32:39 hrs | 1 | 4.92 GB | 98.85% |
577939 | aparajita | k1=10000_N=110 | R | 74 days 18:31:20 hrs | 1 | 5.49 GB | 98.84% |
577940 | aparajita | k1=10000_N=120 | R | 74 days 18:30:09 hrs | 1 | 6.17 GB | 98.85% |
577941 | aparajita | k1=10000_N=130 | R | 74 days 18:28:49 hrs | 1 | 10.10 GB | 98.84% |
577942 | aparajita | k1=10000_N=140 | R | 74 days 18:27:25 hrs | 1 | 10.97 GB | 98.85% |
577943 | aparajita | k1=10000_N=150 | R | 74 days 18:25:53 hrs | 1 | 12.12 GB | 100.08% |
577944 | aparajita | k1=10000_N=160 | R | 74 days 18:24:52 hrs | 1 | 19.59 GB | 100.09% |
577945 | aparajita | k1=10000_N=170 | R | 74 days 18:23:42 hrs | 1 | 20.98 GB | 100.08% |
577946 | aparajita | k1=10000_N=180 | R | 74 days 18:22:45 hrs | 1 | 22.55 GB | 100.08% |
577947 | aparajita | k1=10000_N=190 | R | 74 days 18:21:37 hrs | 1 | 23.78 GB | 100.09% |
577948 | aparajita | k1=10000_N=200 | R | 74 days 18:20:39 hrs | 1 | 26.16 GB | 100.09% |
578069 | aparajita | k1=10000_N=128 | R | 72 days 20:47:03 hrs | 1 | 9.93 GB | 100.02% |
579100 | aparajita | k1=50000_N=64 | R | 66 days 23:04:06 hrs | 1 | 1.29 GB | 98.70% |
579101 | aparajita | k1=30000_N=64 | R | 66 days 23:03:09 hrs | 1 | 1.29 GB | 98.70% |
579112 | aparajita | k1=20000_N=64 | R | 66 days 22:52:58 hrs | 1 | 1.29 GB | 98.70% |
582065 | aparajita | k1=5000_N=128_p=2 | R | 63 days 01:57:38 hrs | 1 | 9.93 GB | 97.23% |
584795 | aparajita | k1=10000_N=60_tmax=1 | R | 35 days 03:09:09 hrs | 1 | 834.30 MB | 74.14% |
589467 | shilendra | CsPb | R | 27 days 16:31:30 hrs | 144 | 5.39 GB | 100.08% |
595668 | tanoykanti | QFI_vst_bose_55 | R | 14 days 22:19:00 hrs | 16 | 1.52 GB | 25.02% |
603646 | tanoykanti | b_1 | R | 11 days 22:21:25 hrs | 0 | 941.26 MB | 0.00% |
603647 | tanoykanti | b_3 | R | 11 days 22:21:03 hrs | 0 | 919.36 MB | 0.00% |
603648 | tanoykanti | b_5 | R | 11 days 22:20:42 hrs | 0 | 905.62 MB | 0.00% |
603649 | tanoykanti | b_10 | R | 11 days 22:20:28 hrs | 0 | 921.27 MB | 0.00% |
603758 | priyaghosh | gau_0.01_1 | R | 11 days 05:27:00 hrs | 1 | 29.80 MB | 23.56% |
603759 | priyaghosh | gau_0.01_2 | R | 11 days 05:26:57 hrs | 1 | 25.65 MB | 23.58% |
603760 | priyaghosh | gau_0.01_3 | R | 11 days 05:26:54 hrs | 1 | 23.69 MB | 23.58% |
603761 | priyaghosh | gau_0.01_4 | R | 11 days 05:26:49 hrs | 1 | 25.55 MB | 23.57% |
603762 | priyaghosh | gau_0.03_1 | R | 11 days 05:26:47 hrs | 1 | 23.69 MB | 23.56% |
603763 | priyaghosh | gau_0.03_2 | R | 11 days 05:26:43 hrs | 1 | 22.85 MB | 23.56% |
603764 | priyaghosh | gau_0.03_3 | R | 11 days 05:26:41 hrs | 1 | 22.86 MB | 23.58% |
603765 | priyaghosh | gau_0.03_4 | R | 11 days 05:26:37 hrs | 1 | 23.69 MB | 23.57% |
603766 | priyaghosh | gau_0.05_1 | R | 11 days 05:26:34 hrs | 1 | 23.69 MB | 23.59% |
603767 | priyaghosh | gau_0.05_2 | R | 11 days 05:26:30 hrs | 1 | 23.71 MB | 23.61% |
603768 | priyaghosh | gau_0.05_3 | R | 11 days 05:26:27 hrs | 1 | 23.72 MB | 23.57% |
603769 | priyaghosh | gau_0.05_4 | R | 11 days 05:26:24 hrs | 1 | 23.69 MB | 23.61% |
603770 | priyaghosh | gau_0.08_1 | R | 11 days 05:26:20 hrs | 1 | 24.72 MB | 23.58% |
603771 | priyaghosh | gau_0.08_2 | R | 11 days 05:26:17 hrs | 1 | 23.69 MB | 23.59% |
603772 | priyaghosh | gau_0.08_3 | R | 11 days 05:26:14 hrs | 1 | 22.85 MB | 23.60% |
603773 | priyaghosh | gau_0.08_4 | R | 11 days 05:26:11 hrs | 1 | 25.62 MB | 23.57% |
603774 | priyaghosh | gau_0.1_1 | R | 11 days 05:26:04 hrs | 1 | 22.86 MB | 23.62% |
603775 | priyaghosh | gau_0.1_2 | R | 11 days 05:26:03 hrs | 1 | 25.64 MB | 23.58% |
603776 | priyaghosh | gau_0.1_3 | R | 11 days 05:25:43 hrs | 1 | 22.86 MB | 100.09% |
603777 | priyaghosh | gau_0.1_4 | R | 11 days 05:25:40 hrs | 1 | 29.69 MB | 100.09% |
603943 | priyaghosh | gau_0.01_5 | R | 11 days 05:21:18 hrs | 1 | 22.87 MB | 100.09% |
603948 | priyaghosh | gau_0.01_6 | R | 11 days 05:21:16 hrs | 1 | 23.71 MB | 84.16% |
603952 | priyaghosh | gau_0.01_7 | R | 11 days 05:21:12 hrs | 1 | 23.69 MB | 84.16% |
603956 | priyaghosh | gau_0.01_8 | R | 11 days 05:21:06 hrs | 1 | 20.41 MB | 100.09% |
603960 | priyaghosh | gau_0.03_5 | R | 11 days 05:21:04 hrs | 1 | 20.41 MB | 84.21% |
603965 | priyaghosh | gau_0.03_6 | R | 11 days 05:21:01 hrs | 1 | 20.39 MB | 84.74% |
603969 | priyaghosh | gau_0.03_7 | R | 11 days 05:20:57 hrs | 1 | 22.86 MB | 84.27% |
603974 | priyaghosh | gau_0.03_8 | R | 11 days 05:20:52 hrs | 1 | 20.40 MB | 84.17% |
603977 | priyaghosh | gau_0.05_5 | R | 11 days 05:20:49 hrs | 1 | 20.40 MB | 84.59% |
603981 | priyaghosh | gau_0.05_6 | R | 11 days 05:20:40 hrs | 1 | 23.71 MB | 100.09% |
603985 | priyaghosh | gau_0.05_7 | R | 11 days 05:20:40 hrs | 1 | 20.11 MB | 84.18% |
603989 | priyaghosh | gau_0.05_8 | R | 11 days 05:20:35 hrs | 1 | 22.86 MB | 84.31% |
603993 | priyaghosh | gau_0.08_5 | R | 11 days 05:20:33 hrs | 1 | 20.40 MB | 84.50% |
603998 | priyaghosh | gau_0.08_6 | R | 11 days 05:20:30 hrs | 1 | 20.10 MB | 84.32% |
604002 | priyaghosh | gau_0.08_7 | R | 11 days 05:20:52 hrs | 1 | 20.41 MB | 91.86% |
604007 | priyaghosh | gau_0.08_8 | R | 11 days 05:20:47 hrs | 1 | 20.41 MB | 91.86% |
604011 | priyaghosh | gau_0.1_5 | R | 11 days 05:20:17 hrs | 1 | 20.10 MB | 84.60% |
604016 | priyaghosh | gau_0.1_6 | R | 11 days 05:20:12 hrs | 1 | 20.42 MB | 84.10% |
604020 | priyaghosh | gau_0.1_7 | R | 11 days 05:20:08 hrs | 1 | 20.10 MB | 84.15% |
604024 | priyaghosh | gau_0.1_8 | R | 11 days 05:20:00 hrs | 1 | 24.74 MB | 84.07% |
604182 | priyaghosh | uni_0.01_1 | R | 11 days 03:39:46 hrs | 1 | 25.67 MB | 100.01% |
604186 | priyaghosh | uni_0.01_2 | R | 11 days 05:16:51 hrs | 1 | 25.55 MB | 84.16% |
604190 | priyaghosh | uni_0.01_3 | R | 11 days 05:16:49 hrs | 1 | 25.57 MB | 84.62% |
604194 | priyaghosh | uni_0.01_4 | R | 11 days 05:17:13 hrs | 1 | 25.55 MB | 91.87% |
604198 | priyaghosh | uni_0.03_1 | R | 11 days 05:17:08 hrs | 1 | 23.69 MB | 91.86% |
604202 | priyaghosh | uni_0.03_2 | R | 11 days 05:16:38 hrs | 1 | 23.70 MB | 100.09% |
604207 | priyaghosh | uni_0.03_3 | R | 11 days 05:16:37 hrs | 1 | 25.54 MB | 100.09% |
604211 | priyaghosh | uni_0.03_4 | R | 11 days 05:16:20 hrs | 1 | 23.69 MB | 84.11% |
604216 | priyaghosh | uni_0.05_1 | R | 11 days 05:16:44 hrs | 1 | 23.69 MB | 91.85% |
604220 | priyaghosh | uni_0.05_2 | R | 11 days 05:16:43 hrs | 1 | 22.85 MB | 91.86% |
604224 | priyaghosh | uni_0.05_3 | R | 11 days 05:16:16 hrs | 1 | 25.58 MB | 100.08% |
604228 | priyaghosh | uni_0.05_4 | R | 11 days 05:16:14 hrs | 1 | 23.69 MB | 100.09% |
604233 | priyaghosh | uni_0.08_1 | R | 11 days 05:16:12 hrs | 1 | 27.54 MB | 100.09% |
604237 | priyaghosh | uni_0.08_2 | R | 11 days 05:16:05 hrs | 1 | 29.75 MB | 84.57% |
604241 | priyaghosh | uni_0.08_3 | R | 11 days 05:16:01 hrs | 1 | 23.70 MB | 84.60% |
604245 | priyaghosh | uni_0.08_4 | R | 11 days 03:39:45 hrs | 1 | 22.85 MB | 100.03% |
604250 | priyaghosh | uni_0.1_1 | R | 11 days 05:16:15 hrs | 1 | 25.56 MB | 91.85% |
604254 | priyaghosh | uni_0.1_2 | R | 11 days 05:15:49 hrs | 1 | 25.54 MB | 100.09% |
604258 | priyaghosh | uni_0.1_3 | R | 11 days 05:15:47 hrs | 1 | 23.68 MB | 100.09% |
604262 | priyaghosh | uni_0.1_4 | R | 11 days 05:15:45 hrs | 1 | 25.55 MB | 100.08% |
604435 | priyaghosh | uni_0.01_5 | R | 11 days 03:39:28 hrs | 1 | 23.68 MB | 100.03% |
604438 | priyaghosh | uni_0.01_6 | R | 11 days 05:12:26 hrs | 1 | 22.86 MB | 91.86% |
604443 | priyaghosh | uni_0.01_7 | R | 11 days 05:11:51 hrs | 1 | 22.86 MB | 100.09% |
604447 | priyaghosh | uni_0.01_8 | R | 11 days 05:11:47 hrs | 1 | 22.86 MB | 100.09% |
604451 | priyaghosh | uni_0.03_5 | R | 11 days 05:12:12 hrs | 1 | 20.40 MB | 91.86% |
604455 | priyaghosh | uni_0.03_6 | R | 11 days 05:11:45 hrs | 1 | 20.40 MB | 100.09% |
604459 | priyaghosh | uni_0.03_7 | R | 11 days 03:39:27 hrs | 1 | 20.39 MB | 100.00% |
604463 | priyaghosh | uni_0.03_8 | R | 11 days 03:39:25 hrs | 1 | 20.39 MB | 100.00% |
604467 | priyaghosh | uni_0.05_5 | R | 11 days 05:12:08 hrs | 1 | 20.39 MB | 91.86% |
604471 | priyaghosh | uni_0.05_6 | R | 11 days 03:39:23 hrs | 1 | 22.85 MB | 100.03% |
604476 | priyaghosh | uni_0.05_7 | R | 11 days 03:39:20 hrs | 1 | 20.39 MB | 99.99% |
604480 | priyaghosh | uni_0.05_8 | R | 11 days 05:11:37 hrs | 1 | 23.70 MB | 100.09% |
604484 | priyaghosh | uni_0.08_5 | R | 11 days 03:39:17 hrs | 1 | 20.39 MB | 99.98% |
604488 | priyaghosh | uni_0.08_6 | R | 11 days 03:39:15 hrs | 1 | 22.85 MB | 100.01% |
604492 | priyaghosh | uni_0.08_7 | R | 11 days 03:39:13 hrs | 1 | 20.10 MB | 100.03% |
604496 | priyaghosh | uni_0.08_8 | R | 11 days 03:38:46 hrs | 1 | 20.09 MB | 84.17% |
604500 | priyaghosh | uni_0.1_5 | R | 11 days 03:39:06 hrs | 1 | 20.38 MB | 99.98% |
604504 | priyaghosh | uni_0.1_6 | R | 11 days 03:39:05 hrs | 1 | 20.40 MB | 100.01% |
604508 | priyaghosh | uni_0.1_7 | R | 11 days 03:39:01 hrs | 1 | 20.09 MB | 100.02% |
604512 | priyaghosh | uni_0.1_8 | R | 11 days 03:38:33 hrs | 1 | 22.85 MB | 84.07% |
608007 | debs | CsPb_441ph | R | 9 days 03:50:00 hrs | 144 | 62.61 GB | 100.06% |
Cluster 12
Nodes Summary
Total Number of CPUs: 1056State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
free | 10 | 105 | 135 | 12.78 |
job-busy | 33 | 792 | 0 | 0.00 |
down | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node2 | 15 | |
node8 | 8 | |
node16 | 12 | |
node17 | 12 | |
node6 | 24 | |
node7 | 4 | |
node13 | 20 | |
node36 | 4 | |
node37 | 12 | |
node41 | 24 | |
Total | 135 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
debarupa11 | 40 | 40 | 3.79% | 9 days 23:44:49 hrs | 99.87% | 99.39% | ||
souravmal | 1 | 24 | 2.27% | 17 days 13:59:59 hrs | 100.02% | 100.02% | ||
shuvam | 17 | 65 | 6.16% | 9 days 22:10:43 hrs | 26.17% | 25.50% | ||
bikashvbu | 4 | 96 | 9.09% | 1 day 13:16:54 hrs | 99.96% | 99.98% | ||
swapnild | 2 | 144 | 13.64% | 3 days 11:13:32 hrs | 100.02% | 100.02% | ||
pradhi | 2 | 336 | 31.82% | 1 day 19:10:43 hrs | 99.83% | 99.83% | ||
ponnappa | 1 | 24 | 2.27% | 1 day 23:20:31 hrs | 100.02% | 100.02% | ||
shilendra | 1 | 144 | 13.64% | 15:20:04 hrs | 100.07% | 100.07% | ||
pchaki | 12 | 24 | 2.27% | 0:04:26 hrs | 99.93% | 99.92% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
sudip | 2 | 168 |
Cluster 12
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node2 | workq | 24 | free | 9 | 15 | |
node3 | workq | 24 | job-busy | 24 | 0 | |
node4 | workq | 24 | job-busy | 24 | 0 | |
node5 | workq | 24 | job-busy | 24 | 0 | |
node8 | workq | 24 | free | 16 | 8 | |
node9 | workq | 24 | job-busy | 24 | 0 | |
node10 | workq | 24 | job-busy | 24 | 0 | |
node11 | workq | 24 | job-busy | 24 | 0 | |
node12 | workq | 24 | job-busy | 24 | 0 | |
node14 | workq | 24 | job-busy | 24 | 0 | |
node15 | workq | 24 | job-busy | 24 | 0 | |
node16 | workq | 24 | free | 12 | 12 | |
node17 | workq | 24 | free | 12 | 12 | |
node18 | workq | 24 | job-busy | 24 | 0 | |
node19 | workq | 24 | job-busy | 24 | 0 | |
node20 | workq | 24 | job-busy | 24 | 0 | |
node21 | workq | 24 | job-busy | 24 | 0 | |
node22 | workq | 24 | job-busy | 24 | 0 | |
node1 | workq | 24 | job-busy | 24 | 0 | |
node23 | workq | 24 | job-busy | 24 | 0 | |
node24 | workq | 24 | job-busy | 24 | 0 | |
node25 | workq | 24 | job-busy | 24 | 0 | |
node6 | workq | 24 | free | 0 | 24 | |
node7 | workq | 24 | free | 20 | 4 | |
node26 | workq | 24 | down | 0 | 0 | |
node27 | workq | 24 | job-busy | 24 | 0 | |
node13 | workq | 24 | free | 4 | 20 | |
node28 | workq | 24 | job-busy | 24 | 0 | |
node29 | workq | 24 | job-busy | 24 | 0 | |
node30 | workq | 24 | job-busy | 24 | 0 | |
node31 | workq | 24 | job-busy | 24 | 0 | |
node32 | workq | 24 | job-busy | 24 | 0 | |
node34 | workq | 24 | job-busy | 24 | 0 | |
node35 | workq | 24 | job-busy | 24 | 0 | |
node36 | workq | 24 | free | 20 | 4 | |
node37 | workq | 24 | free | 12 | 12 | |
node38 | workq | 24 | job-busy | 24 | 0 | |
node39 | workq | 24 | job-busy | 24 | 0 | |
node40 | workq | 24 | job-busy | 24 | 0 | |
node41 | workq | 24 | free | 0 | 24 | |
node42 | workq | 24 | job-busy | 24 | 0 | |
node33 | workq | 24 | job-busy | 24 | 0 | |
node43 | workq | 24 | job-busy | 24 | 0 | |
node44 | workq | 24 | job-busy | 24 | 0 | |
Cluster 12
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
93725 | debarupa11 | workq | GMBF100000000 | R | 33 days 02:58:34 hrs | 1 | 6.76 MB | 95.95% |
93726 | debarupa11 | workq | GMBF1000000000 | R | 33 days 02:58:25 hrs | 1 | 6.75 MB | 95.95% |
94538 | souravmal | workq | smps-914568 | R | 17 days 13:59:59 hrs | 24 | 10.17 GB | 100.02% |
95168 | shuvam | workq | zpt30 | R | 10 days 05:20:55 hrs | 4 | 12.45 MB | 25.02% |
95176 | shuvam | workq | zpt10 | R | 10 days 03:12:30 hrs | 4 | 12.48 MB | 25.02% |
95198 | shuvam | workq | zpt_coh_0.15 | R | 9 days 23:48:34 hrs | 4 | 12.55 MB | 25.02% |
95199 | shuvam | workq | zpt_coh_0.20 | R | 9 days 23:48:34 hrs | 4 | 14.44 MB | 25.02% |
95200 | shuvam | workq | zpt_coh_0.25 | R | 9 days 23:50:11 hrs | 4 | 12.54 MB | 25.02% |
95202 | shuvam | workq | zpt_coh_0.35 | R | 9 days 23:50:11 hrs | 4 | 12.54 MB | 25.02% |
95204 | shuvam | workq | zpt_coh_0.45 | R | 9 days 23:50:11 hrs | 4 | 12.54 MB | 25.02% |
95205 | shuvam | workq | zpt_coh_0.50 | R | 9 days 23:49:29 hrs | 4 | 12.54 MB | 25.02% |
95206 | shuvam | workq | zpt_coh_0.55 | R | 9 days 23:49:29 hrs | 4 | 14.45 MB | 25.02% |
95208 | shuvam | workq | zpt_coh_0.65 | R | 9 days 23:49:29 hrs | 4 | 14.43 MB | 25.02% |
95209 | shuvam | workq | zpt_coh_0.70 | R | 9 days 23:49:29 hrs | 4 | 12.54 MB | 25.02% |
95210 | shuvam | workq | zpt_coh_0.75 | R | 9 days 23:49:29 hrs | 4 | 14.38 MB | 25.02% |
95212 | shuvam | workq | zpt_coh_0.85 | R | 9 days 23:50:21 hrs | 4 | 12.54 MB | 25.02% |
95213 | shuvam | workq | zpt_coh_0.90 | R | 9 days 23:50:21 hrs | 4 | 12.54 MB | 25.02% |
95214 | shuvam | workq | zpt_coh_0.95 | R | 9 days 23:50:21 hrs | 4 | 12.53 MB | 25.02% |
95226 | shuvam | workq | zpt4d_coh_0.65 | R | 9 days 22:57:18 hrs | 4 | 12.86 MB | 25.02% |
95319 | debarupa11 | workq | 3D0.52 | R | 9 days 00:07:54 hrs | 1 | 6.86 MB | 100.08% |
95320 | debarupa11 | workq | 3D0.54 | R | 9 days 00:08:02 hrs | 1 | 6.87 MB | 100.07% |
95322 | debarupa11 | workq | 3D0.58 | R | 9 days 00:07:45 hrs | 1 | 6.87 MB | 100.07% |
95323 | debarupa11 | workq | 3D0.60 | R | 9 days 00:07:36 hrs | 1 | 6.87 MB | 100.07% |
95325 | debarupa11 | workq | 3D0.64 | R | 9 days 00:07:18 hrs | 1 | 8.87 MB | 100.07% |
95360 | debarupa11 | workq | 5D0.66 | R | 8 days 23:46:35 hrs | 1 | 9.10 MB | 100.07% |
95361 | debarupa11 | workq | 5D0.68 | R | 8 days 23:46:25 hrs | 1 | 7.10 MB | 100.07% |
95364 | debarupa11 | workq | 5D0.74 | R | 8 days 23:45:55 hrs | 1 | 7.10 MB | 100.07% |
95366 | debarupa11 | workq | 5D0.78 | R | 8 days 23:45:35 hrs | 1 | 7.11 MB | 100.07% |
95377 | debarupa11 | workq | 4D0.60 | R | 8 days 23:38:38 hrs | 1 | 8.92 MB | 100.07% |
95379 | debarupa11 | workq | 4D0.62 | R | 8 days 23:36:58 hrs | 1 | 8.93 MB | 100.07% |
95380 | debarupa11 | workq | 4D0.63 | R | 8 days 23:38:11 hrs | 1 | 6.93 MB | 100.07% |
95381 | debarupa11 | workq | 4D0.64 | R | 8 days 23:38:01 hrs | 1 | 6.92 MB | 100.07% |
95383 | debarupa11 | workq | 4D0.66 | R | 8 days 23:37:43 hrs | 1 | 8.91 MB | 100.07% |
95384 | debarupa11 | workq | 4D0.67 | R | 8 days 23:37:34 hrs | 1 | 6.93 MB | 100.07% |
95385 | debarupa11 | workq | 4D0.68 | R | 8 days 23:37:25 hrs | 1 | 8.92 MB | 100.07% |
95386 | debarupa11 | workq | 4D0.69 | R | 8 days 23:35:54 hrs | 1 | 8.93 MB | 100.07% |
95387 | debarupa11 | workq | 4D0.70 | R | 8 days 23:37:06 hrs | 1 | 6.93 MB | 100.07% |
95388 | debarupa11 | workq | 4D0.71 | R | 8 days 23:36:57 hrs | 1 | 6.92 MB | 100.08% |
95389 | debarupa11 | workq | 4D0.72 | R | 8 days 23:36:48 hrs | 1 | 8.92 MB | 100.07% |
95390 | debarupa11 | workq | 4D0.73 | R | 8 days 23:36:39 hrs | 1 | 6.92 MB | 100.07% |
95394 | debarupa11 | workq | 3D0.54 | R | 8 days 23:32:29 hrs | 1 | 6.86 MB | 100.08% |
95395 | debarupa11 | workq | 3D0.56 | R | 8 days 23:32:20 hrs | 1 | 6.87 MB | 100.08% |
95396 | debarupa11 | workq | 3D0.58 | R | 8 days 23:32:11 hrs | 1 | 6.87 MB | 100.08% |
95397 | debarupa11 | workq | 3D0.60 | R | 8 days 23:32:02 hrs | 1 | 6.87 MB | 100.07% |
95399 | debarupa11 | workq | 3D0.64 | R | 8 days 23:31:45 hrs | 1 | 6.86 MB | 100.07% |
95400 | debarupa11 | workq | 3D0.66 | R | 8 days 23:31:36 hrs | 1 | 6.86 MB | 100.08% |
95407 | debarupa11 | workq | 3D0.53 | R | 8 days 20:48:37 hrs | 1 | 6.87 MB | 100.07% |
95408 | debarupa11 | workq | 3D0.54 | R | 8 days 20:48:28 hrs | 1 | 6.86 MB | 100.07% |
95410 | debarupa11 | workq | 3D0.56 | R | 8 days 20:48:10 hrs | 1 | 6.87 MB | 100.07% |
95412 | debarupa11 | workq | 3D0.58 | R | 8 days 20:47:53 hrs | 1 | 8.86 MB | 100.07% |
95413 | debarupa11 | workq | 3D0.59 | R | 8 days 20:47:44 hrs | 1 | 6.86 MB | 100.07% |
95415 | debarupa11 | workq | 3D0.61 | R | 8 days 20:48:48 hrs | 1 | 6.86 MB | 100.08% |
95416 | debarupa11 | workq | 3D0.62 | R | 8 days 20:48:39 hrs | 1 | 6.86 MB | 100.08% |
95417 | debarupa11 | workq | 3D0.63 | R | 8 days 20:47:09 hrs | 1 | 6.87 MB | 100.07% |
95419 | debarupa11 | workq | 3D0.65 | R | 8 days 20:48:13 hrs | 1 | 7.05 MB | 100.07% |
95420 | debarupa11 | workq | 3D0.66 | R | 8 days 20:48:04 hrs | 1 | 6.87 MB | 100.07% |
96214 | bikashvbu | workq | Tuya | R | 4 days 16:13:30 hrs | 24 | 12.75 GB | 100.02% |
96217 | shuvam | workq | zpt4d_coh_0.95 | R | 4 days 04:29:30 hrs | 1 | 13.11 MB | 100.07% |
96221 | swapnild | workq | GenAI | R | 3 days 22:23:10 hrs | 72 | 13.68 GB | 100.02% |
96281 | swapnild | workq | GenAI_e | R | 3 days 00:03:54 hrs | 72 | 13.77 GB | 100.02% |
96314 | pradhi | workq | 3d_alf3_rlx | R | 2 days 21:23:06 hrs | 168 | 15.80 GB | 99.83% |
96418 | ponnappa | workq | H23 | R | 1 day 23:20:31 hrs | 24 | 18.69 GB | 100.02% |
96424 | debarupa11 | workq | 3D0.00 | R | 1 day 23:30:48 hrs | 1 | 7.09 MB | 100.07% |
96459 | pradhi | workq | 3d_al2o3_rlx | R | 16:58:20 hrs | 168 | 16.09 GB | 99.83% |
96464 | bikashvbu | workq | Mousumi | R | 1 day 02:03:56 hrs | 24 | 5.03 GB | 99.81% |
96491 | shilendra | workq | opt_FeCo | R | 15:20:04 hrs | 144 | 19.43 GB | 100.07% |
96493 | bikashvbu | workq | JCM | R | 08:29:36 hrs | 24 | 7.88 GB | 100.01% |
96494 | sudip | workq | Hf2Te10_211 | Q | 96 | 0.00% | ||
96496 | bikashvbu | workq | Mousumi | R | 02:20:36 hrs | 24 | 6.08 GB | 100.00% |
96497 | pchaki | workq | cheb_0.4 | R | 00:10:51 hrs | 2 | 4.64 GB | 99.92% |
96498 | pchaki | workq | cheb_.01 | R | 00:06:28 hrs | 2 | 4.64 GB | 100.00% |
96499 | pchaki | workq | cheb_.02 | R | 00:05:57 hrs | 2 | 4.64 GB | 100.00% |
96500 | pchaki | workq | cheb_.03 | R | 00:05:26 hrs | 2 | 4.64 GB | 99.85% |
96501 | pchaki | workq | cheb_.04 | R | 00:04:55 hrs | 2 | 4.64 GB | 99.83% |
96502 | pchaki | workq | cheb_.05 | R | 00:04:23 hrs | 2 | 4.64 GB | 100.00% |
96503 | pchaki | workq | cheb_.06 | R | 00:03:52 hrs | 2 | 4.64 GB | 100.00% |
96504 | sudip | workq | Hf2Te10_211 | Q | 72 | 0.00% | ||
96505 | pchaki | workq | cheb_.07 | R | 00:03:21 hrs | 2 | 4.64 GB | 99.75% |
96506 | pchaki | workq | cheb_.08 | R | 00:02:50 hrs | 2 | 4.64 GB | 99.41% |
96507 | pchaki | workq | cheb_.09 | R | 00:02:18 hrs | 2 | 4.64 GB | 100.36% |
96508 | pchaki | workq | cheb_.10 | R | 00:01:47 hrs | 2 | 4.64 GB | 100.00% |
96509 | pchaki | workq | cheb_.11 | R | 00:01:16 hrs | 2 | 4.64 GB | 100.00% |
Cluster 13
Nodes Summary
Total Number of CPUs: 1024State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 20 | 640 | 0 | 0.00 |
down | 2 | 64 | 0 | 0.00 |
free | 10 | 0 | 320 | 31.25 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
c13node20 | 32 | |
c13node13 | 32 | |
c13node21 | 32 | |
neutrino | ||
c13node24 | 32 | |
c13node25 | 32 | |
c13node26 | 32 | |
c13node27 | 32 | |
c13node28 | 32 | |
c13node30 | 32 | |
c13node31 | 32 | |
Total | 320 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
sankalpa | 1 | 128 | 12.50% | 32 days 03:50:15 hrs | 99.66% | 99.66% | ||
jagjitkaur | 1 | 128 | 12.50% | 16 days 17:46:15 hrs | 99.64% | 99.64% | ||
ayushitripathi | 1 | 128 | 12.50% | 11 days 09:07:09 hrs | 99.67% | 99.67% | ||
ponnappa | 1 | 128 | 12.50% | 4 days 08:29:00 hrs | 99.66% | 99.66% | ||
shilendra | 1 | 128 | 12.50% | 17:59:39 hrs | 99.66% | 99.66% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
pradhi | 3 | 384 | ||
shilendra | 1 | 128 |
Cluster 13
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
c13node1 | workq | 32 | job-busy | 32 | 0 | |
c13node2 | workq | 32 | job-busy | 32 | 0 | |
c13node3 | workq | 32 | job-busy | 32 | 0 | |
c13node4 | workq | 32 | job-busy | 32 | 0 | |
c13node5 | workq | 32 | job-busy | 32 | 0 | |
c13node7 | workq | 32 | job-busy | 32 | 0 | |
c13node8 | workq | 32 | job-busy | 32 | 0 | |
c13node9 | workq | 32 | job-busy | 32 | 0 | |
c13node10 | workq | 32 | job-busy | 32 | 0 | |
c13node11 | workq | 32 | job-busy | 32 | 0 | |
c13node12 | workq | 32 | job-busy | 32 | 0 | |
c13node14 | workq | 32 | job-busy | 32 | 0 | |
c13node15 | workq | 32 | job-busy | 32 | 0 | |
c13node0 | workq | 32 | job-busy | 32 | 0 | |
c13node16 | workq | 32 | job-busy | 32 | 0 | |
c13node17 | workq | 32 | job-busy | 32 | 0 | |
c13node18 | workq | 32 | job-busy | 32 | 0 | |
c13node6 | workq | 32 | down | 0 | 0 | |
c13node19 | workq | 32 | job-busy | 32 | 0 | |
c13node20 | workq | 32 | free | 0 | 32 | |
c13node22 | workq | 32 | job-busy | 32 | 0 | |
c13node13 | workq | 32 | free | 0 | 32 | |
c13node23 | workq | 32 | job-busy | 32 | 0 | |
c13node21 | workq | 32 | free | 0 | 32 | |
c13node24 | neutrino | 32 | free | 0 | 32 | |
c13node25 | neutrino | 32 | free | 0 | 32 | |
c13node26 | neutrino | 32 | free | 0 | 32 | |
c13node27 | neutrino | 32 | free | 0 | 32 | |
c13node28 | neutrino | 32 | free | 0 | 32 | |
c13node29 | neutrino | 32 | down | 0 | 0 | |
c13node30 | neutrino | 32 | free | 0 | 32 | |
c13node31 | neutrino | 32 | free | 0 | 32 | |
Cluster 13
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
398079 | sankalpa | workq | CeSbTe_scfh | R | 32 days 03:50:15 hrs | 128 | 39.47 GB | 99.66% |
398331 | jagjitkaur | workq | nafe | R | 16 days 17:46:15 hrs | 128 | 56.56 GB | 99.64% |
398384 | ayushitripathi | workq | ErOCl12 | R | 11 days 09:07:09 hrs | 128 | 171.83 GB | 99.67% |
398439 | ponnappa | workq | ph61_72 | R | 4 days 08:29:00 hrs | 128 | 31.56 GB | 99.66% |
398464 | shilendra | workq | GGCO_h33 | R | 17:59:39 hrs | 128 | 27.17 GB | 99.66% |
398465 | pradhi | workq | 3cat_rlx | Q | 128 | 0.00% | ||
398466 | pradhi | workq | lytcf_dos | Q | 128 | 0.00% | ||
398469 | shilendra | workq | GGCO_h33 | Q | 128 | 0.00% | ||
398470 | pradhi | workq | lytcf_rlx | Q | 128 | 0.00% | ||
Cluster 14
Nodes Summary
Total Number of CPUs: 1040State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 11 | 616 | 0 | 0.00 |
free | 8 | 144 | 280 | 26.92 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node2 | 2 | |
node5 | 2 | |
node8 | 44 | |
node9 | 56 | |
node10 | 32 | |
node13 | 56 | |
node15 | 56 | |
Total | 280 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
slgupta | 1 | 56 | 5.38% | 29 days 21:39:41 hrs | 99.56% | 99.56% | ||
tisita | 1 | 108 | 10.38% | 20 days 11:34:35 hrs | 99.46% | 99.46% | ||
tanmoymondal | 9 | 36 | 3.46% | 7 days 21:37:00 hrs | 46.61% | 46.61% | ||
shilendra | 1 | 112 | 10.77% | 7 days 19:09:49 hrs | 99.55% | 99.55% | ||
vanshreep | 1 | 56 | 5.38% | 2 days 21:47:38 hrs | 99.67% | 99.67% | ||
pradhi | 1 | 112 | 10.77% | 2 days 21:20:58 hrs | 99.50% | 99.50% | ||
ponnappa | 1 | 112 | 10.77% | 07:25:39 hrs | 98.32% | 98.32% | ||
swapnild | 1 | 112 | 10.77% | 15:35:04 hrs | 99.75% | 99.75% | ||
tanoykanti | 8 | 56 | 5.38% | 01:14:59 hrs | 99.78% | 99.78% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
vanshreep | 1 | 112 | ||
pradhi | 1 | 112 |
Cluster 14
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node1 | workq | 56 | job-busy | 56 | 0 | |
node2 | workq | 56 | free | 54 | 2 | |
node3 | workq | 56 | job-busy | 56 | 0 | |
node4 | workq | 56 | job-busy | 56 | 0 | |
node5 | workq | 56 | free | 54 | 2 | |
node6 | workq | 56 | job-busy | 56 | 0 | |
node7 | workq | 56 | job-busy | 56 | 0 | |
node8 | workq | 56 | free | 12 | 44 | |
node9 | workq | 56 | free | 0 | 56 | |
node10 | workq | 56 | free | 24 | 32 | |
node11 | workq | 56 | job-busy | 56 | 0 | |
node12 | workq | 56 | job-busy | 56 | 0 | |
node13 | workq | 56 | free | 0 | 56 | |
node14 | workq | 56 | job-busy | 56 | 0 | |
node15 | workq | 56 | free | 0 | 56 | |
node16 | workq | 56 | job-busy | 56 | 0 | |
node17 | workq | 56 | job-busy | 56 | 0 | |
node18 | workq | 56 | job-busy | 56 | 0 | |
gpu1 | neutrino | 32 | free | 0 | 32 | |
Cluster 14
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
12847.c14m1.clusternet | slgupta@c14m2.clusternet | workq | 0104_LMTO | R | 29 days 21:39:41 hrs | 56 | 137.28 GB | 99.56% |
13133.c14m1.clusternet | tisita@c14m2.clusternet | workq | on_P_neb | R | 20 days 11:34:35 hrs | 108 | 14.50 GB | 99.46% |
13647.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0004 | R | 7 days 21:37:55 hrs | 4 | 122.50 MB | 40.71% |
13648.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0008 | R | 7 days 21:37:52 hrs | 4 | 116.48 MB | 40.72% |
13649.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.00125 | R | 7 days 21:37:50 hrs | 4 | 120.68 MB | 40.69% |
13650.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.00175 | R | 7 days 21:36:40 hrs | 4 | 129.62 MB | 49.57% |
13651.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0025 | R | 7 days 21:36:38 hrs | 4 | 127.14 MB | 49.56% |
13652.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.003 | R | 7 days 21:36:35 hrs | 4 | 128.31 MB | 49.57% |
13653.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.008 | R | 7 days 21:36:33 hrs | 4 | 126.63 MB | 49.57% |
13654.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.01 | R | 7 days 21:36:31 hrs | 4 | 128.59 MB | 49.56% |
13655.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.02 | R | 7 days 21:36:28 hrs | 4 | 128.62 MB | 49.56% |
13660.c14m1.clusternet | shilendra@c14m2.clusternet | workq | CGGO_hbBi | R | 7 days 19:09:49 hrs | 112 | 53.56 GB | 99.55% |
13783.c14m1.clusternet | vanshreep@c14m2.clusternet | workq | cre | R | 2 days 21:47:38 hrs | 56 | 83.00 GB | 99.67% |
13784.c14m1.clusternet | vanshreep@c14m2.clusternet | workq | shl | Q | 112 | 0.00% | ||
13785.c14m1.clusternet | pradhi@c14m2.clusternet | workq | 3d_al2o3_rlx | R | 2 days 21:20:58 hrs | 112 | 78.95 GB | 99.50% |
13786.c14m1.clusternet | pradhi@c14m2.clusternet | workq | 3d_alf3_rlx | Q | 112 | 0.00% | ||
13857.c14m1.clusternet | ponnappa@c14m2.clusternet | workq | SSe | R | 07:25:39 hrs | 112 | 497.46 GB | 98.32% |
13923.c14m1.clusternet | swapnild@c14m2.clusternet | workq | InAs_blk1 | R | 15:35:04 hrs | 112 | 249.79 GB | 99.75% |
13966.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | test_bose_82_U | R | 01:29:24 hrs | 7 | 282.14 MB | 99.79% |
13967.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | test_bose_83_U | R | 01:27:33 hrs | 7 | 2.80 GB | 99.78% |
13968.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | bose83a0.5_test | R | 01:15:32 hrs | 7 | 2.80 GB | 99.78% |
13969.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | bose83a1_test | R | 01:14:45 hrs | 7 | 2.80 GB | 99.79% |
13970.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | bose83a2_test | R | 01:10:33 hrs | 7 | 2.80 GB | 99.78% |
13971.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | bose83a3_test | R | 01:09:21 hrs | 7 | 2.80 GB | 99.77% |
13972.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | bose83a5_test | R | 01:08:26 hrs | 7 | 2.80 GB | 99.77% |
13973.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | test_fermi_83_U | R | 01:04:23 hrs | 7 | 59.32 GB | 99.76% |