Help is available by moving the cursor above any symbol or by checking MAQAO website.
Metric | r0 | r1 | r2 | r3 | |
---|---|---|---|---|---|
Total Time (s) | 12.94 | 13.01 | 11.17 | 12.85 | |
Max (Thread Active Time) (s) | 12.26 | 12.30 | 10.55 | 12.10 | |
Average Active Time (s) | 11.91 | 11.95 | 10.27 | 11.79 | |
Activity Ratio (%) | 96.2 | 96.1 | 96.9 | 96.1 | |
Average number of active threads | 88.320 | 88.170 | 88.277 | 88.031 | |
Affinity Stability (%) | 99.5 | 98.0 | 99.3 | 97.9 | |
Time in analyzed loops (%) | 69.3 | 68.9 | 73.2 | 70.0 | |
Time in analyzed innermost loops (%) | 68.5 | 68.3 | 72.4 | 66.8 | |
Time in user code (%) | 70.1 | 69.4 | 74.0 | 70.5 | |
Compilation Options Score (%) | 99.2 | 75.0 | 99.3 | 87.5 | |
Array Access Efficiency (%) | 66.6 | 64.0 | 92.4 | 63.2 | |
Potential Speedups | ![]() | ||||
Perfect Flow Complexity | 1.00 | 1.00 | 1.00 | 1.00 | |
Perfect OpenMP/MPI/Pthread/TBB | 1.22 | 1.31 | 1.23 | 1.29 | |
Perfect OpenMP/MPI/Pthread/TBB + Perfect Load Distribution | 1.47 | 1.48 | 1.38 | 1.45 | |
No Scalar Integer | Potential Speedup | 1.00 | 1.00 | 1.14 | 1.00 |
Nb Loops to get 80% | 3 | 4 | 1 | 3 | |
FP Vectorised | Potential Speedup | 1.00 | 1.00 | 1.06 | 1.00 |
Nb Loops to get 80% | 2 | 2 | 1 | 2 | |
Fully Vectorised | Potential Speedup | 1.01 | 1.00 | 1.28 | 1.00 |
Nb Loops to get 80% | 5 | 7 | 1 | 7 | |
Only FP Arithmetic | Potential Speedup | 1.01 | 1.01 | 1.58 | 1.01 |
Nb Loops to get 80% | 7 | 9 | 1 | 7 |
Source Object | Issue |
---|---|
▼libllama.so | |
▼llama-vocab.cpp | |
○ | -mcpu=native is missing. |
▼llama-sampling.cpp | |
○ | -mcpu=native is missing. |
▼llama-impl.cpp | |
○ | -mcpu=native is missing. |
▼hashtable_policy.h | |
○ | -mcpu=native is missing. |
▼hashtable.h | |
○ | -mcpu=native is missing. |
▼libggml-cpu.so | |
▼binary-ops.cpp | |
○ | |
▼repack.cpp | |
○ | |
▼vec.cpp | |
○ | |
▼ggml-cpu.cpp | |
○ | |
▼traits.cpp | |
○ | |
▼quants.c | |
○ | |
▼ggml-cpu.c | |
○ | |
▼ops.cpp | |
○ | |
▼exec | |
▼console.cpp | |
○ | -mcpu=native is missing. |
▼sampling.cpp | |
○ | -mcpu=native is missing. |
▼vector.tcc | |
○ | -mcpu=native is missing. |
▼basic_string.h | |
○ | -mcpu=native is missing. |
▼regex_executor.tcc | |
○ | -mcpu=native is missing. |
▼[vdso] | |
▼ | |
○ | -g is missing for some functions (possibly ones added by the compiler), it is needed to have more accurate reports. Other recommended flags are: -O2/-O3, -march=(target) |
○ | -O2, -O3 or -Ofast is missing. |
○ | -mcpu=native is missing. |
▼libggml-base.so | |
▼ggml-alloc.c | |
○ | -mcpu=native is missing. |
▼ggml.c | |
○ | -mcpu=native is missing. |
r0 | r1 | r2 | r3 | |
---|---|---|---|---|
Experiment Name | ||||
Application | /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/oneview_runs/defaults/orig/exec | /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/base_runs/defaults/gcc/exec | /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/binaries/armclang_3/exec | /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/run/binaries/gcc_5/exec |
Timestamp | 2025-09-16 13:35:29 | 2025-09-16 13:37:24 | 2025-09-16 13:49:12 | 2025-09-16 13:49:52 |
Experiment Type | MPI; OpenMP; | same as r0 | same as r0 | same as r0 |
Machine | ip-172-31-47-249.ec2.internal | same as r0 | same as r0 | same as r0 |
Architecture | aarch64 | same as r0 | same as r0 | same as r0 |
Micro Architecture | ARM_NEOVERSE_V2 | same as r0 | same as r0 | same as r0 |
Model Name | ||||
Cache Size | ||||
Number of Cores | ||||
Maximal Frequency | 0 GHz | same as r0 | same as r0 | same as r0 |
OS Version | Linux 6.1.109-118.189.amzn2023.aarch64 #1 SMP Tue Sep 10 08:58:40 UTC 2024 | same as r0 | same as r0 | same as r0 |
Architecture used during static analysis | aarch64 | same as r0 | same as r0 | same as r0 |
Micro Architecture used during static analysis | ARM_NEOVERSE_V2 | same as r0 | same as r0 | same as r0 |
Compilation Options | + [vdso]: N/A exec: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 --driver-mode=g++ -D GGML_BACKEND_SHARED -D GGML_SHARED -D GGML_USE_BLAS -D GGML_USE_CPU -D LLAMA_SHARED -D LLAMA_USE_CURL -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/../vendor -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/../include -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -O3 -D NDEBUG -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT common/CMakeFiles/common.dir/sampling.cpp.o -MF common/CMakeFiles/common.dir/sampling.cpp.o.d -o common/CMakeFiles/common.dir/sampling.cpp.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/sampling.cpp libggml-base.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 -D GGML_BUILD -D GGML_COMMIT=\"unknown\" -D GGML_SCHED_MAX_COPIES=4 -D GGML_SHARED -D GGML_VERSION=\"0.0.0\" -D _GNU_SOURCE -D _XOPEN_SOURCE=600 -D ggml_base_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -O3 -D NDEBUG -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -std=gnu11 -MD -MT ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o -MF ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o.d -o ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml.c libggml-cpu.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 -D GGML_BACKEND_BUILD -D GGML_BACKEND_SHARED -D GGML_SCHED_MAX_COPIES=4 -D GGML_SHARED -D GGML_USE_CPU_REPACK -D GGML_USE_LLAMAFILE -D GGML_USE_OPENMP -D _GNU_SOURCE -D _XOPEN_SOURCE=600 -D ggml_cpu_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/.. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml-cpu -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -O3 -D NDEBUG -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -mcpu=native+dotprod+i8mm+sve+nosme -fopenmp=libomp -std=gnu11 -MD -MT ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o -MF ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o.d -o ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml-cpu/arch/arm/quants.c libllama.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 --driver-mode=g++ -D GGML_BACKEND_SHARED -D GGML_SHARED -D GGML_USE_BLAS -D GGML_USE_CPU -D LLAMA_BUILD -D LLAMA_SHARED -D llama_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/../include -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -O3 -D NDEBUG -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT src/CMakeFiles/llama.dir/llama-vocab.cpp.o -MF src/CMakeFiles/llama.dir/llama-vocab.cpp.o.d -o src/CMakeFiles/llama.dir/llama-vocab.cpp.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/llama-vocab.cpp Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 --driver-mode=g++ -D GGML_BACKEND_SHARED -D GGML_SHARED -D GGML_USE_BLAS -D GGML_USE_CPU -D LLAMA_BUILD -D LLAMA_SHARED -D llama_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/../include -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -O3 -D NDEBUG -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT src/CMakeFiles/llama.dir/llama.cpp.o -MF src/CMakeFiles/llama.dir/llama.cpp.o.d -o src/CMakeFiles/llama.dir/llama.cpp.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/llama.cpp | exec: GNU C++17 14.2.0 -mlittle-endian -mabi=lp64 -g -O3 -O3 -fno-omit-frame-pointer -fcf-protection=none -fPIC libggml-base.so: GNU C11 14.2.0 -mlittle-endian -mabi=lp64 -g -O3 -O3 -std=gnu11 -fno-omit-frame-pointer -fcf-protection=none -fPIC libggml-cpu.so: GNU C11 14.2.0 -mcpu=neoverse-v2+crc+sve2-aes+sve2-sha3+nossbs+dotprod+i8mm+sve+nosme -mlittle-endian -mabi=lp64 -g -O3 -O3 -std=gnu11 -fno-omit-frame-pointer -fcf-protection=none -fPIC -fopenmp libllama.so: GNU C++17 14.2.0 -mlittle-endian -mabi=lp64 -g -O3 -O3 -fno-omit-frame-pointer -fcf-protection=none -fPIC | + [vdso]: N/A exec: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 --driver-mode=g++ -D GGML_BACKEND_SHARED -D GGML_SHARED -D GGML_USE_BLAS -D GGML_USE_CPU -D LLAMA_SHARED -D LLAMA_USE_CURL -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/../vendor -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/../include -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -O3 -mcpu=neoverse-v2+nosve+nosve2 -armpl -ffast-math -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -fno-finite-math-only -O3 -D NDEBUG -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT common/CMakeFiles/common.dir/regex-partial.cpp.o -MF common/CMakeFiles/common.dir/regex-partial.cpp.o.d -o common/CMakeFiles/common.dir/regex-partial.cpp.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/common/regex-partial.cpp GNU C17 14.2.0 -mlittle-endian -mabi=lp64 -g -g -g -O2 -O2 -O2 -fbuilding-libgcc -fno-stack-protector -fPIC libggml-base.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 -D GGML_BUILD -D GGML_COMMIT=\"unknown\" -D GGML_SCHED_MAX_COPIES=4 -D GGML_SHARED -D GGML_VERSION=\"0.0.0\" -D _GNU_SOURCE -D _XOPEN_SOURCE=600 -D ggml_base_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -O3 -mcpu=neoverse-v2+nosve+nosve2 -armpl -ffast-math -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -fno-finite-math-only -O3 -D NDEBUG -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -std=gnu11 -MD -MT ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o -MF ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o.d -o ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml.c libggml-cpu.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 -D GGML_BACKEND_BUILD -D GGML_BACKEND_SHARED -D GGML_SCHED_MAX_COPIES=4 -D GGML_SHARED -D GGML_USE_CPU_REPACK -D GGML_USE_LLAMAFILE -D GGML_USE_OPENMP -D _GNU_SOURCE -D _XOPEN_SOURCE=600 -D ggml_cpu_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/.. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml-cpu -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -O3 -mcpu=neoverse-v2+nosve+nosve2 -armpl -ffast-math -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -fno-finite-math-only -O3 -D NDEBUG -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -fopenmp=libomp -std=gnu11 -MD -MT ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o -MF ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o.d -o ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/arch/arm/quants.c.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/ggml-cpu/arch/arm/quants.c libllama.so: Arm C/C++/Fortran Compiler version 24.10.1 (build number 4) (based on LLVM 19.1.0) /opt/arm/arm-linux-compiler-24.10.1_AmazonLinux-2023/llvm-bin/clang-19 --driver-mode=g++ -D GGML_BACKEND_SHARED -D GGML_SHARED -D GGML_USE_BLAS -D GGML_USE_CPU -D LLAMA_BUILD -D LLAMA_SHARED -D llama_EXPORTS -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/. -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/../include -I /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/ggml/src/../include -O3 -O3 -mcpu=neoverse-v2+nosve+nosve2 -armpl -ffast-math -g -fno-omit-frame-pointer -fcf-protection=none -no-pie -grecord-command-line -fno-finite-math-only -O3 -D NDEBUG -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT src/CMakeFiles/llama.dir/llama-vocab.cpp.o -MF src/CMakeFiles/llama.dir/llama-vocab.cpp.o.d -o src/CMakeFiles/llama.dir/llama-vocab.cpp.o -c /home/eoseret/Tools/QaaS/qaas_runs/ip-172-31-47-249.ec2.internal/175-802-9624/llama.cpp/build/llama.cpp/src/llama-vocab.cpp | + [vdso]: N/A exec: GNU C++17 14.2.0 -mcpu=neoverse-v2 -mlittle-endian -mabi=lp64 -g -O2 -funroll-loops -ffast-math -fno-omit-frame-pointer -fcf-protection=none -fno-finite-math-only -fPIC libggml-base.so: GNU C11 14.2.0 -mcpu=neoverse-v2 -mlittle-endian -mabi=lp64 -g -O2 -std=gnu11 -funroll-loops -ffast-math -fno-omit-frame-pointer -fcf-protection=none -fno-finite-math-only -fPIC libggml-cpu.so: GNU C11 14.2.0 -mcpu=neoverse-v2 -mlittle-endian -mabi=lp64 -g -O2 -std=gnu11 -funroll-loops -ffast-math -fno-omit-frame-pointer -fcf-protection=none -fno-finite-math-only -fPIC -fopenmp libllama.so: GNU C++17 14.2.0 -mcpu=neoverse-v2 -mlittle-endian -mabi=lp64 -g -O2 -funroll-loops -ffast-math -fno-omit-frame-pointer -fcf-protection=none -fno-finite-math-only -fPIC |
Number of processes observed | 1 | same as r0 | same as r0 | same as r0 |
Number of threads observed | 96 | same as r0 | same as r0 | same as r0 |
Frequency Driver | NA | same as r0 | same as r0 | same as r0 |
Frequency Governor | NA | same as r0 | same as r0 | same as r0 |
Huge Pages | madvise | same as r0 | same as r0 | same as r0 |
Hyperthreading | off | same as r0 | same as r0 | same as r0 |
Number of sockets | 1 | same as r0 | same as r0 | same as r0 |
Number of cores per socket | 96 | same as r0 | same as r0 | same as r0 |
MAQAO version | 2025.1.2 | same as r0 | same as r0 | same as r0 |
MAQAO build | Build information not available | same as r0 | same as r0 | same as r0 |
Comments | same as r0 | same as r0 | same as r0 |