election (selnone) Testing -- chunked dataset with none-selection (selnone) Testing -- chunked dataset with none-selection (selnone) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel extend Chunked allocation on serial file (calloc) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- parallel read of dataset written serially with filters (fltread) Testing -- compressed dataset collective read (cmpdsetr) Testing -- compressed dataset collective read (cmpdsetr) Testing -- compressed dataset collective read (cmpdsetr) Testing -- compressed dataset collective read (cmpdsetr) Testing -- compressed dataset collective read (cmpdsetr) Testing -- compressed dataset collective read (cmpdsetr) Proc 0: *** Parallel ERROR *** VRFY (H5Dwrite succeeded) failed at line 2610 in ../../testpar/t_dset.c aborting MPI processes Proc 3: *** Parallel ERROR *** VRFY (H5Dwrite succeeded) failed at line 2610 in ../../testpar/t_dset.c aborting MPI processes -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- Proc 1: *** Parallel ERROR *** VRFY (H5Dwrite succeeded) failed at line 2610 in ../../testpar/t_dset.c aborting MPI processes Proc 2: *** Parallel ERROR *** VRFY (H5Dwrite succeeded) failed at line 2610 in ../../testpar/t_dset.c aborting MPI processes [winterrose.scrye.com:3310549] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [winterrose.scrye.com:3310549] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Command exited with non-zero status 1 0.17user 0.12system 0:16.63elapsed 1%CPU (0avgtext+0avgdata 15868maxresident)k 0inputs+16472outputs (2057major+1742minor)pagefaults 0swaps make[4]: Leaving directory '/builddir/build/BUILD/hdf5-1.12.1/openmpi/testpar' make[3]: Leaving directory '/builddir/build/BUILD/hdf5-1.12.1/openmpi/testpar' make[2]: Leaving directory '/builddir/build/BUILD/hdf5-1.12.1/openmpi/testpar' make[1]: Leaving directory '/builddir/build/BUILD/hdf5-1.12.1/openmpi/testpar' make: Leaving directory '/builddir/build/BUILD/hdf5-1.12.1/openmpi' RPM build errors: make[4]: *** [Makefile:1591: testphdf5.chkexe_] Error 1 make[3]: *** [Makefile:1717: build-check-p] Error 1 make[2]: *** [Makefile:1572: test] Error 2 make[1]: *** [Makefile:1323: check-am] Error 2 make: *** [Makefile:718: check-recursive] Error 1 error: Bad exit status from /var/tmp/rpm-tmp.8E9f0x (%check) Bad exit status from /var/tmp/rpm-tmp.8E9f0x (%check) Child return code was: 1 EXCEPTION: [Error()] Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/mockbuild/trace_decorator.py", line 93, in trace result = func(*args, **kw) File "/usr/lib/python3.9/site-packages/mockbuild/util.py", line 600, in do_with_status raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode) mockbuild.exception.Error: Command failed: # bash --login -c /usr/bin/rpmbuild -bb --target riscv64 --nodeps /builddir/build/SPECS/hdf5.spec