Wed, 15 May 2024 18:50:17 UTC | login

Information for RPM ncnn-20240102-1.fc40.src.rpm

ID1290824
Namencnn
Version20240102
Release1.fc40
Epoch
Archsrc
SummaryA high-performance neural network inference framework
Descriptionncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply considerate about deployment and uses on mobile phones from the beginning of design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open source frameworks on mobile phone cpu. Developers can easily deploy deep learning algorithm models to the mobile platform by using efficient ncnn implementation, create intelligent APPs, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.
Build Time2024-04-15 15:39:43 GMT
Size12.16 MB
8416ccd604d9c1ea573323bfcd9b4c0b
LicenseBSD-3-Clause AND BSD-2-Clause AND Zlib
Buildrootf40-build-806305-144971
Provides
ncnn = 20240102-1.fc40
ncnn-debuginfo = 20240102-1.fc40
ncnn-debugsource = 20240102-1.fc40
ncnn-devel = 20240102-1.fc40
Obsoletes No Obsoletes
Conflicts No Conflicts
Requires
cmake
cmake(glslang)
gcc-c++
ninja-build
pkgconfig(protobuf)
rpmlib(CompressedFileNames) <= 3.0.4-1
rpmlib(FileDigests) <= 4.6.0-1
Recommends No Recommends
Suggests No Suggests
Supplements No Supplements
Enhances No Enhances
Files
1 through 2 of 2
Name ascending sort Size
ncnn-20240102.tar.gz12.20 MB
ncnn.spec2.14 KB
Component of No Buildroots