在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):sony/nnabla开源软件地址(OpenSource Url):https://github.com/sony/nnabla开源编程语言(OpenSource Language):Python 60.1%开源软件介绍(OpenSource Introduction):Neural Network LibrariesNeural Network Libraries is a deep learning framework that is intended to be used for research, development and production. We aim to have it running everywhere: desktop PCs, HPC clusters, embedded devices and production servers.
InstallationInstalling Neural Network Libraries is easy:
This installs the CPU version of Neural Network Libraries. GPU-acceleration can be added by installing the CUDA extension with following command.
Above command is for version 11.0 CUDA Toolkit. for other versions: CUDA ver. 10.1, ver. 9.x, ver. 8.x are not supported now. For more details, see the installation section of the documentation. Building from SourceSee Build Manuals. Running on DockerFor details on running on Docker, see the installation section of the documentation. FeaturesEasy, flexible and expressiveThe Python API built on the Neural Network Libraries C++11 core gives you flexibility and
productivity. For example, a two layer neural network with classification loss
can be defined in the following 5 lines of codes (hyper parameters are enclosed
by import nnabla as nn
import nnabla.functions as F
import nnabla.parametric_functions as PF
x = nn.Variable(<input_shape>)
t = nn.Variable(<target_shape>)
h = F.tanh(PF.affine(x, <hidden_size>, name='affine1'))
y = PF.affine(h, <target_size>, name='affine2')
loss = F.mean(F.softmax_cross_entropy(y, t)) Training can be done by: import nnabla.solvers as S
# Create a solver (parameter updater)
solver = S.Adam(<solver_params>)
solver.set_parameters(nn.get_parameters())
# Training iteration
for n in range(<num_training_iterations>):
# Setting data from any data source
x.d = <set data>
t.d = <set label>
# Initialize gradients
solver.zero_grad()
# Forward and backward execution
loss.forward()
loss.backward()
# Update parameters by computed gradients
solver.update() The dynamic computation graph enables flexible runtime network construction. Neural Network Libraries can use both paradigms of static and dynamic graphs, both using the same API. x.d = <set data>
t.d = <set label>
drop_depth = np.random.rand(<num_stochastic_layers>) < <layer_drop_ratio>
with nn.auto_forward():
h = F.relu(PF.convolution(x, <hidden_size>, (3, 3), pad=(1, 1), name='conv0'))
for i in range(<num_stochastic_layers>):
if drop_depth[i]:
continue # Stochastically drop a layer
h2 = F.relu(PF.convolution(x, <hidden_size>, (3, 3), pad=(1, 1),
name='conv%d' % (i + 1)))
h = F.add2(h, h2)
y = PF.affine(h, <target_size>, name='classification')
loss = F.mean(F.softmax_cross_entropy(y, t))
# Backward computation (can also be done in dynamically executed graph)
loss.backward() You can differentiate to any order with nn.grad. import nnabla as nn
import nnabla.functions as F
import numpy as np
x = nn.Variable.from_numpy_array(np.random.randn(2, 2)).apply(need_grad=True)
x.grad.zero()
y = F.sin(x)
def grad(y, x, n=1):
dx = [y]
for _ in range(n):
dx = nn.grad([dx[0]], [x])
return dx[0]
dnx = grad(y, x, n=10)
dnx.forward()
print(np.allclose(-np.sin(x.d), dnx.d))
dnx.backward()
print(np.allclose(-np.cos(x.d), x.g))
# Show the registry status
from nnabla.backward_functions import show_registry
show_registry() Command line utilityNeural Network Libraries provides a command line utility
For more details see Documentation Portable and multi-platform
Extensible
Efficient
Documentationhttps://nnabla.readthedocs.org Getting started
Contribution guideThe technology is rapidly progressing, and researchers and developers often want to add their custom features to a deep learning framework. NNabla is really nice in this point. The architecture of Neural Network Libraries is clean and quite simple. Also, you can add new features very easy by the help of our code template generating system. See the following link for details. License & NoticeNeural Network Libraries is provided under the Apache License Version 2.0 license. It also depends on some open source software packages. For more information, see LICENSES. Citation
|
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论