command
Version:
v1.0.1
Opens a new window with list of versions in this module.
Published: Jun 17, 2023
License: MIT
Opens a new window with license information.
Imports: 16
Opens a new window with list of imports.
Imported by: 0
Opens a new window with list of known importers.
README
¶
sin_attention
该示例演示了如何使用tnn库来构造RNN和LSTM模型,模型主要由:2个Transformer模块、Flatten层、Sigmoid层、Output层构成,参数定义如下:
const lr = 1e-3
const epoch = 1000
const batchSize = 128
const steps = 32
const dims = 8
const transformerSize = 2
下面是模型训练后拟合出的图像:
![attention](https://github.com/lwch/tnn/raw/v1.0.1/example/sin_attention/pred.png)
Documentation
¶
There is no documentation for this package.
Source Files
¶
Click to show internal directories.
Click to hide internal directories.