TensorFlow 中的 RNN 训练17 Mar 2025 | 5 分钟阅读 循环神经网络是一种面向深度学习的算法,它遵循一种顺序方法。在神经网络中,我们假设所有层中的每个输入和输出都是独立的。这些类型的神经网络被称为循环,因为它们按顺序执行数学计算。 ![]() 训练循环神经网络的步骤如下: 步骤 1- 从数据集中输入一个特定的例子。 步骤 2- 网络将取一个例子,并使用随机初始化的变量进行一些计算。 步骤 3- 然后计算预测结果。 步骤 4- 将实际结果与预期值进行比较,将产生一个错误。 步骤 5- 它通过调整变量以追踪错误的方式,沿着同一路径传播。 步骤 6- 重复从 1 到 5 的步骤,直到我们确信用于获得输出的变量已适当定义。 步骤 7- 在最后一步中,通过应用这些变量来获取新的、未见的输入,从而进行系统预测。 下面描述了表示循环神经网络的示意方法- ![]() TensorFlow 实现循环神经网络完整代码 输出 Instructions for updating: Future major versions of TensorFlow will allow gradients to flow into the label's input on backprop by default. See `tf.nn.softmax_cross_entropy_with_logits_v2`. Step 1, Minibatch Loss= 2.6592, Training Accuracy= 0.148 Step 200, Minibatch Loss= 2.1379, Training Accuracy= 0.250 Step 400, Minibatch Loss= 1.8860, Training Accuracy= 0.445 Step 600, Minibatch Loss= 1.8542, Training Accuracy= 0.367 Step 800, Minibatch Loss= 1.7489, Training Accuracy= 0.477 Step 1000, Minibatch Loss= 1.6399, Training Accuracy= 0.492 Step 1200, Minibatch Loss= 1.4379, Training Accuracy= 0.570 Step 1400, Minibatch Loss= 1.4319, Training Accuracy= 0.500 Step 1600, Minibatch Loss= 1.3899, Training Accuracy= 0.547 Step 1800, Minibatch Loss= 1.3563, Training Accuracy= 0.570 Step 2000, Minibatch Loss= 1.2134, Training Accuracy= 0.617 Step 2200, Minibatch Loss= 1.2582, Training Accuracy= 0.609 Step 2400, Minibatch Loss= 1.2412, Training Accuracy= 0.578 Step 2600, Minibatch Loss= 1.1655, Training Accuracy= 0.625 Step 2800, Minibatch Loss= 1.0927, Training Accuracy= 0.656 Step 3000, Minibatch Loss= 1.2648, Training Accuracy= 0.617 Step 3200, Minibatch Loss= 0.9734, Training Accuracy= 0.695 Step 3400, Minibatch Loss= 0.8705, Training Accuracy= 0.773 Step 3600, Minibatch Loss= 1.0188, Training Accuracy= 0.680 Step 3800, Minibatch Loss= 0.8047, Training Accuracy= 0.719 Step 4000, Minibatch Loss= 0.8417, Training Accuracy= 0.758 Step 4200, Minibatch Loss= 0.8516, Training Accuracy= 0.703 Step 4400, Minibatch Loss= 0.8496, Training Accuracy= 0.773 Step 4600, Minibatch Loss= 0.9925, Training Accuracy= 0.719 Step 4800, Minibatch Loss= 0.6316, Training Accuracy= 0.812 Step 5000, Minibatch Loss= 0.7585, Training Accuracy= 0.750 Step 5200, Minibatch Loss= 0.6965, Training Accuracy= 0.797 Step 5400, Minibatch Loss= 0.7134, Training Accuracy= 0.836 Step 5600, Minibatch Loss= 0.6509, Training Accuracy= 0.812 Step 5800, Minibatch Loss= 0.7797, Training Accuracy= 0.750 Step 6000, Minibatch Loss= 0.6225, Training Accuracy= 0.859 Step 6200, Minibatch Loss= 0.6776, Training Accuracy= 0.781 Step 6400, Minibatch Loss= 0.6090, Training Accuracy= 0.781 Step 6600, Minibatch Loss= 0.5446, Training Accuracy= 0.836 Step 6800, Minibatch Loss= 0.6514, Training Accuracy= 0.750 Step 7000, Minibatch Loss= 0.7421, Training Accuracy= 0.758 Step 7200, Minibatch Loss= 0.5114, Training Accuracy= 0.844 Step 7400, Minibatch Loss= 0.5999, Training Accuracy= 0.844 Step 7600, Minibatch Loss= 0.5764, Training Accuracy= 0.789 Step 7800, Minibatch Loss= 0.6225, Training Accuracy= 0.805 Step 8000, Minibatch Loss= 0.4691, Training Accuracy= 0.875 Step 8200, Minibatch Loss= 0.4859, Training Accuracy= 0.852 Step 8400, Minibatch Loss= 0.5820, Training Accuracy= 0.828 Step 8600, Minibatch Loss= 0.4873, Training Accuracy= 0.883 Step 8800, Minibatch Loss= 0.5194, Training Accuracy= 0.828 Step 9000, Minibatch Loss= 0.6888, Training Accuracy= 0.820 Step 9200, Minibatch Loss= 0.6094, Training Accuracy= 0.812 Step 9400, Minibatch Loss= 0.5852, Training Accuracy= 0.852 Step 9600, Minibatch Loss= 0.4656, Training Accuracy= 0.844 Step 9800, Minibatch Loss= 0.4595, Training Accuracy= 0.875 Step 10000, Minibatch Loss= 0.4404, Training Accuracy= 0.883 Optimization Finished! Testing Accuracy: 0.890625 下一个主题RNN 的类型 |
我们请求您订阅我们的新闻通讯以获取最新更新。