<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>反向传播 on Answer</title>
    <link>https://answer.freetools.me/tags/%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD/</link>
    <description>Recent content in 反向传播 on Answer</description>
    <generator>Hugo -- 0.152.2</generator>
    <language>zh-cn</language>
    <lastBuildDate>Thu, 12 Mar 2026 16:06:50 +0800</lastBuildDate>
    <atom:link href="https://answer.freetools.me/tags/%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>神经网络是如何学习的：从前向传播到反向传播的完整训练过程解析</title>
      <link>https://answer.freetools.me/%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%E6%98%AF%E5%A6%82%E4%BD%95%E5%AD%A6%E4%B9%A0%E7%9A%84%E4%BB%8E%E5%89%8D%E5%90%91%E4%BC%A0%E6%92%AD%E5%88%B0%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD%E7%9A%84%E5%AE%8C%E6%95%B4%E8%AE%AD%E7%BB%83%E8%BF%87%E7%A8%8B%E8%A7%A3%E6%9E%90/</link>
      <pubDate>Thu, 12 Mar 2026 16:06:50 +0800</pubDate>
      <guid>https://answer.freetools.me/%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%E6%98%AF%E5%A6%82%E4%BD%95%E5%AD%A6%E4%B9%A0%E7%9A%84%E4%BB%8E%E5%89%8D%E5%90%91%E4%BC%A0%E6%92%AD%E5%88%B0%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD%E7%9A%84%E5%AE%8C%E6%95%B4%E8%AE%AD%E7%BB%83%E8%BF%87%E7%A8%8B%E8%A7%A3%E6%9E%90/</guid>
      <description>神经网络是如何学习的：从前向传播到反向传播的完整训练过程解析</description>
    </item>
    <item>
      <title>自动微分与反向传播为什么这个六十岁的算法是深度学习的基石</title>
      <link>https://answer.freetools.me/%E8%87%AA%E5%8A%A8%E5%BE%AE%E5%88%86%E4%B8%8E%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD%E4%B8%BA%E4%BB%80%E4%B9%88%E8%BF%99%E4%B8%AA%E5%85%AD%E5%8D%81%E5%B2%81%E7%9A%84%E7%AE%97%E6%B3%95%E6%98%AF%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E7%9A%84%E5%9F%BA%E7%9F%B3/</link>
      <pubDate>Thu, 12 Mar 2026 06:24:59 +0800</pubDate>
      <guid>https://answer.freetools.me/%E8%87%AA%E5%8A%A8%E5%BE%AE%E5%88%86%E4%B8%8E%E5%8F%8D%E5%90%91%E4%BC%A0%E6%92%AD%E4%B8%BA%E4%BB%80%E4%B9%88%E8%BF%99%E4%B8%AA%E5%85%AD%E5%8D%81%E5%B2%81%E7%9A%84%E7%AE%97%E6%B3%95%E6%98%AF%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E7%9A%84%E5%9F%BA%E7%9F%B3/</guid>
      <description>深入解析自动微分的前向模式与反向模式、计算图的构建与遍历、PyTorch autograd引擎的实现细节，以及梯度消失、梯度爆炸等数值稳定性问题的解决方案。</description>
    </item>
    <item>
      <title>梯度消失与梯度爆炸：为什么深层神经网络曾经只能堆叠五层？</title>
      <link>https://answer.freetools.me/%E6%A2%AF%E5%BA%A6%E6%B6%88%E5%A4%B1%E4%B8%8E%E6%A2%AF%E5%BA%A6%E7%88%86%E7%82%B8%E4%B8%BA%E4%BB%80%E4%B9%88%E6%B7%B1%E5%B1%82%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%E6%9B%BE%E7%BB%8F%E5%8F%AA%E8%83%BD%E5%A0%86%E5%8F%A0%E4%BA%94%E5%B1%82/</link>
      <pubDate>Thu, 12 Mar 2026 00:23:55 +0800</pubDate>
      <guid>https://answer.freetools.me/%E6%A2%AF%E5%BA%A6%E6%B6%88%E5%A4%B1%E4%B8%8E%E6%A2%AF%E5%BA%A6%E7%88%86%E7%82%B8%E4%B8%BA%E4%BB%80%E4%B9%88%E6%B7%B1%E5%B1%82%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%E6%9B%BE%E7%BB%8F%E5%8F%AA%E8%83%BD%E5%A0%86%E5%8F%A0%E4%BA%94%E5%B1%82/</guid>
      <description>从1991年Hochreiter发现梯度消失问题，到2015年ResNet突破1000层训练障碍，深度学习的&amp;#34;深度&amp;#34;困境经历了二十五年的技术突围。本文深入解析梯度问题的数学本质、历史演进与解决方案。</description>
    </item>
  </channel>
</rss>
