# (开篇词)PyTorch 学习笔记

[![GitHub](https://img.shields.io/github/stars/zhangxiann/PyTorch_Practice?label=Stars\&style=flat-square\&logo=GitHub)](https://github.com/labuladong/fucking-algorithm) [![Website](https://img.shields.io/website?label=%E5%9C%A8%E7%BA%BF%E7%94%B5%E5%AD%90%E4%B9%A6\&style=flat-square\&down_color=blue\&down_message=%E7%82%B9%E8%BF%99%E9%87%8C\&up_color=blue\&up_message=%E7%82%B9%E8%BF%99%E9%87%8C\&url=https://pytorch.zhangxiann.com/\&logo=Gitea)](https://labuladong.gitbook.io/algo)

[![](https://img.shields.io/badge/%E4%BD%9C%E8%80%85-@zhangxiann-000000.svg?style=flat-square\&logo=GitHub)](https://www.github.com/labuladong) [![](https://img.shields.io/badge/%E7%9F%A5%E4%B9%8E-@%E5%BC%A0%E8%B4%A4%E5%90%8C%E5%AD%A6-000000.svg?style=flat-square\&logo=Zhihu)](https://www.zhihu.com/people/zhangxian/posts) [![](https://img.shields.io/badge/%E5%85%AC%E4%BC%97%E5%8F%B7-@%E5%BC%A0%E8%B4%A4%E5%90%8C%E5%AD%A6-000000.svg?style=flat-square\&logo=WeChat)](https://image.zhangxiann.com/QRcode_8cm.jpg)

这篇文章是我学习 PyTorch 过程中所记录的学习笔记汇总，包括 **25** 篇文章，是我学习 **PyTorch** 框架版课程期间所记录的内容。

学习笔记的结构遵循课程的顺序，共分为 8 周，循序渐进，**力求通俗易懂**。

## 代码

配套代码：<https://github.com/zhangxiann/PyTorch_Practice>

所有代码均在 PyCharm 中通过测试，建议通过 git 克隆到本地运行。

## 数据

由于代码中会用到一些第三方的数据集，这里给出百度云的下载地址（如果有其他更好的数据托管方式，欢迎告诉我）。

数据下载地址： 链接：<https://pan.baidu.com/s/1f9wQM7gvkMVx2x5z6xC9KQ> 提取码：w7xt

## 面向读者

本教程假定读你有一定的机器学习和深度学习基础。

如果你没有学习过机器学习或者深度学习，建议先观看 Andrew ng 的深度学习（Deep Learning）课程，课程地址： <https://mooc.study.163.com/university/deeplearning_ai#/c>。

然后再学习本教程，效果会更佳。

## 学习计划

这个学习笔记共 25 章，分为 8 周进行的，每周大概 3 章（当然你可以根据自己的进度调整），每章花费的时间约 30 分钟到 2 个小时之间。

目录大纲如下：

* [(开篇词)PyTorch 学习笔记](https://pytorch.zhangxiann.com/readme)
* [1 基本概念](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian)
  * [1.1 PyTorch 简介与安装](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian/1.1-pytorch-jian-jie-yu-an-zhuang)
  * [1.2 Tensor(张量)介绍](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian/1.2-tensor-zhang-liang-jie-shao)
  * [1.3 张量操作与线性回归](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian/1.3-zhang-liang-cao-zuo-yu-xian-xing-hui-gui)
  * [1.4 计算图与动态图机制](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian/1.4-ji-suan-tu-yu-dong-tai-tu-ji-zhi)
  * [1.5 autograd 与逻辑回归](https://pytorch.zhangxiann.com/1-ji-ben-gai-nian/1.5-autograd-yu-luo-ji-hui-gui)
* [2 图片处理与数据加载](https://pytorch.zhangxiann.com/2-tu-pian-chu-li-yu-shu-ju-jia-zai)
  * [2.1 DataLoader 与 DataSet](https://pytorch.zhangxiann.com/2-tu-pian-chu-li-yu-shu-ju-jia-zai/2.1-dataloader-yu-dataset)
  * [2.2 图片预处理 transforms 模块机制](https://pytorch.zhangxiann.com/2-tu-pian-chu-li-yu-shu-ju-jia-zai/2.2-tu-pian-yu-chu-li-transforms-mo-kuai-ji-zhi)
  * [2.3 二十二种 transforms 图片数据预处理方法](https://pytorch.zhangxiann.com/2-tu-pian-chu-li-yu-shu-ju-jia-zai/2.3-er-shi-er-zhong-transforms-tu-pian-shu-ju-yu-chu-li-fang-fa)
* [3 模型构建](https://pytorch.zhangxiann.com/3-mo-xing-gou-jian)
  * [3.1 模型创建步骤与 nn.Module](https://pytorch.zhangxiann.com/3-mo-xing-gou-jian/3.1-mo-xing-chuang-jian-bu-zhou-yu-nn.module)
  * [3.2 卷积层](https://pytorch.zhangxiann.com/3-mo-xing-gou-jian/3.2-juan-ji-ceng)
  * [3.3 池化层、线性层和激活函数层](https://pytorch.zhangxiann.com/3-mo-xing-gou-jian/3.3-chi-hua-ceng-xian-xing-ceng-he-ji-huo-han-shu-ceng)
* [4 模型训练](https://pytorch.zhangxiann.com/4-mo-xing-xun-lian)
  * [4.1 权值初始化](https://pytorch.zhangxiann.com/4-mo-xing-xun-lian/4.1-quan-zhi-chu-shi-hua)
  * [4.2 损失函数](https://pytorch.zhangxiann.com/4-mo-xing-xun-lian/4.2-sun-shi-han-shu)
  * [4.3 优化器](https://pytorch.zhangxiann.com/4-mo-xing-xun-lian/4.3-you-hua-qi)
* [5 可视化与 Hook](https://pytorch.zhangxiann.com/5-ke-shi-hua-yu-hook)
  * [5.1 TensorBoard 介绍](https://pytorch.zhangxiann.com/5-ke-shi-hua-yu-hook/5.1-tensorboard-jie-shao)
  * [5.2 Hook 函数与 CAM 算法](https://pytorch.zhangxiann.com/5-ke-shi-hua-yu-hook/5.2-hook-han-shu-yu-cam-suan-fa)
* [6 正则化](https://pytorch.zhangxiann.com/6-zheng-ze-hua)
  * [6.1 weight decay 和 dropout](https://pytorch.zhangxiann.com/6-zheng-ze-hua/6.1-weight-decay-he-dropout)
  * [6.2 Normalization](https://pytorch.zhangxiann.com/6-zheng-ze-hua/6.2-normalization)
* [7 模型其他操作](https://pytorch.zhangxiann.com/7-mo-xing-qi-ta-cao-zuo)
  * [7.1 模型保存与加载](https://pytorch.zhangxiann.com/7-mo-xing-qi-ta-cao-zuo/7.1-mo-xing-bao-cun-yu-jia-zai)
  * [7.2 模型 Finetune](https://pytorch.zhangxiann.com/7-mo-xing-qi-ta-cao-zuo/7.2-mo-xing-finetune)
  * [7.3 使用 GPU 训练模型](https://pytorch.zhangxiann.com/7-mo-xing-qi-ta-cao-zuo/7.3-shi-yong-gpu-xun-lian-mo-xing)
* [8 实际应用](https://pytorch.zhangxiann.com/8-shi-ji-ying-yong)
  * [8.1 图像分类简述与 ResNet 源码分析](https://pytorch.zhangxiann.com/8-shi-ji-ying-yong/8.1-tu-xiang-fen-lei-jian-shu-yu-resnet-yuan-ma-fen-xi)
  * [8.2 目标检测简介](https://pytorch.zhangxiann.com/8-shi-ji-ying-yong/8.2-mu-biao-jian-ce-jian-jie)
  * [8.3 GAN（生成对抗网络）简介](https://pytorch.zhangxiann.com/8-shi-ji-ying-yong/8.3-gan-sheng-cheng-dui-kang-wang-luo-jian-jie)
  * [8.4 手动实现 RNN](https://pytorch.zhangxiann.com/8-shi-ji-ying-yong/8.4-shou-dong-shi-xian-rnn)
* [9 其他](https://pytorch.zhangxiann.com/9-qi-ta)
  * [PyTorch 常见报错信息](https://pytorch.zhangxiann.com/9-qi-ta/pytorch-chang-jian-bao-cuo-xin-xi)
  * [图神经网络 PyTorch Geometric 入门教程](https://pytorch.zhangxiann.com/9-qi-ta/tu-shen-jing-wang-luo-pytorch-geometric-ru-men-jiao-cheng)

如果这份 PyTorch 学习笔记对你有帮助，欢迎 star：

<https://github.com/zhangxiann/PyTorch_Practice>

如果你觉得这篇文章对你有帮助，不妨点个赞，让我有更多动力写出好文章。

欢迎扫码关注我的公众号**张贤同学**。

![](https://image.zhangxiann.com/QRcode_8cm.jpg)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://pytorch.zhangxiann.com/readme.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
