Initial blog skeleton
This commit is contained in:
18
docs/topics/llm.md
Normal file
18
docs/topics/llm.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# 大语言模型
|
||||
|
||||
这个专题页用于组织与 LLM 相关的论文、知识点和方法梳理。
|
||||
|
||||
## 可纳入的内容
|
||||
|
||||
- Transformer 基础
|
||||
- 预训练目标
|
||||
- 指令微调
|
||||
- 对齐训练
|
||||
- RAG
|
||||
- 推理优化
|
||||
- Agent 系统
|
||||
|
||||
## 当前关联内容
|
||||
|
||||
- [Attention Is All You Need](../papers/attention-is-all-you-need.md)
|
||||
- [Transformer 注意力机制](../notes/transformer-attention.md)
|
||||
Reference in New Issue
Block a user