Spaces:
Running
Running
thibaud frere
commited on
Commit
·
2225c34
1
Parent(s):
6b70860
add latex to blogpost feature
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitignore +3 -2
- app/package.json +0 -0
- app/scripts/latex-converter/README.md +0 -107
- app/scripts/latex-converter/bibliography-cleaner.mjs +0 -123
- app/scripts/latex-converter/config.mjs +0 -59
- app/scripts/latex-converter/converter.mjs +0 -456
- app/scripts/latex-converter/image-transformer.mjs +0 -179
- app/scripts/latex-converter/index.mjs +0 -75
- app/scripts/latex-converter/preprocessor.mjs +0 -115
- app/scripts/latex-converter/robust-preprocessor.mjs +0 -399
- app/scripts/latex-to-mdx/README.md +169 -0
- app/scripts/latex-to-mdx/bib-cleaner.mjs +104 -0
- app/scripts/latex-to-mdx/filters/equation-ids.lua +134 -0
- app/scripts/latex-to-mdx/index.mjs +138 -0
- app/scripts/latex-to-mdx/input/.gitignore +13 -0
- app/scripts/latex-to-mdx/input/README.md +64 -0
- app/scripts/latex-to-mdx/input/_minted/62B8750C0ACEBDA39A95140434E540A8.highlight.minted +52 -0
- app/scripts/latex-to-mdx/input/_minted/_FAD58DE7366495DB4650CFEFAC2FCD61.index.minted +10 -0
- app/scripts/latex-to-mdx/input/_minted/colorful.style.minted +100 -0
- app/scripts/latex-to-mdx/input/fancyhdr.sty +485 -0
- app/scripts/latex-to-mdx/input/figures/ch1/ch1-lerobot-figure1.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-approaches.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-classical-limitations.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-cost-accessibility.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor-box.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor-shelf.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-free.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-platforms.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch2/ch2-so100-to-planar-manipulator.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-agent-env.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-duck-sim-vs-real.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-hil-serl-examples.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-learning-atlas.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-learning-benefits.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-many-ducks.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-rl-algorithms-atlas.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-rl-examples.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-act-decoder.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-act-encoder.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-act.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-action-vs-observation-distribution.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-async-inference.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-bc-trajectories.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-policy.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-robot-actions.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-vs-flowmatching.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-issues-with-bc.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-latent-variable-model.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-many-latents.png +3 -0
.gitignore
CHANGED
|
@@ -19,7 +19,7 @@ node_modules/
|
|
| 19 |
*.env
|
| 20 |
*.cache
|
| 21 |
|
| 22 |
-
app/scripts/latex-converter/
|
| 23 |
|
| 24 |
# PDF export
|
| 25 |
app/public/*.pdf
|
|
@@ -27,4 +27,5 @@ app/public/*.png
|
|
| 27 |
app/public/*.jpg
|
| 28 |
app/public/data/**/*
|
| 29 |
|
| 30 |
-
.astro/
|
|
|
|
|
|
| 19 |
*.env
|
| 20 |
*.cache
|
| 21 |
|
| 22 |
+
app/scripts/latex-converter/output/
|
| 23 |
|
| 24 |
# PDF export
|
| 25 |
app/public/*.pdf
|
|
|
|
| 27 |
app/public/*.jpg
|
| 28 |
app/public/data/**/*
|
| 29 |
|
| 30 |
+
.astro/
|
| 31 |
+
|
app/package.json
CHANGED
|
Binary files a/app/package.json and b/app/package.json differ
|
|
|
app/scripts/latex-converter/README.md
DELETED
|
@@ -1,107 +0,0 @@
|
|
| 1 |
-
# Convertisseur LaTeX vers Markdown
|
| 2 |
-
|
| 3 |
-
Conversion robuste de projets LaTeX complexes vers Markdown/MDX pour Astro.
|
| 4 |
-
|
| 5 |
-
## 🚀 Usage rapide
|
| 6 |
-
|
| 7 |
-
```bash
|
| 8 |
-
# Conversion standard
|
| 9 |
-
node scripts/latex-converter/index.mjs
|
| 10 |
-
|
| 11 |
-
# Avec nettoyage du dossier de sortie
|
| 12 |
-
node scripts/latex-converter/index.mjs --clean
|
| 13 |
-
|
| 14 |
-
# Chemins personnalisés
|
| 15 |
-
node scripts/latex-converter/index.mjs \
|
| 16 |
-
--input=../tools/latex-to-markdown/input \
|
| 17 |
-
--output=src/content \
|
| 18 |
-
--clean
|
| 19 |
-
```
|
| 20 |
-
|
| 21 |
-
## 📁 Architecture
|
| 22 |
-
|
| 23 |
-
```
|
| 24 |
-
scripts/latex-converter/
|
| 25 |
-
├── index.mjs # Point d'entrée principal
|
| 26 |
-
├── config.mjs # Configuration et mappings
|
| 27 |
-
├── preprocessor.mjs # Préprocesseur LaTeX
|
| 28 |
-
├── bibliography-cleaner.mjs # Nettoyeur de bibliographie
|
| 29 |
-
├── converter.mjs # Convertisseur principal
|
| 30 |
-
└── README.md # Documentation
|
| 31 |
-
```
|
| 32 |
-
|
| 33 |
-
## 🔧 Fonctionnalités
|
| 34 |
-
|
| 35 |
-
### ✅ Ce qui est géré
|
| 36 |
-
- **412+ commandes personnalisées** (math, text, projet-spécifique)
|
| 37 |
-
- **Environnements custom** (`tldr`, `callout`, `finding`)
|
| 38 |
-
- **41 figures** avec organisation par chapitre
|
| 39 |
-
- **2247 entrées bibliographiques** avec nettoyage automatique
|
| 40 |
-
- **Citations** et références croisées
|
| 41 |
-
- **Structure MDX** compatible Astro
|
| 42 |
-
|
| 43 |
-
### 🛠️ Transformations automatiques
|
| 44 |
-
|
| 45 |
-
#### Commandes LaTeX → Markdown
|
| 46 |
-
```latex
|
| 47 |
-
\lerobot → **LeRobot**
|
| 48 |
-
\lerobotdataset → `LeRobotDataset`
|
| 49 |
-
\huggingface → 🤗 **Hugging Face**
|
| 50 |
-
\eg → e.g.,
|
| 51 |
-
\X → \mathcal{X}
|
| 52 |
-
```
|
| 53 |
-
|
| 54 |
-
#### Environnements → Callouts
|
| 55 |
-
```latex
|
| 56 |
-
\begin{tldr}
|
| 57 |
-
Content here
|
| 58 |
-
\end{tldr}
|
| 59 |
-
```
|
| 60 |
-
→
|
| 61 |
-
```markdown
|
| 62 |
-
> **TL;DR**
|
| 63 |
-
> Content here
|
| 64 |
-
```
|
| 65 |
-
|
| 66 |
-
#### Bibliographie
|
| 67 |
-
- `{{Title}}` → `Title` (suppression doubles accolades)
|
| 68 |
-
- `\&` → `&` (déséchappement)
|
| 69 |
-
- Nettoyage général du formatting
|
| 70 |
-
|
| 71 |
-
## 📊 Statistiques exemple
|
| 72 |
-
|
| 73 |
-
```
|
| 74 |
-
⏱️ Time: 1.02s
|
| 75 |
-
📄 Files: 9 sections converties
|
| 76 |
-
🖼️ Figures: 41 images copiées
|
| 77 |
-
📚 Citations: Detection automatique
|
| 78 |
-
🔧 Commands replaced: 34 transformations
|
| 79 |
-
📦 Environments processed: 4 environnements
|
| 80 |
-
📚 Bibliography: 159 entries, 403 fixes
|
| 81 |
-
```
|
| 82 |
-
|
| 83 |
-
## 🎯 Résultat
|
| 84 |
-
|
| 85 |
-
Structure finale dans `src/content/`:
|
| 86 |
-
```
|
| 87 |
-
src/content/
|
| 88 |
-
├── article.mdx # Article principal avec imports
|
| 89 |
-
├── bibliography.bib # Bibliographie nettoyée
|
| 90 |
-
├── chapters/ # Sections converties
|
| 91 |
-
│ ├── 00_abstract.mdx
|
| 92 |
-
│ ├── 01_introduction.mdx
|
| 93 |
-
│ └── ...
|
| 94 |
-
└── assets/image/ # Figures organisées
|
| 95 |
-
├── ch1/
|
| 96 |
-
├── ch2/
|
| 97 |
-
└── ...
|
| 98 |
-
```
|
| 99 |
-
|
| 100 |
-
## ⚠️ Prérequis
|
| 101 |
-
|
| 102 |
-
- **Pandoc** installé (`brew install pandoc`)
|
| 103 |
-
- Node.js avec support ESM
|
| 104 |
-
|
| 105 |
-
## 🔍 Debugging
|
| 106 |
-
|
| 107 |
-
Les warnings sont normaux pour les sections avec math complexe non supporté par Pandoc. Le convertisseur continue et produit un résultat utilisable.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/bibliography-cleaner.mjs
DELETED
|
@@ -1,123 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Nettoyeur de bibliographie - Corrige les doubles accolades et problèmes de formatting
|
| 3 |
-
*/
|
| 4 |
-
|
| 5 |
-
export class BibliographyCleaner {
|
| 6 |
-
constructor() {
|
| 7 |
-
this.stats = {
|
| 8 |
-
entriesProcessed: 0,
|
| 9 |
-
doubleAccoladesFixed: 0,
|
| 10 |
-
escapedCharsFixed: 0,
|
| 11 |
-
mathExpressionsFixed: 0
|
| 12 |
-
};
|
| 13 |
-
}
|
| 14 |
-
|
| 15 |
-
cleanContent(content) {
|
| 16 |
-
let cleaned = content;
|
| 17 |
-
|
| 18 |
-
// Count entries
|
| 19 |
-
this.stats.entriesProcessed = (content.match(/@\w+\{/g) || []).length;
|
| 20 |
-
|
| 21 |
-
// Fix double accolades
|
| 22 |
-
cleaned = this.fixDoubleAccolades(cleaned);
|
| 23 |
-
|
| 24 |
-
// Fix escaped characters
|
| 25 |
-
cleaned = this.fixEscapedCharacters(cleaned);
|
| 26 |
-
|
| 27 |
-
// Fix malformed math expressions
|
| 28 |
-
cleaned = this.fixMathExpressions(cleaned);
|
| 29 |
-
|
| 30 |
-
// General cleanup
|
| 31 |
-
cleaned = this.generalCleanup(cleaned);
|
| 32 |
-
|
| 33 |
-
return cleaned;
|
| 34 |
-
}
|
| 35 |
-
|
| 36 |
-
fixDoubleAccolades(content) {
|
| 37 |
-
let fixed = content;
|
| 38 |
-
let fixCount = 0;
|
| 39 |
-
|
| 40 |
-
fixed = fixed.replace(/\{\{([^}]+)\}\}/g, (match, inner) => {
|
| 41 |
-
fixCount++;
|
| 42 |
-
|
| 43 |
-
// Keep accolades for important terms
|
| 44 |
-
if (/^[A-Z][A-Z0-9]*$/.test(inner) || // Acronyms like "API", "ML"
|
| 45 |
-
/^[A-Z][a-z]*(?:\s+[A-Z][a-z]*)*$/.test(inner) || // Proper nouns
|
| 46 |
-
inner.includes('++') || // Languages like "C++"
|
| 47 |
-
inner.includes('$') // Math
|
| 48 |
-
) {
|
| 49 |
-
return `{${inner}}`;
|
| 50 |
-
}
|
| 51 |
-
|
| 52 |
-
return inner;
|
| 53 |
-
});
|
| 54 |
-
|
| 55 |
-
this.stats.doubleAccoladesFixed = fixCount;
|
| 56 |
-
return fixed;
|
| 57 |
-
}
|
| 58 |
-
|
| 59 |
-
fixEscapedCharacters(content) {
|
| 60 |
-
let fixed = content;
|
| 61 |
-
let fixCount = 0;
|
| 62 |
-
|
| 63 |
-
const replacements = [
|
| 64 |
-
[/\\&/g, '&'],
|
| 65 |
-
[/\\\$/g, '$'],
|
| 66 |
-
[/\\%/g, '%'],
|
| 67 |
-
[/\\#/g, '#'],
|
| 68 |
-
[/\\_/g, '_']
|
| 69 |
-
];
|
| 70 |
-
|
| 71 |
-
for (const [pattern, replacement] of replacements) {
|
| 72 |
-
const matches = fixed.match(pattern);
|
| 73 |
-
if (matches) {
|
| 74 |
-
fixCount += matches.length;
|
| 75 |
-
fixed = fixed.replace(pattern, replacement);
|
| 76 |
-
}
|
| 77 |
-
}
|
| 78 |
-
|
| 79 |
-
this.stats.escapedCharsFixed = fixCount;
|
| 80 |
-
return fixed;
|
| 81 |
-
}
|
| 82 |
-
|
| 83 |
-
fixMathExpressions(content) {
|
| 84 |
-
let fixed = content;
|
| 85 |
-
let fixCount = 0;
|
| 86 |
-
|
| 87 |
-
// Fix specific problematic patterns
|
| 88 |
-
const mathFixes = [
|
| 89 |
-
// ${$\pi_$}0$ → $\pi_0$
|
| 90 |
-
[/\$\{\$\\pi_\$\}([0-9]+)\$/g, '$\\pi_$1$'],
|
| 91 |
-
// ${$something$}text$ → $something_text$
|
| 92 |
-
[/\$\{\$([^}]+)\$\}([^$]*)\$/g, '$$$1_$2$$'],
|
| 93 |
-
// Fix other malformed patterns
|
| 94 |
-
[/\$\{([^}]+)\}\$/g, '$$$1$$'],
|
| 95 |
-
[/\$([^$]*)\\\$([^$]*)\$/g, '$$$1$2$$']
|
| 96 |
-
];
|
| 97 |
-
|
| 98 |
-
for (const [pattern, replacement] of mathFixes) {
|
| 99 |
-
const matches = fixed.match(pattern);
|
| 100 |
-
if (matches) {
|
| 101 |
-
fixCount += matches.length;
|
| 102 |
-
fixed = fixed.replace(pattern, replacement);
|
| 103 |
-
}
|
| 104 |
-
}
|
| 105 |
-
|
| 106 |
-
this.stats.mathExpressionsFixed = fixCount;
|
| 107 |
-
return fixed;
|
| 108 |
-
}
|
| 109 |
-
|
| 110 |
-
generalCleanup(content) {
|
| 111 |
-
let cleaned = content;
|
| 112 |
-
|
| 113 |
-
// Normalize whitespace
|
| 114 |
-
cleaned = cleaned.replace(/\n{3,}/g, '\n\n');
|
| 115 |
-
cleaned = cleaned.trim() + '\n';
|
| 116 |
-
|
| 117 |
-
return cleaned;
|
| 118 |
-
}
|
| 119 |
-
|
| 120 |
-
getStats() {
|
| 121 |
-
return this.stats;
|
| 122 |
-
}
|
| 123 |
-
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/config.mjs
DELETED
|
@@ -1,59 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Configuration et mappings pour la conversion LaTeX vers Markdown
|
| 3 |
-
*/
|
| 4 |
-
|
| 5 |
-
export const COMMAND_MAPPINGS = {
|
| 6 |
-
// Math shortcuts
|
| 7 |
-
'X': '\\mathcal{X}',
|
| 8 |
-
'Z': '\\mathcal{Z}',
|
| 9 |
-
'G': '\\mathcal{G}',
|
| 10 |
-
'D': '\\mathcal{D}',
|
| 11 |
-
'F': '\\mathcal{F}',
|
| 12 |
-
'R': '\\mathcal{R}',
|
| 13 |
-
|
| 14 |
-
// Text commands
|
| 15 |
-
'eg': 'e.g.,',
|
| 16 |
-
'ie': 'i.e.,',
|
| 17 |
-
'versus': 'vs.',
|
| 18 |
-
'wrt': 'w.r.t.',
|
| 19 |
-
'etc': 'etc.',
|
| 20 |
-
|
| 21 |
-
// Project-specific
|
| 22 |
-
'lerobot': '**LeRobot**',
|
| 23 |
-
'lerobotdataset': '`LeRobotDataset`',
|
| 24 |
-
'huggingface': '🤗 **Hugging Face**',
|
| 25 |
-
|
| 26 |
-
// Functions
|
| 27 |
-
'qfunction': 'Q-function',
|
| 28 |
-
'qopt': 'Q^*'
|
| 29 |
-
};
|
| 30 |
-
|
| 31 |
-
export const ENVIRONMENT_MAPPINGS = {
|
| 32 |
-
'tldr': {
|
| 33 |
-
start: '> **TL;DR**\n> ',
|
| 34 |
-
end: '\n',
|
| 35 |
-
type: 'callout'
|
| 36 |
-
},
|
| 37 |
-
'callout': {
|
| 38 |
-
start: '> **Note**\n> ',
|
| 39 |
-
end: '\n',
|
| 40 |
-
type: 'callout'
|
| 41 |
-
},
|
| 42 |
-
'finding': {
|
| 43 |
-
start: '> **🔍 Finding**: ',
|
| 44 |
-
end: '\n',
|
| 45 |
-
type: 'finding'
|
| 46 |
-
}
|
| 47 |
-
};
|
| 48 |
-
|
| 49 |
-
export const PANDOC_OPTIONS = [
|
| 50 |
-
'--from=latex',
|
| 51 |
-
'--to=markdown',
|
| 52 |
-
'--wrap=preserve',
|
| 53 |
-
'--markdown-headings=atx'
|
| 54 |
-
];
|
| 55 |
-
|
| 56 |
-
export const DEFAULT_PATHS = {
|
| 57 |
-
input: '../tools/latex-to-markdown/input',
|
| 58 |
-
output: 'src/content'
|
| 59 |
-
};
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/converter.mjs
DELETED
|
@@ -1,456 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Convertisseur principal LaTeX vers Markdown
|
| 3 |
-
*/
|
| 4 |
-
|
| 5 |
-
import { spawn } from 'node:child_process';
|
| 6 |
-
import { promises as fs } from 'node:fs';
|
| 7 |
-
import { resolve, dirname, basename, join } from 'node:path';
|
| 8 |
-
|
| 9 |
-
import { LaTeXPreprocessor } from './preprocessor.mjs';
|
| 10 |
-
import { RobustLaTeXPreprocessor } from './robust-preprocessor.mjs';
|
| 11 |
-
import { BibliographyCleaner } from './bibliography-cleaner.mjs';
|
| 12 |
-
import { DEFAULT_PATHS, PANDOC_OPTIONS } from './config.mjs';
|
| 13 |
-
|
| 14 |
-
export class LaTeXConverter {
|
| 15 |
-
constructor() {
|
| 16 |
-
this.preprocessor = new LaTeXPreprocessor();
|
| 17 |
-
this.robustPreprocessor = new RobustLaTeXPreprocessor();
|
| 18 |
-
this.bibCleaner = new BibliographyCleaner();
|
| 19 |
-
this.stats = {
|
| 20 |
-
totalFiles: 0,
|
| 21 |
-
totalFigures: 0,
|
| 22 |
-
totalCitations: 0,
|
| 23 |
-
conversionTime: 0
|
| 24 |
-
};
|
| 25 |
-
this.warnings = [];
|
| 26 |
-
this.errors = [];
|
| 27 |
-
}
|
| 28 |
-
|
| 29 |
-
async convert(inputDir, outputDir, options = {}) {
|
| 30 |
-
const startTime = Date.now();
|
| 31 |
-
|
| 32 |
-
console.log('🚀 LaTeX to Markdown Converter');
|
| 33 |
-
console.log(`📁 Input: ${inputDir}`);
|
| 34 |
-
console.log(`📁 Output: ${outputDir}`);
|
| 35 |
-
|
| 36 |
-
try {
|
| 37 |
-
// Setup
|
| 38 |
-
await this.setupOutput(outputDir, options.clean);
|
| 39 |
-
|
| 40 |
-
// Convert sections
|
| 41 |
-
await this.convertSections(inputDir, outputDir);
|
| 42 |
-
|
| 43 |
-
// Handle assets
|
| 44 |
-
await this.handleAssets(inputDir, outputDir);
|
| 45 |
-
|
| 46 |
-
// Create main article
|
| 47 |
-
await this.createMainArticle(outputDir);
|
| 48 |
-
|
| 49 |
-
// Generate report
|
| 50 |
-
this.stats.conversionTime = Date.now() - startTime;
|
| 51 |
-
this.generateReport();
|
| 52 |
-
|
| 53 |
-
console.log('🎉 Conversion completed successfully!');
|
| 54 |
-
return true;
|
| 55 |
-
|
| 56 |
-
} catch (error) {
|
| 57 |
-
this.errors.push(`Conversion failed: ${error.message}`);
|
| 58 |
-
throw error;
|
| 59 |
-
}
|
| 60 |
-
}
|
| 61 |
-
|
| 62 |
-
async setupOutput(outputDir, clean = false) {
|
| 63 |
-
if (clean) {
|
| 64 |
-
console.log('🧹 Cleaning output directory...');
|
| 65 |
-
await fs.rm(outputDir, { recursive: true, force: true });
|
| 66 |
-
}
|
| 67 |
-
|
| 68 |
-
await fs.mkdir(outputDir, { recursive: true });
|
| 69 |
-
await fs.mkdir(join(outputDir, 'chapters'), { recursive: true });
|
| 70 |
-
await fs.mkdir(join(outputDir, 'assets', 'image'), { recursive: true });
|
| 71 |
-
}
|
| 72 |
-
|
| 73 |
-
async convertSections(inputDir, outputDir) {
|
| 74 |
-
console.log('\n📄 Converting sections...');
|
| 75 |
-
|
| 76 |
-
const sectionsDir = join(inputDir, 'sections');
|
| 77 |
-
const outputChaptersDir = join(outputDir, 'chapters');
|
| 78 |
-
|
| 79 |
-
try {
|
| 80 |
-
const files = await fs.readdir(sectionsDir);
|
| 81 |
-
const texFiles = files.filter(f => f.endsWith('.tex'));
|
| 82 |
-
|
| 83 |
-
for (const file of texFiles) {
|
| 84 |
-
const inputPath = join(sectionsDir, file);
|
| 85 |
-
const outputPath = join(outputChaptersDir, file.replace('.tex', '.mdx'));
|
| 86 |
-
|
| 87 |
-
console.log(` Converting ${file}...`);
|
| 88 |
-
await this.convertSingleFile(inputPath, outputPath);
|
| 89 |
-
}
|
| 90 |
-
|
| 91 |
-
this.stats.totalFiles = texFiles.length;
|
| 92 |
-
|
| 93 |
-
} catch (error) {
|
| 94 |
-
this.errors.push(`Section conversion failed: ${error.message}`);
|
| 95 |
-
}
|
| 96 |
-
}
|
| 97 |
-
|
| 98 |
-
async convertSingleFile(inputPath, outputPath) {
|
| 99 |
-
try {
|
| 100 |
-
// Read and preprocess with robust preprocessor
|
| 101 |
-
let content = await fs.readFile(inputPath, 'utf-8');
|
| 102 |
-
content = this.robustPreprocessor.preprocessContent(content, basename(inputPath));
|
| 103 |
-
|
| 104 |
-
// Create temp file for Pandoc
|
| 105 |
-
const tempPath = inputPath + '.temp';
|
| 106 |
-
await fs.writeFile(tempPath, content);
|
| 107 |
-
|
| 108 |
-
// Convert with Pandoc
|
| 109 |
-
const pandocArgs = [tempPath, '-o', outputPath, ...PANDOC_OPTIONS];
|
| 110 |
-
await this.runPandoc(pandocArgs);
|
| 111 |
-
|
| 112 |
-
// Cleanup
|
| 113 |
-
await fs.unlink(tempPath);
|
| 114 |
-
|
| 115 |
-
// Post-process
|
| 116 |
-
await this.postProcessFile(outputPath);
|
| 117 |
-
|
| 118 |
-
} catch (error) {
|
| 119 |
-
this.warnings.push(`Failed to convert ${basename(inputPath)}: ${error.message}`);
|
| 120 |
-
}
|
| 121 |
-
}
|
| 122 |
-
|
| 123 |
-
async runPandoc(args) {
|
| 124 |
-
return new Promise((resolve, reject) => {
|
| 125 |
-
const child = spawn('pandoc', args, {
|
| 126 |
-
stdio: ['pipe', 'pipe', 'pipe'],
|
| 127 |
-
shell: false
|
| 128 |
-
});
|
| 129 |
-
|
| 130 |
-
let stderr = '';
|
| 131 |
-
child.stderr.on('data', (data) => {
|
| 132 |
-
stderr += data.toString();
|
| 133 |
-
});
|
| 134 |
-
|
| 135 |
-
child.on('error', reject);
|
| 136 |
-
child.on('exit', (code) => {
|
| 137 |
-
if (code === 0) {
|
| 138 |
-
resolve();
|
| 139 |
-
} else {
|
| 140 |
-
reject(new Error(`Pandoc failed: ${stderr}`));
|
| 141 |
-
}
|
| 142 |
-
});
|
| 143 |
-
});
|
| 144 |
-
}
|
| 145 |
-
|
| 146 |
-
fixMalformedMath(content) {
|
| 147 |
-
let fixed = content;
|
| 148 |
-
|
| 149 |
-
// Fix problematic expressions like ${$\pi_$}0$
|
| 150 |
-
fixed = fixed.replace(/\$\{\$([^$}]+)\$\}([^$]*)\$/g, '$$$1_{$2}$$');
|
| 151 |
-
|
| 152 |
-
// Fix nested math delimiters
|
| 153 |
-
fixed = fixed.replace(/\$\$([^$]*)\$([^$]*)\$([^$]*)\$\$/g, '$$$1 $2 $3$$');
|
| 154 |
-
|
| 155 |
-
// Fix incomplete math expressions
|
| 156 |
-
fixed = fixed.replace(/\$([^$]*)\{([^}]*)\$([^$]*)\$/g, '$$$1\\{$2\\}$3$$');
|
| 157 |
-
|
| 158 |
-
// Fix math with unescaped braces
|
| 159 |
-
fixed = fixed.replace(/\$([^$]*)\{([^}]*)\}([^$]*)\$/g, '$$$1\\{$2\\}$3$$');
|
| 160 |
-
|
| 161 |
-
// Fix common pi expressions
|
| 162 |
-
fixed = fixed.replace(/\$\\pi_\$([0-9]+)\$/g, '$\\pi_$1$');
|
| 163 |
-
fixed = fixed.replace(/\$\{\\pi_\}([0-9]+)\$/g, '$\\pi_$1$');
|
| 164 |
-
|
| 165 |
-
// Fix doubled dollar signs (but preserve display math)
|
| 166 |
-
fixed = fixed.replace(/\$\$\$+/g, '$$');
|
| 167 |
-
|
| 168 |
-
// Ensure proper spacing around math
|
| 169 |
-
fixed = fixed.replace(/([a-zA-Z])\$([^$]+)\$([a-zA-Z])/g, '$1 $$$2$$ $3');
|
| 170 |
-
|
| 171 |
-
return fixed;
|
| 172 |
-
}
|
| 173 |
-
|
| 174 |
-
fixMDXUrls(content) {
|
| 175 |
-
let fixed = content;
|
| 176 |
-
|
| 177 |
-
// Fix all escaped markdown that should be unescaped for MDX
|
| 178 |
-
fixed = fixed.replace(/\\\*/g, '*');
|
| 179 |
-
fixed = fixed.replace(/\\\[/g, '[');
|
| 180 |
-
fixed = fixed.replace(/\\\]/g, ']');
|
| 181 |
-
fixed = fixed.replace(/\\\(/g, '(');
|
| 182 |
-
fixed = fixed.replace(/\\\)/g, ')');
|
| 183 |
-
fixed = fixed.replace(/\\>/g, '>');
|
| 184 |
-
fixed = fixed.replace(/\\!/g, '!');
|
| 185 |
-
|
| 186 |
-
// Fix angle bracket URLs that are MDX-incompatible
|
| 187 |
-
fixed = fixed.replace(/\*\*<(https?:\/\/[^>]+)>\*\*/g, '**[$1]($1)**');
|
| 188 |
-
fixed = fixed.replace(/<(https?:\/\/[^>]+)>/g, '[$1]($1)');
|
| 189 |
-
|
| 190 |
-
// Fix malformed math expressions with escaped braces
|
| 191 |
-
fixed = fixed.replace(/\\\{/g, '{');
|
| 192 |
-
fixed = fixed.replace(/\\\}/g, '}');
|
| 193 |
-
|
| 194 |
-
// Escape all braces in math expressions for MDX compatibility
|
| 195 |
-
fixed = fixed.replace(/\$([^$]*)\$/g, (match, mathContent) => {
|
| 196 |
-
const escaped = mathContent.replace(/\{/g, '\\{').replace(/\}/g, '\\}');
|
| 197 |
-
return `$${escaped}$`;
|
| 198 |
-
});
|
| 199 |
-
|
| 200 |
-
fixed = fixed.replace(/\$\$([^$]*)\$\$/g, (match, mathContent) => {
|
| 201 |
-
const escaped = mathContent.replace(/\{/g, '\\{').replace(/\}/g, '\\}');
|
| 202 |
-
return `$$${escaped}$$`;
|
| 203 |
-
});
|
| 204 |
-
|
| 205 |
-
// Fix Section references that are malformed
|
| 206 |
-
fixed = fixed.replace(/Section\s+([a-zA-Z-]+:[a-zA-Z0-9-]+)\\/g, 'the referenced figure');
|
| 207 |
-
fixed = fixed.replace(/Figure\s+Section\s+([a-zA-Z-]+:[a-zA-Z0-9-]+)\\/g, 'the referenced figure');
|
| 208 |
-
|
| 209 |
-
return fixed;
|
| 210 |
-
}
|
| 211 |
-
|
| 212 |
-
async postProcessFile(filePath) {
|
| 213 |
-
try {
|
| 214 |
-
let content = await fs.readFile(filePath, 'utf-8');
|
| 215 |
-
|
| 216 |
-
// Fix common issues
|
| 217 |
-
content = content.replace(/\\\\#/g, '#');
|
| 218 |
-
content = content.replace(/\\\\!/g, '!');
|
| 219 |
-
content = content.replace(/\\\\\*/g, '*');
|
| 220 |
-
|
| 221 |
-
// Fix citations
|
| 222 |
-
content = content.replace(/\\citep\{([^}]+)\}/g, '[@$1]');
|
| 223 |
-
content = content.replace(/\\citet\{([^}]+)\}/g, '@$1');
|
| 224 |
-
content = content.replace(/\\cite\{([^}]+)\}/g, '[@$1]');
|
| 225 |
-
|
| 226 |
-
// Remove section labels from headers
|
| 227 |
-
content = content.replace(/^(#{1,6}.*?)\s*\{#[^}]+\}/gm, '$1');
|
| 228 |
-
|
| 229 |
-
// Fix complex LaTeX references like [\[sec:xxx\]](#sec:xxx){reference-type="ref" reference="sec:xxx"}
|
| 230 |
-
content = content.replace(/\[\\?\[([^\]]+)\\?\]\]\(#[^)]+\)\{[^}]*reference[^}]*\}/g, 'Section $1');
|
| 231 |
-
|
| 232 |
-
// Fix simple references [\[ref\]](#ref)
|
| 233 |
-
content = content.replace(/\[\\?\[([^\]]+)\\?\]\]\(#[^)]+\)/g, '$1');
|
| 234 |
-
|
| 235 |
-
// Fix remaining malformed references like "Section Section sec:classical\"
|
| 236 |
-
content = content.replace(/Section\s+Section\s+([^\\]+)\\/g, 'Section $1');
|
| 237 |
-
content = content.replace(/Section\s+Section\s+([^\\]+)/g, 'Section $1');
|
| 238 |
-
|
| 239 |
-
// Remove remaining LaTeX labels and references
|
| 240 |
-
content = content.replace(/\\label\{[^}]+\}/g, '');
|
| 241 |
-
content = content.replace(/\\ref\{[^}]+\}/g, '[Reference]');
|
| 242 |
-
|
| 243 |
-
// Clean up section references with colons (be more specific)
|
| 244 |
-
content = content.replace(/Section\s+sec:([a-zA-Z-]+)/g, 'the following section');
|
| 245 |
-
|
| 246 |
-
// Fix broken section references that got mangled
|
| 247 |
-
content = content.replace(/Section\s+secs[a-zA-Z]*\s+/g, 'The following section ');
|
| 248 |
-
content = content.replace(/Section\s+sec[a-zA-Z]*\s+/g, 'The following section ');
|
| 249 |
-
|
| 250 |
-
// Count citations
|
| 251 |
-
const citations = content.match(/\[@[^\]]+\]/g) || [];
|
| 252 |
-
this.stats.totalCitations += citations.length;
|
| 253 |
-
|
| 254 |
-
// Fix malformed math expressions
|
| 255 |
-
content = this.fixMalformedMath(content);
|
| 256 |
-
|
| 257 |
-
// Fix MDX-incompatible URLs (post-pandoc)
|
| 258 |
-
content = this.fixMDXUrls(content);
|
| 259 |
-
|
| 260 |
-
// Final cleanup
|
| 261 |
-
content = content.replace(/\n{3,}/g, '\n\n');
|
| 262 |
-
content = content.replace(/\\texttt\{([^}]+)\}/g, '`$1`');
|
| 263 |
-
content = content.replace(/\\textbf\{([^}]+)\}/g, '**$1**');
|
| 264 |
-
content = content.replace(/\\emph\{([^}]+)\}/g, '*$1*');
|
| 265 |
-
content = content.trim();
|
| 266 |
-
|
| 267 |
-
await fs.writeFile(filePath, content);
|
| 268 |
-
|
| 269 |
-
} catch (error) {
|
| 270 |
-
this.warnings.push(`Post-processing failed for ${basename(filePath)}: ${error.message}`);
|
| 271 |
-
}
|
| 272 |
-
}
|
| 273 |
-
|
| 274 |
-
async handleAssets(inputDir, outputDir) {
|
| 275 |
-
console.log('\n🖼️ Handling assets...');
|
| 276 |
-
|
| 277 |
-
// Copy figures
|
| 278 |
-
try {
|
| 279 |
-
const figuresInputDir = join(inputDir, 'figures');
|
| 280 |
-
const assetsOutputDir = join(outputDir, 'assets', 'image');
|
| 281 |
-
|
| 282 |
-
await this.copyDirectoryRecursive(figuresInputDir, assetsOutputDir);
|
| 283 |
-
this.stats.totalFigures = await this.countFiles(assetsOutputDir, /\.(png|jpg|jpeg|pdf|svg)$/i);
|
| 284 |
-
|
| 285 |
-
console.log(` 📊 Copied ${this.stats.totalFigures} figures`);
|
| 286 |
-
} catch (error) {
|
| 287 |
-
this.warnings.push(`Could not copy figures: ${error.message}`);
|
| 288 |
-
}
|
| 289 |
-
|
| 290 |
-
// Handle bibliography
|
| 291 |
-
try {
|
| 292 |
-
const bibPath = join(inputDir, 'main.bib');
|
| 293 |
-
const outputBibPath = join(outputDir, 'bibliography.bib');
|
| 294 |
-
|
| 295 |
-
// Copy and clean bibliography
|
| 296 |
-
let bibContent = await fs.readFile(bibPath, 'utf-8');
|
| 297 |
-
bibContent = this.bibCleaner.cleanContent(bibContent);
|
| 298 |
-
await fs.writeFile(outputBibPath, bibContent);
|
| 299 |
-
|
| 300 |
-
const bibStats = this.bibCleaner.getStats();
|
| 301 |
-
console.log(` 📚 Bibliography: ${bibStats.entriesProcessed} entries, ${bibStats.doubleAccoladesFixed} fixes, ${bibStats.mathExpressionsFixed} math fixes`);
|
| 302 |
-
|
| 303 |
-
} catch (error) {
|
| 304 |
-
this.warnings.push(`Could not handle bibliography: ${error.message}`);
|
| 305 |
-
}
|
| 306 |
-
}
|
| 307 |
-
|
| 308 |
-
async copyDirectoryRecursive(src, dest) {
|
| 309 |
-
await fs.mkdir(dest, { recursive: true });
|
| 310 |
-
const entries = await fs.readdir(src, { withFileTypes: true });
|
| 311 |
-
|
| 312 |
-
for (const entry of entries) {
|
| 313 |
-
const srcPath = join(src, entry.name);
|
| 314 |
-
const destPath = join(dest, entry.name);
|
| 315 |
-
|
| 316 |
-
if (entry.isDirectory()) {
|
| 317 |
-
await this.copyDirectoryRecursive(srcPath, destPath);
|
| 318 |
-
} else {
|
| 319 |
-
await fs.copyFile(srcPath, destPath);
|
| 320 |
-
}
|
| 321 |
-
}
|
| 322 |
-
}
|
| 323 |
-
|
| 324 |
-
async countFiles(dir, pattern) {
|
| 325 |
-
let count = 0;
|
| 326 |
-
try {
|
| 327 |
-
const entries = await fs.readdir(dir, { withFileTypes: true });
|
| 328 |
-
|
| 329 |
-
for (const entry of entries) {
|
| 330 |
-
if (entry.isDirectory()) {
|
| 331 |
-
count += await this.countFiles(join(dir, entry.name), pattern);
|
| 332 |
-
} else if (pattern.test(entry.name)) {
|
| 333 |
-
count++;
|
| 334 |
-
}
|
| 335 |
-
}
|
| 336 |
-
} catch {
|
| 337 |
-
// Directory doesn't exist
|
| 338 |
-
}
|
| 339 |
-
|
| 340 |
-
return count;
|
| 341 |
-
}
|
| 342 |
-
|
| 343 |
-
async createMainArticle(outputDir) {
|
| 344 |
-
console.log('\n📝 Creating main article...');
|
| 345 |
-
|
| 346 |
-
try {
|
| 347 |
-
const chaptersDir = join(outputDir, 'chapters');
|
| 348 |
-
const files = await fs.readdir(chaptersDir);
|
| 349 |
-
const mdxFiles = files.filter(f => f.endsWith('.mdx')).sort();
|
| 350 |
-
|
| 351 |
-
const frontmatter = this.generateFrontmatter();
|
| 352 |
-
const { imports, components } = this.generateChapterImports(mdxFiles);
|
| 353 |
-
|
| 354 |
-
const articleContent = frontmatter + imports + '\n\n' + components;
|
| 355 |
-
|
| 356 |
-
const articlePath = join(outputDir, 'article.mdx');
|
| 357 |
-
await fs.writeFile(articlePath, articleContent);
|
| 358 |
-
|
| 359 |
-
console.log(` 📄 Created article.mdx with ${mdxFiles.length} chapters`);
|
| 360 |
-
|
| 361 |
-
} catch (error) {
|
| 362 |
-
this.errors.push(`Failed to create main article: ${error.message}`);
|
| 363 |
-
}
|
| 364 |
-
}
|
| 365 |
-
|
| 366 |
-
generateFrontmatter() {
|
| 367 |
-
const now = new Date().toISOString().split('T')[0];
|
| 368 |
-
|
| 369 |
-
return `---
|
| 370 |
-
title: "Robot Learning: A Tutorial"
|
| 371 |
-
subtitle: "From Classical Robotics to Foundation Models"
|
| 372 |
-
description: "A comprehensive guide to modern robot learning techniques"
|
| 373 |
-
date: "${now}"
|
| 374 |
-
authors:
|
| 375 |
-
- name: "Francesco Capuano"
|
| 376 |
-
affiliations: [1, 2]
|
| 377 |
-
- name: "Adil Zouitine"
|
| 378 |
-
affiliations: [2]
|
| 379 |
-
- name: "Pepijn Kooijmans"
|
| 380 |
-
affiliations: [2]
|
| 381 |
-
- name: "Thomas Wolf"
|
| 382 |
-
affiliations: [2]
|
| 383 |
-
- name: "Michel Aractingi"
|
| 384 |
-
affiliations: [2]
|
| 385 |
-
affiliations:
|
| 386 |
-
- name: "École Normale Supérieure Paris-Saclay"
|
| 387 |
-
url: "https://ens-paris-saclay.fr"
|
| 388 |
-
- name: "Hugging Face"
|
| 389 |
-
url: "https://huggingface.co"
|
| 390 |
-
tags:
|
| 391 |
-
- robotics
|
| 392 |
-
- machine-learning
|
| 393 |
-
- tutorial
|
| 394 |
-
bibliography: bibliography.bib
|
| 395 |
-
converted_from: "LaTeX"
|
| 396 |
-
---
|
| 397 |
-
|
| 398 |
-
`;
|
| 399 |
-
}
|
| 400 |
-
|
| 401 |
-
generateChapterImports(mdxFiles) {
|
| 402 |
-
let imports = '';
|
| 403 |
-
let components = '';
|
| 404 |
-
|
| 405 |
-
mdxFiles.forEach(file => {
|
| 406 |
-
const sectionName = basename(file, '.mdx');
|
| 407 |
-
const componentName = this.formatComponentName(sectionName);
|
| 408 |
-
|
| 409 |
-
imports += `import ${componentName} from "./chapters/${sectionName}.mdx";\n`;
|
| 410 |
-
components += `<${componentName} />\n\n`;
|
| 411 |
-
});
|
| 412 |
-
|
| 413 |
-
return { imports, components };
|
| 414 |
-
}
|
| 415 |
-
|
| 416 |
-
formatComponentName(sectionName) {
|
| 417 |
-
let componentName = sectionName
|
| 418 |
-
.split(/[_-]/)
|
| 419 |
-
.map(part => part.charAt(0).toUpperCase() + part.slice(1))
|
| 420 |
-
.join('');
|
| 421 |
-
|
| 422 |
-
if (/^\d/.test(componentName)) {
|
| 423 |
-
componentName = 'Chapter' + componentName;
|
| 424 |
-
}
|
| 425 |
-
|
| 426 |
-
if (componentName === 'AForword') componentName = 'Foreword';
|
| 427 |
-
if (componentName === 'Chapter00Abstract') componentName = 'Abstract';
|
| 428 |
-
|
| 429 |
-
return componentName;
|
| 430 |
-
}
|
| 431 |
-
|
| 432 |
-
generateReport() {
|
| 433 |
-
console.log('\n📊 Conversion Report:');
|
| 434 |
-
console.log('=====================');
|
| 435 |
-
console.log(`⏱️ Time: ${(this.stats.conversionTime / 1000).toFixed(2)}s`);
|
| 436 |
-
console.log(`📄 Files: ${this.stats.totalFiles}`);
|
| 437 |
-
console.log(`🖼️ Figures: ${this.stats.totalFigures}`);
|
| 438 |
-
console.log(`📚 Citations: ${this.stats.totalCitations}`);
|
| 439 |
-
console.log(`⚠️ Warnings: ${this.warnings.length}`);
|
| 440 |
-
console.log(`❌ Errors: ${this.errors.length}`);
|
| 441 |
-
|
| 442 |
-
const robustStats = this.robustPreprocessor.getStats();
|
| 443 |
-
console.log(`🔧 Commands replaced: ${robustStats.commandsReplaced}`);
|
| 444 |
-
console.log(`📦 Environments processed: ${robustStats.environmentsProcessed}`);
|
| 445 |
-
console.log(`🖼️ Figures processed: ${robustStats.figuresProcessed}`);
|
| 446 |
-
console.log(`📐 Math expressions fixed: ${robustStats.mathExpressionsFixed}`);
|
| 447 |
-
|
| 448 |
-
if (this.warnings.length > 0 && this.warnings.length <= 3) {
|
| 449 |
-
console.log('\n⚠️ Warnings:');
|
| 450 |
-
this.warnings.forEach(w => console.log(` - ${w}`));
|
| 451 |
-
} else if (this.warnings.length > 3) {
|
| 452 |
-
console.log(`\n⚠️ ${this.warnings.length} warnings:`);
|
| 453 |
-
this.warnings.forEach(w => console.log(` - ${w.substring(0, 150)}...`));
|
| 454 |
-
}
|
| 455 |
-
}
|
| 456 |
-
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/image-transformer.mjs
DELETED
|
@@ -1,179 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Transformateur d'images : Markdown → ResponsiveImage Astro
|
| 3 |
-
* Convertit les images markdown en composants ResponsiveImage optimisés
|
| 4 |
-
*/
|
| 5 |
-
|
| 6 |
-
import { promises as fs } from 'node:fs';
|
| 7 |
-
import { dirname, basename, extname, resolve, relative } from 'node:path';
|
| 8 |
-
|
| 9 |
-
export class ImageTransformer {
|
| 10 |
-
constructor() {
|
| 11 |
-
this.stats = {
|
| 12 |
-
filesProcessed: 0,
|
| 13 |
-
imagesTransformed: 0,
|
| 14 |
-
importsAdded: 0
|
| 15 |
-
};
|
| 16 |
-
}
|
| 17 |
-
|
| 18 |
-
async transformImagesInDirectory(contentDir) {
|
| 19 |
-
const chaptersDir = resolve(contentDir, 'chapters');
|
| 20 |
-
|
| 21 |
-
try {
|
| 22 |
-
const files = await fs.readdir(chaptersDir);
|
| 23 |
-
const mdxFiles = files.filter(file => file.endsWith('.mdx'));
|
| 24 |
-
|
| 25 |
-
for (const file of mdxFiles) {
|
| 26 |
-
const filePath = resolve(chaptersDir, file);
|
| 27 |
-
await this.transformImagesInFile(filePath, contentDir);
|
| 28 |
-
this.stats.filesProcessed++;
|
| 29 |
-
}
|
| 30 |
-
|
| 31 |
-
console.log(`📸 Image transformation completed:`);
|
| 32 |
-
console.log(` 📄 Files processed: ${this.stats.filesProcessed}`);
|
| 33 |
-
console.log(` 🖼️ Images transformed: ${this.stats.imagesTransformed}`);
|
| 34 |
-
console.log(` 📦 Imports added: ${this.stats.importsAdded}`);
|
| 35 |
-
|
| 36 |
-
} catch (error) {
|
| 37 |
-
console.error('Error transforming images:', error.message);
|
| 38 |
-
}
|
| 39 |
-
}
|
| 40 |
-
|
| 41 |
-
async transformImagesInFile(filePath, contentDir) {
|
| 42 |
-
try {
|
| 43 |
-
let content = await fs.readFile(filePath, 'utf-8');
|
| 44 |
-
|
| 45 |
-
const imageInfo = this.extractImageInfo(content);
|
| 46 |
-
if (imageInfo.length === 0) {
|
| 47 |
-
return; // No images to transform
|
| 48 |
-
}
|
| 49 |
-
|
| 50 |
-
const imports = this.generateImports(imageInfo, filePath, contentDir);
|
| 51 |
-
const transformedContent = this.transformImageReferences(content, imageInfo);
|
| 52 |
-
|
| 53 |
-
// Add imports at the top of the file
|
| 54 |
-
const finalContent = this.addImportsToFile(transformedContent, imports);
|
| 55 |
-
|
| 56 |
-
await fs.writeFile(filePath, finalContent);
|
| 57 |
-
|
| 58 |
-
this.stats.imagesTransformed += imageInfo.length;
|
| 59 |
-
this.stats.importsAdded += imports.length;
|
| 60 |
-
|
| 61 |
-
} catch (error) {
|
| 62 |
-
console.error(`Error processing ${filePath}:`, error.message);
|
| 63 |
-
}
|
| 64 |
-
}
|
| 65 |
-
|
| 66 |
-
extractImageInfo(content) {
|
| 67 |
-
// More robust regex that handles complex alt text with brackets and parentheses
|
| 68 |
-
const imageRegex = /!\[([^\]]*(?:\[[^\]]*\][^\]]*)*)\]\(([^)]+)\)(?:\s*(#[^\s]+))?/g;
|
| 69 |
-
const images = [];
|
| 70 |
-
let match;
|
| 71 |
-
|
| 72 |
-
while ((match = imageRegex.exec(content)) !== null) {
|
| 73 |
-
const [fullMatch, alt, src, id] = match;
|
| 74 |
-
|
| 75 |
-
// Only process relative image paths (not external URLs)
|
| 76 |
-
if (!src.startsWith('http') && !src.startsWith('//')) {
|
| 77 |
-
images.push({
|
| 78 |
-
fullMatch,
|
| 79 |
-
alt: alt || 'Figure',
|
| 80 |
-
src,
|
| 81 |
-
id: id ? id.substring(1) : null, // Remove # from id
|
| 82 |
-
variableName: this.generateVariableName(src)
|
| 83 |
-
});
|
| 84 |
-
}
|
| 85 |
-
}
|
| 86 |
-
|
| 87 |
-
return images;
|
| 88 |
-
}
|
| 89 |
-
|
| 90 |
-
generateVariableName(imagePath) {
|
| 91 |
-
// Convert path to valid variable name
|
| 92 |
-
// assets/image/ch4/ch4-bc-trajectories.png → ch4BcTrajectories
|
| 93 |
-
const filename = basename(imagePath, extname(imagePath));
|
| 94 |
-
|
| 95 |
-
return filename
|
| 96 |
-
.replace(/[-_]/g, ' ')
|
| 97 |
-
.replace(/\b\w/g, l => l.toUpperCase())
|
| 98 |
-
.replace(/\s/g, '')
|
| 99 |
-
.replace(/^\d+/, 'Fig$&'); // Prefix with Fig if starts with number
|
| 100 |
-
}
|
| 101 |
-
|
| 102 |
-
generateImports(imageInfo, filePath, contentDir) {
|
| 103 |
-
const imports = [];
|
| 104 |
-
|
| 105 |
-
// Add ResponsiveImage import
|
| 106 |
-
imports.push("import ResponsiveImage from '../../components/ResponsiveImage.astro'");
|
| 107 |
-
|
| 108 |
-
// Add image imports
|
| 109 |
-
for (const image of imageInfo) {
|
| 110 |
-
const relativePath = this.getRelativeImagePath(image.src, filePath, contentDir);
|
| 111 |
-
imports.push(`import ${image.variableName} from '${relativePath}'`);
|
| 112 |
-
}
|
| 113 |
-
|
| 114 |
-
return imports;
|
| 115 |
-
}
|
| 116 |
-
|
| 117 |
-
getRelativeImagePath(imageSrc, filePath, contentDir) {
|
| 118 |
-
// Convert absolute image path to relative from chapter file
|
| 119 |
-
// From: chapters/04_imitation_learning.mdx
|
| 120 |
-
// To: ../assets/image/ch4/ch4-bc-trajectories.png
|
| 121 |
-
|
| 122 |
-
const chapterDir = dirname(filePath);
|
| 123 |
-
const imageAbsolutePath = resolve(contentDir, imageSrc);
|
| 124 |
-
const relativePath = relative(chapterDir, imageAbsolutePath);
|
| 125 |
-
|
| 126 |
-
return relativePath.startsWith('.') ? relativePath : `./${relativePath}`;
|
| 127 |
-
}
|
| 128 |
-
|
| 129 |
-
transformImageReferences(content, imageInfo) {
|
| 130 |
-
let transformed = content;
|
| 131 |
-
|
| 132 |
-
for (const image of imageInfo) {
|
| 133 |
-
const componentTag = this.generateResponsiveImageTag(image);
|
| 134 |
-
transformed = transformed.replace(image.fullMatch, componentTag);
|
| 135 |
-
}
|
| 136 |
-
|
| 137 |
-
return transformed;
|
| 138 |
-
}
|
| 139 |
-
|
| 140 |
-
generateResponsiveImageTag(image) {
|
| 141 |
-
const props = [
|
| 142 |
-
`src={${image.variableName}}`,
|
| 143 |
-
`alt="${image.alt}"`
|
| 144 |
-
];
|
| 145 |
-
|
| 146 |
-
if (image.id) {
|
| 147 |
-
props.push(`id="${image.id}"`);
|
| 148 |
-
}
|
| 149 |
-
|
| 150 |
-
return `<ResponsiveImage ${props.join(' ')} />`;
|
| 151 |
-
}
|
| 152 |
-
|
| 153 |
-
addImportsToFile(content, imports) {
|
| 154 |
-
if (imports.length === 0) {
|
| 155 |
-
return content;
|
| 156 |
-
}
|
| 157 |
-
|
| 158 |
-
// Check if there are already imports at the top
|
| 159 |
-
const lines = content.split('\n');
|
| 160 |
-
let insertIndex = 0;
|
| 161 |
-
|
| 162 |
-
// Skip existing imports
|
| 163 |
-
while (insertIndex < lines.length &&
|
| 164 |
-
(lines[insertIndex].startsWith('import ') ||
|
| 165 |
-
lines[insertIndex].trim() === '')) {
|
| 166 |
-
insertIndex++;
|
| 167 |
-
}
|
| 168 |
-
|
| 169 |
-
// Insert imports
|
| 170 |
-
const importBlock = imports.join('\n') + '\n\n';
|
| 171 |
-
lines.splice(insertIndex, 0, importBlock);
|
| 172 |
-
|
| 173 |
-
return lines.join('\n');
|
| 174 |
-
}
|
| 175 |
-
|
| 176 |
-
getStats() {
|
| 177 |
-
return this.stats;
|
| 178 |
-
}
|
| 179 |
-
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/index.mjs
DELETED
|
@@ -1,75 +0,0 @@
|
|
| 1 |
-
#!/usr/bin/env node
|
| 2 |
-
/**
|
| 3 |
-
* Point d'entrée principal pour la conversion LaTeX vers Markdown
|
| 4 |
-
*
|
| 5 |
-
* Usage: node scripts/latex-converter/index.mjs [--input=path] [--output=path] [--clean]
|
| 6 |
-
*/
|
| 7 |
-
|
| 8 |
-
import { resolve } from 'node:path';
|
| 9 |
-
import { spawn } from 'node:child_process';
|
| 10 |
-
import process from 'node:process';
|
| 11 |
-
|
| 12 |
-
import { LaTeXConverter } from './converter.mjs';
|
| 13 |
-
import { ImageTransformer } from './image-transformer.mjs';
|
| 14 |
-
import { DEFAULT_PATHS } from './config.mjs';
|
| 15 |
-
|
| 16 |
-
function parseArgs(argv) {
|
| 17 |
-
const out = {};
|
| 18 |
-
for (const arg of argv.slice(2)) {
|
| 19 |
-
if (!arg.startsWith('--')) continue;
|
| 20 |
-
const [k, v] = arg.replace(/^--/, '').split('=');
|
| 21 |
-
out[k] = v === undefined ? true : v;
|
| 22 |
-
}
|
| 23 |
-
return out;
|
| 24 |
-
}
|
| 25 |
-
|
| 26 |
-
async function checkPandoc() {
|
| 27 |
-
try {
|
| 28 |
-
const child = spawn('pandoc', ['--version'], { stdio: 'pipe' });
|
| 29 |
-
return new Promise((resolve) => {
|
| 30 |
-
child.on('exit', (code) => resolve(code === 0));
|
| 31 |
-
child.on('error', () => resolve(false));
|
| 32 |
-
});
|
| 33 |
-
} catch {
|
| 34 |
-
return false;
|
| 35 |
-
}
|
| 36 |
-
}
|
| 37 |
-
|
| 38 |
-
async function main() {
|
| 39 |
-
const cwd = process.cwd();
|
| 40 |
-
const args = parseArgs(process.argv);
|
| 41 |
-
|
| 42 |
-
// Vérifier Pandoc
|
| 43 |
-
const hasPandoc = await checkPandoc();
|
| 44 |
-
if (!hasPandoc) {
|
| 45 |
-
console.error('❌ Pandoc n\'est pas installé.');
|
| 46 |
-
console.error(' macOS: brew install pandoc');
|
| 47 |
-
console.error(' Ubuntu: apt-get install pandoc');
|
| 48 |
-
process.exit(1);
|
| 49 |
-
}
|
| 50 |
-
|
| 51 |
-
// Chemins
|
| 52 |
-
const inputDir = resolve(cwd, args.input || DEFAULT_PATHS.input);
|
| 53 |
-
const outputDir = resolve(cwd, args.output || DEFAULT_PATHS.output);
|
| 54 |
-
|
| 55 |
-
try {
|
| 56 |
-
const converter = new LaTeXConverter();
|
| 57 |
-
await converter.convert(inputDir, outputDir, {
|
| 58 |
-
clean: args.clean || false
|
| 59 |
-
});
|
| 60 |
-
|
| 61 |
-
// Transform images to ResponsiveImage components
|
| 62 |
-
console.log('\n📸 Transforming images to ResponsiveImage components...');
|
| 63 |
-
const imageTransformer = new ImageTransformer();
|
| 64 |
-
await imageTransformer.transformImagesInDirectory(outputDir);
|
| 65 |
-
|
| 66 |
-
} catch (error) {
|
| 67 |
-
console.error('❌ Conversion échouée:', error.message);
|
| 68 |
-
process.exit(1);
|
| 69 |
-
}
|
| 70 |
-
}
|
| 71 |
-
|
| 72 |
-
main().catch(err => {
|
| 73 |
-
console.error('❌ Erreur fatale:', err);
|
| 74 |
-
process.exit(1);
|
| 75 |
-
});
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/preprocessor.mjs
DELETED
|
@@ -1,115 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Préprocesseur LaTeX - Nettoie et simplifie le contenu LaTeX
|
| 3 |
-
*/
|
| 4 |
-
|
| 5 |
-
import { COMMAND_MAPPINGS, ENVIRONMENT_MAPPINGS } from './config.mjs';
|
| 6 |
-
|
| 7 |
-
export class LaTeXPreprocessor {
|
| 8 |
-
constructor() {
|
| 9 |
-
this.stats = {
|
| 10 |
-
commandsReplaced: 0,
|
| 11 |
-
environmentsProcessed: 0,
|
| 12 |
-
figuresFixed: 0
|
| 13 |
-
};
|
| 14 |
-
}
|
| 15 |
-
|
| 16 |
-
preprocessContent(content) {
|
| 17 |
-
let processed = content;
|
| 18 |
-
|
| 19 |
-
// Remove comments
|
| 20 |
-
processed = processed.replace(/%.*$/gm, '');
|
| 21 |
-
|
| 22 |
-
// Apply command mappings
|
| 23 |
-
processed = this.applyCommandMappings(processed);
|
| 24 |
-
|
| 25 |
-
// Process custom environments
|
| 26 |
-
processed = this.processCustomEnvironments(processed);
|
| 27 |
-
|
| 28 |
-
// Fix figures
|
| 29 |
-
processed = this.fixFigures(processed);
|
| 30 |
-
|
| 31 |
-
// General cleanup
|
| 32 |
-
processed = processed.replace(/\n{3,}/g, '\n\n');
|
| 33 |
-
processed = processed.trim();
|
| 34 |
-
|
| 35 |
-
return processed;
|
| 36 |
-
}
|
| 37 |
-
|
| 38 |
-
applyCommandMappings(content) {
|
| 39 |
-
let processed = content;
|
| 40 |
-
|
| 41 |
-
for (const [command, replacement] of Object.entries(COMMAND_MAPPINGS)) {
|
| 42 |
-
const regex = new RegExp(`\\\\${command.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}(?![a-zA-Z])`, 'g');
|
| 43 |
-
const matches = processed.match(regex);
|
| 44 |
-
if (matches) {
|
| 45 |
-
this.stats.commandsReplaced += matches.length;
|
| 46 |
-
processed = processed.replace(regex, replacement);
|
| 47 |
-
}
|
| 48 |
-
}
|
| 49 |
-
|
| 50 |
-
return processed;
|
| 51 |
-
}
|
| 52 |
-
|
| 53 |
-
processCustomEnvironments(content) {
|
| 54 |
-
let processed = content;
|
| 55 |
-
|
| 56 |
-
// Convert tldr environment
|
| 57 |
-
processed = processed.replace(
|
| 58 |
-
/\\begin\{tldr\}(.*?)\\end\{tldr\}/gs,
|
| 59 |
-
(match, content) => {
|
| 60 |
-
this.stats.environmentsProcessed++;
|
| 61 |
-
return `> **TL;DR**\n> ${content.trim()}\n`;
|
| 62 |
-
}
|
| 63 |
-
);
|
| 64 |
-
|
| 65 |
-
// Convert callout environment
|
| 66 |
-
processed = processed.replace(
|
| 67 |
-
/\\begin\{callout\}\{([^}]*)\}(.*?)\\end\{callout\}/gs,
|
| 68 |
-
(match, title, content) => {
|
| 69 |
-
this.stats.environmentsProcessed++;
|
| 70 |
-
return `> **${title}**\n> ${content.trim()}\n`;
|
| 71 |
-
}
|
| 72 |
-
);
|
| 73 |
-
|
| 74 |
-
// Convert finding environment
|
| 75 |
-
processed = processed.replace(
|
| 76 |
-
/\\finding\{([^}]*)\}\{([^}]*)\}/g,
|
| 77 |
-
(match, number, content) => {
|
| 78 |
-
this.stats.environmentsProcessed++;
|
| 79 |
-
return `> **🔍 Finding ${number}**: ${content}\n`;
|
| 80 |
-
}
|
| 81 |
-
);
|
| 82 |
-
|
| 83 |
-
return processed;
|
| 84 |
-
}
|
| 85 |
-
|
| 86 |
-
fixFigures(content) {
|
| 87 |
-
let fixed = content;
|
| 88 |
-
|
| 89 |
-
// Fix complex figure environments
|
| 90 |
-
const figurePattern = /\\begin\{figure\}[\s\S]*?\\includegraphics(?:\[[^\]]*\])?\{([^}]+)\}[\s\S]*?\\caption\{([^}]+)\}[\s\S]*?(?:\\label\{([^}]+)\})?[\s\S]*?\\end\{figure\}/g;
|
| 91 |
-
|
| 92 |
-
fixed = fixed.replace(figurePattern, (match, imagePath, caption, label) => {
|
| 93 |
-
this.stats.figuresFixed++;
|
| 94 |
-
const cleanPath = imagePath.replace(/^figures\//, 'assets/image/');
|
| 95 |
-
const labelAttr = label ? ` {#fig-${label}}` : '';
|
| 96 |
-
return `\n${labelAttr}\n\n*${caption}*\n`;
|
| 97 |
-
});
|
| 98 |
-
|
| 99 |
-
// Fix simple includegraphics
|
| 100 |
-
fixed = fixed.replace(
|
| 101 |
-
/\\includegraphics(?:\[[^\]]*\])?\{([^}]+)\}/g,
|
| 102 |
-
(match, imagePath) => {
|
| 103 |
-
this.stats.figuresFixed++;
|
| 104 |
-
const cleanPath = imagePath.replace(/^figures\//, 'assets/image/');
|
| 105 |
-
return ``;
|
| 106 |
-
}
|
| 107 |
-
);
|
| 108 |
-
|
| 109 |
-
return fixed;
|
| 110 |
-
}
|
| 111 |
-
|
| 112 |
-
getStats() {
|
| 113 |
-
return this.stats;
|
| 114 |
-
}
|
| 115 |
-
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-converter/robust-preprocessor.mjs
DELETED
|
@@ -1,399 +0,0 @@
|
|
| 1 |
-
/**
|
| 2 |
-
* Préprocesseur LaTeX Ultra-Robuste
|
| 3 |
-
* Gère les cas complexes qui font planter Pandoc
|
| 4 |
-
*/
|
| 5 |
-
|
| 6 |
-
export class RobustLaTeXPreprocessor {
|
| 7 |
-
constructor() {
|
| 8 |
-
this.stats = {
|
| 9 |
-
figuresProcessed: 0,
|
| 10 |
-
citationsFixed: 0,
|
| 11 |
-
mathExpressionsFixed: 0,
|
| 12 |
-
environmentsProcessed: 0,
|
| 13 |
-
commandsReplaced: 0
|
| 14 |
-
};
|
| 15 |
-
this.debugMode = false;
|
| 16 |
-
}
|
| 17 |
-
|
| 18 |
-
preprocessContent(content, filename = 'unknown') {
|
| 19 |
-
if (this.debugMode) {
|
| 20 |
-
console.log(` 🔍 [DEBUG] Processing ${filename}...`);
|
| 21 |
-
}
|
| 22 |
-
|
| 23 |
-
let processed = content;
|
| 24 |
-
|
| 25 |
-
// Phase 1: Structure cleanup (most important first)
|
| 26 |
-
processed = this.phase1_StructureCleanup(processed);
|
| 27 |
-
|
| 28 |
-
// Phase 2: Content transformation
|
| 29 |
-
processed = this.phase2_ContentTransformation(processed);
|
| 30 |
-
|
| 31 |
-
// Phase 3: Final polish
|
| 32 |
-
processed = this.phase3_FinalPolish(processed);
|
| 33 |
-
|
| 34 |
-
return processed;
|
| 35 |
-
}
|
| 36 |
-
|
| 37 |
-
phase1_StructureCleanup(content) {
|
| 38 |
-
let cleaned = content;
|
| 39 |
-
|
| 40 |
-
// Remove comments (but preserve structure)
|
| 41 |
-
cleaned = cleaned.replace(/%.*$/gm, '');
|
| 42 |
-
|
| 43 |
-
// Fix broken line breaks that split words
|
| 44 |
-
cleaned = this.fixBrokenLineBreaks(cleaned);
|
| 45 |
-
|
| 46 |
-
// Fix broken equation environments
|
| 47 |
-
cleaned = this.fixBrokenEquations(cleaned);
|
| 48 |
-
|
| 49 |
-
// Fix broken figure environments BEFORE processing
|
| 50 |
-
cleaned = this.fixComplexFigures(cleaned);
|
| 51 |
-
|
| 52 |
-
// Handle problematic environments early
|
| 53 |
-
cleaned = this.handleProblematicEnvironments(cleaned);
|
| 54 |
-
|
| 55 |
-
return cleaned;
|
| 56 |
-
}
|
| 57 |
-
|
| 58 |
-
fixBrokenLineBreaks(content) {
|
| 59 |
-
let fixed = content;
|
| 60 |
-
|
| 61 |
-
// Fix hyphenated words broken across lines
|
| 62 |
-
// "length-\nT\nT" → "length-T"
|
| 63 |
-
fixed = fixed.replace(/([a-zA-Z])-\s*\n\s*([A-Z])\s*\n\s*\2/g, '$1-$2');
|
| 64 |
-
|
| 65 |
-
// Fix broken compound words
|
| 66 |
-
// "some-\nword" → "some-word"
|
| 67 |
-
fixed = fixed.replace(/([a-zA-Z])-\s*\n\s*([a-z])/g, '$1-$2');
|
| 68 |
-
|
| 69 |
-
// Fix sentences that got broken inappropriately
|
| 70 |
-
// "word.Sentence" → "word. Sentence"
|
| 71 |
-
fixed = fixed.replace(/([a-z])\.([A-Z])/g, '$1. $2');
|
| 72 |
-
|
| 73 |
-
return fixed;
|
| 74 |
-
}
|
| 75 |
-
|
| 76 |
-
fixBrokenEquations(content) {
|
| 77 |
-
let fixed = content;
|
| 78 |
-
|
| 79 |
-
// Fix mixed equation environments
|
| 80 |
-
// "\end{equation}$" → "$$"
|
| 81 |
-
fixed = fixed.replace(/\\end\{equation\}\$/g, '$$');
|
| 82 |
-
fixed = fixed.replace(/\$\\begin\{equation\}/g, '$$');
|
| 83 |
-
|
| 84 |
-
// Fix broken align environments
|
| 85 |
-
fixed = fixed.replace(/([^$])\s*&=\s*/g, '$1 &= ');
|
| 86 |
-
|
| 87 |
-
// Fix multiline math that lost structure
|
| 88 |
-
fixed = fixed.replace(/\$([^$]*?)&=([^$]*?)\$/g, '$$\\begin{align}\n$1 &= $2\n\\end{align}$$');
|
| 89 |
-
|
| 90 |
-
return fixed;
|
| 91 |
-
}
|
| 92 |
-
|
| 93 |
-
fixComplexFigures(content) {
|
| 94 |
-
let fixed = content;
|
| 95 |
-
|
| 96 |
-
// Strategy: Convert complex figures to simple markdown BEFORE Pandoc sees them
|
| 97 |
-
const figurePattern = /\\begin\{figure\*?\}([\s\S]*?)\\end\{figure\*?\}/g;
|
| 98 |
-
const wrapfigurePattern = /\\begin\{wrapfigure\}(?:\[[^\]]*\])?\{[^}]*\}\{[^}]*\}([\s\S]*?)\\end\{wrapfigure\}/g;
|
| 99 |
-
|
| 100 |
-
fixed = fixed.replace(figurePattern, (match, figureContent) => {
|
| 101 |
-
this.stats.figuresProcessed++;
|
| 102 |
-
|
| 103 |
-
// Extract components safely
|
| 104 |
-
const imageMatch = figureContent.match(/\\includegraphics(?:\[[^\]]*\])?\{([^}]+)\}/);
|
| 105 |
-
const captionMatch = figureContent.match(/\\caption\{([\s\S]*?)\}(?=\s*(?:\\label|\\end|\}|$))/);
|
| 106 |
-
const labelMatch = figureContent.match(/\\label\{([^}]+)\}/);
|
| 107 |
-
|
| 108 |
-
if (!imageMatch) {
|
| 109 |
-
return match; // Keep original if we can't parse it
|
| 110 |
-
}
|
| 111 |
-
|
| 112 |
-
const imagePath = imageMatch[1].replace(/^figures\//, 'assets/image/');
|
| 113 |
-
let caption = captionMatch ? captionMatch[1].trim() : 'Figure';
|
| 114 |
-
const label = labelMatch ? labelMatch[1] : '';
|
| 115 |
-
|
| 116 |
-
// Clean caption thoroughly
|
| 117 |
-
caption = this.cleanCaption(caption);
|
| 118 |
-
|
| 119 |
-
// Generate clean markdown
|
| 120 |
-
const labelAttr = label ? ` {#fig-${label}}` : '';
|
| 121 |
-
|
| 122 |
-
return `\n\n${labelAttr}\n\n*${caption}*\n\n`;
|
| 123 |
-
});
|
| 124 |
-
|
| 125 |
-
// Also handle wrapfigure environments
|
| 126 |
-
fixed = fixed.replace(wrapfigurePattern, (match, figureContent) => {
|
| 127 |
-
this.stats.figuresProcessed++;
|
| 128 |
-
|
| 129 |
-
// Extract components safely
|
| 130 |
-
const imageMatch = figureContent.match(/\\includegraphics(?:\[[^\]]*\])?\{([^}]+)\}/);
|
| 131 |
-
const captionMatch = figureContent.match(/\\caption\{([\s\S]*?)\}(?=\s*(?:\\label|\\end|\}|$))/);
|
| 132 |
-
const labelMatch = figureContent.match(/\\label\{([^}]+)\}/);
|
| 133 |
-
|
| 134 |
-
if (!imageMatch) {
|
| 135 |
-
return match; // Keep original if we can't parse it
|
| 136 |
-
}
|
| 137 |
-
|
| 138 |
-
const imagePath = imageMatch[1].replace(/^figures\//, 'assets/image/');
|
| 139 |
-
let caption = captionMatch ? captionMatch[1].trim() : 'Figure';
|
| 140 |
-
const label = labelMatch ? labelMatch[1] : '';
|
| 141 |
-
|
| 142 |
-
// Clean caption thoroughly
|
| 143 |
-
caption = this.cleanCaption(caption);
|
| 144 |
-
|
| 145 |
-
// Generate clean markdown (simpler for wrapfigure)
|
| 146 |
-
const labelAttr = label ? ` {#fig-${label}}` : '';
|
| 147 |
-
|
| 148 |
-
return `\n\n${labelAttr}\n\n`;
|
| 149 |
-
});
|
| 150 |
-
|
| 151 |
-
return fixed;
|
| 152 |
-
}
|
| 153 |
-
|
| 154 |
-
cleanCaption(caption) {
|
| 155 |
-
let cleaned = caption;
|
| 156 |
-
|
| 157 |
-
// Handle citations in captions properly
|
| 158 |
-
cleaned = cleaned.replace(/~\\cite[tp]?\{([^}]+)\}/g, ' [@$1]');
|
| 159 |
-
cleaned = cleaned.replace(/\\cite[tp]?\{([^}]+)\}/g, '[@$1]');
|
| 160 |
-
|
| 161 |
-
// Remove problematic LaTeX commands
|
| 162 |
-
cleaned = cleaned.replace(/\\textit\{([^}]+)\}/g, '*$1*');
|
| 163 |
-
cleaned = cleaned.replace(/\\textbf\{([^}]+)\}/g, '**$1**');
|
| 164 |
-
cleaned = cleaned.replace(/\\emph\{([^}]+)\}/g, '*$1*');
|
| 165 |
-
|
| 166 |
-
// Fix \textsc with complex content
|
| 167 |
-
cleaned = cleaned.replace(/\\textsc\{([^}]*\([^)]*\)[^}]*)\}/g, '**$1**');
|
| 168 |
-
|
| 169 |
-
// Handle nested braces safely
|
| 170 |
-
let depth = 0;
|
| 171 |
-
let result = '';
|
| 172 |
-
for (let i = 0; i < cleaned.length; i++) {
|
| 173 |
-
const char = cleaned[i];
|
| 174 |
-
if (char === '{') {
|
| 175 |
-
depth++;
|
| 176 |
-
if (depth === 1) continue; // Skip opening brace
|
| 177 |
-
} else if (char === '}') {
|
| 178 |
-
depth--;
|
| 179 |
-
if (depth === 0) continue; // Skip closing brace
|
| 180 |
-
} else {
|
| 181 |
-
result += char;
|
| 182 |
-
}
|
| 183 |
-
}
|
| 184 |
-
|
| 185 |
-
return result.trim();
|
| 186 |
-
}
|
| 187 |
-
|
| 188 |
-
handleProblematicEnvironments(content) {
|
| 189 |
-
let fixed = content;
|
| 190 |
-
|
| 191 |
-
// Handle algorithm environments
|
| 192 |
-
fixed = fixed.replace(/\\begin\{algorithm\}([\s\S]*?)\\end\{algorithm\}/g, (match, algContent) => {
|
| 193 |
-
return '\n```\nAlgorithm:\n' + algContent.replace(/\\[a-zA-Z]+/g, '') + '\n```\n';
|
| 194 |
-
});
|
| 195 |
-
|
| 196 |
-
// Handle complex math environments
|
| 197 |
-
fixed = fixed.replace(/\\begin\{align\*?\}([\s\S]*?)\\end\{align\*?\}/g, (match, mathContent) => {
|
| 198 |
-
const cleaned = mathContent.replace(/\\&/g, '').replace(/\\\\/g, '\n');
|
| 199 |
-
return '\n$$\n' + cleaned + '\n$$\n';
|
| 200 |
-
});
|
| 201 |
-
|
| 202 |
-
return fixed;
|
| 203 |
-
}
|
| 204 |
-
|
| 205 |
-
phase2_ContentTransformation(content) {
|
| 206 |
-
let transformed = content;
|
| 207 |
-
|
| 208 |
-
// Apply command mappings (safer order)
|
| 209 |
-
transformed = this.applyCommandMappings(transformed);
|
| 210 |
-
|
| 211 |
-
// Process custom environments
|
| 212 |
-
transformed = this.processCustomEnvironments(transformed);
|
| 213 |
-
|
| 214 |
-
// Handle remaining citations
|
| 215 |
-
transformed = this.processCitations(transformed);
|
| 216 |
-
|
| 217 |
-
return transformed;
|
| 218 |
-
}
|
| 219 |
-
|
| 220 |
-
applyCommandMappings(content) {
|
| 221 |
-
let processed = content;
|
| 222 |
-
|
| 223 |
-
// Safe command replacements (most common first)
|
| 224 |
-
const safeCommands = {
|
| 225 |
-
'eg': 'e.g.,',
|
| 226 |
-
'ie': 'i.e.,',
|
| 227 |
-
'versus': 'vs.',
|
| 228 |
-
'wrt': 'w.r.t.',
|
| 229 |
-
'etc': 'etc.',
|
| 230 |
-
'lerobot': '**LeRobot**',
|
| 231 |
-
'lerobotdataset': '`LeRobotDataset`',
|
| 232 |
-
'huggingface': '🤗 **Hugging Face**',
|
| 233 |
-
'qfunction': 'Q-function',
|
| 234 |
-
'qopt': 'Q^*',
|
| 235 |
-
// Robotics-specific commands from handles.tex
|
| 236 |
-
'actionchunk': '\\mathbf{A}',
|
| 237 |
-
'actionexpert': '\\mathbf{v}_\\theta',
|
| 238 |
-
'pizero': '\\pi_0',
|
| 239 |
-
'statespace': '\\mathcal{S}',
|
| 240 |
-
'actionspace': '\\mathcal{A}',
|
| 241 |
-
'obsspace': '\\mathcal{O}',
|
| 242 |
-
'dynamics': '\\mathcal{D}',
|
| 243 |
-
'stateplusone': 's_{t+1}',
|
| 244 |
-
'state': 's_t',
|
| 245 |
-
'action': 'a_t',
|
| 246 |
-
'transition': '(s_t, a_t, s_{t+1})',
|
| 247 |
-
'sars': '(s_t, a_t, r_t, s_{t+1})',
|
| 248 |
-
'transitiongiven': '(s_{t+1} | s_t, a_t)',
|
| 249 |
-
'transitionprob': '\\mathbb{P}(s_{t+1} | s_t, a_t)',
|
| 250 |
-
'trajectory': '(s_0, a_0, r_0, s_1, a_1, r_1, \\dots, s_{T-1}, a_{T-1}, r_{T-1}, s_T)',
|
| 251 |
-
'Jpi': 'J(\\pi_\\theta)',
|
| 252 |
-
'supp': '\\text{supp}',
|
| 253 |
-
'DKL': '\\text{D}_{\\text{KL}}',
|
| 254 |
-
'FK': '\\text{FK}',
|
| 255 |
-
'targetvel': '\\dot{p}^*',
|
| 256 |
-
'targetpos': 'p^*'
|
| 257 |
-
};
|
| 258 |
-
|
| 259 |
-
for (const [command, replacement] of Object.entries(safeCommands)) {
|
| 260 |
-
const regex = new RegExp(`\\\\${command}(?![a-zA-Z])`, 'g');
|
| 261 |
-
const matches = processed.match(regex);
|
| 262 |
-
if (matches) {
|
| 263 |
-
this.stats.commandsReplaced += matches.length;
|
| 264 |
-
processed = processed.replace(regex, replacement);
|
| 265 |
-
}
|
| 266 |
-
}
|
| 267 |
-
|
| 268 |
-
// Math commands (more careful)
|
| 269 |
-
const mathCommands = ['X', 'Z', 'G', 'D', 'F', 'R', 'S', 'T', 'U', 'Y'];
|
| 270 |
-
mathCommands.forEach(letter => {
|
| 271 |
-
const regex = new RegExp(`\\\\${letter}(?![a-zA-Z])`, 'g');
|
| 272 |
-
processed = processed.replace(regex, `\\mathcal{${letter}}`);
|
| 273 |
-
});
|
| 274 |
-
|
| 275 |
-
// Handle commands with subscripts (like \actionchunk_t)
|
| 276 |
-
processed = processed.replace(/\\actionchunk_t/g, '\\mathbf{A}_t');
|
| 277 |
-
processed = processed.replace(/\\actionexpert_([a-zA-Z0-9]+)/g, '\\mathbf{v}_{\\theta_$1}');
|
| 278 |
-
processed = processed.replace(/\\state_([a-zA-Z0-9]+)/g, 's_{$1}');
|
| 279 |
-
processed = processed.replace(/\\action_([a-zA-Z0-9]+)/g, 'a_{$1}');
|
| 280 |
-
|
| 281 |
-
// Fix problematic \textsc commands with complex content
|
| 282 |
-
processed = processed.replace(/\\textsc\{([^{}]*\([^)]*\)[^{}]*)\}/g, '**$1**');
|
| 283 |
-
processed = processed.replace(/\\textsc\{([^}]+)\}/g, '**$1**');
|
| 284 |
-
|
| 285 |
-
// Fix \url commands to make them MDX-compatible
|
| 286 |
-
processed = processed.replace(/\\textbf\{\\url\{([^}]+)\}\}/g, '**[$1]($1)**');
|
| 287 |
-
processed = processed.replace(/\\url\{([^}]+)\}/g, '[$1]($1)');
|
| 288 |
-
|
| 289 |
-
return processed;
|
| 290 |
-
}
|
| 291 |
-
|
| 292 |
-
processCustomEnvironments(content) {
|
| 293 |
-
let processed = content;
|
| 294 |
-
|
| 295 |
-
// TL;DR environment
|
| 296 |
-
processed = processed.replace(
|
| 297 |
-
/\\begin\{tldr\}([\s\S]*?)\\end\{tldr\}/g,
|
| 298 |
-
(match, content) => {
|
| 299 |
-
this.stats.environmentsProcessed++;
|
| 300 |
-
return `\n> **TL;DR**\n> ${content.trim()}\n\n`;
|
| 301 |
-
}
|
| 302 |
-
);
|
| 303 |
-
|
| 304 |
-
// Callout environment
|
| 305 |
-
processed = processed.replace(
|
| 306 |
-
/\\begin\{callout\}\{([^}]*)\}([\s\S]*?)\\end\{callout\}/g,
|
| 307 |
-
(match, title, content) => {
|
| 308 |
-
this.stats.environmentsProcessed++;
|
| 309 |
-
return `\n> **${title}**\n> ${content.trim()}\n\n`;
|
| 310 |
-
}
|
| 311 |
-
);
|
| 312 |
-
|
| 313 |
-
// Finding command
|
| 314 |
-
processed = processed.replace(
|
| 315 |
-
/\\finding\{([^}]*)\}\{([^}]*)\}/g,
|
| 316 |
-
(match, number, content) => {
|
| 317 |
-
this.stats.environmentsProcessed++;
|
| 318 |
-
return `\n> **🔍 Finding ${number}**: ${content}\n\n`;
|
| 319 |
-
}
|
| 320 |
-
);
|
| 321 |
-
|
| 322 |
-
return processed;
|
| 323 |
-
}
|
| 324 |
-
|
| 325 |
-
processCitations(content) {
|
| 326 |
-
let processed = content;
|
| 327 |
-
|
| 328 |
-
// Handle different citation types
|
| 329 |
-
processed = processed.replace(/\\citep\{([^}]+)\}/g, '[@$1]');
|
| 330 |
-
processed = processed.replace(/\\citet\{([^}]+)\}/g, '@$1');
|
| 331 |
-
processed = processed.replace(/\\cite\{([^}]+)\}/g, '[@$1]');
|
| 332 |
-
|
| 333 |
-
// Handle spaced citations (common issue)
|
| 334 |
-
processed = processed.replace(/~\\cite/g, ' \\cite');
|
| 335 |
-
processed = processed.replace(/~\[@/g, ' [@');
|
| 336 |
-
|
| 337 |
-
// Count citations
|
| 338 |
-
const citations = processed.match(/\[@[^\]]+\]/g) || [];
|
| 339 |
-
this.stats.citationsFixed += citations.length;
|
| 340 |
-
|
| 341 |
-
return processed;
|
| 342 |
-
}
|
| 343 |
-
|
| 344 |
-
phase3_FinalPolish(content) {
|
| 345 |
-
let polished = content;
|
| 346 |
-
|
| 347 |
-
// Fix math expressions
|
| 348 |
-
polished = this.fixMathExpressions(polished);
|
| 349 |
-
|
| 350 |
-
// Clean up whitespace and structure
|
| 351 |
-
polished = this.finalCleanup(polished);
|
| 352 |
-
|
| 353 |
-
return polished;
|
| 354 |
-
}
|
| 355 |
-
|
| 356 |
-
fixMathExpressions(content) {
|
| 357 |
-
let fixed = content;
|
| 358 |
-
|
| 359 |
-
// Fix common problematic patterns
|
| 360 |
-
fixed = fixed.replace(/\$\{([^}]+)\}\$/g, '$$$1$$'); // ${...}$ -> $...$
|
| 361 |
-
fixed = fixed.replace(/\$([^$]*)\\\$([^$]*)\$/g, '$$$1$2$$'); // $...\$...$ -> $...$
|
| 362 |
-
|
| 363 |
-
// Fix pi expressions specifically
|
| 364 |
-
fixed = fixed.replace(/\$\\pi_\$([0-9]+)\$/g, '$\\pi_$1$');
|
| 365 |
-
fixed = fixed.replace(/\$\{\\pi_\}([0-9]+)\$/g, '$\\pi_$1$');
|
| 366 |
-
|
| 367 |
-
// Fix malformed math delimiters
|
| 368 |
-
fixed = fixed.replace(/\$\$\$+/g, '$$');
|
| 369 |
-
|
| 370 |
-
this.stats.mathExpressionsFixed++;
|
| 371 |
-
|
| 372 |
-
return fixed;
|
| 373 |
-
}
|
| 374 |
-
|
| 375 |
-
finalCleanup(content) {
|
| 376 |
-
let cleaned = content;
|
| 377 |
-
|
| 378 |
-
// Normalize whitespace
|
| 379 |
-
cleaned = cleaned.replace(/\n{3,}/g, '\n\n');
|
| 380 |
-
cleaned = cleaned.replace(/[ \t]+$/gm, ''); // Trailing spaces
|
| 381 |
-
|
| 382 |
-
// Fix MDX-incompatible angle bracket URLs
|
| 383 |
-
cleaned = cleaned.replace(/\*\*<(https?:\/\/[^>]+)>\*\*/g, '**[$1]($1)**');
|
| 384 |
-
cleaned = cleaned.replace(/<(https?:\/\/[^>]+)>/g, '[$1]($1)');
|
| 385 |
-
|
| 386 |
-
// Ensure proper spacing around elements
|
| 387 |
-
cleaned = cleaned.replace(/\n\n\n+/g, '\n\n');
|
| 388 |
-
|
| 389 |
-
return cleaned.trim();
|
| 390 |
-
}
|
| 391 |
-
|
| 392 |
-
getStats() {
|
| 393 |
-
return this.stats;
|
| 394 |
-
}
|
| 395 |
-
|
| 396 |
-
setDebugMode(enabled) {
|
| 397 |
-
this.debugMode = enabled;
|
| 398 |
-
}
|
| 399 |
-
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-to-mdx/README.md
ADDED
|
@@ -0,0 +1,169 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# LaTeX to MDX Toolkit
|
| 2 |
+
|
| 3 |
+
Complete LaTeX to MDX (Markdown + JSX) conversion optimized for Astro with advanced support for references, interactive equations, and components.
|
| 4 |
+
|
| 5 |
+
## 🚀 Quick Start
|
| 6 |
+
|
| 7 |
+
```bash
|
| 8 |
+
# Complete LaTeX → MDX conversion with all features
|
| 9 |
+
node index.mjs
|
| 10 |
+
|
| 11 |
+
# For step-by-step debugging
|
| 12 |
+
node latex-converter.mjs # LaTeX → Markdown
|
| 13 |
+
node mdx-converter.mjs # Markdown → MDX
|
| 14 |
+
```
|
| 15 |
+
|
| 16 |
+
## 📁 Structure
|
| 17 |
+
|
| 18 |
+
```
|
| 19 |
+
latex-to-mdx/
|
| 20 |
+
├── index.mjs # Complete LaTeX → MDX pipeline
|
| 21 |
+
├── latex-converter.mjs # LaTeX → Markdown with Pandoc
|
| 22 |
+
├── mdx-converter.mjs # Markdown → MDX with Astro components
|
| 23 |
+
├── reference-preprocessor.mjs # LaTeX references cleanup
|
| 24 |
+
├── post-processor.mjs # Markdown post-processing
|
| 25 |
+
├── bib-cleaner.mjs # Bibliography cleaner
|
| 26 |
+
├── filters/
|
| 27 |
+
│ └── equation-ids.lua # Pandoc filter for KaTeX equations
|
| 28 |
+
├── input/ # LaTeX sources
|
| 29 |
+
│ ├── main.tex
|
| 30 |
+
│ ├── main.bib
|
| 31 |
+
│ └── sections/
|
| 32 |
+
└── output/ # Results
|
| 33 |
+
├── main.md # Intermediate Markdown
|
| 34 |
+
└── main.mdx # Final MDX for Astro
|
| 35 |
+
```
|
| 36 |
+
|
| 37 |
+
## ✨ Key Features
|
| 38 |
+
|
| 39 |
+
### 🎯 **Smart References**
|
| 40 |
+
- **Invisible anchors**: Automatic conversion of `\label{}` to `<span id="..." style="position: absolute;"></span>`
|
| 41 |
+
- **Clean links**: Identifier cleanup (`:` → `-`, removing prefixes `sec:`, `fig:`, `eq:`)
|
| 42 |
+
- **Cross-references**: Full support for `\ref{}` with functional links
|
| 43 |
+
|
| 44 |
+
### 🧮 **Interactive Equations**
|
| 45 |
+
- **KaTeX IDs**: Conversion of `\label{eq:...}` to `\htmlId{id}{equation}`
|
| 46 |
+
- **Equation references**: Clickable links to mathematical equations
|
| 47 |
+
- **Advanced KaTeX support**: `trust: true` configuration for `\htmlId{}`
|
| 48 |
+
|
| 49 |
+
### 🎨 **Automatic Styling**
|
| 50 |
+
- **Highlights**: `\highlight{text}` → `<span class="highlight">text</span>`
|
| 51 |
+
- **Auto cleanup**: Removal of numbering `(1)`, `(2)`, etc.
|
| 52 |
+
- **Astro components**: Images → `ResponsiveImage` with automatic imports
|
| 53 |
+
|
| 54 |
+
### 🔧 **Robust Pipeline**
|
| 55 |
+
- **LaTeX preprocessor**: Reference cleanup before Pandoc
|
| 56 |
+
- **Lua filter**: Equation processing in Pandoc AST
|
| 57 |
+
- **Post-processor**: Markdown cleanup and optimization
|
| 58 |
+
- **MDX converter**: Final transformation with Astro components
|
| 59 |
+
|
| 60 |
+
## 📊 Example Workflow
|
| 61 |
+
|
| 62 |
+
```bash
|
| 63 |
+
# 1. Prepare LaTeX sources
|
| 64 |
+
cp my-paper/* input/
|
| 65 |
+
|
| 66 |
+
# 2. Complete automatic conversion
|
| 67 |
+
node index.mjs
|
| 68 |
+
|
| 69 |
+
# 3. Generated results
|
| 70 |
+
ls output/
|
| 71 |
+
# → main.md (Intermediate Markdown)
|
| 72 |
+
# → main.mdx (Final MDX for Astro)
|
| 73 |
+
# → assets/image/ (extracted images)
|
| 74 |
+
```
|
| 75 |
+
|
| 76 |
+
### 📋 Conversion Result
|
| 77 |
+
|
| 78 |
+
The pipeline generates an MDX file optimized for Astro with:
|
| 79 |
+
|
| 80 |
+
```mdx
|
| 81 |
+
---
|
| 82 |
+
title: "Your Article Title"
|
| 83 |
+
description: "Generated from LaTeX"
|
| 84 |
+
---
|
| 85 |
+
|
| 86 |
+
import ResponsiveImage from '../components/ResponsiveImage.astro';
|
| 87 |
+
import figure1 from '../assets/image/figure1.png';
|
| 88 |
+
|
| 89 |
+
## Section with invisible anchor
|
| 90 |
+
<span id="introduction" style="position: absolute;"></span>
|
| 91 |
+
|
| 92 |
+
Here is some text with <span class="highlight">highlighted words</span>.
|
| 93 |
+
|
| 94 |
+
Reference to an interactive [equation](#equation-name).
|
| 95 |
+
|
| 96 |
+
Equation with KaTeX ID:
|
| 97 |
+
$$\htmlId{equation-name}{E = mc^2}$$
|
| 98 |
+
|
| 99 |
+
<ResponsiveImage src={figure1} alt="Description" />
|
| 100 |
+
```
|
| 101 |
+
|
| 102 |
+
## ⚙️ Required Astro Configuration
|
| 103 |
+
|
| 104 |
+
To use equations with IDs, add to `astro.config.mjs`:
|
| 105 |
+
|
| 106 |
+
```javascript
|
| 107 |
+
import rehypeKatex from 'rehype-katex';
|
| 108 |
+
|
| 109 |
+
export default defineConfig({
|
| 110 |
+
markdown: {
|
| 111 |
+
rehypePlugins: [
|
| 112 |
+
[rehypeKatex, { trust: true }], // ← Important for \htmlId{}
|
| 113 |
+
],
|
| 114 |
+
},
|
| 115 |
+
});
|
| 116 |
+
```
|
| 117 |
+
|
| 118 |
+
## 🛠️ Prerequisites
|
| 119 |
+
|
| 120 |
+
- **Node.js** with ESM support
|
| 121 |
+
- **Pandoc** (`brew install pandoc`)
|
| 122 |
+
- **Astro** to use the generated MDX
|
| 123 |
+
|
| 124 |
+
## 🎯 Technical Architecture
|
| 125 |
+
|
| 126 |
+
### 4-Stage Pipeline
|
| 127 |
+
|
| 128 |
+
1. **LaTeX Preprocessing** (`reference-preprocessor.mjs`)
|
| 129 |
+
- Cleanup of `\label{}` and `\ref{}`
|
| 130 |
+
- Conversion `\highlight{}` → CSS spans
|
| 131 |
+
- Removal of prefixes and problematic characters
|
| 132 |
+
|
| 133 |
+
2. **Pandoc + Lua Filter** (`equation-ids.lua`)
|
| 134 |
+
- LaTeX → Markdown conversion with `gfm+tex_math_dollars+raw_html`
|
| 135 |
+
- Equation processing: `\label{eq:name}` → `\htmlId{name}{equation}`
|
| 136 |
+
- Automatic image extraction
|
| 137 |
+
|
| 138 |
+
3. **Markdown Post-processing** (`post-processor.mjs`)
|
| 139 |
+
- KaTeX, Unicode, grouping commands cleanup
|
| 140 |
+
- Attribute correction with `:`
|
| 141 |
+
- Code snippet injection
|
| 142 |
+
|
| 143 |
+
4. **MDX Conversion** (`mdx-converter.mjs`)
|
| 144 |
+
- Images transformation → `ResponsiveImage`
|
| 145 |
+
- HTML span escaping correction
|
| 146 |
+
- Automatic imports generation
|
| 147 |
+
- MDX frontmatter
|
| 148 |
+
|
| 149 |
+
## 📊 Conversion Statistics
|
| 150 |
+
|
| 151 |
+
For a typical scientific document:
|
| 152 |
+
- **87 labels** detected and processed
|
| 153 |
+
- **48 invisible anchors** created
|
| 154 |
+
- **13 highlight spans** with CSS class
|
| 155 |
+
- **4 equations** with `\htmlId{}` KaTeX
|
| 156 |
+
- **40 images** converted to components
|
| 157 |
+
|
| 158 |
+
## ✅ Project Status
|
| 159 |
+
|
| 160 |
+
### 🎉 **Complete Features**
|
| 161 |
+
- ✅ **LaTeX → MDX Pipeline**: Full end-to-end functional conversion
|
| 162 |
+
- ✅ **Cross-document references**: Perfectly functional internal links
|
| 163 |
+
- ✅ **Interactive equations**: KaTeX support with clickable IDs
|
| 164 |
+
- ✅ **Automatic styling**: Highlights and Astro components
|
| 165 |
+
- ✅ **Robustness**: Automatic cleanup of all escaping
|
| 166 |
+
- ✅ **Optimization**: Clean code without unnecessary elements
|
| 167 |
+
|
| 168 |
+
### 🚀 **Production Ready**
|
| 169 |
+
The toolkit is now **100% operational** for converting complex scientific LaTeX documents to MDX/Astro with all advanced features (references, interactive equations, styling).
|
app/scripts/latex-to-mdx/bib-cleaner.mjs
ADDED
|
@@ -0,0 +1,104 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env node
|
| 2 |
+
|
| 3 |
+
import { readFileSync, writeFileSync, existsSync } from 'fs';
|
| 4 |
+
import { join, dirname, basename } from 'path';
|
| 5 |
+
|
| 6 |
+
/**
|
| 7 |
+
* Clean a BibTeX file by removing local file references and paths
|
| 8 |
+
* @param {string} inputBibFile - Path to the input .bib file
|
| 9 |
+
* @param {string} outputBibFile - Path to the output cleaned .bib file
|
| 10 |
+
* @returns {boolean} - Success status
|
| 11 |
+
*/
|
| 12 |
+
export function cleanBibliography(inputBibFile, outputBibFile) {
|
| 13 |
+
if (!existsSync(inputBibFile)) {
|
| 14 |
+
console.log(' ⚠️ No bibliography file found:', inputBibFile);
|
| 15 |
+
return false;
|
| 16 |
+
}
|
| 17 |
+
|
| 18 |
+
console.log('📚 Cleaning bibliography...');
|
| 19 |
+
let bibContent = readFileSync(inputBibFile, 'utf8');
|
| 20 |
+
|
| 21 |
+
// Remove file paths and local references
|
| 22 |
+
bibContent = bibContent.replace(/file = \{[^}]+\}/g, '');
|
| 23 |
+
|
| 24 |
+
// Remove empty lines created by file removal
|
| 25 |
+
bibContent = bibContent.replace(/,\s*\n\s*\n/g, '\n\n');
|
| 26 |
+
bibContent = bibContent.replace(/,\s*\}/g, '\n}');
|
| 27 |
+
|
| 28 |
+
// Clean up double commas
|
| 29 |
+
bibContent = bibContent.replace(/,,/g, ',');
|
| 30 |
+
|
| 31 |
+
// Remove trailing commas before closing braces
|
| 32 |
+
bibContent = bibContent.replace(/,(\s*\n\s*)\}/g, '$1}');
|
| 33 |
+
|
| 34 |
+
writeFileSync(outputBibFile, bibContent);
|
| 35 |
+
console.log(` 📄 Clean bibliography saved: ${outputBibFile}`);
|
| 36 |
+
|
| 37 |
+
return true;
|
| 38 |
+
}
|
| 39 |
+
|
| 40 |
+
/**
|
| 41 |
+
* CLI for bibliography cleaning
|
| 42 |
+
*/
|
| 43 |
+
function main() {
|
| 44 |
+
const args = process.argv.slice(2);
|
| 45 |
+
|
| 46 |
+
if (args.includes('--help') || args.includes('-h')) {
|
| 47 |
+
console.log(`
|
| 48 |
+
📚 BibTeX Bibliography Cleaner
|
| 49 |
+
|
| 50 |
+
Usage:
|
| 51 |
+
node bib-cleaner.mjs [input.bib] [output.bib]
|
| 52 |
+
node bib-cleaner.mjs --input=input.bib --output=output.bib
|
| 53 |
+
|
| 54 |
+
Options:
|
| 55 |
+
--input=FILE Input .bib file
|
| 56 |
+
--output=FILE Output cleaned .bib file
|
| 57 |
+
--help, -h Show this help
|
| 58 |
+
|
| 59 |
+
Examples:
|
| 60 |
+
# Clean main.bib to clean.bib
|
| 61 |
+
node bib-cleaner.mjs main.bib clean.bib
|
| 62 |
+
|
| 63 |
+
# Using flags
|
| 64 |
+
node bib-cleaner.mjs --input=references.bib --output=clean-refs.bib
|
| 65 |
+
`);
|
| 66 |
+
process.exit(0);
|
| 67 |
+
}
|
| 68 |
+
|
| 69 |
+
let inputFile, outputFile;
|
| 70 |
+
|
| 71 |
+
// Parse command line arguments
|
| 72 |
+
if (args.length >= 2 && !args[0].startsWith('--')) {
|
| 73 |
+
// Positional arguments
|
| 74 |
+
inputFile = args[0];
|
| 75 |
+
outputFile = args[1];
|
| 76 |
+
} else {
|
| 77 |
+
// Named arguments
|
| 78 |
+
for (const arg of args) {
|
| 79 |
+
if (arg.startsWith('--input=')) {
|
| 80 |
+
inputFile = arg.split('=')[1];
|
| 81 |
+
} else if (arg.startsWith('--output=')) {
|
| 82 |
+
outputFile = arg.split('=')[1];
|
| 83 |
+
}
|
| 84 |
+
}
|
| 85 |
+
}
|
| 86 |
+
|
| 87 |
+
if (!inputFile || !outputFile) {
|
| 88 |
+
console.error('❌ Both input and output files are required');
|
| 89 |
+
console.log('Use --help for usage information');
|
| 90 |
+
process.exit(1);
|
| 91 |
+
}
|
| 92 |
+
|
| 93 |
+
const success = cleanBibliography(inputFile, outputFile);
|
| 94 |
+
if (success) {
|
| 95 |
+
console.log('🎉 Bibliography cleaning completed!');
|
| 96 |
+
} else {
|
| 97 |
+
process.exit(1);
|
| 98 |
+
}
|
| 99 |
+
}
|
| 100 |
+
|
| 101 |
+
// Run CLI if called directly
|
| 102 |
+
if (import.meta.url === `file://${process.argv[1]}`) {
|
| 103 |
+
main();
|
| 104 |
+
}
|
app/scripts/latex-to-mdx/filters/equation-ids.lua
ADDED
|
@@ -0,0 +1,134 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
--[[
|
| 2 |
+
Pandoc Lua filter to add IDs to equations using KaTeX \htmlId syntax
|
| 3 |
+
|
| 4 |
+
This filter processes display math equations and inline math that contain
|
| 5 |
+
\label{} commands, and wraps them with \htmlId{clean-id}{content} for KaTeX.
|
| 6 |
+
|
| 7 |
+
Requirements:
|
| 8 |
+
- KaTeX renderer with trust: true option
|
| 9 |
+
- Equations with \label{} commands in LaTeX
|
| 10 |
+
--]]
|
| 11 |
+
|
| 12 |
+
-- Function to clean identifier strings (remove prefixes and colons)
|
| 13 |
+
function clean_identifier(id_str)
|
| 14 |
+
if id_str and type(id_str) == "string" then
|
| 15 |
+
-- Remove common prefixes and replace colons with dashes
|
| 16 |
+
local clean = id_str
|
| 17 |
+
:gsub("^(eq|equation):", "") -- Remove eq: prefix
|
| 18 |
+
:gsub(":", "-") -- Replace colons with dashes
|
| 19 |
+
:gsub("[^a-zA-Z0-9_-]", "-") -- Replace other problematic chars
|
| 20 |
+
:gsub("-+", "-") -- Collapse multiple dashes
|
| 21 |
+
:gsub("^-", "") -- Remove leading dash
|
| 22 |
+
:gsub("-$", "") -- Remove trailing dash
|
| 23 |
+
|
| 24 |
+
-- Ensure we don't have empty identifiers
|
| 25 |
+
if clean == "" then
|
| 26 |
+
clean = id_str:gsub(":", "-")
|
| 27 |
+
end
|
| 28 |
+
|
| 29 |
+
return clean
|
| 30 |
+
end
|
| 31 |
+
return id_str
|
| 32 |
+
end
|
| 33 |
+
|
| 34 |
+
-- Process Math elements (both inline and display)
|
| 35 |
+
function Math(el)
|
| 36 |
+
local math_content = el.text
|
| 37 |
+
|
| 38 |
+
-- Look for \label{...} commands in the math content
|
| 39 |
+
local label_match = math_content:match("\\label%{([^}]+)%}")
|
| 40 |
+
|
| 41 |
+
if label_match then
|
| 42 |
+
-- Clean the identifier
|
| 43 |
+
local clean_id = clean_identifier(label_match)
|
| 44 |
+
|
| 45 |
+
-- Remove the \label{} command from the math content
|
| 46 |
+
local clean_math = math_content:gsub("\\label%{[^}]+%}", "")
|
| 47 |
+
|
| 48 |
+
-- Clean up any extra whitespace or line breaks that might remain
|
| 49 |
+
clean_math = clean_math:gsub("%s*$", ""):gsub("^%s*", "")
|
| 50 |
+
|
| 51 |
+
-- Handle different equation environments appropriately
|
| 52 |
+
-- For align environments, preserve them as they work with KaTeX
|
| 53 |
+
local has_align = clean_math:match("\\begin%{align%}")
|
| 54 |
+
|
| 55 |
+
if has_align then
|
| 56 |
+
-- For align environments, we keep the structure and add ID as an attribute
|
| 57 |
+
-- KaTeX supports align environments natively
|
| 58 |
+
clean_math = clean_math:gsub("\\begin%{align%}", "\\begin{align}")
|
| 59 |
+
clean_math = clean_math:gsub("\\end%{align%}", "\\end{align}")
|
| 60 |
+
else
|
| 61 |
+
-- Remove other equation environments that don't work well with \htmlId
|
| 62 |
+
clean_math = clean_math:gsub("\\begin%{equation%}", ""):gsub("\\end%{equation%}", "")
|
| 63 |
+
clean_math = clean_math:gsub("\\begin%{equation%*%}", ""):gsub("\\end%{equation%*%}", "")
|
| 64 |
+
clean_math = clean_math:gsub("\\begin%{align%*%}", ""):gsub("\\end%{align%*%}", "")
|
| 65 |
+
end
|
| 66 |
+
|
| 67 |
+
-- Clean up any remaining whitespace
|
| 68 |
+
clean_math = clean_math:gsub("%s*$", ""):gsub("^%s*", "")
|
| 69 |
+
|
| 70 |
+
local new_math
|
| 71 |
+
if has_align then
|
| 72 |
+
-- For align environments, KaTeX doesn't support \htmlId with align
|
| 73 |
+
-- Instead, we add a special marker that the post-processor will convert to a span
|
| 74 |
+
-- This span will serve as an anchor for references
|
| 75 |
+
new_math = "%%ALIGN_ANCHOR_ID{" .. clean_id .. "}%%\n" .. clean_math
|
| 76 |
+
else
|
| 77 |
+
-- For other math, wrap with \htmlId{}
|
| 78 |
+
new_math = "\\htmlId{" .. clean_id .. "}{" .. clean_math .. "}"
|
| 79 |
+
end
|
| 80 |
+
|
| 81 |
+
-- Return new Math element with the updated content
|
| 82 |
+
return pandoc.Math(el.mathtype, new_math)
|
| 83 |
+
end
|
| 84 |
+
|
| 85 |
+
-- Return unchanged if no label found
|
| 86 |
+
return el
|
| 87 |
+
end
|
| 88 |
+
|
| 89 |
+
-- Optional: Process RawInline elements that might contain LaTeX math
|
| 90 |
+
function RawInline(el)
|
| 91 |
+
if el.format == "latex" or el.format == "tex" then
|
| 92 |
+
local content = el.text
|
| 93 |
+
|
| 94 |
+
-- Look for equation environments with labels
|
| 95 |
+
local label_match = content:match("\\label%{([^}]+)%}")
|
| 96 |
+
|
| 97 |
+
if label_match then
|
| 98 |
+
local clean_id = clean_identifier(label_match)
|
| 99 |
+
|
| 100 |
+
-- For raw LaTeX, we might need different handling
|
| 101 |
+
-- This is a simplified approach - adjust based on your needs
|
| 102 |
+
local clean_content = content:gsub("\\label%{[^}]+%}", "")
|
| 103 |
+
|
| 104 |
+
if clean_content:match("\\begin%{equation") or clean_content:match("\\begin%{align") then
|
| 105 |
+
-- For equation environments, we might need to wrap differently
|
| 106 |
+
-- This depends on how your KaTeX setup handles equation environments
|
| 107 |
+
return pandoc.RawInline(el.format, clean_content)
|
| 108 |
+
end
|
| 109 |
+
end
|
| 110 |
+
end
|
| 111 |
+
|
| 112 |
+
return el
|
| 113 |
+
end
|
| 114 |
+
|
| 115 |
+
-- Optional: Process RawBlock elements for display equations
|
| 116 |
+
function RawBlock(el)
|
| 117 |
+
if el.format == "latex" or el.format == "tex" then
|
| 118 |
+
local content = el.text
|
| 119 |
+
|
| 120 |
+
-- Look for equation environments with labels
|
| 121 |
+
local label_match = content:match("\\label%{([^}]+)%}")
|
| 122 |
+
|
| 123 |
+
if label_match then
|
| 124 |
+
local clean_id = clean_identifier(label_match)
|
| 125 |
+
local clean_content = content:gsub("\\label%{[^}]+%}", "")
|
| 126 |
+
|
| 127 |
+
-- For block equations, we might want to preserve the structure
|
| 128 |
+
-- but add the htmlId functionality
|
| 129 |
+
return pandoc.RawBlock(el.format, clean_content)
|
| 130 |
+
end
|
| 131 |
+
end
|
| 132 |
+
|
| 133 |
+
return el
|
| 134 |
+
end
|
app/scripts/latex-to-mdx/index.mjs
ADDED
|
@@ -0,0 +1,138 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env node
|
| 2 |
+
|
| 3 |
+
import { join, dirname } from 'path';
|
| 4 |
+
import { fileURLToPath } from 'url';
|
| 5 |
+
import { copyFileSync } from 'fs';
|
| 6 |
+
import { convertLatexToMarkdown } from './latex-converter.mjs';
|
| 7 |
+
import { convertToMdx } from './mdx-converter.mjs';
|
| 8 |
+
import { cleanBibliography } from './bib-cleaner.mjs';
|
| 9 |
+
|
| 10 |
+
const __filename = fileURLToPath(import.meta.url);
|
| 11 |
+
const __dirname = dirname(__filename);
|
| 12 |
+
|
| 13 |
+
// Default configuration
|
| 14 |
+
const DEFAULT_INPUT = join(__dirname, 'input', 'main.tex');
|
| 15 |
+
const DEFAULT_OUTPUT = join(__dirname, 'output');
|
| 16 |
+
const ASTRO_CONTENT_PATH = join(__dirname, '..', '..', 'src', 'content', 'article.mdx');
|
| 17 |
+
|
| 18 |
+
function parseArgs() {
|
| 19 |
+
const args = process.argv.slice(2);
|
| 20 |
+
const config = {
|
| 21 |
+
input: DEFAULT_INPUT,
|
| 22 |
+
output: DEFAULT_OUTPUT,
|
| 23 |
+
clean: false,
|
| 24 |
+
bibOnly: false,
|
| 25 |
+
convertOnly: false,
|
| 26 |
+
mdx: false,
|
| 27 |
+
};
|
| 28 |
+
|
| 29 |
+
for (const arg of args) {
|
| 30 |
+
if (arg.startsWith('--input=')) {
|
| 31 |
+
config.input = arg.split('=')[1];
|
| 32 |
+
} else if (arg.startsWith('--output=')) {
|
| 33 |
+
config.output = arg.split('=')[1];
|
| 34 |
+
} else if (arg === '--clean') {
|
| 35 |
+
config.clean = true;
|
| 36 |
+
} else if (arg === '--bib-only') {
|
| 37 |
+
config.bibOnly = true;
|
| 38 |
+
} else if (arg === '--convert-only') {
|
| 39 |
+
config.convertOnly = true;
|
| 40 |
+
}
|
| 41 |
+
}
|
| 42 |
+
|
| 43 |
+
return config;
|
| 44 |
+
}
|
| 45 |
+
|
| 46 |
+
function showHelp() {
|
| 47 |
+
console.log(`
|
| 48 |
+
🚀 LaTeX to Markdown Toolkit
|
| 49 |
+
|
| 50 |
+
Usage:
|
| 51 |
+
node index.mjs [options]
|
| 52 |
+
|
| 53 |
+
Options:
|
| 54 |
+
--input=PATH Input LaTeX file (default: input/main.tex)
|
| 55 |
+
--output=PATH Output directory (default: output/)
|
| 56 |
+
--clean Clean output directory before processing
|
| 57 |
+
--bib-only Only clean bibliography file
|
| 58 |
+
--convert-only Only convert LaTeX to Markdown (skip bib cleaning)
|
| 59 |
+
--help, -h Show this help
|
| 60 |
+
|
| 61 |
+
Examples:
|
| 62 |
+
# Full conversion with bibliography cleaning
|
| 63 |
+
node index.mjs --clean
|
| 64 |
+
|
| 65 |
+
# Only clean bibliography
|
| 66 |
+
node index.mjs --bib-only --input=paper.tex --output=clean/
|
| 67 |
+
|
| 68 |
+
# Only convert LaTeX (use existing clean bibliography)
|
| 69 |
+
node index.mjs --convert-only
|
| 70 |
+
|
| 71 |
+
# Custom paths
|
| 72 |
+
node index.mjs --input=../paper/main.tex --output=../results/ --clean
|
| 73 |
+
`);
|
| 74 |
+
}
|
| 75 |
+
|
| 76 |
+
function main() {
|
| 77 |
+
const args = process.argv.slice(2);
|
| 78 |
+
|
| 79 |
+
if (args.includes('--help') || args.includes('-h')) {
|
| 80 |
+
showHelp();
|
| 81 |
+
process.exit(0);
|
| 82 |
+
}
|
| 83 |
+
|
| 84 |
+
const config = parseArgs();
|
| 85 |
+
|
| 86 |
+
console.log('🚀 LaTeX to Markdown Toolkit');
|
| 87 |
+
console.log('==============================');
|
| 88 |
+
|
| 89 |
+
try {
|
| 90 |
+
if (config.bibOnly) {
|
| 91 |
+
// Only clean bibliography
|
| 92 |
+
console.log('📚 Bibliography cleaning mode');
|
| 93 |
+
const bibInput = config.input.replace('.tex', '.bib');
|
| 94 |
+
const bibOutput = join(config.output, 'main.bib');
|
| 95 |
+
|
| 96 |
+
cleanBibliography(bibInput, bibOutput);
|
| 97 |
+
console.log('🎉 Bibliography cleaning completed!');
|
| 98 |
+
|
| 99 |
+
} else if (config.convertOnly) {
|
| 100 |
+
// Only convert LaTeX
|
| 101 |
+
console.log('📄 Conversion only mode');
|
| 102 |
+
convertLatexToMarkdown(config.input, config.output);
|
| 103 |
+
|
| 104 |
+
} else {
|
| 105 |
+
// Full workflow
|
| 106 |
+
console.log('🔄 Full conversion workflow');
|
| 107 |
+
convertLatexToMarkdown(config.input, config.output);
|
| 108 |
+
|
| 109 |
+
// Convert to MDX if requested
|
| 110 |
+
const markdownFile = join(config.output, 'main.md');
|
| 111 |
+
const mdxFile = join(config.output, 'main.mdx');
|
| 112 |
+
|
| 113 |
+
console.log('📝 Converting Markdown to MDX...');
|
| 114 |
+
convertToMdx(markdownFile, mdxFile);
|
| 115 |
+
|
| 116 |
+
// Copy MDX to Astro content directory
|
| 117 |
+
console.log('📋 Copying MDX to Astro content directory...');
|
| 118 |
+
try {
|
| 119 |
+
copyFileSync(mdxFile, ASTRO_CONTENT_PATH);
|
| 120 |
+
console.log(` ✅ Copied to ${ASTRO_CONTENT_PATH}`);
|
| 121 |
+
} catch (error) {
|
| 122 |
+
console.warn(` ⚠️ Failed to copy MDX to Astro: ${error.message}`);
|
| 123 |
+
}
|
| 124 |
+
}
|
| 125 |
+
|
| 126 |
+
} catch (error) {
|
| 127 |
+
console.error('❌ Error:', error.message);
|
| 128 |
+
process.exit(1);
|
| 129 |
+
}
|
| 130 |
+
}
|
| 131 |
+
|
| 132 |
+
// Export functions for use as module
|
| 133 |
+
export { convertLatexToMarkdown, cleanBibliography };
|
| 134 |
+
|
| 135 |
+
// Run CLI if called directly
|
| 136 |
+
if (import.meta.url === `file://${process.argv[1]}`) {
|
| 137 |
+
main();
|
| 138 |
+
}
|
app/scripts/latex-to-mdx/input/.gitignore
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
.DS_store
|
| 2 |
+
|
| 3 |
+
*.aux
|
| 4 |
+
*.nav
|
| 5 |
+
*.log
|
| 6 |
+
*.snm
|
| 7 |
+
*.toc
|
| 8 |
+
*.out
|
| 9 |
+
*.vrb
|
| 10 |
+
*.blg
|
| 11 |
+
*latexmk*
|
| 12 |
+
*fls
|
| 13 |
+
*synctex*
|
app/scripts/latex-to-mdx/input/README.md
ADDED
|
@@ -0,0 +1,64 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Robot Learning: A Tutorial
|
| 2 |
+
|
| 3 |
+
Google "robot learning tutorial", and you will spend just as much time skimming through sources as actually learning about robot learning.
|
| 4 |
+
This tutorial solves this: a unified entry point to the field of robot learning, presenting the conceptual underpinnings of popular approaches in the field, as well as presenting practical examples of how to use SOTA algorithms in `lerobot`, an open-source library for full-stack robotics.
|
| 5 |
+
|
| 6 |
+
# TODO
|
| 7 |
+
|
| 8 |
+
```markdown
|
| 9 |
+
## 1. Introduction
|
| 10 |
+
- [x] 1.1 Motivation
|
| 11 |
+
- [x] 1.2 Structure of the Report
|
| 12 |
+
|
| 13 |
+
## 2. Classical Robotics
|
| 14 |
+
- [x] 2.1 Different kinds of motion
|
| 15 |
+
- [x] 2.2 Example: (Planar) Manipulation
|
| 16 |
+
- [x] 2.3.1 Adding Feedback Loops
|
| 17 |
+
- [x] 2.4 Limitations of Dynamics-based Robotics
|
| 18 |
+
|
| 19 |
+
## 3. Robot Learning
|
| 20 |
+
- [ ] 3.1 Reinforcement Learning (RL) for Robotics
|
| 21 |
+
- [ ] 3.1.1 A (Concise) Introduction to RL
|
| 22 |
+
- [ ] 3.2 Model-Free RL for Real-world Robotics
|
| 23 |
+
- [ ] 3.2.1 RL in lerobot: sample efficient, data-driven, and real-world
|
| 24 |
+
- [ ] 3.2.2 Code Example: HIL-SERL in lerobot
|
| 25 |
+
- [ ] 3.3 Limitations of RL in Real-World Robotics: Simulators and Reward Design
|
| 26 |
+
- [ ] 3.4 Behavioral Cloning (BC) for Robotics
|
| 27 |
+
- [ ] 4.1.1 Leveraging Real-World Demonstrations
|
| 28 |
+
- [ ] 4.1.2 Reward-Free Training and Betting on Data
|
| 29 |
+
|
| 30 |
+
## 4. Single-Task Policy Architectures
|
| 31 |
+
- [ ] 4.2 Action Chunking with Transformers (ACT)
|
| 32 |
+
- [ ] 4.2.1 Model Architecture and Training Objectives
|
| 33 |
+
- [ ] 4.2.2 Code Example: Use ACT in lerobot
|
| 34 |
+
- [ ] 4.3 Diffusion-Based Policy Models
|
| 35 |
+
- [ ] 4.3.1 Generative Modeling for Action Sequences
|
| 36 |
+
- [ ] 4.3.2 Code Example: Use Diffusion Policy in lerobot
|
| 37 |
+
|
| 38 |
+
## 5. Multi-task Policies: Vision-Language-Action (VLA) Models in Robotics
|
| 39 |
+
- [ ] 5.1 Multi-task Policies: Vision-Language-Action (VLA) Models in Robotics
|
| 40 |
+
- [ ] 5.1.1 Overview of Major Architectures: Pi0, SmolVLA
|
| 41 |
+
- [ ] 5.1.2 Practical Implementation: Using VLA in lerobot
|
| 42 |
+
|
| 43 |
+
## 6. Some Emerging Directions in Robot Learning
|
| 44 |
+
- [ ] 6.1 VLAs Post-Training
|
| 45 |
+
- [ ] 6.1.1 From Imitation to Refinement
|
| 46 |
+
- [ ] 6.1.2 EXPO
|
| 47 |
+
|
| 48 |
+
## 7. Conclusions
|
| 49 |
+
```
|
| 50 |
+
|
| 51 |
+
If time permits (vs current TOC):
|
| 52 |
+
|
| 53 |
+
- [ ] 3.3 Model-based RL for Robotics
|
| 54 |
+
- [ ] 3.3.1 TD-MPC
|
| 55 |
+
- [ ] 3.3.2 Code Example: Use TD-MPC in lerobot
|
| 56 |
+
- [ ] 3.5 Popular benchmarks in Robot Learning
|
| 57 |
+
|
| 58 |
+
- 4.3 Vector-Quantized Behavior Transformer (VQ-BeT)
|
| 59 |
+
- [ ] 4.3.1 Model Architecture and Training Objectives
|
| 60 |
+
- [ ] 4.3.2 Code Example: Use VQ-BeT in lerobot
|
| 61 |
+
|
| 62 |
+
- [ ] 6.1 Using World Models for Robotics
|
| 63 |
+
- [ ] 6.1.1 In the architecture: V-JEPA and V-JEPA2
|
| 64 |
+
- [ ] 6.1.2 In the simulation: GENIE
|
app/scripts/latex-to-mdx/input/_minted/62B8750C0ACEBDA39A95140434E540A8.highlight.minted
ADDED
|
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\begin{MintedVerbatim}[commandchars=\\\{\}]
|
| 2 |
+
\PYG{k+kn}{import}\PYG{+w}{ }\PYG{n+nn}{torch}
|
| 3 |
+
\PYG{k+kn}{from}\PYG{+w}{ }\PYG{n+nn}{lerobot}\PYG{n+nn}{.}\PYG{n+nn}{datasets}\PYG{n+nn}{.}\PYG{n+nn}{lerobot\PYGZus{}dataset}\PYG{+w}{ }\PYG{k+kn}{import} \PYG{n}{LeRobotDataset}
|
| 4 |
+
\PYG{k+kn}{from}\PYG{+w}{ }\PYG{n+nn}{lerobot}\PYG{n+nn}{.}\PYG{n+nn}{datasets}\PYG{n+nn}{.}\PYG{n+nn}{streaming\PYGZus{}dataset}\PYG{+w}{ }\PYG{k+kn}{import} \PYG{n}{StreamingLeRobotDataset}
|
| 5 |
+
|
| 6 |
+
\PYG{n}{delta\PYGZus{}timestamps} \PYG{o}{=} \PYG{p}{\PYGZob{}}
|
| 7 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.images.wrist\PYGZus{}camera}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{:} \PYG{p}{[}\PYG{o}{\PYGZhy{}}\PYG{l+m+mf}{0.2}\PYG{p}{,} \PYG{o}{\PYGZhy{}}\PYG{l+m+mf}{0.1}\PYG{p}{,} \PYG{l+m+mf}{0.0}\PYG{p}{]} \PYG{c+c1}{\PYGZsh{} 0.2, and 0.1 seconds *before* each frame}
|
| 8 |
+
\PYG{p}{\PYGZcb{}}
|
| 9 |
+
|
| 10 |
+
\PYG{c+c1}{\PYGZsh{} Optionally, use StreamingLeRobotDataset to avoid downloading the dataset}
|
| 11 |
+
\PYG{n}{dataset} \PYG{o}{=} \PYG{n}{LeRobotDataset}\PYG{p}{(}
|
| 12 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{lerobot/svla\PYGZus{}so101\PYGZus{}pickplace}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{,}
|
| 13 |
+
\PYG{n}{delta\PYGZus{}timestamps}\PYG{o}{=}\PYG{n}{delta\PYGZus{}timestamps}
|
| 14 |
+
\PYG{p}{)}
|
| 15 |
+
|
| 16 |
+
\PYG{c+c1}{\PYGZsh{} Streams frames from the Hugging Face Hub without loading into memory}
|
| 17 |
+
\PYG{n}{streaming\PYGZus{}dataset} \PYG{o}{=} \PYG{n}{StreamingLeRobotDataset}\PYG{p}{(}
|
| 18 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{lerobot/svla\PYGZus{}so101\PYGZus{}pickplace}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{,}
|
| 19 |
+
\PYG{n}{delta\PYGZus{}timestamps}\PYG{o}{=}\PYG{n}{delta\PYGZus{}timestamps}
|
| 20 |
+
\PYG{p}{)}
|
| 21 |
+
|
| 22 |
+
\PYG{c+c1}{\PYGZsh{} Get the 100th frame in the dataset by }
|
| 23 |
+
\PYG{n}{sample} \PYG{o}{=} \PYG{n}{dataset}\PYG{p}{[}\PYG{l+m+mi}{100}\PYG{p}{]}
|
| 24 |
+
\PYG{n+nb}{print}\PYG{p}{(}\PYG{n}{sample}\PYG{p}{)}
|
| 25 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZob{}}
|
| 26 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}observation.state\PYGZsq{}: tensor([...]), }
|
| 27 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}action\PYGZsq{}: tensor([...]), }
|
| 28 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}observation.images.wrist\PYGZus{}camera\PYGZsq{}: tensor([3, C, H, W]), for delta timesteps}
|
| 29 |
+
\PYG{c+c1}{\PYGZsh{} ...}
|
| 30 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZcb{}}
|
| 31 |
+
|
| 32 |
+
\PYG{n}{batch\PYGZus{}size}\PYG{o}{=}\PYG{l+m+mi}{16}
|
| 33 |
+
\PYG{c+c1}{\PYGZsh{} wrap the dataset in a DataLoader to use process it batches for training purposes}
|
| 34 |
+
\PYG{n}{data\PYGZus{}loader} \PYG{o}{=} \PYG{n}{torch}\PYG{o}{.}\PYG{n}{utils}\PYG{o}{.}\PYG{n}{data}\PYG{o}{.}\PYG{n}{DataLoader}\PYG{p}{(}
|
| 35 |
+
\PYG{n}{dataset}\PYG{p}{,}
|
| 36 |
+
\PYG{n}{batch\PYGZus{}size}\PYG{o}{=}\PYG{n}{batch\PYGZus{}size}
|
| 37 |
+
\PYG{p}{)}
|
| 38 |
+
|
| 39 |
+
\PYG{c+c1}{\PYGZsh{} Iterate over the DataLoader in a training loop}
|
| 40 |
+
\PYG{n}{num\PYGZus{}epochs} \PYG{o}{=} \PYG{l+m+mi}{1}
|
| 41 |
+
\PYG{n}{device} \PYG{o}{=} \PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{cuda}\PYG{l+s+s2}{\PYGZdq{}} \PYG{k}{if} \PYG{n}{torch}\PYG{o}{.}\PYG{n}{cuda}\PYG{o}{.}\PYG{n}{is\PYGZus{}available}\PYG{p}{(}\PYG{p}{)} \PYG{k}{else} \PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{cpu}\PYG{l+s+s2}{\PYGZdq{}}
|
| 42 |
+
|
| 43 |
+
\PYG{k}{for} \PYG{n}{epoch} \PYG{o+ow}{in} \PYG{n+nb}{range}\PYG{p}{(}\PYG{n}{num\PYGZus{}epochs}\PYG{p}{)}\PYG{p}{:}
|
| 44 |
+
\PYG{k}{for} \PYG{n}{batch} \PYG{o+ow}{in} \PYG{n}{data\PYGZus{}loader}\PYG{p}{:}
|
| 45 |
+
\PYG{c+c1}{\PYGZsh{} Move data to the appropriate device (e.g., GPU)}
|
| 46 |
+
\PYG{n}{observations} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.state}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 47 |
+
\PYG{n}{actions} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{action}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 48 |
+
\PYG{n}{images} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.images.wrist\PYGZus{}camera}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 49 |
+
|
| 50 |
+
\PYG{c+c1}{\PYGZsh{} Next, you can do amazing\PYGZus{}model.forward(batch)}
|
| 51 |
+
\PYG{o}{.}\PYG{o}{.}\PYG{o}{.}
|
| 52 |
+
\end{MintedVerbatim}
|
app/scripts/latex-to-mdx/input/_minted/_FAD58DE7366495DB4650CFEFAC2FCD61.index.minted
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"jobname": "main",
|
| 3 |
+
"md5": "FAD58DE7366495DB4650CFEFAC2FCD61",
|
| 4 |
+
"timestamp": "20250911180655",
|
| 5 |
+
"cachefiles": [
|
| 6 |
+
"62B8750C0ACEBDA39A95140434E540A8.highlight.minted",
|
| 7 |
+
"_FAD58DE7366495DB4650CFEFAC2FCD61.index.minted",
|
| 8 |
+
"colorful.style.minted"
|
| 9 |
+
]
|
| 10 |
+
}
|
app/scripts/latex-to-mdx/input/_minted/colorful.style.minted
ADDED
|
@@ -0,0 +1,100 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\makeatletter
|
| 2 |
+
\def\PYG@reset{\let\PYG@it=\relax \let\PYG@bf=\relax%
|
| 3 |
+
\let\PYG@ul=\relax \let\PYG@tc=\relax%
|
| 4 |
+
\let\PYG@bc=\relax \let\PYG@ff=\relax}
|
| 5 |
+
\def\PYG@tok#1{\csname PYG@tok@#1\endcsname}
|
| 6 |
+
\def\PYG@toks#1+{\ifx\relax#1\empty\else%
|
| 7 |
+
\PYG@tok{#1}\expandafter\PYG@toks\fi}
|
| 8 |
+
\def\PYG@do#1{\PYG@bc{\PYG@tc{\PYG@ul{%
|
| 9 |
+
\PYG@it{\PYG@bf{\PYG@ff{#1}}}}}}}
|
| 10 |
+
\def\PYG#1#2{\PYG@reset\PYG@toks#1+\relax+\PYG@do{#2}}
|
| 11 |
+
|
| 12 |
+
\@namedef{PYG@tok@w}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.73,0.73}{##1}}}
|
| 13 |
+
\@namedef{PYG@tok@c}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 14 |
+
\@namedef{PYG@tok@cp}{\def\PYG@tc##1{\textcolor[rgb]{0.33,0.47,0.60}{##1}}}
|
| 15 |
+
\@namedef{PYG@tok@cs}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.80,0.00,0.00}{##1}}}
|
| 16 |
+
\@namedef{PYG@tok@k}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 17 |
+
\@namedef{PYG@tok@kp}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.20,0.53}{##1}}}
|
| 18 |
+
\@namedef{PYG@tok@kt}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.60}{##1}}}
|
| 19 |
+
\@namedef{PYG@tok@o}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.20}{##1}}}
|
| 20 |
+
\@namedef{PYG@tok@ow}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.00}{##1}}}
|
| 21 |
+
\@namedef{PYG@tok@nb}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.44,0.13}{##1}}}
|
| 22 |
+
\@namedef{PYG@tok@nf}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.40,0.73}{##1}}}
|
| 23 |
+
\@namedef{PYG@tok@nc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.73,0.00,0.40}{##1}}}
|
| 24 |
+
\@namedef{PYG@tok@nn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.05,0.52,0.71}{##1}}}
|
| 25 |
+
\@namedef{PYG@tok@ne}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}}
|
| 26 |
+
\@namedef{PYG@tok@nv}{\def\PYG@tc##1{\textcolor[rgb]{0.60,0.40,0.20}{##1}}}
|
| 27 |
+
\@namedef{PYG@tok@vi}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.73}{##1}}}
|
| 28 |
+
\@namedef{PYG@tok@vc}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.40,0.60}{##1}}}
|
| 29 |
+
\@namedef{PYG@tok@vg}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.87,0.47,0.00}{##1}}}
|
| 30 |
+
\@namedef{PYG@tok@no}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.20,0.40}{##1}}}
|
| 31 |
+
\@namedef{PYG@tok@nl}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.60,0.47,0.00}{##1}}}
|
| 32 |
+
\@namedef{PYG@tok@ni}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.53,0.00,0.00}{##1}}}
|
| 33 |
+
\@namedef{PYG@tok@na}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.80}{##1}}}
|
| 34 |
+
\@namedef{PYG@tok@nt}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.47,0.00}{##1}}}
|
| 35 |
+
\@namedef{PYG@tok@nd}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.33,0.33,0.33}{##1}}}
|
| 36 |
+
\@namedef{PYG@tok@s}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 37 |
+
\@namedef{PYG@tok@sc}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.27,0.87}{##1}}}
|
| 38 |
+
\@namedef{PYG@tok@sd}{\def\PYG@tc##1{\textcolor[rgb]{0.87,0.27,0.13}{##1}}}
|
| 39 |
+
\@namedef{PYG@tok@si}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{0.93,0.93,0.93}{\strut ##1}}}}
|
| 40 |
+
\@namedef{PYG@tok@se}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 41 |
+
\@namedef{PYG@tok@sr}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,1.00}{\strut ##1}}}}
|
| 42 |
+
\@namedef{PYG@tok@ss}{\def\PYG@tc##1{\textcolor[rgb]{0.67,0.40,0.00}{##1}}}
|
| 43 |
+
\@namedef{PYG@tok@sx}{\def\PYG@tc##1{\textcolor[rgb]{0.87,0.13,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 44 |
+
\@namedef{PYG@tok@m}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 45 |
+
\@namedef{PYG@tok@mi}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.87}{##1}}}
|
| 46 |
+
\@namedef{PYG@tok@mf}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 47 |
+
\@namedef{PYG@tok@mh}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.33,0.53}{##1}}}
|
| 48 |
+
\@namedef{PYG@tok@mo}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.27,0.00,0.93}{##1}}}
|
| 49 |
+
\@namedef{PYG@tok@gh}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.50}{##1}}}
|
| 50 |
+
\@namedef{PYG@tok@gu}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.50,0.00,0.50}{##1}}}
|
| 51 |
+
\@namedef{PYG@tok@gd}{\def\PYG@tc##1{\textcolor[rgb]{0.63,0.00,0.00}{##1}}}
|
| 52 |
+
\@namedef{PYG@tok@gi}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.63,0.00}{##1}}}
|
| 53 |
+
\@namedef{PYG@tok@gr}{\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}}
|
| 54 |
+
\@namedef{PYG@tok@ge}{\let\PYG@it=\textit}
|
| 55 |
+
\@namedef{PYG@tok@gs}{\let\PYG@bf=\textbf}
|
| 56 |
+
\@namedef{PYG@tok@ges}{\let\PYG@bf=\textbf\let\PYG@it=\textit}
|
| 57 |
+
\@namedef{PYG@tok@gp}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.78,0.36,0.04}{##1}}}
|
| 58 |
+
\@namedef{PYG@tok@go}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 59 |
+
\@namedef{PYG@tok@gt}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.27,0.87}{##1}}}
|
| 60 |
+
\@namedef{PYG@tok@err}{\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.67,0.67}{\strut ##1}}}}
|
| 61 |
+
\@namedef{PYG@tok@kc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 62 |
+
\@namedef{PYG@tok@kd}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 63 |
+
\@namedef{PYG@tok@kn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 64 |
+
\@namedef{PYG@tok@kr}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 65 |
+
\@namedef{PYG@tok@bp}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.44,0.13}{##1}}}
|
| 66 |
+
\@namedef{PYG@tok@fm}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.40,0.73}{##1}}}
|
| 67 |
+
\@namedef{PYG@tok@vm}{\def\PYG@tc##1{\textcolor[rgb]{0.60,0.40,0.20}{##1}}}
|
| 68 |
+
\@namedef{PYG@tok@sa}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 69 |
+
\@namedef{PYG@tok@sb}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 70 |
+
\@namedef{PYG@tok@dl}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 71 |
+
\@namedef{PYG@tok@s2}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 72 |
+
\@namedef{PYG@tok@sh}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 73 |
+
\@namedef{PYG@tok@s1}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 74 |
+
\@namedef{PYG@tok@mb}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 75 |
+
\@namedef{PYG@tok@il}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.87}{##1}}}
|
| 76 |
+
\@namedef{PYG@tok@ch}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 77 |
+
\@namedef{PYG@tok@cm}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 78 |
+
\@namedef{PYG@tok@cpf}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 79 |
+
\@namedef{PYG@tok@c1}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 80 |
+
|
| 81 |
+
\def\PYGZbs{\char`\\}
|
| 82 |
+
\def\PYGZus{\char`\_}
|
| 83 |
+
\def\PYGZob{\char`\{}
|
| 84 |
+
\def\PYGZcb{\char`\}}
|
| 85 |
+
\def\PYGZca{\char`\^}
|
| 86 |
+
\def\PYGZam{\char`\&}
|
| 87 |
+
\def\PYGZlt{\char`\<}
|
| 88 |
+
\def\PYGZgt{\char`\>}
|
| 89 |
+
\def\PYGZsh{\char`\#}
|
| 90 |
+
\def\PYGZpc{\char`\%}
|
| 91 |
+
\def\PYGZdl{\char`\$}
|
| 92 |
+
\def\PYGZhy{\char`\-}
|
| 93 |
+
\def\PYGZsq{\char`\'}
|
| 94 |
+
\def\PYGZdq{\char`\"}
|
| 95 |
+
\def\PYGZti{\char`\~}
|
| 96 |
+
% for compatibility with earlier versions
|
| 97 |
+
\def\PYGZat{@}
|
| 98 |
+
\def\PYGZlb{[}
|
| 99 |
+
\def\PYGZrb{]}
|
| 100 |
+
\makeatother
|
app/scripts/latex-to-mdx/input/fancyhdr.sty
ADDED
|
@@ -0,0 +1,485 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
% fancyhdr.sty version 3.2
|
| 2 |
+
% Fancy headers and footers for LaTeX.
|
| 3 |
+
% Piet van Oostrum,
|
| 4 |
+
% Dept of Computer and Information Sciences, University of Utrecht,
|
| 5 |
+
% Padualaan 14, P.O. Box 80.089, 3508 TB Utrecht, The Netherlands
|
| 6 |
+
% Telephone: +31 30 2532180. Email: piet@cs.uu.nl
|
| 7 |
+
% ========================================================================
|
| 8 |
+
% LICENCE:
|
| 9 |
+
% This file may be distributed under the terms of the LaTeX Project Public
|
| 10 |
+
% License, as described in lppl.txt in the base LaTeX distribution.
|
| 11 |
+
% Either version 1 or, at your option, any later version.
|
| 12 |
+
% ========================================================================
|
| 13 |
+
% MODIFICATION HISTORY:
|
| 14 |
+
% Sep 16, 1994
|
| 15 |
+
% version 1.4: Correction for use with \reversemargin
|
| 16 |
+
% Sep 29, 1994:
|
| 17 |
+
% version 1.5: Added the \iftopfloat, \ifbotfloat and \iffloatpage commands
|
| 18 |
+
% Oct 4, 1994:
|
| 19 |
+
% version 1.6: Reset single spacing in headers/footers for use with
|
| 20 |
+
% setspace.sty or doublespace.sty
|
| 21 |
+
% Oct 4, 1994:
|
| 22 |
+
% version 1.7: changed \let\@mkboth\markboth to
|
| 23 |
+
% \def\@mkboth{\protect\markboth} to make it more robust
|
| 24 |
+
% Dec 5, 1994:
|
| 25 |
+
% version 1.8: corrections for amsbook/amsart: define \@chapapp and (more
|
| 26 |
+
% importantly) use the \chapter/sectionmark definitions from ps@headings if
|
| 27 |
+
% they exist (which should be true for all standard classes).
|
| 28 |
+
% May 31, 1995:
|
| 29 |
+
% version 1.9: The proposed \renewcommand{\headrulewidth}{\iffloatpage...
|
| 30 |
+
% construction in the doc did not work properly with the fancyplain style.
|
| 31 |
+
% June 1, 1995:
|
| 32 |
+
% version 1.91: The definition of \@mkboth wasn't restored on subsequent
|
| 33 |
+
% \pagestyle{fancy}'s.
|
| 34 |
+
% June 1, 1995:
|
| 35 |
+
% version 1.92: The sequence \pagestyle{fancyplain} \pagestyle{plain}
|
| 36 |
+
% \pagestyle{fancy} would erroneously select the plain version.
|
| 37 |
+
% June 1, 1995:
|
| 38 |
+
% version 1.93: \fancypagestyle command added.
|
| 39 |
+
% Dec 11, 1995:
|
| 40 |
+
% version 1.94: suggested by Conrad Hughes <chughes@maths.tcd.ie>
|
| 41 |
+
% CJCH, Dec 11, 1995: added \footruleskip to allow control over footrule
|
| 42 |
+
% position (old hardcoded value of .3\normalbaselineskip is far too high
|
| 43 |
+
% when used with very small footer fonts).
|
| 44 |
+
% Jan 31, 1996:
|
| 45 |
+
% version 1.95: call \@normalsize in the reset code if that is defined,
|
| 46 |
+
% otherwise \normalsize.
|
| 47 |
+
% this is to solve a problem with ucthesis.cls, as this doesn't
|
| 48 |
+
% define \@currsize. Unfortunately for latex209 calling \normalsize doesn't
|
| 49 |
+
% work as this is optimized to do very little, so there \@normalsize should
|
| 50 |
+
% be called. Hopefully this code works for all versions of LaTeX known to
|
| 51 |
+
% mankind.
|
| 52 |
+
% April 25, 1996:
|
| 53 |
+
% version 1.96: initialize \headwidth to a magic (negative) value to catch
|
| 54 |
+
% most common cases that people change it before calling \pagestyle{fancy}.
|
| 55 |
+
% Note it can't be initialized when reading in this file, because
|
| 56 |
+
% \textwidth could be changed afterwards. This is quite probable.
|
| 57 |
+
% We also switch to \MakeUppercase rather than \uppercase and introduce a
|
| 58 |
+
% \nouppercase command for use in headers. and footers.
|
| 59 |
+
% May 3, 1996:
|
| 60 |
+
% version 1.97: Two changes:
|
| 61 |
+
% 1. Undo the change in version 1.8 (using the pagestyle{headings} defaults
|
| 62 |
+
% for the chapter and section marks. The current version of amsbook and
|
| 63 |
+
% amsart classes don't seem to need them anymore. Moreover the standard
|
| 64 |
+
% latex classes don't use \markboth if twoside isn't selected, and this is
|
| 65 |
+
% confusing as \leftmark doesn't work as expected.
|
| 66 |
+
% 2. include a call to \ps@empty in ps@@fancy. This is to solve a problem
|
| 67 |
+
% in the amsbook and amsart classes, that make global changes to \topskip,
|
| 68 |
+
% which are reset in \ps@empty. Hopefully this doesn't break other things.
|
| 69 |
+
% May 7, 1996:
|
| 70 |
+
% version 1.98:
|
| 71 |
+
% Added % after the line \def\nouppercase
|
| 72 |
+
% May 7, 1996:
|
| 73 |
+
% version 1.99: This is the alpha version of fancyhdr 2.0
|
| 74 |
+
% Introduced the new commands \fancyhead, \fancyfoot, and \fancyhf.
|
| 75 |
+
% Changed \headrulewidth, \footrulewidth, \footruleskip to
|
| 76 |
+
% macros rather than length parameters, In this way they can be
|
| 77 |
+
% conditionalized and they don't consume length registers. There is no need
|
| 78 |
+
% to have them as length registers unless you want to do calculations with
|
| 79 |
+
% them, which is unlikely. Note that this may make some uses of them
|
| 80 |
+
% incompatible (i.e. if you have a file that uses \setlength or \xxxx=)
|
| 81 |
+
% May 10, 1996:
|
| 82 |
+
% version 1.99a:
|
| 83 |
+
% Added a few more % signs
|
| 84 |
+
% May 10, 1996:
|
| 85 |
+
% version 1.99b:
|
| 86 |
+
% Changed the syntax of \f@nfor to be resistent to catcode changes of :=
|
| 87 |
+
% Removed the [1] from the defs of \lhead etc. because the parameter is
|
| 88 |
+
% consumed by the \@[xy]lhead etc. macros.
|
| 89 |
+
% June 24, 1997:
|
| 90 |
+
% version 1.99c:
|
| 91 |
+
% corrected \nouppercase to also include the protected form of \MakeUppercase
|
| 92 |
+
% \global added to manipulation of \headwidth.
|
| 93 |
+
% \iffootnote command added.
|
| 94 |
+
% Some comments added about \@fancyhead and \@fancyfoot.
|
| 95 |
+
% Aug 24, 1998
|
| 96 |
+
% version 1.99d
|
| 97 |
+
% Changed the default \ps@empty to \ps@@empty in order to allow
|
| 98 |
+
% \fancypagestyle{empty} redefinition.
|
| 99 |
+
% Oct 11, 2000
|
| 100 |
+
% version 2.0
|
| 101 |
+
% Added LPPL license clause.
|
| 102 |
+
%
|
| 103 |
+
% A check for \headheight is added. An errormessage is given (once) if the
|
| 104 |
+
% header is too large. Empty headers don't generate the error even if
|
| 105 |
+
% \headheight is very small or even 0pt.
|
| 106 |
+
% Warning added for the use of 'E' option when twoside option is not used.
|
| 107 |
+
% In this case the 'E' fields will never be used.
|
| 108 |
+
%
|
| 109 |
+
% Mar 10, 2002
|
| 110 |
+
% version 2.1beta
|
| 111 |
+
% New command: \fancyhfoffset[place]{length}
|
| 112 |
+
% defines offsets to be applied to the header/footer to let it stick into
|
| 113 |
+
% the margins (if length > 0).
|
| 114 |
+
% place is like in fancyhead, except that only E,O,L,R can be used.
|
| 115 |
+
% This replaces the old calculation based on \headwidth and the marginpar
|
| 116 |
+
% area.
|
| 117 |
+
% \headwidth will be dynamically calculated in the headers/footers when
|
| 118 |
+
% this is used.
|
| 119 |
+
%
|
| 120 |
+
% Mar 26, 2002
|
| 121 |
+
% version 2.1beta2
|
| 122 |
+
% \fancyhfoffset now also takes h,f as possible letters in the argument to
|
| 123 |
+
% allow the header and footer widths to be different.
|
| 124 |
+
% New commands \fancyheadoffset and \fancyfootoffset added comparable to
|
| 125 |
+
% \fancyhead and \fancyfoot.
|
| 126 |
+
% Errormessages and warnings have been made more informative.
|
| 127 |
+
%
|
| 128 |
+
% Dec 9, 2002
|
| 129 |
+
% version 2.1
|
| 130 |
+
% The defaults for \footrulewidth, \plainheadrulewidth and
|
| 131 |
+
% \plainfootrulewidth are changed from \z@skip to 0pt. In this way when
|
| 132 |
+
% someone inadvertantly uses \setlength to change any of these, the value
|
| 133 |
+
% of \z@skip will not be changed, rather an errormessage will be given.
|
| 134 |
+
|
| 135 |
+
% March 3, 2004
|
| 136 |
+
% Release of version 3.0
|
| 137 |
+
|
| 138 |
+
% Oct 7, 2004
|
| 139 |
+
% version 3.1
|
| 140 |
+
% Added '\endlinechar=13' to \fancy@reset to prevent problems with
|
| 141 |
+
% includegraphics in header when verbatiminput is active.
|
| 142 |
+
|
| 143 |
+
% March 22, 2005
|
| 144 |
+
% version 3.2
|
| 145 |
+
% reset \everypar (the real one) in \fancy@reset because spanish.ldf does
|
| 146 |
+
% strange things with \everypar between << and >>.
|
| 147 |
+
|
| 148 |
+
\def\ifancy@mpty#1{\def\temp@a{#1}\ifx\temp@a\@empty}
|
| 149 |
+
|
| 150 |
+
\def\fancy@def#1#2{\ifancy@mpty{#2}\fancy@gbl\def#1{\leavevmode}\else
|
| 151 |
+
\fancy@gbl\def#1{#2\strut}\fi}
|
| 152 |
+
|
| 153 |
+
\let\fancy@gbl\global
|
| 154 |
+
|
| 155 |
+
\def\@fancyerrmsg#1{%
|
| 156 |
+
\ifx\PackageError\undefined
|
| 157 |
+
\errmessage{#1}\else
|
| 158 |
+
\PackageError{Fancyhdr}{#1}{}\fi}
|
| 159 |
+
\def\@fancywarning#1{%
|
| 160 |
+
\ifx\PackageWarning\undefined
|
| 161 |
+
\errmessage{#1}\else
|
| 162 |
+
\PackageWarning{Fancyhdr}{#1}{}\fi}
|
| 163 |
+
|
| 164 |
+
% Usage: \@forc \var{charstring}{command to be executed for each char}
|
| 165 |
+
% This is similar to LaTeX's \@tfor, but expands the charstring.
|
| 166 |
+
|
| 167 |
+
\def\@forc#1#2#3{\expandafter\f@rc\expandafter#1\expandafter{#2}{#3}}
|
| 168 |
+
\def\f@rc#1#2#3{\def\temp@ty{#2}\ifx\@empty\temp@ty\else
|
| 169 |
+
\f@@rc#1#2\f@@rc{#3}\fi}
|
| 170 |
+
\def\f@@rc#1#2#3\f@@rc#4{\def#1{#2}#4\f@rc#1{#3}{#4}}
|
| 171 |
+
|
| 172 |
+
% Usage: \f@nfor\name:=list\do{body}
|
| 173 |
+
% Like LaTeX's \@for but an empty list is treated as a list with an empty
|
| 174 |
+
% element
|
| 175 |
+
|
| 176 |
+
\newcommand{\f@nfor}[3]{\edef\@fortmp{#2}%
|
| 177 |
+
\expandafter\@forloop#2,\@nil,\@nil\@@#1{#3}}
|
| 178 |
+
|
| 179 |
+
% Usage: \def@ult \cs{defaults}{argument}
|
| 180 |
+
% sets \cs to the characters from defaults appearing in argument
|
| 181 |
+
% or defaults if it would be empty. All characters are lowercased.
|
| 182 |
+
|
| 183 |
+
\newcommand\def@ult[3]{%
|
| 184 |
+
\edef\temp@a{\lowercase{\edef\noexpand\temp@a{#3}}}\temp@a
|
| 185 |
+
\def#1{}%
|
| 186 |
+
\@forc\tmpf@ra{#2}%
|
| 187 |
+
{\expandafter\if@in\tmpf@ra\temp@a{\edef#1{#1\tmpf@ra}}{}}%
|
| 188 |
+
\ifx\@empty#1\def#1{#2}\fi}
|
| 189 |
+
%
|
| 190 |
+
% \if@in <char><set><truecase><falsecase>
|
| 191 |
+
%
|
| 192 |
+
\newcommand{\if@in}[4]{%
|
| 193 |
+
\edef\temp@a{#2}\def\temp@b##1#1##2\temp@b{\def\temp@b{##1}}%
|
| 194 |
+
\expandafter\temp@b#2#1\temp@b\ifx\temp@a\temp@b #4\else #3\fi}
|
| 195 |
+
|
| 196 |
+
\newcommand{\fancyhead}{\@ifnextchar[{\f@ncyhf\fancyhead h}%
|
| 197 |
+
{\f@ncyhf\fancyhead h[]}}
|
| 198 |
+
\newcommand{\fancyfoot}{\@ifnextchar[{\f@ncyhf\fancyfoot f}%
|
| 199 |
+
{\f@ncyhf\fancyfoot f[]}}
|
| 200 |
+
\newcommand{\fancyhf}{\@ifnextchar[{\f@ncyhf\fancyhf{}}%
|
| 201 |
+
{\f@ncyhf\fancyhf{}[]}}
|
| 202 |
+
|
| 203 |
+
% New commands for offsets added
|
| 204 |
+
|
| 205 |
+
\newcommand{\fancyheadoffset}{\@ifnextchar[{\f@ncyhfoffs\fancyheadoffset h}%
|
| 206 |
+
{\f@ncyhfoffs\fancyheadoffset h[]}}
|
| 207 |
+
\newcommand{\fancyfootoffset}{\@ifnextchar[{\f@ncyhfoffs\fancyfootoffset f}%
|
| 208 |
+
{\f@ncyhfoffs\fancyfootoffset f[]}}
|
| 209 |
+
\newcommand{\fancyhfoffset}{\@ifnextchar[{\f@ncyhfoffs\fancyhfoffset{}}%
|
| 210 |
+
{\f@ncyhfoffs\fancyhfoffset{}[]}}
|
| 211 |
+
|
| 212 |
+
% The header and footer fields are stored in command sequences with
|
| 213 |
+
% names of the form: \f@ncy<x><y><z> with <x> for [eo], <y> from [lcr]
|
| 214 |
+
% and <z> from [hf].
|
| 215 |
+
|
| 216 |
+
\def\f@ncyhf#1#2[#3]#4{%
|
| 217 |
+
\def\temp@c{}%
|
| 218 |
+
\@forc\tmpf@ra{#3}%
|
| 219 |
+
{\expandafter\if@in\tmpf@ra{eolcrhf,EOLCRHF}%
|
| 220 |
+
{}{\edef\temp@c{\temp@c\tmpf@ra}}}%
|
| 221 |
+
\ifx\@empty\temp@c\else
|
| 222 |
+
\@fancyerrmsg{Illegal char `\temp@c' in \string#1 argument:
|
| 223 |
+
[#3]}%
|
| 224 |
+
\fi
|
| 225 |
+
\f@nfor\temp@c{#3}%
|
| 226 |
+
{\def@ult\f@@@eo{eo}\temp@c
|
| 227 |
+
\if@twoside\else
|
| 228 |
+
\if\f@@@eo e\@fancywarning
|
| 229 |
+
{\string#1's `E' option without twoside option is useless}\fi\fi
|
| 230 |
+
\def@ult\f@@@lcr{lcr}\temp@c
|
| 231 |
+
\def@ult\f@@@hf{hf}{#2\temp@c}%
|
| 232 |
+
\@forc\f@@eo\f@@@eo
|
| 233 |
+
{\@forc\f@@lcr\f@@@lcr
|
| 234 |
+
{\@forc\f@@hf\f@@@hf
|
| 235 |
+
{\expandafter\fancy@def\csname
|
| 236 |
+
f@ncy\f@@eo\f@@lcr\f@@hf\endcsname
|
| 237 |
+
{#4}}}}}}
|
| 238 |
+
|
| 239 |
+
\def\f@ncyhfoffs#1#2[#3]#4{%
|
| 240 |
+
\def\temp@c{}%
|
| 241 |
+
\@forc\tmpf@ra{#3}%
|
| 242 |
+
{\expandafter\if@in\tmpf@ra{eolrhf,EOLRHF}%
|
| 243 |
+
{}{\edef\temp@c{\temp@c\tmpf@ra}}}%
|
| 244 |
+
\ifx\@empty\temp@c\else
|
| 245 |
+
\@fancyerrmsg{Illegal char `\temp@c' in \string#1 argument:
|
| 246 |
+
[#3]}%
|
| 247 |
+
\fi
|
| 248 |
+
\f@nfor\temp@c{#3}%
|
| 249 |
+
{\def@ult\f@@@eo{eo}\temp@c
|
| 250 |
+
\if@twoside\else
|
| 251 |
+
\if\f@@@eo e\@fancywarning
|
| 252 |
+
{\string#1's `E' option without twoside option is useless}\fi\fi
|
| 253 |
+
\def@ult\f@@@lcr{lr}\temp@c
|
| 254 |
+
\def@ult\f@@@hf{hf}{#2\temp@c}%
|
| 255 |
+
\@forc\f@@eo\f@@@eo
|
| 256 |
+
{\@forc\f@@lcr\f@@@lcr
|
| 257 |
+
{\@forc\f@@hf\f@@@hf
|
| 258 |
+
{\expandafter\setlength\csname
|
| 259 |
+
f@ncyO@\f@@eo\f@@lcr\f@@hf\endcsname
|
| 260 |
+
{#4}}}}}%
|
| 261 |
+
\fancy@setoffs}
|
| 262 |
+
|
| 263 |
+
% Fancyheadings version 1 commands. These are more or less deprecated,
|
| 264 |
+
% but they continue to work.
|
| 265 |
+
|
| 266 |
+
\newcommand{\lhead}{\@ifnextchar[{\@xlhead}{\@ylhead}}
|
| 267 |
+
\def\@xlhead[#1]#2{\fancy@def\f@ncyelh{#1}\fancy@def\f@ncyolh{#2}}
|
| 268 |
+
\def\@ylhead#1{\fancy@def\f@ncyelh{#1}\fancy@def\f@ncyolh{#1}}
|
| 269 |
+
|
| 270 |
+
\newcommand{\chead}{\@ifnextchar[{\@xchead}{\@ychead}}
|
| 271 |
+
\def\@xchead[#1]#2{\fancy@def\f@ncyech{#1}\fancy@def\f@ncyoch{#2}}
|
| 272 |
+
\def\@ychead#1{\fancy@def\f@ncyech{#1}\fancy@def\f@ncyoch{#1}}
|
| 273 |
+
|
| 274 |
+
\newcommand{\rhead}{\@ifnextchar[{\@xrhead}{\@yrhead}}
|
| 275 |
+
\def\@xrhead[#1]#2{\fancy@def\f@ncyerh{#1}\fancy@def\f@ncyorh{#2}}
|
| 276 |
+
\def\@yrhead#1{\fancy@def\f@ncyerh{#1}\fancy@def\f@ncyorh{#1}}
|
| 277 |
+
|
| 278 |
+
\newcommand{\lfoot}{\@ifnextchar[{\@xlfoot}{\@ylfoot}}
|
| 279 |
+
\def\@xlfoot[#1]#2{\fancy@def\f@ncyelf{#1}\fancy@def\f@ncyolf{#2}}
|
| 280 |
+
\def\@ylfoot#1{\fancy@def\f@ncyelf{#1}\fancy@def\f@ncyolf{#1}}
|
| 281 |
+
|
| 282 |
+
\newcommand{\cfoot}{\@ifnextchar[{\@xcfoot}{\@ycfoot}}
|
| 283 |
+
\def\@xcfoot[#1]#2{\fancy@def\f@ncyecf{#1}\fancy@def\f@ncyocf{#2}}
|
| 284 |
+
\def\@ycfoot#1{\fancy@def\f@ncyecf{#1}\fancy@def\f@ncyocf{#1}}
|
| 285 |
+
|
| 286 |
+
\newcommand{\rfoot}{\@ifnextchar[{\@xrfoot}{\@yrfoot}}
|
| 287 |
+
\def\@xrfoot[#1]#2{\fancy@def\f@ncyerf{#1}\fancy@def\f@ncyorf{#2}}
|
| 288 |
+
\def\@yrfoot#1{\fancy@def\f@ncyerf{#1}\fancy@def\f@ncyorf{#1}}
|
| 289 |
+
|
| 290 |
+
\newlength{\fancy@headwidth}
|
| 291 |
+
\let\headwidth\fancy@headwidth
|
| 292 |
+
\newlength{\f@ncyO@elh}
|
| 293 |
+
\newlength{\f@ncyO@erh}
|
| 294 |
+
\newlength{\f@ncyO@olh}
|
| 295 |
+
\newlength{\f@ncyO@orh}
|
| 296 |
+
\newlength{\f@ncyO@elf}
|
| 297 |
+
\newlength{\f@ncyO@erf}
|
| 298 |
+
\newlength{\f@ncyO@olf}
|
| 299 |
+
\newlength{\f@ncyO@orf}
|
| 300 |
+
\newcommand{\headrulewidth}{0.4pt}
|
| 301 |
+
\newcommand{\footrulewidth}{0pt}
|
| 302 |
+
\newcommand{\footruleskip}{.3\normalbaselineskip}
|
| 303 |
+
|
| 304 |
+
% Fancyplain stuff shouldn't be used anymore (rather
|
| 305 |
+
% \fancypagestyle{plain} should be used), but it must be present for
|
| 306 |
+
% compatibility reasons.
|
| 307 |
+
|
| 308 |
+
\newcommand{\plainheadrulewidth}{0pt}
|
| 309 |
+
\newcommand{\plainfootrulewidth}{0pt}
|
| 310 |
+
\newif\if@fancyplain \@fancyplainfalse
|
| 311 |
+
\def\fancyplain#1#2{\if@fancyplain#1\else#2\fi}
|
| 312 |
+
|
| 313 |
+
\headwidth=-123456789sp %magic constant
|
| 314 |
+
|
| 315 |
+
% Command to reset various things in the headers:
|
| 316 |
+
% a.o. single spacing (taken from setspace.sty)
|
| 317 |
+
% and the catcode of ^^M (so that epsf files in the header work if a
|
| 318 |
+
% verbatim crosses a page boundary)
|
| 319 |
+
% It also defines a \nouppercase command that disables \uppercase and
|
| 320 |
+
% \Makeuppercase. It can only be used in the headers and footers.
|
| 321 |
+
\let\fnch@everypar\everypar% save real \everypar because of spanish.ldf
|
| 322 |
+
\def\fancy@reset{\fnch@everypar{}\restorecr\endlinechar=13
|
| 323 |
+
\def\baselinestretch{1}%
|
| 324 |
+
\def\nouppercase##1{{\let\uppercase\relax\let\MakeUppercase\relax
|
| 325 |
+
\expandafter\let\csname MakeUppercase \endcsname\relax##1}}%
|
| 326 |
+
\ifx\undefined\@newbaseline% NFSS not present; 2.09 or 2e
|
| 327 |
+
\ifx\@normalsize\undefined \normalsize % for ucthesis.cls
|
| 328 |
+
\else \@normalsize \fi
|
| 329 |
+
\else% NFSS (2.09) present
|
| 330 |
+
\@newbaseline%
|
| 331 |
+
\fi}
|
| 332 |
+
|
| 333 |
+
% Initialization of the head and foot text.
|
| 334 |
+
|
| 335 |
+
% The default values still contain \fancyplain for compatibility.
|
| 336 |
+
\fancyhf{} % clear all
|
| 337 |
+
% lefthead empty on ``plain'' pages, \rightmark on even, \leftmark on odd pages
|
| 338 |
+
% evenhead empty on ``plain'' pages, \leftmark on even, \rightmark on odd pages
|
| 339 |
+
\if@twoside
|
| 340 |
+
\fancyhead[el,or]{\fancyplain{}{\sl\rightmark}}
|
| 341 |
+
\fancyhead[er,ol]{\fancyplain{}{\sl\leftmark}}
|
| 342 |
+
\else
|
| 343 |
+
\fancyhead[l]{\fancyplain{}{\sl\rightmark}}
|
| 344 |
+
\fancyhead[r]{\fancyplain{}{\sl\leftmark}}
|
| 345 |
+
\fi
|
| 346 |
+
\fancyfoot[c]{\rm\thepage} % page number
|
| 347 |
+
|
| 348 |
+
% Use box 0 as a temp box and dimen 0 as temp dimen.
|
| 349 |
+
% This can be done, because this code will always
|
| 350 |
+
% be used inside another box, and therefore the changes are local.
|
| 351 |
+
|
| 352 |
+
\def\@fancyvbox#1#2{\setbox0\vbox{#2}\ifdim\ht0>#1\@fancywarning
|
| 353 |
+
{\string#1 is too small (\the#1): ^^J Make it at least \the\ht0.^^J
|
| 354 |
+
We now make it that large for the rest of the document.^^J
|
| 355 |
+
This may cause the page layout to be inconsistent, however\@gobble}%
|
| 356 |
+
\dimen0=#1\global\setlength{#1}{\ht0}\ht0=\dimen0\fi
|
| 357 |
+
\box0}
|
| 358 |
+
|
| 359 |
+
% Put together a header or footer given the left, center and
|
| 360 |
+
% right text, fillers at left and right and a rule.
|
| 361 |
+
% The \lap commands put the text into an hbox of zero size,
|
| 362 |
+
% so overlapping text does not generate an errormessage.
|
| 363 |
+
% These macros have 5 parameters:
|
| 364 |
+
% 1. LEFTSIDE BEARING % This determines at which side the header will stick
|
| 365 |
+
% out. When \fancyhfoffset is used this calculates \headwidth, otherwise
|
| 366 |
+
% it is \hss or \relax (after expansion).
|
| 367 |
+
% 2. \f@ncyolh, \f@ncyelh, \f@ncyolf or \f@ncyelf. This is the left component.
|
| 368 |
+
% 3. \f@ncyoch, \f@ncyech, \f@ncyocf or \f@ncyecf. This is the middle comp.
|
| 369 |
+
% 4. \f@ncyorh, \f@ncyerh, \f@ncyorf or \f@ncyerf. This is the right component.
|
| 370 |
+
% 5. RIGHTSIDE BEARING. This is always \relax or \hss (after expansion).
|
| 371 |
+
|
| 372 |
+
\def\@fancyhead#1#2#3#4#5{#1\hbox to\headwidth{\fancy@reset
|
| 373 |
+
\@fancyvbox\headheight{\hbox
|
| 374 |
+
{\rlap{\parbox[b]{\headwidth}{\raggedright#2}}\hfill
|
| 375 |
+
\parbox[b]{\headwidth}{\centering#3}\hfill
|
| 376 |
+
\llap{\parbox[b]{\headwidth}{\raggedleft#4}}}\headrule}}#5}
|
| 377 |
+
|
| 378 |
+
\def\@fancyfoot#1#2#3#4#5{#1\hbox to\headwidth{\fancy@reset
|
| 379 |
+
\@fancyvbox\footskip{\footrule
|
| 380 |
+
\hbox{\rlap{\parbox[t]{\headwidth}{\raggedright#2}}\hfill
|
| 381 |
+
\parbox[t]{\headwidth}{\centering#3}\hfill
|
| 382 |
+
\llap{\parbox[t]{\headwidth}{\raggedleft#4}}}}}#5}
|
| 383 |
+
|
| 384 |
+
\def\headrule{{\if@fancyplain\let\headrulewidth\plainheadrulewidth\fi
|
| 385 |
+
\hrule\@height\headrulewidth\@width\headwidth \vskip-\headrulewidth}}
|
| 386 |
+
|
| 387 |
+
\def\footrule{{\if@fancyplain\let\footrulewidth\plainfootrulewidth\fi
|
| 388 |
+
\vskip-\footruleskip\vskip-\footrulewidth
|
| 389 |
+
\hrule\@width\headwidth\@height\footrulewidth\vskip\footruleskip}}
|
| 390 |
+
|
| 391 |
+
\def\ps@fancy{%
|
| 392 |
+
\@ifundefined{@chapapp}{\let\@chapapp\chaptername}{}%for amsbook
|
| 393 |
+
%
|
| 394 |
+
% Define \MakeUppercase for old LaTeXen.
|
| 395 |
+
% Note: we used \def rather than \let, so that \let\uppercase\relax (from
|
| 396 |
+
% the version 1 documentation) will still work.
|
| 397 |
+
%
|
| 398 |
+
\@ifundefined{MakeUppercase}{\def\MakeUppercase{\uppercase}}{}%
|
| 399 |
+
\@ifundefined{chapter}{\def\sectionmark##1{\markboth
|
| 400 |
+
{\MakeUppercase{\ifnum \c@secnumdepth>\z@
|
| 401 |
+
\thesection\hskip 1em\relax \fi ##1}}{}}%
|
| 402 |
+
\def\subsectionmark##1{\markright {\ifnum \c@secnumdepth >\@ne
|
| 403 |
+
\thesubsection\hskip 1em\relax \fi ##1}}}%
|
| 404 |
+
{\def\chaptermark##1{\markboth {\MakeUppercase{\ifnum \c@secnumdepth>\m@ne
|
| 405 |
+
\@chapapp\ \thechapter. \ \fi ##1}}{}}%
|
| 406 |
+
\def\sectionmark##1{\markright{\MakeUppercase{\ifnum \c@secnumdepth >\z@
|
| 407 |
+
\thesection. \ \fi ##1}}}}%
|
| 408 |
+
%\csname ps@headings\endcsname % use \ps@headings defaults if they exist
|
| 409 |
+
\ps@@fancy
|
| 410 |
+
\gdef\ps@fancy{\@fancyplainfalse\ps@@fancy}%
|
| 411 |
+
% Initialize \headwidth if the user didn't
|
| 412 |
+
%
|
| 413 |
+
\ifdim\headwidth<0sp
|
| 414 |
+
%
|
| 415 |
+
% This catches the case that \headwidth hasn't been initialized and the
|
| 416 |
+
% case that the user added something to \headwidth in the expectation that
|
| 417 |
+
% it was initialized to \textwidth. We compensate this now. This loses if
|
| 418 |
+
% the user intended to multiply it by a factor. But that case is more
|
| 419 |
+
% likely done by saying something like \headwidth=1.2\textwidth.
|
| 420 |
+
% The doc says you have to change \headwidth after the first call to
|
| 421 |
+
% \pagestyle{fancy}. This code is just to catch the most common cases were
|
| 422 |
+
% that requirement is violated.
|
| 423 |
+
%
|
| 424 |
+
\global\advance\headwidth123456789sp\global\advance\headwidth\textwidth
|
| 425 |
+
\fi}
|
| 426 |
+
\def\ps@fancyplain{\ps@fancy \let\ps@plain\ps@plain@fancy}
|
| 427 |
+
\def\ps@plain@fancy{\@fancyplaintrue\ps@@fancy}
|
| 428 |
+
\let\ps@@empty\ps@empty
|
| 429 |
+
\def\ps@@fancy{%
|
| 430 |
+
\ps@@empty % This is for amsbook/amsart, which do strange things with \topskip
|
| 431 |
+
\def\@mkboth{\protect\markboth}%
|
| 432 |
+
\def\@oddhead{\@fancyhead\fancy@Oolh\f@ncyolh\f@ncyoch\f@ncyorh\fancy@Oorh}%
|
| 433 |
+
\def\@oddfoot{\@fancyfoot\fancy@Oolf\f@ncyolf\f@ncyocf\f@ncyorf\fancy@Oorf}%
|
| 434 |
+
\def\@evenhead{\@fancyhead\fancy@Oelh\f@ncyelh\f@ncyech\f@ncyerh\fancy@Oerh}%
|
| 435 |
+
\def\@evenfoot{\@fancyfoot\fancy@Oelf\f@ncyelf\f@ncyecf\f@ncyerf\fancy@Oerf}%
|
| 436 |
+
}
|
| 437 |
+
% Default definitions for compatibility mode:
|
| 438 |
+
% These cause the header/footer to take the defined \headwidth as width
|
| 439 |
+
% And to shift in the direction of the marginpar area
|
| 440 |
+
|
| 441 |
+
\def\fancy@Oolh{\if@reversemargin\hss\else\relax\fi}
|
| 442 |
+
\def\fancy@Oorh{\if@reversemargin\relax\else\hss\fi}
|
| 443 |
+
\let\fancy@Oelh\fancy@Oorh
|
| 444 |
+
\let\fancy@Oerh\fancy@Oolh
|
| 445 |
+
|
| 446 |
+
\let\fancy@Oolf\fancy@Oolh
|
| 447 |
+
\let\fancy@Oorf\fancy@Oorh
|
| 448 |
+
\let\fancy@Oelf\fancy@Oelh
|
| 449 |
+
\let\fancy@Oerf\fancy@Oerh
|
| 450 |
+
|
| 451 |
+
% New definitions for the use of \fancyhfoffset
|
| 452 |
+
% These calculate the \headwidth from \textwidth and the specified offsets.
|
| 453 |
+
|
| 454 |
+
\def\fancy@offsolh{\headwidth=\textwidth\advance\headwidth\f@ncyO@olh
|
| 455 |
+
\advance\headwidth\f@ncyO@orh\hskip-\f@ncyO@olh}
|
| 456 |
+
\def\fancy@offselh{\headwidth=\textwidth\advance\headwidth\f@ncyO@elh
|
| 457 |
+
\advance\headwidth\f@ncyO@erh\hskip-\f@ncyO@elh}
|
| 458 |
+
|
| 459 |
+
\def\fancy@offsolf{\headwidth=\textwidth\advance\headwidth\f@ncyO@olf
|
| 460 |
+
\advance\headwidth\f@ncyO@orf\hskip-\f@ncyO@olf}
|
| 461 |
+
\def\fancy@offself{\headwidth=\textwidth\advance\headwidth\f@ncyO@elf
|
| 462 |
+
\advance\headwidth\f@ncyO@erf\hskip-\f@ncyO@elf}
|
| 463 |
+
|
| 464 |
+
\def\fancy@setoffs{%
|
| 465 |
+
% Just in case \let\headwidth\textwidth was used
|
| 466 |
+
\fancy@gbl\let\headwidth\fancy@headwidth
|
| 467 |
+
\fancy@gbl\let\fancy@Oolh\fancy@offsolh
|
| 468 |
+
\fancy@gbl\let\fancy@Oelh\fancy@offselh
|
| 469 |
+
\fancy@gbl\let\fancy@Oorh\hss
|
| 470 |
+
\fancy@gbl\let\fancy@Oerh\hss
|
| 471 |
+
\fancy@gbl\let\fancy@Oolf\fancy@offsolf
|
| 472 |
+
\fancy@gbl\let\fancy@Oelf\fancy@offself
|
| 473 |
+
\fancy@gbl\let\fancy@Oorf\hss
|
| 474 |
+
\fancy@gbl\let\fancy@Oerf\hss}
|
| 475 |
+
|
| 476 |
+
\newif\iffootnote
|
| 477 |
+
\let\latex@makecol\@makecol
|
| 478 |
+
\def\@makecol{\ifvoid\footins\footnotetrue\else\footnotefalse\fi
|
| 479 |
+
\let\topfloat\@toplist\let\botfloat\@botlist\latex@makecol}
|
| 480 |
+
\def\iftopfloat#1#2{\ifx\topfloat\empty #2\else #1\fi}
|
| 481 |
+
\def\ifbotfloat#1#2{\ifx\botfloat\empty #2\else #1\fi}
|
| 482 |
+
\def\iffloatpage#1#2{\if@fcolmade #1\else #2\fi}
|
| 483 |
+
|
| 484 |
+
\newcommand{\fancypagestyle}[2]{%
|
| 485 |
+
\@namedef{ps@#1}{\let\fancy@gbl\relax#2\relax\ps@fancy}}
|
app/scripts/latex-to-mdx/input/figures/ch1/ch1-lerobot-figure1.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-approaches.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-classical-limitations.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-cost-accessibility.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor-box.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor-shelf.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-floor.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-planar-manipulator-free.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-platforms.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch2/ch2-so100-to-planar-manipulator.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-agent-env.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-duck-sim-vs-real.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-hil-serl-examples.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-learning-atlas.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-learning-benefits.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-many-ducks.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-rl-algorithms-atlas.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-rl-examples.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-act-decoder.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-act-encoder.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-act.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-action-vs-observation-distribution.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-async-inference.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-bc-trajectories.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-policy.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-robot-actions.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-vs-flowmatching.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-issues-with-bc.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-latent-variable-model.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-many-latents.png
ADDED
|
Git LFS Details
|