This commit is contained in:
commit
422f95e27b
43 changed files with 1360 additions and 173 deletions
|
@ -11,7 +11,7 @@ steps:
|
|||
path: /_cache
|
||||
commands:
|
||||
- pwd
|
||||
- apt-get update && apt-get install -y cmake g++
|
||||
- apt-get update && apt-get install -y cmake g++ libreadline-dev
|
||||
- shards install
|
||||
- shards build --production --static
|
||||
- strip bin/docmachine
|
||||
|
|
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -1,2 +1,3 @@
|
|||
/bin
|
||||
/lib
|
||||
|
||||
|
|
33
Makefile
33
Makefile
|
@ -1,6 +1,37 @@
|
|||
# SPDX-License-Identifier: GPL-3.0-or-later
|
||||
#
|
||||
# SPDX-FileCopyrightText: 2023 Glenn Y. Rolland <glenux@glenux.net>
|
||||
# Copyright © 2023 Glenn Y. Rolland <glenux@glenux.net>
|
||||
|
||||
CURRENT_UID := $(shell id -u)
|
||||
CURRENT_GID := $(shell id -g)
|
||||
ifeq ($(CURRENT_UID),0)
|
||||
PREFIX=/usr
|
||||
else
|
||||
PREFIX=$(HOME)/.local
|
||||
endif
|
||||
|
||||
|
||||
all: build
|
||||
|
||||
prepare:
|
||||
shards install
|
||||
|
||||
build:
|
||||
shards build
|
||||
shards build --error-trace -Dpreview_mt
|
||||
@echo SUCCESS
|
||||
|
||||
watch:
|
||||
watchexec --restart --delay-run 3 -c -e cr make build
|
||||
|
||||
spec: test
|
||||
test:
|
||||
crystal spec --error-trace
|
||||
|
||||
install:
|
||||
install \
|
||||
-m 755 \
|
||||
bin/docmachine \
|
||||
$(PREFIX)/bin
|
||||
|
||||
.PHONY: spec test build all prepare install
|
||||
|
|
24
data/prompts/01-each-chapter--build-toc.tpl
Normal file
24
data/prompts/01-each-chapter--build-toc.tpl
Normal file
|
@ -0,0 +1,24 @@
|
|||
{# GOAL: Build sections object containing subsections #}
|
||||
|
||||
Merci.
|
||||
|
||||
Suit la structure du PROGRAMME DE FORMATION.
|
||||
|
||||
Concentre toi sur le chapitre « {{chapter.title}} ».
|
||||
|
||||
Commence par écrire "boundary: {{ delimiter }}".
|
||||
Ensuite rédige la table des matière détaillée de ce chapitre en respectant la structure suivante:
|
||||
|
||||
«
|
||||
{
|
||||
"type": "section",
|
||||
"title": ""
|
||||
"subsections": [
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] }
|
||||
]
|
||||
}
|
||||
»
|
||||
|
||||
Termine en écrivant "boundary: {{ delimiter }}".
|
5
data/prompts/chats-sessions.md
Normal file
5
data/prompts/chats-sessions.md
Normal file
|
@ -0,0 +1,5 @@
|
|||
|
||||
# Chat sessions
|
||||
|
||||
* Formation MongoDB
|
||||
<https://chat.openai.com/c/e30a46c3-9bed-4896-bcfe-159865b501e3>
|
89
data/prompts/prompt-00-main.md
Normal file
89
data/prompts/prompt-00-main.md
Normal file
|
@ -0,0 +1,89 @@
|
|||
[TECHY is a technology expert with more than 20 years of experience. TECHY are specialized in systems, networks and infrastructures. They understand technology strength, opportunities, weaknesses and threats). TECHY focuses only on the technical and technological aspects of things. TECHY is methodical and analytical in his answers. TECHY admits his incompetence on other topics. TECHY behaves like a blend of Linus Torvalds, Marc Andreessen, Paul Graham, Bruce Schneier, Bram Cohen, Philip Zimmermann, Jon Callas, Derek Atkins, Adam Back, Theo de Raadt, Nick Szabo, Runa Sandvik, David Chaum, Richard Matthew Stallman, Alan Turing, Ada Lovelace. In all messages, you will start your reply by "TECHY:" and then you will answer to the questions or to the prompt like if TECHY was answering himself.]
|
||||
|
||||
## Contexte
|
||||
|
||||
Une formation d'excellente qualité, faite une université ou une ecole prestigieuse comme l'ENS Ulm, l'école Centrale, l'Ecole polytechnique, le MIT, l'université de Berkeley.
|
||||
|
||||
## Public concerné
|
||||
|
||||
* Administrateurs de bases de données
|
||||
* Administrateurs de systèmes
|
||||
* Professionnels des opérations et DevOps
|
||||
* et toute personne souhaitant maitriser parfaitement MongoDB
|
||||
|
||||
## Prérequis
|
||||
|
||||
* Avoir une connaissance générale des systèmes d'informations, systèmes et réseaux IP.
|
||||
* Avoir de bonnes connaissance Linux
|
||||
* Familiarité avec les concepts basiques et intermédiaires de MongoDB
|
||||
* Plusieurs années d'expérience sur l'outil MongoDB
|
||||
|
||||
## Objectifs
|
||||
|
||||
* Connaître la manipulation et l'interrogation des données à un niveau avancé
|
||||
* Connaître les bonnes pratiques d'optimisation des performances
|
||||
* Comprendre l'indexation avancée et les collections spéciales
|
||||
* Travailler sur la performance et la haute disponibilité avec le sharding et la réplication
|
||||
* Savoir détecter les causes de sous-performance et y remédier
|
||||
* Faire face à une montée en charge avec une répartition de charge.
|
||||
* Créer une stratégie de sauvegarde
|
||||
|
||||
## Programme de la formation
|
||||
|
||||
### Manipulation avancée de données
|
||||
|
||||
* Ajustement du Shell Mongo
|
||||
* Manipulation efficace des opérations CRUD (insertions, requêtes, mises à jour, suppressions)
|
||||
* Commandes d'administration utiles
|
||||
|
||||
### Optimisation des performances
|
||||
|
||||
* Outils de supervision intégrés : mongotop, mongostat
|
||||
* Analyser la mémoire et les performances des E/S
|
||||
* MongoDB Cloud Manager et Munin
|
||||
* Identifier les requêtes sous-optimales. Utiliser le profileur de requêtes.
|
||||
* Moteurs de stockage : MMAPv1 et WiredTiger
|
||||
* Les Explainable objects
|
||||
|
||||
### Indexation et collections spéciales
|
||||
|
||||
* Gestion et fonctionnement des index
|
||||
* Index des champs uniques et composés
|
||||
* Index des tableaux et des sous-documents
|
||||
* Index géo-spatiaux
|
||||
* Collections plafonnées, indexs TTL et curseurs
|
||||
|
||||
### Agrégation
|
||||
|
||||
* Agrégation à finalité unique
|
||||
* Pipelines d'agrégation
|
||||
* Map-reduce
|
||||
|
||||
### Réplication
|
||||
|
||||
* Réplication asynchrone dans MongoDB
|
||||
* Mise en place et entretien d'un replica set
|
||||
* Utilisation de "write concern" et "read preference"
|
||||
* Gérer les échecs de réplication
|
||||
|
||||
### Sharding
|
||||
|
||||
* Sharding automatique
|
||||
* Mise en place d'un cluster de shards MongoDB
|
||||
* Choisir judicieusement une shard key
|
||||
* Administration avancée d'un cluster de shards
|
||||
* Gérer un cluster de shards déséquilibré
|
||||
* Gérer les chunks (scission, fusion, migration)
|
||||
|
||||
### Sécurité
|
||||
|
||||
* Authentification et autorisation dans les replica sets et les clusters de shards
|
||||
* Gestion des privilèges et des rôles personnalisés
|
||||
* Recommandations pour un déploiement sûr
|
||||
|
||||
### Plans de sauvegarde et de restauration
|
||||
|
||||
* Stratégies basées sur le système de fichiers
|
||||
* Utilisation mongodump et mongorestore
|
||||
* Récupération de type point-in-time
|
||||
|
27
data/prompts/prompt-02-section-loop.md
Normal file
27
data/prompts/prompt-02-section-loop.md
Normal file
|
@ -0,0 +1,27 @@
|
|||
GOAL: Build subsections object containing markdown
|
||||
|
||||
PROMPT:
|
||||
|
||||
Super.
|
||||
|
||||
Suit la structure du PROGRAMME DE FORMATION.
|
||||
|
||||
Concentre toi sur le chapitre « {{this.chapter.title}} »
|
||||
|
||||
Dans ce chapitre, concentre toi sur la section « {{this.section.title}} ».
|
||||
|
||||
Focalise toi plus spécifiquement sur la sous-sections suivantes « {{this.subsection.title}} »
|
||||
qui contient les éléments suivants :
|
||||
«
|
||||
{% for child in this.children %}
|
||||
* {{ child }}
|
||||
{% endfor %}
|
||||
» sur lequel on se concentre.
|
||||
|
||||
Rédige le contenu détaillée de ces différents sous-sujets, sous forme phrases courtes et de listes à puces (bullet-points) pour remplir le contenu d'une présentation PowerPoint.
|
||||
|
||||
Donne des informations plus précises et plus techniques.
|
||||
|
||||
Précise les mots "gérer" ou "permet" quand tu les utilises.
|
||||
|
||||
Indique où trouver les informations dans MongoDB, dans la CLI ou dans d'autres outils. Donne les commandes à utiliser et des exemples de code si nécessaire.
|
18
data/prompts/prompt-03-subsection-loop.md
Normal file
18
data/prompts/prompt-03-subsection-loop.md
Normal file
|
@ -0,0 +1,18 @@
|
|||
GOAL: provide examples and code for given subsection
|
||||
|
||||
PROMPT:
|
||||
|
||||
Super.
|
||||
Suit la structure du PROGRAMME DE FORMATION.
|
||||
|
||||
Concentre toi sur le chapitre « {{this.chapter.title}} ». Focalise toi plus spécifiquement sur la section suivante « {{this.section}} » et la sous-section « {{this.subsection.title}} »
|
||||
|
||||
Explique les sujets suivants:
|
||||
«
|
||||
{% for child in this.children %}
|
||||
* {{child}}
|
||||
{% endfor %}
|
||||
»
|
||||
|
||||
Donne également des exemples: commandes shell, extraits de code, ou extraits de configuration pour illustrer tes explications.
|
||||
|
7
data/prompts/prompt-04-fix-content.md
Normal file
7
data/prompts/prompt-04-fix-content.md
Normal file
|
@ -0,0 +1,7 @@
|
|||
@@ TEXTE A CORRIGER
|
||||
|
||||
[[FIXME: text]]
|
||||
|
||||
@@ REQUETE
|
||||
|
||||
Dans le TEXTE A CORRIGER, indique où sont les erreurs et les approximations, et propose des corrections pour améliorer le contenu (pour une formation sur OpenStack).
|
67
data/prompts/syllabus.md
Normal file
67
data/prompts/syllabus.md
Normal file
|
@ -0,0 +1,67 @@
|
|||
|
||||
## Programme de la formation
|
||||
|
||||
Please use the following syllabus for the 'beging to advanced Openstack administrator course'
|
||||
|
||||
Week 1: Introduction to Openstack
|
||||
|
||||
* Overview of the Openstack platform and its components
|
||||
* Setting up an Openstack development environment
|
||||
* Basic Openstack commands and usage
|
||||
|
||||
Week 2: Openstack Compute (Nova)
|
||||
|
||||
* Understanding Nova architecture and components
|
||||
* Managing virtual machines and instances
|
||||
* Configuring and managing flavors and images
|
||||
|
||||
Week 3: Openstack Networking (Neutron)
|
||||
|
||||
* Understanding Neutron architecture and components
|
||||
* Managing virtual networks and subnets
|
||||
* Configuring security groups and firewall rules
|
||||
|
||||
Week 4: Openstack Storage (Cinder and Swift)
|
||||
|
||||
* Understanding Cinder and Swift architecture and components
|
||||
* Managing block and object storage
|
||||
* Creating and managing volumes and snapshots
|
||||
|
||||
Week 5: Openstack Identity (Keystone)
|
||||
|
||||
* Understanding Keystone architecture and components
|
||||
* Managing users and projects
|
||||
* Configuring authentication and authorization
|
||||
|
||||
Week 6: Openstack Dashboard (Horizon)
|
||||
|
||||
* Understanding Horizon architecture and components
|
||||
* Navigating and using the Openstack dashboard
|
||||
* Customizing and extending the dashboard
|
||||
|
||||
Week 7: Advanced Openstack topics
|
||||
|
||||
* Deploying and managing a production Openstack environment
|
||||
* Managing and scaling an Openstack cloud
|
||||
* Troubleshooting and monitoring Openstack
|
||||
* Openstack in a multi-node environment
|
||||
|
||||
Week 8: Project and Exam
|
||||
|
||||
* Students will be given a project to deploy a multi-node OpenStack cloud, and they will need to complete it
|
||||
* Written Exam will be held to evaluate the student's understanding of the course content and their ability to apply the knowledge to real-world scenarios
|
||||
|
||||
|
||||
## ME
|
||||
|
||||
Please follow the syllabus structure and write the detailed content for "week 1 : introduction to openstack"
|
||||
|
||||
## ME
|
||||
|
||||
Follow the syllabus and the content structure above, and write the detailed course content for "Basic Openstack commands and usage"
|
||||
|
||||
## ME
|
||||
|
||||
Please provide all explanations and commands for the "Configuring and managing flavors and images" part, as the content of multiple slides in markdown, separated by "----":
|
||||
|
||||
|
3
data/samples/content.01.01.01.md
Normal file
3
data/samples/content.01.01.01.md
Normal file
|
@ -0,0 +1,3 @@
|
|||
## SUBSECTION TITLE
|
||||
|
||||
FIXME
|
9
data/samples/content.01.01.json
Normal file
9
data/samples/content.01.01.json
Normal file
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"type": "section",
|
||||
"title": ""
|
||||
"subsections": [
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "subsection", "title": "...", "keywords": ["...", "..."] }
|
||||
]
|
||||
}
|
10
data/samples/content.01.json
Normal file
10
data/samples/content.01.json
Normal file
|
@ -0,0 +1,10 @@
|
|||
{
|
||||
"type": "chapter",
|
||||
"title": ""
|
||||
"keywords": ["...", "...", "..."],
|
||||
"sections": [
|
||||
{ "type": "section", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "section", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "section", "title": "...", "keywords": ["...", "..."] }
|
||||
]
|
||||
}
|
9
data/samples/content.json
Normal file
9
data/samples/content.json
Normal file
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"type": "root-element",
|
||||
"title": ""
|
||||
"chapters": [
|
||||
{ "type": "chapter", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "chapter", "title": "...", "keywords": ["...", "..."] },
|
||||
{ "type": "chapter", "title": "...", "keywords": ["...", "..."] },
|
||||
]
|
||||
}
|
38
docs/config.md
Normal file
38
docs/config.md
Normal file
|
@ -0,0 +1,38 @@
|
|||
|
||||
## Usage
|
||||
|
||||
Préparer un fichier `docmachine.yaml`:
|
||||
|
||||
personae:
|
||||
- TECHY
|
||||
- GEOPOLY
|
||||
|
||||
context: |
|
||||
some text here
|
||||
|
||||
audience: |
|
||||
some text here
|
||||
|
||||
prerequisites: |
|
||||
some text here
|
||||
|
||||
goals: |
|
||||
some text here
|
||||
|
||||
pattern: "{{chapter.index}}.{{section.index}}.{{chapter.slug}}.json"
|
||||
|
||||
|
||||
Run docmachine to build the main toc
|
||||
|
||||
docmachine plan -c docmachine.yaml --level 0 --output .
|
||||
|
||||
|
||||
Run docmachine to build the toc for all chapters
|
||||
|
||||
docmachine plan -c docmachine.yaml --level 1 --output .
|
||||
|
||||
|
||||
Run docmachine to build the toc (if missing)
|
||||
|
||||
docmachine plan -c docmachine.yaml --level 0 --output .
|
||||
|
91
docs/prompts.md
Normal file
91
docs/prompts.md
Normal file
|
@ -0,0 +1,91 @@
|
|||
|
||||
[TECHY is a technology expert with more than 20 years of experience. TECHY are specialized in systems, networks and infrastructures. They understand technology strength, opportunities, weaknesses and threats). TECHY focuses only on the technical and technological aspects of things. TECHY is methodical and analytical in his answers. TECHY admits his incompetence on other topics. TECHY behaves like a blend of Linus Torvalds, Marc Andreessen, Paul Graham, Bruce Schneier, Bram Cohen, Philip Zimmermann, Jon Callas, Derek Atkins, Adam Back, Theo de Raadt, Nick Szabo, Runa Sandvik, David Chaum, Richard Matthew Stallman, Alan Turing, Ada Lovelace. In all messages, you will start your reply by "TECHY:" and then you will answer to the questions or to the prompt like if TECHY was answering himself.]
|
||||
|
||||
## Contexte
|
||||
|
||||
Une formation d'excellente qualité, faite une université ou une ecole prestigieuse comme l'ENS Ulm, l'école Centrale, l'Ecole polytechnique, le MIT, l'université de Berkeley.
|
||||
|
||||
## Public concerné
|
||||
|
||||
* Administrateurs de bases de données
|
||||
* Administrateurs de systèmes
|
||||
* Professionnels des opérations et DevOps
|
||||
* et toute personne souhaitant maitriser parfaitement MongoDB
|
||||
|
||||
## Prérequis
|
||||
|
||||
* Avoir une connaissance générale des systèmes d'informations, systèmes et réseaux IP.
|
||||
* Avoir de bonnes connaissance Linux
|
||||
* Familiarité avec les concepts basiques et intermédiaires de MongoDB
|
||||
* Plusieurs années d'expérience sur l'outil MongoDB
|
||||
|
||||
## Objectifs
|
||||
|
||||
* Connaître la manipulation et l'interrogation des données à un niveau avancé
|
||||
* Connaître les bonnes pratiques d'optimisation des performances
|
||||
* Comprendre l'indexation avancée et les collections spéciales
|
||||
* Travailler sur la performance et la haute disponibilité avec le sharding et la réplication
|
||||
* Savoir détecter les causes de sous-performance et y remédier
|
||||
* Faire face à une montée en charge avec une répartition de charge.
|
||||
* Créer une stratégie de sauvegarde
|
||||
|
||||
## Programme de la formation
|
||||
|
||||
### Manipulation avancée de données
|
||||
|
||||
* Ajustement du Shell Mongo
|
||||
* Manipulation efficace des opérations CRUD (insertions, requêtes, mises à jour, suppressions)
|
||||
* Commandes d'administration utiles
|
||||
|
||||
### Optimisation des performances
|
||||
|
||||
* Outils de supervision intégrés : mongotop, mongostat
|
||||
* Analyser la mémoire et les performances des E/S
|
||||
* MongoDB Cloud Manager et Munin
|
||||
* Identifier les requêtes sous-optimales. Utiliser le profileur de requêtes.
|
||||
* Moteurs de stockage : MMAPv1 et WiredTiger
|
||||
* Les Explainable objects
|
||||
|
||||
### Indexation et collections spéciales
|
||||
|
||||
* Gestion et fonctionnement des index
|
||||
* Index des champs uniques et composés
|
||||
* Index des tableaux et des sous-documents
|
||||
* Index géo-spatiaux
|
||||
* Collections plafonnées, indexs TTL et curseurs
|
||||
|
||||
### Agrégation
|
||||
|
||||
* Agrégation à finalité unique
|
||||
* Pipelines d'agrégation
|
||||
* Map-reduce
|
||||
|
||||
### Réplication
|
||||
|
||||
* Réplication asynchrone dans MongoDB
|
||||
* Mise en place et entretien d'un replica set
|
||||
* Utilisation de "write concern" et "read preference"
|
||||
* Gérer les échecs de réplication
|
||||
|
||||
### Sharding
|
||||
|
||||
* Sharding automatique
|
||||
* Mise en place d'un cluster de shards MongoDB
|
||||
* Choisir judicieusement une shard key
|
||||
* Administration avancée d'un cluster de shards
|
||||
* Gérer un cluster de shards déséquilibré
|
||||
* Gérer les chunks (scission, fusion, migration)
|
||||
|
||||
### Sécurité
|
||||
|
||||
* Authentification et autorisation dans les replica sets et les clusters de shards
|
||||
* Gestion des privilèges et des rôles personnalisés
|
||||
* Recommandations pour un déploiement sûr
|
||||
|
||||
### Plans de sauvegarde et de restauration
|
||||
|
||||
* Stratégies basées sur le système de fichiers
|
||||
* Utilisation mongodump et mongorestore
|
||||
* Récupération de type point-in-time
|
||||
|
||||
|
34
shard.lock
Normal file
34
shard.lock
Normal file
|
@ -0,0 +1,34 @@
|
|||
version: 2.0
|
||||
shards:
|
||||
baked_file_system:
|
||||
git: https://github.com/schovi/baked_file_system.git
|
||||
version: 0.10.0
|
||||
|
||||
cor:
|
||||
git: https://github.com/watzon/cor.git
|
||||
version: 0.1.0+git.commit.9c9e51ac6168f3bd4fdc51d679b65de09ef76cac
|
||||
|
||||
crinja:
|
||||
git: https://github.com/straight-shoota/crinja.git
|
||||
version: 0.8.1
|
||||
|
||||
ioctl:
|
||||
git: https://github.com/crystal-posix/ioctl.cr.git
|
||||
version: 1.0.0
|
||||
|
||||
term-cursor:
|
||||
git: https://github.com/crystal-term/cursor.git
|
||||
version: 0.1.0+git.commit.8805d5f686d153db92cf2ce3333433f8ed3708d0
|
||||
|
||||
term-prompt:
|
||||
git: https://github.com/crystal-term/prompt.git
|
||||
version: 0.1.0+git.commit.bf2b17f885a6c660aea0dda62b0b9da4343ab295
|
||||
|
||||
term-reader:
|
||||
git: https://github.com/crystal-term/reader.git
|
||||
version: 0.1.0+git.commit.cd022d4d4628e5d9de47e669a770ccb7df412863
|
||||
|
||||
term-screen:
|
||||
git: https://github.com/crystal-term/screen.git
|
||||
version: 0.1.0+git.commit.ea51ee8d1f6c286573c41a7e784d31c80af7b9bb
|
||||
|
16
shard.yml
16
shard.yml
|
@ -12,13 +12,15 @@ targets:
|
|||
docmachine:
|
||||
main: src/main.cr
|
||||
|
||||
# dependencies:
|
||||
# pg:
|
||||
# github: will/crystal-pg
|
||||
# version: "~> 0.5"
|
||||
dependencies:
|
||||
term-prompt:
|
||||
github: crystal-term/prompt
|
||||
crinja:
|
||||
github: straight-shoota/crinja
|
||||
baked_file_system:
|
||||
github: schovi/baked_file_system
|
||||
version: 0.10.0
|
||||
|
||||
# development_dependencies:
|
||||
# webmock:
|
||||
# github: manastech/webmock.cr
|
||||
# FIXME: for prompts rendering
|
||||
|
||||
license: MIT
|
||||
|
|
53
src/build/cli.cr
Normal file
53
src/build/cli.cr
Normal file
|
@ -0,0 +1,53 @@
|
|||
|
||||
require "./config"
|
||||
|
||||
module DocMachine::Build
|
||||
class Cli
|
||||
def self.add_options(opts, args, parent_config, commands)
|
||||
config = Config.new(parent_config)
|
||||
|
||||
opts.on("build", "Build content and produce HTML & PDF deliverables") do
|
||||
opts.banner = [
|
||||
"Usage: #{PROGRAM_NAME} build [options]",
|
||||
"",
|
||||
"Main options:"
|
||||
].join("\n")
|
||||
|
||||
opts.separator ""
|
||||
opts.separator "Builder Options:"
|
||||
|
||||
opts.on("-a", "--action ACTION", "Action (watch, build, shell, etc.)") do |action|
|
||||
config.action = action
|
||||
end
|
||||
|
||||
opts.on("--no-cache", "Disable cache") do |_|
|
||||
config.enable_cache = false
|
||||
end
|
||||
|
||||
opts.on("-d", "--data-dir DIR", "Content directory") do |dir|
|
||||
config.data_dir = dir
|
||||
end
|
||||
|
||||
opts.on("-p", "--port PORT", "Set base port to PORT") do |port|
|
||||
config.port = port.to_i
|
||||
end
|
||||
|
||||
opts.on("-m", "--multiple", "Allow multiple instances per dir" ) do |port|
|
||||
config.enable_multiple = true
|
||||
end
|
||||
|
||||
opts.on("-t", "--tty", "Enable TTY mode (needed for shell)") do
|
||||
config.enable_tty = true
|
||||
end
|
||||
|
||||
commands << ->() : Nil do
|
||||
app = DocMachine::Build::Run.new(config)
|
||||
app.prepare
|
||||
app.start
|
||||
app.wait
|
||||
end
|
||||
end
|
||||
|
||||
end
|
||||
end
|
||||
end
|
14
src/build/config.cr
Normal file
14
src/build/config.cr
Normal file
|
@ -0,0 +1,14 @@
|
|||
|
||||
module DocMachine::Build
|
||||
class Config
|
||||
property data_dir : String = Dir.current
|
||||
property action : String = "watch"
|
||||
property enable_tty : Bool = false
|
||||
property port : Int32 = 5100
|
||||
property enable_multiple : Bool = false
|
||||
property enable_cache : Bool = false
|
||||
|
||||
def initialize(@parent : DocMachine::Config)
|
||||
end
|
||||
end
|
||||
end
|
7
src/build/module.cr
Normal file
7
src/build/module.cr
Normal file
|
@ -0,0 +1,7 @@
|
|||
|
||||
require "../module"
|
||||
require "log"
|
||||
|
||||
module DocMachine::Build
|
||||
Log = DocMachine::Log.for("docmachine")
|
||||
end
|
210
src/build/run.cr
Normal file
210
src/build/run.cr
Normal file
|
@ -0,0 +1,210 @@
|
|||
|
||||
require "path"
|
||||
require "file_utils"
|
||||
require "socket"
|
||||
|
||||
require "./module"
|
||||
require "./config"
|
||||
|
||||
module DocMachine::Build
|
||||
class Run
|
||||
Log = DocMachine::Build::Log.for("run")
|
||||
|
||||
def initialize(@config : DocMachine::Build::Config)
|
||||
data = "#{@config.data_dir}:#{@config.port}"
|
||||
@basehash = Digest::SHA256.hexdigest(data)[0..6]
|
||||
@docker_name = "docmachine-#{@basehash}"
|
||||
@docker_image = "glenux/docmachine:latest"
|
||||
@docker_opts = [] of String
|
||||
@process = nil
|
||||
end
|
||||
|
||||
|
||||
# cleanup environment
|
||||
# create directories
|
||||
# setup permissions
|
||||
def prepare()
|
||||
Log.info { "basedir = #{@config.data_dir}" }
|
||||
Log.info { "docker_image = #{@docker_image}" }
|
||||
Log.info { "action = #{@config.action}" }
|
||||
|
||||
self._pull_image()
|
||||
self._avoid_duplicates() unless @config.enable_multiple
|
||||
end
|
||||
|
||||
private def _find_port(port_base)
|
||||
(port_base..65535).each do |port|
|
||||
return port if _port_available?(port)
|
||||
end
|
||||
raise "No port available"
|
||||
end
|
||||
|
||||
private def _port_available?(port)
|
||||
sock = Socket.new(Socket::Family::INET, Socket::Type::STREAM)
|
||||
sock.bind(Socket::IPAddress.new("0.0.0.0", port))
|
||||
sock.close
|
||||
return true
|
||||
rescue ex : Socket::BindError
|
||||
return false
|
||||
end
|
||||
|
||||
private def _avoid_duplicates
|
||||
Log.info { "Multiple Instances: stopping duplicate containers (for #{@docker_name})" }
|
||||
docker_cid = %x{docker ps -f "name=#{@docker_name}" -q}.strip
|
||||
|
||||
Log.info { "Multiple Instances: docker_name: #{@docker_name}" }
|
||||
Log.info { "Multiple Instances: docker_cid: #{docker_cid || "-"}" }
|
||||
|
||||
if !docker_cid.empty?
|
||||
Process.run("docker", ["kill", @docker_name])
|
||||
Process.run("docker", ["rm", @docker_name])
|
||||
end
|
||||
end
|
||||
|
||||
def _pull_image
|
||||
# FIXME: add option to force update
|
||||
data_cache_dir = if ENV["XDG_CACHE_HOME"]?
|
||||
Path[ENV["XDG_CACHE_HOME"], "docmachine"]
|
||||
else Path[ENV["HOME"], ".cache", "docmachine"]
|
||||
end
|
||||
|
||||
## Build cache if it doesnt exist
|
||||
data_cache_file = data_cache_dir / "image.tar"
|
||||
Log.info { "Checking cache #{data_cache_file}..." }
|
||||
if ! File.exists? data_cache_file.to_s
|
||||
Log.info { "Downloading #{@docker_image} image..." }
|
||||
Process.run("docker", ["pull", @docker_image], output: STDOUT)
|
||||
Log.info { "Building cache for image (#{data_cache_dir})" }
|
||||
FileUtils.mkdir_p(data_cache_dir)
|
||||
status = Process.run(
|
||||
"docker",
|
||||
["image", "save", @docker_image, "-o", data_cache_file.to_s],
|
||||
output: STDOUT
|
||||
)
|
||||
if status.success?
|
||||
Log.info { "done" }
|
||||
else
|
||||
Log.error { "Unable to save cache image" }
|
||||
exit 1
|
||||
end
|
||||
|
||||
else
|
||||
Log.info { "Cache already exist. Skipping." }
|
||||
end
|
||||
|
||||
if @config.enable_cache
|
||||
Log.info { "Loading #{@docker_image} image from cache..." }
|
||||
docker_image_loaded = false
|
||||
status = Process.run(
|
||||
"docker",
|
||||
["image", "load", "-i", data_cache_file.to_s],
|
||||
output: STDOUT
|
||||
)
|
||||
if status.success?
|
||||
Log.info { "done" }
|
||||
else
|
||||
Log.error { "Unable to load cache image" }
|
||||
exit 1
|
||||
end
|
||||
else
|
||||
Log.info { "Loading #{@docker_image} image from local registry..." }
|
||||
# FIXME: check that local image exists
|
||||
end
|
||||
end
|
||||
|
||||
def start()
|
||||
uid = %x{id -u}.strip
|
||||
gid = %x{id -g}.strip
|
||||
Log.info { "uid: #{uid}" }
|
||||
Log.info { "cid: #{gid}" }
|
||||
|
||||
docker_opts = [] of String
|
||||
docker_opts << "run"
|
||||
docker_opts << "-i"
|
||||
# add tty support
|
||||
docker_opts << "-t" if @config.enable_tty
|
||||
# add container name
|
||||
docker_opts.concat ["--name", @docker_name]
|
||||
docker_opts << "--rm"
|
||||
docker_opts << "--shm-size=1gb"
|
||||
docker_opts.concat ["-e", "EXT_UID=#{uid}"]
|
||||
docker_opts.concat ["-e", "EXT_GID=#{gid}"]
|
||||
docker_opts.concat ["-v", "#{@config.data_dir}/docs:/app/docs"]
|
||||
docker_opts.concat ["-v", "#{@config.data_dir}/slides:/app/slides"]
|
||||
docker_opts.concat ["-v", "#{@config.data_dir}/images:/app/images"]
|
||||
docker_opts.concat ["-v", "#{@config.data_dir}/_build:/app/_build"]
|
||||
|
||||
## Detect Marp SCSS
|
||||
if File.exists?("#{@config.data_dir}/.marp/theme.scss")
|
||||
docker_opt_marp_theme = ["-v", "#{@config.data_dir}/.marp:/app/.marp"]
|
||||
docker_opts.concat docker_opt_marp_theme
|
||||
Log.info { "Theme: detected Marp files. Adding option to command line (#{docker_opt_marp_theme})" }
|
||||
else
|
||||
Log.info { "Theme: no theme detected. Using default files" }
|
||||
end
|
||||
|
||||
## Detect Mkdocs configuration - old format (full)
|
||||
if File.exists?("#{@config.data_dir}/mkdocs.yml")
|
||||
Log.info { "Docs: detected mkdocs.yml file. Please rename to mkdocs-patch.yml" }
|
||||
exit 1
|
||||
end
|
||||
|
||||
## Detect Mkdocs configuration - new format (patch)
|
||||
if File.exists?("#{@config.data_dir}/mkdocs-patch.yml")
|
||||
docker_opt_mkdocs_config = ["-v", "#{@config.data_dir}/mkdocs-patch.yml:/app/mkdocs-patch.yml"]
|
||||
docker_opts.concat docker_opt_mkdocs_config
|
||||
Log.info { "Docs: detected mkdocs-patch.yml file. Adding option to command line (#{docker_opt_mkdocs_config})" }
|
||||
else
|
||||
Log.info { "Docs: no mkdocs-patch.yml detected. Using default files" }
|
||||
end
|
||||
|
||||
## Detect docs
|
||||
if Dir.exists?("#{@config.data_dir}/docs")
|
||||
Log.info { "Docs: detected docs directory." }
|
||||
mkdocs_port = _find_port(@config.port)
|
||||
docker_opt_mkdocs_port = ["-p", "#{mkdocs_port}:5100"]
|
||||
docker_opts.concat docker_opt_mkdocs_port
|
||||
Log.notice { "Using port #{mkdocs_port} for docs" }
|
||||
Log.info { "Docs: Adding option to command line (#{docker_opt_mkdocs_port})" }
|
||||
else
|
||||
Log.info { "Docs: no docs detected." }
|
||||
end
|
||||
|
||||
## Detect slides
|
||||
if Dir.exists?("#{@config.data_dir}/slides")
|
||||
Log.info { "Slides: detected slides directory." }
|
||||
marp_port = _find_port(@config.port+100)
|
||||
docker_opt_marp_port = ["-p", "#{marp_port}:5200"]
|
||||
docker_opts.concat docker_opt_marp_port
|
||||
Log.info { "Slides: Adding option to command line (#{docker_opt_marp_port})" }
|
||||
Log.notice { "Slides: Using port #{marp_port} for slides" }
|
||||
else
|
||||
Log.info { "Slides: no slides directory detected." }
|
||||
end
|
||||
|
||||
docker_opts << @docker_image
|
||||
docker_opts << @config.action
|
||||
|
||||
Log.info { docker_opts.inspect.colorize(:yellow) }
|
||||
@process = Process.new("docker", docker_opts, output: STDOUT, error: STDERR)
|
||||
end
|
||||
|
||||
def wait()
|
||||
process = @process
|
||||
return if process.nil?
|
||||
|
||||
Signal::INT.trap do
|
||||
Log.warn { "Received CTRL-C" }
|
||||
process.signal(Signal::KILL)
|
||||
Process.run("docker", ["kill", @docker_name])
|
||||
end
|
||||
process.wait
|
||||
end
|
||||
|
||||
def stop()
|
||||
end
|
||||
|
||||
def docker_opts()
|
||||
end
|
||||
end
|
||||
end
|
96
src/cli.cr
96
src/cli.cr
|
@ -2,65 +2,73 @@ require "option_parser"
|
|||
require "digest/sha256"
|
||||
require "colorize"
|
||||
|
||||
require "./launcher"
|
||||
require "./log"
|
||||
require "./config"
|
||||
require "./build/cli"
|
||||
require "./build/run"
|
||||
require "./scaffold/cli"
|
||||
require "./scaffold/run"
|
||||
require "./plan/cli"
|
||||
require "./plan/run"
|
||||
require "./write/cli"
|
||||
require "./write/run"
|
||||
|
||||
module DocMachine
|
||||
class Cli
|
||||
Log = DocMachine::Log.for("cli")
|
||||
|
||||
def initialize
|
||||
end
|
||||
|
||||
def start(argv)
|
||||
options = {} of Symbol => String
|
||||
def start(args)
|
||||
config = Config.new
|
||||
commands = [] of Proc(Nil)
|
||||
|
||||
parser = OptionParser.new do |opts|
|
||||
opts.banner = "Usage: script.cr [options]"
|
||||
opts.banner = [
|
||||
"Usage: #{PROGRAM_NAME} [options]",
|
||||
"",
|
||||
"Main options:"
|
||||
].join("\n")
|
||||
|
||||
opts.on("-d", "--data-dir DIR", "Content directory") do |dir|
|
||||
options[:data_dir] = dir
|
||||
end
|
||||
|
||||
opts.on("-a", "--action ACTION", "Action (watch, build, shell, etc.)") do |action|
|
||||
options[:action] = action
|
||||
end
|
||||
|
||||
opts.on("-t", "--tty", "Enable TTY mode (needed for shell)") do |tty|
|
||||
options[:tty] = tty
|
||||
end
|
||||
|
||||
opts.on("-v", "--verbose", "Enable verbosity") do |verbose|
|
||||
options[:verbose] = true.to_s
|
||||
opts.on("-v", "--verbosity LEVEL", "Change verbosity level to LEVEL (0..3)") do |verbose|
|
||||
verbose_i = verbose.to_i
|
||||
verbose_i = 0 if verbose.to_i < 0
|
||||
verbose_i = 3 if verbose.to_i > 3
|
||||
config.verbosity = ::Log::Severity.from_value(3 - verbose_i)
|
||||
rescue ex: ArgumentError
|
||||
Log.error { "Wrong value for parameter --verbosity" }
|
||||
exit 1
|
||||
end
|
||||
|
||||
opts.on("-h", "--help", "Show this help") do
|
||||
puts opts
|
||||
exit
|
||||
end
|
||||
end
|
||||
|
||||
parser.parse(ARGV)
|
||||
|
||||
|
||||
basedir = options[:data_dir]? ? options[:data_dir] : Dir.current
|
||||
basehash = Digest::SHA256.hexdigest(basedir)[0..6]
|
||||
action = options[:action]? ? options[:action] : "watch"
|
||||
verbosity = options[:verbose]? ? options[:verbose] : 0
|
||||
docker_image = "glenux/docmachine:latest"
|
||||
|
||||
if options[:help]?
|
||||
puts "Usage: script.cr [options]"
|
||||
puts ""
|
||||
puts "-d, --data-dir DIR Content directory"
|
||||
puts "-a, --action ACTION Action (watch, build, shell, etc.)"
|
||||
puts "-t, --tty Enable TTY mode (needed for shell)"
|
||||
puts "-v, --verbose Enable verbosity"
|
||||
puts "-h, --help Show this help"
|
||||
Log.notice { opts }
|
||||
exit
|
||||
end
|
||||
|
||||
launcher = DocMachine::Launcher.new(options)
|
||||
launcher.prepare
|
||||
launcher.start
|
||||
launcher.wait
|
||||
opts.separator ""
|
||||
opts.separator "Commands:"
|
||||
|
||||
DocMachine::Scaffold::Cli.add_options(opts, args, config, commands)
|
||||
DocMachine::Plan::Cli.add_options(opts, args, config, commands)
|
||||
DocMachine::Write::Cli.add_options(opts, args, config, commands)
|
||||
DocMachine::Build::Cli.add_options(opts, args, config, commands)
|
||||
end
|
||||
|
||||
parser.parse(args)
|
||||
::Log.setup(config.verbosity, ::Log::IOBackend.new(formatter: BaseFormat))
|
||||
Log.notice { "verbosity level = #{config.verbosity}" }
|
||||
Log.debug { "commands = #{commands}" }
|
||||
|
||||
if commands.size < 1
|
||||
Log.error { parser.to_s }
|
||||
Log.error { "No command defined" }
|
||||
end
|
||||
|
||||
commands.each do |command|
|
||||
Log.debug { "== Running #{command}" }
|
||||
command.call()
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
14
src/common/docker.cr
Normal file
14
src/common/docker.cr
Normal file
|
@ -0,0 +1,14 @@
|
|||
|
||||
class Docker
|
||||
property image : String
|
||||
|
||||
def initialize(@image)
|
||||
end
|
||||
|
||||
|
||||
def store_image
|
||||
end
|
||||
|
||||
def image_load
|
||||
end
|
||||
end
|
13
src/config.cr
Normal file
13
src/config.cr
Normal file
|
@ -0,0 +1,13 @@
|
|||
|
||||
|
||||
require "./module"
|
||||
|
||||
module DocMachine
|
||||
class Config
|
||||
|
||||
property verbosity = ::Log::Severity::Notice
|
||||
|
||||
def initialize
|
||||
end
|
||||
end
|
||||
end
|
124
src/launcher.cr
124
src/launcher.cr
|
@ -1,124 +0,0 @@
|
|||
|
||||
module DocMachine
|
||||
class Launcher
|
||||
def initialize(config)
|
||||
@basedir = config[:data_dir]? ? config[:data_dir] : Dir.current
|
||||
@basehash = Digest::SHA256.hexdigest(@basedir)[0..6]
|
||||
@action = config[:action]? ? config[:action] : "watch"
|
||||
# @verbosity = config[:verbose]? ? config[:verbose] : 0
|
||||
@docker_name = "docmachine-#{@basehash}"
|
||||
@docker_image = "glenux/docmachine:latest"
|
||||
@docker_opts = [] of String
|
||||
@enable_tty = !!config[:tty]?
|
||||
@process = nil
|
||||
end
|
||||
|
||||
|
||||
# cleanup environment
|
||||
# create directories
|
||||
# setup permissions
|
||||
def prepare()
|
||||
puts "basedir = #{@basedir}"
|
||||
puts "docker_image = #{@docker_image}"
|
||||
puts "action = #{@action}"
|
||||
|
||||
docker_cid = %x{docker ps -f "name=#{@docker_name}" -q}.strip
|
||||
|
||||
puts "docker_name: #{@docker_name}"
|
||||
puts "docker_cid: #{docker_cid}"
|
||||
|
||||
if !docker_cid.empty?
|
||||
Process.run("docker", ["kill", @docker_name])
|
||||
end
|
||||
end
|
||||
|
||||
def start()
|
||||
uid = %x{id -u}.strip
|
||||
gid = %x{id -g}.strip
|
||||
puts "uid: #{uid}"
|
||||
puts "cid: #{gid}"
|
||||
|
||||
docker_opts = [] of String
|
||||
docker_opts << "run"
|
||||
docker_opts << "-i"
|
||||
# add tty support
|
||||
docker_opts << "-t" if @enable_tty
|
||||
# add container name
|
||||
docker_opts.concat ["--name", @docker_name]
|
||||
docker_opts << "--rm"
|
||||
docker_opts << "--shm-size=1gb"
|
||||
docker_opts.concat ["-e", "EXT_UID=#{uid}"]
|
||||
docker_opts.concat ["-e", "EXT_GID=#{gid}"]
|
||||
docker_opts.concat ["-v", "#{@basedir}/docs:/app/docs"]
|
||||
docker_opts.concat ["-v", "#{@basedir}/slides:/app/slides"]
|
||||
docker_opts.concat ["-v", "#{@basedir}/images:/app/images"]
|
||||
docker_opts.concat ["-v", "#{@basedir}/_build:/app/_build"]
|
||||
|
||||
## Detect Marp SCSS
|
||||
if File.exists?("#{@basedir}/.marp/theme.scss")
|
||||
docker_opt_marp_theme = ["-v", "#{@basedir}/.marp:/app/.marp"]
|
||||
docker_opts.concat docker_opt_marp_theme
|
||||
puts "Theme: detected Marp files. Adding option to command line (#{docker_opt_marp_theme})"
|
||||
else
|
||||
puts "Theme: no theme detected. Using default files"
|
||||
end
|
||||
|
||||
## Detect Mkdocs configuration - old format (full)
|
||||
if File.exists?("#{@basedir}/mkdocs.yml")
|
||||
puts "Mkdocs: detected mkdocs.yml file. Please rename to mkdocs-patch.yml"
|
||||
exit 1
|
||||
end
|
||||
|
||||
## Detect Mkdocs configuration - new format (patch)
|
||||
if File.exists?("#{@basedir}/mkdocs-patch.yml")
|
||||
docker_opt_mkdocs_config = ["-v", "#{@basedir}/mkdocs-patch.yml:/app/mkdocs-patch.yml"]
|
||||
docker_opts.concat docker_opt_mkdocs_config
|
||||
puts "Mkdocs: detected mkdocs-patch.yml file. Adding option to command line (#{docker_opt_mkdocs_config})"
|
||||
else
|
||||
puts "Mkdocs: no mkdocs-patch.yml detected. Using default files"
|
||||
end
|
||||
|
||||
## Detect slides
|
||||
if Dir.exists?("#{@basedir}/slides")
|
||||
docker_opt_marp_port = ["-p", "5200:5200"]
|
||||
docker_opts.concat docker_opt_marp_port
|
||||
puts "Slides: detected slides directory. Adding option to command line (#{docker_opt_marp_port})"
|
||||
else
|
||||
puts "Slides: no slides directory detected."
|
||||
end
|
||||
|
||||
## Detect docs
|
||||
if Dir.exists?("#{@basedir}/docs")
|
||||
docker_opt_marp_port = ["-p", "5100:5100"]
|
||||
docker_opts.concat docker_opt_marp_port
|
||||
puts "Slides: detected docs directory. Adding option to command line (#{docker_opt_marp_port})"
|
||||
else
|
||||
puts "Slides: no slides docs detected."
|
||||
end
|
||||
|
||||
docker_opts << @docker_image
|
||||
docker_opts << @action
|
||||
|
||||
puts docker_opts.inspect.colorize(:yellow)
|
||||
@process = Process.new("docker", docker_opts, output: STDOUT, error: STDERR)
|
||||
end
|
||||
|
||||
def wait()
|
||||
process = @process
|
||||
return if process.nil?
|
||||
|
||||
Signal::INT.trap do
|
||||
STDERR.puts "Received CTRL-C"
|
||||
process.signal(Signal::KILL)
|
||||
Process.run("docker", ["kill", @docker_name])
|
||||
end
|
||||
process.wait
|
||||
end
|
||||
|
||||
def stop()
|
||||
end
|
||||
|
||||
def docker_opts()
|
||||
end
|
||||
end
|
||||
end
|
38
src/log.cr
Normal file
38
src/log.cr
Normal file
|
@ -0,0 +1,38 @@
|
|||
|
||||
require "log"
|
||||
require "colorize"
|
||||
|
||||
struct DebugFormat < Log::StaticFormatter
|
||||
def run
|
||||
string @entry.severity.label[0].downcase
|
||||
string ": "
|
||||
source
|
||||
string ": "
|
||||
message
|
||||
end
|
||||
end
|
||||
|
||||
struct BaseFormat < Log::StaticFormatter
|
||||
def run
|
||||
io = ::IO::Memory.new
|
||||
|
||||
color = case @entry.severity
|
||||
when ::Log::Severity::Error
|
||||
Colorize.colorize.red.bold
|
||||
when ::Log::Severity::Warn
|
||||
Colorize.colorize.red.yellow
|
||||
when ::Log::Severity::Notice
|
||||
Colorize.colorize.bold
|
||||
else
|
||||
Colorize.colorize
|
||||
end
|
||||
|
||||
color.surround(io) do
|
||||
io << @entry.message
|
||||
end
|
||||
|
||||
string io.to_s
|
||||
end
|
||||
end
|
||||
|
||||
# Log.define_formatter BaseFormat, "#{severity.to_s.lstrip}(#{source}): #{message}"
|
|
@ -1,6 +1,9 @@
|
|||
|
||||
require "./cli"
|
||||
require "./log"
|
||||
|
||||
::Log.setup(:notice, Log::IOBackend.new(formatter: BaseFormat))
|
||||
::Log.progname = "(root)"
|
||||
app = DocMachine::Cli.new
|
||||
app.start(ARGV)
|
||||
|
||||
|
|
6
src/module.cr
Normal file
6
src/module.cr
Normal file
|
@ -0,0 +1,6 @@
|
|||
|
||||
require "log"
|
||||
|
||||
module DocMachine
|
||||
Log = ::Log.for("docmachine")
|
||||
end
|
17
src/plan/cli.cr
Normal file
17
src/plan/cli.cr
Normal file
|
@ -0,0 +1,17 @@
|
|||
|
||||
require "./config"
|
||||
|
||||
module DocMachine::Plan
|
||||
class Cli
|
||||
def self.add_options(opts, args, parent_config, command)
|
||||
config = Config.new(parent_config)
|
||||
|
||||
opts.on("plan", "Generate content structure (beta)") do
|
||||
opts.banner = "Usage: #{PROGRAM_NAME} plan [options]"
|
||||
opts.on("-t", "--test", "Test") do
|
||||
Log.info { "Test" }
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
9
src/plan/config.cr
Normal file
9
src/plan/config.cr
Normal file
|
@ -0,0 +1,9 @@
|
|||
|
||||
module DocMachine
|
||||
module Plan
|
||||
class Config
|
||||
def initialize(@parent : DocMachine::Config)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
48
src/plan/run.cr
Normal file
48
src/plan/run.cr
Normal file
|
@ -0,0 +1,48 @@
|
|||
|
||||
# Ref. https://platform.openai.com/docs/api-reference/chat
|
||||
|
||||
## -x --expertise XXX
|
||||
## -a --audience
|
||||
## -o --objectives
|
||||
|
||||
# System
|
||||
|
||||
# System (common)
|
||||
#
|
||||
# Characters
|
||||
# TEACHER is ...
|
||||
# AUDITOR is ...
|
||||
#
|
||||
# Content Type
|
||||
# ...
|
||||
#
|
||||
# Audience
|
||||
# ...
|
||||
#
|
||||
# Topic
|
||||
# ...
|
||||
|
||||
# Message (1) : please define main TOC
|
||||
#
|
||||
# Request
|
||||
# Write TOC for a ...
|
||||
#
|
||||
# Constraints
|
||||
# ...
|
||||
#
|
||||
# => JSON
|
||||
|
||||
# Message (1) : please define chapter TOC
|
||||
#
|
||||
# TOC
|
||||
# ...
|
||||
# Request
|
||||
# Write content structure for chapter ...
|
||||
# ...
|
||||
# => JSON
|
||||
|
||||
# Message (2) : please write
|
||||
|
||||
|
||||
# [auto] Request
|
||||
# [auto]
|
31
src/scaffold/cli.cr
Normal file
31
src/scaffold/cli.cr
Normal file
|
@ -0,0 +1,31 @@
|
|||
require "./config"
|
||||
require "./run"
|
||||
|
||||
module DocMachine::Scaffold
|
||||
class Cli
|
||||
def self.add_options(opts, args, parent_config, commands)
|
||||
config = Config.new(parent_config)
|
||||
|
||||
opts.on("scaffold", "Scaffold target directory (beta)") do
|
||||
opts.banner = "Usage: #{PROGRAM_NAME} scaffold [options] TARGET"
|
||||
|
||||
opts.on("-f", "--force", "Don't ask for confirmation") do
|
||||
config.force = true
|
||||
end
|
||||
|
||||
commands << ->() : Nil do
|
||||
if args.size < 1
|
||||
Log.error { "ERROR: No target given!" }
|
||||
exit 1
|
||||
end
|
||||
config.target_directory = args[0]
|
||||
|
||||
app = DocMachine::Scaffold::Run.new(config)
|
||||
app.prepare
|
||||
app.start
|
||||
app.wait
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
11
src/scaffold/config.cr
Normal file
11
src/scaffold/config.cr
Normal file
|
@ -0,0 +1,11 @@
|
|||
|
||||
|
||||
module DocMachine::Scaffold
|
||||
class Config
|
||||
property target_directory : String = "."
|
||||
property force : Bool = false
|
||||
|
||||
def initialize(@parent : DocMachine::Config)
|
||||
end
|
||||
end
|
||||
end
|
0
src/scaffold/module.cr
Normal file
0
src/scaffold/module.cr
Normal file
61
src/scaffold/run.cr
Normal file
61
src/scaffold/run.cr
Normal file
|
@ -0,0 +1,61 @@
|
|||
|
||||
# Core
|
||||
require "file_utils"
|
||||
|
||||
# Internal
|
||||
require "./config"
|
||||
|
||||
# Shards
|
||||
require "term-prompt"
|
||||
|
||||
module DocMachine::Scaffold
|
||||
class Run
|
||||
private property config : DocMachine::Scaffold::Config
|
||||
|
||||
def initialize(@config)
|
||||
end
|
||||
|
||||
# Verify parameters
|
||||
def prepare()
|
||||
if ! File.directory? @config.target_directory
|
||||
Log.error { "ERROR: target must be a directory" }
|
||||
exit 1
|
||||
end
|
||||
|
||||
Log.info { "Target directory: #{@config.target_directory}" }
|
||||
|
||||
if !@config.force
|
||||
prompt = Term::Prompt.new
|
||||
confirm = prompt.no?("Are you sure you want to proceed?")
|
||||
exit 1 if !confirm
|
||||
end
|
||||
end
|
||||
|
||||
def start()
|
||||
Log.info { "== Scaffolding #{@config.target_directory}" }
|
||||
p = Path.new(@config.target_directory)
|
||||
cwd = Dir.current
|
||||
["docs", "slides", "images"].each do |dir|
|
||||
p_sub = p.join(dir)
|
||||
Log.info { "-- creating #{p_sub}" }
|
||||
FileUtils.mkdir_p(p_sub)
|
||||
end
|
||||
["docs", "slides"].each do |dir|
|
||||
p_sub = p.join(dir)
|
||||
FileUtils.cd(p_sub)
|
||||
Log.info { "-- creating link to images in #{p_sub}" }
|
||||
if File.symlink? "images"
|
||||
FileUtils.rm "images"
|
||||
end
|
||||
FileUtils.ln_sf(Path.new("..","images"), Path.new("images"))
|
||||
FileUtils.cd(cwd)
|
||||
end
|
||||
Log.info { "-- creating README.md" }
|
||||
FileUtils.touch("README.md")
|
||||
end
|
||||
|
||||
# Verify parameters
|
||||
def wait()
|
||||
end
|
||||
end
|
||||
end
|
45
src/write/cli.cr
Normal file
45
src/write/cli.cr
Normal file
|
@ -0,0 +1,45 @@
|
|||
|
||||
require "./config"
|
||||
require "./run"
|
||||
require "./module"
|
||||
|
||||
module DocMachine::Write
|
||||
class Cli
|
||||
Log = DocMachine::Write::Log.for("cli")
|
||||
|
||||
def self.add_options(opts, args, parent_config, commands)
|
||||
config = Config.new(parent_config)
|
||||
|
||||
opts.on("write", "Write content target for plan (beta)") do
|
||||
opts.banner = "Usage: #{PROGRAM_NAME} write [options] TARGET"
|
||||
|
||||
opts.on("-f", "--force", "Don't ask for confirmation") do
|
||||
config.force = true
|
||||
end
|
||||
|
||||
opts.on("-t", "--template TEMPLATE", "Use given template") do |template_name|
|
||||
config.template_name = template_name
|
||||
end
|
||||
|
||||
commands << ->() : Nil do
|
||||
Log.debug { "before any" }
|
||||
if args.size < 1
|
||||
Log.error { "No target given!" }
|
||||
exit 1
|
||||
end
|
||||
config.target_directory = args[0]
|
||||
|
||||
Log.debug { "before new" }
|
||||
app = DocMachine::Write::Run.new(config)
|
||||
Log.debug { "before prepare" }
|
||||
app.prepare
|
||||
Log.debug { "before start" }
|
||||
app.start
|
||||
Log.debug { "before wait" }
|
||||
app.wait
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
14
src/write/config.cr
Normal file
14
src/write/config.cr
Normal file
|
@ -0,0 +1,14 @@
|
|||
|
||||
|
||||
module DocMachine::Write
|
||||
class Config
|
||||
Log = DocMachine::Write.for("config")
|
||||
|
||||
property target_directory : String = "."
|
||||
property force : Bool = false
|
||||
property template_name : String = ""
|
||||
|
||||
def initialize(@parent : DocMachine::Config)
|
||||
end
|
||||
end
|
||||
end
|
7
src/write/file_storage.cr
Normal file
7
src/write/file_storage.cr
Normal file
|
@ -0,0 +1,7 @@
|
|||
require "baked_file_system"
|
||||
|
||||
class FileStorage
|
||||
extend BakedFileSystem
|
||||
|
||||
bake_folder "../../data/prompts"
|
||||
end
|
6
src/write/module.cr
Normal file
6
src/write/module.cr
Normal file
|
@ -0,0 +1,6 @@
|
|||
|
||||
require "../module"
|
||||
|
||||
module DocMachine::Write
|
||||
Log = DocMachine::Log.for("write")
|
||||
end
|
6
src/write/nodes/module.cr
Normal file
6
src/write/nodes/module.cr
Normal file
|
@ -0,0 +1,6 @@
|
|||
|
||||
require "../module"
|
||||
|
||||
module DocMachine::Write::Nodes
|
||||
Log = DocMachine::Write::Log.for("nodes")
|
||||
end
|
46
src/write/nodes/root_node.cr
Normal file
46
src/write/nodes/root_node.cr
Normal file
|
@ -0,0 +1,46 @@
|
|||
|
||||
require "./module"
|
||||
|
||||
module DocMachine::Write::Nodes
|
||||
class RootNode
|
||||
property context : String = ""
|
||||
property audience : String = ""
|
||||
property goals : String = ""
|
||||
property constraints : String = ""
|
||||
property chapters = [] of ChapterNode
|
||||
|
||||
def build_chapters()
|
||||
[] of ChapterNode
|
||||
end
|
||||
end
|
||||
|
||||
class ChapterNode
|
||||
property sections = [] of SectionNode
|
||||
|
||||
def build_sections()
|
||||
[] of SectionNode
|
||||
end
|
||||
end
|
||||
|
||||
class SectionNode
|
||||
property subsections = [] of SubsectionNode
|
||||
|
||||
def build_subsections()
|
||||
[] of SubsectionNode
|
||||
end
|
||||
end
|
||||
|
||||
class SubsectionNode
|
||||
property content = [] of String
|
||||
|
||||
def build_content()
|
||||
[] of String
|
||||
end
|
||||
|
||||
def fix_content()
|
||||
[] of String
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
179
src/write/run.cr
Normal file
179
src/write/run.cr
Normal file
|
@ -0,0 +1,179 @@
|
|||
|
||||
# Core
|
||||
require "file_utils"
|
||||
require "path"
|
||||
|
||||
# Internal
|
||||
require "./config"
|
||||
require "./nodes/root_node"
|
||||
|
||||
# Shards
|
||||
require "term-prompt"
|
||||
require "crinja"
|
||||
|
||||
module DocMachine::Write
|
||||
class Run
|
||||
private property config : DocMachine::Write::Config
|
||||
property root = Nodes::RootNode.new
|
||||
|
||||
def initialize(@config)
|
||||
end
|
||||
|
||||
def validate_build_dir()
|
||||
if ! File.directory? @config.target_directory
|
||||
Log.error { "Target must be a directory" }
|
||||
exit 1
|
||||
end
|
||||
|
||||
Log.info { "Target directory: #{@config.target_directory}" }
|
||||
end
|
||||
|
||||
def ask_confirmation
|
||||
# if !@config.force
|
||||
# prompt = Term::Prompt.new
|
||||
# confirm = prompt.no?("Are you sure you want to proceed?")
|
||||
# exit 1 if !confirm
|
||||
# end
|
||||
end
|
||||
|
||||
def load_templates
|
||||
pp @config
|
||||
|
||||
context_file = Path[@config.target_directory] / "CONTEXT.tpl"
|
||||
if ! File.exists? context_file
|
||||
raise "Context file #{context_file} is missing"
|
||||
end
|
||||
@root.context = File.read(context_file)
|
||||
|
||||
audience_file = Path[@config.target_directory] / "AUDIENCE.tpl"
|
||||
if ! File.exists? audience_file
|
||||
raise "Audience file #{audience_file} is missing"
|
||||
end
|
||||
@root.audience = File.read(audience_file)
|
||||
|
||||
goals_file = Path[@config.target_directory] / "GOALS.tpl"
|
||||
if ! File.exists? goals_file
|
||||
raise "Audience file #{goals_file} is missing"
|
||||
end
|
||||
@root.goals = File.read(goals_file)
|
||||
|
||||
constraints_file = Path[@config.target_directory] / "CONSTRAINTS.tpl"
|
||||
if ! File.exists? constraints_file
|
||||
raise "Audience file #{constraints_file} is missing"
|
||||
end
|
||||
@root.constraints = File.read(constraints_file)
|
||||
end
|
||||
|
||||
# Verify parameters
|
||||
def prepare()
|
||||
validate_build_dir()
|
||||
ask_confirmation()
|
||||
load_templates()
|
||||
Log.debug { "done" }
|
||||
end
|
||||
|
||||
##
|
||||
## ContentNode
|
||||
## type: ...
|
||||
## title: ...
|
||||
## keywords: ...
|
||||
## content: ...
|
||||
##
|
||||
|
||||
def start()
|
||||
@root.chapters = root.build_chapters()
|
||||
|
||||
@root.chapters.each do |chapter|
|
||||
chapter.sections = chapter.build_sections()
|
||||
|
||||
chapter.sections.each do |section|
|
||||
section.subsections = section.build_subsections()
|
||||
# FIXME(later): section.exercises = section.build_exercises()
|
||||
|
||||
section.subsections.each do |subsection|
|
||||
subsection.content = subsection.build_content()
|
||||
subsection.content = subsection.fix_content()
|
||||
# FIXME(later): subsection.exercises = subsection.build_exercises()
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
## Level 0 - each topic : build TOC (chapter list)
|
||||
def from_topic_build_chapters
|
||||
chapter_build_toc_template = FileStorage.get("./../data/prompts/01-each-chapter--build-toc.tpl")
|
||||
chapters = [{ "title": "Terraform on Azure" }]
|
||||
|
||||
end
|
||||
|
||||
|
||||
## Level 1 - each chapter : build TOC (section list)
|
||||
# 1. build chat
|
||||
# - (system) quality & style guidance
|
||||
# - (user) context
|
||||
# - (user) audience
|
||||
# - (user) objectives
|
||||
# - (user) main toc (chapters)
|
||||
def from_chapter_build_sections()
|
||||
delimiter = "34e127df" # FIXME: random 8 bytes hex string
|
||||
chapters.each do |chapter|
|
||||
template_vars = {
|
||||
delimiter: delimiter,
|
||||
chapter: chapter
|
||||
}
|
||||
render = Crinja.render(chapter_build_toc_template, template_vars)
|
||||
puts render
|
||||
end
|
||||
end
|
||||
|
||||
## Level 2 - each section : build TOC (subsection list)
|
||||
# 1. build chat
|
||||
# - (system) quality & style guidance
|
||||
# - (user) context
|
||||
# - (user) audience
|
||||
# - (user) objectives
|
||||
# - (user) main toc (chapters)
|
||||
# - (user) chapter toc (sections)
|
||||
# 2. make openai request
|
||||
# 3. validate result structure
|
||||
# 4. create section objects in memory
|
||||
def from_section_build_subsections()
|
||||
section_build_toc_tpl = File.read(DOCMACHINE_DATA_PATH / "prompts" / "02-each-section--build-toc.tpl")
|
||||
end
|
||||
|
||||
|
||||
## Level 2 - each section : build EXERCISES
|
||||
# 1. build chat
|
||||
# - (system) quality & style guidance
|
||||
# - (user) context
|
||||
# - (user) audience
|
||||
# - (user) objectives
|
||||
# - (user) main toc (chapters)
|
||||
# - (user) chapter toc (sections)
|
||||
# 2. make openai request
|
||||
# 3. validate result structure
|
||||
# 4. create exercises objects in memory
|
||||
def from_section_build_exercises()
|
||||
section_build_toc_tpl = File.read(DOCMACHINE_DATA_PATH / "prompts" / "02-each-section--build-exercises.tpl")
|
||||
end
|
||||
|
||||
def from_subsection_build_content()
|
||||
## Level 3 - each subsection : build CONTENT
|
||||
section_build_toc_tpl = File.read(DOCMACHINE_DATA_PATH / "prompts" / "02-each-subsection--build-content.tpl")
|
||||
end
|
||||
|
||||
def from_subsection_fix_content()
|
||||
## Level 4 - each subsection : build FIXED CONTENT
|
||||
section_build_toc_tpl = File.read(DOCMACHINE_DATA_PATH / "prompts" / "02-each-subsection--fix-content.tpl")
|
||||
end
|
||||
|
||||
def from_subsection_build_exercises()
|
||||
## Level 1 - each subsection EXERCICES
|
||||
section_build_toc_tpl = File.read(DOCMACHINE_DATA_PATH / "prompts" / "02-each-subsection--build-exercises.tpl")
|
||||
end
|
||||
|
||||
def wait()
|
||||
end
|
||||
end
|
||||
end
|
Loading…
Reference in a new issue