Compare commits

..

No commits in common. "main" and "custom_content" have entirely different histories.

34 changed files with 1484 additions and 2671 deletions

View file

@ -3,36 +3,12 @@
the configuration format, with defaults, is documented below:
```toml
[engine]
mode = "markdown" # choose which post engine to use
# options: "markdown", "blag"
# absolutely do not use "blag" unless you know exactly
# what you are getting yourself into.
[engine.markdown] # options for the `markdown` engine
root = "posts" # where posts are served from
raw_access = true # allow visitors to see the raw markdown of a post
[engine.markdown.render]
escape = false # escape HTML in the markdown soucre instead of
# clobbering it (https://docs.rs/comrak/latest/comrak/struct.RenderOptions.html#structfield.escape)
unsafe = false # allow HTML and dangerous links (https://docs.rs/comrak/latest/comrak/struct.RenderOptions.html#structfield.unsafe_)
[engine.markdown.render.syntect]
load_defaults = false # include default syntect themes
themes_dir = "themes" # directory to include themes from
theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
[engine.blag]
root = "posts" # where posts are served from
bin = "blag" # path to the `blag` binary
raw_access = true # allow visitors to see the raw bash of a post
[style]
title = "bingus-blog" # title of the blog
# description of the blog
description = "blazingly fast blog software written in rust memory safe"
js_enable = true # enable javascript (required for sorting and dates)
description = "blazingly fast markdown blog software written in rust memory safe"
markdown_access = true # allow users to see the raw markdown of a post
# endpoint: /posts/<name>.md
js_enable = true # enable javascript (required for below 2 options)
date_format = "RFC3339" # format string used to format dates in the backend
# it's highly recommended to leave this as default,
# so the date can be formatted by the browser.
@ -40,39 +16,36 @@ date_format = "RFC3339" # format string used to format dates in the backend
default_sort = "date" # default sorting method ("date" or "name")
#default_color = "#f5c2e7" # default embed color, optional
[style.display_dates]
creation = true # display creation ("written") dates
modification = true # display modified ("last modified") dates
[rss]
enable = false # serve an rss field under /feed.xml
# this may be a bit resource intensive
link = "https://..." # public url of the blog, required if rss is enabled
[custom] # custom override directories
[dirs]
posts = "posts" # where posts are stored
media = "media" # directory served under /media/
templates = "templates" # handlebars template drop-ins dir
static = "static" # custom static directory
custom_templates = "templates" # custom templates dir
custom_static = "static" # custom static dir
# see CUSTOM.md for documentation
[http]
host = "::" # ip to listen on
host = "0.0.0.0" # ip to listen on
port = 3000 # port to listen on
[cache]
enable = true # save metadata and rendered posts into RAM
# highly recommended, only turn off if absolutely necessary
#ttl = 5 # how long should and item persist in cache,
# in milliseconds
# uncomment to enable
cleanup = true # clean cache, highly recommended
#cleanup_interval = 86400000 # clean the cache regularly instead of
# just at startup, value in milliseconds
#cleanup_interval = 86400000 # clean the cache regularly instead of just at startup
# uncomment to enable
persistence = true # save the cache to on shutdown and load on startup
file = "cache" # file to save the cache to
compress = true # compress the cache file
compression_level = 3 # zstd compression level, 3 is recommended
[render]
syntect.load_defaults = false # include default syntect themes
syntect.themes_dir = "themes" # directory to include themes from
syntect.theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
```
configuration is done in [TOML](https://toml.io/)
@ -81,7 +54,7 @@ a default value
you don't have to copy the whole thing from here,
it's generated by the program if it doesn't exist
## Specifying the configuration file
## Specifying Configuration
the configuration file is loaded from `config.toml` by default, but the path
can be overriden by setting the environment variable `BINGUS_BLOG_CONFIG`,

View file

@ -3,7 +3,7 @@
bingus-blog supports loading custom content such as templates and static files
at runtime from custom locations.
the configuration options `custom.templates` and `custom.static`
the configuration options `dirs.custom_templates` and `dirs.custom_static`
allow you to set where these files are loaded from.
customizing the error page, other than CSS, is not supported at this time.
@ -37,7 +37,7 @@ at startup. if you delete/create the directory, you must restart the program.
## Custom Static Files
GET requests to `/static` will first be checked against `custom.static`.
GET requests to `/static` will first be checked against `dirs.custom_static`.
if the file is not found in the *custom static directory*, bingus-blog will try
to serve it from the directory embedded in the executable. this means you can
add whatever you want in the *custom static directory* and it will be served
@ -45,5 +45,5 @@ under `/static`.
## Custom Media
the endpoint `/media` is served from `custom.media`. no other logic or mechanism
the endpoint `/media` is served from `dirs.media`. no other logic or mechanism
is present.

1539
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -3,22 +3,22 @@ name = "bingus-blog"
version = "0.1.0"
edition = "2021"
default-run = "bingus-blog"
repository = "https://git.slonk.ing/slonk/bingus-blog"
[features]
default = []
tokio-console = ["dep:console-subscriber"]
[profile.release]
lto = "thin"
lto = "fat"
opt-level = 3
strip = true
[dependencies]
arc-swap = { version = "1.7.1", features = ["serde"] }
askama = { version = "0.12.1", default-features = false }
async-trait = "0.1.85"
axum = { version = "0.8.1", features = [
askama = { version = "0.12.1", features = [
"with-axum",
], default-features = false }
askama_axum = { version = "0.4.0", default-features = false }
axum = { version = "0.7.5", features = [
"http1",
"json",
"query",
@ -32,38 +32,32 @@ chrono = { version = "0.4.37", features = [
"serde",
], default-features = false }
color-eyre = "0.6.3"
comrak = { version = "0.35.0", features = [
comrak = { version = "0.22.0", features = [
"syntect",
"bon",
], default-features = false }
console-subscriber = { version = "0.4.1", optional = true }
const-str = "0.6.0"
console-subscriber = { version = "0.2.0", optional = true }
derive_more = "0.99.17"
fronma = "0.2.0"
futures = "0.3.31"
handlebars = "6.0.0"
include_dir = "0.7.4"
indexmap = { version = "2.7.0", features = ["serde"] }
mime_guess = "2.0.5"
notify-debouncer-full = { version = "0.5.0", default-features = false }
notify-debouncer-full = { version = "0.3.1", default-features = false }
rss = "2.0.7"
scc = { version = "2.1.0", features = ["serde"] }
serde = { version = "1.0.197", features = ["derive", "rc"] }
serde-value = "0.7.0"
serde = { version = "1.0.197", features = ["derive"] }
serde_json = { version = "1.0.124", features = ["preserve_order"] }
syntect = "5.2.0"
thiserror = "2.0.9"
thiserror = "1.0.58"
tokio = { version = "1.37.0", features = [
"fs",
"macros",
"rt-multi-thread",
"signal",
"process",
"sync",
] }
tokio-util = { version = "0.7.10", default-features = false }
toml = "0.8.12"
tower = { version = "0.5.2", features = ["util"] }
tower-http = { version = "0.6.2", features = [
tower = "0.4.13"
tower-http = { version = "0.5.2", features = [
"compression-gzip",
"fs",
"trace",

View file

@ -2,7 +2,7 @@
title: README
description: the README.md file of this project
author: slonkazoid
written_at: 2024-04-18T04:15:26+03:00
created_at: 2024-04-18T04:15:26+03:00
---
# bingus-blog
@ -17,7 +17,7 @@ for bingus-blog viewers: [see original document](https://git.slonk.ing/slonk/bin
can write posts from anywhere and sync it with the server without headache
- RSS is supported
- the look of the blog is extremely customizable, with support for
[custom drop-ins](CUSTOM.md) for both templates and static content
[custom drop-ins](/CUSTOM.md) for both templates and static content
- really easy to deploy (the server is one executable file)
- blazingly fast
@ -26,20 +26,18 @@ for bingus-blog viewers: [see original document](https://git.slonk.ing/slonk/bin
- [ ] blog thumbnail and favicon
- [ ] sort asc/desc
- [ ] extend syntect options
- [x] ^ fix syntect mutex poisoning
- [ ] better error reporting and error pages
- [ ] better tracing
- [ ] replace HashMap with HashCache once i implement [this](https://github.com/wvwwvwwv/scalable-concurrent-containers/issues/139)
- [ ] make date parsing less strict
- [x] improve home page
- [ ] improve home page
- [ ] multi-language support
- [x] add credits
- [x] be blazingly fast
- [x] 100+ MiB binary size
## Configuration
see [CONFIG.md](CONFIG.md)
see [CONFIG.md](/CONFIG.md)
## Building
@ -54,7 +52,7 @@ cargo +nightly build --release
the executable will be located at `target/release/bingus-blog`.
see [BUILDING.md](BUILDING.md) for more information and detailed instructions.
see [BUILDING.md](/BUILDING.md) for more information and detailed instructions.
## Writing Posts
@ -83,7 +81,7 @@ author: Blubber256 # author of the post
icon: /media/first-post/icon.png # icon/thumbnail of post used in embeds
icon_alt: Picture of a computer running DOOM
color: "#00aacc" # color of post, also used in embeds
written_at: 2024-04-18T04:15:26+03:00 # date of writing, this is highly
created_at: 2024-04-18T04:15:26+03:00 # date of writing, this is highly
# recommended if you are on a system which doesnt have btime (like musl),
# because this is fetched from file stats by default
#modified_at: ... # see above. this is also fetched from the filesystem
@ -94,7 +92,7 @@ tags: # tags, or keywords, used in meta and also in the ui
only first 3 fields are required. if it can't find the other 2 fields, it will
get them from filesystem metadata. if you are on musl and you omit the
`written_at` field, it will just not show up
`created_at` field, it will just not show up
the dates must follow the [RFC 3339](https://datatracker.ietf.org/doc/html/rfc3339)
standard. examples of valid and invalid dates:
@ -110,8 +108,7 @@ standard. examples of valid and invalid dates:
## Non-static Routes
- `GET /`: index page, lists posts
- `GET /posts`: small preview of posts for embedding in other sites and such
- `GET /posts.json`: returns a list of all posts with metadata in JSON format
- `GET /posts`: returns a list of all posts with metadata in JSON format
- `GET /posts/<name>`: view a post
- `GET /posts/<name>.md`: view the raw markdown of a post
- `GET /post/*`: redirects to `/posts/*`
@ -144,7 +141,3 @@ feature requests provided they fit the project and it's values.
most just ping me on discord with feature requests, but if your request is
non-trivial, please create an issue [here](https://git.slonk.ing/slonk/bingus-blog/issues).
## Blagging
you've scrolled this far. you deserve to know [the truth](https://git.slonk.ing/slonk/blag).

View file

@ -1,11 +1,11 @@
<div class="table">
{{#if (and (ne this.written_at null) style.display_dates.creation)}}
{{#if (ne this.created_at null)}}
<div class="created">written</div>
<div class="created value">{{>span_date dt=this.written_at df=style.date_format}}</div>
<div class="created value">{{>span_date date_time=this.created_at}}</div>
{{/if}}
{{#if (and (ne this.modified_at null) style.display_dates.modification)}}
{{#if (ne this.modified_at null)}}
<div class="modified">last modified</div>
<div class="modified value">{{>span_date dt=this.modified_at df=style.date_format}}</div>
<div class="modified value">{{>span_date date_time=this.modified_at}}</div>
{{/if}}
{{#if (gt (len this.tags) 0)}}
<div class="tags">tags</div>

View file

@ -1 +1 @@
<span class="date {{#if (eq df "RFC3339")}}date-rfc3339{{/if}}">{{date dt df}}</span>
<span class="date {{#if (eq df "RFC3339")}}date-rfc3339{{/if}}">{{date date_time df}}</span>

View file

@ -1,85 +1,63 @@
use std::collections::HashMap;
use std::sync::Arc;
use std::time::Duration;
use arc_swap::access::DynAccess;
use axum::extract::{Path, Query, State};
use axum::http::header::CONTENT_TYPE;
use axum::http::Request;
use axum::response::{Html, IntoResponse, Redirect, Response};
use axum::response::{IntoResponse, Redirect, Response};
use axum::routing::get;
use axum::{Json, Router};
use handlebars::Handlebars;
use include_dir::{include_dir, Dir};
use indexmap::IndexMap;
use rss::{Category, ChannelBuilder, ItemBuilder};
use serde::{Deserialize, Serialize};
use serde_value::Value;
use serde_json::Map;
use tokio::sync::RwLock;
use tower::service_fn;
use tower_http::services::ServeDir;
use tower_http::trace::TraceLayer;
use tracing::{info, info_span, Span};
use crate::config::{DirsConfig, RssConfig, StyleConfig};
use crate::config::{Config, DateFormat, Sort};
use crate::error::{AppError, AppResult};
use crate::path::SafePath;
use crate::post::{Filter, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::post::{MarkdownPosts, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::serve_dir_included::handle;
const STATIC: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/static");
#[derive(Serialize)]
pub struct BingusInfo {
pub name: &'static str,
pub version: &'static str,
pub repository: &'static str,
}
const BINGUS_INFO: BingusInfo = BingusInfo {
name: env!("CARGO_PKG_NAME"),
version: env!("CARGO_PKG_VERSION"),
repository: env!("CARGO_PKG_REPOSITORY"),
};
#[derive(Clone)]
#[non_exhaustive]
pub struct AppState {
pub rss: Arc<dyn DynAccess<RssConfig> + Send + Sync>,
pub style: Arc<dyn DynAccess<StyleConfig> + Send + Sync>,
pub posts: Arc<dyn PostManager + Send + Sync>,
pub templates: Arc<RwLock<Handlebars<'static>>>,
pub config: Arc<Config>,
pub posts: Arc<MarkdownPosts<Arc<Config>>>,
pub reg: Arc<RwLock<Handlebars<'static>>>,
}
#[derive(Serialize)]
struct IndexTemplate<'a> {
bingus_info: &'a BingusInfo,
title: &'a str,
description: &'a str,
posts: Vec<PostMetadata>,
rss: bool,
df: &'a DateFormat,
js: bool,
tags: IndexMap<Arc<str>, u64>,
color: Option<&'a str>,
sort: Sort,
tags: Map<String, serde_json::Value>,
joined_tags: String,
style: &'a StyleConfig,
}
#[derive(Serialize)]
struct PostsTemplate<'a> {
bingus_info: &'a BingusInfo,
posts: Vec<PostMetadata>,
js: bool,
style: &'a StyleConfig,
}
#[derive(Serialize)]
struct PostTemplate<'a> {
bingus_info: &'a BingusInfo,
meta: &'a PostMetadata,
body: Arc<str>,
perf: RenderStats,
rendered: String,
rendered_in: RenderStats,
markdown_access: bool,
df: &'a DateFormat,
js: bool,
color: Option<&'a str>,
joined_tags: String,
style: &'a StyleConfig,
raw_name: Option<String>,
}
#[derive(Deserialize)]
@ -87,16 +65,14 @@ struct QueryParams {
tag: Option<String>,
#[serde(rename = "n")]
num_posts: Option<usize>,
#[serde(flatten)]
other: IndexMap<String, Value>,
}
fn collect_tags(posts: &Vec<PostMetadata>) -> IndexMap<Arc<str>, u64> {
let mut tags = IndexMap::new();
fn collect_tags(posts: &Vec<PostMetadata>) -> Map<String, serde_json::Value> {
let mut tags = HashMap::new();
for post in posts {
for tag in &post.tags {
if let Some((existing_tag, count)) = tags.swap_remove_entry(tag) {
if let Some((existing_tag, count)) = tags.remove_entry(tag) {
tags.insert(existing_tag, count + 1);
} else {
tags.insert(tag.clone(), 1);
@ -104,13 +80,21 @@ fn collect_tags(posts: &Vec<PostMetadata>) -> IndexMap<Arc<str>, u64> {
}
}
tags.sort_unstable_by(|k1, _v1, k2, _v2| k1.cmp(k2));
tags.sort_by(|_k1, v1, _k2, v2| v2.cmp(v1));
let mut tags: Vec<(String, u64)> = tags.into_iter().collect();
tags
tags.sort_unstable_by_key(|(v, _)| v.clone());
tags.sort_by_key(|(_, v)| -(*v as i64));
let mut map = Map::new();
for tag in tags.into_iter() {
map.insert(tag.0, tag.1.into());
}
map
}
fn join_tags_for_meta(tags: &IndexMap<Arc<str>, u64>, delim: &str) -> String {
fn join_tags_for_meta(tags: &Map<String, serde_json::Value>, delim: &str) -> String {
let mut s = String::new();
let tags = tags.keys().enumerate();
let len = tags.len();
@ -123,151 +107,102 @@ fn join_tags_for_meta(tags: &IndexMap<Arc<str>, u64>, delim: &str) -> String {
s
}
async fn index(
async fn index<'a>(
State(AppState {
rss,
style,
posts,
templates,
..
config, posts, reg, ..
}): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<impl IntoResponse> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(
query.num_posts,
query.tag.as_deref(),
&query.other,
)
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
let tags = collect_tags(&posts);
let joined_tags = join_tags_for_meta(&tags, ", ");
let reg = templates.read().await;
let style = style.load();
let reg = reg.read().await;
let rendered = reg.render(
"index",
&IndexTemplate {
bingus_info: &BINGUS_INFO,
title: &config.title,
description: &config.description,
posts,
rss: rss.load().enable,
js: style.js_enable,
rss: config.rss.enable,
df: &config.date_format,
js: config.js_enable,
color: config.default_color.as_deref(),
sort: config.default_sort,
tags,
joined_tags,
style: &style,
},
);
drop((style, reg));
Ok(Html(rendered?))
drop(reg);
Ok(([(CONTENT_TYPE, "text/html")], rendered?))
}
async fn posts_json(
async fn all_posts(
State(AppState { posts, .. }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Json<Vec<PostMetadata>>> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(
query.num_posts,
query.tag.as_deref(),
&query.other,
)
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
Ok(Json(posts))
}
async fn posts(
State(AppState {
posts,
templates,
style,
..
}): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Html<String>> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(
query.num_posts,
query.tag.as_deref(),
&query.other,
)
.await?;
let reg = templates.read().await;
let style = style.load();
let rendered = reg.render(
"posts",
&PostsTemplate {
bingus_info: &BINGUS_INFO,
posts,
js: style.js_enable,
style: &style,
},
);
drop((style, reg));
Ok(Html(rendered?))
}
async fn rss(
State(AppState {
rss, style, posts, ..
}): State<AppState>,
State(AppState { config, posts, .. }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Response> {
if !rss.load().enable {
if !config.rss.enable {
return Err(AppError::RssDisabled);
}
let posts = posts
.get_all_posts(
query
.get_all_posts(|metadata, _| {
!query
.tag
.as_ref()
.and(Some(Filter::Tags(query.tag.as_deref().as_slice())))
.as_slice(),
&query.other,
)
.is_some_and(|tag| !metadata.tags.contains(tag))
})
.await?;
let rss = rss.load();
let style = style.load();
let mut channel = ChannelBuilder::default();
channel
.title(&*style.title)
.link(rss.link.to_string())
.description(&*style.description);
.title(&config.title)
.link(config.rss.link.to_string())
.description(&config.description);
//TODO: .language()
for (metadata, content, _) in posts {
channel.item(
ItemBuilder::default()
.title(metadata.title.to_string())
.description(metadata.description.to_string())
.author(metadata.author.to_string())
.title(metadata.title)
.description(metadata.description)
.author(metadata.author)
.categories(
metadata
.tags
.into_iter()
.map(|tag| Category {
name: tag.to_string(),
name: tag,
domain: None,
})
.collect::<Vec<Category>>(),
)
.pub_date(metadata.written_at.map(|date| date.to_rfc2822()))
.content(content.to_string())
.pub_date(metadata.created_at.map(|date| date.to_rfc2822()))
.content(content)
.link(
rss.link
config
.rss
.link
.join(&format!("/posts/{}", metadata.name))?
.to_string(),
)
.build(),
);
}
drop((style, rss));
let body = channel.build().to_string();
drop(channel);
@ -277,68 +212,56 @@ async fn rss(
async fn post(
State(AppState {
style,
posts,
templates,
..
config, posts, reg, ..
}): State<AppState>,
SafePath(name): SafePath<Arc<str>>,
Query(query): Query<QueryParams>,
Path(name): Path<String>,
) -> AppResult<impl IntoResponse> {
match posts.get_post(name.clone(), &query.other).await? {
ReturnedPost::Rendered {
ref meta,
body,
perf,
raw_name,
} => {
match posts.get_post(&name).await? {
ReturnedPost::Rendered(ref meta, rendered, rendered_in) => {
let joined_tags = meta.tags.join(", ");
let reg = templates.read().await;
let style = style.load();
let reg = reg.read().await;
let rendered = reg.render(
"post",
&PostTemplate {
bingus_info: &BINGUS_INFO,
meta,
body,
perf,
js: style.js_enable,
color: meta.color.as_deref().or(style.default_color.as_deref()),
rendered,
rendered_in,
markdown_access: config.markdown_access,
df: &config.date_format,
js: config.js_enable,
color: meta.color.as_deref().or(config.default_color.as_deref()),
joined_tags,
style: &style,
raw_name,
},
);
drop((style, reg));
Ok(Html(rendered?).into_response())
drop(reg);
Ok(([(CONTENT_TYPE, "text/html")], rendered?).into_response())
}
ReturnedPost::Raw(body, content_type) => {
Ok(([(CONTENT_TYPE, content_type)], body).into_response())
}
ReturnedPost::Raw {
buffer,
content_type,
} => Ok(([(CONTENT_TYPE, content_type)], buffer).into_response()),
}
}
pub fn new(dirs: &DirsConfig) -> Router<AppState> {
pub fn new(config: &Config) -> Router<AppState> {
Router::new()
.route("/", get(index))
.route(
"/post/{name}",
get(|Path(name): Path<String>| async move { Redirect::to(&format!("/posts/{name}")) }),
"/post/:name",
get(
|Path(name): Path<String>| async move { Redirect::to(&format!("/posts/{}", name)) },
),
)
.route("/posts/{name}", get(post))
.route("/posts", get(posts))
.route("/posts.json", get(posts_json))
.route("/posts/:name", get(post))
.route("/posts", get(all_posts))
.route("/feed.xml", get(rss))
.nest_service(
"/static",
ServeDir::new(&dirs.static_)
ServeDir::new(&config.dirs.custom_static)
.precompressed_gzip()
.fallback(service_fn(|req| handle(req, &STATIC))),
)
.nest_service("/media", ServeDir::new(&dirs.media))
.nest_service("/media", ServeDir::new(&config.dirs.media))
.layer(
TraceLayer::new_for_http()
.make_span_with(|request: &Request<_>| {

View file

@ -1,80 +1,73 @@
use std::borrow::Cow;
use std::env;
use std::net::{IpAddr, Ipv6Addr};
use std::num::NonZeroU64;
use std::path::{Path, PathBuf};
use std::sync::Arc;
use std::time::Duration;
use std::path::PathBuf;
use arc_swap::ArcSwap;
use color_eyre::eyre::{self, bail, Context};
use const_str::{concat, convert_ascii_case};
use notify_debouncer_full::notify::RecursiveMode;
use notify_debouncer_full::{new_debouncer, DebouncedEvent};
use color_eyre::eyre::{bail, Context, Result};
use serde::{Deserialize, Serialize};
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tokio::select;
use tokio_util::sync::CancellationToken;
use tracing::{error, info, instrument, trace};
use tracing::{error, info, instrument};
use url::Url;
use crate::de::*;
use crate::ranged_i128_visitor::RangedI128Visitor;
#[derive(Serialize, Deserialize, Debug, PartialEq, Eq, Hash)]
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash)]
#[serde(default)]
pub struct SyntectConfig {
pub load_defaults: bool,
pub themes_dir: Option<Box<Path>>,
pub theme: Option<Box<str>>,
pub themes_dir: Option<PathBuf>,
pub theme: Option<String>,
}
#[derive(Serialize, Deserialize, Debug)]
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash, Default)]
#[serde(default)]
pub struct RenderConfig {
pub syntect: SyntectConfig,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct CacheConfig {
pub enable: bool,
#[serde(deserialize_with = "check_millis")]
pub ttl: Option<NonZeroU64>,
pub cleanup: bool,
#[serde(deserialize_with = "check_millis")]
pub cleanup_interval: Option<NonZeroU64>,
pub cleanup_interval: Option<u64>,
pub persistence: bool,
pub file: Box<Path>,
pub file: PathBuf,
pub compress: bool,
#[serde(deserialize_with = "check_zstd_level_bounds")]
pub compression_level: i32,
}
#[derive(Serialize, Deserialize, Debug)]
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct HttpConfig {
pub host: IpAddr,
pub port: u16,
}
#[derive(Serialize, Deserialize, Debug)]
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct DirsConfig {
pub media: Box<Path>,
#[serde(rename = "static")]
pub static_: Box<Path>,
pub templates: Box<Path>,
pub posts: PathBuf,
pub media: PathBuf,
pub custom_static: PathBuf,
pub custom_templates: PathBuf,
}
#[derive(Serialize, Deserialize, Debug)]
#[derive(Serialize, Deserialize, Debug, Clone)]
pub struct RssConfig {
pub enable: bool,
pub link: Url,
}
#[derive(Serialize, Deserialize, Debug, Default)]
#[derive(Serialize, Deserialize, Debug, Clone, Default)]
pub enum DateFormat {
#[default]
RFC3339,
#[serde(untagged)]
Strftime(Box<str>),
Strftime(String),
}
#[derive(Serialize, Deserialize, Debug, Clone, Copy, PartialEq, Eq, Default)]
#[derive(Serialize, Deserialize, Debug, Clone, Default, Copy, PartialEq, Eq)]
#[serde(rename_all = "lowercase")]
#[repr(u8)]
pub enum Sort {
@ -83,107 +76,33 @@ pub enum Sort {
Name,
}
#[derive(Serialize, Deserialize, Debug)]
#[serde(default)]
pub struct StyleConfig {
pub title: Box<str>,
pub description: Box<str>,
pub js_enable: bool,
pub display_dates: DisplayDates,
pub date_format: DateFormat,
pub default_sort: Sort,
pub default_color: Option<Box<str>>,
}
impl Default for StyleConfig {
fn default() -> Self {
Self {
title: "bingus-blog".into(),
description: "blazingly fast markdown blog software written in rust memory safe".into(),
js_enable: true,
display_dates: Default::default(),
date_format: Default::default(),
default_sort: Default::default(),
default_color: Default::default(),
}
}
}
#[derive(Serialize, Deserialize, Debug, Clone, Copy)]
#[serde(default)]
pub struct DisplayDates {
pub creation: bool,
pub modification: bool,
}
#[derive(Serialize, Deserialize, Debug, PartialEq, Eq, Hash, Default)]
#[serde(default)]
pub struct MarkdownRenderConfig {
pub syntect: SyntectConfig,
pub escape: bool,
#[serde(rename = "unsafe")]
pub unsafe_: bool,
}
#[derive(Serialize, Deserialize, Debug)]
#[serde(default)]
pub struct MarkdownConfig {
pub root: Box<Path>,
pub render: MarkdownRenderConfig,
pub raw_access: bool,
}
impl Default for MarkdownConfig {
fn default() -> Self {
Self {
root: PathBuf::from("posts").into(),
render: Default::default(),
raw_access: true,
}
}
}
#[derive(Serialize, Deserialize, Debug)]
#[serde(default)]
pub struct BlagConfig {
pub root: Box<Path>,
pub bin: Box<Path>,
pub raw_access: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, Copy, Default)]
#[serde(rename_all = "lowercase")]
pub enum EngineMode {
#[default]
Markdown,
Blag,
}
#[derive(Serialize, Deserialize, Debug, Default)]
#[serde(default, rename_all = "lowercase")]
pub struct Engine {
pub mode: EngineMode,
pub markdown: MarkdownConfig,
pub blag: BlagConfig,
}
#[derive(Serialize, Deserialize, Debug)]
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct Config {
pub engine: Engine,
pub style: StyleConfig,
pub title: String,
pub description: String,
pub markdown_access: bool,
pub js_enable: bool,
pub date_format: DateFormat,
pub default_sort: Sort,
pub default_color: Option<String>,
pub rss: RssConfig,
#[serde(rename = "custom")]
pub dirs: DirsConfig,
pub http: HttpConfig,
pub render: RenderConfig,
pub cache: CacheConfig,
}
impl Default for Config {
fn default() -> Self {
Self {
engine: Default::default(),
style: Default::default(),
title: "bingus-blog".into(),
description: "blazingly fast markdown blog software written in rust memory safe".into(),
markdown_access: true,
js_enable: true,
date_format: Default::default(),
default_sort: Default::default(),
default_color: None,
// i have a love-hate relationship with serde
// it was engimatic at first, but then i started actually using it
// writing my own serialize and deserialize implementations.. spending
@ -196,26 +115,19 @@ impl Default for Config {
},
dirs: Default::default(),
http: Default::default(),
render: Default::default(),
cache: Default::default(),
}
}
}
impl Default for DisplayDates {
fn default() -> Self {
Self {
creation: true,
modification: true,
}
}
}
impl Default for DirsConfig {
fn default() -> Self {
Self {
media: PathBuf::from("media").into_boxed_path(),
static_: PathBuf::from("static").into_boxed_path(),
templates: PathBuf::from("templates").into_boxed_path(),
posts: "posts".into(),
media: "media".into(),
custom_static: "static".into(),
custom_templates: "templates".into(),
}
}
}
@ -233,7 +145,7 @@ impl Default for SyntectConfig {
fn default() -> Self {
Self {
load_defaults: false,
themes_dir: Some(PathBuf::from("themes").into_boxed_path()),
themes_dir: Some("themes".into()),
theme: Some("Catppuccin Mocha".into()),
}
}
@ -243,38 +155,28 @@ impl Default for CacheConfig {
fn default() -> Self {
Self {
enable: true,
ttl: None,
cleanup: true,
cleanup_interval: None,
persistence: true,
file: PathBuf::from("cache").into(),
file: "cache".into(),
compress: true,
compression_level: 3,
}
}
}
impl Default for BlagConfig {
fn default() -> Self {
Self {
root: PathBuf::from("posts").into(),
bin: PathBuf::from("blag").into(),
raw_access: true,
}
}
}
fn config_path() -> Cow<'static, str> {
env::var(concat!(
convert_ascii_case!(shouty_snake, env!("CARGO_BIN_NAME")),
"_CONFIG"
#[instrument(name = "config")]
pub async fn load() -> Result<Config> {
let config_file = env::var(format!(
"{}_CONFIG",
env!("CARGO_BIN_NAME").to_uppercase().replace('-', "_")
))
.map(Into::into)
.unwrap_or("config.toml".into())
}
pub async fn load_from(path: (impl AsRef<Path> + std::fmt::Debug)) -> eyre::Result<Config> {
match tokio::fs::OpenOptions::new().read(true).open(&path).await {
.unwrap_or(String::from("config.toml"));
match tokio::fs::OpenOptions::new()
.read(true)
.open(&config_file)
.await
{
Ok(mut file) => {
let mut buf = String::new();
file.read_to_string(&mut buf)
@ -290,7 +192,7 @@ pub async fn load_from(path: (impl AsRef<Path> + std::fmt::Debug)) -> eyre::Resu
.write(true)
.create(true)
.truncate(true)
.open(&path)
.open(&config_file)
.await
{
Ok(mut file) => file
@ -300,104 +202,22 @@ pub async fn load_from(path: (impl AsRef<Path> + std::fmt::Debug)) -> eyre::Resu
.as_bytes(),
)
.await
.unwrap_or_else(|err| error!("couldn't write configuration: {err}")),
Err(err) => error!("couldn't open file {path:?} for writing: {err}"),
.unwrap_or_else(|err| error!("couldn't write configuration: {}", err)),
Err(err) => {
error!("couldn't open file {:?} for writing: {}", &config_file, err)
}
}
Ok(config)
}
_ => bail!("couldn't open config file: {err}"),
_ => bail!("couldn't open config file: {}", err),
},
}
}
#[instrument]
pub async fn load() -> eyre::Result<(Config, Cow<'static, str>)> {
let config_file = config_path();
let config = load_from(&*config_file).await?;
Ok((config, config_file))
}
async fn process_event(
event: DebouncedEvent,
config_file: &Path,
swapper: &ArcSwap<Config>,
) -> eyre::Result<()> {
if !event.kind.is_modify() && !event.kind.is_create()
|| !event.paths.iter().any(|p| p == config_file)
{
trace!("not interested: {event:?}");
return Ok(());
}
let config = load_from(config_file).await?;
info!("reloaded config from {config_file:?}");
swapper.store(Arc::new(config));
Ok(())
}
#[instrument(skip_all)]
pub async fn watcher(
config_file: impl AsRef<str>,
watcher_token: CancellationToken,
swapper: Arc<ArcSwap<Config>>,
) -> eyre::Result<()> {
let config_file = tokio::fs::canonicalize(config_file.as_ref())
.await
.context("failed to canonicalize path")?;
let (tx, mut rx) = tokio::sync::mpsc::channel(1);
let mut debouncer = new_debouncer(Duration::from_millis(100), None, move |events| {
tx.blocking_send(events)
.expect("failed to send message over channel")
})?;
let dir = config_file
.as_path()
.parent()
.expect("absolute path to have parent");
debouncer
.watch(dir, RecursiveMode::NonRecursive)
.with_context(|| format!("failed to watch {dir:?}"))?;
'event_loop: while let Some(ev) = select! {
_ = watcher_token.cancelled() => {
info!("2");
break 'event_loop;
},
ev = rx.recv() => ev,
} {
let events = match ev {
Ok(events) => events,
Err(err) => {
error!("error getting events: {err:?}");
continue;
}
};
for event in events {
if let Err(err) = process_event(event, &config_file, &swapper).await {
error!("error while processing event: {err}");
}
}
}
Ok(())
}
fn check_zstd_level_bounds<'de, D>(d: D) -> Result<i32, D::Error>
where
D: serde::Deserializer<'de>,
{
d.deserialize_i32(RangedI64Visitor::<1, 22>)
d.deserialize_i32(RangedI128Visitor::<1, 22>)
.map(|x| x as i32)
}
fn check_millis<'de, D>(d: D) -> Result<Option<NonZeroU64>, D::Error>
where
D: serde::Deserializer<'de>,
{
d.deserialize_option(MillisVisitor)
}

View file

@ -1,86 +0,0 @@
use std::num::NonZeroU64;
use serde::de::Error;
use serde::{
de::{Unexpected, Visitor},
Deserializer,
};
pub struct RangedI64Visitor<const START: i64, const END: i64>;
impl<const START: i64, const END: i64> serde::de::Visitor<'_> for RangedI64Visitor<START, END> {
type Value = i64;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "an integer between {START} and {END}")
}
fn visit_i32<E>(self, v: i32) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i64(v as i64)
}
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
if v >= START && v <= END {
Ok(v)
} else {
Err(E::custom(format!(
"integer is out of bounds ({START}..{END})"
)))
}
}
fn visit_i128<E>(self, v: i128) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i64(v as i64)
}
}
pub struct U64Visitor;
impl Visitor<'_> for U64Visitor {
type Value = u64;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a non-negative integer")
}
fn visit_u64<E>(self, v: u64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(v)
}
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
u64::try_from(v).map_err(|_| E::invalid_value(Unexpected::Signed(v), &self))
}
}
pub struct MillisVisitor;
impl<'de> Visitor<'de> for MillisVisitor {
type Value = Option<NonZeroU64>;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a positive integer")
}
fn visit_some<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
where
D: Deserializer<'de>,
{
let n = deserializer.deserialize_i64(U64Visitor)?;
NonZeroU64::new(n)
.ok_or(D::Error::invalid_value(Unexpected::Unsigned(n), &self))
.map(Some)
}
}

View file

@ -1,52 +1,48 @@
use std::sync::Arc;
use std::fmt::Display;
use askama::Template;
use askama_axum::Template;
use axum::http::StatusCode;
use axum::response::{Html, IntoResponse, Response};
use color_eyre::eyre;
use axum::response::{IntoResponse, Response};
use thiserror::Error;
use tracing::error;
#[derive(Debug)]
#[repr(transparent)]
pub struct FronmaError(fronma::error::Error);
impl std::error::Error for FronmaError {}
impl Display for FronmaError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str("failed to parse front matter: ")?;
match &self.0 {
fronma::error::Error::MissingBeginningLine => f.write_str("missing beginning line"),
fronma::error::Error::MissingEndingLine => f.write_str("missing ending line"),
fronma::error::Error::SerdeYaml(yaml_error) => write!(f, "{}", yaml_error),
}
}
}
#[derive(Error, Debug)]
#[allow(clippy::enum_variant_names)]
pub enum PostError {
#[error("io error: {0}")]
#[error(transparent)]
IoError(#[from] std::io::Error),
#[error("failed to parse post metadata: {0}")]
ParseError(String),
#[error("failed to render post: {0}")]
RenderError(String),
#[error(transparent)]
AskamaError(#[from] askama::Error),
#[error(transparent)]
ParseError(#[from] FronmaError),
#[error("post {0:?} not found")]
NotFound(Arc<str>),
#[error("unexpected: {0}")]
Other(#[from] eyre::Error),
NotFound(String),
}
impl From<fronma::error::Error> for PostError {
fn from(value: fronma::error::Error) -> Self {
let binding;
Self::ParseError(format!(
"failed to parse front matter: {}",
match value {
fronma::error::Error::MissingBeginningLine => "missing beginning line",
fronma::error::Error::MissingEndingLine => "missing ending line",
fronma::error::Error::SerdeYaml(yaml_error) => {
binding = yaml_error.to_string();
&binding
}
}
))
}
}
impl From<serde_json::Error> for PostError {
fn from(value: serde_json::Error) -> Self {
Self::ParseError(value.to_string())
Self::ParseError(FronmaError(value))
}
}
impl IntoResponse for PostError {
fn into_response(self) -> Response {
fn into_response(self) -> axum::response::Response {
(StatusCode::INTERNAL_SERVER_ERROR, self.to_string()).into_response()
}
}
@ -80,25 +76,17 @@ struct ErrorTemplate {
impl IntoResponse for AppError {
fn into_response(self) -> Response {
let error = self.to_string();
error!("error while handling request: {error}");
let status_code = match &self {
AppError::PostError(PostError::NotFound(_)) => StatusCode::NOT_FOUND,
AppError::RssDisabled => StatusCode::FORBIDDEN,
_ => StatusCode::INTERNAL_SERVER_ERROR,
};
match (ErrorTemplate { error }.render()) {
Ok(rendered) => (status_code, Html(rendered)).into_response(),
Err(err) => {
error!("error while rendering error template: {err}");
(
StatusCode::INTERNAL_SERVER_ERROR,
"error while trying to show error. good job",
)
.into_response()
}
}
(
status_code,
ErrorTemplate {
error: self.to_string(),
},
)
.into_response()
}
}

37
src/hash_arc_store.rs Normal file
View file

@ -0,0 +1,37 @@
use std::hash::{DefaultHasher, Hash, Hasher};
use std::marker::PhantomData;
use std::sync::Arc;
pub struct HashArcStore<T, Lookup>
where
Lookup: Hash,
{
inner: Option<Arc<T>>,
hash: Option<u64>,
_phantom: PhantomData<Lookup>,
}
impl<T, Lookup> HashArcStore<T, Lookup>
where
Lookup: Hash,
{
pub fn new() -> Self {
Self {
inner: None,
hash: None,
_phantom: PhantomData,
}
}
pub fn get_or_init(&mut self, key: &Lookup, init: impl Fn(&Lookup) -> Arc<T>) -> Arc<T> {
let mut h = DefaultHasher::new();
key.hash(&mut h);
let hash = h.finish();
if !self.hash.is_some_and(|inner_hash| inner_hash == hash) {
self.inner = Some(init(key));
self.hash = Some(hash);
}
// safety: please.
unsafe { self.inner.as_ref().unwrap_unchecked().clone() }
}
}

View file

@ -1,14 +1,14 @@
#![feature(let_chains, pattern, path_add_extension, if_let_guard)]
#![feature(let_chains, pattern)]
mod app;
mod config;
mod de;
mod error;
mod hash_arc_store;
mod helpers;
mod markdown_render;
mod path;
mod platform;
mod post;
mod ranged_i128_visitor;
mod serve_dir_included;
mod systemtime_as_secs;
mod templates;
@ -19,10 +19,7 @@ use std::process::exit;
use std::sync::Arc;
use std::time::Duration;
use arc_swap::access::Map;
use arc_swap::ArcSwap;
use color_eyre::eyre::{self, Context};
use config::{Config, EngineMode};
use tokio::net::TcpListener;
use tokio::sync::RwLock;
use tokio::task::JoinSet;
@ -30,13 +27,12 @@ use tokio::time::Instant;
use tokio::{select, signal};
use tokio_util::sync::CancellationToken;
use tracing::level_filters::LevelFilter;
use tracing::{debug, error, info, warn};
use tracing::{debug, error, info, info_span, warn, Instrument};
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::{util::SubscriberInitExt, EnvFilter};
use crate::app::AppState;
use crate::post::cache::{load_cache, Cache, CacheGuard, CACHE_VERSION};
use crate::post::{Blag, MarkdownPosts, PostManager};
use crate::post::{MarkdownPosts, PostManager};
use crate::templates::new_registry;
use crate::templates::watcher::watch_templates;
@ -45,7 +41,13 @@ async fn main() -> eyre::Result<()> {
color_eyre::install()?;
let reg = tracing_subscriber::registry();
#[cfg(feature = "tokio-console")]
let reg = reg.with(console_subscriber::spawn());
let reg = reg
.with(
EnvFilter::builder()
.with_default_directive(LevelFilter::TRACE.into())
.from_env_lossy(),
)
.with(console_subscriber::spawn());
#[cfg(not(feature = "tokio-console"))]
let reg = reg.with(
EnvFilter::builder()
@ -54,79 +56,45 @@ async fn main() -> eyre::Result<()> {
);
reg.with(tracing_subscriber::fmt::layer()).init();
let config = Arc::new(
config::load()
.await
.context("couldn't load configuration")?,
);
let socket_addr = SocketAddr::new(config.http.host, config.http.port);
let mut tasks = JoinSet::new();
let cancellation_token = CancellationToken::new();
let (config, config_file) = config::load()
.await
.context("couldn't load configuration")?;
let config = Arc::new(config);
let swapper = Arc::new(ArcSwap::from(config.clone()));
let config_cache_access: crate::post::cache::ConfigAccess =
Box::new(arc_swap::access::Map::new(swapper.clone(), |c: &Config| {
&c.cache
}));
info!("loaded config from {config_file:?}");
let start = Instant::now();
// NOTE: use tokio::task::spawn_blocking if this ever turns into a concurrent task
let mut reg =
new_registry(&config.dirs.templates).context("failed to create handlebars registry")?;
let mut reg = new_registry(&config.dirs.custom_templates)
.context("failed to create handlebars registry")?;
reg.register_helper("date", Box::new(helpers::date));
reg.register_helper("duration", Box::new(helpers::duration));
debug!(duration = ?start.elapsed(), "registered all templates");
let registry = Arc::new(RwLock::new(reg));
let reg = Arc::new(RwLock::new(reg));
debug!("setting up watcher");
let watcher_token = cancellation_token.child_token();
tasks.spawn(watch_templates(
config.dirs.templates.clone(),
watcher_token.clone(),
registry.clone(),
));
let cache = if config.cache.enable {
if config.cache.persistence && tokio::fs::try_exists(&config.cache.file).await? {
info!("loading cache from file");
let mut cache = load_cache(&config.cache).await.unwrap_or_else(|err| {
error!("failed to load cache: {}", err);
info!("using empty cache");
Cache::new(config.cache.ttl)
});
if cache.version() < CACHE_VERSION {
warn!("cache version changed, clearing cache");
cache = Cache::new(config.cache.ttl);
};
Some(cache)
} else {
Some(Cache::new(config.cache.ttl))
}
} else {
None
}
.map(|cache| CacheGuard::new(cache, config_cache_access))
.map(Arc::new);
let posts: Arc<dyn PostManager + Send + Sync> = match config.engine.mode {
EngineMode::Markdown => {
let access = Map::new(swapper.clone(), |c: &Config| &c.engine.markdown);
Arc::new(MarkdownPosts::new(access, cache.clone()).await?)
}
EngineMode::Blag => {
let access = Map::new(swapper.clone(), |c: &Config| &c.engine.blag);
Arc::new(Blag::new(access, cache.clone()))
}
let posts = Arc::new(MarkdownPosts::new(Arc::clone(&config)).await?);
let state = AppState {
config: Arc::clone(&config),
posts: Arc::clone(&posts),
reg: Arc::clone(&reg),
};
debug!("setting up config watcher");
let token = cancellation_token.child_token();
tasks.spawn(config::watcher(config_file, token, swapper.clone()));
debug!("setting up watcher");
tasks.spawn(
watch_templates(
config.dirs.custom_templates.clone(),
watcher_token.clone(),
reg,
)
.instrument(info_span!("custom_template_watcher")),
);
if config.cache.enable && config.cache.cleanup {
if let Some(millis) = config.cache.cleanup_interval {
@ -134,7 +102,7 @@ async fn main() -> eyre::Result<()> {
let token = cancellation_token.child_token();
debug!("setting up cleanup task");
tasks.spawn(async move {
let mut interval = tokio::time::interval(Duration::from_millis(millis.into()));
let mut interval = tokio::time::interval(Duration::from_millis(millis));
loop {
select! {
_ = token.cancelled() => break Ok(()),
@ -149,15 +117,8 @@ async fn main() -> eyre::Result<()> {
}
}
let state = AppState {
rss: Arc::new(Map::new(swapper.clone(), |c: &Config| &c.rss)),
style: Arc::new(Map::new(swapper.clone(), |c: &Config| &c.style)),
posts,
templates: registry,
};
let app = app::new(&config.dirs).with_state(state.clone());
let app = app::new(&config).with_state(state.clone());
let socket_addr = SocketAddr::new(config.http.host, config.http.port);
let listener = TcpListener::bind(socket_addr)
.await
.with_context(|| format!("couldn't listen on {}", socket_addr))?;

View file

@ -1,36 +1,38 @@
use color_eyre::eyre::{self, Context};
use comrak::adapters::SyntaxHighlighterAdapter;
use std::sync::{Arc, OnceLock, RwLock};
use comrak::markdown_to_html_with_plugins;
use comrak::plugins::syntect::{SyntectAdapter, SyntectAdapterBuilder};
use comrak::ComrakOptions;
use comrak::RenderPlugins;
use comrak::{markdown_to_html_with_plugins, Plugins};
use syntect::highlighting::ThemeSet;
use crate::config::MarkdownRenderConfig;
use crate::config::RenderConfig;
use crate::hash_arc_store::HashArcStore;
pub fn build_syntect(config: &MarkdownRenderConfig) -> eyre::Result<SyntectAdapter> {
fn syntect_adapter(config: &RenderConfig) -> Arc<SyntectAdapter> {
static STATE: OnceLock<RwLock<HashArcStore<SyntectAdapter, RenderConfig>>> = OnceLock::new();
let lock = STATE.get_or_init(|| RwLock::new(HashArcStore::new()));
let mut guard = lock.write().unwrap();
guard.get_or_init(config, build_syntect)
}
fn build_syntect(config: &RenderConfig) -> Arc<SyntectAdapter> {
let mut theme_set = if config.syntect.load_defaults {
ThemeSet::load_defaults()
} else {
ThemeSet::new()
};
if let Some(path) = config.syntect.themes_dir.as_ref() {
theme_set
.add_from_folder(path)
.with_context(|| format!("failed to add themes from {path:?}"))?;
theme_set.add_from_folder(path).unwrap();
}
let mut builder = SyntectAdapterBuilder::new().theme_set(theme_set);
if let Some(theme) = config.syntect.theme.as_ref() {
builder = builder.theme(theme);
}
Ok(builder.build())
Arc::new(builder.build())
}
pub fn render(
markdown: &str,
config: &MarkdownRenderConfig,
syntect: Option<&dyn SyntaxHighlighterAdapter>,
) -> String {
pub fn render(markdown: &str, config: &RenderConfig) -> String {
let mut options = ComrakOptions::default();
options.extension.table = true;
options.extension.autolink = true;
@ -39,15 +41,15 @@ pub fn render(
options.extension.strikethrough = true;
options.extension.multiline_block_quotes = true;
options.extension.header_ids = Some(String::new());
options.render.escape = config.escape;
options.render.unsafe_ = config.unsafe_;
let render_plugins = RenderPlugins {
codefence_syntax_highlighter: syntect,
..Default::default()
};
let mut render_plugins = RenderPlugins::default();
let syntect = syntect_adapter(config);
render_plugins.codefence_syntax_highlighter = Some(syntect.as_ref());
let plugins = Plugins::builder().render(render_plugins).build();
let plugins = comrak::PluginsBuilder::default()
.render(render_plugins)
.build()
.unwrap();
markdown_to_html_with_plugins(markdown, &options, &plugins)
}

View file

@ -1,51 +0,0 @@
use axum::extract::rejection::PathRejection;
use axum::extract::{FromRequestParts, Path};
use axum::http::request::Parts;
use axum::http::StatusCode;
use axum::response::{IntoResponse, Response};
use serde::de::DeserializeOwned;
pub struct SafePath<T>(pub T);
impl<S, T> FromRequestParts<S> for SafePath<T>
where
T: DeserializeOwned,
T: AsRef<str>,
T: Send + Sync,
S: Send + Sync,
{
type Rejection = SafePathRejection;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection> {
let s = Path::<T>::from_request_parts(parts, state).await?.0;
if s.as_ref().contains("..") || s.as_ref().contains('/') {
return Err(SafePathRejection::Invalid);
}
Ok(SafePath(s))
}
}
#[derive(Debug)]
pub enum SafePathRejection {
Invalid,
PathRejection(PathRejection),
}
impl From<PathRejection> for SafePathRejection {
fn from(value: PathRejection) -> Self {
Self::PathRejection(value)
}
}
impl IntoResponse for SafePathRejection {
fn into_response(self) -> Response {
match self {
SafePathRejection::Invalid => {
(StatusCode::BAD_REQUEST, "path contains invalid characters").into_response()
}
SafePathRejection::PathRejection(err) => err.into_response(),
}
}
}

View file

@ -1,359 +0,0 @@
use std::collections::BTreeSet;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::path::Path;
use std::process::Stdio;
use std::sync::Arc;
use std::time::Duration;
use arc_swap::access::Access;
use async_trait::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use futures::stream::FuturesUnordered;
use futures::{FutureExt, StreamExt};
use indexmap::IndexMap;
use serde::Deserialize;
use serde_value::Value;
use tokio::fs::OpenOptions;
use tokio::io::{AsyncBufReadExt, AsyncReadExt, BufReader};
use tokio::time::Instant;
use tracing::{debug, error, info, instrument};
use crate::config::BlagConfig;
use crate::error::PostError;
use crate::post::Filter;
use crate::systemtime_as_secs::as_secs;
use super::cache::{CacheGuard, CacheValue};
use super::{ApplyFilters, PostManager, PostMetadata, RenderStats, ReturnedPost};
#[derive(Deserialize, Debug)]
struct BlagMetadata {
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
#[serde(alias = "created_at")]
pub written_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
#[serde(default)]
pub tags: BTreeSet<Arc<str>>,
pub dont_cache: bool,
pub raw: Option<Arc<str>>,
}
impl BlagMetadata {
pub fn into_full(self, name: Arc<str>) -> (PostMetadata, bool, Option<Arc<str>>) {
(
PostMetadata {
name,
title: self.title,
description: self.description,
author: self.author,
icon: self.icon,
icon_alt: self.icon_alt,
color: self.color,
written_at: self.written_at,
modified_at: self.modified_at,
tags: self.tags.into_iter().collect(),
},
self.dont_cache,
self.raw,
)
}
}
pub struct Blag<A> {
config: A,
cache: Option<Arc<CacheGuard>>,
_fastblag: bool,
}
enum RenderResult {
Normal(PostMetadata, String, (Duration, Duration), bool),
Raw(Vec<u8>, Arc<str>),
}
impl<A> Blag<A>
where
A: Access<BlagConfig>,
A: Sync,
A::Guard: Send,
{
pub fn new(config: A, cache: Option<Arc<CacheGuard>>) -> Self {
Self {
config,
cache,
_fastblag: false,
}
}
async fn render(
&self,
name: Arc<str>,
path: impl AsRef<Path>,
query_json: String,
) -> Result<RenderResult, PostError> {
let start = Instant::now();
let bin = self.config.load().bin.clone();
debug!(%name, "rendering");
let mut cmd = tokio::process::Command::new(&*bin)
.arg(path.as_ref())
.env("BLAG_QUERY", query_json)
.stdout(Stdio::piped())
.stderr(Stdio::inherit())
.stdin(Stdio::null())
.spawn()
.map_err(|err| {
error!("failed to spawn {bin:?}: {err}");
err
})?;
let stdout = cmd.stdout.take().unwrap();
let mut reader = BufReader::new(stdout);
let mut buf = String::new();
reader.read_line(&mut buf).await?;
let blag_meta: BlagMetadata = serde_json::from_str(&buf)?;
debug!("blag meta: {blag_meta:?}");
let (meta, dont_cache, raw) = blag_meta.into_full(name);
buf.clear();
// this is morally reprehensible
if let Some(raw) = raw {
let mut buf = buf.into_bytes();
reader.read_to_end(&mut buf).await?;
return Ok(RenderResult::Raw(buf, raw));
}
let parsed = start.elapsed();
let rendering = Instant::now();
reader.read_to_string(&mut buf).await?;
let status = cmd.wait().await?;
debug!("exited: {status}");
if !status.success() {
return Err(PostError::RenderError(status.to_string()));
}
let rendered = rendering.elapsed();
Ok(RenderResult::Normal(
meta,
buf,
(parsed, rendered),
dont_cache,
))
}
fn as_raw(name: &str) -> String {
let mut buf = String::with_capacity(name.len() + 3);
buf += name;
buf += ".sh";
buf
}
fn is_raw(name: &str) -> bool {
name.ends_with(".sh")
}
}
#[async_trait]
impl<A> PostManager for Blag<A>
where
A: Access<BlagConfig>,
A: Sync,
A::Guard: Send,
{
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError> {
let root = &self.config.load().root;
let mut set = FuturesUnordered::new();
let mut posts = Vec::new();
let mut files = tokio::fs::read_dir(&root).await?;
loop {
let entry = match files.next_entry().await {
Ok(Some(v)) => v,
Ok(None) => break,
Err(err) => {
error!("error while getting next entry: {err}");
continue;
}
};
let stat = tokio::fs::metadata(entry.path()).await?;
if stat.is_file() {
let mut name = match entry.file_name().into_string() {
Ok(v) => v,
Err(_) => {
continue;
}
};
if Self::is_raw(&name) {
name.truncate(name.len() - 3);
let name = name.into();
set.push(self.get_post(Arc::clone(&name), query).map(|v| (name, v)));
}
}
}
while let Some((name, result)) = set.next().await {
let post = match result {
Ok(v) => v,
Err(err) => {
error!("error while rendering blagpost {name:?}: {err}");
continue;
}
};
if let ReturnedPost::Rendered {
meta, body, perf, ..
} = post
&& meta.apply_filters(filters)
{
posts.push((meta, body, perf));
}
}
debug!("collected posts");
Ok(posts)
}
#[instrument(skip(self))]
async fn get_post(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError> {
let start = Instant::now();
let BlagConfig {
ref root,
ref raw_access,
..
} = &*self.config.load();
if Self::is_raw(&name) {
let mut buffer = Vec::new();
let mut file = OpenOptions::new()
.read(true)
.open(root.join(&*name))
.await
.map_err(|err| match err.kind() {
std::io::ErrorKind::NotFound => PostError::NotFound(name),
_ => PostError::IoError(err),
})?;
file.read_to_end(&mut buffer).await?;
return Ok(ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_static("text/x-shellscript"),
});
}
let raw_name = Self::as_raw(&name);
let path = root.join(&raw_name);
let raw_name = raw_access.then_some(raw_name);
let stat = tokio::fs::metadata(&path)
.await
.map_err(|err| match err.kind() {
std::io::ErrorKind::NotFound => PostError::NotFound(name.clone()),
_ => PostError::IoError(err),
})?;
if !stat.is_file() {
return Err(PostError::NotFound(name));
}
let mtime = as_secs(stat.modified()?);
let query_json = serde_json::to_string(&query).expect("this should not fail");
let mut hasher = DefaultHasher::new();
query_json.hash(&mut hasher);
let query_hash = hasher.finish();
let post = if let Some(cache) = &self.cache
&& let Some(CacheValue { meta, body, .. }) =
cache.lookup(name.clone(), mtime, query_hash).await
{
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Cached(start.elapsed()),
raw_name,
}
} else {
let (meta, content, (parsed, rendered), dont_cache) =
match self.render(name.clone(), path, query_json).await? {
RenderResult::Normal(x, y, z, w) => (x, y, z, w),
RenderResult::Raw(buffer, content_type) => {
return Ok(ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_str(&content_type)
.map_err(Into::into)
.map_err(PostError::Other)?,
});
}
};
let body = content.into();
if !dont_cache && let Some(cache) = &self.cache {
cache
.insert(name, meta.clone(), mtime, Arc::clone(&body), query_hash)
.await;
}
let total = start.elapsed();
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Rendered {
total,
parsed,
rendered,
},
raw_name,
}
};
if let ReturnedPost::Rendered { perf, .. } = &post {
info!("rendered blagpost in {:?}", perf);
}
Ok(post)
}
async fn cleanup(&self) {
if let Some(cache) = &self.cache {
let root = &self.config.load().root;
cache
.cleanup(|key, value| {
let mtime = std::fs::metadata(root.join(Self::as_raw(&key.name)))
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(as_secs);
match mtime {
Some(mtime) => mtime <= value.mtime,
None => false,
}
})
.await
}
}
}

View file

@ -1,80 +1,53 @@
use std::fmt::Debug;
use std::io::{Read, Write};
use std::num::NonZeroU64;
use std::ops::Deref;
use std::sync::Arc;
use std::time::SystemTime;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::io::Read;
use crate::config::CacheConfig;
use crate::config::{Config, RenderConfig};
use crate::post::PostMetadata;
use arc_swap::access::DynAccess;
use color_eyre::eyre::{self, Context};
use scc::HashMap;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncReadExt;
use tracing::{debug, info, instrument, trace, Span};
use tracing::{debug, instrument};
/// do not persist cache if this version number changed
pub const CACHE_VERSION: u16 = 5;
pub const CACHE_VERSION: u16 = 2;
fn now() -> u128 {
crate::systemtime_as_secs::as_millis(SystemTime::now())
}
#[derive(Serialize, Deserialize, Clone, Debug)]
#[derive(Serialize, Deserialize, Clone)]
pub struct CacheValue {
pub meta: PostMetadata,
pub body: Arc<str>,
pub metadata: PostMetadata,
pub rendered: String,
pub mtime: u64,
/// when the item was inserted into cache, in milliseconds since epoch
pub cached_at: u128,
config_hash: u64,
}
#[derive(Serialize, Deserialize, Clone)]
pub struct Cache {
map: HashMap<CacheKey, CacheValue>,
version: u16,
#[serde(skip)]
ttl: Option<NonZeroU64>,
}
pub struct Cache(HashMap<String, CacheValue>, u16);
#[derive(Serialize, Deserialize, Hash, Eq, PartialEq, Clone, Debug)]
#[repr(C)]
pub struct CacheKey {
pub name: Arc<str>,
pub extra: u64,
impl Default for Cache {
fn default() -> Self {
Self(Default::default(), CACHE_VERSION)
}
}
impl Cache {
pub fn new(ttl: Option<NonZeroU64>) -> Self {
Cache {
map: Default::default(),
version: CACHE_VERSION,
ttl,
}
}
fn up_to_date(&self, cached: &CacheValue, mtime: u64) -> bool {
mtime <= cached.mtime
&& self
.ttl
.is_none_or(|ttl| cached.cached_at + u64::from(ttl) as u128 >= now())
}
#[instrument(level = "debug", skip(self), fields(entry_mtime))]
pub async fn lookup(&self, name: Arc<str>, mtime: u64, extra: u64) -> Option<CacheValue> {
trace!("looking up in cache");
match self.map.get_async(&CacheKey { name, extra }).await {
pub async fn lookup(
&self,
name: &str,
mtime: u64,
config: &RenderConfig,
) -> Option<CacheValue> {
match self.0.get_async(name).await {
Some(entry) => {
let cached = entry.get();
Span::current().record("entry_mtime", cached.mtime);
trace!("found in cache");
if self.up_to_date(cached, mtime) {
trace!("entry up-to-date");
if mtime <= cached.mtime && {
let mut hasher = DefaultHasher::new();
config.hash(&mut hasher);
hasher.finish()
} == cached.config_hash
{
Some(cached.clone())
} else {
let _ = entry.remove();
debug!("removed stale entry");
None
}
}
@ -82,24 +55,14 @@ impl Cache {
}
}
#[instrument(level = "debug", skip(self), fields(entry_mtime))]
pub async fn lookup_metadata(
&self,
name: Arc<str>,
mtime: u64,
extra: u64,
) -> Option<PostMetadata> {
trace!("looking up metadata in cache");
match self.map.get_async(&CacheKey { name, extra }).await {
pub async fn lookup_metadata(&self, name: &str, mtime: u64) -> Option<PostMetadata> {
match self.0.get_async(name).await {
Some(entry) => {
let cached = entry.get();
Span::current().record("entry_mtime", cached.mtime);
if self.up_to_date(cached, mtime) {
trace!("entry up-to-date");
Some(cached.meta.clone())
if mtime <= cached.mtime {
Some(cached.metadata.clone())
} else {
let _ = entry.remove();
debug!("removed stale entry");
None
}
}
@ -107,165 +70,78 @@ impl Cache {
}
}
#[instrument(level = "debug", skip(self))]
pub async fn insert(
&self,
name: Arc<str>,
name: String,
metadata: PostMetadata,
mtime: u64,
rendered: Arc<str>,
extra: u64,
) -> Option<CacheValue> {
trace!("inserting into cache");
rendered: String,
config: &RenderConfig,
) -> Result<(), (String, (PostMetadata, String))> {
let mut hasher = DefaultHasher::new();
config.hash(&mut hasher);
let hash = hasher.finish();
let r = self
.map
.upsert_async(
CacheKey { name, extra },
CacheValue {
meta: metadata,
body: rendered,
mtime,
cached_at: now(),
},
)
.await;
let value = CacheValue {
metadata,
rendered,
mtime,
config_hash: hash,
};
debug!(
"{} cache",
match r {
Some(_) => "updated in",
None => "inserted into",
}
);
r
if self
.0
.update_async(&name, |_, _| value.clone())
.await
.is_none()
{
self.0
.insert_async(name, value)
.await
.map_err(|x| (x.0, (x.1.metadata, x.1.rendered)))
} else {
Ok(())
}
}
#[instrument(level = "debug", skip(self))]
#[allow(unused)]
pub async fn remove(&self, name: Arc<str>, extra: u64) -> Option<(CacheKey, CacheValue)> {
trace!("removing from cache");
let r = self.map.remove_async(&CacheKey { name, extra }).await;
debug!(
"item {} cache",
match r {
Some(_) => "removed from",
None => "did not exist in",
}
);
r
pub async fn remove(&self, name: &str) -> Option<(String, CacheValue)> {
self.0.remove_async(name).await
}
pub async fn retain(&self, predicate: impl Fn(&CacheKey, &CacheValue) -> bool) {
let old_size = self.map.len();
#[instrument(name = "cleanup", skip_all)]
pub async fn cleanup(&self, get_mtime: impl Fn(&str) -> Option<u64>) {
let old_size = self.0.len();
let mut i = 0;
// TODO: multithread
// not urgent as this is run concurrently anyways
self.map
self.0
.retain_async(|k, v| {
if predicate(k, v) {
if get_mtime(k).is_some_and(|mtime| mtime == v.mtime) {
true
} else {
debug!("removing {k:?} from cache");
debug!("removing {k} from cache");
i += 1;
false
}
})
.await;
let new_size = self.len();
let new_size = self.0.len();
debug!("removed {i} entries ({old_size} -> {new_size} entries)");
}
#[instrument(level = "debug", skip_all)]
pub async fn cleanup(&self, predicate: impl Fn(&CacheKey, &CacheValue) -> bool) {
self.retain(|k, v| {
self.ttl
.is_none_or(|ttl| v.cached_at + u64::from(ttl) as u128 >= now())
&& predicate(k, v)
})
.await
}
pub fn len(&self) -> usize {
self.map.len()
}
#[inline(always)]
pub fn version(&self) -> u16 {
self.version
self.1
}
}
pub type ConfigAccess = Box<dyn DynAccess<CacheConfig> + Send + Sync>;
pub struct CacheGuard {
inner: Cache,
config: ConfigAccess,
}
impl CacheGuard {
pub fn new(cache: Cache, config: ConfigAccess) -> Self {
Self {
inner: cache,
config,
}
}
fn try_drop(&mut self) -> Result<(), eyre::Report> {
let config = self.config.load();
// write cache to file
let path = &*config.file;
let serialized = bitcode::serialize(&self.inner).context("failed to serialize cache")?;
let mut cache_file = std::fs::File::create(path)
.with_context(|| format!("failed to open cache at {}", path.display()))?;
let compression_level = config.compression_level;
if config.compress {
std::io::Write::write_all(
&mut zstd::stream::write::Encoder::new(cache_file, compression_level)?
.auto_finish(),
&serialized,
)
} else {
cache_file.write_all(&serialized)
}
.context("failed to write cache to file")?;
info!("wrote cache to {path:?}");
Ok(())
}
}
impl Deref for CacheGuard {
type Target = Cache;
fn deref(&self) -> &Self::Target {
&self.inner
}
}
impl AsRef<Cache> for CacheGuard {
fn as_ref(&self) -> &Cache {
&self.inner
}
}
impl Drop for CacheGuard {
fn drop(&mut self) {
self.try_drop().expect("cache to save successfully")
}
}
pub(crate) async fn load_cache(config: &CacheConfig) -> Result<Cache, eyre::Report> {
let path = &config.file;
pub(crate) async fn load_cache(config: &Config) -> Result<Cache, eyre::Report> {
let path = &config.cache.file;
let mut cache_file = tokio::fs::File::open(&path)
.await
.context("failed to open cache file")?;
let serialized = if config.compress {
let serialized = if config.cache.compress {
let cache_file = cache_file.into_std().await;
tokio::task::spawn_blocking(move || {
let mut buf = Vec::with_capacity(4096);
@ -283,10 +159,5 @@ pub(crate) async fn load_cache(config: &CacheConfig) -> Result<Cache, eyre::Repo
buf
};
let mut cache: Cache =
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")?;
cache.ttl = config.ttl;
Ok(cache)
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")
}

View file

@ -1,54 +1,44 @@
use std::collections::BTreeSet;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::io;
use std::io::{self, Write};
use std::ops::Deref;
use std::path::Path;
use std::sync::Arc;
use std::time::Duration;
use std::time::Instant;
use std::time::SystemTime;
use arc_swap::access::Access;
use async_trait::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use color_eyre::eyre::{self, Context};
use comrak::plugins::syntect::SyntectAdapter;
use fronma::parser::{parse, ParsedData};
use indexmap::IndexMap;
use serde::Deserialize;
use serde_value::Value;
use tokio::fs;
use tokio::io::AsyncReadExt;
use tracing::{error, info, instrument};
use tracing::{error, info, warn};
use crate::config::MarkdownConfig;
use crate::markdown_render::{build_syntect, render};
use crate::config::Config;
use crate::markdown_render::render;
use crate::post::cache::{load_cache, Cache, CACHE_VERSION};
use crate::post::{PostError, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::systemtime_as_secs::as_secs;
use super::cache::{CacheGuard, CacheKey, CacheValue};
use super::{
ApplyFilters, Filter, PostError, PostManager, PostMetadata, RenderStats, ReturnedPost,
};
#[derive(Deserialize)]
struct FrontMatter {
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
#[serde(alias = "created_at")]
pub written_at: Option<DateTime<Utc>>,
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub icon_alt: Option<String>,
pub color: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
#[serde(default)]
pub tags: BTreeSet<Arc<str>>,
pub tags: BTreeSet<String>,
}
impl FrontMatter {
pub fn into_full(
self,
name: Arc<str>,
name: String,
created: Option<SystemTime>,
modified: Option<SystemTime>,
) -> PostMetadata {
@ -60,47 +50,62 @@ impl FrontMatter {
icon: self.icon,
icon_alt: self.icon_alt,
color: self.color,
written_at: self.written_at.or_else(|| created.map(|t| t.into())),
created_at: self.created_at.or_else(|| created.map(|t| t.into())),
modified_at: self.modified_at.or_else(|| modified.map(|t| t.into())),
tags: self.tags.into_iter().collect(),
}
}
}
pub struct MarkdownPosts<A> {
cache: Option<Arc<CacheGuard>>,
config: A,
render_hash: u64,
syntect: SyntectAdapter,
pub struct MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
cache: Option<Cache>,
config: C,
}
impl<A> MarkdownPosts<A>
impl<C> MarkdownPosts<C>
where
A: Access<MarkdownConfig>,
A: Sync,
A::Guard: Send,
C: Deref<Target = Config>,
{
pub async fn new(config: A, cache: Option<Arc<CacheGuard>>) -> eyre::Result<Self> {
let syntect = build_syntect(&config.load().render)
.context("failed to create syntax highlighting engine")?;
pub async fn new(config: C) -> eyre::Result<MarkdownPosts<C>> {
if config.cache.enable {
if config.cache.persistence && tokio::fs::try_exists(&config.cache.file).await? {
info!("loading cache from file");
let mut cache = load_cache(&config).await.unwrap_or_else(|err| {
error!("failed to load cache: {}", err);
info!("using empty cache");
Default::default()
});
let mut hasher = DefaultHasher::new();
config.load().render.hash(&mut hasher);
let render_hash = hasher.finish();
if cache.version() < CACHE_VERSION {
warn!("cache version changed, clearing cache");
cache = Default::default();
};
Ok(Self {
cache,
config,
render_hash,
syntect,
})
Ok(Self {
cache: Some(cache),
config,
})
} else {
Ok(Self {
cache: Some(Default::default()),
config,
})
}
} else {
Ok(Self {
cache: None,
config,
})
}
}
async fn parse_and_render(
&self,
name: Arc<str>,
name: String,
path: impl AsRef<Path>,
) -> Result<(PostMetadata, Arc<str>, (Duration, Duration)), PostError> {
) -> Result<(PostMetadata, String, (Duration, Duration)), PostError> {
let parsing_start = Instant::now();
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(val) => val,
@ -121,233 +126,221 @@ where
let parsing = parsing_start.elapsed();
let before_render = Instant::now();
let post = render(body, &self.config.load().render, Some(&self.syntect)).into();
let post = render(body, &self.config.render);
let rendering = before_render.elapsed();
if let Some(cache) = &self.cache {
if let Some(cache) = self.cache.as_ref() {
cache
.insert(
name.clone(),
name.to_string(),
metadata.clone(),
as_secs(modified),
Arc::clone(&post),
self.render_hash,
as_secs(&modified),
post.clone(),
&self.config.render,
)
.await;
}
.await
.unwrap_or_else(|err| warn!("failed to insert {:?} into cache", err.0))
};
Ok((metadata, post, (parsing, rendering)))
}
fn is_raw(name: &str) -> bool {
name.ends_with(".md")
fn cache(&self) -> Option<&Cache> {
self.cache.as_ref()
}
fn as_raw(name: &str) -> Option<String> {
let mut buf = String::with_capacity(name.len() + 3);
buf += name;
buf += ".md";
Some(buf)
fn try_drop(&mut self) -> Result<(), eyre::Report> {
// write cache to file
let config = &self.config.cache;
if config.enable
&& config.persistence
&& let Some(cache) = self.cache()
{
let path = &config.file;
let serialized = bitcode::serialize(cache).context("failed to serialize cache")?;
let mut cache_file = std::fs::File::create(path)
.with_context(|| format!("failed to open cache at {}", path.display()))?;
let compression_level = config.compression_level;
if config.compress {
std::io::Write::write_all(
&mut zstd::stream::write::Encoder::new(cache_file, compression_level)?
.auto_finish(),
&serialized,
)
} else {
cache_file.write_all(&serialized)
}
.context("failed to write cache to file")?;
info!("wrote cache to {}", path.display());
}
Ok(())
}
}
#[async_trait]
impl<A> PostManager for MarkdownPosts<A>
impl<C> Drop for MarkdownPosts<C>
where
A: Access<MarkdownConfig>,
A: Sync,
A::Guard: Send,
C: Deref<Target = Config>,
{
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.load().root).await?;
while let Some(entry) = read_dir.next_entry().await? {
if let Err(err) = async {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string()
.into();
let post = self.get_post(Arc::clone(&name), query).await?;
if let ReturnedPost::Rendered {
meta, body, perf, ..
} = post
&& meta.apply_filters(filters)
{
posts.push((meta, body, perf));
}
}
color_eyre::eyre::Ok(())
}
.await
{
error!("error while getting post: {err}");
continue;
};
}
Ok(posts)
fn drop(&mut self) {
self.try_drop().unwrap()
}
}
impl<C> PostManager for MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
async fn get_all_post_metadata(
&self,
filters: &[Filter<'_>],
_query: &IndexMap<String, Value>,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.load().root).await?;
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
if let Err(err) = async {
let path = entry.path();
let stat = fs::metadata(&path).await?;
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let mtime = as_secs(stat.modified()?);
let name: Arc<str> =
String::from(path.file_stem().unwrap().to_string_lossy()).into();
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let mtime = as_secs(&stat.modified()?);
// TODO. this?
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
if let Some(cache) = &self.cache
&& let Some(hit) = cache
.lookup_metadata(name.clone(), mtime, self.render_hash)
.await
&& hit.apply_filters(filters)
{
posts.push(hit);
} else {
let (metadata, ..) = self.parse_and_render(name, path).await?;
if metadata.apply_filters(filters) {
posts.push(metadata);
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup_metadata(&name, mtime).await
&& filter(&hit)
{
posts.push(hit);
} else {
match self.parse_and_render(name, path).await {
Ok((metadata, ..)) => {
if filter(&metadata) {
posts.push(metadata);
}
}
Err(err) => match err {
PostError::IoError(ref io_err)
if matches!(io_err.kind(), io::ErrorKind::NotFound) =>
{
warn!("TOCTOU: {}", err)
}
_ => return Err(err),
},
}
}
color_eyre::eyre::Ok(())
}
.await
{
error!("error while getting post metadata: {err}");
continue;
};
}
Ok(posts)
}
#[instrument(level = "info", skip(self))]
async fn get_post(
async fn get_all_posts(
&self,
name: Arc<str>,
_query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError> {
let config = self.config.load();
let post = if config.raw_access && Self::is_raw(&name) {
let path = config.root.join(&*name);
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
let post = self.get_post(&name).await?;
if let ReturnedPost::Rendered(meta, content, stats) = post
&& filter(&meta, &content)
{
posts.push((meta, content, stats));
}
}
}
Ok(posts)
}
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError> {
if self.config.markdown_access && name.ends_with(".md") {
let path = self.config.dirs.posts.join(name);
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(value) => value,
Err(err) => {
return match err.kind() {
io::ErrorKind::NotFound => Err(PostError::NotFound(name)),
_ => Err(PostError::IoError(err)),
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
}
_ => return Err(PostError::IoError(err)),
},
};
let mut buffer = Vec::with_capacity(4096);
let mut buf = Vec::with_capacity(4096);
file.read_to_end(&mut buffer).await?;
file.read_to_end(&mut buf).await?;
ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_static("text/plain"),
}
Ok(ReturnedPost::Raw(
buf,
HeaderValue::from_static("text/plain"),
))
} else {
let start = Instant::now();
let raw_name = Self::as_raw(&name).unwrap_or_else(|| unreachable!());
let path = config.root.join(&raw_name);
let path = self.config.dirs.posts.join(name.to_owned() + ".md");
let stat = match tokio::fs::metadata(&path).await {
Ok(value) => value,
Err(err) => {
return match err.kind() {
io::ErrorKind::NotFound => Err(PostError::NotFound(name)),
_ => Err(PostError::IoError(err)),
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
}
_ => return Err(PostError::IoError(err)),
},
};
let mtime = as_secs(stat.modified()?);
let mtime = as_secs(&stat.modified()?);
let (meta, body, perf) = if let Some(cache) = &self.cache
&& let Some(CacheValue { meta, body, .. }) =
cache.lookup(name.clone(), mtime, self.render_hash).await
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup(name, mtime, &self.config.render).await
{
(meta, body, RenderStats::Cached(start.elapsed()))
Ok(ReturnedPost::Rendered(
hit.metadata,
hit.rendered,
RenderStats::Cached(start.elapsed()),
))
} else {
let (meta, body, stats) = self.parse_and_render(name, path).await?;
(
meta,
body,
RenderStats::Rendered {
total: start.elapsed(),
parsed: stats.0,
rendered: stats.1,
},
)
};
ReturnedPost::Rendered {
meta,
body,
perf,
raw_name: config.raw_access.then_some(raw_name),
let (metadata, rendered, stats) =
self.parse_and_render(name.to_string(), path).await?;
Ok(ReturnedPost::Rendered(
metadata,
rendered,
RenderStats::ParsedAndRendered(start.elapsed(), stats.0, stats.1),
))
}
};
if let ReturnedPost::Rendered { perf, .. } = &post {
info!("rendered post in {:?}", perf);
}
Ok(post)
}
async fn cleanup(&self) {
if let Some(cache) = &self.cache {
if let Some(cache) = self.cache.as_ref() {
cache
.cleanup(|CacheKey { name, extra }, value| {
// nuke entries with different render options
if self.render_hash != *extra {
return false;
}
let mtime = std::fs::metadata(
self.config
.load()
.root
.join(Self::as_raw(name).unwrap_or_else(|| unreachable!())),
)
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(as_secs);
match mtime {
Some(mtime) => mtime <= value.mtime,
None => false,
}
.cleanup(|name| {
std::fs::metadata(self.config.dirs.posts.join(name.to_owned() + ".md"))
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(|mtime| as_secs(&mtime))
})
.await
}

View file

@ -1,128 +1,68 @@
pub mod blag;
pub mod cache;
pub mod markdown_posts;
use std::sync::Arc;
use std::time::Duration;
use async_trait::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use indexmap::IndexMap;
use serde::{Deserialize, Serialize};
use serde_value::Value;
use crate::error::PostError;
pub use blag::Blag;
pub use markdown_posts::MarkdownPosts;
pub use crate::post::markdown_posts::MarkdownPosts;
// TODO: replace String with Arc<str>
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PostMetadata {
pub name: Arc<str>,
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
pub written_at: Option<DateTime<Utc>>,
pub name: String,
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub icon_alt: Option<String>,
pub color: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
pub tags: Vec<Arc<str>>,
pub tags: Vec<String>,
}
#[derive(Serialize, Debug, Clone)]
#[allow(unused)]
#[derive(Serialize)]
pub enum RenderStats {
Cached(Duration),
Rendered {
total: Duration,
parsed: Duration,
rendered: Duration,
},
Fetched(Duration),
Other {
verb: Arc<str>,
time: Duration,
},
Unknown,
// format: Total, Parsed in, Rendered in
ParsedAndRendered(Duration, Duration, Duration),
}
#[allow(clippy::large_enum_variant)] // Raw will be returned very rarely
#[derive(Debug, Clone)]
pub enum ReturnedPost {
Rendered {
meta: PostMetadata,
body: Arc<str>,
perf: RenderStats,
raw_name: Option<String>,
},
Raw {
buffer: Vec<u8>,
content_type: HeaderValue,
},
Rendered(PostMetadata, String, RenderStats),
Raw(Vec<u8>, HeaderValue),
}
pub enum Filter<'a> {
Tags(&'a [&'a str]),
}
impl Filter<'_> {
pub fn apply(&self, meta: &PostMetadata) -> bool {
match self {
Filter::Tags(tags) => tags
.iter()
.any(|tag| meta.tags.iter().any(|meta_tag| &**meta_tag == *tag)),
}
}
}
pub trait ApplyFilters {
fn apply_filters(&self, filters: &[Filter<'_>]) -> bool;
}
impl ApplyFilters for PostMetadata {
fn apply_filters(&self, filters: &[Filter<'_>]) -> bool {
for filter in filters {
if !filter.apply(self) {
return false;
}
}
true
}
}
#[async_trait]
pub trait PostManager {
async fn get_all_post_metadata(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
self.get_all_posts(filters, query)
self.get_all_posts(|m, _| filter(m))
.await
.map(|vec| vec.into_iter().map(|(meta, ..)| meta).collect())
}
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError>;
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError>;
async fn get_max_n_post_metadata_with_optional_tag_sorted(
&self,
n: Option<usize>,
tag: Option<&str>,
query: &IndexMap<String, Value>,
tag: Option<&String>,
) -> Result<Vec<PostMetadata>, PostError> {
let filters = tag.and(Some(Filter::Tags(tag.as_slice())));
let mut posts = self
.get_all_post_metadata(filters.as_slice(), query)
.get_all_post_metadata(|metadata| !tag.is_some_and(|tag| !metadata.tags.contains(tag)))
.await?;
// we still want some semblance of order if created_at is None so sort by mtime as well
posts.sort_unstable_by_key(|metadata| metadata.modified_at.unwrap_or_default());
posts.sort_by_key(|metadata| metadata.written_at.unwrap_or_default());
posts.sort_by_key(|metadata| metadata.created_at.unwrap_or_default());
posts.reverse();
if let Some(n) = n {
posts.truncate(n);
@ -132,22 +72,14 @@ pub trait PostManager {
}
#[allow(unused)]
async fn get_post_metadata(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<PostMetadata, PostError> {
match self.get_post(name.clone(), query).await? {
ReturnedPost::Rendered { meta, .. } => Ok(meta),
ReturnedPost::Raw { .. } => Err(PostError::NotFound(name)),
async fn get_post_metadata(&self, name: &str) -> Result<PostMetadata, PostError> {
match self.get_post(name).await? {
ReturnedPost::Rendered(metadata, ..) => Ok(metadata),
ReturnedPost::Raw(..) => Err(PostError::NotFound(name.to_string())),
}
}
async fn get_post(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError>;
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError>;
async fn cleanup(&self) {}
async fn cleanup(&self);
}

View file

@ -0,0 +1,37 @@
pub struct RangedI128Visitor<const START: i128, const END: i128>;
impl<'de, const START: i128, const END: i128> serde::de::Visitor<'de>
for RangedI128Visitor<START, END>
{
type Value = i128;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "an integer between {START} and {END}")
}
fn visit_i32<E>(self, v: i32) -> std::result::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i64<E>(self, v: i64) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i128<E>(self, v: i128) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
if v >= START && v <= END {
Ok(v)
} else {
Err(E::custom(format!(
"integer is out of bounds ({START}..{END})"
)))
}
}
}

View file

@ -39,6 +39,9 @@ pub async fn handle(
req: Request,
included_dir: &'static Dir<'static>,
) -> Result<Response, Infallible> {
#[cfg(windows)]
compile_error!("this is not safe");
let path = req.uri().path();
let has_dotdot = path.split('/').any(|seg| seg == "..");

View file

@ -1,13 +1,9 @@
use std::time::SystemTime;
pub fn as_secs(t: SystemTime) -> u64 {
t.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_else(|err| err.duration())
.as_secs()
}
pub fn as_millis(t: SystemTime) -> u128 {
t.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_else(|err| err.duration())
.as_millis()
pub fn as_secs(t: &SystemTime) -> u64 {
match t.duration_since(SystemTime::UNIX_EPOCH) {
Ok(duration) => duration,
Err(err) => err.duration(),
}
.as_secs()
}

View file

@ -8,6 +8,7 @@ use thiserror::Error;
use tracing::{debug, error, info_span, trace};
const TEMPLATES: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/templates");
const PARTIALS: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/partials");
#[derive(Error, Debug)]
#[allow(clippy::enum_variant_names)]
@ -28,7 +29,7 @@ fn is_ext(path: impl AsRef<Path>, ext: &str) -> bool {
}
}
fn get_template_name(path: &Path) -> Option<&str> {
pub(self) fn get_template_name<'a>(path: &'a Path) -> Option<&'a str> {
if !is_ext(path, "hbs") {
return None;
}
@ -46,16 +47,25 @@ fn register_included_file(
Ok(())
}
fn register_path(
fn register_path<'a>(
path: impl AsRef<std::path::Path>,
name: &str,
registry: &mut Handlebars<'_>,
registry: &mut Handlebars<'a>,
) -> Result<(), TemplateError> {
let template = compile_path(path)?;
registry.register_template(name, template);
Ok(())
}
fn register_partial(
file: &include_dir::File<'_>,
name: &str,
registry: &mut Handlebars,
) -> Result<(), TemplateError> {
registry.register_partial(name, file.contents_utf8().ok_or(TemplateError::UTF8Error)?)?;
Ok(())
}
fn compile_included_file(file: &include_dir::File<'_>) -> Result<Template, TemplateError> {
let contents = file.contents_utf8().ok_or(TemplateError::UTF8Error)?;
@ -75,7 +85,7 @@ fn compile_path(path: impl AsRef<std::path::Path>) -> Result<Template, TemplateE
Ok(template)
}
async fn compile_path_async_io(
pub(self) async fn compile_path_async_io(
path: impl AsRef<std::path::Path>,
) -> Result<Template, TemplateError> {
use tokio::fs::OpenOptions;
@ -115,6 +125,29 @@ pub fn new_registry<'a>(custom_templates_path: impl AsRef<Path>) -> io::Result<H
};
}
for entry in PARTIALS.entries() {
let file = match entry.as_file() {
Some(file) => file,
None => continue,
};
let span = info_span!("register_partial", path = ?file.path());
let _handle = span.enter();
let name = match get_template_name(file.path()) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_partial(file, name, &mut reg) {
Ok(()) => debug!("registered partial {name:?}"),
Err(err) => error!("error while registering partial: {err}"),
};
}
let read_dir = match std::fs::read_dir(custom_templates_path) {
Ok(v) => v,
Err(err) => match err.kind() {

View file

@ -3,12 +3,12 @@ use std::sync::Arc;
use std::time::Duration;
use handlebars::{Handlebars, Template};
use notify_debouncer_full::notify::{self};
use notify_debouncer_full::notify::{self, Watcher};
use notify_debouncer_full::{new_debouncer, DebouncedEvent};
use tokio::select;
use tokio::sync::RwLock;
use tokio_util::sync::CancellationToken;
use tracing::{debug, debug_span, error, info, instrument, trace};
use tracing::{debug, error, info, trace, trace_span};
use crate::templates::*;
@ -20,7 +20,7 @@ async fn process_event(
notify::EventKind::Create(notify::event::CreateKind::File)
| notify::EventKind::Modify(_) => {
for path in &event.paths {
let span = debug_span!("modify_event", ?path);
let span = trace_span!("modify_event", ?path);
let _handle = span.enter();
let template_name = match get_template_name(path) {
@ -33,13 +33,13 @@ async fn process_event(
trace!("processing recompilation");
let compiled = compile_path_async_io(path).await?;
debug!("compiled template {template_name:?}");
trace!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
notify::EventKind::Remove(notify::event::RemoveKind::File) => {
for path in &event.paths {
let span = debug_span!("remove_event", ?path);
let span = trace_span!("remove_event", ?path);
let _handle = span.enter();
let (file_name, template_name) = match path
@ -60,7 +60,7 @@ async fn process_event(
let file = TEMPLATES.get_file(file_name);
if let Some(file) = file {
let compiled = compile_included_file(file)?;
debug!("compiled template {template_name:?}");
trace!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
@ -71,12 +71,11 @@ async fn process_event(
Ok(())
}
#[instrument(skip_all)]
pub async fn watch_templates<'a>(
path: impl AsRef<Path>,
watcher_token: CancellationToken,
reg: Arc<RwLock<Handlebars<'a>>>,
) -> Result<(), color_eyre::eyre::Report> {
) -> Result<(), Box<dyn std::error::Error + Send + Sync + 'static>> {
let path = path.as_ref();
let (tx, mut rx) = tokio::sync::mpsc::channel(1);
@ -86,7 +85,9 @@ pub async fn watch_templates<'a>(
.expect("failed to send message over channel")
})?;
debouncer.watch(path, notify::RecursiveMode::NonRecursive)?;
debouncer
.watcher()
.watch(path, notify::RecursiveMode::NonRecursive)?;
'event_loop: while let Some(events) = select! {
_ = watcher_token.cancelled() => {
@ -106,21 +107,19 @@ pub async fn watch_templates<'a>(
let mut templates = Vec::new();
for event in events {
trace!("file event: {event:?}");
if let Err(err) = process_event(event, &mut templates).await {
error!("error while processing event: {err}");
}
}
if !templates.is_empty() {
let mut reg = reg.write().await;
for template in templates.into_iter() {
debug!("registered template {}", template.0);
reg.register_template(&template.0, template.1);
}
drop(reg);
info!("updated custom templates");
let mut reg = reg.write().await;
for template in templates.into_iter() {
debug!("registered template {}", template.0);
reg.register_template(&template.0, template.1);
}
drop(reg);
info!("updated custom templates");
}
Ok(())

View file

@ -1,8 +1,7 @@
function replaceDates() {
for (let el of document.querySelectorAll(".date-rfc3339")) {
let date = new Date(Date.parse(el.textContent));
el.title = el.textContent;
el.textContent = date.toLocaleString();
el.classList.replace("date-rfc3339", "tooltipped");
el.classList.remove("date-rfc3339");
}
}

View file

@ -54,22 +54,3 @@ th,
td:nth-child(1) {
word-break: keep-all;
}
blockquote {
margin-left: 1em;
padding-left: 1.5em;
border-left: 0.5em solid;
border-color: var(--blue);
& > blockquote {
border-color: var(--mauve);
& > blockquote {
border-color: var(--pink);
& > blockquote {
border-color: var(--rosewater);
& > blockquote {
border-color: var(--text);
}
}
}
}
}

View file

@ -1,5 +1,4 @@
/* colors from catppuccin https://github.com/catppuccin/catppuccin
licensed under the MIT license, available in the source tree */
/* colors */
:root {
--base: #1e1e2e;
--text: #cdd6f4;
@ -28,21 +27,15 @@
}
}
:root {
/* please have one at least one good monospace font */
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Source Code Pro", "Ubuntu Mono", monospace,
"Noto Sans", sans-serif;
}
:root,
code {
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Source Code Pro", monospace;
/* please have one at least one good monospace font */
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Ubuntu Mono", monospace, sans-serif;
}
:root {
background-color: var(--base);
color: var(--text);
overflow-wrap: break-word;
}
a {
@ -143,15 +136,15 @@ body > main > h1:first-child {
margin-top: 0;
}
@media (min-width: 900px) {
:root:has(body.cool) {
--target-ratio: 0.6;
@media (min-width: 956px) {
:root {
--target-ratio: 0.7; /* 669px - 1344x */
--width: min(100% * var(--target-ratio), 1920px * var(--target-ratio));
--padding: 4em;
--padded-width: calc(var(--width) - var(--padding) * 2);
}
body.cool {
body {
padding: 4em 0;
min-height: calc(100vh - 8em);
@ -165,13 +158,13 @@ body > main > h1:first-child {
);
}
body.cool > * {
body > * {
margin: auto;
padding: var(--padding);
width: var(--padded-width);
}
body.cool > footer {
body > footer {
padding: initial;
width: var(--width);
}

View file

@ -5,9 +5,8 @@
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>error</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/custom/style.css" />
</head>
<body class="cool">
<body>
<main>
<h1>error</h1>
<p>{{ error }}</p>

View file

@ -1,25 +0,0 @@
running
<a href="{{bingus_info.repository}}" target="_blank">{{bingus_info.name}}</a>
v{{bingus_info.version}}
{{#if perf}}
<b> - </b>
{{/if}}
{{#each perf}}
{{#if (eq @key "Rendered")}}
<span class="tooltipped" title="parsing metadata took {{duration this.parsed}}">parsed meta</span>
and
<span class="tooltipped" title="rendering took {{duration this.rendered}}">rendered</span>
in
{{duration this.total}}
{{else if (eq @key "Cached")}}
retrieved from cache in
{{duration this}}
{{else if (eq @key "Fetched")}}
fetched in
{{duration this}}
{{/if}}
{{/each}}
{{#if raw_name}}
-
<a href="/posts/{{raw_name}}">view raw</a>
{{/if}}

View file

@ -1,20 +1,18 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{style.title}}" />
<meta property="og:title" content="{{style.title}}" />
<meta property="og:description" content="{{style.description}}" />
<meta name="description" content="{{title}}" />
<meta property="og:title" content="{{title}}" />
<meta property="og:description" content="{{description}}" />
<meta name="keywords" content="{{joined_tags}}" />
{{#if (ne color null)}}
<meta name="theme-color" content="{{style.color}}" />
<meta name="theme-color" content="{{color}}" />
{{/if}}
<title>{{style.title}}</title>
<title>{{title}}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/custom/style.css" />
{{#if rss}}
<link rel="alternate" type="application/rss+xml" title="{{style.title}}" href="/feed.xml" />
<link rel="alternate" type="application/rss+xml" title="{{title}}" href="/feed.xml" />
{{/if}}
{{#if js}}
<script src="/static/date.js" defer></script>
@ -22,24 +20,33 @@
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body class="cool">
<body>
<main>
{{>title}}
<h1>{{title}}</h1>
<p>{{description}}</p>
<h2>posts</h2>
<div>
{{#if js}}
<form id="sort" style="display: none">
sort by: {{sort}}
<br />
<input type="radio" name="sort" id="sort-date" value="date" {{#if (eq style.default_sort "date")}}checked{{/if}} />
<input type="radio" name="sort" id="sort-date" value="date" {{#if (eq sort "date")}}checked{{/if}} />
<label for="sort-date">date</label>
<input type="radio" name="sort" id="sort-name" value="name" {{#if (eq style.default_sort "name")}}checked{{/if}} />
<input type="radio" name="sort" id="sort-name" value="name" {{#if (eq sort "name")}}checked{{/if}} />
<label for="sort-name">name</label>
</form>
{{/if}}
<div id="posts">
{{>post_list}}
</div>
{{#each posts}}
<div id="posts">
<div class="post">
<a href="/posts/{{name}}"><b>{{title}}</b></a>
<span class="post-author">- by {{author}}</span>
<br />
{{description}}<br />
{{>post_table post df=@root.df}}
</div>
</div>
{{else}} there are no posts right now. check back later! {{/each}}
</div>
{{#if (gt (len tags) 0)}}
@ -52,8 +59,5 @@
<span class="post-author">- {{this}} post{{#if (ne this 1)}}s{{/if}}</span><br />
{{/each}}
</main>
<footer>
{{>footer}}
</footer>
</body>
</html>

View file

@ -1,4 +1,3 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
@ -22,7 +21,7 @@
<meta property="twitter:image:alt" content="{{meta.icon_alt}}" />
{{/if}}{{/if}}
{{#if (ne color null)}}
<meta name="theme-color" content="{{color}}" />
<meta name="theme-color" content="{{meta.color}}" />
{{/if}}
<title>{{meta.title}}</title>
<link rel="stylesheet" href="/static/style.css" />
@ -34,7 +33,7 @@
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body class="cool">
<body>
<main>
<h1 class="post-title">
{{meta.title}}
@ -42,15 +41,31 @@
</h1>
<p class="post-desc">{{meta.description}}</p>
<div class="post">
{{>post_table meta style=@root.style}}
{{>post_table meta df=@root.df}}
<a href="/posts/{{meta.name}}">link</a><br />
<a href="/">back to home</a>
</div>
<hr />
{{{body}}}
{{{rendered}}}
</main>
<footer>
{{>footer}}
{{#each rendered_in}}
{{#if (eq @key "ParsedAndRendered")}}
<span class="tooltipped" title="parsing took {{duration this.[1]}}">parsed</span>
and
<span class="tooltipped" title="rendering took {{duration this.[2]}}">rendered</span>
in
{{duration this.[0]}}
{{else if (eq @key "Cached")}}
retrieved from cache in
{{duration this}}
{{/if}}
{{/each}}
{{#if markdown_access}}
-
<a href="/posts/{{meta.name}}.md">view raw</a>
{{/if}}
</footer>
</body>
</html>

View file

@ -1,10 +0,0 @@
{{#each posts}}
<div class="post">
<section>
<a href="/posts/{{name}}"><b>{{title}}</b></a>
<span class="post-author">- by {{author}}</span>
</section>
<section role="paragraph">{{description}}</section>
{{>post_table post style=@root.style}}
</div>
{{else}} there are no posts right now. check back later! {{/each}}

View file

@ -1,25 +0,0 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>{{title}}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/custom/style.css" />
{{#if rss}}
<link rel="alternate" type="application/rss+xml" title="{{style.title}}" href="/feed.xml" />
{{/if}}
{{#if js}}
<script src="/static/date.js" defer></script>
<script src="/static/sort.js" defer></script>
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body>
<main>
<div id="posts">
{{>post_list}}
</div>
</main>
</body>
</html>

View file

@ -1,2 +0,0 @@
<h1>{{style.title}}</h1>
<p>{{style.description}}</p>